Sample records for distributed tsunami warning

  1. Implementation of a Global Navigation Satellite System (GNSS) Augmentation to Tsunami Early Warning Systems

    NASA Astrophysics Data System (ADS)

    LaBrecque, John

    2016-04-01

    The Global Geodetic Observing System has issued a Call for Participation to research scientists, geodetic research groups and national agencies in support of the implementation of the IUGG recommendation for a Global Navigation Satellite System (GNSS) Augmentation to Tsunami Early Warning Systems. The call seeks to establish a working group to be a catalyst and motivating force for the definition of requirements, identification of resources, and for the encouragement of international cooperation in the establishment, advancement, and utilization of GNSS for Tsunami Early Warning. During the past fifteen years the populations of the Indo-Pacific region experienced a series of mega-thrust earthquakes followed by devastating tsunamis that claimed nearly 300,000 lives. The future resiliency of the region will depend upon improvements to infrastructure and emergency response that will require very significant investments from the Indo-Pacific economies. The estimation of earthquake moment magnitude, source mechanism and the distribution of crustal deformation are critical to rapid tsunami warning. Geodetic research groups have demonstrated the use of GNSS data to estimate earthquake moment magnitude, source mechanism and the distribution of crustal deformation sufficient for the accurate and timely prediction of tsunamis generated by mega-thrust earthquakes. GNSS data have also been used to measure the formation and propagation of tsunamis via ionospheric disturbances acoustically coupled to the propagating surface waves; thereby providing a new technique to track tsunami propagation across ocean basins, opening the way for improving tsunami propagation models, and providing accurate warning to communities in the far field. These two new advancements can deliver timely and accurate tsunami warnings to coastal communities in the near and far field of mega-thrust earthquakes. This presentation will present the justification for and the details of the GGOS Call for Participation.

  2. Development of a GNSS-Enhanced Tsunami Early Warning System

    NASA Astrophysics Data System (ADS)

    Bawden, G. W.; Melbourne, T. I.; Bock, Y.; Song, Y. T.; Komjathy, A.

    2015-12-01

    The past decade has witnessed a terrible loss of life and economic disruption caused by large earthquakes and resultant tsunamis impacting coastal communities and infrastructure across the Indo-Pacific region. NASA has funded the early development of a prototype real-time Global Navigation Satellite System (RT-GNSS) based rapid earthquake and tsunami early warning (GNSS-TEW) system that may be used to enhance seismic tsunami early warning systems for large earthquakes. This prototype GNSS-TEW system geodetically estimates fault parameters (earthquake magnitude, location, strike, dip, and slip magnitude/direction on a gridded fault plane both along strike and at depth) and tsunami source parameters (seafloor displacement, tsunami energy scale, and 3D tsunami initials) within minutes after the mainshock based on dynamic numerical inversions/regressions of the real-time measured displacements within a spatially distributed real-time GNSS network(s) spanning the epicentral region. It is also possible to measure fluctuations in the ionosphere's total electron content (TEC) in the RT-GNSS data caused by the pressure wave from the tsunami. This TEC approach can detect if a tsunami has been triggered by an earthquake, track its waves as they propagate through the oceanic basins, and provide upwards of 45 minutes early warning. These combined real-time geodetic approaches will very quickly address a number of important questions in the immediate minutes following a major earthquake: How big was the earthquake and what are its fault parameters? Could the earthquake have produced a tsunami and was a tsunami generated?

  3. 2006 - 2016: Ten Years Of Tsunami In French Polynesia

    NASA Astrophysics Data System (ADS)

    Reymond, D.; Jamelot, A.; Hyvernaud, O.

    2016-12-01

    Located in South central Pacific and despite of its far field situation, the French Polynesia is very much concerned by the tsunamis generated along the major subduction zones located around the Pacific. At the time of writing, 10 tsunamis have been generated in the Pacific Ocean since 2006; all these events recorded in French Polynesia, produced different levels of warning, starting from a simple seismic warning with an information bulletin, up to an effective tsunami warning with evacuation of the coastal zone. These tsunamigenic events represent an invaluable opportunity of evolutions and tests of the tsunami warning system developed in French Polynesia: during the last ten years, the warning rules had evolved from a simple criterion of magnitudes up to the computation of the main seismic source parameters (location, slowness determinant (Newman & Okal, 1998) and focal geometry) using two independent methods: the first one uses an inversion of W-phases (Kanamori & Rivera, 2012) and the second one performs an inversion of long period surface waves (Clément & Reymond, 2014); the source parameters such estimated allow to compute in near real time the expected distributions of tsunami heights (with the help of a super-computer and parallelized codes of numerical simulations). Furthermore, two kinds of numerical modeling are used: the first one, very rapid (performed in about 5minutes of computation time) is based on the Green's law (Jamelot & Reymond, 2015), and a more detailed and precise one that uses classical numerical simulations through nested grids (about 45 minutes of computation time). Consequently, the criteria of tsunami warning are presently based on the expected tsunami heights in the different archipelagos and islands of French Polynesia. This major evolution allows to differentiate and use different levels of warning for the different archipelagos,working in tandem with the Civil Defense. We present the comparison of the historical observed tsunami heights (instrumental records, including deep ocean measurements provided by DART buoys and measured of tsunamis run-up) to the computed ones. In addition, the sites known for their amplification and resonance effects are well reproduced by the numerical simulations.

  4. Tsunami Forecast Progress Five Years After Indonesian Disaster

    NASA Astrophysics Data System (ADS)

    Titov, Vasily V.; Bernard, Eddie N.; Weinstein, Stuart A.; Kanoglu, Utku; Synolakis, Costas E.

    2010-05-01

    Almost five years after the 26 December 2004 Indian Ocean tragedy, tsunami warnings are finally benefiting from decades of research toward effective model-based forecasts. Since the 2004 tsunami, two seminal advances have been (i) deep-ocean tsunami measurements with tsunameters and (ii) their use in accurately forecasting tsunamis after the tsunami has been generated. Using direct measurements of deep-ocean tsunami heights, assimilated into numerical models for specific locations, greatly improves the real-time forecast accuracy over earthquake-derived magnitude estimates of tsunami impact. Since 2003, this method has been used to forecast tsunamis at specific harbors for different events in the Pacific and Indian Oceans. Recent tsunamis illustrated how this technology is being adopted in global tsunami warning operations. The U.S. forecasting system was used by both research and operations to evaluate the tsunami hazard. Tests demonstrated the effectiveness of operational tsunami forecasting using real-time deep-ocean data assimilated into forecast models. Several examples also showed potential of distributed forecast tools. With IOC and USAID funding, NOAA researchers at PMEL developed the Community Model Interface for Tsunami (ComMIT) tool and distributed it through extensive capacity-building sessions in the Indian Ocean. Over hundred scientists have been trained in tsunami inundation mapping, leading to the first generation of inundation models for many Indian Ocean shorelines. These same inundation models can also be used for real-time tsunami forecasts as was demonstrated during several events. Contact Information Vasily V. Titov, Seattle, Washington, USA, 98115

  5. Web-based Tsunami Early Warning System with instant Tsunami Propagation Calculations in the GPU Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, M.; Spazier, J.; Reißland, S.

    2014-12-01

    Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the concept a whirl and to shape science's future. Further functionality, improvements and possible profound changes have to implemented successively based on the users' evolving needs.

  6. Anatomy of Historical Tsunamis: Lessons Learned for Tsunami Warning

    NASA Astrophysics Data System (ADS)

    Igarashi, Y.; Kong, L.; Yamamoto, M.; McCreery, C. S.

    2011-11-01

    Tsunamis are high-impact disasters that can cause death and destruction locally within a few minutes of their occurrence and across oceans hours, even up to a day, afterward. Efforts to establish tsunami warning systems to protect life and property began in the Pacific after the 1946 Aleutian Islands tsunami caused casualties in Hawaii. Seismic and sea level data were used by a central control center to evaluate tsunamigenic potential and then issue alerts and warnings. The ensuing events of 1952, 1957, and 1960 tested the new system, which continued to expand and evolve from a United States system to an international system in 1965. The Tsunami Warning System in the Pacific (ITSU) steadily improved through the decades as more stations became available in real and near-real time through better communications technology and greater bandwidth. New analysis techniques, coupled with more data of higher quality, resulted in better detection, greater solution accuracy, and more reliable warnings, but limitations still exist in constraining the source and in accurately predicting propagation of the wave from source to shore. Tsunami event data collected over the last two decades through international tsunami science surveys have led to more realistic models for source generation and inundation, and within the warning centers, real-time tsunami wave forecasting will become a reality in the near future. The tsunami warning system is an international cooperative effort amongst countries supported by global and national monitoring networks and dedicated tsunami warning centers; the research community has contributed to the system by advancing and improving its analysis tools. Lessons learned from the earliest tsunamis provided the backbone for the present system, but despite 45 years of experience, the 2004 Indian Ocean tsunami reminded us that tsunamis strike and kill everywhere, not just in the Pacific. Today, a global intergovernmental tsunami warning system is coordinated under the United Nations. This paper reviews historical tsunamis, their warning activities, and their sea level records to highlight lessons learned with the focus on how these insights have helped to drive further development of tsunami warning systems and their tsunami warning centers. While the international systems do well for teletsunamis, faster detection, more accurate evaluations, and widespread timely alerts are still the goals, and challenges still remain to achieving early warning against the more frequent and destructive local tsunamis.

  7. Streamlining Tsunami Messages (e.g., Warnings) of the US National Tsunami Warning Center, Palmer, Alaska

    NASA Astrophysics Data System (ADS)

    Gregg, C. E.; Sorensen, J. H.; Vogt Sorensen, B.; Whitmore, P.; Johnston, D. M.

    2016-12-01

    Spurred in part by world-wide interest in improving warning messaging for and response to tsunamis in the wake of several catastrophic tsunamis since 2004 and growing interest at the US National Weather Service (NWS) to integrate social science into their Tsunami Program, the NWS Tsunami Warning Centers in Alaska and Hawaii have made great progress toward enhancing tsunami messages. These include numerous products, among them being Tsunami Warnings, Tsunami Advisories and Tsunami Watches. Beginning in 2010 we have worked with US National Tsunami Hazard Mitigation Program (NTHMP) Warning Coordination and Mitigation and Education Subcommittee members; Tsunami Program administrators; and NWS Weather Forecast Officers to conduct a series of focus group meetings with stakeholders in coastal areas of Alaska, American Samoa, California, Hawaii, North Carolina, Oregon, US Virgin Islands and Washington to understand end-user perceptions of existing messages and their existing needs in message products. We also reviewed research literature on behavioral response to warnings to develop a Tsunami Warning Message Metric that could be used to guide revisions to tsunami warning messages of both warning centers. The message metric is divided into categories of Message Content, Style, Order, Formatting, and Receiver Characteristics. A sample message is evaluated by cross-referencing the message with the operational definitions of metric factors. Findings are then used to guide revisions of the message until the characteristics of each factor are met, whether the message is a full length or short message. Incrementally, this work contributed to revisions in the format, content and style of message products issued by the National Tsunami Warning Center (NTWC). Since that time, interest in short warning messages has continued to increase and in May 2016 the NTWC began efforts to revise message products to take advantage of recent NWS policy changes allowing use of mixed-case text format and expanded punctuation, a practice which the NWS first started in 2010. Here we describe our application of a modification of the warning message metric to develop new streamlined messages using mixed-case text. These messages reflect current state-of-the-art knowledge on warning message effectiveness.

  8. Evolution of tsunami warning systems and products.

    PubMed

    Bernard, Eddie; Titov, Vasily

    2015-10-28

    Each year, about 60 000 people and $4 billion (US$) in assets are exposed to the global tsunami hazard. Accurate and reliable tsunami warning systems have been shown to provide a significant defence for this flooding hazard. However, the evolution of warning systems has been influenced by two processes: deadly tsunamis and available technology. In this paper, we explore the evolution of science and technology used in tsunami warning systems, the evolution of their products using warning technologies, and offer suggestions for a new generation of warning products, aimed at the flooding nature of the hazard, to reduce future tsunami impacts on society. We conclude that coastal communities would be well served by receiving three standardized, accurate, real-time tsunami warning products, namely (i) tsunami energy estimate, (ii) flooding maps and (iii) tsunami-induced harbour current maps to minimize the impact of tsunamis. Such information would arm communities with vital flooding guidance for evacuations and port operations. The advantage of global standardized flooding products delivered in a common format is efficiency and accuracy, which leads to effectiveness in promoting tsunami resilience at the community level. © 2015 The Authors.

  9. Evolution of tsunami warning systems and products

    PubMed Central

    Bernard, Eddie; Titov, Vasily

    2015-01-01

    Each year, about 60 000 people and $4 billion (US$) in assets are exposed to the global tsunami hazard. Accurate and reliable tsunami warning systems have been shown to provide a significant defence for this flooding hazard. However, the evolution of warning systems has been influenced by two processes: deadly tsunamis and available technology. In this paper, we explore the evolution of science and technology used in tsunami warning systems, the evolution of their products using warning technologies, and offer suggestions for a new generation of warning products, aimed at the flooding nature of the hazard, to reduce future tsunami impacts on society. We conclude that coastal communities would be well served by receiving three standardized, accurate, real-time tsunami warning products, namely (i) tsunami energy estimate, (ii) flooding maps and (iii) tsunami-induced harbour current maps to minimize the impact of tsunamis. Such information would arm communities with vital flooding guidance for evacuations and port operations. The advantage of global standardized flooding products delivered in a common format is efficiency and accuracy, which leads to effectiveness in promoting tsunami resilience at the community level. PMID:26392620

  10. The seismic project of the National Tsunami Hazard Mitigation Program

    USGS Publications Warehouse

    Oppenheimer, D.H.; Bittenbinder, A.N.; Bogaert, B.M.; Buland, R.P.; Dietz, L.D.; Hansen, R.A.; Malone, S.D.; McCreery, C.S.; Sokolowski, T.J.; Whitmore, P.M.; Weaver, C.S.

    2005-01-01

    In 1997, the Federal Emergency Management Agency (FEMA), National Oceanic and Atmospheric Administration (NOAA), U.S. Geological Survey (USGS), and the five western States of Alaska, California, Hawaii, Oregon, and Washington joined in a partnership called the National Tsunami Hazard Mitigation Program (NTHMP) to enhance the quality and quantity of seismic data provided to the NOAA tsunami warning centers in Alaska and Hawaii. The NTHMP funded a seismic project that now provides the warning centers with real-time seismic data over dedicated communication links and the Internet from regional seismic networks monitoring earthquakes in the five western states, the U.S. National Seismic Network in Colorado, and from domestic and global seismic stations operated by other agencies. The goal of the project is to reduce the time needed to issue a tsunami warning by providing the warning centers with high-dynamic range, broadband waveforms in near real time. An additional goal is to reduce the likelihood of issuing false tsunami warnings by rapidly providing to the warning centers parametric information on earthquakes that could indicate their tsunamigenic potential, such as hypocenters, magnitudes, moment tensors, and shake distribution maps. New or upgraded field instrumentation was installed over a 5-year period at 53 seismic stations in the five western states. Data from these instruments has been integrated into the seismic network utilizing Earthworm software. This network has significantly reduced the time needed to respond to teleseismic and regional earthquakes. Notably, the West Coast/Alaska Tsunami Warning Center responded to the 28 February 2001 Mw 6.8 Nisqually earthquake beneath Olympia, Washington within 2 minutes compared to an average response time of over 10 minutes for the previous 18 years. ?? Springer 2005.

  11. Coastal Amplification Laws for the French Tsunami Warning Center: Numerical Modeling and Fast Estimate of Tsunami Wave Heights Along the French Riviera

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Hébert, H.; Schindelé, F.; Reymond, D.

    2017-11-01

    Tsunami modeling tools in the French tsunami Warning Center operational context provide rapidly derived warning levels with a dimensionless variable at basin scale. A new forecast method based on coastal amplification laws has been tested to estimate the tsunami onshore height, with a focus on the French Riviera test-site (Nice area). This fast prediction tool provides a coastal tsunami height distribution, calculated from the numerical simulation of the deep ocean tsunami amplitude and using a transfer function derived from the Green's law. Due to a lack of tsunami observations in the western Mediterranean basin, coastal amplification parameters are here defined regarding high resolution nested grids simulations. The preliminary results for the Nice test site on the basis of nine historical and synthetic sources show a good agreement with the time-consuming high resolution modeling: the linear approximation is obtained within 1 min in general and provides estimates within a factor of two in amplitude, although the resonance effects in harbors and bays are not reproduced. In Nice harbor especially, variation in tsunami amplitude is something that cannot be really assessed because of the magnitude range and maximum energy azimuth of possible events to account for. However, this method is well suited for a fast first estimate of the coastal tsunami threat forecast.

  12. Coastal amplification laws for the French tsunami Warning Center: numerical modeling and fast estimate of tsunami wave heights along the French Riviera

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Schindelé, F.; Hebert, H.; Reymond, D.

    2017-12-01

    Tsunami modeling tools in the French tsunami Warning Center operational context provide for now warning levels with a no dimension scale, and at basin scale. A new forecast method based on coastal amplification laws has been tested to estimate the tsunami onshore height, with a focus on the French Riviera test-site (Nice area). This fast prediction tool provides a coastal tsunami height distribution, calculated from the numerical simulation of the deep ocean tsunami amplitude and using a transfer function derived from the Green's law. Due to a lack of tsunami observation in the western Mediterranean basin, coastal amplification parameters are here defined regarding high resolution nested grids simulations. The first encouraging results for the Nice test site on the basis of 9 historical and fake sources show a good agreement with the time-consuming high resolution modeling: the linear approximation provides within in general 1 minute estimates less a factor of 2 in amplitude, although the resonance effects in harbors and bays are not reproduced. In Nice harbor especially, variation in tsunami amplitude is something that cannot be really appreciated because of the magnitude range and maximum energy azimuth of possible events to account for. However, this method suits well for a fast first estimate of the coastal tsunami threat forecast.

  13. Coastal Amplification Laws for the French Tsunami Warning Center: Numerical Modeling and Fast Estimate of Tsunami Wave Heights Along the French Riviera

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Hébert, H.; Schindelé, F.; Reymond, D.

    2018-04-01

    Tsunami modeling tools in the French tsunami Warning Center operational context provide rapidly derived warning levels with a dimensionless variable at basin scale. A new forecast method based on coastal amplification laws has been tested to estimate the tsunami onshore height, with a focus on the French Riviera test-site (Nice area). This fast prediction tool provides a coastal tsunami height distribution, calculated from the numerical simulation of the deep ocean tsunami amplitude and using a transfer function derived from the Green's law. Due to a lack of tsunami observations in the western Mediterranean basin, coastal amplification parameters are here defined regarding high resolution nested grids simulations. The preliminary results for the Nice test site on the basis of nine historical and synthetic sources show a good agreement with the time-consuming high resolution modeling: the linear approximation is obtained within 1 min in general and provides estimates within a factor of two in amplitude, although the resonance effects in harbors and bays are not reproduced. In Nice harbor especially, variation in tsunami amplitude is something that cannot be really assessed because of the magnitude range and maximum energy azimuth of possible events to account for. However, this method is well suited for a fast first estimate of the coastal tsunami threat forecast.

  14. Application of a Tsunami Warning Message Metric to refine NOAA NWS Tsunami Warning Messages

    NASA Astrophysics Data System (ADS)

    Gregg, C. E.; Johnston, D.; Sorensen, J.; Whitmore, P.

    2013-12-01

    In 2010, the U.S. National Weather Service (NWS) funded a three year project to integrate social science into their Tsunami Program. One of three primary requirements of the grant was to make improvements to tsunami warning messages of the NWS' two Tsunami Warning Centers- the West Coast/Alaska Tsunami Warning Center (WCATWC) in Palmer, Alaska and the Pacific Tsunami Warning Center (PTWC) in Ewa Beach, Hawaii. We conducted focus group meetings with a purposive sample of local, state and Federal stakeholders and emergency managers in six states (AK, WA, OR, CA, HI and NC) and two US Territories (US Virgin Islands and American Samoa) to qualitatively asses information needs in tsunami warning messages using WCATWC tsunami messages for the March 2011 Tohoku earthquake and tsunami event. We also reviewed research literature on behavioral response to warnings to develop a tsunami warning message metric that could be used to guide revisions to tsunami warning messages of both warning centers. The message metric is divided into categories of Message Content, Style, Order and Formatting and Receiver Characteristics. A message is evaluated by cross-referencing the message with the operational definitions of metric factors. Findings are then used to guide revisions of the message until the characteristics of each factor are met. Using findings from this project and findings from a parallel NWS Warning Tiger Team study led by T. Nicolini, the WCATWC implemented the first of two phases of revisions to their warning messages in November 2012. A second phase of additional changes, which will fully implement the redesign of messages based on the metric, is in progress. The resulting messages will reflect current state-of-the-art knowledge on warning message effectiveness. Here we present the message metric; evidence-based rational for message factors; and examples of previous, existing and proposed messages.

  15. NOAA/West coast and Alaska Tsunami warning center Atlantic Ocean response criteria

    USGS Publications Warehouse

    Whitmore, P.; Refidaff, C.; Caropolo, M.; Huerfano-Moreno, V.; Knight, W.; Sammler, W.; Sandrik, A.

    2009-01-01

    West Coast/Alaska Tsunami Warning Center (WCATWC) response criteria for earthquakesoccurring in the Atlantic and Caribbean basins are presented. Initial warning center decisions are based on an earthquake's location, magnitude, depth, distance from coastal locations, and precomputed threat estimates based on tsunami models computed from similar events. The new criteria will help limit the geographical extent of warnings and advisories to threatened regions, and complement the new operational tsunami product suite. Criteria are set for tsunamis generated by earthquakes, which are by far the main cause of tsunami generation (either directly through sea floor displacement or indirectly by triggering of sub-sea landslides).The new criteria require development of a threat data base which sets warning or advisory zones based on location, magnitude, and pre-computed tsunami models. The models determine coastal tsunami amplitudes based on likely tsunami source parameters for a given event. Based on the computed amplitude, warning and advisory zones are pre-set.

  16. Development and Application of a Message Metric for NOAA NWS Tsunami Warnings and Recommended Guidelines for the NWS TsunamiReady Program

    NASA Astrophysics Data System (ADS)

    Gregg, C. E.; Johnston, D. M.; Ricthie, L.; Meinhold, S.; Johnson, V.; Scott, C.; Farnham, C.; Houghton, B. F.; Horan, J.; Gill, D.

    2012-12-01

    Improving the quality and effectiveness of tsunami warning messages and the TsunamiReady community preparedness program of the US National Oceanic and Atmospheric Administration, National Weather Service's (NWS), Tsunami Program are two key objectives of a three year project (Award NA10NWS4670015) to help integrate social science into the NWS' Tsunami Program and improve the preparedness of member states and territories of the National Tsunami Hazard Mitigation Program (NTHMP). Research was conducted in collaboration with state and local emergency managers. Based on findings from focus group meetings with a purposive sample of local, state and Federal stakeholders and emergency managers in six states (AK, WA, OR, CA, HI and NC) and two US Territories (US Virgin Islands and American Samoa), and upon review of research literature on behavioral response to warnings, we developed a warning message metric to help guide revisions to tsunami warning messages issued by the NWS' West Coast/Alaska Tsunami Warning Center, Alaska and Pacific Tsunami Warning Center, Hawaii. The metric incorporates factors that predict response to warning information, which are divided into categories of Message Content, Style, Order and Formatting and Receiver Characteristics. A message is evaluated by cross-referencing the message with the meaning of metric factors and assigning a maximum score of one point per factor. Findings are then used to guide revisions of the message until the characteristics of each factor are met. From focus groups that gathered information on the usefulness and achievability of tsunami preparedness actions, we developed recommendations for revisions to the proposed draft guidelines of the TsunamiReady Improvement Program. Proposed key revisions include the incorporation of community vulnerability to distant (far-field) versus local (near-field) tsunamis as a primary determinant of mandatory actions, rather than community population. Our team continues to work with NWS personnel, including a NWS Tsunami Warning Improvement Team, and the focus group participants to finalize and pilot test prototype warning products and the draft TsunamiReady guidelines.

  17. Sea Level Station Metadata for Tsunami Detection, Warning and Research

    NASA Astrophysics Data System (ADS)

    Stroker, K. J.; Marra, J.; Kari, U. S.; Weinstein, S. A.; Kong, L.

    2007-12-01

    The devastating earthquake and tsunami of December 26, 2004 has greatly increased recognition of the need for water level data both from the coasts and the deep-ocean. In 2006, the National Oceanic and Atmospheric Administration (NOAA) completed a Tsunami Data Management Report describing the management of data required to minimize the impact of tsunamis in the United States. One of the major gaps defined in this report is the access to global coastal water level data. NOAA's National Geophysical Data Center (NGDC) and National Climatic Data Center (NCDC) are working cooperatively to bridge this gap. NOAA relies on a network of global data, acquired and processed in real-time to support tsunami detection and warning, as well as high-quality global databases of archived data to support research and advanced scientific modeling. In 2005, parties interested in enhancing the access and use of sea level station data united under the NOAA NCDC's Integrated Data and Environmental Applications (IDEA) Center's Pacific Region Integrated Data Enterprise (PRIDE) program to develop a distributed metadata system describing sea level stations (Kari et. al., 2006; Marra et.al., in press). This effort started with pilot activities in a regional framework and is targeted at tsunami detection and warning systems being developed by various agencies. It includes development of the components of a prototype sea level station metadata web service and accompanying Google Earth-based client application, which use an XML-based schema to expose, at a minimum, information in the NOAA National Weather Service (NWS) Pacific Tsunami Warning Center (PTWC) station database needed to use the PTWC's Tide Tool application. As identified in the Tsunami Data Management Report, the need also exists for long-term retention of the sea level station data. NOAA envisions that the retrospective water level data and metadata will also be available through web services, using an XML-based schema. Five high-priority metadata requirements identified at a water level workshop held at the XXIV IUGG Meeting in Perugia will be addressed: consistent, validated, and well defined numbers (e.g. amplitude); exact location of sea level stations; a complete record of sea level data stored in the archive; identifying high-priority sea level stations; and consistent definitions. NOAA's National Geophysical Data Center (NGDC) and co-located World Data Center for Solid Earth Geophysics (including tsunamis) would hold the archive of the sea level station data and distribute the standard metadata. Currently, NGDC is also archiving and distributing the DART buoy deep-ocean water level data and metadata in standards based formats. Kari, Uday S., John J. Marra, Stuart A. Weinstein, 2006 A Tsunami Focused Data Sharing Framework For Integration of Databases that Describe Water Level Station Specifications. AGU Fall Meeting, 2006. San Francisco, California. Marra, John, J., Uday S. Kari, and Stuart A. Weinstein (in press). A Tsunami Detection and Warning-focused Sea Level Station Metadata Web Service. IUGG XXIV, July 2-13, 2007. Perugia, Italy.

  18. Modeling of influence from remote tsunami at the coast of Sakhalin and Kuriles islands.

    NASA Astrophysics Data System (ADS)

    Zaytsev, Andrey; Pelinovsky, Efim; Yalciner, Ahmet; Chernov, Anton; Kostenko, Irina

    2010-05-01

    The Far East coast of Russia (Kuriles islands, Sakhalin, Kamchatka) is the area where the dangerous natural phenomena as tsunami is located. A lot of works are established for decreasing of tsunami's influence. Tsunami mapping and mitigation strategy are given for some regions. The centers of Tsunami Warning System are opened, enough plenty of records of a tsunami are collected. The properties of local tsunami are studied well. At the same time, the catastrophic event of the Indonesian tsunami, which had happened in December, 2004, when the sufficient waves have reached the coasts of Africa and South America, it is necessary to note, that the coats, which was far from the epicenter of earthquakes can be effected by catastrophic influence. Moreover, it is practically unique case, when using Tsunami Warning System can reduce the number of human victims to zero. Development of the computer technologies, numerical methods for the solution of systems of the nonlinear differential equations makes computer modeling real and hypothetical tsunamis is the basic method of studying features of distribution of waves in water areas and their influence at coast. Numerical modeling of distribution of historical tsunami from the seismic sources in the Pacific Ocean was observed. The events with an epicenter, remote from Far East coast of Russia were considered. The estimation of the remote tsunami waves propagation was developed. Impact force of tsunamis was estimated. The features of passage of tsunami through Kuril Straits were considered. The spectral analysis of records in settlements of Sakhalin and Kuriles is lead. NAMI-DANCE program was used for tsunami propagation numerical modeling. It is used finite element numerical schemes for Shallow Water Equations and Nonlinear-Dispersive Equations, with use Nested Grid.

  19. Faster from the Depths to Decision: Collecting, Distributing, and Applying Data from NOAA`s Deep-Sea Tsunameters

    NASA Astrophysics Data System (ADS)

    Bouchard, R. H.; Wang, D.; Branski, F.

    2008-05-01

    The National Oceanic and Atmospheric Administration (NOAA) operates two tsunami warning centers (TWCs): the West Coast/Alaska Tsunami Warning Center (ATWC) and Pacific Tsunami Warning Center (PTWC). ATWC provides tsunami alerts to Canadian coastal regions, Virgin Islands, Puerto Rico, and the coasts of continental US and Alaska. PTWC provides local/regional tsunami alerts/advisories to the state of Hawaii. An operational center of the Tsunami Warning System of the Pacific, it provides tsunami alerts to most countries of the Pacific Rim. PTWC also provides tsunami alerts for the Caribbean and Indian Ocean countries on an interim basis. The TWCs aim to issue first tsunami bulletins within 10-15 minutes of the earthquake for tele-tsunamis and within a few minutes for local tsunamis. The TWCs have a requirement for offshore tsunami detection in real-time with a data latency of 1 minute or less. Offshore detection of tsunamis is the purpose of NOAA`s recently completed 39-station array of deep-sea tsunameters. The tsunameters, employing the second-generation DART (Deep-ocean Assessment and Reporting of Tsunamis) technology, can speed tsunami detection information to the TWCs in less than 3 minutes from depths of 6000 meters in the Pacific and Western Atlantic oceans. The tsunameters consist of a Bottom Pressure Recorder (BPR) and a surface buoy. Communication from the BPR to the buoy is via underwater acoustic transmissions. Satellite communications carry the data from the buoy to NOAA`s National Data Buoy Center (NDBC), which operates the tsunameters. The BPRs make pressure measurements, converts them to an equivalent water-column height, and passes them through a tsunami detection algorithm. If the algorithm detects a sufficient change in the height, the tsunameter goes into a rapid reporting mode or Event Mode. The acoustic modem-satellite telecommunications path takes approximately 50 seconds to reach the NDBC server. In a few seconds, NDBC reformats the data and pushes them as messages to the National Weather Service Telecommunications Gateway also known as World Meteorological Organization (WMO) Regional Telecommunication Hub (RTH) Washington. RTH Washington can route more than 50 routine messages per second with reliability for all dissemination to all of its users of 99.9 percent. It provides a latency for high priority traffic of 10 seconds or less and routinely handles 1.2 TB of information per day. Its switching centers are on the Main Trunk Network of the WMO`s Global Telecommunication System (GTS), which provides international distribution of the tsunameter data. The GTS is required to deliver tsunami data and warnings to any connected center within two minutes anywhere in the world. TWCs receive the tsunameter data from RTH Washington via GTS circuits, or download the data from servers at the RTH, in the event the GTS circuits fails. TWCs display the data in real-time in their operations. When a tsunameter goes into Event Mode, the TWCs receive alerts. After subtracting the tide, tsunameter signals can measure tsunamis as small as a few millimeters. The usefulness of the tsunameter data at TWCs was demonstrated in some of the recent events in the Pacific Ocean (Kuril Tsunamis of November 2006 and January 2007, Peru Tsunamis of August 2007 and September 2007) and the Indian Ocean (Southern Sumatra Tsunami of September 2007).

  20. A Walk through TRIDEC's intermediate Tsunami Early Warning System

    NASA Astrophysics Data System (ADS)

    Hammitzsch, M.; Reißland, S.; Lendholt, M.

    2012-04-01

    The management of natural crises is an important application field of the technology developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme. TRIDEC is based on the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the Distant Early Warning System (DEWS) providing a service platform for both sensor integration and warning dissemination. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing challenges, such as the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulation tools and data fusion tools. In addition to conventional sensors also unconventional sensors and sensor networks play an important role in TRIDEC. The system version presented is based on service-oriented architecture (SOA) concepts and on relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). The first system demonstrator has been designed and implemented to support plausible scenarios demonstrating the treatment of simulated tsunami threats with an essential subset of a National Tsunami Warning Centre (NTWC). The feasibility and the potentials of the implemented approach are demonstrated covering standard operations as well as tsunami detection and alerting functions. The demonstrator presented addresses information management and decision-support processes in a hypothetical natural crisis situation caused by a tsunami in the Eastern Mediterranean. Developments of the system are based to the largest extent on free and open source software (FOSS) components and industry standards. Emphasis has been and will be made on leveraging open source technologies that support mature system architecture models wherever appropriate. All open source software produced is foreseen to be published on a publicly available software repository thus allowing others to reuse results achieved and enabling further development and collaboration with a wide community including scientists, developers, users and stakeholders. This live demonstration is linked with the talk "TRIDEC Natural Crisis Management Demonstrator for Tsunamis" (EGU2012-7275) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.7/ESSI1.7).

  1. Tsunami prevention and mitigation necessities and options derived from tsunami risk assessment in Indonesia

    NASA Astrophysics Data System (ADS)

    Post, J.; Zosseder, K.; Wegscheider, S.; Steinmetz, T.; Mück, M.; Strunz, G.; Riedlinger, T.; Anwar, H. Z.; Birkmann, J.; Gebert, N.

    2009-04-01

    Risk and vulnerability assessment is an important component of an effective End-to-End Tsunami Early Warning System and therefore contributes significantly to disaster risk reduction. Risk assessment is a key strategy to implement and design adequate disaster prevention and mitigation measures. The knowledge about expected tsunami hazard impacts, exposed elements, their susceptibility, coping and adaptation mechanisms is a precondition for the development of people-centred warning structures, local specific response and recovery policy planning. The developed risk assessment and its components reflect the disaster management cycle (disaster time line) and cover the early warning as well as the emergency response phase. Consequently the components hazard assessment, exposure (e.g. how many people/ critical facilities are affected?), susceptibility (e.g. are the people able to receive a tsunami warning?), coping capacity (are the people able to evacuate in time?) and recovery (are the people able to restore their livelihoods?) are addressed and quantified. Thereby the risk assessment encompasses three steps: (i) identifying the nature, location, intensity and probability of potential tsunami threats (hazard assessment); (ii) determining the existence and degree of exposure and susceptibility to those threats; and (iii) identifying the coping capacities and resources available to address or manage these threats. The paper presents results of the research work, which is conducted in the framework of the GITEWS project and the Joint Indonesian-German Working Group on Risk Modelling and Vulnerability Assessment. The assessment methodology applied follows a people-centred approach to deliver relevant risk and vulnerability information for the purposes of early warning and disaster management. The analyses are considering the entire coastal areas of Sumatra, Java and Bali facing the Sunda trench. Selected results and products like risk maps, guidelines, decision support information and other GIS products will be presented. The focus of the products is on the one hand to provide relevant risk assessment products as decision support to issue a tsunami warning within the early warning stage. On the other hand the maps and GIS products shall provide relevant information to enable local decision makers to act adequately concerning their local risks. It is shown that effective prevention and mitigation measures can be designed based on risk assessment results and information especially when used pro-active and beforehand a disaster strikes. The conducted hazard assessment provides the probability of an area to be affected by a tsunami threat divided into two ranked impact zones. The two divided impact zones directly relate to tsunami warning levels issued by the Early Warning Center and consequently enable the local decision maker to base their planning (e.g. evacuation) accordingly. Within the tsunami hazard assessment several hundred pre-computed tsunami scenarios are analysed. This is combined with statistical analysis of historical event data. Probabilities of tsunami occurrence considering probabilities of different earthquake magnitudes, occurrences of specific wave heights at coast and spatial inundation probability are computed. Hazard assessment is then combined with a comprehensive vulnerability assessment. Here deficits in e.g. people's ability to receive and understand a tsunami warning and deficits in their ability to respond adequately (evacuate on time) are quantified and are visualized for the respective coastal areas. Hereby socio-economic properties (determining peoples ability to understand a warning and to react) are combined with environmental conditions (land cover, slope, population density) to calculate the time needed to evacuate (reach a tsunami safe area derived through the hazard assessment). This is implemented using a newly developed GIS cost-distance weighting approach. For example, the amount of people affected in a certain area is dependent on expected tsunami intensity, inundated area, estimated tsunami arrival time and available time for evacuation. Referring to the Aceh 2004 Tsunami, an estimated amount of people affected (dead/injured) of 21000 for Kabubaten Aceh Jaya and 85000 for Kab. Banda Aceh is in a comparable range with reported values of 19661 and 78417 (JICA 2005) respectively. Hence the established methodology provides reliable estimates of people affected and people's ability to reach a safe area. Based on the spatial explicit detection of e.g. high tsunami risk areas (and the assessed root causes therefore), specific disaster risk reduction and early warning strategies can be designed. For example additional installation of technical warning dissemination device, community based preparedness and awareness programmes (education), structural and non-structural measures (e.g. land use conversion, coastal engineering), effective evacuation, contingency and household recovery aid planning can be employed and/or optimized within high tsunami risk areas as a first priority. In the context of early warning, spatially distributed information like degree of expected hazard impact, exposure of critical facilities (e.g. hospitals, schools), potential people dead/injured depending on available response times, location of safe and shelter areas can be disseminated and used for decision making. Keywords: Tsunami risk, hazard and vulnerability assessment, early warning, tsunami mitigation and prevention, Indonesia

  2. GPS-TEC of the Ionospheric Disturbances as a Tool for Early Tsunami Warning

    NASA Astrophysics Data System (ADS)

    Kunitsyn, Viacheslav E.; Nesterov, Ivan A.; Shalimov, Sergey L.; Krysanov, Boris Yu.; Padokhin, Artem M.; Rekenthaler, Douglas

    2013-04-01

    Recently, the GPS measurements were used for retrieving the information on the various types of ionospheric responses to seismic events (earthquakes, seismic Rayleigh waves, and tsunami) which generate atmospheric waves propagating up to the ionospheric altitudes where the collisions between the neutrals and charge particles give rise to the motion of the ionospheric plasma. These experimental results can well be used in architecture of the future tsunami warning system. The point is an earlier (in comparison with seismological methods) detection of the ionospheric signal that can indicate the moment of tsunami generation. As an example we consider the two-dimensional distributions of the vertical total electron content (TEC) variations in the ionosphere both close to and far from the epicenter of the Japan undersea earthquake of March 11, 2011 using radio tomographic (RT) reconstruction of high-temporal-resolution (2-minute) data from the Japan and the US GPS networks. Near-zone TEC variations shows a diverging ionospheric perturbation with multi-component spectral composition emerging after the main shock. The initial phase of the disturbance can be used as an indicator of the tsunami generation and subsequently for the tsunami early warning. Far-zone TEC variations reveals distinct wave train associated with gravity waves generated by tsunami. According to observations tsunami arrives at Hawaii and further at the coast of Southern California with delay relative to the gravity waves. Therefore the gravity wave pattern can be used in the early tsunami warning. We support this scenario by the results of modeling with the parameters of the ocean surface perturbation corresponding to the considered earthquake. In addition it was observed in the modeling that at long distance from the source the gravity wave can pass ahead of the tsunami. The work was supported by the Russian Foundation for Basic Research (grants 11-05-01157 and 12-05-33065).

  3. Design and Implementation of a C++ Software Package to scan for and parse Tsunami Messages issued by the Tsunami Warning Centers for Operational use at the Pacific Tsunami Warning Center

    NASA Astrophysics Data System (ADS)

    Sardina, V.

    2012-12-01

    The US Tsunami Warning Centers (TWCs) have traditionally generated their tsunami message products primarily as blocks of text then tagged with headers that identify them on each particular communications' (comms) circuit. Each warning center has a primary area of responsibility (AOR) within which it has an authoritative role regarding parameters such as earthquake location and magnitude. This means that when a major tsunamigenic event occurs the other warning centers need to quickly access the earthquake parameters issued by the authoritative warning center before issuing their message products intended for customers in their own AOR. Thus, within the operational context of the TWCs the scientists on duty have an operational need to access the information contained in the message products issued by other warning centers as quickly as possible. As a solution to this operational problem we designed and implemented a C++ software package that allows scanning for and parsing the entire suite of tsunami message products issued by the Pacific Tsunami Warning Center (PTWC), the West Coast and Alaska Tsunami Warning Center (WCATWC), and the Japan Meteorological Agency (JMA). The scanning and parsing classes composing the resulting C++ software package allow parsing both non-official message products(observatory messages) routinely issued by the TWCs, and all official tsunami message products such as tsunami advisories, watches, and warnings. This software package currently allows scientists on duty at the PTWC to automatically retrieve the parameters contained in tsunami messages issued by WCATWC, JMA, or PTWC itself. Extension of the capabilities of the classes composing the software package would make it possible to generate XML and CAP compliant versions of the TWCs' message products until new messaging software natively adds this capabilities. Customers who receive the TWCs' tsunami message products could also use the package to automatically retrieve information from messages sent via any text-based communications' circuit currently used by the TWCs to disseminate their tsunami message products.

  4. Probabilistic tsunami hazard assessment in Greece for seismic sources along the segmented Hellenic Arc

    NASA Astrophysics Data System (ADS)

    Novikova, Tatyana; Babeyko, Andrey; Papadopoulos, Gerassimos

    2017-04-01

    Greece and adjacent coastal areas are characterized by a high population exposure to tsunami hazard. The Hellenic Arc is the most active geotectonic structure for the generation of earthquakes and tsunamis. We performed probabilistic tsunami hazard assessment for selected locations of Greek coastlines which are the forecasting points officially used in the tsunami warning operations by the Hellenic National Tsunami Warning Center and the NEAMTWS/IOC/UNESCO. In our analysis we considered seismic sources for tsunami generation along the western, central and eastern segments of the Hellenic Arc. We first created a synthetic catalog as long as 10,000 years for all the significant earthquakes with magnitudes in the range from 6.0 to 8.5, the real events being included in this catalog. For each event included in the synthetic catalog a tsunami was generated and propagated using Boussinesq model. The probability of occurrence for each event was determined by Gutenberg-Richter magnitude-frequency distribution. The results of our study are expressed as hazard curves and hazard maps. The hazard curves were obtained for the selected sites and present the annual probability of exceedance as a function of pick coastal tsunami amplitude. Hazard maps represent the distribution of peak coastal tsunami amplitudes corresponding to a fixed annual probability. In such forms our results can be easily compared to the ones obtained in other studies and further employed for the development of tsunami risk management plans. This research is a contribution to the EU-FP7 tsunami research project ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant agreement no: 603839, 2013-10-30.

  5. A Walk through TRIDEC's intermediate Tsunami Early Warning System for the Turkish and Portuguese NEAMWave12 exercise tsunami scenarios

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Lendholt, Matthias; Reißland, Sven; Schulz, Jana

    2013-04-01

    On November 27-28, 2012, the Kandilli Observatory and Earthquake Research Institute (KOERI) and the Portuguese Institute for the Sea and Atmosphere (IPMA) joined other countries in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region as participants in an international tsunami response exercise. The exercise, titled NEAMWave12, simulated widespread Tsunami Watch situations throughout the NEAM region. It is the first international exercise as such, in this region, where the UNESCO-IOC ICG/NEAMTWS tsunami warning chain has been tested to a full scale for the first time with different systems. One of the systems is developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) and has been validated in this exercise among others by KOERI and IPMA. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing related challenges. The first and second phase system demonstrator, deployed at KOERI's crisis management room and deployed at IPMA has been designed and implemented, firstly, to support plausible scenarios for the Turkish NTWC and for the Portuguese NTWC to demonstrate the treatment of simulated tsunami threats with an essential subset of a NTWC. Secondly, the feasibility and the potentials of the implemented approach are demonstrated covering ICG/NEAMTWS standard operations as well as tsunami detection and alerting functions beyond ICG/NEAMTWS requirements. The demonstrator presented addresses information management and decision-support processes for hypothetical tsunami-related crisis situations in the context of the ICG/NEAMTWS NEAMWave12 exercise for the Turkish and Portuguese tsunami exercise scenarios. Impressions gained with the standards compliant TRIDEC system during the exercise will be reported. The system version presented is based on event-driven architecture (EDA) and service-oriented architecture (SOA) concepts and is making use of relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). This demonstration is linked with the talk 'Experiences with TRIDEC's Crisis Management Demonstrator in the Turkish NEAMWave12 exercise tsunami scenario' (EGU2013-2833) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.6).

  6. Tsunami Warning Protocol for Eruptions of Augustine Volcano, Cook Inlet, Alaska

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Neal, C.; Nyland, D.; Murray, T.; Power, J.

    2006-12-01

    Augustine is an island volcano that has generated at least one tsunami. During its January 2006 eruption coastal residents of lower Cook Inlet became concerned about tsunami potential. To address this concern, NOAA's West Coast/ Alaska Tsunami Warning Center (WC/ATWC) and the Alaska Volcano Observatory (AVO) jointly developed a tsunami warning protocol for the most likely scenario for tsunami generation at Augustine: a debris avalanche into the Cook Inlet. Tsunami modeling indicates that a wave generated at Augustine volcano could reach coastal communities in approximately 55 minutes. If a shallow seismic event with magnitude greater than 4.5 occurred near Augustine and the AVO had set the level of concern color code to orange or red, the WC/ATWC would immediately issue a warning for the lower Cook Inlet. Given the short tsunami travel times involved, potentially affected communities would be provided as much lead time as possible. Large debris avalanches that could trigger a tsunami in lower Cook Inlet are expected to be accompanied by a strong seismic signal. Seismograms produced by these debris avalanches have unique spectral characteristics. After issuing a warning, the WC/ATWC would compare the observed waveform with known debris avalanches, and would consult with AVO to further evaluate the event using AVO's on-island networks (web cameras, seismic network, etc) to refine or cancel the warning. After the 2006 eruptive phase ended, WC/ATWC, with support from AVO and the University of Alaska Tsunami Warning and Environmental Observatory for Alaska program (TWEAK), developed and installed "splash-gauges" which will provide confirmation of tsunami generation.

  7. A Pilot Tsunami Inundation Forecast System for Australia

    NASA Astrophysics Data System (ADS)

    Allen, Stewart C. R.; Greenslade, Diana J. M.

    2016-12-01

    The Joint Australian Tsunami Warning Centre (JATWC) provides a tsunami warning service for Australia. Warnings are currently issued according to a technique that does not include explicit modelling at the coastline, including any potential coastal inundation. This paper investigates the feasibility of developing and implementing tsunami inundation modelling as part of the JATWC warning system. An inundation model was developed for a site in Southeast Australia, on the basis of the availability of bathymetric and topographic data and observations of past tsunamis. The model was forced using data from T2, the operational deep-water tsunami scenario database currently used for generating warnings. The model was evaluated not only for its accuracy but also for its computational speed, particularly with respect to operational applications. Limitations of the proposed forecast processes in the Australian context and areas requiring future improvement are discussed.

  8. NOAA Operational Tsunameter Support for Research

    NASA Astrophysics Data System (ADS)

    Bouchard, R.; Stroker, K.

    2008-12-01

    In March 2008, the National Oceanic and Atmospheric Administration's (NOAA) National Data Buoy Center (NDBC) completed the deployment of the last of the 39-station network of deep-sea tsunameters. As part of NOAA's effort to strengthen tsunami warning capabilities, NDBC expanded the network from 6 to 39 stations and upgraded all stations to the second generation Deep-ocean Assessment and Reporting of Tsunamis technology (DART II). Consisting of a bottom pressure recorder (BPR) and a surface buoy, the tsunameters deliver water-column heights, estimated from pressure measurements at the sea floor, to Tsunami Warning Centers in less than 3 minutes. This network provides coastal communities in the Pacific, Atlantic, Caribbean, and the Gulf of Mexico with faster and more accurate tsunami warnings. In addition, both the coarse resolution real-time data and the high resolution (15-second) recorded data provide invaluable contributions to research, such as the detection of the 2004 Sumatran tsunami in the Northeast Pacific (Gower and González, 2006) and the experimental tsunami forecast system (Bernard et al., 2007). NDBC normally recovers the BPRs every 24 months and sends the recovered high resolution data to NOAA's National Geophysical Data Center (NGDC) for archive and distribution. NGDC edits and processes this raw binary format to obtain research-quality data. NGDC provides access to retrospective BPR data from 1986 to the present. The DART database includes pressure and temperature data from the ocean floor, stored in a relational database, enabling data integration with the global tsunami and significant earthquake databases. All data are accessible via the Web as tables, reports, interactive maps, OGC Web Map Services (WMS), and Web Feature Services (WFS) to researchers around the world. References: Gower, J. and F. González, 2006. U.S. Warning System Detected the Sumatra Tsunami, Eos Trans. AGU, 87(10). Bernard, E. N., C. Meinig, and A. Hilton, 2007. Deep Ocean Tsunami Detection: Third Generation DART, Eos Trans. AGU, 88(52), Fall Meet. Suppl., Abstract S51C-03.

  9. Quantifying human response capabilities towards tsunami threats at community level

    NASA Astrophysics Data System (ADS)

    Post, J.; Mück, M.; Zosseder, K.; Wegscheider, S.; Taubenböck, H.; Strunz, G.; Muhari, A.; Anwar, H. Z.; Birkmann, J.; Gebert, N.

    2009-04-01

    Decision makers at the community level need detailed information on tsunami risks in their area. Knowledge on potential hazard impact, exposed elements such as people, critical facilities and lifelines, people's coping capacity and recovery potential are crucial to plan precautionary measures for adaptation and to mitigate potential impacts of tsunamis on society and the environment. A crucial point within a people-centred tsunami risk assessment is to quantify the human response capabilities towards tsunami threats. Based on this quantification and spatial representation in maps tsunami affected and safe areas, difficult-to-evacuate areas, evacuation target points and evacuation routes can be assigned and used as an important contribution to e.g. community level evacuation planning. Major component in the quantification of human response capabilities towards tsunami impacts is the factor time. The human response capabilities depend on the estimated time of arrival (ETA) of a tsunami, the time until technical or natural warning signs (ToNW) can be received, the reaction time (RT) of the population (human understanding of a tsunami warning and the decision to take appropriate action), the evacuation time (ET, time people need to reach a safe area) and the actual available response time (RsT = ETA - ToNW - RT). If RsT is larger than ET, people in the respective areas are able to reach a safe area and rescue themselves. Critical areas possess RsT values equal or even smaller ET and hence people whin these areas will be directly affected by a tsunami. Quantifying the factor time is challenging and an attempt to this is presented here. The ETA can be derived by analyzing pre-computed tsunami scenarios for a respective area. For ToNW we assume that the early warning center is able to fulfil the Indonesian presidential decree to issue a warning within 5 minutes. RT is difficult as here human intrinsic factors as educational level, believe, tsunami knowledge and experience besides others play a role. An attempt to quantify this variable under high uncertainty is also presented. Quantifying ET is based on a GIS modelling using a Cost Weighted Distance approach. Basic principle is to define the best evacuation path from a given point to the next safe area (shelter location). Here the fastest path from that point to the shelter location has to be found. Thereby the impact of land cover, slope, population density, population age and gender distribution are taken into account as literature studies prove these factors as highly important. Knowing the fastest path and the distance to the next safe area together with a spatially distributed pattern of evacuation speed delivers the time needed from each location to a safe area. By considering now the obtained time value for RsT the coverage area of an evacuation target point (safe area) can be assigned. Incorporating knowledge on people capacity of an evacuation target point the respective coverage area is refined. Hence areas with weak, moderate and good human response capabilities can be detected. This allows calculation of potential amount of people affected (dead or injured) and amount of people dislocated. First results for Kuta (Bali) for a worst case tsunami event deliver people affected of approx. 25 000 when RT = 0 minutes (direct evacuation when receiving a tsunami warning to 120 000 when RT > ETA (no evacuation action until tsunami hits the land). Additionally fastest evacuation routes to the evacuation target points can be assigned. Areas with weak response capabilities can be assigned as priority areas to install e.g. additional evacuation target points or to increase tsunami knowledge and awareness to promote a faster reaction time. Especially in analyzing underlying socio-economic properties causing deficiencies in responding to a tsunami threat can lead to valuable information and direct planning of adaptation measures. Keywords: Community level, Risk and vulnerability assessment, Early warning, Disaster management, Tsunami, Indonesia

  10. Real time assessment of the 15 July 2009 New Zealand tsunami

    NASA Astrophysics Data System (ADS)

    Uslu, Burak; Power, William; Greensdale, Dianne; Titov, Vasily

    2010-05-01

    On the 15th July 2009 a Mw 7.6 earthquake occurred off the coast of Fiordland in the South Island of New Zealand, about 1200 km from Auckland, New Zealand, 1500 km from Hobart, Tasmania and 1800 km from Sydney, Australia. A tsunami was generated and an initial warning issued by the PTWC. The Centre for Australian Weather and Climate issued its first tsunami warning for coastal regions of eastern Australia and New Zealand 24 minutes after the earthquake. By serendipitous coincidence, the earthquake struck while the International Tsunami Symposium was in session in Novosibirsk Russia. This provided the opportunity to test, in real-time, several tsunami warning systems in front of attending scientists (Schiermeier, 2009). NOAA Center for Tsunami Research, Pacific Tsunami Warning Center, GNS Science, and Centre for Australian Weather and Climate scientists were present at the symposium and worked together. Vasily Titov showed "live" NOAA's methodology (Bernard et al, 2006) to assess the tsunami potential and, in consultation with colleagues, provided warning guidance, and the warning was eventually canceled. We discuss how the forecast was done and how accurate the initial determination was. References Bernard E.N. et al., 2006, Tsunami: scientific frontiers, mitigation, forecasting and policy implications, Phil. Trans. R. Soc. A, 364:1989-2007; doi:10.1098/rsta.2006.1809 Schiermeier, Q., 2009, Tsunami forecast in real time, Published online 16 July 2009 | Nature | doi:10.1038/news.2009.702

  11. Suitability of Open-Ocean Instrumentation for Use in Near-Field Tsunami Early Warning Along Seismically Active Subduction Zones

    NASA Astrophysics Data System (ADS)

    Williamson, Amy L.; Newman, Andrew V.

    2018-05-01

    Over the past decade, the number of open-ocean gauges capable of parsing information about a passing tsunami has steadily increased, particularly through national cable networks and international buoyed efforts such as the Deep-ocean Assessment and Reporting of Tsunami (DART). This information is analyzed to disseminate tsunami warnings to affected regions. However, most current warnings that incorporate tsunami are directed at mid- and far-field localities. In this study, we analyze the region surrounding four seismically active subduction zones, Cascadia, Japan, Chile, and Java, for their potential to facilitate local tsunami early warning using such systems. We assess which locations currently have instrumentation in the right locations for direct tsunami observations with enough time to provide useful warning to the nearest affected coastline—and which are poorly suited for such systems. Our primary findings are that while some regions are ill-suited for this type of early warning, such as the coastlines of Chile, other localities, like Java, Indonesia, could incorporate direct tsunami observations into their hazard forecasts with enough lead time to be effective for coastal community emergency response. We take into account the effect of tsunami propagation with regard to shallow bathymetry on the fore-arc as well as the effect of earthquake source placement. While it is impossible to account for every type of off-shore tsunamigenic event in these locales, this study aims to characterize a typical large tsunamigenic event occurring in the shallow part of the megathrust as a guide in what is feasible with early tsunami warning.

  12. NOAA/West Coast and Alaska Tsunami Warning Center Pacific Ocean response criteria

    USGS Publications Warehouse

    Whitmore, P.; Benz, H.; Bolton, M.; Crawford, G.; Dengler, L.; Fryer, G.; Goltz, J.; Hansen, R.; Kryzanowski, K.; Malone, S.; Oppenheimer, D.; Petty, E.; Rogers, G.; Wilson, Jim

    2008-01-01

    New West Coast/Alaska Tsunami Warning Center (WCATWC) response criteria for earthquakes occurring in the Pacific basin are presented. Initial warning decisions are based on earthquake location, magnitude, depth, and - dependent on magnitude - either distance from source or precomputed threat estimates generated from tsunami models. The new criteria will help limit the geographical extent of warnings and advisories to threatened regions, and complement the new operational tsunami product suite. Changes to the previous criteria include: adding hypocentral depth dependence, reducing geographical warning extent for the lower magnitude ranges, setting special criteria for areas not well-connected to the open ocean, basing warning extent on pre-computed threat levels versus tsunami travel time for very large events, including the new advisory product, using the advisory product for far-offshore events in the lower magnitude ranges, and specifying distances from the coast for on-shore events which may be tsunamigenic. This report sets a baseline for response criteria used by the WCATWC considering its processing and observational data capabilities as well as its organizational requirements. Criteria are set for tsunamis generated by earthquakes, which are by far the main cause of tsunami generation (either directly through sea floor displacement or indirectly by triggering of slumps). As further research and development provides better tsunami source definition, observational data streams, and improved analysis tools, the criteria will continue to adjust. Future lines of research and development capable of providing operational tsunami warning centers with better tools are discussed.

  13. Source modeling and inversion with near real-time GPS: a GITEWS perspective for Indonesia

    NASA Astrophysics Data System (ADS)

    Babeyko, A. Y.; Hoechner, A.; Sobolev, S. V.

    2010-07-01

    We present the GITEWS approach to source modeling for the tsunami early warning in Indonesia. Near-field tsunami implies special requirements to both warning time and details of source characterization. To meet these requirements, we employ geophysical and geological information to predefine a maximum number of rupture parameters. We discretize the tsunamigenic Sunda plate interface into an ordered grid of patches (150×25) and employ the concept of Green's functions for forward and inverse rupture modeling. Rupture Generator, a forward modeling tool, additionally employs different scaling laws and slip shape functions to construct physically reasonable source models using basic seismic information only (magnitude and epicenter location). GITEWS runs a library of semi- and fully-synthetic scenarios to be extensively employed by system testing as well as by warning center personnel teaching and training. Near real-time GPS observations are a very valuable complement to the local tsunami warning system. Their inversion provides quick (within a few minutes on an event) estimation of the earthquake magnitude, rupture position and, in case of sufficient station coverage, details of slip distribution.

  14. U.S. Tsunami Warning System: Advancements since the 2004 Indian Ocean Tsunami (Invited)

    NASA Astrophysics Data System (ADS)

    Whitmore, P.

    2009-12-01

    The U.S. government embarked on a strengthening program for the U.S. Tsunami Warning System (TWS) in the aftermath of the disastrous 2004 Indian Ocean tsunami. The program was designed to improve several facets of the U.S. TWS, including: upgrade of the coastal sea level network - 16 new stations plus higher transmission rates; expansion of the deep ocean tsunameter network - 7 sites increased to 39; upgrade of seismic networks - both USGS and Tsunami Warning Center (TWC); increase of TWC staff to allow 24x7 coverage at two centers; development of an improved tsunami forecast system; increased preparedness in coastal communities; expansion of the Pacific Tsunami Warning Center facility; and improvement of the tsunami data archive effort at the National Geophysical Data Center. The strengthening program has been completed and has contributed to the many improvements attained in the U.S. TWS since 2004. Some of the more significant enhancements to the program are: the number of sea level and seismic sites worldwide available to the TWCs has more than doubled; the TWC areas-of-responsibility expanded to include the U.S./Canadian Atlantic coasts, Indian Ocean, Caribbean Sea, Gulf of Mexico, and U.S. Arctic coast; event response time decreased by approximately one-half; product accuracy has improved; a tsunami forecast system developed by NOAA capable of forecasting inundation during an event has been delivered to the TWCs; warning areas are now defined by pre-computed or forecasted threat versus distance or travel time, significantly reducing the amount of coast put in a warning; new warning dissemination techniques have been implemented to reach a broader audience in less time; tsunami product content better reflects the expected impact level; the number of TsunamiReady communities has quadrupled; and the historical data archive has increased in quantity and accuracy. In addition to the strengthening program, the U.S. National Tsunami Hazard Mitigation Program (NTHMP) has expanded its efforts since 2004 and improved tsunami preparedness throughout U.S. coastal communities. The NTHMP is a partnership of federal agencies and state tsunami response agencies whose efforts include: development of inundation and evacuation maps for most highly threatened communities; tsunami evacuation and educational signage for coastal communities; support for tsunami educational, awareness and planning seminars; increased number of local tsunami warning dissemination devices such as sirens; and support for regional tsunami exercises. These activities are major factors that have contributed to the increase of TsunamiReady communities throughout the country.

  15. U.S. Tsunami Warning Centers

    Science.gov Websites

    > No Tsunami Warning, Advisory, Watch, or Threat There is No Tsunami Warning Loading Earthquake Layer Loading Alert Layer Earthquake Layer failed to load Alerts/Threats Layer failed to load Default View Alaska Hawaii Guam/CNMI American Samoa Caribbean North America South America

  16. Tsunami Warning Services for the U.S. and Canadian Atlantic Coasts

    NASA Astrophysics Data System (ADS)

    Whitmore, P. M.; Knight, W.

    2008-12-01

    In January 2005, the National Oceanic and Atmospheric Administration (NOAA) developed a tsunami warning program for the U.S. Atlantic and Gulf of Mexico coasts. Within a year, this program extended further to the Atlantic coast of Canada and the Caribbean Sea. Warning services are provided to U.S. and Canadian coasts (including Puerto Rico and the Virgin Islands) by the NOAA/West Coast and Alaska Tsunami Warning Center (WCATWC) while the NOAA/Pacific Tsunami Warning Center (PTWC) provides services for non-U.S. entities in the Caribbean Basin. The Puerto Rico Seismic Network (PRSN) is also an active partner in the Caribbean Basin warning system. While the nature of the tsunami threat in the Atlantic Basin is different than in the Pacific, the warning system philosophy is similar. That is, initial messages are based strictly on seismic data so that information is provided to those at greatest risk as fast as possible while supplementary messages are refined with sea level observations and forecasts when possible. The Tsunami Warning Centers (TWCs) acquire regional seismic data through many agencies, such as the United States Geological Survey, Earthquakes Canada, regional seismic networks, and the PRSN. Seismic data quantity and quality are generally sufficient throughout most of the Atlantic area-of-responsibility to issue initial information within five minutes of origin time. Sea level data are mainly provided by the NOAA/National Ocean Service. Coastal tide gage coverage is generally denser along the Atlantic coast than in the Pacific. Seven deep ocean pressure sensors (DARTs), operated by the National Weather Service (NWS) National Data Buoy Center, are located in the Atlantic Basin (5 in the Atlantic Ocean, 1 in the Caribbean, and 1 in the Gulf of Mexico). The DARTs provide TWCs with the means to verify tsunami generation in the Atlantic and provide critical data with which to calibrate forecast models. Tsunami warning response criteria in the Atlantic Basin poses a challenge due to the lack of historic events. The probability and nature of potential sources along the offshore U.S./Canada region are not well understood. Warning/watch/advisory criteria are under review to improve TWC response. Primary tsunami warning contact points consist of NWS Weather Forecast Offices, state warning points, U.S. Coast Guard, and the military. These entities each have responsibility to propagate the message through specific channels. To help communities prepare for a tsunami warning, the NWS established the TsunamiReady program. TsunamiReady sets criteria for communities which include: reliable methods to receive TWC warnings, reliable methods to disseminate messages locally, pre-event planning, hazard/safe zones defined and public education. Once the criteria are met, the community can be recognized as TsunamiReady. A hypothetical event off the east coast is examined and a timeline given for TWC analysis and product issuance.

  17. Developing Tsunami Evacuation Plans, Maps, And Procedures: Pilot Project in Central America

    NASA Astrophysics Data System (ADS)

    Arcos, N. P.; Kong, L. S. L.; Arcas, D.; Aliaga, B.; Coetzee, D.; Leonard, J.

    2015-12-01

    In the End-to-End tsunami warning chain, once a forecast is provided and a warning alert issued, communities must know what to do and where to go. The 'where to' answer would be reliable and practical community-level tsunami evacuation maps. Following the Exercise Pacific Wave 2011, a questionnaire was sent to the 46 Member States of Pacific Tsunami Warning System (PTWS). The results revealed over 42 percent of Member States lacked tsunami mass coastal evacuation plans. Additionally, a significant gap in mapping was exposed as over 55 percent of Member States lacked tsunami evacuation maps, routes, signs and assembly points. Thereby, a significant portion of countries in the Pacific lack appropriate tsunami planning and mapping for their at-risk coastal communities. While a variety of tools exist to establish tsunami inundation areas, these are inconsistent while a methodology has not been developed to assist countries develop tsunami evacuation maps, plans, and procedures. The International Tsunami Information Center (ITIC) and partners is leading a Pilot Project in Honduras demonstrating that globally standardized tools and methodologies can be applied by a country, with minimal tsunami warning and mitigation resources, towards the determination of tsunami inundation areas and subsequently community-owned tsunami evacuation maps and plans for at-risk communities. The Pilot involves a 1- to 2-year long process centered on a series of linked tsunami training workshops on: evacuation planning, evacuation map development, inundation modeling and map creation, tsunami warning & emergency response Standard Operating Procedures (SOPs), and conducting tsunami exercises (including evacuation). The Pilot's completion is capped with a UNESCO/IOC document so that other countries can replicate the process in their tsunami-prone communities.

  18. Bodrum-Kos (Turkey-Greece) Mw 6.6 earthquake and tsunami of 20 July 2017: a test for the Mediterranean tsunami warning system

    NASA Astrophysics Data System (ADS)

    Heidarzadeh, Mohammad; Necmioglu, Ocal; Ishibe, Takeo; Yalciner, Ahmet C.

    2017-12-01

    Various Tsunami Service Providers (TSPs) within the Mediterranean Basin supply tsunami warnings including CAT-INGV (Italy), KOERI-RETMC (Turkey), and NOA/HL-NTWC (Greece). The 20 July 2017 Bodrum-Kos (Turkey-Greece) earthquake (Mw 6.6) and tsunami provided an opportunity to assess the response from these TSPs. Although the Bodrum-Kos tsunami was moderate (e.g., runup of 1.9 m) with little damage to properties, it was the first noticeable tsunami in the Mediterranean Basin since the 21 May 2003 western Mediterranean tsunami. Tsunami waveform analysis revealed that the trough-to-crest height was 34.1 cm at the near-field tide gauge station of Bodrum (Turkey). Tsunami period band was 2-30 min with peak periods at 7-13 min. We proposed a source fault model for this tsunami with the length and width of 25 and 15 km and uniform slip of 0.4 m. Tsunami simulations using both nodal planes produced almost same results in terms of agreement between tsunami observations and simulations. Different TSPs provided tsunami warnings at 10 min (CAT-INGV), 19 min (KOERI-RETMC), and 18 min (NOA/HL-NTWC) after the earthquake origin time. Apart from CAT-INGV, whose initial Mw estimation differed 0.2 units with respect to the final value, the response from the other two TSPs came relatively late compared to the desired warning time of 10 min, given the difficulties for timely and accurate calculation of earthquake magnitude and tsunami impact assessment. It is argued that even if a warning time of 10 min was achieved, it might not have been sufficient for addressing near-field tsunami hazards. Despite considerable progress and achievements made within the upstream components of NEAMTWS (North East Atlantic, Mediterranean and Connected seas Tsunami Warning System), the experience from this moderate tsunami may highlight the need for improving operational capabilities of TSPs, but more importantly for effectively integrating civil protection authorities into NEAMTWS and strengthening tsunami education programs.

  19. Implementation and Challenges of the Tsunami Warning System in the Western Mediterranean

    NASA Astrophysics Data System (ADS)

    Schindelé, F.; Gailler, A.; Hébert, H.; Loevenbruck, A.; Gutierrez, E.; Monnier, A.; Roudil, P.; Reymond, D.; Rivera, L.

    2015-03-01

    The French Tsunami Warning Center (CENALT) has been in operation since 2012. It is contributing to the North-eastern and Mediterranean (NEAM) tsunami warning and mitigation system coordinated by the United Nations Educational, Scientific, and Cultural Organization, and benefits from data exchange with several foreign institutes. This center is supported by the French Government and provides French civil-protection authorities and member states of the NEAM region with relevant messages for assessing potential tsunami risk when an earthquake has occurred in the Western Mediterranean sea or the Northeastern Atlantic Ocean. To achieve its objectives, CENALT has developed a series of innovative techniques based on recent research results in seismology for early tsunami warning, monitoring of sea level variations and detection capability, and effective numerical computation of ongoing tsunamis.

  20. Educating and Preparing for Tsunamis in the Caribbean

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Aliaga, B.; Edwards, S.

    2013-12-01

    The Caribbean and Adjacent Regions has a long history of tsunamis and earthquakes. Over the past 500 years, more than 75 tsunamis have been documented in the region by the NOAA National Geophysical Data Center. Just since 1842, 3446 lives have been lost to tsunamis; this is more than in the Northeastern Pacific for the same time period. With a population of almost 160 million, over 40 million visitors a year and a heavy concentration of residents, tourists, businesses and critical infrastructure along its shores (especially in the northern and eastern Caribbean), the risk to lives and livelihoods is greater than ever before. The only way to survive a tsunami is to get out of harm's way before the waves strike. In the Caribbean given the relatively short distances from faults, potential submarine landslides and volcanoes to some of the coastlines, the tsunamis are likely to be short fused, so it is imperative that tsunami warnings be issued extremely quickly and people be educated on how to recognize and respond. Nevertheless, given that tsunamis occur infrequently as compared with hurricanes, it is a challenge for them to receive the priority they require in order to save lives when the next one strikes the region. Close cooperation among countries and territories is required for warning, but also for education and public awareness. Geographical vicinity and spoken languages need to be factored in when developing tsunami preparedness in the Caribbean, to make sure citizens receive a clear, reliable and sound science based message about the hazard and the risk. In 2006, in the wake of the Indian Ocean tsunami and after advocating without success for a Caribbean Tsunami Warning System since the mid 90's, the Intergovernmental Oceanographic Commission of UNESCO established the Intergovernmental Coordination Group for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). Its purpose is to advance an end to end tsunami warning system that serves regionally and delivers locally, saving lives and livelihoods, not only from tsunamis, but all coastal hazards. Through this and other platforms, physical and social scientists, emergency managers and elected officials have been working together via different mechanisms. Community based recognition programs, like the TsunamiReadyTM Program, regional tsunami exercises, sub-regional public education activities such as the Tsunami Smart campaigns, internet technologies, social media, meetings and conferences, identification of local and national champions, capitalization of news breaking tsunamis and earthquakes, economic resources for equipment and training have all been key to developing a tsunami safer Caribbean. Given these efforts, according to a 2013 survey, 93% of the countries covered by CARIBE EWS have tsunami response protocols in place, although much more work is required. In 2010 the US National Weather Service established the Caribbean Tsunami Warning Program as the first step towards a Caribbean Tsunami Warning Center in the region. In 2013 the Caribbean Tsunami Information Center was established in Barbados. Both these institutions which serve the region play a key role for promoting both the warning and educational components of the warning system.

  1. Experiences integrating autonomous components and legacy systems into tsunami early warning systems

    NASA Astrophysics Data System (ADS)

    Reißland, S.; Herrnkind, S.; Guenther, M.; Babeyko, A.; Comoglu, M.; Hammitzsch, M.

    2012-04-01

    Fostered by and embedded in the general development of Information and Communication Technology (ICT) the evolution of Tsunami Early Warning Systems (TEWS) shows a significant development from seismic-centred to multi-sensor system architectures using additional sensors, e.g. sea level stations for the detection of tsunami waves and GPS stations for the detection of ground displacements. Furthermore, the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources serving near real-time data not only includes sensors but also other components and systems offering services such as the delivery of feasible simulations used for forecasting in an imminent tsunami threat. In the context of the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the project Distant Early Warning System (DEWS) a service platform for both sensor integration and warning dissemination has been newly developed and demonstrated. In particular, standards of the Open Geospatial Consortium (OGC) and the Organization for the Advancement of Structured Information Standards (OASIS) have been successfully incorporated. In the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) new developments are used to extend the existing platform to realise a component-based technology framework for building distributed TEWS. This talk will describe experiences made in GITEWS, DEWS and TRIDEC while integrating legacy stand-alone systems and newly developed special-purpose software components into TEWS using different software adapters and communication strategies to make the systems work together in a corporate infrastructure. The talk will also cover task management and data conversion between the different systems. Practical approaches and software solutions for the integration of sensors, e.g. providing seismic and sea level data, and utilisation of special-purpose components, such as simulation systems, in TEWS will be presented.

  2. The One-Meter Criterion for Tsunami Warning: Time for a Reevaluation?

    NASA Astrophysics Data System (ADS)

    Fryer, G. J.; Weinstein, S.

    2013-12-01

    The U.S. tsunami warning centers issue warnings when runup is anticipated to exceed one meter. The origins of the one-meter criterion are unclear, though Whitmore, et al (2008) showed from tsunami history that one meter is roughly the threshold above which damage occurs. Recent experiences in Hawaii, however, suggest that the threshold could be raised. Tsunami Warnings were issued for 2010 Chile, 2011 Tohoku, and 2012 Haida Gwaii tsunamis; each exceeded one meter runup somewhere in the State. Evacuation, however, was necessary only in 2011, and even then onshore damage (as opposed to damage from currents) occurred only where runup exceeded 1.5m. During both Chile and Haida Gwaii tsunamis the existing criteria led to unnecessary evacuation. Maximum runup during the Chile tsunami was 1.1m at Hilo's Wailoa Boat Harbor, while the Haida Gwaii tsunami peaked at 1.2m at Honouliwai Bay on Molokai. Both tsunamis caused only minor damage and minimal flooding; in both cases a Tsunami Advisory (i.e., there is no need to evacuate, but stay off the beach and out of the water) would have been adequate. The Advisory was originally developed as an ad hoc response to the mildly threatening 2006 Kuril tsunami and has since been formalized as the product we issue when maximum runup is expected to be 0.3-1.0 m. At the time it was introduced, however, there was no discussion that this new low-level warning might allow the criterion for Tsunami Warning itself to be adjusted. We now suggest that the divide between Advisory and Warning be raised from 1.0 to something greater, possibly 1.2m. If the warning threshold were raised to 1.2m, the over-warning for the Chile tsunami still could not have been avoided. Models calibrated against DART data consistently forecast runup just over 1.2m for that event. For Haida Gwaii, adjusting the models to match the DART data increased the forecast runup to almost 2m, which again meant a warning, though in retrospect we should have been skeptical. The nearest DART to Haida Gwaii was off the Washington coast in line with the long axis (strike direction) of the rupture and so provided little constraint on the tsunami directed towards Hawaii (the dip direction). The finite fault model obtained by inverting the DART data extended the rupture too far along strike and pushed the rupture to the wrong (east) side of Haida Gwaii, in conflict with the W-phase CMT. The inferred wave height at the Langara Point tide gauge, just outside the epicentral region, was also too large by a factor of two. Forcing the tsunami inversion to be consistent with the CMT would have rendered the inferred rupture much closer to reality, matched the Langara Point record well, and forecast a maximum runup at Kahului of only 1.0 m (the actual runup there was 0.8m). If the warning criterion had been 1.2m the unnecessary coastal evacuation for the Haida Gwaii tsunami could have been avoided. So increasing the warning threshold by only 20 cm would eliminate one of the two recent unnecessary evacuations. Can the threshold be be raised even more? We are considering that possibility, though the uncertainties and time constraints of an actual warning demand that we remain very conservative.

  3. Water level ingest, archive and processing system - an integral part of NOAA's tsunami database

    NASA Astrophysics Data System (ADS)

    McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.

  4. DISTANT EARLY WARNING SYSTEM for Tsunamis - A wide-area and multi-hazard approach

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Lendholt, Matthias; Wächter, Joachim

    2010-05-01

    The DEWS (Distant Early Warning System) [1] project, funded under the 6th Framework Programme of the European Union, has the objective to create a new generation of interoperable early warning systems based on an open sensor platform. This platform integrates OGC [2] SWE [3] compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements in the case of tsunami early warning. Based on the upstream information flow DEWS focuses on the improvement of downstream capacities of warning centres especially by improving information logistics for effective and targeted warning message aggregation for a multilingual environment. Multiple telecommunication channels will be used for the dissemination of warning messages. Wherever possible, existing standards have been integrated. The Command and Control User Interface (CCUI), a rich client application based on Eclipse RCP (Rich Client Platform) [4] and the open source GIS uDig [5], integrates various OGC services. Using WMS (Web Map Service) [6] and WFS (Web Feature Service) [7] spatial data are utilized to depict the situation picture and to integrate a simulation system via WPS (Web Processing Service) [8] to identify affected areas. Warning messages are compiled and transmitted in the OASIS [9] CAP (Common Alerting Protocol) [10] standard together with addressing information defined via EDXL-DE (Emergency Data Exchange Language - Distribution Element) [11]. Internal interfaces are realized with SOAP [12] web services. Based on results of GITEWS [13] - in particular the GITEWS Tsunami Service Bus [14] - the DEWS approach provides an implementation for tsunami early warning systems but other geological paradigms are going to follow, e.g. volcanic eruptions or landslides. Therefore in future also multi-hazard functionality is conceivable. The specific software architecture of DEWS makes it possible to dock varying sensors to the system and to extend the CCUI with hazard specific functionality. The presentation covers the DEWS project, the system architecture and the CCUI in conjunction with details of information logistics. The DEWS Wide Area Centre connecting national centres to allow the international communication and warning exchange is presented also. REFERENCES: [1] DEWS, www.dews-online.org [2] OGC, www.opengeospatial.org [3] SWE, www.opengeospatial.org/projects/groups/sensorweb [4] Eclipse RCP, www.eclipse.org/home/categories/rcp.php [5] uDig, udig.refractions.net [6] WMS, www.opengeospatial.org/standards/wms [7] WFS, www.opengeospatial.org/standards/wfs [8] WPS, www.opengeospatial.org/standards/wps [9] OASIS, www.oasis-open.org [10] CAP, www.oasis-open.org/specs/#capv1.1 [11] EDXL-DE, www.oasis-open.org/specs/#edxlde-v1.0 [12] SOAP, www.w3.org/TR/soap [13] GITEWS (German Indonesian Tsunami Early Warning System) is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone Indian Ocean region, www.gitews.org [14] The Tsunami Service Bus is the GITEWS sensor system integration platform offering standardised services for the detection and monitoring of tsunamis

  5. Application of Seismic Array Processing to Tsunami Early Warning

    NASA Astrophysics Data System (ADS)

    An, C.; Meng, L.

    2015-12-01

    Tsunami wave predictions of the current tsunami warning systems rely on accurate earthquake source inversions of wave height data. They are of limited effectiveness for the near-field areas since the tsunami waves arrive before data are collected. Recent seismic and tsunami disasters have revealed the need for early warning to protect near-source coastal populations. In this work we developed the basis for a tsunami warning system based on rapid earthquake source characterisation through regional seismic array back-projections. We explored rapid earthquake source imaging using onshore dense seismic arrays located at regional distances on the order of 1000 km, which provides faster source images than conventional teleseismic back-projections. We implement this method in a simulated real-time environment, and analysed the 2011 Tohoku earthquake rupture with two clusters of Hi-net stations in Kyushu and Northern Hokkaido, and the 2014 Iquique event with the Earthscope USArray Transportable Array. The results yield reasonable estimates of rupture area, which is approximated by an ellipse and leads to the construction of simple slip models based on empirical scaling of the rupture area, seismic moment and average slip. The slip model is then used as the input of the tsunami simulation package COMCOT to predict the tsunami waves. In the example of the Tohoku event, the earthquake source model can be acquired within 6 minutes from the start of rupture and the simulation of tsunami waves takes less than 2 min, which could facilitate a timely tsunami warning. The predicted arrival time and wave amplitude reasonably fit observations. Based on this method, we propose to develop an automatic warning mechanism that provides rapid near-field warning for areas of high tsunami risk. The initial focus will be Japan, Pacific Northwest and Alaska, where dense seismic networks with the capability of real-time data telemetry and open data accessibility, such as the Japanese HiNet (>800 instruments) and the Earthscope USArray Transportable Array (~400 instruments), are established.

  6. GPS water level measurements for Indonesia's Tsunami Early Warning System

    NASA Astrophysics Data System (ADS)

    Schöne, T.; Pandoe, W.; Mudita, I.; Roemer, S.; Illigner, J.; Zech, C.; Galas, R.

    2011-03-01

    On Boxing Day 2004, a severe tsunami was generated by a strong earthquake in Northern Sumatra causing a large number of casualties. At this time, neither an offshore buoy network was in place to measure tsunami waves, nor a system to disseminate tsunami warnings to local governmental entities. Since then, buoys have been developed by Indonesia and Germany, complemented by NOAA's Deep-ocean Assessment and Reporting of Tsunamis (DART) buoys, and have been moored offshore Sumatra and Java. The suite of sensors for offshore tsunami detection in Indonesia has been advanced by adding GPS technology for water level measurements. The usage of GPS buoys in tsunami warning systems is a relatively new approach. The concept of the German Indonesian Tsunami Early Warning System (GITEWS) (Rudloff et al., 2009) combines GPS technology and ocean bottom pressure (OBP) measurements. Especially for near-field installations where the seismic noise may deteriorate the OBP data, GPS-derived sea level heights provide additional information. The GPS buoy technology is precise enough to detect medium to large tsunamis of amplitudes larger than 10 cm. The analysis presented here suggests that for about 68% of the time, tsunamis larger than 5 cm may be detectable.

  7. A Tsunami-Focused Tide Station Data Sharing Framework

    NASA Astrophysics Data System (ADS)

    Kari, U. S.; Marra, J. J.; Weinstein, S. A.

    2006-12-01

    The Indian Ocean Tsunami of 26 December 2004 made it clear that information about tide stations that could be used to support detection and warning (such as location, collection and transmission capabilities, operator identification) are insufficiently known or not readily accessible. Parties interested in addressing this problem united under the Pacific Region Data Integrated Data Enterprise (PRIDE), and in 2005 began a multiyear effort to develop a distributed metadata system describing tide stations starting with pilot activities in a regional framework and focusing on tsunami detection and warning systems being developed by various agencies. First, a plain semantic description of the tsunami-focused tide station metadata was developed. The semantic metadata description was, in turn, developed into a formal metadata schema championed by International Tsunami Information Centre (ITIC) as part of a larger effort to develop a prototype web service under the PRIDE program in 2005. Under the 2006 PRIDE program the formal metadata schema was then expanded to corral input parameters for the TideTool application used by Pacific Tsunami Warning Center (PTWC) to drill down into wave activity at a tide station that is located using a web service developed on this metadata schema. This effort contributed to formalization of web service dissemination of PTWC watch and warning tsunami bulletins. During this time, the data content and sharing issues embodied in this schema have been discussed at various forums. The result is that the various stakeholders have different data provider and user perspectives (semantic content) and also exchange formats (not limited to just XML). The challenge then, is not only to capture all data requirements, but also to have formal representation that is easily transformed into any specified format. The latest revision of the tide gauge schema (Version 0.3), begins to address this challenge. It encompasses a broader range of provider and user perspectives, such as station operators, warning system managers, disaster managers, other marine hazard warning systems (such as storm surges and sea level change monitoring and research. In the next revision(s), we hope to take into account various relevant standards, including specifically, the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) Framework, that will serve all prospective stakeholders in the most useful (extensible, scalable) manner. This is because Sensor ML has addressed many of the challenges we face already, through very useful fundamental modeling consideration and data types that are particular to sensors in general, with perhaps some extension needed for tide gauges. As a result of developing this schema, and associated client application architectures, we hope to have a much more distributed network of data providers, who are able to contribute to a global tide station metadata from the comfort of their own Information Technology (IT) departments.

  8. Science and Engineering of an Operational Tsunami Forecasting System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, Frank

    2009-04-06

    After a review of tsunami statistics and the destruction caused by tsunamis, a means of forecasting tsunamis is discussed as part of an overall program of reducing fatalities through hazard assessment, education, training, mitigation, and a tsunami warning system. The forecast is accomplished via a concept called Deep Ocean Assessment and Reporting of Tsunamis (DART). Small changes of pressure at the sea floor are measured and relayed to warning centers. Under development is an international modeling network to transfer, maintain, and improve tsunami forecast models.

  9. Science and Engineering of an Operational Tsunami Forecasting System

    ScienceCinema

    Gonzalez, Frank

    2017-12-09

    After a review of tsunami statistics and the destruction caused by tsunamis, a means of forecasting tsunamis is discussed as part of an overall program of reducing fatalities through hazard assessment, education, training, mitigation, and a tsunami warning system. The forecast is accomplished via a concept called Deep Ocean Assessment and Reporting of Tsunamis (DART). Small changes of pressure at the sea floor are measured and relayed to warning centers. Under development is an international modeling network to transfer, maintain, and improve tsunami forecast models.

  10. Introduction to "Tsunami Science: Ten Years After the 2004 Indian Ocean Tsunami. Volume I"

    NASA Astrophysics Data System (ADS)

    Rabinovich, Alexander B.; Geist, Eric L.; Fritz, Hermann M.; Borrero, Jose C.

    2015-03-01

    Twenty-two papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue "Tsunami Science: Ten Years after the 2004 Indian Ocean Tsunami." Eight papers examine various aspects of past events with an emphasis on case and regional studies. Five papers are on tsunami warning and forecast, including the improvement of existing tsunami warning systems and the development of new warning systems in the northeast Atlantic and Mediterranean region. Three more papers present the results of analytical studies and discuss benchmark problems. Four papers report the impacts of tsunamis, including the detailed calculation of inundation onshore and into rivers and probabilistic analysis for engineering purposes. The final two papers relate to important investigations of the source and tsunami generation. Overall, the volume not only addresses the pivotal 2004 Indian Ocean (Sumatra) and 2011 Japan (Tohoku) tsunamis, but also examines the tsunami hazard posed to other critical coasts in the world.

  11. Improving tsunami warning systems with remote sensing and geographical information system input.

    PubMed

    Wang, Jin-Feng; Li, Lian-Fa

    2008-12-01

    An optimal and integrative tsunami warning system is introduced that takes full advantage of remote sensing and geographical information systems (GIS) in monitoring, forecasting, detection, loss evaluation, and relief management for tsunamis. Using the primary impact zone in Banda Aceh, Indonesia as the pilot area, we conducted three simulations that showed that while the December 26, 2004 Indian Ocean tsunami claimed about 300,000 lives because there was no tsunami warning system at all, it is possible that only about 15,000 lives could have been lost if the area had used a tsunami warning system like that currently in use in the Pacific Ocean. The simulations further calculated that the death toll could have been about 3,000 deaths if there had been a disaster system further optimized with full use of remote sensing and GIS, although the number of badly damaged or destroyed houses (29,545) could have likely remained unchanged.

  12. Tsunami Early Warning via a Physics-Based Simulation Pipeline

    NASA Astrophysics Data System (ADS)

    Wilson, J. M.; Rundle, J. B.; Donnellan, A.; Ward, S. N.; Komjathy, A.

    2017-12-01

    Through independent efforts, physics-based simulations of earthquakes, tsunamis, and atmospheric signatures of these phenomenon have been developed. With the goal of producing tsunami forecasts and early warning tools for at-risk regions, we join these three spheres to create a simulation pipeline. The Virtual Quake simulator can produce thousands of years of synthetic seismicity on large, complex fault geometries, as well as the expected surface displacement in tsunamigenic regions. These displacements are used as initial conditions for tsunami simulators, such as Tsunami Squares, to produce catalogs of potential tsunami scenarios with probabilities. Finally, these tsunami scenarios can act as input for simulations of associated ionospheric total electron content, signals which can be detected by GNSS satellites for purposes of early warning in the event of a real tsunami. We present the most recent developments in this project.

  13. Warnings and reactions to the Tohoku tsunami in Hawaii

    NASA Astrophysics Data System (ADS)

    Houghton, B. F.; Gregg, C. E.

    2012-12-01

    The 2011 Tohoku tsunami was the first chance within the USA to document and interpret large-scale response and protective action behavior with regard to a large, destructive tsunami since 1964. The 2011 tsunami offered a unique, short-lived opportunity to transform our understanding of individual and collective behavior in the US in response to a well-publicized tsunami warning and, in particular, to look at the complex interplay of official information sources, informal warnings and information-seeking in communities with significant physical impact from the 2011 tsunami. This study is focused in Hawaii, which suffered significant ($30 M), but localized damage, from the 2011 Tohoku tsunami and underwent a full-scale tsunami evacuation. The survey contrasts three Hawaiian communities which either experienced significant tsunami damage (Kona) or little physical impact (Hilo, Honolulu). It also contrasts a long-established local community with experience of evacuation, destruction and loss of life in two tsunamis (Hilo) with a metropolitan population with a large visitor presence (Honolulu) that has not experienced a damaging tsunami in decades. Many factors such as personal perceptions of risk, beliefs, past exposure to the hazard, forecast uncertainty, trust in information sources, channels of transmission of information, the need for message confirmation, responsibilities, obligations, mobility, the ability to prepare, the availability of transportation and transport routes, and an acceptable evacuation center affected behavior. We provide new information on how people reacted to warnings and tsunamis, especially with regard to social integration of official warnings and social media. The results of this study will strengthen community resilience to tsunamis, working with emergency managers to integrate strengths and weaknesses of the public responses with official response plans.

  14. What caused a large number of fatalities in the Tohoku earthquake?

    NASA Astrophysics Data System (ADS)

    Ando, M.; Ishida, M.; Nishikawa, Y.; Mizuki, C.; Hayashi, Y.

    2012-04-01

    The Mw9.0 earthquake caused 20,000 deaths and missing persons in northeastern Japan. 115 years prior to this event, there were three historical tsunamis that struck the region, one of which is a "tsunami earthquake" resulted with a death toll of 22,000. Since then, numerous breakwaters were constructed along the entire northeastern coasts and tsunami evacuation drills were carried out and hazard maps were distributed to local residents on numerous communities. However, despite the constructions and preparedness efforts, the March 11 Tohoku earthquake caused numerous fatalities. The strong shaking lasted three minutes or longer, thus all residents recognized that this is the strongest and longest earthquake that they had been ever experienced in their lives. The tsunami inundated an enormous area at about 560km2 over 35 cities along the coast of northeast Japan. To find out the reasons behind the high number of fatalities due to the March 11 tsunami, we interviewed 150 tsunami survivors at public evacuation shelters in 7 cities mainly in Iwate prefecture in mid-April and early June 2011. Interviews were done for about 30min or longer focused on their evacuation behaviors and those that they had observed. On the basis of the interviews, we found that residents' decisions not to evacuate immediately were partly due to or influenced by earthquake science results. Below are some of the factors that affected residents' decisions. 1. Earthquake hazard assessments turned out to be incorrect. Expected earthquake magnitudes and resultant hazards in northeastern Japan assessed and publicized by the government were significantly smaller than the actual Tohoku earthquake. 2. Many residents did not receive accurate tsunami warnings. The first tsunami warning were too small compared with the actual tsunami heights. 3. The previous frequent warnings with overestimated tsunami height influenced the behavior of the residents. 4. Many local residents above 55 years old experienced the 1960 Chile tsunami, which was significantly smaller than that of the 11 March tsunami. This sense of "knowing" put their lives at high risk. 5. Some local residents believed that with the presence of a breakwater, only slight flooding would occur. 6. Many people did not understand why tsunami is created under the sea. Therefore, relation of earthquake and tsunami is not quite linked to many people. These interviews made it clear that many deaths resulted because current technology and earthquake science underestimated tsunami heights, warning systems failed, and breakwaters were not strong or high enough. However, even if these problems occur in future earthquakes, better knowledge regarding earthquakes and tsunami hazards could save more lives. In an elementary school when children have fresh brain, it is necessary for them to learn the basic mechanism of tsunami generation.

  15. 33 CFR 165.14-1414 - Safety Zones; Hawaiian Islands Commercial Harbors; HI.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... harbors, or all of these harbors, dependent upon details in the tsunami warning. These safety zones extend... period. Paragraph (b) of this section will be enforced when a tsunami warning has been issued for the... Coast Guard's Homeport Web site. Following the passage of the tsunami or tsunami threat and harbor...

  16. Global Tsunami Warning System Development Since 2004

    NASA Astrophysics Data System (ADS)

    Weinstein, S.; Becker, N. C.; Wang, D.; Fryer, G. J.; McCreery, C.; Hirshorn, B. F.

    2014-12-01

    The 9.1 Mw Great Sumatra Earthquake of Dec. 26, 2004, generated the most destructive tsunami in history killing 227,000 people along Indian Ocean coastlines and was recorded by sea-level instruments world-wide. This tragedy showed the Indian Ocean needed a tsunami warning system to prevent another tragedy on this scale. The Great Sumatra Earthquake also highlighted the need for tsunami warning systems in other ocean basins. Instruments for recording earthquakes and sea-level data useful for tsunami monitoring did not exist outside of the Pacific Ocean in 2004. Seismometers were few in number, and even fewer were high-quality long period broadband instruments. Nor was much of their data made available to the US tsunami warning centers (TWCs). In 2004 the US TWCs relied exclusively on instrumentation provided and maintained by IRIS and the USGS for areas outside of the Pacific.Since 2004, the US TWCs and their partners have made substantial improvements to seismic and sea-level monitoring networks with the addition of new and better instruments, densification of existing networks, better communications infrastructure, and improved data sharing among tsunami warning centers. In particular, the number of sea-level stations transmitting data in near real-time and the amount of seismic data available to the tsunami warning centers has more than tripled. The DART network that consisted of a half-dozen Pacific stations in 2004 now totals nearly 60 stations worldwide. Earthquake and tsunami science has progressed as well. It took nearly three weeks to obtain the first reliable estimates of the 2004 Sumatra Earthquake's magnitude. Today, thanks to improved seismic networks and modern computing power, TWCs use the W-phase seismic moment method to determine accurate earthquake magnitudes and focal mechanisms for great earthquakes within 25 minutes. TWC scientists have also leveraged these modern computers to generate tsunami forecasts in a matter of minutes.Progress towards a global tsunami warning system has been substantial and today fully-functioning TWCs protect most of the world's coastlines. These improvements have also led to a substantial reduction of time required by the TWCs to detect, locate, and assess the tsunami threat from earthquakes occurring worldwide.

  17. Preliminary numerical simulations of the 27 February 2010 Chile tsunami: first results and hints in a tsunami early warning perspective

    NASA Astrophysics Data System (ADS)

    Tinti, S.; Tonini, R.; Armigliato, A.; Zaniboni, F.; Pagnoni, G.; Gallazzi, Sara; Bressan, Lidia

    2010-05-01

    The tsunamigenic earthquake (M 8.8) that occurred offshore central Chile on 27 February 2010 can be classified as a typical subduction-zone earthquake. The effects of the ensuing tsunami have been devastating along the Chile coasts, and especially between the cities of Valparaiso and Talcahuano, and in the Juan Fernandez islands. The tsunami propagated across the entire Pacific Ocean, hitting with variable intensity almost all the coasts facing the basin. While the far-field propagation was quite well tracked almost in real-time by the warning centres and reasonably well reproduced by the forecast models, the toll of lives and the severity of the damage caused by the tsunami in the near-field occurred with no local alert nor warning and sadly confirms that the protection of the communities placed close to the tsunami sources is still an unresolved problem in the tsunami early warning field. The purpose of this study is two-fold. On one side we perform numerical simulations of the tsunami starting from different earthquake models which we built on the basis of the preliminary seismic parameters (location, magnitude and focal mechanism) made available by the seismological agencies immediately after the event, or retrieved from more detailed and refined studies published online in the following days and weeks. The comparison with the available records of both offshore DART buoys and coastal tide-gauges is used to put some preliminary constraints on the best-fitting fault model. The numerical simulations are performed by means of the finite-difference code UBO-TSUFD, developed and maintained by the Tsunami Research Team of the University of Bologna, Italy, which can solve both the linear and non-linear versions of the shallow-water equations on nested grids. The second purpose of this study is to use the conclusions drawn in the previous part in a tsunami early warning perspective. In the framework of the EU-funded project DEWS (Distant Early Warning System), we will try to give some clues for discussion on the deficiencies of the existing tsunami early warning concepts as regards the warning to the areas which are found close to the tsunami source, and on the strategies that should be followed in the near future in order to make significant progress in the protection and safeguarding of local communities.

  18. Implications Of The 11 March Tohoku Tsunami On Warning Systems And Vertical Evacuation Strategies

    NASA Astrophysics Data System (ADS)

    Fraser, S.; Leonard, G.; Johnston, D.

    2011-12-01

    The Mw 9.0 Tohoku earthquake and tsunami of March 11th 2011 claimed over 20,000 lives in an event which inundated over 500 km2 of land on the north-east coast of Japan. Successful execution of tsunami warning procedures and evacuation strategies undoubtedly saved thousands of lives, and there is evidence that vertical evacuation facilities were a key part of reducing the fatality rate in several municipalities in the Sendai Plains. As with all major disasters, however, post-event observations show that there are lessons to be learned in minimising life loss in future events. This event has raised or reinforced several key points that should be considered for implementation in all areas at risk from tsunami around the world. Primary areas for discussion are the need for redundant power supplies in tsunami warning systems; considerations of natural warnings when official warnings may not come; adequate understanding and estimation of the tsunami hazard; thorough site assessments for critical infrastructure, including emergency management facilities and tsunami refuges; and adequate signage of evacuation routes and refuges. This paper will present observations made on two field visits to the Tohoku region during 2011, drawing conclusions from field observations and discussions with local emergency officials. These observations will inform the enhancement of current tsunami evacuation strategies in New Zealand; it is believed discussion of these observations can also benefit continuing development of warning and evacuation strategies existing in the United States and elsewhere.

  19. The U.S. National Tsunami Hazard Mitigation Program: Successes in Tsunami Preparedness

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Wilson, R. I.

    2012-12-01

    Formed in 1995 by Congressional Action, the National Tsunami Hazards Mitigation Program (NTHMP) provides the framework for tsunami preparedness activities in the United States. The Program consists of the 28 U.S. coastal states, territories, and commonwealths (STCs), as well as three Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS). Since its inception, the NTHMP has advanced tsunami preparedness in the United States through accomplishments in many areas of tsunami preparedness: - Coordination and funding of tsunami hazard analysis and preparedness activities in STCs; - Development and execution of a coordinated plan to address education and outreach activities (materials, signage, and guides) within its membership; - Lead the effort to assist communities in meeting National Weather Service (NWS) TsunamiReady guidelines through development of evacuation maps and other planning activities; - Determination of tsunami hazard zones in most highly threatened coastal communities throughout the country by detailed tsunami inundation studies; - Development of a benchmarking procedure for numerical tsunami models to ensure models used in the inundation studies meet consistent, NOAA standards; - Creation of a national tsunami exercise framework to test tsunami warning system response; - Funding community tsunami warning dissemination and reception systems such as sirens and NOAA Weather Radios; and, - Providing guidance to NOAA's Tsunami Warning Centers regarding warning dissemination and content. NTHMP activities have advanced the state of preparedness of United States coastal communities, and have helped save lives and property during recent tsunamis. Program successes as well as future plans, including maritime preparedness, are discussed.

  20. Local Tsunami Warnings using GNSS and Seismic Data.

    NASA Astrophysics Data System (ADS)

    Hirshorn, B. F.

    2017-12-01

    Tsunami warning Centers (TWC's) must issue warnings based on imperfect and limited data. Uncertainties increase in the near field, where a tsunami reaches the closest coastal populations to the causative earthquake in a half hour or less. In the absence of a warning, the usual advice is "When the ground shakes so severely that it's difficult to stand, move uphill and away from the coast." But, what if the shaking is not severe? If, for example, the earthquake ruptures slowly (producing very little perceived shaking) this advice will fail. Unfortunately these "Tsunami" earthquakes are not rare: tsunamis from slow earthquakes off of Nicaragua in 1992, and Java in 1994 and 2006, killed 179, 250 and 637 people, respectively, even though very few nearby coastal residents felt any strong ground shaking. TWC's must therefore warn the closest coastal populations to the causative earthquake, where over 80% of the Tsunami based casualties typically occur, as soon possible after earthquake rupture begins. The NWS Tsunami Warning Centers (TWCs) currently issue local Tsunami Warnings for the US West Coast, Hawaii, and the Puerto Rico - Virgin Island region within 2-4 minutes after origin time. However, our initial short period Magnitude estimates saturate over about Mw 6.5, and Mwp underestimates Mw for events larger than about Mw 7.5 when using data in the 0 to 3 degree epicentral distance range, severely underestimating the danger of a potential Tsunami in the near field. Coastal GNSS networks complement seismic monitoring networks, and enable unsaturated estimates of Mw within 2-3 minutes of earthquake origin time. NASA/JPL, SIO, USGS, CWU, UCB and UW, with funding and guidance from NASA, and leveraging the USGS funded ShakeAlert development, have been working with the National Weather Service TWC's to incorporate real-time GNSS and seismogeodetic data into their operations. These data will soon provide unsaturated estimates of moment magnitude, Centroid Moment Tensor solutions, coseismic crustal deformation, and fault slip models within a few minutes after earthquake initiation. The sea floor deformation associated with the earthquake slip can then be used as an initial condition for an automatically generated tsunami propagation and coastal inundation model for coastal warnings.

  1. The role of integrating natural and social science concepts for risk governance and the design of people-centred early warning systems. Case study from the German-Indonesian Tsunami Early Warning System Project (GITEWS)

    NASA Astrophysics Data System (ADS)

    Gebert, Niklas; Post, Joachim

    2010-05-01

    The development of early warning systems are one of the key domains of adaptation to global environmental change and contribute very much to the development of societal reaction and adaptive capacities to deal with extreme events. Especially, Indonesia is highly exposed to tsunami. In average every three years small and medium size tsunamis occur in the region causing damage and death. In the aftermath of the Indian Ocean Tsunami 2004, the German and Indonesian government agreed on a joint cooperation to develop a People Centered End-to-End Early Warning System (GITEWS). The analysis of risk and vulnerability, as an important step in risk (and early warning) governance, is a precondition for the design of effective early warning structures by delivering the knowledge base for developing institutionalized quick response mechanisms of organizations involved in the issuing of a tsunami warning, and of populations exposed to react to warnings and to manage evacuation before the first tsunami wave hits. Thus, a special challenge for developing countries is the governance of complex cross-sectoral and cross-scale institutional, social and spatial processes and requirements for the conceptualization, implementation and optimization of a people centered tsunami early warning system. In support of this, the risk and vulnerability assessment of the case study aims at identifying those factors that constitute the causal structure of the (dis)functionality between the technological warning and the social response system causing loss of life during an emergency situation: Which social groups are likely to be less able to receive and respond to an early warning alert? And, are people able to evacuate in due time? Here, only an interdisciplinary research approach is capable to analyze the socio-spatial and environmental conditions of vulnerability and risk and to produce valuable results for decision makers and civil society to manage tsunami risk in the early warning context. This requires the integration of natural / spatial and social science concepts, methods and data: E.g. a scenario based approach for tsunami inundation modeling was developed to provide decision makers with options to decide up to what level they aim to protect their people and territory, on the contrary household surveys were conducted for the spatial analysis of the evacuation preparedness of the population as a function of place specific hazard, risk, warning and evacuation perception; remote sensing was applied for the spatial analysis (land-use) of the socio-physical conditions of a city and region for evacuation; and existing social / population statistics were combined with land-use data for the precise spatial mapping of the population exposed to tsunami risks. Only by utilizing such a comprehensive assessment approach valuable information for risk governance can be generated. The results are mapped using GIS and designed according to the specific needs of different end-users, such as public authorities involved in the design of warning dissemination strategies, land-use planners (shelter planning, road network configuration) and NGOs mandated to provide education for the general public on tsunami risk and evacuation behavior. The case study of the city of Padang (one of the pilot areas of GITEWS), Indonesia clearly show, that only by intersecting social (vulnerability) and natural hazards research a comprehensive picture on tsunami risk can be provided with which risk governance in the early warning context can be conducted in a comprehensive, systemic and sustainable manner.

  2. Preliminary Report Summarizes Tsunami Impacts and Lessons Learned from the September 7, 2017, M8.1 Tehuantepec Earthquake

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Ramirez-Herrera, M. T.; Dengler, L. A.; Miller, K.; LaDuke, Y.

    2017-12-01

    The preliminary tsunami impacts from the September 7, 2017, M8.1 Tehuantepec Earthquake have been summarized in the following report: https://www.eeri.org/wp-content/uploads/EERI-Recon-Rpt-090717-Mexico-tsunami_fn.pdf. Although the tsunami impacts were not as significant as those from the earthquake itself (98 fatalities and 41,000 homes damaged), the following are highlights and lessons learned: The Tehuantepec earthquake was one of the largest down-slab normal faulting events ever recorded. This situation complicated the tsunami forecast since forecast methods and pre-event modeling are primarily associated with megathrust earthquakes where the most significant tsunamis are generated. Adding non-megathrust source modeling to the tsunami forecast databases of conventional warning systems should be considered. Offshore seismic and tsunami hazard analyses using past events should incorporate the potential for large earthquakes occurring along sources other than the megathrust boundary. From an engineering perspective, initial reports indicate there was only minor tsunami damage along the Mexico coast. There was damage to Marina Chiapas where floating docks overtopped their piles. Increasing pile heights could reduce the potential for damage to floating docks. Tsunami warning notifications did not get to the public in time to assist with evacuation. Streamlining the messaging in Mexico from the warning system directly to the public should be considered. And, for local events, preparedness efforts should place emphasis on responding to feeling the earthquake and not waiting to be notified. Although the U.S. tsunami warning centers were timely with their international and domestic messaging, there were some issues with how those messages were presented and interpreted. The use of a "Tsunami Threat" banner on the new main warning center website created confusion with emergency managers in the U.S. where no tsunami threat was expected to exist. Also, some U.S. states and territories in the Pacific were listed in both domestic and international messages, which caused confusion for American Samoa where these messages contained somewhat conflicting information. These issues are being addressed by the warning centers and the U.S. National Tsunami Hazard Mitigation Program.

  3. REWSET: A prototype seismic and tsunami early warning system in Rhodes island, Greece

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Gerasimos; Argyris, Ilias; Aggelou, Savvas; Karastathis, Vasilis

    2014-05-01

    Tsunami warning in near-field conditions is a critical issue in the Mediterranean Sea since the most important tsunami sources are situated within tsunami wave travel times starting from about five minutes. The project NEARTOWARN (2012-2013) supported by the EU-DG ECHO contributed substantially to the development of new tools for the near-field tsunami early warning in the Mediterranean. One of the main achievements is the development of a local warning system in the test-site of Rhodes island (Rhodes Early Warning System for Earthquakes and Tsunamis - REWSET). The system is composed by three main subsystems: (1) a network of eight seismic early warning devices installed in four different localities of the island, one in the civil protection, another in the Fire Brigade and another two in municipality buildings; (2) two radar-type (ultrasonic) tide-gauges installed in the eastern coastal zine of the island which was selected since research on the historical earthquake and tsunami activity has indicated that the most important, near-field tsunami sources are situated offshore to the east of Rhodes; (3) a crisis Geographic Management System (GMS), which is a web-based and GIS-based application incorporating a variety of thematic maps and other information types. The seismic early warning devices activate by strong (magnitude around 6 or more) earthquakes occurring at distances up to about 100 km from Rhodes, thus providing immediate mobilization of the civil protection. The tide-gauges transmit sea level data, while during the crisis the GMS supports decisions to be made by civil protection. In the near future it is planned the REWSET system to be integrated with national and international systems. REWSET is a prototype which certainly could be developed in other coastal areas of the Mediterranean and beyond.

  4. How soon is too soon? When to cancel a warning after a damaging tsunami

    NASA Astrophysics Data System (ADS)

    Fryer, G. J.; Becker, N. C.; Wang, D.; Weinstein, S.; Richards, K.

    2012-12-01

    Following an earthquake a tsunami warning center (TWC) must determine if a coastal evacuation is necessary and must do so fast enough for the warning to be useful to affected coastlines. Once a damaging tsunami has arrived, the TWC must decide when to cancel its warning, a task often more challenging than the initial hazard assessment. Here we demonstrate the difficulties by investigating the impact of the Tohoku tsunami of 11 March 2011 on the State of Hawaii, which relies on the Pacific Tsunami Warning Center (PTWC) for tsunami hazard guidance. PTWC issued a Tsunami Watch for Hawaii at 10 March 1956 HST (10 minutes after the earthquake) and upgraded to a Tsunami Warning at 2131 HST. The tsunami arrived in Hawaii just before 0300 HST the next day, reached a maximum runup of over 5 m, and did roughly $50 million in damage throughout the state. PTWC downgraded the Warning to an Advisory at 0730 HST, and canceled the Advisory at 1140 HST. The timing of the downgrade was appropriate—by then it was safe for coastal residents to re-enter the evacuation zone but not to enter the water—but in retrospect PTWC cancelled its Advisory too early. By late morning tide gauges throughout the state had all registered maximum wave heights of 30 cm or less for a couple of hours, so PTWC cancelled. The Center was unaware, however, of ocean behavior at locations without instruments. At Ma'alaea Harbor on the Island of Maui, for example, sea level oscillations exposed the harbor bottom every 20 minutes for several hours after the cancellation. At Waikiki on Oahu, lifeguards rescued 25 swimmers (who had either ignored or were unaware of the cancellation message's caution about hazardous currents) in the hours after the cancellation and performed CPR on one near-drowning victim. Fortunately, there were no deaths. Because of dangerous surges, ocean safety officials closed Hanauma Bay, a popular snorkeling spot on Oahu, for a full day after the tsunami hit. They reassessed the bay the following morning just as waves reflected from South America started to arrive (36 hours after the earthquake), and prudently chose to keep the bay closed for two further days. The Tohoku tsunami showed that resonances and trapped waves in shallow water can last for many hours and that energy reflected from distant shorelines can rejuvenate them. PTWC's real-time simulation of the tsunami, including animation of its propagation, now helps to identify which reflections will be most troublesome and should permit the Center to specify in advance how long a Warning should remain in effect. The current open-ended warnings, which specify when the tsunami will arrive but not how long the Warning should last, should be replaced with warnings active for a specified time ("until 3 a.m. tomorrow"), with PTWC adjusting the projected cancellation time based on coastal sea-level observations. Such warnings should greatly reduce public misconceptions and state and local government expectations about how long the hazard will last. The National Weather Service, parent agency of the US TWCs, already issues weather Warnings and Advisories active for specific durations, so this message format is already familiar to both the public and emergency managers.

  5. Emergency management response to a warning-level Alaska-source tsunami impacting California: Chapter J in The SAFRR (Science Application for Risk Reduction) Tsunami Scenario

    USGS Publications Warehouse

    Miller, Kevin M.; Long, Kate

    2013-01-01

    This chapter is directed towards two audiences: Firstly, it targets nonemergency management readers, providing them with insight on the process and challenges facing emergency managers in responding to tsunami Warning, particularly given this “short fuse” scenario. It is called “short fuse” because there is only a 5.5-hour window following the earthquake before arrival of the tsunami within which to evaluate the threat, disseminate alert and warning messages, and respond. This action initiates a period when crisis communication is of paramount importance. An additional dynamic that is important to note is that within 15 minutes of the earthquake, the National Oceanic and Atmospheric Administration (NOAA) and the National Weather Service (NWS) will issue alert bulletins for the entire Pacific Coast. This is one-half the time actually presented by recent tsunamis from Japan, Chile, and Samoa. Second, the chapter provides emergency managers at all levels with insights into key considerations they may need to address in order to augment their existing plans and effectively respond to tsunami events. We look at emergency management response to the tsunami threat from three perspectives:“Top Down” (Threat analysis and Alert/Warning information from the Federal agency charged with Alert and Warning) “Bottom Up” (Emergency management’s Incident Command approach to responding to emergencies and disasters based on the needs of impacted local jurisdictions) “Across Time” (From the initiating earthquake event through emergency response) We focus on these questions: What are the government roles, relationships, and products that support Tsunami Alert and Warning dissemination? (Emergency Planning and Preparedness.) What roles, relationships, and products support emergency management response to Tsunami Warning and impact? (Engendering prudent public safety response.) What are the key emergency management activities, considerations, and challenges brought out by the SAFRR tsunami scenario? (Real emergencies) How do these activities, considerations, and challenges play out as the tsunami event unfolds across the “life” of the event? (Lessons)

  6. Hazard Assessment and Early Warning of Tsunamis: Lessons from the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Satake, K.

    2012-12-01

    The March 11, 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history, and was the best recorded subduction-zone earthquakes in the world. In particular, various offshore geophysical observations revealed large horizontal and vertical seafloor movements, and the tsunami was recorded on high-quality, high-sampling gauges. Analysis of such tsunami waveforms shows a temporal and spatial slip distribution during the 2011 Tohoku earthquake. The fault rupture started near the hypocenter and propagated into both deep and shallow parts of the plate interface. Very large, ~25 m, slip off Miyagi on the deep part of plate interface corresponds to an interplate earthquake of M 8.8, the location and size similar to 869 Jogan earthquake model, and was responsible for the large tsunami inundation in Sendai and Ishinomaki plains. Huge slip, more than 50 m, occurred on the shallow part near the trench axis ~3 min after the earthquake origin time. This delayed shallow rupture (M 8.8) was similar to the 1896 "tsunami earthquake," and was responsible for the large tsunami on the northern Sanriku coast, measured at ~100 km north of the largest slip. Thus the Tohoku earthquake can be decomposed into an interplate earthquake and the triggered "tsunami earthquake." The Japan Meteorological Agency issued tsunami warning 3 minutes after the earthquake, and saved many lives. However, their initial estimation of tsunami height was underestimated, because the earthquake magnitude was initially estimated as M 7.9, hence the computed tsunami heights were lower. The JMA attempts to improve the tsunami warning system, including technical developments to estimate the earthquake size in a few minutes by using various and redundant information, to deploy and utilize the offshore tsunami observations, and to issue a warning based on the worst case scenario if a possibility of giant earthquake exists. Predicting a trigger of another large earthquake would still be a challenge. Tsunami hazard assessments or long-term forecast of earthquakes have not considered such a triggering or simultaneous occurrence of different types of earthquakes. The large tsunami at the Fukushima nuclear power station was due to the combination of the deep and shallow slip. Disaster prevention for low-frequency but large-scale hazard must be considered. The Japanese government established a general policy to for two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, but cause devastating disaster once they occur. For such events, saving people's lives is the first priority and soft measures such as tsunami hazard maps, evacuation facilities or disaster education will be prepared. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared to protect lives and properties of residents as well as economic and industrial activities.

  7. Rapid estimate of earthquake source duration: application to tsunami warning.

    NASA Astrophysics Data System (ADS)

    Reymond, Dominique; Jamelot, Anthony; Hyvernaud, Olivier

    2016-04-01

    We present a method for estimating the source duration of the fault rupture, based on the high-frequency envelop of teleseismic P-Waves, inspired from the original work of (Ni et al., 2005). The main interest of the knowledge of this seismic parameter is to detect abnormal low velocity ruptures that are the characteristic of the so called 'tsunami-earthquake' (Kanamori, 1972). The validation of the results of source duration estimated by this method are compared with two other independent methods : the estimated duration obtained by the Wphase inversion (Kanamori and Rivera, 2008, Duputel et al., 2012) and the duration calculated by the SCARDEC process that determines the source time function (M. Vallée et al., 2011). The estimated source duration is also confronted to the slowness discriminant defined by Newman and Okal, 1998), that is calculated routinely for all earthquakes detected by our tsunami warning process (named PDFM2, Preliminary Determination of Focal Mechanism, (Clément and Reymond, 2014)). Concerning the point of view of operational tsunami warning, the numerical simulations of tsunami are deeply dependent on the source estimation: better is the source estimation, better will be the tsunami forecast. The source duration is not directly injected in the numerical simulations of tsunami, because the cinematic of the source is presently totally ignored (Jamelot and Reymond, 2015). But in the case of a tsunami-earthquake that occurs in the shallower part of the subduction zone, we have to consider a source in a medium of low rigidity modulus; consequently, for a given seismic moment, the source dimensions will be decreased while the slip distribution increased, like a 'compact' source (Okal, Hébert, 2007). Inversely, a rapid 'snappy' earthquake that has a poor tsunami excitation power, will be characterized by higher rigidity modulus, and will produce weaker displacement and lesser source dimensions than 'normal' earthquake. References: CLément, J. and Reymond, D. (2014). New Tsunami Forecast Tools for the French Polynesia Tsunami Warning System. Pure Appl. Geophys, 171. DUPUTEL, Z., RIVERA, L., KANAMORI, H. and HAYES, G. (2012). Wphase source inversion for moderate to large earthquakes. Geophys. J. Intl.189, 1125-1147. Kanamori, H. (1972). Mechanism of tsunami earthquakes. Phys. Earth Planet. Inter. 6, 246-259. Kanamori, H. and Rivera, L. (2008). Source inversion of W phase : speeding up seismic tsunami warning. Geophys. J. Intl. 175, 222-238. Newman, A. and Okal, E. (1998). Teleseismic estimates of radiated seismic energy : The E=M0 discriminant for tsunami earthquakes. J. Geophys. Res. 103, 26885-26898. Ni, S., H. Kanamori, and D. Helmberger (2005), Energy radiation from the Sumatra earthquake, Nature, 434, 582. Okal, E.A., and H. Hébert (2007), Far-field modeling of the 1946 Aleutian tsunami, Geophys. J. Intl., 169, 1229-1238. Vallée, M., J. Charléty, A.M.G. Ferreira, B. Delouis, and J. Vergoz, SCARDEC : a new technique for the rapid determination of seismic moment magnitude, focal mechanism and source time functions for large earthquakes using body wave deconvolution, Geophys. J. Int., 184, 338-358, 2011.

  8. The EarthScope Plate Boundary Observatory and allied networks, the makings of nascent Earthquake and Tsunami Early Warning System in Western North America.

    NASA Astrophysics Data System (ADS)

    Mattioli, Glen; Mencin, David; Hodgkinson, Kathleen; Meertens, Charles; Phillips, David; Blume, Fredrick; Berglund, Henry; Fox, Otina; Feaux, Karl

    2016-04-01

    The NSF-funded GAGE Facility, managed by UNAVCO, operates approximately ~1300 GNSS stations distributed across North and Central America and in the circum-Caribbean. Following community input starting in 2011 from several workshops and associated reports,UNAVCO has been exploring ways to increase the capability and utility of the geodetic resources under its management to improve our understanding in diverse areas of geophysics including properties of seismic, volcanic, magmatic and tsunami deformation sources. Networks operated by UNAVCO for the NSF have the potential to profoundly transform our ability to rapidly characterize events, provide rapid characterization and warning, as well as improve hazard mitigation and response. Specific applications currently under development include earthquake early warning, tsunami early warning, and tropospheric modeling with university, commercial, non-profit and government partners on national and international scales. In the case of tsunami early warning, for example, an RT-GNSS network can provide multiple inputs in an operational system starting with rapid assessment of earthquake sources and associated deformation, which leads to the initial model of ocean forcing and tsunami generation. In addition, terrestrial GNSScan provide direct measurements of the tsunami through the associated traveling ionospheric disturbance from several 100's of km away as they approach the shoreline,which can be used to refine tsunami inundation models. Any operational system like this has multiple communities that rely on a pan-Pacific real-time open data set. Other scientific and operational applications for high-rate GPS include glacier and ice sheet motions, tropospheric modeling, and better constraints on the dynamics of space weather. Combining existing data sets and user communities, for example seismic data and tide gauge observations, with GNSS and Met data products has proven complicated because of issues related to metadata, appropriate data formats, data quality assessment in real-time and other issues related to using these products operational forecasting. While progress has been made toward more open and free data access across national borders and toward more cooperation among cognizant government sanctioned "early warning" agencies, some impediments remain making a truly operational system a work in progress. Accordingly, UNAVCO has embarked on significant improvements and improvement goals to the original infrastructure and scope of the PBO. We anticipate that PBO and related networks will form a backbone for these disparate efforts providing high quality, low latency raw and processed GNSS data. This requires substantial upgrades to the entire system from the basic GNNS receiver, through robust data collection, archiving and open distribution mechanisms, to efficient data-processing strategies. UNAVCO is currently in a partnership with the commercial and scientific stakeholders to define, develop and deploy all segments of this improved geodetic network. We present the overarching goals, and current and planned future stateof this international resource.

  9. Tsunami.gov: NOAA's Tsunami Information Portal

    NASA Astrophysics Data System (ADS)

    Shiro, B.; Carrick, J.; Hellman, S. B.; Bernard, M.; Dildine, W. P.

    2014-12-01

    We present the new Tsunami.gov website, which delivers a single authoritative source of tsunami information for the public and emergency management communities. The site efficiently merges information from NOAA's Tsunami Warning Centers (TWC's) by way of a comprehensive XML feed called Tsunami Event XML (TEX). The resulting unified view allows users to quickly see the latest tsunami alert status in geographic context without having to understand complex TWC areas of responsibility. The new site provides for the creation of a wide range of products beyond the traditional ASCII-based tsunami messages. The publication of modern formats such as Common Alerting Protocol (CAP) can drive geographically aware emergency alert systems like FEMA's Integrated Public Alert and Warning System (IPAWS). Supported are other popular information delivery systems, including email, text messaging, and social media updates. The Tsunami.gov portal allows NOAA staff to easily edit content and provides the facility for users to customize their viewing experience. In addition to access by the public, emergency managers and government officials may be offered the capability to log into the portal for special access rights to decision-making and administrative resources relevant to their respective tsunami warning systems. The site follows modern HTML5 responsive design practices for optimized use on mobile as well as non-mobile platforms. It meets all federal security and accessibility standards. Moving forward, we hope to expand Tsunami.gov to encompass tsunami-related content currently offered on separate websites, including the NOAA Tsunami Website, National Tsunami Hazard Mitigation Program, NOAA Center for Tsunami Research, National Geophysical Data Center's Tsunami Database, and National Data Buoy Center's DART Program. This project is part of the larger Tsunami Information Technology Modernization Project, which is consolidating the software architectures of NOAA's existing TWC's into a single system. We welcome your feedback to help Tsunami.gov become an effective public resource for tsunami information and a medium to enable better global tsunami warning coordination.

  10. Using Multi-Scenario Tsunami Modelling Results combined with Probabilistic Analyses to provide Hazard Information for the South-WestCoast of Indonesia

    NASA Astrophysics Data System (ADS)

    Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.

    2009-04-01

    Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of the inundation for a specific area, the wave height at coast at this area and the estimated times of arrival (ETAs) of the waves, caused by one tsunamigenic source with a specific magnitude. These parameters from the several scenarios can overlap each other along the coast and must be combined to get one comprehensive hazard assessment for all possible future tsunamis at the region under observation. The simplest way to derive the inundation probability along the coast using the multiscenario approach is to overlay all scenario inundation results and to determine how often a point on land will be significantly inundated from the various scenarios. But this does not take into account that the used tsunamigenic sources for the modeled scenarios have different likelihoods of causing a tsunami. Hence a statistical analysis of historical data and geophysical investigation results based on numerical modelling results is added to the hazard assessment, which clearly improves the significance of the hazard assessment. For this purpose the present method is developed and contains a complex logical combination of the diverse probabilities assessed like probability of occurrence for different earthquake magnitudes at different localities, probability of occurrence for a specific wave height at the coast and the probability for every point on land likely to get hit by a tsunami. The values are combined by a logical tree technique and quantified by statistical analysis of historical data and of the tsunami modelling results as mentioned before. This results in a tsunami inundation probability map covering the South West Coast of Indonesia which nevertheless shows a significant spatial diversity offering a good base for evacuation planning and mitigation strategies. Keywords: tsunami hazard assessment, tsunami modelling, probabilistic analysis, early warning

  11. Assessment of the Initial Response from Tsunami Monitoring Services Provided to the Northeastern Caribbean

    NASA Astrophysics Data System (ADS)

    Soto-Cordero, L.; Meltzer, A.

    2014-12-01

    A mag 6.4 earthquake offshore northern Puerto Rico earlier this year (1/13/14) is a reminder of the high risk of earthquakes and tsunamis in the northeastern Caribbean. Had the magnitude of this event been 0.1 larger (M 6.5) a tsunami warning would have been issued for the Puerto Rico-Virgin Islands (PRVI) region based on the West Coast Alaska Tsunami Warning Center (WCATWC) and Puerto Rico Seismic Network (PRSN) response procedures at the time. Such an alert level would have led local authorities to issue evacuation orders for all PRVI coastal areas. Since the number of deaths associated with tsunamis in the Caribbean region is greater than the total casualties from tsunamis in the entire US (including Hawaii and Alaska coasts) having an effective and redundant warning system is critical in order to save lives and to minimize false alarms that could result in significant economic costs and loss of confidence of Caribbean residents. We are evaluating three fundamental components of tsunami monitoring protocols currently in place in the northeastern Caribbean: 1) preliminary earthquake parameters (used to determine the potential that a tsunami will be generated and the basis of tsunami alert levels), 2) adequacy of the tsunami alert levels, and 3) tsunami message dissemination. We compiled a catalog of earthquake locations (2007-2014) and dissemination times from the PTWC, WCATWC and NEIC (final locations). The events were classified into 3 categories: local [17°-20°N, 63.5°-69°W], regional (Caribbean basin) and distant/teleseismic (Atlantic basin). A total of 104 local earthquakes, 31 regional and 25 distant events were analyzed. We found that in general preliminary epicentral locations have an accuracy of 40 km. 64% of local events were located with an accuracy of 20 km. The depth accuracy of local events shallower than 50 km, regional and distant earthquakes is usually smaller than 30 km. For deeper local events the error distribution shows more variability (-32 to 81 km); preliminary locations tend to underestimate depth. A trade-off between epicentral location and depth was observed for several local events deeper than 50 km.

  12. 2011 Tohoku, Japan tsunami data available from the National Oceanic and Atmospheric Administration/National Geophysical Data Center

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Mccullough, H. L.; Mungov, G.; Harris, E.

    2012-12-01

    The U.S. National Oceanic and Atmospheric Administration (NOAA) has primary responsibility for providing tsunami warnings to the Nation, and a leadership role in tsunami observations and research. A key component of this effort is easy access to authoritative data on past tsunamis, a responsibility of the National Geophysical Data Center (NGDC) and collocated World Service for Geophysics. Archive responsibilities include the global historical tsunami database, coastal tide-gauge data from US/NOAA operated stations, the Deep-ocean Assessment and Reporting of Tsunami (DART®) data, damage photos, as well as other related hazards data. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Understanding the severity and timing of tsunami effects is important for tsunami hazard mitigation and warning. The global historical tsunami database includes the date, time, and location of the source event, magnitude of the source, event validity, maximum wave height, the total number of fatalities and dollar damage. The database contains additional information on run-ups (locations where tsunami waves were observed by eyewitnesses, field reconnaissance surveys, tide gauges, or deep ocean sensors). The run-up table includes arrival times, distance from the source, measurement type, maximum wave height, and the number of fatalities and damage for the specific run-up location. Tide gauge data are required for modeling the interaction of tsunami waves with the coast and for verifying propagation and inundation models. NGDC is the long-term archive for all NOAA coastal tide gauge data and is currently archiving 15-second to 1-minute water level data from the NOAA Center for Operational Oceanographic Products and Services (CO-OPS) and the NOAA Tsunami Warning Centers. DART® buoys, which are essential components of tsunami warning systems, are now deployed in all oceans, giving coastal communities faster and more accurate tsunami warnings. NOAA's National Data Buoy Center disseminates real-time DART® data and NGDC processes and archives post-event 15-second high-resolution bottom pressure time series data. An event-specific archive of DART® observations recorded during recent significant tsunamis, including the March 2011 Tohoku, Japan event, are now available through new tsunami event pages integrated with the NGDC global historical tsunami database. These pages are developed to deliver comprehensive summaries of each tsunami event, including socio-economic impacts, tsunami travel time maps, raw observations, de-tided residuals, spectra of the tsunami signal compared to the energy of the background noise, and wavelets. These data are invaluable to tsunami researchers and educators as they are essential to providing a more thorough understanding of tsunamis and their propagation in the open ocean and subsequent inundation of coastal communities. NGDC has collected 289 tide gauge observations, 34 Deep-ocean Assessment and Reporting of Tsunami (DART®) and bottom pressure recorder (BPR) station observations, and over 5,000 eyewitness reports and post-tsunami field survey measurements for the 2011 Tohoku event.

  13. On the importance of risk knowledge for an end-to-end tsunami early warning system

    NASA Astrophysics Data System (ADS)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    Warning systems commonly use information provided by networks of sensors able to monitor and detect impending disasters, aggregate and condense these information to provide reliable information to a decision maker whether to warn or not, disseminates the warning message and provide this information to people at risk. Ultimate aim is to enable those in danger to make decisions (e.g. initiate protective actions for buildings) and to take action to safe their lives. This involves very complex issues when considering all four elements of early warning systems (UNISDR-PPEW), namely (1) risk knowledge, (2) monitoring and warning service, (3) dissemination and communication, (4) response capability with the ultimate aim to gain as much time as possible to empower individuals and communities to act in an appropriate manner to reduce injury, loss of life, damage to property and the environment and loss of livelihoods. Commonly most warning systems feature strengths and main attention on the technical/structural dimension (monitoring & warning service, dissemination tools) with weaknesses and less attention on social/cultural dimension (e.g. human response capabilities, defined warning chain to and knowing what to do by the people). Also, the use of risk knowledge in early warning most often is treated in a theoretical manner (knowing that it is somehow important), yet less in an operational, practical sense. Risk assessments and risk maps help to motivate people, prioritise early warning system needs and guide preparations for response and disaster prevention activities. Beyond this risk knowledge can be seen as a tie between national level early warning and community level reaction schemes. This presentation focuses on results, key findings and lessons-learnt related to tsunami risk assessment in the context of early warning within the GITEWS (German-Indonesian Tsunami Early Warning) project. Here a novel methodology reflecting risk information needs in the early warning context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  14. Tsunami Risk for the Caribbean Coast

    NASA Astrophysics Data System (ADS)

    Kozelkov, A. S.; Kurkin, A. A.; Pelinovsky, E. N.; Zahibo, N.

    2004-12-01

    The tsunami problem for the coast of the Caribbean basin is discussed. Briefly the historical data of tsunami in the Caribbean Sea are presented. Numerical simulation of potential tsunamis in the Caribbean Sea is performed in the framework of the nonlinear-shallow theory. The tsunami wave height distribution along the Caribbean Coast is computed. These results are used to estimate the far-field tsunami potential of various coastal locations in the Caribbean Sea. In fact, five zones with tsunami low risk are selected basing on prognostic computations, they are: the bay "Golfo de Batabano" and the coast of province "Ciego de Avila" in Cuba, the Nicaraguan Coast (between Bluefields and Puerto Cabezas), the border between Mexico and Belize, the bay "Golfo de Venezuela" in Venezuela. The analysis of historical data confirms that there was no tsunami in the selected zones. Also, the wave attenuation in the Caribbean Sea is investigated; in fact, wave amplitude decreases in an order if the tsunami source is located on the distance up to 1000 km from the coastal location. Both factors wave attenuation and wave height distribution should be taken into account in the planned warning system for the Caribbean Sea.

  15. Fast Simulation of Tsunamis in Real Time

    NASA Astrophysics Data System (ADS)

    Fryer, G. J.; Wang, D.; Becker, N. C.; Weinstein, S. A.; Walsh, D.

    2011-12-01

    The U.S. Tsunami Warning Centers primarily base their wave height forecasts on precomputed tsunami scenarios, such as the SIFT model (Standby Inundation Forecasting of Tsunamis) developed by NOAA's Center for Tsunami Research. In SIFT, tsunami simulations for about 1600 individual earthquake sources, each 100x50 km, define shallow subduction worldwide. These simulations are stored in a database and combined linearly to make up the tsunami from any great earthquake. Precomputation is necessary because the nonlinear shallow-water wave equations are too time consuming to compute during an event. While such scenario-based models are valuable, they tacitly assume all energy in a tsunami comes from thrust at the décollement. The thrust assumption is often violated (e.g., 1933 Sanriku, 2007 Kurils, 2009 Samoa), while a significant number of tsunamigenic earthquakes are completely unrelated to subduction (e.g., 1812 Santa Barbara, 1939 Accra, 1975 Kalapana). Finally, parts of some subduction zones are so poorly defined that precomputations may be of little value (e.g., 1762 Arakan, 1755 Lisbon). For all such sources, a fast means of estimating tsunami size is essential. At the Pacific Tsunami Warning Center, we have been using our model RIFT (Real-time Inundation Forecasting of Tsunamis) experimentally for two years. RIFT is fast by design: it solves only the linearized form of the equations. At 4 arc-minutes resolution calculations for the entire Pacific take just a few minutes on an 8-processor Linux box. Part of the rationale for developing RIFT was earthquakes of M 7.8 or smaller, which approach the lower limit of the more complex SIFT's abilities. For such events we currently issue a fixed warning to areas within 1,000 km of the source, which typically means a lot of over-warning. With sources defined by W-phase CMTs, exhaustive comparison with runup data shows that we can reduce the warning area significantly. Even before CMTs are available, we routinely run models based on the local tectonics, which provide a useful first estimate of the tsunami. Our runup comparisons show that Green's Law (i.e., 1-D runup estimates) works very well indeed, especially if computations are run at 2 arc-minutes. We are developing an experimental RIFT-based product showing expected runups on open coasts. While these will necessarily be rather crude they will be a great help to emergency managers trying to assess the hazard. RIFT is typically run using a single source, but it can already handle multiple sources. In particular, it can handle multiple sources of different orientations such as 1993 Okushiri, or the décollement-splay combinations to be expected during major earthquakes in accretionary margins such as Nankai, Cascadia, and Middle America. As computers get faster and the number-crunching burden is off-loaded to GPUs, we are convinced there will still be a use for a fast, linearized, modeling capability. Rather than applying scaling laws to a CMT, or distributing slip over 100x50 km sub-faults, for example, it would be preferable to model tsunamis using the output from a finite-fault analysis. To accomplish such a compute-bound task fast enough for warning purposes will demand a rapid, approximate technique like RIFT.

  16. Towards a certification process for tsunami early warning systems

    NASA Astrophysics Data System (ADS)

    Löwe, Peter; Wächter, Jochen; Hammitzsch, Martin

    2013-04-01

    The natural disaster of the Boxing Day Tsunami of 2004 was followed by an information catastrophe. Crucial early warning information could not be delivered to the communities under imminent threat, resulting in over 240,000 casualties in 14 countries. This tragedy sparked the development of a new generation of integrated modular Tsunami Early Warning Systems (TEWS). While significant advances were accomplished in the past years, recent events, like the Chile 2010 and the Tohoku 2011 tsunami demonstrate that the key technical challenge for Tsunami Early Warning research on the supranational scale still lies in the timely issuing of status information and reliable early warning messages in a proven workflow. A second challenge stems from the main objective of the Intergovernmental Oceanographic Commission of UNESCO (IOC) Tsunami Programme, the integration of national TEWS towards ocean-wide networks: Each of the increasing number of integrated Tsunami Early Warning Centres has to cope with the continuing evolution of sensors, hardware and software while having to maintain reliable inter-center information exchange services. To avoid future information catastrophes, the performance of all components, ranging from individual sensors, to Warning Centers within their particular end-to-end Warning System Environments, and up to federated Systems of Tsunami Warning Systems has to be regularly validated against defined criteria. Since 2004, GFZ German Research Centre for Geosciences (GFZ) has built up expertise in the field of TEWS. Within GFZ, the Centre for GeoInformation Technology (CeGIT) has focused its work on the geoinformatics aspects of TEWS in two projects already, being the German Indonesian Tsunami Early Warning System (GITEWS) and the Distant Early Warning System (DEWS). This activity is continued in the TRIDEC project (Collaborative, Complex, and Critical Decision Processes in Evolving Crises) funded under the European Union's seventh Framework Programme (FP7). TRIDEC focuses on real-time intelligent information management in Earth management and its long-term application: The technical development is based on mature system architecture models and industry standards. The use of standards already applies to the operation of individual TRIDEC reference installations and their interlinking into an integrated service infrastructure for supranational warning services. This is a first step towards best practices and service lifecycles for Early Warning Centre IT service management, including Service Level Agreements (SLA) and Service Certification. While on a global scale the integration of TEWS progresses towards Systems of Systems (SoS), there is still an absence of accredited and reliable certifications for national TEWS or regional Tsunami Early Warning Systems of Systems (TEWSoS). Concepts for TEWS operations have already been published under the guidance of the IOC, and can now be complemented by the recent research advances concerning SoS architecture. Combined with feedback from the real world, such as the NEAMwave 2012 Tsunami exercise in the Mediterranean, this can serve as a starting point to formulate initial requirements for TEWS and TEWSoS certification: Certification activities will cover the establishment of new TEWS and TEWSoS, and also both maintenance and enhancement of existing TEWS/TEWSoS. While the IOC is expected to take a central role in the development of the certification strategy, it remains to be defined which bodies will actually conduct the certification process. Certification requirements and results are likely to become a valuable information source for various target groups, ranging from national policy decision makers, government agency planners, national and local government preparedness officials, TWC staff members, Disaster Responders, the media and the insurance industry.

  17. Scientific Animations for Tsunami Hazard Mitigation: The Pacific Tsunami Warning Center's YouTube Channel

    NASA Astrophysics Data System (ADS)

    Becker, N. C.; Wang, D.; Shiro, B.; Ward, B.

    2013-12-01

    Outreach and education save lives, and the Pacific Tsunami Warning Center (PTWC) has a new tool--a YouTube Channel--to advance its mission to protect lives and property from dangerous tsunamis. Such outreach and education is critical for coastal populations nearest an earthquake since they may not get an official warning before a tsunami reaches them and will need to know what to do when they feel strong shaking. Those who live far enough away to receive useful official warnings and react to them, however, can also benefit from PTWC's education and outreach efforts. They can better understand a tsunami warning message when they receive one, can better understand the danger facing them, and can better anticipate how events will unfold while the warning is in effect. The same holds true for emergency managers, who have the authority to evacuate the public they serve, and for the news media, critical partners in disseminating tsunami hazard information. PTWC's YouTube channel supplements its formal outreach and education efforts by making its computer animations available 24/7 to anyone with an Internet connection. Though the YouTube channel is only a month old (as of August 2013), it should rapidly develop a large global audience since similar videos on PTWC's Facebook page have reached over 70,000 viewers during organized media events, while PTWC's official web page has received tens of millions of hits during damaging tsunamis. These animations are not mere cartoons but use scientific data and calculations to render graphical depictions of real-world phenomena as accurately as possible. This practice holds true whether the animation is a simple comparison of historic earthquake magnitudes or a complex simulation cycling through thousands of high-resolution data grids to render tsunami waves propagating across an entire ocean basin. PTWC's animations fall into two broad categories. The first group illustrates concepts about seismology and how it is critical to tsunami warning operations, such as those about earthquake magnitudes, how earthquakes are located, where and how often earthquakes occur, and fault rupture length. The second group uses the PTWC-developed tsunami forecast model, RIFT (Wang et al., 2012), to show how various historic tsunamis propagated through the world's oceans. These animations illustrate important concepts about tsunami behavior such as their speed, how they bend around and bounce off of seafloor features, how their wave heights vary from place to place and in time, and how their behavior is strongly influenced by the type of earthquake that generated them. PTWC's YouTube channel also includes an animation that simulates both seismic and tsunami phenomena together as they occurred for the 2011 Japan tsunami including actual sea-level measurements and proper timing for tsunami alert status, thus serving as a video 'time line' for that event and showing the time scales involved in tsunami warning operations. Finally, PTWC's scientists can use their YouTube channel to communicate with their colleagues in the research community by supplementing their peer-reviewed papers with video 'figures' (e.g., Wang et al., 2012).

  18. An Experimental Seismic Data and Parameter Exchange System for Tsunami Warning Systems

    NASA Astrophysics Data System (ADS)

    Hoffmann, T. L.; Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Heinloo, A.; Hoffmann, M.

    2009-12-01

    For several years GFZ Potsdam is operating a global earthquake monitoring system. Since the beginning of 2008, this system is also used as an experimental seismic background data center for two different regional Tsunami Warning Systems (TWS), the IOTWS (Indian Ocean) and the interim NEAMTWS (NE Atlantic and Mediterranean). The SeisComP3 (SC3) software, developed within the GITEWS (German Indian Ocean Tsunami Early Warning System) project, capable to acquire, archive and process real-time data feeds, was extended for export and import of individual processing results within the two clusters of connected SC3 systems. Therefore not only real-time waveform data are routed to the attached warning centers through GFZ but also processing results. While the current experimental NEAMTWS cluster consists of SC3 systems in six designated national warning centers in Europe, the IOTWS cluster presently includes seven centers, with another three likely to join in 2009/10. For NEAMTWS purposes, the GFZ virtual real-time seismic network (GEOFON Extended Virtual Network -GEVN) in Europe was substantially extended by adding many stations from Western European countries optimizing the station distribution. In parallel to the data collection over the Internet, a GFZ VSAT hub for secured data collection of the EuroMED GEOFON and NEAMTWS backbone network stations became operational and first data links were established through this backbone. For the Southeast Asia region, a VSAT hub has been established in Jakarta already in 2006, with some other partner networks connecting to this backbone via the Internet. Since its establishment, the experimental system has had the opportunity to prove its performance in a number of relevant earthquakes. Reliable solutions derived from a minimum of 25 stations were very promising in terms of speed. For important events, automatic alerts were released and disseminated by emails and SMS. Manually verified solutions are added as soon as they become available. The results are also promising in terms of accuracy since epicenter coordinates, depth and magnitude estimates were sufficiently accurate from the very beginning, and usually do not differ substantially from the final solutions. In summary, automatic seismic event processing has shown to work well as a first step for starting a Tsunami Warning process. However, for the secured assessment of the tsunami potential of a given event, 24/7-manned regional TWCs are mandatory for reliable manual verification of the automatic seismic results. At this time, GFZ itself provides manual verification only when staff is available, not on a 24/7 basis, while the actual national tsunami warning centers have all a reliable 24/7 service.

  19. 75 FR 25842 - Notice of a Grant With the Public Broadcasting Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-10

    ... development of the Commercial Mobile Alert System (CMAS), a national system to distribute emergency alert...-27 (Feb. 8, 2006) (establishing the National Alert and Tsunami Warning Program); Section 606 of the... requirements to support the distribution of geographically targeted alerts by commercial mobile service...

  20. NOAA's Integrated Tsunami Database: Data for improved forecasts, warnings, research, and risk assessments

    NASA Astrophysics Data System (ADS)

    Stroker, Kelly; Dunbar, Paula; Mungov, George; Sweeney, Aaron; McCullough, Heather; Carignan, Kelly

    2015-04-01

    The National Oceanic and Atmospheric Administration (NOAA) has primary responsibility in the United States for tsunami forecast, warning, research, and supports community resiliency. NOAA's National Geophysical Data Center (NGDC) and co-located World Data Service for Geophysics provide a unique collection of data enabling communities to ensure preparedness and resilience to tsunami hazards. Immediately following a damaging or fatal tsunami event there is a need for authoritative data and information. The NGDC Global Historical Tsunami Database (http://www.ngdc.noaa.gov/hazard/) includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. The long-term data from these events, including photographs of damage, provide clues to what might happen in the future. NGDC catalogs the information on global historical tsunamis and uses these data to produce qualitative tsunami hazard assessments at regional levels. In addition to the socioeconomic effects of a tsunami, NGDC also obtains water level data from the coasts and the deep-ocean at stations operated by the NOAA/NOS Center for Operational Oceanographic Products and Services, the NOAA Tsunami Warning Centers, and the National Data Buoy Center (NDBC) and produces research-quality data to isolate seismic waves (in the case of the deep-ocean sites) and the tsunami signal. These water-level data provide evidence of sea-level fluctuation and possible inundation events. NGDC is also building high-resolution digital elevation models (DEMs) to support real-time forecasts, implemented at 75 US coastal communities. After a damaging or fatal event NGDC begins to collect and integrate data and information from many organizations into the hazards databases. Sources of data include our NOAA partners, the U.S. Geological Survey, the UNESCO Intergovernmental Oceanographic Commission (IOC) and International Tsunami Information Center, Smithsonian Institution's Global Volcanism Program, news organizations, etc. NGDC assesses the data and then works to promptly distribute the data and information. For example, when a major tsunami occurs, all of the related tsunami data are combined into one timely resource, posted in an online report, which includes: 1) event summary; 2) eyewitness and instrumental recordings from preliminary field surveys; 3) regional historical observations including similar past events and effects; 4) observed water heights and calculated tsunami travel times; and 5) near-field effects. This report is regularly updated to incorporate the most recent data and observations. Providing timely access to authoritative data and information ultimately benefits researchers, state officials, the media and the public. This paper will demonstrate the extensive collection of data and how it is used.

  1. A Collaborative Effort Between Caribbean States for Tsunami Numerical Modeling: Case Study CaribeWave15

    NASA Astrophysics Data System (ADS)

    Chacón-Barrantes, Silvia; López-Venegas, Alberto; Sánchez-Escobar, Rónald; Luque-Vergara, Néstor

    2018-04-01

    Historical records have shown that tsunami have affected the Caribbean region in the past. However infrequent, recent studies have demonstrated that they pose a latent hazard for countries within this basin. The Hazard Assessment Working Group of the ICG/CARIBE-EWS (Intergovernmental Coordination Group of the Early Warning System for Tsunamis and Other Coastal Threats for the Caribbean Sea and Adjacent Regions) of IOC/UNESCO has a modeling subgroup, which seeks to develop a modeling platform to assess the effects of possible tsunami sources within the basin. The CaribeWave tsunami exercise is carried out annually in the Caribbean region to increase awareness and test tsunami preparedness of countries within the basin. In this study we present results of tsunami inundation using the CaribeWave15 exercise scenario for four selected locations within the Caribbean basin (Colombia, Costa Rica, Panamá and Puerto Rico), performed by tsunami modeling researchers from those selected countries. The purpose of this study was to provide the states with additional results for the exercise. The results obtained here were compared to co-seismic deformation and tsunami heights within the basin (energy plots) provided for the exercise to assess the performance of the decision support tools distributed by PTWC (Pacific Tsunami Warning Center), the tsunami service provider for the Caribbean basin. However, comparison of coastal tsunami heights was not possible, due to inconsistencies between the provided fault parameters and the modeling results within the provided exercise products. Still, the modeling performed here allowed to analyze tsunami characteristics at the mentioned states from sources within the North Panamá Deformed Belt. The occurrence of a tsunami in the Caribbean may affect several countries because a great variety of them share coastal zones in this basin. Therefore, collaborative efforts similar to the one presented in this study, particularly between neighboring countries, are critical to assess tsunami hazard and increase preparedness within the countries.

  2. A Collaborative Effort Between Caribbean States for Tsunami Numerical Modeling: Case Study CaribeWave15

    NASA Astrophysics Data System (ADS)

    Chacón-Barrantes, Silvia; López-Venegas, Alberto; Sánchez-Escobar, Rónald; Luque-Vergara, Néstor

    2017-10-01

    Historical records have shown that tsunami have affected the Caribbean region in the past. However infrequent, recent studies have demonstrated that they pose a latent hazard for countries within this basin. The Hazard Assessment Working Group of the ICG/CARIBE-EWS (Intergovernmental Coordination Group of the Early Warning System for Tsunamis and Other Coastal Threats for the Caribbean Sea and Adjacent Regions) of IOC/UNESCO has a modeling subgroup, which seeks to develop a modeling platform to assess the effects of possible tsunami sources within the basin. The CaribeWave tsunami exercise is carried out annually in the Caribbean region to increase awareness and test tsunami preparedness of countries within the basin. In this study we present results of tsunami inundation using the CaribeWave15 exercise scenario for four selected locations within the Caribbean basin (Colombia, Costa Rica, Panamá and Puerto Rico), performed by tsunami modeling researchers from those selected countries. The purpose of this study was to provide the states with additional results for the exercise. The results obtained here were compared to co-seismic deformation and tsunami heights within the basin (energy plots) provided for the exercise to assess the performance of the decision support tools distributed by PTWC (Pacific Tsunami Warning Center), the tsunami service provider for the Caribbean basin. However, comparison of coastal tsunami heights was not possible, due to inconsistencies between the provided fault parameters and the modeling results within the provided exercise products. Still, the modeling performed here allowed to analyze tsunami characteristics at the mentioned states from sources within the North Panamá Deformed Belt. The occurrence of a tsunami in the Caribbean may affect several countries because a great variety of them share coastal zones in this basin. Therefore, collaborative efforts similar to the one presented in this study, particularly between neighboring countries, are critical to assess tsunami hazard and increase preparedness within the countries.

  3. Tsunami warnings: Understanding in Hawai'i

    USGS Publications Warehouse

    Gregg, Chris E.; Houghton, Bruce F.; Paton, Douglas; Johnston, David M.; Swanson, D.A.; Yanagi, B.S.

    2007-01-01

    The devastating southeast Asian tsunami of December 26, 2004 has brought home the destructive consequences of coastal hazards in an absence of effective warning systems. Since the 1946 tsunami that destroyed much of Hilo, Hawai'i, a network of pole mounted sirens has been used to provide an early public alert of future tsunamis. However, studies in the 1960s showed that understanding of the meaning of siren soundings was very low and that ambiguity in understanding had contributed to fatalities in the 1960 tsunami that again destroyed much of Hilo. The Hawaiian public has since been exposed to monthly tests of the sirens for more than 25 years and descriptions of the system have been widely published in telephone books for at least 45 years. However, currently there remains some uncertainty in the level of public understanding of the sirens and their implications for behavioral response. Here, we show from recent surveys of Hawai'i residents that awareness of the siren tests and test frequency is high, but these factors do not equate with increased understanding of the meaning of the siren, which remains disturbingly low (13%). Furthermore, the length of time people have lived in Hawai'i is not correlated systematically with understanding of the meaning of the sirens. An additional issue is that warning times for tsunamis gene rated locally in Hawai'i will be of the order of minutes to tens of minutes and limit the immediate utility of the sirens. Natural warning signs of such tsunamis may provide the earliest warning to residents. Analysis of a survey subgroup from Hilo suggests that awareness of natural signs is only moderate, and a majority may expect notification via alerts provided by official sources. We conclude that a major change is needed in tsunami education, even in Hawai'i, to increase public understanding of, and effective response to, both future official alerts and natural warning signs of future tsunamis. ?? Springer 2006.

  4. Tsunami Detection by High-Frequency Radar Beyond the Continental Shelf

    NASA Astrophysics Data System (ADS)

    Grilli, Stéphan T.; Grosdidier, Samuel; Guérin, Charles-Antoine

    2016-12-01

    Where coastal tsunami hazard is governed by near-field sources, such as submarine mass failures or meteo-tsunamis, tsunami propagation times may be too small for a detection based on deep or shallow water buoys. To offer sufficient warning time, it has been proposed to implement early warning systems relying on high-frequency (HF) radar remote sensing, that can provide a dense spatial coverage as far offshore as 200-300 km (e.g., for Diginext Ltd.'s Stradivarius radar). Shore-based HF radars have been used to measure nearshore currents (e.g., CODAR SeaSonde® system; http://www.codar.com/), by inverting the Doppler spectral shifts, these cause on ocean waves at the Bragg frequency. Both modeling work and an analysis of radar data following the Tohoku 2011 tsunami, have shown that, given proper detection algorithms, such radars could be used to detect tsunami-induced currents and issue a warning. However, long wave physics is such that tsunami currents will only rise above noise and background currents (i.e., be at least 10-15 cm/s), and become detectable, in fairly shallow water which would limit the direct detection of tsunami currents by HF radar to nearshore areas, unless there is a very wide shallow shelf. Here, we use numerical simulations of both HF radar remote sensing and tsunami propagation to develop and validate a new type of tsunami detection algorithm that does not have these limitations. To simulate the radar backscattered signal, we develop a numerical model including second-order effects in both wind waves and radar signal, with the wave angular frequency being modulated by a time-varying surface current, combining tsunami and background currents. In each "radar cell", the model represents wind waves with random phases and amplitudes extracted from a specified (wind speed dependent) energy density frequency spectrum, and includes effects of random environmental noise and background current; phases, noise, and background current are extracted from independent Gaussian distributions. The principle of the new algorithm is to compute correlations of HF radar signals measured/simulated in many pairs of distant "cells" located along the same tsunami wave ray, shifted in time by the tsunami propagation time between these cell locations; both rays and travel time are easily obtained as a function of long wave phase speed and local bathymetry. It is expected that, in the presence of a tsunami current, correlations computed as a function of range and an additional time lag will show a narrow elevated peak near the zero time lag, whereas no pattern in correlation will be observed in the absence of a tsunami current; this is because surface waves and background current are uncorrelated between pair of cells, particularly when time-shifted by the long-wave propagation time. This change in correlation pattern can be used as a threshold for tsunami detection. To validate the algorithm, we first identify key features of tsunami propagation in the Western Mediterranean Basin, where Stradivarius is deployed, by way of direct numerical simulations with a long wave model. Then, for the purpose of validating the algorithm we only model HF radar detection for idealized tsunami wave trains and bathymetry, but verify that such idealized case studies capture well the salient tsunami wave physics. Results show that, in the presence of strong background currents, the proposed method still allows detecting a tsunami with currents as low as 0.05 m/s, whereas a standard direct inversion based on radar signal Doppler spectra fails to reproduce tsunami currents weaker than 0.15-0.2 m/s. Hence, the new algorithm allows detecting tsunami arrival in deeper water, beyond the shelf and further away from the coast, and providing an early warning. Because the standard detection of tsunami currents works well at short range, we envision that, in a field situation, the new algorithm could complement the standard approach of direct near-field detection by providing a warning that a tsunami is approaching, at larger range and in greater depth. This warning would then be confirmed at shorter range by a direct inversion of tsunami currents, from which the magnitude of the tsunami would also estimated. Hence, both algorithms would be complementary. In future work, the algorithm will be applied to actual tsunami case studies performed using a state-of-the-art long wave model, such as briefly presented here in the Mediterranean Basin.

  5. New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.

    2013-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.

  6. Determination of Tsunami Warning Criteria for Current Velocity

    NASA Astrophysics Data System (ADS)

    Chen, R.; Wang, D.

    2015-12-01

    Present Tsunami warning issuance largely depends on an event's predicted wave height and inundation depth. Specifically, a warning is issued if the on-shore wave height is greater than 1m. This project examines whether any consideration should be given to current velocity. We apply the idea of force balance to determine theoretical minimum velocity thresholds for injuring people and damaging properties as a function of wave height. Results show that even at a water depth of less than 1m, a current velocity of 2 m/s is enough to pose a threat to humans and cause potential damage to cars and houses. Next, we employ a 1-dimensional shallow water model to simulate Tsunamis with various amplitudes and an assumed wavelength of 250km. This allows for the profiling of current velocity and wave height behavior as the Tsunamis reach shore. We compare this data against our theoretical thresholds to see if any real world scenarios would be dangerous to people and properties. We conclude that for such Tsunamis, the present warning criteria are effective at protecting people against larger events with amplitude greater than ~0.3m. However, for events with amplitude less than ~0.2m, it is possible to have waves less than 1m with current velocity high enough to endanger humans. Thus, the inclusion of current velocity data would help the present Tsunami warning criteria become more robust and efficient, especially for smaller Tsunami events.

  7. Tsunamis: bridging science, engineering and society.

    PubMed

    Kânoğlu, U; Titov, V; Bernard, E; Synolakis, C

    2015-10-28

    Tsunamis are high-impact, long-duration disasters that in most cases allow for only minutes of warning before impact. Since the 2004 Boxing Day tsunami, there have been significant advancements in warning methodology, pre-disaster preparedness and basic understanding of related phenomena. Yet, the trail of destruction of the 2011 Japan tsunami, broadcast live to a stunned world audience, underscored the difficulties of implementing advances in applied hazard mitigation. We describe state of the art methodologies, standards for warnings and summarize recent advances in basic understanding, and identify cross-disciplinary challenges. The stage is set to bridge science, engineering and society to help build up coastal resilience and reduce losses. © 2015 The Author(s).

  8. Operational Tsunami Modelling with TsunAWI for the German-Indonesian Tsunami Early Warning System: Recent Developments

    NASA Astrophysics Data System (ADS)

    Rakowsky, N.; Harig, S.; Androsov, A.; Fuchs, A.; Immerz, A.; Schröter, J.; Hiller, W.

    2012-04-01

    Starting in 2005, the GITEWS project (German-Indonesian Tsunami Early Warning System) established from scratch a fully operational tsunami warning system at BMKG in Jakarta. Numerical simulations of prototypic tsunami scenarios play a decisive role in a priori risk assessment for coastal regions and in the early warning process itself. Repositories with currently 3470 regional tsunami scenarios for GITEWS and 1780 Indian Ocean wide scenarios in support of Indonesia as a Regional Tsunami Service Provider (RTSP) were computed with the non-linear shallow water modell TsunAWI. It is based on a finite element discretisation, employs unstructured grids with high resolution along the coast and includes inundation. This contribution gives an overview on the model itself, the enhancement of the model physics, and the experiences gained during the process of establishing an operational code suited for thousands of model runs. Technical aspects like computation time, disk space needed for each scenario in the repository, or post processing techniques have a much larger impact than they had in the beginning when TsunAWI started as a research code. Of course, careful testing on artificial benchmarks and real events remains essential, but furthermore, quality control for the large number of scenarios becomes an important issue.

  9. Physical Observations of the Tsunami during the September 8th 2017 Tehuantepec, Mexico Earthquake

    NASA Astrophysics Data System (ADS)

    Ramirez-Herrera, M. T.; Corona, N.; Ruiz-Angulo, A.; Melgar, D.; Zavala-Hidalgo, J.

    2017-12-01

    The September 8th 2017, Mw8.2 earthquake offshore Chiapas, Mexico, is the largest earthquake recorded history in Chiapas since 1902. It caused damage in the states of Oaxaca, Chiapas and Tabasco; it had more than 100 fatalities, over 1.5 million people were affected, and 41,000 homes were damaged in the state of Chiapas alone. This earthquake, a deep intraplate event on a normal fault on the oceanic subducting plate, generated a tsunami recorded at several tide gauge stations in Mexico and on the Pacific Ocean. Here we report the physical effects of the tsunami on the Chiapas coast and analyze the societal implications of this tsunami on the basis of our field observations. Tide gauge data indicate 11.3 and 8.2 cm of coastal subsidence at Salina Cruz and Puerto Chiapas stations. The associated tsunami waves were recorded first at Salina Cruz tide gauge station at 5:13 (GMT). We covered ground observations along 41 km of the coast of Chiapas, encompassing the sites with the highest projected wave heights based on the preliminary tsunami model (maximum tsunami amplitudes between -94.5 and -93.0 W). Runup and inundation distances were measured with an RTK GPS and using a Sokkia B40 level along 8 sites. We corrected runup data with estimated astronomical tide levels at the time of the tsunami. The tsunami occurred at low tide. The maximum runup was 3 m at Boca del Cielo, and maximum inundation distance was 190 m in Puerto Arista, corresponding to the coast directly opposite the epicenter and in the central sector of the Gulf of Tehuantepec. In general, our field data agree with the predicted results from the preliminary tsunami model. Tsunami scour and erosion was evident on the Chiapas coast. Tsunami deposits, mainly sand, reached up to 32 cm thickness thinning landwards up to 172 m distance. Even though the Mexican tsunami early warning system (CAT) issued several warnings, the tsunami arrival struck the Chiapas coast prior to the arrival of official warnings to the residents of small coastal towns, owing to the multi-ranked notification system. Thus, a tsunami early warning system with a direct warning to all coastal communities is needed. Some people evacuated under their own initiative, but some did not evacuate. Therefore, community-based education and awareness programs are needed.

  10. A Multi-Disciplinary Approach to Tsunami Disaster Prevention in Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Horns, D. M.; Hall, S.; Harris, R. A.

    2016-12-01

    The island of Java in Indonesia is the most densely populated island on earth, and is situated within one of the most tectonically active regions on the planet. Deadly tsunamis struck Java in 1994 and 2006. We conducted an assessment of tsunami hazards on the south coast of Java using a team of geologists, public health professionals, and disaster education specialists. The social science component included tsunami awareness surveys, education in communities and schools, evacuation drills, and evaluation. We found that the evacuation routes were generally appropriate for the local hazard, and that most people were aware of the routes and knew how to use them. However, functional tsunami warning systems were lacking in most areas and knowledge of natural warning signs was incomplete. We found that while knowledge of when to evacuate improved after our educational lesson, some incorrect beliefs persisted (e.g. misconceptions about types of earthquakes able to generate tsunamis and how far inland tsunamis can reach). There was a general over-reliance on government to alert when evacuation is needed as well as reluctance on the part of local leaders to take initiative to sound tsunami alerts. Many people on earth who are at risk of tsunamis live in areas where the government lacks resources to maintain a functional tsunami warning system. The best hope for protecting those people is direct education working within the local cultural belief system. Further collaboration is needed with government agencies to design consistent and repeated messages challenging misperceptions about when to evacuate and to encourage individuals to take personal responsibility based on natural warning signs.

  11. The November 15, 2006 Kuril Islands-Generated Tsunami in Crescent City, California

    NASA Astrophysics Data System (ADS)

    Dengler, L.; Uslu, B.; Barberopoulou, A.; Yim, S. C.; Kelly, A.

    2009-02-01

    On November 15, 2006, Crescent City in Del Norte County, California was hit by a tsunami generated by a M w 8.3 earthquake in the central Kuril Islands. Strong currents that persisted over an eight-hour period damaged floating docks and several boats and caused an estimated 9.2 million in losses. Initial tsunami alert bulletins issued by the West Coast Alaska Tsunami Warning Center (WCATWC) in Palmer, Alaska were cancelled about three and a half hours after the earthquake, nearly five hours before the first surges reached Crescent City. The largest amplitude wave, 1.76-meter peak to trough, was the sixth cycle and arrived over two hours after the first wave. Strong currents estimated at over 10 knots, damaged or destroyed three docks and caused cracks in most of the remaining docks. As a result of the November 15 event, WCATWC changed the definition of Advisory from a region-wide alert bulletin meaning that a potential tsunami is 6 hours or further away to a localized alert that tsunami water heights may approach warning- level thresholds in specific, vulnerable locations like Crescent City. On January 13, 2007 a similar Kuril event occurred and hourly conferences between the warning center and regional weather forecasts were held with a considerable improvement in the flow of information to local coastal jurisdictions. The event highlighted the vulnerability of harbors from a relatively modest tsunami and underscored the need to improve public education regarding the duration of the tsunami hazards, improve dialog between tsunami warning centers and local jurisdictions, and better understand the currents produced by tsunamis in harbors.

  12. Tsunami Amplitude Estimation from Real-Time GNSS.

    NASA Astrophysics Data System (ADS)

    Jeffries, C.; MacInnes, B. T.; Melbourne, T. I.

    2017-12-01

    Tsunami early warning systems currently comprise modeling of observations from the global seismic network, deep-ocean DART buoys, and a global distribution of tide gauges. While these tools work well for tsunamis traveling teleseismic distances, saturation of seismic magnitude estimation in the near field can result in significant underestimation of tsunami excitation for local warning. Moreover, DART buoy and tide gauge observations cannot be used to rectify the underestimation in the available time, typically 10-20 minutes, before local runup occurs. Real-time GNSS measurements of coseismic offsets may be used to estimate finite faulting within 1-2 minutes and, in turn, tsunami excitation for local warning purposes. We describe here a tsunami amplitude estimation algorithm; implemented for the Cascadia subduction zone, that uses continuous GNSS position streams to estimate finite faulting. The system is based on a time-domain convolution of fault slip that uses a pre-computed catalog of hydrodynamic Green's functions generated with the GeoClaw shallow-water wave simulation software and maps seismic slip along each section of the fault to points located off the Cascadia coast in 20m of water depth and relies on the principle of the linearity in tsunami wave propagation. The system draws continuous slip estimates from a message broker, convolves the slip with appropriate Green's functions which are then superimposed to produce wave amplitude at each coastal location. The maximum amplitude and its arrival time are then passed into a database for subsequent monitoring and display. We plan on testing this system using a suite of synthetic earthquakes calculated for Cascadia whose ground motions are simulated at 500 existing Cascadia GPS sites, as well as real earthquakes for which we have continuous GNSS time series and surveyed runup heights, including Maule, Chile 2010 and Tohoku, Japan 2011. This system has been implemented in the CWU Geodesy Lab for the Cascadia subduction zone but will be expanded to the circum-Pacific as real-time processing of international GNSS data streams become available.

  13. Evidence-Based Support for the Characteristics of Tsunami Warning Messages for Local, Regional and Distant Sources

    NASA Astrophysics Data System (ADS)

    Gregg, C. E.; Johnston, D. M.; Sorensen, J. H.; Vogt Sorensen, B.; Whitmore, P.

    2014-12-01

    Many studies since 2004 have documented the dissemination and receipt of risk information for local to distant tsunamis and factors influencing people's responses. A few earlier tsunami studies and numerous studies of other hazards provide additional support for developing effective tsunami messages. This study explores evidence-based approaches to developing such messages for the Pacific and National Tsunami Warning Centers in the US. It extends a message metric developed for the NWS Tsunami Program. People at risk to tsunamis receive information from multiple sources through multiple channels. Sources are official and informal and environmental and social cues. Traditionally, official tsunami messages followed a linear dissemination path through relatively few channels from warning center to emergency management to public and media. However, the digital age has brought about a fundamental change in the dissemination and receipt of official and informal communications. Information is now disseminated in very non-linear paths and all end-user groups may receive the same message simultaneously. Research has demonstrated a range of factors that influence rapid respond to an initial real or perceived threat. Immediate response is less common than one involving delayed protective actions where people first engage in "milling behavior" to exchange information and confirm the warning before taking protective action. The most important message factors to achieve rapid response focus on the content and style of the message and the frequency of dissemination. Previously we developed a tsunami message metric consisting of 21 factors divided into message content and style and receiver characteristics. Initially, each factor was equally weighted to identify gaps, but here we extend the work by weighting specific factors. This utilizes recent research that identifies the most important determinants of protective action. We then discuss the prioritization of message information in the context of potentially limited space in evolving tsunami messages issued by the warning centers.

  14. Tsunami Detection Systems for International Requirements

    NASA Astrophysics Data System (ADS)

    Lawson, R. A.

    2007-12-01

    Results are presented regarding the first commercially available, fully operational, tsunami detection system to have passed stringent U.S. government testing requirements and to have successfully demonstrated its ability to detect an actual tsunami at sea. Spurred by the devastation of the December 26, 2004, Indian Ocean tsunami that killed more than 230,000 people, the private sector actively supported the Intergovernmental Oceanographic Commission's (IOC"s) efforts to develop a tsunami warning system and mitigation plan for the Indian Ocean region. As each country in the region developed its requirements, SAIC recognized that many of these underdeveloped countries would need significant technical assistance to fully execute their plans. With the original focus on data fusion, consequence assessment tools, and warning center architecture, it was quickly realized that the cornerstone of any tsunami warning system would be reliable tsunami detection buoys that could meet very stringent operational standards. Our goal was to leverage extensive experience in underwater surveillance and oceanographic sensing to produce an enhanced and reliable deep water sensor that could meet emerging international requirements. Like the NOAA Deep-ocean Assessment and Recording of Tsunamis (DART TM ) buoy, the SAIC Tsunami Buoy (STB) system consists of three subsystems: a surfaccommunications buoy subsystem, a bottom pressure recorder subsystem, and a buoy mooring subsystem. With the operational success that DART has demonstrated, SAIC decided to build and test to the same high standards. The tsunami detection buoy system measures small changes in the depth of the deep ocean caused by tsunami waves as they propagate past the sensor. This is accomplished by using an extremely sensitive bottom pressure sensor/recorder to measure very small changes in pressure as the waves move past the buoy system. The bottom pressure recorder component includes a processor with algorithms that recognize these characteristics, and then immediately alerts a tsunami warning center through the communications buoy when the processor senses one of these waves. In addition to the tsunami detection buoy system, an end-to-end tsunami warning system was developed that builds upon the country's existing disaster warning infrastructure. This warning system includes 1) components that receive, process, and analyze buoy, seismic and tide gauge data; 2) predictive tools and a consequence assessment tool set to provide decision support; 3) operation center design and implementation; and 4) tsunami buoy operations and maintenance support. The first buoy was deployed Oct. 25, 2006, approximately 200 nautical miles west of San Diego in 3,800 meters of water. Just three weeks later, it was put to the test during an actual tsunami event. On Nov. 15, 2006, an 8.3 magnitude earthquake rocked the Kuril Islands, located between Japan and the Kamchatka Peninsula of Russia. That quake generated a small tsunami. Waves from the tsunami propagated approximately 4,000 nautical miles across the Pacific Ocean in about nine hours-- a speed of about 445 nautical miles per hour when this commercial buoy first detected them. Throughout that event, the tsunami buoy system showed excellent correlation with data collected by a NOAA DART buoy located 28 nautical miles north of it. Subsequent analysis revealed that the STB matched DART operational capabilities and performed flawlessly. The buoy proved its capabilities again on Jan. 13, 2007, when an 8.1 magnitude earthquake occurred in the same region, and the STB detected the seismic event. As a result of the successes of this entire project, SAIC recently applied for and received a license from NOAA to build DART systems.

  15. Far-field tsunami of 2017 Mw 8.1 Tehuantepec, Mexico earthquake recorded by Chilean tide gauge network: Implications for tsunami warning systems

    NASA Astrophysics Data System (ADS)

    González-Carrasco, J. F.; Benavente, R. F.; Zelaya, C.; Núñez, C.; Gonzalez, G.

    2017-12-01

    The 2017 Mw 8.1, Tehuantepec earthquake generated a moderated tsunami, which was registered in near-field tide gauges network activating a tsunami threat state for Mexico issued by PTWC. In the case of Chile, the forecast of tsunami waves indicate amplitudes less than 0.3 meters above the tide level, advising an informative state of threat, without activation of evacuation procedures. Nevertheless, during sea level monitoring of network we detect wave amplitudes (> 0.3 m) indicating a possible change of threat state. Finally, NTWS maintains informative level of threat based on mathematical filtering analysis of sea level records. After 2010 Mw 8.8, Maule earthquake, the Chilean National Tsunami Warning System (NTWS) has increased its observational capabilities to improve early response. Most important operational efforts have focused on strengthening tide gauge network for national area of responsibility. Furthermore, technological initiatives as Integrated Tsunami Prediction and Warning System (SIPAT) has segmented the area of responsibility in blocks to focus early warning and evacuation procedures on most affected coastal areas, while maintaining an informative state for distant areas of near-field earthquake. In the case of far-field events, NTWS follow the recommendations proposed by Pacific Tsunami Warning Center (PTWC), including a comprehensive monitoring of sea level records, such as tide gauges and DART (Deep-Ocean Assessment and Reporting of Tsunami) buoys, to evaluate the state of tsunami threat in the area of responsibility. The main objective of this work is to analyze the first-order physical processes involved in the far-field propagation and coastal impact of tsunami, including implications for decision-making of NTWS. To explore our main question, we construct a finite-fault model of the 2017, Mw 8.1 Tehuantepec earthquake. We employ the rupture model to simulate a transoceanic tsunami modeled by Neowave2D. We generate synthetic time series at tide gauge stations and compare them with recorded sea level data, to dismiss meteorological processes, such as storms and surges. Resonance analysis is performed by wavelet technique.

  16. Insights from interviews regarding high fatality rate caused by the 2011 Tohoku-Oki earthquake

    NASA Astrophysics Data System (ADS)

    Ando, M.; Ishida, M.

    2012-12-01

    The 11 March 2011 Tohoku-Oki earthquake (Mw9.0) caused approximately 19,000 casualties including missing persons along the entire coast of the Tohoku region. Three historical tsunamis occurred in the past 115 years preceding this tsunami. Since these tsunamis, numerous countermeasures against future tsunamis such as breakwaters, early tsunami warning systems and tsunami evacuation drills were implemented. Despite the preparedness, a number of deaths and missing persons occurred. Although this death rate is approximately 4 % of the population in severely inundated areas; 96 % safely evacuated or managed to survive the tsunami. To understand why some people evacuated immediately while others delayed; survivors were interviewed in the northern part of the Tohoku region. Our interviews revealed that many residents obtained no appropriate warnings and many chose to remain in dangerous locations partly because they obtained the wrong idea of the risks. In addition, our interviews also indicated that the resultant high casualties were due to current technology malfunction, underestimated earthquake size and tsunami heights, and failure of warning systems. Furthermore, the existing breakwaters provided the local community a false sense of security. The advanced technology did not work properly, especially at the time of the severe disaster. If residents had taken an immediate action after the major shaking stopped, most local residents might have survived considering that safer highlands are within 5 to 20 minute walking distance from the interviewed areas. However, the elderly and physically disabled people would still be in a much more difficult situation to walk such distance into safety. Nevertheless, even if these problems occur in future earthquakes, better knowledge regarding earthquakes and tsunami hazards could save more lives. People must take immediate action without waiting for official warning or help. To avoid similar high tsunami death ratios in the future, residents including young children should be taught the basic mechanism of tsunami generation. Such basic knowledge can lead local residents to evacuate sooner, enabling more people to survive a tsunami even if warning systems or other technology would fail to function.

  17. The First Real-Time Tsunami Animation

    NASA Astrophysics Data System (ADS)

    Becker, N. C.; Wang, D.; McCreery, C.; Weinstein, S.; Ward, B.

    2014-12-01

    For the first time a U.S. tsunami warning center created and issued a tsunami forecast model animation while the tsunami was still crossing an ocean. Pacific Tsunami Warning Center (PTWC) scientists had predicted they would have this ability (Becker et al., 2012) with their RIFT forecast model (Wang et al., 2009) by using rapidly-determined W-phase centroid-moment tensor earthquake focal mechanisms as tsunami sources in the RIFT model (Wang et al., 2012). PTWC then acquired its own YouTube channel in 2013 for its outreach efforts that showed animations of historic tsunamis (Becker et al., 2013), but could also be a platform for sharing future tsunami animations. The 8.2 Mw earthquake of 1 April 2014 prompted PTWC to issue official warnings for a dangerous tsunami in Chile, Peru and Ecuador. PTWC ended these warnings five hours later, then issued its new tsunami marine hazard product (i.e., no coastal evacuations) for the State of Hawaii. With the international warning canceled but with a domestic hazard still present PTWC generated a forecast model animation and uploaded it to its YouTube channel six hours before the arrival of the first waves in Hawaii. PTWC also gave copies of this animation to television reporters who in turn passed it on to their national broadcast networks. PTWC then created a version for NOAA's Science on a Sphere system so it could be shown on these exhibits as the tsunami was still crossing the Pacific Ocean. While it is difficult to determine how many people saw this animation since local, national, and international news networks showed it in their broadcasts, PTWC's YouTube channel provides some statistics. As of 1 August 2014 this animation has garnered more than 650,000 views. Previous animations, typically released during significant anniversaries, rarely get more than 10,000 views, and even then only when external websites share them. Clearly there is a high demand for a tsunami graphic that shows both the speed and the severity of a tsunami before it reaches impacted coastlines, similar to how radar and satellite images show the advancement of storms. Though this animation showed that most of the tsunami waves would not be dangerous, future publication of these animations will require additional outreach and education to avoid any unnecessary alarm. https://www.youtube.com/user/PacificTWC

  18. Tsunami Hockey

    NASA Astrophysics Data System (ADS)

    Weinstein, S.; Becker, N. C.; Wang, D.; Fryer, G. J.

    2013-12-01

    An important issue that vexes tsunami warning centers (TWCs) is when to cancel a tsunami warning once it is in effect. Emergency managers often face a variety of pressures to allow the public to resume their normal activities, but allowing coastal populations to return too quickly can put them at risk. A TWC must, therefore, exercise caution when cancelling a warning. Kim and Whitmore (2013) show that in many cases a TWC can use the decay of tsunami oscillations in a harbor to forecast when its amplitudes will fall to safe levels. This technique should prove reasonably robust for local tsunamis (those that are potentially dangerous within only 100 km of their source region) and for regional tsunamis (whose danger is limited to within 1000km of the source region) as well. For ocean-crossing destructive tsunamis such as the 11 March 2011 Tohoku tsunami, however, this technique may be inadequate. When a tsunami propagates across the ocean basin, it will encounter topographic obstacles such as seamount chains or coastlines, resulting in coherent reflections that can propagate great distances. When these reflections reach previously-impacted coastlines, they can recharge decaying tsunami oscillations and make them hazardous again. Warning center scientists should forecast sea-level records for 24 hours beyond the initial tsunami arrival in order to observe any potential reflections that may pose a hazard. Animations are a convenient way to visualize reflections and gain a broad geographic overview of their impacts. The Pacific Tsunami Warning Center has developed tools based on tsunami simulations using the RIFT tsunami forecast model. RIFT is a linear, parallelized numerical tsunami propagation model that runs very efficiently on a multi-CPU system (Wang et al, 2012). It can simulate 30-hours of tsunami wave propagation in the Pacific Ocean at 4 arc minute resolution in approximately 6 minutes of real time on a 12-CPU system. Constructing a 30-hour animation using 1 minute simulated time steps takes approximately 50 minutes on the same system. These animations are generated quickly enough to provide decision support for emergency managers whose coastlines may be impacted by the tsunami several hours later. Tsunami reflections can also aid in determining the source region for those tsunamis generated by non-seismic mechanisms without a clear source such as meteotsunamis, tsunamis generated by meteorological phenomena. A derecho that crossed the New Jersey coast and entered the Atlantic Ocean at approximately 1500 UTC June 13, 2013 generated a meteotsunami that struck the northeast coast of the US causing several injuries. A DART sensor off Montauk, NY, recorded tsunami waves approximately 200 minutes apart. We show how the arrival times of the tsunamis recorded by this DART can help to constrain the source region of the meteotsunami. We also examine other reflections produced by the Haida Gwaii 2012, Tohoku 2011, and other tsunamis.

  19. Implementation of the NEAMTWS in Portugal

    NASA Astrophysics Data System (ADS)

    Matias, L. M.; Annunziato, A.; Carrilho, F.; Baptista, M.

    2008-12-01

    In this paper we present the ongoing implementation of a national tsunami warning system in Portugal. After the Sumatra event in December 2004, the UNESCO, through its International Oceanographic Commission, recognized the need for an end to end global tsunami warning system and International Coordination Groups have been established for different areas around the globe: Indian, Caribbean, Atlantic and Mediterranean ocean basins. This system is the natural response to the historical and recent instrumental events generated along the western segment of the Eurasia and Nubian plates, which eastern end corresponds to the Gulf of Cadiz. The TWS includes three main components: the seismic detection, the tsunami detection and the issue of warnings/alerts. In Portugal the automatic earthquake processing is installed at IM (Instituto de Meteorologia) which is the only national institution operating on a 24x7 basis. This makes IM the natural candidate to host the Portuguese tsunami warning system. The TWS under implementation has several key points: definition of the tsunami scenarios, tsunami detection, and tsunami protocol messages. The system will also be able to predict tsunami potential impact along the coast, wave-heights and arrival times at pre-defined locations along the coast. In this study we present the recent results on definition of tsunami scenarios, establishment of the scenario database and the tsunami analysis tool. This work is a joint effort between Instituto de Meteorologia (Portugal), the Joint Research Center, JRC- ISPRA, Italy and the coordination of the Portuguese Group for the implementation of NEAMTWS in the area. This work has been financed by different European projects as NEAREST and TRANSFER, and also by the JRC, the IM and CGUL/IDL institutions.

  20. 2009 Samoa tsunami: factors that exacerbated or reduced impacts in Samoa and American Samoa

    NASA Astrophysics Data System (ADS)

    Dengler, L. A.; Ewing, L.; Brandt, J.; Irish, J. L.; Jones, C.; Long, K.; Lazrus, H.; McCullough, N.

    2009-12-01

    An interdisciplinary team with expertise in coastal and port engineering, coastal management, environmental science, anthropology, emergency management, and mitigation visited Samoa and American Samoa in late October and November, 2009. The team, sponsored by ASCE/COPRI, EERI, and the NTHMP focused on identifying the factors which effected the impacts of the September 29, 2009 tsunami. The engineering group assessed the value of engineered coastal protection and natural protective features (reefs, mangroves, etc.) in reducing tsunami inundation by comparing protected and unprotected coastlines and examined possible correlations between damage to the built environment and hydrodynamic forcing, namely loading by runup and velocity. The EERI group looked at how coastal land use planning and management, emergency planning and response, and culture, education and awareness of tsunami hazards affected outcomes. The group also looked at public response to the natural warnings of September 29 and the official warnings following the October 7 Vanuatu tsunami warning.

  1. Geoethical issues involved in Tsunami Warning System concepts and operations

    NASA Astrophysics Data System (ADS)

    Charalampakis, Marinos; Papadopoulos, Gerassimos A.; Tinti, Stefano

    2016-04-01

    The main goal of a Tsunami Warning System (TWS) is to mitigate the effect of an incoming tsunami by alerting coastal population early enough to allow people to evacuate safely from inundation zones. Though this representation might seem oversimplified, nonetheless, achieving successfully this goal requires a positive synergy of geoscience, communication, emergency management, technology, education, social sciences, politics. Geoethical issues arise always when there is an interaction between geoscience and society, and TWS is a paradigmatic case where interaction is very strong and is made critical because a) the formulation of the tsunami alert has to be made in a time as short as possible and therefore on uncertain data, and b) any evaluation error (underestimation or overestimation) can lead to serious (and sometimes catastrophic) consequences involving wide areas and a large amount of population. From the geoethical point of view three issues are critical: how to (i) combine forecasts and uncertainties reasonably and usefully, (ii) cope and possibly solve the dilemma whether it is better over-alerting or under-alerting population and (iii) deal with responsibility and liability of geoscientists, TWS operators, emergency operators and coastal population. The discussion will be based on the experience of the Hellenic National Tsunami Warning Center (HL-NTWC, Greece), which operates on 24/7 basis as a special unit of the Institute of Geodynamics, National Observatory of Athens, and acts also as Candidate Tsunami Service Provider (CTSP) in the framework of the North-Eastern Atlantic, the Mediterranean and connected seas Tsunami Warning System (NEAMTWS) of the IOC/UNESCO. Since August 2012, when HL-NTWC was officially declared as operational, 14 tsunami warning messages have been disseminated to a large number of subscribers after strong submarine earthquakes occurring in Greece and elsewhere in the eastern Mediterranean. It is recognized that the alerting process and procedure are quite complex and deserve an open and wide debate, that at the moment seems to be absent from media, scientific community and society, very likely until the next tsunami disaster.

  2. Interviewing insights regarding the fatalities inflicted by the 2011 Great East Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Ando, M.; Ishida, M.; Hayashi, Y.; Mizuki, C.; Nishikawa, Y.; Tu, Y.

    2013-09-01

    One hundred fifty survivors of the 11 March 2011 Great East Japan Earthquake (Tohoku-oki earthquake) (Mw = 9.0) were interviewed to study the causes of deaths from the associated tsunami in coastal areas of Tohoku. The first official tsunami warning underestimated the height of the tsunami and 40% of the interviewees did not obtain this warning due to immediate blackouts and a lack of communication after the earthquake. Many chose to remain in dangerous locations based on the underestimated warning and their experiences with previous smaller tsunamis and/or due to misunderstanding the mitigating effects of nearby breakwaters in blocking incoming tsunamis. Some delayed their evacuation to perform family safety checks, and in many situations, the people affected misunderstood the risks involved in tsunamis. In this area, three large tsunamis have struck in the 115 yr preceding the 2011 tsunami. These tsunamis remained in the collective memory of communities, and numerous measures against future tsunami damage, such as breakwaters and tsunami evacuation drills, had been implemented. Despite these preparedness efforts, approximately 18 500 deaths and cases of missing persons occurred. The death rate with the age of 65 and above was particularly high, four times higher than that with other age groups. These interviews indicate that deaths resulted from a variety of reasons, but if residents had taken immediate action after the major ground motion stopped, most residents might have been saved. Education about the science behind earthquakes and tsunamis could help save more lives in the future.

  3. Tsunamis warning from space :Ionosphere seismology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larmat, Carene

    2012-09-04

    Ionosphere is the layer of the atmosphere from about 85 to 600km containing electrons and electrically charged atoms that are produced by solar radiation. Perturbations - layering affected by day and night, X-rays and high-energy protons from the solar flares, geomagnetic storms, lightning, drivers-from-below. Strategic for radio-wave transmission. This project discusses the inversion of ionosphere signals, tsunami wave amplitude and coupling parameters, which improves tsunami warning systems.

  4. Implementation of tsunami disaster prevention measures in the municipality of San Rafael del Sur, Nicaragua

    NASA Astrophysics Data System (ADS)

    Strauch, W.; Talavera, E.; Acosta, N.; Sanchez, M.; Mejia, E.

    2007-05-01

    The Nicaraguan Pacific coast presents considerable tsunami risk. On September 1, 1992, a tsunami caused enormous damage in the infrastructure and killed more than 170 people. A pilot project was conducted between 2006 and 2007 in the municipality of San Rafel del Sur, area of Masachapa, The project included multiple topics of tsunami prevention measures and considering the direct participation of the local population, as: -General education on disaster prevention, participative events; -Investigation of awareness level and information needs for different population groups; -Specific educational measures in the schools; -Publication of brochures, calendars, news paper articles, radio programs, TV spots -Development of local tsunami hazard maps, 1:5,000 scale; (based on previous regional tsunami hazard mapping projects and local participation) -Development of a tsunami warning plan; -Improvements of the national tsunami warning system. -Installation of sirens for tsunami warning -Installation of tsunami signs, indicating hazardous areas, evacuation routes, safe places; -Realization of evacuation drills in schools. Based on the experiences gained in Masachapa it is planned to run similar projects in other areas along the Nicaraguan Pacific coast. In the project participated the local municipality and local stakeholders of San Rafael del Sur, Ministry of Education, National Police, Nicaraguan Red Cross, Ministry of Health, Ministry of Tourism, Nicaraguan Geosciences Institute (INETER), National System for Disaster Prevention (SINAPRED), Swiss Agency for Development and Cooperation (SDC). It was financed by SDC and INETER.

  5. New Coastal Tsunami Gauges: Application at Augustine Volcano, Cook Inlet, Alaska

    NASA Astrophysics Data System (ADS)

    Burgy, M.; Bolton, D. K.

    2006-12-01

    Recent eruptive activity at Augustine Volcano and its associated tsunami threat to lower Cook Inlet pointed out the need for a quickly deployable tsunami detector which could be installed on Augustine Island's coast. The detector's purpose would be to verify tsunami generation by direct observation of the wave at the source to support tsunami warning decisions along populated coastlines. To fill this need the Tsunami Mobile Alert Real-Time (TSMART) system was developed at NOAA's West Coast/Alaska Tsunami Warning Center with support from the University of Alaska Tsunami Warning and Environmental Observatory for Alaska program (TWEAK) and the Alaska Volcano Observatory (AVO). The TSMART system consists of a pressure sensor installed as near as possible to the low tide line. The sensor is enclosed in a water-tight hypalon bag filled with propylene-glycol to prevent silt damage to the sensor and freezing. The bag is enclosed in a perforated, strong plastic pipe about 16 inches long and 8 inches in diameter enclosed at both ends for protection. The sensor is cabled to a data logger/radio/power station up to 300 feet distant. Data are transmitted to a base station and made available to the warning center in real-time through the internet. This data telemetry system can be incorporated within existing AVO and Plate Boundary Observatory networks which makes it ideal for volcano-tsunami monitoring. A TSMART network can be utilized anywhere in the world within 120 miles of an internet connection. At Augustine, two test stations were installed on the east side of the island in August 2006. The sensors were located very near the low tide limit and covered with rock, and the cable was buried to the data logger station which was located well above high tide mark. Data logger, radio, battery and other electronics are housed in an enclosure mounted to a pole which also supports an antenna and solar panel. Radio signal is transmitted to a repeater station higher up on the island which then transmits the data to a base station in Homer, Alaska. Sea level data values are transmitted every 15 seconds and displayed at the tsunami warning center in Palmer, Alaska.

  6. Short-term Inundation Forecasting for Tsunamis Version 4.0 Brings Forecasting Speed, Accuracy, and Capability Improvements to NOAA's Tsunami Warning Centers

    NASA Astrophysics Data System (ADS)

    Sterling, K.; Denbo, D. W.; Eble, M. C.

    2016-12-01

    Short-term Inundation Forecasting for Tsunamis (SIFT) software was developed by NOAA's Pacific Marine Environmental Laboratory (PMEL) for use in tsunami forecasting and has been used by both U.S. Tsunami Warning Centers (TWCs) since 2012, when SIFTv3.1 was operationally accepted. Since then, advancements in research and modeling have resulted in several new features being incorporated into SIFT forecasting. Following the priorities and needs of the TWCs, upgrades to SIFT forecasting were implemented into SIFTv4.0, scheduled to become operational in October 2016. Because every minute counts in the early warning process, two major time saving features were implemented in SIFT 4.0. To increase processing speeds and generate high-resolution flooding forecasts more quickly, the tsunami propagation and inundation codes were modified to run on Graphics Processing Units (GPUs). To reduce time demand on duty scientists during an event, an automated DART inversion (or fitting) process was implemented. To increase forecasting accuracy, the forecasted amplitudes and inundations were adjusted to include dynamic tidal oscillations, thereby reducing the over-estimates of flooding common in SIFTv3.1 due to the static tide stage conservatively set at Mean High Water. Further improvements to forecasts were gained through the assimilation of additional real-time observations. Cabled array measurements from Bottom Pressure Recorders (BPRs) in the Oceans Canada NEPTUNE network are now available to SIFT for use in the inversion process. To better meet the needs of harbor masters and emergency managers, SIFTv4.0 adds a tsunami currents graphical product to the suite of disseminated forecast results. When delivered, these new features in SIFTv4.0 will improve the operational tsunami forecasting speed, accuracy, and capabilities at NOAA's Tsunami Warning Centers.

  7. Disaster risk reduction policies and regulations in Aceh after the 2004 Indian Ocean Tsunami

    NASA Astrophysics Data System (ADS)

    Syamsidik; Rusydy, I.; Arief, S.; Munadi, K.; Melianda, E.

    2017-02-01

    The 2004 Indian Ocean Tsunami that struck most of coastal cities in Aceh has motivated a numerous changes in the world of disaster risk reduction including to the policies and regulations at local level in Aceh. This paper is aimed at elaborating the changes of policies and regulations in Aceh captured and monitored during 12-year of the tsunami recovery process. A set of questionnaires were distributed to about 245 respondents in Aceh to represent government officials at 6 districts in Aceh. The districts were severely damaged due to the 2004 tsunami. Four aspects were investigated during this research, namely tsunami evacuation mechanism and infrastructures, disaster risk map, disaster data accessibility, perceptions on tsunami risks, and development of tsunami early warning at local level in Aceh. This research found that the spatial planning in several districts in Aceh have adopted tsunami mitigation although they were only significant in terms of land-use planning within several hundreds meter from the coastline. Perceptions of the government officials toward all investigated aspects were relatively good. One concern was found at coordination among disaster stakeholders in Aceh.

  8. A possible space-based tsunami early warning system using observations of the tsunami ionospheric hole.

    PubMed

    Kamogawa, Masashi; Orihara, Yoshiaki; Tsurudome, Chiaki; Tomida, Yuto; Kanaya, Tatsuya; Ikeda, Daiki; Gusman, Aditya Riadi; Kakinami, Yoshihiro; Liu, Jann-Yenq; Toyoda, Atsushi

    2016-12-01

    Ionospheric plasma disturbances after a large tsunami can be detected by measurement of the total electron content (TEC) between a Global Positioning System (GPS) satellite and its ground-based receivers. TEC depression lasting for a few minutes to tens of minutes termed as tsunami ionospheric hole (TIH) is formed above the tsunami source area. Here we describe the quantitative relationship between initial tsunami height and the TEC depression rate caused by a TIH from seven tsunamigenic earthquakes in Japan and Chile. We found that the percentage of TEC depression and initial tsunami height are correlated and the largest TEC depressions appear 10 to 20 minutes after the main shocks. Our findings imply that Ionospheric TEC measurement using the existing ground receiver networks could be used in an early warning system for near-field tsunamis that take more than 20 minutes to arrive in coastal areas.

  9. A possible space-based tsunami early warning system using observations of the tsunami ionospheric hole

    PubMed Central

    Kamogawa, Masashi; Orihara, Yoshiaki; Tsurudome, Chiaki; Tomida, Yuto; Kanaya, Tatsuya; Ikeda, Daiki; Gusman, Aditya Riadi; Kakinami, Yoshihiro; Liu, Jann-Yenq; Toyoda, Atsushi

    2016-01-01

    Ionospheric plasma disturbances after a large tsunami can be detected by measurement of the total electron content (TEC) between a Global Positioning System (GPS) satellite and its ground-based receivers. TEC depression lasting for a few minutes to tens of minutes termed as tsunami ionospheric hole (TIH) is formed above the tsunami source area. Here we describe the quantitative relationship between initial tsunami height and the TEC depression rate caused by a TIH from seven tsunamigenic earthquakes in Japan and Chile. We found that the percentage of TEC depression and initial tsunami height are correlated and the largest TEC depressions appear 10 to 20 minutes after the main shocks. Our findings imply that Ionospheric TEC measurement using the existing ground receiver networks could be used in an early warning system for near-field tsunamis that take more than 20 minutes to arrive in coastal areas. PMID:27905487

  10. Community participation in tsunami early warning system in Pangandaran town

    NASA Astrophysics Data System (ADS)

    Hadian, Sapari D.; Khadijah, Ute Lies Siti; Saepudin, Encang; Budiono, Agung; Yuliawati, Ayu Krishna

    2017-07-01

    Disaster-resilient communities are communities capable of anticipating and minimizing destructive forces through adaptation. Disaster is an event very close to the people of Indonesia, especially in the small tourism town of Pangadaran located at West Java, Indonesia. On July 17, 2006, the town was hit by a Mw 7.8 earthquake and tsunami that effected over 300 km of the coastline, where the community suffered losses in which more than 600 people were killed, with run up heights exceeding 20 m. The devastation of the tsunami have made the community more alert and together with the local government and other stakeholder develop an Early Warning System for Tsunami. The study is intended to discover issues on tsunami Early Warning System (EWS), disaster risk reduction measures taken and community participation. The research method used is descriptive and explanatory research. The study describe the Tsunami EWS and community based Disaster Risk Reduction in Pangandaran, the implementation of Tsunami alert/EWS in disaster preparedness and observation of community participation in EWS. Data were gathered by secondary data collection, also primary data through interviews, focus group discussions and field observations. Research resulted in a description of EWS implementation, community participation and recommendation to reduce disaster risk in Pangandaran.

  11. Survey of the July 17, 2006 Central Javan tsunami reveals 21m runup heights

    NASA Astrophysics Data System (ADS)

    Fritz, H.; Goff, J.; Harbitz, C.; McAdoo, B.; Moore, A.; Latief, H.; Kalligeris, N.; Kodjo, W.; Uslu, B.; Titov, V.; Synolakis, C.

    2006-12-01

    The Monday, July 17, 2006 Central Javan 7.7 earthquake triggered a substantial tsunami that killed 600 people along a 200km stretch of coastline. The earthquake was not reported felt along the coastline. While there was a warning issued by the PTWC, it did not trigger an evacuation warning (Synolakis, 2006). The Indian Ocean Tsunami Warning System announced by UNESCO as operational in a press release two weeks before the event did not function as promised. There were no seismic recordings transmitted to the PTWC, and two German tsunameter buoys had broken off their moorings and were not operational. Lifeguards along a tourist beach reported that while the observed the harbinger shoreline recession, they attributed to exteme storm waves that were pounding the beaches that day. Had the tsunami struck on the preceding Sunday, instead of Monday, the death toll would had been far higher. The International Tsunami Survey Team (ITST) surveyed the coastline measuring runup, inundation, flow depths and sediment deposition, with standard methods (Synolakis and Okal, 2004). Runup values ranged up to 21m with several readings over 10m, while sand sheets up to 15cm were deposited. The parent earthquake was similar, albeit of smaller magnitude, to the 1994 East Javan tsunami, which struck about 200km east (Synolakis, et al, 1995) and reached a maximum of 11m runup height only at one location on steep cliffs. The unusual distribution of runup heights, and the pronounced extreme values near Nusa Kambangan, suggest a local coseismic landslide may have triggered an additional tsunami (Okal and Synolakis, 2005). The ITST observed that many coastal villages were completely abandoned after the tsunami, even in locales where there were no casualties. Whether residents will return is uncertain, but it is clear that an education campaign in tsunami hazard mitigation is urgently needed. In the aftermath of the tsunami, the Government of Indonesia enforced urgent emergency preparedness measures, including sirens, identification of rapid evacuation routes, and emergency drills, which were under way some locations the team visited. Synolakis, C.E., What went wrong Wall Street Journal. p. 12, July 25, 2006. Synolakis, C.E., and E.A. Okal, 1992--2002: Perspective on a decade of post-tsunami surveys, in: Tsunamis: Case studies, K. Satake (ed), Adv. Natur. Technol. Hazards, 23 1--30, 2005. Okal, E.A., and Synolakis, C.E., Source discriminants for nearfield tsunamis, Geophysical Journal International, 158, 899?-912, 2004. Synolakis, C.E., Imamura, F., Tsuji, Y., Matsutomi, S., Tinti, B., Cook, B., and Ushman, M. Damage, Conditions of East Java tsunami of 1994 analyzed, EOS, 76, (26), 257 and 261-?262, 1995.

  12. Seismogeodesy for rapid earthquake and tsunami characterization

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2016-12-01

    Rapid estimation of earthquake magnitude and fault mechanism is critical for earthquake and tsunami warning systems. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. These methods are well developed for ocean basin-wide warnings but are not timely enough to protect vulnerable populations and infrastructure from the effects of local tsunamis, where waves may arrive within 15-30 minutes of earthquake onset time. Direct measurements of displacements by GPS networks at subduction zones allow for rapid magnitude and slip estimation in the near-source region, that are not affected by instrumental limitations and magnitude saturation experienced by local seismic networks. However, GPS displacements by themselves are too noisy for strict earthquake early warning (P-wave detection). Optimally combining high-rate GPS and seismic data (in particular, accelerometers that do not clip), referred to as seismogeodesy, provides a broadband instrument that does not clip in the near field, is impervious to magnitude saturation, and provides accurate real-time static and dynamic displacements and velocities in real time. Here we describe a NASA-funded effort to integrate GPS and seismogeodetic observations as part of NOAA's Tsunami Warning Centers in Alaska and Hawaii. It consists of a series of plug-in modules that allow for a hierarchy of rapid seismogeodetic products, including automatic P-wave picking, hypocenter estimation, S-wave prediction, magnitude scaling relationships based on P-wave amplitude (Pd) and peak ground displacement (PGD), finite-source CMT solutions and fault slip models as input for tsunami warnings and models. For the NOAA/NASA project, the modules are being integrated into an existing USGS Earthworm environment, currently limited to traditional seismic data. We are focused on a network of dozens of seismogeodetic stations available through the Pacific Northwest Seismic Network (University of Washington), the Plate Boundary Observatory (UNAVCO) and the Pacific Northwest Geodetic Array (Central Washington University) as the basis for local tsunami warnings for a large subduction zone earthquake in Cascadia.

  13. Research to Operations: From Point Positions, Earthquake and Tsunami Modeling to GNSS-augmented Tsunami Early Warning

    NASA Astrophysics Data System (ADS)

    Stough, T.; Green, D. S.

    2017-12-01

    This collaborative research to operations demonstration brings together the data and algorithms from NASA research, technology, and applications-funded projects to deliver relevant data streams, algorithms, predictive models, and visualization tools to the NOAA National Tsunami Warning Center (NTWC) and Pacific Tsunami Warning Center (PTWC). Using real-time GNSS data and models in an operational environment, we will test and evaluate an augmented capability for tsunami early warning. Each of three research groups collect data from a selected network of real-time GNSS stations, exchange data consisting of independently processed 1 Hz station displacements, and merge the output into a single, more accurate and reliable set. The resulting merged data stream is delivered from three redundant locations to the TWCs with a latency of 5-10 seconds. Data from a number of seismogeodetic stations with collocated GPS and accelerometer instruments are processed for displacements and seismic velocities and also delivered. Algorithms for locating and determining the magnitude of earthquakes as well as algorithms that compute the source function of a potential tsunami using this new data stream are included in the demonstration. The delivered data, algorithms, models and tools are hosted on NOAA-operated machines at both warning centers, and, once tested, the results will be evaluated for utility in improving the speed and accuracy of tsunami warnings. This collaboration has the potential to dramatically improve the speed and accuracy of the TWCs local tsunami information over the current seismometer-only based methods. In our first year of this work, we have established and deployed an architecture for data movement and algorithm installation at the TWC's. We are addressing data quality issues and porting algorithms into the TWCs operating environment. Our initial module deliveries will focus on estimating moment magnitude (Mw) from Peak Ground Displacement (PGD), within 2-3 minutes of the event, and coseismic displacements converging to static offsets. We will also develop visualizations of module outputs tailored to the operational environment. In the context of this work, we will also discuss this research to operations approach and other opportunities within the NASA Applied Science Disaster Program.

  14. Integration of WERA Ocean Radar into Tsunami Early Warning Systems

    NASA Astrophysics Data System (ADS)

    Dzvonkovskaya, Anna; Helzel, Thomas; Kniephoff, Matthias; Petersen, Leif; Weber, Bernd

    2016-04-01

    High-frequency (HF) ocean radars give a unique capability to deliver simultaneous wide area measurements of ocean surface current fields and sea state parameters far beyond the horizon. The WERA® ocean radar system is a shore-based remote sensing system to monitor ocean surface in near real-time and at all-weather conditions up to 300 km offshore. Tsunami induced surface currents cause increasing orbital velocities comparing to normal oceanographic situation and affect the measured radar spectra. The theoretical approach about tsunami influence on radar spectra showed that a tsunami wave train generates a specific unusual pattern in the HF radar spectra. While the tsunami wave is approaching the beach, the surface current pattern changes slightly in deep water and significantly in the shelf area as it was shown in theoretical considerations and later proved during the 2011 Japan tsunami. These observed tsunami signatures showed that the velocity of tsunami currents depended on a tsunami wave height and bathymetry. The HF ocean radar doesn't measure the approaching wave height of a tsunami; however, it can resolve the surface current velocity signature, which is generated when tsunami reaches the shelf edge. This strong change of the surface current can be detected by a phased-array WERA system in real-time; thus the WERA ocean radar is a valuable tool to support Tsunami Early Warning Systems (TEWS). Based on real tsunami measurements, requirements for the integration of ocean radar systems into TEWS are already defined. The requirements include a high range resolution, a narrow beam directivity of phased-array antennas and an accelerated data update mode to provide a possibility of offshore tsunami detection in real-time. The developed software package allows reconstructing an ocean surface current map of the area observed by HF radar based on the radar power spectrum processing. This fact gives an opportunity to issue an automated tsunami identification message by the WERA radars to TEWS. The radar measurements can be used to confirm a pre-warning and raise a tsunami alert. The output data of WERA processing software can be easily integrated into existing TEWS due to flexible data format, fast update rate and quality control of measurements. The archived radar data can be used for further hazard analysis and research purposes. The newly launched Tsunami Warning Center in Oman is one of the most sophisticated tsunami warning system world-wide applying a mix of well proven state-of-the-art subsystems. It allows the acquisition of data from many different sensor systems including seismic stations, GNSS, tide gauges, and WERA ocean radars in one acquisition system providing access to all sensor data via a common interface. The TEWS in Oman also integrates measurements of a modern network of HF ocean radars to verify tsunami simulations, which give additional scenario quality information and confirmation to the decision support.

  15. Tsunami disaster risk management capabilities in Greece

    NASA Astrophysics Data System (ADS)

    Marios Karagiannis, Georgios; Synolakis, Costas

    2015-04-01

    Greece is vulnerable to tsunamis, due to the length of the coastline, its islands and its geographical proximity to the Hellenic Arc, an active subduction zone. Historically, about 10% of all world tsunamis occur in the Mediterranean region. Here we review existing tsunami disaster risk management capabilities in Greece. We analyze capabilities across the disaster management continuum, including prevention, preparedness, response and recovery. Specifically, we focus on issues like legal requirements, stakeholders, hazard mitigation practices, emergency operations plans, public awareness and education, community-based approaches and early-warning systems. Our research is based on a review of existing literature and official documentation, on previous projects, as well as on interviews with civil protection officials in Greece. In terms of tsunami disaster prevention and hazard mitigation, the lack of tsunami inundation maps, except for some areas in Crete, makes it quite difficult to get public support for hazard mitigation practices. Urban and spatial planning tools in Greece allow the planner to take into account hazards and establish buffer zones near hazard areas. However, the application of such ordinances at the local and regional levels is often difficult. Eminent domain is not supported by law and there are no regulatory provisions regarding tax abatement as a disaster prevention tool. Building codes require buildings and other structures to withstand lateral dynamic earthquake loads, but there are no provisions for resistance to impact loading from water born debris Public education about tsunamis has increased during the last half-decade but remains sporadic. In terms of disaster preparedness, Greece does have a National Tsunami Warning Center (NTWC) and is a Member of UNESCO's Tsunami Program for North-eastern Atlantic, the Mediterranean and connected seas (NEAM) region. Several exercises have been organized in the framework of the NEAM Tsunami Warning System, with the Greek NWTC actively participating as a Candidate Tsunami Watch Provider. In addition, Greece designed and conducted the first tsunami exercise program in the Union Civil Protection Mechanism in 2011, which also considered the attrition of response capabilities by the earthquake generating the tsunami. These exercises have demonstrated the capability of the Greek NWTC to provide early warning to local civil protection authorities, but warning dissemination to the population remains an issue, especially during the summer season. However, there is no earthquake or tsunami national emergency operations plan, and we found that tsunami disaster planning and preparedness activities are rather limited at the local level. We acknowledge partial support by the project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe) FP7-ENV2013 6.4-3, Grant 603839 to the Technical University of Crete.

  16. ASTARTE: Assessment Strategy and Risk Reduction for Tsunamis in Europe

    NASA Astrophysics Data System (ADS)

    Baptista, M. A.; Yalciner, A. C.; Canals, M.

    2014-12-01

    Tsunamis are low frequency but high impact natural disasters. In 2004, the Boxing Day tsunami killed hundreds of thousands of people from many nations along the coastlines of the Indian Ocean. Tsunami run-up exceeded 35 m. Seven years later, and in spite of some of the best warning technologies and levels of preparedness in the world, the Tohoku-Oki tsunami in Japan dramatically showed the limitations of scientific knowledge on tsunami sources, coastal impacts and mitigation measures. The experience from Japan raised serious questions on how to improve the resilience of coastal communities, to upgrade the performance of coastal defenses, to adopt a better risk management, and also on the strategies and priorities for the reconstruction of damaged coastal areas. Societal resilience requires the reinforcement of capabilities to manage and reduce risk at national and local scales.ASTARTE (Assessment STrategy And Risk for Tsunami in Europe), a 36-month FP7 project, aims to develop a comprehensive strategy to mitigate tsunami impact in this region. To achieve this goal, an interdisciplinary consortium has been assembled. It includes all CTWPs of NEAM and expert institutions across Europe and worldwide. ASTARTE will improve i) basic knowledge of tsunami generation and recurrence going beyond simple catalogues, with novel empirical data and new statistical analyses for assessing long-term recurrence and hazards of large events in sensitive areas of NEAM, ii) numerical techniques for tsunami simulation, with focus on real-time codes and novel statistical emulation approaches, and iii) methods for assessment of hazard, vulnerability, and risk. ASTARTE will also provide i) guidelines for tsunami Eurocodes, ii) better tools for forecast and warning for CTWPs and NTWCs, and iii) guidelines for decision makers to increase sustainability and resilience of coastal communities. In summary, ASTARTE will develop basic scientific and technical elements allowing for a significant enhancement of the Tsunami Warning System in the NEAM region in terms of monitoring, early warning and forecast, governance and resilience. This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3)

  17. Lessons Learned and Unlearned from the 2004 Great Sumatran Tsunami.

    NASA Astrophysics Data System (ADS)

    Synolakis, C.; Kanoglu, U.

    2014-12-01

    Huppert & Sparks (2006 Phil Trans Math Phys Eng Sci) wrote It is likely that in the future, we will experience several disasters per year that kill more than 10,000 people. The 2011 Great East Japan Earthquake Disaster alone resulted in more than 20,000 casualties. Synolakis & Bernard (2006 Phil Trans Math Phys Eng Sci) concluded that Before the next Sumatra-type tsunami strikes, we must resolve to create a world that can coexist with the tsunami hazard. The 2011 Japan tsunami dramatically showed that we are not there yet. Despite substantial advances after the 2004 Boxing Day tsunami, substantial challenges remain for improving tsunami hazard mitigation. If the tsunami community appeared at first perplexed in the aftermath of the 2004 tsunami, it was not due to the failure of recognized hydrodynamic paradigms, much as certain geophysical ones and scaling laws failed, but at the worst surprise, the lack of preparedness and education. Synolakis et al. (2008 Pure Appl Geophys) presented standards for tsunami modeling; for both warnings and inundation maps (IMs). Although at least one forecasting methodology has gone through extensive testing, and is now officially in use by the warning centers (WCs), standards need urgently to be formalized for warnings. In Europe, several WCs have been established, but none has yet to issue an operational warning for a hazardous event. If it happens, there might be confusion with possibly contradictory/competing warnings. Never again should there be a repeat of the TEPCO analysis for the safety of the Fukushima NPP. This was primarily due to lacks of familiarity with the context of numerical predictions and experience with real tsunami. The accident was the result of a cascade of stupid errors, almost impossible to ignore by anyone in the field (Synolakis, 26.03.2011 The New York Times). Current practices in tsunami studies for US NPPs and for IMs do not provide us with optimism that the Fukushima lessons have been absorbed and that bagatellomania is still rabid. What saves human lives is ancestral knowledge and community preparedness, as demonstrated repeatedly. Efforts need to be focused in improving education worldwide in the simple steps they can take. We acknowledge the partial supports from the 7th FP (ASTARTE, Grant 603839), TUBITAK, TR (109Y387) and GSRT, GR (10TUR/1-50-1) projects.

  18. Lessons unlearned in Japan before 2011: Effects of the 2004 Indian Ocean tsunami on a nuclear plant in India

    NASA Astrophysics Data System (ADS)

    Sugimoto, M.

    2015-12-01

    The 2004 Indian Ocean tsunami killed around 220,000 people and startled the world. North of Chennai (Madras), the Indian plant nearly affected by tsunami in 2004. The local residents really did not get any warning in India. "On December 26, the Madras Atomic Power Station looked like a desolate place with no power, no phones, no water, no security arrangement and no hindrance whatsoever for outsiders to enter any part of the plant," said S.P. Udaykumar of SACCER. Nuclear issues hide behind such big tsunami damaged. Few media reported outside India. As for US, San Francisco Chronicle reported scientists had to rethink about nuclear power plants by the 2004 tsunami in 11th July 2005. Few tsunami scientsts did not pay attention to nucler power plants nearly affected by tsunami in US. On the other hand, US government noticed the Indian plant nearly affected in 2004. US Goverment supported nucler disaster management in several countries. As for Japan, Japanese goverment mainly concentrated reconstrucation in affected areas and tsunami early warning system. I worked in Japanese embassy in Jakarta Indonesia at that time. I did not receive the information about the Indian plant nearly affected by tsunami and US supported nucler safety to the other coutries. The 2011 Tohoku earthquake and tsunami damaged society and nuclear power stations. The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in the largest release of radioactive material since the 1986 Chernobyl accident. Why did not Japanese tsunami scientists learn from warning signs from the nuclear plant in India by the 2004 Indian Ocean tsunami to the 2011 Fukushima accident? I would like to clarify the reason few tsunami scientist notice this point in my presentation.

  19. The Pacific Tsunami Warning Center's Response to the Tohoku Earthquake and Tsunami

    NASA Astrophysics Data System (ADS)

    Weinstein, S. A.; Becker, N. C.; Shiro, B.; Koyanagi, K. K.; Sardina, V.; Walsh, D.; Wang, D.; McCreery, C. S.; Fryer, G. J.; Cessaro, R. K.; Hirshorn, B. F.; Hsu, V.

    2011-12-01

    The largest Pacific basin earthquake in 47 years, and also the largest magnitude earthquake since the Sumatra 2004 earthquake, struck off of the east coast of the Tohoku region of Honshu, Japan at 5:46 UTC on 11 March 2011. The Tohoku earthquake (Mw 9.0) generated a massive tsunami with runups of up to 40m along the Tohoku coast. The tsunami waves crossed the Pacific Ocean causing significant damage as far away as Hawaii, California, and Chile, thereby becoming the largest, most destructive tsunami in the Pacific Basin since 1960. Triggers on the seismic stations at Erimo, Hokkaido (ERM) and Matsushiro, Honshu (MAJO), alerted Pacific Tsunami Warning Center (PTWC) scientists 90 seconds after the earthquake began. Four minutes after its origin, and about one minute after the earthquake's rupture ended, PTWC issued an observatory message reporting a preliminary magnitude of 7.5. Eight minutes after origin time, the Japan Meteorological Agency (JMA) issued its first international tsunami message in its capacity as the Northwest Pacific Tsunami Advisory Center. In accordance with international tsunami warning system protocols, PTWC then followed with its first international tsunami warning message using JMA's earthquake parameters, including an Mw of 7.8. Additional Mwp, mantle wave, and W-phase magnitude estimations based on the analysis of later-arriving seismic data at PTWC revealed that the earthquake magnitude reached at least 8.8, and that a destructive tsunami would likely be crossing the Pacific Ocean. The earthquake damaged the nearest coastal sea-level station located 90 km from the epicenter in Ofunato, Japan. The NOAA DART sensor situated 600 km off the coast of Sendai, Japan, at a depth of 5.6 km recorded a tsunami wave amplitude of nearly two meters, making it by far the largest tsunami wave ever recorded by a DART sensor. Thirty minutes later, a coastal sea-level station at Hanasaki, Japan, 600 km from the epicenter, recorded a tsunami wave amplitude of nearly three meters. The evacuation of Hawaii's coastlines commenced at 7:31 UTC. Concurrent with this tsunami event, a widely-felt Mw 4.6 earthquake occurred beneath the island of Hawai`i at 8:58 UTC. PTWC responded within three minutes of origin time with a Tsunami Information Statement stating that the Hawaii earthquake would not generate a tsunami. After issuing 27 international tsunami bulletins to Pacific basin countries, and 16 messages to the State of Hawaii during a period of 25 hours after the event began, PTWC concluded its role during the Tohoku tsunami event with the issuance of the corresponding warning cancellation message at 6:36 UTC on 12 March 2011. During the following weeks, however, the PTWC would continue to respond to dozens of aftershocks related to the earthquake. We will present a complete timeline of PTWC's activities, both domestic and international, during the Tohoku tsunami event. We will also illustrate the immense number of website hits, phone calls, and media requests that flooded PTWC during the course of the event, as well as the growing role social media plays in communicating tsunami hazard information to the public.

  20. 77 FR 6785 - Proposed Information Collection; Comment Request; Feedback Survey for Annual Tsunami Warning...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... information following testing of the associated NWS communications systems. The tests are planned annually, in March/April and again in September. Post-test feedback information will be requested from emergency... Collection; Comment Request; Feedback Survey for Annual Tsunami Warning Communications Tests AGENCY: National...

  1. Near-Field Tsunami Models with Rapid Earthquake Source Inversions from Land and Ocean-Based Observations: The Potential for Forecast and Warning

    NASA Astrophysics Data System (ADS)

    Melgar, D.; Bock, Y.; Crowell, B. W.; Haase, J. S.

    2013-12-01

    Computation of predicted tsunami wave heights and runup in the regions adjacent to large earthquakes immediately after rupture initiation remains a challenging problem. Limitations of traditional seismological instrumentation in the near field which cannot be objectively employed for real-time inversions and the non-unique source inversion results are a major concern for tsunami modelers. Employing near-field seismic, GPS and wave gauge data from the Mw 9.0 2011 Tohoku-oki earthquake, we test the capacity of static finite fault slip models obtained from newly developed algorithms to produce reliable tsunami forecasts. First we demonstrate the ability of seismogeodetic source models determined from combined land-based GPS and strong motion seismometers to forecast near-source tsunamis in ~3 minutes after earthquake origin time (OT). We show that these models, based on land-borne sensors only tend to underestimate the tsunami but are good enough to provide a realistic first warning. We then demonstrate that rapid ingestion of offshore shallow water (100 - 1000 m) wave gauge data significantly improves the model forecasts and possible warnings. We ingest data from 2 near-source ocean-bottom pressure sensors and 6 GPS buoys into the earthquake source inversion process. Tsunami Green functions (tGFs) are generated using the GeoClaw package, a benchmarked finite volume code with adaptive mesh refinement. These tGFs are used for a joint inversion with the land-based data and substantially improve the earthquake source and tsunami forecast. Model skill is assessed by detailed comparisons of the simulation output to 2000+ tsunami runup survey measurements collected after the event. We update the source model and tsunami forecast and warning at 10 min intervals. We show that by 20 min after OT the tsunami is well-predicted with a high variance reduction to the survey data and by ~30 minutes a model that can be considered final, since little changed is observed afterwards, is achieved. This is an indirect approach to tsunami warning, it relies on automatic determination of the earthquake source prior to tsunami simulation. It is more robust than ad-hoc approaches because it relies on computation of a finite-extent centroid moment tensor to objectively determine the style of faulting and the fault plane geometry on which to launch the heterogeneous static slip inversion. Operator interaction and physical assumptions are minimal. Thus, the approach can provide the initial conditions for tsunami simulation (seafloor motion) irrespective of the type of earthquake source and relies heavily on oceanic wave gauge measurements for source determination. It reliably distinguishes among strike-slip, normal and thrust faulting events, all of which have been observed recently to occur in subduction zones and pose distinct tsunami hazards.

  2. Program and abstracts of the Second Tsunami Source Workshop; July 19-20, 2010

    USGS Publications Warehouse

    Lee, W.H.K.; Kirby, S.H.; Diggles, M.F.

    2010-01-01

    In response to a request by the National Oceanic and Atmospheric Administration (NOAA) for computing tsunami propagations in the western Pacific, Eric Geist asked Willie Lee for assistance in providing parameters of earthquakes which may be future tsunami sources. The U.S. Geological Survey (USGS) Tsunami Source Working Group (TSWG) was initiated in August 2005. An ad hoc group of diverse expertise was formed, with Steve Kirby as the leader. The founding members are: Rick Blakely, Eric Geist, Steve Kirby, Willie Lee, George Plafker, Dave Scholl, Roland von Huene, and Ray Wells. Half of the founding members are USGS emeritus scientists. A report was quickly completed because of NOAA's urgent need to precalculate tsunami propagation paths for early warning purposes. It was clear to the group that much more work needed to be done to improve our knowledge about tsunami sources worldwide. The group therefore started an informal research program on tsunami sources and meets irregularly to share ideas, data, and results. Because our group activities are open to anyone, we have more participants now, including, for example, Harley Benz and George Choy (USGS, Golden, Colo.), Holly Ryan and Stephanie Ross (USGS, Menlo Park, Calif.), Hiroo Kanamori (Caltech), Emile Okal (Northwestern University), and Gerard Fryer and Barry Hirshorn (Pacific Tsunami Warning Center, Hawaii). To celebrate the fifth anniversary of the TSWG, a workshop is being held in the Auditorium of Building 3, USGS, Menlo Park, on July 19-20, 2010 (Willie Lee and Steve Kirby, Conveners). All talks (except one) will be video broadcast. The first tsunami source workshop was held in April 2006 with about 100 participants from many institutions. This second workshop (on a much smaller scale) will be devoted primarily to recent work by the USGS members. In addition, Hiroo Kanamori (Caltech) will present his recent work on the 1960 and 2010 Chile earthquakes, Barry Hirshorn and Stuart Weinstein (Pacific Tsunami Warning Center) will present their work on tsunami warning, and Rick Wilson (California Geological Survey) will display three posters on tsunami studies by him and his colleagues.

  3. 1854-2014: 160 years of far-field tsunami detection and warning

    NASA Astrophysics Data System (ADS)

    Okal, Emile

    2014-05-01

    The first scientific study of a tsunami as generated by a distant earthquake can be traced to Bache [1856] who correctly identified waves from the 1854 Nankai earthquake on California tidal gauges. We will review developments in the study of the relationship between earthquake source and far field tsunami, with their logical application to distant warning. Among the principal milestones, we discuss Hochstetter's [1869] work on the 1868 Arica tsunami, Jaggar's real-time, but ignored, warning of the 1923 Kamchatka tsunami in Hawaii, his much greater success with the 1933 Showa Sanriku event, the catastrophic 1946 Aleutian event, which led to the implementation of PTWC, the 1960 events in Hilo, and the 1964 Alaska tsunami, which led to the development of the A[now N]TWC. From the scientific standpoint, we will review the evolution of our attempts to measure the seismic source (in practice its seismic moment), always faster, and at always lower frequencies, culminating in the W-phase inversion, heralded by Kanamori and co-workers in the wake of the Sumatra disaster. Specific problems arise from events violating scaling laws, such as the so-called "tsunami earthquakes", and we will review methodologies to recognize them in real time, such as energy-to-moment ratios. Finally, we will discuss briefly modern technologies aimed at directly detecting the tsunami independently of the seismic source.

  4. Impact of Hellenic Arc Tsunamis on Corsica (France)

    NASA Astrophysics Data System (ADS)

    Gailler, Audrey; Schindelé, F.; Hébert, H.

    2016-12-01

    In the historical period, the Eastern Mediterranean has been devastated by several tsunamis, the two most damaging were those of AD 365 and AD 1303, generated by great earthquakes of magnitude >8 at the Hellenic plate boundary. Recently, events of 6-7 magnitude have occurred in this region. As the French tsunami warning center has to ensure the warning for the French coastlines, the question has raised the possibility for a major tsunami triggered along the Hellenic arc to impact the French coasts. The focus is on the Corsica coasts especially, to estimate what would be the expected wave heights, and from which threshold of magnitude it would be necessary to put the population under cover. This study shows that a magnitude 8.0 earthquake nucleated along the Hellenic arc could induce in some cases a tsunami that would be observed along the Corsica coasts, and for events of 8.5 magnitude amplitudes exceeding 50 cm can be expected, which would be dangerous in harbors and beach areas especially. The main contribution of these results is the establishment of specific thresholds of magnitude for the tsunami warning along the French coasts, 7.8 for the advisory level (coastal marine threat with harbors and beaches evacuation), and 8.3 for the watch level (inland inundation threat) for tsunamis generated along the Hellenic arc.

  5. A Distributed Architecture for Tsunami Early Warning and Collaborative Decision-support in Crises

    NASA Astrophysics Data System (ADS)

    Moßgraber, J.; Middleton, S.; Hammitzsch, M.; Poslad, S.

    2012-04-01

    The presentation will describe work on the system architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". The challenges for a Tsunami Early Warning System (TEWS) are manifold and the success of a system depends crucially on the system's architecture. A modern warning system following a system-of-systems approach has to integrate various components and sub-systems such as different information sources, services and simulation systems. Furthermore, it has to take into account the distributed and collaborative nature of warning systems. In order to create an architecture that supports the whole spectrum of a modern, distributed and collaborative warning system one must deal with multiple challenges. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. At the bottom layer it has to reliably integrate a large set of conventional sensors, such as seismic sensors and sensor networks, buoys and tide gauges, and also innovative and unconventional sensors, such as streams of messages from social media services. At the top layer it has to support collaboration on high-level decision processes and facilitates information sharing between organizations. In between, the system has to process all data and integrate information on a semantic level in a timely manner. This complex communication follows an event-driven mechanism allowing events to be published, detected and consumed by various applications within the architecture. Therefore, at the upper layer the event-driven architecture (EDA) aspects are combined with principles of service-oriented architectures (SOA) using standards for communication and data exchange. The most prominent challenges on this layer include providing a framework for information integration on a syntactic and semantic level, leveraging distributed processing resources for a scalable data processing platform, and automating data processing and decision support workflows.

  6. Availability and Reliability of Disaster Early Warning Systems and the IT Infrastructure Library

    NASA Astrophysics Data System (ADS)

    Wächter, J.; Loewe, P.

    2012-12-01

    The Boxing Day Tsunami of 2004 caused an information catastrophy. Crucial early warning information could not be delivered to the communities under imminent threat, resulting in over 240,000 casualties in 14 countries. This tragedy sparked the development of a new generation of integrated modular Tsunami Early Warning Systems (TEWS). While significant advances were accomplished in the past years, recent events, like the Chile 2010 and the Tohoku 2011 tsunami demonstrate that the key technical challenge for Tsunami Early Warning research on the supranational scale still lies in the timely issuing of status information and reliable early warning messages. A key challenge stems from the main objective of the IOC Tsunami Programme, the integration of national TEWS towards ocean-wide networks: Each of the increasing number of integrated Tsunami Early Warning Centres has to cope with the continuing evolution of sensors, hardware and software while having to maintain reliable inter-center information exchange services. To avoid future information catastrophes, the performance of all components, ranging from sensors to Warning Centers, has to be regularly validated against defined criteria. This task is complicated by the fact that in term of ICT system life cycles tsunami are very rare event resulting in very difficult framing conditions to safeguard the availability and reliability of TWS. Since 2004, GFZ German Research Centre for Geosciences (GFZ) has built up expertise in the field of TEWS. Within GFZ, the Centre for GeoInformation Technology (CEGIT) has focused its work on the geoinformatics aspects of TEWS in two projects already: The German Indonesian Tsunami Early Warning System (GITEWS) funded by the German Federal Ministry of Education and Research (BMBF) and the Distant Early Warning System (DEWS), a European project funded under the sixth Framework Programme (FP6). These developments are continued in the TRIDEC project (Collaborative, Complex, and Critical Decision Processes in Evolving Crises) funded under the European Union's seventh Framework Programme (FP7). This ongoing project focuses on real-time intelligent information management in Earth management and its long-term application. All technical development in TRIDEC is based on mature system architecture models and industry standards. The use of standards applies also to the operation of individual TRIDEC reference installations and their interlinking into an integrated service infrastructure for supranational warning services: A set of best practices for IT service management is used to align the TEWS software services with the requirements by the Early Warning Centre management by defining Service Level Agreements (SLA) and ensuring appliance. For this, the concept of service lifecycles is adapted for the TEWS domain, which is laid out in the IT Infrastructure Library (ITIL) by the United Kingdom's Office of Government Commerce (OGC). The cyclic procedures, tasks and checklists described by ITIL are used to establish a baseline to plan, implement, and maintain TEWS service components in the long run. This allows to ensure compliance with given international TEWS standards and to measure improvement of the provided services against a gold-standard.

  7. Novel Algorithms Enabling Rapid, Real-Time Earthquake Monitoring and Tsunami Early Warning Worldwide

    NASA Astrophysics Data System (ADS)

    Lomax, A.; Michelini, A.

    2012-12-01

    We have introduced recently new methods to determine rapidly the tsunami potential and magnitude of large earthquakes (e.g., Lomax and Michelini, 2009ab, 2011, 2012). To validate these methods we have implemented them along with other new algorithms within the Early-est earthquake monitor at INGV-Rome (http://early-est.rm.ingv.it, http://early-est.alomax.net). Early-est is a lightweight software package for real-time earthquake monitoring (including phase picking, phase association and event detection, location, magnitude determination, first-motion mechanism determination, ...), and for tsunami early warning based on discriminants for earthquake tsunami potential. In a simulation using archived broadband seismograms for the devastating M9, 2011 Tohoku earthquake and tsunami, Early-est determines: the epicenter within 3 min after the event origin time, discriminants showing very high tsunami potential within 5-7 min, and magnitude Mwpd(RT) 9.0-9.2 and a correct shallow-thrusting mechanism within 8 min. Real-time monitoring with Early-est givess similar results for most large earthquakes using currently available, real-time seismogram data. Here we summarize some of the key algorithms within Early-est that enable rapid, real-time earthquake monitoring and tsunami early warning worldwide: >>> FilterPicker - a general purpose, broad-band, phase detector and picker (http://alomax.net/FilterPicker); >>> Robust, simultaneous association and location using a probabilistic, global-search; >>> Period-duration discriminants TdT0 and TdT50Ex for tsunami potential available within 5 min; >>> Mwpd(RT) magnitude for very large earthquakes available within 10 min; >>> Waveform P polarities determined on broad-band displacement traces, focal mechanisms obtained with the HASH program (Hardebeck and Shearer, 2002); >>> SeisGramWeb - a portable-device ready seismogram viewer using web-services in a browser (http://alomax.net/webtools/sgweb/info.html). References (see also: http://alomax.net/pub_list.html): Lomax, A. and A. Michelini (2012), Tsunami early warning within 5 minutes, Pure and Applied Geophysics, 169, nnn-nnn, doi: 10.1007/s00024-012-0512-6. Lomax, A. and A. Michelini (2011), Tsunami early warning using earthquake rupture duration and P-wave dominant period: the importance of length and depth of faulting, Geophys. J. Int., 185, 283-291, doi: 10.1111/j.1365-246X.2010.04916.x. Lomax, A. and A. Michelini (2009b), Tsunami early warning using earthquake rupture duration, Geophys. Res. Lett., 36, L09306, doi:10.1029/2009GL037223. Lomax, A. and A. Michelini (2009a), Mwpd: A Duration-Amplitude Procedure for Rapid Determination of Earthquake Magnitude and Tsunamigenic Potential from P Waveforms, Geophys. J. Int.,176, 200-214, doi:10.1111/j.1365-246X.2008.03974.x

  8. An automatic tsunami warning system: TREMORS application in Europe

    NASA Astrophysics Data System (ADS)

    Reymond, D.; Robert, S.; Thomas, Y.; Schindelé, F.

    1996-03-01

    An integrated system named TREMORS (Tsunami Risk Evaluation through seismic Moment of a Real-time System) has been installed in EVORA station, in Portugal which has been affected by historical tsunamis. The system is based on a three component long period seismic station linked to a compatible IBM_PC with a specific software. The goals of this system are the followings: detect earthquake, locate them, compute their seismic moment, give a seismic warning. The warnings are based on the seismic moment estimation and all the processing are made automatically. The finality of this study is to check the quality of estimation of the main parameters of interest in a goal of tsunami warning: the location which depends of azimuth and distance, and at last the seismic moment, M 0, which controls the earthquake size. The sine qua non condition for obtaining an automatic location is that the 3 main seismic phases P, S, R must be visible. This study gives satisfying results (automatic analysis): ± 5° errors in azimuth and epicentral distance, and a standard deviation of less than a factor 2 for the seismic moment M 0.

  9. International year of planet earth 7. Oceans, submarine land-slides and consequent tsunamis in Canada

    USGS Publications Warehouse

    Mosher, D.C.

    2009-01-01

    Canada has the longest coastline and largest continental margin of any nation in the World. As a result, it is more likely than other nations to experience marine geohazards such as submarine landslides and consequent tsunamis. Coastal landslides represent a specific threat because of their possible proximity to societal infrastructure and high tsunami potential; they occur without warning and with little time lag between failure and tsunami impact. Continental margin landslides are common in the geologic record but rare on human timescales. Some ancient submarine landslides are massive but more recent events indicate that even relatively small slides on continental margins can generate devastating tsunamis. Tsunami impact can occur hundreds of km away from the source event, and with less than 2 hours warning. Identification of high-potential submarine landslide regions, combined with an understanding of landslide and tsunami processes and sophisticated tsunami propagation models, are required to identify areas at high risk of impact.

  10. Real-time forecasting of the April 11, 2012 Sumatra tsunami

    USGS Publications Warehouse

    Wang, Dailin; Becker, Nathan C.; Walsh, David; Fryer, Gerard J.; Weinstein, Stuart A.; McCreery, Charles S.; ,

    2012-01-01

    The April 11, 2012, magnitude 8.6 earthquake off the northern coast of Sumatra generated a tsunami that was recorded at sea-level stations as far as 4800 km from the epicenter and at four ocean bottom pressure sensors (DARTs) in the Indian Ocean. The governments of India, Indonesia, Sri Lanka, Thailand, and Maldives issued tsunami warnings for their coastlines. The United States' Pacific Tsunami Warning Center (PTWC) issued an Indian Ocean-wide Tsunami Watch Bulletin in its role as an Interim Service Provider for the region. Using an experimental real-time tsunami forecast model (RIFT), PTWC produced a series of tsunami forecasts during the event that were based on rapidly derived earthquake parameters, including initial location and Mwp magnitude estimates and the W-phase centroid moment tensor solutions (W-phase CMTs) obtained at PTWC and at the U. S. Geological Survey (USGS). We discuss the real-time forecast methodology and how successive, real-time tsunami forecasts using the latest W-phase CMT solutions improved the accuracy of the forecast.

  11. Challenges in Defining Tsunami Wave Height

    NASA Astrophysics Data System (ADS)

    Stroker, K. J.; Dunbar, P. K.; Mungov, G.; Sweeney, A.; Arcos, N. P.

    2017-12-01

    The NOAA National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 Mw earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height, NCEI will consider adding an additional field for the maximum peak measurement.

  12. Challenges in Defining Tsunami Wave Heights

    NASA Astrophysics Data System (ADS)

    Dunbar, Paula; Mungov, George; Sweeney, Aaron; Stroker, Kelly; Arcos, Nicolas

    2017-08-01

    The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 M w earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 coastal tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height for each tide gauge and deep-ocean buoy, NCEI will consider adding an additional field for the maximum peak measurement.

  13. Meteotsunamis, destructive tsunami-like waves: from observations and simulations towards a warning system (MESSI)

    NASA Astrophysics Data System (ADS)

    Sepic, Jadranka; Vilibic, Ivica

    2016-04-01

    Atmospherically-generated tsunami-like waves, also known as meteotsunamis, pose a severe threat for exposed coastlines. Although not as destructive as ordinary tsunamis, several meters high meteotsunami waves can bring destruction, cause loss of human lives and raise panic. For that reason, MESSI, an integrative meteotsunami research & warning project, has been developed and will be presented herein. The project has a threefold base: (1) research of atmosphere-ocean interaction with focus on (i) source processes in the atmosphere, (ii) energy transfer to the ocean and (iii) along-propagation growth of meteotsunami waves; (2) estimation of meteotsunami occurrence rates in past, present and future climate, and mapping of meteotsunami hazard; (3) construction of a meteotsunami warning system prototype, with the latter being the main objective of the project. Due to a great frequency of meteotsunamis and its complex bathymetry which varies from the shallow shelf in the north towards deep pits in the south, with a number of funnel-shaped bays and harbours substantially amplifying incoming tsunami-like waves, the Adriatic, northernmost of the Mediterranean seas, has been chosen as an ideal area for realization of the MESSI project and implementation of the warning system. This warning system will however be designed to allow for a wider applicability and easy-to-accomplish transfer to other endangered locations. The architecture of the warning system will integrate several components: (1) real-time measurements of key oceanographic and atmospheric parameters, (2) coupled atmospheric-ocean models run in real time (warning) mode, and (3) semi-automatic procedures and protocols for warning of civil protection, local authorities and public. The effectiveness of the warning system will be tested over the historic events.

  14. Optimization of the Number and Location of Tsunami Stations in a Tsunami Warning System

    NASA Astrophysics Data System (ADS)

    An, C.; Liu, P. L. F.; Pritchard, M. E.

    2014-12-01

    Optimizing the number and location of tsunami stations in designing a tsunami warning system is an important and practical problem. It is always desirable to maximize the capability of the data obtained from the stations for constraining the earthquake source parameters, and to minimize the number of stations at the same time. During the 2011 Tohoku tsunami event, 28 coastal gauges and DART buoys in the near-field recorded tsunami waves, providing an opportunity for assessing the effectiveness of those stations in identifying the earthquake source parameters. Assuming a single-plane fault geometry, inversions of tsunami data from combinations of various number (1~28) of stations and locations are conducted and evaluated their effectiveness according to the residues of the inverse method. Results show that the optimized locations of stations depend on the number of stations used. If the stations are optimally located, 2~4 stations are sufficient to constrain the source parameters. Regarding the optimized location, stations must be uniformly spread in all directions, which is not surprising. It is also found that stations within the source region generally give worse constraint of earthquake source than stations farther from source, which is due to the exaggeration of model error in matching large amplitude waves at near-source stations. Quantitative discussions on these findings will be given in the presentation. Applying similar analysis to the Manila Trench based on artificial scenarios of earthquakes and tsunamis, the optimal location of tsunami stations are obtained, which provides guidance of deploying a tsunami warning system in this region.

  15. Generating tsunami risk knowledge at community level as a base for planning and implementation of risk reduction strategies

    NASA Astrophysics Data System (ADS)

    Wegscheider, Stephanie; Post, Joachim; Mück, Matthias; Zosseder, Kai; Muhari, Abdul; Anwar, Herryal Z.; Gebert, Niklas; Strunz, Günter; Riedlinger, Torsten

    2010-05-01

    More than 4 million Indonesians live in tsunami-prone areas on the southern and western coasts of Sumatra, Java and Bali. Depending on the location of the tsunamigenic earthquake, in many cases the time to reach a tsunami-safe area is as short as 15 or 20 minutes. To increase the chances of a successful evacuation a comprehensive and thorough planning and preparation is necessary. For this purpose, detailed knowledge on potential hazard impact and safe areas, exposed elements such as people, critical facilities and lifelines, deficiencies in response capabilities and evacuation routes is crucial. The major aims of this paper are (i) to assess and quantify people's response capabilities and (ii) to identify high risk areas which have a high need of action to improve the response capabilities and thus to reduce the risk. The major factor influencing people's ability to evacuate successfully is the factor time. The estimated time of arrival of a tsunami at the coast which determines the overall available time for evacuation after triggering of a tsunami can be derived by analyzing modeled tsunami scenarios for a respective area. But in most cases, this available time frame is diminished by other time components including the time until natural or technical warning signs are received and the time until reaction follows a warning (understanding a warning and decision to take appropriate action). For the time to receive a warning we assume that the early warning centre is able to fulfil the Indonesian presidential decree to issue a warning within 5 minutes. Reaction time is difficult to quantify as here human intrinsic factors as educational level, believe, tsunami knowledge and experience play a role. Although we are aware of the great importance of this factor and the importance to minimize the reaction time, it is not considered in this paper. Quantifying the needed evacuation time is based on a GIS approach. This approach is relatively simple and enables local authorities to implement it at low technical complexity and relatively low cost and time needs. Basic principle is to define the best evacuation route from a given point to the nearest safe area. Here the fastest path from that point to the shelter location has to be found. Thereby the impact of land cover, slope, population density, population age and gender distribution are taken into account as literature studies prove these factors as highly important. Knowing the fastest path and the distance to the nearest safe area together with a spatially distributed pattern of evacuation speed delivers the time needed from each location to a shelter. A shelter location can either be a horizontal area or an evacuation building (vertical evacuation). For both kinds of evacuation target points, one limiting factor can be again time: are the people able to reach the target point within the available time? Especially for evacuation buildings, there is a second possibly limiting factor, namely capacity. In the majority of cases in all of the three study areas where this approach was applied to, capacity was the critical factor instead of time. Consequently, for planning purposes it is essential to know which area can be served by an evacuation building and which areas have to be assigned to a different evacuation target point due to exhausted capacity of the nearest one. The coverage of a building is also derived on basis of a GIS approach using the beforehand derived available and needed evacuation times and detailed population distribution data. Evacuation time and derived evacuable areas are then used to identify high risk areas. In combination with detailed population distribution data, hazard probability and hazard intensity, it is possible to identify areas with high risk and large deficiencies in response capabilities. Often enough, human response capabilities can be increased by thorough disaster planning and thus, the results of this paper provide valuable information for planning authorities to decrease the risk. This paper presents results exemplarily for the study area Kuta, Bali where we tested this approach and where it is also in progress to be implemented by local authorities.

  16. The TRIDEC Project: Future-Saving FOSS GIS Applications for Tsunami Early Warning

    NASA Astrophysics Data System (ADS)

    Loewe, P.; Wächter, J.; Hammitzsch, M.

    2011-12-01

    The Boxing Day Tsunami of 2004 killed over 240,000 people in 14 countries and inundated the affected shorelines with waves reaching heights up to 30m. This natural disaster coincided with an information catastrophy, as potentially life-saving early warning information existed, yet no means were available to deliver it to the communities under imminent threat. Tsunami Early Warning Capabilities have improved in the meantime by continuing development of modular Tsunami Early Warning Systems (TEWS). However, recent tsunami events, like the Chile 2010 and the Tohoku 2011 tsunami demonstrate that the key challenge for ongoing TEWS research on the supranational scale still lies in the timely issuing of reliable early warning messages. Since 2004, the GFZ German Research Centre for Geosciences has built up expertise in the field of TEWS. Within GFZ, the Centre for GeoInformation Technology (CEGIT) has focused its work on the geoinformatics aspects of TEWS in two projects already: The German Indonesian Tsunami Early Warning System (GITEWS) funded by the German Federal Ministry of Education and Research (BMBF) and the Distant Early Warning System (DEWS), a European project funded under the sixth Framework Programme (FP6). These developments are continued in the TRIDEC project (Collaborative, Complex, and Critical Decision Processes in Evolving Crises) funded under the European Union's seventh Framework Programme (FP7). This ongoing project focuses on real-time intelligent information management in Earth management and its long-term application. All TRIDEC developments are based on Free and Open Source Software (FOSS) components and industry standards where-ever possible. Tsunami Early Warning in TRIDEC is also based on mature system architecture models to ensure long-term usability and the flexibility to adapt to future generations of Tsunami sensors. All open source software produced by the project consortium are foreseen to be published on FOSSLAB, a publicly available software repository provided by CEGIT. FOSSLAB serves as a platform for the development of FOSS projects in geospatial context, allowing to save, advance and reuse results achieved in previous and on-going project activities and enabling further development and collaboration with a wide community including scientists, developers, users and stakeholders. FOSSLABs potential to preserve and advance existing best practices for reuse in new scenarios is documented by a first case study: For TEWS education and public outreach a comprehensive approach to generate high resolution globe maps was compiled using GRASS GIS and the POV-Ray rendering software. The task resulted in the merging of isolated technical know-how into publicly available best practices, which had been previously maintained in disparate GIS- and rendering communities. Beyond the scope of TRIDEC, FOSSLAB constitutes an umbrella encompassing several geoinformatics-related activities, such as the documentation of Best Practices for experiences and results while working with Spatial Data Infrastructures (SDI), Geographic Information Systems (GIS), Geomatics, and future spatial processing on Computation Clusters and in Cloud Computing.

  17. Open Source Seismic Software in NOAA's Next Generation Tsunami Warning System

    NASA Astrophysics Data System (ADS)

    Hellman, S. B.; Baker, B. I.; Hagerty, M. T.; Leifer, J. M.; Lisowski, S.; Thies, D. A.; Donnelly, B. K.; Griffith, F. P.

    2014-12-01

    The Tsunami Information technology Modernization (TIM) is a project spearheaded by National Oceanic and Atmospheric Administration to update the United States' Tsunami Warning System software currently employed at the Pacific Tsunami Warning Center (Eva Beach, Hawaii) and the National Tsunami Warning Center (Palmer, Alaska). This entirely open source software project will integrate various seismic processing utilities with the National Weather Service Weather Forecast Office's core software, AWIPS2. For the real-time and near real-time seismic processing aspect of this project, NOAA has elected to integrate the open source portions of GFZ's SeisComP 3 (SC3) processing system into AWIPS2. To provide for better tsunami threat assessments we are developing open source tools for magnitude estimations (e.g., moment magnitude, energy magnitude, surface wave magnitude), detection of slow earthquakes with the Theta discriminant, moment tensor inversions (e.g. W-phase and teleseismic body waves), finite fault inversions, and array processing. With our reliance on common data formats such as QuakeML and seismic community standard messaging systems, all new facilities introduced into AWIPS2 and SC3 will be available as stand-alone tools or could be easily integrated into other real time seismic monitoring systems such as Earthworm, Antelope, etc. Additionally, we have developed a template based design paradigm so that the developer or scientist can efficiently create upgrades, replacements, and/or new metrics to the seismic data processing with only a cursory knowledge of the underlying SC3.

  18. Knowledge base and sensor bus messaging service architecture for critical tsunami warning and decision-support

    NASA Astrophysics Data System (ADS)

    Sabeur, Z. A.; Wächter, J.; Middleton, S. E.; Zlatev, Z.; Häner, R.; Hammitzsch, M.; Loewe, P.

    2012-04-01

    The intelligent management of large volumes of environmental monitoring data for early tsunami warning requires the deployment of robust and scalable service oriented infrastructure that is supported by an agile knowledge-base for critical decision-support In the TRIDEC project (TRIDEC 2010-2013), a sensor observation service bus of the TRIDEC system is being developed for the advancement of complex tsunami event processing and management. Further, a dedicated TRIDEC system knowledge-base is being implemented to enable on-demand access to semantically rich OGC SWE compliant hydrodynamic observations and operationally oriented meta-information to multiple subscribers. TRIDEC decision support requires a scalable and agile real-time processing architecture which enables fast response to evolving subscribers requirements as the tsunami crisis develops. This is also achieved with the support of intelligent processing services which specialise in multi-level fusion methods with relevance feedback and deep learning. The TRIDEC knowledge base development work coupled with that of the generic sensor bus platform shall be presented to demonstrate advanced decision-support with situation awareness in context of tsunami early warning and crisis management.

  19. The 2017 México Tsunami Record, Numerical Modeling and Threat Assessment in Costa Rica

    NASA Astrophysics Data System (ADS)

    Chacón-Barrantes, Silvia

    2018-03-01

    An M w 8.2 earthquake and tsunami occurred offshore the Pacific coast of México on 2017-09-08, at 04:49 UTC. Costa Rican tide gauges have registered a total of 21 local, regional and far-field tsunamis. The Quepos gauge registered 12 tsunamis between 1960 and 2014 before it was relocated inside a harbor by late 2014, where it registered two more tsunamis. This paper analyzes the 2017 México tsunami as recorded by the Quepos gauge. It took 2 h for the tsunami to arrive to Quepos, with a first peak height of 9.35 cm and a maximum amplitude of 18.8 cm occurring about 6 h later. As a decision support tool, this tsunami was modeled for Quepos in real time using ComMIT (Community Model Interface for Tsunami) with the finer grid having a resolution of 1 arcsec ( 30 m). However, the model did not replicate the tsunami record well, probably due to the lack of a finer and more accurate bathymetry. In 2014, the National Tsunami Monitoring System of Costa Rica (SINAMOT) was created, acting as a national tsunami warning center. The occurrence of the 2017 México tsunami raised concerns about warning dissemination mechanisms for most coastal communities in Costa Rica, due to its short travel time.

  20. Marine, Tropical, and Tsunami Services

    Science.gov Websites

    essential to the conduct of safe and efficient maritime operations and for the protection of the marine - Managed by National Data Buoy Center (NDBC) Awareness Weeks: Tsunami Preparedness Campaigns National Safe Prepared and Stay Safe! Tsunami Preparedness: Applying Lessons from the Past Pacific Tsunami Warning Center

  1. On mitigating rapid onset natural disasters: Project THRUST (Tsunami Hazards Reduction Utilizing Systems Technology)

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.; Behn, R. R.; Hebenstreit, G. T.; Gonzalez, F. I.; Krumpe, P.; Lander, J. F.; Lorca, E.; McManamon, P. M.; Milburn, H. B.

    Rapid onset natural hazards have claimed more than 2.8 million lives worldwide in the past 20 years. This category includes such events as earthquakes, landslides, hurricanes, tornados, floods, volcanic eruptions, wildfires, and tsunamis. Effective hazard mitigation is particularly difficult in such cases, since the time available to issue warnings can be very short or even nonexistent. This paper presents the concept of a local warning system that exploits and integrates the existing technologies of risk evaluation, environmental measurement, and telecommunications. We describe Project THRUST, a successful implementation of this general, systematic approach to tsunamis. The general approach includes pre-event emergency planning, real-time hazard assessment, and rapid warning via satellite communication links.

  2. Tsunami Early Warning in Europe: NEAMWave Exercise 2012 - the Portuguese Scenario

    NASA Astrophysics Data System (ADS)

    Lendholt, Matthias; Hammmitzsch, Martin; Schulz, Jana; Reißland, Sven

    2013-04-01

    On 27th and 28th November 2012 the first European-wide tsunami exercise took place under the auspices of UNESCO Intergovernmental Coordination Group for the Tsunami Early Warning and Mitigation System in the North-eastern Atlantic, the Mediterranean and connected seas (ICG/NEAMTWS). Four international scenarios were performed - one for each candidate tsunami watch provider France, Greece, Portugal and Turkey. Their task was to generate and disseminate tsunami warning bulletins in-time and in compliance with the official NEAMTWS specifications. The Instituto Português do Mar e da Atmosfera (IPMA, [1]) in Lissabon and the Kandilli Observatory and Earthquake Research Institute (KOERI [2]) in Istanbul are the national agencies of Portugal and Turkey responsible for tsunami early warning. Both institutes are partners in the TRIDEC [3] project and were using the TRIDEC Natural Crisis Management (NCM) system during NEAMWave exercise. The software demonstrated the seamless integration of diverse components including sensor systems, simulation data, and dissemination hardware. The functionalities that were showcased significantly exceeded the internationally agreed range of capabilities. Special attention was given to the Command and Control User Interface (CCUI) serving as central application for the operator. Its origins lie in the DEWS project [4] but numerous new functionalities were added to master all requirements defined by the complex NEAMTWS workflows. It was of utmost importance to develop an application handling the complexity of tsunami science but providing a clearly arranged and comprehensible interface that disburdens the operator during time-critical hazard situations. [1] IPMA: www.ipma.pt/ [2] KOERI: www.koeri.boun.edu.tr/ [3] TRIDEC: www.tridec-online.eu [4] DEWS: www.dews-online.org

  3. Experience from three years of local capacity development for tsunami early warning in Indonesia: challenges, lessons and the way ahead

    NASA Astrophysics Data System (ADS)

    Spahn, H.; Hoppe, M.; Vidiarina, H. D.; Usdianto, B.

    2010-07-01

    Five years after the 2004 tsunami, a lot has been achieved to make communities in Indonesia better prepared for tsunamis. This achievement is primarily linked to the development of the Indonesian Tsunami Early Warning System (InaTEWS). However, many challenges remain. This paper describes the experience with local capacity development for tsunami early warning (TEW) in Indonesia, based on the activities of a pilot project. TEW in Indonesia is still new to disaster management institutions and the public, as is the paradigm of Disaster Risk Reduction (DRR). The technology components of InaTEWS will soon be fully operational. The major challenge for the system is the establishment of clear institutional arrangements and capacities at national and local levels that support the development of public and institutional response capability at the local level. Due to a lack of information and national guidance, most local actors have a limited understanding of InaTEWS and DRR, and often show little political will and priority to engage in TEW. The often-limited capacity of local governments is contrasted by strong engagement of civil society organisations that opt for early warning based on natural warning signs rather than technology-based early warning. Bringing together the various actors, developing capacities in a multi-stakeholder cooperation for an effective warning system are key challenges for the end-to-end approach of InaTEWS. The development of local response capability needs to receive the same commitment as the development of the system's technology components. Public understanding of and trust in the system comes with knowledge and awareness on the part of the end users of the system and convincing performance on the part of the public service provider. Both sides need to be strengthened. This requires the integration of TEW into DRR, clear institutional arrangements, national guidance and intensive support for capacity development at local levels as well as dialogue between the various actors.

  4. Integrating TWES and Satellite-based remote sensing: Lessons learned from the Honshu 2011 Tsunami

    NASA Astrophysics Data System (ADS)

    Löwe, Peter; Wächter, Joachim

    2013-04-01

    The Boxing Day Tsunami killed 240,000 people and inundated the affected shorelines with waves reaching heights up to 30m. Tsunami Early Warning Capabilities have improved in the meantime by continuing development of modular Tsunami Early Warning Systems (TEWS). However, recent tsunami events, like the Chile 2010 and the Honshu 2011 tsunami demonstrate that the key challenge for TEWS research still lies in the timely issuing of reliable early warning messages to areas at risk, but also to other stakeholders professionally involved in the unfolding event. Until now remote sensing products for Tsunami events, including crisis maps and change detection products, are exclusively linked to those phases of the disaster life cycle, which follow after the early warning stage: Response, recovery and mitigation. The International Charter for Space and Major Disasters has been initiated by the European Space Agency (ESA) and the Centre National d'Etudes Spatiales (CNES) in 1999. It coordinates a voluntary group of governmental space agencies and industry partners, to provide rapid crisis imaging and mapping to disaster and relief organisations to mitigate the effects of disasters on human life, property and the environment. The efficiency of this approach has been demonstrated in the field of Tsunami early warning by Charter activations following the Boxing Day Tsunami 2004, the Chile Tsunami 2010 and the Honshu Tsunami 2011. Traditional single-satellite operations allow at best bimonthly repeat rates over a given Area of Interest (AOI). This allows a lot of time for image acquisition campaign planning between imaging windows for the same AOI. The advent of constellations of identical remote sensing satellites in the early 21st century resulted both in daily AOI revisit capabilities and drastically reduced time frames for acquisition planning. However, the image acquisition planning for optical remote sensing satellite constellations is constrained by orbital and communication requirements: Defined time slots exist to commandeer the tasking of image acquisitions. If such a time slot has been missed, another attempt to image an AOI again can only be attempted ca. 24 hours later, due to the sun-synchronous satellite orbits Therefore it is critical to establish automated Disaster Early Warning dissemination services for the remote sensing community, to supply them with the timeliest opportunity to trigger the tasking process for the affected AOI. For very large events like a Tsunami in the Pacific, this approach provides the chance to gain additional pre-disaster imagery as a reference for change detection. In the case of the Tohoku earthquake, an ad-hoc warning dissemination process was manually dispatched by the Centre for Geoinformation Technology (CeGIT) at the German Research Centre for Geoscience, contacting RapidEye AG, once the severity of the earthquake event had been confirmed by the GEOFON geoseismic network. RapidEye AG decided to launch an imaging campaign which yielded 78 georectified image tiles (L3A) of Honshu island during the next imaging window. Of these, 26 tiles cover the affected coastline, resulting in 16,250km² of content for crisis mapping effort such as the Humanitarian Open Street Map (OSM) Team. This data was made available by RapidEye as a part of the Charter Activiation requested by Japan on March 11 2011. [1] Hoja, D., Schwinger, M.,Wendleder A.,Löwe, P., Konstanski, H., Weichelt, H.: Optimised Near-Real Time Data Acquisition for Disaster Related Rapid Mapping

  5. Tsunamis hazard assessment and monitoring for the Back Sea area

    NASA Astrophysics Data System (ADS)

    Partheniu, Raluca; Ionescu, Constantin; Constantin, Angela; Moldovan, Iren; Diaconescu, Mihail; Marmureanu, Alexandru; Radulian, Mircea; Toader, Victorin

    2016-04-01

    NIEP has improved lately its researches regarding tsunamis in the Black Sea. As part of the routine earthquake and tsunami monitoring activity, the first tsunami early-warning system in the Black Sea has been implemented in 2013 and is active during these last years. In order to monitor the seismic activity of the Black Sea, NIEP is using a total number of 114 real time stations and 2 seismic arrays, 18 of the stations being located in Dobrogea area, area situated in the vicinity of the Romanian Black Sea shore line. Moreover, there is a data exchange with the Black Sea surrounding countries involving the acquisition of real-time data for 17 stations from Bulgaria, Turkey, Georgia and Ukraine. This improves the capability of the Romanian Seismic Network to monitor and more accurately locate the earthquakes occurred in the Black Sea area. For tsunamis monitoring and warning, a number of 6 sea level monitoring stations, 1 infrasound barometer, 3 offshore marine buoys and 7 GPS/GNSS stations are installed in different locations along and near the Romanian shore line. In the framework of ASTARTE project, few objectives regarding the seismic hazard and tsunami waves height assessment for the Black Sea were accomplished. The seismic hazard estimation was based on statistical studies of the seismic sources and their characteristics, compiled using different seismic catalogues. Two probabilistic methods were used for the evaluation of the seismic hazard, the Cornell method, based on the Gutenberg Richter distribution parameters, and Gumbel method, based on extremes statistic. The results show maximum values of possible magnitudes and their recurrence periods, for each seismic source. Using the Tsunami Analysis Tool (TAT) software, a set of tsunami modelling scenarios have been generated for Shabla area, the seismic source that could mostly affect the Romanian shore. These simulations are structured in a database, in order to set maximum possible tsunami waves that could be generated and to establish minimum magnitude values that could trigger tsunamis in this area. Some particularities of Shabla source are: past observed magnitudes > 7 and a recurrence period of 175 years. Some other important objectives of NIEP are to continue the monitoring of the seismic activity of the Black Sea, to improve the data base of the tsunami simulations for this area, near real time fault plane solution estimations used for the warning system, and to add new seismic, GPS/GNSS and sea level monitoring equipment to the existing network. Acknowledgements: This work was partially supported by the FP7 FP7-ENV2013 6.4-3 "Assessment, Strategy And Risk Reduction For Tsunamis in Europe" (ASTARTE) Project 603839/2013 and PNII, Capacity Module III ASTARTE RO Project 268/2014. This work was partially supported by the "Global Tsunami Informal Monitoring Service - 2" (GTIMS2) Project, JRC/IPR/2015/G.2/2006/NC 260286, Ref. Ares (2015)1440256 - 01.04.2015.

  6. New Tsunami Forecast Tools for the French Polynesia Tsunami Warning System

    NASA Astrophysics Data System (ADS)

    Clément, Joël; Reymond, Dominique

    2015-03-01

    This paper presents the tsunami warning tools, which are used for the estimation of the seismic source parameters. These tools are grouped under a method called Preliminary Determination of Focal Mechanism_2 ( PDFM2), that has been developed at the French Polynesia Warning Center, in the framework of the system, as a plug-in concept. The first tool determines the seismic moment and the focal geometry (strike, dip, and slip), and the second tool identifies the "tsunami earthquakes" (earthquakes that cause much bigger tsunamis than their magnitude would imply). In a tsunami warning operation, initial assessment of the tsunami potential is based on location and magnitude. The usual quick magnitude methods which use waves, work fine for smaller earthquakes. For major earthquakes these methods drastically underestimate the magnitude and its tsunami potential because the radiated energy shifts to the longer period waves. Since French Polynesia is located far away from the subduction zones of the Pacific rim, the tsunami threat is not imminent, and this luxury of time allows to use the long period surface wave data to determine the true size of a major earthquake. The source inversion method presented in this paper uses a combination of surface waves amplitude spectra and P wave first motions. The advantage of using long period surface data is that there is a much more accurate determination of earthquake size, and the advantage of using P wave first motion is to have a better constrain of the focal geometry than using the surface waves alone. The method routinely gives stable results at minutes, with being the origin time of an earthquake. Our results are then compared to the Global Centroid Moment Tensor catalog for validating both the seismic moment and the source geometry. The second tool discussed in this paper is the slowness parameter and is the energy-to-moment ratio. It has been used to identify tsunami earthquakes, which are characterized by having unusual slow rupture velocity and release seismic energy that has been shifted to longer periods and, therefore, have low values. The slow rupture velocity would indicate weaker material and bigger uplift and, thus, bigger tsunami potential. The use of the slowness parameter is an efficient tool for monitoring the near real-time identification of tsunami earthquakes.

  7. Defining Tsunami Magnitude as Measure of Potential Impact

    NASA Astrophysics Data System (ADS)

    Titov, V. V.; Tang, L.

    2016-12-01

    The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.

  8. Tsunami field survey in French Polynesia of the 2015 Chilean earthquake Mw = 8.2 and what we learned.

    NASA Astrophysics Data System (ADS)

    Jamelot, Anthony; Reymond, Dominique; Savigny, Jonathan; Hyvernaud, Olivier

    2016-04-01

    The tsunami generated by the earthquake of magnitude Mw=8.2 near the coast of central Chile on the 16th September 2015 was observed on 7 tide gauges distributed over the five archipelagoes composing French Polynesia, a territory as large as Europe. We'll sum up all the observations of the tsunami and the field survey done in Tahiti (Society islands) and Hiva-Oa (Marquesas islands) to evaluate the preliminary tsunami forecast tool (MERIT) and the detailed tsunami forecast tool (COASTER) of the French Polynesian Tsunami Warning Center. The preliminary tool forecasted a maximal tsunami height between 0.5m to 2.3 m all over the Marquesas Islands. But only the island of Hiva-Oa had a tsunami forecast greater than 1 meter especially in the Tahauku Bay well known for its local response due to its resonance properties. In Tahauku bay, the tide gauge located at the entrance of the bay recorded a maximal tsunami height above mean sea level ~ 1.7 m; and we measured at the bottom of the bay a run-up about 2.8 m at 388 m inland from the shoreline in the river bed, and a run-up of 2.5 m located 155 m inland. The multi-grid simulation over Tahiti was done one hour after the origin time of the earthquake and gave a very localized tsunami impact on the North shore. Our forecast indicated an inundation about 10 m inland that lead Civil Authorities to evacuate 6 houses. It was the first operational use of this new fine grid covering the north part of Tahiti that is not protected by a coral reef. So we were attentive to the feed back of the alert that confirm the forecast of the maximal height arrival 1 hour after the first arrival. The tsunami warning system forecast well strong impact as well as low impact as long as we have an early robust description of the seismic parameters and fine grids about 10 m spatial resolution to simulate tsunami impact. In January of 2016 we are able to forecast tsunami heights for 72 points located over 35 islands of French Polynesia.

  9. Rapid inundation estimates at harbor scale using tsunami wave heights offshore simulation and Green's law approach

    NASA Astrophysics Data System (ADS)

    Gailler, Audrey; Hébert, Hélène; Loevenbruck, Anne

    2013-04-01

    Improvements in the availability of sea-level observations and advances in numerical modeling techniques are increasing the potential for tsunami warnings to be based on numerical model forecasts. Numerical tsunami propagation and inundation models are well developed and have now reached an impressive level of accuracy, especially in locations such as harbors where the tsunami waves are mostly amplified. In the framework of tsunami warning under real-time operational conditions, the main obstacle for the routine use of such numerical simulations remains the slowness of the numerical computation, which is strengthened when detailed grids are required for the precise modeling of the coastline response on the scale of an individual harbor. In fact, when facing the problem of the interaction of the tsunami wavefield with a shoreline, any numerical simulation must be performed over an increasingly fine grid, which in turn mandates a reduced time step, and the use of a fully non-linear code. Such calculations become then prohibitively time-consuming, which is clearly unacceptable in the framework of real-time warning. Thus only tsunami offshore propagation modeling tools using a single sparse bathymetric computation grid are presently included within the French Tsunami Warning Center (CENALT), providing rapid estimation of tsunami wave heights in high seas, and tsunami warning maps at western Mediterranean and NE Atlantic basins scale. We present here a preliminary work that performs quick estimates of the inundation at individual harbors from these deep wave heights simulations. The method involves an empirical correction relation derived from Green's law, expressing conservation of wave energy flux to extend the gridded wave field into the harbor with respect to the nearby deep-water grid node. The main limitation of this method is that its application to a given coastal area would require a large database of previous observations, in order to define the empirical parameters of the correction equation. As no such data (i.e., historical tide gage records of significant tsunamis) are available for the western Mediterranean and NE Atlantic basins, a set of synthetic mareograms is calculated for both fake and well-known historical tsunamigenic earthquakes in the area. This synthetic dataset is obtained through accurate numerical tsunami propagation and inundation modeling by using several nested bathymetric grids characterized by a coarse resolution over deep water regions and an increasingly fine resolution close to the shores (down to a grid cell size of 3m in some Mediterranean harbors). This synthetic dataset is then used to approximate the empirical parameters of the correction equation. Results of inundation estimates in several french Mediterranean harbors obtained with the fast "Green's law - derived" method are presented and compared with values given by time-consuming nested grids simulations.

  10. TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems

    NASA Astrophysics Data System (ADS)

    Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.

    2012-12-01

    A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time-series data in the GUI as well. This GUI also includes mouse-clickable functions such as zooming or expanding the time-series display, measuring tsunami signal characteristics (arrival time, wave period and amplitude, etc.), and removing the tide signal from the time-series data. De-tiding of the time series is necessary to obtain accurate measurements of tsunami wave parameters and to maintain accurate historical tsunami databases. With TIDE TOOL, de-tiding is accomplished with a set of tide harmonic coefficients routinely computed and updated at PTWC for many of the stations in PTWC's inventory (~570). PTWC also uses the decoded time series files (previous 3-5 days' worth) to compute on-the-fly tide coefficients. The latter is useful in cases where the station is new and a long-term stable set of tide coefficients are not available or cannot be easily obtained due to various non-astronomical effects. The international tsunami warning system is coordinated globally by the UNESCO IOC, and a number of countries in the Pacific and Indian Ocean, and Caribbean depend on Tide Tool to monitor tsunamis in real time.

  11. Rapid Determination of Appropriate Source Models for Tsunami Early Warning using a Depth Dependent Rigidity Curve: Method and Numerical Tests

    NASA Astrophysics Data System (ADS)

    Tanioka, Y.; Miranda, G. J. A.; Gusman, A. R.

    2017-12-01

    Recently, tsunami early warning technique has been improved using tsunami waveforms observed at the ocean bottom pressure gauges such as NOAA DART system or DONET and S-NET systems in Japan. However, for tsunami early warning of near field tsunamis, it is essential to determine appropriate source models using seismological analysis before large tsunamis hit the coast, especially for tsunami earthquakes which generated significantly large tsunamis. In this paper, we develop a technique to determine appropriate source models from which appropriate tsunami inundation along the coast can be numerically computed The technique is tested for four large earthquakes, the 1992 Nicaragua tsunami earthquake (Mw7.7), the 2001 El Salvador earthquake (Mw7.7), the 2004 El Astillero earthquake (Mw7.0), and the 2012 El Salvador-Nicaragua earthquake (Mw7.3), which occurred off Central America. In this study, fault parameters were estimated from the W-phase inversion, then the fault length and width were determined from scaling relationships. At first, the slip amount was calculated from the seismic moment with a constant rigidity of 3.5 x 10**10N/m2. The tsunami numerical simulation was carried out and compared with the observed tsunami. For the 1992 Nicaragua tsunami earthquake, the computed tsunami was much smaller than the observed one. For the 2004 El Astillero earthquake, the computed tsunami was overestimated. In order to solve this problem, we constructed a depth dependent rigidity curve, similar to suggested by Bilek and Lay (1999). The curve with a central depth estimated by the W-phase inversion was used to calculate the slip amount of the fault model. Using those new slip amounts, tsunami numerical simulation was carried out again. Then, the observed tsunami heights, run-up heights, and inundation areas for the 1992 Nicaragua tsunami earthquake were well explained by the computed one. The other tsunamis from the other three earthquakes were also reasonably well explained by the computed ones. Therefore, our technique using a depth dependent rigidity curve is worked to estimate an appropriate fault model which reproduces tsunami heights near the coast in Central America. The technique may be worked in the other subduction zones by finding a depth dependent rigidity curve in that particular subduction zone.

  12. Potential coping capacities to avoid tsunamis in Mentawai

    NASA Astrophysics Data System (ADS)

    Panjaitan, Berton; Gomez, Christopher; Pawson, Eric

    2017-07-01

    In 2010 a tsunamigenic earthquake triggered tsunami waves reaching the Mentawai archipelago in less than ten minutes. Similar events can occur any time as seismic scholars predict enormous energy remains trapped on the Sunda Megathrust - approximately 30 km offshore from the archipelago. Therefore, the local community of Mentawai is vulnerable to tsunami hazards. In the absence of modern technology to monitor the sea surface interventions, existing strategies need to be improved. This study was based on a qualitative research and literature review about developing coping capacity on tsunami hazards for Mentawai. A community early-warning system is the main strategy to develop the coping capacity at the community level. This consists of risk knowledge, monitoring, warning dissemination, and capability response. These are interlocked and are an end-to-end effort. From the study, the availability of risk assessments and risk mappings were mostly not found at dusun, whereas they are effective to increase tsunami risk knowledge. Also, the monitoring of tsunami waves can be maximized by strengthening and expanding the community systems for the people to avoid the waves. Moreover, the traditional tools are potential to deliver warnings. Lastly, although the local government has provided a few public facilities to increase the response capability, the people often ignore them. Therefore, their traditional values should be revitalized.

  13. Tsunami Forecasting and Monitoring in New Zealand

    NASA Astrophysics Data System (ADS)

    Power, William; Gale, Nora

    2011-06-01

    New Zealand is exposed to tsunami threats from several sources that vary significantly in their potential impact and travel time. One route for reducing the risk from these tsunami sources is to provide advance warning based on forecasting and monitoring of events in progress. In this paper the National Tsunami Warning System framework, including the responsibilities of key organisations and the procedures that they follow in the event of a tsunami threatening New Zealand, are summarised. A method for forecasting threat-levels based on tsunami models is presented, similar in many respects to that developed for Australia by Allen and Greenslade (Nat Hazards 46:35-52, 2008), and a simple system for easy access to the threat-level forecasts using a clickable pdf file is presented. Once a tsunami enters or initiates within New Zealand waters, its progress and evolution can be monitored in real-time using a newly established network of online tsunami gauge sensors placed at strategic locations around the New Zealand coasts and offshore islands. Information from these gauges can be used to validate and revise forecasts, and assist in making the all-clear decision.

  14. Tsunami preparedness at the resort facilities along the coast of the Ryukyu Islands - their actions against the 27 February 2010 Okinawan and Chilean tsunami warning

    NASA Astrophysics Data System (ADS)

    Matsumoto, T.

    2010-12-01

    The economy (including tourism) in tropical and subtropical coastal areas, such as Okinawa Prefecture (Ryukyu) is highly relying on the sea. The sea has both “gentle” side to give people healing and “fierce” side to kill people. If we are going to utilise the sea for marine tourism such as constructing resort facilities on the oceanfront, we should know the whole nature of the sea, Tsunami is the typical case of the “fierce” side of the sea. We have already learned a lesson about this issue from the Sumatra tsunami in 2004. Early morning (5:31 am Japanese Standard Time = JST) on 27 February 2010, a M6.9 earthquake occurred near the coast of Okinawa Ryukyu Island Japan, and just after that Japanese Meteorological Agency (JMA) issued a tsunami warning along the coastal area of Okinawa Prefecture. About one hour later the tsunami warning was cancelled. The CMT solution of this earthquake was found to be strike-slip type with NE-SW P-axis. Therefore this did not induce a tsunami. However, in the afternoon on the same day (JST) a M8.6 earthquake occurred off the coast of Chile and soon after that a tsunami warning issued along the Pacific coastal area including Japan and Ryukyu Islands. Indeed maximum 1m tsunami hit the eastern coast of Okinawa Island on 28th February (Nakamura, 2010, personal communication). The author conducted a survey about the actions against the both tsunami after the 27 February tsunami warming to the major resort hotels along the coast of the Ryukyu Islands. A questionnaire was sent to about 20 hotels and 6 hotels replied to the questionnaire. Most of these hotels reported the regular training against tsunami attack, preparation of a disaster prevention manual, close communication with the local fire station authority, evacuation procedure towards high stories of the hotel building etc. It was “winter season” when the tsunami took place. However, if that were “summer season,” the other problem such as how they make the people enjoying on the beach evacuate as quickly as possible might be considered. The author will show the details of the answer to the questionnaire and would like to discuss the best way of the tsunami preparedness at the waterfront resort facilities through this presentation.

  15. The GNSS-based component for the new Indonesian tsunami early warning centre provided by GITEWS

    NASA Astrophysics Data System (ADS)

    Falck, C.; Ramatschi, M.; Bartsch, M.; Merx, A.; Hoeberechts, J.; Rothacher, M.

    2009-04-01

    Introduction Nowadays GNSS technologies are used for a large variety of precise positioning applications. The accuracy can reach the mm level depending on the data analysis methods. GNSS technologies thus offer a high potential to support tsunami early warning systems, e.g., by detection of ground motions due to earthquakes and of tsunami waves on the ocean by GNSS instruments on a buoy. Although GNSS-based precise positioning is a standard method, it is not yet common to apply this technique under tight time constraints and, hence, in the absence of precise satellite orbits and clocks. The new developed GNSS-based component utilises on- and offshore measured GNSS data and is the first system of its kind that was integrated into an operational early warning system. (Indonesian Tsunami Early Warning Centre INATEWS, inaugurated at BMKG, Jakarta on November, 11th 2008) Motivation After the Tsunami event of 26th December 2004 the German government initiated the GITEWS project (German Indonesian Tsunami Early Warning System) to develop a tsunami early warning system for Indonesia. The GFZ Potsdam (German Research Centre for Geosciences) as the consortial leader of GITEWS also covers several work packages, most of them related to sensor systems. The geodetic branch (Department 1) of the GFZ was assigned to develop a GNSS-based component. Brief system description The system covers all aspects from sensor stations with new developed hard- and software designs, manufacturing and installation of stations, real-time data transfer issues, a new developed automatic near real-time data processing and a graphical user interface for early warning centre operators including training on the system. GNSS sensors are installed on buoys, at tide gauges and as real-time reference stations (RTR stations), either stand-alone or co-located with seismic sensors. The GNSS data are transmitted to the warning centre where they are processed in a near real-time data processing chain. For sensors on land the processing system delivers deviations from their normal, mean coordinates. The deviations or so called displacements are indicators for land mass movements which can occur, e.g., due to strong earthquakes. The ground motion information is a valuable source for a fast understanding of an earthquake's mechanism with possible relevance for a potentially following tsunami. By this means the GNSS system supports the decision finding process whether most probably a tsunami has been generated or not. For buoy based GNSS data the processing (differential, with GNSS reference station on land) delivers coordinates as well. Only the vertical component is of interest as it corresponds to the instant sea level height. Deviations to the mean sea level height are an indicator for a possibly passing tsunami wave. The graphical user interface (GUI) of the GNSS system supports both, a quick view for all staff members at the warning centre (24h/7d shifts) and deeper analysis by GNSS experts. The GNSS GUI system is implemented as a web-based application and allows all views to be displayed on different screens at the same time, even at remote locations. This is part of the concept, as it can support the dialogue between warning centre staff on duty or on standby and sensor station maintenance staff. Acknowledgements The GITEWS project (German Indonesian Tsunami Early Warning System) is carried out by a large group of scientists and engineers from (GFZ) German Research Centre for Geosciences and its partners from the German Aerospace Centre (DLR), the Alfred Wegener Institute for Polar and Marine Research (AWI), the GKSS Research Centre, the Konsortium Deutsche Meeresforschung (KDM), the Leibniz Institute for Marine Sciences (IFM-GEOMAR), the United Nations University (UNU), the Federal Institute for Geosciences and Natural Resources (BGR), the German Agency for Technical Cooperation (GTZ) and other international partners. Most relevant partners in Indonesia with respect to the GNSS component of GITEWS are the National Coordinating Agency for Surveys and Mapping (BAKOSURTANAL), the National Metereology and Geophysics Agency (BMKG) and the National Agency for the Assessment and Application of Technology (BPPT). Funding is provided by the German Federal Ministry for Education and Research (BMBF), Grant 03TSU01.

  16. Tsunami Warning Center in Turkey : Status Update 2012

    NASA Astrophysics Data System (ADS)

    Meral Ozel, N.; Necmioglu, O.; Yalciner, A. C.; Kalafat, D.; Yilmazer, M.; Comoglu, M.; Sanli, U.; Gurbuz, C.; Erdik, M.

    2012-04-01

    This is an update to EGU2011-3094 informing on the progress of the establishment of a National Tsunami Warning Center in Turkey (NTWC-TR) under the UNESCO Intergovernmental Oceanographic Commission - Intergovernmental Coordination Group for the Tsunami Early Warning and Mitigation System in the North-eastern Atlantic, the Mediterranean and connected seas (IOC-ICG/NEAMTWS) initiative. NTWC-TR is integrated into the 24/7 operational National Earthquake Monitoring Center (NEMC) of KOERI comprising 129 BB and 61 strong motion sensors. Based on an agreement with the Disaster and Emergency Management Presidency (DEMP), data from 10 BB stations located in the Aegean and Mediterranean Coast is now transmitted in real time to KOERI. Real-time data transmission from 6 primary and 10 auxiliary stations from the International Monitoring System will be in place in the very near future based on an agreement concluded with the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO) in 2011. In an agreement with a major Turkish GSM company, KOERI is enlarging its strong-motion network to promote real-time seismology and to extend Earthquake Early Warning system countrywide. 25 accelerometers (included in the number given above) have been purchased and installed at Base Transceiver Station Sites in coastal regions within the scope of this initiative. Data from 3 tide gauge stations operated by General Command of Mapping (GCM) is being transmitted to KOERI via satellite connection and the aim is to integrate all tide-gauge stations operated by GCM into NTWC-TR. A collaborative agreement has been signed with the European Commission - Joint Research Centre (EC-JRC) and MOD1 Tsunami Scenario Database and TAT (Tsunami Analysis Tool) are received by KOERI and user training was provided. The database and the tool are linked to SeisComp3 and currently operational. In addition KOERI is continuing the work towards providing contributions to JRC in order to develop an improved database (MOD2), and also continuing work related to the development of its own scenario database using NAMI DANCE Tsunami Simulation and Visualization Software. Further improvement of the Tsunami Warning System at the NTWC-TR will be accomplished through KOERI's participation in the FP-7 Project TRIDEC focusing on new technologies for real-time intelligent earth information management to be used in Tsunami Early Warning Systems. In cooperation with Turkish State Meteorological Service (TSMS), KOERI has its own GTS system now and connected to GTS via its own satellite hub. The system has been successfully utilized during the First Enlarged Communication Test Exercise (NEAMTWS/ECTE1), where KOERI acted as the message provider. KOERI is providing guidance and assistance to a working group established within the DEMP on issues such as Communication and Tsunami Exercises, National Procedures and National Tsunami Response Plan. KOERI is also participating in NEAMTIC (North-Eastern Atlantic and Mediterranean Tsunami Information Centre) Project. Finally, during the 8th Session of NEAMTWS in November 2011, KOERI has announced that NTWC-TR is operational as of January 2012 covering Eastern Mediterranean, Aegean, Marmara and Black Seas and KOERI is also ready to operate as an Interim Candidate Tsunami Watch Provider.

  17. Detecting Tsunami Source Energy and Scales from GNSS & Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Song, Y. T.; Yim, S. C.; Mohtat, A.

    2016-12-01

    Historically, tsunami warnings based on the earthquake magnitude have not been very accurate. According to the 2006 U.S. Government Accountability Office report, an unacceptable 75% false alarm rate has prevailed in the Pacific Ocean (GAO-06-519). One of the main reasons for those inaccurate warnings is that an earthquake's magnitude is not the scale or power of the resulting tsunami. For the last 10 years, we have been developing both theories and algorithms to detect tsunami source energy and scales, instead of earthquake magnitudes per se, directly from real-time Global Navigation Satellite System (GNSS) stations along coastlines for early warnings [Song 2007; Song et al., 2008; Song et al., 2012; Xu and Song 2013; Titov et al, 2016]. Here we will report recent progress on two fronts: 1) Examples of using GNSS in detecting the tsunami energy scales for the 2004 Sumatra M9.1 earthquake, the 2005 Nias M8.7 earthquake, the 2010 M8.8 Chilean earthquake, the 2011 M9.0 Tohoku-Oki earthquake, and the 2015 M8.3 Illapel earthquake. 2) New results from recent state-of-the-art wave-maker experiments and comparisons with GNSS data will also be presented. Related reference: Titov, V., Y. T. Song, L. Tang, E. N. Bernard, Y. Bar-Sever, and Y. Wei (2016), Consistent estimates of tsunami energy show promise for improved early warning, Pur Appl. Geophs., DOI: 10.1007/s00024-016-1312-1. Xu, Z. and Y. T. Song (2013), Combining the all-source Green's functions and the GPS-derived source for fast tsunami prediction - illustrated by the March 2011 Japan tsunami, J. Atmos. Oceanic Tech., jtechD1200201. Song, Y. T., I. Fukumori, C. K. Shum, and Y. Yi (2012), Merging tsunamis of the 2011 Tohoku-Oki earthquake detected over the open ocean, Geophys. Res. Lett., doi:10.1029/2011GL050767. Song, Y. T., L.-L. Fu, V. Zlotnicki, C. Ji, V. Hjorleifsdottir, C.K. Shum, and Y. Yi, 2008: The role of horizontal impulses of the faulting continental slope in generating the 26 December 2004 Tsunami (2007), Ocean Modelling, doi:10.1016/j.ocemod.2007.10.007. Song, Y. T. (2007) Detecting tsunami genesis and scales directly from coastal GPS stations, Geophys. Res. Lett., 34, L19602, doi:10.1029/2007GL031681.

  18. U.S. Tsunami Information technology (TIM) Modernization: Performance Assessment of Tsunamigenic Earthquake Discrimination System

    NASA Astrophysics Data System (ADS)

    Hagerty, M. T.; Lomax, A.; Hellman, S. B.; Whitmore, P.; Weinstein, S.; Hirshorn, B. F.; Knight, W. R.

    2015-12-01

    Tsunami warning centers must rapidly decide whether an earthquake is likely to generate a destructive tsunami in order to issue a tsunami warning quickly after a large event. For very large events (Mw > 8 or so), magnitude and location alone are sufficient to warrant an alert. However, for events of smaller magnitude (e.g., Mw ~ 7.5), particularly for so-called "tsunami earthquakes", magnitude alone is insufficient to issue an alert and other measurements must be rapidly made and used to assess tsunamigenic potential. The Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). We (ISTI) are responsible for implementing the seismic monitoring components in this new system, including real-time seismic data collection and seismic processing. The seismic data processor includes a variety of methods aimed at real-time discrimination of tsunamigenic events, including: Mwp, Me, slowness (Theta), W-phase, mantle magnitude (Mm), array processing and finite-fault inversion. In addition, it contains the ability to designate earthquake scenarios and play the resulting synthetic seismograms through the processing system. Thus, it is also a convenient tool that integrates research and monitoring and may be used to calibrate and tune the real-time monitoring system. Here we show results of the automated processing system for a large dataset of subduction zone earthquakes containing recent tsunami earthquakes and we examine the accuracy of the various discrimation methods and discuss issues related to their successful real-time application.

  19. Rapid assessment of tsunami impact from real-time seismology and geographic, historical other datasets using machine learning

    NASA Astrophysics Data System (ADS)

    Michelini, Alberto; Lomax, Anthony

    2017-04-01

    The impact of an earthquake, tsunami, volcanic eruption, severe weather or other natural disaster is related to: the intensity of the hazard; the vulnerability or exposure of the population, such as housing quality, infrastructure and proximity to a coastlines; and the capacity to resist and cope with the disaster. Rapid assessment by monitoring agencies of the impact of a natural event is fundamental for early warning and response. We previously* proposed the "tsunami importance" parameter, It, for characterizing the strength of a tsunami. This parameter combines 5 descriptive indices from the NOAA/WDC Historical Tsunami Database: 4 tsunami impact measures (deaths, injuries, damage, houses destroyed), and maximum water height. Accordingly, It = 2 corresponds approximately to the JMA threshold for issuing a ''Tsunami Warning'' whereas the largest or most devastating tsunamis typically have It = 10. Here we discuss extending this simple, 5-component parameter with additional impact-related measures from relevant databases (e.g., LandScan population density, major infrastructures) and historical / archaeological information, and measures that might be obtained in near-real-time (e.g., emergency services, news, social media). We combine these measures with seismological and other real-time observations as an ensemble of features within automated procedures to estimate impact and guide decision making. We examine using modern machine learning methodologies to train and calibrate the procedures, while working with high-dimensional feature space. * Lomax, A. and A. Michelini (2011), Tsunami early warning using earthquake rupture duration and P-wave dominant period: the importance of length and depth of faulting, Geophys. J. Int., 185, 283-291, doi: 10.1111/j.1365-246X.2010.04916.x

  20. Large magnitude (M > 7.5) offshore earthquakes in 2012: few examples of absent or little tsunamigenesis, with implications for tsunami early warning

    NASA Astrophysics Data System (ADS)

    Pagnoni, Gianluca; Armigliato, Alberto; Tinti, Stefano

    2013-04-01

    We take into account some examples of offshore earthquakes occurred worldwide in year 2012 that were characterised by a "large" magnitude (Mw equal or larger than 7.5) but which produced no or little tsunami effects. Here, "little" is intended as "lower than expected on the basis of the parent earthquake magnitude". The examples we analyse include three earthquakes occurred along the Pacific coasts of Central America (20 March, Mw=7.8, Mexico; 5 September, Mw=7.6, Costa Rica; 7 November, Mw=7.5, Mexico), the Mw=7.6 and Mw=7.7 earthquakes occurred respectively on 31 August and 28 October offshore Philippines and offshore Alaska, and the two Indian Ocean earthquakes registered on a single day (11 April) and characterised by Mw=8.6 and Mw=8.2. For each event, we try to face the problem related to its tsunamigenic potential from two different perspectives. The first can be considered purely scientific and coincides with the question: why was the ensuing tsunami so weak? The answer can be related partly to the particular tectonic setting in the source area, partly to the particular position of the source with respect to the coastline, and finally to the focal mechanism of the earthquake and to the slip distribution on the ruptured fault. The first two pieces of information are available soon after the earthquake occurrence, while the third requires time periods in the order of tens of minutes. The second perspective is more "operational" and coincides with the tsunami early warning perspective, for which the question is: will the earthquake generate a significant tsunami and if so, where will it strike? The Indian Ocean events of 11 April 2012 are perfect examples of the fact that the information on the earthquake magnitude and position alone may not be sufficient to produce reliable tsunami warnings. We emphasise that it is of utmost importance that the focal mechanism determination is obtained in the future much more quickly than it is at present and that this information must be included in the operational procedures of warning systems. On the other hand, we stress the importance of a correct management of false alarms, which are almost impossible to avoid, both in the crisis and in the pot-crisis phases. These are topics whose solution represents one of the major efforts of the EU-FP7 TRIDEC Project, in the frame of which this study is conducted.

  1. Toward tsunami early warning system in Indonesia by using rapid rupture durations estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madlazim

    2012-06-20

    Indonesia has Indonesian Tsunami Early Warning System (Ina-TEWS) since 2008. The Ina-TEWS has used automatic processing on hypocenter; Mwp, Mw (mB) and Mj. If earthquake occurred in Ocean, depth < 70 km and magnitude > 7, then Ina-TEWS announce early warning that the earthquake can generate tsunami. However, the announcement of the Ina-TEWS is still not accuracy. Purposes of this research are to estimate earthquake rupture duration of large Indonesia earthquakes that occurred in Indian Ocean, Java, Timor sea, Banda sea, Arafura sea and Pasific ocean. We analyzed at least 330 vertical seismogram recorded by IRIS-DMC network using a directmore » procedure for rapid assessment of earthquake tsunami potential using simple measures on P-wave vertical seismograms on the velocity records, and the likelihood that the high-frequency, apparent rupture duration, T{sub dur}. T{sub dur} can be related to the critical parameters rupture length (L), depth (z), and shear modulus ({mu}) while T{sub dur} may be related to wide (W), slip (D), z or {mu}. Our analysis shows that the rupture duration has a stronger influence to generate tsunami than Mw and depth. The rupture duration gives more information on tsunami impact, Mo/{mu}, depth and size than Mw and other currently used discriminants. We show more information which known from the rupture durations. The longer rupture duration, the shallower source of the earthquake. For rupture duration greater than 50 s, the depth less than 50 km, Mw greater than 7, the longer rupture length, because T{sub dur} is proportional L and greater Mo/{mu}. Because Mo/{mu} is proportional L. So, with rupture duration information can be known information of the four parameters. We also suggest that tsunami potential is not directly related to the faulting type of source and for events that have rupture duration greater than 50 s, the earthquakes generated tsunami. With available real-time seismogram data, rapid calculation, rupture duration discriminant can be completed within 4-5 min after an earthquake occurs and thus can aid in effective, accuracy and reliable tsunami early warning for Indonesia region.« less

  2. Steps Towards the Implementation of a Tsunami Detection, Warning, Mitigation and Preparedness Program for Southwestern Coastal Areas of Mexico

    NASA Astrophysics Data System (ADS)

    Farreras, Salvador; Ortiz, Modesto; Gonzalez, Juan I.

    2007-03-01

    The highly vulnerable Pacific southwest coast of Mexico has been repeatedly affected by local, regional and remote source tsunamis. Mexico presently has no national tsunami warning system in operation. The implementation of key elements of a National Program on Tsunami Detection, Monitoring, Warning and Mitigation is in progress. For local and regional events detection and monitoring, a prototype of a robust and low cost high frequency sea-level tsunami gauge, sampling every minute and equipped with 24 hours real time transmission to the Internet, was developed and is currently in operation. Statistics allow identification of low, medium and extreme hazard categories of arriving tsunamis. These categories are used as prototypes for computer simulations of coastal flooding. A finite-difference numerical model with linear wave theory for the deep ocean propagation, and shallow water nonlinear one for the near shore and interaction with the coast, and non-fixed boundaries for flooding and recession at the coast, is used. For prevention purposes, tsunami inundation maps for several coastal communities, are being produced in this way. The case of the heavily industrialized port of Lázaro Cárdenas, located on the sand shoals of a river delta, is illustrated; including a detailed vulnerability assessment study. For public education on preparedness and awareness, printed material for children and adults has been developed and published. It is intended to extend future coverage of this program to the Mexican Caribbean and Gulf of Mexico coastal areas.

  3. Introduction to "Global Tsunami Science: Past and Future, Volume III"

    NASA Astrophysics Data System (ADS)

    Rabinovich, Alexander B.; Fritz, Hermann M.; Tanioka, Yuichiro; Geist, Eric L.

    2018-04-01

    Twenty papers on the study of tsunamis are included in Volume III of the PAGEOPH topical issue "Global Tsunami Science: Past and Future". Volume I of this topical issue was published as PAGEOPH, vol. 173, No. 12, 2016 and Volume II as PAGEOPH, vol. 174, No. 8, 2017. Two papers in Volume III focus on specific details of the 2009 Samoa and the 1923 northern Kamchatka tsunamis; they are followed by three papers related to tsunami hazard assessment for three different regions of the world oceans: South Africa, Pacific coast of Mexico and the northwestern part of the Indian Ocean. The next six papers are on various aspects of tsunami hydrodynamics and numerical modelling, including tsunami edge waves, resonant behaviour of compressible water layer during tsunamigenic earthquakes, dispersive properties of seismic and volcanically generated tsunami waves, tsunami runup on a vertical wall and influence of earthquake rupture velocity on maximum tsunami runup. Four papers discuss problems of tsunami warning and real-time forecasting for Central America, the Mediterranean coast of France, the coast of Peru, and some general problems regarding the optimum use of the DART buoy network for effective real-time tsunami warning in the Pacific Ocean. Two papers describe historical and paleotsunami studies in the Russian Far East. The final set of three papers importantly investigates tsunamis generated by non-seismic sources: asteroid airburst and meteorological disturbances. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  4. Introduction to “Global tsunami science: Past and future, Volume III”

    USGS Publications Warehouse

    Rabinovich, Alexander B.; Fritz, Hermann M.; Tanioka, Yuichiro; Geist, Eric L.

    2018-01-01

    Twenty papers on the study of tsunamis are included in Volume III of the PAGEOPH topical issue “Global Tsunami Science: Past and Future”. Volume I of this topical issue was published as PAGEOPH, vol. 173, No. 12, 2016 and Volume II as PAGEOPH, vol. 174, No. 8, 2017. Two papers in Volume III focus on specific details of the 2009 Samoa and the 1923 northern Kamchatka tsunamis; they are followed by three papers related to tsunami hazard assessment for three different regions of the world oceans: South Africa, Pacific coast of Mexico and the northwestern part of the Indian Ocean. The next six papers are on various aspects of tsunami hydrodynamics and numerical modelling, including tsunami edge waves, resonant behaviour of compressible water layer during tsunamigenic earthquakes, dispersive properties of seismic and volcanically generated tsunami waves, tsunami runup on a vertical wall and influence of earthquake rupture velocity on maximum tsunami runup. Four papers discuss problems of tsunami warning and real-time forecasting for Central America, the Mediterranean coast of France, the coast of Peru, and some general problems regarding the optimum use of the DART buoy network for effective real-time tsunami warning in the Pacific Ocean. Two papers describe historical and paleotsunami studies in the Russian Far East. The final set of three papers importantly investigates tsunamis generated by non-seismic sources: asteroid airburst and meteorological disturbances. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  5. Database of tsunami scenario simulations for Western Iberia: a tool for the TRIDEC Project Decision Support System for tsunami early warning

    NASA Astrophysics Data System (ADS)

    Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo; Tinti, Stefano

    2013-04-01

    TRIDEC is a EU-FP7 Project whose main goal is, in general terms, to develop suitable strategies for the management of crises possibly arising in the Earth management field. The general paradigms adopted by TRIDEC to develop those strategies include intelligent information management, the capability of managing dynamically increasing volumes and dimensionality of information in complex events, and collaborative decision making in systems that are typically very loosely coupled. The two areas where TRIDEC applies and tests its strategies are tsunami early warning and industrial subsurface development. In the field of tsunami early warning, TRIDEC aims at developing a Decision Support System (DSS) that integrates 1) a set of seismic, geodetic and marine sensors devoted to the detection and characterisation of possible tsunamigenic sources and to monitoring the time and space evolution of the generated tsunami, 2) large-volume databases of pre-computed numerical tsunami scenarios, 3) a proper overall system architecture. Two test areas are dealt with in TRIDEC: the western Iberian margin and the eastern Mediterranean. In this study, we focus on the western Iberian margin with special emphasis on the Portuguese coasts. The strategy adopted in TRIDEC plans to populate two different databases, called "Virtual Scenario Database" (VSDB) and "Matching Scenario Database" (MSDB), both of which deal only with earthquake-generated tsunamis. In the VSDB we simulate numerically few large-magnitude events generated by the major known tectonic structures in the study area. Heterogeneous slip distributions on the earthquake faults are introduced to simulate events as "realistically" as possible. The members of the VSDB represent the unknowns that the TRIDEC platform must be able to recognise and match during the early crisis management phase. On the other hand, the MSDB contains a very large number (order of thousands) of tsunami simulations performed starting from many different simple earthquake sources of different magnitudes and located in the "vicinity" of the virtual scenario earthquake. In the DSS perspective, the members of the MSDB have to be suitably combined based on the information coming from the sensor networks, and the results are used during the crisis evolution phase to forecast the degree of exposition of different coastal areas. We provide examples from both databases whose members are computed by means of the in-house software called UBO-TSUFD, implementing the non-linear shallow-water equations and solving them over a set of nested grids that guarantee a suitable spatial resolution (few tens of meters) in specific, suitably chosen, coastal areas.

  6. Early waning and evacuation from Tsunami, volcano, flood and other hazards

    NASA Astrophysics Data System (ADS)

    Sugimoto, M.

    2012-12-01

    In reconsideration of the great sacrifice among the people, evacuation calls for evacuation through Japan Meteorological Agency (JMA), local governments and Medias have been drastically changed after the 2011 Tohoku tsunami in Japan. One of example is that JMA changed from forecasted concrete figure of tsunami height to one of 3 levels of tsunami height. A data shows the border between life and death is just 2 minutes of earlier evacuation in case of the 2011 tsunami. It shows how importance for communities to prompt early evacuation for survivals. However, the 2011 Tohoku tsunami revealed there is no reliable trigger to prompt early evacuation to people in case of blackout under disasters, excluding effective education. The warning call was still complicated situations in Japan in July 2012. The 2012 Northern Kyusyu downpours was at worst around 110 millimeters an hour and casualties 30 in Japan. JMA learned from the last tsunami. In this time JMA informed to local governments as a waning call "Unexpected severe rains" to local governments. However, local governments did not notice the call from JMA in the same as usual informed way. One of the local government said "We were very busy for preparing for staffs. We looked at the necessary information of the water levels of rivers and flood prevention under emergent situation" (NHK 2012). This case shows JMA's evacuation calls from upstream to midstream of local government and downstream of communities started, however upstream calls have not engaged with midstream and communities yet. Calls of early warning from upstream is still a self-centered idea for both midstream and downstream. Finally JMA could not convey a crisis mentality to local government. The head of Oarai town independently decided to use the different warning call "Order townspersons to evacuate immediately" in Ibaraki prefecture, Japan from the other municipalities in 2011 though there was not such a manuals calls in Japan. This risk communication succeeded between the local government and communities. People said I have never heard such warning call so I started evacuate soon. On the other hand, Japanese government make a strategy of level 1 tsunami height and lever 2 height. Japan is still seeking to adept at harmonizing evacuation with infrastructures to prevent. It is still not clear to solve warning issues and prevent issues. This research contributes how to struggle with these issues now in Japan.

  7. Concerns over modeling and warning capabilities in wake of Tohoku Earthquake and Tsunami

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2011-04-01

    Improved earthquake models, better tsunami modeling and warning capabilities, and a review of nuclear power plant safety are all greatly needed following the 11 March Tohoku earthquake and tsunami, according to scientists at the European Geosciences Union's (EGU) General Assembly, held 3-8 April in Vienna, Austria. EGU quickly organized a morning session of oral presentations and an afternoon panel discussion less than 1 month after the earthquake and the tsunami and the resulting crisis at Japan's Fukushima nuclear power plant, which has now been identified as having reached the same level of severity as the 1986 Chernobyl disaster. Many of the scientists at the EGU sessions expressed concern about the inability to have anticipated the size of the earthquake and the resulting tsunami, which appears likely to have caused most of the fatalities and damage, including damage to the nuclear plant.

  8. The 17 July 2006 Tsunami earthquake in West Java, Indonesia

    USGS Publications Warehouse

    Mori, J.; Mooney, W.D.; Afnimar,; Kurniawan, S.; Anaya, A.I.; Widiyantoro, S.

    2007-01-01

    A tsunami earthquake (Mw = 7.7) occurred south of Java on 17 July 2006. The event produced relatively low levels of high-frequency radiation, and local felt reports indicated only weak shaking in Java. There was no ground motion damage from the earthquake, but there was extensive damage and loss of life from the tsunami along 250 km of the southern coasts of West Java and Central Java. An inspection of the area a few days after the earthquake showed extensive damage to wooden and unreinforced masonry buildings that were located within several hundred meters of the coast. Since there was no tsunami warning system in place, efforts to escape the large waves depended on how people reacted to the earthquake shaking, which was only weakly felt in the coastal areas. This experience emphasizes the need for adequate tsunami warning systems for the Indian Ocean region.

  9. TRIDEC Cloud - a Web-based Platform for Tsunami Early Warning tested with NEAMWave14 Scenarios

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven; Necmioglu, Ocal; Comoglu, Mustafa; Ozer Sozdinler, Ceren; Carrilho, Fernando; Wächter, Joachim

    2015-04-01

    In times of cloud computing and ubiquitous computing the use of concepts and paradigms introduced by information and communications technology (ICT) have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in research projects new technologies are exploited to implement a cloud-based and web-based platform - the TRIDEC Cloud - to open up new prospects for EWS. The platform in its current version addresses tsunami early warning and mitigation. It merges several complementary external and in-house cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The TRIDEC Cloud can be accessed in two different modes, the monitoring mode and the exercise-and-training mode. The monitoring mode provides important functionality required to act in a real event. So far, the monitoring mode integrates historic and real-time sea level data and latest earthquake information. The integration of sources is supported by a simple and secure interface. The exercise and training mode enables training and exercises with virtual scenarios. This mode disconnects real world systems and connects with a virtual environment that receives virtual earthquake information and virtual sea level data re-played by a scenario player. Thus operators and other stakeholders are able to train skills and prepare for real events and large exercises. The GFZ German Research Centre for Geosciences (GFZ), the Kandilli Observatory and Earthquake Research Institute (KOERI), and the Portuguese Institute for the Sea and Atmosphere (IPMA) have used the opportunity provided by NEAMWave14 to test the TRIDEC Cloud as a collaborative activity based on previous partnership and commitments at the European scale. The TRIDEC Cloud has not been involved officially in Part B of the NEAMWave14 scenarios. However, the scenarios have been used by GFZ, KOERI, and IPMA for testing in exercise runs on October 27-28, 2014. Additionally, the Greek NEAMWave14 scenario has been tested in an exercise run by GFZ only on October 29, 2014 (see ICG/NEAMTWS-XI/13). The exercise runs demonstrated that operators in warning centres and stakeholders of other involved parties just need a standard web browser to access a full-fledged TEWS. The integration of GPU accelerated tsunami simulation computations have been an integral part to foster early warning with on-demand tsunami predictions based on actual source parameters. Thus tsunami travel times, estimated times of arrival and estimated wave heights are available immediately for visualization and for further analysis and processing. The generation of warning messages is based on internationally agreed message structures and includes static and dynamic information based on earthquake information, instant computations of tsunami simulations, and actual measurements. Generated messages are served for review, modification, and addressing in one simple form for dissemination via Cloud Messages, Shared Maps, e-mail, FTP/GTS, SMS, and FAX. Cloud Messages and Shared Maps are complementary channels and integrate interactive event and simulation data. Thus recipients are enabled to interact dynamically with a map and diagrams beyond traditional text information.

  10. State Emergency Response and Field Observation Activities in California (USA) during the March 11, 2011, Tohoku Tsunami

    NASA Astrophysics Data System (ADS)

    Miller, K. M.; Wilson, R. I.; Goltz, J.; Fenton, J.; Long, K.; Dengler, L.; Rosinski, A.; California Tsunami Program

    2011-12-01

    This poster will present an overview of successes and challenges observed by the authors during this major tsunami response event. The Tohoku, Japan tsunami was the most costly to affect California since the 1964 Alaskan earthquake and ensuing tsunami. The Tohoku tsunami caused at least $50 million in damage to public facilities in harbors and marinas along the coast of California, and resulted in one fatality. It was generated by a magnitude 9.0 earthquake which occurred at 9:46PM PST on Thursday, March 10, 2011 in the sea off northern Japan. The tsunami was recorded at tide gages monitored by the West Coast/Alaska Tsunami Warning Center (WCATWC), which projected tsunami surges would reach California in approximately 10 hours. At 12:51AM on March 11, 2011, based on forecasted tsunami amplitudes, the WCATWC placed the California coast north of Point Conception (Santa Barbara County) in a Tsunami Warning, and the coast south of Point Conception to the Mexican border in a Tsunami Advisory. The California Emergency Management Agency (CalEMA) activated two Regional Emergency Operation Centers (REOCs) and the State Operation Center (SOC). The California Geological Survey (CGS) deployed a field team which collected data before, during and after the event through an information clearinghouse. Conference calls were conducted hourly between the WCATWC and State Warning Center, as well as with emergency managers in the 20 coastal counties. Coordination focused on local response measures, public information messaging, assistance needs, evacuations, emergency shelters, damage, and recovery issues. In the early morning hours, some communities in low lying areas recommended evacuation for their citizens, and the fishing fleet at Crescent City evacuated to sea. The greatest damage occurred in the harbors of Crescent City and Santa Cruz. As with any emergency, there were lessons learned and important successes in managing this event. Forecasts by the WCATWC were highly accurate. Exercises and workshops have enhanced communications between state and local agencies, and emergency managers are more educated about what to expect. Areas for improvement include keeping people out of the hazard area; educating the non-English speaking community; and reinforcing the long duration and unpredictable peak damaging waves of these events to emergency managers. The Governor proclaimed a state of emergency in six counties and the President declared a major disaster on April 18, 2011, allowing federal assistance to support repairs and economic recovery. Detailed evaluation of local maritime response activities, harbor damage, and measured and observed tsunami current velocity data will help the California Tsunami Program develop improved tsunami hazard maps and guidance for maritime communities. The state program will continue to emphasize the importance of both tsunami warnings and advisories, the unpredictable nature of each tsunami, and encourage public understanding of tsunamis to prepare and protect themselves in the future.

  11. A tsunami early warning system for the coastal area modeling

    NASA Astrophysics Data System (ADS)

    Soebroto, Arief Andy; Sunaryo, Suhartanto, Ery

    2015-04-01

    The tsunami disaster is a potential disaster in the territory of Indonesia. Indonesia is an archipelago country and close to the ocean deep. The tsunami occurred in Aceh province in 2004. Early prevention efforts have been carried out. One of them is making "tsunami buoy" which has been developed by BPPT. The tool puts sensors on the ocean floor near the coast to detect earthquakes on the ocean floor. Detection results are transmitted via satellite by a transmitter placed floating on the sea surface. The tool will cost billions of dollars for each system. Another constraint was the transmitter theft "tsunami buoy" in the absence of guard. In this study of the system has a transmission system using radio frequency and focused on coastal areas where costs are cheaper, so that it can be applied at many beaches in Indonesia are potentially affected by the tsunami. The monitoring system sends the detection results to the warning system using a radio frequency with a capability within 3 Km. Test results on the sub module sensor monitoring system generates an error of 0.63% was taken 10% showed a good quality sensing. The test results of data transmission from the transceiver of monitoring system to the receiver of warning system produces 100% successful delivery and reception of data. The test results on the whole system to function 100% properly.

  12. Challenges for U.S. tsunami preparedness; NASA's Genesis crash blamed on design flaw

    NASA Astrophysics Data System (ADS)

    Zielinski, Sarah

    2006-06-01

    Challenges for U.S. tsunami preparednessDespite recent improvements in U.S. tsunamipreparedness, greater efforts are neededin tsunami hazard assessment, detection, warning,and mitigation, according to a 5 June reportfrom the U.S. Government AccountabilityOffice (GAO).Eos 87(21), 2006).

  13. Tsunami early warning system for the western coast of the Black Sea

    NASA Astrophysics Data System (ADS)

    Ionescu, Constantin; Partheniu, Raluca; Cioflan, Carmen; Constantin, Angela; Danet, Anton; Diaconescu, Mihai; Ghica, Daniela; Grecu, Bogdan; Manea, Liviu; Marmureanu, Alexandru; Moldovan, Iren; Neagoe, Cristian; Radulian, Mircea; Raileanu, Victor; Verdes, Ioan

    2014-05-01

    The Black Sea area is liable to tsunamis generation and the statistics show that more than twenty tsunamis have been observed in the past. The last tsunami was observed on 31st of March 1901 in the western part of the Black Sea, in the Shabla area. An earthquake of magnitude generated at a depth of 15 km below the sea level , triggered tsunami waves of 5 m height and material losses as well. The oldest tsunami ever recorded close to the Romanian shore-line dates from year 104. This paper emphasises the participation of The National Institute for Earth Physics (NIEP) to the development of a tsunami warning system for the western cost of the Black Sea. In collaboration with the National Institute for Marine Geology and Geoecology (GeoEcoMar), the Institute of Oceanology and the Geological Institute, the last two belonging to the Bulgarian Academy of Science, NIEP has participated as partner, to the cross-border project "Set-up and implementation of key core components of a regional early-warning system for marine geohazards of risk to the Romanian-Bulgarian Black Sea coastal area - MARINEGEOHAZARDS", coordinated by GeoEcoMar. The main purpose of the project was the implementation of an integrated early-warning system accompanied by a common decision-support tool, and enhancement of regional technical capability, for the adequate detection, assessment, forecasting and rapid notification of natural marine geohazards for the Romanian-Bulgarian Black Sea cross-border area. In the last years, NIEP has increased its interest on the marine related hazards, such as tsunamis and, in collaboration with other institutions of Romania, is acting to strengthen the cooperation and data exchanges with institutions from the Black Sea surrounding countries which already have tsunami monitoring infrastructures. In this respect, NIEP has developed a coastal network for marine seismicity, by installing three new seismic stations in the coastal area of the Black Sea, Sea Level Sensors, Radar and Pressure sensors, Meteorological and GNSS stations at every site, providing tide gauges and seismic data exchange with the Black Sea countries. At the same time, the Tsunami Analysis Tool (TAT) software, for inundation modelling, along with it's RedPhone application, were also installed at the National Data Centre in Magurele city, and also at Dobrogea Seismic Observatory in the city of Eforie Nord, close to the Black Sea shore.

  14. Method to Determine Appropriate Source Models of Large Earthquakes Including Tsunami Earthquakes for Tsunami Early Warning in Central America

    NASA Astrophysics Data System (ADS)

    Tanioka, Yuichiro; Miranda, Greyving Jose Arguello; Gusman, Aditya Riadi; Fujii, Yushiro

    2017-08-01

    Large earthquakes, such as the Mw 7.7 1992 Nicaragua earthquake, have occurred off the Pacific coasts of El Salvador and Nicaragua in Central America and have generated distractive tsunamis along these coasts. It is necessary to determine appropriate fault models before large tsunamis hit the coast. In this study, first, fault parameters were estimated from the W-phase inversion, and then an appropriate fault model was determined from the fault parameters and scaling relationships with a depth dependent rigidity. The method was tested for four large earthquakes, the 1992 Nicaragua tsunami earthquake (Mw7.7), the 2001 El Salvador earthquake (Mw7.7), the 2004 El Astillero earthquake (Mw7.0), and the 2012 El Salvador-Nicaragua earthquake (Mw7.3), which occurred off El Salvador and Nicaragua in Central America. The tsunami numerical simulations were carried out from the determined fault models. We found that the observed tsunami heights, run-up heights, and inundation areas were reasonably well explained by the computed ones. Therefore, our method for tsunami early warning purpose should work to estimate a fault model which reproduces tsunami heights near the coast of El Salvador and Nicaragua due to large earthquakes in the subduction zone.

  15. Uncertainties in the 2004 Sumatra–Andaman source through nonlinear stochastic inversion of tsunami waves

    PubMed Central

    Venugopal, M.; Roy, D.; Rajendran, K.; Guillas, S.; Dias, F.

    2017-01-01

    Numerical inversions for earthquake source parameters from tsunami wave data usually incorporate subjective elements to stabilize the search. In addition, noisy and possibly insufficient data result in instability and non-uniqueness in most deterministic inversions, which are barely acknowledged. Here, we employ the satellite altimetry data for the 2004 Sumatra–Andaman tsunami event to invert the source parameters. We also include kinematic parameters that improve the description of tsunami generation and propagation, especially near the source. Using a finite fault model that represents the extent of rupture and the geometry of the trench, we perform a new type of nonlinear joint inversion of the slips, rupture velocities and rise times with minimal a priori constraints. Despite persistently good waveform fits, large uncertainties in the joint parameter distribution constitute a remarkable feature of the inversion. These uncertainties suggest that objective inversion strategies should incorporate more sophisticated physical models of seabed deformation in order to significantly improve the performance of early warning systems. PMID:28989311

  16. Uncertainties in the 2004 Sumatra-Andaman source through nonlinear stochastic inversion of tsunami waves.

    PubMed

    Gopinathan, D; Venugopal, M; Roy, D; Rajendran, K; Guillas, S; Dias, F

    2017-09-01

    Numerical inversions for earthquake source parameters from tsunami wave data usually incorporate subjective elements to stabilize the search. In addition, noisy and possibly insufficient data result in instability and non-uniqueness in most deterministic inversions, which are barely acknowledged. Here, we employ the satellite altimetry data for the 2004 Sumatra-Andaman tsunami event to invert the source parameters. We also include kinematic parameters that improve the description of tsunami generation and propagation, especially near the source. Using a finite fault model that represents the extent of rupture and the geometry of the trench, we perform a new type of nonlinear joint inversion of the slips, rupture velocities and rise times with minimal a priori constraints. Despite persistently good waveform fits, large uncertainties in the joint parameter distribution constitute a remarkable feature of the inversion. These uncertainties suggest that objective inversion strategies should incorporate more sophisticated physical models of seabed deformation in order to significantly improve the performance of early warning systems.

  17. Relationship between the Prediction Accuracy of Tsunami Inundation and Relative Distribution of Tsunami Source and Observation Arrays: A Case Study in Tokyo Bay

    NASA Astrophysics Data System (ADS)

    Takagawa, T.

    2017-12-01

    A rapid and precise tsunami forecast based on offshore monitoring is getting attention to reduce human losses due to devastating tsunami inundation. We developed a forecast method based on the combination of hierarchical Bayesian inversion with pre-computed database and rapid post-computing of tsunami inundation. The method was applied to Tokyo bay to evaluate the efficiency of observation arrays against three tsunamigenic earthquakes. One is a scenario earthquake at Nankai trough and the other two are historic ones of Genroku in 1703 and Enpo in 1677. In general, rich observation array near the tsunami source has an advantage in both accuracy and rapidness of tsunami forecast. To examine the effect of observation time length we used four types of data with the lengths of 5, 10, 20 and 45 minutes after the earthquake occurrences. Prediction accuracy of tsunami inundation was evaluated by the simulated tsunami inundation areas around Tokyo bay due to target earthquakes. The shortest time length of accurate prediction varied with target earthquakes. Here, accurate prediction means the simulated values fall within the 95% credible intervals of prediction. In Enpo earthquake case, 5-minutes observation is enough for accurate prediction for Tokyo bay, but 10-minutes and 45-minutes are needed in the case of Nankai trough and Genroku, respectively. The difference of the shortest time length for accurate prediction shows the strong relationship with the relative distance from the tsunami source and observation arrays. In the Enpo case, offshore tsunami observation points are densely distributed even in the source region. So, accurate prediction can be rapidly achieved within 5 minutes. This precise prediction is useful for early warnings. Even in the worst case of Genroku, where less observation points are available near the source, accurate prediction can be obtained within 45 minutes. This information can be useful to figure out the outline of the hazard in an early stage of reaction.

  18. Tsunami Simulation Method Assimilating Ocean Bottom Pressure Data Near a Tsunami Source Region

    NASA Astrophysics Data System (ADS)

    Tanioka, Yuichiro

    2018-02-01

    A new method was developed to reproduce the tsunami height distribution in and around the source area, at a certain time, from a large number of ocean bottom pressure sensors, without information on an earthquake source. A dense cabled observation network called S-NET, which consists of 150 ocean bottom pressure sensors, was installed recently along a wide portion of the seafloor off Kanto, Tohoku, and Hokkaido in Japan. However, in the source area, the ocean bottom pressure sensors cannot observe directly an initial ocean surface displacement. Therefore, we developed the new method. The method was tested and functioned well for a synthetic tsunami from a simple rectangular fault with an ocean bottom pressure sensor network using 10 arc-min, or 20 km, intervals. For a test case that is more realistic, ocean bottom pressure sensors with 15 arc-min intervals along the north-south direction and sensors with 30 arc-min intervals along the east-west direction were used. In the test case, the method also functioned well enough to reproduce the tsunami height field in general. These results indicated that the method could be used for tsunami early warning by estimating the tsunami height field just after a great earthquake without the need for earthquake source information.

  19. Tsunami early warning in the central Mediterranean: effect of the heterogeneity of the seismic source on the timely detectability of a tsunami

    NASA Astrophysics Data System (ADS)

    Armigliato, A.; Tinti, S.; Pagnoni, G.; Zaniboni, F.

    2012-04-01

    The central Mediterranean, and in particular the coasts of southern Italy, is one of the areas with the highest tsunami hazard in Europe. Limiting our attention to earthquake-generated tsunamis, the sources of historical events hitting this region, as well as the largest part of the potential tsunamigenic seismic sources mapped there, are found at very short distances from the closest shorelines, reducing the time needed for the tsunami to attack the coasts themselves to few minutes. This represents by itself an issue from the Tsunami Early Warning (TEW) perspective. To make the overall problem even more intriguing and challenging, it is known that large tsunamigenic earthquakes are generally characterized by highly heterogeneous distributions of the slip on the fault. This feature has been recognized clearly, for instance, in the giant Sumatra 2004, Chile 2010, and Japan 2011 earthquakes (magnitude 9.3, 8.8 and 9.0, respectively), but it was a property also of smaller magnitude events occurred in the region considered in this study, like the 28 December 1908 Messina Straits tsunamigenic earthquake (M=7.2). In terms of tsunami impact, the parent fault slip heterogeneity usually determines a high variability of run-up and inundation on the near-field coasts, which further complicates the TEW problem. The information on the details of the seismic source rupture coming from the seismic (and possibly geodetic) networks, though of primary importance, is typically available after a time that is comparable or larger than the time comprised between the generation and the impact of the tsunami. In the framework of the EU-FP7 TRIDEC Project, we investigate how a proper marine sensors coverage both along the coasts and offshore can help posing constraints on the characteristics of the source in near-real time. Our approach consists in discussing numerical tsunami scenarios in the central Mediterranean involving different slip distributions on the parent fault; the tsunamigenic region we take into consideration is the Hyblaean-Malta escarpment located offshore eastern Sicily, where several large historical tsunamigenic earthquakes took place (e.g. 11 January 1693). Starting from different slip configurations on a chosen fault, we compare the time series of wave elevation simulated for tide gauges placed along the coast and for virtual deep sea sensors placed at different distances from the source area. The final goal is to understand whether a properly designed marine sensor network can help determining in real-time the slip characteristics along the parent fault and hence forecasting the pattern of impact of the tsunami especially along the closest coasts.

  20. CARIBE WAVE/LANTEX Caribbean and Western Atlantic Tsunami Exercises

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Whitmore, P.; Aliaga, B.; Huerfano Moreno, V.

    2013-12-01

    Over 75 tsunamis have been documented in the Caribbean and Adjacent Regions over the past 500 years. While most have been generated by local earthquakes, distant generated tsunamis can also affect the region. For example, waves from the 1755 Lisbon earthquake and tsunami were observed in Cuba, Dominican Republic, British Virgin Islands, as well as Antigua, Martinique, Guadalupe and Barbados in the Lesser Antilles. Since 1500, at least 4484 people are reported to have perished in these killer waves. Although the tsunami generated by the 2010 Haiti earthquake claimed only a few lives, in the 1530 El Pilar, Venezuela; 1602 Port Royale, Jamaica; 1918 Puerto Rico; and 1946 Samaná, Dominican Republic tsunamis the death tolls ranged to over a thousand. Since then, there has been an explosive increase in residents, visitors, infrastructure, and economic activity along the coastlines, increasing the potential for human and economic loss. It has been estimated that on any day, upwards of more than 500,000 people could be in harm's way just along the beaches, with hundreds of thousands more working and living in the tsunamis hazard zones. Given the relative infrequency of tsunamis, exercises are a valuable tool to test communications, evaluate preparedness and raise awareness. Exercises in the Caribbean are conducted under the framework of the UNESCO IOC Intergovernmental Coordination Group for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS) and the US National Tsunami Hazard Mitigation Program. On March 23, 2011, 34 countries and territories participated in the first CARIBE WAVE/LANTEX regional tsunami exercise, while in the second exercise on March 20, 2013 a total of 45 countries and territories participated. 481 organizations (almost 200 more than in 2011) also registered to receive the bulletins issued by the Pacific Tsunami Warning Center (PTWC), West Coast and Alaska Tsunami Warning Center and/or the Puerto Rico Seismic Network. The CARIBE WAVE/LANTEX 13 scenario simulated a tsunami generated by a magnitude 8.5 earthquake originating north of Oranjestad, Aruba in the Caribbean Sea. For the first time earthquake impact was included in addition to expected tsunami impact. The initial message was issued by the warning centers over the established channels, while different mechanisms were then used by participants for further dissemination. The enhanced PTWC tsunami products for the Caribbean were also made available to the participants. To provide feedback on the exercise an online survey tool with 85 questions was used. The survey demonstrated satisfaction with exercise, timely receipt of bulletins and interest in the enhanced PTWC products. It also revealed that while 93% of the countries had an activation and response process, only 59% indicated that they also had an emergency response plan for tsunamis and even fewer had tsunami evacuation plans and inundation maps. Given that 80% of those surveyed indicated that CARIBE WAVE should be conducted annually, CARIBE EWS decided that the next exercise be held on March 26, 2014, instead of waiting until 2015.

  1. Kuril Islands tsunami of November 2006: 1. Impact at Crescent City by distant scattering

    NASA Astrophysics Data System (ADS)

    Kowalik, Z.; Horrillo, J.; Knight, W.; Logan, Tom

    2008-01-01

    A numerical model for the global tsunami computation constructed by Kowalik et al. (2005, 2007a) is applied to the tsunami of November 15, 2006 in the northern Pacific with spatial resolution of one minute. Numerical results are compared to sea level data collected by Pacific DART buoys. The tide gauge at Crescent City (CC) recorded an initial tsunami wave of about 20 cm amplitude and a second larger energy packet arriving 2 hours later. The first energy input into the CC harbor was the primary (direct) wave traveling over the deep waters of the North Pacific. Interactions with submarine ridges and numerous seamounts located in the tsunami path were a larger source of tsunami energy than the direct wave. Travel time for these amplified energy fluxes is longer than for the direct wave. Prime sources for the larger fluxes at CC are interactions with Koko Guyot and Hess Rise. Tsunami waves travel next over the Mendocino Escarpment where the tsunami energy flux is concentrated owing to refraction and directed toward CC. Local tsunami amplification over the shelf break and shelf are important as well. In many locations along the North Pacific coast, the first arriving signal or forerunner has lower amplitude than the main signal, which often is delayed. Understanding this temporal distribution is important for an application to tsunami warning and prediction. As a tsunami hazard mitigation tool, we propose that along with the sea level records (which are often quite noisy), an energy flux for prediction of the delayed tsunami signals be used.

  2. Tsunami Ready Recognition Program for the Caribbean and Adjacent Regions Launched in 2015

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Hinds, K.; Aliaga, B.; Brome, A.; Lopes, R.

    2015-12-01

    Over 75 tsunamis have been documented in the Caribbean and Adjacent Regions over the past 500 years with 4,561 associated deaths according to the NOAA Tsunami Database. The most recent devastating tsunamis occurred in 1946 in Dominican Republic; 1865 died. With the explosive increase in residents, tourists, infrastructure, and economic activity along the coasts, the potential for human and economic loss is enormous. It has been estimated that on any day, more than 500,000 people in the Caribbean could be in harm's way just along the beaches, with hundreds of thousands more working and living in the tsunamis hazard zones. In 2005 the UNESCO Intergovernmental Oceanographic Commission established the Intergovernmental Coordination Group for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (ICG CARIBE EWS) to coordinate tsunami efforts among the 48 participating countries in territories in the region. In addition to monitoring, modeling and communication systems, one of the fundamental components of the warning system is community preparedness, readiness and resilience. Over the past 10 years 49 coastal communities in the Caribbean have been recognized as TsunamiReady® by the US National Weather Service (NWS) in the case of Puerto Rico and the US Virgin Islands and jointly by UNESCO and NWS in the case of the non US jurisdictions of Anguilla and the British Virgin Islands. In response to the positive feedback of the implementation of TsunamiReady, the ICG CARIBE EWS in 2015 recommended the approval of the guidelines for a Community Performance Based Recognition program. It also recommended the adoption of the name "Tsunami Ready", which has been positively consulted with the NWS. Ten requirements were established for recognition and are divided among Preparedness, Mitigation and Response elements which were adapted from the proposed new US TsunamiReady guidelines and align well with emergency management functions. Both a regional ICG CARIBE EWS and national/territorial "Tsunami Ready" boards will administer the recognition program. With the "Tsunami Ready" program, it will be possible for to better track the full implementation of tsunami warning system in the Caribbean and Adjacent regions. Member States and donor agencies have been invited to support pilot projects.

  3. Tohoku-Oki Earthquake Tsunami Runup and Inundation Data for Sites Around the Island of Hawaiʻi

    USGS Publications Warehouse

    Trusdell, Frank A.; Chadderton, Amy; Hinchliffe, Graham; Hara, Andrew; Patenge, Brent; Weber, Tom

    2012-01-01

    At 0546 U.t.c. March 11, 2011, a Mw 9.0 ("great") earthquake occurred near the northeast coast of Honshu Island, Japan, generating a large tsunami that devastated the east coast of Japan and impacted many far-flung coastal sites around the Pacific Basin. After the earthquake, the Pacific Tsunami Warning Center issued a tsunami alert for the State of Hawaii, followed by a tsunami-warning notice from the local State Civil Defense on March 10, 2011 (Japan is 19 hours ahead of Hawaii). After the waves passed the islands, U.S. Geological Survey (USGS) scientists from the Hawaiian Volcano Observatory (HVO) measured inundation (maximum inland distance of flooding), runup (elevation at maximum extent of inundation) and took photographs in coastal areas around the Island of Hawaiʻi. Although the damage in West Hawaiʻi is well documented, HVO's mapping revealed that East Hawaiʻi coastlines were also impacted by the tsunami. The intent of this report is to provide runup and inundation data for sites around the Island of Hawaiʻi.

  4. THE TSUNAMI SERVICE BUS, AN INTEGRATION PLATFORM FOR HETEROGENEOUS SENSOR SYSTEMS

    NASA Astrophysics Data System (ADS)

    Fleischer, J.; Häner, R.; Herrnkind, S.; Kriegel, U.; Schwarting, H.; Wächter, J.

    2009-12-01

    The Tsunami Service Bus (TSB) is the sensor integration platform of the German Indonesian Tsunami Early Warning System (GITEWS) [1]. The primary goal of GITEWS is to deliver reliable tsunami warnings as fast as possible. This is achieved on basis of various sensor systems like seismometers, ocean instrumentation, and GPS stations, all providing fundamental data to support prediction of tsunami wave propagation by the GITEWS warning center. However, all these sensors come with their own proprietary data formats and specific behavior. Also new sensor types might be added, old sensors will be replaced. To keep GITEWS flexible the TSB was developed in order to access and control sensors in a uniform way. To meet these requirements the TSB follows the architectural blueprint of a Service Oriented Architecture (SOA). The integration platform implements dedicated services communicating via a service infrastructure. The functionality required for early warnings is provided by loosely coupled services replacing the "hard-wired" coupling at data level. Changes in the sensor specification are confined to the data level without affecting the warning center. Great emphasis was laid on following the Sensor Web Enablement (SWE) standard [2], specified by the Open Geospatial Consortium (OGC) [3]. As a result the full functionality needed in GITEWS could be achieved by implementing the four SWE services: The Sensor Observation Service for retrieving sensor measurements, the Sensor Alert Service in order to deliver sensor alerts, the Sensor Planning Service for tasking sensors, and the Web Notification Service for conduction messages to various media channels. Beyond these services the TSB also follows SWE Observation & Measurements specifications (O&M) for data encoding and Sensor Model Language (SensorML) for meta information. Moreover, accessing sensors via the TSB is not restricted to GITEWS. Multiple instances of the TSB can be composed to realize federate warning system. Beside the already operating TSB at the BMKG warning center [4], two other organizations in Indonesia ([5], [6]) consider using the TSB, making their data centers available to GITEWS. The presentation takes a look at the concepts and implementation and reflects the usefulness of the mentioned standards. REFERENCES [1] GITEWS is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone region of the Indian Ocean, http://www.gitews.org/ [2] SWE, www.opengeospatial.org/projects/groups/sensorweb [3] OGC, www.opengeospatial.org [4] Meteorological and Geophysical Agency of Indonesia (BMKG), www.bmg.go.id [5] National Coordinating Agency for Surveys and Mapping (BAKOSURTANAL), www.bakosurtanal.go.id [6] Agency for the Assessment & Application of Technology (BPPT), www.bppt.go.id

  5. Fusion of real-time simulation, sensing, and geo-informatics in assessing tsunami impact

    NASA Astrophysics Data System (ADS)

    Koshimura, S.; Inoue, T.; Hino, R.; Ohta, Y.; Kobayashi, H.; Musa, A.; Murashima, Y.; Gokon, H.

    2015-12-01

    Bringing together state-of-the-art high-performance computing, remote sensing and spatial information sciences, we establish a method of real-time tsunami inundation forecasting, damage estimation and mapping to enhance disaster response. Right after a major (near field) earthquake is triggered, we perform a real-time tsunami inundation forecasting with use of high-performance computing platform (Koshimura et al., 2014). Using Tohoku University's vector supercomputer, we accomplished "10-10-10 challenge", to complete tsunami source determination in 10 minutes, tsunami inundation modeling in 10 minutes with 10 m grid resolution. Given the maximum flow depth distribution, we perform quantitative estimation of exposed population using census data and mobile phone data, and the numbers of potential death and damaged structures by applying tsunami fragility curve. After the potential tsunami-affected areas are estimated, the analysis gets focused and moves on to the "detection" phase using remote sensing. Recent advances of remote sensing technologies expand capabilities of detecting spatial extent of tsunami affected area and structural damage. Especially, a semi-automated method to estimate building damage in tsunami affected areas is developed using pre- and post-event high-resolution SAR (Synthetic Aperture Radar) data. The method is verified through the case studies in the 2011 Tohoku and other potential tsunami scenarios, and the prototype system development is now underway in Kochi prefecture, one of at-risk coastal city against Nankai trough earthquake. In the trial operation, we verify the capability of the method as a new tsunami early warning and response system for stakeholders and responders.

  6. Using Interdisciplinary Research Methods to Revise and Strengthen the NWS TsunamiReadyTM Community Recognition Program

    NASA Astrophysics Data System (ADS)

    Scott, C.; Gregg, C. E.; Ritchie, L.; Stephen, M.; Farnham, C.; Fraser, S. A.; Gill, D.; Horan, J.; Houghton, B. F.; Johnson, V.; Johnston, D.

    2013-12-01

    The National Tsunami Hazard Mitigation Program (NTHMP) partnered with the National Weather Service (NWS) in early 2000 to create the TsunamiReadyTM Community Recognition program. TsunamiReadyTM, modeled after the older NWS StormReadyTM program, is designed to help cities, towns, counties, universities and other large sites in coastal areas reduce the potential for disastrous tsunami-related consequences. To achieve TsunamiReadyTM recognition, communities must meet certain criteria aimed at better preparing a community for tsunami, including specific actions within the following categories: communications and coordination, tsunami warning reception, local warning dissemination, community preparedness, and administration. Using multidisciplinary research methods and strategies from Public Health; Psychology; Political, Social and Physical Sciences and Evaluation, our research team is working directly with a purposive sample of community stakeholders in collaboration and feedback focus group sessions. Invitation to participate is based on a variety of factors including but not limited to an individual's role as a formal or informal community leader (e.g., in business, government, civic organizations), or their organization or agency affiliation to emergency management and response. Community organizing and qualitative research methods are being used to elicit discussion regarding TsunamiReadyTM requirements and the division of requirements based on some aspect of tsunami hazard, vulnerability and risk, such as proximity to active or passive plate margins or subduction zone generated tsunamis versus earthquake-landslide generated tsunamis . The primary aim of this research is to use social science to revise and refine the NWS TsunamiReadyTM Guidelines in an effort to better prepare communities to reduce risk to tsunamis.

  7. Tsunami Data and Scientific Data Diplomacy

    NASA Astrophysics Data System (ADS)

    Arcos, N. P.; Dunbar, P. K.; Gusiakov, V. K.; Kong, L. S. L.; Aliaga, B.; Yamamoto, M.; Stroker, K. J.

    2016-12-01

    Free and open access to data and information fosters scientific progress and can build bridges between nations even when political relationships are strained. Data and information held by one stakeholder may be vital for promoting research of another. As an emerging field of inquiry, data diplomacy explores how data-sharing helps create and support positive relationships between countries to enable the use of data for societal and humanitarian benefit. Tsunami has arguably been the only natural hazard that has been addressed so effectively at an international scale and illustrates the success of scientific data diplomacy. Tsunami mitigation requires international scientific cooperation in both tsunami science and technology development. This requires not only international agreements, but working-level relationships between scientists from countries that may have different political and economic policies. For example, following the Pacific wide tsunami of 1960 that killed two thousand people in Chile and then, up to a day later, hundreds in Hawaii, Japan, and the Philippines; delegates from twelve countries met to discuss and draft the requirements for an international tsunami warning system. The Pacific Tsunami Warning System led to the development of local, regional, and global tsunami databases and catalogs. For example, scientists at NOAA/NCEI and the Tsunami Laboratory/Russian Academy of Sciences have collaborated on their tsunami catalogs that are now routinely accessed by scientists and the public around the world. These data support decision-making during tsunami events, are used in developing inundation and evacuation maps, and hazard assessments. This presentation will include additional examples of agreements for data-sharing between countries, as well as challenges in standardization and consistency among the tsunami research community. Tsunami data and scientific data diplomacy have ultimately improved understanding of tsunami and associated impacts.

  8. Ionospheric Method of Detecting Tsunami-Generating Earthquakes.

    ERIC Educational Resources Information Center

    Najita, Kazutoshi; Yuen, Paul C.

    1978-01-01

    Reviews the earthquake phenomenon and its possible relation to ionospheric disturbances. Discusses the basic physical principles involved and the methods upon which instrumentation is being developed for possible use in a tsunami disaster warning system. (GA)

  9. Advances in the TRIDEC Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven

    2016-04-01

    The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.

  10. Japanese Experience with Long-term Recovery from the 2011 Tohoku Earthquake and Tsunami Disaster

    NASA Astrophysics Data System (ADS)

    Hayashi, H.

    2015-12-01

    On March 11, 2011, a huge tsunami disaster hit Pacific coast of Tohoku region due to a magnitude of 9.0 earthquake, and killed almost 20,000 people. It was also the beginning of long-term recovery to prepare for next tsunami attack in the future. In this presentation, I would like to review the recovery process from the following five elements: quantification of tsunami hazards, public education, evacuation model, land-use planning, and real-time tsunami warning. It should be noted that there are lessons from the 2011 event at two different levels: national level and prefecture levels. In relation to the quantification of tsunami hazard and real-time tsunami warning, it followed a big change in tsunami policy at national level such as setting up two levels of tsunami scenarios for tsunami preparedness and mitigation: Level 1 tsunami (L1) and Level 2 tsunami (L2). L1 is the tsunami risk with 50 year return period, and L2 is the one with 1,000 year return period. As for public education, evacuation model, and land-use planning, There existed a big difference for what happened in the northern half of the coast and the southern half. Northern half of the coast belongs to Iwate Prefecture whose geography is rias coast. People in the Rias coast of Iwate Prefecture has been hit many times by tsunami on the average of about 50 years. With these many experiences, they succeeded in reducing the number of mortality down to 4,000 in comparison with 20,000 at the 1886 tsunami disaster. Most of the Southern half belongs to Miyagi Prefecture whose geography is coastal plain. People in the coastal plain in Miyagi Prefecture has little experience with tsunami disaster and end up with 14,000 deaths due to tsunami attack. The differences in the past tsunami experiences in these two prefectures resulted in big differences in public education, evacuation model, and land-use planning.

  11. The FASTER Approach: A New Tool for Calculating Real-Time Tsunami Flood Hazards

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Cross, A.; Johnson, L.; Miller, K.; Nicolini, T.; Whitmore, P.

    2014-12-01

    In the aftermath of the 2010 Chile and 2011 Japan tsunamis that struck the California coastline, emergency managers requested that the state tsunami program provide more detailed information about the flood potential of distant-source tsunamis well ahead of their arrival time. The main issue is that existing tsunami evacuation plans call for evacuation of the predetermined "worst-case" tsunami evacuation zone (typically at a 30- to 50-foot elevation) during any "Warning" level event; the alternative is to not call an evacuation at all. A solution to provide more detailed information for secondary evacuation zones has been the development of tsunami evacuation "playbooks" to plan for tsunami scenarios of various sizes and source locations. To determine a recommended level of evacuation during a distant-source tsunami, an analytical tool has been developed called the "FASTER" approach, an acronym for factors that influence the tsunami flood hazard for a community: Forecast Amplitude, Storm, Tides, Error in forecast, and the Run-up potential. Within the first couple hours after a tsunami is generated, the National Tsunami Warning Center provides tsunami forecast amplitudes and arrival times for approximately 60 coastal locations in California. At the same time, the regional NOAA Weather Forecast Offices in the state calculate the forecasted coastal storm and tidal conditions that will influence tsunami flooding. Providing added conservatism in calculating tsunami flood potential, we include an error factor of 30% for the forecast amplitude, which is based on observed forecast errors during recent events, and a site specific run-up factor which is calculated from the existing state tsunami modeling database. The factors are added together into a cumulative FASTER flood potential value for the first five hours of tsunami activity and used to select the appropriate tsunami phase evacuation "playbook" which is provided to each coastal community shortly after the forecast is provided.

  12. A Sensitivity Analysis of Tsunami Inversions on the Number of Stations

    NASA Astrophysics Data System (ADS)

    An, Chao; Liu, Philip L.-F.; Meng, Lingsen

    2018-05-01

    Current finite-fault inversions of tsunami recordings generally adopt as many tsunami stations as possible to better constrain earthquake source parameters. In this study, inversions are evaluated by the waveform residual that measures the difference between model predictions and recordings, and the dependence of the quality of inversions on the number tsunami stations is derived. Results for the 2011 Tohoku event show that, if the tsunami stations are optimally located, the waveform residual decreases significantly with the number of stations when the number is 1 ˜ 4 and remains almost constant when the number is larger than 4, indicating that 2 ˜ 4 stations are able to recover the main characteristics of the earthquake source. The optimal location of tsunami stations is explained in the text. Similar analysis is applied to the Manila Trench in the South China Sea using artificially generated earthquakes and virtual tsunami stations. Results confirm that 2 ˜ 4 stations are necessary and sufficient to constrain the earthquake source parameters, and the optimal sites of stations are recommended in the text. The conclusion is useful for the design of new tsunami warning systems. Current strategies of tsunameter network design mainly focus on the early detection of tsunami waves from potential sources to coastal regions. We therefore recommend that, in addition to the current strategies, the waveform residual could also be taken into consideration so as to minimize the error of tsunami wave prediction for warning purposes.

  13. Recent improvements in earthquake and tsunami monitoring in the Caribbean

    NASA Astrophysics Data System (ADS)

    Gee, L.; Green, D.; McNamara, D.; Whitmore, P.; Weaver, J.; Huang, P.; Benz, H.

    2007-12-01

    Following the catastrophic loss of life from the December 26, 2004, Sumatra-Andaman Islands earthquake and tsunami, the U.S. Government appropriated funds to improve monitoring along a major portion of vulnerable coastal regions in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Partners in this project include the United States Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the Puerto Rico Seismic Network (PRSN), the Seismic Research Unit of the University of the West Indies, and other collaborating institutions in the Caribbean region. As part of this effort, the USGS is coordinating with Caribbean host nations to design and deploy nine new broadband and strong-motion seismic stations. The instrumentation consists of an STS-2 seismometer, an Episensor accelerometer, and a Q330 high resolution digitizer. Six stations are currently transmitting data to the USGS National Earthquake Information Center, where the data are redistributed to the NOAA's Tsunami Warning Centers, regional monitoring partners, and the IRIS Data Management Center. Operating stations include: Isla Barro Colorado, Panama; Gun Hill Barbados; Grenville, Grenada; Guantanamo Bay, Cuba; Sabaneta Dam, Dominican Republic; and Tegucigalpa, Honduras. Three additional stations in Barbuda, Grand Turks, and Jamaica will be completed during the fall of 2007. These nine stations are affiliates of the Global Seismographic Network (GSN) and complement existing GSN stations as well as regional stations. The new seismic stations improve azimuthal coverage, increase network density, and provide on-scale recording throughout the region. Complementary to this network, NOAA has placed Deep-ocean Assessment and Reporting of Tsunami (DART) stations at sites in regions with a history of generating destructive tsunamis. Recently, NOAA completed deployment of 7 DART stations off the coasts of Montauk Pt, NY; Charleston, SC; Miami, FL; San Juan, Puerto Rico; New Orleans, LA; and Bermuda as part of the U.S. tsunami warning system expansion. DART systems consist of an anchored seafloor pressure recorder (BPR) and a companion moored surface buoy for real-time communications. The new stations are a second-generation design (DART II) equipped with two- way satellite communications that allow NOAA's Tsunami Warning Centers to set stations in event mode in anticipation of possible tsunamis or retrieve the high-resolution (15-s intervals) data in one-hour blocks for detailed analysis. Combined with development of sophisticated wave propagation and site-specific inundation models, the DART data are being used to forecast wave heights for at-risk coastal communities. NOAA expects to deploy a total of 39 DART II buoy stations by 2008 (32 in the Pacific and 7 in the Atlantic, Caribbean and Gulf regions). The seismic and DART networks are two components in a comprehensive and fully-operational global observing system to detect and warn the public of earthquake and tsunami threats. NOAA and USGS are working together to make important strides in enhancing communication networks so residents and visitors can receive earthquake and tsunami watches and warnings around the clock.

  14. Coastal Warning Display Program

    Science.gov Websites

    ! Boating Safety Beach Hazards Rip Currents Hypothermia Hurricanes Thunderstorms Lightning Coastal Flooding Tsunamis 406 EPIRB's National Weather Service Marine Forecasts COASTAL WARNING DISPLAY PROGRAM Marine COASTAL WARNING DISPLAY PROGRAM As of February 15, 1989, the National Weather Service retired its Coastal

  15. Rapid tsunami models and earthquake source parameters: Far-field and local applications

    USGS Publications Warehouse

    Geist, E.L.

    2005-01-01

    Rapid tsunami models have recently been developed to forecast far-field tsunami amplitudes from initial earthquake information (magnitude and hypocenter). Earthquake source parameters that directly affect tsunami generation as used in rapid tsunami models are examined, with particular attention to local versus far-field application of those models. First, validity of the assumption that the focal mechanism and type of faulting for tsunamigenic earthquakes is similar in a given region can be evaluated by measuring the seismic consistency of past events. Second, the assumption that slip occurs uniformly over an area of rupture will most often underestimate the amplitude and leading-wave steepness of the local tsunami. Third, sometimes large magnitude earthquakes will exhibit a high degree of spatial heterogeneity such that tsunami sources will be composed of distinct sub-events that can cause constructive and destructive interference in the wavefield away from the source. Using a stochastic source model, it is demonstrated that local tsunami amplitudes vary by as much as a factor of two or more, depending on the local bathymetry. If other earthquake source parameters such as focal depth or shear modulus are varied in addition to the slip distribution patterns, even greater uncertainty in local tsunami amplitude is expected for earthquakes of similar magnitude. Because of the short amount of time available to issue local warnings and because of the high degree of uncertainty associated with local, model-based forecasts as suggested by this study, direct wave height observations and a strong public education and preparedness program are critical for those regions near suspected tsunami sources.

  16. Introduction to “Global tsunami science: Past and future, Volume I”

    USGS Publications Warehouse

    Geist, Eric L.; Fritz, Hermann; Rabinovich, Alexander B.; Tanioka, Yuichiro

    2016-01-01

    Twenty-five papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue “Global Tsunami Science: Past and Future”. Six papers examine various aspects of tsunami probability and uncertainty analysis related to hazard assessment. Three papers relate to deterministic hazard and risk assessment. Five more papers present new methods for tsunami warning and detection. Six papers describe new methods for modeling tsunami hydrodynamics. Two papers investigate tsunamis generated by non-seismic sources: landslides and meteorological disturbances. The final three papers describe important case studies of recent and historical events. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  17. Introduction to "Global Tsunami Science: Past and Future, Volume I"

    NASA Astrophysics Data System (ADS)

    Geist, Eric L.; Fritz, Hermann M.; Rabinovich, Alexander B.; Tanioka, Yuichiro

    2016-12-01

    Twenty-five papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue "Global Tsunami Science: Past and Future". Six papers examine various aspects of tsunami probability and uncertainty analysis related to hazard assessment. Three papers relate to deterministic hazard and risk assessment. Five more papers present new methods for tsunami warning and detection. Six papers describe new methods for modeling tsunami hydrodynamics. Two papers investigate tsunamis generated by non-seismic sources: landslides and meteorological disturbances. The final three papers describe important case studies of recent and historical events. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  18. Tsunami Early Warning Within Five Minutes

    NASA Astrophysics Data System (ADS)

    Lomax, Anthony; Michelini, Alberto

    2013-09-01

    Tsunamis are most destructive at near to regional distances, arriving within 20-30 min after a causative earthquake; effective early warning at these distances requires notification within 15 min or less. The size and impact of a tsunami also depend on sea floor displacement, which is related to the length, L, width, W, mean slip, D, and depth, z, of the earthquake rupture. Currently, the primary seismic discriminant for tsunami potential is the centroid-moment tensor magnitude, M {w/CMT}, representing the product LWD and estimated via an indirect inversion procedure. However, the obtained M {w/CMT} and the implied LWD value vary with rupture depth, earth model, and other factors, and are only available 20-30 min or more after an earthquake. The use of more direct discriminants for tsunami potential could avoid these problems and aid in effective early warning, especially for near to regional distances. Previously, we presented a direct procedure for rapid assessment of earthquake tsunami potential using two, simple measurements on P-wave seismograms—the predominant period on velocity records, T d , and the likelihood, T {50/Ex}, that the high-frequency, apparent rupture-duration, T 0, exceeds 50-55 s. We have shown that T d and T 0 are related to the critical rupture parameters L, W, D, and z, and that either of the period-duration products T d T 0 or T d T {50/Ex} gives more information on tsunami impact and size than M {w/CMT}, M wp, and other currently used discriminants. These results imply that tsunami potential is not directly related to the product LWD from the "seismic" faulting model, as is assumed with the use of the M {w/CMT} discriminant. Instead, information on rupture length, L, and depth, z, as provided by T d T 0 or T d T {50/Ex}, can constrain well the tsunami potential of an earthquake. We introduce here special treatment of the signal around the S arrival at close stations, a modified, real-time, M wpd(RT) magnitude, and other procedures to enable early estimation of event parameters and tsunami discriminants. We show that with real-time data currently available in most regions of tsunami hazard, event locations, m b and M wp magnitudes, and the direct, period-duration discriminant, T d T {50/Ex} can be determined within 5 min after an earthquake occurs, and T 0, T d T 0, and M wpd(RT) within approximately 10 min. This processing is implemented and running continuously in real-time within the Early-est earthquake monitor at INGV-Rome (http://early-est.rm.ingv.it). We also show that the difference m b - log10( T d T 0) forms a rapid discriminant for slow, tsunami earthquakes. The rapid availability of these measurements can aid in faster and more reliable tsunami early warning for near to regional distances.

  19. Use of Advanced Tsunami Hazard Assessment Techniques and Tsunami Source Characterizations in U.S. and International Nuclear Regulatory Activities

    NASA Astrophysics Data System (ADS)

    Kammerer, A. M.; Godoy, A. R.

    2009-12-01

    In response to the 2004 Indian Ocean Tsunami, as well as the anticipation of the submission of license applications for new nuclear facilities, the United States Nuclear Regulatory Commission (US NRC) initiated a long-term research program to improve understanding of tsunami hazard levels for nuclear power plants and other coastal facilities in the United States. To undertake this effort, the US NRC organized a collaborative research program jointly undertaken with researchers at the United States Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA) for the purpose of assessing tsunami hazard on the Atlantic and Gulf Coasts of the United States. This study identified and modeled both seismic and landslide tsunamigenic sources in the near- and far-field. The results from this work are now being used directly as the basis for the review of tsunami hazard at potential nuclear plant sites. This application once again shows the importance that the earth sciences can play in addressing issues of importance to society. Because the Indian Ocean Tsunami was a global event, a number of cooperative international activities have also been initiated within the nuclear community. The results of US efforts are being incorporated into updated regulatory guidance for both the U.S. Nuclear Regulatory Commission and the United Nation’s International Atomic Energy Agency (IAEA). Coordinated efforts are underway to integrate state-of-the art tsunami warning tools developed by NOAA into NRC and IAEA activities. The goal of the warning systems project is to develop automated protocols that allow scientists at these agencies to have up-to-the minute user-specific information in hand shortly after a potential tsunami has been identified by the US Tsunami Warning System. Lastly, USGS and NOAA scientists are assisting the NRC and IAEA in a special Extra-Budgetary Program (IAEA EBP) on tsunami being coordinated by the IAEA’s International Seismic Safety Center. This IAEA EBP is focused on sharing lessons learned, tsunami hazard assessment techniques, and numerical tools among UN Member States. The complete body of basic and applied research undertaken in these many projects represents the combined effort of a diverse group of marine geologists, geophysicists, geotechnical engineers, seismologists and hydrodynamic modelers at multiple organizations.

  20. Rapid inundation estimates at harbor scale using tsunami wave heights offshore simulation and coastal amplification laws

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Loevenbruck, A.; Hebert, H.

    2013-12-01

    Numerical tsunami propagation and inundation models are well developed and have now reached an impressive level of accuracy, especially in locations such as harbors where the tsunami waves are mostly amplified. In the framework of tsunami warning under real-time operational conditions, the main obstacle for the routine use of such numerical simulations remains the slowness of the numerical computation, which is strengthened when detailed grids are required for the precise modeling of the coastline response of an individual harbor. Thus only tsunami offshore propagation modeling tools using a single sparse bathymetric computation grid are presently included within the French Tsunami Warning Center (CENALT), providing rapid estimation of tsunami warning at western Mediterranean and NE Atlantic basins scale. We present here a preliminary work that performs quick estimates of the inundation at individual harbors from these high sea forecasting tsunami simulations. The method involves an empirical correction based on theoretical amplification laws (either Green's or Synolakis laws). The main limitation is that its application to a given coastal area would require a large database of previous observations, in order to define the empirical parameters of the correction equation. As no such data (i.e., historical tide gage records of significant tsunamis) are available for the western Mediterranean and NE Atlantic basins, we use a set of synthetic mareograms, calculated for both fake and well-known historical tsunamigenic earthquakes in the area. This synthetic dataset is obtained through accurate numerical tsunami propagation and inundation modeling by using several nested bathymetric grids of increasingly fine resolution close to the shores (down to a grid cell size of 3m in some Mediterranean harbors). Non linear shallow water tsunami modeling performed on a single 2' coarse bathymetric grid are compared to the values given by time-consuming nested grids simulations (and observation when available), in order to check to which extent the simple approach based on the amplification laws can explain the data. The idea is to fit tsunami data with numerical modeling carried out without any refined coastal bathymetry/topography. To this end several parameters are discussed, namely the bathymetric depth to which model results must be extrapolated (using the Green's law), or the mean bathymetric slope to consider near the studied coast (when using the Synolakis law).

  1. Evaluation and Numerical Simulation of Tsunami for Coastal Nuclear Power Plants of India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Pavan K.; Singh, R.K.; Ghosh, A.K.

    2006-07-01

    Recent tsunami generated on December 26, 2004 due to Sumatra earthquake of magnitude 9.3 resulted in inundation at the various coastal sites of India. The site selection and design of Indian nuclear power plants demand the evaluation of run up and the structural barriers for the coastal plants: Besides it is also desirable to evaluate the early warning system for tsunami-genic earthquakes. The tsunamis originate from submarine faults, underwater volcanic activities, sub-aerial landslides impinging on the sea and submarine landslides. In case of a submarine earthquake-induced tsunami the wave is generated in the fluid domain due to displacement of themore » seabed. There are three phases of tsunami: generation, propagation, and run-up. Reactor Safety Division (RSD) of Bhabha Atomic Research Centre (BARC), Trombay has initiated computational simulation for all the three phases of tsunami source generation, its propagation and finally run up evaluation for the protection of public life, property and various industrial infrastructures located on the coastal regions of India. These studies could be effectively utilized for design and implementation of early warning system for coastal region of the country apart from catering to the needs of Indian nuclear installations. This paper presents some results of tsunami waves based on different analytical/numerical approaches with shallow water wave theory. (authors)« less

  2. On the Development of Multi-Hazard Early Warning Networks: Practical experiences from North and Central America.

    NASA Astrophysics Data System (ADS)

    Mencin, David; Hodgkinson, Kathleen; Braun, John; Meertens, Charles; Mattioli, Glen; Phillips, David; Blume, Fredrick; Berglund, Henry; Fox, Otina; Feaux, Karl

    2015-04-01

    The GAGE facility, managed by UNAVCO, maintains and operates about 1300 GNSS stations distributed across North and Central America as part of the EarthScope Plate Boundary Observatory (PBO) and the Continuously Operating Caribbean GPS Observational Network (COCONet). UNAVCO has upgraded about 450 stations in these networks to real-time and high-rate (RT-GNSS) and included surface meteorological instruments. The majority of these streaming stations are part of the PBO but also include approximately 50 RT-GNSS stations in the Caribbean and Central American region as part of the COCONet and TLALOCNet projects. Based on community input UNAVCO has been exploring ways to increase the capability and utility of these resources to improve our understanding in diverse areas of geophysics including seismic, volcanic, magmatic and tsunami deformation sources, extreme weather events such as hurricanes and storms, and space weather. The RT-GNSS networks also have the potential to profoundly transform our ability to rapidly characterize geophysical events, provide early warning, as well as improve hazard mitigation and response. Specific applications currently under development with university, commercial, non-profit and government collaboration on national and international scales include earthquake and tsunami early warning systems and near real-time tropospheric modeling of hurricanes and precipitable water vapor estimate assimilation. Using tsunami early warning as an example, an RT-GNSS network can provide multiple inputs in an operational system starting with rapid assessment of earthquake sources and associated deformation which informs the initial modeled tsunami. The networks can then can also provide direct measurements of the tsunami wave heights and propagation by tracking the associated ionospheric disturbance from several 100's of km away as the waves approaches the shoreline. These GNSS based constraints can refine the tsunami and inundation models and potentially mitigate hazards. Other scientific and operational applications for high-rate GPS include glacier and ice sheet motions, tropospheric modeling, and better constraints on the dynamics of space weather. Our operational system has multiple communities that use and depend on a Pan-Pacific real-time open data set. The ability to merge existing data sets and user communities, seismic and tide gauge observations, with GNSS and Met data products has proven complicated because of issues related to meta-data, appropriate data formats, data quality assessment in real-time and specific issues related to using these products in operational forecasting. Additional issues related to data access across national borders and cognizant government sanctioned "early warning" agencies, some committed to specific technologies, methodologies, internal structure and further constrained by data policies make a truly operational system an on-going work in progress. We present a short history of evolving a very large and expensive RT-GNSS network originally designed to answer specific long term scientific questions about structure and evolution of North American plate boundaries into a much needed national hazard system while continuing to serve our core community in long term scientific studies. Out primary focus in this presentation is an analysis of our current goals and impediments to achieving these broader objectives.

  3. CoopEUS Case Study: Tsunami Modelling and Early Warning Systems for Near Source Areas (Mediterranean, Juan de Fuca).

    NASA Astrophysics Data System (ADS)

    Beranzoli, Laura; Best, Mairi; Chierici, Francesco; Embriaco, Davide; Galbraith, Nan; Heeseman, Martin; Kelley, Deborah; Pirenne, Benoit; Scofield, Oscar; Weller, Robert

    2015-04-01

    There is a need for tsunami modeling and early warning systems for near-source areas. For example this is a common public safety threat in the Mediterranean and Juan de Fuca/NE Pacific Coast of N.A.; Regions covered by the EMSO, OOI, and ONC ocean observatories. Through the CoopEUS international cooperation project, a number of environmental research infrastructures have come together to coordinate efforts on environmental challenges; this tsunami case study tackles one such challenge. There is a mutual need of tsunami event field data and modeling to deepen our experience in testing methodology and developing real-time data processing. Tsunami field data are already available for past events, part of this use case compares these for compatibility, gap analysis, and model groundtruthing. It also reviews sensors needed and harmonizes instrument settings. Sensor metadata and registries are compared, harmonized, and aligned. Data policies and access are also compared and assessed for gap analysis. Modelling algorithms are compared and tested against archived and real-time data. This case study will then be extended to other related tsunami data and model sources globally with similar geographic and seismic scenarios.

  4. Signals in the ionosphere generated by tsunami earthquakes: observations and modeling suppor

    NASA Astrophysics Data System (ADS)

    Rolland, L.; Sladen, A.; Mikesell, D.; Larmat, C. S.; Rakoto, V.; Remillieux, M.; Lee, R.; Khelfi, K.; Lognonne, P. H.; Astafyeva, E.

    2017-12-01

    Forecasting systems failed to predict the magnitude of the 2011 great tsunami in Japan due to the difficulty and cost of instrumenting the ocean with high-quality and dense networks. Melgar et al. (2013) show that using all of the conventional data (inland seismic, geodetic, and tsunami gauges) with the best inversion method still fails to predict the correct height of the tsunami before it breaks onto a coast near the epicenter (< 500 km). On the other hand, in the last decade, scientists have gathered convincing evidence of transient signals in the ionosphere Total Electron Content (TEC) observations that are associated to open ocean tsunami waves. Even though typical tsunami waves are only a few centimeters high, they are powerful enough to create atmospheric vibrations extending all the way to the ionosphere, 300 kilometers up in the atmosphere. Therefore, we are proposing to incorporate the ionospheric signals into tsunami early-warning systems. We anticipate that the method could be decisive for mitigating "tsunami earthquakes" which trigger tsunamis larger than expected from their short-period magnitude. These events are challenging to characterize as they rupture the near-trench subduction interface, in a distant region less constrained by onshore data. As a couple of devastating tsunami earthquakes happens per decade, they represent a real threat for onshore populations and a challenge for tsunami early-warning systems. We will present the TEC observations of the recent Java 2006 and Mentawaii 2010 tsunami earthquakes and base our analysis on acoustic ray tracing, normal modes summation and the simulation code SPECFEM, which solves the wave equation in coupled acoustic (ocean, atmosphere) and elastic (solid earth) domains. Rupture histories are entered as finite source models, which will allow us to evaluate the effect of a relatively slow rupture on the surrounding ocean and atmosphere.

  5. Establishing an early warning alert and response network following the Solomon Islands tsunami in 2013.

    PubMed

    Bilve, Augustine; Nogareda, Francisco; Joshua, Cynthia; Ross, Lester; Betcha, Christopher; Durski, Kara; Fleischl, Juliet; Nilles, Eric

    2014-11-01

    On 6 February 2013, an 8.0 magnitude earthquake generated a tsunami that struck the Santa Cruz Islands, Solomon Islands, killing 10 people and displacing over 4700. A post-disaster assessment of the risk of epidemic disease transmission recommended the implementation of an early warning alert and response network (EWARN) to rapidly detect, assess and respond to potential outbreaks in the aftermath of the tsunami. Almost 40% of the Santa Cruz Islands' population were displaced by the disaster, and living in cramped temporary camps with poor or absent sanitation facilities and insufficient access to clean water. There was no early warning disease surveillance system. By 25 February, an EWARN was operational in five health facilities that served 90% of the displaced population. Eight priority diseases or syndromes were reported weekly; unexpected health events were reported immediately. Between 25 February and 19 May, 1177 target diseases or syndrome cases were reported. Seven alerts were investigated. No sustained transmission or epidemics were identified. Reporting compliance was 85%. The EWARN was then transitioned to the routine four-syndrome early warning disease surveillance system. It was necessary to conduct a detailed assessment to evaluate the risk and potential impact of serious infectious disease outbreaks, to assess whether and how enhanced early warning disease surveillance should be implemented. Local capacities and available resources should be considered in planning EWARN implementation. An EWARN can be an opportunity to establish or strengthen early warning disease surveillance capabilities.

  6. Tsunami Generation Modelling for Early Warning Systems

    NASA Astrophysics Data System (ADS)

    Annunziato, A.; Matias, L.; Ulutas, E.; Baptista, M. A.; Carrilho, F.

    2009-04-01

    In the frame of a collaboration between the European Commission Joint Research Centre and the Institute of Meteorology in Portugal, a complete analytical tool to support Early Warning Systems is being developed. The tool will be part of the Portuguese National Early Warning System and will be used also in the frame of the UNESCO North Atlantic Section of the Tsunami Early Warning System. The system called Tsunami Analysis Tool (TAT) includes a worldwide scenario database that has been pre-calculated using the SWAN-JRC code (Annunziato, 2007). This code uses a simplified fault generation mechanism and the hydraulic model is based on the SWAN code (Mader, 1988). In addition to the pre-defined scenario, a system of computers is always ready to start a new calculation whenever a new earthquake is detected by the seismic networks (such as USGS or EMSC) and is judged capable to generate a Tsunami. The calculation is performed using minimal parameters (epicentre and the magnitude of the earthquake): the programme calculates the rupture length and rupture width by using empirical relationship proposed by Ward (2002). The database calculations, as well the newly generated calculations with the current conditions are therefore available to TAT where the real online analysis is performed. The system allows to analyze also sea level measurements available worldwide in order to compare them and decide if a tsunami is really occurring or not. Although TAT, connected with the scenario database and the online calculation system, is at the moment the only software that can support the tsunami analysis on a global scale, we are convinced that the fault generation mechanism is too simplified to give a correct tsunami prediction. Furthermore short tsunami arrival times especially require a possible earthquake source parameters data on tectonic features of the faults like strike, dip, rake and slip in order to minimize real time uncertainty of rupture parameters. Indeed the earthquake parameters available right after an earthquake are preliminary and could be inaccurate. Determining which earthquake source parameters would affect the initial height and time series of tsunamis will show the sensitivity of the tsunami time series to seismic source details. Therefore a new fault generation model will be adopted, according to the seismotectonics properties of the different regions, and finally included in the calculation scheme. In order to do this, within the collaboration framework of Portuguese authorities, a new model is being defined, starting from the seismic sources in the North Atlantic, Caribbean and Gulf of Cadiz. As earthquakes occurring in North Atlantic and Caribbean sources may affect Portugal mainland, the Azores and Madeira archipelagos also these sources will be included in the analysis. Firstly we have started to examine the geometries of those sources that spawn tsunamis to understand the effect of fault geometry and depths of earthquakes. References: Annunziato, A., 2007. The Tsunami Assesment Modelling System by the Joint Research Center, Science of Tsunami Hazards, Vol. 26, pp. 70-92. Mader, C.L., 1988. Numerical modelling of water waves, University of California Press, Berkeley, California. Ward, S.N., 2002. Tsunamis, Encyclopedia of Physical Science and Technology, Vol. 17, pp. 175-191, ed. Meyers, R.A., Academic Press.

  7. Contribution to the top-down alert system associated with the upcoming French tsunami warning center (CENALT): tsunami hazard assessment along the French Mediterranean coast for the ALDES project

    NASA Astrophysics Data System (ADS)

    Loevenbruck, A.; Quentel, E.; Hebert, H.

    2011-12-01

    The catastrophic 2004 tsunami drew the international community's attention to tsunami risk in all basins where tsunamis occurred but no warning system exists. Consequently, under the coordination of UNESCO, France decided to create a regional center, called CENALT, for the north-east Atlantic and the western Mediterranean. This warning system, which should be operational by 2012, is set up by the CEA in collaboration with the SHOM and the CNRS. The French authorities are in charge of the top-down alert system including the local alert dissemination. In order to prepare the appropriate means and measures, they initiated the ALDES (Alerte Descendante) project to which the CEA also contributes. It aims at examining along the French Mediterranean coast the tsunami risk related to earthquakes and landslides. In addition to the evaluation at regional scale, it includes the detailed studies of 3 selected sites; the local alert system will be designed for one of them. In this project, our main task at CEA consists in assessing tsunami hazard related to seismic sources using numerical modeling. Tsunamis have already affected the west Mediterranean coast; however past events are too few and poorly documented to provide a suitable database. Thus, a synthesis of earthquakes representative of the tsunamigenic seismic activity and prone to induce the largest impact to the French coast is performed based on historical data, seismotectonics and first order models. The North Africa Margin, the Ligurian and the South Tyrrhenian Seas are considered as the main tsunamigenic zones. In order to forecast the most important plausible effects, the magnitudes are estimated by enhancing to some extent the largest known values. Our hazard estimation is based on the simulation of the induced tsunamis scenarios performed with the CEA code. Models of propagation in the basin and off the French coast allow evaluating the potential threat at regional scale in terms of sources location and highlight the most exposed areas. The 3 sites have been chosen according to the regional hazard studies, coastal typology elements and the availability of appropriate DTMs (Digital Terrain Models). Indeed, the propagation models accuracy relies on the resolution of the input bathymetry, especially in shallow water areas, and the inundation estimation also depends on the precision of the coastal topographic data. The ALDES project allows the SHOM and the IGN to conduct high resolution data acquisition in the Litto3D framework for 2 sites, one west of the Gulf of Lion and one west of the French Riviera. DTMs of the third site, centered on the Antibes Cape, are built using pre-existent data sets with lesser resolution. Detailed modeling of the tsunamis scenarios provides refined estimation of the potential impacts; it points out the most exposed places and morphologic features prone to amplify potential waves and to generate significant coastal effects. Expected water heights and currents, inundation distances and run-up elevations are assessed. Our set of simulations gives an evaluation of the expected maximum impact distribution and highlights places, such as specific beaches or harbors, where mitigation measures must be given priority.

  8. The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program (PR-NTHMP)

    NASA Astrophysics Data System (ADS)

    Vanacore, E. A.; Huerfano Moreno, V. A.; Lopez, A. M.

    2015-12-01

    The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. Of particular interest is the Puerto Rico - Virgin Islands (PRVI) region, where the proximity of the coast to prominent tectonic faults would result in near-field tsunamis. Tsunami hazard assessment, detection capabilities, warning, education and outreach efforts are common tools intended to reduce loss of life and property. It is for these reasons that the PRSN is participating in an effort with local and federal agencies to develop tsunami hazard risk reduction strategies under the NTHMP. This grant supports the TsunamiReady program, which is the base of the tsunami preparedness and mitigation in PR. In order to recognize threatened communities in PR as TsunamiReady by the US NWS, the PR Component of the NTHMP have identified and modeled sources for local, regional and tele-tsunamis and the results of simulations have been used to develop tsunami response plans. The main goal of the PR-NTHMP is to strengthen resilient coastal communities that are prepared for tsunami hazards, and recognize PR as TsunamiReady. Evacuation maps were generated in three phases: First, hypothetical tsunami scenarios of potential underwater earthquakes were developed, and these scenarios were then modeled through during the second phase. The third phase consisted in determining the worst-case scenario based on the Maximum of Maximums (MOM). Inundation and evacuation zones were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Maps and related evacuation products, like evacuation times, can be accessed online via the PR Tsunami Decision Support Tool. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were approved. Also, the existing tsunami protocol and criteria in the PR/VI was updated. This paper describes the PR-NTHMP recent outcomes, including the real time monitoring as well as the protocols used to broadcast tsunami messages. The paper highlights tsunami hazards assessment, detection, warning, education and outreach efforts in Puerto Rico.

  9. Performances of the New Real Time Tsunami Detection Algorithm applied to tide gauges data

    NASA Astrophysics Data System (ADS)

    Chierici, F.; Embriaco, D.; Morucci, S.

    2017-12-01

    Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection (TDA) based on the real-time tide removal and real-time band-pass filtering of seabed pressure time series acquired by Bottom Pressure Recorders. The TDA algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability with respect to the most widely used tsunami detection algorithm, while containing the computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. In this work we present the performance of the TDA algorithm applied to tide gauge data. We have adapted the new tsunami detection algorithm and the Monte Carlo test methodology to tide gauges. Sea level data acquired by coastal tide gauges in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event generated by Tohoku earthquake on March 11th 2011, using data recorded by several tide gauges scattered all over the Pacific area.

  10. Tsunami Hazards - A National Threat

    USGS Publications Warehouse

    ,

    2006-01-01

    In December 2004, when a tsunami killed more than 200,000 people in 11 countries around the Indian Ocean, the United States was reminded of its own tsunami risks. In fact, devastating tsunamis have struck North America before and are sure to strike again. Especially vulnerable are the five Pacific States--Hawaii, Alaska, Washington, Oregon, and California--and the U.S. Caribbean islands. In the wake of the Indian Ocean disaster, the United States is redoubling its efforts to assess the Nation's tsunami hazards, provide tsunami education, and improve its system for tsunami warning. The U.S. Geological Survey (USGS) is helping to meet these needs, in partnership with the National Oceanic and Atmospheric Administration (NOAA) and with coastal States and counties.

  11. Correlation Equation of Fault Size, Moment Magnitude, and Height of Tsunami Case Study: Historical Tsunami Database in Sulawesi

    NASA Astrophysics Data System (ADS)

    Julius, Musa, Admiral; Pribadi, Sugeng; Muzli, Muzli

    2018-03-01

    Sulawesi, one of the biggest island in Indonesia, located on the convergence of two macro plate that is Eurasia and Pacific. NOAA and Novosibirsk Tsunami Laboratory show more than 20 tsunami data recorded in Sulawesi since 1820. Based on this data, determination of correlation between tsunami and earthquake parameter need to be done to proved all event in the past. Complete data of magnitudes, fault sizes and tsunami heights on this study sourced from NOAA and Novosibirsk Tsunami database, completed with Pacific Tsunami Warning Center (PTWC) catalog. This study aims to find correlation between moment magnitude, fault size and tsunami height by simple regression. The step of this research are data collecting, processing, and regression analysis. Result shows moment magnitude, fault size and tsunami heights strongly correlated. This analysis is enough to proved the accuracy of historical tsunami database in Sulawesi on NOAA, Novosibirsk Tsunami Laboratory and PTWC.

  12. Building strategies for tsunami scenarios databases to be used in a tsunami early warning decision support system: an application to western Iberia

    NASA Astrophysics Data System (ADS)

    Tinti, S.; Armigliato, A.; Pagnoni, G.; Zaniboni, F.

    2012-04-01

    One of the most challenging goals that the geo-scientific community is facing after the catastrophic tsunami occurred on December 2004 in the Indian Ocean is to develop the so-called "next generation" Tsunami Early Warning Systems (TEWS). Indeed, the meaning of "next generation" does not refer to the aim of a TEWS, which obviously remains to detect whether a tsunami has been generated or not by a given source and, in the first case, to send proper warnings and/or alerts in a suitable time to all the countries and communities that can be affected by the tsunami. Instead, "next generation" identifies with the development of a Decision Support System (DSS) that, in general terms, relies on 1) an integrated set of seismic, geodetic and marine sensors whose objective is to detect and characterise the possible tsunamigenic sources and to monitor instrumentally the time and space evolution of the generated tsunami, 2) databases of pre-computed numerical tsunami scenarios to be suitably combined based on the information coming from the sensor environment and to be used to forecast the degree of exposition of different coastal places both in the near- and in the far-field, 3) a proper overall (software) system architecture. The EU-FP7 TRIDEC Project aims at developing such a DSS and has selected two test areas in the Euro-Mediterranean region, namely the western Iberian margin and the eastern Mediterranean (Turkish coasts). In this study, we discuss the strategies that are being adopted in TRIDEC to build the databases of pre-computed tsunami scenarios and we show some applications to the western Iberian margin. In particular, two different databases are being populated, called "Virtual Scenario Database" (VSDB) and "Matching Scenario Database" (MSDB). The VSDB contains detailed simulations of few selected earthquake-generated tsunamis. The cases provided by the members of the VSDB are computed "real events"; in other words, they represent the unknowns that the TRIDEC platform must be able to recognise and match during the early crisis management phase. The MSDB contains a very large number (order of thousands) of tsunami simulations performed starting from many different simple earthquake sources of different magnitudes and located in the "vicinity" of the virtual scenario earthquake. Examples from both databases will be presented.

  13. Rapid inundation estimates using coastal amplification laws in the western Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Gailler, Audrey; Loevenbruck, Anne; Hébert, Hélène

    2014-05-01

    Numerical tsunami propagation and inundation models are well developed and have now reached an impressive level of accuracy, especially in locations such as harbors where the tsunami waves are mostly amplified. In the framework of tsunami warning under real-time operational conditions, the main obstacle for the routine use of such numerical simulations remains the slowness of the numerical computation, which is strengthened when detailed grids are required for the precise modeling of the coastline response of an individual harbor. Thus only tsunami offshore propagation modeling tools using a single sparse bathymetric computation grid are presently included within the French Tsunami Warning Center (CENALT), providing rapid estimation of tsunami warning at western Mediterranean and NE Atlantic basins scale. We present here a preliminary work that performs quick estimates of the inundation at individual harbors from these high sea forecasting tsunami simulations. The method involves an empirical correction based on theoretical amplification laws (either Green's or Synolakis laws). The main limitation is that its application to a given coastal area would require a large database of previous observations, in order to define the empirical parameters of the correction equation. As no such data (i.e., historical tide gage records of significant tsunamis) are available for the western Mediterranean and NE Atlantic basins, we use a set of synthetic mareograms, calculated for both fake events and well-known historical tsunamigenic earthquakes in the area. This synthetic dataset is obtained through accurate numerical tsunami propagation and inundation modeling by using several nested bathymetric grids of increasingly fine resolution close to the shores (down to a grid cell size of 3m in some Mediterranean harbors). Non linear shallow water tsunami modeling performed on a single 2' coarse bathymetric grid are compared to the values given by time-consuming nested grids simulations (and observation when available), in order to check to which extent the simple approach based on the amplification laws can explain the data. The idea is to fit tsunami data with numerical modeling carried out without any refined coastal bathymetry/topography. To this end several parameters are discussed, namely the bathymetric depth to which model results must be extrapolated (using the Green's law), or the mean bathymetric slope to consider near the studied coast (when using the Synolakis law).

  14. Helping coastal communities at risk from tsunamis: the role of U.S. Geological Survey research

    USGS Publications Warehouse

    Geist, Eric L.; Gelfenbaum, Guy R.; Jaffe, Bruce E.; Reid, Jane A.

    2000-01-01

    In 1946, 1960, and 1964, major tsunamis (giant sea waves usually caused by earthquakes or submarine landslides) struck coastal areas of the Pacific Ocean. In the U.S. alone, these tsunamis killed hundreds of people and caused many tens of millions of dollars in damage. Recent events in Papua New Guinea (1998) and elsewhere are reminders that a catastrophic tsunami could strike U.S. coasts at any time. The USGS, working closely with NOAA and other partners in the National Tsunami Hazard Mitigation Program, is helping to reduce losses from tsunamis through increased hazard assessment and improved real-time warning systems.

  15. A hazard-independent approach for the standardised multi-channel dissemination of warning messages

    NASA Astrophysics Data System (ADS)

    Esbri Palomares, M. A.; Hammitzsch, M.; Lendholt, M.

    2012-04-01

    The tsunami disaster affecting the Indian Ocean region on Christmas 2004 demonstrated very clearly the shortcomings in tsunami detection, public warning processes as well as intergovernmental warning message exchange in the Indian Ocean region. In that regard, early warning systems require that the dissemination of early warning messages has to be executed in way that ensures that the message delivery is timely; the message content is understandable, usable and accurate. To that end, diverse and multiple dissemination channels must be used to increase the chance of the messages reaching all affected persons in a hazard scenario. In addition to this, usage of internationally accepted standards for the warning dissemination such as the Common Alerting Protocol (CAP) and Emergency Data Exchange Language (EDXL) Distribution Element specified by the Organization for the Advancement of Structured Information Standards (OASIS) increase the interoperability among different warning systems enabling thus the concept of system-of-systems proposed by GEOSS. The project Distant Early Warning System (DEWS), co-funded by the European Commission under the 6th Framework Programme, aims at strengthening the early warning capacities by building an innovative generation of interoperable tsunami early warning systems based on the above mentioned concepts following a Service-oriented Architecture (SOA) approach. The project focuses on the downstream part of the hazard information processing where customized, user-tailored warning messages and alerts flow from the warning centre to the responsible authorities and/or the public with their different needs and responsibilities. The information logistics services within DEWS generate tailored EDXL-DE/CAP warning messages for each user that must receive the message according to their preferences, e.g., settings for language, interested areas, dissemination channels, etc.. However, the significant difference in the implementation and capabilities of different dissemination channels such as SMS, email and television, have bearing on the information processing required for delivery and consumption of a DEWS EDXL-DE/CAP message over each dissemination channel. These messages may include additional information in the form of maps, graphs, documents, sensor observations, etc. Therefore, the generated messages are pre-processed by channel adaptors in the information dissemination services converting it into a format that is suitable for end-to-end delivery over the dissemination channels without any semantic distortion. The approach followed by DEWS for disseminating warnings not only relies on traditional communication ways used by the already established early warnings such as the delivery of faxes and phone calls but takes into consideration the use of other broadly used communication channels such as SMS, email, narrowcast and broadcast television, instant messaging, Voice over IP, and radio. It also takes advantage of social media channels like RSS feeds, Facebook, Twitter, etc., enabling a multiplier effect, like in the case of radio and television, and thus allowing to create mash-ups by aggregating other sources of information to the original message. Finally, status information is also important in order to assess and understand whether the process of disseminating the warning to the message consumers has been successfully completed or the process failed at some point of the dissemination chain. To that end, CAP-based messages generated within the information dissemination services provide the semantics for those fields that are of interest within the context of reporting the warning dissemination status in DEWS.

  16. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    ,

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  17. The potential role of real-time geodetic observations in tsunami early warning

    NASA Astrophysics Data System (ADS)

    Tinti, Stefano; Armigliato, Alberto

    2016-04-01

    Tsunami warning systems (TWS) have the final goal to launch a reliable alert of an incoming dangerous tsunami to coastal population early enough to allow people to flee from the shore and coastal areas according to some evacuation plans. In the last decade, especially after the catastrophic 2004 Boxing Day tsunami in the Indian Ocean, much attention has been given to filling gaps in the existing TWSs (only covering the Pacific Ocean at that time) and to establishing new TWSs in ocean regions that were uncovered. Typically, TWSs operating today work only on earthquake-induced tsunamis. TWSs estimate quickly earthquake location and size by real-time processing seismic signals; on the basis of some pre-defined "static" procedures (either based on decision matrices or on pre-archived tsunami simulations), assess the tsunami alert level on a large regional scale and issue specific bulletins to a pre-selected recipients audience. Not unfrequently these procedures result in generic alert messages with little value. What usually operative TWSs do not do, is to compute earthquake focal mechanism, to calculate the co-seismic sea-floor displacement, to assess the initial tsunami conditions, to input these data into tsunami simulation models and to compute tsunami propagation up to the threatened coastal districts. This series of steps is considered nowadays too time consuming to provide the required timely alert. An equivalent series of steps could start from the same premises (earthquake focal parameters) and reach the same result (tsunami height at target coastal areas) by replacing the intermediate steps of real-time tsunami simulations with proper selection from a large archive of pre-computed tsunami scenarios. The advantage of real-time simulations and of archived scenarios selection is that estimates are tailored to the specific occurring tsunami and alert can be more detailed (less generic) and appropriate for local needs. Both these procedures are still at an experimental or testing stage and haven't been implemented yet in any standard TWS operations. Nonetheless, this is seen to be the future and the natural TWS evolving enhancement. In this context, improvement of the real-time estimates of tsunamigenic earthquake focal mechanism is of fundamental importance to trigger the appropriate computational chain. Quick discrimination between strike-slip and thrust-fault earthquakes, and equally relevant, quick assessment of co-seismic on-fault slip distribution, are exemplary cases to which a real-time geodetic monitoring system can contribute significantly. Robust inversion of geodetic data can help to reconstruct the sea floor deformation pattern especially if two conditions are met: the source is not too far from network stations and is well covered azimuthally. These two conditions are sometimes hard to satisfy fully, but in certain regions, like the Mediterranean and the Caribbean sea, this is quite possible due to the limited size of the ocean basins. Close cooperation between the Global Geodetic Observing System (GGOS) community, seismologists, tsunami scientists and TWS operators is highly recommended to obtain significant progresses in the quick determination of the earthquake source, which can trigger a timely estimation of the ensuing tsunami and a more reliable and detailed assessment of the tsunami size at the coast.

  18. A new real-time tsunami detection algorithm

    NASA Astrophysics Data System (ADS)

    Chierici, F.; Embriaco, D.; Pignagnoli, L.

    2016-12-01

    Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection based on the real-time tide removal and real-time band-pass filtering of sea-bed pressure recordings. The algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability, at low computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. Pressure data sets acquired by Bottom Pressure Recorders in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event which occurred at Haida Gwaii on October 28th, 2012 using data recorded by the Bullseye underwater node of Ocean Networks Canada. The algorithm successfully ran for test purpose in year-long missions onboard the GEOSTAR stand-alone multidisciplinary abyssal observatory, deployed in the Gulf of Cadiz during the EC project NEAREST and on NEMO-SN1 cabled observatory deployed in the Western Ionian Sea, operational node of the European research infrastructure EMSO.

  19. Challenges and Alternatives in Tsunami Water Levels Processing in NOAA/NCEI-CO Global Water-Level Data Repository

    NASA Astrophysics Data System (ADS)

    Mungov, G.; Dunbar, P. K.; Stroker, K. J.; Sweeney, A.

    2016-12-01

    The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information is data repository for high-resolution, integrated water-level data to support tsunami research, risk assessment and mitigation to protect life and property damages along the coasts. NCEI responsibilities include, but are not limited to process, archiv and distribut and coastal water level data from different sourcesg tsunami and storm-surge inundation, sea-level change, climate variability, etc. High-resolution data for global historical tsunami events are collected by the Deep-ocean Assessment and Reporting of Tsunami (DART®) tsunameter network maintained by NOAA's National Data Buoy Center NDBC, coastal tide-gauges maintained by NOAA's Center for Operational Oceanographic Products and Services (CO-OPS) and Tsunami Warning Centers, historic marigrams and images, bathymetric data, and from other national and international sources. NCEI-CO water level database is developed in close collaboration with all data providers along with NOAA's Pacific Marine Environmental Laboratory. We outline here the present state in water-level data processing regarding the increasing needs for high-precision, homogeneous and "clean" tsunami records from data different sources and different sampling interval. Two tidal models are compared: the Mike Foreman's improved oceanographic model (2009) and the Akaike Bayesian Information Criterion approach applied by Tamura et al. (1991). The effects of filtering and the limits of its application are also discussed along with the used method for de-spiking the raw time series.

  20. Estimating Seismic Moment From Broadband P-Waves for Tsunami Warnings.

    NASA Astrophysics Data System (ADS)

    Hirshorn, B. F.

    2006-12-01

    The Richard H. Hagemeyer Pacific Tsunami Warning Center (PTWC), located in Ewa Beach, Oahu, Hawaii, is responsible for issuing local, regional, and distant tsunami warnings to Hawaii, and for issuing regional and distant tsunami warnings to the rest of the Pacific Basin, exclusive of the US West Coast. The PTWC must provide these tsunami warnings as soon as technologically possible, based entirely on estimates of a potentially tsunamigenic earthquake's source parameters. We calculate the broadband P-wave moment magnitude, Mwp, from the P or pP wave velocity seismograms [Tsuboi et al., 1995, 1999]. This method appears to work well for regional and teleseismic events [ Tsuboi et al (1999], Whitmore et al (2002), Hirshorn et al (2004) ]. Following Tsuboi, [1995], we consider the displacement record of the P-wave portion of the broadband seismograms as an approximate source time function and integrate this record to obtain the moment rate function, Mo(t), and the moment magnitude [Hanks and Kanamori, 1972] as a function of time, Mw(t). We present results for Mwp for local, regional, and teleseismic broad band recordings for earthquakes in the Mw 5 to 9.3 range. As large Hawaii events are rare, we tested this local case using other Pacific events in the magnitude 5.0 to 7.5 range recorded by nearby stations. Signals were excluded, however, if the epicentral distance was so small (generally less than 1 degree) that there was contamination by the S-wave too closely following the P-waves. Scatter plots of Mwp against the Harvard Mw for these events shows that Mwp does predict Mw well from seismograms recorded at local, regional, and teleseismic distances. For some complex earthquakes, eg. the Mw 8.4(HRV) Peru earthquake of June 21, 2001, Mwp underestimates Mw if the first moment release is not the largest. Our estimates of Mwp for the Mw 9.3 Summatra-Andaman Island's earthquake of December 26, 2004 and for the Mw 8.7 (HRV) Summatra event of March 28, 2005, were Mwp 8.1, Mwp 8.7 respectively, from p-waves recorded at 15 - 90 degrees from each hypocenter.

  1. Combining historical eyewitness accounts on tsunami-induced waves and numerical simulations for getting insights in uncertainty of source parameters

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Rousseau, Marie; Lemoine, Anne; Pedreros, Rodrigo; Lambert, Jerome; benki, Aalae

    2017-04-01

    Recent tsunami events including the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami have caused many casualties and damages to structures. Advances in numerical simulation of tsunami-induced wave processes have tremendously improved forecast, hazard and risk assessment and design of early warning for tsunamis. Among the major challenges, several studies have underlined uncertainties in earthquake slip distributions and rupture processes as major contributor on tsunami wave height and inundation extent. Constraining these uncertainties can be performed by taking advantage of observations either on tsunami waves (using network of water level gauge) or on inundation characteristics (using field evidence and eyewitness accounts). Despite these successful applications, combining tsunami observations and simulations still faces several limitations when the problem is addressed for past tsunamis events like 1755 Lisbon. 1) While recent inversion studies can benefit from current modern networks (e.g., tide gauges, sea bottom pressure gauges, GPS-mounted buoys), the number of tide gauges can be very scarce and testimonies on tsunami observations can be limited, incomplete and imprecise for past tsunamis events. These observations often restrict to eyewitness accounts on wave heights (e.g., maximum reached wave height at the coast) instead of the full observed waveforms; 2) Tsunami phenomena involve a large span of spatial scales (from ocean basin scales to local coastal wave interactions), which can make the modelling very demanding: the computation time cost of tsunami simulation can be very prohibitive; often reaching several hours. This often limits the number of allowable long-running simulations for performing the inversion, especially when the problem is addressed from a Bayesian inference perspective. The objective of the present study is to overcome both afore-described difficulties in the view to combine historical observations on past tsunami-induced waves and numerical simulations. In order to learn the uncertainty information on source parameters, we treat the problem within the Bayesian setting, which enables to incorporate in a flexible manner the different uncertainty sources. We propose to rely on an emerging technique called Approximate Bayesian Computation ABC, which has been developed to estimate the posterior distribution in modelling scenarios where the likelihood function is either unknown or cannot be explicitly defined. To overcome the computational issue, we combine ABC with statistical emulators (aka meta-model). We apply the proposed approach on the case study of Ligurian (North West of Italy) tsunami (1887) and discuss the results with a special attention paid to the impact of the observational error.

  2. Development of a Tsunami Scenario Database for Marmara Sea

    NASA Astrophysics Data System (ADS)

    Ozer Sozdinler, Ceren; Necmioglu, Ocal; Meral Ozel, Nurcan

    2016-04-01

    Due to the very short travel times in Marmara Sea, a Tsunami Early Warning System (TEWS) has to be strongly coupled with the earthquake early warning system and should be supported with a pre-computed tsunami scenario database to be queried in near real-time based on the initial earthquake parameters. To address this problem, 30 different composite earthquake scenarios with maximum credible Mw values based on 32 fault segments have been identified to produce a detailed scenario database for all possible earthquakes in the Marmara Sea with a tsunamigenic potential. The bathy/topo data of Marmara Sea was prepared using GEBCO and ASTER data, bathymetric measurements along Bosphorus, Istanbul and Dardanelle, Canakkale and the coastline digitized from satellite images. The coarser domain in 90m-grid size was divided into 11 sub-regions having 30m-grid size in order to increase the data resolution and precision of the calculation results. The analyses were performed in nested domains with numerical model NAMIDANCE using non-linear shallow water equations. In order to cover all the residential areas, industrial facilities and touristic locations, more than 1000 numerical gauge points were selected along the coasts of Marmara Sea, which are located at water depth of 5 to 10m in finer domain. The distributions of tsunami hydrodynamic parameters were investigated together with the change of water surface elevations, current velocities, momentum fluxes and other important parameters at the gauge points. This work is funded by the project MARsite - New Directions in Seismic Hazard assessment through Focused Earth Observation in the Marmara Supersite (FP7-ENV.2012 6.4-2, Grant 308417 - see NH2.3/GMPV7.4/SM7.7) and supported by SATREPS-MarDim Project (Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education in Turkey) and JICA (Japan International Cooperation Agency). The authors would like to acknowledge Ms. Basak Firat for her assistance in preparation of the database.

  3. Quantifying 10 years of improved earthquake-monitoring performance in the Caribbean region

    USGS Publications Warehouse

    McNamara, Daniel E.; Hillebrandt-Andrade, Christa; Saurel, Jean-Marie; Huerfano-Moreno, V.; Lynch, Lloyd

    2015-01-01

    Over 75 tsunamis have been documented in the Caribbean and adjacent regions during the past 500 years. Since 1500, at least 4484 people are reported to have perished in these killer waves. Hundreds of thousands are currently threatened along the Caribbean coastlines. Were a great tsunamigenic earthquake to occur in the Caribbean region today, the effects would potentially be catastrophic due to an increasingly vulnerable region that has seen significant population increases in the past 40–50 years and currently hosts an estimated 500,000 daily beach visitors from North America and Europe, a majority of whom are not likely aware of tsunami and earthquake hazards. Following the magnitude 9.1 Sumatra–Andaman Islands earthquake of 26 December 2004, the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Early Warning System for the Caribbean and Adjacent Regions (CARIBE‐EWS) was established and developed minimum performance standards for the detection and analysis of earthquakes. In this study, we model earthquake‐magnitude detection threshold and P‐wave detection time and demonstrate that the requirements established by the UNESCO ICG CARIBE‐EWS are met with 100% of the network operating. We demonstrate that earthquake‐monitoring performance in the Caribbean Sea region has improved significantly in the past decade as the number of real‐time seismic stations available to the National Oceanic and Atmospheric Administration tsunami warning centers have increased. We also identify weaknesses in the current international network and provide guidance for selecting the optimal distribution of seismic stations contributed from existing real‐time broadband national networks in the region.

  4. Using GPS to Detect Imminent Tsunamis

    NASA Technical Reports Server (NTRS)

    Song, Y. Tony

    2009-01-01

    A promising method of detecting imminent tsunamis and estimating their destructive potential involves the use of Global Positioning System (GPS) data in addition to seismic data. Application of the method is expected to increase the reliability of global tsunami-warning systems, making it possible to save lives while reducing the incidence of false alarms. Tsunamis kill people every year. The 2004 Indian Ocean tsunami killed about 230,000 people. The magnitude of an earthquake is not always a reliable indication of the destructive potential of a tsunami. The 2004 Indian Ocean quake generated a huge tsunami, while the 2005 Nias (Indonesia) quake did not, even though both were initially estimated to be of the similar magnitude. Between 2005 and 2007, five false tsunami alarms were issued worldwide. Such alarms result in negative societal and economic effects. GPS stations can detect ground motions of earthquakes in real time, as frequently as every few seconds. In the present method, the epicenter of an earthquake is located by use of data from seismometers, then data from coastal GPS stations near the epicenter are used to infer sea-floor displacements that precede a tsunami. The displacement data are used in conjunction with local topographical data and an advanced theory to quantify the destructive potential of a tsunami on a new tsunami scale, based on the GPS-derived tsunami energy, much like the Richter Scale used for earthquakes. An important element of the derivation of the advanced theory was recognition that horizontal sea-floor motions contribute much more to generation of tsunamis than previously believed. The method produces a reliable estimate of the destructive potential of a tsunami within minutes typically, well before the tsunami reaches coastal areas. The viability of the method was demonstrated in computational tests in which the method yielded accurate representations of three historical tsunamis for which well-documented ground-motion measurements were available. Development of a global tsunami-warning system utilizing an expanded network of coastal GPS stations was under consideration at the time of reporting the information for this article.

  5. How prepared individuals and communities are for evacuation in tsunami-prone areas in Europe? Findings from the ASTARTE EU Programme

    NASA Astrophysics Data System (ADS)

    Lavigne, Franck; Grancher, Delphine; Goeldner-Gianella, Lydie; Karanci, Nuray; Dogulu, Nilay; Kanoglu, Utku; Zaniboni, Filippo; Tinti, Stefano; Papageorgiou, Antonia; Papadopoulos, Gerassimos; Constantin, Angela; Moldovan, Iren; El Mouraouah, Azelarab; Benchekroun, Sabah; Birouk, Abdelouahad

    2016-04-01

    Understanding social vulnerability to tsunamis provides risk managers with the required information to determine whether individuals have the capacity to evacuate, and therefore to take mitigation measures to protect their communities. In the frame of the EU programme ASTARTE (Assessment, STrategy And Risk reduction for Tsunamis in Europe), we conducted a questionnaire-based survey among 1,661 people from 41 nationalities living in, working in, or visiting 10 Test Sites from 9 different countries. The questions, which have been translated in 11 languages, focused on tsunami hazard awareness, risk perception, and knowledge of the existing warning systems. Our results confirm our initial hypothesis that low attention is paid in Europe to tsunami risk. Among all type of hazards, either natural or not, tsunami rank first in only one site (Lyngen fjord in Norway), rank third in 3 other sites (Eforie Nord in Romania, Nice and Istanbul), rank 4 in Gulluk Bay, 5 in Sines and Heraklion, and 10 in Siracusa (Sicily) and San Jordi (Balearic Islands). Whatever the respondent's status (i.e. local population, local authorities, or tourists), earthquakes and drawdown of the sea are cited as tsunami warning signs by 43% and 39% of the respondents, respectively. Therefore self-evacuation is not expected for more than half of the population. Considering that most European countries have no early warning system for tsunamis, a disaster is likely to happen in any coastal area exposed to this specific hazard. Furthermore, knowledge of past tsunami events is also very limited: only 22% of people stated that a tsunami has occurred in the past, whereas a deadly tsunami occurs every century in the Mediterranean Sea (e.g. in AD 365, 1660, 1672 or 1956 in the eastern part, 1908, 1979 or 2003 in the western part), and high tsunami waves devastated the Portugal and Moroccan coasts in 1755. Despite this lack of knowledge and awareness of past events, 62% of the respondents think that the site of the interview could be affected by a tsunami in the future. Respondents were strongly influenced by the images of catastrophic tsunamis they have seen in 2004 and 2011, leading them to consider local wave heights >10 or 15m, even in low-exposed areas such as Nice or the Balearic Islands. Such overestimation of the wave heights could lead to confusion during an evacuation. This survey at the European scale underlines the need to better mitigation strategies, including but not limited to inform residents, local workers and tourists of each site about: (1) the reality of the tsunami risk; (2) the maximal wave height that has been modelled for the worst case; and (3) where to evacuate in case of a future tsunami. Key words: tsunami, coastal risk, hazard knowledge, risk perception, vulnerability, resilience, evacuation, Europe

  6. Real-time determination of the worst tsunami scenario based on Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Furuya, Takashi; Koshimura, Shunichi; Hino, Ryota; Ohta, Yusaku; Inoue, Takuya

    2016-04-01

    In recent years, real-time tsunami inundation forecasting has been developed with the advances of dense seismic monitoring, GPS Earth observation, offshore tsunami observation networks, and high-performance computing infrastructure (Koshimura et al., 2014). Several uncertainties are involved in tsunami inundation modeling and it is believed that tsunami generation model is one of the great uncertain sources. Uncertain tsunami source model has risk to underestimate tsunami height, extent of inundation zone, and damage. Tsunami source inversion using observed seismic, geodetic and tsunami data is the most effective to avoid underestimation of tsunami, but needs to expect more time to acquire the observed data and this limitation makes difficult to terminate real-time tsunami inundation forecasting within sufficient time. Not waiting for the precise tsunami observation information, but from disaster management point of view, we aim to determine the worst tsunami source scenario, for the use of real-time tsunami inundation forecasting and mapping, using the seismic information of Earthquake Early Warning (EEW) that can be obtained immediately after the event triggered. After an earthquake occurs, JMA's EEW estimates magnitude and hypocenter. With the constraints of earthquake magnitude, hypocenter and scaling law, we determine possible multi tsunami source scenarios and start searching the worst one by the superposition of pre-computed tsunami Green's functions, i.e. time series of tsunami height at offshore points corresponding to 2-dimensional Gaussian unit source, e.g. Tsushima et al., 2014. Scenario analysis of our method consists of following 2 steps. (1) Searching the worst scenario range by calculating 90 scenarios with various strike and fault-position. From maximum tsunami height of 90 scenarios, we determine a narrower strike range which causes high tsunami height in the area of concern. (2) Calculating 900 scenarios that have different strike, dip, length, width, depth and fault-position. Note that strike is limited with the range obtained from 90 scenarios calculation. From 900 scenarios, we determine the worst tsunami scenarios from disaster management point of view, such as the one with shortest travel time and the highest water level. The method was applied to a hypothetical-earthquake, and verified if it can effectively search the worst tsunami source scenario in real-time, to be used as an input of real-time tsunami inundation forecasting.

  7. Establishing an early warning alert and response network following the Solomon Islands tsunami in 2013

    PubMed Central

    Bilve, Augustine; Nogareda, Francisco; Joshua, Cynthia; Ross, Lester; Betcha, Christopher; Durski, Kara; Fleischl, Juliet

    2014-01-01

    Abstract Problem On 6 February 2013, an 8.0 magnitude earthquake generated a tsunami that struck the Santa Cruz Islands, Solomon Islands, killing 10 people and displacing over 4700. Approach A post-disaster assessment of the risk of epidemic disease transmission recommended the implementation of an early warning alert and response network (EWARN) to rapidly detect, assess and respond to potential outbreaks in the aftermath of the tsunami. Local setting Almost 40% of the Santa Cruz Islands’ population were displaced by the disaster, and living in cramped temporary camps with poor or absent sanitation facilities and insufficient access to clean water. There was no early warning disease surveillance system. Relevant changes By 25 February, an EWARN was operational in five health facilities that served 90% of the displaced population. Eight priority diseases or syndromes were reported weekly; unexpected health events were reported immediately. Between 25 February and 19 May, 1177 target diseases or syndrome cases were reported. Seven alerts were investigated. No sustained transmission or epidemics were identified. Reporting compliance was 85%. The EWARN was then transitioned to the routine four-syndrome early warning disease surveillance system. Lesson learnt It was necessary to conduct a detailed assessment to evaluate the risk and potential impact of serious infectious disease outbreaks, to assess whether and how enhanced early warning disease surveillance should be implemented. Local capacities and available resources should be considered in planning EWARN implementation. An EWARN can be an opportunity to establish or strengthen early warning disease surveillance capabilities. PMID:25378746

  8. Prioritizing earthquake and tsunami alerting efforts

    NASA Astrophysics Data System (ADS)

    Allen, R. M.; Allen, S.; Aranha, M. A.; Chung, A. I.; Hellweg, M.; Henson, I. H.; Melgar, D.; Neuhauser, D. S.; Nof, R. N.; Strauss, J. A.

    2015-12-01

    The timeline of hazards associated with earthquakes ranges from seconds for the strong shaking at the epicenter, to minutes for strong shaking at more distant locations in big quakes, to tens of minutes for a local tsunami. Earthquake and tsunami warning systems must therefore include very fast initial alerts, while also taking advantage of available time in bigger and tsunami-generating quakes. At the UC Berkeley Seismological Laboratory we are developing a suite of algorithms to provide the fullest possible information about earthquake shaking and tsunami inundation from seconds to minutes after a quake. The E-larmS algorithm uses the P-wave to rapidly detect an earthquake and issue a warning. It is currently issuing alerts to test users in as little as 3 sec after the origin time. Development of a new waveform detector may lead to even faster alerts. G-larmS uses permanent deformation estimates from GNSS stations to estimate the geometry and extent of rupture underway providing more accurate ground shaking estimates in big (M>~7) earthquakes. It performed well in the M6.0 2014 Napa earthquake. T-larmS is a new algorithm designed to extend alert capabilities to tsunami inundation. Rapid estimates of source characteristics for subduction zones event can not only be used to warn of the shaking hazard, but also the local tsunami inundation hazard. These algorithms are being developed, implemented and tested with a focus on the western US, but are also now being tested in other parts of the world including Israel, Turkey, Korea and Chile. Beta users in the Bay Area are receiving the alerts and beginning to implement automated actions. They also provide feedback on users needs, which has led to the development of the MyEEW smartphone app. This app allows beta users to receive the alerts on their cell phones. All these efforts feed into our ongoing assessment of directions and priorities for future development and implementation efforts.

  9. Sensitivities of Near-field Tsunami Forecasts to Megathrust Deformation Predictions

    NASA Astrophysics Data System (ADS)

    Tung, S.; Masterlark, T.

    2018-02-01

    This study reveals how modeling configurations of forward and inverse analyses of coseismic deformation data influence the estimations of seismic and tsunami sources. We illuminate how the predictions of near-field tsunami change when (1) a heterogeneous (HET) distribution of crustal material is introduced to the elastic dislocation model, and (2) the near-trench rupture is either encouraged or suppressed to invert spontaneous coseismic displacements. Hypothetical scenarios of megathrust earthquakes are studied with synthetic Global Positioning System displacements in Cascadia. Finite-element models are designed to mimic the subsurface heterogeneity across the curved subduction margin. The HET lithospheric domain modifies the seafloor displacement field and alters tsunami predictions from those of a homogeneous (HOM) crust. Uncertainties persist as the inverse analyses of geodetic data produce nonrealistic slip artifacts over the HOM domain, which propagates into the prediction errors of subsequent tsunami arrival and amplitudes. A stochastic analysis further shows that the uncertainties of seismic tomography models do not degrade the solution accuracy of HET over HOM. Whether the source ruptures near the trench also controls the details of the seafloor disturbance. Deeper subsurface slips induce more seafloor uplift near the coast and cause an earlier arrival of tsunami waves than surface-slipping events. We suggest using the solutions of zero-updip-slip and zero-updip-slip-gradient rupture boundary conditions as end-members to constrain the tsunami behavior for forecasting purposes. The findings are important for the near-field tsunami warning that primarily relies on the near-real-time geodetic or seismic data for source calibration before megawaves hit the nearest shore upon tsunamigenic events.

  10. Effect of Variable Manning Coefficients on Tsunami Inundation

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Rees, D.

    2017-12-01

    Numerical simulations are commonly used to help estimate tsunami hazard, improve evacuation plans, issue or cancel tsunami warnings, inform forecasting and hazard assessments and have therefore become an integral part of hazard mitigation among the tsunami community. Many numerical codes exist for simulating tsunamis, most of which have undergone extensive benchmarking and testing. Tsunami hazard or risk assessments employ these codes following a deterministic or probabilistic approach. Depending on the scope these studies may or may not consider uncertainty in the numerical simulations, the effects of tides, variable friction or estimate financial losses, none of which are necessarily trivial. Distributed manning coefficients, the roughness coefficients used in hydraulic modeling, are commonly used in simulating both riverine and pluvial flood events however, their use in tsunami hazard assessments is primarily part of limited scope studies and for the most part, not a standard practice. For this work, we investigate variations in manning coefficients and their effects on tsunami inundation extent, pattern and financial loss. To assign manning coefficients we use land use maps that come from the New Zealand Land Cover Database (LCDB) and more recent data from the Ministry of the Environment. More than 40 classes covering different types of land use are combined into major classes such as cropland, grassland and wetland representing common types of land use in New Zealand, each of which is assigned a unique manning coefficient. By utilizing different data sources for variable manning coefficients, we examine the impact of data sources and classification methodology on the accuracy of model outputs.

  11. Tsunami Preparedness in Washington (video)

    USGS Publications Warehouse

    Loeffler, Kurt; Gesell, Justine

    2010-01-01

    Tsunamis are a constant threat to the coasts of our world. Although tsunamis are infrequent along the West coast of the United States, it is possible and necessary to prepare for potential tsunami hazards to minimize loss of life and property. Community awareness programs are important, as they strive to create an informed society by providing education and training. This video about tsunami preparedness in Washington distinguishes between a local tsunami and a distant event and focus on the specific needs of this region. It offers guidelines for correct tsunami response and community preparedness from local emergency managers, first-responders, and leading experts on tsunami hazards and warnings, who have been working on ways of making the tsunami affected regions safer for the people and communities on a long-term basis. This video was produced by the US Geological Survey (USGS) in cooperation with Washington Emergency Management Division (EMD) and with funding by the National Tsunami Hazard Mitigation Program.

  12. Tsunami Early Warning for the Indian Ocean Region - Status and Outlook

    NASA Astrophysics Data System (ADS)

    Lauterjung, Joern; Rudloff, Alexander; Muench, Ute; Gitews Project Team

    2010-05-01

    The German-Indonesian Tsunami Early Warning System (GITEWS) for the Indian Ocean region has gone into operation in Indonesia in November 2008. The system includes a seismological network, together with GPS stations and a network of GPS buoys additionally equipped with ocean bottom pressure sensors and a tide gauge network. The different sensor systems have, for the most part, been installed and now deliver respective data either online or interactively upon request to the Warning Centre in Jakarta. Before 2011, however, the different components requires further optimization and fine tuning, local personnel needs to be trained and eventual problems in the daily operation have to be dealt with. Furthermore a company will be founded in the near future, which will guarantee a sustainable maintenance and operation of the system. This concludes the transfer from a temporarily project into a permanent service. This system established in Indonesia differs from other Tsunami Warning Systems through its application of modern scientific methods and technologies. New procedures for the fast and reliable determination of strong earthquakes, deformation monitoring by GPS, the modeling of tsunamis and the assessment of the situation have been implemented in the Warning System architecture. In particular, the direct incorporation of different sensors provides broad information already at the early stages of Early Warning thus resulting in a stable system and minimizing breakdowns and false alarms. The warning system is designed in an open and modular structure based on the most recent developments and standards of information technology. Therefore, the system can easily integrate additional sensor components to be used for other multi-hazard purposes e.g. meteorological and hydrological events. Up to now the German project group is cooperating in the Indian Ocean region with Sri Lanka, the Maldives, Iran, Yemen, Tanzania and Kenya to set up the equipment primarily for seismological monitoring and data analysis. The automatic seismic data processing software SeisComP3, is not only operational in the warning centre in Jakarta and successfully used for rapid earthquake information, but also in different Indian Ocean rim countries like the once mentioned before as well as in India, Thailand and Pakistan. Close cooperation has been established with Australia, South Africa and India for the real-time exchange mainly of seismological and sea level data.

  13. The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program Pr-Nthmp

    NASA Astrophysics Data System (ADS)

    Huerfano Moreno, V. A.; Hincapie-Cardenas, C. M.

    2014-12-01

    Tsunami hazard assessment, detection, warning, education and outreach efforts are intended to reduce losses to life and property. The Puerto Rico Seismic Network (PRSN) is participating in an effort with local and federal agencies, to developing tsunami hazard risk reduction strategies under the National Tsunami Hazards and Mitigation Program (NTHMP). This grant supports the TsunamiReady program which is the base of the tsunami preparedness and mitigation in PR. The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. The seismic water waves originating in the prominent fault systems around PR are considered to be a near-field hazard for Puerto Rico and the Virgin islands (PR/VI) because they can reach coastal areas within a few minutes after the earthquake. Sources for local, regional and tele tsunamis have been identified and modeled and tsunami evacuation maps were prepared for PR. These maps were generated in three phases: First, hypothetical tsunami scenarios on the basis of the parameters of potential underwater earthquakes were developed. Secondly, each of these scenarios was simulated. The third step was to determine the worst case scenario (MOM). The run-ups were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Online maps and related evacuation products are available to the public via the PR-TDST (PR Tsunami Decision Support Tool). Currently all the 44 coastal municipalities were recognized as TsunamiReady by the US NWS. The main goal of the program is to declare Puerto Rico as TsunamiReady, including two cities that are not coastal but could be affected by tsunamis. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were approved. Also, the existing tsunami protocol and criteria in the PR/VI was updated. This paper describes the PR-NTHMP project, including the real time earthquake and tsunami monitoring as well as the specific protocols used to broadcast tsunami messages. The paper highlights tsunami hazards assessment, detection, warning, education and outreach in Puerto Rico.

  14. The GNSS-based Ground Tracking System (GTS) of GFZ; from GITEWS to PROTECTS and beyond

    NASA Astrophysics Data System (ADS)

    Falck, Carsten; Merx, Alexander; Ramatschi, Markus

    2013-04-01

    Introduction An automatic system for the near real-time determination and visualization of ground motions, respectively co-seismic deformations of the Earth's surface, was developed by GFZ (German Research Centre for Geosciences) within the project GITEWS (German Indonesian Tsunami Early Warning System). The system is capable to deliver 3D-displacement vectors for locations with appropriate GPS-equipment in the vicinity of an earthquake's epicenter with a delay of only a few minutes. These vectors can help to assess the earthquake causing tectonic movements, which must be known to make reliable early warning predictions, e.g., concerning the generation of tsunami waves. The GTS (Ground Tracking System) has been integrated into InaTEWS (Indonesian Tsunami Early Warning System) and is in operation at the national warning center in Jakarta since November 2008. After the end of the project GITEWS GFZ continues to support the GTS in Indonesia within the frame of PROTECTS (Project for Training, Education and Consulting for Tsunami Early Warning Systems) and recently some new developments have been introduced. We now aim to make further use of the achievements made, e.g., by developing a license model for the GTS software package. Motivation After the Tsunami of 26th December 2004 the German government initiated the GITEWS project to develop the main components for a tsunami early warning system in Indonesia. The GFZ, as the consortial leader of GITEWS, had several work packages, most of them related to sensor systems. The geodetic branch (Department 1) of GFZ was assigned to develop a GNSS-based component, which since then is known as the GTS (Ground Tracking System). System benefit The ground motion information delivered by the GTS is a valuable source for a fast understanding of an earthquake's mechanism with a high relevance to assess the probability and magnitude of a potentially following tsunami. The system may detect highest displacement vector values, where seismic systems may tend to have problems with the determination of earthquake magnitudes, e.g. close to an earthquake epicenter. By considering displacement vectors the GTS may significantly support the decision finding process whether a tsunami has been generated. Brief system description The GTS may be divided into three main components: 1) The data acquisition component receives and manages data from GNSS-stations being transferred either in real-time, file based or both in parallel, including, e.g., format conversions and real-time spreading to other services. It also acquires the most actual auxiliary data needed for data processing, e.g., GNSS-satellite orbit data or, in case of internet problems, generates them from ephemeris broadcast transmissions, received by the connected GNSS-network stations. 2) The automatic GNSS-data processing unit calculates coordinate time series for all GNSS-stations providing data. The processing kernel is the robust working and well supported »Bernese GPS Software«, but wrapped into adaptations for a fully automatic near real-time processing. The final products of this unit are 3D-displacement vectors, which are calculated as differences to the mean coordinates of the latest timespan prior to an earthquake. 3) The graphical user interface (GUI) of the GTS supports both, a quick view for all staff members at the warning centre (24h/7d shifts) and deeper analysis by experts. The states of the connected GNSS-networks and of the automatic data processing system are displayed. Other views are available, e.g., to check intermediate processing steps or historic data. The GTS final products, the 3D-displacement vectors, are displayed as arrows and bars on a map view. The GUI system is implemented as a web-based application and allows all views to be displayed on many screens at the same time, even at remote locations. Acknowledgements The projects GITEWS (German Indonesian Tsunami Early Warning System) and PROTECTS (Project for Training, Education and Consulting for Tsunami Early Warning System) are carried out by a large group of scientists and engineers from (GFZ) German Research Centre for Geosciences and its partners from the German Aerospace Centre (DLR), the Alfred Wegener Institute for Polar and Marine Research (AWI), the GKSS Research Centre, the Konsortium Deutsche Meeresforschung (KDM), the Leibniz Institute for Marine Sciences (IFM-GEOMAR), the United Nations University (UNU), the Federal Institute for Geosciences and Natural Resources (BGR), the German Agency for Technical Cooperation (GTZ) and other international partners. Funding is provided by the German Federal Ministry for Education and Research (BMBF), Grant 03TSU01 and 03TSU07.

  15. Forecasting database for the tsunami warning regional center for the western Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Hebert, H.; Loevenbruck, A.; Hernandez, B.

    2010-12-01

    Improvements in the availability of sea-level observations and advances in numerical modeling techniques are increasing the potential for tsunami warnings to be based on numerical model forecasts. Numerical tsunami propagation and inundation models are well developed, but they present a challenge to run in real-time, partly due to computational limitations and also to a lack of detailed knowledge on the earthquake rupture parameters. Through the establishment of the tsunami warning regional center for NE Atlantic and western Mediterranean Sea, the CEA is especially in charge of providing rapidly a map with uncertainties showing zones in the main axis of energy at the Mediterranean scale. The strategy is based initially on a pre-computed tsunami scenarios database, as source parameters available a short time after an earthquake occurs are preliminary and may be somewhat inaccurate. Existing numerical models are good enough to provide a useful guidance for warning structures to be quickly disseminated. When an event will occur, an appropriate variety of offshore tsunami propagation scenarios by combining pre-computed propagation solutions (single or multi sources) may be recalled through an automatic interface. This approach would provide quick estimates of tsunami offshore propagation, and aid hazard assessment and evacuation decision-making. As numerical model accuracy is inherently limited by errors in bathymetry and topography, and as inundation maps calculation is more complex and expensive in term of computational time, only tsunami offshore propagation modeling will be included in the forecasting database using a single sparse bathymetric computation grid for the numerical modeling. Because of too much variability in the mechanism of tsunamigenic earthquakes, all possible magnitudes cannot be represented in the scenarios database. In principle, an infinite number of tsunami propagation scenarios can be constructed by linear combinations of a finite number of pre-computed unit scenarios. The whole notion of a pre-computed forecasting database also requires a historical earthquake and tsunami database, as well as an up-to-date seismotectonic database including faults geometry and a zonation based on seismotectonic synthesis of source zones and tsunamigenic faults. Our forecast strategy is thus based on a unit source function methodology, whereby the model runs are combined and scaled linearly to produce any composite tsunamis propagation solution. Each unit source function is equivalent to a tsunami generated by a Mo 1.75E+19 N.m earthquake (Mw ~6.8) with a rectangular fault 25 km by 20 km in size and 1 m in slip. The faults of the unit functions are placed adjacent to each other, following the discretization of the main seismogenic faults bounding the western Mediterranean basin. The number of unit functions involved varies with the magnitude of the wanted composite solution and the combined waveheights are multiplied by a given scaling factor to produce the new arbitrary scenario. Some test-cases examples are presented (e.g., Boumerdès 2003 [Algeria, Mw 6.8], Djijel 1856 [Algeria, Mw 7.2], Ligure 1887 [Italia, Mw 6.5-6.7]).

  16. Tsunami evacuation buildings and evacuation planning in Banda Aceh, Indonesia.

    PubMed

    Yuzal, Hendri; Kim, Karl; Pant, Pradip; Yamashita, Eric

    Indonesia, a country of more than 17,000 islands, is exposed to many hazards. A magnitude 9.1 earthquake struck off the coast of Sumatra, Indonesia, on December 26, 2004. It triggered a series of tsunami waves that spread across the Indian Ocean causing damage in 11 countries. Banda Aceh, the capital city of Aceh Province, was among the most damaged. More than 31,000 people were killed. At the time, there were no early warning systems nor evacuation buildings that could provide safe refuge for residents. Since then, four tsunami evacuation buildings (TEBs) have been constructed in the Meuraxa subdistrict of Banda Aceh. Based on analysis of evacuation routes and travel times, the capacity of existing TEBs is examined. Existing TEBs would not be able to shelter all of the at-risk population. In this study, additional buildings and locations for TEBs are proposed and residents are assigned to the closest TEBs. While TEBs may be part of a larger system of tsunami mitigation efforts, other strategies and approaches need to be considered. In addition to TEBs, robust detection, warning and alert systems, land use planning, training, exercises, and other preparedness strategies are essential to tsunami risk reduction.

  17. Development of a new real-time GNSS data analysis system in GEONET for rapid Mw estimates in Japan

    NASA Astrophysics Data System (ADS)

    Kawamoto, S.; Miyagawa, K.; Yahagi, T.; Yamaguchi, K.; Tsuji, H.; Nishimura, T.; Ohta, Y.; Hino, R.; Miura, S.

    2013-12-01

    The 2011 off the Pacific Coast of Tohoku Earthquake (Mw 9.0) occurred on March 11, 2011. The earthquake and following tsunami caused serious damages to the broad coastal area of east Japan. Japan Meteorological Agency (JMA) operates the Tsunami Warning system, which is designed to forecast the tsunami height and its arrival time around 3 minutes after a large event. However, the first estimated magnitude of Mj, which was used for Tsunami Warning issuance, was far below the real one at the Tohoku event because of a saturation problem. In principle, as well as most other magnitude scales, Mj is saturated at certain values around 8.0. On the other hand, Mw represents the earthquake energy itself and it can be directly calculated by permanent displacements derived from geodetic measurements without the saturation problem. GNSS Earth Observation Network System (GEONET) is one of the densest real-time GNSS networks in the world operated by Geospatial Information Authority of Japan (GSI). The GEONET data and recent rapid advancement of GNSS analysis techniques motivate us to develop a new system for tackling the tsunami disasters. In order to provide the more reliable magnitude for Tsunami Warning, GSI and Tohoku University have jointly developed a new real-time analysis system in GEONET for quasi real-time Mw estimation. Its targets are large earthquakes, especially ones of Mw > 8.0, which would be saturated by the Tsunami Warning system. The real-time analysis system in GEONET mainly consists of three parts: (1) real-time GNSS positioning, (2) automated extraction of displacement fields due to the large earthquake, and (3) automated estimation of Mw by an approximated single rectangular fault. The positions of each station are calculated by using RTKLIB 2.4.1 (Takasu, 2011) with the baseline mode and the predicted part of the IGS Ultra Rapid precise orbit. For the event detection, we adopt the 'RAPiD' algorithm (Ohta et al., 2012) or Earthquake Early Warning issued by JMA. This whole process is done within 10 seconds at most and the estimated results are immediately announced to GSI staffs by e-mail. We examined the system by using the recorded 1Hz GEONET data of past several large earthquakes in Japan. The results showed that it could estimate reliable Mw within a few minutes like Mw of 8.9 for the 2011 Tohoku earthquake (Mw 9.0) after 172 seconds, Mw of 7.6 for the 2011 off Ibaraki earthquake (Mw 7.7) after 107 seconds and Mw of 8.0 for the 2003 Tokachi-oki earthquake (Mw 8.0) after 93 seconds respectively. GSI launched its prototype in April of 2012 with 146 GEONET stations for covering mainly Tohoku district and now is planning to extend it to the whole area of Japan. We assure that this system would become one of the powerful tools for supporting Tsunami Warinng in order to prevent or mitigate the severe damages of future disastrous tsunamis.

  18. The Pacific tsunami warning system

    USGS Publications Warehouse

    Pararas-Carayannis, G.

    1986-01-01

    The impact of tsunamis on human societies can be traced back in written history to 480 BC, when the Minoan civilization in the Eastern Mediterranean was wiped out by great tsunami waves generated by the volcanic explosion of the island of Santorin. In the Pacific Ocean where the majority of these waves have been generated, the historical record, although brief, shows tremendous destruction. In Japan which has one of the most populated coastal regions in the world and a long history of earthquake activity, tsunamis have destroyed entire coastal communities. There is also history of tsunami destruction in Alaska, in Hawaiian Islands, and in South America. 

  19. Discrimination of tsunamigenic earthquakes by ionospheric sounding using GNSS observations of total electron content from the Sumatran GPS Array

    NASA Astrophysics Data System (ADS)

    Manta, F.; Feng, L.; Occhipinti, G.; Taisne, B.; Hill, E.

    2017-12-01

    Tsunami earthquakes generate tsunamis larger than expected for their seismic magnitude. They rupture the shallow megathrust, which is usually at significant distance from land-based monitoring networks. This distance presents a challenge in accurately estimating the magnitude and source extent of tsunami earthquakes. Whether these parameters can be estimated reliably is critical to the success of tsunami early warning systems. In this work, we investigate the potential role of using GNSS-observed ionospheric total electron content (TEC) to discriminate tsunami earthquakes, by introducing for the first time the TEC Intensity Index (TECII) for rapidly identify tsunamigenic earthquakes. We examine two Mw 7.8 megathrust events along the Sumatran subduction zone with data from the Sumatran GPS Array (SuGAr). Both events triggered a tsunami alert that was canceled later. The Banyaks event (April 6th, 2010) did not generate a tsunami and caused only minor earthquake-related damage to infrastructure. On the contrary, the Mentawai event (October 25th, 2010) produced a large tsunami with run-up heights of >16 m along the southwestern coasts of the Pagai Islands. The tsunami claimed more than 400 lives. The primary difference between the two events was the depth of rupture: the Mentawai event ruptured a very shallow (<6 km) portion of the Sunda megathrust, while the Banyaks event ruptured a deeper portion (20-30 km). While we identify only a minor ionospheric signature of the Banyaks event (TECII = 1.05), we identify a strong characteristic acoustic-gravity wave only 8 minutes after the Mentawai earthquake (TECII = 1.14) and a characteristic signature of a tsunami 40 minutes after the event. These two signals reveal the large surface displacement at the rupture, and the consequent destructive tsunami. This comparative study of two earthquakes with the same magnitude at different depths highlights the potential role of ionospheric monitoring by GNSS to tsunami early warning systems

  20. Comparison of Human Response against Earthquake and Tsunami

    NASA Astrophysics Data System (ADS)

    Arikawa, T.; Güler, H. G.; Yalciner, A. C.

    2017-12-01

    The evacuation response against the earthquake and tsunamis is very important for the reduction of human damages against tsunami. But it is very difficult to predict the human behavior after shaking of the earthquake. The purpose of this research is to clarify the difference of the human response after the earthquake shock in the difference countries and to consider the relation between the response and the safety feeling, knowledge and education. For the objective of this paper, the questionnaire survey was conducted after the 21st July 2017 Gokova earthquake and tsunami. Then, consider the difference of the human behavior by comparison of that in 2015 Chilean earthquake and tsunami and 2011 Japan earthquake and tsunami. The seismic intensity of the survey points was almost 6 to 7. The contents of the questions include the feeling of shaking, recalling of the tsunami, the behavior after shock and so on. The questionnaire was conducted for more than 20 20 people in 10 areas. The results are the following; 1) Most people felt that it was a strong shake not to stand, 2) All of the questionnaires did not recall the tsunami, 3) Depending on the area, they felt that after the earthquake the beach was safer than being at home. 4) After they saw the sea drawing, they thought that a tsunami would come and ran away. Fig. 1 shows the comparison of the evacuation rate within 10 minutes in 2011 Japan, 2015 Chile and 2017 Turkey.. From the education point of view, education for tsunami is not done much in Turkey. From the protection facilities point of view, the high sea walls are constructed only in Japan. From the warning alert point of view, there is no warning system against tsunamis in the Mediterranean Sea. As a result of this survey, the importance of tsunami education is shown, and evacuation tends to be delayed if dependency on facilities and alarms is too high.

  1. Warning and prevention based on estimates with large uncertainties: the case of low-frequency and large-impact events like tsunamis

    NASA Astrophysics Data System (ADS)

    Tinti, Stefano; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo

    2013-04-01

    Geoscientists deal often with hazardous processes like earthquakes, volcanic eruptions, tsunamis, hurricanes, etc., and their research is aimed not only to a better understanding of the physical processes, but also to provide assessment of the space and temporal evolution of a given individual event (i.e. to provide short-term prediction) and of the expected evolution of a group of events (i.e. to provide statistical estimates referred to a given return period, and a given geographical area). One of the main issues of any scientific method is how to cope with measurement errors, a topic which in case of forecast of ongoing or of future events translates into how to deal with forecast uncertainties. In general, the more data are available and processed to make a prediction, the more accurate the prediction is expected to be if the scientific approach is sound, and the smaller the associated uncertainties are. However, there are several important cases where assessment is to be made with insufficient data or insufficient time for processing, which leads to large uncertainties. Two examples can be given taken from tsunami science, since tsunamis are rare events that may have destructive power and very large impact. One example is the case of warning for a tsunami generated by a near-coast earthquake, which is an issue at the focus of the European funded project NearToWarn. Warning has to be launched before tsunami hits the coast, that is in a few minutes after its generation. This may imply that data collected in such a short time are not yet enough for an accurate evaluation, also because the implemented monitoring system (if any) could be inadequate (f.i. one reason of inadequacy could be that implementing a dense instrumental network could be judged too expensive for rare events) The second case is the long term prevention from tsunami strikes. Tsunami infrequency may imply that the historical record for a given piece of coast is too short to capture a statistical sufficient number of large tsunamis, which entails that tsunami hazard has to be estimated by means of speculated worst-case scenarios, and their consequences are evaluated accordingly and usually result associated with large uncertainty bands. In case of large uncertainties, the main issues for geoscientists are how to communicate the information (prediction and uncertainties) to stakeholders and citizens and how to build and implement together responsive procedures that should be adequate. Usually there is a tradeoff between the cost of the countermeasure (warning and prevention) and its efficacy (i.e. its capability of minimizing the damage). The level of the acceptable tradeoff is an issue pertaining to decision makers and to local threatened communities. This paper, that represents a contribution from the European project TRIDEC on management of emergency crises, discusses the role of geoscientists in providing predictions and the related uncertainties. It is stressed that through academic education geoscientists are formed more to better their understanding of processes and the quantification of uncertainties, but are often unprepared to communicate their results in a way appropriate for society. Filling this gap is crucial for improving the way geoscience and society handle natural hazards and devise proper defense means.

  2. Introduction to “Global tsunami science: Past and future, Volume II”

    USGS Publications Warehouse

    Rabinovich, Alexander B.; Fritz, Hermann M.; Tanioka, Yuichiro; Geist, Eric L.

    2017-01-01

    Twenty-two papers on the study of tsunamis are included in Volume II of the PAGEOPH topical issue “Global Tsunami Science: Past and Future”. Volume I of this topical issue was published as PAGEOPH, vol. 173, No. 12, 2016 (Eds., E. L. Geist, H. M. Fritz, A. B. Rabinovich, and Y. Tanioka). Three papers in Volume II focus on details of the 2011 and 2016 tsunami-generating earthquakes offshore of Tohoku, Japan. The next six papers describe important case studies and observations of recent and historical events. Four papers related to tsunami hazard assessment are followed by three papers on tsunami hydrodynamics and numerical modelling. Three papers discuss problems of tsunami warning and real-time forecasting. The final set of three papers importantly investigates tsunamis generated by non-seismic sources: volcanic explosions, landslides, and meteorological disturbances. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  3. Tsunami Preparedness in California (videos)

    USGS Publications Warehouse

    Filmed and edited by: Loeffler, Kurt; Gesell, Justine

    2010-01-01

    Tsunamis are a constant threat to the coasts of our world. Although tsunamis are infrequent along the West coast of the United States, it is possible and necessary to prepare for potential tsunami hazards to minimize loss of life and property. Community awareness programs are important, as they strive to create an informed society by providing education and training. These videos about tsunami preparedness in California distinguish between a local tsunami and a distant event and focus on the specific needs of each region. They offer guidelines for correct tsunami response and community preparedness from local emergency managers, first-responders, and leading experts on tsunami hazards and warnings, who have been working on ways of making the tsunami affected regions safer for the people and communities on a long-term basis. These videos were produced by the U.S. Geological Survey (USGS) in cooperation with the California Emergency Management Agency (CalEMA) and Pacific Gas and Electric Company (PG&E).

  4. Tsunami Preparedness in Oregon (video)

    USGS Publications Warehouse

    Filmed and edited by: Loeffler, Kurt; Gesell, Justine

    2010-01-01

    Tsunamis are a constant threat to the coasts of our world. Although tsunamis are infrequent along the West coast of the United States, it is possible and necessary to prepare for potential tsunami hazards to minimize loss of life and property. Community awareness programs are important, as they strive to create an informed society by providing education and training. This video about tsunami preparedness in Oregon distinguishes between a local tsunami and a distant event and focus on the specific needs of this region. It offers guidelines for correct tsunami response and community preparedness from local emergency managers, first-responders, and leading experts on tsunami hazards and warnings, who have been working on ways of making the tsunami affected regions safer for the people and communities on a long-term basis. This video was produced by the US Geological Survey (USGS) in cooperation with Oregon Department of Geology and Mineral Industries (DOGAMI).

  5. Introduction to "Global Tsunami Science: Past and Future, Volume II"

    NASA Astrophysics Data System (ADS)

    Rabinovich, Alexander B.; Fritz, Hermann M.; Tanioka, Yuichiro; Geist, Eric L.

    2017-08-01

    Twenty-two papers on the study of tsunamis are included in Volume II of the PAGEOPH topical issue "Global Tsunami Science: Past and Future". Volume I of this topical issue was published as PAGEOPH, vol. 173, No. 12, 2016 (Eds., E. L. Geist, H. M. Fritz, A. B. Rabinovich, and Y. Tanioka). Three papers in Volume II focus on details of the 2011 and 2016 tsunami-generating earthquakes offshore of Tohoku, Japan. The next six papers describe important case studies and observations of recent and historical events. Four papers related to tsunami hazard assessment are followed by three papers on tsunami hydrodynamics and numerical modelling. Three papers discuss problems of tsunami warning and real-time forecasting. The final set of three papers importantly investigates tsunamis generated by non-seismic sources: volcanic explosions, landslides, and meteorological disturbances. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.

  6. U.S. states and territories national tsunami hazard assessment, historic record and sources for waves

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Weaver, C.

    2007-12-01

    In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the qualitative assessments based on frequency and amplitude. The second method to assess tsunami hazard involved using the USGS earthquake databases to search for possible earthquake sources near American coastlines to extend the NOAA/NGDC tsunami databases backward in time. The qualitative tsunami hazard assessment based on the results of the NGDC and USGS database searches will be presented.

  7. On the Potential Uses of Static Offsets Derived From Low-Cost Community Instruments and Crowd-Sourcing for Earthquake Monitoring and Rapid Response

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Murray, J. R.; Iannucci, R. A.

    2013-12-01

    We explore the efficacy of low-cost community instruments (LCCIs) and crowd-sourcing to produce rapid estimates of earthquake magnitude and rupture characteristics which can be used for earthquake loss reduction such as issuing tsunami warnings and guiding rapid response efforts. Real-time high-rate GPS data are just beginning to be incorporated into earthquake early warning (EEW) systems. These data are showing promising utility including producing moment magnitude estimates which do not saturate for the largest earthquakes and determining the geometry and slip distribution of the earthquake rupture in real-time. However, building a network of scientific-quality real-time high-rate GPS stations requires substantial infrastructure investment which is not practicable in many parts of the world. To expand the benefits of real-time geodetic monitoring globally, we consider the potential of pseudorange-based GPS locations such as the real-time positioning done onboard cell phones or on LCCIs that could be distributed in the same way accelerometers are distributed as part of the Quake Catcher Network (QCN). While location information from LCCIs often have large uncertainties, their low cost means that large numbers of instruments can be deployed. A monitoring network that includes smartphones could collect data from potentially millions of instruments. These observations could be averaged together to substantially decrease errors associated with estimated earthquake source parameters. While these data will be inferior to data recorded by scientific-grade seismometers and GPS instruments, there are features of community-based data collection (and possibly analysis) that are very attractive. This approach creates a system where every user can host an instrument or download an application to their smartphone that both provides them with earthquake and tsunami warnings while also providing the data on which the warning system operates. This symbiosis helps to encourage people to both become users of the warning system and to contribute data to the system. Further, there is some potential to take advantage of the LCCI hosts' computing and communications resources to do some of the analysis required for the warning system. We will present examples of the type of data which might be observed by pseudorange-based positioning for both actual earthquakes and laboratory tests as well as performance tests of potential earthquake source modeling derived from pseudorange data. A highlight of these performance tests is a case study of the 2011 Mw 9 Tohoku-oki earthquake.

  8. The quest for wisdom: lessons from 17 tsunamis, 2004-2014.

    PubMed

    Okal, Emile A

    2015-10-28

    Since the catastrophic Sumatra-Andaman tsunami took place in 2004, 16 other tsunamis have resulted in significant damage and 14 in casualties. We review the fundamental changes that have affected our command of tsunami issues as scientists, engineers and decision-makers, in the quest for improved wisdom in this respect. While several scientific paradigms have had to be altered or abandoned, new algorithms, e.g. the W seismic phase and real-time processing of fast-arriving seismic P waves, give us more powerful tools to estimate in real time the tsunamigenic character of an earthquake. We assign to each event a 'wisdom index' based on the warning issued (or not) during the event, and on the response of the population. While this approach is admittedly subjective, it clearly shows several robust trends: (i) we have made significant progress in our command of far-field warning, with only three casualties in the past 10 years; (ii) self-evacuation by educated populations in the near field is a key element of successful tsunami mitigation; (iii) there remains a significant cacophony between the scientific community and decision-makers in industry and government as documented during the 2010 Maule and 2011 Tohoku events; and (iv) the so-called 'tsunami earthquakes' generating larger tsunamis than expected from the size of their seismic source persist as a fundamental challenge, despite scientific progress towards characterizing these events in real time. © 2015 The Author(s).

  9. A Case Study of Array-based Early Warning System for Tsunami Offshore Ventura, California

    NASA Astrophysics Data System (ADS)

    Xie, Y.; Meng, L.

    2017-12-01

    Extreme scenarios of M 7.5+ earthquakes on the Red Mountain and Pitas Point faults can potentially generate significant local tsunamis in southern California. The maximum water elevation could be as large as 10 m in the nearshore region of Oxnard and Santa Barbara. Recent development in seismic array processing enables rapid tsunami prediction and early warning based on the back-projection approach (BP). The idea is to estimate the rupture size by back-tracing the seismic body waves recorded by stations at local and regional distances. A simplified source model of uniform slip is constructed and used as an input for tsunami simulations that predict the tsunami wave height and arrival time. We demonstrate the feasibility of this approach in southern California by implementing it in a simulated real-time environment and applying to a hypothetical M 7.7 Dip-slip earthquake scenario on the Pitas Point fault. Synthetic seismograms are produced using the SCEC broadband platform based on the 3D SoCal community velocity model. We use S-wave instead of P-wave to avoid S-minus-P travel times shorter than rupture duration. Two clusters of strong-motion stations near Bakersfield and Palmdale are selected to determine the back-azimuth of the strongest high-frequency radiations (0.5-1 Hz). The back-azimuths of the two clusters are then intersected to locate the source positions. The rupture area is then approximated by enclosing these BP radiators with an ellipse or a polygon. Our preliminary results show that the extent of 1294 square kilometers rupture area and magnitude of 7.6 obtained by this approach is reasonably close to the 1849 square kilometers and 7.7 of the input model. The average slip of 7.3 m is then estimated according to the scaling relation between slip and rupture area, which is close to the actual average dislocation amount, 8.3 m. Finally, a tsunami simulation is conducted to assess the wave height and arrival time. The errors of -3 to +9 s in arrival time and 0.4 m in wave amplitude are reasonably small for early warning purpose. The blind zone for early warning is the region north of the outcrop of Pitas Point faults and has a scale close to the length of the fault, 43 km. The warning time is above 15 min in the nearshore region west of Cojo Bay Beach and south of Oxnard.

  10. Sources of information for tsunami forecasting in New Zealand

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Ristau, J. P.; D'Anastasio, E.; Wang, X.

    2013-12-01

    Tsunami science has evolved considerably in the last two decades due to technological advancements which also helped push for better numerical modelling of the tsunami phases (generation to inundation). The deployment of DART buoys has also been a considerable milestone in tsunami forecasting. Tsunami forecasting is one of the parts that tsunami modelling feeds into and is related to response, preparedness and planning. Usually tsunami forecasting refers to short-term forecasting that takes place in real-time after a tsunami has or appears to have been generated. In this report we refer to all types of forecasting (short-term or long-term) related to work in advance of a tsunami impacting a coastline that would help in response, planning or preparedness. We look at the standard types of data (seismic, GPS, water level) that are available in New Zealand for tsunami forecasting, how they are currently being used, other ways to use these data and provide recommendations for better utilisation. The main findings are: -Current investigations of the use of seismic parameters quickly obtained after an earthquake, have potential to provide critical information about the tsunamigenic potential of earthquakes. Further analysis of the most promising methods should be undertaken to determine a path to full implementation. -Network communication of the largest part of the GPS network is not currently at a stage that can provide sufficient data early enough for tsunami warning. It is believed that it has potential, but changes including data transmission improvements may have to happen before real-time processing oriented to tsunami early warning is implemented on the data that is currently provided. -Tide gauge data is currently under-utilised for tsunami forecasting. Spectral analysis, modal analysis based on identified modes and arrival times extracted from the records can be useful in forecasting. -The current study is by no means exhaustive of the ways the different types of data can be used. We are only presenting an overview of what can be done. More extensive studies with each one of the types of data collected by GeoNet and other relevant networks will help improve tsunami forecasting in New Zealand.

  11. The November 17, 2015 Lefkada offshore (non-?)tsunamigenic earthquake: preliminary considerations and implications for tsunami hazard and warning in the Ionian Sea

    NASA Astrophysics Data System (ADS)

    Armigliato, Alberto; Tinti, Stefano; Pagnoni, Gianluca; Ausilia Paparo, Maria; Zaniboni, Filippo

    2016-04-01

    A Mw = 6.5 earthquake occurred on November 17, 2015 just offshore the western coast of the Ionian island of Lefkada (western Greece). The earthquake caused two fatalities and severe damage, especially in the island of Lefkada. Several landslides were set in motion by the earthquake, some of which occurred along the coastal cliffs. The earthquake was clearly felt also along the eastern coasts of Apulia, Calabria and Sicily (Italy). The computed focal mechanisms indicate that the rupture occurred along a dextral strike-slip, sub-vertical fault, compatible with the well-known transcurrent tectonics of the Lefkada-Cephalonia area. At the time of the drafting of this abstract no heterogeneous slip distribution has been proposed. No clear evidence of tsunami effects is available, with the only exception of the signal recorded by the tide gauge in Crotone (eastern Calabria, Italy), where a clear disturbance (still to be fully characterised and explained) emerges from the background at approximately 1 hour after the earthquake origin time. From the tsunami research point of view, the November 17 Lefkada earthquake poses at least two problems, which we try to address in this paper. The first consists in studying the tsunami generation based on the available seismic information and on the tectonic setting of the area. We present results of numerical simulations of the tsunami generation and propagation aimed at casting light on the reasons why the generated tsunami was so weak (or even absent). Starting from the official fault parameters provided by the seismic agencies, we vary a number of them, there including the length and width calculated on the basis of different regression formulas, and the depth. For each configuration we perform tsunami simulations by means of the in-house finite-difference code UBO-TSUFD. In parallel, we analyse the Crotone tide-gauge record in order to understand whether the observed "anomalous" signal can be attributed to a tsunami or not. In the first case we will try at least to reproduce the observed signal, otherwise we will try to understand whether the non-tsunamigenic nature of the event is confirmed by the tsunami simulations. The second problem is more related to tsunami early warning issues, in particular with the performance of the Tsunami Decision Matrix for the Mediterranean, presently adopted for example by the candidate Tsunami Service Providers at NOA (Greece) and INGV (Italy). We will briefly discuss whether the present form of the matrix, which does not include any information on focal mechanism, is well suited to a peculiar event like the November 17 earthquake, which was of strike-slip nature and had a magnitude lying just at the border between two distinct classes of tsunami potential forecast. This study is funded in the frame of the EU Project called ASTARTE - "Assessment, STrategy And Risk Reduction for Tsunamis in Europe", Grant 603839, 7th FP (ENV.2013.6.4-3), and of the Italian Flagship Project RITMARE ("La Ricerca ITaliana per il MARE").

  12. Tsunami forecast by joint inversion of real-time tsunami waveforms and seismic of GPS data: application to the Tohoku 2011 tsunami

    USGS Publications Warehouse

    Yong, Wei; Newman, Andrew V.; Hayes, Gavin P.; Titov, Vasily V.; Tang, Liujuan

    2014-01-01

    Correctly characterizing tsunami source generation is the most critical component of modern tsunami forecasting. Although difficult to quantify directly, a tsunami source can be modeled via different methods using a variety of measurements from deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some of which in or near real time. Here we assess the performance of different source models for the destructive 11 March 2011 Japan tsunami using model–data comparison for the generation, propagation, and inundation in the near field of Japan. This comparative study of tsunami source models addresses the advantages and limitations of different real-time measurements with potential use in early tsunami warning in the near and far field. The study highlights the critical role of deep-ocean tsunami measurements and rapid validation of the approximate tsunami source for high-quality forecasting. We show that these tsunami measurements are compatible with other real-time geodetic data, and may provide more insightful understanding of tsunami generation from earthquakes, as well as from nonseismic processes such as submarine landslide failures.

  13. Recent Advances in Remote Sensing of Natural Hazards-Induced Atmospheric and Ionospheric Perturbations

    NASA Astrophysics Data System (ADS)

    Yang, Y. M.; Komjathy, A.; Meng, X.; Verkhoglyadova, O. P.; Langley, R. B.; Mannucci, A. J.

    2015-12-01

    Traveling ionospheric disturbances (TIDs) induced by acoustic-gravity waves in the neutral atmosphere have significant impact on trans-ionospheric radio waves such as Global Navigation Satellite System (GNSS, including Global Position System (GPS)) measurements. Natural hazards and solid Earth events, such as earthquakes, tsunamis and volcanic eruptions are actual sources that may trigger acoustic and gravity waves resulting in traveling ionospheric disturbances (TIDs) in the upper atmosphere. Trans-ionospheric radio wave measurements sense the total electron content (TEC) along the signal propagation path. In this research, we introduce a novel GPS-based detection and estimation technique for remote sensing of atmospheric wave-induced TIDs including space weather phenomena induced by major natural hazard events, using TEC time series collected from worldwide ground-based dual-frequency GNSS (including GPS) receiver networks. We demonstrate the ability of using ground- and space-based dual-frequency GPS measurements to detect and monitor tsunami wave propagation from the 2011 Tohoku-Oki earthquake and tsunami. Major wave trains with different propagation speeds and wavelengths were identified through analysis of the GPS remote sensing observations. Dominant physical characteristics of atmospheric wave-induced TIDs are found to be associated with specific tsunami propagations and oceanic Rayleigh waves. In this research, we compared GPS-based observations, corresponding model simulations and tsunami wave propagation. Results are shown to lead to a better understanding of the tsunami-induced ionosphere responses. Based on current distribution of Plate Boundary Observatory GPS stations, the results indicate that tsunami-induced TIDs may be detected about 60 minutes prior to tsunamis arriving at the U.S. west coast. It is expected that this GNSS-based technology will become an integral part of future early-warning systems.

  14. Assessment of tsunami hazard for coastal areas of Shandong Province, China

    NASA Astrophysics Data System (ADS)

    Feng, Xingru; Yin, Baoshu

    2017-04-01

    Shandong province is located on the east coast of China and has a coastline of about 3100 km. There are only a few tsunami events recorded in the history of Shandong Province, but the tsunami hazard assessment is still necessary as the rapid economic development and increasing population of this area. The objective of this study was to evaluate the potential danger posed by tsunamis for Shandong Province. The numerical simulation method was adopted to assess the tsunami hazard for coastal areas of Shandong Province. The Cornell multi-grid coupled tsunami numerical model (COMCOT) was used and its efficacy was verified by comparison with three historical tsunami events. The simulated maximum tsunami wave height agreed well with the observational data. Based on previous studies and statistical analyses, multiple earthquake scenarios in eight seismic zones were designed, the magnitudes of which were set as the potential maximum values. Then, the tsunamis they induced were simulated using the COMCOT model to investigate their impact on the coastal areas of Shandong Province. The numerical results showed that the maximum tsunami wave height, which was caused by the earthquake scenario located in the sea area of the Mariana Islands, could reach up to 1.39 m off the eastern coast of Weihai city. The tsunamis from the seismic zones of the Bohai Sea, Okinawa Trough, and Manila Trench could also reach heights of >1 m in some areas, meaning that earthquakes in these zones should not be ignored. The inundation hazard was distributed primarily in some northern coastal areas near Yantai and southeastern coastal areas of Shandong Peninsula. When considering both the magnitude and arrival time of tsunamis, it is suggested that greater attention be paid to earthquakes that occur in the Bohai Sea. In conclusion, the tsunami hazard facing the coastal area of Shandong Province is not very serious; however, disasters could occur if such events coincided with spring tides or other extreme oceanic conditions. The results of this study will be useful for the design of coastal engineering projects and the establishment of a tsunami warning system for Shandong Province.

  15. Toward the Real-Time Tsunami Parameters Prediction

    NASA Astrophysics Data System (ADS)

    Lavrentyev, Mikhail; Romanenko, Alexey; Marchuk, Andrey

    2013-04-01

    Today, a wide well-developed system of deep ocean tsunami detectors operates over the Pacific. Direct measurements of tsunami-wave time series are available. However, tsunami-warning systems fail to predict basic parameters of tsunami waves on time. Dozens examples could be provided. In our view, the lack of computational power is the main reason of these failures. At the same time, modern computer technologies such as, GPU (graphic processing unit) and FPGA (field programmable gates array), can dramatically improve data processing performance, which may enhance timely tsunami-warning prediction. Thus, it is possible to address the challenge of real-time tsunami forecasting for selected geo regions. We propose to use three new techniques in the existing tsunami warning systems to achieve real-time calculation of tsunami wave parameters. First of all, measurement system (DART buoys location, e.g.) should be optimized (both in terms of wave arriving time and amplitude parameter). The corresponding software application exists today and is ready for use [1]. We consider the example of the coastal line of Japan. Numerical tests show that optimal installation of only 4 DART buoys (accounting the existing sea bed cable) will reduce the tsunami wave detection time to only 10 min after an underwater earthquake. Secondly, as was shown by this paper authors, the use of GPU/FPGA technologies accelerates the execution of the MOST (method of splitting tsunami) code by 100 times [2]. Therefore, tsunami wave propagation over the ocean area 2000*2000 km (wave propagation simulation: time step 10 sec, recording each 4th spatial point and 4th time step) could be calculated at: 3 sec with 4' mesh 50 sec with 1' mesh 5 min with 0.5' mesh The algorithm to switch from coarse mesh to the fine grain one is also available. Finally, we propose the new algorithm for tsunami source parameters determination by real-time processing the time series, obtained at DART. It is possible to approximate the measured time series by a linear combination of synthetic marigrams. Coefficients of such linear combination are calculated with the help of orthogonal decomposition. The algorithm is very fast and demonstrates good accuracy. Summing up, using the example of the coastal line of Japan, wave height evaluation will be available in 12-14 minutes after the earthquake even before the wave approaches the nearest shore point (usually, it takes places in about 20 minutes). The determination of the optimal sensors' location using genetic algorithm / A.S.Astrakova, D.V.Bannikov, S.G.Cherny, M.M.Lavrentiev // 3rd Nordic EMW Summer School, Turku, Finland, June, 2009: proceedings - Finland: TUSC General Publications, 2009. - N 53. - P.5-22. M.Lavrentiev Jr., A.Romanenko, "Modern Hardware Solutions to Speed Up Tsunami Simulation Codes", Geophysical research abstracts, Vol. 12, EGU2010-3835, 2010

  16. The Catalog of Event Data of the Operational Deep-ocean Assessment and Reporting of Tsunamis (DART) Stations at the National Data Buoy Center

    NASA Astrophysics Data System (ADS)

    Bouchard, R.; Locke, L.; Hansen, W.; Collins, S.; McArthur, S.

    2007-12-01

    DART systems are a critical component of the tsunami warning system as they provide the only real-time, in situ, tsunami detection before landfall. DART systems consist of a surface buoy that serves as a position locater and communications transceiver and a Bottom Pressure Recorder (BPR) on the seafloor. The BPR records temperature and pressure at 15-second intervals to a memory card for later retrieval for analysis and use by tsunami researchers, but the BPRs are normally recovered only once every two years. The DART systems also transmit subsets of the data, converted to an estimation of the sea surface height, in near real-time for use by the tsunami warning community. These data are available on NDBC's webpages, http://www.ndbc.noaa.gov/dart.shtml. Although not of the resolution of the data recorded to the BPR memory card, the near real-time data have proven to be of value in research applications [1]. Of particular interest are the DART data associated with geophysical events. The DART BPR continuously compares the measured sea height with a predicted sea-height and when the difference exceeds a threshold value, the BPR goes into Event Mode. Event Mode provides an extended, more frequent near real-time reporting of the sea surface heights for tsunami detection. The BPR can go into Event Mode because of geophysical triggers, such as tsunamis or seismic activity, which may or may not be tsunamigenic. The BPR can also go into Event Mode during recovery of the BPR as it leaves the seafloor, or when manually triggered by the Tsunami Warning Centers in advance of an expected tsunami. On occasion, the BPR will go into Event Mode without any associated tsunami or seismic activity or human intervention and these are considered "False'' Events. Approximately one- third of all Events can be classified as "False". NDBC is responsible for the operations, maintenance, and data management of the DART stations. Each DART station has a webpage with a drop-down list of all Events. NDBC maintains the non-geophysical Events in order to maintain the continuity of the time series records. In 2007, NDBC compiled all DART Events that occurred while under NDBC's operational control and made an assessment on their validity. The NDBC analysts performed the assessment using the characteristics of the data time series, triggering criteria, and associated seismic events. The compilation and assessments are catalogued in a NDBC technical document. The Catalog also includes a listing of the one-hour, high-resolution data, retrieved remotely from the BPRs that are not available on the web pages. The Events are classified by their triggering mechanism and listed by station location and, for those Events associated with geophysical triggers, they are listed by their associated seismic events. The Catalog provides researchers with a valuable tool in locating, assessing, and applying near real-time DART data to tsunami research and will be updated following DART Events. A link to the published Catalog can be found on the NDBC DART website, http://www.ndbc.noaa.gov/dart.shtml. Reference: [1] Gower, J. and F. González (2006), U.S. Warning System Detected the Sumatra Tsunami, Eos Trans. AGU, 87(10), 105-112.

  17. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  18. Development of new tsunami detection algorithms for high frequency radars and application to tsunami warning in British Columbia, Canada

    NASA Astrophysics Data System (ADS)

    Grilli, S. T.; Guérin, C. A.; Shelby, M. R.; Grilli, A. R.; Insua, T. L.; Moran, P., Jr.

    2016-12-01

    A High-Frequency (HF) radar was installed by Ocean Networks Canada in Tofino, BC, to detect tsunamis from far- and near-field seismic sources; in particular, from the Cascadia Subduction Zone. This HF radar can measure ocean surface currents up to a 70-85 km range, depending on atmospheric conditions, based on the Doppler shift they cause in ocean waves at the Bragg frequency. In earlier work, we showed that tsunami currents must be at least 0.15 m/s to be directly detectable by a HF radar, when considering environmental noise and background currents (from tide/mesoscale circulation). This limits a direct tsunami detection to shallow water areas where currents are sufficiently strong due to wave shoaling and, hence, to the continental shelf. It follows that, in locations with a narrow shelf, warning times using a direct inversion method will be small. To detect tsunamis in deeper water, beyond the continental shelf, we proposed a new algorithm that does not require directly inverting currents, but instead is based on observing changes in patterns of spatial correlations of the raw radar signal between two radar cells located along the same wave ray, after time is shifted by the tsunami propagation time along the ray. A pattern change will indicate the presence of a tsunami. We validated this new algorithm for idealized tsunami wave trains propagating over a simple seafloor geometry in a direction normally incident to shore. Here, we further develop, extend, and validate the algorithm for realistic case studies of seismic tsunami sources impacting Vancouver Island, BC. Tsunami currents, computed with a state-of-the-art long wave model are spatially averaged over cells aligned along individual wave rays, located within the radar sweep area, obtained by solving the wave geometric optic equation; for long waves, such rays and tsunami propagation times along those are only function of the seafloor bathymetry, and hence can be precalculated for different incident tsunami directions. A model simulating the radar backscattered signal in space and time as a function of simulated tsunami currents is applied to the sweep area. Numerical experiments show that the new algorithm can detect a realistic tsunami further offshore than a direct detection method. Correlation thresholds for tsunami detection will be derived from the results.

  19. Long-term perspectives on giant earthquakes and tsunamis at subduction zones

    USGS Publications Warehouse

    Satake, K.; Atwater, B.F.; ,

    2007-01-01

    Histories of earthquakes and tsunamis, inferred from geological evidence, aid in anticipating future catastrophes. This natural warning system now influences building codes and tsunami planning in the United States, Canada, and Japan, particularly where geology demonstrates the past occurrence of earthquakes and tsunamis larger than those known from written and instrumental records. Under favorable circumstances, paleoseismology can thus provide long-term advisories of unusually large tsunamis. The extraordinary Indian Ocean tsunami of 2004 resulted from a fault rupture more than 1000 km in length that included and dwarfed fault patches that had broken historically during lesser shocks. Such variation in rupture mode, known from written history at a few subduction zones, is also characteristic of earthquake histories inferred from geology on the Pacific Rim. Copyright ?? 2007 by Annual Reviews. All rights reserved.

  20. Towards an Earthquake and Tsunami Early Warning in the Caribbean

    NASA Astrophysics Data System (ADS)

    Huerfano Moreno, V. A.; Vanacore, E. A.

    2017-12-01

    The Caribbean region (CR) has a documented history of large damaging earthquakes and tsunamis that have affected coastal areas, including the events of Jamaica in 1692, Virgin Islands in 1867, Puerto Rico in 1918, the Dominican Republic in 1946 and Haiti in 2010. There is clear evidence that tsunamis have been triggered by large earthquakes that deformed the ocean floor around the Caribbean Plate boundary. The CR is monitored jointly by national/regional/local seismic, geodetic and sea level networks. All monitoring institutions are participating in the UNESCO ICG/Caribe EWS, the purpose of this initiative is to minimize loss of life and destruction of property, and to mitigate against catastrophic economic impacts via promoting local research, real time (RT) earthquake, geodetic and sea level data sharing and improving warning capabilities and enhancing education and outreach strategies. Currently more than, 100 broad-band seismic, 65 sea levels and 50 GPS high rate stations are available in real or near real-time. These real-time streams are used by Local/Regional or Worldwide detection and warning institutions to provide earthquake source parameters in a timely manner. Currently, any Caribbean event detected to have a magnitude greater than 4.5 is evaluated, and sea level is measured, by the TWC for tsumanigenic potential. The regional cooperation is motivated both by research interests as well as geodetic, seismic and tsunami hazard monitoring and warning. It will allow the imaging of the tectonic structure of the Caribbean region to a high resolution which will consequently permit further understanding of the seismic source properties for moderate and large events and the application of this knowledge to procedures of civil protection. To reach its goals, the virtual network has been designed following the highest technical standards: BB sensors, 24 bits A/D converters with 140 dB dynamic range, real-time telemetry. Here we will discuss the state of the PR component of this virtual network as well as current advances in the imaging of the PR tectonic structure. The goal of this presentation is to describe the Puerto Rico Seismic Network (PRSN) system, including the real time earthquake and tsunami monitoring as well as the specific protocols used to broadcast earthquake/tsunami messages locally.

  1. Detecting Tsunami Genesis and Scales Directly from Coastal GPS Stations

    NASA Astrophysics Data System (ADS)

    Song, Y. Tony

    2013-04-01

    Different from the conventional approach to tsunami warnings that rely on earthquake magnitude estimates, we have found that coastal GPS stations are able to detect continental slope displacements of faulting due to big earthquakes, and that the detected seafloor displacements are able to determine tsunami source energy and scales instantaneously. This method has successfully replicated several historical tsunamis caused by the 2004 Sumatra earthquake, the 2005 Nias earthquake, the 2010 Chilean earthquake, and the 2011 Tohoku-Oki earthquake, respectively, and has been compared favorably with the conventional seismic solutions that usually take hours or days to get through inverting seismographs (reference listed). Because many coastal GPS stations are already in operation for measuring ground motions in real time as often as once every few seconds, this study suggests a practical way of identifying tsunamigenic earthquakes for early warnings and reducing false alarms. Reference Song, Y. T., 2007: Detecting tsunami genesis and scales directly from coastal GPS stations, Geophys. Res. Lett., 34, L19602, doi:10.1029/2007GL031681. Song, Y. T., L.-L. Fu, V. Zlotnicki, C. Ji, V. Hjorleifsdottir, C.K. Shum, and Y. Yi, 2008: The role of horizontal impulses of the faulting continental slope in generating the 26 December 2004 Tsunami, Ocean Modelling, doi:10.1016/j.ocemod.2007.10.007. Song, Y. T. and S.C. Han, 2011: Satellite observations defying the long-held tsunami genesis theory, D.L. Tang (ed.), Remote Sensing of the Changing Oceans, DOI 10.1007/978-3-642-16541-2, Springer-Verlag Berlin Heidelberg. Song, Y. T., I. Fukumori, C. K. Shum, and Y. Yi, 2012: Merging tsunamis of the 2011 Tohoku-Oki earthquake detected over the open ocean, Geophys. Res. Lett., doi:10.1029/2011GL050767 (Nature Highlights, March 8, 2012).

  2. Near-field hazard assessment of March 11, 2011 Japan Tsunami sources inferred from different methods

    USGS Publications Warehouse

    Wei, Y.; Titov, V.V.; Newman, A.; Hayes, G.; Tang, L.; Chamberlin, C.

    2011-01-01

    Tsunami source is the origin of the subsequent transoceanic water waves, and thus the most critical component in modern tsunami forecast methodology. Although impractical to be quantified directly, a tsunami source can be estimated by different methods based on a variety of measurements provided by deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some in real time, some in post real-time. Here we assess these different sources of the devastating March 11, 2011 Japan tsunami by model-data comparison for generation, propagation and inundation in the near field of Japan. This study provides a comparative study to further understand the advantages and shortcomings of different methods that may be potentially used in real-time warning and forecast of tsunami hazards, especially in the near field. The model study also highlights the critical role of deep-ocean tsunami measurements for high-quality tsunami forecast, and its combination with land GPS measurements may lead to better understanding of both the earthquake mechanisms and tsunami generation process. ?? 2011 MTS.

  3. U.S. States and Territories National Tsunami Hazard Assessment: Historical record and sources for waves – Update

    USGS Publications Warehouse

    Dunbar, Paula K.; Weaver, Craig S.

    2015-01-01

    The first U.S. Tsunami Hazard Assessment (Dunbar and Weaver, 2008) was prepared at the request of the National Tsunami Hazard Mitigation Program (NTHMP). The NTHMP is a partnership formed between federal and state agencies to reduce the impact of tsunamis through hazard assessment, warning guidance, and mitigation. The assessment was conducted in response to a 2005 joint report by the Sub-Committee on Disaster Reduction and the U.S. Group on Earth Observations entitled Tsunami Risk Reduction for the United States: A Framework for Action. The first specific action called for in the Framework was to “develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories.” Since the first assessment, there have been a number of very significant tsunamis, including the 2009 Samoa, 2010 Chile, and 2011 Japan tsunamis. As a result, the NTHMP requested an update of the U.S. tsunami hazard assessment.

  4. Widespread tsunami-like waves of 23-27 June in the Mediterranean and Black Seas generated by high-altitude atmospheric forcing.

    PubMed

    Šepić, Jadranka; Vilibić, Ivica; Rabinovich, Alexander B; Monserrat, Sebastian

    2015-06-29

    A series of tsunami-like waves of non-seismic origin struck several southern European countries during the period of 23 to 27 June 2014. The event caused considerable damage from Spain to Ukraine. Here, we show that these waves were long-period ocean oscillations known as meteorological tsunamis which are generated by intense small-scale air pressure disturbances. An unique atmospheric synoptic pattern was tracked propagating eastward over the Mediterranean and the Black seas in synchrony with onset times of observed tsunami waves. This pattern favoured generation and propagation of atmospheric gravity waves that induced pronounced tsunami-like waves through the Proudman resonance mechanism. This is the first documented case of a chain of destructive meteorological tsunamis occurring over a distance of thousands of kilometres. Our findings further demonstrate that these events represent potentially dangerous regional phenomena and should be included in tsunami warning systems.

  5. Widespread tsunami-like waves of 23-27 June in the Mediterranean and Black Seas generated by high-altitude atmospheric forcing

    PubMed Central

    Šepić, Jadranka; Vilibić, Ivica; Rabinovich, Alexander B.; Monserrat, Sebastian

    2015-01-01

    A series of tsunami-like waves of non-seismic origin struck several southern European countries during the period of 23 to 27 June 2014. The event caused considerable damage from Spain to Ukraine. Here, we show that these waves were long-period ocean oscillations known as meteorological tsunamis which are generated by intense small-scale air pressure disturbances. An unique atmospheric synoptic pattern was tracked propagating eastward over the Mediterranean and the Black seas in synchrony with onset times of observed tsunami waves. This pattern favoured generation and propagation of atmospheric gravity waves that induced pronounced tsunami-like waves through the Proudman resonance mechanism. This is the first documented case of a chain of destructive meteorological tsunamis occurring over a distance of thousands of kilometres. Our findings further demonstrate that these events represent potentially dangerous regional phenomena and should be included in tsunami warning systems. PMID:26119833

  6. Tsunami: ocean dynamo generator.

    PubMed

    Sugioka, Hiroko; Hamano, Yozo; Baba, Kiyoshi; Kasaya, Takafumi; Tada, Noriko; Suetsugu, Daisuke

    2014-01-08

    Secondary magnetic fields are induced by the flow of electrically conducting seawater through the Earth's primary magnetic field ('ocean dynamo effect'), and hence it has long been speculated that tsunami flows should produce measurable magnetic field perturbations, although the signal-to-noise ratio would be small because of the influence of the solar magnetic fields. Here, we report on the detection of deep-seafloor electromagnetic perturbations of 10-micron-order induced by a tsunami, which propagated through a seafloor electromagnetometer array network. The observed data extracted tsunami characteristics, including the direction and velocity of propagation as well as sea-level change, first to verify the induction theory. Presently, offshore observation systems for the early forecasting of tsunami are based on the sea-level measurement by seafloor pressure gauges. In terms of tsunami forecasting accuracy, the integration of vectored electromagnetic measurements into existing scalar observation systems would represent a substantial improvement in the performance of tsunami early-warning systems.

  7. Dynamic Tsunami Data Assimilation (DTDA) Based on Green's Function: Theory and Application

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Satake, K.; Gusman, A. R.; Maeda, T.

    2017-12-01

    Tsunami data assimilation estimates the tsunami arrival time and height at Points of Interest (PoIs) by assimilating tsunami data observed offshore into a numerical simulation, without the need of calculating initial sea surface height at the source (Maeda et al., 2015). The previous tsunami data assimilation has two main problems: one is that it requires quite large calculating time because the tsunami wavefield of the whole interested region is computed continuously; another is that it relies on dense observation network such as Dense Oceanfloor Network system for Earthquakes and Tsunamis (DONET) in Japan or Cascadia Initiative (CI) in North America (Gusman et al., 2016), which is not practical for some area. Here we propose a new approach based on Green's function to speed up the tsunami data assimilation process and to solve the problem of sparse observation: Dynamic Tsunami Data Assimilation (DTDA). If the residual between the observed and calculated tsunami height is not zero, there will be an assimilation response around the station, usually a Gaussian-distributed sea surface displacement. The Green's function Gi,j is defined as the tsunami waveform at j-th grid caused by the propagation of assimilation response at i-th station. Hence, the forecasted waveforms at PoIs are calculated as the superposition of the Green's functions. In case of sparse observation, we could use the aircraft and satellite observations. The previous assimilation approach is not practical because it costs much time to assimilate moving observation, and to compute the tsunami wavefield of the interested region. In contrast, DTDA synthesizes the waveforms quickly as long as the Green's functions are calculated in advance. We apply our method to a hypothetic earthquake off the west coast of Sumatra Island similar to the 2004 Indian Ocean earthquake. Currently there is no dense observation network in that area, making it difficult for the previous assimilation approach. We used DTDA with aircraft and satellite observation above the Indian Ocean, to forecast the tsunami in Sri Lanka, India and Thailand. It shows that DTDA provides reliable tsunami forecasting for these countries, and the tsunami early warning can be issued half an hour before the tsunami arrives to reduce the damage along the coast.

  8. Real-Time Earthquake Analysis for Disaster Mitigation (READI) Network

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2014-12-01

    Real-time GNSS networks are making a significant impact on our ability to forecast, assess, and mitigate the effects of geological hazards. I describe the activities of the Real-time Earthquake Analysis for Disaster Mitigation (READI) working group. The group leverages 600+ real-time GPS stations in western North America operated by UNAVCO (PBO network), Central Washington University (PANGA), US Geological Survey & Scripps Institution of Oceanography (SCIGN project), UC Berkeley & US Geological Survey (BARD network), and the Pacific Geosciences Centre (WCDA project). Our goal is to demonstrate an earthquake and tsunami early warning system for western North America. Rapid response is particularly important for those coastal communities that are in the near-source region of large earthquakes and may have only minutes of warning time, and who today are not adequately covered by existing seismic and basin-wide ocean-buoy monitoring systems. The READI working group is performing comparisons of independent real time analyses of 1 Hz GPS data for station displacements and is participating in government-sponsored earthquake and tsunami exercises in the Western U.S. I describe a prototype seismogeodetic system using a cluster of southern California stations that includes GNSS tracking and collocation with MEMS accelerometers for real-time estimation of seismic velocity and displacement waveforms, which has advantages for improved earthquake early warning and tsunami forecasts compared to seismic-only or GPS-only methods. The READI working group's ultimate goal is to participate in an Indo-Pacific Tsunami early warning system that utilizes GNSS real-time displacements and ionospheric measurements along with seismic, near-shore buoys and ocean-bottom pressure sensors, where available, to rapidly estimate magnitude and finite fault slip models for large earthquakes, and then forecast tsunami source, energy scale, geographic extent, inundation and runup. This will require cooperation with other real-time efforts around the Pacific Rim in terms of sharing, analysis centers, and advisory bulletins to the responsible government agencies. The IAG's Global Geodetic Observing System (GGOS), in particular its natural hazards theme, provides a natural umbrella for achieving this objective.

  9. Incorporating Geodectic Processing Modules into a Real-Time Earthworm Environment to Enhance NOAA's Tsunami Warning Capability

    NASA Astrophysics Data System (ADS)

    Macpherson, K. A.

    2017-12-01

    The National Oceanographic and Atmospheric Administration's National and Pacific Tsunami Warning Centers currently rely on traditional seismic data in order to detect and evaluate potential tsunamigenic earthquakes anywhere on the globe. The first information products disseminated by the centers following a significant seismic event are based solely on seismically-derived earthquake locations and magnitudes, and are issued within minutes of the earthquake origin time. Thus, the rapid and reliable determination of the earthquake magnitude is a critical piece of information needed by the centers to generate the appropriate alert levels. However, seismically-derived magnitudes of large events are plagued by well-known problems, particularly during the first few minutes following the origin time; near-source broad-band instruments may go off scale, and magnitudes tend to saturate until sufficient teleseismic data arrive to represent the long-period signal that characterizes large events. However, geodetic data such as high-rate Global Positioning System (hGPS) displacements and seismogeodetic data that is a combination of collocated hGPS and accelerometer data do not suffer from these limitations. These sensors stay on scale, even for large events, and they record both dynamic and static displacements that may be used to estimate magnitude without saturation. Therefore, there is an ongoing effort to incorporate these data streams into the operations of the tsunami warning centers to enhance current magnitude determination capabilities, and eventually, to invert the geodetic displacements for mechanism and finite-fault information. These later quantities will be useful for tsunami modeling and forecasting. The tsunami warning centers rely on the Earthworm system for real-time data acquisition, so we have developed Earthworm modules for the Magnitude from Peak Ground Displacement (MPGD) algorithm, developed at the University of Washington and the University of California, Berkeley, and a module for a Static Offset Estimator algorithm that was developed by the NASA Jet Propulsion Laboratory. In this presentation we will discuss module architecture and show output computed by replaying both synthetic and historical scenarios in a simulated real-time Earthworm environment.

  10. Source mechanisms of volcanic tsunamis.

    PubMed

    Paris, Raphaël

    2015-10-28

    Volcanic tsunamis are generated by a variety of mechanisms, including volcano-tectonic earthquakes, slope instabilities, pyroclastic flows, underwater explosions, shock waves and caldera collapse. In this review, we focus on the lessons that can be learnt from past events and address the influence of parameters such as volume flux of mass flows, explosion energy or duration of caldera collapse on tsunami generation. The diversity of waves in terms of amplitude, period, form, dispersion, etc. poses difficulties for integration and harmonization of sources to be used for numerical models and probabilistic tsunami hazard maps. In many cases, monitoring and warning of volcanic tsunamis remain challenging (further technical and scientific developments being necessary) and must be coupled with policies of population preparedness. © 2015 The Author(s).

  11. Tsunami evacuation mathematical model for the city of Padang

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kusdiantara, R.; Hadianti, R.; Badri Kusuma, M. S.

    2012-05-22

    Tsunami is a series of wave trains which travels with high speed on the sea surface. This traveling wave is caused by the displacement of a large volume of water after the occurrence of an underwater earthquake or volcano eruptions. The speed of tsunami decreases when it reaches the sea shore along with the increase of its amplitudes. Two large tsunamis had occurred in the last decades in Indonesia with huge casualties and large damages. Indonesian Tsunami Early Warning System has been installed along the west coast of Sumatra. This early warning system will give about 10-15 minutes to evacuatemore » people from high risk regions to the safe areas. Here in this paper, a mathematical model for Tsunami evacuation is presented with the city of Padang as a study case. In the model, the safe areas are chosen from the existing and selected high rise buildings, low risk region with relatively high altitude and (proposed to be built) a flyover ring road. Each gathering points are located in the radius of approximately 1 km from the ring road. The model is formulated as an optimization problem with the total normalized evacuation time as the objective function. The constraints consist of maximum allowable evacuation time in each route, maximum capacity of each safe area, and the number of people to be evacuated. The optimization problem is solved numerically using linear programming method with Matlab. Numerical results are shown for various evacuation scenarios for the city of Padang.« less

  12. The GNSS data processing component within the Indonesian tsunami early warning centre provided by GITEWS

    NASA Astrophysics Data System (ADS)

    Bartsch, M.; Merx, A.; Falck, C.; Ramatschi, M.

    2010-05-01

    Introduction Within the GITEWS (German Indonesian Tsunami Early Warning System) project a near real-time GNSS processing system has been developed, which analizes on- and offshore measured GNSS data. It is the first system of its kind that was integrated into an operational tsunami early warning system. (Indonesian Tsunami Early Warning Centre INATEWS, inaugurated at BMKG Jakarta on November, 11th 2008) Brief system description The GNSS data to be processed are received from sensors (GNSS antenna and receiver) installed on buoys, at tide gauges and as real-time reference stations (RTR stations), either stand-alone or co-located with seismic sensors. The GNSS data are transmitted to the warning centre in real-time as a stream (RTR stations) or file-based and are processed in a near real-time data processing chain. The fully automatized system uses the BERNESE GPS software as processing core. Kinematic coordinate timeseries with a resolution of 1 Hz (landbased stations) and 1/3 Hz (buoys) are estimated every five minutes. In case of a recently occured earthquake the processing interval decreases from five to two minutes. All stations are processed with the relative technique (baseline-technique) using GITEWS-stations and stations available via IGS as reference. The most suitable reference stations are choosen by querying a database where continiously monitored quality data of GNSS observations are stored. In case of an earthquake at least one reference station should be located on a different tectonic plate to ensure that relative movements can be detected. The primary source for satellite orbit information is the IGS IGU product. If this source is not available for any reason, the system switches automatically to other orbit sources like CODE products or broadcast ephemeris data. For sensors on land the kinematic coordinates are used to detect deviations from their normal, mean coordinates. The deviations or so called displacements are indicators for land mass movements which can occur, e.g., due to strong earthquakes. The ground motion information is a valuable source for a fast understanding of an earthquake's mechanism and consequences with possible relevance for a potentially following tsunami. Regarding kinematic coordinates of a buoy only the vertical component is of interest as it corresponds to the instant sea level. The kinematic coordinates are delivered to an oceanographic post-processing unit which applies dipping-, tilting- and tidal-corrections to the data. Deviations to the mean sea level are an indicator for a possibly passing tsunami wave. By this means the GNSS system supports the decision finding process whether a tsunami has been released or not. A graphical user interface (GUI) was developed which monitors the whole processing chain from data transmission and GNSS data processing to the displaying of the kinematic coordinate time series. It supports both, a quick view for all staff members at the warning centre (24h/7d shifts) and deeper analysis by GNSS experts. The GNSS GUI system is web-based and allows all views to be displayed on different screens at the same time, even at remote locations. This is part of the concept, as it can support the dialogue between warning centre staff on duty or on standby and sensor station maintenance staff. Acknowledgements The GITEWS project (German Indonesian Tsunami Early Warning System) is carried out by a large group of scientists and engineers from (GFZ) German Research Centre for Geosciences and its partners from the German Aerospace Centre (DLR), the Alfred Wegener Institute for Polar and Marine Research (AWI), the GKSS Research Centre, the Konsortium Deutsche Meeresforschung (KDM), the Leibniz Institute for Marine Sciences (IFM-GEOMAR), the United Nations University (UNU), the Federal Institute for Geosciences and Natural Resources (BGR), the German Agency for Technical Cooperation (GTZ) and other international partners. Most relevant partners in Indonesia with respect to the GNSS component of GITEWS are the National Coordinating Agency for Surveys and Mapping (BAKOSURTANAL), the National Metereology and Geophysics Agency (BMG) and the National Agency for the Assessment and Application of Technology (BPPT). Funding is provided by the German Federal Ministry for Education and Research (BMBF), Grant 03TSU01.

  13. Augmenting Onshore GPS Displacements with Offshore Observations to Improve Slip Characterization for Cascadia Subduction Earthquakes

    NASA Astrophysics Data System (ADS)

    Saunders, J. K.; Haase, J. S.

    2017-12-01

    The rupture location of a Mw 8 megathrust earthquake can dramatically change the near-source tsunami impact, where a shallow earthquake can produce a disproportionally large tsunami for its magnitude. Because the locking pattern of the shallow Cascadia megathrust is unconstrained due to the lack of widespread seafloor geodetic observations, near-source tsunami early warning systems need to be able to identify shallow, near-trench earthquakes. Onshore GPS displacements provide low frequency ground motions and coseismic offsets for characterizing tsunamigenic earthquakes, however the one-sided distribution of data may not be able to uniquely determine the rupture region. We examine how augmenting the current real-time GPS network in Cascadia with different offshore station configurations improves static slip inversion solutions for Mw 8 earthquakes at different rupture depths. Two offshore coseismic data types are tested in this study: vertical-only, which would be available using existing technology for bottom pressure sensors, and all-component, which could be achieved by combining pressure sensors with real-time GPS-Acoustic observations. We find that both types of offshore data better constrain the rupture region for a shallow earthquake compared to onshore data alone when offshore stations are located above the rupture. However, inversions using vertical-only offshore data tend to underestimate the amount of slip for a shallow rupture, which we show underestimates the tsunami impact. Including offshore horizontal coseismic data into the inversions improves the slip solutions for a given offshore station configuration, especially in terms of maximum slip. This suggests that while real-time GPS-Acoustic sensors may have a long development timeline, they will have more impact for inversion-based tsunami early warning systems than bottom pressure sensors. We also conduct sensitivity studies using kinematic models with varying rupture speeds and rise times as a proxy for expected rigidity changes with depth along the megathrust. We find distinguishing features in displacement waveforms that can be used to infer primary rupture region. We discuss how kinematic inversion methods that use these characteristics in high-rate GPS data could be applied to the Cascadia subduction zone.

  14. Ionospheric detection of tsunami earthquakes: observation, modeling and ideas for future early warning

    NASA Astrophysics Data System (ADS)

    Occhipinti, G.; Manta, F.; Rolland, L.; Watada, S.; Makela, J. J.; Hill, E.; Astafieva, E.; Lognonne, P. H.

    2017-12-01

    Detection of ionospheric anomalies following the Sumatra and Tohoku earthquakes (e.g., Occhipinti 2015) demonstrated that ionosphere is sensitive to earthquake and tsunami propagation: ground and oceanic vertical displacement induces acoustic-gravity waves propagating within the neutral atmosphere and detectable in the ionosphere. Observations supported by modelling proved that ionospheric anomalies related to tsunamis are deterministic and reproducible by numerical modeling via the ocean/neutral-atmosphere/ionosphere coupling mechanism (Occhipinti et al., 2008). To prove that the tsunami signature in the ionosphere is routinely detected we show here perturbations of total electron content (TEC) measured by GPS and following tsunamigenic earthquakes from 2004 to 2011 (Rolland et al. 2010, Occhipinti et al., 2013), nominally, Sumatra (26 December, 2004 and 12 September, 2007), Chile (14 November, 2007), Samoa (29 September, 2009) and the recent Tohoku-Oki (11 Mars, 2011). Based on the observations close to the epicenter, mainly performed by GPS networks located in Sumatra, Chile and Japan, we highlight the TEC perturbation observed within the first 8 min after the seismic rupture. This perturbation contains information about the ground displacement, as well as the consequent sea surface displacement resulting in the tsunami. In addition to GNSS-TEC observations close to the epicenter, new exciting measurements in the far-field were performed by airglow measurement in Hawaii show the propagation of the internal gravity waves induced by the Tohoku tsunami (Occhipinti et al., 2011). This revolutionary imaging technique is today supported by two new observations of moderate tsunamis: Queen Charlotte (M: 7.7, 27 October, 2013) and Chile (M: 8.2, 16 September 2015). We finally detail here our recent work (Manta et al., 2017) on the case of tsunami alert failure following the Mw7.8 Mentawai event (25 October, 2010), and its twin tsunami alert response following the Mw7.8 Benyak event (2010). In this talk we present all this new tsunami observations in the ionosphere and we discuss, under the light of modelling, the potential role of ionospheric sounding by GNSS-TEC and airglow cameras in oceanic monitoring and future tsunami warning system. All ref. here @ www.ipgp.fr/ ninto

  15. A short history of tsunami research and countermeasures in Japan.

    PubMed

    Shuto, Nobuo; Fujima, Koji

    2009-01-01

    The tsunami science and engineering began in Japan, the country the most frequently hit by local and distant tsunamis. The gate to the tsunami science was opened in 1896 by a giant local tsunami of the highest run-up height of 38 m that claimed 22,000 lives. The crucial key was a tide record to conclude that this tsunami was generated by a "tsunami earthquake". In 1933, the same area was hit again by another giant tsunami. A total system of tsunami disaster mitigation including 10 "hard" and "soft" countermeasures was proposed. Relocation of dwelling houses to high ground was the major countermeasures. The tsunami forecasting began in 1941. In 1960, the Chilean Tsunami damaged the whole Japanese Pacific coast. The height of this tsunami was 5-6 m at most. The countermeasures were the construction of structures including the tsunami breakwater which was the first one in the world. Since the late 1970s, tsunami numerical simulation was developed in Japan and refined to become the UNESCO standard scheme that was transformed to 22 different countries. In 1983, photos and videos of a tsunami in the Japan Sea revealed many faces of tsunami such as soliton fission and edge bores. The 1993 tsunami devastated a town protected by seawalls 4.5 m high. This experience introduced again the idea of comprehensive countermeasures, consisted of defense structure, tsunami-resistant town development and evacuation based on warning.

  16. Improvements in NOAA's Operational Tsunameter Network since December 2004

    NASA Astrophysics Data System (ADS)

    Bouchard, R.; Kohler, C.; McArthur, S.; Burnett, W. H.; Wells, W. I.; Luke, R.

    2009-12-01

    In December 2004 during the devastating Sumatran Tsunami, the National Oceanic and Atmospheric Administration (NOAA) had five tsunameter stations established in the North Pacific Ocean and one in the South Pacific Ocean operated and maintained by NOAA’s National Data Buoy Center (NDBC). The original six tsunameters employed the technology of the first generation Deep-ocean Assessment and Reporting of Tsunamis (DART I) developed by NOAA’s Pacific Marine Environmental Laboratory (PMEL) and successfully transitioned to NDBC in 2003. The technology consists of a Bottom Pressure Recorder (BPR) that makes pressure measurements near the sea-floor and a surface buoy. It takes less than three minutes for data to get from the BPR, which can reside to depths of 6000 m, to users. The BPR contains a tsunami detection algorithm that will place the BPR in rapid reporting mode(also know as Event Mode). The two most profound improvements to the network were its expansion to 39 stations and the transition and upgrade to the second generation DART II systems. In the aftermath of the Sumatran Tsunami, NOAA expanded the network to 39 stations to bolster the US tsunami warning system by providing coastal communities in the Pacific, Atlantic, Caribbean and the Gulf of Mexico with faster and more accurate tsunami warnings. Cooperating NOAA offices selected the sites in consultation with the US Geological Survey and other interested parties. Since their initial establishment, NDBC has relocated some stations to improve data availability by reducing the risks of vessel collision, extreme winds, seas, and currents. NDBC completed the network in March 2008. During the expansion of the NOAA network, NDBC assisted several countries in the deploying and distributing data from their own DART II tsunameters. NDBC completed the upgraded of all stations to the DART II systems by the end of 2007. The significant capability fielded by the DART II technology was the bi-directional communications between the BPR and the TWCs via the surface buoy. This capability allows the TWCs to set the tsunameters in Event Mode (known as a Manual Trigger) in anticipation of the passage of a tsunami. The TWCs can also retrieve the full-resolution data in one-hour increments directly from the BPR and set the tsunami detection threshold. A further upgrade being fielded in 2009 and 2010 allows the TWCs to interrupt an ongoing Event Mode and restart the Event Mode to extend the rapid reporting period and to identify Manual Triggers from the message identifier rather than from an artificial 100 mm addition to water-column heights. Improvements occurred in data distribution. In 2008, in response to greater participation by the international community in the application of the tsunameter data, NDBC moved towards standardizing its tsunameter data messages and bulletins. Further improvements include evaluation of the use of the Easy-To-Deploy DART systems and autonomous vehicles for rapid mitigation response for failed stations and the development of international standardization of data message formats. Emphasis has been placed on mooring failure analysis and investigations into improving mooring reliability and subsequent data availability.

  17. Tsunami Casualty Model

    NASA Astrophysics Data System (ADS)

    Yeh, H.

    2007-12-01

    More than 4500 deaths by tsunamis were recorded in the decade of 1990. For example, the 1992 Flores Tsunami in Indonesia took away at least 1712 lives, and more than 2182 people were victimized by the 1998 Papua New Guinea Tsunami. Such staggering death toll has been totally overshadowed by the 2004 Indian Ocean Tsunami that claimed more than 220,000 lives. Unlike hurricanes that are often evaluated by economic losses, death count is the primary measure for tsunami hazard. It is partly because tsunamis kill more people owing to its short lead- time for warning. Although exact death tallies are not available for most of the tsunami events, there exist gender and age discriminations in tsunami casualties. Significant gender difference in the victims of the 2004 Indian Ocean Tsunami was attributed to women's social norms and role behavior, as well as cultural bias toward women's inability to swim. Here we develop a rational casualty model based on humans' limit to withstand the tsunami flows. The application to simple tsunami runup cases demonstrates that biological and physiological disadvantages also make a significant difference in casualty rate. It further demonstrates that the gender and age discriminations in casualties become most pronounced when tsunami is marginally strong and the difference tends to diminish as tsunami strength increases.

  18. The Mexican Seismic Network (Red Sísmica Mexicana)

    NASA Astrophysics Data System (ADS)

    Valdes-Gonzales, C. M.; Arreola-Manzano, J.; Castelan-Pescina, G.; Alonso-Rivera, P.; Saldivar-Rangel, M. A.; Rodriguez-Arteaga, O. O.; Lopez-Lena-Villasana, R.

    2014-12-01

    The Mexican Seismic Network (Red Sísmica Mexicana) was created to give sufficient information and opportune to make decisions in order to mitigate seismic and tsunami risk. This was a Mexican government initiative headed by CENAPRED (National Disaster Prevention Center) who made an effort to integrated academic institutions and civil agencies to work together through a collaboration agreement. This network is supported by Universidad National Autónoma de México (UNAM) and its seismic networks (Broad Band and Strong Motion), the Centro de Instrumentación y Registro Sismico (CIRES) with its Earthquake Early Warning System that covers the Guerrero Gap and Oaxaca earthquakes, The Centro de Investigación Científica y de Educación Superior de Ensenada (CICESE) with the support of its expertise in tsunami observation and the Secretaria de Marina (SEMAR) to monitor the sea level and operate the Mexican Tsunami Warning Center. The institutions involved in this scope have the compromise to interchange and share the data and advice to the Civil Protection authorities.

  19. Tsunami Preparedness Along the U.S. West Coast (video)

    USGS Publications Warehouse

    Filmed and edited by: Loeffler, Kurt; Gesell, Justine

    2010-01-01

    Tsunamis are a constant threat to the coasts of our world. Although tsunamis are infrequent along the West coast of the United States, it is possible and necessary to prepare for potential tsunami hazards to minimize loss of life and property. Community awareness programs are important, as they strive to create an informed society by providing education and training. This video about tsunami preparedness along the West coast distinguishes between a local tsunami and a distant event and focuses on the specific needs of each region. It offers guidelines for correct tsunami response and community preparedness from local emergency managers, first-responders, and leading experts on tsunami hazards and warnings, who have been working on ways of making the tsunami affected regions safer for the people and communities on a long-term basis. This video was produced by the US Geological Survey (USGS) in cooperation with the California Emergency Management Agency (CalEMA), Oregon Department of Geology and Mineral Industries (DOGAMI), Washington Emergency Management Division (EMD), Marin Office of Emergency Services, and Pacific Gas and Electric (PG&E).

  20. Parallelization of the Coupled Earthquake Model

    NASA Technical Reports Server (NTRS)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  1. Brief Communication: Understanding disasters and early-warning systems

    NASA Astrophysics Data System (ADS)

    Castaños, H.; Lomnitz, C.

    2014-08-01

    This paper discusses some methodological questions on understanding disasters. Destructive earthquakes continue to claim thousands of lives. Tsunamis may be caused by recoil of the upper plate. Darwin's twin-epicenter hypothesis is applied to a theory of tsunamis. The ergodicity hypothesis may help estimating the return periods of extremely rare events. A social science outline on the causation of the Tôhoku nuclear disaster is provided.

  2. Brief Communication: Understanding disasters and early-warning systems

    NASA Astrophysics Data System (ADS)

    Castaños, H.; Lomnitz, C.

    2014-12-01

    This paper discusses some methodological questions on understanding disasters. Destructive earthquakes continue to claim thousands of lives. Tsunamis may be caused by recoil of the upper plate. Darwin's twin-epicenter hypothesis is applied to a theory of tsunamis. The ergodicity hypothesis may help to estimate the return periods of extremely rare events. A social science outline on the causation of the Tôhoku nuclear disaster is provided.

  3. An Evaluation of Infrastructure for Tsunami Evacuation in Padang, West Sumatra, Indonesia (Invited)

    NASA Astrophysics Data System (ADS)

    Cedillos, V.; Canney, N.; Deierlein, G.; Diposaptono, S.; Geist, E. L.; Henderson, S.; Ismail, F.; Jachowski, N.; McAdoo, B. G.; Muhari, A.; Natawidjaja, D. H.; Sieh, K. E.; Toth, J.; Tucker, B. E.; Wood, K.

    2009-12-01

    Padang has one of the world’s highest tsunami risks due to its high hazard, vulnerable terrain and population density. The current strategy to prepare for tsunamis in Padang is focused on developing early warning systems, planning evacuation routes, conducting evacuation drills, and raising local awareness. Although these are all necessary, they are insufficient. Padang’s proximity to the Sunda Trench and flat terrain make reaching safe ground impossible for much of the population. The natural warning in Padang - a strong earthquake that lasts over a minute - will be the first indicator of a potential tsunami. People will have about 30 minutes after the earthquake to reach safe ground. It is estimated that roughly 50,000 people in Padang will be unable to evacuate in that time. Given these conditions, other means to prepare for the expected tsunami must be developed. With this motivation, GeoHazards International and Stanford University’s Chapter of Engineers for a Sustainable World partnered with Indonesian organizations - Andalas University and Tsunami Alert Community in Padang, Laboratory for Earth Hazards, and the Ministry of Marine Affairs and Fisheries - in an effort to evaluate the need for and feasibility of tsunami evacuation infrastructure in Padang. Tsunami evacuation infrastructure can include earthquake-resistant bridges and evacuation structures that rise above the maximum tsunami water level, and can withstand the expected earthquake and tsunami forces. The choices for evacuation structures vary widely - new and existing buildings, evacuation towers, soil berms, elevated highways and pedestrian overpasses. This interdisciplinary project conducted a course at Stanford University, undertook several field investigations, and concluded that: (1) tsunami evacuation structures and bridges are essential to protect the people in Padang, (2) there is a need for a more thorough engineering-based evaluation than conducted to-date of the suitability of existing buildings to serve as evacuation structures, and of existing bridges to serve as elements of evacuation routes, and (3) additions to Padang’s tsunami evacuation infrastructure must carefully take into account technical matters (e.g. expected wave height, debris impact forces), social considerations (e.g. cultural acceptability, public’s confidence in the structure’s integrity), and political issues (e.g. land availability, cost, maintenance). Future plans include collaboration between U.S. and Indonesian engineers in developing designs for new tsunami evacuation structures, as well as providing training for Indonesian authorities on: (1) siting, designing, and constructing tsunami evacuation structures, and (2) evaluating the suitability of existing buildings to serve as tsunami evacuation shelters.

  4. Tsunami impact to Washington and northern Oregon from segment ruptures on the southern Cascadia subduction zone

    USGS Publications Warehouse

    Priest, George R.; Zhang, Yinglong; Witter, Robert C.; Wang, Kelin; Goldfinger, Chris; Stimely, Laura

    2014-01-01

    This paper explores the size and arrival of tsunamis in Oregon and Washington from the most likely partial ruptures of the Cascadia subduction zone (CSZ) in order to determine (1) how quickly tsunami height declines away from sources, (2) evacuation time before significant inundation, and (3) extent of felt shaking that would trigger evacuation. According to interpretations of offshore turbidite deposits, the most frequent partial ruptures are of the southern CSZ. Combined recurrence of ruptures extending ~490 km from Cape Mendocino, California, to Waldport, Oregon (segment C) and ~320 km from Cape Mendocino to Cape Blanco, Oregon (segment D), is ~530 years. This recurrence is similar to frequency of full-margin ruptures on the CSZ inferred from paleoseismic data and to frequency of the largest distant tsunami sources threatening Washington and Oregon, ~Mw 9.2 earthquakes from the Gulf of Alaska. Simulated segment C and D ruptures produce relatively low-amplitude tsunamis north of source areas, even for extreme (20 m) peak slip on segment C. More than ~70 km north of segments C and D, the first tsunami arrival at the 10-m water depth has an amplitude of <1.9 m. The largest waves are trapped edge waves with amplitude ≤4.2 m that arrive ≥2 h after the earthquake. MM V–VI shaking could trigger evacuation of educated populaces as far north as Newport, Oregon for segment D events and Grays Harbor, Washington for segment C events. The NOAA and local warning systems will be the only warning at greater distances from sources.

  5. The Application of Speaker Recognition Techniques in the Detection of Tsunamigenic Earthquakes

    NASA Astrophysics Data System (ADS)

    Gorbatov, A.; O'Connell, J.; Paliwal, K.

    2015-12-01

    Tsunami warning procedures adopted by national tsunami warning centres largely rely on the classical approach of earthquake location, magnitude determination, and the consequent modelling of tsunami waves. Although this approach is based on known physics theories of earthquake and tsunami generation processes, this may be the main shortcoming due to the need to satisfy minimum seismic data requirement to estimate those physical parameters. At least four seismic stations are necessary to locate the earthquake and a minimum of approximately 10 minutes of seismic waveform observation to reliably estimate the magnitude of a large earthquake similar to the 2004 Indian Ocean Tsunami Earthquake of M9.2. Consequently the total time to tsunami warning could be more than half an hour. In attempt to reduce the time of tsunami alert a new approach is proposed based on the classification of tsunamigenic and non tsunamigenic earthquakes using speaker recognition techniques. A Tsunamigenic Dataset (TGDS) was compiled to promote the development of machine learning techniques for application to seismic trace analysis and, in particular, tsunamigenic event detection, and compare them to existing seismological methods. The TGDS contains 227 off shore events (87 tsunamigenic and 140 non-tsunamigenic earthquakes with M≥6) from Jan 2000 to Dec 2011, inclusive. A Support Vector Machine classifier using a radial-basis function kernel was applied to spectral features derived from 400 sec frames of 3-comp. 1-Hz broadband seismometer data. Ten-fold cross-validation was used during training to choose classifier parameters. Voting was applied to the classifier predictions provided from each station to form an overall prediction for an event. The F1 score (harmonic mean of precision and recall) was chosen to rate each classifier as it provides a compromise between type-I and type-II errors, and due to the imbalance between the representative number of events in the tsunamigenic and non-tsunamigenic classes. The described classifier achieved an F1 score of 0.923, with tsunamigenic classification precision and recall/sensitivity of 0.928 and 0.919 respectively. The system requires a minimum of 3 stations with ~400 seconds of data each to make a prediction. The accuracy improves as further stations and data become available.

  6. Development of algorithms for tsunami detection by High Frequency Radar based on modeling tsunami case studies in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Grilli, Stéphan; Guérin, Charles-Antoine; Grosdidier, Samuel

    2015-04-01

    Where coastal tsunami hazard is governed by near-field sources, Submarine Mass Failures (SMFs) or earthquakes, tsunami propagation times may be too small for a detection based on deep or shallow water buoys. To offer sufficient warning time, it has been proposed by others to implement early warning systems relying on High Frequency Surface Wave Radar (HFSWR) remote sensing, that has a dense spatial coverage far offshore. A new HFSWR, referred to as STRADIVARIUS, has been recently deployed by Diginext Inc. to cover the "Golfe du Lion" (GDL) in the Western Mediterranean Sea. This radar, which operates at 4.5 MHz, uses a proprietary phase coding technology that allows detection up to 300 km in a bistatic configuration (with a baseline of about 100 km). Although the primary purpose of the radar is vessel detection in relation to homeland security, it can also be used for ocean current monitoring. The current caused by an arriving tsunami will shift the Bragg frequency by a value proportional to a component of its velocity, which can be easily obtained from the Doppler spectrum of the HFSWR signal. Using state of the art tsunami generation and propagation models, we modeled tsunami case studies in the western Mediterranean basin (both seismic and SMFs) and simulated the HFSWR backscattered signal that would be detected for the entire GDL and beyond. Based on simulated HFSWR signal, we developed two types of tsunami detection algorithms: (i) one based on standard Doppler spectra, for which we found that to be detectable within the environmental and background current noises, the Doppler shift requires tsunami currents to be at least 10-15 cm/s, which typically only occurs on the continental shelf in fairly shallow water; (ii) to allow earlier detection, a second algorithm computes correlations of the HFSWR signals at two distant locations, shifted in time by the tsunami propagation time between these locations (easily computed based on bathymetry). We found that this second method allowed detection for currents as low as 5 cm/s, i.e., in deeper water, beyond the shelf and further away from the coast, thus allowing an earlier detection.

  7. Improvement of real-time seismic magnitude estimation by combining seismic and geodetic instrumentation

    NASA Astrophysics Data System (ADS)

    Goldberg, D.; Bock, Y.; Melgar, D.

    2017-12-01

    Rapid seismic magnitude assessment is a top priority for earthquake and tsunami early warning systems. For the largest earthquakes, seismic instrumentation tends to underestimate the magnitude, leading to an insufficient early warning, particularly in the case of tsunami evacuation orders. GPS instrumentation provides more accurate magnitude estimations using near-field stations, but isn't sensitive enough to detect the first seismic wave arrivals, thereby limiting solution speed. By optimally combining collocated seismic and GPS instruments, we demonstrate improved solution speed of earthquake magnitude for the largest seismic events. We present a real-time implementation of magnitude-scaling relations that adapts to consider the length of the recording, reflecting the observed evolution of ground motion with time.

  8. A short history of tsunami research and countermeasures in Japan

    PubMed Central

    Shuto, Nobuo; Fujima, Koji

    2009-01-01

    The tsunami science and engineering began in Japan, the country the most frequently hit by local and distant tsunamis. The gate to the tsunami science was opened in 1896 by a giant local tsunami of the highest run-up height of 38 m that claimed 22,000 lives. The crucial key was a tide record to conclude that this tsunami was generated by a “tsunami earthquake”. In 1933, the same area was hit again by another giant tsunami. A total system of tsunami disaster mitigation including 10 “hard” and “soft” countermeasures was proposed. Relocation of dwelling houses to high ground was the major countermeasures. The tsunami forecasting began in 1941. In 1960, the Chilean Tsunami damaged the whole Japanese Pacific coast. The height of this tsunami was 5–6 m at most. The countermeasures were the construction of structures including the tsunami breakwater which was the first one in the world. Since the late 1970s, tsunami numerical simulation was developed in Japan and refined to become the UNESCO standard scheme that was transformed to 22 different countries. In 1983, photos and videos of a tsunami in the Japan Sea revealed many faces of tsunami such as soliton fission and edge bores. The 1993 tsunami devastated a town protected by seawalls 4.5 m high. This experience introduced again the idea of comprehensive countermeasures, consisted of defense structure, tsunami-resistant town development and evacuation based on warning. PMID:19838008

  9. Tsunami hazard assessment along the French Mediterranean coast : detailed modeling of tsunami impacts for the ALDES project

    NASA Astrophysics Data System (ADS)

    Quentel, E.; Loevenbruck, A.; Hébert, H.

    2012-04-01

    The catastrophic 2004 tsunami drew the international community's attention to tsunami risk in all basins where tsunamis occurred but no warning system exists. Consequently, under the coordination of UNESCO, France decided to create a regional center, called CENALT, for the north-east Atlantic and the western Mediterranean. This warning system, which should be operational by 2012, is set up by the CEA in collaboration with the SHOM and the CNRS. The French authorities are in charge of the top-down alert system including the local alert dissemination. In order to prepare the appropriate means and measures, they initiated the ALDES (Alerte Descendante) project to which the CEA also contributes. It aims at examining along the French Mediterranean coast the tsunami risk related to earthquakes and landslides. In addition to the evaluation at regional scale, it includes the detailed studies of 3 selected sites; the local alert system will be designed for one of them : the French Riviera. In this project, our main task at CEA consists in assessing tsunami hazard related to seismic sources using numerical modeling. Past tsunamis have affected the west Mediterranean coast but are too few and poorly documented to provide a suitable database. Thus, a synthesis of earthquakes representative of the tsunamigenic seismic activity and prone to induce the largest impact to the French coast is performed based on historical data, seismotectonics and first order models. The North Africa Margin, the Ligurian and the South Tyrrhenian Seas are considered as the main tsunamigenic zones. In order to forecast the most important plausible effects, the magnitudes are estimated by enhancing to some extent the largest known values. Our hazard estimation is based on the simulation of the induced tsunamis scenarios performed with the CEA code. The 3 sites have been chosen according to the regional hazard studies, coastal typology elements and the appropriate DTMs (Digital Terrain Models). The ALDES project allows the SHOM and the IGN to conduct high resolution data acquisition in the Litto3D framework for 2 sites, one west of the Gulf of Lion (3 m) and one west of the French Riviera (3 m). DTMs of the third site, centered on the Antibes Cape, are built using pre-existent data sets with lesser resolution (10 m). Then, detailed models for the selected sites are performed based on high resolution bathymetric and topographic data; they provide estimations of water heights and currents, inundation distances and run-up elevations. It points out the most exposed places and morphologic features prone to amplify potential waves and to generate significant coastal effects. Our set of simulations gives an evaluation of the expected maximum impact distribution and highlights places, such as specific beaches or harbors, where mitigation measures must be given priority.

  10. Regional Impact of the 29 September 2009 North Tonga Tsunami on the Futuna and Alofi Islands (Wallis & Futuna)

    NASA Astrophysics Data System (ADS)

    Lamarche, G.; Pelletier, B.; Goff, J. R.

    2009-12-01

    The north Tonga earthquake occurred at 5:48am on 30 September local time in Futuna, ~650 km west of the epicentre. The PTWC issued a warning at 6:04am for tsunami arrival in Wallis (Wallis & Futuna) at 6.35am. No warning was issued by the territorial authorities for Wallis nor for Futuna, located 230 km to the south-west. There was no reported tsunami on Wallis. However a tsunami hit the archipelago of Futuna (islands of Futuna and Alofi) between 7.00 and 7.20am on 30 September. The tide was approximately 3/4 out. We took advantage of an 8 days survey funded by the French Ministry of Foreign Affairs, previously planned for investigating palaeotsunamis on Futuna and Alofi. We measured run-up and inundation from the mid- to low-tide mark, as well as flow depths, and sediments associated with the 30 September tsunami at 41 sites around the islands. Run-ups were estimated based on visual evidence of recent coastal impact - burnt grasses and plants, sand and other displaced debris (e.g., on the road). We interviewed the population on multiple occasions. The maximum run-up of 4.5 m was observed on the eastern beach of Alofitai in Alofi, associated with an inundation of 85 m and a flow depth of 3m at the coast. On Futuna, we measured maximum run-ups of 4.4 m on the eastern tip and 4.3 m on the NW tip of the island, with maximum inundations of 95 and 72m, respectively. A flow depth of 2 m was inferred on the NE tip. Overall, the tsunami impact was more severe on the northern coast of Futuna, with run-ups ranging from 2.1 to 4.3 m. Very small run-ups and inundations were observed along the southern coast, with a 1.0 m run-up and 10 m inundation measured in Léava, the capital of Futuna. Most witnesses report two main waves equivalent in amplitude, the second one being sometimes described as the largest. All witnesses indicate that the sea withdrew first. A video suggests only a few minutes between the successive waves (likely not the first) in Léava. The video shows the reef exposed well below the lowest tides. There were no casualties. One inhabitant was warned by LCI television at 06:30am and was able to witness the tsunami. There were unconfirmed reports of two women taken by surprise by the arrival of the tsunami on the reef near the eastern end of Futuna, but who managed to hold on to trees to avoid being taken out to sea by the backwash. A significant disaster was avoided essentially because it was early and the tide was low when the tsunami hit. Such an event at high tide would have added about 0.8-1m in height to the wave and have undoubtedly resulted in severe damage, injuries and possibly deaths. This event, together with a small tsunami triggered by a Mw 6.4 local earthquake in March 1993 and an oral legend about a deadly and destructive wave indicate that the tsunami risk for Futuna is high for the >4000 inhabitants who live almost exclusively on a 50-400 m-wide coastal strip, between a narrow reef and landward coastal cliffs. However, the hour and 10 minutes that the 30 September tsunami took to reach the island provided sufficient time to issue a warning to the population who can rapidly reach safety on this mountainous landscape.

  11. Elders recall an earlier tsunami on Indian Ocean shores

    USGS Publications Warehouse

    Kakar, Din Mohammad; Naeem, Ghazala; Usman, Abdullah; Hasan, Haider; Lohdi, Hira; Srinivasalu, Seshachalam; Andrade, Vanessa; Rajendran, C.P.; Naderi Beni, Abdolmajid; Hamzeh, Mohammad Ali; Hoffmann, Goesta; Al Balushi, Noora; Gale, Nora; Kodijat, Ardito; Fritz, Hermann M.; Atwater, Brian F.

    2014-01-01

    Ten years on, the Indian Ocean tsunami of 26 December 2004 still looms large in efforts to reduce coastal risk. The disaster has spurred worldwide advances in tsunami detection and warning, tsunami-risk assessment, and tsunami awareness [Satake, 2014]. Nearly a lifetime has passed since the northwestern Indian Ocean last produced a devastating tsunami. Documentation of this tsunami, in November 1945, was hindered by international instability in the wake of the Second World War and, in British India, by the approach of independence and partition. The parent earthquake, of magnitude 8.1, was widely recorded, and the tsunami registered on tide gauges, but intelligence reports and newspaper articles say little about inundation limits while permitting a broad range of catalogued death tolls. What has been established about the 1945 tsunami falls short of what's needed today for ground-truthing inundation models, estimating risk to enlarged populations, and anchoring awareness campaigns in local facts. Recent efforts to reduce coastal risk around the Arabian Sea include a project in which eyewitnesses to the 1945 tsunami were found and interviewed (Fig. 1), and related archives were gathered. Results are being made available through UNESCO's Indian Ocean Tsunami Information Center in hopes of increasing scientific understanding and public awareness of the region's tsunami hazards.

  12. Tsunami risk mapping simulation for Malaysia

    USGS Publications Warehouse

    Teh, S.Y.; Koh, H. L.; Moh, Y.T.; De Angelis, D. L.; Jiang, J.

    2011-01-01

    The 26 December 2004 Andaman mega tsunami killed about a quarter of a million people worldwide. Since then several significant tsunamis have recurred in this region, including the most recent 25 October 2010 Mentawai tsunami. These tsunamis grimly remind us of the devastating destruction that a tsunami might inflict on the affected coastal communities. There is evidence that tsunamis of similar or higher magnitudes might occur again in the near future in this region. Of particular concern to Malaysia are tsunamigenic earthquakes occurring along the northern part of the Sunda Trench. Further, the Manila Trench in the South China Sea has been identified as another source of potential tsunamigenic earthquakes that might trigger large tsunamis. To protect coastal communities that might be affected by future tsunamis, an effective early warning system must be properly installed and maintained to provide adequate time for residents to be evacuated from risk zones. Affected communities must be prepared and educated in advance regarding tsunami risk zones, evacuation routes as well as an effective evacuation procedure that must be taken during a tsunami occurrence. For these purposes, tsunami risk zones must be identified and classified according to the levels of risk simulated. This paper presents an analysis of tsunami simulations for the South China Sea and the Andaman Sea for the purpose of developing a tsunami risk zone classification map for Malaysia based upon simulated maximum wave heights. ?? 2011 WIT Press.

  13. Observing Tsunamis in the Ionosphere Using Ground Based GPS Measurements

    NASA Technical Reports Server (NTRS)

    Galvan, D. A.; Komjathy, A.; Song, Y. Tony; Stephens, P.; Hickey, M. P.; Foster, J.

    2011-01-01

    Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following recent seismic events, including the Tohoku tsunami of March 11, 2011. We observe fluctuations correlated in time, space, and wave properties with this tsunami in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the Tohoku tsunami near the epicenter, at Hawaii, and near the west coast of North America. Disturbance magnitudes are 1-10% of the background TEC value. Observations near the epicenter are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement. The potential exists to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for future early warning systems.

  14. Tsunami evacuation modelling as a tool for risk reduction: application to the coastal area of El Salvador

    NASA Astrophysics Data System (ADS)

    González-Riancho, P.; Aguirre-Ayerbe, I.; Aniel-Quiroga, I.; Abad, S.; González, M.; Larreynaga, J.; Gavidia, F.; Gutiérrez, O. Q.; Álvarez-Gómez, J. A.; Medina, R.

    2013-12-01

    Advances in the understanding and prediction of tsunami impacts allow the development of risk reduction strategies for tsunami-prone areas. This paper presents an integral framework for the formulation of tsunami evacuation plans based on tsunami vulnerability assessment and evacuation modelling. This framework considers (i) the hazard aspects (tsunami flooding characteristics and arrival time), (ii) the characteristics of the exposed area (people, shelters and road network), (iii) the current tsunami warning procedures and timing, (iv) the time needed to evacuate the population, and (v) the identification of measures to improve the evacuation process. The proposed methodological framework aims to bridge between risk assessment and risk management in terms of tsunami evacuation, as it allows for an estimation of the degree of evacuation success of specific management options, as well as for the classification and prioritization of the gathered information, in order to formulate an optimal evacuation plan. The framework has been applied to the El Salvador case study, demonstrating its applicability to site-specific response times and population characteristics.

  15. Origin of the ahead of tsunami traveling ionospheric disturbances during Sumatra tsunami and offshore forecasting

    NASA Astrophysics Data System (ADS)

    Bagiya, Mala S.; Kherani, E. A.; Sunil, P. S.; Sunil, A. S.; Sunda, S.; Ramesh, D. S.

    2017-07-01

    The presence of ionospheric disturbances associated with Sumatra 2004 tsunami that propagated ahead of tsunami itself has previously been identified. However, their origin remains unresolved till date. Focusing on their origin mechanism, we document these ionospheric disturbances referred as Ahead of tsunami Traveling Ionospheric Disturbances (ATIDs). Using total electron content (TEC) data from GPS Aided GEO Augmented Navigation GPS receivers located near the Indian east coast, we first confirm the ATIDs presence in TEC that appear 90 min ahead of the arrival of tsunami at the Indian east coast. We propose here a simulation study based on tsunami-atmospheric-ionospheric coupling that considers tsunamigenic acoustic gravity waves (AGWs) to excite these disturbances. We explain the ATIDs generation based on the dissipation of transverse mode of the primary AGWs. The simulation corroborates the excitation of ATIDs with characteristics similar to the observations. Therefore, we offer an alternative theoretical tool to monitor the offshore ATIDs where observations are either rare or not available and could be potentially important for the tsunami early warning.

  16. The TRIDEC Virtual Tsunami Atlas - customized value-added simulation data products for Tsunami Early Warning generated on compute clusters

    NASA Astrophysics Data System (ADS)

    Löwe, P.; Hammitzsch, M.; Babeyko, A.; Wächter, J.

    2012-04-01

    The development of new Tsunami Early Warning Systems (TEWS) requires the modelling of spatio-temporal spreading of tsunami waves both recorded from past events and hypothetical future cases. The model results are maintained in digital repositories for use in TEWS command and control units for situation assessment once a real tsunami occurs. Thus the simulation results must be absolutely trustworthy, in a sense that the quality of these datasets is assured. This is a prerequisite as solid decision making during a crisis event and the dissemination of dependable warning messages to communities under risk will be based on them. This requires data format validity, but even more the integrity and information value of the content, being a derived value-added product derived from raw tsunami model output. Quality checking of simulation result products can be done in multiple ways, yet the visual verification of both temporal and spatial spreading characteristics for each simulation remains important. The eye of the human observer still remains an unmatched tool for the detection of irregularities. This requires the availability of convenient, human-accessible mappings of each simulation. The improvement of tsunami models necessitates the changes in many variables, including simulation end-parameters. Whenever new improved iterations of the general models or underlying spatial data are evaluated, hundreds to thousands of tsunami model results must be generated for each model iteration, each one having distinct initial parameter settings. The use of a Compute Cluster Environment (CCE) of sufficient size allows the automated generation of all tsunami-results within model iterations in little time. This is a significant improvement to linear processing on dedicated desktop machines or servers. This allows for accelerated/improved visual quality checking iterations, which in turn can provide a positive feedback into the overall model improvement iteratively. An approach to set-up and utilize the CCE has been implemented by the project Collaborative, Complex, and Critical Decision Processes in Evolving Crises (TRIDEC) funded under the European Union's FP7. TRIDEC focuses on real-time intelligent information management in Earth management. The addressed challenges include the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulations and data fusion tools. Additionally, TRIDEC adopts enhancements of Service Oriented Architecture (SOA) principles in terms of Event Driven Architecture (EDA) design. As a next step the implemented CCE's services to generate derived and customized simulation products are foreseen to be provided via an EDA service for on-demand processing for specific threat-parameters and to accommodate for model improvements.

  17. Learning from the victims: New physical and social science information about tsunamis from victims of the September 29, 2009 event in Samoa and American Samoa

    NASA Astrophysics Data System (ADS)

    Dudley, Walter C.; Whitney, Rosy; Faasisila, Jackie; Fonolua, Sharon; Jowitt, Angela; Chan-Kau, Marie

    2011-07-01

    Thirty-one video interviews were carried out on the islands of Tutuila, American Samoa and Upolu, Samoa with survivors of, and responders to, the September 29, 2009 tsunami event. Those interviewed included local residents caught by the waves while attempting to flee to higher ground, those who intentionally ran into the water to save others, individuals who recognized the potential tsunami hazard due to the severity of the earthquake and attempted to warn others, aid workers, tourism managers, and others. The frank, often emotional, responses provide unfiltered insight into their level of understanding of the tsunami phenomenon, the level of preparedness of local residents, and challenges faced by aid workers.

  18. The Hellenic National Tsunami Warning Centre (HL-NTWC): Recent updates and future developments

    NASA Astrophysics Data System (ADS)

    Melis, Nikolaos S.; Charalampakis, Marinos

    2014-05-01

    The Hellenic NTWC (HL-NTWC) was established officially by Greek Law in September 2010. HL-NTWC is hosted at the National Observatory of Athens, Institute of Geodynamics (NOA-IG), which also operates a 24/7 earthquake monitoring service in Greece and coordinates the newly established Hellenic Unified National Seismic Network. NOA-IG and HL-NTWC Operational Centre is linked to the Civil Protection Operational Centre and serves as the official alerting agency to the General Secretariat for Civil Protection in Greece, regarding earthquake events and tsunami watch. Since August 2012, HL-NTWC acts as Candidate Tsunami Watch Provider (CTWP) under the UNESCO IOC - ICG NEAMTWS tsunami warning system (NEAM: North-Eastern Atlantic, the Mediterranean and connected seas) and offers its services to the NEAMTWS system. HL-NTWC has participated in all Communication Test Exercises (CTE) under NEAMTWS and also it has provided tsunami scenarios for extended system testing exercises such as NEAMWAVE12. Some of the recent developments at HL-NTWC in Greece include: deployment of new tide gauge stations for tsunami watch purposes, computation of tsunami scenarios and extending the database in use, improving alerting response times, earthquake magnitude estimation and testing newly established software modules for tsunami and earthquake alerting (i.e. Early-Est, SeisComP3 etc.) in Greece and the Eastern Mediterranean. Although funding today is limited, an advantage of the participation in important EC funded research projects, i.e. NERIES, NERA, TRANSFER, NEAMTIC and ASTARTE, demonstrates that collaboration of top class Research Institutions that care to produce important and useful results in the research front in Europe, can facilitate towards developing and operating top class Operational Centers, useful for Civil Protection purposes in regions in need. Last, it is demonstrated that HL-NTWC collaboration with important key role Research Centers in the Security and Safety issues (e.g. JRC-IPSC) at the Operational front, can further facilitate and secure everyday operation under a collaborative and experience exchanging manner. This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3)

  19. Multiple Solutions of Real-time Tsunami Forecasting Using Short-term Inundation Forecasting for Tsunamis Tool

    NASA Astrophysics Data System (ADS)

    Gica, E.

    2016-12-01

    The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.

  20. New Offshore Approach to Reduce Impact of Tsunami Waves

    NASA Astrophysics Data System (ADS)

    Anant Chatorikar, Kaustubh

    2016-07-01

    The world is facing an increasing frequency and intensity of natural disaster that has devastating impacts on society. As per International Strategy for Disaster Reduction (ISDR), it has been observed that over five million people were killed or affected in last 10 years and huge amount of economic losses occurred due to natural disaster. The 2011 tsunami in Japan showed a tremendous setback to existing technology of tsunami protection. More than 25,000 lives have been lost, Apart from that the damage to the nuclear power stations has severely affected the nearby populace and marine life. After the 2004 tsunami, world's effort has been concentrated on early warning and effective mitigation plans to defend against tsunami. It is anybody's guess as to what would have happened if such natural calamity specifically tsunami of such magnitude strikes our nation as country has already suffered from it in 2004 and seen its disastrous effects. But the point is what if such calamity strikes the mega cities like Chennai, Mumbai and Kolkata again where there is extensive human habitation and conventional warning systems and mitigation methods are not effective when it comes to huge population of these cities, destruction caused by it will be worse than nuclear weapon strike as there is also very high possibility of deaths due to stampede. This paper talks about an idea inspired from daily routine and its relation with fundamental physics as well as method of its deployment is discussed. According to this idea when wave will strike the coast, aim is not to stop it but to reduce its impact within the permissible impact limits of existing infrastructure by converting it into foam wave with help of surfactants, thereby saving human lives as well as complications of Mitigation.

  1. Probabilistic tsunami hazard assessment for Makran considering recently suggested larger maximum magnitudes and sensitivity analysis for GNSS-based early warning

    NASA Astrophysics Data System (ADS)

    Zamora, N.; Hoechner, A.; Babeyko, A. Y.

    2014-12-01

    Iran and Pakistan are countries frequently affected by destructive earthquakes, as for instance, the magnitude 6.6 Bam earthquake in 2003 in Iran with about 30 000 casualties, or the magnitude 7.6 Kashmir earthquake 2005 in Pakistan with about 80'000 casualties. Both events took place inland, but in terms of magnitude, even significantly larger events can be expected to happen offshore, at the Makran subduction zone. This small subduction zone is seismically rather quiescent, nevertheless a tsunami caused by a thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss the possiblity of rather rare huge magnitude 9 events at the Makran subduction zone. We analyze the seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 100000 years. All the events are projected onto the plate interface using scaling relations and a tsunami model is run for every scenario. The tsunami hazard along the coast is computed and presented in the form of annual probability of exceedance, probabilistic tsunami height for different time periods and other measures. We show how the hazard reacts to variation of the Gutenberg-Richter parameters and maximum magnitudes.We model the historic Balochistan event and its effect in terms of coastal wave heights. Finally, we show how an effective tsunami early warning could be achieved by using an array of high-precision real-time GNSS (Global Navigation Satellite System) receivers along the coast by applying it to the 1945 event and by performing a sensitivity analysis.

  2. SeisComP 3 - Where are we now?

    NASA Astrophysics Data System (ADS)

    Saul, Joachim; Becker, Jan; Hanka, Winfried; Heinloo, Andres; Weber, Bernd

    2010-05-01

    The seismological software SeisComP has evolved within the last approximately 10 years from a pure acquisition modules to a fully featured real-time earthquake monitoring software. The now very popular SeedLink protocol for seismic data transmission has been the core of SeisComP from the very beginning. Later additions included simple, purely automatic event detection, location and magnitude determination capabilities. Especially within the development of the 3rd-generation SeisComP, also known as "SeisComP 3", automatic processing capabilities have been augmented by graphical user interfaces for vizualization, rapid event review and quality control. Communication between the modules is achieved using a a TCP/IP infrastructure that allows distributed computing and remote review. For seismological metadata exchange export/import to/from QuakeML is avalable, which also provides a convenient interface with 3rd-party software. SeisComP is the primary seismological processing software at the GFZ Potsdam. It has also been in use for years in numerous seismic networks in Europe and, more recently, has been adopted as primary monitoring software by several tsunami warning centers around the Indian Ocean. In our presentation we describe the current status of development as well as future plans. We illustrate its possibilities by discussing different use cases for global and regional real-time earthquake monitoring and tsunami warning.

  3. Tsunami Early Warning System in Italy and involvement of local communities

    NASA Astrophysics Data System (ADS)

    Tinti, Stefano; Armigliato, Alberto; Zaniboni, Filippo

    2010-05-01

    Italy is characterized by a great coastal extension, and by a series of possible tsunamigenic sources: many active faults, onshore and offshore, also near the shoreline and in shallow water, active volcanoes (Etna, Stromboli, Campi Flegrei for example), continental margins where landslides can occur. All these threats justify the establishment of a tsunami early warning system (TEWS), especially in Southern Italy where most of the sources capable of large disastrous tsunamis are located. One of the main characteristics of such sources, that however is common to other countries in not only in the Mediterranean, is their vicinity to the coast, which means that the tsunami lead time for attacking the coastal system is expected to be within 10-15 minutes in several cases. This constraint of time imposes to conceive and adopt specific plans aiming at a quick tsunami detection and alert dissemination for the TEWS, since obviously the TEWS alert must precede and not follow the tsunami first arrival. The need to be quick introduces the specific problem of uncertainty that is though inherent to any forecast system, but it is a very big issue especially when time available is short, since crucial decisions have to be taken in presence of incomplete data and incomplete processing. This is just the big problem that has to be faced by a system like the a TEWS in Italy. Uncertainties can be reduced by increasing the capabilities of the tsunami monitoring system by densifying the traditional instrumental networks (e.g. by empowering seismic and especially coastal and offshore sea-level observation systems) in the identified tsunamigenic source areas. However, uncertainties, though are expected to have a decreasing trend as time passes after the tsunami initiation, cannot be eliminated and have to be appropriately dealt with: uncertainties lead to under- and overestimation of the tsunami size and arrival times, and to missing or to false alerts, or in other terms they degrade the performance of the tsunami predictors. The role of the local communities in defining the strategies in case of uncertain data is essential: only involvement of such communities since the beginning of the planning and implementation phase of the TEWS as well as in the definition of a decision making matrix can ensure appropriate response in case of emergency, and most importantly, the acceptance of the system in the long run. The efforts to implement the Tsunami Warning System in Italy should take into proper account the above mentioned aspects. Involvement of local communities should be primarily realized through the involvement of the local components of the Civil Protection Agency that is responsible for the implementation of the system over the Italian territory. A pilot project is being conducted in cooperation between the Civil Protection Service of Sicily and the University of Bologna (UNIBO) that contemplates the empowering of the local sea-level monitoring system (TSUNET) and specific vulnerability and risk analyses, also exploiting results of national and European research projects (e.g. TRANSFER and SCHEMA) where UNIBO had a primary role.

  4. The Evaluation of Cone Capsule as an Alternative Hull form for Portable Tsunami Lifeboat to Support Evacuation System in the Coastal Regions and Small Islands

    NASA Astrophysics Data System (ADS)

    Fauzan Zakki, Ahmad; Suharto; Windyandari, Aulia

    2018-03-01

    Several attempts have been made to reduce the risk of tsunami disasters such as the development of early warning systems, evacuation procedures training, coastal protection and coastal spatial planning. Although many efforts to mitigate the impact of the tsunami in Indonesia was made, no one has developed a portable disaster rescue vehicle/shelter as well as a lifeboat on ships and offshore building, which is always available when a disaster occurs. The aim of the paper is to evaluate the performance of cone capsule shaped hull form that would be used for the portable tsunami lifeboat. The investigation of the boat resistance, intact stability, and seakeeping characteristics was made. The numerical analysis results indicate that the cone capsule is reliable as an alternative hull form for the portable tsunami lifeboat.

  5. National Weather Service

    MedlinePlus

    ... Data SAFETY Floods Tsunami Beach Hazards Wildfire Cold Tornadoes Fog Air Quality Heat Hurricanes Lightning Safe Boating ... Winter Weather Forecasts River Flooding Latest Warnings Thunderstorm/Tornado Outlook Hurricanes Fire Weather Outlooks UV Alerts Drought ...

  6. Tsunamis

    MedlinePlus

    ... Extreme Heat Older Adults (Aged 65+) Infants and Children Chronic Medical Conditions Low Income Athletes Outdoor Workers Pets Hot Weather Tips Warning Signs and Symptoms FAQs Social Media How to Stay Cool Missouri Cooling Centers Extreme ...

  7. Meteorological tsunamis along the U.S. coastline

    NASA Astrophysics Data System (ADS)

    Vilibic, I.; Monserrat, S.; Amores, A.; Dadic, V.; Fine, I.; Horvath, K.; Ivankovic, D.; Marcos, M.; Mihanovic, H.; Pasquet, S.; Rabinovich, A. B.; Sepic, J.; Strelec Mahovic, N.; Whitmore, P.

    2012-04-01

    Meteotsunamis, or meteorological tsunamis, are atmospherically induced ocean waves in the tsunami frequency band that are found to affect coasts in a destructive way in a number of places in the World Ocean, including the U.S. coastline. The Boothbay Harbor, Maine, in October 2008 and Daytona Beach, Florida, in July 1992 were hit by several meters high waves appearing from "nowhere", and a preliminary assessment pointed to the atmosphere as a possible source for the events. As a need for in-depth analyses and proper qualification of these and other events emerged, National Oceanographic and Atmospheric Administration (NOAA) decided to fund the research, currently carried out within the TMEWS project (Towards a MEteotsunami Warning System along the U.S. coastline). The project structure, planned research activities and first results will be presented here. The first objective of the project is creation of a list of potential meteotsunami events, from catalogues, news and high-resolution sea level data, and their proper assessment with regards to the source, generation and dynamics. The assessment will be based on the research of the various types of ocean (tide gauges, buoys), atmospheric (ground stations, buoys, vertical soundings, reanalyses) and remote sensing (satellites) data and products, supported by the atmospheric and ocean modelling efforts. Based on the earned knowledge, the basis for a meteotsunami warning system, i.e. observational systems and communication needs for early detection of a meteotsunami, will be defined. Finally, meteotsunami warning protocols, procedures and decision matrix will be developed, and tested on historical meteotsunami events. These deliverables are expected also to boost meteotsunami research in other parts of the World Ocean, and to contribute to the creation of an efficient meteotsunami warning systems in different regions of interest, such as Mediterranean Sea, western Japan, Western Australia or other.

  8. Rescue, Archival and Discovery of Tsunami Events on Marigrams

    NASA Astrophysics Data System (ADS)

    Eble, M. C.; Wright, L. M.; Stroker, K. J.; Sweeney, A.; Lancaster, M.

    2017-12-01

    The Big Earth Data Initiative made possible the reformatting of paper marigram records on which were recorded measurements of the 1946, 1952, 1960, and 1964 tsunamis generated in the Pacific Ocean. Data contained within each record were determined to be invaluable for tsunami researchers and operational agencies with a responsibility for issuing warnings during a tsunami event. All marigrams were carefully digitized and metadata were generated to form numerical datasets in order to provide the tsunami and other research and application-driven communities with quality data. Data were then packaged as CF-compliant netCDF datafiles and submitted to the NOAA Centers for Environmental Information for long-term stewardship, archival, and public discovery of both original scanned images and data in digital netCDF and CSC formats. The PNG plots of each time series were generated and included with data packages to provide a visual representation of the numerical data sets. ISO-compliant metadata were compiled for the collection at the event level and individual DOIs were minted for each of the four events included in this project. The procedure followed to reformat each record in this four-event subset of the larger NCEI scanned marigram inventory is presented and discussed. The practical use of these data is presented to highlight that even infrequent measurements of tsunamis hold information that may potentially help constrain earthquake rupture area, provide estimates of earthquake co-seismic slip distribution, identify subsidence or uplift, and significantly increase the holdings of situ data available for tsunami model validation. These same data may also prove valuable to the broader global tide community for validation and further development of tide models and for investigation into the stability of tidal harmonic constants. Data reformatted as part of this project are PARR compliant and meet the requirements for Data Management, Discoverability, Accessibility, Documentation, Readability, and Data Preservation and Stewardship as per the Big Earth Data Initiative.

  9. Non-Poissonian Distribution of Tsunami Waiting Times

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Parsons, T.

    2007-12-01

    Analysis of the global tsunami catalog indicates that tsunami waiting times deviate from an exponential distribution one would expect from a Poisson process. Empirical density distributions of tsunami waiting times were determined using both global tsunami origin times and tsunami arrival times at a particular site with a sufficient catalog: Hilo, Hawai'i. Most sources for the tsunamis in the catalog are earthquakes; other sources include landslides and volcanogenic processes. Both datasets indicate an over-abundance of short waiting times in comparison to an exponential distribution. Two types of probability models are investigated to explain this observation. Model (1) is a universal scaling law that describes long-term clustering of sources with a gamma distribution. The shape parameter (γ) for the global tsunami distribution is similar to that of the global earthquake catalog γ=0.63-0.67 [Corral, 2004]. For the Hilo catalog, γ is slightly greater (0.75-0.82) and closer to an exponential distribution. This is explained by the fact that tsunamis from smaller triggered earthquakes or landslides are less likely to be recorded at a far-field station such as Hilo in comparison to the global catalog, which includes a greater proportion of local tsunamis. Model (2) is based on two distributions derived from Omori's law for the temporal decay of triggered sources (aftershocks). The first is the ETAS distribution derived by Saichev and Sornette [2007], which is shown to fit the distribution of observed tsunami waiting times. The second is a simpler two-parameter distribution that is the exponential distribution augmented by a linear decay in aftershocks multiplied by a time constant Ta. Examination of the sources associated with short tsunami waiting times indicate that triggered events include both earthquake and landslide tsunamis that begin in the vicinity of the primary source. Triggered seismogenic tsunamis do not necessarily originate from the same fault zone, however. For example, subduction-thrust and outer-rise earthquake pairs are evident, such as the November 2006 and January 2007 Kuril Islands tsunamigenic pair. Because of variations in tsunami source parameters, such as water depth above the source, triggered tsunami events with short waiting times are not systematically smaller than the primary tsunami.

  10. Seismogeodetic monitoring techniques for tsunami and earthquake early warning and rapid assessment of structural damage

    NASA Astrophysics Data System (ADS)

    Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.

    2016-12-01

    As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June 2016 Mw 5.2 Borrego Springs earthquake of strong ground motions in near field close to the San Jacinto fault, as well as observations that show the response of the 3 story parking garage. The occurrence of this recent earthquake provided a useful demonstration of structural monitoring applications with seismogeodesy.

  11. Integrating Caribbean Seismic and Tsunami Hazard into Public Policy and Action

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.

    2012-12-01

    The Caribbean has a long history of tsunamis and earthquakes. Over the past 500 years, more than 80 tsunamis have been documented in the region by the NOAA National Geophysical Data Center. Almost 90% of all these historical tsunamis have been associated with earthquakes. Just since 1842, 3510 lives have been lost to tsunamis; this is more than in the Northeastern Pacific for the same time period. With a population of almost 160 million and a heavy concentration of residents, tourists, businesses and critical infrastructure along the Caribbean shores (especially in the northern and eastern Caribbean), the risk to lives and livelihoods is greater than ever before. Most of the countries also have a very high exposure to earthquakes. Given the elevated vulnerability, it is imperative that government officials take steps to mitigate the potentially devastating effects of these events. Nevertheless, given the low frequency of high impact earthquakes and tsunamis, in comparison to hurricanes, combined with social and economic considerations, the needed investments are not made and disasters like the 2010 Haiti earthquake occur. In the absence of frequent significant events, an important driving force for public officials to take action, is the dissemination of scientific studies. When papers of this nature have been published and media advisories issued, public officials demonstrate heightened interest in the topic which in turn can lead to increased legislation and funding efforts. This is especially the case if the material can be easily understood by the stakeholders and there is a local contact. In addition, given the close link between earthquakes and tsunamis, in Puerto Rico alone, 50% of the high impact earthquakes have also generated destructive tsunamis, it is very important that earthquake and tsunami hazards studies demonstrate consistency. Traditionally in the region, earthquake and tsunami impacts have been considered independently in the emergency planning processes. For example, earthquake and tsunami exercises are conducted separately, without taking into consideration the compounding effects. Recognizing this deficiency, the UNESCO IOC Intergovernmental Coordination Group for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS) which was established in 2005, decided to include the tsunami and earthquake impacts for the upcoming March 20, 2013 regional CARIBE WAVE/LANTEX tsunami exercise. In addition to the tsunami wave heights predicted by the National Weather Service Tsunami Warning Centers in Alaska and Hawaii, the USGS PAGER and SHAKE MAP results for the M8.5 scenario earthquake in the southern Caribbean were also integrated into the manual. Additionally, in recent catastrophic planning for Puerto Rico, FEMA did request the local researchers to determine both the earthquake and tsunami impacts for the same source. In the US, despite that the lead for earthquakes and tsunamis lies within two different agencies, USGS and NOAA/NWS, it has been very beneficial that the National Tsunami Hazard Mitigation Program partnership includes both agencies. By working together, the seismic and tsunami communities can achieve an even better understanding of the hazards, but also foster more actions on behalf of government officials and the populations at risk.

  12. PTWC Creating a New Catalog of Historic Tsunami Animations for NOAA Science-on-a-Sphere Exhibits

    NASA Astrophysics Data System (ADS)

    Becker, N. C.; Geschwind, L. R.; Wang, D.

    2016-12-01

    Throughout 2016 the Pacific Tsunami Warning Center (PTWC) has been developing a catalog of tsunami animations for NOAA's Science on a Sphere (SOS) display system. The SOS consists of a six-foot (1.8 m) diameter sphere that serves as a projection screen for four high-definition video projectors that can show any global dataset. SOS systems have been installed in over 100 locations around the world, primarily in venues such as science museums. Education and outreach are a vital part of PTWC's mission and SOS can show the global impacts of tsunami hazards in an intuitive and engaging presentation analogous to a planetarium. PTWC has been releasing these animations for the anniversaries of significant tsunamis throughout the year and has so far has produced them for Cascadia 1700, Chile 2010, Japan 2011, Aleutian Islands 1946, Alaska 1964, and Chile 1960, and before the end of the year the library will include Samoa 2009 and Sumatra 2004. PTWC created these animations at 8k video resolution to future-proof them against SOS upgrades such as higher definition projectors and larger spheres. Though not the first SOS tsunami animations, these are the first ones to show impacts to coastlines, the criteria that PTWC uses to determine the tsunami hazard guidance it will issue to the coastal populations it serves. These animations also all use a common color scheme based on PTWC's alert criteria such that they will be consistent with each other as well as with PTWC's tsunami messages. PTWC created these animations using the same tsunami forecast model it routinely uses in its warning operations, and PTWC has even demonstrated that it can produce a SOS tsunami animation while a tsunami was still crossing the Pacific Ocean, and so this library of animations can also be used to prepare docents and audiences to interpret such a real-time animation should it become available for the next major tsunami. One does not need access to a SOS exhibit, however, to view these animations. NOAA also maintains a website where these animations can be viewed in a web browser. The site also allows a user to download these data along with software such that they may be viewed on a personal computer. PTWC also maintains a YouTube channel with Mercator-projected versions of these animations that are in the same style and color scheme as their SOS counterparts.

  13. Tsunami Ionospheric warning and Ionospheric seismology

    NASA Astrophysics Data System (ADS)

    Lognonne, Philippe; Rolland, Lucie; Rakoto, Virgile; Coisson, Pierdavide; Occhipinti, Giovanni; Larmat, Carene; Walwer, Damien; Astafyeva, Elvira; Hebert, Helene; Okal, Emile; Makela, Jonathan

    2014-05-01

    The last decade demonstrated that seismic waves and tsunamis are coupled to the ionosphere. Observations of Total Electron Content (TEC) and airglow perturbations of unique quality and amplitude were made during the Tohoku, 2011 giant Japan quake, and observations of much lower tsunamis down to a few cm in sea uplift are now routinely done, including for the Kuril 2006, Samoa 2009, Chili 2010, Haida Gwai 2012 tsunamis. This new branch of seismology is now mature enough to tackle the new challenge associated to the inversion of these data, with either the goal to provide from these data maps or profile of the earth surface vertical displacement (and therefore crucial information for tsunami warning system) or inversion, with ground and ionospheric data set, of the various parameters (atmospheric sound speed, viscosity, collision frequencies) controlling the coupling between the surface, lower atmosphere and the ionosphere. We first present the state of the art in the modeling of the tsunami-atmospheric coupling, including in terms of slight perturbation in the tsunami phase and group velocity and dependance of the coupling strength with local time, ocean depth and season. We then show the confrontation of modelled signals with observations. For tsunami, this is made with the different type of measurement having proven ionospheric tsunami detection over the last 5 years (ground and space GPS, Airglow), while we focus on GPS and GOCE observation for seismic waves. These observation systems allowed to track the propagation of the signal from the ground (with GPS and seismometers) to the neutral atmosphere (with infrasound sensors and GOCE drag measurement) to the ionosphere (with GPS TEC and airglow among other ionospheric sounding techniques). Modelling with different techniques (normal modes, spectral element methods, finite differences) are used and shown. While the fits of the waveform are generally very good, we analyse the differences and draw direction of future studies and improvements, enabling the integration of lateral variations of the solid earth, bathymetry or atmosphere, finite model sources, non-linearity of the waves and better attenuation and coupling processes. All these effects are revealed by phase or amplitude discrepancies in selected observations. We then present goals and first results of source inversions, with a focus on estimations of the sea level uplift location and amplitude, either by using GPS networks close from the epicentre or, for tsunamis, GPS of the Hawaii Islands.

  14. Development of algorithms for tsunami detection by High Frequency Radar based on modeling tsunami case studies in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Grilli, S. T.; Guérin, C. A.; Grosdidier, S.

    2014-12-01

    Where coastal tsunami hazard is governed by near-field sources, Submarine Mass Failures (SMFs) or earthquakes, tsunami propagation times may be too small for a detection based on deep or shallow water buoys. To offer sufficient warning time, it has been proposed by others to implement early warning systems relying on High Frequency Radar (HFR) remote sensing, that has a dense spatial coverage far offshore. A new HFR, referred to as STRADIVARIUS, is being deployed by Diginext Inc. (in Fall 2014), to cover the "Golfe du Lion" (GDL) in the Western Mediterranean Sea. This radar uses a proprietary phase coding technology that allows detection up to 300 km, in a bistatic configuration (for which radar and antennas are separated by about 100 km). Although the primary purpose of the radar is vessel detection in relation to homeland security, the 4.5 MHz HFR will provide a strong backscattered signal for ocean surface waves at the so-called Bragg frequency (here, wavelength of 30 m). The current caused by an arriving tsunami will shift the Bragg frequency, by a value proportional to the current magnitude (projected on the local radar ray direction), which can be easily obtained from the Doppler spectrum of the HFR signal. Using state of the art tsunami generation and propagation models, we modeled tsunami case studies in the western Mediterranean basin (both seismic and SMFs) and simulated the HFR backscattered signal that would be detected for the entire GDL and beyond. Based on simulated HFR signal, we developed two types of tsunami detection algorithms: (i) one based on standard Doppler spectra, for which we found that to be detectable within the environmental and background current noises, the Doppler shift requires tsunami currents to be at least 10-15 cm/s, which typically only occurs on the continental shelf in fairly shallow water; (ii) to allow earlier detection, a second algorithm computes correlations of the HFR signals at two distant locations, shifted in time by the tsunami propagation time between these locations (easily computed based on bathymetry). We found that this second method allowed detection for currents as low as 5 cm/s, i.e., in deeper water, beyond the shelf and further away from the coast, thus allowing an earlier detection.

  15. Reconstruction of far-field tsunami amplitude distributions from earthquake sources

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.

    2016-01-01

    The probability distribution of far-field tsunami amplitudes is explained in relation to the distribution of seismic moment at subduction zones. Tsunami amplitude distributions at tide gauge stations follow a similar functional form, well described by a tapered Pareto distribution that is parameterized by a power-law exponent and a corner amplitude. Distribution parameters are first established for eight tide gauge stations in the Pacific, using maximum likelihood estimation. A procedure is then developed to reconstruct the tsunami amplitude distribution that consists of four steps: (1) define the distribution of seismic moment at subduction zones; (2) establish a source-station scaling relation from regression analysis; (3) transform the seismic moment distribution to a tsunami amplitude distribution for each subduction zone; and (4) mix the transformed distribution for all subduction zones to an aggregate tsunami amplitude distribution specific to the tide gauge station. The tsunami amplitude distribution is adequately reconstructed for four tide gauge stations using globally constant seismic moment distribution parameters established in previous studies. In comparisons to empirical tsunami amplitude distributions from maximum likelihood estimation, the reconstructed distributions consistently exhibit higher corner amplitude values, implying that in most cases, the empirical catalogs are too short to include the largest amplitudes. Because the reconstructed distribution is based on a catalog of earthquakes that is much larger than the tsunami catalog, it is less susceptible to the effects of record-breaking events and more indicative of the actual distribution of tsunami amplitudes.

  16. Puerto Rico Seismic Network Operations During and After the Hurricane Maria: Response, Continuity of Operations, and Experiences

    NASA Astrophysics Data System (ADS)

    Vanacore, E. A.; Baez-Sanchez, G.; Huerfano, V.; Lopez, A. M.; Lugo, J.

    2017-12-01

    The Puerto Rico Seismic Network (PRSN) is an integral part of earthquake and tsunami monitoring in Puerto Rico and the Virgin Islands. The PRSN conducts scientific research as part of the University of Puerto Rico Mayaguez, conducts the earthquake monitoring for the region, runs extensive earthquake and tsunami education and outreach programs, and acts as a Tsunami Warning Focal Point Alternate for Puerto Rico. During and in the immediate aftermath of Hurricane Maria, the PRSN duties and responsibilities evolved from a seismic network to a major information and communications center for the western side of Puerto Rico. Hurricane Maria effectively destroyed most communications on island, critically between the eastern side of the island where Puerto Rico's Emergency Management's (PREMA) main office and the National Weather Service (NWS) is based and the western side of the island. Additionally, many local emergency management agencies on the western side of the island lost a satellite based emergency management information system called EMWIN which provides critical tsunami and weather information. PRSN's EMWIN system remained functional and consequently via this system and radio communications PRSN became the only information source for NWS warnings and bulletins, tsunami alerts, and earthquake information for western Puerto Rico. Additionally, given the functional radio and geographic location of the PRSN, the network became a critical communications relay for local emergency management. Here we will present the PRSN response in relation to Hurricane Maria including the activation of the PRSN devolution plan, adoption of duties, experiences and lessons learned for continuity of operations and adoption of responsibilities during future catastrophic events.

  17. Long-term statistics of extreme tsunami height at Crescent City

    NASA Astrophysics Data System (ADS)

    Dong, Sheng; Zhai, Jinjin; Tao, Shanshan

    2017-06-01

    Historically, Crescent City is one of the most vulnerable communities impacted by tsunamis along the west coast of the United States, largely attributed to its offshore geography. Trans-ocean tsunamis usually produce large wave runup at Crescent Harbor resulting in catastrophic damages, property loss and human death. How to determine the return values of tsunami height using relatively short-term observation data is of great significance to assess the tsunami hazards and improve engineering design along the coast of Crescent City. In the present study, the extreme tsunami heights observed along the coast of Crescent City from 1938 to 2015 are fitted using six different probabilistic distributions, namely, the Gumbel distribution, the Weibull distribution, the maximum entropy distribution, the lognormal distribution, the generalized extreme value distribution and the generalized Pareto distribution. The maximum likelihood method is applied to estimate the parameters of all above distributions. Both Kolmogorov-Smirnov test and root mean square error method are utilized for goodness-of-fit test and the better fitting distribution is selected. Assuming that the occurrence frequency of tsunami in each year follows the Poisson distribution, the Poisson compound extreme value distribution can be used to fit the annual maximum tsunami amplitude, and then the point and interval estimations of return tsunami heights are calculated for structural design. The results show that the Poisson compound extreme value distribution fits tsunami heights very well and is suitable to determine the return tsunami heights for coastal disaster prevention.

  18. Lessons on vulnerability from the 2011 Tohoku earthquake for Indonesia and the United States

    NASA Astrophysics Data System (ADS)

    Sugimoto, M.; Dengler, L.

    2011-12-01

    The 2011 Tohoku earthquake and tsunami shocked people relevant for tsunami disaster risk reduction all over the world because such people thought Tohoku has often attacked by tsunamis and has declared one of the most wellprepared areas for tsunami in the world. Each author has separately promoted tsunami education to community in Indonesia for 7 years after the 2004 Indian Ocean tsunami and California US for 19 years after the1992 M7.2 Cape Mendocino earthquake. In order to learn the lesson from the 2011 Tohoku earthquake and tsunami and feedback to Indonesia, US and International society, we examined some of the factors that contributed to impacts in Tohoku based on field reconnaissance and reports from other organizations. The biggest factors exacerbating losses were the underestimation M8 of the real tsunami size M9 in design of prevention structures and evacuation planning coupled with a perception of individuals that they were not at risk. Approximately 86 % of the tsunami victims were in areas outside the mapped tsunami hazard zone in Unosumai town, Iwate. At least 100 chosen tsunami evacuation buildings were either overtopped or structurally toppled by the tsunami. More than 200 people died in the first story gymnasium of elementary school beside the river and canal in areas outside the mapped tsunami hazard zone in Higashi-Matsushima city Miyagi. Around 80 students sacrificed in Okawa Elementary school in Ishinomaki city Miyagi. Additional factors affecting vulnerability included people who were in safe areas at the time of the earthquake, returning to hazard zones after feeling the earthquake to rescue relatives or possessions, and relying on cars for evacuation. Factors that enhanced resilience include the good performance of most structures to earthquake ground shaking and the performance of the tsunami early warning system in stopping trains and shutting down other critical systems. Although power was out in most of the affected region, some cell phones and automobile car radios worked in many areas and were able to provide some warning guidance. Individuals who were able to improvise and make changes in their evacuation plans and routes may have been more likely to survive. As for US, it has triggered a re-examination of how slip and secondary fault rupture may affect the size of the tsunami and engendered debate about how to treat uncertainty in model results while it has not changed the maximum magnitude estimate for an earthquake on the Cascadia subduction zone, it has triggered a re-examination of how slip and secondary fault rupture may affect the size of the tsunami and engendered debate about how to treat uncertainty in model results. It has also raised the priority of FEMA's catastrophic response planning efforts for a great Cascadia earthquake and has invigorated states and local coastal jurisdiction's planning, education, and outreach efforts. Indonesia has been on the way to prepare for tsunami from the Tohoku model after the 2004 Indian Ocean tsunami. I stopped the plan make signboards of numerical tsunami height in Padang Indonesia because such signboards were not effective in Tohoku in this time. We introduce new plans in this presentation.

  19. Landquake dynamics inferred from seismic source inversion: Greenland and Sichuan events of 2017

    NASA Astrophysics Data System (ADS)

    Chao, W. A.

    2017-12-01

    In June 2017 two catastrophic landquake events occurred in Greenland and Sichuan. The Greenland event leads to tsunami hazard in the small town of Nuugaarsiaq. A landquake in Sichuan hit the town, which resulted in over 100 death. Both two events generated the strong seismic signals recorded by the real-time global seismic network. I adopt an inversion algorithm to derive the landquake force time history (LFH) using the long-period waveforms, and the landslide volume ( 76 million m3) can be rapidly estimated, facilitating the tsunami-wave modeling for early warning purpose. Based on an integrated approach involving tsunami forward simulation and seismic waveform inversion, this study has significant implications to issuing actionable warnings before hazardous tsunami waves strike populated areas. Two single-forces (SFs) mechanism (two block model) yields the best explanation for Sichuan event, which demonstrates that secondary event (seismic inferred volume: 8.2 million m3) may be mobilized by collapse-mass hitting from initial rock avalanches ( 5.8 million m3), likely causing a catastrophic disaster. The later source with a force magnitude of 0.9967×1011 N occurred 70 seconds after first mass-movement occurrence. In contrast, first event has the smaller force magnitude of 0.8116×1011 N. In conclusion, seismically inferred physical parameters will substantially contribute to improving our understanding of landquake source mechanisms and mitigating similar hazards in other parts of the world.

  20. Human Response to Emergency Warning

    NASA Astrophysics Data System (ADS)

    Sorensen, J.

    2009-12-01

    Almost every day people evacuate from their homes, businesses or other sites, even ships, in response to actual or predicted threats or hazards. Evacuation is the primary protective action utilized in large-scale emergencies such as hurricanes, floods, tornados, tsunamis, volcanic eruptions, or wildfires. Although often precautionary, protecting human lives by temporally relocating populations before or during times of threat remains a major emergency management strategy. One of the most formidable challenges facing emergency officials is evacuating residents for a fast-moving and largely unpredictable event such as a wildfire or a local tsunami. How to issue effective warnings to those at risk in time for residents to take appropriate action is an on-going problem. To do so, some communities have instituted advanced communications systems that include reverse telephone call-down systems or other alerting systems to notify at-risk residents of imminent threats. This presentation examines the effectiveness of using reverse telephone call-down systems for warning San Diego residents of wildfires in the October of 2007. This is the first systematic study conducted on this topic and is based on interviews with 1200 households in the evacuation areas.

  1. Tsunami Warning and Education Reauthorization Act of 2014

    THOMAS, 113th Congress

    Sen. Begich, Mark [D-AK

    2014-03-27

    Senate - 03/27/2014 Read twice and referred to the Committee on Commerce, Science, and Transportation. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  2. Health Effects of Tsunamis

    MedlinePlus

    ... Extreme Heat Older Adults (Aged 65+) Infants and Children Chronic Medical Conditions Low Income Athletes Outdoor Workers Pets Hot Weather Tips Warning Signs and Symptoms FAQs Social Media How to Stay Cool Missouri Cooling Centers Extreme ...

  3. Response to the 2011 Great East Japan Earthquake and Tsunami disaster.

    PubMed

    Koshimura, Shunichi; Shuto, Nobuo

    2015-10-28

    We revisited the lessons of the 2011 Great East Japan Earthquake Tsunami disaster specifically on the response and impact, and discussed the paradigm shift of Japan's tsunami disaster management policies and the perspectives for reconstruction. Revisiting the modern histories of Tohoku tsunami disasters and pre-2011 tsunami countermeasures, we clarified how Japan's coastal communities have prepared for tsunamis. The discussion mainly focuses on structural measures such as seawalls and breakwaters and non-structural measures of hazard map and evacuation. The responses to the 2011 event are discussed specifically on the tsunami warning system and efforts to identify the tsunami impacts. The nation-wide post-tsunami survey results shed light on the mechanisms of structural destruction, tsunami loads and structural vulnerability to inform structural rehabilitation measures and land-use planning. Remarkable paradigm shifts in designing coastal protection and disaster mitigation measures were introduced, leading with a new concept of potential tsunami levels: Prevention (Level 1) and Mitigation (Level 2) levels according to the level of 'protection'. The seawall is designed with reference to Level 1 tsunami scenario, while comprehensive disaster management measures should refer to Level 2 tsunami for protection of human lives and reducing potential losses and damage. Throughout the case study in Sendai city, the proposed reconstruction plan was evaluated from the tsunami engineering point of view to discuss how the post 2011 paradigm was implemented in coastal communities for future disaster mitigation. The analysis revealed that Sendai city's multiple protection measures for Level 2 tsunami will contribute to a substantial reduction of the tsunami inundation zone and potential losses, combined with an effective tsunami evacuation plan. © 2015 The Author(s).

  4. Rapid magnitude estimation from time-dependent displacement amplitude measured with seismogeodetic instrumentation

    NASA Astrophysics Data System (ADS)

    Goldberg, D.; Bock, Y.; Melgar, D.

    2017-12-01

    Earthquake magnitude is a concise metric that illuminates the destructive potential of a seismic event. Rapid determination of earthquake magnitude is currently the main prerequisite for dissemination of a tsunami early warning, thus timely and automated calculation is of high importance. Seismic instrumentation experiences well-documented complications at long periods, making the accurate measurement of ground displacement in the near field unreliable. As a result, the relation between ground motion measured with seismic instrumentation and magnitude saturates, causing underestimation of the size of very large events. In the case of tsunamigenic earthquakes, magnitude underestimation in turn leads to a flawed tsunami inundation assessment, which limits the effectiveness of an early warning, in particular for local tsunamis. Global Navigation Satellite System (GNSS) instrumentation measures the displacement field directly, leading to more accurate magnitude estimates with near-field data. Unlike seismic-only instrumentation, near-field GNSS has been shown to provide an accurate magnitude estimate using the peak ground displacement (PGD) after just 2 minutes [Melgar et al., 2015]. However, GNSS alone is too noisy to detect the first seismic wave arrivals (P-waves), thus it cannot be as timely as a seismic system on its own. Using collocated seismic and geodetic instrumentation, we refine magnitude scaling relations by incorporating a large dataset of earthquakes in Japan. We demonstrate that consideration of the time-dependence of displacement amplitude with respect to P-wave arrival time reduces the time to convergence of the magnitude estimate. We present findings on the growth of events of large magnitude, and demonstrate time-dependent scaling relations that adapt to the amount of recorded data, starting with the P-wave arrival and continuing through PGD. We illustrate real-time, automated implementation of this method, and consider network improvements to advance rapid characterization of large events. Improvement of initial magnitude estimates through integration of geodetic and seismogeodetic observations is a top priority of an ongoing collaboration with NASA and NOAA's National and Pacific Tsunami Warning Centers (NOAA/NASA GNSS Tsunami Team).

  5. Towards Operational Meteotsunami Early Warning System: the Adriatic Project MESSI

    NASA Astrophysics Data System (ADS)

    Vilibic, I.; Sepic, J.; Denamiel, C. L.; Mihanovic, H.; Muslim, S.; Tudor, M.; Ivankovic, D.; Jelavic, D.; Kovacevic, V.; Masce, T.; Dadic, V.; Gacic, M.; Horvath, K.; Monserrat, S.; Rabinovich, A.; Telisman-Prtenjak, M.

    2017-12-01

    A number of destructive meteotsunamis - atmospherically-driven long ocean waves in a tsunami frequency band - occurred during the last decade through the world oceans. Owing to significant damage caused by these meteotsunamis, several scientific groups (occasionally in collaboration with public offices) have started developing meteotsunami warning systems. Creation of one such system has been initialized in the late 2015 within the MESSI (Meteotsunamis, destructive long ocean waves in the tsunami frequency band: from observations and simulations towards a warning system) project. Main goal of this project is to build a prototype of a meteotsunami warning system for the eastern Adriatic coast. The system will be based on real-time measurements, operational atmosphere and ocean modeling and real time decision-making process. Envisioned MESSI meteotsunami warning system consists of three modules: (1) synoptic warning module, which will use established correlation between forecasted synoptic fields and high-frequency sea level oscillations to provide qualitative meteotsunami forecasts for up to a week in advance, (2) probabilistic premodeling prediction module, which will use operational WRF-ROMS-ADCIRC modeling system and compare the forecast with an atlas of presimulations to get the probabilistic meteotsunami forecast for up to three days in advance, and (3) real-time module, which is based on real time tracking of properties of air pressure disturbance (amplitude, speed, direction, period, ...) and their real-time comparison with the atlas of meteotsunami simulations. System will be tested on recent meteotsunami events which were recorded in the MESSI area shortly after the operational meteotsunami network installation. Albeit complex, such a multilevel warning system has a potential to be adapted to most meteotsunami hot spots, simply by tuning the system parameters to the available atmospheric and ocean data.

  6. In Search of the Largest Possible Tsunami: An Example Following the 2011 Japan Tsunami

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Parsons, T.

    2012-12-01

    Many tsunami hazard assessments focus on estimating the largest possible tsunami: i.e., the worst-case scenario. This is typically performed by examining historic and prehistoric tsunami data or by estimating the largest source that can produce a tsunami. We demonstrate that worst-case assessments derived from tsunami and tsunami-source catalogs are greatly affected by sampling bias. Both tsunami and tsunami sources are well represented by a Pareto distribution. It is intuitive to assume that there is some limiting size (i.e., runup or seismic moment) for which a Pareto distribution is truncated or tapered. Likelihood methods are used to determine whether a limiting size can be determined from existing catalogs. Results from synthetic catalogs indicate that several observations near the limiting size are needed for accurate parameter estimation. Accordingly, the catalog length needed to empirically determine the limiting size is dependent on the difference between the limiting size and the observation threshold, with larger catalog lengths needed for larger limiting-threshold size differences. Most, if not all, tsunami catalogs and regional tsunami source catalogs are of insufficient length to determine the upper bound on tsunami runup. As an example, estimates of the empirical tsunami runup distribution are obtained from the Miyako tide gauge station in Japan, which recorded the 2011 Tohoku-oki tsunami as the largest tsunami among 51 other events. Parameter estimation using a tapered Pareto distribution is made both with and without the Tohoku-oki event. The catalog without the 2011 event appears to have a low limiting tsunami runup. However, this is an artifact of undersampling. Including the 2011 event, the catalog conforms more to a pure Pareto distribution with no confidence in estimating a limiting runup. Estimating the size distribution of regional tsunami sources is subject to the same sampling bias. Physical attenuation mechanisms such as wave breaking likely limit the maximum tsunami runup at a particular site. However, historic and prehistoric data alone cannot determine the upper bound on tsunami runup. Because of problems endemic to sampling Pareto distributions of tsunamis and their sources, we recommend that tsunami hazard assessment be based on a specific design probability of exceedance following a pure Pareto distribution, rather than attempting to determine the worst-case scenario.

  7. The Redwood Coast Tsunami Work Group: a unique organization promoting earthquake and tsunami resilience on California's North Coast

    NASA Astrophysics Data System (ADS)

    Dengler, L.; Henderson, C.; Larkin, D.; Nicolini, T.; Ozaki, V.

    2012-12-01

    The Northern California counties of Del Norte, Humboldt, and Mendocino account for over 30% of California's coastline and is one of the most seismically active areas of the contiguous 48 states. The region is at risk from earthquakes located on- and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ) and from distant sources elsewhere in the Pacific. In 1995 the California Geological Survey (CGS) published a scenario for a CSZ earthquake that included both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of government agencies, tribes, service groups, academia and the private sector, was formed to coordinate and promote earthquake and tsunami hazard awareness and mitigation in the three-county region. The RCTWG and its member agencies projects include education/outreach products and programs, tsunami hazard mapping, signage and siren planning. Since 2008, RCTWG has worked with the California Emergency Management Agency (Cal EMA) in conducting tsunami warning communications tests on the North Coast. In 2007, RCTWG members helped develop and carry out the first tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD. The RCTWG has facilitated numerous multi-agency, multi-discipline coordinated exercises, and RCTWG county tsunami response plans have been a model for other regions of the state and country. Eight North Coast communities have been recognized as TsunamiReady by the National Weather Service, including the first National Park the first State Park and only tribe in California to be so recognized. Over 500 tsunami hazard zone signs have been posted in the RCTWG region since 2008. Eight assessment surveys from 1993 to 2010 have tracked preparedness actions and personal awareness of earthquake and tsunami hazards in the county and additional surveys have tracked public awareness and tourist concerns about tsunami hazard signs. Over the seventeen-year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 84 percent, respondents aware of a local tsunami hazard increased from 51 to 89 percent and knowing what the Cascadia subduction zone is from 16 to 57 percent. In 2009, the RCTWG was recognized by the Western States Seismic Policy Council (WSSPC) with an award for innovation and in 2010, the RCTWG-sponsored class "Living on Shaky Ground" was awarded WSSPC's overall Award in Excellence. The RCTWG works closely with CGS and Cal EMA on a number of projects including tsunami mapping, evacuation zone planning, siren policy, tsunami safety for boaters, and public education messaging. Current projects include working with CGS to develop a "playbook" tsunami mapping product to illustrate the expected effects from a range of tsunami source events and assist local governments in focusing future response actions to reflect the range expected impacts from distant source events. Preparedness efforts paid off on March 11, 2011 when a tsunami warning was issued for the region and significant damage occurred in harbor regions of Del Norte County and Mendocino County. Full-scale evacuations were carried out in a coordinated manner and the majority of the commercial fishing fleet in Crescent City was able to exit the harbor before the tsunami arrived.

  8. Advanced Geospatial Hydrodynamic Signals Analysis for Tsunami Event Detection and Warning

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, Banafshe; Sabeur, Zoheir

    2013-04-01

    Current early tsunami warning can be issued upon the detection of a seismic event which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic event from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source event. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded tsunami event in the Pacific Ocean. The new approach will be tested in the future on other oceanic regions including the Mediteranean Sea and North East Atlantic Ocean zones. Both authors acknowledge that the current research is currently conducted under the TRIDEC IP FP7 project[1] which involves the development of a system of systems for collaborative, complex and critical decision-support in evolving crises. [1] TRIDEC IP ICT-2009.4.3 Intelligent Information Management Project Reference: 258723. http://www.tridec-online.eu/home

  9. Scenario-based tsunami risk assessment using a static flooding approach and high-resolution digital elevation data: An example from Muscat in Oman

    NASA Astrophysics Data System (ADS)

    Schneider, Bastian; Hoffmann, Gösta; Reicherter, Klaus

    2016-04-01

    Knowledge of tsunami risk and vulnerability is essential to establish a well-adapted Multi Hazard Early Warning System, land-use planning and emergency management. As the tsunami risk for the coastline of Oman is still under discussion and remains enigmatic, various scenarios based on historical tsunamis were created. The suggested inundation and run-up heights were projected onto the modern infrastructural setting of the Muscat Capital Area. Furthermore, possible impacts of the worst-case tsunami event for Muscat are discussed. The approved Papathoma Tsunami Vulnerability Assessment Model was used to model the structural vulnerability of the infrastructure for a 2 m tsunami scenario, depicting the 1945 tsunami and a 5 m tsunami in Muscat. Considering structural vulnerability, the results suggest a minor tsunami risk for the 2 m tsunami scenario as the flooding is mainly confined to beaches and wadis. Especially traditional brick buildings, still predominant in numerous rural suburbs, and a prevalently coast-parallel road network lead to an increased tsunami risk. In contrast, the 5 m tsunami scenario reveals extensively inundated areas and with up to 48% of the buildings flooded, and therefore consequently a significantly higher tsunami risk. We expect up to 60000 damaged buildings and up to 380000 residents directly affected in the Muscat Capital Area, accompanied with a significant loss of life and damage to vital infrastructure. The rapid urbanization processes in the Muscat Capital Area, predominantly in areas along the coast, in combination with infrastructural, demographic and economic growth will additionally increase the tsunami risk and therefore emphasizes the importance of tsunami risk assessment in Oman.

  10. Modeling of Grain Size Distribution of Tsunami Sand Deposits in V-shaped Valley of Numanohama During the 2011 Tohoku Tsunami

    NASA Astrophysics Data System (ADS)

    Gusman, A. R.; Satake, K.; Goto, T.; Takahashi, T.

    2016-12-01

    Estimating tsunami amplitude from tsunami sand deposit has been a challenge. The grain size distribution of tsunami sand deposit may have correlation with tsunami inundation process, and further with its source characteristics. In order to test this hypothesis, we need a tsunami sediment transport model that can accurately estimate grain size distribution of tsunami deposit. Here, we built and validate a tsunami sediment transport model that can simulate grain size distribution. Our numerical model has three layers which are suspended load layer, active bed layer, and parent bed layer. The two bed layers contain information about the grain size distribution. This numerical model can handle a wide range of grain sizes from 0.063 (4 ϕ) to 5.657 mm (-2.5 ϕ). We apply the numerical model to simulate the sedimentation process during the 2011 Tohoku earthquake in Numanohama, Iwate prefecture, Japan. The grain size distributions at 15 sample points along a 900 m transect from the beach are used to validate the tsunami sediment transport model. The tsunami deposits are dominated by coarse sand with diameter of 0.5 - 1 mm and their thickness are up to 25 cm. Our tsunami model can well reproduce the observed tsunami run-ups that are ranged from 16 to 34 m along the steep valley in Numanohama. The shapes of the simulated grain size distributions at many sample points located within 300 m from the shoreline are similar to the observations. The differences between observed and simulated peak of grain size distributions are less than 1 ϕ. Our result also shows that the simulated sand thickness distribution along the transect is consistent with the observation.

  11. Variety of Sedimentary Process and Distribution of Tsunami Deposits in Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Yamaguchi, N.; Sekiguchi, T.

    2017-12-01

    As an indicator of the history and magnitude of paleotsunami events, tsunami deposits have received considerable attention. To improve the identification and interpretation of paleotsunami deposits, an understanding of sedimentary process and distribution of tsunami deposits is crucial. Recent detailed surveys of onshore tsunami deposits including the 2004 Indian Ocean tsunami and the 2011 Tohoku-oki tsunami have revealed that terrestrial topography causes a variety of their features and distributions. Therefore, a better understanding of possible sedimentary process and distribution on such influential topographies is required. Flume experiments, in which sedimentary conditions can be easily controlled, can provide insights into the effects of terrestrial topography as well as tsunami magnitude on the feature of tsunami deposits. In this presentation, we report laboratory experiments that focused on terrestrial topography including a water body (e.g. coastal lake) on a coastal lowland and a cliff. In both cases, the results suggested relationship between the distribution of tsunami deposits and the hydraulic condition of the tsunami flow associated with the terrestrial topography. These experiments suggest that influential topography would enhance the variability in thickness of tsunami deposits, and thus, in reconstructions of paleotsunami events using sedimentary records, we should take into account such anomalous distribution of tsunami deposits. Further examination of the temporal sequence of sedimentary process in laboratory tsunamis may improve interpretation and estimation of paleotsunami events.

  12. PEER - Earthquake Reconnaissance Reports

    Science.gov Websites

    struck off Mexico's Pacific coast - Wave 3 News - Live video Kirakira, Solomon Islands 2016 - USGS image - M7.8 - 69km WSW of Kirakira, Solomon Islands - BBC - Solomon Islands tsunami warning lifted after

  13. Influence of behavioral biases on the assessment of multi-hazard risks and the implementation of multi-hazard risks mitigation measures: case study of multi-hazard cyclone shelters in Tamil Nadu, India

    NASA Astrophysics Data System (ADS)

    Komendantova, Nadejda; Patt, Anthony

    2013-04-01

    In December 2004, a multiple hazards event devastated the Tamil Nadu province of India. The Sumatra -Andaman earthquake with a magnitude of Mw=9.1-9.3 caused the Indian Ocean tsunami with wave heights up to 30 m, and flooding that reached up to two kilometers inland in some locations. More than 7,790 persons were killed in the province of Tamil Nadu, with 206 in its capital Chennai. The time lag between the earthquake and the tsunami's arrival in India was over an hour, therefore, if a suitable early warning system existed, a proper means of communicating the warning and shelters existing for people would exist, than while this would not have prevented the destruction of infrastructure, several thousands of human lives would have been saved. India has over forty years of experience in the construction of cyclone shelters. With additional efforts and investment, these shelters could be adapted to other types of hazards such as tsunamis and flooding, as well as the construction of new multi-hazard cyclone shelters (MPCS). It would therefore be possible to mitigate one hazard such as cyclones by the construction of a network of shelters while at the same time adapting these shelters to also deal with, for example, tsunamis, with some additional investment. In this historical case, the failure to consider multiple hazards caused significant human losses. The current paper investigates the patterns of the national decision-making process with regards to multiple hazards mitigation measures and how the presence of behavioral and cognitive biases influenced the perceptions of the probabilities of multiple hazards and the choices made for their mitigation by the national decision-makers. Our methodology was based on the analysis of existing reports from national and international organizations as well as available scientific literature on behavioral economics and natural hazards. The results identified several biases in the national decision-making process when the construction of cyclone shelters was being undertaken. The availability heuristics caused a perception of low probability of tsunami following an earthquake, as the last large similar event happened over a hundred years ago. Another led to a situation when decisions were taken on the basis of experience and not statistical evidence, namely, experience showed that the so-called "Ring of Fire" generates underground earthquakes and tsunamis in the Pacific Ocean. This knowledge made decision-makers to neglect the numerical estimations about probability of underground earthquake in the Indian Ocean even though seismologists were warning about probability of a large underground earthquake in the Indian Ocean. The bounded rationality bias led to misperception of signals from the early warning center in the Pacific Ocean. The resulting limited concern resulted in risk mitigation measures that considered cyclone risks, but much less about tsunami. Under loss aversion considerations, the decision-makers perceived the losses connected with the necessary additional investment as being greater than benefits from mitigating a less probable hazard.

  14. Comparison of maximum runup through analytical and numerical approaches for different fault parameters estimates

    NASA Astrophysics Data System (ADS)

    Kanoglu, U.; Wronna, M.; Baptista, M. A.; Miranda, J. M. A.

    2017-12-01

    The one-dimensional analytical runup theory in combination with near shore synthetic waveforms is a promising tool for tsunami rapid early warning systems. Its application in realistic cases with complex bathymetry and initial wave condition from inverse modelling have shown that maximum runup values can be estimated reasonably well. In this study we generate a simplistic bathymetry domains which resemble realistic near-shore features. We investigate the accuracy of the analytical runup formulae to the variation of fault source parameters and near-shore bathymetric features. To do this we systematically vary the fault plane parameters to compute the initial tsunami wave condition. Subsequently, we use the initial conditions to run the numerical tsunami model using coupled system of four nested grids and compare the results to the analytical estimates. Variation of the dip angle of the fault plane showed that analytical estimates have less than 10% difference for angles 5-45 degrees in a simple bathymetric domain. These results shows that the use of analytical formulae for fast run up estimates constitutes a very promising approach in a simple bathymetric domain and might be implemented in Hazard Mapping and Early Warning.

  15. An observation on the main factor for the high fatalities by the March 11 earthquake

    NASA Astrophysics Data System (ADS)

    Ishida, M.; Baba, T.; Ando, M.

    2011-12-01

    On 11 March 2011, Mw9.0 earthquake occurred in Tohoku district, the northeastern Japan, and caused a large tsunami which affected the greater part of the area. During 115 years prior to this event, large tsunamis have struck the Tohoku region in 1960, 1933 and 1896. Therefore, disaster mitigation efforts have been undertaken in the Tohoku region, such as the construction of incomparably strong breakwaters, the annual practice for tsunami evacuation drill, the preparation of hazard maps, etc. Despite these long-term efforts, ca. 25,000 deaths and missing persons were reported by the National Police Headquarters, Japan. In order to clarify the causes of such high number of the fatalities, we interviewed 120 tsunami survivors in 7 cities mainly in Iwate prefecture in several periods after the earthquake. Since the tsunami arrived more than 20-30 min later after the strong ground shaking stopped and highlands are within about 10 to 20 minutes on foot, residents would have been saved if people had taken an immediate action. We found several major reasons why the residents delayed their evacuation actions as follows: 1. Earthquakes that were forecast for the offshore Tohoku by the governmental committee had been much smaller than the March 11 event. Accordingly, evacuation shelters were located at the lower level than that required for the incoming tsunami; 2. The earthquake magnitude and tsunami height of the first warning issue by Japan Meteorological Agency (JMA) was significantly smaller than those of the actual events. Majority of local residents thought that breakwaters would protect them. The JMA renewed the earthquake magnitude and tsunami height step by step, but the corrected information did not reach to the local residents because of the blackout of electric power. Consequently, the residents were unable to get the renewed information through TV or radio; 3. Fifty percent of the local residents experienced the 1960 Chile tsunami that significantly smaller than the March 11 tsunami. Most of them had estimated the height and inundation area of the incoming tsunami based on their experience; 4. People had believed that breakwaters would protect the city from the tsunami. But the March 11 tsunami climbed over and destroyed most breakwaters. Focusing on the reliance of the breakwaters that delayed the evacuation of residents, we numerically simulated the tsunami height caused by the March 11 event in Kamaishi-city for three cases; 1. with breakwaters, 2. without breakwaters, 3. with partially collapsed breakwaters. Our preliminary results showed that the tsunami height does not show much difference among the above three cases during about 20 min from the beginning. Detail of the results will be shown in the poster. It is noticeable that the immoderate confidence on breakwaters delayed the timing for the local resident to evacuation, although there are other reasons that influenced their behaviors. Finally we emphasize that educating children at a young age is important and essential to understand the basic mechanism of tsunami generation even if technology could underestimate tsunami heights, the warning systems could fail, and the breakwaters were not sturdy enough.

  16. The Human Impact of Tsunamis: a Historical Review of Events 1900-2009 and Systematic Literature Review

    PubMed Central

    Doocy, Shannon; Daniels, Amy; Dick, Anna; Kirsch, Thomas D.

    2013-01-01

    Introduction. Although rare, tsunamis have the potential to cause considerable loss of life and injury as well as widespread damage to the natural and built environments. The objectives of this review were to describe the impact of tsunamis on human populations in terms of mortality, injury, and displacement and, to the extent possible, identify risk factors associated with these outcomes. This is one of five reviews on the human impact of natural disasters. Methods. Data on the impact of tsunamis were compiled using two methods, a historical review from 1900 to mid 2009 of tsunami events from multiple databases and a systematic literature review to October 2012 of publications. Analysis included descriptive statistics and bivariate tests for associations between tsunami mortality and characteristics using STATA 11. Findings. There were 255,195 deaths (range 252,619-275,784) and 48,462 injuries (range 45,466-51,457) as a result of tsunamis from 1900 to 2009. The majority of deaths (89%) and injuries reported during this time period were attributed to a single event –the 2004 Indian Ocean tsunami. Findings from the systematic literature review indicate that the primary cause of tsunami-related mortality is drowning, and that females, children and the elderly are at increased mortality risk. The few studies that reported on tsunami-related injury suggest that males and young adults are at increased injury-risk. Conclusions. Early warning systems may help mitigate tsunami-related loss of life. PMID:23857277

  17. The human impact of tsunamis: a historical review of events 1900-2009 and systematic literature review.

    PubMed

    Doocy, Shannon; Daniels, Amy; Dick, Anna; Kirsch, Thomas D

    2013-04-16

    Introduction. Although rare, tsunamis have the potential to cause considerable loss of life and injury as well as widespread damage to the natural and built environments. The objectives of this review were to describe the impact of tsunamis on human populations in terms of mortality, injury, and displacement and, to the extent possible, identify risk factors associated with these outcomes. This is one of five reviews on the human impact of natural disasters. Methods. Data on the impact of tsunamis were compiled using two methods, a historical review from 1900 to mid 2009 of tsunami events from multiple databases and a systematic literature review to October 2012 of publications. Analysis included descriptive statistics and bivariate tests for associations between tsunami mortality and characteristics using STATA 11. Findings. There were 255,195 deaths (range 252,619-275,784) and 48,462 injuries (range 45,466-51,457) as a result of tsunamis from 1900 to 2009. The majority of deaths (89%) and injuries reported during this time period were attributed to a single event -the 2004 Indian Ocean tsunami. Findings from the systematic literature review indicate that the primary cause of tsunami-related mortality is drowning, and that females, children and the elderly are at increased mortality risk. The few studies that reported on tsunami-related injury suggest that males and young adults are at increased injury-risk. Conclusions. Early warning systems may help mitigate tsunami-related loss of life.

  18. U.S. Geological Survey Global Seismographic Network - Five-Year Plan 2006-2010

    USGS Publications Warehouse

    Leith, William S.; Gee, Lind S.; Hutt, Charles R.

    2009-01-01

    The Global Seismographic Network provides data for earthquake alerting, tsunami warning, nuclear treaty verification, and Earth science research. The system consists of nearly 150 permanent digital stations, distributed across the globe, connected by a modern telecommunications network. It serves as a multi-use scientific facility and societal resource for monitoring, research, and education, by providing nearly uniform, worldwide monitoring of the Earth. The network was developed and is operated through a partnership among the National Science Foundation (http://www.nsf.gov), the Incorporated Research Institutions for Seismology (http://www.iris.edu/hq/programs/gsn), and the U.S. Geological Survey (http://earthquake.usgs.gov/gsn).

  19. Numerical tool for tsunami risk assessment in the southern coast of Dominican Republic

    NASA Astrophysics Data System (ADS)

    Macias Sanchez, J.; Llorente Isidro, M.; Ortega, S.; Gonzalez Vida, J. M., Sr.; Castro, M. J.

    2016-12-01

    The southern coast of Dominican Republic is a very populated region, with several important cities including Santo Domingo, its capital. Important activities are rooted in the southern coast including tourism, industry, commercial ports, and, energy facilities, among others. According to historical reports, it has been impacted by big earthquakes accompanied by tsunamis as in Azua in 1751 and recently Pedernales in 2010, but their sources are not clearly identified. The aim of the present work is to develop a numerical tool to simulate the impact in the southern coast of the Dominican Republic of tsunamis generated in the Caribbean Sea. This tool, based on the Tsunami-HySEA model from EDANYA group (University of Malaga, Spain), could be used in the framework of a Tsunami Early Warning Systems due the very short computing times when only propagation is computed or it could be used to assess inundation impact, computing inundation with a initial 5 meter resolution. Numerical results corresponding to three theoretical sources are used to test the numerical tool.

  20. Operational tsunami modeling with TsunAWI - Examples for Indonesia and Chile

    NASA Astrophysics Data System (ADS)

    Rakowsky, Natalja; Androsov, Alexey; Harig, Sven; Immerz, Antonia; Fuchs, Annika; Behrens, Jörn; Danilov, Sergey; Hiller, Wolfgang; Schröter, Jens

    2014-05-01

    The numerical simulation code TsunAWI was developed in the framework of the German-Indonesian Tsunami Early Warning System (GITEWS). The numerical simulation of prototypical tsunami scenarios plays a decisive role in the a priory risk assessment for coastal regions and in the early warning process itself. TsunAWI is based on a finite element discretization, employs unstructured grids with high resolution along the coast, and includes inundation. This contribution gives an overview of the model itself and presents two applications. For GITEWS, the existing scenario database covering 528 epicenters / 3450 scenarios from Sumatra to Bali was extended by 187 epicenters / 1100 scenarios in the Eastern Sunda Arc. Furthermore, about 1100 scenarios for the Western Sunda Arc were recomputed on the new model domain covering the whole Indonesian Seas. These computations would not have been feasible in the beginning of the project. The unstructured computational grid contains 7 million nodes and resolves all coastal regions with 150m, some project regions and the surrounding of tide gauges with 50m, and the deep ocean with 12km edge length. While in the Western Sunda Arc, the large islands of Sumatra and Java shield the Northern Indonesian Archipelago, tsunamis in the Eastern Sunda Arc can propagate to the North. The unstructured grid approach allows TsunAWI to easily simulate the complex propagation patterns with the self-interactions and the reflections at the coastal regions of myriads of islands. For the Hydrographic and Oceanographic Service of the Chilean Navy (SHOA), we calculated a small scenario database of 100 scenarios (sources by Universidad de Chile) to provide data for a lightweight decision support system prototype (built by DLR). This work is part of the initiation project "Multi hazard information and early warning system in cooperation with Chile" and aims at sharing our experience from GITEWS with the Chilean partners.

  1. Sea level hazards: Altimetric monitoring of tsunamis and sea level rise

    NASA Astrophysics Data System (ADS)

    Hamlington, Benjamin Dillon

    Whether on the short timescale of an impending tsunami or the much longer timescale of climate change-driven sea level rise, the threat stemming from rising and inundating ocean waters is a great concern to coastal populations. Timely and accurate observations of potentially dangerous changes in sea level are vital in determining the precautionary steps that need to be taken in order to protect coastal communities. While instruments from the past have provided in situ measurements of sea level at specific locations across the globe, satellites can be used to provide improved spatial and temporal sampling of the ocean in addition to producing more accurate measurements. Since 1993, satellite altimetry has provided accurate measurements of sea surface height (SSH) with near-global coverage. Not only have these measurements led to the first definitive estimates of global mean sea level rise, satellite altimetry observations have also been used to detect tsunami waves in the open ocean where wave amplitudes are relatively small, a vital step in providing early warning to those potentially affected by the impending tsunami. The use of satellite altimetry to monitor two specific sea level hazards is examined in this thesis. The first section will focus on the detection of tsunamis in the open ocean for the purpose of providing early warning to coastal inhabitants. The second section will focus on estimating secular trends using satellite altimetry data with the hope of improving our understanding of future sea level change. Results presented here will show the utility of satellite altimetry for sea level monitoring and will lay the foundation for further advancement in the detection of the two sea level hazards considered.

  2. Impact of earthquake-induced tsunamis on public health

    NASA Astrophysics Data System (ADS)

    Mavroulis, Spyridon; Mavrouli, Maria; Lekkas, Efthymios; Tsakris, Athanassios

    2017-04-01

    Tsunamis are caused by rapid sea floor displacement during earthquakes, landslides and large explosive eruptions in marine environment setting. Massive amounts of sea water in the form of devastating surface waves travelling hundreds of kilometers per hour have the potential to cause extensive damage to coastal infrastructures, considerable loss of life and injury and emergence of infectious diseases (ID). This study involved an extensive and systematic literature review of 50 research publications related to public health impact of the three most devastating tsunamis of the last 12 years induced by great earthquakes, namely the 2004 Sumatra-Andaman earthquake (moment magnitude Mw 9.2), the 2009 Samoa earthquake (Mw 8.1) and the 2011 Tōhoku (Japan) earthquake (Mw 9.0) in the Indian, Western Pacific and South Pacific Oceans respectively. The inclusion criteria were literature type comprising journal articles and official reports, natural disaster type including tsunamis induced only by earthquakes, population type including humans, and outcome measure characterized by disease incidence increase. The potential post-tsunami ID are classified into 11 groups including respiratory, pulmonary, wound-related, water-borne, skin, vector-borne, eye, fecal-oral, food-borne, fungal and mite-borne ID. Respiratory infections were detected after all the above mentioned tsunamis. Wound-related, skin and water-borne ID were observed after the 2004 and 2011 tsunamis, while vector-borne, fecal-oral and eye ID were observed only after the 2004 tsunami and pulmonary, food-borne and mite-borne ID were diagnosed only after the 2011 tsunami. Based on available age and genre data, it is concluded that the most vulnerable population groups are males, children (age ≤ 15 years) and adults (age ≥ 65 years). Tetanus and pneumonia are the deadliest post-tsunami ID. The detected risk factors include (1) lowest socioeconomic conditions, poorly constructed buildings and lack of prevention measures, (2) lack of awareness and prior warning resulting in little time for preparedness or evacuation, (3) severely injured tsunami survivors exposed to high pathogen densities in soil and water, (4) destruction of critical infrastructures including health care systems causing delayed management and treatment of severe cases, (5) aggravating post-tsunami weather conditions, (6) formation of extensive potential vector breeding sites due to flooding, (7) overcrowded conditions in evacuation shelters characterized by small places, inadequate air ventilation, poor hand hygiene and dysfunction of the public health system, (8) low vaccination coverage, (9) poor personal hygiene, (10) minimum precautions against food contamination and (11) dependency of young children and weaker physical strength and resilience of elders needing assistance with daily activities. In conclusion, our study referred to potential ID following tsunamis induced after great earthquakes during the last 12 years. The establishment of strong disaster preparedness plans characterized by adequate environmental planning, resistant infrastructures and resilient health facilities is significant for the early detection, surveillance and control of emerging ID. Moreover, the establishment and the unceasing function of reliable early warning systems may help mitigate tsunami-related impact on public health.

  3. Focus Upon Implementing the GGOS Decadal Vision for Geohazards Monitoring

    NASA Astrophysics Data System (ADS)

    LaBrecque, John; Stangl, Gunter

    2017-04-01

    The Global Geodetic Observing System of the IAG identified present and future roles for Geodesy in the development and well being of the global society. The GGOS is focused upon the development of infrastructure, information, analysis, and educational systems to advance the International Global Reference Frame, the International Celestial Reference System, the International Height Reference System, atmospheric dynamics, sea level change and geohazards monitoring. The geohazards initiative is guided by an eleven nation working group initially focused upon the development and integration of regional multi-GNSS networks and analysis systems for earthquake and tsunami early warning. The opportunities and challenges being addressed by the Geohazards working group include regional network design, algorithm development and implementation, communications, funding, and international agreements on data access. This presentation will discuss in further detail these opportunities and challenges for the GGOS focus upon earthquake and tsunami early warning.

  4. Tsunami magnetic signals in the Northwestern Pacific seafloor magnetic measurements

    NASA Astrophysics Data System (ADS)

    Schnepf, N. R.; An, C.; Nair, M. C.; Maus, S.

    2013-12-01

    In the past two decades, underwater cables and seafloor magnetometers have observed motional inductance from ocean tsunamis. This study aimed to characterize the electromagnetic signatures of tsunamis from seafloor stations to assist in the long-term goal of real-time tsunami detection and warning systems. Four ocean seafloor stations (T13, T14, T15, T18) in the Northeastern Philippine Sea collected vector measurements of the electric and magnetic fields every minute during the period of 10/05/2005 to 11/30/2007 (Baba et al., 2010 PEPI). During this time, four major tsunamis occurred as a result of moment magnitude 8.0-8.1 earthquakes. These tsunamis include the 05/03/2006 Tonga event, the 01/13/2007 Kuril Islands event, the 04/01/2007 Solomon Islands event, and the 08/15/2007 Peru event. The Cornell Multi-grid Coupled Tsunami model (COMCOT) was used to predict the arrival time of the tsunamis at each of the seafloor stations. The stations' raw magnetic field signals underwent a high pass filter to then be examined for signals of the tsunami arrival. The high pass filtering showed clear tsunami signals for the Tonga event, but a clear signal was not seen for the other events. This may be due to signals from near Earth space with periods similar to tsunamis. To remove extraneous atmospheric magnetic signals, a cross-wavelet analysis was conducted using the horizontal field components from three INTERMAGNET land stations and the vertical component from the seafloor stations. The cross-wavelet analysis showed that for three of the six stations (two of the four tsunami events) the peak in wavelet amplitude matched the arrival of the tsunami. We discuss implications of our finding in magnetic monitoring of tsunamis.

  5. The UBO-TSUFD tsunami inundation model: validation and application to a tsunami case study focused on the city of Catania, Italy

    NASA Astrophysics Data System (ADS)

    Tinti, S.; Tonini, R.

    2013-07-01

    Nowadays numerical models are a powerful tool in tsunami research since they can be used (i) to reconstruct modern and historical events, (ii) to cast new light on tsunami sources by inverting tsunami data and observations, (iii) to build scenarios in the frame of tsunami mitigation plans, and (iv) to produce forecasts of tsunami impact and inundation in systems of early warning. In parallel with the general recognition of the importance of numerical tsunami simulations, the demand has grown for reliable tsunami codes, validated through tests agreed upon by the tsunami community. This paper presents the tsunami code UBO-TSUFD that has been developed at the University of Bologna, Italy, and that solves the non-linear shallow water (NSW) equations in a Cartesian frame, with inclusion of bottom friction and exclusion of the Coriolis force, by means of a leapfrog (LF) finite-difference scheme on a staggered grid and that accounts for moving boundaries to compute sea inundation and withdrawal at the coast. Results of UBO-TSUFD applied to four classical benchmark problems are shown: two benchmarks are based on analytical solutions, one on a plane wave propagating on a flat channel with a constant slope beach; and one on a laboratory experiment. The code is proven to perform very satisfactorily since it reproduces quite well the benchmark theoretical and experimental data. Further, the code is applied to a realistic tsunami case: a scenario of a tsunami threatening the coasts of eastern Sicily, Italy, is defined and discussed based on the historical tsunami of 11 January 1693, i.e. one of the most severe events in the Italian history.

  6. Public Perceptions of Tsunamis and the NOAA TsunamiReady Program in Los Angeles

    NASA Astrophysics Data System (ADS)

    Rosati, A.

    2010-12-01

    After the devastating December 2004 Indian Ocean Tsunami, California and other coastal states began installing "Tsunami Warning Zone" and "Evacuation Route" signs at beaches and major access roads. The geography of the Los Angeles area may not be conducive to signage alone for communication of the tsunami risk and safety precautions. Over a year after installation, most people surveyed did not know about or recognize the tsunami signs. More alarming is that many did not believe a tsunami could occur in the area even though earthquake generated waves have reached nearby beaches as recently as September 2009! UPDATE: FEB. 2010. Fifty two percent of the 147 people surveyed did not believe they would survive a natural disaster in Los Angeles. Given the unique geography of Los Angeles, how can the city and county improve the mental health of its citizens before and after a natural disaster? This poster begins to address the issues of community self-efficacy and resiliency in the face of tsunamis. Of note for future research, the data from this survey showed that most people believed climate change would increase the occurrence of tsunamis. Also, the public understanding of water inundation was disturbingly low. As scientists, it is important to understand the big picture of our research - how it is ultimately communicated, understood, and used by the public.

  7. An Experimental Seismic Data and Parameter Exchange System for Interim NEAMTWS

    NASA Astrophysics Data System (ADS)

    Hanka, W.; Hoffmann, T.; Weber, B.; Heinloo, A.; Hoffmann, M.; Müller-Wrana, T.; Saul, J.

    2009-04-01

    In 2008 GFZ Potsdam has started to operate its global earthquake monitoring system as an experimental seismic background data centre for the interim NEAMTWS (NE Atlantic and Mediterranean Tsunami Warning System). The SeisComP3 (SC3) software, developed within the GITEWS (German Indian Ocean Tsunami Early Warning System) project was extended to test the export and import of individual processing results within a cluster of SC3 systems. The initiated NEAMTWS SC3 cluster consists presently of the 24/7 seismic services at IMP, IGN, LDG/EMSC and KOERI, whereas INGV and NOA are still pending. The GFZ virtual real-time seismic network (GEOFON Extended Virtual Network - GEVN) was substantially extended by many stations from Western European countries optimizing the station distribution for NEAMTWS purposes. To amend the public seismic network (VEBSN - Virtual European Broadband Seismic Network) some attached centres provided additional private stations for NEAMTWS usage. In parallel to the data collection by Internet the GFZ VSAT hub for the secured data collection of the EuroMED GEOFON and NEAMTWS backbone network stations became operational and the first data links were established. In 2008 the experimental system could already prove its performance since a number of relevant earthquakes have happened in NEAMTWS area. The results are very promising in terms of speed as the automatic alerts (reliable solutions based on a minimum of 25 stations and disseminated by emails and SMS) were issued between 2 1/2 and 4 minutes for Greece and 5 minutes for Iceland. They are also promising in terms of accuracy since epicenter coordinates, depth and magnitude estimates were sufficiently accurate from the very beginning, usually don't differ substantially from the final solutions and provide a good starting point for the operations of the interim NEAMTWS. However, although an automatic seismic system is a good first step, 24/7 manned RTWCs are mandatory for regular manual verification of the automatic seismic results and the estimation of the tsunami potential for a given event.

  8. Parallel Processing of Numerical Tsunami Simulations on a High Performance Cluster based on the GDAL Library

    NASA Astrophysics Data System (ADS)

    Schroeder, Matthias; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim

    2014-05-01

    Thousands of numerical tsunami simulations allow the computation of inundation and run-up along the coast for vulnerable areas over the time. A so-called Matching Scenario Database (MSDB) [1] contains this large number of simulations in text file format. In order to visualize these wave propagations the scenarios have to be reprocessed automatically. In the TRIDEC project funded by the seventh Framework Programme of the European Union a Virtual Scenario Database (VSDB) and a Matching Scenario Database (MSDB) were established amongst others by the working group of the University of Bologna (UniBo) [1]. One part of TRIDEC was the developing of a new generation of a Decision Support System (DSS) for tsunami Early Warning Systems (TEWS) [2]. A working group of the GFZ German Research Centre for Geosciences was responsible for developing the Command and Control User Interface (CCUI) as central software application which support operator activities, incident management and message disseminations. For the integration and visualization in the CCUI, the numerical tsunami simulations from MSDB must be converted into the shapefiles format. The usage of shapefiles enables a much easier integration into standard Geographic Information Systems (GIS). Since also the CCUI is based on two widely used open source products (GeoTools library and uDig), whereby the integration of shapefiles is provided by these libraries a priori. In this case, for an example area around the Western Iberian margin several thousand tsunami variations were processed. Due to the mass of data only a program-controlled process was conceivable. In order to optimize the computing efforts and operating time the use of an existing GFZ High Performance Computing Cluster (HPC) had been chosen. Thus, a geospatial software was sought after that is capable for parallel processing. The FOSS tool Geospatial Data Abstraction Library (GDAL/OGR) was used to match the coordinates with the wave heights and generates the different shapefiles for certain time steps. The shapefiles contain afterwards lines for visualizing the isochrones of the wave propagation and moreover, data about the maximum wave height and the Estimated Time of Arrival (ETA) at the coast. Our contribution shows the entire workflow and the visualizing results of the-processing for the example region Western Iberian ocean margin. [1] Armigliato A., Pagnoni G., Zaniboni F, Tinti S. (2013), Database of tsunami scenario simulations for Western Iberia: a tool for the TRIDEC Project Decision Support System for tsunami early warning, Vol. 15, EGU2013-5567, EGU General Assembly 2013, Vienna (Austria). [2] Löwe, P., Wächter, J., Hammitzsch, M., Lendholt, M., Häner, R. (2013): The Evolution of Service-oriented Disaster Early Warning Systems in the TRIDEC Project, 23rd International Ocean and Polar Engineering Conference - ISOPE-2013, Anchorage (USA).

  9. Tsunami Size Distributions at Far-Field Locations from Aggregated Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Parsons, T.

    2015-12-01

    The distribution of tsunami amplitudes at far-field tide gauge stations is explained by aggregating the probability of tsunamis derived from individual subduction zones and scaled by their seismic moment. The observed tsunami amplitude distributions of both continental (e.g., San Francisco) and island (e.g., Hilo) stations distant from subduction zones are examined. Although the observed probability distributions nominally follow a Pareto (power-law) distribution, there are significant deviations. Some stations exhibit varying degrees of tapering of the distribution at high amplitudes and, in the case of the Hilo station, there is a prominent break in slope on log-log probability plots. There are also differences in the slopes of the observed distributions among stations that can be significant. To explain these differences we first estimate seismic moment distributions of observed earthquakes for major subduction zones. Second, regression models are developed that relate the tsunami amplitude at a station to seismic moment at a subduction zone, correcting for epicentral distance. The seismic moment distribution is then transformed to a site-specific tsunami amplitude distribution using the regression model. Finally, a mixture distribution is developed, aggregating the transformed tsunami distributions from all relevant subduction zones. This mixture distribution is compared to the observed distribution to assess the performance of the method described above. This method allows us to estimate the largest tsunami that can be expected in a given time period at a station.

  10. Cloud manifestations of atmospheric gravity waves over the water area of the Kuril Islands during the propagation of powerful transoceanic tsunamis

    NASA Astrophysics Data System (ADS)

    Skorokhodov, A. V.; Shevchenko, G. V.; Astafurov, V. G.

    2017-11-01

    The investigation results of atmospheric gravity waves cloudy manifestations observed over the water area of the Kuril Island ridge during the propagation of powerful transoceanic tsunami 2009-2010 are shown. The description of tsunami characteristics is based on the use of information from autonomous deep-water stations of the Institute of Marine Geology and Geophysics FEB RAS in the Southern Kuril Islands and the Tsunami Warning Service telemetering recorder located in one of the ports on Paramushir Island. The environment condition information was extracted from the results of remote sensing of the Earth from space by the MODIS sensor and aerological measurements at the meteorological station of Severo-Kurilsk. The results of analyzing the characteristics of wave processes in the atmosphere and the ocean are discussed and their comparison is carried out.

  11. SEQUENCING of TSUNAMI WAVES: Why the first wave is not always the largest?

    NASA Astrophysics Data System (ADS)

    Synolakis, C.; Okal, E.

    2016-12-01

    We discuss what contributes to the `sequencing' of tsunami waves in the far field, that is, to the distribution of the maximum sea surface amplitude inside the dominant wave packet constituting the primary arrival at a distant harbour. Based on simple models of sources for which analytical solutions are available, we show that, as range is increased, the wave pattern evolves from a regime of maximum amplitude in the first oscillation to one of delayed maximum, where the largest amplitude takes place during a subsequent oscillation. In the case of the simple, instantaneous uplift of a circular disk at the surface of an ocean of constant depth, the critical distance for transition between those patterns scales as r 30 /h2 where r0 is the radius of the disk and h the depth of the ocean. This behaviour is explained from simple arguments based on a model where sequencing results from frequency dispersion in the primary wave packet, as the width of its spectrum around its dominant period T0 becomes dispersed in time in an amount comparable to T0 , the latter being controlled by a combination of source size and ocean depth. The general concepts in this model are confirmed in the case of more realistic sources for tsunami excitation by a finite-time deformation of the ocean floor, as well as in real-life simulations of tsunamis excited by large subduction events, for which we find that the influence of fault width on the distribution of sequencing is more important than that of fault length. Finally, simulation of the major events of Chile (2010) and Japan (2011) at large arrays of virtual gauges in the Pacific Basin correctly predicts the majority of the sequencing patterns observed on DART buoys during these events. By providing insight into the evolution with time of wave amplitudes inside primary wave packets for far field tsunamis generated by large earthquakes, our results stress the importance, for civil defense authorities, of issuing warning and evacuation orders of sufficient duration to avoid the hazard

  12. Sequencing of tsunami waves: why the first wave is not always the largest

    NASA Astrophysics Data System (ADS)

    Okal, Emile A.; Synolakis, Costas E.

    2016-02-01

    This paper examines the factors contributing to the `sequencing' of tsunami waves in the far field, that is, to the distribution of the maximum sea surface amplitude inside the dominant wave packet constituting the primary arrival at a distant harbour. Based on simple models of sources for which analytical solutions are available, we show that, as range is increased, the wave pattern evolves from a regime of maximum amplitude in the first oscillation to one of delayed maximum, where the largest amplitude takes place during a subsequent oscillation. In the case of the simple, instantaneous uplift of a circular disk at the surface of an ocean of constant depth, the critical distance for transition between those patterns scales as r_0^3 / h^2 where r0 is the radius of the disk and h the depth of the ocean. This behaviour is explained from simple arguments based on a model where sequencing results from frequency dispersion in the primary wave packet, as the width of its spectrum around its dominant period T0 becomes dispersed in time in an amount comparable to T0, the latter being controlled by a combination of source size and ocean depth. The general concepts in this model are confirmed in the case of more realistic sources for tsunami excitation by a finite-time deformation of the ocean floor, as well as in real-life simulations of tsunamis excited by large subduction events, for which we find that the influence of fault width on the distribution of sequencing is more important than that of fault length. Finally, simulation of the major events of Chile (2010) and Japan (2011) at large arrays of virtual gauges in the Pacific Basin correctly predicts the majority of the sequencing patterns observed on DART buoys during these events. By providing insight into the evolution with time of wave amplitudes inside primary wave packets for far field tsunamis generated by large earthquakes, our results stress the importance, for civil defense authorities, of issuing warning and evacuation orders of sufficient duration to avoid the hazard inherent in premature calls for all-clear.

  13. Tsunami Warning, Education, and Research Act of 2014

    THOMAS, 113th Congress

    Rep. Bonamici, Suzanne [D-OR-1

    2014-07-31

    Senate - 09/09/2014 Received in the Senate and Read twice and referred to the Committee on Commerce, Science, and Transportation. (All Actions) Tracker: This bill has the status Passed HouseHere are the steps for Status of Legislation:

  14. Role of State Tsunami Geoscientists during Emergency Response Activities: Example from the State of California (USA) during September 29, 2009, Samoa Tsunami Event

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Dengler, L. A.; Goltz, J. D.; Legg, M.; Miller, K. M.; Parrish, J. G.; Whitmore, P.

    2009-12-01

    California tsunami geoscientists work closely with federal, state and local government emergency managers to help prepare coastal communities for potential impacts from a tsunami before, during, and after an event. For teletsunamis, as scientific information (forecast model wave heights, first-wave arrival times, etc.) from NOAA’s West Coast and Alaska’s Tsunami Warning Center is made available, state-level emergency managers must help convey this information in a concise and comprehendible manner to local officials who ultimately determine the appropriate response activities for their jurisdictions. During the Samoa Tsunami Advisory for California on September 29, 2009, geoscientists from the California Geological Survey and Humboldt State University assisted the California Emergency Management Agency in this information transfer by providing technical assistance during teleconference meetings with NOAA and other state and local emergency managers prior to the arrival of the tsunami. State geoscientists gathered additional background information on anticipated tidal conditions and wave heights for areas not covered by NOAA’s forecast models. The participation of the state geoscientists in the emergency response process resulted in clarifying which regions were potentially at-risk, as well as those having a low risk from the tsunami. Future tsunami response activities for state geoscientists include: 1) working closely with NOAA to simplify their tsunami alert messaging and expand their forecast modeling coverage, 2) creation of “playbooks” containing information from existing tsunami scenarios for local emergency managers to reference during an event, and 3) development of a state-level information “clearinghouse” and pre-tsunami field response team to assist local officials as well as observe and report tsunami effects.

  15. Assessing historical rate changes in global tsunami occurrence

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2011-01-01

    The global catalogue of tsunami events is examined to determine if transient variations in tsunami rates are consistent with a Poisson process commonly assumed for tsunami hazard assessments. The primary data analyzed are tsunamis with maximum sizes >1m. The record of these tsunamis appears to be complete since approximately 1890. A secondary data set of tsunamis >0.1m is also analyzed that appears to be complete since approximately 1960. Various kernel density estimates used to determine the rate distribution with time indicate a prominent rate change in global tsunamis during the mid-1990s. Less prominent rate changes occur in the early- and mid-20th century. To determine whether these rate fluctuations are anomalous, the distribution of annual event numbers for the tsunami catalogue is compared to Poisson and negative binomial distributions, the latter of which includes the effects of temporal clustering. Compared to a Poisson distribution, the negative binomial distribution model provides a consistent fit to tsunami event numbers for the >1m data set, but the Poisson null hypothesis cannot be falsified for the shorter duration >0.1m data set. Temporal clustering of tsunami sources is also indicated by the distribution of interevent times for both data sets. Tsunami event clusters consist only of two to four events, in contrast to protracted sequences of earthquakes that make up foreshock-main shock-aftershock sequences. From past studies of seismicity, it is likely that there is a physical triggering mechanism responsible for events within the tsunami source 'mini-clusters'. In conclusion, prominent transient rate increases in the occurrence of global tsunamis appear to be caused by temporal grouping of geographically distinct mini-clusters, in addition to the random preferential location of global M >7 earthquakes along offshore fault zones.

  16. Tide gauge observations of the Indian Ocean tsunami, December 26, 2004

    NASA Astrophysics Data System (ADS)

    Merrifield, M. A.; Firing, Y. L.; Aarup, T.; Agricole, W.; Brundrit, G.; Chang-Seng, D.; Farre, R.; Kilonsky, B.; Knight, W.; Kong, L.; Magori, C.; Manurung, P.; McCreery, C.; Mitchell, W.; Pillay, S.; Schindele, F.; Shillington, F.; Testut, L.; Wijeratne, E. M. S.; Caldwell, P.; Jardin, J.; Nakahara, S.; Porter, F.-Y.; Turetsky, N.

    2005-05-01

    The magnitude 9.0 earthquake centered off the west coast of northern Sumatra (3.307°N, 95.947°E) on December 26, 2004 at 00:59 UTC (United States Geological Survey (USGS) (2005), USGS Earthquake Hazards Program-Latest Earthquakes, Earthquake Hazards Program, http://earthquake.usgs.gov/eqinthenews/2004/usslav/, 2005) generated a series of tsunami waves that devastated coastal areas throughout the Indian Ocean. Tide gauges operated on behalf of national and international organizations recorded the wave form at a number of island and continental locations. This report summarizes the tide gauge observations of the tsunami in the Indian Ocean (available as of January 2005) and provides a recommendation for the use of the basin-wide tide gauge network for future warnings.

  17. Learning from the Survivors and Teaching the Community (Invited)

    NASA Astrophysics Data System (ADS)

    Dudley, W. C.

    2009-12-01

    Interviewing tsunami survivors is an effective way to collect data that have both educational and scientific value. Critical eye witness insight can be gained into the human perception of tsunami events, as well as the reaction and response of victims. Furthermore, the survivors’ assessment of rescue and recovery efforts following the event, and of current warning, mitigation and education measures can be an important tool in evaluating the potential effectiveness of such efforts. Video interviews of tsunami survivors telling their stories can in themselves be a powerful education product for use in the local community and beyond. Mistakes made, lessons learned, and current challenges facing communities in Indonesia, Thailand, India, Sri Lanka, and the Maldives, as well as in Alaska and Hawaii, will be presented.

  18. Numerical experiment on tsunami deposit distribution process by using tsunami sediment transport model in historical tsunami event of megathrust Nankai trough earthquake

    NASA Astrophysics Data System (ADS)

    Imai, K.; Sugawara, D.; Takahashi, T.

    2017-12-01

    A large flow caused by tsunami transports sediments from beach and forms tsunami deposits in land and coastal lakes. A tsunami deposit has been found in their undisturbed on coastal lakes especially. Okamura & Matsuoka (2012) found some tsunami deposits in the field survey of coastal lakes facing to the Nankai trough, and tsunami deposits due to the past eight Nankai Trough megathrust earthquakes they identified. The environment in coastal lakes is stably calm and suitable for tsunami deposits preservation compared to other topographical conditions such as plains. Therefore, there is a possibility that the recurrence interval of megathrust earthquakes and tsunamis will be discussed with high resolution. In addition, it has been pointed out that small events that cannot be detected in plains could be separated finely (Sawai, 2012). Various aspects of past tsunami is expected to be elucidated, in consideration of topographical conditions of coastal lakes by using the relationship between the erosion-and-sedimentation process of the lake bottom and the external force of tsunami. In this research, numerical examination based on tsunami sediment transport model (Takahashi et al., 1999) was carried out on the site Ryujin-ike pond of Ohita, Japan where tsunami deposit was identified, and deposit migration analysis was conducted on the tsunami deposit distribution process of historical Nankai Trough earthquakes. Furthermore, examination of tsunami source conditions is possibly investigated by comparison studies of the observed data and the computation of tsunami deposit distribution. It is difficult to clarify details of tsunami source from indistinct information of paleogeographical conditions. However, this result shows that it can be used as a constraint condition of the tsunami source scale by combining tsunami deposit distribution in lakes with computation data.

  19. Analysis of Tsunami Evacuation Issues Using Agent Based Modeling. A Case Study of the 2011 Tohoku Tsunami in Yuriage, Natori.

    NASA Astrophysics Data System (ADS)

    Mas, E.; Takagi, H.; Adriano, B.; Hayashi, S.; Koshimura, S.

    2014-12-01

    The 2011 Great East Japan earthquake and tsunami reminded that nature can exceed structural countermeasures like seawalls, breakwaters or tsunami gates. In such situations it is a challenging task for people to find nearby haven. This event, as many others before, confirmed the importance of early evacuation, tsunami awareness and the need for developing much more resilient communities with effective evacuation plans. To support reconstruction activities and efforts on developing resilient communities in areas at risk, tsunami evacuation simulation can be applied to tsunami mitigation and evacuation planning. In this study, using the compiled information related to the evacuation behavior at Yuriage in Natori during the 2011 tsunami, we simulated the evacuation process and explored the reasons for the large number of fatalities in the area. It was found that residents did evacuate to nearby shelter areas, however after the tsunami warning was increased some evacuees decided to conduct a second step evacuation to a far inland shelter. Simulation results show the consequences of such decision and the outcomes when a second evacuation would not have been performed. The actual reported number of fatalities in the event and the results from simulation are compared to verify the model. The case study shows the contribution of tsunami evacuation models as tools to be applied for the analysis of evacuees' decisions and the related outcomes. In addition, future evacuation plans and activities for reconstruction process and urban planning can be supported by the results provided from this kind of tsunami evacuation models.

  20. Compilation and Analysis of a Database of Local Tsunami Bulletins issued by the Pacific Tsunami Warning Center (PTWC) to the Hawaii Emergency Management Agency (HI-EMA) between September 2003 and July, 2015

    NASA Astrophysics Data System (ADS)

    Sardina, V.; Koyanagi, K. K.; Walsh, D.; Becker, N. C.; McCreery, C.

    2015-12-01

    The PTWC functions not only as official international tsunami warning center (TWC) for nations with coasts around the Pacific rim, the Caribbean, and other regions of the world, but also as the local TWC for the State of Hawaii. The PTWC began sending local tsunami messages to HI-EMA only since September, 2003. As part of its routine operations, the PTWC strives to send a local tsunami message product for any Hawaii earthquake with a 4.0 magnitude or larger within five minutes of origin time. To evaluate PTWC's performance in that regard, however, we must first compile a suitable local tsunami bulletins' database. For this purpose, we scanned all the available logs for the Federal Aviation Administration (FAA) communications' circuit between 2003 and 2015 and retrieved 104 local bulletins. We parsed these bulletins and extracted the parametric data needed to evaluate PTWC's performance in terms of essential statistics such as message delay time, epicenter offsets, and magnitude residuals as compared with more authoritative earthquake source parametrizations. To that end, we cross-validated 88 of these seismic events having magnitudes between 2.8 and 6.7 with the corresponding source parameters obtained from the USGS Hawaiian Volcano Observatory (HVO) and the National Earthquake Information Center's (NEIC) online catalog. Analysis of events with magnitude 4.0 or larger gives a median message delay time of 3 minutes and 33 seconds, a median epicentral offset of 3.2 km, and a median magnitude residual of 0.2 unit. Several message delay outliers exist due to the fact that PTWC has sent local tsunami information statements (TIS) for felt events with magnitudes as small as 2.8 located west of the Big Island. Routine use of a synthetic Wood-Anderson magnitude since the end of 2012 appears to have brought consistency to PTWC's local magnitude estimates and a reduction in the message delays. Station site corrections, a refined attenuation model, and optimization of the peak-amplitude search window may improve magnitude estimates. Improved dissemination software, regional moment tensor determinations for rapid magnitude estimation in addition to ML and Mwp can result in yet faster and more accurate tsunami message products.

  1. Proposal of a method for evaluating tsunami risk using response-surface methodology

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.

  2. Seismic and tsunami hazard in Puerto Rico and the Virgin Islands

    USGS Publications Warehouse

    Dillon, William P.; Frankel, Arthur D.; Mueller, Charles S.; Rodriguez, Rafael W.; ten Brink, Uri S.

    1999-01-01

    Executive SummaryPuerto Rico and the Virgin Islands are located at an active plate boundary between the North American plate and the northeast corner of the Caribbean plate. The region was subject in historical times to large magnitude earthquakes and devastating tsunamis. A major downward tilt of the sea floor north of Puerto Rico and the Virgin Islands, large submarine rockslides, and an unusually large negative gravity anomaly are also indicative of a tectonically active region. Scientists have so far failed to explain the deformation of this region in a coherent and predictable picture, such as in California, and this has hampered their ability to assess seismic and tsunami hazards in the region. The NE corner of the Caribbean is unique among the seismically-active regions of the United States in that it is mostly covered by water. This fact presents an additional challenge for seismic and tsunami hazard assessment and mitigation.The workshop, convened in San Juan on March 23-24, 1999, was "historic" in that it brought together for the first time a broad spectrum of scientists, engineers, and public and private sector officials who deal with such diverse questions as tectonic models, probabilistic assessment of seismic hazard, prediction of tsunami runup, strong ground motion, building codes, stability of man-made structures, and the public’s preparedness for natural disasters. It was an opportunity for all the participants to find out how their own activity fit into the broad picture of science and how it aids society in hazard assessment and mitigation. In addition, the workshop was offered as a continuing education course at the Colegio de Ingenieros y Agrimensores de Puerto Rico, which assured a rapid dissemination of the results to the local community. A news conference which took place during the workshop alerted the public to the efforts of the USGS, other Federal agencies, the Commonwealth of Puerto Rico, universities and the private sector.During the first day of the workshop, participants from universities, federal institutions, and consulting firms in Puerto Rico, the Virgin Islands, the continental U.S., Dominican Republic, and Europe reviewed the present state of knowledge including a review and discussion of present plate models, recent GPS and seismic reflection data, seismicity, paleoseismology, and tsunamis. The state of earthquake/tsunami studies in Puerto Rico was presented by several faculty members from the University of Puerto Rico at Mayaguez. A preliminary seismic hazard map was presented by the USGS and previous hazard maps and economic loss assessments were considered. During the second day, the participants divided into working groups and prepared specific recommendations for future activities in the region along the six following topics below. Highlights of these recommended activities are:Marine geology and geophysics – Acquire deep-penetration seismic reflection and refraction data, deploy temporary ocean bottom seismometer arrays to record earthquakes, collect high-resolution multibeam bathymetry and side scan sonar data of the region, and in particular, the near shore region, and conduct focussed high-resolution seismic studies around faults. Determine slip rates of specific offshore faults. Assemble a GIS database for available marine geological and geophysical data.Paleoseismology and active faults - Field reconnaissance aimed at identifying Quaternary faults and determining their paleoseismic chronology and slip rates, as well as identifying and dating paleoliquefaction features from large earthquakes. Quaternary mapping of marine terraces, fluvial terraces and basins, beach ridges, etc., to establish framework for understanding neotectonic deformation of the island. Interpretation of aerial photography to identify possible Quaternary faults.Earthquake seismology – Determine an empirical seismic attenuation function using observations from local seismic networks and recently-installed broad-band stations. Evaluate existing earthquake catalogs from local networks and regional stations, complete the catalogs. Transcribe the pre-1991 network data from 9-track tape onto more stable archival media. Calibrate instruments of local networks. Use GPS measurement to constrain deformation rates used in seismic-hazard maps.Engineering – Prepare liquefaction susceptibility maps for the urban areas. Update and improve databases for types of site conditions. Collect site effect observations and near-surface geophysical measurements for future local (urban-area) hazard maps. Expand the number of instruments in the strong motion program. Develop fragility curves for Puerto Rico construction types and details, and carry out laboratory testing on selected types of mass-produced construction. Consider tsunami design in shoreline construction projects.Tsunami hazard - Extract tsunami observations from archives and develop a Caribbean historical tsunami database. Analyze prehistoric tsunami deposits. Collect accurate, up-to-date, near-shore topography and bathymetry for accurate inundation models. Prepare tsunami flooding and evacuation maps. Establish a Caribbean Tsunami Warning System for Puerto Rico and the Virgin Islands. Evaluate local, regional, national, and global seismic networks and equipment, and their role in a tsunami warning system.Societal concerns – Prepare warning messages, protocols, and evacuation routes for earthquake, tsunami, and landslide hazards for Puerto Rico and the U.S. Virgin Islands. Advocate enforcement of existing building codes. Prepare non-technical hazard assessment maps for political and educational uses. Raise the awareness of potentially affected populations by presentations at elementary schools, by the production of a tsunami video, and by distribution of earthquake preparedness manuals in newspaper supplements. Promote partnerships at state and federal level for long-term earthquake and tsunami hazard mitigation. This partnership should also include the private sector such as the insurance industry, telecommunication companies, and the engineering community.The following reports of the various working groups are the cumulative recommendations of the community of scientists, engineers, and public officials, who participated in the workshop. The list of participants and the workshop’s agenda are given in the appendix.Marine and Geology and Geophysics Working GroupPaleoseismology and Active Faults Working GroupJoint Working Group for Earthquake Seismology and EngineeringTsunami Working GroupSocietal Concerns Working Group

  3. Development, testing, and applications of site-specific tsunami inundation models for real-time forecasting

    NASA Astrophysics Data System (ADS)

    Tang, L.; Titov, V. V.; Chamberlin, C. D.

    2009-12-01

    The study describes the development, testing and applications of site-specific tsunami inundation models (forecast models) for use in NOAA's tsunami forecast and warning system. The model development process includes sensitivity studies of tsunami wave characteristics in the nearshore and inundation, for a range of model grid setups, resolutions and parameters. To demonstrate the process, four forecast models in Hawaii, at Hilo, Kahului, Honolulu, and Nawiliwili are described. The models were validated with fourteen historical tsunamis and compared with numerical results from reference inundation models of higher resolution. The accuracy of the modeled maximum wave height is greater than 80% when the observation is greater than 0.5 m; when the observation is below 0.5 m the error is less than 0.3 m. The error of the modeled arrival time of the first peak is within 3% of the travel time. The developed forecast models were further applied to hazard assessment from simulated magnitude 7.5, 8.2, 8.7 and 9.3 tsunamis based on subduction zone earthquakes in the Pacific. The tsunami hazard assessment study indicates that use of a seismic magnitude alone for a tsunami source assessment is inadequate to achieve such accuracy for tsunami amplitude forecasts. The forecast models apply local bathymetric and topographic information, and utilize dynamic boundary conditions from the tsunami source function database, to provide site- and event-specific coastal predictions. Only by combining a Deep-ocean Assessment and Reporting of Tsunami-constrained tsunami magnitude with site-specific high-resolution models can the forecasts completely cover the evolution of earthquake-generated tsunami waves: generation, deep ocean propagation, and coastal inundation. Wavelet analysis of the tsunami waves suggests the coastal tsunami frequency responses at different sites are dominated by the local bathymetry, yet they can be partially related to the locations of the tsunami sources. The study also demonstrates the nonlinearity between offshore and nearshore maximum wave amplitudes.

  4. Improvement of tsunami detection in timeseries data of GPS buoys with the Continuous Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Chida, Y.; Takagawa, T.

    2017-12-01

    The observation data of GPS buoys which are installed in the offshore of Japan are used for monitoring not only waves but also tsunamis in Japan. The real-time data was successfully used to upgrade the tsunami warnings just after the 2011 Tohoku earthquake. Huge tsunamis can be easily detected because the signal-noise ratio is high enough, but moderate tsunami is not. GPS data sometimes include the error waveforms like tsunamis because of changing accuracy by the number and the position of GPS satellites. To distinguish the true tsunami waveforms from pseudo-tsunami ones is important for tsunami detection. In this research, a method to reduce misdetections of tsunami in the observation data of GPS buoys and to increase the efficiency of tsunami detection was developed.Firstly, the error waveforms were extracted by using the indexes of position dilution of precision, reliability of GPS satellite positioning and satellite number for calculation. Then, the output from this procedure was used for the Continuous Wavelet Transform (CWT) to analyze the time-frequency characteristics of error waveforms and real tsunami waveforms.We found that the error waveforms tended to appear when the accuracy of GPS buoys positioning was low. By extracting these waveforms, it was possible to decrease about 43% error waveforms without the reduction of the tsunami detection rate. Moreover, we found that the amplitudes of power spectra obtained from the error waveforms and real tsunamis were similar in the component of long period (4-65 minutes), on the other hand, the amplitude in the component of short period (< 1 minute) obtained from the error waveforms was significantly larger than that of the real tsunami waveforms. By thresholding of the short-period component, further extraction of error waveforms became possible without a significant reduction of tsunami detection rate.

  5. Tsunami early warning in the Mediterranean: role, structure and tricks of pre-computed tsunami simulation databases and matching/forecasting algorithms

    NASA Astrophysics Data System (ADS)

    Armigliato, Alberto; Pagnoni, Gianluca; Tinti, Stefano

    2014-05-01

    The general idea that pre-computed simulated scenario databases can play a key role in conceiving tsunami early warning systems is commonly accepted by now. But it was only in the last decade that it started to be applied to the Mediterranean region, taking special impulse from initiatives like the GDACS and from recently concluded EU-funded projects such as TRIDEC and NearToWarn. With reference to these two projects and with the possibility of further developing this research line in the frame of the FP7 ASTARTE project, we discuss some results we obtained regarding two major topics, namely the strategies applicable to the tsunami scenario database building and the design and performance assessment of a timely and "reliable" elementary-scenario combination algorithm to be run in real-time. As for the first theme, we take advantage of the experience gained in the test areas of Western Iberia, Rhodes (Greece) and Cyprus to illustrate the criteria with which a "Matching Scenario Database" (MSDB) can be built. These involve 1) the choice of the main tectonic tsunamigenic sources (or areas), 2) their tessellation with matrices of elementary faults whose dimension heavily depend on the particular studied area and must be a compromise between the needs to represent the tsunamigenic area in sufficient detail and of limiting the number of scenarios to be simulated, 3) the computation of the scenarios themselves, 4) the choice of the relevant simulation outputs and the standardisation of their formats. Regarding the matching/forecast algorithm, we want it to select and combine the MSDB elements based on the initial earthquake magnitude and location estimate, and to produce a forecast of (at least) the tsunami arrival time, amplitude and period at the closest tide-level sensors and in all needed forecast points. We discuss the performance of the algorithm in terms of the time needed to produce the forecast after the earthquake is detected. In particular, we analyse the different contributions of a number of factors such as the efficient code development and availability of cutting-edge hardware to run the code itself, the wise selection of the MSDB outputs to be combined, the choice of the forecast points where water elevation time series must be taken into account, and few others.

  6. How Perturbing Ocean Floor Disturbs Tsunami Waves

    NASA Astrophysics Data System (ADS)

    Salaree, A.; Okal, E.

    2017-12-01

    Bathymetry maps play, perhaps the most crucial role in optimal tsunami simulations. Regardless of the simulation method, on one hand, it is desirable to include every detailed bathymetry feature in the simulation grids in order to predict tsunami amplitudes as accurately as possible, but on the other hand, large grids result in long simulation times. It is therefore, of interest to investigate a "sufficiency" level - if any - for the amount of details in bathymetry grids needed to reconstruct the most important features in tsunami simulations, as obtained from the actual bathymetry. In this context, we use a spherical harmonics series approach to decompose the bathymetry of the Pacific ocean into its components down to a resolution of 4 degrees (l=100) and create bathymetry grids by accumulating the resulting terms. We then use these grids to simulate the tsunami behavior from pure thrust events around the Pacific through the MOST algorithm (e.g. Titov & Synolakis, 1995; Titov & Synolakis, 1998). Our preliminary results reveal that one would only need to consider the sum of the first 40 coefficients (equivalent to a resolution of 1000 km) to reproduce the main components of the "real" results. This would result in simpler simulations, and potentially allowing for more efficient tsunami warning algorithms.

  7. Observing Traveling Ionospheric Disturbances Caused by Tsunamis Using GPS TEC Measurements

    NASA Technical Reports Server (NTRS)

    Galvan, David A.; Komjathy, Attila; Hickey, Michael; Foster, James; Mannucci, Anthony J.

    2010-01-01

    Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following two recent seismic events: the American Samoa earthquake of September 29, 2009, and the Chile earthquake of February 27, 2010. Fluctuations in TEC correlated in time, space, and wave properties with these tsunamis were observed in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with wavelengths and periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the tsunamis in certain locations, but not in others. Where variations are observed, the typical amplitude tends to be on the order of 1% of the background TEC value. Variations with amplitudes 0.1 - 0.2 TECU are observable with periods and timing affiliated with the tsunami. These observations are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement in some locations, though there are cases when the model predicts an observable tsunami-driven signature and none is observed. These TEC variations are not always seen when a tsunami is present, but in these two events the regions where a strong ocean tsunami was observed did coincide with clear TEC observations, while a lack of clear TEC observations coincided with smaller tsunami amplitudes. There exists the potential to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for early warning systems.

  8. Making Multi-Level Tsunami Evacuation Playbooks Operational in California and Hawaii

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Peterson, D.; Fryer, G. J.; Miller, K.; Nicolini, T.; Popham, C.; Richards, K.; Whitmore, P.; Wood, N. J.

    2016-12-01

    In the aftermath of the 2010 Chile, 2011 Japan, and 2012 Haida Gwaii tsunamis in California and Hawaii, coastal emergency managers requested that state and federal tsunami programs investigate providing more detailed information about the flood potential and recommended evacuation for distant-source tsunamis well ahead of their arrival time. Evacuation "Playbooks" for tsunamis of variable sizes and source locations have been developed for some communities in the two states, providing secondary options to an all or nothing approach for evacuation. Playbooks have been finalized for nearly 70% of the coastal communities in California, and have been drafted for evaluation by the communities of Honolulu and Hilo in Hawaii. A key component to determining a recommended level of evacuation during a distant-source tsunami and making the Playbooks operational has been the development of the "FASTER" approach, an acronym for factors that influence the tsunami flood hazard for a community: Forecast Amplitude, Storm, Tides, Error in forecast, and the Run-up potential. Within the first couple hours after a tsunami is generated, the FASTER flood elevation value will be computed and used to select the appropriate minimum tsunami phase evacuation "Playbook" for use by the coastal communities. The states of California and Hawaii, the tsunami warning centers, and local weather service offices are working together to deliver recommendations on the appropriate evacuation Playbook plans for communities to use prior to the arrival of a distant-source tsunami. These partners are working closely with individual communities on developing conservative and consistent protocols on the use of the Playbooks. Playbooks help provide a scientifically-based, minimum response for small- to moderate-size tsunamis which could reduce the potential for over-evacuation of hundreds of thousands of people and save hundreds of millions of dollars in evacuation costs for communities and businesses.

  9. Field Survey in French Polynesia and Numerical Modeling of the 11 March 2011 Japan Tsunami

    NASA Astrophysics Data System (ADS)

    Hyvernaud, O.; Reymond, D.; Okal, E.; Hebert, H.; Clément, J.; Wong, K.

    2011-12-01

    We present the field survey and observations of the Japan tsunami of March 2011, in Society and Marquesas islands. Without being catastrophic the tsunami produced some damages in the Marquesas, which are always the most prone to tsunami amplification in French Polynesia: 8 houses were destroyed and inundated (up to 4.5 m of run-up measured). Surprisingly, the maximum run-up was observed on the South-West coast of Nuku Hiva island (a bay open to the opposite direction of the wave-front). In Tahiti, the tsunami was much more moderate, with a maximum height observed on the North coast: about 3 m of run-up observed, corresponding to the highest level of the seasonal oceanic swell without damage (just the main road inundated). These observations are well explained and reproduced by the numerical modeling of the tsunami. The results obtained confirm the exceptional source dimensions. Concerning the real time aspect, the tsunami height has been also rapidly predicted during the context of tsunami warning, with 2 methods: the first uses a database of pre-computed numeric simulations, and the second one uses a formula giving the tsunami amplitude in deep ocean in function of the source parameters (coordinates of the source, scalar moment and fault azimuth) and of the coordinates of the receiver. The population responded responsibly to the evacuation order on the 19 islands involved, helped in part by a favourable arrival time of the wave (7:30 a.m., local time).

  10. Documentation for the Southeast Asia seismic hazard maps

    USGS Publications Warehouse

    Petersen, Mark; Harmsen, Stephen; Mueller, Charles; Haller, Kathleen; Dewey, James; Luco, Nicolas; Crone, Anthony; Lidke, David; Rukstales, Kenneth

    2007-01-01

    The U.S. Geological Survey (USGS) Southeast Asia Seismic Hazard Project originated in response to the 26 December 2004 Sumatra earthquake (M9.2) and the resulting tsunami that caused significant casualties and economic losses in Indonesia, Thailand, Malaysia, India, Sri Lanka, and the Maldives. During the course of this project, several great earthquakes ruptured subduction zones along the southern coast of Indonesia (fig. 1) causing additional structural damage and casualties in nearby communities. Future structural damage and societal losses from large earthquakes can be mitigated by providing an advance warning of tsunamis and introducing seismic hazard provisions in building codes that allow buildings and structures to withstand strong ground shaking associated with anticipated earthquakes. The Southeast Asia Seismic Hazard Project was funded through a United States Agency for International Development (USAID)—Indian Ocean Tsunami Warning System to develop seismic hazard maps that would assist engineers in designing buildings that will resist earthquake strong ground shaking. An important objective of this project was to discuss regional hazard issues with building code officials, scientists, and engineers in Thailand, Malaysia, and Indonesia. The code communities have been receptive to these discussions and are considering updating the Thailand and Indonesia building codes to incorporate new information (for example, see notes from Professor Panitan Lukkunaprasit, Chulalongkorn University in Appendix A).

  11. Rapid Tsunami Inundation Forecast from Near-field or Far-field Earthquakes using Pre-computed Tsunami Database: Pelabuhan Ratu, Indonesia

    NASA Astrophysics Data System (ADS)

    Gusman, A. R.; Setiyono, U.; Satake, K.; Fujii, Y.

    2017-12-01

    We built pre-computed tsunami inundation database in Pelabuhan Ratu, one of tsunami-prone areas on the southern coast of Java, Indonesia. The tsunami database can be employed for a rapid estimation of tsunami inundation during an event. The pre-computed tsunami waveforms and inundations are from a total of 340 scenarios ranging from 7.5 to 9.2 in moment magnitude scale (Mw), including simple fault models of 208 thrust faults and 44 tsunami earthquakes on the plate interface, as well as 44 normal faults and 44 reverse faults in the outer-rise region. Using our tsunami inundation forecasting algorithm (NearTIF), we could rapidly estimate the tsunami inundation in Pelabuhan Ratu for three different hypothetical earthquakes. The first hypothetical earthquake is a megathrust earthquake type (Mw 9.0) offshore Sumatra which is about 600 km from Pelabuhan Ratu to represent a worst-case event in the far-field. The second hypothetical earthquake (Mw 8.5) is based on a slip deficit rate estimation from geodetic measurements and represents a most likely large event near Pelabuhan Ratu. The third hypothetical earthquake is a tsunami earthquake type (Mw 8.1) which often occur south off Java. We compared the tsunami inundation maps produced by the NearTIF algorithm with results of direct forward inundation modeling for the hypothetical earthquakes. The tsunami inundation maps produced from both methods are similar for the three cases. However, the tsunami inundation map from the inundation database can be obtained in much shorter time (1 min) than the one from a forward inundation modeling (40 min). These indicate that the NearTIF algorithm based on pre-computed inundation database is reliable and useful for tsunami warning purposes. This study also demonstrates that the NearTIF algorithm can work well even though the earthquake source is located outside the area of fault model database because it uses a time shifting procedure for the best-fit scenario searching.

  12. Real-time correction of tsunami site effect by frequency-dependent tsunami-amplification factor

    NASA Astrophysics Data System (ADS)

    Tsushima, H.

    2017-12-01

    For tsunami early warning, I developed frequency-dependent tsunami-amplification factor and used it to design a recursive digital filter that can be applicable for real-time correction of tsunami site response. In this study, I assumed that a tsunami waveform at an observing point could be modeled by convolution of source, path and site effects in time domain. Under this assumption, spectral ratio between offshore and the nearby coast can be regarded as site response (i.e. frequency-dependent amplification factor). If the amplification factor can be prepared before tsunamigenic earthquakes, its temporal convolution to offshore tsunami waveform provides tsunami prediction at coast in real time. In this study, tsunami waveforms calculated by tsunami numerical simulations were used to develop frequency-dependent tsunami-amplification factor. Firstly, I performed numerical tsunami simulations based on nonlinear shallow-water theory from many tsuanmigenic earthquake scenarios by varying the seismic magnitudes and locations. The resultant tsunami waveforms at offshore and the nearby coastal observing points were then used in spectral-ratio analysis. An average of the resulted spectral ratios from the tsunamigenic-earthquake scenarios is regarded as frequency-dependent amplification factor. Finally, the estimated amplification factor is used in design of a recursive digital filter that can be applicable in time domain. The above procedure is applied to Miyako bay at the Pacific coast of northeastern Japan. The averaged tsunami-height spectral ratio (i.e. amplification factor) between the location at the center of the bay and the outside show a peak at wave-period of 20 min. A recursive digital filter based on the estimated amplification factor shows good performance in real-time correction of tsunami-height amplification due to the site effect. This study is supported by Japan Society for the Promotion of Science (JSPS) KAKENHI grant 15K16309.

  13. Numerical Simulation of Several Tectonic Tsunami Sources at the Caribbean Basin

    NASA Astrophysics Data System (ADS)

    Chacon-Barrantes, S. E.; Lopez, A. M.; Macias, J.; Zamora, N.; Moore, C. W.; Llorente Isidro, M.

    2016-12-01

    The Tsunami Hazard Assessment Working Group (WG2) of the Intergovernmental Coordination Group for the Tsunami and Other Coastal Hazards Early Warning System for the Caribbean and Adjacent Regions (ICG/CARIBE-EWS), has been tasked to identify tsunami sources for the Caribbean region and evaluate their effects along Caribbean coasts. A list of tectonic sources was developed and presented at the Fall 2015 AGU meeting and the WG2 is currently working on a list of non-tectonic sources. In addition, three Experts Meetings have already been held in 2016 to define worst-case, most credible scenarios for southern Hispaniola and Central America. The WG2 has been tasked to simulate these scenarios to provide an estimate of the resulting effects on coastal areas within the Caribbean. In this study we simulated tsunamis with two leading numerical models (NEOWAVE and Tsunami-HySEA) to compare results among them and report on the consequences for the Caribbean region if a tectonically-induced tsunami occurs in any of these postulated sources. The considered sources are located offshore Central America, at the North Panamá Deformed Belt (NPDB), at the South Caribbean Deformed Belt (SCDB) and around La Hispaniola Island. Results obtained in this study are critical to develop a catalog of scenarios that can be used in future CaribeWave exercises, as well as their usage for ICG/CARIBE-EWS member states as input to model tsunami inundation for their coastal locations. Data from inundation parameters are an additional step to produce tsunami evacuation maps, and develop plans and procedures to increase tsunami awareness and preparedness within the Caribbean.

  14. Hydraulic experiment on tsunami sand deposits relating with grain size distribution and magnitude of incident waves

    NASA Astrophysics Data System (ADS)

    Yamamoto, A.; Takahashi, T.; Harada, K.; Nojima, K.

    2016-12-01

    A huge earthquake occurred off the Tohoku district in Japan on March 11th, 2011. A massive tsunami generated by the earthquake attacked coastal areas and caused serious damage. The tsunami disaster requires to reconsider tsunami measures in the Nankai Trough. Many of the measures are based on histories of large earthquakes and tsunamis. Because they are low frequency disasters and their historical documents are limited, tsunami sand deposits have been expected to analyze paleotsunamis. Tsunami sand deposits, however, are only used to confirm the fact of tsunamis and to determine the relative magnitudes. The thickness of sand layer and the grain size may be clues to estimate the tsunami force. Further, it could reveal the tsunami source. These results are also useful to improve the present tsunami measures. The objective of this study is to investigate the formation mechanism of tsunami sand deposits by hydraulic experiment. A two-dimensional water channel consisted of a wave maker, a flat section and a slope section. A movable bed section with various grain sizes and distribution of sand was set at the end of flat section. Bore waves of several heights transported the sand to the slope section by run-up. Water surface elevation and velocity were measured at several points. Tsunami sand deposit distribution was also measured along the slope section. The experimental result showed that the amount of tsunami sand deposit was relating with the grain size distribution and the magnitude of incident waves. Further, the number of incident waves affected the profile of tsunami sand deposits.

  15. The July 17, 2006 Java Tsunami: Tsunami Modeling and the Probable Causes of the Extreme Run-up

    NASA Astrophysics Data System (ADS)

    Kongko, W.; Schlurmann, T.

    2009-04-01

    On 17 July 2006, an Earthquake magnitude Mw 7.8 off the south coast of west Java, Indonesia generated tsunami that affected over 300 km of south Java coastline and killed more than 600 people. Observed tsunami heights and field measurement of run-up distributions were uniformly scattered approximately 5 to 7 m along a 200 km coastal stretch; remarkably, a locally focused tsunami run-up height exceeding 20 m at Nusakambangan Island has been observed. Within the framework of the German Indonesia Tsunami Early Warning System (GITEWS) Project, a high-resolution near-shore bathymetrical survey equipped by multi-beam echo-sounder has been recently conducted. Additional geodata have been collected using Intermap Technologies STAR-4 airborne interferometric SAR data acquisition system on a 5 m ground sample distance basis in order to establish a most-sophisticated Digital Terrain Model (DTM). This paper describes the outcome of tsunami modelling approaches using high resolution data of bathymetry and topography being part of a general case study in Cilacap, Indonesia, and medium resolution data for other area along coastline of south Java Island. By means of two different seismic deformation models to mimic the tsunami source generation, a numerical code based on the 2D nonlinear shallow water equations is used to simulate probable tsunami run-up scenarios. Several model tests are done and virtual points in offshore, near-shore, coastline, as well as tsunami run-up on the coast are collected. For the purpose of validation, the model results are compared with field observations and sea level data observed at several tide gauges stations. The performance of numerical simulations and correlations with observed field data are highlighted, and probable causes for the extreme wave heights and run-ups are outlined. References Ammon, C.J., Kanamori, K., Lay, T., and Velasco, A., 2006. The July 2006 Java Tsunami Earthquake, Geophysical Research Letters, 33(L24308). Fritz, H.M., Kongko, W., Moore, A., McAdoo, B., Goff, J., Harbitz, C., Uslu, B., Kalligeris, N., Suteja, D., Kalsum, K., Titov, V., Gusman, A., Latief, H., Santoso, E., Sujoko, S., Djulkarnaen, D., Sunendar, H., and Synolakis, C., 2007. Extreme Run-up from the 17 July 2006 Java Tsunami. Geophysical Research Letters, 34(L12602). Fujii, Y., and Satake, K., 2006. Source of the July 2006 Java Tsunami Estimated from Tide Gauge Records. Geophysical Research Letters, 33(L23417). Intermap Federal Services Inc., 2007. Digital Terrain Model Cilacap, version 1. Project of GITEWS, DLR Germany. Kongko, W., and Leschka, S., 2008. Nearshore Bathymetry Measurements in Indonesia: Part 1. Cilacap, Technical Report, DHI-WASY GmbH Syke Germany. Kongko, W., Suranto, Chaeroni, Aprijanto, Zikra, and SUjantoko, 2006, Rapid Survey on Tsunami Jawa 17 July 2006, http://nctr.pmel.noaa.gov/java20060717/tsunami-java170706_e.pdf Lavigne, F., Gomes, C., Giffo, M., Wassmer, P., Hoebreck, C., Mardiatno, D., Prioyono, J., and Paris R., 2007. Field Observation of the 17 July 2006 Tsunami in Java. Natural Hazards and Earth Systems Sciences, 7: 177-183.

  16. Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2009-01-01

    Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.

  17. Observing Natural Hazards: Tsunami, Hurricane, and El Niño Observations from the NDBC Ocean Observing System of Systems

    NASA Astrophysics Data System (ADS)

    O'Neil, K.; Bouchard, R.; Burnett, W. H.; Aldrich, C.

    2009-12-01

    The National Oceanic and Atmospheric Administration’s (NOAA) National Data Buoy Center (NDBC) operates and maintains the NDBC Ocean Observing Systems of Systems (NOOSS), comprised of 3 networks that provide critical information before and during and after extreme hazards events, such as tsunamis, hurricanes, and El Niños. While each system has its own mission, they have in common the requirement to remain on station in remote areas of the ocean to provide reliable and accurate observations. After the 2004 Sumatran Tsunami, NOAA expanded its network of tsunameters from six in the Pacific Ocean to a vast network of 39 stations providing information to Tsunami Warning Centers to enable faster and more accurate tsunami warnings for coastal communities in the Pacific, Atlantic, Caribbean and the Gulf of Mexico. The tsunameter measurements are used to detect the amplitude and period of the tsunamis, and the data can be assimilated into models for the prediction and impact of the tsunamis to coastal communities. The network has been used for the detection of tsunamis generated by earthquakes, including the 2006 and 2007 Kuril Islands, 2007 Peru, and Solomon Islands, and most recently for the 2009 Dusky Sound, New Zealand earthquake. In August 2009, the NOAA adjusted its 2009 Atlantic Hurricane Seasonal Outlooks from above normal to near or below normal activity, primarily due to a strengthening El Niño. A key component in the detection of that El Niño was the Tropical Atmosphere Ocean Array (TAO) operated by NDBC. TAO provides real-time data for improved detection, understanding, and prediction of El Niño and La Niña. The 55-buoy TAO array spans the central and eastern equatorial Pacific providing real-time and post-deployment recovery data to support climate analysis and forecasts. Although, in this case, the El Niño benefits the tropical Atlantic, the alternate manifestation, La Niña typically enhances hurricane activity in the Atlantic. The various phases of the El Niño-Southern Oscillation resulting in extreme hazards, such as floods and landslides, droughts and wildfires, fish kills and biological impacts. For almost 40 years, NDBC has operated and maintained a network of buoys and coastal automated stations for meteorological and oceanographic observations that support real-time weather analysis, forecasting, and warnings. The US National Hurricane Center (NHC) uses the observations from the buoys to detect the position and intensity of tropical cyclones and the extent of their extreme winds and sea. Since 2006, NHC has cited over 100 instances of using buoy data in its Forecast Discussions or Public Advisories. Data are also used in reconstructing and analyzing the extent of devastation from land-falling hurricanes. The unprecedented devastation caused by the rising waters of 2005’s Hurricane Katrina was attributed to the waves generated and reported by the NDBC buoys in the Gulf of Mexico superimposed upon the storm surge at landfall. The three constituent systems of the NOOSS comprise a network of more than 250 observing stations providing real-time and archived data for forecasters, scientists, and disaster management officials.

  18. Real-Time Detection of Tsunami Ionospheric Disturbances with a Stand-Alone GNSS Receiver: An Integration of GPS and Galileo Systems

    NASA Astrophysics Data System (ADS)

    Savastano, Giorgio; Komjathy, Attila; Verkhoglyadova, Olga; Wei, Yong; Mazzoni, Augusto; Crespi, Mattia

    2017-04-01

    Tsunamis can produce gravity waves that propagate up to the ionosphere generating disturbed electron densities in the E and F regions. These ionospheric disturbances are studied in detail using ionospheric total electron content (TEC) measurements collected by continuously operating ground-based receivers from the Global Navigation Satellite Systems (GNSS). Here, we present results using a new approach, named VARION (Variometric Approach for Real-Time Ionosphere Observation), and for the first time, we estimate slant TEC (sTEC) variations in a real-time scenario from GPS and Galileo constellations. Specifically, we study the 2016 New Zealand tsunami event using GNSS receivers with multi-constellation tracking capabilities located in the Pacific region. We compare sTEC estimates obtained using GPS and Galileo constellations. The efficiency of the real-time sTEC estimation using the VARION algorithm has been demonstrated for the 2012 Haida Gwaii tsunami event. TEC variations induced by the tsunami event are computed using 56 GPS receivers in Hawai'i. We observe TEC perturbations with amplitudes up to 0.25 TEC units and traveling ionospheric disturbances moving away from the epicenter at a speed of about 316 m/s. We present comparisons with the real-time tsunami model MOST (Method of Splitting Tsunami) provided by the NOAA Center for Tsunami Research. We observe variations in TEC that correlate well in time and space with the propagating tsunami waves. We conclude that the integration of different satellite constellations is a crucial step forward to increasing the reliability of real-time tsunami detection systems using ground-based GNSS receivers as an augmentation to existing tsunami early warning systems.

  19. The 1887 earthquake and tsunami in the Ligurian Sea: analysis of coastal effects studied by numerical modeling and prototype for real-time computing

    NASA Astrophysics Data System (ADS)

    Monnier, Angélique; Gailler, Audrey; Loevenbruck, Anne; Heinrich, Philippe; Hébert, Hélène

    2017-04-01

    The February 1887 earthquake in Italy (Imperia) triggered a tsunami well observed on the French and Italian coastlines. Tsunami waves were recorded on a tide gauge in the Genoa harbour with a small, recently reappraised maximum amplitude of about 10-12 cm (crest-to-trough). The magnitude of the earthquake is still debated in the recent literature, and discussed according to available macroseismic, tectonic and tsunami data. While the tsunami waveform observed in the Genoa harbour may be well explained with a magnitude smaller than 6.5 (Hébert et al., EGU 2015), we investigate in this study whether such source models are consistent with the tsunami effects reported elsewhere along the coastline. The idea is to take the opportunity of the fine bathymetric data recently synthetized for the French Tsunami Warning Center (CENALT) to test the 1887 source parameters using refined, nested grid tsunami numerical modeling down to the harbour scale. Several source parameters are investigated to provide a series of models accounting for various magnitudes and mechanisms. This allows us to compute the tsunami effects for several coastal sites in France (Nice, Villefranche, Antibes, Mandelieu, Cannes) and to compare with observations. Meanwhile we also check the computing time of the chosen scenarios to study whether running nested grids simulation in real time can be suitable in operational context in term of computational cost for these Ligurian scenarios. This work is supported by the FP7 ASTARTE project (Assessment Strategy and Risk Reduction for Tsunamis in Europe, grant 603839 FP7) and by the French PIA TANDEM (Tsunamis in the Atlantic and English ChaNnel: Definition of the Effects through Modeling) project (grant ANR-11-RSNR-00023).

  20. Earthquake-triggered landslides along the Hyblean-Malta Escarpment (off Augusta, eastern Sicily, Italy) - assessment of the related tsunamigenic potential

    NASA Astrophysics Data System (ADS)

    Ausilia Paparo, Maria; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo; Tinti, Stefano

    2017-02-01

    Eastern Sicily is affected by earthquakes and tsunamis of local and remote origin, which is known through numerous historical chronicles. Recent studies have put emphasis on the role of submarine landslides as the direct cause of the main local tsunamis, envisaging that earthquakes (in 1693 and 1908) did produce a tsunami, but also that they triggered mass failures that were able to generate an even larger tsunami. The debate is still open, and though no general consensus has been found among scientists so far, this research had the merit to attract attention on possible generation of tsunamis by landslides off Sicily. In this paper we investigate the tsunami potential of mass failures along one sector of the Hyblean-Malta Escarpment (HME). facing Augusta. The HME is the main offshore geological structure of the region running almost parallel to the coast, off eastern Sicily. Here, bottom morphology and slope steepness favour soil failures. In our work we study slope stability under seismic load along a number of HME transects by using the Minimun Lithostatic Deviation (MLD) method, which is based on the limit-equilibrium theory. The main goal is to identify sectors of the HME that could be unstable under the effect of realistic earthquakes. We estimate the possible landslide volume and use it as input for numerical codes to simulate the landslide motion and the consequent tsunami. This is an important step for the assessment of the tsunami hazard in eastern Sicily and for local tsunami mitigation policies. It is also important in view of tsunami warning system since it can help to identify the minimum earthquake magnitude capable of triggering destructive tsunamis induced by landslides, and therefore to set up appropriate knowledge-based criteria to launch alert to the population.

  1. Emergency response and field observation activities of geoscientists in California (USA) during the September 29, 2009, Samoa Tsunami

    NASA Astrophysics Data System (ADS)

    Wilson, Rick I.; Dengler, Lori A.; Goltz, James D.; Legg, Mark R.; Miller, Kevin M.; Ritchie, Andy; Whitmore, Paul M.

    2011-07-01

    State geoscientists (geologists, geophysicists, seismologists, and engineers) in California work closely with federal, state and local government emergency managers to help prepare coastal communities for potential impacts from a tsunami before, during, and after an event. For teletsunamis, as scientific information (forecast model wave heights, first-wave arrival times, etc.) from NOAA's West Coast and Alaska Tsunami Warning Center is made available, federal- and state-level emergency managers must help convey this information in a concise, comprehensible and timely manner to local officials who ultimately determine the appropriate response activities for their jurisdictions. During the September 29, 2009 Tsunami Advisory for California, government geoscientists assisted the California Emergency Management Agency by providing technical assistance during teleconference meetings with NOAA and other state and local emergency managers prior to the arrival of the tsunami. This technical assistance included background information on anticipated tidal conditions when the tsunami was set to arrive, wave height estimates from state-modeled scenarios for areas not covered by NOAA's forecast models, and clarifying which regions of the state were at greatest risk. Over the last year, state geoscientists have started to provide additional assistance: 1) working closely with NOAA to simplify their tsunami alert messaging and expand their forecast modeling coverage; 2) creating "playbooks" containing information from existing tsunami scenarios for local emergency managers to reference during an event; and, 3) developing a state-level information "clearinghouse" and pre-tsunami field response team to assist local officials as well as observe and report tsunami effects. Activities of geoscientists were expanded during the more recent Tsunami Advisory on February 27, 2010, including deploying a geologist from the California Geological Survey as a field observer who provided information back to emergency managers.

  2. Design Principles for resilient cyber-physical Early Warning Systems - Challenges, Experiences, Design Patterns, and Best Practices

    NASA Astrophysics Data System (ADS)

    Gensch, S.; Wächter, J.; Schnor, B.

    2014-12-01

    Early warning systems (EWS) are safety-critical IT-infrastructures that serve the purpose of potentially saving lives or assets by observing real-world phenomena and issuing timely warning products to authorities and communities. An EWS consists of sensors, communication networks, data centers, simulation platforms, and dissemination channels. The components of this cyber-physical system may all be affected by both natural hazards and malfunctions of components alike. Resilience engineering so far has mostly been applied to safety-critical systems and processes in transportation (aviation, automobile), construction and medicine. Early warning systems need equivalent techniques to compensate for failures, and furthermore means to adapt to changing threats, emerging technology and research findings. We present threats and pitfalls from our experiences with the German and Indonesian tsunami early warning system, as well as architectural, technological and organizational concepts employed that can enhance an EWS' resilience. The current EWS is comprised of a multi-type sensor data upstream part, different processing and analysis engines, a decision support system, and various warning dissemination channels. Each subsystem requires a set of approaches towards ensuring stable functionality across system layer boundaries, including also institutional borders. Not only must services be available, but also produce correct results. Most sensors are distributed components with restricted resources, communication channels and power supply. An example for successful resilience engineering is the power capacity based functional management for buoy and tide gauge stations. We discuss various fault-models like cause and effect models on linear pathways, interaction of multiple events, complex and non-linear interaction of assumedly reliable subsystems and fault tolerance means implemented to tackle these threats.

  3. Field Survey of the 2015 Ilapel Tsunami in North Central Chile

    NASA Astrophysics Data System (ADS)

    Lagos, M.; Fritz, H. M.

    2016-12-01

    The magnitude Mw 8.3 earthquake in north-central Chile on September 16, 2015 generated a tsunami that rapidly flooded coastal areas. The tsunami impact was concentrated in Coquimbo region, while the regions of Valparaiso and Atacama were also affected. Fortunately, ancestral knowledge from the past tsunamis in the region, as well as tsunami education and evacuation exercises prompted most coastal residents to spontaneously evacuate to high ground after the earthquake. The event caused 11 fatalities: 8 were associated with the tsunami, while 3 were attributed to building collapses caused by the earthquake. The international scientist joined the local effort from September 20 to 26, 2015. The international tsunami survey team (ITST) interviewed numerous eyewitnesses and documented flow depths, runup heights, inundation distances, sediment deposition, damage patterns, performance of the navigation infrastructure and impact on the natural environment. The ITST covered a 500 km stretch of coastline from Caleta Chañaral de Aceituno (28.8° S) south of Huasco down to Llolleo near San Antonio (33.6° S). We surveyed more than 40 locations and recorded more than 100 tsunami and runup heights with differential GPS and integrated laser range finders. The tsunami impact peaked at Caleta Totoral near Punta Aldea with both tsunami and runup heights exceeding 10 m as surveyed on September 22. Runup exceeded 10 m at a second uninhabited location some 15 km south of Caleta Totoral. A significant variation in tsunami impact was observed along the coastlines of central Chile at local and regional scales. The tsunami occurred in the evening hours limiting the availability of eyewitness video footages. Observations from the 2015 Chile tsunami are compared with recent Chilean tsunamis. The tsunami was characterized by rapid arrival within minutes in the nearfield requiring spontaneous self-evacuation as warning messages did not reach some of the hardest hit fishing villages prior to tsunami arrival. The absence of a massive tsunami outside of the Coquimbo region may mislead evacuated residents in the adjacent Atacama and Valparaíso regions of Chile in potential future events. This event poses significant challenges to community-based education raising tsunami awareness.

  4. New Measurements and Modeling Capability to Improve Real-time Forecast of Cascadia Tsunamis along U.S. West Coast

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Titov, V. V.; Bernard, E. N.; Spillane, M. C.

    2014-12-01

    The tragedies of 2004 Sumatra and 2011 Tohoku tsunamis exposed the limits of our knowledge in preparing for devastating tsunamis, especially in the near field. The 1,100-km coastline of the Pacific coast of North America has tectonic and geological settings similar to Sumatra and Japan. The geological records unambiguously show that the Cascadia fault had caused devastating tsunamis in the past and this geological process will cause tsunamis in the future. Existing observational instruments along the Cascadia Subduction Zone are capable of providing tsunami data within minutes of tsunami generation. However, this strategy requires separation of the tsunami signals from the overwhelming high-frequency seismic waves produced during a strong earthquake- a real technical challenge for existing operational tsunami observational network. A new-generation of nano-resolution pressure sensors can provide high temporal resolution of the earthquake and tsunami signals without loosing precision. The nano-resolution pressure sensor offers a state-of the-science ability to separate earthquake vibrations and other oceanic noise from tsunami waveforms, paving the way for accurate, early warnings of local tsunamis. This breakthrough underwater technology has been tested and verified for a couple of micro-tsunami events (Paros et al., 2011). Real-time forecast of Cascadia tsunamis is becoming a possibility with the development of nano-tsunameter technology. The present study provides an investigation on optimizing the placement of these new sensors so that the forecast time can be shortened.. The presentation will cover the optimization of an observational array to quickly detect and forecast a tsunami generated by a strong Cascadia earthquake, including short and long rupture scenarios. Lessons learned from the 2011 Tohoku tsunami will be examined to demonstrate how we can improve the local forecast using the new technology We expect this study to provide useful guideline for future siting and deployment of the new-generation tsunameters. Driven by the new technology, we demonstrate scenarios of real-time forecast of Cascadia tsunami impact along the Pacific Northwest, as well as in the Puget Sound.

  5. Earthquake Early Warning Management based on Client-Server using Primary Wave data from Vibrating Sensor

    NASA Astrophysics Data System (ADS)

    Laumal, F. E.; Nope, K. B. N.; Peli, Y. S.

    2018-01-01

    Early warning is a warning mechanism before an actual incident occurs, can be implemented on natural events such as tsunamis or earthquakes. Earthquakes are classified in tectonic and volcanic types depend on the source and nature. The tremor in the form of energy propagates in all directions as Primary and Secondary waves. Primary wave as initial earthquake vibrations propagates longitudinally, while the secondary wave propagates like as a sinusoidal wave after Primary, destructive and as a real earthquake. To process the primary vibration data captured by the earthquake sensor, a network management required client computer to receives primary data from sensors, authenticate and forward to a server computer to set up an early warning system. With the water propagation concept, a method of early warning system has been determined in which some sensors are located on the same line, sending initial vibrations as primary data on the same scale and the server recommended to the alarm sound as an early warning.

  6. Global early warning systems for natural hazards: systematic and people-centred.

    PubMed

    Basher, Reid

    2006-08-15

    To be effective, early warning systems for natural hazards need to have not only a sound scientific and technical basis, but also a strong focus on the people exposed to risk, and with a systems approach that incorporates all of the relevant factors in that risk, whether arising from the natural hazards or social vulnerabilities, and from short-term or long-term processes. Disasters are increasing in number and severity and international institutional frameworks to reduce disasters are being strengthened under United Nations oversight. Since the Indian Ocean tsunami of 26 December 2004, there has been a surge of interest in developing early warning systems to cater to the needs of all countries and all hazards.

  7. Hydraulic experimental investigation on spatial distribution and formation process of tsunami deposit on a slope

    NASA Astrophysics Data System (ADS)

    Harada, K.; Takahashi, T.; Yamamoto, A.; Sakuraba, M.; Nojima, K.

    2017-12-01

    An important aim of the study of tsunami deposits is to estimate the characteristics of past tsunamis from the tsunami deposits found locally. Based on the tsunami characteristics estimated from tsunami deposit, it is possible to examine tsunami risk assessment in coastal areas. It is considered that tsunami deposits are formed based on the dynamic correlation between tsunami's hydraulic values, sediment particle size, topography, etc. However, it is currently not enough to evaluate the characteristics of tsunamis from tsunami deposits. This is considered to be one of the reasons that the understanding of the formation process of tsunami deposits is not sufficiently understood. In this study, we analyze the measurement results of hydraulic experiment (Yamamoto et al., 2016) and focus on the formation process and distribution of tsunami deposits. Hydraulic experiment was conducted with two-dimensional water channel with a slope. Tsunami was inputted as a bore wave flow. The moving floor section was installed as a seabed slope connecting to shoreline and grain size distribution was set some cases. The water level was measured using ultrasonic displacement gauges, and the flow velocity was measured using propeller current meters and an electromagnetic current meter. The water level and flow velocity was measured at some points. The distribution of tsunami deposit was measured from shoreline to run-up limit on the slope. Yamamoto et al. (2016) reported the measurement results on the distribution of tsunami deposit with wave height and sand grain size. Therefore, in this study, hydraulic analysis of tsunami sediment formation process was examined based on the measurement data. Time series fluctuation of hydraulic parameters such as Froude number, Shields number, Rouse number etc. was calculated to understand on the formation process of tsunami deposit. In the front part of the tsunami, the flow velocity take strong flow from shoreline to around the middle of slope. From the measurement result in this time, it is considered that the dominant process of deposit formation is suspended state. At the run-up limit where the flow velocity decreases, the sediment moves in bedload state. As a result, the amount of sediment transport near the run-up limit changes under the influence of particle size.

  8. U.S. Tsunami Information technology (TIM) Modernization:Developing a Maintainable and Extensible Open Source Earthquake and Tsunami Warning System

    NASA Astrophysics Data System (ADS)

    Hellman, S. B.; Lisowski, S.; Baker, B.; Hagerty, M.; Lomax, A.; Leifer, J. M.; Thies, D. A.; Schnackenberg, A.; Barrows, J.

    2015-12-01

    Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). While this project was funded by NOAA to solve a specific problem, the requirements that the delivered system be both open source and easily maintainable have resulted in the creation of a variety of open source (OS) software packages. The open source software is now complete and this is a presentation of the OS Software that has been funded by NOAA for benefit of the entire seismic community. The design architecture comprises three distinct components: (1) The user interface, (2) The real-time data acquisition and processing system and (3) The scientific algorithm library. The system follows a modular design with loose coupling between components. We now identify the major project constituents. The user interface, CAVE, is written in Java and is compatible with the existing National Weather Service (NWS) open source graphical system AWIPS. The selected real-time seismic acquisition and processing system is open source SeisComp3 (sc3). The seismic library (libseismic) contains numerous custom written and wrapped open source seismic algorithms (e.g., ML/mb/Ms/Mwp, mantle magnitude (Mm), w-phase moment tensor, bodywave moment tensor, finite-fault inversion, array processing). The seismic library is organized in a way (function naming and usage) that will be familiar to users of Matlab. The seismic library extends sc3 so that it can be called by the real-time system, but it can also be driven and tested outside of sc3, for example, by ObsPy or Earthworm. To unify the three principal components we have developed a flexible and lightweight communication layer called SeismoEdex.

  9. Tsunami hazard assessment in the Colombian Caribbean Coast with a deterministic approach

    NASA Astrophysics Data System (ADS)

    Otero Diaz, L.; Correa, R.; Ortiz R, J. C.; Restrepo L, J. C.

    2014-12-01

    For the Caribbean Sea, we propose six potential tectonic sources of tsunami, defining for each source the worst credible earthquake from the analysis of historical seismicity, tectonics, pasts tsunami, and review of IRIS, PDE, NOAA, and CMT catalogs. The generation and propagation of tsunami waves in the selected sources were simulated with COMCOT 1.7, which is a numerical model that solves the linear and nonlinear long wave equations in finite differences in both Cartesian, and spherical coordinates. The results of the modeling are presented in maps of maximum displacement of the free surface for the Colombian Caribbean coast and the island areas, and they show that the event would produce greater impact is generated in the source of North Panama Deformed Belt (NPDB), where the first wave train reaches the central Colombian coast in 40 minutes, generating wave heights up to 3.7 m. In San Andrés and Providencia island, tsunami waves reach more than 4.5 m due effects of edge waves caused by interactions between waves and a barrier coral reef around of each island. The results obtained in this work are useful for planning systems and future regional and local warning systems and to identify priority areas to conduct detailed research to the tsunami threat.

  10. Simulation of landslide and tsunami of the 1741 Oshima-Oshima eruption in Hokkaido, Japan

    NASA Astrophysics Data System (ADS)

    Ioki, K.; Yanagisawa, H.; Tanioka, Y.; Kawakami, G.; Kase, Y.; Nishina, K.; Hirose, W.; Ishimaru, S.

    2017-12-01

    The 1741 tsunami was generated by the Oshima-Oshima sector collapse in the southwestern Hokkaido, Japan. The tsunami caused great damage along the coast of Japan Sea in Oshima and Tsugaru peninsula and was the largest scale generated in the Japan sea. By the survey of tsunami deposits, at the coast of Okushiri Island and Hiyama in Hokkaido, tsunami deposits of this tsunami were found. In this study, the landslide and tsunami by the Oshima-Oshima eruption were modeled to explain distribution of debris deposits, tsunami heights by historical records, and distribution of tsunami deposits. First, region of landslide and debris deposits were made out from the bathymetry based on the bathymetry survey data (Satake and Kato, 2001) in the north slope of Oshima-Oshima. In addition, topography before the sector collapse and landslide volume were re-estimated. The volume of landslide was estimated at 2.2 km3. Based on those data, the landslide and tsunami were simulated using two-layer model considered soil mass and water mass. The model was made improvements the integrated model of landslide and tsunami (Yanagisawa et al., 2014). The angle of internal friction was calculated 4 cases, included the bottom friction term in soil mass, to affect the movement of landslide. The Manning's roughness coefficient was calculated 5 cases, included the bottom friction term in soil mass, to affect the generation of tsunami. By the parameter study, optimal solutions were found. As the results, soil mass slid slowly submarine slope and stopped after about 15 minutes. Distribution of computed debris deposits agree relatively well with region of debris deposits made out from the bathymetry. On the other hand, the first wave of tsunami was generated during 1 minute that soil mass was sliding. Calculated tsunami heights match with historical records along the coast of Okushiri and Hiyama in Hokkaido. Calculated inundation area of tsunami cover distribution of tsunami deposits found by tsunami deposits survey in Okushiri and Hiyama coast.

  11. ALASKA MARINE VHF VOICE

    Science.gov Websites

    Tsunamis 406 EPIRB's National Weather Service Marine Forecasts ALASKA MARINE VHF VOICE Marine Forecast greater danger near shore or any shallow waters? NATIONAL WEATHER SERVICE PRODUCTS VIA ALASKA MARINE VHF VOICE NOAA broadcasts offshore forecasts, nearshore forecasts and storm warnings on marine VHF channels

  12. Tsunami Preparedness, Response, Mitigation, and Recovery Planning in California

    NASA Astrophysics Data System (ADS)

    Miller, K.; Wilson, R. I.; Johnson, L. A.; Mccrink, T. P.; Schaffer, E.; Bower, D.; Davis, M.

    2016-12-01

    In California officials of state, federal, and local governments have coordinated to implement a Tsunami Preparedness and Mitigation Program. Building upon past preparedness efforts carried out year-round this group has leveraged government support at all levels. A primary goal is for everyone who lives at or visits the coast to understand basic life-safety measures when responding to official tsunami alerts or natural warnings. Preparedness actions include: observation of National Tsunami Preparedness Week, local "tsunami walk" drills, scenario-based exercises, testing of notification systems for public alert messaging, outreach materials, workshops, presentations, and media events.Program partners have worked together to develop emergency operations, evacuation plans, and tsunami annexes to plans for counties, cities, communities, and harbors in 20 counties along the coast. Working with the state and federal partner agencies, coastal communities have begun to incorporate sophisticated tsunami "Playbook" scenario information into their planning. These innovative tsunami evacuation and response tools provide detailed evacuation maps and associated real-time response information for identifying areas where flooding could occur. This is critical information for evacuating populations on land, near the shoreline.Acting on recommendations from the recent USGS-led, multi-discipline Science Application for Risk Reduction Tsunami Scenario report on impacts to California and American Society of Civil Engineering adoption proposals to the International Building Code, the state has begun to develop a strategy to incorporate probabilistic tsunami findings into state level policy recommendations for addressing building code adoption, as well as approach land use planning and building code implementation in local jurisdictions. Additional efforts, in the context of sustained community resiliency, include developing recovery planning guidance for local communities.

  13. Tsunami Preparedness: Building On Past Efforts to Reach More People… California and Beyond!

    NASA Astrophysics Data System (ADS)

    Miller, K.; Siegel, J.; Pridmore, C. L.; Benthien, M. L.; Wilson, R. I.; Long, K.; Ross, S.

    2014-12-01

    The California Tsunami Program has continued to build upon past preparedness efforts, carried out year-round, while leveraging government support at all levels during National Tsunami Preparedness Week, the last week of March. A primary goal is for everyone who lives at or visits the coast to understand basic safety measures when responding to official tsunami alerts or natural warnings. In 2014, more so than ever before, many local, coastal jurisdictions conducted grass-roots activities in their areas. When requested, state and federal programs stepped in to contribute subject matter expertise, lessons learned, and support. And, this year, the new website, www.TsunamiZone.org, was developed. With a goal of establishing a baseline for future years, this website builds on the successes of the Great Shakeout Earthquake Drills (www.ShakeOut.org) by allowing people to locate and register for tsunami preparedness events in their area. Additionally, it provides a central location for basic tsunami preparedness information, and links to find out more. The idea is not only to empower people with the best available, vetted, scientifically-based public safety information, but also to provide ways in which individuals can take physical action to educate themselves and others. Several broad categories of preparedness actions include: official acknowledgement of National Tsunami Preparedness Week, local "tsunami walk" drills, simulated tsunami-based exercises, testing of sirens and notification systems, outreach materials (brochures, videos, maps), workshops, presentations, media events, and websites. Next steps include building on the foundation established in 2014 by leveraging ShakeOut audiences, providing people with more information about how they can participate in 2015, and carrying the effort forward to other states and territories.

  14. Quantifying 10 years of Improvements in Earthquake and Tsunami Monitoring in the Caribbean and Adjacent Regions

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Huerfano Moreno, V. A.; McNamara, D. E.; Saurel, J. M.

    2014-12-01

    The magnitude-9.3 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness to the destructive hazard of earthquakes and tsunamis. Post event assessments of global coastline vulnerability highlighted the Caribbean as a region of high hazard and risk and that it was poorly monitored. Nearly 100 tsunamis have been reported for the Caribbean region and Adjacent Regions in the past 500 years and continue to pose a threat for its nations, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North and South America. Significant efforts to improve monitoring capabilities have been undertaken since this time including an expansion of the United States Geological Survey (USGS) Global Seismographic Network (GSN) (McNamara et al., 2006) and establishment of the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). The minimum performance standards it recommended for initial earthquake locations include: 1) Earthquake detection within 1 minute, 2) Minimum magnitude threshold = M4.5, and 3) Initial hypocenter error of <30 km. In this study, we assess current compliance with performance standards and model improvements in earthquake and tsunami monitoring capabilities in the Caribbean region since the first meeting of the UNESCO ICG-Caribe EWS in 2006. The three measures of network capability modeled in this study are: 1) minimum Mw detection threshold; 2) P-wave detection time of an automatic processing system and; 3) theoretical earthquake location uncertainty. By modeling three measures of seismic network capability, we can optimize the distribution of ICG-Caribe EWS seismic stations and select an international network that will be contributed from existing real-time broadband national networks in the region. Sea level monitoring improvements both offshore and along the coast will also be addressed. With the support of Member States and other countries and organizations it has been possible to significantly expand the sea level network thus reducing the amount of time it now takes to verify tsunamis.

  15. Implementation of the TsunamiReady Supporter Program in Puerto Rico

    NASA Astrophysics Data System (ADS)

    Flores Hots, V. E.; Vanacore, E. A.; Gonzalez Ruiz, W.; Gomez, G.

    2016-12-01

    The Puerto Rico Seismic Network (PRSN) manages the PR Tsunami Program (NTHMP), including the TsunamiReady Supporter Program. Through this program the PRSN helps private organizations, businesses, facilities or local government entities to willingly engage in tsunami planning and preparedness that meet some requirements established by the National Weather Service. TsunamiReady Supporter organizations are better prepared to respond to a tsunami emergency, developing a response plan (using a template that PRSN developed and provides), and reinforcing their communication systems including NOAA radio, RSS, and loud speakers to receive and disseminate the alerts issued by the NWS and the Tsunami Warning Centers (TWC). The planning and the communication systems added to the training that PRSN provides to the staff and employees, are intend to help visitors and employees evacuate the tsunami hazard zone to the nearest assembly point minimizing loss of life. Potential TsunamiReady Supporters include, but are not limited to, businesses, schools, churches, hospitals, malls, utilities, museums, beaches, and harbors. However, the traditional targets for such programs are primarily tourism sites and hotels where people unaware of the tsunami hazard may be present. In 2016 the Tsunami Ready Program guided four businesses to achieve the TsunamiReady Supporter recognition. Two facilities were hotels near or inside the evacuation zone. The other facilities were the first and only health center and supermarket to be recognized in the United States and US territories. Based on the experience of preparing the health center and supermarket sites, here we present two case studies of how the TsunamiReady Supporter Program can be applied to non-traditional facilities as well as how the application of this program to such facilities can improve tsunami hazard mitigation. Currently, we are working on expanding the application of this program to non-traditional facilities by working with a banking facility located in a tsunami evacuation zone increasing their capacity to manage a tsunami event and to reinforce the entity's involvement in developing a plan for their clients and employees to evacuate the area and head to a safe place.

  16. Chapter two: Phenomenology of tsunamis II: scaling, event statistics, and inter-event triggering

    USGS Publications Warehouse

    Geist, Eric L.

    2012-01-01

    Observations related to tsunami catalogs are reviewed and described in a phenomenological framework. An examination of scaling relationships between earthquake size (as expressed by scalar seismic moment and mean slip) and tsunami size (as expressed by mean and maximum local run-up and maximum far-field amplitude) indicates that scaling is significant at the 95% confidence level, although there is uncertainty in how well earthquake size can predict tsunami size (R2 ~ 0.4-0.6). In examining tsunami event statistics, current methods used to estimate the size distribution of earthquakes and landslides and the inter-event time distribution of earthquakes are first reviewed. These methods are adapted to estimate the size and inter-event distribution of tsunamis at a particular recording station. Using a modified Pareto size distribution, the best-fit power-law exponents of tsunamis recorded at nine Pacific tide-gauge stations exhibit marked variation, in contrast to the approximately constant power-law exponent for inter-plate thrust earthquakes. With regard to the inter-event time distribution, significant temporal clustering of tsunami sources is demonstrated. For tsunami sources occurring in close proximity to other sources in both space and time, a physical triggering mechanism, such as static stress transfer, is a likely cause for the anomalous clustering. Mechanisms of earthquake-to-earthquake and earthquake-to-landslide triggering are reviewed. Finally, a modification of statistical branching models developed for earthquake triggering is introduced to describe triggering among tsunami sources.

  17. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  18. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  19. Distribution of tsunami interevent times

    NASA Astrophysics Data System (ADS)

    Geist, Eric L.; Parsons, Tom

    2008-01-01

    The distribution of tsunami interevent times is analyzed using global and site-specific (Hilo, Hawaii) tsunami catalogs. An empirical probability density distribution is determined by binning the observed interevent times during a period in which the observation rate is approximately constant. The empirical distributions for both catalogs exhibit non-Poissonian behavior in which there is an abundance of short interevent times compared to an exponential distribution. Two types of statistical distributions are used to model this clustering behavior: (1) long-term clustering described by a universal scaling law, and (2) Omori law decay of aftershocks and triggered sources. The empirical and theoretical distributions all imply an increased hazard rate after a tsunami, followed by a gradual decrease with time approaching a constant hazard rate. Examination of tsunami sources suggests that many of the short interevent times are caused by triggered earthquakes, though the triggered events are not necessarily on the same fault.

  20. Lattice Boltzmann approach for hydro-acoustic waves generated by tsunamigenic sea bottom displacement

    NASA Astrophysics Data System (ADS)

    Prestininzi, P.; Abdolali, A.; Montessori, A.; Kirby, J. T.; La Rocca, Michele

    2016-11-01

    Tsunami waves are generated by sea bottom failures, landslides and faults. The concurrent generation of hydro-acoustic waves (HAW), which travel much faster than the tsunami, has received much attention, motivated by their possible exploitation as precursors of tsunamis. This feature makes the detection of HAW particularly well-suited for building an early-warning system. Accuracy and efficiency of the modeling approaches for HAW thus play a pivotal role in the design of such systems. Here, we present a Lattice Boltzmann Method (LBM) for the generation and propagation of HAW resulting from tsunamigenic ground motions and verify it against commonly employed modeling solutions. LBM is well known for providing fast and accurate solutions to both hydrodynamics and acoustics problems, thus it naturally becomes a candidate as a comprehensive computational tool for modeling generation and propagation of HAW.

  1. Global Environmental Alert Service

    NASA Astrophysics Data System (ADS)

    Grasso, V. F.; Cervone, G.; Singh, A.; Kafatos, M.

    2006-12-01

    Every year natural disasters such as earthquakes, floods, hurricanes, tsunamis, etc. occur around the world, causing hundreds of thousands of deaths and injuries, billions of dollars in economic losses, and destroying natural landmarks and adveresely affecting ecosystems. Due to increasing urbanization, and increasingly higher percentage of the world's population living in megacities, the existence of nuclear power plants and other facilities whose potential destruction poses unacceptable high risks, natural hazards represent an increasing threat for economic losses, as well as risk to people and property. Warning systems represent an innovative and effective approach to mitigate the risks associated with natural hazards. Several state-of-the-art analyses show that early warning technologies are now available for most natural hazards and systems are already in operation in some parts of the world. Nevertheless, recent disasters such as the 2004 Indian Ocean tsunami, the 2005 Kashmir earthquake and the 2005 Katrina hurricane, highlighted inadequacies in early warning system technologies. Furthermore, not all available technologies are deployed in every part of the world, due to the lack of awareness and resources in the poorer countries, leaving very large and densely populated areas at risk. Efforts towards the development of a global warning system are necessary for filling the gaps of existing technologies. A globally comprehensive early warning system based on existing technologies will be a means to consolidate scientific knowledge, package it in a form usable to international and national decision makers and actively disseminate this information to protect people and properties. There is not a single information broker who searches and packages the policy relevant material and delivers it in an understandable format to the public and decision makers. A critical review of existing systems reveals the need for the innovative service. We propose here a Global Environmental Alert Service (GEAS) that could provide information from monitoring, Earth observing and early warning systems to users in a near real time mode and bridge the gap between the scientific community and policy makers. Characteristics and operational aspects of GEAS are discussed.

  2. Tsunami Forecasting in the Atlantic Basin

    NASA Astrophysics Data System (ADS)

    Knight, W. R.; Whitmore, P.; Sterling, K.; Hale, D. A.; Bahng, B.

    2012-12-01

    The mission of the West Coast and Alaska Tsunami Warning Center (WCATWC) is to provide advance tsunami warning and guidance to coastal communities within its Area-of-Responsibility (AOR). Predictive tsunami models, based on the shallow water wave equations, are an important part of the Center's guidance support. An Atlantic-based counterpart to the long-standing forecasting ability in the Pacific known as the Alaska Tsunami Forecast Model (ATFM) is now developed. The Atlantic forecasting method is based on ATFM version 2 which contains advanced capabilities over the original model; including better handling of the dynamic interactions between grids, inundation over dry land, new forecast model products, an optional non-hydrostatic approach, and the ability to pre-compute larger and more finely gridded regions using parallel computational techniques. The wide and nearly continuous Atlantic shelf region presents a challenge for forecast models. Our solution to this problem has been to develop a single unbroken high resolution sub-mesh (currently 30 arc-seconds), trimmed to the shelf break. This allows for edge wave propagation and for kilometer scale bathymetric feature resolution. Terminating the fine mesh at the 2000m isobath keeps the number of grid points manageable while allowing for a coarse (4 minute) mesh to adequately resolve deep water tsunami dynamics. Higher resolution sub-meshes are then included around coastal forecast points of interest. The WCATWC Atlantic AOR includes eastern U.S. and Canada, the U.S. Gulf of Mexico, Puerto Rico, and the Virgin Islands. Puerto Rico and the Virgin Islands are in very close proximity to well-known tsunami sources. Because travel times are under an hour and response must be immediate, our focus is on pre-computing many tsunami source "scenarios" and compiling those results into a database accessible and calibrated with observations during an event. Seismic source evaluation determines the order of model pre-computation - starting with those sources that carry the highest risk. Model computation zones are confined to regions at risk to save computation time. For example, Atlantic sources have been shown to not propagate into the Gulf of Mexico. Therefore, fine grid computations are not performed in the Gulf for Atlantic sources. Outputs from the Atlantic model include forecast marigrams at selected sites, maximum amplitudes, drawdowns, and currents for all coastal points. The maximum amplitude maps will be supplemented with contoured energy flux maps which show more clearly the effects of bathymetric features on tsunami wave propagation. During an event, forecast marigrams will be compared to observations to adjust the model results. The modified forecasts will then be used to set alert levels between coastal breakpoints, and provided to emergency management.

  3. Recent Progress of Seismic Observation Networks in Japan

    NASA Astrophysics Data System (ADS)

    Okada, Y.

    2013-04-01

    Before the occurrence of disastrous Kobe earthquake in 1995, the number of high sensitivity seismograph stations operated in Japan was nearly 550 and was concentrated in the Kanto and Tokai districts, central Japan. In the wake of the Kobe earthquake, Japanese government has newly established the Headquarters for Earthquake Research Promotion and started the reconstruction of seismic networks to evenly cover the whole Japan. The basic network is composed of three seismographs, i.e. high sensitivity seismograph (Hi-net), broadband seismograph (F-net), and strong motion seismograph (K-NET). A large majority of Hi-net stations are also equipped with a pair of strong motion sensors at the bottom of borehole and the ground surface (KiK-net). A plenty of high quality data obtained from these networks are circulated at once and is producing several new seismological findings as well as providing the basis for the Earthquake Early Warning system. In March 11, 2011, "Off the Pacific coast of Tohoku Earthquake" was generated with magnitude 9.0, which records the largest in the history of seismic observation in Japan. The greatest disaster on record was brought by huge tsunami with nearly 20 thousand killed or missing people. We are again noticed that seismic observation system is quite poor in the oceanic region compared to the richness of it in the inland region. In 2012, NIED has started the construction of ocean bottom seismic and tsunami observation network along the Japan Trench. It is planned to layout 154 stations with an average spacing of 30km, each of which is equipped with an accelerometer for seismic observation and a water pressure gauge for tsunami observation. We are expecting that more rapid and accurate warning of earthquake and tsunami becomes possible by this observing network.

  4. Ocean-bottom pressure changes above a fault area for tsunami excitation and propagation observed by a submarine dense network

    NASA Astrophysics Data System (ADS)

    Yomogida, K.; Saito, T.

    2017-12-01

    Conventional tsunami excitation and propagation have been formulated by incompressible fluid with velocity components. This approach is valid in most cases because we usually analyze tunamis as "long gravity waves" excited by submarine earthquakes. Newly developed ocean-bottom tsunami networks such as S-net and DONET have dramatically changed the above situation for the following two reasons: (1) tsunami propagations are now directly observed in a 2-D array manner without being suffered by complex "site effects" of sea shore, and (2) initial tsunami features can be directly detected just above a fault area. Removing the incompressibility assumption of sea water, we have formulated a new representation of tsunami excitation based on not velocity but displacement components. As a result, not only dynamics but static term (i.e., the component of zero frequency) can be naturally introduced, which is important for the pressure observed on the ocean floor, which ocean-bottom tsunami stations are going to record. The acceleration on the ocean floor should be combined with the conventional tsunami height (that is, the deformation of the sea level above a given station) in the measurement of ocean-bottom pressure although the acceleration exists only during fault motions in time. The M7.2 Off Fukushima earthquake on 22 November 2016 was the first event that excited large tsunamis within the territory of S-net stations. The propagation of tsunamis is found to be highly non-uniform, because of the strong velocity (i.e., sea depth) gradient perpendicular to the axis of Japan Trench. The earthquake was located in a shallow sea close to the coast, so that all the tsunami energy is reflected by the trench region of high velocity. Tsunami records (pressure gauges) within its fault area recorded clear slow motions of tsunamis (i.e., sea level changes) but also large high-frequency signals, as predicted by our theoretical result. That is, it may be difficult to extract tsunami motions from near-fault pressure gauge data immediately after the earthquake occurs, in the sense of tsunami early warning systems.

  5. New Science Applications Within the U.S. National Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.; Forson, C. K.; Horrillo, J. J.; Nicolsky, D.

    2017-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is a collaborative State and Federal program which supports consistent and cost effective tsunami preparedness and mitigation activities at a community level. The NTHMP is developing a new five-year Strategic Plan based on the 2017 Tsunami Warning, Education, and Research Act as well as recommendations the 2017 NTHMP External Review Panel. Many NTHMP activities are based on the best available scientific methods through the NTHMP Mapping and Modeling Subcommittee (MMS). The primary activities for the MMS member States are to characterize significant tsunami sources, numerically model those sources, and create tsunami inundation maps for evacuation planning. This work remains a focus for many unmapped coastlines. With the lessons learned from the 2004 Indian Ocean and 2011 Tohoku Japan tsunamis, where both immediate risks and long-term recovery issues where recognized, the NTHMP MMS is expanding efforts into other areas that address community resilience. Tsunami evacuation modeling based on both pedestrian and vehicular modes of transportation are being developed by NTHMP States. Products include tools for the public to create personal evacuation maps. New tsunami response planning tools are being developed for both maritime and coastal communities. Maritime planning includes tsunami current-hazard maps for in-harbor and offshore response activities. Multi-tiered tsunami evacuation plans are being developed in some states to address local- versus distant-source tsunamis, as well as real-time evacuation plans, or "playbooks," for distant-source tsunamis forecasted to be less than the worst-case flood event. Products to assist community mitigation and recovery are being developed at a State level. Harbor Improvement Reports, which evaluate the impacts of currents, sediment, and debris on harbor infrastructure, include direct mitigation activities for Local Hazard Mitigation Plans. Building code updates in the five Pacific states will include new sections on tsunami load analysis of structures, and require Tsunami Design Zones based on probabilistic analyses. Guidance for community recovery planning has also been initiated. These new projects are being piloted by some States and will help create guidance for other States in the future.

  6. Tsunami simulation method initiated from waveforms observed by ocean bottom pressure sensors for real-time tsunami forecast; Applied for 2011 Tohoku Tsunami

    NASA Astrophysics Data System (ADS)

    Tanioka, Yuichiro

    2017-04-01

    After tsunami disaster due to the 2011 Tohoku-oki great earthquake, improvement of the tsunami forecast has been an urgent issue in Japan. National Institute of Disaster Prevention is installing a cable network system of earthquake and tsunami observation (S-NET) at the ocean bottom along the Japan and Kurile trench. This cable system includes 125 pressure sensors (tsunami meters) which are separated by 30 km. Along the Nankai trough, JAMSTEC already installed and operated the cable network system of seismometers and pressure sensors (DONET and DONET2). Those systems are the most dense observation network systems on top of source areas of great underthrust earthquakes in the world. Real-time tsunami forecast has depended on estimation of earthquake parameters, such as epicenter, depth, and magnitude of earthquakes. Recently, tsunami forecast method has been developed using the estimation of tsunami source from tsunami waveforms observed at the ocean bottom pressure sensors. However, when we have many pressure sensors separated by 30km on top of the source area, we do not need to estimate the tsunami source or earthquake source to compute tsunami. Instead, we can initiate a tsunami simulation from those dense tsunami observed data. Observed tsunami height differences with a time interval at the ocean bottom pressure sensors separated by 30 km were used to estimate tsunami height distribution at a particular time. In our new method, tsunami numerical simulation was initiated from those estimated tsunami height distribution. In this paper, the above method is improved and applied for the tsunami generated by the 2011 Tohoku-oki great earthquake. Tsunami source model of the 2011 Tohoku-oki great earthquake estimated using observed tsunami waveforms, coseimic deformation observed by GPS and ocean bottom sensors by Gusman et al. (2012) is used in this study. The ocean surface deformation is computed from the source model and used as an initial condition of tsunami simulation. By assuming that this computed tsunami is a real tsunami and observed at ocean bottom sensors, new tsunami simulation is carried out using the above method. The station distribution (each station is separated by 15 min., about 30 km) observed tsunami waveforms which were actually computed from the source model. Tsunami height distributions are estimated from the above method at 40, 80, and 120 seconds after the origin time of the earthquake. The Near-field Tsunami Inundation forecast method (Gusman et al. 2014) was used to estimate the tsunami inundation along the Sanriku coast. The result shows that the observed tsunami inundation was well explained by those estimated inundation. This also shows that it takes about 10 minutes to estimate the tsunami inundation from the origin time of the earthquake. This new method developed in this paper is very effective for a real-time tsunami forecast.

  7. Ofu and Ologesa survey of the 29 September 2009 tsunami

    NASA Astrophysics Data System (ADS)

    Foteinis, S.; Synolakis, C.; Titov, V. V.

    2009-12-01

    On 29 September 2009 an Mw~8.0 earthquake struck the Samoan Islands generating a tsunami at least 189 deaths and substantial damage to many coastal infrastructure. An incarnation of the ITST surveyed the impacted region between 4 Oct and 11 Oct measuring inundation per the protocol discussed in Synolakis and Okal (2005). We report here survey results from Ofu and Ologesa, two sparsely populated adjacent islands connected with a bridge. No human casualties were reported. Buildings did not sustain substantial damage, due to light construction materials and open wood frame construction. The strongest effects of the tsunami were recorded in the northern part of Ofu, with runup ranging to 6.1m, with 50m inundation. The longest inundation distance was 74 m (3m runup), in Ofu village. The runup at the airport was 3.9m and inundation 27m. Near the bridge there is motel where runup reached 5.1m with 50m inundation. On the north of Ologessa at Sili village, runup ranged up to 4m with inundation less than 25m. Iin Ologessa village, runup ranged from 2.7m to 4.4m and inundation from 5 to 55m. By serendipity, the team of surveyors experienced a tsunami warning while working in a fairly vulnerable locale. The warning resulted from the 7 October 2009 Mw ~7.6, off Vanuatu . The evacuation message was broadcast by a passing police vehicle in the sole road connecting Ofu and the Ologesa. There was no information where to evacuate to. With the exception of a school bus that drove children from the sole school of the island, evacuations were orderly with care for the elderly and special needs neighbors, although the latter were delayed for tens of minutes on some neighborhoods. In this regard, had there been a real local tsunami, the school bus would had been swept away as allegedly happened in Poloa. For over three hours, there was no further information provided, and residents relied on unofficial reports from radio stations in Samoa relating that there had been no tsunami generated and then returned back to their homes. The tsunami was a no show, nonetheless it was clear that there is substantial outreach work needed to reinforce the concept that people should wait for official notifications for the all clear and that evacuations can not take place by car, particularly where there are hills within hundreds of feet from the shoreline. The airport was not closed and airline employees reported that there was no official information there either. Some eyewitnesses questioned the false alarm, and, despite the catastrophe only a week earlier, doubted they would keep self-evacuating if there were more of the same. We conclude that education saves lives and should be continuously reinforced through lectures discussing lessons learned, and explaining that for near field events occasional false alarms are inevitable. Local authorities should begin paying attention to broadcasting information during the warning and eventually all clear messages as soon as practical. Synolakis, C.E., and E.A. Okal, 1992-2002: Perspective on a decade of post-tsunami surveys, Adv. Natur. Technol. Hazards, 23, 1-30, 2005.

  8. Tsunami hazard assessment for the island of Rhodes, Greece

    NASA Astrophysics Data System (ADS)

    Pagnoni, Gianluca; Armigliato, Alberto; Zaniboni, Filippo; Tinti, Stefano

    2013-04-01

    The island of Rhodes is part of the Dodecanese archipelago, and is one of the many islands that are found in the Aegean Sea. The tectonics of the Rhodes area is rather complex, involving both strike-slip and dip-slip (mainly thrust) processes. Tsunami catalogues (e.g. Papadopulos et al, 2007) show the relative high frequency of occurrence of tsunamis in this area, some also destructive, in particular between the coasts of Rhodes and Turkey. In this part of the island is located the town of Rhodes, the capital and also the largest and most populated city. Rhodes is historically famous for the Colossus of Rhodes, collapsed following an earthquake, and nowadays is a popular tourist destination. This work is focused on the hazard assessment evaluation with research performed in the frame of the European project NearToWarn. The hazard is assessed by using the worst-credible case scenario, a method introduced and used to study local tsunami hazard in coastal towns like Catania, Italy, and Alexandria, Egypt (Tinti et al., 2012). The tsunami sources chosen for building scenarios are three: two located in the sea area in front of the Turkish coasts where the events are more frequent represent local sources and were selected in the frame of the European project NearToWarn, while one provides the case of a distant source. The first source is taken from the paper Ebeling et al. (2012) and modified by UNIBO and models the earthquake and small tsunami occurred on 25th April 1957.The second source is a landslide and is derived from the TRANSFER Project "Database of Tsunamigenic Non-Seismic Sources" and coincides with the so-called "Northern Rhodes Slide", possibly responsible for the 24th March 2002 tsunami. The last source is the fault that is located close to the island of Crete believed to be responsible for the tsunami event of 1303 that was reported to have caused damage in the city of Rhodes. The simulations are carried out using the finite difference code UBO-TSUFD that solves the Navier Stokes equations in shallow water approximation. To cover the entire basin two nested grids (a coarse one with 30 arc sec resolution and a finer one with 200 m resolution) are used, constructed on bathymetry data provided by the TRANSFER database. The results, as fields of highest wave elevation, maximum flood, maximum speed, arrival times and synthetic tide-gauges, are provided and discussed both individually (i.e. separately for each source) as well as in the form of a single, aggregate result, as required by the worst-case scenario technique. References Ebeling, C.W., Okal., E.A., Kalligeris, N., Synolakis, C.E.: Modern seismological reassessment and tsunami simulation of historical Hellenic Arc earthquakes. Tectonophysics, 530-531, 225-239, 2012. Papadopoulos, G. A., Daskalaki, E., Fokaefs, A., and Giraleas, N.: Tsunami hazards in the Eastern Mediterranean: strong earthquakes and tsunamis in the East Hellenic Arc and Trench system, Nat. Hazards Earth Syst. Sci., 7, 57-64, doi:10.5194/nhess-7-57-2007, 2007. Tinti S., Pagnoni G., Armigliato A., and Tonini R.: Tsunami inundation scenarios and tsunami vulnerability assessment forthe town of Alexandria, Egypt, Geophysical Research Abstracts Vol. 14, EGU2012-10325, 2012, EGU General Assembly 2012.

  9. Tsunami Modeling and Prediction Using a Data Assimilation Technique with Kalman Filters

    NASA Astrophysics Data System (ADS)

    Barnier, G.; Dunham, E. M.

    2016-12-01

    Earthquake-induced tsunamis cause dramatic damages along densely populated coastlines. It is difficult to predict and anticipate tsunami waves in advance, but if the earthquake occurs far enough from the coast, there may be enough time to evacuate the zones at risk. Therefore, any real-time information on the tsunami wavefield (as it propagates towards the coast) is extremely valuable for early warning systems. After the 2011 Tohoku earthquake, a dense tsunami-monitoring network (S-net) based on cabled ocean-bottom pressure sensors has been deployed along the Pacific coast in Northeastern Japan. Maeda et al. (GRL, 2015) introduced a data assimilation technique to reconstruct the tsunami wavefield in real time by combining numerical solution of the shallow water wave equations with additional terms penalizing the numerical solution for not matching observations. The penalty or gain matrix is determined though optimal interpolation and is independent of time. Here we explore a related data assimilation approach using the Kalman filter method to evolve the gain matrix. While more computationally expensive, the Kalman filter approach potentially provides more accurate reconstructions. We test our method on a 1D tsunami model derived from the Kozdon and Dunham (EPSL, 2014) dynamic rupture simulations of the 2011 Tohoku earthquake. For appropriate choices of model and data covariance matrices, the method reconstructs the tsunami wavefield prior to wave arrival at the coast. We plan to compare the Kalman filter method to the optimal interpolation method developed by Maeda et al. (GRL, 2015) and then to implement the method for 2D.

  10. SAFRR Tsunami Scenarios and USGS-NTHMP Collaboration

    NASA Astrophysics Data System (ADS)

    Ross, S.; Wood, N. J.; Cox, D. A.; Jones, L.; Cheung, K. F.; Chock, G.; Gately, K.; Jones, J. L.; Lynett, P. J.; Miller, K.; Nicolsky, D.; Richards, K.; Wein, A. M.; Wilson, R. I.

    2015-12-01

    Hazard scenarios provide emergency managers and others with information to help them prepare for future disasters. The SAFRR Tsunami Scenario, published in 2013, modeled a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. It presented the modeled inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the scenario tsunami. The intended users were those responsible for making mitigation decisions before and those who need to make rapid decisions during future tsunamis. It provided the basis for many exercises involving, among others, NOAA, the State of Washington, several counties in California, and the National Institutes of Health. The scenario led to improvements in the warning protocol for southern California and highlighted issues that led to ongoing work on harbor and marina safety. Building on the lessons learned in the SAFRR Tsunami Scenario, another tsunami scenario is being developed with impacts to Hawaii and to the source region in Alaska, focusing on the evacuation issues of remote communities with primarily shore parallel roads, and also on the effects of port closures. Community exposure studies in Hawaii (Ratliff et al., USGS-SIR, 2015) provided background for selecting these foci. One complicated and important aspect of any hazard scenario is defining the source event. The USGS is building collaborations with the National Tsunami Hazard Mitigation Program (NTHMP) to consider issues involved in developing a standardized set of tsunami sources to support hazard mitigation work. Other key USGS-NTHMP collaborations involve population vulnerability and evacuation modeling.

  11. Determination of broadband moment magnitude (Mwp) for August 11, 2009 Suruga-Bay earthquake (MJMA=6.5)

    NASA Astrophysics Data System (ADS)

    Tsuboi, S.; Hirshorn, B. F.

    2009-12-01

    We have determined Mwp for the August 11, 2009 Suruga-Bay earthquake (MJMA=6.5) using broadband seismograms recorded at close epicentral distance stations. We have used two broadband seismograph stations: JHJ2 (epicentral distance 1.9 degree) and FUJ (epicentral distance 0.44 degree). Because of the close epicentral distance of FUJ, the seismogram is clipped at about 10 second after the P-wave arrival. However, it was possible to use the first 10 second of this seismogram to compute Mwp. We get Mwp=6.4 for JHJ2 and 6.8 for FUJ(figure 1). After we apply Whitmore et al (2000)’s correction and average these two stations, we get Mwp=6.6 for this event. The epicentral distance of 0.44 degree for magnitude 6.5 earthquake is marginal to treat this seismogram as far-field. However, considering the aftershock distribution, the fault area seems to be limited to within the Suruga-Bay, which may confirm the fact that Mwp can be successfully computed at FUJ based on the far-field approximation. This result is significant in using Mwp from close epicentral distance seismograms to issue early tsunami warning. A large earthquake with Mw=7.5 (GCMT) occurred in Andaman Island, India, 10 minutes before this Suruga-Bay event. This made it very difficult to estimate Mwp for the Suruga-Bay event from broadband seismograms at teleseismic distances because of the large amplitude of Mw7.5 Andaman Island earthquake. In this case, it is therefore difficult to issue accurate tsunami warnings based on the teleseismic stations. We used broadband seismograms recorded by F-net operated by the National Research Institute for Earth Science and Disaster Prevention.

  12. Numerical Procedure to Forecast the Tsunami Parameters from a Database of Pre-Simulated Seismic Unit Sources

    NASA Astrophysics Data System (ADS)

    Jiménez, César; Carbonel, Carlos; Rojas, Joel

    2018-04-01

    We have implemented a numerical procedure to forecast the parameters of a tsunami, such as the arrival time of the front of the first wave and the maximum wave height in real and virtual tidal stations along the Peruvian coast, with this purpose a database of pre-computed synthetic tsunami waveforms (or Green functions) was obtained from numerical simulation of seismic unit sources (dimension: 50 × 50 km2) for subduction zones from southern Chile to northern Mexico. A bathymetry resolution of 30 arc-sec (approximately 927 m) was used. The resulting tsunami waveform is obtained from the superposition of synthetic waveforms corresponding to several seismic unit sources contained within the tsunami source geometry. The numerical procedure was applied to the Chilean tsunami of April 1, 2014. The results show a very good correlation for stations with wave amplitude greater than 1 m, in the case of the Arica tide station an error (from the maximum height of the observed and simulated waveform) of 3.5% was obtained, for Callao station the error was 12% and the largest error was in Chimbote with 53.5%, however, due to the low amplitude of the Chimbote wave (<1 m), the overestimated error, in this case, is not important for evacuation purposes. The aim of the present research is tsunami early warning, where speed is required rather than accuracy, so the results should be taken as preliminary.

  13. Numerical Procedure to Forecast the Tsunami Parameters from a Database of Pre-Simulated Seismic Unit Sources

    NASA Astrophysics Data System (ADS)

    Jiménez, César; Carbonel, Carlos; Rojas, Joel

    2017-09-01

    We have implemented a numerical procedure to forecast the parameters of a tsunami, such as the arrival time of the front of the first wave and the maximum wave height in real and virtual tidal stations along the Peruvian coast, with this purpose a database of pre-computed synthetic tsunami waveforms (or Green functions) was obtained from numerical simulation of seismic unit sources (dimension: 50 × 50 km2) for subduction zones from southern Chile to northern Mexico. A bathymetry resolution of 30 arc-sec (approximately 927 m) was used. The resulting tsunami waveform is obtained from the superposition of synthetic waveforms corresponding to several seismic unit sources contained within the tsunami source geometry. The numerical procedure was applied to the Chilean tsunami of April 1, 2014. The results show a very good correlation for stations with wave amplitude greater than 1 m, in the case of the Arica tide station an error (from the maximum height of the observed and simulated waveform) of 3.5% was obtained, for Callao station the error was 12% and the largest error was in Chimbote with 53.5%, however, due to the low amplitude of the Chimbote wave (<1 m), the overestimated error, in this case, is not important for evacuation purposes. The aim of the present research is tsunami early warning, where speed is required rather than accuracy, so the results should be taken as preliminary.

  14. Field survey of the 16 September 2015 Chile tsunami

    NASA Astrophysics Data System (ADS)

    Lagos, Marcelo; Fritz, Hermann M.

    2016-04-01

    On the evening of 16 September, 2015 a magnitude Mw 8.3 earthquake occurred off the coast of central Chile's Coquimbo region. The ensuing tsunami caused significant inundation and damage in the Coquimbo or 4th region and mostly minor effects in neighbouring 3rd and 5th regions. Fortunately, ancestral knowledge from the past 1922 and 1943 tsunamis in the region along with the catastrophic 2010 Maule and recent 2014 tsunamis, as well as tsunami education and evacuation exercises prompted most coastal residents to spontaneously evacuate to high ground after the earthquake. There were a few tsunami victims; while a handful of fatalities were associated to earthquake induced building collapses and the physical stress of tsunami evacuation. The international scientist joined the local effort from September 20 to 26, 2015. The international tsunami survey team (ITST) interviewed numerous eyewitnesses and documented flow depths, runup heights, inundation distances, sediment deposition, damage patterns, performance of the navigation infrastructure and impact on the natural environment. The ITST covered a 500 km stretch of coastline from Caleta Chañaral de Aceituno (28.8° S) south of Huasco down to Llolleo near San Antonio (33.6° S). We surveyed more than 40 locations and recorded more than 100 tsunami and runup heights with differential GPS and integrated laser range finders. The tsunami impact peaked at Caleta Totoral near Punta Aldea with both tsunami and runup heights exceeding 10 m as surveyed on September 22 and broadcasted nationwide that evening. Runup exceeded 10 m at a second uninhabited location some 15 km south of Caleta Totoral. A significant variation in tsunami impact was observed along the coastlines of central Chile at local and regional scales. The tsunami occurred in the evening hours limiting the availability of eyewitness video footages. Observations from the 2015 Chile tsunami are compared against the 1922, 1943, 2010 and 2014 Chile tsunamis. The tsunami was characterized by rapid arrival within minutes in the nearfield requiring spontaneous self-evacuation as warning messages did not reach some of the hardest hit fishing villages prior to tsunami arrival. The absence of a massive tsunami outside of the 4th region may mislead evacuated residents in the adjacent 3rd and 5th regions of Chile in potential future events. This event poses significant challenges to community-based education raising tsunami awareness. The team educated residents about tsunami hazards since awareness programs are essential to save lives in locales at risk from near-field tsunamis.

  15. Tsunami Waves Joint Inversion Using Tsunami Inundation, Tsunami Deposits Distribution and Marine-Terrestrial Sediment Signal in Tsunami Deposit

    NASA Astrophysics Data System (ADS)

    Tang, H.; WANG, J.

    2017-12-01

    Population living close to coastlines is increasing, which creates higher risks due to coastal hazards, such as the tsunami. However, the generation of a tsunami is not fully understood yet, especially for paleo-tsunami. Tsunami deposits are one of the concrete evidence in the geological record which we can apply for studying paleo-tsunami. The understanding of tsunami deposits has significantly improved over the last decades. There are many inversion models (e.g. TsuSedMod, TSUFLIND, and TSUFLIND-EnKF) to study the overland-flow characteristics based on tsunami deposits. However, none of them tries to reconstruct offshore tsunami wave characteristics (wave form, wave height, and length) based on tsunami deposits. Here we present a state-of-the-art inverse approach to reconstruct offshore tsunami wave based on the tsunami inundation data, the spatial distribution of tsunami deposits and Marine-terrestrial sediment signal in the tsunami deposits. Ensemble Kalman Filter (EnKF) Method is used for assimilating both sediment transport simulations and the field observation data. While more computationally expensive, the EnKF approach potentially provides more accurate reconstructions for tsunami waveform. In addition to the improvement of inversion results, the ensemble-based method can also quantify the uncertainties of the results. Meanwhile, joint inversion improves the resolution of tsunami waves compared with inversions using any single data type. The method will be tested by field survey data and gauge data from the 2011 Tohoku tsunami on Sendai plain area.

  16. The Promise and Challenges of High Rate GNSS for Environmental Monitoring and Response

    NASA Astrophysics Data System (ADS)

    LaBrecque, John

    2017-04-01

    The decadal vision Global Geodetic Observing System recognizes the potential of high rate real time GNSS for environmental monitoring. The GGOS initiated a program to advance GNSS real time high rate measurements to augment seismic and other sensor systems for earthquake and tsunami early warning. High rate multi-GNSS networks can provide ionospheric tomography for the detection and tracking of land, ocean and atmospheric gravity waves that can provide coastal warning of tsunamis induced by earthquakes, volcanic eruptions, severe weather and other catastrophic events. NASA has collaborated on a microsatellite constellation of GPS receivers to measure ocean surface roughness to improve severe storm tracking and a equatorial system of GPS occultation receivers to measure ionospheric and atmospheric dynamics. Systems such as these will be significantly enhanced by the availability of a four fold increase in GNSS satellite systems with new and enhanced signal structures and by the densification of regional multi-GNSS networks. These new GNSS capabilities will rely upon improved and cost effective communications infrastructure for a network of coordinated real time analysis centers with input to national warning systems. Most important, the implementation of these new real time GNSS capabilities will rely upon the broad international support for the sharing of real time GNSS much as is done in weather and seismic observing systems and as supported by the Committee of Experts on UN Global Geodetic Information Management (UNGGIM).

  17. 76 FR 68332 - International Fisheries; Pacific Tuna Fisheries; Fishing Restrictions in the Eastern Pacific Ocean

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-04

    ...://www.iattc.org/ResolutionsActiveENG.htm . Changes to Tuna Conservation Measures for 2011-2013... fishing vessels that often leads to loss of data critical to weather forecasting, tsunami warnings, search... of Climate Observations at http://osmc.noaa.gov/Monitor/OSMC/OSMC.html , also provides information...

  18. 76 FR 60790 - International Fisheries; Pacific Tuna Fisheries; Fishing Restrictions in the Eastern Pacific Ocean

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... without change. All personal identifying information (e.g., name, address, etc.) submitted voluntarily by...: (1) A change to the duration of the purse seine closure of the Convention Area in 2011 and... weather forecasting, tsunami warnings, search and rescue efforts, and research of the marine environment...

  19. Tsunami Hazard in La Réunion Island (SW Indian Ocean): Scenario-Based Numerical Modelling on Vulnerable Coastal Sites

    NASA Astrophysics Data System (ADS)

    Allgeyer, S.; Quentel, É.; Hébert, H.; Gailler, A.; Loevenbruck, A.

    2017-08-01

    Several major tsunamis have affected the southwest Indian Ocean area since the 2004 Sumatra event, and some of them (2005, 2006, 2007 and 2010) have hit La Réunion Island in the southwest Indian Ocean. However, tsunami hazard is not well defined for La Réunion Island where vulnerable coastlines can be exposed. This study offers a first tsunami hazard assesment for La Réunion Island. We first review the historical tsunami observations made on the coastlines, where high tsunami waves (2-3 m) have been reported on the western coast, especially during the 2004 Indian Ocean tsunami. Numerical models of historical scenarios yield results consistent with available observations on the coastal sites (the harbours of La Pointe des Galets and Saint-Paul). The 1833 Pagai earthquake and tsunami can be considered as the worst-case historical scenario for this area. In a second step, we assess the tsunami exposure by covering the major subduction zones with syntethic events of constant magnitude (8.7, 9.0 and 9.3). The aggregation of magnitude 8.7 scenarios all generate strong currents in the harbours (3-7 m s^{-1}) and about 2 m of tsunami maximum height without significant inundation. The analysis of the magnitude 9.0 events confirms that the main commercial harbour (Port Est) is more vulnerable than Port Ouest and that flooding in Saint-Paul is limited to the beach area and the river mouth. Finally, the magnitude 9.3 scenarios show limited inundations close to the beach and in the riverbed in Saint-Paul. More generally, the results confirm that for La Runion, the Sumatra subduction zone is the most threatening non-local source area for tsunami generation. This study also shows that far-field coastal sites should be prepared for tsunami hazard and that further work is needed to improve operational warning procedures. Forecast methods should be developed to provide tools to enable the authorities to anticipate the local effects of tsunamis and to evacuate the harbours in sufficient time when such an earthquake occurs.

  20. Space geodetic tools provide early warnings for earthquakes and volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Aoki, Yosuke

    2017-04-01

    Development of space geodetic techniques such as Global Navigation Satellite System and Synthetic Aperture Radar in last few decades allows us to monitor deformation of Earth's surface in unprecedented spatial and temporal resolution. These observations, combined with fast data transmission and quick data processing, enable us to quickly detect and locate earthquakes and volcanic eruptions and assess potential hazards such as strong earthquake shaking, tsunamis, and volcanic eruptions. These techniques thus are key parts of early warning systems, help identify some hazards before a cataclysmic event, and improve the response to the consequent damage.

  1. Crowdsourced earthquake early warning

    USGS Publications Warehouse

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  2. Destructive Interactions Between Mitigation Strategies and the Causes of Unexpected Failures in Natural Hazard Mitigation Systems

    NASA Astrophysics Data System (ADS)

    Day, S. J.; Fearnley, C. J.

    2013-12-01

    Large investments in the mitigation of natural hazards, using a variety of technology-based mitigation strategies, have proven to be surprisingly ineffective in some recent natural disasters. These failures reveal a need for a systematic classification of mitigation strategies; an understanding of the scientific uncertainties that affect the effectiveness of such strategies; and an understanding of how the different types of strategy within an overall mitigation system interact destructively to reduce the effectiveness of the overall mitigation system. We classify mitigation strategies into permanent, responsive and anticipatory. Permanent mitigation strategies such as flood and tsunami defenses or land use restrictions, are both costly and 'brittle': when they malfunction they can increase mortality. Such strategies critically depend on the accuracy of the estimates of expected hazard intensity in the hazard assessments that underpin their design. Responsive mitigation strategies such as tsunami and lahar warning systems rely on capacities to detect and quantify the hazard source events and to transmit warnings fast enough to enable at risk populations to decide and act effectively. Self-warning and voluntary evacuation is also usually a responsive mitigation strategy. Uncertainty in the nature and magnitude of the detected hazard source event is often the key scientific obstacle to responsive mitigation; public understanding of both the hazard and the warnings, to enable decision making, can also be a critical obstacle. Anticipatory mitigation strategies use interpretation of precursors to hazard source events and are used widely in mitigation of volcanic hazards. Their critical limitations are due to uncertainties in time, space and magnitude relationships between precursors and hazard events. Examples of destructive interaction between different mitigation strategies are provided by the Tohoku 2011 earthquake and tsunami; recent earthquakes that have impacted population centers with poor enforcement of building codes, unrealistic expectations of warning systems or failures to understand local seismic damage mechanisms; and the interaction of land use restriction strategies and responsive warning strategies around lahar-prone volcanoes. A more complete understanding of the interactions between these different types of mitigation strategy, especially the consequences for the expectations and behaviors of the populations at risk, requires models of decision-making under high levels of both uncertainty and danger. The Observation-Orientation-Decision-Action (OODA) loop model (Boyd, 1987) may be a particularly useful model. It emphasizes the importance of 'orientation' (the interpretation of observations and assessment of their significance for the observer and decision-maker), the feedback between decisions and subsequent observations and orientations, and the importance of developing mitigation strategies that are flexible and so able to respond to the occurrence of the unexpected. REFERENCE: Boyd, J.R. A Discourse on Winning and Losing [http://dnipogo.org/john-r-boyd/

  3. Real-time and rapid GNSS solutions from the M8.2 September 2017 Tehuantepec Earthquake and implications for Earthquake and Tsunami Early Warning Systems

    NASA Astrophysics Data System (ADS)

    Mencin, D.; Hodgkinson, K. M.; Mattioli, G. S.

    2017-12-01

    In support of hazard research and Earthquake Early Warning (EEW) Systems UNAVCO operates approximately 800 RT-GNSS stations throughout western North America and Alaska (EarthScope Plate Boundary Observatory), Mexico (TLALOCNet), and the pan-Caribbean region (COCONet). Our system produces and distributes raw data (BINEX and RTCM3) and real-time Precise Point Positions via the Trimble PIVOT Platform (RTX). The 2017-09-08 earthquake M8.2 located 98 km SSW of Tres Picos, Mexico is the first great earthquake to occur within the UNAVCO RT-GNSS footprint, which allows for a rigorous analysis of our dynamic and static processing methods. The need for rapid geodetic solutions ranges from seconds (EEW systems) to several minutes (Tsunami Warning and NEIC moment tensor and finite fault models). Here, we compare and quantify the relative processing strategies for producing static offsets, moment tensors and geodetically determined finite fault models using data recorded during this event. We also compare the geodetic solutions with the USGS NEIC seismically derived moment tensors and finite fault models, including displacement waveforms generated from these models. We define kinematic post-processed solutions from GIPSY-OASISII (v6.4) with final orbits and clocks as a "best" case reference to evaluate the performance of our different processing strategies. We find that static displacements of a few centimeters or less are difficult to resolve in the real-time GNSS position estimates. The standard daily 24-hour solutions provide the highest-quality data-set to determine coseismic offsets, but these solutions are delayed by at least 48 hours after the event. Dynamic displacements, estimated in real-time, however, show reasonable agreement with final, post-processed position estimates, and while individual position estimates have large errors, the real-time solutions offer an excellent operational option for EEW systems, including the use of estimated peak-ground displacements or directly inverting for finite-fault solutions. In the near-field, we find that the geodetically-derived moment tensors and finite fault models differ significantly with seismically-derived models, highlighting the utility of using geodetic data in hazard applications.

  4. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    NASA Astrophysics Data System (ADS)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture assessment, e.g. for planning relief actions. At present, a multilingual corpus of Twitter messages related to crises is being assembled, and domain-specific language resources such as multilingual terminology lists and language-specific Natural Language Processing (NLP) tools are being built up to help cross the language barrier. The final goal is to extend this work to the main languages spoken around the Mediterranean and to classify and extract relevant information from tweets, translating the main keywords into English.

  5. Differences in tsunami generation between the December 26, 2004 and March 28, 2005 Sumatra earthquakes

    USGS Publications Warehouse

    Geist, E.L.; Bilek, S.L.; Arcas, D.; Titov, V.V.

    2006-01-01

    Source parameters affecting tsunami generation and propagation for the Mw > 9.0 December 26, 2004 and the Mw = 8.6 March 28, 2005 earthquakes are examined to explain the dramatic difference in tsunami observations. We evaluate both scalar measures (seismic moment, maximum slip, potential energy) and finite-source repre-sentations (distributed slip and far-field beaming from finite source dimensions) of tsunami generation potential. There exists significant variability in local tsunami runup with respect to the most readily available measure, seismic moment. The local tsunami intensity for the December 2004 earthquake is similar to other tsunamigenic earthquakes of comparable magnitude. In contrast, the March 2005 local tsunami was deficient relative to its earthquake magnitude. Tsunami potential energy calculations more accurately reflect the difference in tsunami severity, although these calculations are dependent on knowledge of the slip distribution and therefore difficult to implement in a real-time system. A significant factor affecting tsunami generation unaccounted for in these scalar measures is the location of regions of seafloor displacement relative to the overlying water depth. The deficiency of the March 2005 tsunami seems to be related to concentration of slip in the down-dip part of the rupture zone and the fact that a substantial portion of the vertical displacement field occurred in shallow water or on land. The comparison of the December 2004 and March 2005 Sumatra earthquakes presented in this study is analogous to previous studies comparing the 1952 and 2003 Tokachi-Oki earthquakes and tsunamis, in terms of the effect slip distribution has on local tsunamis. Results from these studies indicate the difficulty in rapidly assessing local tsunami runup from magnitude and epicentral location information alone.

  6. Tsunami vulnerability of buildings and people in South Java - field observations after the July 2006 Java tsunami

    NASA Astrophysics Data System (ADS)

    Reese, S.; Cousins, W. J.; Power, W. L.; Palmer, N. G.; Tejakusuma, I. G.; Nugrahadi, S.

    2007-10-01

    A team of scientists from New Zealand and Indonesia undertook a reconnaissance mission to the South Java area affected by the tsunami of 17 July 2006. The team used GPS-based surveying equipment to measure ground profiles and inundation depths along 17 transects across affected areas near the port city of Cilacap and the resort town of Pangandaran. The purpose of the work was to acquire data for calibration of models used to estimate tsunami inundations, casualty rates and damage levels. Additional information was gathered from interviews with eyewitnesses. The degree of damage observed was diverse, being primarily dependant on water depth and the building construction type. Water depths were typically 2 to 4 m where housing was seriously damaged. Damage levels ranged from total for older brick houses, to about 50% for newer buildings with rudimentary reinforced-concrete beams and columns, to 5-20% for engineered residential houses and multi-storey hotels with heavier RC columns. "Punchout" of weak brick walls was widespread. Despite various natural warning signs very few people were alerted to the impending tsunami. Hence, the death toll was significant, with average death and injury rates both being about 10% of the people exposed, for water depths of about 3 m.

  7. The 1945 Balochistan earthquake and probabilistic tsunami hazard assessment for the Makran subduction zone

    NASA Astrophysics Data System (ADS)

    Höchner, Andreas; Babeyko, Andrey; Zamora, Natalia

    2014-05-01

    Iran and Pakistan are countries quite frequently affected by destructive earthquakes. For instance, the magnitude 6.6 Bam earthquake in 2003 in Iran with about 30'000 casualties, or the magnitude 7.6 Kashmir earthquake 2005 in Pakistan with about 80'000 casualties. Both events took place inland, but in terms of magnitude, even significantly larger events can be expected to happen offshore, at the Makran subduction zone. This small subduction zone is seismically rather quiescent, but a tsunami caused by a thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Additionally, some recent publications raise the question of the possiblity of rare but huge magnitude 9 events at the Makran subduction zone. We first model the historic Balochistan event and its effect in terms of coastal wave heights, and then generate various synthetic earthquake and tsunami catalogs including the possibility of large events in order to asses the tsunami hazard at the affected coastal regions. Finally, we show how an effective tsunami early warning could be achieved by the use of an array of high-precision real-time GNSS (Global Navigation Satellite System) receivers along the coast.

  8. Role of Compressibility on Tsunami Propagation

    NASA Astrophysics Data System (ADS)

    Abdolali, Ali; Kirby, James T.

    2017-12-01

    In the present paper, we aim to reduce the discrepancies between tsunami arrival times evaluated from tsunami models and real measurements considering the role of ocean compressibility. We perform qualitative studies to reveal the phase speed reduction rate via a modified version of the Mild Slope Equation for Weakly Compressible fluid (MSEWC) proposed by Sammarco et al. (2013). The model is validated against a 3-D computational model. Physical properties of surface gravity waves are studied and compared with those for waves evaluated from an incompressible flow solver over realistic geometry for 2011 Tohoku-oki event, revealing reduction in phase speed.Plain Language SummarySubmarine earthquakes and submarine mass failures (SMFs), can generate long gravitational waves (or tsunamis) that propagate at the free surface. Tsunami waves can travel long distances and are known for their dramatic effects on coastal areas. Nowadays, numerical models are used to reconstruct the tsunamigenic events for many scientific and socioeconomic aspects i.e. Tsunami Early Warning Systems, inundation mapping, risk and hazard analysis, etc. A number of typically neglected parameters in these models cause discrepancies between model outputs and observations. Most of the tsunami models predict tsunami arrival times at distant stations slightly early in comparison to observations. In this study, we show how ocean compressibility would affect the tsunami wave propagation speed. In this framework, an efficient two-dimensional model equation for the weakly compressible ocean has been developed, validated and tested for simplified and real cases against three dimensional and incompressible solvers. Taking the effect of compressibility, the phase speed of surface gravity waves is reduced compared to that of an incompressible fluid. Then, we used the model for the case of devastating Tohoku-Oki 2011 tsunami event, improving the model accuracy. This study sheds light for future model development to include ocean compressibility among other typically neglected parameters.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012NHESS..12.1855U','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012NHESS..12.1855U"><span>Web-based Tsunami Early Warning System: a case study of the 2010 Kepulaunan Mentawai Earthquake and Tsunami</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ulutas, E.; Inan, A.; Annunziato, A.</p> <p>2012-06-01</p> <p>This study analyzes the response of the Global Disasters Alerts and Coordination System (GDACS) in relation to a case study: the Kepulaunan Mentawai earthquake and related tsunami, which occurred on 25 October 2010. The GDACS, developed by the European Commission Joint Research Center, combines existing web-based disaster information management systems with the aim to alert the international community in case of major disasters. The tsunami simulation system is an integral part of the GDACS. In more detail, the study aims to assess the tsunami hazard on the Mentawai and Sumatra coasts: the tsunami heights and arrival times have been estimated employing three propagation models based on the long wave theory. The analysis was performed in three stages: (1) pre-calculated simulations by using the tsunami scenario database for that region, used by the GDACS system to estimate the alert level; (2) near-real-time simulated tsunami forecasts, automatically performed by the GDACS system whenever a new earthquake is detected by the seismological data providers; and (3) post-event tsunami calculations using GCMT (Global Centroid Moment Tensor) fault mechanism solutions proposed by US Geological Survey (USGS) for this event. The GDACS system estimates the alert level based on the first type of calculations and on that basis sends alert messages to its users; the second type of calculations is available within 30-40 min after the notification of the event but does not change the estimated alert level. The third type of calculations is performed to improve the initial estimations and to have a better understanding of the extent of the possible damage. The automatic alert level for the earthquake was given between Green and Orange Alert, which, in the logic of GDACS, means no need or moderate need of international humanitarian assistance; however, the earthquake generated 3 to 9 m tsunami run-up along southwestern coasts of the Pagai Islands where 431 people died. The post-event calculations indicated medium-high humanitarian impacts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018OcDyn..68..423G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018OcDyn..68..423G"><span>Tsunami detection by high-frequency radar in British Columbia: performance assessment of the time-correlation algorithm for synthetic and real events</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Guérin, Charles-Antoine; Grilli, Stéphan T.; Moran, Patrick; Grilli, Annette R.; Insua, Tania L.</p> <p>2018-05-01</p> <p>The authors recently proposed a new method for detecting tsunamis using high-frequency (HF) radar observations, referred to as "time-correlation algorithm" (TCA; Grilli et al. Pure Appl Geophys 173(12):3895-3934, 2016a, 174(1): 3003-3028, 2017). Unlike standard algorithms that detect surface current patterns, the TCA is based on analyzing space-time correlations of radar signal time series in pairs of radar cells, which does not require inverting radial surface currents. This was done by calculating a contrast function, which quantifies the change in pattern of the mean correlation between pairs of neighboring cells upon tsunami arrival, with respect to a reference correlation computed in the recent past. In earlier work, the TCA was successfully validated based on realistic numerical simulations of both the radar signal and tsunami wave trains. Here, this algorithm is adapted to apply to actual data from a HF radar installed in Tofino, BC, for three test cases: (1) a simulated far-field tsunami generated in the Semidi Subduction Zone in the Aleutian Arc; (2) a simulated near-field tsunami from a submarine mass failure on the continental slope off of Tofino; and (3) an event believed to be a meteotsunami, which occurred on October 14th, 2016, off of the Pacific West Coast and was measured by the radar. In the first two cases, the synthetic tsunami signal is superimposed onto the radar signal by way of a current memory term; in the third case, the tsunami signature is present within the radar data. In light of these test cases, we develop a detection methodology based on the TCA, using a correlation contrast function, and show that in all three cases the algorithm is able to trigger a timely early warning.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.9901O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.9901O"><span>Tsunami Risk in the NE Atlantic: Pilot Study for Algarve Portugal and Applications for future TWS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Omira, R.; Baptista, M. A.; Catita, C.; Carrilho, F.; Matias, L.</p> <p>2012-04-01</p> <p>Tsunami risk assessment is an essential component of any Tsunami Early Warning System due to its significant contribution to the disaster reduction by providing valuable information that serve as basis for mitigation preparedness and strategies. Generally, risk assessment combines the outputs of the hazard and the vulnerability assessment for considered exposed elements. In the NE Atlantic region, the tsunami hazard is relatively well established through compilation of tsunami historical events, evaluation of tsunamigenic sources and impact computations for site-specific coastal areas. While, tsunami vulnerability remains poorly investigated in spite of some few studies that focused on limited coastal areas of the Gulf of Cadiz region. This work seeks to present a pilot study for tsunami risk assessment that covers about 170 km of coasts of Algarve region, south of Portugal. This area of high coastal occupation and touristic activities was strongly impacted by the 1755 tsunami event as reported in various historical documents. An approach based upon a combination of tsunami hazard and vulnerability is developed in order to take into account the dynamic aspect of tsunami risk in the region that depends on the variation of hazard and vulnerability of exposed elements from a coastal point to other. Hazard study is based upon the consideration of most credible earthquake scenarios and the derivation of hazard maps through hydrodynamic modeling of inundation and tsunami arrival time. The vulnerability assessment is performed by: i) the analysis of the occupation and the population density, ii) derivation of evacuation maps and safe shelters, and iii) the analysis of population response and evacuation times. Different risk levels ranging from "low" to "high" are assigned to the coats of the studied area. Variation of human tsunami risk between the high and low touristic seasons is also considered in this study and aims to produce different tsunami risk-related scenarios. Results are presented in terms of thematic maps and GIS layers highlighting information on inundation depths and limits, evacuation plans and safe shelters, tsunami vulnerability, evacuation times and tsunami risk levels. Results can be used for national and regional tsunami disaster management and planning. This work is funded by TRIDEC (Collaborative, Complex and Critical Decision-Support in Evolving Crises) FP7, EU project and by MAREMOTI (Mareograph and field tsunami observations, modeling and vulnerability studies for Northeast Atlantic and western Mediterranean) French project. Keywords: Tsunami, Algarve-Portugal, Evacuation, Vulnerability, Risk</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..1410253F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..1410253F"><span>Numerical Aspects of Nonhydrostatic Implementations Applied to a Parallel Finite Element Tsunami Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fuchs, A.; Androsov, A.; Harig, S.; Hiller, W.; Rakowsky, N.</p> <p>2012-04-01</p> <p>Based on the jeopardy of devastating tsunamis and the unpredictability of such events, tsunami modelling as part of warning systems is still a contemporary topic. The tsunami group of Alfred Wegener Institute developed the simulation tool TsunAWI as contribution to the Early Warning System in Indonesia. Although the precomputed scenarios for this purpose qualify for satisfying deliverables, the study of further improvements continues. While TsunAWI is governed by the Shallow Water Equations, an extension of the model is based on a nonhydrostatic approach. At the arrival of a tsunami wave in coastal regions with rough bathymetry, the term containing the nonhydrostatic part of pressure, that is neglected in the original hydrostatic model, gains in importance. In consideration of this term, a better approximation of the wave is expected. Differences of hydrostatic and nonhydrostatic model results are contrasted in the standard benchmark problem of a solitary wave runup on a plane beach. The observation data provided by Titov and Synolakis (1995) serves as reference. The nonhydrostatic approach implies a set of equations that are similar to the Shallow Water Equations, so the variation of the code can be implemented on top. However, this additional routines cause a lot of issues you have to cope with. So far the computations of the model were purely explicit. In the nonhydrostatic version the determination of an additional unknown and the solution of a large sparse system of linear equations is necessary. The latter constitutes the lion's share of computing time and memory requirement. Since the corresponding matrix is only symmetric in structure and not in values, an iterative Krylov Subspace Method is used, in particular the restarted Generalized Minimal Residual Algorithm GMRES(m). With regard to optimization, we present a comparison of several combinations of sequential and parallel preconditioning techniques respective number of iterations and setup/application time. Since the used software package pARMS 3.2, that provides solving and preconditioning techniques, works via MPI parallelism, in an auxiliary branch we adapted TsunAWI and switched from OpenMP to MPI with attached importance to internal partition management.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMNH43A1740K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMNH43A1740K"><span>The U.S. East Coast Meteotsunami of June 13, 2013</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Knight, W. R.; Whitmore, P.; Kim, Y.; Wang, D.; Becker, N. C.; Weinstein, S.; Walker, K.</p> <p>2013-12-01</p> <p>NOAA's two Tsunami Warning Centers (TWCs) provide advance notification to coastal communities concerning tsunami hazards. While the focus is primarily on seismic sources, the U.S. East Coast event of June 13, 2013 indicates the importance of understanding and forecasting atmospherically-driven tsunamis, or meteotsunamis, as well. Here we describe an approach which explains the generation of this event by atmospheric processes, and suggests that the causative forces can be monitored and used to forecast meteotsunami occurrence. The U.S. East Coast tsunami of June 13, 2013 was well recorded at tide gauges from North Carolina to Massachusetts as well as at Bermuda and Puerto Rico. It also triggered DART 44402, just east of the Atlantic shelf break at 39.4N. As there was no seismic energy release associated with the tsunami and an eastward propagating major weather system crossed the Atlantic coast just before the tsunami, the focus turned to atmospheric forcing. Tsunami forecast models used at the two U.S. TWCs were modified to introduce moving atmospheric pressure distributions as sources. In a simple case, a north-south oriented line air pressure jump of width 50 km and pressure of 4 mb at sea level was moved eastward at 20 m/s. The speed matched both the storm speed at the coast and the long wave speed for 40 m deep water, thus allowing for resonant coupling of atmosphere to ocean in the shelf region (Proudman Resonance). Considering the simplicity of the source, a reasonable comparison between the modeled and observed tsunami was obtained with regards to arrival time and height. The proposed source also offers an explanation of the later wave arrivals at US tide gauges. These typically lagged the arrival at Bermuda - a location much further east. This pattern can be explained within the context of Proudman resonance if the waves arriving at coastal stations originated at the shelf break as reflected waves. Model animations of wave dynamics corroborate this phenomenon. The contribution of edge waves generated as the system moves over the coast is also examined. Remaining questions include the importance of shelf parameters in setting the wave fetch and the 'Q' of Proudman resonance along the Atlantic coastline. In other words, are some stretches of shelf more conducive to tsunami formation than others? Wind stress was disregarded in the initial modeling work leaving its possible importance as another unanswered question. Operational questions include how to detect likely meteotsunami conditions with real-time meteorological measurements, and what form alerts should take. The minimum necessary temporal resolution of the pressure sensors along with their density and siting needs to be determined. Because details of the source, such as direction and speed of propagation, will likely subject unique sections of coastline to tsunami attack, the detailed analysis of data from sensor arrays to be used in forecasting will be important.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMNH32B..04B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMNH32B..04B"><span>Development of Physics and Control of Multiple Forcing Mechanisms for the Alaska Tsunami Forecast Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bahng, B.; Whitmore, P.; Macpherson, K. A.; Knight, W. R.</p> <p>2016-12-01</p> <p>The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes or other mechanisms in either the Pacific Ocean, Atlantic Ocean or Gulf of Mexico. At the U.S. National Tsunami Warning Center (NTWC), the use of the model has been mainly for tsunami pre-computation due to earthquakes. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. The model has also been used for tsunami hindcasting due to submarine landslides and due to atmospheric pressure jumps, but in a very case-specific and somewhat limited manner. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves approach coastal waters. The shallow-water wave physics is readily applicable to all of the above tsunamis as well as to tides. Recently, the model has been expanded to include multiple forcing mechanisms in a systematic fashion, and to enhance the model physics for non-earthquake events.ATFM is now able to handle multiple source mechanisms, either individually or jointly, which include earthquake, submarine landslide, meteo-tsunami and tidal forcing. As for earthquakes, the source can be a single unit source or multiple, interacting source blocks. Horizontal slip contribution can be added to the sea-floor displacement. The model now includes submarine landslide physics, modeling the source either as a rigid slump, or as a viscous fluid. Additional shallow-water physics have been implemented for the viscous submarine landslides. With rigid slumping, any trajectory can be followed. As for meteo-tsunami, the forcing mechanism is capable of following any trajectory shape. Wind stress physics has also been implemented for the meteo-tsunami case, if required. As an example of multiple sources, a near-field model of the tsunami produced by a combination of earthquake and submarine landslide forcing which happened in Papua New Guinea on July 17, 1998 is provided.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMNH33B3913R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMNH33B3913R"><span>The SAFRR Tsunami Scenario: from Publication to Implementation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ross, S.; Jones, L.; Miller, K.; Wilson, R. I.; Burkett, E. R.; Bwarie, J.; Campbell, N. M.; Johnson, L. A.; Long, K.; Lynett, P. J.; Perry, S. C.; Plumlee, G. S.; Porter, K.; Real, C. R.; Ritchie, L. A.; Wein, A. M.; Whitmore, P.; Wood, N. J.</p> <p>2014-12-01</p> <p>The SAFRR Tsunami Scenario modeled a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. We presented the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the scenario tsunami. The intended users were those responsible for making mitigation decisions before and those who need to make rapid decisions during future tsunamis. The Tsunami Scenario process is being evaluated by the University of Colorado's Natural Hazards Center; this is the first time that a USGS scenario of this scale has been formally and systematically evaluated by an external party. The SAFRR Tsunami Scenario was publicly introduced in September, 2013, through a series of regional workshops in California that brought together emergency managers, maritime authorities, first responders, elected officials and staffers, the business sector, state agencies, local media, scientific partners, and special districts such as utilities (http://pubs.usgs.gov/of/2013/1170/). In March, 2014, NOAA's annual tsunami warning exercise, PACIFEX, was based on the SAFRR Tsunami Scenario. Many groups conducted exercises associated with PACIFEX including the State of Washington and several counties in California. San Francisco had the most comprehensive exercise with a 3-day functional exercise based on the SAFRR Tsunami Scenario. In addition, the National Institutes of Health ran an exercise at the Ports of Los Angeles and Long Beach in April, 2014, building on the Tsunami Scenario, focusing on the recovery phase and adding a refinery fire. The benefits and lessons learned include: 1) stimulating dialogue among practitioners to solve problems; 2) seeing groups add extra components to their exercises that best address their specific concerns; 3) providing groups with information packaged specifically for them; 4) recognizing the value of having scenario developers personally present the scenario to user groups and 5) having the SAFRR work applied to support ongoing activities by and future directions of the California state tsunami program.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.1856G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.1856G"><span>Tsunami evacuation analysis, modelling and planning: application to the coastal area of El Salvador</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gonzalez-Riancho, Pino; Aguirre-Ayerbe, Ignacio; Aniel-Quiroga, Iñigo; Abad Herrero, Sheila; González Rodriguez, Mauricio; Larreynaga, Jeniffer; Gavidia, Francisco; Quetzalcoalt Gutiérrez, Omar; Álvarez-Gómez, Jose Antonio; Medina Santamaría, Raúl</p> <p>2014-05-01</p> <p>Advances in the understanding and prediction of tsunami impacts allow the development of risk reduction strategies for tsunami-prone areas. Conducting adequate tsunami risk assessments is essential, as the hazard, vulnerability and risk assessment results allow the identification of adequate, site-specific and vulnerability-oriented risk management options, with the formulation of a tsunami evacuation plan being one of the main expected results. An evacuation plan requires the analysis of the territory and an evaluation of the relevant elements (hazard, population, evacuation routes, and shelters), the modelling of the evacuation, and the proposal of alternatives for those communities located in areas with limited opportunities for evacuation. Evacuation plans, which are developed by the responsible authorities and decision makers, would benefit from a clear and straightforward connection between the scientific and technical information from tsunami risk assessments and the subsequent risk reduction options. Scientifically-based evacuation plans would translate into benefits for the society in terms of mortality reduction. This work presents a comprehensive framework for the formulation of tsunami evacuation plans based on tsunami vulnerability assessment and evacuation modelling. This framework considers (i) the hazard aspects (tsunami flooding characteristics and arrival time), (ii) the characteristics of the exposed area (people, shelters and road network), (iii) the current tsunami warning procedures and timing, (iv) the time needed to evacuate the population, and (v) the identification of measures to improve the evacuation process, such as the potential location for vertical evacuation shelters and alternative routes. The proposed methodological framework aims to bridge the gap between risk assessment and risk management in terms of tsunami evacuation, as it allows for an estimation of the degree of evacuation success of specific management options, as well as for the classification and prioritization of the gathered information, in order to formulate an optimal evacuation plan. The framework has been applied to the El Salvador case study through the project "Tsunami Hazard and Risk Assessment in El Salvador", funded by AECID during the period 2009-12, demonstrating its applicability to site-specific response times and population characteristics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70024680','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70024680"><span>Complex earthquake rupture and local tsunamis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Geist, E.L.</p> <p>2002-01-01</p> <p>In contrast to far-field tsunami amplitudes that are fairly well predicted by the seismic moment of subduction zone earthquakes, there exists significant variation in the scaling of local tsunami amplitude with respect to seismic moment. From a global catalog of tsunami runup observations this variability is greatest for the most frequently occuring tsunamigenic subduction zone earthquakes in the magnitude range of 7 < Mw < 8.5. Variability in local tsunami runup scaling can be ascribed to tsunami source parameters that are independent of seismic moment: variations in the water depth in the source region, the combination of higher slip and lower shear modulus at shallow depth, and rupture complexity in the form of heterogeneous slip distribution patterns. The focus of this study is on the effect that rupture complexity has on the local tsunami wave field. A wide range of slip distribution patterns are generated using a stochastic, self-affine source model that is consistent with the falloff of far-field seismic displacement spectra at high frequencies. The synthetic slip distributions generated by the stochastic source model are discretized and the vertical displacement fields from point source elastic dislocation expressions are superimposed to compute the coseismic vertical displacement field. For shallow subduction zone earthquakes it is demonstrated that self-affine irregularities of the slip distribution result in significant variations in local tsunami amplitude. The effects of rupture complexity are less pronounced for earthquakes at greater depth or along faults with steep dip angles. For a test region along the Pacific coast of central Mexico, peak nearshore tsunami amplitude is calculated for a large number (N = 100) of synthetic slip distribution patterns, all with identical seismic moment (Mw = 8.1). Analysis of the results indicates that for earthquakes of a fixed location, geometry, and seismic moment, peak nearshore tsunami amplitude can vary by a factor of 3 or more. These results indicate that there is substantially more variation in the local tsunami wave field derived from the inherent complexity subduction zone earthquakes than predicted by a simple elastic dislocation model. Probabilistic methods that take into account variability in earthquake rupture processes are likely to yield more accurate assessments of tsunami hazards.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMNH54A..07R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMNH54A..07R"><span>The SAFRR Tsunami Scenario: Improving Resilience for California from a Plausible M9 Earthquake near the Alaska Peninsula</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ross, S.; Jones, L.; Wilson, R. I.; Bahng, B.; Barberopoulou, A.; Borrero, J. C.; Brosnan, D.; Bwarie, J.; Geist, E. L.; Johnson, L.; Kirby, S. H.; Knight, W.; Long, K.; Lynett, P. J.; Miller, K.; Mortensen, C. E.; Nicolsky, D.; Oglesby, D. D.; Perry, S. C.; Plumlee, G. S.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Suleimani, E.; Thio, H. K.; Titov, V.; Wein, A. M.; Whitmore, P.; Wood, N. J.</p> <p>2013-12-01</p> <p>The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. We present the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the tsunami scenario. The intended users are those who must make mitigation decisions before and rapid decisions during future tsunamis. Around a half million people would be present in the scenario's inundation area in residences, businesses, public venues, parks and beaches. Evacuation would likely be ordered for the State of California's maximum mapped tsunami inundation zone, evacuating an additional quarter million people from residences and businesses. Some island and peninsula communities would face particular evacuation challenges because of limited access options and short warning time, caused by the distance between Alaska and California. Evacuations may also be a challenge for certain dependent-care populations. One third of the boats in California's marinas could be damaged or sunk, costing at least 700 million in repairs to boats and docks, and potentially much more to address serious issues due to sediment transport and environmental contamination. Fires would likely start at many sites where fuel and petrochemicals are stored in ports and marinas. Tsunami surges and bores may travel several miles inland up coastal rivers. Debris clean-up and recovery of inundated and damaged areas will take days, months, or years depending on the severity of impacts and the available resources for recovery. The Ports of Los Angeles and Long Beach (POLA/LB) would be shut down for a miniμm of two days due to strong currents. Inundation of dry land in the ports would result in 100 million damages to cargo and additional downtime. The direct exposure of port trade value totals over 1.2 billion, while associated business interruption losses in the California economy could more than triple that value. Other estimated damages include 1.8 billion of property damage and 85 million for highway and railroad repairs. In total, we have estimated repair and replacement costs of almost 3 billion to California marinas, coastal properties and the POLA/LB. These damages could cause $6 billion of business interruption losses in the California economy, but that could be reduced by 80-90% with the implementation of business continuity or resilience strategies. This scenario provides the basis for improving preparedness, mitigation, and continuity planning for tsunamis, which can reduce damage and economic impacts and enhance recovery efforts. Two positive outcomes have already resulted from the SAFRR Tsunami Scenario. Emergency managers in areas where the scenario inundation exceeds the State's maximum inundation zone have been notified and evacuation plans have been updated appropriately. The State has also worked with NOAA's West Coast and Alaska Tsunami Warning Center to modify future message protocols to facilitate effective evacuations in California. While our specific results pertain to California, the lessons learned and our scenario approach can be applied to other regions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PApGe.174.3003G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PApGe.174.3003G"><span>Tsunami Detection by High Frequency Radar Beyond the Continental Shelf: II. Extension of Time Correlation Algorithm and Validation on Realistic Case Studies</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Grilli, Stéphan T.; Guérin, Charles-Antoine; Shelby, Michael; Grilli, Annette R.; Moran, Patrick; Grosdidier, Samuel; Insua, Tania L.</p> <p>2017-08-01</p> <p>In past work, tsunami detection algorithms (TDAs) have been proposed, and successfully applied to offline tsunami detection, based on analyzing tsunami currents inverted from high-frequency (HF) radar Doppler spectra. With this method, however, the detection of small and short-lived tsunami currents in the most distant radar ranges is challenging due to conflicting requirements on the Doppler spectra integration time and resolution. To circumvent this issue, in Part I of this work, we proposed an alternative TDA, referred to as time correlation (TC) TDA, that does not require inverting currents, but instead detects changes in patterns of correlations of radar signal time series measured in pairs of cells located along the main directions of tsunami propagation (predicted by geometric optics theory); such correlations can be maximized when one signal is time-shifted by the pre-computed long wave propagation time. We initially validated the TC-TDA based on numerical simulations of idealized tsunamis in a simplified geometry. Here, we further develop, extend, and apply the TC algorithm to more realistic tsunami case studies. These are performed in the area West of Vancouver Island, BC, where Ocean Networks Canada recently deployed a HF radar (in Tofino, BC), to detect tsunamis from far- and near-field sources, up to a 110 km range. Two case studies are considered, both simulated using long wave models (1) a far-field seismic, and (2) a near-field landslide, tsunami. Pending the availability of radar data, a radar signal simulator is parameterized for the Tofino HF radar characteristics, in particular its signal-to-noise ratio with range, and combined with the simulated tsunami currents to produce realistic time series of backscattered radar signal from a dense grid of cells. Numerical experiments show that the arrival of a tsunami causes a clear change in radar signal correlation patterns, even at the most distant ranges beyond the continental shelf, thus making an early tsunami detection possible with the TC-TDA. Based on these results, we discuss how the new algorithm could be combined with standard methods proposed earlier, based on a Doppler analysis, to develop a new tsunami detection system based on HF radar data, that could increase warning time. This will be the object of future work, which will be based on actual, rather than simulated, radar data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.S21A4388R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.S21A4388R"><span>New Perspective of Tsunami Deposit Investigations: Insight from the 1755 Lisbon Tsunami in Martinique, Lesser Antilles.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Roger, J.; Clouard, V.; Moizan, E.</p> <p>2014-12-01</p> <p>The recent devastating tsunamis having occurred during the last decades have highlighted the essential necessity to deploy operationnal warning systems and educate coastal populations. This could not be prepared correctly without a minimum knowledge about the tsunami history. That is the case of the Lesser Antilles islands, where a few handfuls of tsunamis have been reported over the past 5 centuries, some of them leading to notable destructions and inundations. But the lack of accurate details for most of the historical tsunamis and the limited period during which we could find written information represents an important problem for tsunami hazard assessment in this region. Thus, it is of major necessity to try to find other evidences of past tsunamis by looking for sedimentary deposits. Unfortunately, island tropical environments do not seem to be the best places to keep such deposits burried. In fact, heavy rainfalls, storms, and all other phenomena leading to coastal erosion, and associated to human activities such as intensive sugarcane cultivation in coastal flat lands, could caused the loss of potential tsunami deposits. Lots of places have been accurately investigated within the Lesser Antilles (from Sainte-Lucia to the British Virgin Islands) the last 3 years and nothing convincing has been found. That is when archeaological investigations excavated a 8-cm thick sandy and shelly layer in downtown Fort-de-France (Martinique), wedged between two well-identified layers of human origin (Fig. 1), that we found new hope: this sandy layer has been quickly attributed without any doubt to the 1755 tsunami, using on one hand the information provided by historical reports of the construction sites, and on the other hand by numerical modeling of the tsunami (wave heights, velocity fields, etc.) showing the ability of this transoceanic tsunami to wrap around the island after ~7 hours of propagation, enter Fort-de-France's Bay with enough energy to carry sediments, and inundate it. Helping with this discovery, we conclude that tsunami markers could have been simply buried and preserved by human earthmoving, leveling and other building activities. It also shows how a collaborative research involving geology and archaeology could chart a new course to greatly improve our tsunami databases.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14...83C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14...83C"><span>Comparable Analysis of the Distribution Functions of Runup Heights of the 1896, 1933 and 2011 Japanese Tsunamis in the Sanriku Area</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Choi, B. H.; Min, B. I.; Yoshinobu, T.; Kim, K. O.; Pelinovsky, E.</p> <p>2012-04-01</p> <p>Data from a field survey of the 2011 tsunami in the Sanriku area of Japan is presented and used to plot the distribution function of runup heights along the coast. It is shown that the distribution function can be approximated using a theoretical log-normal curve [Choi et al, 2002]. The characteristics of the distribution functions derived from the runup-heights data obtained during the 2011 event are compared with data from two previous gigantic tsunamis (1896 and 1933) that occurred in almost the same region. The number of observations during the last tsunami is very large (more than 5,247), which provides an opportunity to revise the conception of the distribution of tsunami wave heights and the relationship between statistical characteristics and number of observations suggested by Kajiura [1983]. The distribution function of the 2011 event demonstrates the sensitivity to the number of observation points (many of them cannot be considered independent measurements) and can be used to determine the characteristic scale of the coast, which corresponds to the statistical independence of observed wave heights.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.7389S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.7389S"><span>A SDMS Model: Early Warning Coordination Centres</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Santos-Reyes, Jaime</p> <p>2010-05-01</p> <p>Following the tsunami disaster in 2004, the General Secretary of the United Nations (UN) Kofi Annan called for a global early warning system for all hazards and for all communities. He also requested the ISDR (International Strategy fort Disaster Reduction) and its UN partners to conduct a global survey of capacities, gaps and opportunities in relation to early warning systems. The produced report, "Global survey of Early Warning Systems", concluded that there are many gaps and shortcomings and that much progress has been made on early warning systems and great capabilities are available around the world. However, it may be argued that an early warning system (EWS) may not be enough to prevent fatalities due to a natural hazard; i.e., it should be seen as part of a ‘wider' or total system. Furthermore, an EWS may work very well when assessed individually but it is not clear whether it will contribute to accomplish the purpose of the ‘total disaster management system'; i.e., to prevent fatalities. For instance, a regional EWS may only work if it is well co-ordinated with the local warning and emergency response systems that ensure that the warning is received, communicated and acted upon by the potentially affected communities. It may be argued that without these local measures being in place, a regional EWS will have little impact in saving lives. Researchers argued that unless people are warned in remote areas, the technology is useless; for instance McGuire [5] argues that: "I have no doubt that the technical element of the warning system will work very well,"…"But there has to be an effective and efficient communications cascade from the warning centre to the fisherman on the beach and his family and the bar owners." Similarly, McFadden [6] states that: "There's no point in spending all the money on a fancy monitoring and a fancy analysis system unless we can make sure the infrastructure for the broadcast system is there,"… "That's going to require a lot of work. If it's a tsunami, you've got to get it down to the last Joe on the beach. This is the stuff that is really very hard." Given the above, the paper argues that there is a need for a systemic approach to early warning centres. Systemic means looking upon things as a system; systemic means seeing pattern and inter-relationship within a complex whole; i.e., to see events as products of the working of a system. System may be defined as a whole which is made of parts and relationships. Given this, ‘failure' may be seen as the product of a system and, within that, see death/injury/property loss etc. as results of the working of systems. This paper proposes a preliminary model of ‘early warning coordination centres' (EWCC); it should be highlighted that an EWCC is a subsystem of the Systemic Disaster Management System (SDMS) model.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFMNH33A1641M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFMNH33A1641M"><span>A culture of tsunami preparedness and applying knowledge from recent tsunamis affecting California</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Miller, K. M.; Wilson, R. I.</p> <p>2012-12-01</p> <p>It is the mission of the California Tsunami Program to ensure public safety by protecting lives and property before, during, and after a potentially destructive or damaging tsunami. In order to achieve this goal, the state has sought first to use finite funding resources to identify and quantify the tsunami hazard using the best available scientific expertise, modeling, data, mapping, and methods at its disposal. Secondly, it has been vital to accurately inform the emergency response community of the nature of the threat by defining inundation zones prior to a tsunami event and leveraging technical expertise during ongoing tsunami alert notifications (specifically incoming wave heights, arrival times, and the dangers of strong currents). State scientists and emergency managers have been able to learn and apply both scientific and emergency response lessons from recent, distant-source tsunamis affecting coastal California (from Samoa in 2009, Chile in 2010, and Japan in 2011). Emergency managers must understand and plan in advance for specific actions and protocols for each alert notification level provided by the NOAA/NWS West Coast/Alaska Tsunami Warning Center. Finally the state program has provided education and outreach information via a multitude of delivery methods, activities, and end products while keeping the message simple, consistent, and focused. The goal is a culture of preparedness and understanding of what to do in the face of a tsunami by residents, visitors, and responsible government officials. We provide an update of results and findings made by the state program with support of the National Tsunami Hazard Mitigation Program through important collaboration with other U.S. States, Territories and agencies. In 2009 the California Emergency Management Agency (CalEMA) and the California Geological Survey (CGS) completed tsunami inundation modeling and mapping for all low-lying, populated coastal areas of California to assist local jurisdictions on the coast in the identification of areas possible to be inundated in a tsunami. "Tsunami Inundation Maps for Emergency Planning" have provided the basis for some of the following preparedness, planning, and education activities in California: Improved evacuation and emergency response plans; Production of multi-language brochures: statewide, community, and boating; Development and support of tsunami scenario-driven exercises and drills; Development of workshops to educate both emergency managers and public; and Establishment of a comprehensive information website www.tsunami.ca.gov; and a preparedness website myhazards.calema.ca.gov. In addition, the California Tsunami Program has a number of initiatives underway through existing work plans to continue to apply scientifically vetted information toward comprehensive public understanding of the threat from future tsunamis to constituents on the coast. These include projects to: Complete tsunami land-use planning maps for California communities, Develop in-harbor tsunami hazard maps statewide, Complete modeling of offshore safety zones for the maritime community, Complete preliminary tsunami risk analysis for state utilizing new HAZUS tsunami module and probabilistic analysis results, and Develop a post-tsunami recovery and resiliency plan for the state.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMNH52A..07C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMNH52A..07C"><span>Lessons for tsunami risk mitigation from recent events occured in Chile: research findings for alerting and evacuation from interdisciplinary perspectives</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cienfuegos, R.; Catalan, P. A.; Leon, J.; Gonzalez, G.; Repetto, P.; Urrutia, A.; Tomita, T.; Orellana, V.</p> <p>2016-12-01</p> <p>In the wake of the 2010 tsunami that hit Chile, a major public effort to promote interdisciplinary disaster reseach was undertaken by the Comisión Nacional de Investigación Científica y Tecnológica (Conicyt) allocating funds to create the Center for Integrated Research on Natural Risks Management (CIGIDEN). This effort has been key in promoting associativity between national and international research teams in order to transform the frequent occurrence of extreme events that affect Chile into an opportunity for interdisciplinary research. In this presentation we will summarize some of the fundamental research findings regarding tsunami forecasting, alerting, and evacuation processes based on interdisciplinary field work campaigns and modeling efforts conducted in the wake of the three most recent destructive events that hit Chile in 2010, 2014, and 2015. One of the main results that we shall emphatize from these findings, is that while research and operational efforts to model and forecast tsunamis are important, technological positivisms should not undermine educational efforts that have proved to be effective in reducing casualties due to tsunamis in the near field. Indeed, in recent events that hit Chile, first tsunami waves reached the adjacent generation zones in time scales comparable with the required time for data gathering and modeling even for the most sophisticated early warning tsunami algorithms currently available. The latter emphasizes self-evacuation from coastal areas, while forecasting and monitoring tsunami hazards remain very important for alerting more distant areas, and are essential for alert cancelling especially when shelf and embayment resonance, and edge wave propagation may produce destructive late tsunami arrivals several hours after the nucleation of the earthquake. By combining some of the recent evidence we have gathered in Chile on seismic source uncertainities (both epistemic and aleatoric), tsunami hydrodynamics, the response of official national institutions in charge of emergency management, and the evacuation processess observed, we will attempt to bring some elements for discussing on the complex balance between technological positivism and risk awareness and education programs that may help prioritizing funding efforts in tsunami prone regions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011AGUFMNH14A..05C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011AGUFMNH14A..05C"><span>Modeling of the 2011 Tohoku-oki Tsunami and its Impacts on Hawaii</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cheung, K.; Yamazaki, Y.; Roeber, V.; Lay, T.</p> <p>2011-12-01</p> <p>The 2011 Tohoku-oki great earthquake (Mw 9.0) generated a destructive tsunami along the entire Pacific coast of northeastern Japan. The tsunami, which registered 6.7 m amplitude at a coastal GPS gauge and 1.75 m at an open-ocean DART buoy, triggered warnings across the Pacific. The waves reached Hawaii 7 hours after the earthquake and caused localized damage and persistent coastal oscillations along the island chain. Several tide gauges and a DART buoy west of Hawaii Island recorded clear signals of the tsunami. The Tsunami Observer Program of Hawaii State Civil Defense immediately conducted field surveys to gather runup and inundation data on Kauai, Oahu, Maui, and Hawaii Island. The extensive global seismic networks and geodetic instruments allows evaluation and validation of finite fault solutions for the tsunami modeling. We reconstruct the 2011 Tohoku-oki tsunami using the long-wave model NEOWAVE (Non-hydrostatic Evolution of Ocean WAVEs) and a finite fault solution based on inversion of teleseismic P waves. The depth-integrated model describes dispersive waves through the non-hydrostatic pressure and vertical velocity, which also account for tsunami generation from time histories of seafloor deformation. The semi-implicit, staggered finite difference model captures flow discontinuities associated with bores or hydraulic jumps through the momentum-conserved advection scheme. Four levels of two-way nested grids in spherical coordinates allow description of tsunami evolution processes of different time and spatial scales for investigation of the impacts around the Hawaiian Islands. The model results are validated with DART data across the Pacific as well as tide gauge and runup measurements in Hawaii. Spectral analysis of the computed surface elevation reveals a series of resonance modes over the insular shelf and slope complex along the archipelago. Resonance oscillations provide an explanation for the localized impacts and the persistent wave activities in the aftermath. The model results provide insights into effects of fringing reefs, which are present along 70% of Hawaii's coastlines, on tsunami transformation and runup processes. This case study improves our understanding of tsunamis in tropical island environment and validates the modeling capability to predict their impacts for hazard mitigation and emergency management.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16844643','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16844643"><span>Sumatran megathrust earthquakes: from science to saving lives.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sieh, Kerry</p> <p>2006-08-15</p> <p>Most of the loss of life, property and well-being stemming from the great Sumatran earthquake and tsunami of 2004 could have been avoided and losses from similar future events can be largely prevented. However, achieving this goal requires forging a chain linking basic science-the study of why, when and where these events occur-to people's everyday lives. The intermediate links in this chain are emergency response preparedness, warning capability, education and infrastructural changes. In this article, I first describe our research on the Sumatran subduction zone. This research has allowed us to understand the basis of the earthquake cycle on the Sumatran megathrust and to reconstruct the sequence of great earthquakes that have occurred there in historic and prehistoric times. On the basis of our findings, we expect that one or two more great earthquakes and tsunamis, nearly as devastating as the 2004 event, are to be expected within the next few decades in a region of coastal Sumatra to the south of the zone affected in 2004. I go on to argue that preventing future tragedies does not necessarily involve hugely expensive or high-tech solutions such as the construction of coastal defences or sensor-based tsunami warning systems. More valuable and practical steps include extending the scientific research, educating the at-risk populations as to what to do in the event of a long-lasting earthquake (i.e. one that might be followed by a tsunami), taking simple measures to strengthen buildings against shaking, providing adequate escape routes and helping the residents of the vulnerable low-lying coastal strips to relocate their homes and businesses to land that is higher or farther from the coast. Such steps could save hundreds and thousands of lives in the coastal cities and offshore islands of western Sumatra, and have general applicability to strategies for helping the developing nations to deal with natural hazards.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMNH33A0237C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMNH33A0237C"><span>Rapid kinematic finite source inversion for Tsunamic Early Warning using high rate GNSS data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chen, K.; Liu, Z.; Song, Y. T.</p> <p>2017-12-01</p> <p>Recently, Global Navigation Satellite System (GNSS) has been used for rapid earthquake source inversion towards tsunami early warning. In practice, two approaches, i.e., static finite source inversion based on permanent co-seismic offsets and kinematic finite source inversion using high-rate (>= 1 Hz) co-seismic displacement waveforms, are often employed to fulfill the task. The static inversion is relatively easy to be implemented and does not require additional constraints on rupture velocity, duration, and temporal variation. However, since most GNSS receivers are deployed onshore locating on one side of the subduction fault, there is very limited resolution on near-trench fault slip using GNSS in static finite source inversion. On the other hand, the high-rate GNSS displacement waveforms, which contain the timing information of earthquake rupture explicitly and static offsets implicitly, have the potential to improve near-trench resolution by reconciling with the depth-dependent megathrust rupture behaviors. In this contribution, we assess the performance of rapid kinematic finite source inversion using high-rate GNSS by three selected historical tsunamigenic cases: the 2010 Mentawai, 2011 Tohoku and 2015 Illapel events. With respect to the 2010 Mentawai case, it is a typical tsunami earthquake with most slip concentrating near the trench. The static inversion has little resolution there and incorrectly puts slip at greater depth (>10km). In contrast, the recorded GNSS displacement waveforms are deficit in high-frequency energy, the kinematic source inversion recovers a shallow slip patch (depth less than 6 km) and tsunami runups are predicted quite reasonably. For the other two events, slip from kinematic and static inversion show similar characteristics and comparable tsunami scenarios, which may be related to dense GNSS network and behavior of the rupture. Acknowledging the complexity of kinematic source inversion in real-time, we adopt the back-projection approach to provide constraint on rupture velocity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006GeoRL..3313601C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006GeoRL..3313601C"><span>Distribution of runup heights of the December 26, 2004 tsunami in the Indian Ocean</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Choi, Byung Ho; Hong, Sung Jin; Pelinovsky, Efim</p> <p>2006-07-01</p> <p>A massive earthquake with magnitude 9.3 occurred on December 26, 2004 off the northern Sumatra generated huge tsunami waves affected many coastal countries in the Indian Ocean. A number of field surveys have been performed after this tsunami event; in particular, several surveys in the south/east coast of India, Andaman and Nicobar Islands, Sri Lanka, Sumatra, Malaysia, and Thailand have been organized by the Korean Society of Coastal and Ocean Engineers from January to August 2005. Spatial distribution of the tsunami runup is used to analyze the distribution function of the wave heights on different coasts. Theoretical interpretation of this distribution is associated with random coastal bathymetry and coastline led to the log-normal functions. Observed data also are in a very good agreement with log-normal distribution confirming the important role of the variable ocean bathymetry in the formation of the irregular wave height distribution along the coasts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22953236','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22953236"><span>Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hunter, Jennifer C; Crawley, Adam W; Petrie, Michael; Yang, Jane E; Aragón, Tomás J</p> <p>2012-07-16</p> <p>Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami's impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergency responders' ability to rapidly communicate and coordinate a response to a potential threat. Methods To evaluate the local public health system emergency response to the tsunami threat in California, we surveyed all local public health, emergency medical services (EMS), and emergency management agencies in coastal or floodplain counties about several domains related to the tsunami threat in California, including: (1) the extent to which their community was affected by the tsunami, (2) when and how they received notification of the event, (3) which public health response activities were carried out to address the tsunami threat in their community, and (4) which organizations contributed to the response. Public health activities were characterized using the Centers for Disease Control and Prevention (CDC) Public Health Preparedness Capabilities (PHEP) framework. Findings The tsunami's impact on coastal communities in California ranged widely, both in terms of the economic consequences and the response activities. Based on estimates from the National Oceanic and Atmospheric Administration (NOAA), ten jurisdictions in California reported tsunami-related damage, which ranged from $15,000 to $35 million. Respondents first became aware of the tsunami threat in California between the hours of 10:00pm Pacific Standard Time (PST) on Thursday March 10th and 2:00pm PST on Friday March 11th, a range of 16 hours, with notification occurring through both formal and informal channels. In response to this threat, the activities most commonly reported by the local government agencies included in this study were: emergency public information and warning, emergency operations coordination, and inter-organizational information sharing, which were reported by 86%, 75%, and 65% of all respondents, respectively. When looking at the distribution of responsibility, emergency management agencies were the most likely to report assuming a lead role in these common activities as well as those related to evacuation and community recovery. While activated less frequently, public health agencies carried out emergency response functions related to surveillance and epidemiology, environmental health, and mental health/psychological support. Both local public health and EMS agencies took part in mass care and medical material management activities. A large network of organizations contributed to response activities, with emergency management, law enforcement, fire, public health, public works, EMS, and media cited by more than half of respondents. Conclusions In response to the tsunami threat in California, we found that emergency management agencies assumed a lead role in the local response efforts. While public health and medical agencies played a supporting role in the response, they uniquely contributed to a number of specific activities. If the response to the recent tsunami is any indication, these support activities can be anticipated in planning for future events with similar characteristics to the tsunami threat. Additionally, we found that many respondents first learned of the tsunami through the media, rather than through rapid notification systems, which suggests that government agencies must continue to develop and maintain the ability to rapidly aggregate and analyze information in order to provide accurate assessments and guidance to a potentially well-informed public. Hunter JC, Crawley AW, Petrie M, Yang JE, Aragón TJ. Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake. PLoS Currents Disasters. 2012 Jul 16.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EP%26S...68..139A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EP%26S...68..139A"><span>Assessment of tsunami resilience of Haydarpaşa Port in the Sea of Marmara by high-resolution numerical modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Aytore, Betul; Yalciner, Ahmet Cevdet; Zaytsev, Andrey; Cankaya, Zeynep Ceren; Suzen, Mehmet Lütfi</p> <p>2016-08-01</p> <p>Turkey is highly prone to earthquakes because of active fault zones in the region. The Marmara region located at the western extension of the North Anatolian Fault Zone (NAFZ) is one of the most tectonically active zones in Turkey. Numerous catastrophic events such as earthquakes or earthquake/landslide-induced tsunamis have occurred in the Marmara Sea basin. According to studies on the past tsunami records, the Marmara coasts have been hit by 35 different tsunami events in the last 2000 years. The recent occurrences of catastrophic tsunamis in the world's oceans have also raised awareness about tsunamis that might take place around the Marmara coasts. Similarly, comprehensive studies on tsunamis, such as preparation of tsunami databases, tsunami hazard analysis and assessments, risk evaluations for the potential tsunami-prone regions, and establishing warning systems have accelerated. However, a complete tsunami inundation analysis in high resolution will provide a better understanding of the effects of tsunamis on a specific critical structure located in the Marmara Sea. Ports are one of those critical structures that are susceptible to marine disasters. Resilience of ports and harbors against tsunamis are essential for proper, efficient, and successful rescue operations to reduce loss of life and property. Considering this, high-resolution simulations have been carried out in the Marmara Sea by focusing on Haydarpaşa Port of the megacity Istanbul. In the first stage of simulations, the most critical tsunami sources possibly effective for Haydarpaşa Port were inputted, and the computed tsunami parameters at the port were compared to determine the most critical tsunami scenario. In the second stage of simulations, the nested domains from 90 m gird size to 10 m grid size (in the port region) were used, and the most critical tsunami scenario was modeled. In the third stage of simulations, the topography of the port and its regions were used in the two nested domains in 3-m and 1-m resolutions and the water elevations computed from the previous simulations were inputted from the border of the large domain. A tsunami numerical code, NAMI DANCE, was used in the simulations. The tsunami parameters in the highest resolution were computed in and around the port. The effect of the data resolution on the computed results has been presented. The performance of the port structures and possible effects of tsunami on port operations have been discussed. Since the harbor protection structures have not been designed to withstand tsunamis, the breakwaters' stability becomes one of the major concerns for less agitation and inundation under tsunami in Haydarpaşa Port for resilience. The flow depth, momentum fluxes, and current pattern are the other concerns that cause unexpected circulations and uncontrolled movements of objects on land and vessels in the sea.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA478442','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA478442"><span>The Indonesian Imperative</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2008-02-29</p> <p>2002 Bali bombing that killed 202 and injured around 300 people. The Indonesia government, TNI and police have cracked down on JI, but this terrorist...Hawaii’s Department of Business, Economic Development and Tourism ; the Department of Health; the Pacific Tsunami Warning Center; University of Hawaii’s...associations, industry , Department of Defense (DOD), Congressional National Guard Caucuses, adjutants general, governors, veterans associations, and others to</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA573151','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA573151"><span>System Design of an Unmanned Aerial Vehicle (UAV) for Marine Environmental Sensing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2013-02-01</p> <p>Malaysia to the north. Sea trials have been located through the green band. ................................................................... 56 Figure...light of recent disasters, pressure monitoring nodes mounted to the seafloor now provide advanced tsunami warning in countries including Malaysia ...organisms in huge number. Human health can also be impacted through the consumption of shellfish or other seafood contaminated with bloom-related</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012NHESS..12..555H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012NHESS..12..555H"><span>User interface prototype for geospatial early warning systems - a tsunami showcase</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hammitzsch, M.; Lendholt, M.; Esbrí, M. Á.</p> <p>2012-03-01</p> <p>The command and control unit's graphical user interface (GUI) is a central part of early warning systems (EWS) for man-made and natural hazards. The GUI combines and concentrates the relevant information of the system and offers it to human operators. It has to support operators successfully performing their tasks in complex workflows. Most notably in critical situations when operators make important decisions in a limited amount of time, the command and control unit's GUI has to work reliably and stably, providing the relevant information and functionality with the required quality and in time. The design of the GUI application is essential in the development of any EWS to manage hazards effectively. The design and development of such GUI is performed repeatedly for each EWS by various software architects and developers. Implementations differ based on their application in different domains. But similarities designing and equal approaches implementing GUIs of EWS are not quite harmonized enough with related activities and do not exploit possible synergy effects. Thus, the GUI's implementation of an EWS for tsunamis is successively introduced, providing a generic approach to be applied in each EWS for man-made and natural hazards.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010CRGeo.342..434S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010CRGeo.342..434S"><span>A catalog of tsunamis in New Caledonia from 28 March 1875 to 30 September 2009</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sahal, Alexandre; Pelletier, Bernard; Chatelier, Jean; Lavigne, Franck; Schindelé, François</p> <p>2010-06-01</p> <p>In order to establish a tsunami alert system in New Caledonia in April 2008, the French Secretary of State for Overseas Affairs, with the aid of the UNESCO French Commission, mandated an investigation to build a more complete record of the most recent tsunamis. To complete this task, a call for witnesses was broadcast through various media and in public locations. These witnesses were then interviewed onsite about the phenomenon they had observed. Previous witness reports that had been obtained in the last few years were also used. For the most recent events, various archives were consulted. In total, 18 events were documented, of which 12 had not been previously mentioned in past work. These results confirm an exposure to a hazard of: (1) local origin (the southern part of the Vanuatu arc) with a very short post-seismic delay (< 30 min) before the arrival of wave trains; (2) regional origin (Solomon Islands arc, northern part of the Vanuatu arc) with a delay of several hours; and (3) an exposure to trans-oceanic tsunamis (Kamchatka 1952, South Chile 1960, Kuril Islands 2006, North Tonga 2009), unknown until today. These results highlight the necessity for New Caledonia to adopt an alert system, coupled with ocean tide gauges, that liaises with the main alert system for the Pacific (Pacific Tsunami Warning Center), and brings to light the importance of establishing a prevention campaign.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMNH21D..01B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMNH21D..01B"><span>The Making of a Tsunami Hazard Map: Lessons Learned from the TSUMAPS-NEAM Project</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Basili, R.</p> <p>2017-12-01</p> <p>Following the worldwide surge of awareness toward tsunami hazard and risk in the last decade, Europe has promoted a better understanding of the tsunami phenomenon through research projects (e.g. TRANSFER, ASTARTE) and started programs for preventing the tsunami impact along the coastlines of the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region (e.g. the Tsunami Early Warning and Mitigation System, NEAMTWS, coordinated by IOC/UNESCO). An indispensable tool toward long-term coastal planning and an effective design and subsequent use of TWS is the availability of a comprehensive Probabilistic Tsunami Hazard Assessment (PTHA). The TSUMAPS-NEAM project took the pledge of producing the first region-wide long-term homogenous PTHA map from earthquake sources. The hazard assessment was built upon state-of-the-art procedures and standards, enriched by some rather innovative/experimental approaches such as: (1) the statistical treatment of potential seismic sources, combining all the available information (seismicity, moment tensors, tectonics), and considering earthquakes occurring on major crustal faults and subduction interfaces; (2) an intensive computational approach to tsunami generation and linear propagation across the sea up to an offshore fixed depth; (3) the use of approximations for shoaling and inundation, based on local bathymetry, and for tidal stages; and (4) the exploration of several alternatives for the basic input data and their parameters which produces a number of models that are treated through an ensemble uncertainty quantification. This presentation will summarize the TSUMAPS-NEAM project goals, implementation, and achieved results, as well as the humps and bumps we run into during its development. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMNH51A1923R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMNH51A1923R"><span>Contribution of ionospheric monitoring to tsunami warning: results from a benchmark exercise</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rolland, L.; Makela, J. J.; Drob, D. P.; Occhipinti, G.; Lognonne, P. H.; Kherani, E. A.; Sladen, A.; Rakoto, V.; Grawe, M.; Meng, X.; Komjathy, A.; Liu, T. J. Y.; Astafyeva, E.; Coisson, P.; Budzien, S. A.</p> <p>2016-12-01</p> <p>Deep ocean pressure sensors have proven very effective to quantify tsunami waves in real-time. Yet, the cost of these sensors and maintenance strongly limit the extensive deployment of dense networks. Thus a complete observation of the tsunami wave-field is not possible so far. In the last decade, imprints of moderate to large transpacific tsunami wave-fields have been registered in the ionosphere through the atmospheric internal gravity wave coupled with the tsunami during its propagation. Those ionospheric observations could provide a an additional description of the phenomenon with a high spatial coverage. Ionospheric observations have been supported by numerical modeling of the ocean-atmosphere-ionosphere coupling, developed by different groups. We present here the first results of a cross-validation exercise aimed at testing various forward simulation techniques. In particular, we compare different approaches for modeling tsunami-induced gravity waves including a pseudo-spectral method, finite difference schemes, a fully coupled normal modes modeling approach, a Fourier-Laplace compressible ray-tracing solution, and a self-consistent, three-dimensional physics-based wave perturbation (WP) model based on the augmented Global Thermosphere-Ionosphere Model (WP-GITM). These models and other existing models use either a realistic sea-surface motion input model or a simple analytic model. We discuss the advantages and drawbacks of the different methods and setup common inputs to the models so that meaningful comparisons of model outputs can be made to higlight physical conclusions and understanding. Nominally, we highlight how the different models reproduce or disagree for two study cases: the ionospheric observations related to the 2012 Mw7.7 Haida Gwaii, Canada, and 2015 Mw8.3 Illapel, Chile, events. Ultimately, we explore the possibility of computing a transfer function in order to convert ionospheric perturbations directly into tsunami height estimates.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMNH13A3716B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMNH13A3716B"><span>Development of Parallel Code for the Alaska Tsunami Forecast Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bahng, B.; Knight, W. R.; Whitmore, P.</p> <p>2014-12-01</p> <p>The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes and other means in both the Pacific and Atlantic Oceans. At the U.S. National Tsunami Warning Center (NTWC), the model is mainly used in a pre-computed fashion. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves get closer to coastal waters. Even with the pre-computation the task becomes non-trivial as sub-grid resolution gets finer. Currently, the finest resolution Digital Elevation Models (DEM) used by ATFM are 1/3 arc-seconds. With a serial code, large or multiple areas of very high resolution can produce run-times that are unrealistic even in a pre-computed approach. One way to increase the model performance is code parallelization used in conjunction with a multi-processor computing environment. NTWC developers have undertaken an ATFM code-parallelization effort to streamline the creation of the pre-computed database of results with the long term aim of tsunami forecasts from source to high resolution shoreline grids in real time. Parallelization will also permit timely regeneration of the forecast model database with new DEMs; and, will make possible future inclusion of new physics such as the non-hydrostatic treatment of tsunami propagation. The purpose of our presentation is to elaborate on the parallelization approach and to show the compute speed increase on various multi-processor systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011NHESS..11..227A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011NHESS..11..227A"><span>Tsunami hazard at the Western Mediterranean Spanish coast from seismic sources</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; González, M.; Otero, L.</p> <p>2011-01-01</p> <p>Spain represents an important part of the tourism sector in the Western Mediterranean, which has been affected in the past by tsunamis. Although the tsunami risk at the Spanish coasts is not the highest of the Mediterranean, the necessity of tsunami risk mitigation measures should not be neglected. In the Mediterranean area, Spain is exposed to two different tectonic environments with contrasting characteristics. On one hand, the Alboran Basin characterised by transcurrent and transpressive tectonics and, on the other hand, the North Algerian fold and thrust belt, characterised by compressive tectonics. A set of 22 seismic tsunamigenic sources has been used to estimate the tsunami threat over the Spanish Mediterranean coast of the Iberian peninsula and the Balearic Islands. Maximum wave elevation maps and tsunami travel times have been computed by means of numerical modelling and we have obtained estimations of threat levels for each source over the Spanish coast. The sources on the Western edge of North Algeria are the most dangerous, due to their threat to the South-Eastern coast of the Iberian Peninsula and to the Western Balearic Islands. In general, the Northern Algerian sources pose a greater risk to the Spanish coast than the Alboran Sea sources, which only threaten the peninsular coast. In the Iberian Peninsula, the Spanish provinces of Almeria and Murcia are the most exposed, while all the Balearic Islands can be affected by the North Algerian sources with probable severe damage, specially the islands of Ibiza and Minorca. The results obtained in this work are useful to plan future regional and local warning systems, as well as to set the priority areas to conduct research on detailed tsunami risk.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.8860T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.8860T"><span>Evaluation of tsunami risk in Heraklion city, Crete, Greece, by using GIS methods</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Triantafyllou, Ioanna; Fokaefs, Anna; Novikova, Tatyana; Papadopoulos, Gerasimos A.; Vaitis, Michalis</p> <p>2016-04-01</p> <p>The Hellenic Arc is the most active seismotectonic structure in the Mediterranean region. The island of Crete occupies the central segment of the arc which is characterized by high seismic and tsunami activity. Several tsunamis generated by large earthquakes, volcanic eruptions and landslides were reported that hit the capital city of Heraklion in the historical past. We focus our tsunami risk study in the northern coastal area of Crete (ca. 6 km in length and 1 km in maximum width) which includes the western part of the city of Heraklion and a large part of the neighboring municipality of Gazi. The evaluation of tsunami risk included calculations and mapping with QGIS of (1) cost for repairing buildings after tsunami damage, (2) population exposed to tsunami attack, (3) optimum routes and times for evacuation. To calculate the cost for building reparation after a tsunami attack we have determined the tsunami inundation zone in the study area after numerical simulations for extreme tsunami scenarios. The geographical distribution of buildings per building block, obtained from the 2011 census data of the Hellenic Statistical Authority (EL.STAT) and satellite data, was mapped. By applying the SCHEMA Damage Tool we assessed the building vulnerability to tsunamis according to the types of buildings and their expected damage from the hydrodynamic impact. A set of official cost rates varying with the building types and the damage levels, following standards set by the state after the strong damaging earthquakes in Greece in 2014, was applied to calculate the cost of rebuilding or repairing buildings damaged by the tsunami. In the investigation of the population exposed to tsunami inundation we have used the interpolation method to smooth out the population geographical distribution per building block within the inundation zone. Then, the population distribution was correlated with tsunami hydrodynamic parameters in the inundation zone. The last approach of tsunami risk assessment refers to the selection of optimal routes and times needed for evacuation from certain points within the inundation zone to a number of shelters outside the zone. The three different approaches were evaluated as for their overall contribution in the development of a plan for the tsunami risk mitigation. This research is a contribution to the EU-FP7 tsunami research project ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant agreement no: 603839, 2013-10-30.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1215694Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1215694Z"><span>February 27, 2010 Chilean Tsunami in Pacific and its Arrival to North East Asia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zaytsev, Andrey; Pelinovsky, EfiM.; Yalciner, Ahmet C.; Ozer, Ceren; Chernov, Anton; Kostenko, Irina; Shevchenko, Georgy</p> <p>2010-05-01</p> <p>The outskirts of the fault plane broken by the strong earthquake on February 27, 2010 in Chili with a magnitude 8.8 at the 35km depth of 35.909°S, 72.733°W coordinates generated a moderate size tsunami. The initial amplitude of the tsunami source is not so high because of the major area of the plane was at land. The tsunami waves propagated far distances in South and North directions to East Asia and Wet America coasts. The waves are also recorded by several gauges in Pacific during its propagation and arrival to coastal areas. The recorded and observed amplitudes of tsunami waves are important for the potential effects with the threatening amplitudes. The event also showed that a moderate size tsunami can be effective even if it propagates far distances in any ocean or a marginal sea. The far east coasts of Russia at North East Asia (Sakhalin, Kuriles, Kamchatka) are one of the important source (i.e. November 15, 2006, Kuril Island Tsunami) and target (i.e. February, 27, 2010 Chilean tsunami) areas of the Pacific tsunamis. Many efforts have been spent for establishment of the monitoring system and assessment of tsunamis and development of the mitigation strategies against tsunamis and other hazards in the region. Development of the computer technologies provided the advances in data collection, transfer, and processing. Furthermore it also contributed new developments in computational tools and made the computer modeling to be an efficient tool in tsunami warning systems. In this study the tsunami numerical model NAMI DANCE Nested version is used. NAMI-DANCE solves Nonlinear form of Long Wave (Shallow water) equations (with or without dispersion) using finite difference model in nested grid domains from the source to target areas in multiprocessor hardware environment. It is applied to 2010 Chilean tsunami and its propagation and coastal behavior at far distances near Sakhalin, Kuril and Kamchatka coasts. The main tide gauge records used in this study are from Petropavlosk (Kamchatka), Severo-Kurilsk (Paramushir), Kurilsk (Iturup, coast of the Okhotsk sea), Malokurilskoe (Shikotan), Korsakov, Kholmsk and Aniva Bay (Sakhalin). These records and also other offshore DART records are analyzed and used for comparison of the modeling results with offshore and nearshore records. The transmission of tsunami waves through Sakhalin and Kuril straits and their propagation to nearby coasts are investigated. The spectral analysis of records in settlements of Sakhalin and Kurile Islands are investigated. The performance and capabilities of NAMI DANCE is also presented together with comparisons between the model, observations and discussions.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMNH13B..05Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMNH13B..05Y"><span>Geological Evidences for a Large Tsunami Generated by the 7.3 ka Kikai Caldera Eruption, Southern Japan</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yamada, M.; Fujino, S.; Satake, K.</p> <p>2017-12-01</p> <p>The 7.3 ka eruption of Kikai volcano, southern Kyushu, Japan, is one of the largest caldera-forming eruption in the world. Given that a huge caldera was formed in shallow sea area during the eruption, a tsunami must have been generated by a sea-level change associated. Pyroclastic flow and tsunami deposits by the eruption have been studied around the caldera, but they are not enough to evaluate the tsunami size. The goal of this study is to unravel sizes of tsunami and triggering caldera collapse by numerical simulations based on a widely-distributed tsunami deposit associated with the eruption. In this presentation, we will provide an initial data on distribution of the 7.3 ka tsunami deposit contained in sediment cores taken at three coastal lowlands in Wakayama, Tokushima, and Oita prefectures (560 km, 520 km, and 310 km north-east from the caldera, respectively). A volcanic ash from the eruption (Kikai Akahoya tephra: K-Ah) is evident in organic-rich muddy sedimentary sequence in all sediment cores. Up to 6-cm-thick sand layer, characterized by a grading structure and sharp bed boundary with lower mud, is observed immediately beneath the K-Ah tephra in all study sites. These sedimentary characteristics and broad distribution indicate that the sand layer was most likely deposited by a tsunami which can propagate to a wide area, but not by a local storm surge. Furthermore, the stratigraphic relationship implies that the study sites must have been inundated by the tsunami prior to the ash fall. A sand layer is also evident within the K-Ah tephra layer, suggesting that the sand layer was probably formed by a subsequent tsunami wave during the ash fall. These geological evidences for the 7.3 ka tsunami inundation will contribute to a better understanding of the caldera collapse and the resultant tsunami, but also of the tsunami generating system in the eruptive process.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3426142','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3426142"><span>Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Hunter, Jennifer C.; Crawley, Adam W.; Petrie, Michael; Yang, Jane E.; Aragón, Tomás J.</p> <p>2012-01-01</p> <p>Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami’s impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergency responders’ ability to rapidly communicate and coordinate a response to a potential threat. Methods To evaluate the local public health system emergency response to the tsunami threat in California, we surveyed all local public health, emergency medical services (EMS), and emergency management agencies in coastal or floodplain counties about several domains related to the tsunami threat in California, including: (1) the extent to which their community was affected by the tsunami, (2) when and how they received notification of the event, (3) which public health response activities were carried out to address the tsunami threat in their community, and (4) which organizations contributed to the response. Public health activities were characterized using the Centers for Disease Control and Prevention (CDC) Public Health Preparedness Capabilities (PHEP) framework. Findings The tsunami's impact on coastal communities in California ranged widely, both in terms of the economic consequences and the response activities. Based on estimates from the National Oceanic and Atmospheric Administration (NOAA), ten jurisdictions in California reported tsunami-related damage, which ranged from $15,000 to $35 million. Respondents first became aware of the tsunami threat in California between the hours of 10:00pm Pacific Standard Time (PST) on Thursday March 10th and 2:00pm PST on Friday March 11th, a range of 16 hours, with notification occurring through both formal and informal channels. In response to this threat, the activities most commonly reported by the local government agencies included in this study were: emergency public information and warning, emergency operations coordination, and inter-organizational information sharing, which were reported by 86%, 75%, and 65% of all respondents, respectively. When looking at the distribution of responsibility, emergency management agencies were the most likely to report assuming a lead role in these common activities as well as those related to evacuation and community recovery. While activated less frequently, public health agencies carried out emergency response functions related to surveillance and epidemiology, environmental health, and mental health/psychological support. Both local public health and EMS agencies took part in mass care and medical material management activities. A large network of organizations contributed to response activities, with emergency management, law enforcement, fire, public health, public works, EMS, and media cited by more than half of respondents. Conclusions In response to the tsunami threat in California, we found that emergency management agencies assumed a lead role in the local response efforts. While public health and medical agencies played a supporting role in the response, they uniquely contributed to a number of specific activities. If the response to the recent tsunami is any indication, these support activities can be anticipated in planning for future events with similar characteristics to the tsunami threat. Additionally, we found that many respondents first learned of the tsunami through the media, rather than through rapid notification systems, which suggests that government agencies must continue to develop and maintain the ability to rapidly aggregate and analyze information in order to provide accurate assessments and guidance to a potentially well-informed public. Citation: Hunter JC, Crawley AW, Petrie M, Yang JE, Aragón TJ. Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake. PLoS Currents Disasters. 2012 Jul 16 PMID:22953236</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007AGUFM.S13A1051M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007AGUFM.S13A1051M"><span>Near-Field Population Response During the 2 April 2007 Solomon Islands Tsunami</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>McAdoo, B. G.; Moore, A. L.; Baumwoll, J.</p> <p>2007-12-01</p> <p>When the magnitude 8.1 earthquake and subsequent tsunami hit the Solomon Islands on 2 April 2007 it killed 52 people. On Ghizo Island, home of the capital of the Western Province, Gizo, waves approaching 4 m in height inundated the south coast villages. Eyewitness accounts supported by geologic data from the offshore coral reef and sediment deposited on land suggest a wave that came in as the shaking stopped as a rapidly-rising tide rather than a turbulent bore- vehicles and houses were floated inland with very little damage. Those that survived in villages affected by the tsunami had indigenous knowledge of prior events, whereas immigrant populations died in higher proportions. While buoy-based early warning systems are necessary to mitigate the effects of teletsunamis, they would have done little good in this near-field environment. In Pailongge, a village of 76 indigenous Solomon Islanders on Ghizo's south coast, there were no deaths. Village elders directed the people inland following the shaking and the almost immediate withdrawal of water from the lagoon, and heads of household made sure that children were accounted for and evacuated. Of the 366 Gilbertese living in Titiana, however, 13 people died, 8 of which were children who were exploring the emptied lagoon. A large proportion of the dead were children (24) as they were likely too weak to swim against the non-bore flow. The Gilbertese migrated from Kiribati in the 1950"s, and had not experienced a major earthquake and tsunami, hence had no cultural memory. In the case of the Solomon Islands tsunami, as was the case in the 2004 Indian Ocean tsunami, indigenous knowledge served the people in the near-field well. In the case of the Indian Ocean where there was 10-20 minutes separation between the time the shaking began and the waves arrived, the combination of an in-place plan and a suitable physical geography allowed the population of Simeulue Island and the Moken people of Thailand to escape before the waves hit. In the Solomons, there was less than 3 minutes separation time, and the populations with indigenous knowledge were able to save themselves. Mitigation strategies for those that live adjacent to tsunamigenic subduction zones must include a community-based disaster management plan to educate a variety of populations with different cultural knowledges. This education can be in concert with development of an basin-wide early warning system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMNH52A..03O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMNH52A..03O"><span>From Sumatra 2004 to Today, through Tohoku-Oki 2011: what we learn about Tsunami detection by ionospheric sounding.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Occhipinti, G.; Rolland, L.; Watada, S.; Makela, J. J.; Bablet, A.; Coisson, P.; Lognonne, P. H.; Hebert, H.</p> <p>2016-12-01</p> <p>The tsunamigenic Tohoku earthquake (2011) strongly affirms, after the 26 December 2004, the necessity to open new paradigms in oceanic monitoring. Detection of ionospheric anomalies following the Sumatra earthquake tsunami (Occhipinti et al. 2006) demonstrated that ionosphere is sensitive to earthquake and tsunami propagation: ground and oceanic vertical displacement induces acoustic-gravity waves propagating within the neutral atmosphere and detectable in the ionosphere. Observations supported by modelling proved that tsunamigenic ionospheric anomalies are deterministic and reproducible by numerical modeling (Occhipinti et al., 2008). To prove that the tsunami signature in the ionosphere is routinely detected we show perturbations of total electron content (TEC) measured by GPS and following tsunamigenic eartquakes from 2004 to 2011 (Rolland et al. 2010, Occhipinti et al., 2013), nominally, Sumatra (26 December, 2004 and 12 September, 2007), Chile (14 November, 2007), Samoa (29 September, 2009) and the Tohoku-Oki (11 Mars, 2011). Additionally, new exciting measurements in the far-field were performed by Airglow measurement in Hawaii: those measurements show the propagation of the IGWs induced by the Tohoku tsunami in the Pacific Ocean (Occhipinti et al., 2011), as well as by two new recent tsunamis: the Queen Charlotte (27 October, 2013, Mw 7,7) and Chili (16 September, 2015, Mw 8.2). The detection of those two new events strongly confirm the potential interest and perspective of the tsunami monitoring by airglow camera, ground-located or potentially onboard on satelites. Based on the observations close to the epicenter, mainly performed by GPS networks located in Sumatra, Chile and Japan, we highlight the TEC perturbation observed within the first hour after the seismic rupture (Occhipinti et al., 2013). This perturbation contains informations about the ground displacement, as well as the consequent sea surface displacement resulting in the tsunami. In this talk we present all this new tsunami observations in the ionosphere and we discuss, under the light of modelling, the potential role of ionospheric sounding in the oceanic monitoring and future tsunami warning system (Occhipinti, 2015). All ref. here @ www.ipgp.fr/ ninto</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMNH23A0212Q','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMNH23A0212Q"><span>Characterization of the Spatio-temporal Evolution of the Energy of Recent Tsunamis in Chile and its Connection with the Seismic Source and Geomorphological Conditions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Quiroz, M.; Cienfuegos, R.</p> <p>2017-12-01</p> <p>At present, there is good knowledge acquired by the scientific community on characterizing the evolution of tsunami energy at ocean and shelf scales. For instance, the investigations of Rabinovich (2013) and Yamazaki (2011), represent some important advances in this subject. In the present paper we rather focus on tsunami energy evolution, and ultimately its decay, in coastal areas because characteristic time scales of this process has implications for early warning, evacuation initiation, and cancelling. We address the tsunami energy evolution analysis at three different spatial scales, a global scale at the ocean basin level, in particular the Pacific Ocean basin, a regional scale comprising processes that occur at the continental shelf level, and finally a local scale comprising coastal areas or bays. These scales were selected following the motivation to understand how the response is associated with tsunami, and how the energy evolves until it is completely dissipated. Through signal processing methods, such as discrete and wavelets analysis, we analyze time series of recent tsunamigenic events in the main Chilean coastal cities. Based on this analysis, we propose a conceptual model based on the influence of geomorphological variables on the evolution and decay of tsunami energy. This model acts as a filter from the seismic source to the observed response in coastal zones. Finally, we hope to conclude with practical tools that will establish patterns of behavior and scaling of energy evolution through interconnections from seismic source variables and the geomorphological component to understand the response and predict behavior for a given site.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.S33G2937G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.S33G2937G"><span>Fault Slip Distribution and Optimum Sea Surface Displacement of the 2017 Tehuantepec Earthquake in Mexico (Mw 8.2) Estimated from Tsunami Waveforms</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gusman, A. R.; Satake, K.; Mulia, I. E.</p> <p>2017-12-01</p> <p>An intraplate normal fault earthquake (Mw 8.2) occurred on 8 September 2017 in the Tehuantepec seismic gap of the Middle America Trench. The submarine earthquake generated a tsunami which was recorded by coastal tide gauges and offshore DART buoys. We used the tsunami waveforms recorded at 16 stations to estimate the fault slip distribution and an optimum sea surface displacement of the earthquake. A steep fault dipping to the northeast with strike of 315°, dip of 73°and rake of -96° based on the USGS W-phase moment tensor solution was assumed for the slip inversion. To independently estimate the sea surface displacement without assuming earthquake fault parameters, we used the B-spline function for the unit sources. The distribution of the unit sources was optimized by a Genetic Algorithm - Pattern Search (GA-PS) method. Tsunami waveform inversion resolves a spatially compact region of large slip (4-10 m) with a dimension of 100 km along the strike and 80 km along the dip in the depth range between 40 km and 110 km. The seismic moment calculated from the fault slip distribution with assumed rigidity of 6 × 1010 Nm-2 is 2.46 × 1021 Nm (Mw 8.2). The optimum displacement model suggests that the sea surface was uplifted up to 0.5 m and subsided down to -0.8 m. The deep location of large fault slip may be the cause of such small sea surface displacements. The simulated tsunami waveforms from the optimum sea surface displacement can reproduce the observations better than those from fault slip distribution; the normalized root mean square misfit for the sea surface displacement is 0.89, while that for the fault slip distribution is 1.04. We simulated the tsunami propagation using the optimum sea surface displacement model. Large tsunami amplitudes up to 2.5 m were predicted to occur inside and around a lagoon located between Salina Cruz and Puerto Chiapas. Figure 1. a) Sea surface displacement for the 2017 Tehuantepec earthquake estimated by tsunami waveforms. b) Map of simulated maximum tsunami amplitude and comparison between observed (blue circles) and simulated (red circles) tsunami maximum amplitude along the coast.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70031183','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70031183"><span>Distribution and sedimentary characteristics of tsunami deposits along the Cascadia margin of western North America</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Peters, R.; Jaffe, B.; Gelfenbaum, G.</p> <p>2007-01-01</p> <p>Tsunami deposits have been found at more than 60 sites along the Cascadia margin of Western North America, and here we review and synthesize their distribution and sedimentary characteristics based on the published record. Cascadia tsunami deposits are best preserved, and most easily identified, in low-energy coastal environments such as tidal marshes, back-barrier marshes and coastal lakes where they occur as anomalous layers of sand within peat and mud. They extend up to a kilometer inland in open coastal settings and several kilometers up river valleys. They are distinguished from other sediments by a combination of sedimentary character and stratigraphic context. Recurrence intervals range from 300-1000??years with an average of 500-600??years. The tsunami deposits have been used to help evaluate and mitigate tsunami hazards in Cascadia. They show that the Cascadia subduction zone is prone to great earthquakes that generate large tsunamis. The inclusion of tsunami deposits on inundation maps, used in conjunction with results from inundation models, allows a more accurate assessment of areas subject to tsunami inundation. The application of sediment transport models can help estimate tsunami flow velocity and wave height, parameters which are necessary to help establish evacuation routes and plan development in tsunami prone areas. ?? 2007.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://dx.doi.org/10.1016/S0065-2687(09)05108-5','USGSPUBS'); return false;" href="http://dx.doi.org/10.1016/S0065-2687(09)05108-5"><span>Chapter 3 – Phenomenology of Tsunamis: Statistical Properties from Generation to Runup</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Geist, Eric L.</p> <p>2015-01-01</p> <p>Observations related to tsunami generation, propagation, and runup are reviewed and described in a phenomenological framework. In the three coastal regimes considered (near-field broadside, near-field oblique, and far field), the observed maximum wave amplitude is associated with different parts of the tsunami wavefield. The maximum amplitude in the near-field broadside regime is most often associated with the direct arrival from the source, whereas in the near-field oblique regime, the maximum amplitude is most often associated with the propagation of edge waves. In the far field, the maximum amplitude is most often caused by the interaction of the tsunami coda that develops during basin-wide propagation and the nearshore response, including the excitation of edge waves, shelf modes, and resonance. Statistical distributions that describe tsunami observations are also reviewed, both in terms of spatial distributions, such as coseismic slip on the fault plane and near-field runup, and temporal distributions, such as wave amplitudes in the far field. In each case, fundamental theories of tsunami physics are heuristically used to explain the observations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018SedG..364..204W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018SedG..364..204W"><span>Are inundation limit and maximum extent of sand useful for differentiating tsunamis and storms? An example from sediment transport simulations on the Sendai Plain, Japan</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Watanabe, Masashi; Goto, Kazuhisa; Bricker, Jeremy D.; Imamura, Fumihiko</p> <p>2018-02-01</p> <p>We examined the quantitative difference in the distribution of tsunami and storm deposits based on numerical simulations of inundation and sediment transport due to tsunami and storm events on the Sendai Plain, Japan. The calculated distance from the shoreline inundated by the 2011 Tohoku-oki tsunami was smaller than that inundated by storm surges from hypothetical typhoon events. Previous studies have assumed that deposits observed farther inland than the possible inundation limit of storm waves and storm surge were tsunami deposits. However, confirming only the extent of inundation is insufficient to distinguish tsunami and storm deposits, because the inundation limit of storm surges may be farther inland than that of tsunamis in the case of gently sloping coastal topography such as on the Sendai Plain. In other locations, where coastal topography is steep, the maximum inland inundation extent of storm surges may be only several hundred meters, so marine-sourced deposits that are distributed several km inland can be identified as tsunami deposits by default. Over both gentle and steep slopes, another difference between tsunami and storm deposits is the total volume deposited, as flow speed over land during a tsunami is faster than during a storm surge. Therefore, the total deposit volume could also be a useful proxy to differentiate tsunami and storm deposits.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMNH43B1833T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMNH43B1833T"><span>Tsunami Evacuation Exercises: the Case of Heraklion, Crete Isl., Greece</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Triantafyllou, I.; Charalampakis, M.; Bocchini, G. M.; Novikova, T.; Papadopoulos, G. A.</p> <p>2016-12-01</p> <p>Effective tsunami evacuation requires appropriate awareness as regards good shelters selection. Field exercises may improve public awareness. A field exercise was organized in Heraklion, Crete Isl., in 2016. The area is part of the Hellenic Arc which is the most active structure in the Mediterranean. Large earthquakes triggered tsunamis that hit Heraklion in the past, such in AD 1303. After selecting various fault models, simulation of the 1303 tsunami showed important inundation zone in Heraklion. For the exercise needs a team of 30 volunteers was divided in 3 groups of 10 people each. Everyone was equipped with a mobile phone and a GPS device. The 3 groups were gathered in 3 coastal spots Heraklion situated 400 m apart each other. The scenario was that immediately after receiving in their mobile a tsunami warning message they will set on their personal GPS device and start evacuating inland on the best way they believed to do so. In each group, only 5 out of 10 volunteers were notified beforehand that the Eleftherias Square, located inland at distance satisfying evacuation needs in case of repeat of the 1303 tsunami, would be a good shelter to go. Using the Road Graph Plugin of QGIS, we calculated the shortest path distances which found equal to 800, 700 and 680 m. Adopting average velocity of 3 km/h we found that these distances can be covered within 18, 16 and 15 min, respectively. The routes towards the settlement spots as well as the times needed to arrive there by each one of the 30 volunteers were recorded by their personal GPS devices. The processing of the GPS tracks and their comparison with the theoretical routes and times showed good evacuation performance which is encouraging for the next phases of the Heraklion tsunami hazard mitigation program. This is contribution to the EU-FP7 projects ZIP (Zooming In between Plates, grant no: 604713, 2013) and ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant no: 603839, 2013.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMNH23A1867K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMNH23A1867K"><span>Leading Wave Amplitude of a Tsunami</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kanoglu, U.</p> <p>2015-12-01</p> <p>Okal and Synolakis (EGU General Assembly 2015, Geophysical Research Abstracts-Vol. 17-7622) recently discussed that why the maximum amplitude of a tsunami might not occur for the first wave. Okal and Synolakis list observations from 2011 Japan tsunami, which reached to Papeete, Tahiti with a fourth wave being largest and 72 min later after the first wave; 1960 Chilean tsunami reached Hilo, Hawaii with a maximum wave arriving 1 hour later with a height of 5m, first wave being only 1.2m. Largest later waves is a problem not only for local authorities both in terms of warning to the public and rescue efforts but also mislead the public thinking that it is safe to return shoreline or evacuated site after arrival of the first wave. Okal and Synolakis considered Hammack's (1972, Ph.D. Dissertation, Calif. Inst. Tech., 261 pp., Pasadena) linear dispersive analytical solution with a tsunami generation through an uplifting of a circular plug on the ocean floor. They performed parametric study for the radius of the plug and the depth of the ocean since these are the independent scaling lengths in the problem. They identified transition distance, as the second wave being larger, regarding the parameters of the problem. Here, we extend their analysis to an initial wave field with a finite crest length and, in addition, to a most common tsunami initial wave form of N-wave as presented by Tadepalli and Synolakis (1994, Proc. R. Soc. A: Math. Phys. Eng. Sci., 445, 99-112). We compare our results with non-dispersive linear shallow water wave results as presented by Kanoglu et al. (2013, Proc. R. Soc. A: Math. Phys. Eng. Sci., 469, 20130015), investigating focusing feature. We discuss the results both in terms of leading wave amplitude and tsunami focusing. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16..752K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16..752K"><span>Display of historical and hypothetical tsunami on the coast of Sakhalin Island</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kostenko, Irina; Zaytsev, Andrey; Kurkin, Andrey; Yalciner, Ahmet</p> <p>2014-05-01</p> <p>Tsunami waves achieve the coast of the Sakhalin Island and their sources are located in the Japan Sea, in the Okhotsk Sea, in Kuril Islands region and in the Pacific Ocean. Study of tsunami generation characteristics and its propagation allows studying display of the tsunami on the various parts of the island coast. For this purpose the series of computational experiments of some historical tsunamis was carried out. Their sources located in Japan Sea and Kuril Islands region. The simulation results are compared with the observations. Analysis of all recorded historical tsunami on coast of Sakhalin Island was done. To identify the possible display of the tsunami on the coast of Sakhalin Island the series of computational experiments of hypothetical tsunamis was carried out. Their sources located in the Japan Sea and in the Okhotsk Sea. There were used hydrodynamic sources. There were used different parameters of sources (length, width, height, raising and lowering of sea level), which correspond to earthquakes of various magnitudes. The analysis of the results was carried out. Pictures of the distribution of maximum amplitudes from each tsunami were done. Areas of Okhotsk Sea, Japan Sea and offshore strip of Sakhalin Island with maximum tsunami amplitudes were defined. Graphs of the distribution of maximum tsunami wave heights along the coast of the Sakhalin Island were plotted. Based on shallow-water equation tsunami numerical code NAMI DANCE was used for numerical simulations. This work was supported by ASTARTE project.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013PApGe.170..433I','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013PApGe.170..433I"><span>Tsunami Modeling to Validate Slip Models of the 2007 M w 8.0 Pisco Earthquake, Central Peru</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ioualalen, M.; Perfettini, H.; Condo, S. Yauri; Jimenez, C.; Tavera, H.</p> <p>2013-03-01</p> <p>Following the 2007, August 15th, M w 8.0, Pisco earthquake in central Peru, Sladen et al. (J Geophys Res 115: B02405, 2010) have derived several slip models of this event. They inverted teleseismic data together with geodetic (InSAR) measurements to look for the co-seismic slip distribution on the fault plane, considering those data sets separately or jointly. But how close to the real slip distribution are those inverted slip models? To answer this crucial question, the authors generated some tsunami records based on their slip models and compared them to DART buoys, tsunami records, and available runup data. Such an approach requires a robust and accurate tsunami model (non-linear, dispersive, accurate bathymetry and topography, etc.) otherwise the differences between the data and the model may be attributed to the slip models themselves, though they arise from an incomplete tsunami simulation. The accuracy of a numerical tsunami simulation strongly depends, among others, on two important constraints: (i) A fine computational grid (and thus the bathymetry and topography data sets used) which is not always available, unfortunately, and (ii) a realistic tsunami propagation model including dispersion. Here, we extend Sladen's work using newly available data, namely a tide gauge record at Callao (Lima harbor) and the Chilean DART buoy record, while considering a complete set of runup data along with a more realistic tsunami numerical that accounts for dispersion, and also considering a fine-resolution computational grid, which is essential. Through these accurate numerical simulations we infer that the InSAR-based model is in better agreement with the tsunami data, studying the case of the Pisco earthquake indicating that geodetic data seems essential to recover the final co-seismic slip distribution on the rupture plane. Slip models based on teleseismic data are unable to describe the observed tsunami, suggesting that a significant amount of co-seismic slip may have been aseismic. Finally, we compute the runup distribution along the central part of the Peruvian coast to better understand the wave amplification/attenuation processes of the tsunami generated by the Pisco earthquake.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PApGe.175.1257S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PApGe.175.1257S"><span>The "Tsunami Earthquake" of 13 April 1923 in Northern Kamchatka: Seismological and Hydrodynamic Investigations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Salaree, Amir; Okal, Emile A.</p> <p>2018-04-01</p> <p>We present a seismological and hydrodynamic investigation of the earthquake of 13 April 1923 at Ust'-Kamchatsk, Northern Kamchatka, which generated a more powerful and damaging tsunami than the larger event of 03 February 1923, thus qualifying as a so-called "tsunami earthquake". On the basis of modern relocations, we suggest that it took place outside the fault area of the mainshock, across the oblique Pacific-North America plate boundary, a model confirmed by a limited dataset of mantle waves, which also confirms the slow nature of the source, characteristic of tsunami earthquakes. However, numerical simulations for a number of legitimate seismic models fail to reproduce the sharply peaked distribution of tsunami wave amplitudes reported in the literature. By contrast, we can reproduce the distribution of reported wave amplitudes using an underwater landslide as a source of the tsunami, itself triggered by the earthquake inside the Kamchatskiy Bight.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007AGUFM.S13A1049N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007AGUFM.S13A1049N"><span>Tsunami Field Survey for the Solomon Islands Earthquake of April 1, 2007</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Nishimura, Y.; Tanioka, Y.; Nakamura, Y.; Tsuji, Y.; Namegaya, Y.; Murata, M.; Woodward, S.</p> <p>2007-12-01</p> <p>Two weeks after the 2007 off-Solomon earthquake, an international tsunami survey team (ITST) of Japanese and US researchers performed a post tsunami survey in Ghizo and adjacent islands. Main purpose of the team was to provide information on the earthquake and tsunami to the national disaster council of the Solomon Islands, who was responsible for the disaster management at that time. The ITST had interview with the affected people and conducted reconnaissance mapping of the tsunami heights and flow directions. Tsunami flow heights at beach and inland were evaluated from watermarks on buildings and the position of broken branches and stuck materials on trees. These tsunami heights along the southern to western coasts of Ghizo Island were ca. 5m (a.s.l.). Tsunami run-up was traced by distribution of floating debris that carried up by the tsunami and deposited at their inundation limit. The maximum run-up was measured at Tapurai of Simbo Island to be ca. 9 m. Most of the inundation area was covered by 0-10 cm thick tsunami deposit that consists of beach sand, coral peaces and eroded soil. Coseismic uplift and subsidence were clearly identified by changes of the sea level before and after the earthquake, that were inferred by eyewitness accounts and evidences such as dried up coral reeves. These deformation patterns, as well as the tsunami height distribution, could constrain the earthquake fault geometry and motion. It is worthy of mention that the tsunami damage in villages in Ranongga Island has significantly reduced by 2-3 m uplift before the tsunami attack.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5057117','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5057117"><span>Shallow slip amplification and enhanced tsunami hazard unravelled by dynamic simulations of mega-thrust earthquakes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Festa, G.; Trasatti, E.; Tonini, R.; Romano, F.; Molinari, I.; Nielsen, S.</p> <p>2016-01-01</p> <p>The 2011 Tohoku earthquake produced an unexpected large amount of shallow slip greatly contributing to the ensuing tsunami. How frequent are such events? How can they be efficiently modelled for tsunami hazard? Stochastic slip models, which can be computed rapidly, are used to explore the natural slip variability; however, they generally do not deal specifically with shallow slip features. We study the systematic depth-dependence of slip along a thrust fault with a number of 2D dynamic simulations using stochastic shear stress distributions and a geometry based on the cross section of the Tohoku fault. We obtain a probability density for the slip distribution, which varies both with depth, earthquake size and whether the rupture breaks the surface. We propose a method to modify stochastic slip distributions according to this dynamically-derived probability distribution. This method may be efficiently applied to produce large numbers of heterogeneous slip distributions for probabilistic tsunami hazard analysis. Using numerous M9 earthquake scenarios, we demonstrate that incorporating the dynamically-derived probability distribution does enhance the conditional probability of exceedance of maximum estimated tsunami wave heights along the Japanese coast. This technique for integrating dynamic features in stochastic models can be extended to any subduction zone and faulting style. PMID:27725733</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMNH23A0194A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMNH23A0194A"><span>Preliminary Hazard Assessment for Tectonic Tsunamis in the Eastern Mediterranean</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Aydin, B.; Bayazitoglu, O.; Sharghi vand, N.; Kanoglu, U.</p> <p>2017-12-01</p> <p>There are many critical industrial facilities such as energy production units and energy transmission lines along the southeast coast of Turkey. This region is also active on tourism, and agriculture and aquaculture production. There are active faults in the region, i.e. the Cyprus Fault, which extends along the Mediterranean basin in the east-west direction and connects to the Hellenic Arc. Both the Cyprus Fault and the Hellenic Arc are seismologically active and are capable of generating earthquakes with tsunamigenic potential. Even a small tsunami in the region could cause confusion as shown by the recent 21 July 2017 earthquake of Mw 6.6, which occurred in the Aegean Sea, between Bodrum, Turkey and Kos Island, Greece since region is not prepared for such an event. Moreover, the Mediterranean Sea is one of the most vulnerable regions against sea level rise due to global warming, according to the 5th Report of the Intergovernmental Panel on Climate Change. For these reasons, a marine hazard such as a tsunami can cause much worse damage than expected in the region (Kanoglu et al., Phil. Trans. R. Soc. A 373, 2015). Hence, tsunami hazard assessment is required for the region. In this study, we first characterize earthquakes which have potential to generate a tsunami in the Eastern Mediterranean. Such study is a prerequisite for regional tsunami mitigation studies. For fast and timely predictions, tsunami warning systems usually employ databases that store pre-computed tsunami propagation resulting from hypothetical earthquakes with pre-defined parameters. These pre-defined sources are called tsunami unit sources and they are linearly superposed to mimic a real event, since wave propagation is linear offshore. After investigating historical earthquakes along the Cyprus Fault and the Hellenic Arc, we identified tsunamigenic earthquakes in the Eastern Mediterranean and proposed tsunami unit sources for the region. We used the tsunami numerical model MOST (Titov et al., J. Waterw. Port Coastal Ocean Eng. 142(6), 2016) to numerically solve the nonlinear shallow water-wave equations for calculating offshore wave propagation, shoreline wave heights, and inundation depths through its interface ComMIT (Titov et al., Pure Appl. Geophys. 168(11), 2011) and generated inundation map for one town in the southeast coast of Turkey.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMNH21D..02W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMNH21D..02W"><span>Evaluation of W Phase CMT Based PTWC Real-Time Tsunami Forecast Model Using DART Observations: Events of the Last Decade</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, D.; Becker, N. C.; Weinstein, S.; Duputel, Z.; Rivera, L. A.; Hayes, G. P.; Hirshorn, B. F.; Bouchard, R. H.; Mungov, G.</p> <p>2017-12-01</p> <p>The Pacific Tsunami Warning Center (PTWC) began forecasting tsunamis in real-time using source parameters derived from real-time Centroid Moment Tensor (CMT) solutions in 2009. Both the USGS and PTWC typically obtain W-Phase CMT solutions for large earthquakes less than 30 minutes after earthquake origin time. Within seconds, and often before waves reach the nearest deep ocean bottom pressure sensor (DARTs), PTWC then generates a regional tsunami propagation forecast using its linear shallow water model. The model is initialized by the sea surface deformation that mimics the seafloor deformation based on Okada's (1985) dislocation model of a rectangular fault with a uniform slip. The fault length and width are empirical functions of the seismic moment. How well did this simple model perform? The DART records provide a very valuable dataset for model validation. We examine tsunami events of the last decade with earthquake magnitudes ranging from 6.5 to 9.0 including some deep events for which tsunamis were not expected. Most of the forecast results were obtained during the events. We also include events from before the implementation of the WCMT method at USGS and PTWC, 2006-2009. For these events, WCMTs were computed retrospectively (Duputel et al. 2012). We also re-ran the model with a larger domain for some events to include far-field DARTs that recorded a tsunami with identical source parameters used during the events. We conclude that our model results, in terms of maximum wave amplitude, are mostly within a factor of two of the observed at DART stations, with an average error of less than 40% for most events, including the 2010 Maule and the 2011 Tohoku tsunamis. However, the simple fault model with a uniform slip is too simplistic for the Tohoku tsunami. We note model results are sensitive to centroid location and depth, especially if the earthquake is close to land or inland. For the 2016 M7.8 New Zealand earthquake the initial forecast underestimated the tsunami because the initial WCMT centroid was on land (the epicenter was inland but most of the slips occurred offshore). Later WCMTs did provide better forecast. The model also failed to reproduce the observed tsunamis from earthquake-generated landslides. Sea level observations during the events are crucial in determining whether or not a forecast needs to be adjusted.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.T51D2898V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.T51D2898V"><span>A real-time cabled observatory on the Cascadia subduction zone</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vidale, J. E.; Delaney, J. R.; Toomey, D. R.; Bodin, P.; Roland, E. C.; Wilcock, W. S. D.; Houston, H.; Schmidt, D. A.; Allen, R. M.</p> <p>2015-12-01</p> <p>Subduction zones are replete with mystery and rife with hazard. Along most of the Pacific Northwest margin, the traditional methods of monitoring offshore geophysical activity use onshore sensors or involve conducting infrequent oceanographic expeditions. This results in a limited capacity for detecting and monitoring subduction processes offshore. We propose that the next step in geophysical observations of Cascadia should include real-time data delivered by a seafloor cable with seismic, geodetic, and pressure-sensing instruments. Along the Cascadia subduction zone, we need to monitor deformation, earthquakes, and fluid fluxes on short time scales. High-quality long-term time series are needed to establish baseline observations and evaluate secular changes in the subduction environment. Currently we lack a basic knowledge of the plate convergence rate, direction and its variations along strike and of how convergence is accommodated across the plate boundary. We also would like to seek cycles of microseismicity, how far locking extends up-dip, and the transient processes (i.e., fluid pulsing, tremor, and slow slip) that occur near the trench. For reducing risk to society, real-time monitoring has great benefit for immediate and accurate assessment through earthquake early warning systems. Specifically, the improvement to early warning would be in assessing the location, geometry, and progression of ongoing faulting and obtaining an accurate tsunami warning, as well as simply speeding up the early warning. It would also be valuable to detect strain transients and map the locked portion of the megathrust, and detect changes in locking over the earthquake cycle. Development of the US portion of a real-time cabled seismic and geodetic observatory should build upon the Ocean Observatories Initiative's cabled array, which was recently completed and is currently delivering continuous seismic and pressure data from the seafloor. Its implementation would require substantial initial and ongoing investments from federal and state governments, private partners and the academic community but would constitute a critical resource in mitigating the hazard both through improved earthquake and tsunami warning and an enhanced scientific understanding of subduction processes in Cascadia.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.S23A2749W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.S23A2749W"><span>An Offshore Geophysical Network in the Pacific Northwest for Earthquake and Tsunami Early Warning and Hazard Research</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wilcock, W. S. D.; Schmidt, D. A.; Vidale, J. E.; Harrington, M.; Bodin, P.; Cram, G.; Delaney, J. R.; Gonzalez, F. I.; Kelley, D. S.; LeVeque, R. J.; Manalang, D.; McGuire, C.; Roland, E. C.; Tilley, J.; Vogl, C. J.; Stoermer, M.</p> <p>2016-12-01</p> <p>The Cascadia subduction zone hosts catastrophic earthquakes every few hundred years. On land, there are extensive geophysical networks available to monitor the subduction zone, but since the locked portion of the plate boundary lies mostly offshore, these networks are ideally complemented by seafloor observations. Such considerations helped motivate the development of scientific cabled observatories that cross the subduction zone at two sites off Vancouver Island and one off central Oregon, but these have a limited spatial footprint along the strike of the subduction zone. The Pacific Northwest Seismic Network is leading a collaborative effort to implement an earthquake early warning system in the Washington and Oregon using data streams from land networks as well as the few existing offshore instruments. For subduction zone earthquakes that initiate offshore, this system will provide a warning. However, the availability of real time offshore instrumentation along the entire subduction zone would improve its reliability and accuracy, add up to 15 s to the warning time, and ensure an early warning for coastal communities near the epicenter. Furthermore, real-time networks of seafloor pressure sensors above the subduction zone would enable monitoring and contribute to accurate predictions of the incoming tsunami. There is also strong scientific motivation for offshore monitoring. We lack a complete knowledge of the plate convergence rate and direction. Measurements of steady deformation and observations of transient processes such as fluid pulsing, microseismic cycles, tremor and slow-slip are necessary for assessing the dimensions of the locked zone and its along-strike segmentation. Long-term monitoring will also provide baseline observations that can be used to detect and evaluate changes in the subduction environment. There are significant engineering challenges to be solved to ensure the system is sufficiently reliable and maintainable. It must provide continuous monitoring over its operational life in the harsh ocean environment and at least parts of the system must continue to operate following a megathrust event. These requirements for robustness must be balanced with the desire for a flexible design that can accommodate new scientific instrumentation over the life of the project.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23113189','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23113189"><span>Crisis management of tohoku; Japan earthquake and tsunami, 11 march 2011.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zaré, M; Afrouz, S Ghaychi</p> <p>2012-01-01</p> <p>The huge earthquake in 11 March 2012 which followed by a destructive tsunami in Japan was largest recorded earthquake in the history. Japan is pioneer in disaster management, especially earthquakes. How this developed country faced this disaster, which had significant worldwide effects? The humanitarian behavior of the Japanese people amazingly wondered the word's media, meanwhile the management of government and authorities showed some deficiencies. The impact of the disaster is followed up after the event and the different impacts are tried to be analyzed in different sectors. The situation one year after Japan 2011 earthquake and Tsunami is overviewed. The reason of Japanese plans failure was the scale of tsunami, having higher waves than what was assumed, especially in the design of the Nuclear Power Plant. Japanese authorities considered economic benefits more than safety and moral factors exacerbate the situation. Major lessons to be learnt are 1) the effectiveness of disaster management should be restudied in all hazardous countries; 2) the importance of the high-Tech early-warning systems in reducing risk; 3) Reconsidering of extreme values expected/possible hazard and risk levels is necessary; 4) Morality and might be taken as an important factor in disaster management; 5) Sustainable development should be taken as the basis for reconstruction after disaster.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013NHESS..13.1735S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013NHESS..13.1735S"><span>Complementary methods to plan pedestrian evacuation of the French Riviera's beaches in case of tsunami threat: graph- and multi-agent-based modelling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sahal, A.; Leone, F.; Péroche, M.</p> <p>2013-07-01</p> <p>Small amplitude tsunamis have impacted the French Mediterranean shore (French Riviera) in the past centuries. Some caused casualties; others only generated economic losses. While the North Atlantic and Mediterranean tsunami warning system is being tested and is almost operational, no awareness and preparedness measure is being implemented at a local scale. Evacuation is to be considered along the French Riviera, but no plan exists within communities. We show that various approaches can provide local stakeholders with evacuation capacities assessments to develop adapted evacuation plans through the case study of the Cannes-Antibes region. The complementarity between large- and small-scale approaches is demonstrated with the use of macro-simulators (graph-based) and micro-simulators (multi-agent-based) to select shelter points and choose evacuation routes for pedestrians located on the beach. The first one allows automatically selecting shelter points and measuring and mapping their accessibility. The second one shows potential congestion issues during pedestrian evacuations, and provides leads for the improvement of urban environment. Temporal accessibility to shelters is compared to potential local and distal tsunami travel times, showing a 40 min deficit for an adequate crisis management in the first scenario, and a 30 min surplus for the second one.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3469005','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3469005"><span>Crisis Management of Tohoku; Japan Earthquake and Tsunami, 11 March 2011</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Zaré, M; Afrouz, S Ghaychi</p> <p>2012-01-01</p> <p>The huge earthquake in 11 March 2012 which followed by a destructive tsunami in Japan was largest recorded earthquake in the history. Japan is pioneer in disaster management, especially earthquakes. How this developed country faced this disaster, which had significant worldwide effects? The humanitarian behavior of the Japanese people amazingly wondered the word’s media, meanwhile the management of government and authorities showed some deficiencies. The impact of the disaster is followed up after the event and the different impacts are tried to be analyzed in different sectors. The situation one year after Japan 2011 earthquake and Tsunami is overviewed. The reason of Japanese plans failure was the scale of tsunami, having higher waves than what was assumed, especially in the design of the Nuclear Power Plant. Japanese authorities considered economic benefits more than safety and moral factors exacerbate the situation. Major lessons to be learnt are 1) the effectiveness of disaster management should be restudied in all hazardous countries; 2) the importance of the high-Tech early-warning systems in reducing risk; 3) Reconsidering of extreme values expected/possible hazard and risk levels is necessary; 4) Morality and might be taken as an important factor in disaster management; 5) Sustainable development should be taken as the basis for reconstruction after disaster. PMID:23113189</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14..101A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14..101A"><span>The September 29, 2009 Earthquake and Tsunami in American Samoa: A Case Study of Household Evacuation Behavior and the Protective Action Decision Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Apatu, E. J. I.; Gregg, C. E.; Lindell, M. K.; Sorensen, J.; Hillhouse, J.; Sorensen, B.</p> <p>2012-04-01</p> <p>In 2009, the islands of Samoa, American Samoa, and Tonga were struck by an 8.1 magnitude earthquake that triggered a tsunami. The latter claimed an estimated 149, 34, and nine lives, respectively. Preparing persons to take protective action during an earthquake and tsunami is important to help save lives, but evacuation behavior is a dynamic process, which involves many factors such as recognition and interpretation of environmental cues, characteristics of the receiver, characteristics of official and informal warnings and a person's social context during the event. Compared to individualistic cultures like that in the USA, little is known about what factors affect household evacuation behavior in collectivist cultures. The Protective Action Decision Model (PADM) of Lindell and Perry (2004) is a theoretical framework that purports to explain human response to natural hazards. This broad behavioral hazard model has been tested in several settings in the United States. However, to date, the PADM has never been tested in a collectivist culture. Thus, this study will summarize interview findings from 300 American Samoan survivors to understand household evacuation behavior in response to the 2009 tsunami and earthquake that hit American Samoa. In addition, an investigation of how well the PADM explains evacuation action behavior will be reported. Findings from this study will be useful for public health emergency professionals in planning efforts for local tsunamis in coastal communities in the Pacific and around the world.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29275086','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29275086"><span>A game theory approach for assessing risk value and deploying search-and-rescue resources after devastating tsunamis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wu, Cheng-Kuang</p> <p>2018-04-01</p> <p>The current early-warning system and tsunami protection measures tend to fall short because they always underestimate the level of destruction, and it is difficult to predict the level of damage by a devastating tsunami on uncertain targets. As we know, the key to minimizing the total number of fatalities after a disaster is the deployment of search and rescue efforts in the first few hours. However, the resources available to the affected districts for emergency response may be limited. This study proposes two game theoretic models that are designed for search-and-rescue resource allocation. First, the interactions between a compounded disaster and a response agent in the affected district are modelled as a non-cooperative game, after which the risk value is derived for each district from the Nash equilibrium. The risk value represents the threat, vulnerability, and consequence of a specific disaster for the affected district. Second, the risk values for fifteen districts are collected for calculation of each district's Shapley value. Then an acceptable plan for resource deployment among all districts is made based on their expected marginal contribution. The model is verified in a simulation based upon 2011 tsunami data. The experimental results show the proposed approach to be more efficient than the proportional division of rescue resources, for dealing with compounded disaster, and is feasible as a method for planning the mobilization of resources and to improve relief efforts against devastating tsunamis. Copyright © 2017 Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012SPIE.8525E..1BS','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012SPIE.8525E..1BS"><span>Impact of huge tsunami in March 2011 on seaweed bed distributions in Shizugawa Bay, Sanriku Coast, revealed by remote sensing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sakamoto, Shingo X.; Sasa, Shuji; Sawayama, Shuhei; Tsujimoto, Ryo; Terauchi, Genki; Yagi, Hiroshi; Komatsu, Teruhisa</p> <p>2012-10-01</p> <p>Seaweed beds are very important for abalones and sea urchins as a habitat. In Sanriku Coast, these animals are target species of coastal fisheries. The huge tsunami hit Sanriku Coast facing Pacific Ocean on 11 March 2011. It is needed for fishermen to know present situation of seaweed beds and understand damages of the huge tsunami on natural environments to recover coastal fisheries. We selected Shizugawa Bay as a study site because abalone catch of Shizugawa Bay occupied the first position in Sanriku Coast. To evaluate impact of tsunami on seaweed beds, we compared high spatial resolution satellite image of Shizugawa Bay before the tsunami with that after the tsunami by remote sensing with ground surveys to know impact of the tsunami on seaweed beds. We used two multi-band imageries of commercial high-resolution satellite, Geoeye-1, which were taken on 4 November 2009 before the tsunami and on 22 February 2012 after the tsunami. Although divers observed the tsunami damaged a very small part of Eisenia bicyclis distributions on rock substrates at the bay head, it was not observed clearly by satellite image analysis. On the other hand, we found increase in seaweed beds after the tsunami from the image analysis. The tsunami broke concrete breakwaters, entrained a large amount of rocks and pebble from land to the sea, and disseminated them in the bay. Thus, hard substrates suitable for attachment of seaweeds were increased. Ground surveys revealed that seaweeds consisting of E. bicyclis, Sargassum and Laminaria species grew on these hard substrates on the sandy bottom.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.3631M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.3631M"><span>Towards a robust framework for Probabilistic Tsunami Hazard Assessment (PTHA) for local and regional tsunami in New Zealand</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mueller, Christof; Power, William; Fraser, Stuart; Wang, Xiaoming</p> <p>2013-04-01</p> <p>Probabilistic Tsunami Hazard Assessment (PTHA) is conceptually closely related to Probabilistic Seismic Hazard Assessment (PSHA). The main difference is that PTHA needs to simulate propagation of tsunami waves through the ocean and cannot rely on attenuation relationships, which makes PTHA computationally more expensive. The wave propagation process can be assumed to be linear as long as water depth is much larger than the wave amplitude of the tsunami. Beyond this limit a non-linear scheme has to be employed with significantly higher algorithmic run times. PTHA considering far-field tsunami sources typically uses unit source simulations, and relies on the linearity of the process by later scaling and combining the wave fields of individual simulations to represent the intended earthquake magnitude and rupture area. Probabilistic assessments are typically made for locations offshore but close to the coast. Inundation is calculated only for significantly contributing events (de-aggregation). For local and regional tsunami it has been demonstrated that earthquake rupture complexity has a significant effect on the tsunami amplitude distribution offshore and also on inundation. In this case PTHA has to take variable slip distributions and non-linearity into account. A unit source approach cannot easily be applied. Rupture complexity is seen as an aleatory uncertainty and can be incorporated directly into the rate calculation. We have developed a framework that manages the large number of simulations required for local PTHA. As an initial case study the effect of rupture complexity on tsunami inundation and the statistics of the distribution of wave heights have been investigated for plate-interface earthquakes in the Hawke's Bay region in New Zealand. Assessing the probability that water levels will be in excess of a certain threshold requires the calculation of empirical cumulative distribution functions (ECDF). We compare our results with traditional estimates for tsunami inundation simulations that do not consider rupture complexity. De-aggregation based on moment magnitude alone might not be appropriate, because the hazard posed by any individual event can be underestimated locally if rupture complexity is ignored.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.8014L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.8014L"><span>Tipping point analysis of seismological data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Livina, Valerie N.; Tolkova, Elena</p> <p>2014-05-01</p> <p>We apply the tipping point toolbox [1-7] to study sensor data of pressure variations and vertical velocity of the sea floor after two seismic events: 21 October 2010, M6.9, D10km (California) and 11 March 2011, M9.0, D30km (Japan). One type of datasets was measured by nano-resolution pressure sensor [8], while the other, for comparison, by a co-located ocean bottom seismometer. Both sensors registered the seismic wave, and we investigated the early warning and detection signals of the wave arrival for possible application with a remote and cabled tsunami warning detector network (NOAA DART system and Japan Trench Tsunami Observation System). We study the early warning and detection signals of the wave arrival using methodology that combines degenerate fingerprinting and potential analysis techniques for anticipation, detection and forecast of tipping points in a dynamical system. Degenerate fingerprinting indicator is a dynamically derived lag-1 autocorrelation, ACF (or, alternatively, short-range scaling exponent of Detrended Fluctuation Analysis, DFA [1]), which shows short-term memory in a series. When such values rise monotonically, this indicates an upcoming transition or bifurcation in a series and can be used for early warning signals analysis. The potential analysis detects a transition or bifurcation in a series at the time when it happens, which is illustrated in a special contour plot mapping the potential dynamics of the system [2-6]. The methodology has been extensively tested on artificial data and on various geophysical, ecological and industrial sensor datasets [2-5,7], and proved to be applicable to trajectories of dynamical systems of arbitrary origin [9]. In this seismological application, we have obtained early warning signals in the described series using ACF- and DFA-indicators and detected the Rayleigh wave arrival in the potential contour plots. In the case of the event in 2010, the early warning signal starts appearing about 2 min before the first peak of the Rayleigh train is detected by the sensor, whereas in the case of event of 2011, the early warning signal appears closer to the peak arrival, within 1 min. The different strength of early warning signals of the Rayleigh trains may be due to different depths of the events (10 and 30 km), which we plan to test in further analysis. References: [1] Livina and Lenton, GRL 2007; [2] Livina et al, Climate of the Past 2010; [3] Livina et al, Climate Dynamics 2011; [4] Livina et al, Physica A 2012; [5] Livina and Lenton, Cryosphere 2013; [6] Livina et al, Physica A 2013; [7] Livina et al, Journal of Civil Structural Health Monitoring, in press; [8] Tolkova and Schaad, arXiv:1401.0096v1; [9] Vaz Martins et al, PRE 2010.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMNH23A1856G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMNH23A1856G"><span>First application of tsunami back-projection and source inversion for the 2012 Haida Gwaii earthquake using tsunami data recorded on a dense array of seafloor pressure gauges</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gusman, A. R.; Satake, K.; Sheehan, A. F.; Mulia, I. E.; Heidarzadeh, M.; Maeda, T.</p> <p>2015-12-01</p> <p>Adaption of absolute or differential pressure gauges (APG or DPG) to Ocean Bottom Seismometers has provided the opportunity to study tsunamis. Recently we extracted tsunami waveforms of the 28 October 2012 Haida Gwaii earthquake recoded by the APG and DPG of Cascadia Initiative program (Sheehan et al., 2015, SRL). We applied such dense tsunami observations (48 stations) together with other records from DARTs (9 stations) to characterize the tsunami source. This study is the first study that used such a large number of offshore tsunami records for earthquake source study. Conventionally the curves of tsunami travel times are drawn backward from station locations to estimate the tsunami source region. Here we propose a more advanced technique called tsunami back-projection to estimate the source region. Our image produced by tsunami back-projection has the largest value or tsunami centroid that is very close to the epicenter and above the Queen Charlotte transform fault (QCF), whereas the negative values are mostly located east of Haida Gwaii in the Hecate Strait. By using tsunami back-projection we avoid picking initial tsunami phase which is a necessary step in the conventional method that is rather subjective. The slip distribution of the 2012 Haida Gwaii earthquake estimated by tsunami waveform inversion shows large slip near the trench (4-5 m) and also on a plate interface southeast the epicenter (3-4 m) below QCF. From the slip distribution, the calculated seismic moment is 5.4 × 1020 N m (Mw 7.8). The steep bathymetry offshore Haida Gwaii and the horizontal movement caused by the earthquake possibly affects the sea surface deformation. The potential tsunami energy calculated from the sea-surface deformation of pure faulting is 2.20 × 1013 J, while that from the bathymetry effect is 0.12 × 1013 J or about 5% of the total potential energy. The significant deformation above the steep slope is confirmed by another tsunami inversion that disregards fault parameters.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1991JAtOT...8..879B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1991JAtOT...8..879B"><span>Improved satellite-based emergency alerting system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bernard, E. N.; Milburn, H. B.</p> <p>1991-12-01</p> <p>Rapid-onset natural hazards have claimed more than 2.8 million lives worldwide in the past 20 years. This category includes such events as earthquakes, landslides, hurricanes, tornadoes, floods, volcanic eruptions, wildfires, and tsunamis. Effective hazard mitigation is particularly difficult in such cases, since the time available to issue warnings can be very short or even nonexistent. A general approach to mitigate the effects of these disasters was demonstrated in 1988 that included preevent emergency planning, real-time hazard assessment, and rapid warning via satellite communication links. This article reports on improvements in this satellite-based emergency alerting communication system that have reduced the response time from 87 to 17 sec and expanded the broadcast coverage from 40 percent to 62 percent of the earth's surface.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PApGe.174.2925G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PApGe.174.2925G"><span>Fault Slip Distribution of the 2016 Fukushima Earthquake Estimated from Tsunami Waveforms</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gusman, Aditya Riadi; Satake, Kenji; Shinohara, Masanao; Sakai, Shin'ichi; Tanioka, Yuichiro</p> <p>2017-08-01</p> <p>The 2016 Fukushima normal-faulting earthquake (Mjma 7.4) occurred 40 km off the coast of Fukushima within the upper crust. The earthquake generated a moderate tsunami which was recorded by coastal tide gauges and offshore pressure gauges. First, the sensitivity of tsunami waveforms to fault dimensions and depths was examined and the best size and depth were determined. Tsunami waveforms computed based on four available focal mechanisms showed that a simple fault striking northeast-southwest and dipping southeast (strike = 45°, dip = 41°, rake = -95°) yielded the best fit to the observed waveforms. This fault geometry was then used in a tsunami waveform inversion to estimate the fault slip distribution. A large slip of 3.5 m was located near the surface and the major slip region covered an area of 20 km × 20 km. The seismic moment, calculated assuming a rigidity of 2.7 × 1010 N/m2 was 3.70 × 1019 Nm, equivalent to Mw = 7.0. This is slightly larger than the moments from the moment tensor solutions (Mw 6.9). Large secondary tsunami peaks arrived approximately an hour after clear initial peaks were recorded by the offshore pressure gauges and the Sendai and Ofunato tide gauges. Our tsunami propagation model suggests that the large secondary tsunami signals were from tsunami waves reflected off the Fukushima coast. A rather large tsunami amplitude of 75 cm at Kuji, about 300 km north of the source, was comparable to those recorded at stations located much closer to the epicenter, such as Soma and Onahama. Tsunami simulations and ray tracing for both real and artificial bathymetry indicate that a significant portion of the tsunami wave was refracted to the coast located around Kuji and Miyako due to bathymetry effects.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018E%26ES..148a2005S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018E%26ES..148a2005S"><span>"Smong" as local wisdom for disaster risk reduction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Suciani, A.; Islami, Z. R.; Zainal, S.; Sofiyan; Bukhari</p> <p>2018-04-01</p> <p>The province of Aceh is located in the northern tip of Sumatera Island, Indonesia, highly vulnerable to the disasters, the so-called earthquakes and Tsunamis. This is due to the geological location of Aceh, which is located where the Indo-Australian and Eurasian plates meet. Many people learned this just after the devastating earthquake and tsunami on December 26, 2004 that killed thousands of people and also caused countless material losses. Before 2004, people in Aceh did not even notice what a tsunami was. Yet, after the earthquake in 2004 which had a magnitude of 9.2, Aceh continues to experience earthquake with magnitudes of 56, just as it did in Pidie Jaya on December 2016. Due to these conditions, the people of Aceh need to be informed of the real and serious threats that these disasters can cause in order to reduce the impact of these potential tragedies. Local wisdom could be an early warning for preventing risk disaster. Local wisdom could be easy to understand, adapt, and use by the society. The purpose of this paper is to publish “Smong” as local wisdom to reduce the risk of potential earthquake and tsunami disasters. The word is referred to Tsunami was adopted from Devayan Language. It is part of the Simeulue indigenous culture, transmitted through songs, short poems, lullabies, and stories. It is fascinating to note that the earthquake and tsunami catastrophe of 2004 resulted in only seven casualties in Simeulue, which has approximately 86.735 inhabitants. Smong is a key word understood by the entire population of Simeulue that describes the occurrence of giant waves after a major earthquake. During the terrible event that plagued Aceh on December 26, 2004, there was a massive evacuation of the entire Simeulue beach area within a few minutes after the earthquake. Therefore, "Smong" is an appropriate term to be used in order to reduce the impact of disasters, viz. earthquakes and tsunamis in high risk areas.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70025729','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70025729"><span>Slip distribution of the 1952 Tokachi-Oki earthquake (M 8.1) along the Kuril Trench deduced from tsunami waveform inversion</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Hirata, K.; Geist, E.; Satake, K.; Tanioka, Y.; Yamaki, S.</p> <p>2003-01-01</p> <p>We inverted 13 tsunami waveforms recorded in Japan to estimate the slip distribution of the 1952 Tokachi-Oki earthquake (M 8.1), which occurred southeast off Hokkaido along the southern Kuril subduction zone. The previously estimated source area determined from tsunami travel times [Hatori, 1973] did not coincide with the observed aftershock distribution. Our results show that a large amount of slip occurred in the aftershock area east of Hatori's tsunami source area, suggesting that a portion of the interplate thrust near the trench was ruptured by the main shock. We also found more than 5 m of slip along the deeper part of the seismogenic interface, just below the central part of Hatori's tsunami source area. This region, which also has the largest stress drop during the main shock, had few aftershocks. Large tsunami heights on the eastern Hokkaido coast are better explained by the heterogeneous slip model than previous uniform-slip fault models. The total seismic moment is estimated to be 1.87 ?? 1021 N m, giving a moment magnitude of Mw = 8.1. The revised tsunami source area is estimated to be 25.2 ?? 103 km2, ???3 times larger than the previous tsunami source area. Out of four large earthquakes with M ??? 7 that subsequently occurred in and around the rupture area of the 1952 event, three were at the edges of regions with relatively small amount of slip. We also found that a subducted seamount near the edge of the rupture area possibly impeded slip along the plate interface.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PApGe.175...35A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PApGe.175...35A"><span>Tsunami Source Inversion Using Tide Gauge and DART Tsunami Waveforms of the 2017 Mw8.2 Mexico Earthquake</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Adriano, Bruno; Fujii, Yushiro; Koshimura, Shunichi; Mas, Erick; Ruiz-Angulo, Angel; Estrada, Miguel</p> <p>2018-01-01</p> <p>On September 8, 2017 (UTC), a normal-fault earthquake occurred 87 km off the southeast coast of Mexico. This earthquake generated a tsunami that was recorded at coastal tide gauge and offshore buoy stations. First, we conducted a numerical tsunami simulation using a single-fault model to understand the tsunami characteristics near the rupture area, focusing on the nearby tide gauge stations. Second, the tsunami source of this event was estimated from inversion of tsunami waveforms recorded at six coastal stations and three buoys located in the deep ocean. Using the aftershock distribution within 1 day following the main shock, the fault plane orientation had a northeast dip direction (strike = 320°, dip = 77°, and rake =-92°). The results of the tsunami waveform inversion revealed that the fault area was 240 km × 90 km in size with most of the largest slip occurring on the middle and deepest segments of the fault. The maximum slip was 6.03 m from a 30 × 30 km2 segment that was 64.82 km deep at the center of the fault area. The estimated slip distribution showed that the main asperity was at the center of the fault area. The second asperity with an average slip of 5.5 m was found on the northwest-most segments. The estimated slip distribution yielded a seismic moment of 2.9 × 10^{21} Nm (Mw = 8.24), which was calculated assuming an average rigidity of 7× 10^{10} N/m2.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017NHESS..17..335H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017NHESS..17..335H"><span>Development of a decision support system for tsunami evacuation: application to the Jiyang District of Sanya city in China</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hou, Jingming; Yuan, Ye; Wang, Peitao; Ren, Zhiyuan; Li, Xiaojuan</p> <p>2017-03-01</p> <p>Major tsunami disasters often cause great damage in the first few hours following an earthquake. The possible severity of such events requires preparations to prevent tsunami disasters or mitigate them. This paper is an attempt to develop a decision support system for rapid tsunami evacuation for local decision makers. Based on the numerical results database of tsunami disasters, this system can quickly obtain the tsunami inundation and travel time. Because numerical models are calculated in advance, this system can reduce decision-making time. Population distribution, as a vulnerability factor, was analyzed to identify areas of high risk for tsunami disasters. Combined with spatial data, this system can comprehensively analyze the dynamic and static evacuation process and identify problems that negatively impact evacuation, thus supporting the decision-making for tsunami evacuation in high-risk areas. When an earthquake and tsunami occur, this system can rapidly obtain the tsunami inundation and travel time and provide information to assist with tsunami evacuation operations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017E%26ES...71a2001R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017E%26ES...71a2001R"><span>Tsunami Evidence in South Coast Java, Case Study: Tsunami Deposit along South Coast of Cilacap</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rizal, Yan; Aswan; Zaim, Yahdi; Dwijo Santoso, Wahyu; Rochim, Nur; Daryono; Dewi Anugrah, Suci; Wijayanto; Gunawan, Indra; Yatimantoro, Tatok; Hidayanti; Herdiyani Rahayu, Resti; Priyobudi</p> <p>2017-06-01</p> <p>Cilacap Area is situated in coastal area of Southern Java and directly affected by tsunami hazard in 2006. This event was triggered by active subduction in Java Trench which active since long time ago. To detect tsunami and active tectonic in Southern Java, paleo-tsunami study is performed which is targeted paleo-tsunami deposit older than fifty years ago. During 2011 - 2016, 16 locations which suspected as paleo-tsunami location were visited and the test-pits were performed to obtain characteristic and stratigraphy of paleo-tsunami layers. Paleo-tsunami layer was identified by the presence of light-sand in the upper part of paleo-soil, liquefaction fine grain sandstone, and many rip-up clast of mudstone. The systematic samples were taken and analysis (micro-fauna, grainsize and dating analysis). Micro-fauna result shows that paleo-tsunami layer consist of benthonic foraminifera assemblages from different bathymetry and mixing in one layer. Moreover, grainsize shows random grain distribution which characterized as turbulence and strong wave deposit. Paleo-tsunami layers in Cilacap area are correlated using paleo-soil as marker. There are three paleo-tsunami layers and the distribution can be identified as PS-A, PS-B and PS-C. The samples which were taken in Glempang Pasir layer are being dated using Pb - Zn (Lead-Zinc) method. The result of Pb - Zn (Lead-Zinc) dating shows that PS-A was deposited in 139 years ago, PS-B in 21 years ago, and PS C in 10 years ago. This result indicates that PS -1 occurred in 1883 earthquake activity while PS B formed in 1982 earthquake and PS-C was formed by 2006 earthquake. For ongoing research, the older paleo-tsunami layers were determined in the Gua Nagaraja, close to Selok location and 6 layers of Paleo-tsunami suspect found which shown a similar characteristic with the layers from another location. The three layers deeper approximately have an older age than another location in Cilacap.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1811186A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1811186A"><span>UncertiantyQuantificationinTsunamiEarlyWarningCalculations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Anunziato, Alessandro</p> <p>2016-04-01</p> <p>The objective of the Tsunami calculations is the estimation of the impact of waves caused by large seismic events on the coasts and the determination of potential inundation areas. In the case of Early Warning Systems, i.e. systems that should allow to anticipate the possible effects and give the possibility to react consequently (i.e. issue evacuation of areas at risk), this must be done in very short time (minutes) to be effective. In reality, the above estimation includes several uncertainty factors which make the prediction extremely difficult. The quality of the very first estimations of the seismic parameters is not very precise: the uncertainty in the determination of the seismic components (location, magnitude and depth) decreases with time because as time passes it is possible to use more and more seismic signals and the event characterization becomes more precise. On the other hand other parameters that are necessary to establish for the performance of a calculation (i.e. fault mechanism) are difficult to estimate accurately also after hours (and in some cases remain unknown) and therefore this uncertainty remains in the estimated impact evaluations; when a quick tsunami calculation is necessary (early warning systems) the possibility to include any possible future variation of the conditions to establish the "worst case scenario" is particularly important. The consequence is that the number of uncertain parameters is so large that it is not easy to assess the relative importance of each of them and their effect on the predicted results. In general the complexity of system computer codes is generated by the multitude of different models which are assembled into a single program to give the global response for a particular phenomenon. Each of these model has associated a determined uncertainty coming from the application of that model to single cases and/or separated effect test cases. The difficulty in the prediction of a Tsunami calculation response is additionally increased by the not perfect knowledge of the initial and boundary conditions so that the response can change even with small variations of the input. The paper analyses a number of potential events in the Mediterranean Sea and in the Atlantic Ocean and for each of them a large number of calculations is performed (Monte Carlo simulation) in order to identify the relative importance of each of the uncertain parameter that is adopted. It is shown that even if after several hours the variation on the estimate is reduces, still remains and in some cases it can lead to different conclusions if this information is used as alerting method. The cases considered are: a mild event in the Hellenic arc (Mag. 6.9), a relatively medium event in Algeria (Mag. 7.2) and a quite relevant event in the Gulf of Cadiz (Mag. 8.2).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFMOS43D1332T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFMOS43D1332T"><span>Noise Reduction of Ocean-Bottom Pressure Data Toward Real-Time Tsunami Forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tsushima, H.; Hino, R.</p> <p>2008-12-01</p> <p>We discuss a method of noise reduction of ocean-bottom pressure data to be fed into the near-field tsunami forecasting scheme proposed by Tsushima et al. [2008a]. In their scheme, the pressure data is processed in real time as follows: (1) removing ocean tide components by subtracting the sea-level variation computed from a theoretical tide model, (2) applying low-pass digital filter to remove high-frequency fluctuation due to seismic waves, and (3) removing DC-offset and linear-trend component to determine a baseline of relative sea level. However, it turns out this simple method is not always successful in extracting tsunami waveforms from the data, when the observed amplitude is ~1cm. For disaster mitigation, accurate forecasting of small tsunamis is important as well as large tsunamis. Since small tsunami events occur frequently, successful tsunami forecasting of those events are critical to obtain public reliance upon tsunami warnings. As a test case, we applied the data-processing described above to the bottom pressure records containing tsunami with amplitude less than 1 cm which was generated by the 2003 Off-Fukushima earthquake occurring in the Japan Trench subduction zone. The observed pressure variation due to the ocean tide is well explained by the calculated tide signals from NAO99Jb model [Matsumoto et al., 2000]. However, the tide components estimated by BAYTAP-G [Tamura et al., 1991] from the pressure data is more appropriate for predicting and removing the ocean tide signals. In the pressure data after removing the tide variations, there remain pressure fluctuations with frequencies ranging from about 0.1 to 1 mHz and with amplitudes around ~10 cm. These fluctuations distort the estimation of zero-level and linear trend to define relative sea-level variation, which is treated as tsunami waveform in the subsequent analysis. Since the linear trend is estimated from the data prior to the origin time of the earthquake, an artificial linear trend is produced in the processed waveform. This artificial linear trend degrades the accuracy of the tsunami forecasting, although the forecasting result is expected to be robust against the existence of short-period noise [Tsushima et al., 2008a]. Since the bottom pressure show gradual increase (or decrease) in the tsunami source region [Tsushima et al., 2008b], it is important to remove the linear trend not related to the tsunami generation from the data before fed into the analysis. Therefore, the reduction of the noise in sub-mHz band is critical for the forecasting small tsunamis. Applying a kind of frequency filters to eliminate this noise cannot be a solution for this problem because actual tsunami signals may also contain components of this frequency band. We investigate whether any statistical modelings of the noise are effective for reducing the sub-mHz noise.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFM.U23F..03D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFM.U23F..03D"><span>Preliminary assessment of the impacts and effects of the South Pacific tsunami of September 2009 in Samoa</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dominey-Howes, D.</p> <p>2009-12-01</p> <p>The September 2009 tsunami was a regional South Pacific event of enormous significance. Our UNESCO-IOC ITST Samoa survey used a simplified version of a ‘coupled human-environment systems framework’ (Turner et al., 2003) to investigate the impacts and effects of the tsunami in Samoa. Further, the framework allowed us to identify those factors that affected the vulnerability and resilience of the human-environment system before, during and after the tsunami - a global first. Key findings (unprocessed) include: Maximum run-up exceeded 14 metres above sea level Maximum inundation (at right angles to the shore) was approximately 400 metres Maximum inundation with the wave running parallel with the shore (but inland), exceeded 700 metres Buildings sustained varying degrees of damage Damage was correlated with depth of tsunami flow, velocity, condition of foundations, quality of building materials used, quality of workmanship, adherence to the building code and so on Buildings raised even one metre above the surrounding land surface suffered much less damage Plants, trees and mangroves reduced flow velocity and flow depth - leading to greater chances of human survival and lower levels of building damage The tsunami has left a clear and distinguishable geological record in terms of sediments deposited in the coastal landscape The clear sediment layer associated with this tsunami suggests that older (and prehistoric) tsunamis can be identified, helping to answer questions about frequency and magnitude of tsunamis The tsunami caused widespread erosion of the coastal and beach zones but this damage will repair itself naturally and quickly The tsunami has had clear impacts on ecosystems and these are highly variable Ecosystems will repair themselves naturally and are unlikely to preserve long-term impacts It is clear that some plant (tree) species are highly resilient and provided immediate places for safety during the tsunami and resources post-tsunami People of Samoa are forgetting their knowledge of the value and uses of indigenous plant and animal species and efforts are needed to increase the understanding of the value of these plants and animals (thus increasing community resilience) Video recording survivor stories is important Sadly, there is no tradition of story telling or memory of past tsunamis so the capturing of survivor accounts means that such stories can be introduced to the cultural memory Permitting survivors to tell their stories allows them to heal emotionally, and also provides valuable information for future education and community outreach The people of Samoa are hurting after the tsunami Impacts and effects are highly variable socially and spatially Where lives have been lost, the impacts and associated fear are much higher Communities require practical and long-term emotional care A complex picture is emerging about community experiences of warning and response behaviour that presents challenges to the Government of Samoa in terms of education and outreach for hazard reduction</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFM.G33A0832T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFM.G33A0832T"><span>Comparison of Tsunami height Distributions of the 1960 and the 2010 Chilean Earthquakes on the Coasts of the Japanese Islands</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tsuji, Y.; Takahashi, T.; Imai, K.</p> <p>2010-12-01</p> <p>The tsunami of the Chilean Earthquake (Mw8.8) of February 27, 2010 was detected also on the coasts of the Japanese Islands about 23 hours after the occurrence of the main shock. It caused no human damage. There was slight house damage manly in Miyagi prefecture, south part of Sanriku coast; six and fifty one houses were flooded above and below the floor, respectively. It caused remarkable fishery loss of 75 Million US$ mainly due to breaking of cultivation rafts. The tsunami of the 1960 Chilean Earthquake(Mw9.5) also hit the Japanese coasts more severely. It caused more immense damage than the 2010 tsunami; 142 people were killed, 1,581 houses were entirely destroyed, and 1,256 houses were swept away. Most of damage occurred in the districts of Sanriku coast, where inundation heights exceeded six meters at several points. We made field survey along the Japanese coast, visited offices of fishermen’s cooperatives at over 300 fishery ports, gathered eyewitnesses counts, and obtained information of the inundation limit, arrival time, and building and fishery damage. On the basis of the information of inundation, we measured tsunami heights. We obtained data of tsunami height at more than two hundred points (Tsuji et al., 2010). The distributions of the two tsunamis of the 1960 and the 2010 Chilean earthquakes on the coasts along the Japanese Islands are shown as Fig. 1. The maximum height of 2.2 meters was recorded at Kesennuma City, Miyagi Prefecture. The heights of the 2010 tsunami were generally one third of those of the 1960 tsunami. An eminent peak appears at Sanriku coast commonly for both tsunamis. In addition smaller peaks also appear commonly at the coasts of the east part of Hokkaido, near the top of Boso peninsula, near the top of Izu Peninsula, the east coast of Kii Peninsula, Tokushima prefecture, eastern part of Shikoku, and near the Cape Ashizuri in western part of Shikoku. Fig. 1 Trace height distributions of the tsunamis of the 1960(red) and the 2010(blue) Chilean Earthquakes along the coasts of the Japanese Islands</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>