Science and Engineering of an Operational Tsunami Forecasting System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez, Frank
2009-04-06
After a review of tsunami statistics and the destruction caused by tsunamis, a means of forecasting tsunamis is discussed as part of an overall program of reducing fatalities through hazard assessment, education, training, mitigation, and a tsunami warning system. The forecast is accomplished via a concept called Deep Ocean Assessment and Reporting of Tsunamis (DART). Small changes of pressure at the sea floor are measured and relayed to warning centers. Under development is an international modeling network to transfer, maintain, and improve tsunami forecast models.
Science and Engineering of an Operational Tsunami Forecasting System
Gonzalez, Frank
2017-12-09
After a review of tsunami statistics and the destruction caused by tsunamis, a means of forecasting tsunamis is discussed as part of an overall program of reducing fatalities through hazard assessment, education, training, mitigation, and a tsunami warning system. The forecast is accomplished via a concept called Deep Ocean Assessment and Reporting of Tsunamis (DART). Small changes of pressure at the sea floor are measured and relayed to warning centers. Under development is an international modeling network to transfer, maintain, and improve tsunami forecast models.
Tsunami Forecast Progress Five Years After Indonesian Disaster
NASA Astrophysics Data System (ADS)
Titov, Vasily V.; Bernard, Eddie N.; Weinstein, Stuart A.; Kanoglu, Utku; Synolakis, Costas E.
2010-05-01
Almost five years after the 26 December 2004 Indian Ocean tragedy, tsunami warnings are finally benefiting from decades of research toward effective model-based forecasts. Since the 2004 tsunami, two seminal advances have been (i) deep-ocean tsunami measurements with tsunameters and (ii) their use in accurately forecasting tsunamis after the tsunami has been generated. Using direct measurements of deep-ocean tsunami heights, assimilated into numerical models for specific locations, greatly improves the real-time forecast accuracy over earthquake-derived magnitude estimates of tsunami impact. Since 2003, this method has been used to forecast tsunamis at specific harbors for different events in the Pacific and Indian Oceans. Recent tsunamis illustrated how this technology is being adopted in global tsunami warning operations. The U.S. forecasting system was used by both research and operations to evaluate the tsunami hazard. Tests demonstrated the effectiveness of operational tsunami forecasting using real-time deep-ocean data assimilated into forecast models. Several examples also showed potential of distributed forecast tools. With IOC and USAID funding, NOAA researchers at PMEL developed the Community Model Interface for Tsunami (ComMIT) tool and distributed it through extensive capacity-building sessions in the Indian Ocean. Over hundred scientists have been trained in tsunami inundation mapping, leading to the first generation of inundation models for many Indian Ocean shorelines. These same inundation models can also be used for real-time tsunami forecasts as was demonstrated during several events. Contact Information Vasily V. Titov, Seattle, Washington, USA, 98115
Measuring and forecasting great tsunamis by GNSS-based vertical positioning of multiple ships
NASA Astrophysics Data System (ADS)
Inazu, D.; Waseda, T.; Hibiya, T.; Ohta, Y.
2016-12-01
Vertical ship positioning by the Global Navigation Satellite System (GNSS) was investigated for measuring and forecasting great tsunamis. We first examined existing GNSS vertical position data of a navigating vessel. The result indicated that by using the kinematic Precise Point Positioning (PPP) method, tsunamis greater than 10^-1 m can be detected from the vertical position of the ship. Based on Automatic Identification System (AIS) data, tens of cargo ships and tankers are regularly identified navigating over the Nankai Trough, southwest of Japan. We then assumed that a future Nankai Trough great earthquake tsunami will be observed by ships at locations based on AIS data. The tsunami forecast capability by these virtual offshore tsunami measurements was examined. A conventional Green's function based inversion was used to determine the initial tsunami height distribution. Tsunami forecast tests over the Nankai Trough were carried out using simulated tsunami data of the vertical positions of multiple cargo ships/tankers on a certain day, and of the currently operating observations by deep-sea pressure gauges and Global Positioning System (GPS) buoys. The forecast capability of ship-based tsunami height measurements alone was shown to be comparable to or better than that using the existing offshore observations.
Assessment of GNSS-based height data of multiple ships for measuring and forecasting great tsunamis
NASA Astrophysics Data System (ADS)
Inazu, Daisuke; Waseda, Takuji; Hibiya, Toshiyuki; Ohta, Yusaku
2016-12-01
Ship height positioning by the Global Navigation Satellite System (GNSS) was investigated for measuring and forecasting great tsunamis. We first examined GNSS height-positioning data of a navigating vessel. If we use the kinematic precise point positioning (PPP) method, tsunamis greater than 10-1 m will be detected by ship height positioning. Based on Automatic Identification System (AIS) data, we found that tens of cargo ships and tankers are usually identified to navigate over the Nankai Trough, southwest Japan. We assumed that a future Nankai Trough great earthquake tsunami will be observed by the kinematic PPP height positioning of an AIS-derived ship distribution, and examined the tsunami forecast capability of the offshore tsunami measurements based on the PPP-based ship height. A method to estimate the initial tsunami height distribution using offshore tsunami observations was used for forecasting. Tsunami forecast tests were carried out using simulated tsunami data by the PPP-based ship height of 92 cargo ships/tankers, and by currently operating deep-sea pressure and Global Positioning System (GPS) buoy observations at 71 stations over the Nankai Trough. The forecast capability using the PPP-based height of the 92 ships was shown to be comparable to or better than that using the operating offshore observatories at the 71 stations. We suppose that, immediately after the occurrence of a great earthquake, stations receiving successive ship information (AIS data) along certain areas of the coast would fail to acquire ship data due to strong ground shaking, especially near the epicenter. Such a situation would significantly deteriorate the tsunami-forecast capability using ship data. On the other hand, operational real-time analysis of seismic/geodetic data would be carried out for estimating a tsunamigenic fault model. Incorporating the seismic/geodetic fault model estimation into the tsunami forecast above possibly compensates for the deteriorated forecast capability.
NASA Astrophysics Data System (ADS)
Aoi, S.; Yamamoto, N.; Suzuki, W.; Hirata, K.; Nakamura, H.; Kunugi, T.; Kubo, T.; Maeda, T.
2015-12-01
In the 2011 Tohoku earthquake, in which huge tsunami claimed a great deal of lives, the initial tsunami forecast based on hypocenter information estimated using seismic data on land were greatly underestimated. From this lesson, NIED is now constructing S-net (Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench) which consists of 150 ocean bottom observatories with seismometers and pressure gauges (tsunamimeters) linked by fiber optic cables. To take full advantage of S-net, we develop a new methodology of real-time tsunami inundation forecast using ocean bottom observation data and construct a prototype system that implements the developed forecasting method for the Pacific coast of Chiba prefecture (Sotobo area). We employ a database-based approach because inundation is a strongly non-linear phenomenon and its calculation costs are rather heavy. We prepare tsunami scenario bank in advance, by constructing the possible tsunami sources, and calculating the tsunami waveforms at S-net stations, coastal tsunami heights and tsunami inundation on land. To calculate the inundation for target Sotobo area, we construct the 10-m-mesh precise elevation model with coastal structures. Based on the sensitivities analyses, we construct the tsunami scenario bank that efficiently covers possible tsunami scenarios affecting the Sotobo area. A real-time forecast is carried out by selecting several possible scenarios which can well explain real-time tsunami data observed at S-net from tsunami scenario bank. An advantage of our method is that tsunami inundations are estimated directly from the actual tsunami data without any source information, which may have large estimation errors. In addition to the forecast system, we develop Web services, APIs, and smartphone applications and brush them up through social experiments to provide the real-time tsunami observation and forecast information in easy way to understand toward urging people to evacuate.
Tsunami Forecasting and Monitoring in New Zealand
NASA Astrophysics Data System (ADS)
Power, William; Gale, Nora
2011-06-01
New Zealand is exposed to tsunami threats from several sources that vary significantly in their potential impact and travel time. One route for reducing the risk from these tsunami sources is to provide advance warning based on forecasting and monitoring of events in progress. In this paper the National Tsunami Warning System framework, including the responsibilities of key organisations and the procedures that they follow in the event of a tsunami threatening New Zealand, are summarised. A method for forecasting threat-levels based on tsunami models is presented, similar in many respects to that developed for Australia by Allen and Greenslade (Nat Hazards 46:35-52, 2008), and a simple system for easy access to the threat-level forecasts using a clickable pdf file is presented. Once a tsunami enters or initiates within New Zealand waters, its progress and evolution can be monitored in real-time using a newly established network of online tsunami gauge sensors placed at strategic locations around the New Zealand coasts and offshore islands. Information from these gauges can be used to validate and revise forecasts, and assist in making the all-clear decision.
Test operation of a real-time tsunami inundation forecast system using actual data observed by S-net
NASA Astrophysics Data System (ADS)
Suzuki, W.; Yamamoto, N.; Miyoshi, T.; Aoi, S.
2017-12-01
If the tsunami inundation information can be rapidly and stably forecast before the large tsunami attacks, the information would have effectively people realize the impeding danger and necessity of evacuation. Toward that goal, we have developed a prototype system to perform the real-time tsunami inundation forecast for Chiba prefecture, eastern Japan, using off-shore ocean bottom pressure data observed by the seafloor observation network for earthquakes and tsunamis along the Japan Trench (S-net) (Aoi et al., 2015, AGU). Because tsunami inundation simulation requires a large computation cost, we employ a database approach searching the pre-calculated tsunami scenarios that reasonably explain the observed S-net pressure data based on the multi-index method (Yamamoto et al., 2016, EPS). The scenario search is regularly repeated, not triggered by the occurrence of the tsunami event, and the forecast information is generated from the selected scenarios that meet the criterion. Test operation of the prototype system using the actual observation data started in April, 2017 and the performance and behavior of the system during non-tsunami event periods have been examined. It is found that the treatment of the noises affecting the observed data is the main issue to be solved toward the improvement of the system. Even if the observed pressure data are filtered to extract the tsunami signals, the noises in ordinary times or unusually large noises like high ocean waves due to storm affect the comparison between the observed and scenario data. Due to the noises, the tsunami scenarios are selected and the tsunami is forecast although any tsunami event does not actually occur. In most cases, the selected scenarios due to the noises have the fault models in the region along the Kurile or Izu-Bonin Trenches, far from the S-net region, or the fault models below the land. Based on the parallel operation of the forecast system with a different scenario search condition and examination of the fault models, we improve the stability and performance of the forecast system.This work was supported by Council for Science, Technology and Innovation(CSTI), Cross-ministerial Strategic Innovation Promotion Program (SIP), "Enhancement of societal resiliency against natural disasters"(Funding agency: JST).
NASA Astrophysics Data System (ADS)
Tanioka, Yuichiro
2017-04-01
After tsunami disaster due to the 2011 Tohoku-oki great earthquake, improvement of the tsunami forecast has been an urgent issue in Japan. National Institute of Disaster Prevention is installing a cable network system of earthquake and tsunami observation (S-NET) at the ocean bottom along the Japan and Kurile trench. This cable system includes 125 pressure sensors (tsunami meters) which are separated by 30 km. Along the Nankai trough, JAMSTEC already installed and operated the cable network system of seismometers and pressure sensors (DONET and DONET2). Those systems are the most dense observation network systems on top of source areas of great underthrust earthquakes in the world. Real-time tsunami forecast has depended on estimation of earthquake parameters, such as epicenter, depth, and magnitude of earthquakes. Recently, tsunami forecast method has been developed using the estimation of tsunami source from tsunami waveforms observed at the ocean bottom pressure sensors. However, when we have many pressure sensors separated by 30km on top of the source area, we do not need to estimate the tsunami source or earthquake source to compute tsunami. Instead, we can initiate a tsunami simulation from those dense tsunami observed data. Observed tsunami height differences with a time interval at the ocean bottom pressure sensors separated by 30 km were used to estimate tsunami height distribution at a particular time. In our new method, tsunami numerical simulation was initiated from those estimated tsunami height distribution. In this paper, the above method is improved and applied for the tsunami generated by the 2011 Tohoku-oki great earthquake. Tsunami source model of the 2011 Tohoku-oki great earthquake estimated using observed tsunami waveforms, coseimic deformation observed by GPS and ocean bottom sensors by Gusman et al. (2012) is used in this study. The ocean surface deformation is computed from the source model and used as an initial condition of tsunami simulation. By assuming that this computed tsunami is a real tsunami and observed at ocean bottom sensors, new tsunami simulation is carried out using the above method. The station distribution (each station is separated by 15 min., about 30 km) observed tsunami waveforms which were actually computed from the source model. Tsunami height distributions are estimated from the above method at 40, 80, and 120 seconds after the origin time of the earthquake. The Near-field Tsunami Inundation forecast method (Gusman et al. 2014) was used to estimate the tsunami inundation along the Sanriku coast. The result shows that the observed tsunami inundation was well explained by those estimated inundation. This also shows that it takes about 10 minutes to estimate the tsunami inundation from the origin time of the earthquake. This new method developed in this paper is very effective for a real-time tsunami forecast.
NASA Astrophysics Data System (ADS)
Bernard, Eddie; Wei, Yong; Tang, Liujuan; Titov, Vasily
2014-12-01
Following the devastating 11 March 2011 tsunami, two deep-ocean assessment and reporting of tsunamis (DART®)(DART® and the DART® logo are registered trademarks of the National Oceanic and Atmospheric Administration, used with permission) stations were deployed in Japanese waters by the Japanese Meteorological Agency. Two weeks after deployment, on 7 December 2012, a M w 7.3 earthquake off Japan's Pacific coastline generated a tsunami. The tsunami was recorded at the two Japanese DARTs as early as 11 min after the earthquake origin time, which set a record as the fastest tsunami detecting time at a DART station. These data, along with those recorded at other DARTs, were used to derive a tsunami source using the National Oceanic and Atmospheric Administration tsunami forecast system. The results of our analysis show that data provided by the two near-field Japanese DARTs can not only improve the forecast speed but also the forecast accuracy at the Japanese tide gauge stations. This study provides important guidelines for early detection and forecasting of local tsunamis.
Non-seismic tsunamis: filling the forecast gap
NASA Astrophysics Data System (ADS)
Moore, C. W.; Titov, V. V.; Spillane, M. C.
2015-12-01
Earthquakes are the generation mechanism in over 85% of tsunamis. However, non-seismic tsunamis, including those generated by meteorological events, landslides, volcanoes, and asteroid impacts, can inundate significant area and have a large far-field effect. The current National Oceanographic and Atmospheric Administration (NOAA) tsunami forecast system falls short in detecting these phenomena. This study attempts to classify the range of effects possible from these non-seismic threats, and to investigate detection methods appropriate for use in a forecast system. Typical observation platforms are assessed, including DART bottom pressure recorders and tide gauges. Other detection paths include atmospheric pressure anomaly algorithms for detecting meteotsunamis and the early identification of asteroids large enough to produce a regional hazard. Real-time assessment of observations for forecast use can provide guidance to mitigate the effects of a non-seismic tsunami.
NASA Astrophysics Data System (ADS)
Tang, L.; Titov, V. V.; Chamberlin, C. D.
2009-12-01
The study describes the development, testing and applications of site-specific tsunami inundation models (forecast models) for use in NOAA's tsunami forecast and warning system. The model development process includes sensitivity studies of tsunami wave characteristics in the nearshore and inundation, for a range of model grid setups, resolutions and parameters. To demonstrate the process, four forecast models in Hawaii, at Hilo, Kahului, Honolulu, and Nawiliwili are described. The models were validated with fourteen historical tsunamis and compared with numerical results from reference inundation models of higher resolution. The accuracy of the modeled maximum wave height is greater than 80% when the observation is greater than 0.5 m; when the observation is below 0.5 m the error is less than 0.3 m. The error of the modeled arrival time of the first peak is within 3% of the travel time. The developed forecast models were further applied to hazard assessment from simulated magnitude 7.5, 8.2, 8.7 and 9.3 tsunamis based on subduction zone earthquakes in the Pacific. The tsunami hazard assessment study indicates that use of a seismic magnitude alone for a tsunami source assessment is inadequate to achieve such accuracy for tsunami amplitude forecasts. The forecast models apply local bathymetric and topographic information, and utilize dynamic boundary conditions from the tsunami source function database, to provide site- and event-specific coastal predictions. Only by combining a Deep-ocean Assessment and Reporting of Tsunami-constrained tsunami magnitude with site-specific high-resolution models can the forecasts completely cover the evolution of earthquake-generated tsunami waves: generation, deep ocean propagation, and coastal inundation. Wavelet analysis of the tsunami waves suggests the coastal tsunami frequency responses at different sites are dominated by the local bathymetry, yet they can be partially related to the locations of the tsunami sources. The study also demonstrates the nonlinearity between offshore and nearshore maximum wave amplitudes.
A Comparison Study of Two Numerical Tsunami Forecasting Systems
NASA Astrophysics Data System (ADS)
Greenslade, Diana J. M.; Titov, Vasily V.
2008-12-01
This paper presents a comparison of two tsunami forecasting systems: the NOAA/PMEL system (SIFT) and the Australian Bureau of Meteorology system (T1). Both of these systems are based on a tsunami scenario database and both use the same numerical model. However, there are some major differences in the way in which the scenarios are constructed and in the implementation of the systems. Two tsunami events are considered here: Tonga 2006 and Sumatra 2007. The results show that there are some differences in the distribution of maximum wave amplitude, particularly for the Tonga event, however both systems compare well to the available tsunameter observations. To assess differences in the forecasts for coastal amplitude predictions, the offshore forecast results from both systems were used as boundary conditions for a high-resolution model for Hilo, Hawaii. The minor differences seen between the two systems in deep water become considerably smaller at the tide gauge and both systems compare very well with the observations.
NASA Astrophysics Data System (ADS)
Yamamoto, N.; Aoi, S.; Suzuki, W.; Hirata, K.; Takahashi, N.; Kunugi, T.; Nakamura, H.
2016-12-01
We have launched a new project to develop real-time tsunami inundation forecast system for the Pacific coast of Chiba prefecture (Kujukuri-Sotobo region), Japan (Aoi et al., 2015, AGU). In this study, we design a database-driven real-time tsunami forecast system using the multi-index method (Yamamoto et al., 2016, EPS) and implement a prototype system. In the previous study (Yamamoto et al., 2015, AGU), we assumed that the origin-time of tsunami was known before a forecast based on comparing observed and calculated ocean-bottom pressure waveforms stored in the Tsunami Scenario Bank (TSB). As shown in the figure, we assume the scenario origin-times by defining the scenario elapsed timeτp to compare observed and calculated waveforms. In this design, when several appropriate tsunami scenarios were selected by multiple indices (two variance reductions and correlation coefficient), the system could make tsunami forecast using the selected tsunami scenarios for the target coastal region without any triggered information derived from observed seismic and/or tsunami data. In addition, we define the time range Tq shown in the figure for masking perturbations contaminated by ocean-acoustic and seismic waves on the observed pressure records (Saito, 2015, JpGU). Following the proposed design, we implement a prototype system of real-time tsunami inundation forecast system for the exclusive use of the target coastal region using ocean-bottom pressure data from the Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench (S-net) (Kanazawa et al., 2012, JpGU; Uehira et al., 2015, IUGG), which is constructed by National Research institute for Earth Science and Disaster Resilience (NIED). For the prototype system, we construct a prototype TSB using interplate earthquake fault models located along the Japan Trench (Mw 7.6-9.8), the Sagami Trough (Mw 7.6-8.6), and the Nankai Trough (Mw 7.6-8.6) as well as intraplate earthquake fault models (Mw 7.6-8.6) within the subducting Pacific plate, which could affect the target coastal region. This work was partially supported by the Council for Science, Technology and Innovation (CSTI) through the Cross-ministerial Strategic Innovation Promotion Program (SIP), titled "Enhancement of societal resiliency against natural disasters" (Funding agency: JST).
Tsunami: ocean dynamo generator.
Sugioka, Hiroko; Hamano, Yozo; Baba, Kiyoshi; Kasaya, Takafumi; Tada, Noriko; Suetsugu, Daisuke
2014-01-08
Secondary magnetic fields are induced by the flow of electrically conducting seawater through the Earth's primary magnetic field ('ocean dynamo effect'), and hence it has long been speculated that tsunami flows should produce measurable magnetic field perturbations, although the signal-to-noise ratio would be small because of the influence of the solar magnetic fields. Here, we report on the detection of deep-seafloor electromagnetic perturbations of 10-micron-order induced by a tsunami, which propagated through a seafloor electromagnetometer array network. The observed data extracted tsunami characteristics, including the direction and velocity of propagation as well as sea-level change, first to verify the induction theory. Presently, offshore observation systems for the early forecasting of tsunami are based on the sea-level measurement by seafloor pressure gauges. In terms of tsunami forecasting accuracy, the integration of vectored electromagnetic measurements into existing scalar observation systems would represent a substantial improvement in the performance of tsunami early-warning systems.
NASA Astrophysics Data System (ADS)
Jamelot, Anthony; Reymond, Dominique; Savigny, Jonathan; Hyvernaud, Olivier
2016-04-01
The tsunami generated by the earthquake of magnitude Mw=8.2 near the coast of central Chile on the 16th September 2015 was observed on 7 tide gauges distributed over the five archipelagoes composing French Polynesia, a territory as large as Europe. We'll sum up all the observations of the tsunami and the field survey done in Tahiti (Society islands) and Hiva-Oa (Marquesas islands) to evaluate the preliminary tsunami forecast tool (MERIT) and the detailed tsunami forecast tool (COASTER) of the French Polynesian Tsunami Warning Center. The preliminary tool forecasted a maximal tsunami height between 0.5m to 2.3 m all over the Marquesas Islands. But only the island of Hiva-Oa had a tsunami forecast greater than 1 meter especially in the Tahauku Bay well known for its local response due to its resonance properties. In Tahauku bay, the tide gauge located at the entrance of the bay recorded a maximal tsunami height above mean sea level ~ 1.7 m; and we measured at the bottom of the bay a run-up about 2.8 m at 388 m inland from the shoreline in the river bed, and a run-up of 2.5 m located 155 m inland. The multi-grid simulation over Tahiti was done one hour after the origin time of the earthquake and gave a very localized tsunami impact on the North shore. Our forecast indicated an inundation about 10 m inland that lead Civil Authorities to evacuate 6 houses. It was the first operational use of this new fine grid covering the north part of Tahiti that is not protected by a coral reef. So we were attentive to the feed back of the alert that confirm the forecast of the maximal height arrival 1 hour after the first arrival. The tsunami warning system forecast well strong impact as well as low impact as long as we have an early robust description of the seismic parameters and fine grids about 10 m spatial resolution to simulate tsunami impact. In January of 2016 we are able to forecast tsunami heights for 72 points located over 35 islands of French Polynesia.
U.S. Tsunami Warning System: Advancements since the 2004 Indian Ocean Tsunami (Invited)
NASA Astrophysics Data System (ADS)
Whitmore, P.
2009-12-01
The U.S. government embarked on a strengthening program for the U.S. Tsunami Warning System (TWS) in the aftermath of the disastrous 2004 Indian Ocean tsunami. The program was designed to improve several facets of the U.S. TWS, including: upgrade of the coastal sea level network - 16 new stations plus higher transmission rates; expansion of the deep ocean tsunameter network - 7 sites increased to 39; upgrade of seismic networks - both USGS and Tsunami Warning Center (TWC); increase of TWC staff to allow 24x7 coverage at two centers; development of an improved tsunami forecast system; increased preparedness in coastal communities; expansion of the Pacific Tsunami Warning Center facility; and improvement of the tsunami data archive effort at the National Geophysical Data Center. The strengthening program has been completed and has contributed to the many improvements attained in the U.S. TWS since 2004. Some of the more significant enhancements to the program are: the number of sea level and seismic sites worldwide available to the TWCs has more than doubled; the TWC areas-of-responsibility expanded to include the U.S./Canadian Atlantic coasts, Indian Ocean, Caribbean Sea, Gulf of Mexico, and U.S. Arctic coast; event response time decreased by approximately one-half; product accuracy has improved; a tsunami forecast system developed by NOAA capable of forecasting inundation during an event has been delivered to the TWCs; warning areas are now defined by pre-computed or forecasted threat versus distance or travel time, significantly reducing the amount of coast put in a warning; new warning dissemination techniques have been implemented to reach a broader audience in less time; tsunami product content better reflects the expected impact level; the number of TsunamiReady communities has quadrupled; and the historical data archive has increased in quantity and accuracy. In addition to the strengthening program, the U.S. National Tsunami Hazard Mitigation Program (NTHMP) has expanded its efforts since 2004 and improved tsunami preparedness throughout U.S. coastal communities. The NTHMP is a partnership of federal agencies and state tsunami response agencies whose efforts include: development of inundation and evacuation maps for most highly threatened communities; tsunami evacuation and educational signage for coastal communities; support for tsunami educational, awareness and planning seminars; increased number of local tsunami warning dissemination devices such as sirens; and support for regional tsunami exercises. These activities are major factors that have contributed to the increase of TsunamiReady communities throughout the country.
A Pilot Tsunami Inundation Forecast System for Australia
NASA Astrophysics Data System (ADS)
Allen, Stewart C. R.; Greenslade, Diana J. M.
2016-12-01
The Joint Australian Tsunami Warning Centre (JATWC) provides a tsunami warning service for Australia. Warnings are currently issued according to a technique that does not include explicit modelling at the coastline, including any potential coastal inundation. This paper investigates the feasibility of developing and implementing tsunami inundation modelling as part of the JATWC warning system. An inundation model was developed for a site in Southeast Australia, on the basis of the availability of bathymetric and topographic data and observations of past tsunamis. The model was forced using data from T2, the operational deep-water tsunami scenario database currently used for generating warnings. The model was evaluated not only for its accuracy but also for its computational speed, particularly with respect to operational applications. Limitations of the proposed forecast processes in the Australian context and areas requiring future improvement are discussed.
The FASTER Approach: A New Tool for Calculating Real-Time Tsunami Flood Hazards
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Cross, A.; Johnson, L.; Miller, K.; Nicolini, T.; Whitmore, P.
2014-12-01
In the aftermath of the 2010 Chile and 2011 Japan tsunamis that struck the California coastline, emergency managers requested that the state tsunami program provide more detailed information about the flood potential of distant-source tsunamis well ahead of their arrival time. The main issue is that existing tsunami evacuation plans call for evacuation of the predetermined "worst-case" tsunami evacuation zone (typically at a 30- to 50-foot elevation) during any "Warning" level event; the alternative is to not call an evacuation at all. A solution to provide more detailed information for secondary evacuation zones has been the development of tsunami evacuation "playbooks" to plan for tsunami scenarios of various sizes and source locations. To determine a recommended level of evacuation during a distant-source tsunami, an analytical tool has been developed called the "FASTER" approach, an acronym for factors that influence the tsunami flood hazard for a community: Forecast Amplitude, Storm, Tides, Error in forecast, and the Run-up potential. Within the first couple hours after a tsunami is generated, the National Tsunami Warning Center provides tsunami forecast amplitudes and arrival times for approximately 60 coastal locations in California. At the same time, the regional NOAA Weather Forecast Offices in the state calculate the forecasted coastal storm and tidal conditions that will influence tsunami flooding. Providing added conservatism in calculating tsunami flood potential, we include an error factor of 30% for the forecast amplitude, which is based on observed forecast errors during recent events, and a site specific run-up factor which is calculated from the existing state tsunami modeling database. The factors are added together into a cumulative FASTER flood potential value for the first five hours of tsunami activity and used to select the appropriate tsunami phase evacuation "playbook" which is provided to each coastal community shortly after the forecast is provided.
Real-time forecasting of the April 11, 2012 Sumatra tsunami
Wang, Dailin; Becker, Nathan C.; Walsh, David; Fryer, Gerard J.; Weinstein, Stuart A.; McCreery, Charles S.; ,
2012-01-01
The April 11, 2012, magnitude 8.6 earthquake off the northern coast of Sumatra generated a tsunami that was recorded at sea-level stations as far as 4800 km from the epicenter and at four ocean bottom pressure sensors (DARTs) in the Indian Ocean. The governments of India, Indonesia, Sri Lanka, Thailand, and Maldives issued tsunami warnings for their coastlines. The United States' Pacific Tsunami Warning Center (PTWC) issued an Indian Ocean-wide Tsunami Watch Bulletin in its role as an Interim Service Provider for the region. Using an experimental real-time tsunami forecast model (RIFT), PTWC produced a series of tsunami forecasts during the event that were based on rapidly derived earthquake parameters, including initial location and Mwp magnitude estimates and the W-phase centroid moment tensor solutions (W-phase CMTs) obtained at PTWC and at the U. S. Geological Survey (USGS). We discuss the real-time forecast methodology and how successive, real-time tsunami forecasts using the latest W-phase CMT solutions improved the accuracy of the forecast.
Sources of information for tsunami forecasting in New Zealand
NASA Astrophysics Data System (ADS)
Barberopoulou, A.; Ristau, J. P.; D'Anastasio, E.; Wang, X.
2013-12-01
Tsunami science has evolved considerably in the last two decades due to technological advancements which also helped push for better numerical modelling of the tsunami phases (generation to inundation). The deployment of DART buoys has also been a considerable milestone in tsunami forecasting. Tsunami forecasting is one of the parts that tsunami modelling feeds into and is related to response, preparedness and planning. Usually tsunami forecasting refers to short-term forecasting that takes place in real-time after a tsunami has or appears to have been generated. In this report we refer to all types of forecasting (short-term or long-term) related to work in advance of a tsunami impacting a coastline that would help in response, planning or preparedness. We look at the standard types of data (seismic, GPS, water level) that are available in New Zealand for tsunami forecasting, how they are currently being used, other ways to use these data and provide recommendations for better utilisation. The main findings are: -Current investigations of the use of seismic parameters quickly obtained after an earthquake, have potential to provide critical information about the tsunamigenic potential of earthquakes. Further analysis of the most promising methods should be undertaken to determine a path to full implementation. -Network communication of the largest part of the GPS network is not currently at a stage that can provide sufficient data early enough for tsunami warning. It is believed that it has potential, but changes including data transmission improvements may have to happen before real-time processing oriented to tsunami early warning is implemented on the data that is currently provided. -Tide gauge data is currently under-utilised for tsunami forecasting. Spectral analysis, modal analysis based on identified modes and arrival times extracted from the records can be useful in forecasting. -The current study is by no means exhaustive of the ways the different types of data can be used. We are only presenting an overview of what can be done. More extensive studies with each one of the types of data collected by GeoNet and other relevant networks will help improve tsunami forecasting in New Zealand.
NASA Astrophysics Data System (ADS)
Sterling, K.; Denbo, D. W.; Eble, M. C.
2016-12-01
Short-term Inundation Forecasting for Tsunamis (SIFT) software was developed by NOAA's Pacific Marine Environmental Laboratory (PMEL) for use in tsunami forecasting and has been used by both U.S. Tsunami Warning Centers (TWCs) since 2012, when SIFTv3.1 was operationally accepted. Since then, advancements in research and modeling have resulted in several new features being incorporated into SIFT forecasting. Following the priorities and needs of the TWCs, upgrades to SIFT forecasting were implemented into SIFTv4.0, scheduled to become operational in October 2016. Because every minute counts in the early warning process, two major time saving features were implemented in SIFT 4.0. To increase processing speeds and generate high-resolution flooding forecasts more quickly, the tsunami propagation and inundation codes were modified to run on Graphics Processing Units (GPUs). To reduce time demand on duty scientists during an event, an automated DART inversion (or fitting) process was implemented. To increase forecasting accuracy, the forecasted amplitudes and inundations were adjusted to include dynamic tidal oscillations, thereby reducing the over-estimates of flooding common in SIFTv3.1 due to the static tide stage conservatively set at Mean High Water. Further improvements to forecasts were gained through the assimilation of additional real-time observations. Cabled array measurements from Bottom Pressure Recorders (BPRs) in the Oceans Canada NEPTUNE network are now available to SIFT for use in the inversion process. To better meet the needs of harbor masters and emergency managers, SIFTv4.0 adds a tsunami currents graphical product to the suite of disseminated forecast results. When delivered, these new features in SIFTv4.0 will improve the operational tsunami forecasting speed, accuracy, and capabilities at NOAA's Tsunami Warning Centers.
Modeling influence of tide stages on forecasts of the 2010 Chilean tsunami
NASA Astrophysics Data System (ADS)
Uslu, B. U.; Chamberlin, C.; Walsh, D.; Eble, M. C.
2010-12-01
The impact of the 2010 Chilean tsunami is studied using the NOAA high-resolution tsunami forecast model augmented to include modeled tide heights in addition to deep-water tsunami propagation as boundary-condition input. The Chilean tsunami was observed at the Los Angeles tide station at mean low water, Hilo at low, Pago Pago at mid tide and Wake Island near high tide. Because the tsunami arrived at coastal communities at a representative variety of tide stages, 2010 Chile tsunami provides opportunity to study the tsunami impacts at different tide levels to different communities. The current forecast models are computed with a constant tidal stage, and this study evaluates techniques for adding an additional varying predicted tidal component in a forecasting context. Computed wave amplitudes, wave currents and flooding are compared at locations around the Pacific, and the difference in tsunami impact due to tidal stage is studied. This study focuses on how tsunami impacts vary with different tide levels, and helps us understand how the inclusion of tidal components can improve real-time forecast accuracy.
Development of Parallel Code for the Alaska Tsunami Forecast Model
NASA Astrophysics Data System (ADS)
Bahng, B.; Knight, W. R.; Whitmore, P.
2014-12-01
The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes and other means in both the Pacific and Atlantic Oceans. At the U.S. National Tsunami Warning Center (NTWC), the model is mainly used in a pre-computed fashion. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves get closer to coastal waters. Even with the pre-computation the task becomes non-trivial as sub-grid resolution gets finer. Currently, the finest resolution Digital Elevation Models (DEM) used by ATFM are 1/3 arc-seconds. With a serial code, large or multiple areas of very high resolution can produce run-times that are unrealistic even in a pre-computed approach. One way to increase the model performance is code parallelization used in conjunction with a multi-processor computing environment. NTWC developers have undertaken an ATFM code-parallelization effort to streamline the creation of the pre-computed database of results with the long term aim of tsunami forecasts from source to high resolution shoreline grids in real time. Parallelization will also permit timely regeneration of the forecast model database with new DEMs; and, will make possible future inclusion of new physics such as the non-hydrostatic treatment of tsunami propagation. The purpose of our presentation is to elaborate on the parallelization approach and to show the compute speed increase on various multi-processor systems.
NASA Astrophysics Data System (ADS)
Gica, E.
2016-12-01
The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.
Real time assessment of the 15 July 2009 New Zealand tsunami
NASA Astrophysics Data System (ADS)
Uslu, Burak; Power, William; Greensdale, Dianne; Titov, Vasily
2010-05-01
On the 15th July 2009 a Mw 7.6 earthquake occurred off the coast of Fiordland in the South Island of New Zealand, about 1200 km from Auckland, New Zealand, 1500 km from Hobart, Tasmania and 1800 km from Sydney, Australia. A tsunami was generated and an initial warning issued by the PTWC. The Centre for Australian Weather and Climate issued its first tsunami warning for coastal regions of eastern Australia and New Zealand 24 minutes after the earthquake. By serendipitous coincidence, the earthquake struck while the International Tsunami Symposium was in session in Novosibirsk Russia. This provided the opportunity to test, in real-time, several tsunami warning systems in front of attending scientists (Schiermeier, 2009). NOAA Center for Tsunami Research, Pacific Tsunami Warning Center, GNS Science, and Centre for Australian Weather and Climate scientists were present at the symposium and worked together. Vasily Titov showed "live" NOAA's methodology (Bernard et al, 2006) to assess the tsunami potential and, in consultation with colleagues, provided warning guidance, and the warning was eventually canceled. We discuss how the forecast was done and how accurate the initial determination was. References Bernard E.N. et al., 2006, Tsunami: scientific frontiers, mitigation, forecasting and policy implications, Phil. Trans. R. Soc. A, 364:1989-2007; doi:10.1098/rsta.2006.1809 Schiermeier, Q., 2009, Tsunami forecast in real time, Published online 16 July 2009 | Nature | doi:10.1038/news.2009.702
NASA Astrophysics Data System (ADS)
Yamamoto, N.; Aoi, S.; Hirata, K.; Suzuki, W.; Kunugi, T.; Nakamura, H.
2015-12-01
We started to develop a new methodology for real-time tsunami inundation forecast system (Aoi et al., 2015, this meeting) using densely offshore tsunami observations of the Seafloor Observation Network for Earthquakes and Tsunamis (S-net), which is under construction along the Japan Trench (Kanazawa et al., 2012, JpGU; Uehira et al., 2015, IUGG). In our method, the most important concept is involving any type and/or form uncertainties in the tsunami forecast, which cannot be dealt with any of standard linear/nonlinear least square approaches. We first prepare a Tsunami Scenario Bank (TSB), which contains offshore tsunami waveforms at the S-net stations and tsunami inundation information calculated from any possible tsunami source. We then quickly select several acceptable tsunami scenarios that can explain offshore observations by using multiple indices and appropriate thresholds, after a tsunami occurrence. At that time, possible tsunami inundations coupled with selected scenarios are forecasted (Yamamoto et al., 2014, AGU). Currently, we define three indices: correlation coefficient and two variance reductions, whose L2-norm part is normalized either by observations or calculations (Suzuki et al., 2015, JpGU; Yamamoto et al., 2015, IUGG). In this study, we construct the TSB, which contains various tsunami source models prepared for the probabilistic tsunami hazard assessment in the Japan Trench region (Hirata et al., 2014, AGU). To evaluate the propriety of our method, we adopt the fault model based on the 2011 Tohoku earthquake as a pseudo "observation". We also calculate three indices using coastal maximum tsunami height distributions between observation and calculation. We then obtain the correlation between coastal and offshore indices. We notice that the index value of coastal maximum tsunami heights is closer to 1 than the index value of offshore waveforms, i.e., the coastal maximum tsunami height may be predictable within appropriate thresholds defined for offshore indices. We also investigate the effect of rise-time. This work was partially supported by the Council for Science, Technology and Innovation (CSTI) through the Cross-ministerial Strategic Innovation Promotion Program (SIP), titled "Enhancement of societal resiliency against natural disasters" (Funding agency: JST).
Optimizing Tsunami Forecast Model Accuracy
NASA Astrophysics Data System (ADS)
Whitmore, P.; Nyland, D. L.; Huang, P. Y.
2015-12-01
Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.
Geist, Eric L.; Titov, Vasily V.; Arcas, Diego; Pollitz, Fred F.; Bilek, Susan L.
2007-01-01
Results from different tsunami forecasting and hazard assessment models are compared with observed tsunami wave heights from the 26 December 2004 Indian Ocean tsunami. Forecast models are based on initial earthquake information and are used to estimate tsunami wave heights during propagation. An empirical forecast relationship based only on seismic moment provides a close estimate to the observed mean regional and maximum local tsunami runup heights for the 2004 Indian Ocean tsunami but underestimates mean regional tsunami heights at azimuths in line with the tsunami beaming pattern (e.g., Sri Lanka, Thailand). Standard forecast models developed from subfault discretization of earthquake rupture, in which deep- ocean sea level observations are used to constrain slip, are also tested. Forecast models of this type use tsunami time-series measurements at points in the deep ocean. As a proxy for the 2004 Indian Ocean tsunami, a transect of deep-ocean tsunami amplitudes recorded by satellite altimetry is used to constrain slip along four subfaults of the M >9 Sumatra–Andaman earthquake. This proxy model performs well in comparison to observed tsunami wave heights, travel times, and inundation patterns at Banda Aceh. Hypothetical tsunami hazard assessments models based on end- member estimates for average slip and rupture length (Mw 9.0–9.3) are compared with tsunami observations. Using average slip (low end member) and rupture length (high end member) (Mw 9.14) consistent with many seismic, geodetic, and tsunami inversions adequately estimates tsunami runup in most regions, except the extreme runup in the western Aceh province. The high slip that occurred in the southern part of the rupture zone linked to runup in this location is a larger fluctuation than expected from standard stochastic slip models. In addition, excess moment release (∼9%) deduced from geodetic studies in comparison to seismic moment estimates may generate additional tsunami energy, if the exponential time constant of slip is less than approximately 1 hr. Overall, there is significant variation in assessed runup heights caused by quantifiable uncertainty in both first-order source parameters (e.g., rupture length, slip-length scaling) and spatiotemporal complexity of earthquake rupture.
NASA Astrophysics Data System (ADS)
Wei, Y.; Titov, V. V.; Bernard, E. N.; Spillane, M. C.
2014-12-01
The tragedies of 2004 Sumatra and 2011 Tohoku tsunamis exposed the limits of our knowledge in preparing for devastating tsunamis, especially in the near field. The 1,100-km coastline of the Pacific coast of North America has tectonic and geological settings similar to Sumatra and Japan. The geological records unambiguously show that the Cascadia fault had caused devastating tsunamis in the past and this geological process will cause tsunamis in the future. Existing observational instruments along the Cascadia Subduction Zone are capable of providing tsunami data within minutes of tsunami generation. However, this strategy requires separation of the tsunami signals from the overwhelming high-frequency seismic waves produced during a strong earthquake- a real technical challenge for existing operational tsunami observational network. A new-generation of nano-resolution pressure sensors can provide high temporal resolution of the earthquake and tsunami signals without loosing precision. The nano-resolution pressure sensor offers a state-of the-science ability to separate earthquake vibrations and other oceanic noise from tsunami waveforms, paving the way for accurate, early warnings of local tsunamis. This breakthrough underwater technology has been tested and verified for a couple of micro-tsunami events (Paros et al., 2011). Real-time forecast of Cascadia tsunamis is becoming a possibility with the development of nano-tsunameter technology. The present study provides an investigation on optimizing the placement of these new sensors so that the forecast time can be shortened.. The presentation will cover the optimization of an observational array to quickly detect and forecast a tsunami generated by a strong Cascadia earthquake, including short and long rupture scenarios. Lessons learned from the 2011 Tohoku tsunami will be examined to demonstrate how we can improve the local forecast using the new technology We expect this study to provide useful guideline for future siting and deployment of the new-generation tsunameters. Driven by the new technology, we demonstrate scenarios of real-time forecast of Cascadia tsunami impact along the Pacific Northwest, as well as in the Puget Sound.
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Ramirez-Herrera, M. T.; Dengler, L. A.; Miller, K.; LaDuke, Y.
2017-12-01
The preliminary tsunami impacts from the September 7, 2017, M8.1 Tehuantepec Earthquake have been summarized in the following report: https://www.eeri.org/wp-content/uploads/EERI-Recon-Rpt-090717-Mexico-tsunami_fn.pdf. Although the tsunami impacts were not as significant as those from the earthquake itself (98 fatalities and 41,000 homes damaged), the following are highlights and lessons learned: The Tehuantepec earthquake was one of the largest down-slab normal faulting events ever recorded. This situation complicated the tsunami forecast since forecast methods and pre-event modeling are primarily associated with megathrust earthquakes where the most significant tsunamis are generated. Adding non-megathrust source modeling to the tsunami forecast databases of conventional warning systems should be considered. Offshore seismic and tsunami hazard analyses using past events should incorporate the potential for large earthquakes occurring along sources other than the megathrust boundary. From an engineering perspective, initial reports indicate there was only minor tsunami damage along the Mexico coast. There was damage to Marina Chiapas where floating docks overtopped their piles. Increasing pile heights could reduce the potential for damage to floating docks. Tsunami warning notifications did not get to the public in time to assist with evacuation. Streamlining the messaging in Mexico from the warning system directly to the public should be considered. And, for local events, preparedness efforts should place emphasis on responding to feeling the earthquake and not waiting to be notified. Although the U.S. tsunami warning centers were timely with their international and domestic messaging, there were some issues with how those messages were presented and interpreted. The use of a "Tsunami Threat" banner on the new main warning center website created confusion with emergency managers in the U.S. where no tsunami threat was expected to exist. Also, some U.S. states and territories in the Pacific were listed in both domestic and international messages, which caused confusion for American Samoa where these messages contained somewhat conflicting information. These issues are being addressed by the warning centers and the U.S. National Tsunami Hazard Mitigation Program.
Yong, Wei; Newman, Andrew V.; Hayes, Gavin P.; Titov, Vasily V.; Tang, Liujuan
2014-01-01
Correctly characterizing tsunami source generation is the most critical component of modern tsunami forecasting. Although difficult to quantify directly, a tsunami source can be modeled via different methods using a variety of measurements from deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some of which in or near real time. Here we assess the performance of different source models for the destructive 11 March 2011 Japan tsunami using model–data comparison for the generation, propagation, and inundation in the near field of Japan. This comparative study of tsunami source models addresses the advantages and limitations of different real-time measurements with potential use in early tsunami warning in the near and far field. The study highlights the critical role of deep-ocean tsunami measurements and rapid validation of the approximate tsunami source for high-quality forecasting. We show that these tsunami measurements are compatible with other real-time geodetic data, and may provide more insightful understanding of tsunami generation from earthquakes, as well as from nonseismic processes such as submarine landslide failures.
NOAA Propagation Database Value in Tsunami Forecast Guidance
NASA Astrophysics Data System (ADS)
Eble, M. C.; Wright, L. M.
2016-02-01
The National Oceanic and Atmospheric Administration (NOAA) Center for Tsunami Research (NCTR) has developed a tsunami forecasting capability that combines a graphical user interface with data ingestion and numerical models to produce estimates of tsunami wave arrival times, amplitudes, current or water flow rates, and flooding at specific coastal communities. The capability integrates several key components: deep-ocean observations of tsunamis in real-time, a basin-wide pre-computed propagation database of water level and flow velocities based on potential pre-defined seismic unit sources, an inversion or fitting algorithm to refine the tsunami source based on the observations during an event, and tsunami forecast models. As tsunami waves propagate across the ocean, observations from the deep ocean are automatically ingested into the application in real-time to better define the source of the tsunami itself. Since passage of tsunami waves over a deep ocean reporting site is not immediate, we explore the value of the NOAA propagation database in providing placeholder forecasts in advance of deep ocean observations. The propagation database consists of water elevations and flow velocities pre-computed for 50 x 100 [km] unit sources in a continuous series along all known ocean subduction zones. The 2011 Japan Tohoku tsunami is presented as the case study
NASA Astrophysics Data System (ADS)
Koshimura, S.; Hino, R.; Ohta, Y.; Kobayashi, H.; Musa, A.; Murashima, Y.
2014-12-01
With use of modern computing power and advanced sensor networks, a project is underway to establish a new system of real-time tsunami inundation forecasting, damage estimation and mapping to enhance society's resilience in the aftermath of major tsunami disaster. The system consists of fusion of real-time crustal deformation monitoring/fault model estimation by Ohta et al. (2012), high-performance real-time tsunami propagation/inundation modeling with NEC's vector supercomputer SX-ACE, damage/loss estimation models (Koshimura et al., 2013), and geo-informatics. After a major (near field) earthquake is triggered, the first response of the system is to identify the tsunami source model by applying RAPiD Algorithm (Ohta et al., 2012) to observed RTK-GPS time series at GEONET sites in Japan. As performed in the data obtained during the 2011 Tohoku event, we assume less than 10 minutes as the acquisition time of the source model. Given the tsunami source, the system moves on to running tsunami propagation and inundation model which was optimized on the vector supercomputer SX-ACE to acquire the estimation of time series of tsunami at offshore/coastal tide gauges to determine tsunami travel and arrival time, extent of inundation zone, maximum flow depth distribution. The implemented tsunami numerical model is based on the non-linear shallow-water equations discretized by finite difference method. The merged bathymetry and topography grids are prepared with 10 m resolution to better estimate the tsunami inland penetration. Given the maximum flow depth distribution, the system performs GIS analysis to determine the numbers of exposed population and structures using census data, then estimates the numbers of potential death and damaged structures by applying tsunami fragility curve (Koshimura et al., 2013). Since the tsunami source model is determined, the model is supposed to complete the estimation within 10 minutes. The results are disseminated as mapping products to responders and stakeholders, e.g. national and regional municipalities, to be utilized for their emergency/response activities. In 2014, the system is verified through the case studies of 2011 Tohoku event and potential earthquake scenarios along Nankai Trough with regard to its capability and robustness.
Near-field hazard assessment of March 11, 2011 Japan Tsunami sources inferred from different methods
Wei, Y.; Titov, V.V.; Newman, A.; Hayes, G.; Tang, L.; Chamberlin, C.
2011-01-01
Tsunami source is the origin of the subsequent transoceanic water waves, and thus the most critical component in modern tsunami forecast methodology. Although impractical to be quantified directly, a tsunami source can be estimated by different methods based on a variety of measurements provided by deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some in real time, some in post real-time. Here we assess these different sources of the devastating March 11, 2011 Japan tsunami by model-data comparison for generation, propagation and inundation in the near field of Japan. This study provides a comparative study to further understand the advantages and shortcomings of different methods that may be potentially used in real-time warning and forecast of tsunami hazards, especially in the near field. The model study also highlights the critical role of deep-ocean tsunami measurements for high-quality tsunami forecast, and its combination with land GPS measurements may lead to better understanding of both the earthquake mechanisms and tsunami generation process. ?? 2011 MTS.
NASA Astrophysics Data System (ADS)
Weinstein, S.; Becker, N. C.; Wang, D.; Fryer, G. J.
2013-12-01
An important issue that vexes tsunami warning centers (TWCs) is when to cancel a tsunami warning once it is in effect. Emergency managers often face a variety of pressures to allow the public to resume their normal activities, but allowing coastal populations to return too quickly can put them at risk. A TWC must, therefore, exercise caution when cancelling a warning. Kim and Whitmore (2013) show that in many cases a TWC can use the decay of tsunami oscillations in a harbor to forecast when its amplitudes will fall to safe levels. This technique should prove reasonably robust for local tsunamis (those that are potentially dangerous within only 100 km of their source region) and for regional tsunamis (whose danger is limited to within 1000km of the source region) as well. For ocean-crossing destructive tsunamis such as the 11 March 2011 Tohoku tsunami, however, this technique may be inadequate. When a tsunami propagates across the ocean basin, it will encounter topographic obstacles such as seamount chains or coastlines, resulting in coherent reflections that can propagate great distances. When these reflections reach previously-impacted coastlines, they can recharge decaying tsunami oscillations and make them hazardous again. Warning center scientists should forecast sea-level records for 24 hours beyond the initial tsunami arrival in order to observe any potential reflections that may pose a hazard. Animations are a convenient way to visualize reflections and gain a broad geographic overview of their impacts. The Pacific Tsunami Warning Center has developed tools based on tsunami simulations using the RIFT tsunami forecast model. RIFT is a linear, parallelized numerical tsunami propagation model that runs very efficiently on a multi-CPU system (Wang et al, 2012). It can simulate 30-hours of tsunami wave propagation in the Pacific Ocean at 4 arc minute resolution in approximately 6 minutes of real time on a 12-CPU system. Constructing a 30-hour animation using 1 minute simulated time steps takes approximately 50 minutes on the same system. These animations are generated quickly enough to provide decision support for emergency managers whose coastlines may be impacted by the tsunami several hours later. Tsunami reflections can also aid in determining the source region for those tsunamis generated by non-seismic mechanisms without a clear source such as meteotsunamis, tsunamis generated by meteorological phenomena. A derecho that crossed the New Jersey coast and entered the Atlantic Ocean at approximately 1500 UTC June 13, 2013 generated a meteotsunami that struck the northeast coast of the US causing several injuries. A DART sensor off Montauk, NY, recorded tsunami waves approximately 200 minutes apart. We show how the arrival times of the tsunamis recorded by this DART can help to constrain the source region of the meteotsunami. We also examine other reflections produced by the Haida Gwaii 2012, Tohoku 2011, and other tsunamis.
Improving tsunami warning systems with remote sensing and geographical information system input.
Wang, Jin-Feng; Li, Lian-Fa
2008-12-01
An optimal and integrative tsunami warning system is introduced that takes full advantage of remote sensing and geographical information systems (GIS) in monitoring, forecasting, detection, loss evaluation, and relief management for tsunamis. Using the primary impact zone in Banda Aceh, Indonesia as the pilot area, we conducted three simulations that showed that while the December 26, 2004 Indian Ocean tsunami claimed about 300,000 lives because there was no tsunami warning system at all, it is possible that only about 15,000 lives could have been lost if the area had used a tsunami warning system like that currently in use in the Pacific Ocean. The simulations further calculated that the death toll could have been about 3,000 deaths if there had been a disaster system further optimized with full use of remote sensing and GIS, although the number of badly damaged or destroyed houses (29,545) could have likely remained unchanged.
Fusion of real-time simulation, sensing, and geo-informatics in assessing tsunami impact
NASA Astrophysics Data System (ADS)
Koshimura, S.; Inoue, T.; Hino, R.; Ohta, Y.; Kobayashi, H.; Musa, A.; Murashima, Y.; Gokon, H.
2015-12-01
Bringing together state-of-the-art high-performance computing, remote sensing and spatial information sciences, we establish a method of real-time tsunami inundation forecasting, damage estimation and mapping to enhance disaster response. Right after a major (near field) earthquake is triggered, we perform a real-time tsunami inundation forecasting with use of high-performance computing platform (Koshimura et al., 2014). Using Tohoku University's vector supercomputer, we accomplished "10-10-10 challenge", to complete tsunami source determination in 10 minutes, tsunami inundation modeling in 10 minutes with 10 m grid resolution. Given the maximum flow depth distribution, we perform quantitative estimation of exposed population using census data and mobile phone data, and the numbers of potential death and damaged structures by applying tsunami fragility curve. After the potential tsunami-affected areas are estimated, the analysis gets focused and moves on to the "detection" phase using remote sensing. Recent advances of remote sensing technologies expand capabilities of detecting spatial extent of tsunami affected area and structural damage. Especially, a semi-automated method to estimate building damage in tsunami affected areas is developed using pre- and post-event high-resolution SAR (Synthetic Aperture Radar) data. The method is verified through the case studies in the 2011 Tohoku and other potential tsunami scenarios, and the prototype system development is now underway in Kochi prefecture, one of at-risk coastal city against Nankai trough earthquake. In the trial operation, we verify the capability of the method as a new tsunami early warning and response system for stakeholders and responders.
Modeling Extra-Long Tsunami Propagation: Assessing Data, Model Accuracy and Forecast Implications
NASA Astrophysics Data System (ADS)
Titov, V. V.; Moore, C. W.; Rabinovich, A.
2017-12-01
Detecting and modeling tsunamis propagating tens of thousands of kilometers from the source is a formidable scientific challenge and seemingly satisfies only scientific curiosity. However, results of such analyses produce a valuable insight into the tsunami propagation dynamics, model accuracy and would provide important implications for tsunami forecast. The Mw = 9.3 megathrust earthquake of December 26, 2004 off the coast of Sumatra generated a tsunami that devastated Indian Ocean coastlines and spread into the Pacific and Atlantic oceans. The tsunami was recorded by a great number of coastal tide gauges, including those located in 15-25 thousand kilometers from the source area. To date, it is still the farthest instrumentally detected tsunami. The data from these instruments throughout the world oceans enabled to estimate various statistical parameters and energy decay of this event. High-resolution records of this tsunami from DARTs 32401 (offshore of northern Chile), 46405 and NeMO (both offshore of the US West Coast), combined with the mainland tide gauge measurements enabled us to examine far-field characteristics of the 2004 in the Pacific Ocean and to compare the results of global numerical simulations with the observations. Despite their small heights (less than 2 cm at deep-ocean locations), the records demonstrated consistent spatial and temporal structure. The numerical model described well the frequency content, amplitudes and general structure of the observed waves at deep-ocean and coastal gages. We present analysis of the measurements and comparison with model data to discuss implication for tsunami forecast accuracy. Model study for such extreme distances from the tsunami source and at extra-long times after the event is an attempt to find accuracy bounds for tsunami models and accuracy limitations of model use for forecast. We discuss results in application to tsunami model forecast and tsunami modeling in general.
Office Marine, Tropical, and Tsunami Services Branch Items of Interest Marine Forecasts Text, Graphic Office, Marine, Tropical, and Tsunami Services Branch, Items of Interest, Forecasts, Observations
NWS Marine, Tropical, and Tsunami Services Branch Feedback
Service NWS logo - Click to go to the NWS homepage Marine Forecasts Marine Forecasts Home News Organization Search Landlubber's forecast: "City, St" or zip code (Pan/Zoom for Marine) Search by Office Marine, Tropical, and Tsunami Services Branch Items of Interest Marine Forecasts Text, Graphic
Introduction to "Tsunami Science: Ten Years After the 2004 Indian Ocean Tsunami. Volume I"
NASA Astrophysics Data System (ADS)
Rabinovich, Alexander B.; Geist, Eric L.; Fritz, Hermann M.; Borrero, Jose C.
2015-03-01
Twenty-two papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue "Tsunami Science: Ten Years after the 2004 Indian Ocean Tsunami." Eight papers examine various aspects of past events with an emphasis on case and regional studies. Five papers are on tsunami warning and forecast, including the improvement of existing tsunami warning systems and the development of new warning systems in the northeast Atlantic and Mediterranean region. Three more papers present the results of analytical studies and discuss benchmark problems. Four papers report the impacts of tsunamis, including the detailed calculation of inundation onshore and into rivers and probabilistic analysis for engineering purposes. The final two papers relate to important investigations of the source and tsunami generation. Overall, the volume not only addresses the pivotal 2004 Indian Ocean (Sumatra) and 2011 Japan (Tohoku) tsunamis, but also examines the tsunami hazard posed to other critical coasts in the world.
Office Marine, Tropical, and Tsunami Services Branch Items of Interest Marine Forecasts Text, Graphic , Tropical, and Tsunami Services Branch, Items of Interest, Forecasts, Observations, Portals, Dissemination
PRIVATE WEATHER SERVICES PROVIDERS
Office Marine, Tropical, and Tsunami Services Branch Items of Interest Marine Forecasts Text, Graphic , Marine, Tropical, and Tsunami Services Branch, Items of Interest, Forecasts, Observations, Portals
NASA Astrophysics Data System (ADS)
Wang, D.; Becker, N. C.; Weinstein, S.; Duputel, Z.; Rivera, L. A.; Hayes, G. P.; Hirshorn, B. F.; Bouchard, R. H.; Mungov, G.
2017-12-01
The Pacific Tsunami Warning Center (PTWC) began forecasting tsunamis in real-time using source parameters derived from real-time Centroid Moment Tensor (CMT) solutions in 2009. Both the USGS and PTWC typically obtain W-Phase CMT solutions for large earthquakes less than 30 minutes after earthquake origin time. Within seconds, and often before waves reach the nearest deep ocean bottom pressure sensor (DARTs), PTWC then generates a regional tsunami propagation forecast using its linear shallow water model. The model is initialized by the sea surface deformation that mimics the seafloor deformation based on Okada's (1985) dislocation model of a rectangular fault with a uniform slip. The fault length and width are empirical functions of the seismic moment. How well did this simple model perform? The DART records provide a very valuable dataset for model validation. We examine tsunami events of the last decade with earthquake magnitudes ranging from 6.5 to 9.0 including some deep events for which tsunamis were not expected. Most of the forecast results were obtained during the events. We also include events from before the implementation of the WCMT method at USGS and PTWC, 2006-2009. For these events, WCMTs were computed retrospectively (Duputel et al. 2012). We also re-ran the model with a larger domain for some events to include far-field DARTs that recorded a tsunami with identical source parameters used during the events. We conclude that our model results, in terms of maximum wave amplitude, are mostly within a factor of two of the observed at DART stations, with an average error of less than 40% for most events, including the 2010 Maule and the 2011 Tohoku tsunamis. However, the simple fault model with a uniform slip is too simplistic for the Tohoku tsunami. We note model results are sensitive to centroid location and depth, especially if the earthquake is close to land or inland. For the 2016 M7.8 New Zealand earthquake the initial forecast underestimated the tsunami because the initial WCMT centroid was on land (the epicenter was inland but most of the slips occurred offshore). Later WCMTs did provide better forecast. The model also failed to reproduce the observed tsunamis from earthquake-generated landslides. Sea level observations during the events are crucial in determining whether or not a forecast needs to be adjusted.
Dynamic Tsunami Data Assimilation (DTDA) Based on Green's Function: Theory and Application
NASA Astrophysics Data System (ADS)
Wang, Y.; Satake, K.; Gusman, A. R.; Maeda, T.
2017-12-01
Tsunami data assimilation estimates the tsunami arrival time and height at Points of Interest (PoIs) by assimilating tsunami data observed offshore into a numerical simulation, without the need of calculating initial sea surface height at the source (Maeda et al., 2015). The previous tsunami data assimilation has two main problems: one is that it requires quite large calculating time because the tsunami wavefield of the whole interested region is computed continuously; another is that it relies on dense observation network such as Dense Oceanfloor Network system for Earthquakes and Tsunamis (DONET) in Japan or Cascadia Initiative (CI) in North America (Gusman et al., 2016), which is not practical for some area. Here we propose a new approach based on Green's function to speed up the tsunami data assimilation process and to solve the problem of sparse observation: Dynamic Tsunami Data Assimilation (DTDA). If the residual between the observed and calculated tsunami height is not zero, there will be an assimilation response around the station, usually a Gaussian-distributed sea surface displacement. The Green's function Gi,j is defined as the tsunami waveform at j-th grid caused by the propagation of assimilation response at i-th station. Hence, the forecasted waveforms at PoIs are calculated as the superposition of the Green's functions. In case of sparse observation, we could use the aircraft and satellite observations. The previous assimilation approach is not practical because it costs much time to assimilate moving observation, and to compute the tsunami wavefield of the interested region. In contrast, DTDA synthesizes the waveforms quickly as long as the Green's functions are calculated in advance. We apply our method to a hypothetic earthquake off the west coast of Sumatra Island similar to the 2004 Indian Ocean earthquake. Currently there is no dense observation network in that area, making it difficult for the previous assimilation approach. We used DTDA with aircraft and satellite observation above the Indian Ocean, to forecast the tsunami in Sri Lanka, India and Thailand. It shows that DTDA provides reliable tsunami forecasting for these countries, and the tsunami early warning can be issued half an hour before the tsunami arrives to reduce the damage along the coast.
Real-time determination of the worst tsunami scenario based on Earthquake Early Warning
NASA Astrophysics Data System (ADS)
Furuya, Takashi; Koshimura, Shunichi; Hino, Ryota; Ohta, Yusaku; Inoue, Takuya
2016-04-01
In recent years, real-time tsunami inundation forecasting has been developed with the advances of dense seismic monitoring, GPS Earth observation, offshore tsunami observation networks, and high-performance computing infrastructure (Koshimura et al., 2014). Several uncertainties are involved in tsunami inundation modeling and it is believed that tsunami generation model is one of the great uncertain sources. Uncertain tsunami source model has risk to underestimate tsunami height, extent of inundation zone, and damage. Tsunami source inversion using observed seismic, geodetic and tsunami data is the most effective to avoid underestimation of tsunami, but needs to expect more time to acquire the observed data and this limitation makes difficult to terminate real-time tsunami inundation forecasting within sufficient time. Not waiting for the precise tsunami observation information, but from disaster management point of view, we aim to determine the worst tsunami source scenario, for the use of real-time tsunami inundation forecasting and mapping, using the seismic information of Earthquake Early Warning (EEW) that can be obtained immediately after the event triggered. After an earthquake occurs, JMA's EEW estimates magnitude and hypocenter. With the constraints of earthquake magnitude, hypocenter and scaling law, we determine possible multi tsunami source scenarios and start searching the worst one by the superposition of pre-computed tsunami Green's functions, i.e. time series of tsunami height at offshore points corresponding to 2-dimensional Gaussian unit source, e.g. Tsushima et al., 2014. Scenario analysis of our method consists of following 2 steps. (1) Searching the worst scenario range by calculating 90 scenarios with various strike and fault-position. From maximum tsunami height of 90 scenarios, we determine a narrower strike range which causes high tsunami height in the area of concern. (2) Calculating 900 scenarios that have different strike, dip, length, width, depth and fault-position. Note that strike is limited with the range obtained from 90 scenarios calculation. From 900 scenarios, we determine the worst tsunami scenarios from disaster management point of view, such as the one with shortest travel time and the highest water level. The method was applied to a hypothetical-earthquake, and verified if it can effectively search the worst tsunami source scenario in real-time, to be used as an input of real-time tsunami inundation forecasting.
Noise Reduction of Ocean-Bottom Pressure Data Toward Real-Time Tsunami Forecasting
NASA Astrophysics Data System (ADS)
Tsushima, H.; Hino, R.
2008-12-01
We discuss a method of noise reduction of ocean-bottom pressure data to be fed into the near-field tsunami forecasting scheme proposed by Tsushima et al. [2008a]. In their scheme, the pressure data is processed in real time as follows: (1) removing ocean tide components by subtracting the sea-level variation computed from a theoretical tide model, (2) applying low-pass digital filter to remove high-frequency fluctuation due to seismic waves, and (3) removing DC-offset and linear-trend component to determine a baseline of relative sea level. However, it turns out this simple method is not always successful in extracting tsunami waveforms from the data, when the observed amplitude is ~1cm. For disaster mitigation, accurate forecasting of small tsunamis is important as well as large tsunamis. Since small tsunami events occur frequently, successful tsunami forecasting of those events are critical to obtain public reliance upon tsunami warnings. As a test case, we applied the data-processing described above to the bottom pressure records containing tsunami with amplitude less than 1 cm which was generated by the 2003 Off-Fukushima earthquake occurring in the Japan Trench subduction zone. The observed pressure variation due to the ocean tide is well explained by the calculated tide signals from NAO99Jb model [Matsumoto et al., 2000]. However, the tide components estimated by BAYTAP-G [Tamura et al., 1991] from the pressure data is more appropriate for predicting and removing the ocean tide signals. In the pressure data after removing the tide variations, there remain pressure fluctuations with frequencies ranging from about 0.1 to 1 mHz and with amplitudes around ~10 cm. These fluctuations distort the estimation of zero-level and linear trend to define relative sea-level variation, which is treated as tsunami waveform in the subsequent analysis. Since the linear trend is estimated from the data prior to the origin time of the earthquake, an artificial linear trend is produced in the processed waveform. This artificial linear trend degrades the accuracy of the tsunami forecasting, although the forecasting result is expected to be robust against the existence of short-period noise [Tsushima et al., 2008a]. Since the bottom pressure show gradual increase (or decrease) in the tsunami source region [Tsushima et al., 2008b], it is important to remove the linear trend not related to the tsunami generation from the data before fed into the analysis. Therefore, the reduction of the noise in sub-mHz band is critical for the forecasting small tsunamis. Applying a kind of frequency filters to eliminate this noise cannot be a solution for this problem because actual tsunami signals may also contain components of this frequency band. We investigate whether any statistical modelings of the noise are effective for reducing the sub-mHz noise.
NASA Astrophysics Data System (ADS)
Melgar, D.; Bock, Y.; Crowell, B. W.; Haase, J. S.
2013-12-01
Computation of predicted tsunami wave heights and runup in the regions adjacent to large earthquakes immediately after rupture initiation remains a challenging problem. Limitations of traditional seismological instrumentation in the near field which cannot be objectively employed for real-time inversions and the non-unique source inversion results are a major concern for tsunami modelers. Employing near-field seismic, GPS and wave gauge data from the Mw 9.0 2011 Tohoku-oki earthquake, we test the capacity of static finite fault slip models obtained from newly developed algorithms to produce reliable tsunami forecasts. First we demonstrate the ability of seismogeodetic source models determined from combined land-based GPS and strong motion seismometers to forecast near-source tsunamis in ~3 minutes after earthquake origin time (OT). We show that these models, based on land-borne sensors only tend to underestimate the tsunami but are good enough to provide a realistic first warning. We then demonstrate that rapid ingestion of offshore shallow water (100 - 1000 m) wave gauge data significantly improves the model forecasts and possible warnings. We ingest data from 2 near-source ocean-bottom pressure sensors and 6 GPS buoys into the earthquake source inversion process. Tsunami Green functions (tGFs) are generated using the GeoClaw package, a benchmarked finite volume code with adaptive mesh refinement. These tGFs are used for a joint inversion with the land-based data and substantially improve the earthquake source and tsunami forecast. Model skill is assessed by detailed comparisons of the simulation output to 2000+ tsunami runup survey measurements collected after the event. We update the source model and tsunami forecast and warning at 10 min intervals. We show that by 20 min after OT the tsunami is well-predicted with a high variance reduction to the survey data and by ~30 minutes a model that can be considered final, since little changed is observed afterwards, is achieved. This is an indirect approach to tsunami warning, it relies on automatic determination of the earthquake source prior to tsunami simulation. It is more robust than ad-hoc approaches because it relies on computation of a finite-extent centroid moment tensor to objectively determine the style of faulting and the fault plane geometry on which to launch the heterogeneous static slip inversion. Operator interaction and physical assumptions are minimal. Thus, the approach can provide the initial conditions for tsunami simulation (seafloor motion) irrespective of the type of earthquake source and relies heavily on oceanic wave gauge measurements for source determination. It reliably distinguishes among strike-slip, normal and thrust faulting events, all of which have been observed recently to occur in subduction zones and pose distinct tsunami hazards.
Tsunami Forecasting in the Atlantic Basin
NASA Astrophysics Data System (ADS)
Knight, W. R.; Whitmore, P.; Sterling, K.; Hale, D. A.; Bahng, B.
2012-12-01
The mission of the West Coast and Alaska Tsunami Warning Center (WCATWC) is to provide advance tsunami warning and guidance to coastal communities within its Area-of-Responsibility (AOR). Predictive tsunami models, based on the shallow water wave equations, are an important part of the Center's guidance support. An Atlantic-based counterpart to the long-standing forecasting ability in the Pacific known as the Alaska Tsunami Forecast Model (ATFM) is now developed. The Atlantic forecasting method is based on ATFM version 2 which contains advanced capabilities over the original model; including better handling of the dynamic interactions between grids, inundation over dry land, new forecast model products, an optional non-hydrostatic approach, and the ability to pre-compute larger and more finely gridded regions using parallel computational techniques. The wide and nearly continuous Atlantic shelf region presents a challenge for forecast models. Our solution to this problem has been to develop a single unbroken high resolution sub-mesh (currently 30 arc-seconds), trimmed to the shelf break. This allows for edge wave propagation and for kilometer scale bathymetric feature resolution. Terminating the fine mesh at the 2000m isobath keeps the number of grid points manageable while allowing for a coarse (4 minute) mesh to adequately resolve deep water tsunami dynamics. Higher resolution sub-meshes are then included around coastal forecast points of interest. The WCATWC Atlantic AOR includes eastern U.S. and Canada, the U.S. Gulf of Mexico, Puerto Rico, and the Virgin Islands. Puerto Rico and the Virgin Islands are in very close proximity to well-known tsunami sources. Because travel times are under an hour and response must be immediate, our focus is on pre-computing many tsunami source "scenarios" and compiling those results into a database accessible and calibrated with observations during an event. Seismic source evaluation determines the order of model pre-computation - starting with those sources that carry the highest risk. Model computation zones are confined to regions at risk to save computation time. For example, Atlantic sources have been shown to not propagate into the Gulf of Mexico. Therefore, fine grid computations are not performed in the Gulf for Atlantic sources. Outputs from the Atlantic model include forecast marigrams at selected sites, maximum amplitudes, drawdowns, and currents for all coastal points. The maximum amplitude maps will be supplemented with contoured energy flux maps which show more clearly the effects of bathymetric features on tsunami wave propagation. During an event, forecast marigrams will be compared to observations to adjust the model results. The modified forecasts will then be used to set alert levels between coastal breakpoints, and provided to emergency management.
The First Real-Time Tsunami Animation
NASA Astrophysics Data System (ADS)
Becker, N. C.; Wang, D.; McCreery, C.; Weinstein, S.; Ward, B.
2014-12-01
For the first time a U.S. tsunami warning center created and issued a tsunami forecast model animation while the tsunami was still crossing an ocean. Pacific Tsunami Warning Center (PTWC) scientists had predicted they would have this ability (Becker et al., 2012) with their RIFT forecast model (Wang et al., 2009) by using rapidly-determined W-phase centroid-moment tensor earthquake focal mechanisms as tsunami sources in the RIFT model (Wang et al., 2012). PTWC then acquired its own YouTube channel in 2013 for its outreach efforts that showed animations of historic tsunamis (Becker et al., 2013), but could also be a platform for sharing future tsunami animations. The 8.2 Mw earthquake of 1 April 2014 prompted PTWC to issue official warnings for a dangerous tsunami in Chile, Peru and Ecuador. PTWC ended these warnings five hours later, then issued its new tsunami marine hazard product (i.e., no coastal evacuations) for the State of Hawaii. With the international warning canceled but with a domestic hazard still present PTWC generated a forecast model animation and uploaded it to its YouTube channel six hours before the arrival of the first waves in Hawaii. PTWC also gave copies of this animation to television reporters who in turn passed it on to their national broadcast networks. PTWC then created a version for NOAA's Science on a Sphere system so it could be shown on these exhibits as the tsunami was still crossing the Pacific Ocean. While it is difficult to determine how many people saw this animation since local, national, and international news networks showed it in their broadcasts, PTWC's YouTube channel provides some statistics. As of 1 August 2014 this animation has garnered more than 650,000 views. Previous animations, typically released during significant anniversaries, rarely get more than 10,000 views, and even then only when external websites share them. Clearly there is a high demand for a tsunami graphic that shows both the speed and the severity of a tsunami before it reaches impacted coastlines, similar to how radar and satellite images show the advancement of storms. Though this animation showed that most of the tsunami waves would not be dangerous, future publication of these animations will require additional outreach and education to avoid any unnecessary alarm. https://www.youtube.com/user/PacificTWC
Real Time Earthquake Information System in Japan
NASA Astrophysics Data System (ADS)
Doi, K.; Kato, T.
2003-12-01
An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally monitors earthquake data and analyzes earthquake activities and tsunami occurrence round-the-clock on a real-time basis. In addition to the above, JMA has been developing a system of Nowcast Earthquake Information which can provide its users with occurrence of an earthquake prior to arrival of strong ground motion for a decade. Earthquake Research Institute, the University of Tokyo, is preparing a demonstrative experiment in collaboration with JMA, for a better utilization of Nowcast Earthquake Information to apply actual measures to reduce earthquake disasters caused by strong ground motion.
Tsunamis 406 EPIRB's National Weather Service Marine Forecasts INMARSAT-C SafetyNET Marine Forecast Offices greater danger near shore or any shallow waters? NATIONAL WEATHER SERVICE PRODUCTS VIA INMARSAT-C SafetyNET Inmarsat-C SafetyNET is an internationally adopted, automated satellite system for promulgating
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Dengler, L. A.; Goltz, J. D.; Legg, M.; Miller, K. M.; Parrish, J. G.; Whitmore, P.
2009-12-01
California tsunami geoscientists work closely with federal, state and local government emergency managers to help prepare coastal communities for potential impacts from a tsunami before, during, and after an event. For teletsunamis, as scientific information (forecast model wave heights, first-wave arrival times, etc.) from NOAA’s West Coast and Alaska’s Tsunami Warning Center is made available, state-level emergency managers must help convey this information in a concise and comprehendible manner to local officials who ultimately determine the appropriate response activities for their jurisdictions. During the Samoa Tsunami Advisory for California on September 29, 2009, geoscientists from the California Geological Survey and Humboldt State University assisted the California Emergency Management Agency in this information transfer by providing technical assistance during teleconference meetings with NOAA and other state and local emergency managers prior to the arrival of the tsunami. State geoscientists gathered additional background information on anticipated tidal conditions and wave heights for areas not covered by NOAA’s forecast models. The participation of the state geoscientists in the emergency response process resulted in clarifying which regions were potentially at-risk, as well as those having a low risk from the tsunami. Future tsunami response activities for state geoscientists include: 1) working closely with NOAA to simplify their tsunami alert messaging and expand their forecast modeling coverage, 2) creation of “playbooks” containing information from existing tsunami scenarios for local emergency managers to reference during an event, and 3) development of a state-level information “clearinghouse” and pre-tsunami field response team to assist local officials as well as observe and report tsunami effects.
Evaluating the Effectiveness of DART® Buoy Networks Based on Forecast Accuracy
NASA Astrophysics Data System (ADS)
Percival, Donald B.; Denbo, Donald W.; Gica, Edison; Huang, Paul Y.; Mofjeld, Harold O.; Spillane, Michael C.; Titov, Vasily V.
2018-04-01
A performance measure for a DART® tsunami buoy network has been developed. DART® buoys are used to detect tsunamis, but the full potential of the data they collect is realized through accurate forecasts of inundations caused by the tsunamis. The performance measure assesses how well the network achieves its full potential through a statistical analysis of simulated forecasts of wave amplitudes outside an impact site and a consideration of how much the forecasts are degraded in accuracy when one or more buoys are inoperative. The analysis uses simulated tsunami amplitude time series collected at each buoy from selected source segments in the Short-term Inundation Forecast for Tsunamis database and involves a set for 1000 forecasts for each buoy/segment pair at sites just offshore of selected impact communities. Random error-producing scatter in the time series is induced by uncertainties in the source location, addition of real oceanic noise, and imperfect tidal removal. Comparison with an error-free standard leads to root-mean-square errors (RMSEs) for DART® buoys located near a subduction zone. The RMSEs indicate which buoy provides the best forecast (lowest RMSE) for sections of the zone, under a warning-time constraint for the forecasts of 3 h. The analysis also shows how the forecasts are degraded (larger minimum RMSE among the remaining buoys) when one or more buoys become inoperative. The RMSEs provide a way to assess array augmentation or redesign such as moving buoys to more optimal locations. Examples are shown for buoys off the Aleutian Islands and off the West Coast of South America for impact sites at Hilo HI and along the US West Coast (Crescent City CA and Port San Luis CA, USA). A simple measure (coded green, yellow or red) of the current status of the network's ability to deliver accurate forecasts is proposed to flag the urgency of buoy repair.
Evaluating the Effectiveness of DART® Buoy Networks Based on Forecast Accuracy
NASA Astrophysics Data System (ADS)
Percival, Donald B.; Denbo, Donald W.; Gica, Edison; Huang, Paul Y.; Mofjeld, Harold O.; Spillane, Michael C.; Titov, Vasily V.
2018-03-01
A performance measure for a DART® tsunami buoy network has been developed. DART® buoys are used to detect tsunamis, but the full potential of the data they collect is realized through accurate forecasts of inundations caused by the tsunamis. The performance measure assesses how well the network achieves its full potential through a statistical analysis of simulated forecasts of wave amplitudes outside an impact site and a consideration of how much the forecasts are degraded in accuracy when one or more buoys are inoperative. The analysis uses simulated tsunami amplitude time series collected at each buoy from selected source segments in the Short-term Inundation Forecast for Tsunamis database and involves a set for 1000 forecasts for each buoy/segment pair at sites just offshore of selected impact communities. Random error-producing scatter in the time series is induced by uncertainties in the source location, addition of real oceanic noise, and imperfect tidal removal. Comparison with an error-free standard leads to root-mean-square errors (RMSEs) for DART® buoys located near a subduction zone. The RMSEs indicate which buoy provides the best forecast (lowest RMSE) for sections of the zone, under a warning-time constraint for the forecasts of 3 h. The analysis also shows how the forecasts are degraded (larger minimum RMSE among the remaining buoys) when one or more buoys become inoperative. The RMSEs provide a way to assess array augmentation or redesign such as moving buoys to more optimal locations. Examples are shown for buoys off the Aleutian Islands and off the West Coast of South America for impact sites at Hilo HI and along the US West Coast (Crescent City CA and Port San Luis CA, USA). A simple measure (coded green, yellow or red) of the current status of the network's ability to deliver accurate forecasts is proposed to flag the urgency of buoy repair.
Office Marine, Tropical, and Tsunami Services Branch Items of Interest Marine Forecasts Text, Graphic interest to the mariner on 2670 kHz following an initial announcement on 2182 kHz. Typical transmission Office, Marine, Tropical, and Tsunami Services Branch, Items of Interest, Forecasts, Observations
A Response Function Approach for Rapid Far-Field Tsunami Forecasting
NASA Astrophysics Data System (ADS)
Tolkova, Elena; Nicolsky, Dmitry; Wang, Dailin
2017-08-01
Predicting tsunami impacts at remote coasts largely relies on tsunami en-route measurements in an open ocean. In this work, these measurements are used to generate instant tsunami predictions in deep water and near the coast. The predictions are generated as a response or a combination of responses to one or more tsunameters, with each response obtained as a convolution of real-time tsunameter measurements and a pre-computed pulse response function (PRF). Practical implementation of this method requires tables of PRFs in a 3D parameter space: earthquake location-tsunameter-forecasted site. Examples of hindcasting the 2010 Chilean and the 2011 Tohoku-Oki tsunamis along the US West Coast and beyond demonstrated high accuracy of the suggested technology in application to trans-Pacific seismically generated tsunamis.
NASA Astrophysics Data System (ADS)
Eble, M. C.; uslu, B. U.; Wright, L.
2013-12-01
Synthetic tsunamis generated from source regions around the Pacific Basin are analyzed in terms of their relative impact on United States coastal locations.. The region of tsunami origin is as important as the expected magnitude and the predicted inundation for understanding tsunami hazard. The NOAA Center for Tsunami Research has developed high-resolution tsunami models capable of predicting tsunami arrival time and amplitude of waves at each location. These models have been used to conduct tsunami hazard assessments to assess maximum impact and tsunami inundation for use by local communities in education and evacuation map development. Hazard assessment studies conducted for Los Angeles, San Francisco, Crescent City, Hilo, and Apra Harbor are combined with results of tsunami forecast model development at each of seventy-five locations. Complete hazard assessment, identifies every possible tsunami variation from a pre-computed propagation database. Study results indicate that the Eastern Aleutian Islands and Alaska are the most likely regions to produce the largest impact on the West Coast of the United States, while the East Philippines and Mariana trench regions impact Apra Harbor, Guam. Hawaii appears to be impacted equally from South America, Alaska and the Kuril Islands.
NASA Astrophysics Data System (ADS)
Bahng, B.; Whitmore, P.; Macpherson, K. A.; Knight, W. R.
2016-12-01
The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes or other mechanisms in either the Pacific Ocean, Atlantic Ocean or Gulf of Mexico. At the U.S. National Tsunami Warning Center (NTWC), the use of the model has been mainly for tsunami pre-computation due to earthquakes. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. The model has also been used for tsunami hindcasting due to submarine landslides and due to atmospheric pressure jumps, but in a very case-specific and somewhat limited manner. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves approach coastal waters. The shallow-water wave physics is readily applicable to all of the above tsunamis as well as to tides. Recently, the model has been expanded to include multiple forcing mechanisms in a systematic fashion, and to enhance the model physics for non-earthquake events.ATFM is now able to handle multiple source mechanisms, either individually or jointly, which include earthquake, submarine landslide, meteo-tsunami and tidal forcing. As for earthquakes, the source can be a single unit source or multiple, interacting source blocks. Horizontal slip contribution can be added to the sea-floor displacement. The model now includes submarine landslide physics, modeling the source either as a rigid slump, or as a viscous fluid. Additional shallow-water physics have been implemented for the viscous submarine landslides. With rigid slumping, any trajectory can be followed. As for meteo-tsunami, the forcing mechanism is capable of following any trajectory shape. Wind stress physics has also been implemented for the meteo-tsunami case, if required. As an example of multiple sources, a near-field model of the tsunami produced by a combination of earthquake and submarine landslide forcing which happened in Papua New Guinea on July 17, 1998 is provided.
NASA Astrophysics Data System (ADS)
O'Neil, K.; Bouchard, R.; Burnett, W. H.; Aldrich, C.
2009-12-01
The National Oceanic and Atmospheric Administration’s (NOAA) National Data Buoy Center (NDBC) operates and maintains the NDBC Ocean Observing Systems of Systems (NOOSS), comprised of 3 networks that provide critical information before and during and after extreme hazards events, such as tsunamis, hurricanes, and El Niños. While each system has its own mission, they have in common the requirement to remain on station in remote areas of the ocean to provide reliable and accurate observations. After the 2004 Sumatran Tsunami, NOAA expanded its network of tsunameters from six in the Pacific Ocean to a vast network of 39 stations providing information to Tsunami Warning Centers to enable faster and more accurate tsunami warnings for coastal communities in the Pacific, Atlantic, Caribbean and the Gulf of Mexico. The tsunameter measurements are used to detect the amplitude and period of the tsunamis, and the data can be assimilated into models for the prediction and impact of the tsunamis to coastal communities. The network has been used for the detection of tsunamis generated by earthquakes, including the 2006 and 2007 Kuril Islands, 2007 Peru, and Solomon Islands, and most recently for the 2009 Dusky Sound, New Zealand earthquake. In August 2009, the NOAA adjusted its 2009 Atlantic Hurricane Seasonal Outlooks from above normal to near or below normal activity, primarily due to a strengthening El Niño. A key component in the detection of that El Niño was the Tropical Atmosphere Ocean Array (TAO) operated by NDBC. TAO provides real-time data for improved detection, understanding, and prediction of El Niño and La Niña. The 55-buoy TAO array spans the central and eastern equatorial Pacific providing real-time and post-deployment recovery data to support climate analysis and forecasts. Although, in this case, the El Niño benefits the tropical Atlantic, the alternate manifestation, La Niña typically enhances hurricane activity in the Atlantic. The various phases of the El Niño-Southern Oscillation resulting in extreme hazards, such as floods and landslides, droughts and wildfires, fish kills and biological impacts. For almost 40 years, NDBC has operated and maintained a network of buoys and coastal automated stations for meteorological and oceanographic observations that support real-time weather analysis, forecasting, and warnings. The US National Hurricane Center (NHC) uses the observations from the buoys to detect the position and intensity of tropical cyclones and the extent of their extreme winds and sea. Since 2006, NHC has cited over 100 instances of using buoy data in its Forecast Discussions or Public Advisories. Data are also used in reconstructing and analyzing the extent of devastation from land-falling hurricanes. The unprecedented devastation caused by the rising waters of 2005’s Hurricane Katrina was attributed to the waves generated and reported by the NDBC buoys in the Gulf of Mexico superimposed upon the storm surge at landfall. The three constituent systems of the NOOSS comprise a network of more than 250 observing stations providing real-time and archived data for forecasters, scientists, and disaster management officials.
The U.S. East Coast Meteotsunami of June 13, 2013
NASA Astrophysics Data System (ADS)
Knight, W. R.; Whitmore, P.; Kim, Y.; Wang, D.; Becker, N. C.; Weinstein, S.; Walker, K.
2013-12-01
NOAA's two Tsunami Warning Centers (TWCs) provide advance notification to coastal communities concerning tsunami hazards. While the focus is primarily on seismic sources, the U.S. East Coast event of June 13, 2013 indicates the importance of understanding and forecasting atmospherically-driven tsunamis, or meteotsunamis, as well. Here we describe an approach which explains the generation of this event by atmospheric processes, and suggests that the causative forces can be monitored and used to forecast meteotsunami occurrence. The U.S. East Coast tsunami of June 13, 2013 was well recorded at tide gauges from North Carolina to Massachusetts as well as at Bermuda and Puerto Rico. It also triggered DART 44402, just east of the Atlantic shelf break at 39.4N. As there was no seismic energy release associated with the tsunami and an eastward propagating major weather system crossed the Atlantic coast just before the tsunami, the focus turned to atmospheric forcing. Tsunami forecast models used at the two U.S. TWCs were modified to introduce moving atmospheric pressure distributions as sources. In a simple case, a north-south oriented line air pressure jump of width 50 km and pressure of 4 mb at sea level was moved eastward at 20 m/s. The speed matched both the storm speed at the coast and the long wave speed for 40 m deep water, thus allowing for resonant coupling of atmosphere to ocean in the shelf region (Proudman Resonance). Considering the simplicity of the source, a reasonable comparison between the modeled and observed tsunami was obtained with regards to arrival time and height. The proposed source also offers an explanation of the later wave arrivals at US tide gauges. These typically lagged the arrival at Bermuda - a location much further east. This pattern can be explained within the context of Proudman resonance if the waves arriving at coastal stations originated at the shelf break as reflected waves. Model animations of wave dynamics corroborate this phenomenon. The contribution of edge waves generated as the system moves over the coast is also examined. Remaining questions include the importance of shelf parameters in setting the wave fetch and the 'Q' of Proudman resonance along the Atlantic coastline. In other words, are some stretches of shelf more conducive to tsunami formation than others? Wind stress was disregarded in the initial modeling work leaving its possible importance as another unanswered question. Operational questions include how to detect likely meteotsunami conditions with real-time meteorological measurements, and what form alerts should take. The minimum necessary temporal resolution of the pressure sensors along with their density and siting needs to be determined. Because details of the source, such as direction and speed of propagation, will likely subject unique sections of coastline to tsunami attack, the detailed analysis of data from sensor arrays to be used in forecasting will be important.
NASA Astrophysics Data System (ADS)
Wilson, Rick I.; Dengler, Lori A.; Goltz, James D.; Legg, Mark R.; Miller, Kevin M.; Ritchie, Andy; Whitmore, Paul M.
2011-07-01
State geoscientists (geologists, geophysicists, seismologists, and engineers) in California work closely with federal, state and local government emergency managers to help prepare coastal communities for potential impacts from a tsunami before, during, and after an event. For teletsunamis, as scientific information (forecast model wave heights, first-wave arrival times, etc.) from NOAA's West Coast and Alaska Tsunami Warning Center is made available, federal- and state-level emergency managers must help convey this information in a concise, comprehensible and timely manner to local officials who ultimately determine the appropriate response activities for their jurisdictions. During the September 29, 2009 Tsunami Advisory for California, government geoscientists assisted the California Emergency Management Agency by providing technical assistance during teleconference meetings with NOAA and other state and local emergency managers prior to the arrival of the tsunami. This technical assistance included background information on anticipated tidal conditions when the tsunami was set to arrive, wave height estimates from state-modeled scenarios for areas not covered by NOAA's forecast models, and clarifying which regions of the state were at greatest risk. Over the last year, state geoscientists have started to provide additional assistance: 1) working closely with NOAA to simplify their tsunami alert messaging and expand their forecast modeling coverage; 2) creating "playbooks" containing information from existing tsunami scenarios for local emergency managers to reference during an event; and, 3) developing a state-level information "clearinghouse" and pre-tsunami field response team to assist local officials as well as observe and report tsunami effects. Activities of geoscientists were expanded during the more recent Tsunami Advisory on February 27, 2010, including deploying a geologist from the California Geological Survey as a field observer who provided information back to emergency managers.
Sensitivities of Near-field Tsunami Forecasts to Megathrust Deformation Predictions
NASA Astrophysics Data System (ADS)
Tung, S.; Masterlark, T.
2018-02-01
This study reveals how modeling configurations of forward and inverse analyses of coseismic deformation data influence the estimations of seismic and tsunami sources. We illuminate how the predictions of near-field tsunami change when (1) a heterogeneous (HET) distribution of crustal material is introduced to the elastic dislocation model, and (2) the near-trench rupture is either encouraged or suppressed to invert spontaneous coseismic displacements. Hypothetical scenarios of megathrust earthquakes are studied with synthetic Global Positioning System displacements in Cascadia. Finite-element models are designed to mimic the subsurface heterogeneity across the curved subduction margin. The HET lithospheric domain modifies the seafloor displacement field and alters tsunami predictions from those of a homogeneous (HOM) crust. Uncertainties persist as the inverse analyses of geodetic data produce nonrealistic slip artifacts over the HOM domain, which propagates into the prediction errors of subsequent tsunami arrival and amplitudes. A stochastic analysis further shows that the uncertainties of seismic tomography models do not degrade the solution accuracy of HET over HOM. Whether the source ruptures near the trench also controls the details of the seafloor disturbance. Deeper subsurface slips induce more seafloor uplift near the coast and cause an earlier arrival of tsunami waves than surface-slipping events. We suggest using the solutions of zero-updip-slip and zero-updip-slip-gradient rupture boundary conditions as end-members to constrain the tsunami behavior for forecasting purposes. The findings are important for the near-field tsunami warning that primarily relies on the near-real-time geodetic or seismic data for source calibration before megawaves hit the nearest shore upon tsunamigenic events.
Real-Time Earthquake Analysis for Disaster Mitigation (READI) Network
NASA Astrophysics Data System (ADS)
Bock, Y.
2014-12-01
Real-time GNSS networks are making a significant impact on our ability to forecast, assess, and mitigate the effects of geological hazards. I describe the activities of the Real-time Earthquake Analysis for Disaster Mitigation (READI) working group. The group leverages 600+ real-time GPS stations in western North America operated by UNAVCO (PBO network), Central Washington University (PANGA), US Geological Survey & Scripps Institution of Oceanography (SCIGN project), UC Berkeley & US Geological Survey (BARD network), and the Pacific Geosciences Centre (WCDA project). Our goal is to demonstrate an earthquake and tsunami early warning system for western North America. Rapid response is particularly important for those coastal communities that are in the near-source region of large earthquakes and may have only minutes of warning time, and who today are not adequately covered by existing seismic and basin-wide ocean-buoy monitoring systems. The READI working group is performing comparisons of independent real time analyses of 1 Hz GPS data for station displacements and is participating in government-sponsored earthquake and tsunami exercises in the Western U.S. I describe a prototype seismogeodetic system using a cluster of southern California stations that includes GNSS tracking and collocation with MEMS accelerometers for real-time estimation of seismic velocity and displacement waveforms, which has advantages for improved earthquake early warning and tsunami forecasts compared to seismic-only or GPS-only methods. The READI working group's ultimate goal is to participate in an Indo-Pacific Tsunami early warning system that utilizes GNSS real-time displacements and ionospheric measurements along with seismic, near-shore buoys and ocean-bottom pressure sensors, where available, to rapidly estimate magnitude and finite fault slip models for large earthquakes, and then forecast tsunami source, energy scale, geographic extent, inundation and runup. This will require cooperation with other real-time efforts around the Pacific Rim in terms of sharing, analysis centers, and advisory bulletins to the responsible government agencies. The IAG's Global Geodetic Observing System (GGOS), in particular its natural hazards theme, provides a natural umbrella for achieving this objective.
Tsunami Warning Services for the U.S. and Canadian Atlantic Coasts
NASA Astrophysics Data System (ADS)
Whitmore, P. M.; Knight, W.
2008-12-01
In January 2005, the National Oceanic and Atmospheric Administration (NOAA) developed a tsunami warning program for the U.S. Atlantic and Gulf of Mexico coasts. Within a year, this program extended further to the Atlantic coast of Canada and the Caribbean Sea. Warning services are provided to U.S. and Canadian coasts (including Puerto Rico and the Virgin Islands) by the NOAA/West Coast and Alaska Tsunami Warning Center (WCATWC) while the NOAA/Pacific Tsunami Warning Center (PTWC) provides services for non-U.S. entities in the Caribbean Basin. The Puerto Rico Seismic Network (PRSN) is also an active partner in the Caribbean Basin warning system. While the nature of the tsunami threat in the Atlantic Basin is different than in the Pacific, the warning system philosophy is similar. That is, initial messages are based strictly on seismic data so that information is provided to those at greatest risk as fast as possible while supplementary messages are refined with sea level observations and forecasts when possible. The Tsunami Warning Centers (TWCs) acquire regional seismic data through many agencies, such as the United States Geological Survey, Earthquakes Canada, regional seismic networks, and the PRSN. Seismic data quantity and quality are generally sufficient throughout most of the Atlantic area-of-responsibility to issue initial information within five minutes of origin time. Sea level data are mainly provided by the NOAA/National Ocean Service. Coastal tide gage coverage is generally denser along the Atlantic coast than in the Pacific. Seven deep ocean pressure sensors (DARTs), operated by the National Weather Service (NWS) National Data Buoy Center, are located in the Atlantic Basin (5 in the Atlantic Ocean, 1 in the Caribbean, and 1 in the Gulf of Mexico). The DARTs provide TWCs with the means to verify tsunami generation in the Atlantic and provide critical data with which to calibrate forecast models. Tsunami warning response criteria in the Atlantic Basin poses a challenge due to the lack of historic events. The probability and nature of potential sources along the offshore U.S./Canada region are not well understood. Warning/watch/advisory criteria are under review to improve TWC response. Primary tsunami warning contact points consist of NWS Weather Forecast Offices, state warning points, U.S. Coast Guard, and the military. These entities each have responsibility to propagate the message through specific channels. To help communities prepare for a tsunami warning, the NWS established the TsunamiReady program. TsunamiReady sets criteria for communities which include: reliable methods to receive TWC warnings, reliable methods to disseminate messages locally, pre-event planning, hazard/safe zones defined and public education. Once the criteria are met, the community can be recognized as TsunamiReady. A hypothetical event off the east coast is examined and a timeline given for TWC analysis and product issuance.
Anatomy of Historical Tsunamis: Lessons Learned for Tsunami Warning
NASA Astrophysics Data System (ADS)
Igarashi, Y.; Kong, L.; Yamamoto, M.; McCreery, C. S.
2011-11-01
Tsunamis are high-impact disasters that can cause death and destruction locally within a few minutes of their occurrence and across oceans hours, even up to a day, afterward. Efforts to establish tsunami warning systems to protect life and property began in the Pacific after the 1946 Aleutian Islands tsunami caused casualties in Hawaii. Seismic and sea level data were used by a central control center to evaluate tsunamigenic potential and then issue alerts and warnings. The ensuing events of 1952, 1957, and 1960 tested the new system, which continued to expand and evolve from a United States system to an international system in 1965. The Tsunami Warning System in the Pacific (ITSU) steadily improved through the decades as more stations became available in real and near-real time through better communications technology and greater bandwidth. New analysis techniques, coupled with more data of higher quality, resulted in better detection, greater solution accuracy, and more reliable warnings, but limitations still exist in constraining the source and in accurately predicting propagation of the wave from source to shore. Tsunami event data collected over the last two decades through international tsunami science surveys have led to more realistic models for source generation and inundation, and within the warning centers, real-time tsunami wave forecasting will become a reality in the near future. The tsunami warning system is an international cooperative effort amongst countries supported by global and national monitoring networks and dedicated tsunami warning centers; the research community has contributed to the system by advancing and improving its analysis tools. Lessons learned from the earliest tsunamis provided the backbone for the present system, but despite 45 years of experience, the 2004 Indian Ocean tsunami reminded us that tsunamis strike and kill everywhere, not just in the Pacific. Today, a global intergovernmental tsunami warning system is coordinated under the United Nations. This paper reviews historical tsunamis, their warning activities, and their sea level records to highlight lessons learned with the focus on how these insights have helped to drive further development of tsunami warning systems and their tsunami warning centers. While the international systems do well for teletsunamis, faster detection, more accurate evaluations, and widespread timely alerts are still the goals, and challenges still remain to achieving early warning against the more frequent and destructive local tsunamis.
NASA Astrophysics Data System (ADS)
Armigliato, Alberto; Pagnoni, Gianluca; Tinti, Stefano
2014-05-01
The general idea that pre-computed simulated scenario databases can play a key role in conceiving tsunami early warning systems is commonly accepted by now. But it was only in the last decade that it started to be applied to the Mediterranean region, taking special impulse from initiatives like the GDACS and from recently concluded EU-funded projects such as TRIDEC and NearToWarn. With reference to these two projects and with the possibility of further developing this research line in the frame of the FP7 ASTARTE project, we discuss some results we obtained regarding two major topics, namely the strategies applicable to the tsunami scenario database building and the design and performance assessment of a timely and "reliable" elementary-scenario combination algorithm to be run in real-time. As for the first theme, we take advantage of the experience gained in the test areas of Western Iberia, Rhodes (Greece) and Cyprus to illustrate the criteria with which a "Matching Scenario Database" (MSDB) can be built. These involve 1) the choice of the main tectonic tsunamigenic sources (or areas), 2) their tessellation with matrices of elementary faults whose dimension heavily depend on the particular studied area and must be a compromise between the needs to represent the tsunamigenic area in sufficient detail and of limiting the number of scenarios to be simulated, 3) the computation of the scenarios themselves, 4) the choice of the relevant simulation outputs and the standardisation of their formats. Regarding the matching/forecast algorithm, we want it to select and combine the MSDB elements based on the initial earthquake magnitude and location estimate, and to produce a forecast of (at least) the tsunami arrival time, amplitude and period at the closest tide-level sensors and in all needed forecast points. We discuss the performance of the algorithm in terms of the time needed to produce the forecast after the earthquake is detected. In particular, we analyse the different contributions of a number of factors such as the efficient code development and availability of cutting-edge hardware to run the code itself, the wise selection of the MSDB outputs to be combined, the choice of the forecast points where water elevation time series must be taken into account, and few others.
A short history of tsunami research and countermeasures in Japan.
Shuto, Nobuo; Fujima, Koji
2009-01-01
The tsunami science and engineering began in Japan, the country the most frequently hit by local and distant tsunamis. The gate to the tsunami science was opened in 1896 by a giant local tsunami of the highest run-up height of 38 m that claimed 22,000 lives. The crucial key was a tide record to conclude that this tsunami was generated by a "tsunami earthquake". In 1933, the same area was hit again by another giant tsunami. A total system of tsunami disaster mitigation including 10 "hard" and "soft" countermeasures was proposed. Relocation of dwelling houses to high ground was the major countermeasures. The tsunami forecasting began in 1941. In 1960, the Chilean Tsunami damaged the whole Japanese Pacific coast. The height of this tsunami was 5-6 m at most. The countermeasures were the construction of structures including the tsunami breakwater which was the first one in the world. Since the late 1970s, tsunami numerical simulation was developed in Japan and refined to become the UNESCO standard scheme that was transformed to 22 different countries. In 1983, photos and videos of a tsunami in the Japan Sea revealed many faces of tsunami such as soliton fission and edge bores. The 1993 tsunami devastated a town protected by seawalls 4.5 m high. This experience introduced again the idea of comprehensive countermeasures, consisted of defense structure, tsunami-resistant town development and evacuation based on warning.
Forecasting tsunamis in Poverty Bay, New Zealand, with deep-ocean gauges
NASA Astrophysics Data System (ADS)
Power, William; Tolkova, Elena
2013-12-01
The response/transfer function of a coastal site to a remote open-ocean point is introduced, with the intent to directly convert open-ocean measurements into the wave time history at the site. We show that the tsunami wave at the site can be predicted as the wave is measured in the open ocean as far as 1,000+ km away from the site, with a straightforward computation which can be performed almost instantaneously. The suggested formalism is demonstrated for the purpose of tsunami forecasting in Poverty Bay, in the Gisborne region of New Zealand. Directional sensitivity of the site response due to different conditions for the excitation of the shelf and the bay's normal modes is investigated and used to explain tsunami observations. The suggested response function formalism is validated with available records of the 2010 Chilean tsunami at Gisborne tide gauge and at the nearby deep-ocean assessment and reporting of tsunamis (DART) station 54401. The suggested technique is also demonstrated by hindcasting the 2011 Tohoku tsunami and 2012 Haida Gwaii tsunami at Monterey Bay, CA, using an offshore record of each tsunami at DART station 46411.
ASTARTE: Assessment Strategy and Risk Reduction for Tsunamis in Europe
NASA Astrophysics Data System (ADS)
Baptista, M. A.; Yalciner, A. C.; Canals, M.
2014-12-01
Tsunamis are low frequency but high impact natural disasters. In 2004, the Boxing Day tsunami killed hundreds of thousands of people from many nations along the coastlines of the Indian Ocean. Tsunami run-up exceeded 35 m. Seven years later, and in spite of some of the best warning technologies and levels of preparedness in the world, the Tohoku-Oki tsunami in Japan dramatically showed the limitations of scientific knowledge on tsunami sources, coastal impacts and mitigation measures. The experience from Japan raised serious questions on how to improve the resilience of coastal communities, to upgrade the performance of coastal defenses, to adopt a better risk management, and also on the strategies and priorities for the reconstruction of damaged coastal areas. Societal resilience requires the reinforcement of capabilities to manage and reduce risk at national and local scales.ASTARTE (Assessment STrategy And Risk for Tsunami in Europe), a 36-month FP7 project, aims to develop a comprehensive strategy to mitigate tsunami impact in this region. To achieve this goal, an interdisciplinary consortium has been assembled. It includes all CTWPs of NEAM and expert institutions across Europe and worldwide. ASTARTE will improve i) basic knowledge of tsunami generation and recurrence going beyond simple catalogues, with novel empirical data and new statistical analyses for assessing long-term recurrence and hazards of large events in sensitive areas of NEAM, ii) numerical techniques for tsunami simulation, with focus on real-time codes and novel statistical emulation approaches, and iii) methods for assessment of hazard, vulnerability, and risk. ASTARTE will also provide i) guidelines for tsunami Eurocodes, ii) better tools for forecast and warning for CTWPs and NTWCs, and iii) guidelines for decision makers to increase sustainability and resilience of coastal communities. In summary, ASTARTE will develop basic scientific and technical elements allowing for a significant enhancement of the Tsunami Warning System in the NEAM region in terms of monitoring, early warning and forecast, governance and resilience. This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3)
NASA Astrophysics Data System (ADS)
Takagawa, T.
2017-12-01
A rapid and precise tsunami forecast based on offshore monitoring is getting attention to reduce human losses due to devastating tsunami inundation. We developed a forecast method based on the combination of hierarchical Bayesian inversion with pre-computed database and rapid post-computing of tsunami inundation. The method was applied to Tokyo bay to evaluate the efficiency of observation arrays against three tsunamigenic earthquakes. One is a scenario earthquake at Nankai trough and the other two are historic ones of Genroku in 1703 and Enpo in 1677. In general, rich observation array near the tsunami source has an advantage in both accuracy and rapidness of tsunami forecast. To examine the effect of observation time length we used four types of data with the lengths of 5, 10, 20 and 45 minutes after the earthquake occurrences. Prediction accuracy of tsunami inundation was evaluated by the simulated tsunami inundation areas around Tokyo bay due to target earthquakes. The shortest time length of accurate prediction varied with target earthquakes. Here, accurate prediction means the simulated values fall within the 95% credible intervals of prediction. In Enpo earthquake case, 5-minutes observation is enough for accurate prediction for Tokyo bay, but 10-minutes and 45-minutes are needed in the case of Nankai trough and Genroku, respectively. The difference of the shortest time length for accurate prediction shows the strong relationship with the relative distance from the tsunami source and observation arrays. In the Enpo case, offshore tsunami observation points are densely distributed even in the source region. So, accurate prediction can be rapidly achieved within 5 minutes. This precise prediction is useful for early warnings. Even in the worst case of Genroku, where less observation points are available near the source, accurate prediction can be obtained within 45 minutes. This information can be useful to figure out the outline of the hazard in an early stage of reaction.
A short history of tsunami research and countermeasures in Japan
Shuto, Nobuo; Fujima, Koji
2009-01-01
The tsunami science and engineering began in Japan, the country the most frequently hit by local and distant tsunamis. The gate to the tsunami science was opened in 1896 by a giant local tsunami of the highest run-up height of 38 m that claimed 22,000 lives. The crucial key was a tide record to conclude that this tsunami was generated by a “tsunami earthquake”. In 1933, the same area was hit again by another giant tsunami. A total system of tsunami disaster mitigation including 10 “hard” and “soft” countermeasures was proposed. Relocation of dwelling houses to high ground was the major countermeasures. The tsunami forecasting began in 1941. In 1960, the Chilean Tsunami damaged the whole Japanese Pacific coast. The height of this tsunami was 5–6 m at most. The countermeasures were the construction of structures including the tsunami breakwater which was the first one in the world. Since the late 1970s, tsunami numerical simulation was developed in Japan and refined to become the UNESCO standard scheme that was transformed to 22 different countries. In 1983, photos and videos of a tsunami in the Japan Sea revealed many faces of tsunami such as soliton fission and edge bores. The 1993 tsunami devastated a town protected by seawalls 4.5 m high. This experience introduced again the idea of comprehensive countermeasures, consisted of defense structure, tsunami-resistant town development and evacuation based on warning. PMID:19838008
Signals in the ionosphere generated by tsunami earthquakes: observations and modeling suppor
NASA Astrophysics Data System (ADS)
Rolland, L.; Sladen, A.; Mikesell, D.; Larmat, C. S.; Rakoto, V.; Remillieux, M.; Lee, R.; Khelfi, K.; Lognonne, P. H.; Astafyeva, E.
2017-12-01
Forecasting systems failed to predict the magnitude of the 2011 great tsunami in Japan due to the difficulty and cost of instrumenting the ocean with high-quality and dense networks. Melgar et al. (2013) show that using all of the conventional data (inland seismic, geodetic, and tsunami gauges) with the best inversion method still fails to predict the correct height of the tsunami before it breaks onto a coast near the epicenter (< 500 km). On the other hand, in the last decade, scientists have gathered convincing evidence of transient signals in the ionosphere Total Electron Content (TEC) observations that are associated to open ocean tsunami waves. Even though typical tsunami waves are only a few centimeters high, they are powerful enough to create atmospheric vibrations extending all the way to the ionosphere, 300 kilometers up in the atmosphere. Therefore, we are proposing to incorporate the ionospheric signals into tsunami early-warning systems. We anticipate that the method could be decisive for mitigating "tsunami earthquakes" which trigger tsunamis larger than expected from their short-period magnitude. These events are challenging to characterize as they rupture the near-trench subduction interface, in a distant region less constrained by onshore data. As a couple of devastating tsunami earthquakes happens per decade, they represent a real threat for onshore populations and a challenge for tsunami early-warning systems. We will present the TEC observations of the recent Java 2006 and Mentawaii 2010 tsunami earthquakes and base our analysis on acoustic ray tracing, normal modes summation and the simulation code SPECFEM, which solves the wave equation in coupled acoustic (ocean, atmosphere) and elastic (solid earth) domains. Rupture histories are entered as finite source models, which will allow us to evaluate the effect of a relatively slow rupture on the surrounding ocean and atmosphere.
Defining Tsunami Magnitude as Measure of Potential Impact
NASA Astrophysics Data System (ADS)
Titov, V. V.; Tang, L.
2016-12-01
The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.
NASA Astrophysics Data System (ADS)
Gailler, A.; Hébert, H.; Schindelé, F.; Reymond, D.
2017-11-01
Tsunami modeling tools in the French tsunami Warning Center operational context provide rapidly derived warning levels with a dimensionless variable at basin scale. A new forecast method based on coastal amplification laws has been tested to estimate the tsunami onshore height, with a focus on the French Riviera test-site (Nice area). This fast prediction tool provides a coastal tsunami height distribution, calculated from the numerical simulation of the deep ocean tsunami amplitude and using a transfer function derived from the Green's law. Due to a lack of tsunami observations in the western Mediterranean basin, coastal amplification parameters are here defined regarding high resolution nested grids simulations. The preliminary results for the Nice test site on the basis of nine historical and synthetic sources show a good agreement with the time-consuming high resolution modeling: the linear approximation is obtained within 1 min in general and provides estimates within a factor of two in amplitude, although the resonance effects in harbors and bays are not reproduced. In Nice harbor especially, variation in tsunami amplitude is something that cannot be really assessed because of the magnitude range and maximum energy azimuth of possible events to account for. However, this method is well suited for a fast first estimate of the coastal tsunami threat forecast.
NASA Astrophysics Data System (ADS)
Gailler, A.; Schindelé, F.; Hebert, H.; Reymond, D.
2017-12-01
Tsunami modeling tools in the French tsunami Warning Center operational context provide for now warning levels with a no dimension scale, and at basin scale. A new forecast method based on coastal amplification laws has been tested to estimate the tsunami onshore height, with a focus on the French Riviera test-site (Nice area). This fast prediction tool provides a coastal tsunami height distribution, calculated from the numerical simulation of the deep ocean tsunami amplitude and using a transfer function derived from the Green's law. Due to a lack of tsunami observation in the western Mediterranean basin, coastal amplification parameters are here defined regarding high resolution nested grids simulations. The first encouraging results for the Nice test site on the basis of 9 historical and fake sources show a good agreement with the time-consuming high resolution modeling: the linear approximation provides within in general 1 minute estimates less a factor of 2 in amplitude, although the resonance effects in harbors and bays are not reproduced. In Nice harbor especially, variation in tsunami amplitude is something that cannot be really appreciated because of the magnitude range and maximum energy azimuth of possible events to account for. However, this method suits well for a fast first estimate of the coastal tsunami threat forecast.
NASA Astrophysics Data System (ADS)
Gailler, A.; Hébert, H.; Schindelé, F.; Reymond, D.
2018-04-01
Tsunami modeling tools in the French tsunami Warning Center operational context provide rapidly derived warning levels with a dimensionless variable at basin scale. A new forecast method based on coastal amplification laws has been tested to estimate the tsunami onshore height, with a focus on the French Riviera test-site (Nice area). This fast prediction tool provides a coastal tsunami height distribution, calculated from the numerical simulation of the deep ocean tsunami amplitude and using a transfer function derived from the Green's law. Due to a lack of tsunami observations in the western Mediterranean basin, coastal amplification parameters are here defined regarding high resolution nested grids simulations. The preliminary results for the Nice test site on the basis of nine historical and synthetic sources show a good agreement with the time-consuming high resolution modeling: the linear approximation is obtained within 1 min in general and provides estimates within a factor of two in amplitude, although the resonance effects in harbors and bays are not reproduced. In Nice harbor especially, variation in tsunami amplitude is something that cannot be really assessed because of the magnitude range and maximum energy azimuth of possible events to account for. However, this method is well suited for a fast first estimate of the coastal tsunami threat forecast.
NASA Astrophysics Data System (ADS)
Ulutas, E.; Inan, A.; Annunziato, A.
2012-06-01
This study analyzes the response of the Global Disasters Alerts and Coordination System (GDACS) in relation to a case study: the Kepulaunan Mentawai earthquake and related tsunami, which occurred on 25 October 2010. The GDACS, developed by the European Commission Joint Research Center, combines existing web-based disaster information management systems with the aim to alert the international community in case of major disasters. The tsunami simulation system is an integral part of the GDACS. In more detail, the study aims to assess the tsunami hazard on the Mentawai and Sumatra coasts: the tsunami heights and arrival times have been estimated employing three propagation models based on the long wave theory. The analysis was performed in three stages: (1) pre-calculated simulations by using the tsunami scenario database for that region, used by the GDACS system to estimate the alert level; (2) near-real-time simulated tsunami forecasts, automatically performed by the GDACS system whenever a new earthquake is detected by the seismological data providers; and (3) post-event tsunami calculations using GCMT (Global Centroid Moment Tensor) fault mechanism solutions proposed by US Geological Survey (USGS) for this event. The GDACS system estimates the alert level based on the first type of calculations and on that basis sends alert messages to its users; the second type of calculations is available within 30-40 min after the notification of the event but does not change the estimated alert level. The third type of calculations is performed to improve the initial estimations and to have a better understanding of the extent of the possible damage. The automatic alert level for the earthquake was given between Green and Orange Alert, which, in the logic of GDACS, means no need or moderate need of international humanitarian assistance; however, the earthquake generated 3 to 9 m tsunami run-up along southwestern coasts of the Pagai Islands where 431 people died. The post-event calculations indicated medium-high humanitarian impacts.
LINKS to NATIONAL WEATHER SERVICE MARINE FORECAST OFFICES
Coastal Flooding Tsunamis 406 EPIRB's National Weather Service Marine Forecasts LINKS to NATIONAL WEATHER Marine Forecasts in text form ) Coastal NWS Forecast Offices have regionally focused marine webpages which are overflowing with information such as coastal forecasts, predicted tides, and buoy observations
NASA Astrophysics Data System (ADS)
Stroker, Kelly; Dunbar, Paula; Mungov, George; Sweeney, Aaron; McCullough, Heather; Carignan, Kelly
2015-04-01
The National Oceanic and Atmospheric Administration (NOAA) has primary responsibility in the United States for tsunami forecast, warning, research, and supports community resiliency. NOAA's National Geophysical Data Center (NGDC) and co-located World Data Service for Geophysics provide a unique collection of data enabling communities to ensure preparedness and resilience to tsunami hazards. Immediately following a damaging or fatal tsunami event there is a need for authoritative data and information. The NGDC Global Historical Tsunami Database (http://www.ngdc.noaa.gov/hazard/) includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. The long-term data from these events, including photographs of damage, provide clues to what might happen in the future. NGDC catalogs the information on global historical tsunamis and uses these data to produce qualitative tsunami hazard assessments at regional levels. In addition to the socioeconomic effects of a tsunami, NGDC also obtains water level data from the coasts and the deep-ocean at stations operated by the NOAA/NOS Center for Operational Oceanographic Products and Services, the NOAA Tsunami Warning Centers, and the National Data Buoy Center (NDBC) and produces research-quality data to isolate seismic waves (in the case of the deep-ocean sites) and the tsunami signal. These water-level data provide evidence of sea-level fluctuation and possible inundation events. NGDC is also building high-resolution digital elevation models (DEMs) to support real-time forecasts, implemented at 75 US coastal communities. After a damaging or fatal event NGDC begins to collect and integrate data and information from many organizations into the hazards databases. Sources of data include our NOAA partners, the U.S. Geological Survey, the UNESCO Intergovernmental Oceanographic Commission (IOC) and International Tsunami Information Center, Smithsonian Institution's Global Volcanism Program, news organizations, etc. NGDC assesses the data and then works to promptly distribute the data and information. For example, when a major tsunami occurs, all of the related tsunami data are combined into one timely resource, posted in an online report, which includes: 1) event summary; 2) eyewitness and instrumental recordings from preliminary field surveys; 3) regional historical observations including similar past events and effects; 4) observed water heights and calculated tsunami travel times; and 5) near-field effects. This report is regularly updated to incorporate the most recent data and observations. Providing timely access to authoritative data and information ultimately benefits researchers, state officials, the media and the public. This paper will demonstrate the extensive collection of data and how it is used.
NASA Astrophysics Data System (ADS)
Cienfuegos, R.; Catalan, P. A.; Leon, J.; Gonzalez, G.; Repetto, P.; Urrutia, A.; Tomita, T.; Orellana, V.
2016-12-01
In the wake of the 2010 tsunami that hit Chile, a major public effort to promote interdisciplinary disaster reseach was undertaken by the Comisión Nacional de Investigación Científica y Tecnológica (Conicyt) allocating funds to create the Center for Integrated Research on Natural Risks Management (CIGIDEN). This effort has been key in promoting associativity between national and international research teams in order to transform the frequent occurrence of extreme events that affect Chile into an opportunity for interdisciplinary research. In this presentation we will summarize some of the fundamental research findings regarding tsunami forecasting, alerting, and evacuation processes based on interdisciplinary field work campaigns and modeling efforts conducted in the wake of the three most recent destructive events that hit Chile in 2010, 2014, and 2015. One of the main results that we shall emphatize from these findings, is that while research and operational efforts to model and forecast tsunamis are important, technological positivisms should not undermine educational efforts that have proved to be effective in reducing casualties due to tsunamis in the near field. Indeed, in recent events that hit Chile, first tsunami waves reached the adjacent generation zones in time scales comparable with the required time for data gathering and modeling even for the most sophisticated early warning tsunami algorithms currently available. The latter emphasizes self-evacuation from coastal areas, while forecasting and monitoring tsunami hazards remain very important for alerting more distant areas, and are essential for alert cancelling especially when shelf and embayment resonance, and edge wave propagation may produce destructive late tsunami arrivals several hours after the nucleation of the earthquake. By combining some of the recent evidence we have gathered in Chile on seismic source uncertainities (both epistemic and aleatoric), tsunami hydrodynamics, the response of official national institutions in charge of emergency management, and the evacuation processess observed, we will attempt to bring some elements for discussing on the complex balance between technological positivism and risk awareness and education programs that may help prioritizing funding efforts in tsunami prone regions.
Office Marine, Tropical, and Tsunami Services Branch Items of Interest Marine Forecasts Text, Graphic Services Branch, Items of Interest, Forecasts, Observations, Portals, Dissemination, Publications, Links
Real-time Inversion of Tsunami Source from GNSS Ground Deformation Observations and Tide Gauges.
NASA Astrophysics Data System (ADS)
Arcas, D.; Wei, Y.
2017-12-01
Over the last decade, the NOAA Center for Tsunami Research (NCTR) has developed an inversion technique to constrain tsunami sources based on the use of Green's functions in combination with data reported by NOAA's Deep-ocean Assessment and Reporting of Tsunamis (DART®) systems. The system has consistently proven effective in providing highly accurate tsunami forecasts of wave amplitude throughout an entire basin. However, improvement is necessary in two critical areas: reduction of data latency for near-field tsunami predictions and reduction of maintenance cost of the network. Two types of sensors have been proposed as supplementary to the existing network of DART®systems: Global Navigation Satellite System (GNSS) stations and coastal tide gauges. The use GNSS stations to provide autonomous geo-spatial positioning at specific sites during an earthquake has been proposed in recent years to supplement the DART® array in tsunami source inversion. GNSS technology has the potential to provide substantial contributions in the two critical areas of DART® technology where improvement is most necessary. The present study uses GNSS ground displacement observations of the 2011 Tohoku-Oki earthquake in combination with NCTR operational database of Green's functions, to produce a rapid estimate of tsunami source based on GNSS observations alone. The solution is then compared with that obtained via DART® data inversion and the difficulties in obtaining an accurate GNSS-based solution are underlined. The study also identifies the set of conditions required for source inversion from coastal tide-gauges using the degree of nonlinearity of the signal as a primary criteria. We then proceed to identify the conditions and scenarios under which a particular gage could be used to invert a tsunami source.
Forecasting database for the tsunami warning regional center for the western Mediterranean Sea
NASA Astrophysics Data System (ADS)
Gailler, A.; Hebert, H.; Loevenbruck, A.; Hernandez, B.
2010-12-01
Improvements in the availability of sea-level observations and advances in numerical modeling techniques are increasing the potential for tsunami warnings to be based on numerical model forecasts. Numerical tsunami propagation and inundation models are well developed, but they present a challenge to run in real-time, partly due to computational limitations and also to a lack of detailed knowledge on the earthquake rupture parameters. Through the establishment of the tsunami warning regional center for NE Atlantic and western Mediterranean Sea, the CEA is especially in charge of providing rapidly a map with uncertainties showing zones in the main axis of energy at the Mediterranean scale. The strategy is based initially on a pre-computed tsunami scenarios database, as source parameters available a short time after an earthquake occurs are preliminary and may be somewhat inaccurate. Existing numerical models are good enough to provide a useful guidance for warning structures to be quickly disseminated. When an event will occur, an appropriate variety of offshore tsunami propagation scenarios by combining pre-computed propagation solutions (single or multi sources) may be recalled through an automatic interface. This approach would provide quick estimates of tsunami offshore propagation, and aid hazard assessment and evacuation decision-making. As numerical model accuracy is inherently limited by errors in bathymetry and topography, and as inundation maps calculation is more complex and expensive in term of computational time, only tsunami offshore propagation modeling will be included in the forecasting database using a single sparse bathymetric computation grid for the numerical modeling. Because of too much variability in the mechanism of tsunamigenic earthquakes, all possible magnitudes cannot be represented in the scenarios database. In principle, an infinite number of tsunami propagation scenarios can be constructed by linear combinations of a finite number of pre-computed unit scenarios. The whole notion of a pre-computed forecasting database also requires a historical earthquake and tsunami database, as well as an up-to-date seismotectonic database including faults geometry and a zonation based on seismotectonic synthesis of source zones and tsunamigenic faults. Our forecast strategy is thus based on a unit source function methodology, whereby the model runs are combined and scaled linearly to produce any composite tsunamis propagation solution. Each unit source function is equivalent to a tsunami generated by a Mo 1.75E+19 N.m earthquake (Mw ~6.8) with a rectangular fault 25 km by 20 km in size and 1 m in slip. The faults of the unit functions are placed adjacent to each other, following the discretization of the main seismogenic faults bounding the western Mediterranean basin. The number of unit functions involved varies with the magnitude of the wanted composite solution and the combined waveheights are multiplied by a given scaling factor to produce the new arbitrary scenario. Some test-cases examples are presented (e.g., Boumerdès 2003 [Algeria, Mw 6.8], Djijel 1856 [Algeria, Mw 7.2], Ligure 1887 [Italia, Mw 6.5-6.7]).
Tsunamis: stochastic models of occurrence and generation mechanisms
Geist, Eric L.; Oglesby, David D.
2014-01-01
The devastating consequences of the 2004 Indian Ocean and 2011 Japan tsunamis have led to increased research into many different aspects of the tsunami phenomenon. In this entry, we review research related to the observed complexity and uncertainty associated with tsunami generation, propagation, and occurrence described and analyzed using a variety of stochastic methods. In each case, seismogenic tsunamis are primarily considered. Stochastic models are developed from the physical theories that govern tsunami evolution combined with empirical models fitted to seismic and tsunami observations, as well as tsunami catalogs. These stochastic methods are key to providing probabilistic forecasts and hazard assessments for tsunamis. The stochastic methods described here are similar to those described for earthquakes (Vere-Jones 2013) and volcanoes (Bebbington 2013) in this encyclopedia.
Tsunamis 406 EPIRB's National Weather Service Marine Forecasts ALASKA MARINE VHF VOICE Marine Forecast greater danger near shore or any shallow waters? NATIONAL WEATHER SERVICE PRODUCTS VIA ALASKA MARINE VHF VOICE NOAA broadcasts offshore forecasts, nearshore forecasts and storm warnings on marine VHF channels
Solomon Islands 2007 Tsunami Near-Field Modeling and Source Earthquake Deformation
NASA Astrophysics Data System (ADS)
Uslu, B.; Wei, Y.; Fritz, H.; Titov, V.; Chamberlin, C.
2008-12-01
The earthquake of 1 April 2007 left behind momentous footages of crust rupture and tsunami impact along the coastline of Solomon Islands (Fritz and Kalligeris, 2008; Taylor et al., 2008; McAdoo et al., 2008; PARI, 2008), while the undisturbed tsunami signals were also recorded at nearby deep-ocean tsunameters and coastal tide stations. These multi-dimensional measurements provide valuable datasets to tackle the challenging aspects at the tsunami source directly by inversion from tsunameter records in real time (available in a time frame of minutes), and its relationship with the seismic source derived either from the seismometer records (available in a time frame of hours or days) or from the crust rupture measurements (available in a time frame of months or years). The tsunami measurements in the near field, including the complex vertical crust motion and tsunami runup, are particularly critical to help interpreting the tsunami source. This study develops high-resolution inundation models for the Solomon Islands to compute the near-field tsunami impact. Using these models, this research compares the tsunameter-derived tsunami source with the seismic-derived earthquake sources from comprehensive perceptions, including vertical uplift and subsidence, tsunami runup heights and their distributional pattern among the islands, deep-ocean tsunameter measurements, and near- and far-field tide gauge records. The present study stresses the significance of the tsunami magnitude, source location, bathymetry and topography in accurately modeling the generation, propagation and inundation of the tsunami waves. This study highlights the accuracy and efficiency of the tsunameter-derived tsunami source in modeling the near-field tsunami impact. As the high- resolution models developed in this study will become part of NOAA's tsunami forecast system, these results also suggest expanding the system for potential applications in tsunami hazard assessment, search and rescue operations, as well as event and post-event planning in the Solomon Islands.
Towards Operational Meteotsunami Early Warning System: the Adriatic Project MESSI
NASA Astrophysics Data System (ADS)
Vilibic, I.; Sepic, J.; Denamiel, C. L.; Mihanovic, H.; Muslim, S.; Tudor, M.; Ivankovic, D.; Jelavic, D.; Kovacevic, V.; Masce, T.; Dadic, V.; Gacic, M.; Horvath, K.; Monserrat, S.; Rabinovich, A.; Telisman-Prtenjak, M.
2017-12-01
A number of destructive meteotsunamis - atmospherically-driven long ocean waves in a tsunami frequency band - occurred during the last decade through the world oceans. Owing to significant damage caused by these meteotsunamis, several scientific groups (occasionally in collaboration with public offices) have started developing meteotsunami warning systems. Creation of one such system has been initialized in the late 2015 within the MESSI (Meteotsunamis, destructive long ocean waves in the tsunami frequency band: from observations and simulations towards a warning system) project. Main goal of this project is to build a prototype of a meteotsunami warning system for the eastern Adriatic coast. The system will be based on real-time measurements, operational atmosphere and ocean modeling and real time decision-making process. Envisioned MESSI meteotsunami warning system consists of three modules: (1) synoptic warning module, which will use established correlation between forecasted synoptic fields and high-frequency sea level oscillations to provide qualitative meteotsunami forecasts for up to a week in advance, (2) probabilistic premodeling prediction module, which will use operational WRF-ROMS-ADCIRC modeling system and compare the forecast with an atlas of presimulations to get the probabilistic meteotsunami forecast for up to three days in advance, and (3) real-time module, which is based on real time tracking of properties of air pressure disturbance (amplitude, speed, direction, period, ...) and their real-time comparison with the atlas of meteotsunami simulations. System will be tested on recent meteotsunami events which were recorded in the MESSI area shortly after the operational meteotsunami network installation. Albeit complex, such a multilevel warning system has a potential to be adapted to most meteotsunami hot spots, simply by tuning the system parameters to the available atmospheric and ocean data.
NASA Astrophysics Data System (ADS)
Williamson, Amy L.; Newman, Andrew V.
2018-05-01
Over the past decade, the number of open-ocean gauges capable of parsing information about a passing tsunami has steadily increased, particularly through national cable networks and international buoyed efforts such as the Deep-ocean Assessment and Reporting of Tsunami (DART). This information is analyzed to disseminate tsunami warnings to affected regions. However, most current warnings that incorporate tsunami are directed at mid- and far-field localities. In this study, we analyze the region surrounding four seismically active subduction zones, Cascadia, Japan, Chile, and Java, for their potential to facilitate local tsunami early warning using such systems. We assess which locations currently have instrumentation in the right locations for direct tsunami observations with enough time to provide useful warning to the nearest affected coastline—and which are poorly suited for such systems. Our primary findings are that while some regions are ill-suited for this type of early warning, such as the coastlines of Chile, other localities, like Java, Indonesia, could incorporate direct tsunami observations into their hazard forecasts with enough lead time to be effective for coastal community emergency response. We take into account the effect of tsunami propagation with regard to shallow bathymetry on the fore-arc as well as the effect of earthquake source placement. While it is impossible to account for every type of off-shore tsunamigenic event in these locales, this study aims to characterize a typical large tsunamigenic event occurring in the shallow part of the megathrust as a guide in what is feasible with early tsunami warning.
Office Marine, Tropical, and Tsunami Services Branch Items of Interest Marine Forecasts Text, Graphic , Items of Interest, Forecasts, Observations, Portals, Dissemination, Publications, Links, FAQ, Contacts
NASA Astrophysics Data System (ADS)
Wood, N. J.; Schmidtlein, M.; Schelling, J.; Jones, J.; Ng, P.
2012-12-01
Recent tsunami disasters, such as the 2010 Chilean and 2011 Tohoku events, demonstrate the significant life loss that can occur from tsunamis. Many coastal communities in the world are threatened by near-field tsunami hazards that may inundate low-lying areas only minutes after a tsunami begins. Geospatial integration of demographic data and hazard zones has identified potential impacts on populations in communities susceptible to near-field tsunami threats. Pedestrian-evacuation models build on these geospatial analyses to determine if individuals in tsunami-prone areas will have sufficient time to reach high ground before tsunami-wave arrival. Areas where successful evacuations are unlikely may warrant vertical-evacuation (VE) strategies, such as berms or structures designed to aid evacuation. The decision of whether and where VE strategies are warranted is complex. Such decisions require an interdisciplinary understanding of tsunami hazards, land cover conditions, demography, community vulnerability, pedestrian-evacuation models, land-use and emergency-management policy, and decision science. Engagement with the at-risk population and local emergency managers in VE planning discussions is critical because resulting strategies include permanent structures within a community and their local ownership helps ensure long-term success. We present a summary of an interdisciplinary approach to assess VE options in communities along the southwest Washington coast (U.S.A.) that are threatened by near-field tsunami hazards generated by Cascadia subduction zone earthquakes. Pedestrian-evacuation models based on an anisotropic approach that uses path-distance algorithms were merged with population data to forecast the distribution of at-risk individuals within several communities as a function of travel time to safe locations. A series of community-based workshops helped identify potential VE options in these communities, collectively known as "Project Safe Haven" at the State of Washington Emergency Management Division. Models of the influence of stakeholder-driven VE options identified changes in the type and distribution of at-risk individuals. Insights from VE use and performance as an aid to evacuations from the 2011 Tohoku tsunami helped to inform the meetings and the analysis. We developed geospatial tools to automate parts of the pedestrian-evacuation models to support the iterative process of developing VE options and forecasting changes in population exposure. Our summary presents the interdisciplinary effort to forecast population impacts from near-field tsunami threats and to develop effective VE strategies to minimize fatalities in future events.
Open Source Seismic Software in NOAA's Next Generation Tsunami Warning System
NASA Astrophysics Data System (ADS)
Hellman, S. B.; Baker, B. I.; Hagerty, M. T.; Leifer, J. M.; Lisowski, S.; Thies, D. A.; Donnelly, B. K.; Griffith, F. P.
2014-12-01
The Tsunami Information technology Modernization (TIM) is a project spearheaded by National Oceanic and Atmospheric Administration to update the United States' Tsunami Warning System software currently employed at the Pacific Tsunami Warning Center (Eva Beach, Hawaii) and the National Tsunami Warning Center (Palmer, Alaska). This entirely open source software project will integrate various seismic processing utilities with the National Weather Service Weather Forecast Office's core software, AWIPS2. For the real-time and near real-time seismic processing aspect of this project, NOAA has elected to integrate the open source portions of GFZ's SeisComP 3 (SC3) processing system into AWIPS2. To provide for better tsunami threat assessments we are developing open source tools for magnitude estimations (e.g., moment magnitude, energy magnitude, surface wave magnitude), detection of slow earthquakes with the Theta discriminant, moment tensor inversions (e.g. W-phase and teleseismic body waves), finite fault inversions, and array processing. With our reliance on common data formats such as QuakeML and seismic community standard messaging systems, all new facilities introduced into AWIPS2 and SC3 will be available as stand-alone tools or could be easily integrated into other real time seismic monitoring systems such as Earthworm, Antelope, etc. Additionally, we have developed a template based design paradigm so that the developer or scientist can efficiently create upgrades, replacements, and/or new metrics to the seismic data processing with only a cursory knowledge of the underlying SC3.
Making Multi-Level Tsunami Evacuation Playbooks Operational in California and Hawaii
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Peterson, D.; Fryer, G. J.; Miller, K.; Nicolini, T.; Popham, C.; Richards, K.; Whitmore, P.; Wood, N. J.
2016-12-01
In the aftermath of the 2010 Chile, 2011 Japan, and 2012 Haida Gwaii tsunamis in California and Hawaii, coastal emergency managers requested that state and federal tsunami programs investigate providing more detailed information about the flood potential and recommended evacuation for distant-source tsunamis well ahead of their arrival time. Evacuation "Playbooks" for tsunamis of variable sizes and source locations have been developed for some communities in the two states, providing secondary options to an all or nothing approach for evacuation. Playbooks have been finalized for nearly 70% of the coastal communities in California, and have been drafted for evaluation by the communities of Honolulu and Hilo in Hawaii. A key component to determining a recommended level of evacuation during a distant-source tsunami and making the Playbooks operational has been the development of the "FASTER" approach, an acronym for factors that influence the tsunami flood hazard for a community: Forecast Amplitude, Storm, Tides, Error in forecast, and the Run-up potential. Within the first couple hours after a tsunami is generated, the FASTER flood elevation value will be computed and used to select the appropriate minimum tsunami phase evacuation "Playbook" for use by the coastal communities. The states of California and Hawaii, the tsunami warning centers, and local weather service offices are working together to deliver recommendations on the appropriate evacuation Playbook plans for communities to use prior to the arrival of a distant-source tsunami. These partners are working closely with individual communities on developing conservative and consistent protocols on the use of the Playbooks. Playbooks help provide a scientifically-based, minimum response for small- to moderate-size tsunamis which could reduce the potential for over-evacuation of hundreds of thousands of people and save hundreds of millions of dollars in evacuation costs for communities and businesses.
Water level ingest, archive and processing system - an integral part of NOAA's tsunami database
NASA Astrophysics Data System (ADS)
McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.
2013-12-01
The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.
Developing Tsunami Evacuation Plans, Maps, And Procedures: Pilot Project in Central America
NASA Astrophysics Data System (ADS)
Arcos, N. P.; Kong, L. S. L.; Arcas, D.; Aliaga, B.; Coetzee, D.; Leonard, J.
2015-12-01
In the End-to-End tsunami warning chain, once a forecast is provided and a warning alert issued, communities must know what to do and where to go. The 'where to' answer would be reliable and practical community-level tsunami evacuation maps. Following the Exercise Pacific Wave 2011, a questionnaire was sent to the 46 Member States of Pacific Tsunami Warning System (PTWS). The results revealed over 42 percent of Member States lacked tsunami mass coastal evacuation plans. Additionally, a significant gap in mapping was exposed as over 55 percent of Member States lacked tsunami evacuation maps, routes, signs and assembly points. Thereby, a significant portion of countries in the Pacific lack appropriate tsunami planning and mapping for their at-risk coastal communities. While a variety of tools exist to establish tsunami inundation areas, these are inconsistent while a methodology has not been developed to assist countries develop tsunami evacuation maps, plans, and procedures. The International Tsunami Information Center (ITIC) and partners is leading a Pilot Project in Honduras demonstrating that globally standardized tools and methodologies can be applied by a country, with minimal tsunami warning and mitigation resources, towards the determination of tsunami inundation areas and subsequently community-owned tsunami evacuation maps and plans for at-risk communities. The Pilot involves a 1- to 2-year long process centered on a series of linked tsunami training workshops on: evacuation planning, evacuation map development, inundation modeling and map creation, tsunami warning & emergency response Standard Operating Procedures (SOPs), and conducting tsunami exercises (including evacuation). The Pilot's completion is capped with a UNESCO/IOC document so that other countries can replicate the process in their tsunami-prone communities.
NASA Astrophysics Data System (ADS)
Gusman, A. R.; Setiyono, U.; Satake, K.; Fujii, Y.
2017-12-01
We built pre-computed tsunami inundation database in Pelabuhan Ratu, one of tsunami-prone areas on the southern coast of Java, Indonesia. The tsunami database can be employed for a rapid estimation of tsunami inundation during an event. The pre-computed tsunami waveforms and inundations are from a total of 340 scenarios ranging from 7.5 to 9.2 in moment magnitude scale (Mw), including simple fault models of 208 thrust faults and 44 tsunami earthquakes on the plate interface, as well as 44 normal faults and 44 reverse faults in the outer-rise region. Using our tsunami inundation forecasting algorithm (NearTIF), we could rapidly estimate the tsunami inundation in Pelabuhan Ratu for three different hypothetical earthquakes. The first hypothetical earthquake is a megathrust earthquake type (Mw 9.0) offshore Sumatra which is about 600 km from Pelabuhan Ratu to represent a worst-case event in the far-field. The second hypothetical earthquake (Mw 8.5) is based on a slip deficit rate estimation from geodetic measurements and represents a most likely large event near Pelabuhan Ratu. The third hypothetical earthquake is a tsunami earthquake type (Mw 8.1) which often occur south off Java. We compared the tsunami inundation maps produced by the NearTIF algorithm with results of direct forward inundation modeling for the hypothetical earthquakes. The tsunami inundation maps produced from both methods are similar for the three cases. However, the tsunami inundation map from the inundation database can be obtained in much shorter time (1 min) than the one from a forward inundation modeling (40 min). These indicate that the NearTIF algorithm based on pre-computed inundation database is reliable and useful for tsunami warning purposes. This study also demonstrates that the NearTIF algorithm can work well even though the earthquake source is located outside the area of fault model database because it uses a time shifting procedure for the best-fit scenario searching.
Challenges in Defining Tsunami Wave Height
NASA Astrophysics Data System (ADS)
Stroker, K. J.; Dunbar, P. K.; Mungov, G.; Sweeney, A.; Arcos, N. P.
2017-12-01
The NOAA National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 Mw earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height, NCEI will consider adding an additional field for the maximum peak measurement.
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Eble, M. C.
2013-12-01
The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.
Challenges in Defining Tsunami Wave Heights
NASA Astrophysics Data System (ADS)
Dunbar, Paula; Mungov, George; Sweeney, Aaron; Stroker, Kelly; Arcos, Nicolas
2017-08-01
The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 M w earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 coastal tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height for each tide gauge and deep-ocean buoy, NCEI will consider adding an additional field for the maximum peak measurement.
Seismically generated tsunamis.
Arcas, Diego; Segur, Harvey
2012-04-13
People around the world know more about tsunamis than they did 10 years ago, primarily because of two events: a tsunami on 26 December 2004 that killed more than 200,000 people around the shores of the Indian Ocean; and an earthquake and tsunami off the coast of Japan on 11 March 2011 that killed nearly 15,000 more and triggered a nuclear accident, with consequences that are still unfolding. This paper has three objectives: (i) to summarize our current knowledge of the dynamics of tsunamis; (ii) to describe how that knowledge is now being used to forecast tsunamis; and (iii) to suggest some policy changes that might protect people better from the dangers of future tsunamis.
Rapid tsunami models and earthquake source parameters: Far-field and local applications
Geist, E.L.
2005-01-01
Rapid tsunami models have recently been developed to forecast far-field tsunami amplitudes from initial earthquake information (magnitude and hypocenter). Earthquake source parameters that directly affect tsunami generation as used in rapid tsunami models are examined, with particular attention to local versus far-field application of those models. First, validity of the assumption that the focal mechanism and type of faulting for tsunamigenic earthquakes is similar in a given region can be evaluated by measuring the seismic consistency of past events. Second, the assumption that slip occurs uniformly over an area of rupture will most often underestimate the amplitude and leading-wave steepness of the local tsunami. Third, sometimes large magnitude earthquakes will exhibit a high degree of spatial heterogeneity such that tsunami sources will be composed of distinct sub-events that can cause constructive and destructive interference in the wavefield away from the source. Using a stochastic source model, it is demonstrated that local tsunami amplitudes vary by as much as a factor of two or more, depending on the local bathymetry. If other earthquake source parameters such as focal depth or shear modulus are varied in addition to the slip distribution patterns, even greater uncertainty in local tsunami amplitude is expected for earthquakes of similar magnitude. Because of the short amount of time available to issue local warnings and because of the high degree of uncertainty associated with local, model-based forecasts as suggested by this study, direct wave height observations and a strong public education and preparedness program are critical for those regions near suspected tsunami sources.
U.S. High Seas Marine Text Forecasts by Area
Flooding Tsunamis 406 EPIRB's U.S. High Seas Marine Text Forecasts by Area OPC N.Atlantic High Seas Forecast NHC N.Atlantic High Seas Forecast OPC N.Pacific High Seas Forecast HFO N.Pacific High Seas Forecast NHC N.Pacific High Seas Forecast HFO S.Pacific High Seas Forecast U.S. High Seas Marine Text
Global Tsunami Warning System Development Since 2004
NASA Astrophysics Data System (ADS)
Weinstein, S.; Becker, N. C.; Wang, D.; Fryer, G. J.; McCreery, C.; Hirshorn, B. F.
2014-12-01
The 9.1 Mw Great Sumatra Earthquake of Dec. 26, 2004, generated the most destructive tsunami in history killing 227,000 people along Indian Ocean coastlines and was recorded by sea-level instruments world-wide. This tragedy showed the Indian Ocean needed a tsunami warning system to prevent another tragedy on this scale. The Great Sumatra Earthquake also highlighted the need for tsunami warning systems in other ocean basins. Instruments for recording earthquakes and sea-level data useful for tsunami monitoring did not exist outside of the Pacific Ocean in 2004. Seismometers were few in number, and even fewer were high-quality long period broadband instruments. Nor was much of their data made available to the US tsunami warning centers (TWCs). In 2004 the US TWCs relied exclusively on instrumentation provided and maintained by IRIS and the USGS for areas outside of the Pacific.Since 2004, the US TWCs and their partners have made substantial improvements to seismic and sea-level monitoring networks with the addition of new and better instruments, densification of existing networks, better communications infrastructure, and improved data sharing among tsunami warning centers. In particular, the number of sea-level stations transmitting data in near real-time and the amount of seismic data available to the tsunami warning centers has more than tripled. The DART network that consisted of a half-dozen Pacific stations in 2004 now totals nearly 60 stations worldwide. Earthquake and tsunami science has progressed as well. It took nearly three weeks to obtain the first reliable estimates of the 2004 Sumatra Earthquake's magnitude. Today, thanks to improved seismic networks and modern computing power, TWCs use the W-phase seismic moment method to determine accurate earthquake magnitudes and focal mechanisms for great earthquakes within 25 minutes. TWC scientists have also leveraged these modern computers to generate tsunami forecasts in a matter of minutes.Progress towards a global tsunami warning system has been substantial and today fully-functioning TWCs protect most of the world's coastlines. These improvements have also led to a substantial reduction of time required by the TWCs to detect, locate, and assess the tsunami threat from earthquakes occurring worldwide.
NASA Astrophysics Data System (ADS)
Gica, E.; Reynolds, M.
2012-12-01
Recent global models predict a rise of approximately one meter in global sea level by 2100, with potentially larger increases in areas of the Pacific Ocean. If current climate change trends continue, low-lying islands across the globe may become inundated over the next century, placing island biodiversity at risk. Adding to the risk of inundation due to sea level rise is the occurrence of cyclones and tsunamis. This combined trend will affect the low-lying islands of the Northwestern Hawaiian Islands and it is therefore important to assess its impact since these islands are critical habitats to many endangered endemic species and support the largest tropical seabird rookery in the world. The 11 March 2011 Tohoku (Mw=8.8) earthquake-tsunami affected the habitat of many endangered endemic species in Midway Atoll National Wildlife Refuge because all three islands (Sand, Eastern and Spit) were inundated by tsunami waves. At present sea level, some tsunamis from certain source regions would not affect Midway Atoll. For example, the previous earthquake-tsunamis such as the 15 November 2006 Kuril (Mw=8.1) and 13 February 2007 Kuril (Mw=7.9) were not significant enough to affect Midway Atoll. But at higher sea levels, tsunamis with similar characteristics could pose a threat to such terrestrial habitats and wildlife. To visualize projected impacts to vegetation composition, wildlife habitat, and wildlife populations, we explored and analyzed inundation vulnerability for a range of possible sea level rise and tsunami scenarios at Midway Atoll National Wildlife Refuge. Studying the combined threat of tsunamis and sea level rise can provide more accurate and comprehensive assessments of the vulnerability of the unique natural resources on low-lying islands. A passive sea level rise model was used to determine how much inundation will occur at different sea level rise values for the three islands of Midway Atoll and each scenario was coupled with NOAA Center for Tsunami Research's tsunami forecasting tool. The tsunami forecasting tool was used to generate tsunami scenarios from different source regions and served as boundary conditions for inundation models to project the coastal impact at Midway Atoll. Underlying the tsunami forecast tool is a database of pre-computed tsunami propagation runs for discrete sections of the earth's subduction zones that are the principal locus of tsunami-generating activity. The new LiDAR topographic data, which is the first high resolution elevation data for three individual islands of Midway Atoll, was used for both the passive sea level rise model and inundation model for Midway Atoll. Results of the study will indicate how the combined climate change and tsunami occurrence will affect Midway Atoll and can therefore be used for early climate change adaptation and mitigation planning, especially for vulnerable species and areas of the Atoll.
NWS Offshore Marine Forecasts by Zone
Beach Hazards Rip Currents Hypothermia Hurricanes Thunderstorms Lightning Coastal Flooding Tsunamis 406 page is also available in a text version. Similar webpages for Coastal/Great Lakes Forecasts by Zone
Coastal/Great Lakes Forecasts by Zone
Hazards Rip Currents Hypothermia Hurricanes Thunderstorms Lightning Coastal Flooding Tsunamis 406 EPIRB's Coastal/Great Lakes Forecasts by Zone >>Click on the area of interest below<< Coastal and
NASA Astrophysics Data System (ADS)
Tinti, S.; Tonini, R.
2013-07-01
Nowadays numerical models are a powerful tool in tsunami research since they can be used (i) to reconstruct modern and historical events, (ii) to cast new light on tsunami sources by inverting tsunami data and observations, (iii) to build scenarios in the frame of tsunami mitigation plans, and (iv) to produce forecasts of tsunami impact and inundation in systems of early warning. In parallel with the general recognition of the importance of numerical tsunami simulations, the demand has grown for reliable tsunami codes, validated through tests agreed upon by the tsunami community. This paper presents the tsunami code UBO-TSUFD that has been developed at the University of Bologna, Italy, and that solves the non-linear shallow water (NSW) equations in a Cartesian frame, with inclusion of bottom friction and exclusion of the Coriolis force, by means of a leapfrog (LF) finite-difference scheme on a staggered grid and that accounts for moving boundaries to compute sea inundation and withdrawal at the coast. Results of UBO-TSUFD applied to four classical benchmark problems are shown: two benchmarks are based on analytical solutions, one on a plane wave propagating on a flat channel with a constant slope beach; and one on a laboratory experiment. The code is proven to perform very satisfactorily since it reproduces quite well the benchmark theoretical and experimental data. Further, the code is applied to a realistic tsunami case: a scenario of a tsunami threatening the coasts of eastern Sicily, Italy, is defined and discussed based on the historical tsunami of 11 January 1693, i.e. one of the most severe events in the Italian history.
Tsunami propagation modelling - a sensitivity study
NASA Astrophysics Data System (ADS)
Dao, M. H.; Tkalich, P.
2007-12-01
Indian Ocean (2004) Tsunami and following tragic consequences demonstrated lack of relevant experience and preparedness among involved coastal nations. After the event, scientific and forecasting circles of affected countries have started a capacity building to tackle similar problems in the future. Different approaches have been used for tsunami propagation, such as Boussinesq and Nonlinear Shallow Water Equations (NSWE). These approximations were obtained assuming different relevant importance of nonlinear, dispersion and spatial gradient variation phenomena and terms. The paper describes further development of original TUNAMI-N2 model to take into account additional phenomena: astronomic tide, sea bottom friction, dispersion, Coriolis force, and spherical curvature. The code is modified to be suitable for operational forecasting, and the resulting version (TUNAMI-N2-NUS) is verified using test cases, results of other models, and real case scenarios. Using the 2004 Tsunami event as one of the scenarios, the paper examines sensitivity of numerical solutions to variation of different phenomena and parameters, and the results are analyzed and ranked accordingly.
Introduction to “Global tsunami science: Past and future, Volume II”
Rabinovich, Alexander B.; Fritz, Hermann M.; Tanioka, Yuichiro; Geist, Eric L.
2017-01-01
Twenty-two papers on the study of tsunamis are included in Volume II of the PAGEOPH topical issue “Global Tsunami Science: Past and Future”. Volume I of this topical issue was published as PAGEOPH, vol. 173, No. 12, 2016 (Eds., E. L. Geist, H. M. Fritz, A. B. Rabinovich, and Y. Tanioka). Three papers in Volume II focus on details of the 2011 and 2016 tsunami-generating earthquakes offshore of Tohoku, Japan. The next six papers describe important case studies and observations of recent and historical events. Four papers related to tsunami hazard assessment are followed by three papers on tsunami hydrodynamics and numerical modelling. Three papers discuss problems of tsunami warning and real-time forecasting. The final set of three papers importantly investigates tsunamis generated by non-seismic sources: volcanic explosions, landslides, and meteorological disturbances. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.
Introduction to "Global Tsunami Science: Past and Future, Volume II"
NASA Astrophysics Data System (ADS)
Rabinovich, Alexander B.; Fritz, Hermann M.; Tanioka, Yuichiro; Geist, Eric L.
2017-08-01
Twenty-two papers on the study of tsunamis are included in Volume II of the PAGEOPH topical issue "Global Tsunami Science: Past and Future". Volume I of this topical issue was published as PAGEOPH, vol. 173, No. 12, 2016 (Eds., E. L. Geist, H. M. Fritz, A. B. Rabinovich, and Y. Tanioka). Three papers in Volume II focus on details of the 2011 and 2016 tsunami-generating earthquakes offshore of Tohoku, Japan. The next six papers describe important case studies and observations of recent and historical events. Four papers related to tsunami hazard assessment are followed by three papers on tsunami hydrodynamics and numerical modelling. Three papers discuss problems of tsunami warning and real-time forecasting. The final set of three papers importantly investigates tsunamis generated by non-seismic sources: volcanic explosions, landslides, and meteorological disturbances. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.
COMMERCIAL SERVICES PROVIDING MARINE FORECASTS VIA SATELLITE
Tsunamis 406 EPIRB's National Weather Service Marine Forecasts COMMERCIAL SERVICES PROVIDING MARINE forecast seas? And may present an even greater danger near shore or any shallow waters? COMMERCIAL SERVICES commercial product or service does not imply any endorsement by the National Weather Service as to function
Tsunami Early Warning via a Physics-Based Simulation Pipeline
NASA Astrophysics Data System (ADS)
Wilson, J. M.; Rundle, J. B.; Donnellan, A.; Ward, S. N.; Komjathy, A.
2017-12-01
Through independent efforts, physics-based simulations of earthquakes, tsunamis, and atmospheric signatures of these phenomenon have been developed. With the goal of producing tsunami forecasts and early warning tools for at-risk regions, we join these three spheres to create a simulation pipeline. The Virtual Quake simulator can produce thousands of years of synthetic seismicity on large, complex fault geometries, as well as the expected surface displacement in tsunamigenic regions. These displacements are used as initial conditions for tsunami simulators, such as Tsunami Squares, to produce catalogs of potential tsunami scenarios with probabilities. Finally, these tsunami scenarios can act as input for simulations of associated ionospheric total electron content, signals which can be detected by GNSS satellites for purposes of early warning in the event of a real tsunami. We present the most recent developments in this project.
NASA Astrophysics Data System (ADS)
González-Carrasco, J. F.; Benavente, R. F.; Zelaya, C.; Núñez, C.; Gonzalez, G.
2017-12-01
The 2017 Mw 8.1, Tehuantepec earthquake generated a moderated tsunami, which was registered in near-field tide gauges network activating a tsunami threat state for Mexico issued by PTWC. In the case of Chile, the forecast of tsunami waves indicate amplitudes less than 0.3 meters above the tide level, advising an informative state of threat, without activation of evacuation procedures. Nevertheless, during sea level monitoring of network we detect wave amplitudes (> 0.3 m) indicating a possible change of threat state. Finally, NTWS maintains informative level of threat based on mathematical filtering analysis of sea level records. After 2010 Mw 8.8, Maule earthquake, the Chilean National Tsunami Warning System (NTWS) has increased its observational capabilities to improve early response. Most important operational efforts have focused on strengthening tide gauge network for national area of responsibility. Furthermore, technological initiatives as Integrated Tsunami Prediction and Warning System (SIPAT) has segmented the area of responsibility in blocks to focus early warning and evacuation procedures on most affected coastal areas, while maintaining an informative state for distant areas of near-field earthquake. In the case of far-field events, NTWS follow the recommendations proposed by Pacific Tsunami Warning Center (PTWC), including a comprehensive monitoring of sea level records, such as tide gauges and DART (Deep-Ocean Assessment and Reporting of Tsunami) buoys, to evaluate the state of tsunami threat in the area of responsibility. The main objective of this work is to analyze the first-order physical processes involved in the far-field propagation and coastal impact of tsunami, including implications for decision-making of NTWS. To explore our main question, we construct a finite-fault model of the 2017, Mw 8.1 Tehuantepec earthquake. We employ the rupture model to simulate a transoceanic tsunami modeled by Neowave2D. We generate synthetic time series at tide gauge stations and compare them with recorded sea level data, to dismiss meteorological processes, such as storms and surges. Resonance analysis is performed by wavelet technique.
Geoethical issues involved in Tsunami Warning System concepts and operations
NASA Astrophysics Data System (ADS)
Charalampakis, Marinos; Papadopoulos, Gerassimos A.; Tinti, Stefano
2016-04-01
The main goal of a Tsunami Warning System (TWS) is to mitigate the effect of an incoming tsunami by alerting coastal population early enough to allow people to evacuate safely from inundation zones. Though this representation might seem oversimplified, nonetheless, achieving successfully this goal requires a positive synergy of geoscience, communication, emergency management, technology, education, social sciences, politics. Geoethical issues arise always when there is an interaction between geoscience and society, and TWS is a paradigmatic case where interaction is very strong and is made critical because a) the formulation of the tsunami alert has to be made in a time as short as possible and therefore on uncertain data, and b) any evaluation error (underestimation or overestimation) can lead to serious (and sometimes catastrophic) consequences involving wide areas and a large amount of population. From the geoethical point of view three issues are critical: how to (i) combine forecasts and uncertainties reasonably and usefully, (ii) cope and possibly solve the dilemma whether it is better over-alerting or under-alerting population and (iii) deal with responsibility and liability of geoscientists, TWS operators, emergency operators and coastal population. The discussion will be based on the experience of the Hellenic National Tsunami Warning Center (HL-NTWC, Greece), which operates on 24/7 basis as a special unit of the Institute of Geodynamics, National Observatory of Athens, and acts also as Candidate Tsunami Service Provider (CTSP) in the framework of the North-Eastern Atlantic, the Mediterranean and connected seas Tsunami Warning System (NEAMTWS) of the IOC/UNESCO. Since August 2012, when HL-NTWC was officially declared as operational, 14 tsunami warning messages have been disseminated to a large number of subscribers after strong submarine earthquakes occurring in Greece and elsewhere in the eastern Mediterranean. It is recognized that the alerting process and procedure are quite complex and deserve an open and wide debate, that at the moment seems to be absent from media, scientific community and society, very likely until the next tsunami disaster.
NASA Astrophysics Data System (ADS)
Jiménez, César; Carbonel, Carlos; Rojas, Joel
2018-04-01
We have implemented a numerical procedure to forecast the parameters of a tsunami, such as the arrival time of the front of the first wave and the maximum wave height in real and virtual tidal stations along the Peruvian coast, with this purpose a database of pre-computed synthetic tsunami waveforms (or Green functions) was obtained from numerical simulation of seismic unit sources (dimension: 50 × 50 km2) for subduction zones from southern Chile to northern Mexico. A bathymetry resolution of 30 arc-sec (approximately 927 m) was used. The resulting tsunami waveform is obtained from the superposition of synthetic waveforms corresponding to several seismic unit sources contained within the tsunami source geometry. The numerical procedure was applied to the Chilean tsunami of April 1, 2014. The results show a very good correlation for stations with wave amplitude greater than 1 m, in the case of the Arica tide station an error (from the maximum height of the observed and simulated waveform) of 3.5% was obtained, for Callao station the error was 12% and the largest error was in Chimbote with 53.5%, however, due to the low amplitude of the Chimbote wave (<1 m), the overestimated error, in this case, is not important for evacuation purposes. The aim of the present research is tsunami early warning, where speed is required rather than accuracy, so the results should be taken as preliminary.
NASA Astrophysics Data System (ADS)
Jiménez, César; Carbonel, Carlos; Rojas, Joel
2017-09-01
We have implemented a numerical procedure to forecast the parameters of a tsunami, such as the arrival time of the front of the first wave and the maximum wave height in real and virtual tidal stations along the Peruvian coast, with this purpose a database of pre-computed synthetic tsunami waveforms (or Green functions) was obtained from numerical simulation of seismic unit sources (dimension: 50 × 50 km2) for subduction zones from southern Chile to northern Mexico. A bathymetry resolution of 30 arc-sec (approximately 927 m) was used. The resulting tsunami waveform is obtained from the superposition of synthetic waveforms corresponding to several seismic unit sources contained within the tsunami source geometry. The numerical procedure was applied to the Chilean tsunami of April 1, 2014. The results show a very good correlation for stations with wave amplitude greater than 1 m, in the case of the Arica tide station an error (from the maximum height of the observed and simulated waveform) of 3.5% was obtained, for Callao station the error was 12% and the largest error was in Chimbote with 53.5%, however, due to the low amplitude of the Chimbote wave (<1 m), the overestimated error, in this case, is not important for evacuation purposes. The aim of the present research is tsunami early warning, where speed is required rather than accuracy, so the results should be taken as preliminary.
Experiences integrating autonomous components and legacy systems into tsunami early warning systems
NASA Astrophysics Data System (ADS)
Reißland, S.; Herrnkind, S.; Guenther, M.; Babeyko, A.; Comoglu, M.; Hammitzsch, M.
2012-04-01
Fostered by and embedded in the general development of Information and Communication Technology (ICT) the evolution of Tsunami Early Warning Systems (TEWS) shows a significant development from seismic-centred to multi-sensor system architectures using additional sensors, e.g. sea level stations for the detection of tsunami waves and GPS stations for the detection of ground displacements. Furthermore, the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources serving near real-time data not only includes sensors but also other components and systems offering services such as the delivery of feasible simulations used for forecasting in an imminent tsunami threat. In the context of the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the project Distant Early Warning System (DEWS) a service platform for both sensor integration and warning dissemination has been newly developed and demonstrated. In particular, standards of the Open Geospatial Consortium (OGC) and the Organization for the Advancement of Structured Information Standards (OASIS) have been successfully incorporated. In the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) new developments are used to extend the existing platform to realise a component-based technology framework for building distributed TEWS. This talk will describe experiences made in GITEWS, DEWS and TRIDEC while integrating legacy stand-alone systems and newly developed special-purpose software components into TEWS using different software adapters and communication strategies to make the systems work together in a corporate infrastructure. The talk will also cover task management and data conversion between the different systems. Practical approaches and software solutions for the integration of sensors, e.g. providing seismic and sea level data, and utilisation of special-purpose components, such as simulation systems, in TEWS will be presented.
NASA Astrophysics Data System (ADS)
Harbitz, C. B.; Frauenfelder, R.; Kaiser, G.; Glimsdal, S.; Sverdrup-thygeson, K.; Løvholt, F.; Gruenburg, L.; Mc Adoo, B. G.
2015-12-01
The 2011 Tōhoku tsunami caused a high number of fatalities and massive destruction. Data collected after the event allow for retrospective analyses. Since 2009, NGI has developed a generic GIS model for local analyses of tsunami vulnerability and mortality risk. The mortality risk convolves the hazard, exposure, and vulnerability. The hazard is represented by the maximum tsunami flow depth (with a corresponding likelihood), the exposure is described by the population density in time and space, while the vulnerability is expressed by the probability of being killed as a function of flow depth and building class. The analysis is further based on high-resolution DEMs. Normally a certain tsunami scenario with a corresponding return period is applied for vulnerability and mortality risk analysis. Hence, the model was first employed for a tsunami forecast scenario affecting Bridgetown, Barbados, and further developed in a forecast study for the city of Batangas in the Philippines. Subsequently, the model was tested by hindcasting the 2009 South Pacific tsunami in American Samoa. This hindcast was based on post-tsunami information. The GIS model was adapted for optimal use of the available data and successfully estimated the degree of mortality.For further validation and development, the model was recently applied in the RAPSODI project for hindcasting the 2011 Tōhoku tsunami in Sendai and Ishinomaki. With reasonable choices of building vulnerability, the estimated expected number of fatalities agree well with the reported death toll. The results of the mortality hindcast for the 2011 Tōhoku tsunami substantiate that the GIS model can help to identify high tsunami mortality risk areas, as well as identify the main risk drivers.The research leading to these results has received funding from CONCERT-Japan Joint Call on Efficient Energy Storage and Distribution/Resilience against Disasters (http://www.concertjapan.eu; project RAPSODI - Risk Assessment and design of Prevention Structures fOr enhanced tsunami DIsaster resilience http://www.ngi.no/en/Project-pages/RAPSODI/), and from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 603839 (Project ASTARTE - Assessment, STrategy And Risk reduction for Tsunamis in Europe http://www.astarte-project.eu/).
Short-term Inundation Forecasting for Tsunamis in the Caribbean Sea Region
NASA Astrophysics Data System (ADS)
Mercado-Irizarry, A.; Schmidt, W.
2007-05-01
After the 2004 Indian Ocean tsunami, the USA Congress gave a mandate to the National Oceanographic and Atmospheric Administration (NOAA) to assess the tsunami threat for all USA interests, and adapt to them the Short-term Inundation Forecasting for Tsunamis (SIFT) methodology first developed for the USA Pacific seaboard states. This methodology would be used with the DART buoys deployed in the Atlantic Ocean and Caribbean Sea. The first step involved the evaluation and characterization of the major tsunamigenic regions in both regions, work done by the US Geological Survey (USGS). This was followed by the modeling of the generation and propagation of tsunamis due to unit slip tsunamigenic earthquakes located at different locations along the tsunamigenic zones identified by the USGS. These pre-computed results are stored and are used as sources (in an inverse modeling approach using the DART buoys) for so-called Standby Inundation Models (SIM's) being developed for selected coastal cities in Puerto Rico, the US Virgin Islands, and others along the Atlantic seaboard of the USA. It is the purpose of this presentation to describe the work being carried out in the Caribbean Sea region, where two SIM's for Puerto Rico have already being prepared, allowing for near real-time assessment (less than 10 minutes after detection by the DART buoys) of the expected tsunami impact for two major coastal cities.
NASA Astrophysics Data System (ADS)
Tinti, S.; Armigliato, A.; Pagnoni, G.; Zaniboni, F.
2012-04-01
One of the most challenging goals that the geo-scientific community is facing after the catastrophic tsunami occurred on December 2004 in the Indian Ocean is to develop the so-called "next generation" Tsunami Early Warning Systems (TEWS). Indeed, the meaning of "next generation" does not refer to the aim of a TEWS, which obviously remains to detect whether a tsunami has been generated or not by a given source and, in the first case, to send proper warnings and/or alerts in a suitable time to all the countries and communities that can be affected by the tsunami. Instead, "next generation" identifies with the development of a Decision Support System (DSS) that, in general terms, relies on 1) an integrated set of seismic, geodetic and marine sensors whose objective is to detect and characterise the possible tsunamigenic sources and to monitor instrumentally the time and space evolution of the generated tsunami, 2) databases of pre-computed numerical tsunami scenarios to be suitably combined based on the information coming from the sensor environment and to be used to forecast the degree of exposition of different coastal places both in the near- and in the far-field, 3) a proper overall (software) system architecture. The EU-FP7 TRIDEC Project aims at developing such a DSS and has selected two test areas in the Euro-Mediterranean region, namely the western Iberian margin and the eastern Mediterranean (Turkish coasts). In this study, we discuss the strategies that are being adopted in TRIDEC to build the databases of pre-computed tsunami scenarios and we show some applications to the western Iberian margin. In particular, two different databases are being populated, called "Virtual Scenario Database" (VSDB) and "Matching Scenario Database" (MSDB). The VSDB contains detailed simulations of few selected earthquake-generated tsunamis. The cases provided by the members of the VSDB are computed "real events"; in other words, they represent the unknowns that the TRIDEC platform must be able to recognise and match during the early crisis management phase. The MSDB contains a very large number (order of thousands) of tsunami simulations performed starting from many different simple earthquake sources of different magnitudes and located in the "vicinity" of the virtual scenario earthquake. Examples from both databases will be presented.
Advanced Planning for Tsunamis in California
NASA Astrophysics Data System (ADS)
Miller, K.; Wilson, R. I.; Larkin, D.; Reade, S.; Carnathan, D.; Davis, M.; Nicolini, T.; Johnson, L.; Boldt, E.; Tardy, A.
2013-12-01
The California Tsunami Program is comprised of the California Governor's Office of Emergency Services (CalOES) and the California Geological Survey (CGS) and funded through the National Tsunami Hazard Mitigation Program (NTHMP) and the Federal Emergency Management Agency (FEMA). The program works closely with the 20 coastal counties in California, as well as academic, and industry experts to improve tsunami preparedness and mitigation in shoreline communities. Inundation maps depicting 'worst case' inundation modeled from plausible sources around the Pacific were released in 2009 and have provided a foundation for public evacuation and emergency response planning in California. Experience during recent tsunamis impacting the state (Japan 2011, Chile 2010, Samoa 2009) has brought to light the desire by emergency managers and decision makers for even more detailed information ahead of future tsunamis. A solution to provide enhanced information has been development of 'playbooks' to plan for a variety of expected tsunami scenarios. Elevation 'playbook' lines can be useful for partial tsunami evacuations when enough information about forecast amplitude and arrival times is available to coastal communities and there is sufficient time to make more educated decisions about who to evacuate for a given scenario or actual event. NOAA-issued Tsunami Alert Bulletins received in advance of a distant event will contain an expected wave height (a number) for each given section of coast. Provision of four elevation lines for possible inundation enables planning for different evacuation scenarios based on the above number potentially alleviating the need for an 'all or nothing' decision with regard to evacuation. Additionally an analytical tool called FASTER is being developed to integrate storm, tides, modeling errors, and local tsunami run-up potential with the forecasted tsunami amplitudes in real-time when a tsunami Alert is sent out. Both of these products will help communities better implement evacuations and response activities for minor to moderate (less than maximum) tsunami events. A working group comprised of federal, state, and local governmental scientists, emergency managers, first responders, and community planners has explored details and delivery of the above tools for incorporation into emergency management protocols. The eventual outcome will be inclusion in plans, testing of protocols and methods via drills and exercises and application, as appropriate, during an impending tsunami event.
The One-Meter Criterion for Tsunami Warning: Time for a Reevaluation?
NASA Astrophysics Data System (ADS)
Fryer, G. J.; Weinstein, S.
2013-12-01
The U.S. tsunami warning centers issue warnings when runup is anticipated to exceed one meter. The origins of the one-meter criterion are unclear, though Whitmore, et al (2008) showed from tsunami history that one meter is roughly the threshold above which damage occurs. Recent experiences in Hawaii, however, suggest that the threshold could be raised. Tsunami Warnings were issued for 2010 Chile, 2011 Tohoku, and 2012 Haida Gwaii tsunamis; each exceeded one meter runup somewhere in the State. Evacuation, however, was necessary only in 2011, and even then onshore damage (as opposed to damage from currents) occurred only where runup exceeded 1.5m. During both Chile and Haida Gwaii tsunamis the existing criteria led to unnecessary evacuation. Maximum runup during the Chile tsunami was 1.1m at Hilo's Wailoa Boat Harbor, while the Haida Gwaii tsunami peaked at 1.2m at Honouliwai Bay on Molokai. Both tsunamis caused only minor damage and minimal flooding; in both cases a Tsunami Advisory (i.e., there is no need to evacuate, but stay off the beach and out of the water) would have been adequate. The Advisory was originally developed as an ad hoc response to the mildly threatening 2006 Kuril tsunami and has since been formalized as the product we issue when maximum runup is expected to be 0.3-1.0 m. At the time it was introduced, however, there was no discussion that this new low-level warning might allow the criterion for Tsunami Warning itself to be adjusted. We now suggest that the divide between Advisory and Warning be raised from 1.0 to something greater, possibly 1.2m. If the warning threshold were raised to 1.2m, the over-warning for the Chile tsunami still could not have been avoided. Models calibrated against DART data consistently forecast runup just over 1.2m for that event. For Haida Gwaii, adjusting the models to match the DART data increased the forecast runup to almost 2m, which again meant a warning, though in retrospect we should have been skeptical. The nearest DART to Haida Gwaii was off the Washington coast in line with the long axis (strike direction) of the rupture and so provided little constraint on the tsunami directed towards Hawaii (the dip direction). The finite fault model obtained by inverting the DART data extended the rupture too far along strike and pushed the rupture to the wrong (east) side of Haida Gwaii, in conflict with the W-phase CMT. The inferred wave height at the Langara Point tide gauge, just outside the epicentral region, was also too large by a factor of two. Forcing the tsunami inversion to be consistent with the CMT would have rendered the inferred rupture much closer to reality, matched the Langara Point record well, and forecast a maximum runup at Kahului of only 1.0 m (the actual runup there was 0.8m). If the warning criterion had been 1.2m the unnecessary coastal evacuation for the Haida Gwaii tsunami could have been avoided. So increasing the warning threshold by only 20 cm would eliminate one of the two recent unnecessary evacuations. Can the threshold be be raised even more? We are considering that possibility, though the uncertainties and time constraints of an actual warning demand that we remain very conservative.
Rapid estimate of earthquake source duration: application to tsunami warning.
NASA Astrophysics Data System (ADS)
Reymond, Dominique; Jamelot, Anthony; Hyvernaud, Olivier
2016-04-01
We present a method for estimating the source duration of the fault rupture, based on the high-frequency envelop of teleseismic P-Waves, inspired from the original work of (Ni et al., 2005). The main interest of the knowledge of this seismic parameter is to detect abnormal low velocity ruptures that are the characteristic of the so called 'tsunami-earthquake' (Kanamori, 1972). The validation of the results of source duration estimated by this method are compared with two other independent methods : the estimated duration obtained by the Wphase inversion (Kanamori and Rivera, 2008, Duputel et al., 2012) and the duration calculated by the SCARDEC process that determines the source time function (M. Vallée et al., 2011). The estimated source duration is also confronted to the slowness discriminant defined by Newman and Okal, 1998), that is calculated routinely for all earthquakes detected by our tsunami warning process (named PDFM2, Preliminary Determination of Focal Mechanism, (Clément and Reymond, 2014)). Concerning the point of view of operational tsunami warning, the numerical simulations of tsunami are deeply dependent on the source estimation: better is the source estimation, better will be the tsunami forecast. The source duration is not directly injected in the numerical simulations of tsunami, because the cinematic of the source is presently totally ignored (Jamelot and Reymond, 2015). But in the case of a tsunami-earthquake that occurs in the shallower part of the subduction zone, we have to consider a source in a medium of low rigidity modulus; consequently, for a given seismic moment, the source dimensions will be decreased while the slip distribution increased, like a 'compact' source (Okal, Hébert, 2007). Inversely, a rapid 'snappy' earthquake that has a poor tsunami excitation power, will be characterized by higher rigidity modulus, and will produce weaker displacement and lesser source dimensions than 'normal' earthquake. References: CLément, J. and Reymond, D. (2014). New Tsunami Forecast Tools for the French Polynesia Tsunami Warning System. Pure Appl. Geophys, 171. DUPUTEL, Z., RIVERA, L., KANAMORI, H. and HAYES, G. (2012). Wphase source inversion for moderate to large earthquakes. Geophys. J. Intl.189, 1125-1147. Kanamori, H. (1972). Mechanism of tsunami earthquakes. Phys. Earth Planet. Inter. 6, 246-259. Kanamori, H. and Rivera, L. (2008). Source inversion of W phase : speeding up seismic tsunami warning. Geophys. J. Intl. 175, 222-238. Newman, A. and Okal, E. (1998). Teleseismic estimates of radiated seismic energy : The E=M0 discriminant for tsunami earthquakes. J. Geophys. Res. 103, 26885-26898. Ni, S., H. Kanamori, and D. Helmberger (2005), Energy radiation from the Sumatra earthquake, Nature, 434, 582. Okal, E.A., and H. Hébert (2007), Far-field modeling of the 1946 Aleutian tsunami, Geophys. J. Intl., 169, 1229-1238. Vallée, M., J. Charléty, A.M.G. Ferreira, B. Delouis, and J. Vergoz, SCARDEC : a new technique for the rapid determination of seismic moment magnitude, focal mechanism and source time functions for large earthquakes using body wave deconvolution, Geophys. J. Int., 184, 338-358, 2011.
NOAA Operational Tsunameter Support for Research
NASA Astrophysics Data System (ADS)
Bouchard, R.; Stroker, K.
2008-12-01
In March 2008, the National Oceanic and Atmospheric Administration's (NOAA) National Data Buoy Center (NDBC) completed the deployment of the last of the 39-station network of deep-sea tsunameters. As part of NOAA's effort to strengthen tsunami warning capabilities, NDBC expanded the network from 6 to 39 stations and upgraded all stations to the second generation Deep-ocean Assessment and Reporting of Tsunamis technology (DART II). Consisting of a bottom pressure recorder (BPR) and a surface buoy, the tsunameters deliver water-column heights, estimated from pressure measurements at the sea floor, to Tsunami Warning Centers in less than 3 minutes. This network provides coastal communities in the Pacific, Atlantic, Caribbean, and the Gulf of Mexico with faster and more accurate tsunami warnings. In addition, both the coarse resolution real-time data and the high resolution (15-second) recorded data provide invaluable contributions to research, such as the detection of the 2004 Sumatran tsunami in the Northeast Pacific (Gower and González, 2006) and the experimental tsunami forecast system (Bernard et al., 2007). NDBC normally recovers the BPRs every 24 months and sends the recovered high resolution data to NOAA's National Geophysical Data Center (NGDC) for archive and distribution. NGDC edits and processes this raw binary format to obtain research-quality data. NGDC provides access to retrospective BPR data from 1986 to the present. The DART database includes pressure and temperature data from the ocean floor, stored in a relational database, enabling data integration with the global tsunami and significant earthquake databases. All data are accessible via the Web as tables, reports, interactive maps, OGC Web Map Services (WMS), and Web Feature Services (WFS) to researchers around the world. References: Gower, J. and F. González, 2006. U.S. Warning System Detected the Sumatra Tsunami, Eos Trans. AGU, 87(10). Bernard, E. N., C. Meinig, and A. Hilton, 2007. Deep Ocean Tsunami Detection: Third Generation DART, Eos Trans. AGU, 88(52), Fall Meet. Suppl., Abstract S51C-03.
Tsunami early warning system for the western coast of the Black Sea
NASA Astrophysics Data System (ADS)
Ionescu, Constantin; Partheniu, Raluca; Cioflan, Carmen; Constantin, Angela; Danet, Anton; Diaconescu, Mihai; Ghica, Daniela; Grecu, Bogdan; Manea, Liviu; Marmureanu, Alexandru; Moldovan, Iren; Neagoe, Cristian; Radulian, Mircea; Raileanu, Victor; Verdes, Ioan
2014-05-01
The Black Sea area is liable to tsunamis generation and the statistics show that more than twenty tsunamis have been observed in the past. The last tsunami was observed on 31st of March 1901 in the western part of the Black Sea, in the Shabla area. An earthquake of magnitude generated at a depth of 15 km below the sea level , triggered tsunami waves of 5 m height and material losses as well. The oldest tsunami ever recorded close to the Romanian shore-line dates from year 104. This paper emphasises the participation of The National Institute for Earth Physics (NIEP) to the development of a tsunami warning system for the western cost of the Black Sea. In collaboration with the National Institute for Marine Geology and Geoecology (GeoEcoMar), the Institute of Oceanology and the Geological Institute, the last two belonging to the Bulgarian Academy of Science, NIEP has participated as partner, to the cross-border project "Set-up and implementation of key core components of a regional early-warning system for marine geohazards of risk to the Romanian-Bulgarian Black Sea coastal area - MARINEGEOHAZARDS", coordinated by GeoEcoMar. The main purpose of the project was the implementation of an integrated early-warning system accompanied by a common decision-support tool, and enhancement of regional technical capability, for the adequate detection, assessment, forecasting and rapid notification of natural marine geohazards for the Romanian-Bulgarian Black Sea cross-border area. In the last years, NIEP has increased its interest on the marine related hazards, such as tsunamis and, in collaboration with other institutions of Romania, is acting to strengthen the cooperation and data exchanges with institutions from the Black Sea surrounding countries which already have tsunami monitoring infrastructures. In this respect, NIEP has developed a coastal network for marine seismicity, by installing three new seismic stations in the coastal area of the Black Sea, Sea Level Sensors, Radar and Pressure sensors, Meteorological and GNSS stations at every site, providing tide gauges and seismic data exchange with the Black Sea countries. At the same time, the Tsunami Analysis Tool (TAT) software, for inundation modelling, along with it's RedPhone application, were also installed at the National Data Centre in Magurele city, and also at Dobrogea Seismic Observatory in the city of Eforie Nord, close to the Black Sea shore.
Office Marine, Tropical, and Tsunami Services Branch Items of Interest Marine Forecasts Text, Graphic , either directly or as a Secondary Audio Program (SAP). Scrolling of NWS text forecasts via specialized cable TV weather channels is becoming increasingly commonplace. In the case of severe weather, text is
NASA Astrophysics Data System (ADS)
Bagiya, Mala S.; Kherani, E. A.; Sunil, P. S.; Sunil, A. S.; Sunda, S.; Ramesh, D. S.
2017-07-01
The presence of ionospheric disturbances associated with Sumatra 2004 tsunami that propagated ahead of tsunami itself has previously been identified. However, their origin remains unresolved till date. Focusing on their origin mechanism, we document these ionospheric disturbances referred as Ahead of tsunami Traveling Ionospheric Disturbances (ATIDs). Using total electron content (TEC) data from GPS Aided GEO Augmented Navigation GPS receivers located near the Indian east coast, we first confirm the ATIDs presence in TEC that appear 90 min ahead of the arrival of tsunami at the Indian east coast. We propose here a simulation study based on tsunami-atmospheric-ionospheric coupling that considers tsunamigenic acoustic gravity waves (AGWs) to excite these disturbances. We explain the ATIDs generation based on the dissipation of transverse mode of the primary AGWs. The simulation corroborates the excitation of ATIDs with characteristics similar to the observations. Therefore, we offer an alternative theoretical tool to monitor the offshore ATIDs where observations are either rare or not available and could be potentially important for the tsunami early warning.
Tsunami Hazard Assessment in Guam
NASA Astrophysics Data System (ADS)
Arcas, D.; Uslu, B.; Titov, V.; Chamberlin, C.
2008-12-01
The island of Guam is located approximately 1500 miles south of Japan, in the vicinity of the Mariana Trench. It is surrounded in close proximity by three subduction zones, Nankai-Taiwan, East Philippines and Mariana Trench that pose a considerable near to intermediate field tsunami threat. Tsunami catalogues list 14 tsunamigenic earthquake with Mw≥8.0 since 1900 only in this region, (Soloviev and Go, 1974; Lander, 1993; Iida, 1984; Lander and Lowell, 2002), however the island has not been significantly affected by some of the largest far-field events of the past century, such as the 1952 Kamchatka, 1960 Chile, and the 1964 Great Alaska earthquake. An assessment of the tsunami threat to the island from both near and far field sources, using forecast tools originally developed at NOAA's Pacific Marine Environmental Laboratory (PMEL) for real-time forecasting of tsunamis is presented here. Tide gauge records from 1952 Kamchatka, 1964 Alaska, and 1960 Chile earthquakes at Apra Harbor are used to validate our model set up, and to explain the limited impact of these historical events on Guam. Identification of worst-case scenarios, and determination of tsunamigenic effective source regions are presented for five vulnerable locations on the island via a tsunami sensitivity study. Apra Harbor is the site of a National Ocean Service (NOS) tide gauge and the biggest harbor on the island. Tumon Bay, Pago Bay, Agana Bay and Inarajan Bay are densely populated areas that require careful investigation. The sensitivity study shows that earthquakes from Eastern Philippines present a major threat to west coast facing sites, whereas the Marina Trench poses the biggest concern to the east coast facing sites.
New Tsunami Forecast Tools for the French Polynesia Tsunami Warning System
NASA Astrophysics Data System (ADS)
Clément, Joël; Reymond, Dominique
2015-03-01
This paper presents the tsunami warning tools, which are used for the estimation of the seismic source parameters. These tools are grouped under a method called Preliminary Determination of Focal Mechanism_2 ( PDFM2), that has been developed at the French Polynesia Warning Center, in the framework of the system, as a plug-in concept. The first tool determines the seismic moment and the focal geometry (strike, dip, and slip), and the second tool identifies the "tsunami earthquakes" (earthquakes that cause much bigger tsunamis than their magnitude would imply). In a tsunami warning operation, initial assessment of the tsunami potential is based on location and magnitude. The usual quick magnitude methods which use waves, work fine for smaller earthquakes. For major earthquakes these methods drastically underestimate the magnitude and its tsunami potential because the radiated energy shifts to the longer period waves. Since French Polynesia is located far away from the subduction zones of the Pacific rim, the tsunami threat is not imminent, and this luxury of time allows to use the long period surface wave data to determine the true size of a major earthquake. The source inversion method presented in this paper uses a combination of surface waves amplitude spectra and P wave first motions. The advantage of using long period surface data is that there is a much more accurate determination of earthquake size, and the advantage of using P wave first motion is to have a better constrain of the focal geometry than using the surface waves alone. The method routinely gives stable results at minutes, with being the origin time of an earthquake. Our results are then compared to the Global Centroid Moment Tensor catalog for validating both the seismic moment and the source geometry. The second tool discussed in this paper is the slowness parameter and is the energy-to-moment ratio. It has been used to identify tsunami earthquakes, which are characterized by having unusual slow rupture velocity and release seismic energy that has been shifted to longer periods and, therefore, have low values. The slow rupture velocity would indicate weaker material and bigger uplift and, thus, bigger tsunami potential. The use of the slowness parameter is an efficient tool for monitoring the near real-time identification of tsunami earthquakes.
NASA Astrophysics Data System (ADS)
Voronina, Tatyana; Romanenko, Alexey; Loskutov, Artem
2017-04-01
The key point in the state-of-the-art in the tsunami forecasting is constructing a reliable tsunami source. In this study, we present an application of the original numerical inversion technique to modeling the tsunami sources of the 16 September 2015 Chile tsunami. The problem of recovering a tsunami source from remote measurements of the incoming wave in the deep-water tsunameters is considered as an inverse problem of mathematical physics in the class of ill-posed problems. This approach is based on the least squares and the truncated singular value decomposition techniques. The tsunami wave propagation is considered within the scope of the linear shallow-water theory. As in inverse seismic problem, the numerical solutions obtained by mathematical methods become unstable due to the presence of noise in real data. A method of r-solutions makes it possible to avoid instability in the solution to the ill-posed problem under study. This method seems to be attractive from the computational point of view since the main efforts are required only once for calculating the matrix whose columns consist of computed waveforms for each harmonic as a source (an unknown tsunami source is represented as a part of a spatial harmonics series in the source area). Furthermore, analyzing the singular spectra of the matrix obtained in the course of numerical calculations one can estimate the future inversion by a certain observational system that will allow offering a more effective disposition for the tsunameters with the help of precomputations. In other words, the results obtained allow finding a way to improve the inversion by selecting the most informative set of available recording stations. The case study of the 6 February 2013 Solomon Islands tsunami highlights a critical role of arranging deep-water tsunameters for obtaining the inversion results. Implementation of the proposed methodology to the 16 September 2015 Chile tsunami has successfully produced tsunami source model. The function recovered by the method proposed can find practical applications both as an initial condition for various optimization approaches and for computer calculation of the tsunami wave propagation.
Toward the Real-Time Tsunami Parameters Prediction
NASA Astrophysics Data System (ADS)
Lavrentyev, Mikhail; Romanenko, Alexey; Marchuk, Andrey
2013-04-01
Today, a wide well-developed system of deep ocean tsunami detectors operates over the Pacific. Direct measurements of tsunami-wave time series are available. However, tsunami-warning systems fail to predict basic parameters of tsunami waves on time. Dozens examples could be provided. In our view, the lack of computational power is the main reason of these failures. At the same time, modern computer technologies such as, GPU (graphic processing unit) and FPGA (field programmable gates array), can dramatically improve data processing performance, which may enhance timely tsunami-warning prediction. Thus, it is possible to address the challenge of real-time tsunami forecasting for selected geo regions. We propose to use three new techniques in the existing tsunami warning systems to achieve real-time calculation of tsunami wave parameters. First of all, measurement system (DART buoys location, e.g.) should be optimized (both in terms of wave arriving time and amplitude parameter). The corresponding software application exists today and is ready for use [1]. We consider the example of the coastal line of Japan. Numerical tests show that optimal installation of only 4 DART buoys (accounting the existing sea bed cable) will reduce the tsunami wave detection time to only 10 min after an underwater earthquake. Secondly, as was shown by this paper authors, the use of GPU/FPGA technologies accelerates the execution of the MOST (method of splitting tsunami) code by 100 times [2]. Therefore, tsunami wave propagation over the ocean area 2000*2000 km (wave propagation simulation: time step 10 sec, recording each 4th spatial point and 4th time step) could be calculated at: 3 sec with 4' mesh 50 sec with 1' mesh 5 min with 0.5' mesh The algorithm to switch from coarse mesh to the fine grain one is also available. Finally, we propose the new algorithm for tsunami source parameters determination by real-time processing the time series, obtained at DART. It is possible to approximate the measured time series by a linear combination of synthetic marigrams. Coefficients of such linear combination are calculated with the help of orthogonal decomposition. The algorithm is very fast and demonstrates good accuracy. Summing up, using the example of the coastal line of Japan, wave height evaluation will be available in 12-14 minutes after the earthquake even before the wave approaches the nearest shore point (usually, it takes places in about 20 minutes). The determination of the optimal sensors' location using genetic algorithm / A.S.Astrakova, D.V.Bannikov, S.G.Cherny, M.M.Lavrentiev // 3rd Nordic EMW Summer School, Turku, Finland, June, 2009: proceedings - Finland: TUSC General Publications, 2009. - N 53. - P.5-22. M.Lavrentiev Jr., A.Romanenko, "Modern Hardware Solutions to Speed Up Tsunami Simulation Codes", Geophysical research abstracts, Vol. 12, EGU2010-3835, 2010
Introduction to "Global Tsunami Science: Past and Future, Volume III"
NASA Astrophysics Data System (ADS)
Rabinovich, Alexander B.; Fritz, Hermann M.; Tanioka, Yuichiro; Geist, Eric L.
2018-04-01
Twenty papers on the study of tsunamis are included in Volume III of the PAGEOPH topical issue "Global Tsunami Science: Past and Future". Volume I of this topical issue was published as PAGEOPH, vol. 173, No. 12, 2016 and Volume II as PAGEOPH, vol. 174, No. 8, 2017. Two papers in Volume III focus on specific details of the 2009 Samoa and the 1923 northern Kamchatka tsunamis; they are followed by three papers related to tsunami hazard assessment for three different regions of the world oceans: South Africa, Pacific coast of Mexico and the northwestern part of the Indian Ocean. The next six papers are on various aspects of tsunami hydrodynamics and numerical modelling, including tsunami edge waves, resonant behaviour of compressible water layer during tsunamigenic earthquakes, dispersive properties of seismic and volcanically generated tsunami waves, tsunami runup on a vertical wall and influence of earthquake rupture velocity on maximum tsunami runup. Four papers discuss problems of tsunami warning and real-time forecasting for Central America, the Mediterranean coast of France, the coast of Peru, and some general problems regarding the optimum use of the DART buoy network for effective real-time tsunami warning in the Pacific Ocean. Two papers describe historical and paleotsunami studies in the Russian Far East. The final set of three papers importantly investigates tsunamis generated by non-seismic sources: asteroid airburst and meteorological disturbances. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.
Introduction to “Global tsunami science: Past and future, Volume III”
Rabinovich, Alexander B.; Fritz, Hermann M.; Tanioka, Yuichiro; Geist, Eric L.
2018-01-01
Twenty papers on the study of tsunamis are included in Volume III of the PAGEOPH topical issue “Global Tsunami Science: Past and Future”. Volume I of this topical issue was published as PAGEOPH, vol. 173, No. 12, 2016 and Volume II as PAGEOPH, vol. 174, No. 8, 2017. Two papers in Volume III focus on specific details of the 2009 Samoa and the 1923 northern Kamchatka tsunamis; they are followed by three papers related to tsunami hazard assessment for three different regions of the world oceans: South Africa, Pacific coast of Mexico and the northwestern part of the Indian Ocean. The next six papers are on various aspects of tsunami hydrodynamics and numerical modelling, including tsunami edge waves, resonant behaviour of compressible water layer during tsunamigenic earthquakes, dispersive properties of seismic and volcanically generated tsunami waves, tsunami runup on a vertical wall and influence of earthquake rupture velocity on maximum tsunami runup. Four papers discuss problems of tsunami warning and real-time forecasting for Central America, the Mediterranean coast of France, the coast of Peru, and some general problems regarding the optimum use of the DART buoy network for effective real-time tsunami warning in the Pacific Ocean. Two papers describe historical and paleotsunami studies in the Russian Far East. The final set of three papers importantly investigates tsunamis generated by non-seismic sources: asteroid airburst and meteorological disturbances. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.
Fast Simulation of Tsunamis in Real Time
NASA Astrophysics Data System (ADS)
Fryer, G. J.; Wang, D.; Becker, N. C.; Weinstein, S. A.; Walsh, D.
2011-12-01
The U.S. Tsunami Warning Centers primarily base their wave height forecasts on precomputed tsunami scenarios, such as the SIFT model (Standby Inundation Forecasting of Tsunamis) developed by NOAA's Center for Tsunami Research. In SIFT, tsunami simulations for about 1600 individual earthquake sources, each 100x50 km, define shallow subduction worldwide. These simulations are stored in a database and combined linearly to make up the tsunami from any great earthquake. Precomputation is necessary because the nonlinear shallow-water wave equations are too time consuming to compute during an event. While such scenario-based models are valuable, they tacitly assume all energy in a tsunami comes from thrust at the décollement. The thrust assumption is often violated (e.g., 1933 Sanriku, 2007 Kurils, 2009 Samoa), while a significant number of tsunamigenic earthquakes are completely unrelated to subduction (e.g., 1812 Santa Barbara, 1939 Accra, 1975 Kalapana). Finally, parts of some subduction zones are so poorly defined that precomputations may be of little value (e.g., 1762 Arakan, 1755 Lisbon). For all such sources, a fast means of estimating tsunami size is essential. At the Pacific Tsunami Warning Center, we have been using our model RIFT (Real-time Inundation Forecasting of Tsunamis) experimentally for two years. RIFT is fast by design: it solves only the linearized form of the equations. At 4 arc-minutes resolution calculations for the entire Pacific take just a few minutes on an 8-processor Linux box. Part of the rationale for developing RIFT was earthquakes of M 7.8 or smaller, which approach the lower limit of the more complex SIFT's abilities. For such events we currently issue a fixed warning to areas within 1,000 km of the source, which typically means a lot of over-warning. With sources defined by W-phase CMTs, exhaustive comparison with runup data shows that we can reduce the warning area significantly. Even before CMTs are available, we routinely run models based on the local tectonics, which provide a useful first estimate of the tsunami. Our runup comparisons show that Green's Law (i.e., 1-D runup estimates) works very well indeed, especially if computations are run at 2 arc-minutes. We are developing an experimental RIFT-based product showing expected runups on open coasts. While these will necessarily be rather crude they will be a great help to emergency managers trying to assess the hazard. RIFT is typically run using a single source, but it can already handle multiple sources. In particular, it can handle multiple sources of different orientations such as 1993 Okushiri, or the décollement-splay combinations to be expected during major earthquakes in accretionary margins such as Nankai, Cascadia, and Middle America. As computers get faster and the number-crunching burden is off-loaded to GPUs, we are convinced there will still be a use for a fast, linearized, modeling capability. Rather than applying scaling laws to a CMT, or distributing slip over 100x50 km sub-faults, for example, it would be preferable to model tsunamis using the output from a finite-fault analysis. To accomplish such a compute-bound task fast enough for warning purposes will demand a rapid, approximate technique like RIFT.
TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems
NASA Astrophysics Data System (ADS)
Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.
2012-12-01
A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time-series data in the GUI as well. This GUI also includes mouse-clickable functions such as zooming or expanding the time-series display, measuring tsunami signal characteristics (arrival time, wave period and amplitude, etc.), and removing the tide signal from the time-series data. De-tiding of the time series is necessary to obtain accurate measurements of tsunami wave parameters and to maintain accurate historical tsunami databases. With TIDE TOOL, de-tiding is accomplished with a set of tide harmonic coefficients routinely computed and updated at PTWC for many of the stations in PTWC's inventory (~570). PTWC also uses the decoded time series files (previous 3-5 days' worth) to compute on-the-fly tide coefficients. The latter is useful in cases where the station is new and a long-term stable set of tide coefficients are not available or cannot be easily obtained due to various non-astronomical effects. The international tsunami warning system is coordinated globally by the UNESCO IOC, and a number of countries in the Pacific and Indian Ocean, and Caribbean depend on Tide Tool to monitor tsunamis in real time.
Tsunami Early Warning System in Italy and involvement of local communities
NASA Astrophysics Data System (ADS)
Tinti, Stefano; Armigliato, Alberto; Zaniboni, Filippo
2010-05-01
Italy is characterized by a great coastal extension, and by a series of possible tsunamigenic sources: many active faults, onshore and offshore, also near the shoreline and in shallow water, active volcanoes (Etna, Stromboli, Campi Flegrei for example), continental margins where landslides can occur. All these threats justify the establishment of a tsunami early warning system (TEWS), especially in Southern Italy where most of the sources capable of large disastrous tsunamis are located. One of the main characteristics of such sources, that however is common to other countries in not only in the Mediterranean, is their vicinity to the coast, which means that the tsunami lead time for attacking the coastal system is expected to be within 10-15 minutes in several cases. This constraint of time imposes to conceive and adopt specific plans aiming at a quick tsunami detection and alert dissemination for the TEWS, since obviously the TEWS alert must precede and not follow the tsunami first arrival. The need to be quick introduces the specific problem of uncertainty that is though inherent to any forecast system, but it is a very big issue especially when time available is short, since crucial decisions have to be taken in presence of incomplete data and incomplete processing. This is just the big problem that has to be faced by a system like the a TEWS in Italy. Uncertainties can be reduced by increasing the capabilities of the tsunami monitoring system by densifying the traditional instrumental networks (e.g. by empowering seismic and especially coastal and offshore sea-level observation systems) in the identified tsunamigenic source areas. However, uncertainties, though are expected to have a decreasing trend as time passes after the tsunami initiation, cannot be eliminated and have to be appropriately dealt with: uncertainties lead to under- and overestimation of the tsunami size and arrival times, and to missing or to false alerts, or in other terms they degrade the performance of the tsunami predictors. The role of the local communities in defining the strategies in case of uncertain data is essential: only involvement of such communities since the beginning of the planning and implementation phase of the TEWS as well as in the definition of a decision making matrix can ensure appropriate response in case of emergency, and most importantly, the acceptance of the system in the long run. The efforts to implement the Tsunami Warning System in Italy should take into proper account the above mentioned aspects. Involvement of local communities should be primarily realized through the involvement of the local components of the Civil Protection Agency that is responsible for the implementation of the system over the Italian territory. A pilot project is being conducted in cooperation between the Civil Protection Service of Sicily and the University of Bologna (UNIBO) that contemplates the empowering of the local sea-level monitoring system (TSUNET) and specific vulnerability and risk analyses, also exploiting results of national and European research projects (e.g. TRANSFER and SCHEMA) where UNIBO had a primary role.
NASA Astrophysics Data System (ADS)
Miller, K. M.; Wilson, R. I.; Goltz, J.; Fenton, J.; Long, K.; Dengler, L.; Rosinski, A.; California Tsunami Program
2011-12-01
This poster will present an overview of successes and challenges observed by the authors during this major tsunami response event. The Tohoku, Japan tsunami was the most costly to affect California since the 1964 Alaskan earthquake and ensuing tsunami. The Tohoku tsunami caused at least $50 million in damage to public facilities in harbors and marinas along the coast of California, and resulted in one fatality. It was generated by a magnitude 9.0 earthquake which occurred at 9:46PM PST on Thursday, March 10, 2011 in the sea off northern Japan. The tsunami was recorded at tide gages monitored by the West Coast/Alaska Tsunami Warning Center (WCATWC), which projected tsunami surges would reach California in approximately 10 hours. At 12:51AM on March 11, 2011, based on forecasted tsunami amplitudes, the WCATWC placed the California coast north of Point Conception (Santa Barbara County) in a Tsunami Warning, and the coast south of Point Conception to the Mexican border in a Tsunami Advisory. The California Emergency Management Agency (CalEMA) activated two Regional Emergency Operation Centers (REOCs) and the State Operation Center (SOC). The California Geological Survey (CGS) deployed a field team which collected data before, during and after the event through an information clearinghouse. Conference calls were conducted hourly between the WCATWC and State Warning Center, as well as with emergency managers in the 20 coastal counties. Coordination focused on local response measures, public information messaging, assistance needs, evacuations, emergency shelters, damage, and recovery issues. In the early morning hours, some communities in low lying areas recommended evacuation for their citizens, and the fishing fleet at Crescent City evacuated to sea. The greatest damage occurred in the harbors of Crescent City and Santa Cruz. As with any emergency, there were lessons learned and important successes in managing this event. Forecasts by the WCATWC were highly accurate. Exercises and workshops have enhanced communications between state and local agencies, and emergency managers are more educated about what to expect. Areas for improvement include keeping people out of the hazard area; educating the non-English speaking community; and reinforcing the long duration and unpredictable peak damaging waves of these events to emergency managers. The Governor proclaimed a state of emergency in six counties and the President declared a major disaster on April 18, 2011, allowing federal assistance to support repairs and economic recovery. Detailed evaluation of local maritime response activities, harbor damage, and measured and observed tsunami current velocity data will help the California Tsunami Program develop improved tsunami hazard maps and guidance for maritime communities. The state program will continue to emphasize the importance of both tsunami warnings and advisories, the unpredictable nature of each tsunami, and encourage public understanding of tsunamis to prepare and protect themselves in the future.
Improving global flood risk awareness through collaborative research: Id-Lab
NASA Astrophysics Data System (ADS)
Weerts, A.; Zijderveld, A.; Cumiskey, L.; Buckman, L.; Verlaan, M.; Baart, F.
2015-12-01
Scientific and end-user collaboration on operational flood risk modelling and forecasting requires an environment where scientists and end-users can physically work together and demonstrate, enhance and learn about new tools, methods and models for forecasting and warning purposes. Therefore, Deltares has built a real-time demonstration, training and research infrastructure ('operational' room and ICT backend). This research infrastructure supports various functions like (1) Real time response and disaster management, (2) Training, (3) Collaborative Research, (4) Demonstration. The research infrastructure will be used for a mixture of these functions on a regular basis by Deltares and a multitude of both scientists as well as end users such as universities, research institutes, consultants, governments and aid agencies. This infrastructure facilitates emergency advice and support during international and national disasters caused by rainfall, tropical cyclones or tsunamis. It hosts research flood and storm surge forecasting systems for global/continental/regional scale. It facilitates training for emergency & disaster management (along with hosting forecasting system user trainings in for instance the forecasting platform Delft-FEWS) both internally and externally. The facility is expected to inspire and initiate creative innovations by bringing together different experts from various organizations. The room hosts interactive modelling developments, participatory workshops and stakeholder meetings. State of the art tools, models and software, being applied across the globe are available and on display within the facility. We will present the Id-Lab in detail and we will put particular focus on the global operational forecasting systems GLOFFIS (Global Flood Forecasting Information System) and GLOSSIS (Global Storm Surge Information System).
PTWC Creating a New Catalog of Historic Tsunami Animations for NOAA Science-on-a-Sphere Exhibits
NASA Astrophysics Data System (ADS)
Becker, N. C.; Geschwind, L. R.; Wang, D.
2016-12-01
Throughout 2016 the Pacific Tsunami Warning Center (PTWC) has been developing a catalog of tsunami animations for NOAA's Science on a Sphere (SOS) display system. The SOS consists of a six-foot (1.8 m) diameter sphere that serves as a projection screen for four high-definition video projectors that can show any global dataset. SOS systems have been installed in over 100 locations around the world, primarily in venues such as science museums. Education and outreach are a vital part of PTWC's mission and SOS can show the global impacts of tsunami hazards in an intuitive and engaging presentation analogous to a planetarium. PTWC has been releasing these animations for the anniversaries of significant tsunamis throughout the year and has so far has produced them for Cascadia 1700, Chile 2010, Japan 2011, Aleutian Islands 1946, Alaska 1964, and Chile 1960, and before the end of the year the library will include Samoa 2009 and Sumatra 2004. PTWC created these animations at 8k video resolution to future-proof them against SOS upgrades such as higher definition projectors and larger spheres. Though not the first SOS tsunami animations, these are the first ones to show impacts to coastlines, the criteria that PTWC uses to determine the tsunami hazard guidance it will issue to the coastal populations it serves. These animations also all use a common color scheme based on PTWC's alert criteria such that they will be consistent with each other as well as with PTWC's tsunami messages. PTWC created these animations using the same tsunami forecast model it routinely uses in its warning operations, and PTWC has even demonstrated that it can produce a SOS tsunami animation while a tsunami was still crossing the Pacific Ocean, and so this library of animations can also be used to prepare docents and audiences to interpret such a real-time animation should it become available for the next major tsunami. One does not need access to a SOS exhibit, however, to view these animations. NOAA also maintains a website where these animations can be viewed in a web browser. The site also allows a user to download these data along with software such that they may be viewed on a personal computer. PTWC also maintains a YouTube channel with Mercator-projected versions of these animations that are in the same style and color scheme as their SOS counterparts.
Multilingual Analysis of Twitter News in Support of Mass Emergency Events
NASA Astrophysics Data System (ADS)
Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.
2012-04-01
Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture assessment, e.g. for planning relief actions. At present, a multilingual corpus of Twitter messages related to crises is being assembled, and domain-specific language resources such as multilingual terminology lists and language-specific Natural Language Processing (NLP) tools are being built up to help cross the language barrier. The final goal is to extend this work to the main languages spoken around the Mediterranean and to classify and extract relevant information from tweets, translating the main keywords into English.
Holocene Tsunamis in Avachinsky Bay, Kamchatka, Russia
NASA Astrophysics Data System (ADS)
Pinegina, Tatiana K.; Bazanova, Lilya I.; Zelenin, Egor A.; Bourgeois, Joanne; Kozhurin, Andrey I.; Medvedev, Igor P.; Vydrin, Danil S.
2018-04-01
This article presents results of the study of tsunami deposits on the Avachinsky Bay coast, Kurile-Kamchatka island arc, NW Pacific. We used tephrochronology to assign ages to the tsunami deposits, to correlate them between excavations, and to restore paleo-shoreline positions. In addition to using established regional marker tephra, we establish a detailed tephrochronology for more local tephra from Avachinsky volcano. For the first time in this area, proximal to Kamchatka's primary population, we reconstruct the vertical runup and horizontal inundation for 33 tsunamis recorded over the past 4200 years, 5 of which are historical events - 1737, 1792, 1841, 1923 (Feb) and 1952. The runup heights for all 33 tsunamis range from 1.9 to 5.7 m, and inundation distances from 40 to 460 m. The average recurrence for historical events is 56 years and for the entire study period 133 years. The obtained data makes it possible to calculate frequencies of tsunamis by size, using reconstructed runup and inundation, which is crucial for tsunami hazard assessment and long-term tsunami forecasting. Considering all available data on the distribution of historical and paleo-tsunami heights along eastern Kamchatka, we conclude that the southern part of the Kamchatka subduction zone generates stronger tsunamis than its northern part. The observed differences could be associated with variations in the relative velocity and/or coupling between the downgoing Pacific Plate and Kamchatka.
Holocene Tsunamis in Avachinsky Bay, Kamchatka, Russia
NASA Astrophysics Data System (ADS)
Pinegina, Tatiana K.; Bazanova, Lilya I.; Zelenin, Egor A.; Bourgeois, Joanne; Kozhurin, Andrey I.; Medvedev, Igor P.; Vydrin, Danil S.
2018-03-01
This article presents results of the study of tsunami deposits on the Avachinsky Bay coast, Kurile-Kamchatka island arc, NW Pacific. We used tephrochronology to assign ages to the tsunami deposits, to correlate them between excavations, and to restore paleo-shoreline positions. In addition to using established regional marker tephra, we establish a detailed tephrochronology for more local tephra from Avachinsky volcano. For the first time in this area, proximal to Kamchatka's primary population, we reconstruct the vertical runup and horizontal inundation for 33 tsunamis recorded over the past 4200 years, 5 of which are historical events - 1737, 1792, 1841, 1923 (Feb) and 1952. The runup heights for all 33 tsunamis range from 1.9 to 5.7 m, and inundation distances from 40 to 460 m. The average recurrence for historical events is 56 years and for the entire study period 133 years. The obtained data makes it possible to calculate frequencies of tsunamis by size, using reconstructed runup and inundation, which is crucial for tsunami hazard assessment and long-term tsunami forecasting. Considering all available data on the distribution of historical and paleo-tsunami heights along eastern Kamchatka, we conclude that the southern part of the Kamchatka subduction zone generates stronger tsunamis than its northern part. The observed differences could be associated with variations in the relative velocity and/or coupling between the downgoing Pacific Plate and Kamchatka.
NASA Astrophysics Data System (ADS)
Dunbar, P. K.; Mccullough, H. L.; Mungov, G.; Harris, E.
2012-12-01
The U.S. National Oceanic and Atmospheric Administration (NOAA) has primary responsibility for providing tsunami warnings to the Nation, and a leadership role in tsunami observations and research. A key component of this effort is easy access to authoritative data on past tsunamis, a responsibility of the National Geophysical Data Center (NGDC) and collocated World Service for Geophysics. Archive responsibilities include the global historical tsunami database, coastal tide-gauge data from US/NOAA operated stations, the Deep-ocean Assessment and Reporting of Tsunami (DART®) data, damage photos, as well as other related hazards data. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Understanding the severity and timing of tsunami effects is important for tsunami hazard mitigation and warning. The global historical tsunami database includes the date, time, and location of the source event, magnitude of the source, event validity, maximum wave height, the total number of fatalities and dollar damage. The database contains additional information on run-ups (locations where tsunami waves were observed by eyewitnesses, field reconnaissance surveys, tide gauges, or deep ocean sensors). The run-up table includes arrival times, distance from the source, measurement type, maximum wave height, and the number of fatalities and damage for the specific run-up location. Tide gauge data are required for modeling the interaction of tsunami waves with the coast and for verifying propagation and inundation models. NGDC is the long-term archive for all NOAA coastal tide gauge data and is currently archiving 15-second to 1-minute water level data from the NOAA Center for Operational Oceanographic Products and Services (CO-OPS) and the NOAA Tsunami Warning Centers. DART® buoys, which are essential components of tsunami warning systems, are now deployed in all oceans, giving coastal communities faster and more accurate tsunami warnings. NOAA's National Data Buoy Center disseminates real-time DART® data and NGDC processes and archives post-event 15-second high-resolution bottom pressure time series data. An event-specific archive of DART® observations recorded during recent significant tsunamis, including the March 2011 Tohoku, Japan event, are now available through new tsunami event pages integrated with the NGDC global historical tsunami database. These pages are developed to deliver comprehensive summaries of each tsunami event, including socio-economic impacts, tsunami travel time maps, raw observations, de-tided residuals, spectra of the tsunami signal compared to the energy of the background noise, and wavelets. These data are invaluable to tsunami researchers and educators as they are essential to providing a more thorough understanding of tsunamis and their propagation in the open ocean and subsequent inundation of coastal communities. NGDC has collected 289 tide gauge observations, 34 Deep-ocean Assessment and Reporting of Tsunami (DART®) and bottom pressure recorder (BPR) station observations, and over 5,000 eyewitness reports and post-tsunami field survey measurements for the 2011 Tohoku event.
Develop Probabilistic Tsunami Design Maps for ASCE 7
NASA Astrophysics Data System (ADS)
Wei, Y.; Thio, H. K.; Chock, G.; Titov, V. V.
2014-12-01
A national standard for engineering design for tsunami effects has not existed before and this significant risk is mostly ignored in engineering design. The American Society of Civil Engineers (ASCE) 7 Tsunami Loads and Effects Subcommittee is completing a chapter for the 2016 edition of ASCE/SEI 7 Standard. Chapter 6, Tsunami Loads and Effects, would become the first national tsunami design provisions. These provisions will apply to essential facilities and critical infrastructure. This standard for tsunami loads and effects will apply to designs as part of the tsunami preparedness. The provisions will have significance as the post-tsunami recovery tool, to plan and evaluate for reconstruction. Maps of 2,500-year probabilistic tsunami inundation for Alaska, Washington, Oregon, California, and Hawaii need to be developed for use with the ASCE design provisions. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. The NOAA Center for Tsunami Research (NCTR) has developed 75 tsunami inundation models as part of the operational tsunami model forecast capability for the U.S. coastline. NCTR, UW, and URS are collaborating with ASCE to develop the 2,500-year tsunami design maps for the Pacific states using these tsunami models. This ensures the probabilistic criteria are established in ASCE's tsunami design maps. URS established a Probabilistic Tsunami Hazard Assessment approach consisting of a large amount of tsunami scenarios that include both epistemic uncertainty and aleatory variability (Thio et al., 2010). Their study provides 2,500-year offshore tsunami heights at the 100-m water depth, along with the disaggregated earthquake sources. NOAA's tsunami models are used to identify a group of sources that produce these 2,500-year tsunami heights. The tsunami inundation limits and runup heights derived from these sources establish the tsunami design map for the study site. ASCE's Energy Grad Line Analysis then uses these modeling constraints to derive hydrodynamic forces for structures within the tsunami design zone. The probabilistic tsunami design maps will be validated by comparison to state inundation maps under the coordination of the National Tsunami Hazard Mitigation Program.
available to ships participating in the Voluntary Observing Ships (VOS) program. To register as a Fax-Back Tsunamis 406 EPIRB's National Weather Service Marine Forecasts GREAT LAKES FAX-BACK SERVICE Marine Forecast months. Did you know your body can cool 25 times faster in water than in air? That water does not need to
Annual Tropical Cyclone Report 2011
2012-05-24
nuclear plant, still reeling after the tsunami disaster just a few months earlier.6 Operations at Kadena Air Base were put on hold 48 with major...conditions. Several of these early to mid-season forming TCs exhibited ―S‖ shaped, looping, or generally erratic tracks, with numerous passages near or over...track errors after -the fact to extend the data base (3) Mean forecast errors for all w arned systems in Northwest Pacific. 120-Hour Along Cross
... Data SAFETY Floods Tsunami Beach Hazards Wildfire Cold Tornadoes Fog Air Quality Heat Hurricanes Lightning Safe Boating ... Winter Weather Forecasts River Flooding Latest Warnings Thunderstorm/Tornado Outlook Hurricanes Fire Weather Outlooks UV Alerts Drought ...
NASA Astrophysics Data System (ADS)
Klausner, V.; Mendes, Odim; Domingues, Margarete O.; Papa, Andres R. R.; Tyler, Robert H.; Frick, Peter; Kherani, Esfhan A.
2014-04-01
The vertical component (Z) of the geomagnetic field observed by ground-based observatories of the International Real-Time Magnetic Observatory Network has been used to analyze the induced magnetic fields produced by the movement of a tsunami, electrically conducting sea water through the geomagnetic field. We focus on the survey of minutely sampled geomagnetic variations induced by the tsunami of 27 February 2010 at Easter Island (IPM) and Papeete (PPT) observatories. In order to detect the tsunami disturbances in the geomagnetic data, we used wavelet techniques. We have observed an 85% correlation between the Z component variation and the tide gauge measurements in period range of 10 to 30 min which may be due to two physical mechanisms: gravity waves and the electric currents in the sea. As an auxiliary tool to verify the disturbed magnetic fields, we used the maximum variance analysis (MVA). At PPT, the analyses show local magnetic variations associated with the tsunami arriving in advance of sea surface fluctuations by about 2 h. The first interpretation of the results suggests that wavelet techniques and MVA can be effectively used to characterize the tsunami contributions to the geomagnetic field and further used to calibrate tsunami models and implemented to real-time analysis for forecast tsunami scenarios.
USGS: Science to understand and forecast change in coastal ecosystems
Myers, M.
2007-01-01
The multidisciplinary approach of the US Geological Survey (USGS), a principal science agency of the US Department of the Interior (DOI), to address the complex and cumulative impacts of human activities and natural events on the US coastal ecosystems has been considered remarkable for understanding and forecasting the changes. The USGS helps explain geologic, hydrologic, and biologic systems and their connectivity across landscapes and seascapes along the coastline. The USGS coastal science programs effectively address science and information to other scientists, managers, policy makers, and the public. The USGS provides scientific expertise, capabilities, and services to collaborative federal, regional, and state-led efforts, which are in line with the goals of Ocean Action Plan (OAP) and Ocean Research Priorities Plan (ORPP). The organization is a leader in understanding terrestrial and marine environmental hazards such as earthquakes, tsunamis, floods, and landslides and assessing and forecasting coastal impacts using various specialized visualization techniques.
NASA Astrophysics Data System (ADS)
Tinti, S.; Tonini, R.; Armigliato, A.; Zaniboni, F.; Pagnoni, G.; Gallazzi, Sara; Bressan, Lidia
2010-05-01
The tsunamigenic earthquake (M 8.8) that occurred offshore central Chile on 27 February 2010 can be classified as a typical subduction-zone earthquake. The effects of the ensuing tsunami have been devastating along the Chile coasts, and especially between the cities of Valparaiso and Talcahuano, and in the Juan Fernandez islands. The tsunami propagated across the entire Pacific Ocean, hitting with variable intensity almost all the coasts facing the basin. While the far-field propagation was quite well tracked almost in real-time by the warning centres and reasonably well reproduced by the forecast models, the toll of lives and the severity of the damage caused by the tsunami in the near-field occurred with no local alert nor warning and sadly confirms that the protection of the communities placed close to the tsunami sources is still an unresolved problem in the tsunami early warning field. The purpose of this study is two-fold. On one side we perform numerical simulations of the tsunami starting from different earthquake models which we built on the basis of the preliminary seismic parameters (location, magnitude and focal mechanism) made available by the seismological agencies immediately after the event, or retrieved from more detailed and refined studies published online in the following days and weeks. The comparison with the available records of both offshore DART buoys and coastal tide-gauges is used to put some preliminary constraints on the best-fitting fault model. The numerical simulations are performed by means of the finite-difference code UBO-TSUFD, developed and maintained by the Tsunami Research Team of the University of Bologna, Italy, which can solve both the linear and non-linear versions of the shallow-water equations on nested grids. The second purpose of this study is to use the conclusions drawn in the previous part in a tsunami early warning perspective. In the framework of the EU-funded project DEWS (Distant Early Warning System), we will try to give some clues for discussion on the deficiencies of the existing tsunami early warning concepts as regards the warning to the areas which are found close to the tsunami source, and on the strategies that should be followed in the near future in order to make significant progress in the protection and safeguarding of local communities.
Warnings and reactions to the Tohoku tsunami in Hawaii
NASA Astrophysics Data System (ADS)
Houghton, B. F.; Gregg, C. E.
2012-12-01
The 2011 Tohoku tsunami was the first chance within the USA to document and interpret large-scale response and protective action behavior with regard to a large, destructive tsunami since 1964. The 2011 tsunami offered a unique, short-lived opportunity to transform our understanding of individual and collective behavior in the US in response to a well-publicized tsunami warning and, in particular, to look at the complex interplay of official information sources, informal warnings and information-seeking in communities with significant physical impact from the 2011 tsunami. This study is focused in Hawaii, which suffered significant ($30 M), but localized damage, from the 2011 Tohoku tsunami and underwent a full-scale tsunami evacuation. The survey contrasts three Hawaiian communities which either experienced significant tsunami damage (Kona) or little physical impact (Hilo, Honolulu). It also contrasts a long-established local community with experience of evacuation, destruction and loss of life in two tsunamis (Hilo) with a metropolitan population with a large visitor presence (Honolulu) that has not experienced a damaging tsunami in decades. Many factors such as personal perceptions of risk, beliefs, past exposure to the hazard, forecast uncertainty, trust in information sources, channels of transmission of information, the need for message confirmation, responsibilities, obligations, mobility, the ability to prepare, the availability of transportation and transport routes, and an acceptable evacuation center affected behavior. We provide new information on how people reacted to warnings and tsunamis, especially with regard to social integration of official warnings and social media. The results of this study will strengthen community resilience to tsunamis, working with emergency managers to integrate strengths and weaknesses of the public responses with official response plans.
NASA Astrophysics Data System (ADS)
Novikova, Tatyana; Babeyko, Andrey; Papadopoulos, Gerassimos
2017-04-01
Greece and adjacent coastal areas are characterized by a high population exposure to tsunami hazard. The Hellenic Arc is the most active geotectonic structure for the generation of earthquakes and tsunamis. We performed probabilistic tsunami hazard assessment for selected locations of Greek coastlines which are the forecasting points officially used in the tsunami warning operations by the Hellenic National Tsunami Warning Center and the NEAMTWS/IOC/UNESCO. In our analysis we considered seismic sources for tsunami generation along the western, central and eastern segments of the Hellenic Arc. We first created a synthetic catalog as long as 10,000 years for all the significant earthquakes with magnitudes in the range from 6.0 to 8.5, the real events being included in this catalog. For each event included in the synthetic catalog a tsunami was generated and propagated using Boussinesq model. The probability of occurrence for each event was determined by Gutenberg-Richter magnitude-frequency distribution. The results of our study are expressed as hazard curves and hazard maps. The hazard curves were obtained for the selected sites and present the annual probability of exceedance as a function of pick coastal tsunami amplitude. Hazard maps represent the distribution of peak coastal tsunami amplitudes corresponding to a fixed annual probability. In such forms our results can be easily compared to the ones obtained in other studies and further employed for the development of tsunami risk management plans. This research is a contribution to the EU-FP7 tsunami research project ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant agreement no: 603839, 2013-10-30.
current weather conditions in their operating area. All NWS marine forecasts rely heavily on the Voluntary weather conditions in their operating area. Home, Parent Office, Marine, Tropical, and Tsunami Services
NASA Astrophysics Data System (ADS)
Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo; Tinti, Stefano
2013-04-01
TRIDEC is a EU-FP7 Project whose main goal is, in general terms, to develop suitable strategies for the management of crises possibly arising in the Earth management field. The general paradigms adopted by TRIDEC to develop those strategies include intelligent information management, the capability of managing dynamically increasing volumes and dimensionality of information in complex events, and collaborative decision making in systems that are typically very loosely coupled. The two areas where TRIDEC applies and tests its strategies are tsunami early warning and industrial subsurface development. In the field of tsunami early warning, TRIDEC aims at developing a Decision Support System (DSS) that integrates 1) a set of seismic, geodetic and marine sensors devoted to the detection and characterisation of possible tsunamigenic sources and to monitoring the time and space evolution of the generated tsunami, 2) large-volume databases of pre-computed numerical tsunami scenarios, 3) a proper overall system architecture. Two test areas are dealt with in TRIDEC: the western Iberian margin and the eastern Mediterranean. In this study, we focus on the western Iberian margin with special emphasis on the Portuguese coasts. The strategy adopted in TRIDEC plans to populate two different databases, called "Virtual Scenario Database" (VSDB) and "Matching Scenario Database" (MSDB), both of which deal only with earthquake-generated tsunamis. In the VSDB we simulate numerically few large-magnitude events generated by the major known tectonic structures in the study area. Heterogeneous slip distributions on the earthquake faults are introduced to simulate events as "realistically" as possible. The members of the VSDB represent the unknowns that the TRIDEC platform must be able to recognise and match during the early crisis management phase. On the other hand, the MSDB contains a very large number (order of thousands) of tsunami simulations performed starting from many different simple earthquake sources of different magnitudes and located in the "vicinity" of the virtual scenario earthquake. In the DSS perspective, the members of the MSDB have to be suitably combined based on the information coming from the sensor networks, and the results are used during the crisis evolution phase to forecast the degree of exposition of different coastal areas. We provide examples from both databases whose members are computed by means of the in-house software called UBO-TSUFD, implementing the non-linear shallow-water equations and solving them over a set of nested grids that guarantee a suitable spatial resolution (few tens of meters) in specific, suitably chosen, coastal areas.
A probabilistic tsunami hazard assessment for Indonesia
NASA Astrophysics Data System (ADS)
Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.
2014-11-01
Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.
NASA Astrophysics Data System (ADS)
Reymond, Dominique
2017-04-01
We present a tool for computing the complete arrival times of the dispersed wave-train of a tsunami. The calculus is made using the exact formulation of the tsunami dispersion (and without approximations), at any desired periods between one hour or more (concerning the gravity waves propagation) until 10s (the highly dispersed mode). The computation of the travel times is based on the a summation of the necessary time for a tsunami to cross all the elementary blocs of a grid of bathymetry following a path between the source and receiver at a given period. In addition the source dimensions and the focal mechanism are taken into account to adjust the minimum travel time to the different possible points of emission of the source. A possible application of this tool is to forecast the arrival time of late arrivals of tsunami waves that could produce the resonnance of some bays and sites at higher frequencies than the gravity mode. The theoretical arrival times are compared to the observed ones and to the results obtained by TTT (P. Wessel, 2009) and the ones obtained by numerical simulations. References: Wessel, P. (2009). Analysis of oberved and predicted tsunami travel times for the Pacic and Indian oceans. Pure Appl. Geophys., 166:301-324.
Development of a new real-time GNSS data analysis system in GEONET for rapid Mw estimates in Japan
NASA Astrophysics Data System (ADS)
Kawamoto, S.; Miyagawa, K.; Yahagi, T.; Yamaguchi, K.; Tsuji, H.; Nishimura, T.; Ohta, Y.; Hino, R.; Miura, S.
2013-12-01
The 2011 off the Pacific Coast of Tohoku Earthquake (Mw 9.0) occurred on March 11, 2011. The earthquake and following tsunami caused serious damages to the broad coastal area of east Japan. Japan Meteorological Agency (JMA) operates the Tsunami Warning system, which is designed to forecast the tsunami height and its arrival time around 3 minutes after a large event. However, the first estimated magnitude of Mj, which was used for Tsunami Warning issuance, was far below the real one at the Tohoku event because of a saturation problem. In principle, as well as most other magnitude scales, Mj is saturated at certain values around 8.0. On the other hand, Mw represents the earthquake energy itself and it can be directly calculated by permanent displacements derived from geodetic measurements without the saturation problem. GNSS Earth Observation Network System (GEONET) is one of the densest real-time GNSS networks in the world operated by Geospatial Information Authority of Japan (GSI). The GEONET data and recent rapid advancement of GNSS analysis techniques motivate us to develop a new system for tackling the tsunami disasters. In order to provide the more reliable magnitude for Tsunami Warning, GSI and Tohoku University have jointly developed a new real-time analysis system in GEONET for quasi real-time Mw estimation. Its targets are large earthquakes, especially ones of Mw > 8.0, which would be saturated by the Tsunami Warning system. The real-time analysis system in GEONET mainly consists of three parts: (1) real-time GNSS positioning, (2) automated extraction of displacement fields due to the large earthquake, and (3) automated estimation of Mw by an approximated single rectangular fault. The positions of each station are calculated by using RTKLIB 2.4.1 (Takasu, 2011) with the baseline mode and the predicted part of the IGS Ultra Rapid precise orbit. For the event detection, we adopt the 'RAPiD' algorithm (Ohta et al., 2012) or Earthquake Early Warning issued by JMA. This whole process is done within 10 seconds at most and the estimated results are immediately announced to GSI staffs by e-mail. We examined the system by using the recorded 1Hz GEONET data of past several large earthquakes in Japan. The results showed that it could estimate reliable Mw within a few minutes like Mw of 8.9 for the 2011 Tohoku earthquake (Mw 9.0) after 172 seconds, Mw of 7.6 for the 2011 off Ibaraki earthquake (Mw 7.7) after 107 seconds and Mw of 8.0 for the 2003 Tokachi-oki earthquake (Mw 8.0) after 93 seconds respectively. GSI launched its prototype in April of 2012 with 146 GEONET stations for covering mainly Tohoku district and now is planning to extend it to the whole area of Japan. We assure that this system would become one of the powerful tools for supporting Tsunami Warinng in order to prevent or mitigate the severe damages of future disastrous tsunamis.
The November 15, 2006 Kuril Islands-Generated Tsunami in Crescent City, California
NASA Astrophysics Data System (ADS)
Dengler, L.; Uslu, B.; Barberopoulou, A.; Yim, S. C.; Kelly, A.
2009-02-01
On November 15, 2006, Crescent City in Del Norte County, California was hit by a tsunami generated by a M w 8.3 earthquake in the central Kuril Islands. Strong currents that persisted over an eight-hour period damaged floating docks and several boats and caused an estimated 9.2 million in losses. Initial tsunami alert bulletins issued by the West Coast Alaska Tsunami Warning Center (WCATWC) in Palmer, Alaska were cancelled about three and a half hours after the earthquake, nearly five hours before the first surges reached Crescent City. The largest amplitude wave, 1.76-meter peak to trough, was the sixth cycle and arrived over two hours after the first wave. Strong currents estimated at over 10 knots, damaged or destroyed three docks and caused cracks in most of the remaining docks. As a result of the November 15 event, WCATWC changed the definition of Advisory from a region-wide alert bulletin meaning that a potential tsunami is 6 hours or further away to a localized alert that tsunami water heights may approach warning- level thresholds in specific, vulnerable locations like Crescent City. On January 13, 2007 a similar Kuril event occurred and hourly conferences between the warning center and regional weather forecasts were held with a considerable improvement in the flow of information to local coastal jurisdictions. The event highlighted the vulnerability of harbors from a relatively modest tsunami and underscored the need to improve public education regarding the duration of the tsunami hazards, improve dialog between tsunami warning centers and local jurisdictions, and better understand the currents produced by tsunamis in harbors.
Hazard Assessment and Early Warning of Tsunamis: Lessons from the 2011 Tohoku earthquake
NASA Astrophysics Data System (ADS)
Satake, K.
2012-12-01
The March 11, 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history, and was the best recorded subduction-zone earthquakes in the world. In particular, various offshore geophysical observations revealed large horizontal and vertical seafloor movements, and the tsunami was recorded on high-quality, high-sampling gauges. Analysis of such tsunami waveforms shows a temporal and spatial slip distribution during the 2011 Tohoku earthquake. The fault rupture started near the hypocenter and propagated into both deep and shallow parts of the plate interface. Very large, ~25 m, slip off Miyagi on the deep part of plate interface corresponds to an interplate earthquake of M 8.8, the location and size similar to 869 Jogan earthquake model, and was responsible for the large tsunami inundation in Sendai and Ishinomaki plains. Huge slip, more than 50 m, occurred on the shallow part near the trench axis ~3 min after the earthquake origin time. This delayed shallow rupture (M 8.8) was similar to the 1896 "tsunami earthquake," and was responsible for the large tsunami on the northern Sanriku coast, measured at ~100 km north of the largest slip. Thus the Tohoku earthquake can be decomposed into an interplate earthquake and the triggered "tsunami earthquake." The Japan Meteorological Agency issued tsunami warning 3 minutes after the earthquake, and saved many lives. However, their initial estimation of tsunami height was underestimated, because the earthquake magnitude was initially estimated as M 7.9, hence the computed tsunami heights were lower. The JMA attempts to improve the tsunami warning system, including technical developments to estimate the earthquake size in a few minutes by using various and redundant information, to deploy and utilize the offshore tsunami observations, and to issue a warning based on the worst case scenario if a possibility of giant earthquake exists. Predicting a trigger of another large earthquake would still be a challenge. Tsunami hazard assessments or long-term forecast of earthquakes have not considered such a triggering or simultaneous occurrence of different types of earthquakes. The large tsunami at the Fukushima nuclear power station was due to the combination of the deep and shallow slip. Disaster prevention for low-frequency but large-scale hazard must be considered. The Japanese government established a general policy to for two levels: L1 and L2. The L2 tsunamis are the largest possible tsunamis with low frequency of occurrence, but cause devastating disaster once they occur. For such events, saving people's lives is the first priority and soft measures such as tsunami hazard maps, evacuation facilities or disaster education will be prepared. The L1 tsunamis are expected to occur more frequently, typically once in a few decades, for which hard countermeasures such as breakwater must be prepared to protect lives and properties of residents as well as economic and industrial activities.
NASA Astrophysics Data System (ADS)
Kato, Teruyuki; Terada, Yukihiro; Nagai, Toshihiko; Koshimura, Shun'ichi
2010-05-01
We have developed a GPS buoy system for monitoring tsunami for over 12 years. The idea was that a buoy equipped with a GPS antenna and placed offshore may be an effective way of monitoring tsunami before its arrival to the coast and to give warning to the coastal residents. The key technology for the system is real-time kinematic (RTK) GPS technology. We have successfully developed the system; we have detected tsunamis of about 10cm in height for three large earthquakes, namely, the 23 June 2001 Peru earthquake (Mw8.4), the 26 September 2003 Tokachi earthquake (Mw8.3) and the 5 September 2004 earthquake (Mw7.4). The developed GPS buoy system is also capable of monitoring sea waves that are mainly caused by winds. Only the difference between tsunami and sea waves is their frequency range and can be segregated each other by a simple filtering technique. Given the success of GPS buoy experiments, the system has been adopted as a part of the Nationwide Ocean Wave information system for Port and HArborS (NOWPHAS) by the Ministry of Land, Infrastructure, Transport and Tourism of Japan. They have established more than eight GPS buoys along the Japanese coasts and the system has been operated by the Port and Airport Research Institute. As a future scope, we are now planning to implement some other additional facilities for the GPS buoy system. The first application is a so-called GPS/Acoustic system for monitoring ocean bottom crustal deformation. The system requires acoustic waves to detect ocean bottom reference position, which is the geometrical center of an array of transponders, by measuring distances between a position at the sea surface (vessel) and ocean bottom equipments to return the received sonic wave. The position of the vessel is measured using GPS. The system was first proposed by a research group at the Scripps Institution of Oceanography in early 1980's. The system was extensively developed by Japanese researchers and is now capable of detecting ocean bottom positions with a few centimeters in accuracy. The system is now operational for more than ten sites along the Japanese coasts. Currently, however, the measurements are not continuous but have been done once to several times a year using a boat. If a GPS and acoustic system is placed on a buoy, ocean bottom position could be monitored in near real-time and continuous manner. This will allow us to monitor more detailed and short term crustal deformations at the sea bottom. Another application plan is for an atmospheric research. Previous researchers have shown that GPS is capable of measuring atmospheric water vapor through estimating tropospheric zenith delay measurements of GPS at the sea surface. Information of water vapor content and its temporal variation over sea surface will much contribute to weather forecast on land which has mostly been conducted only by land observations. Considering that the atmospheric mass moves from west to east in general in and around Japanese islands, information of water vapor together with other atmospheric data from an array of GPS buoy placed in the west of Japanese Islands, will much improve weather forecast. We try to examine if this is also feasible. As a conclusion of a series of GPS buoy experiments, we could assert that GPS buoy system will be a powerful tool to monitor ocean surface and much contribute to provide safe and secure life of people.
NASA Astrophysics Data System (ADS)
Tinti, Stefano; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo
2013-04-01
Geoscientists deal often with hazardous processes like earthquakes, volcanic eruptions, tsunamis, hurricanes, etc., and their research is aimed not only to a better understanding of the physical processes, but also to provide assessment of the space and temporal evolution of a given individual event (i.e. to provide short-term prediction) and of the expected evolution of a group of events (i.e. to provide statistical estimates referred to a given return period, and a given geographical area). One of the main issues of any scientific method is how to cope with measurement errors, a topic which in case of forecast of ongoing or of future events translates into how to deal with forecast uncertainties. In general, the more data are available and processed to make a prediction, the more accurate the prediction is expected to be if the scientific approach is sound, and the smaller the associated uncertainties are. However, there are several important cases where assessment is to be made with insufficient data or insufficient time for processing, which leads to large uncertainties. Two examples can be given taken from tsunami science, since tsunamis are rare events that may have destructive power and very large impact. One example is the case of warning for a tsunami generated by a near-coast earthquake, which is an issue at the focus of the European funded project NearToWarn. Warning has to be launched before tsunami hits the coast, that is in a few minutes after its generation. This may imply that data collected in such a short time are not yet enough for an accurate evaluation, also because the implemented monitoring system (if any) could be inadequate (f.i. one reason of inadequacy could be that implementing a dense instrumental network could be judged too expensive for rare events) The second case is the long term prevention from tsunami strikes. Tsunami infrequency may imply that the historical record for a given piece of coast is too short to capture a statistical sufficient number of large tsunamis, which entails that tsunami hazard has to be estimated by means of speculated worst-case scenarios, and their consequences are evaluated accordingly and usually result associated with large uncertainty bands. In case of large uncertainties, the main issues for geoscientists are how to communicate the information (prediction and uncertainties) to stakeholders and citizens and how to build and implement together responsive procedures that should be adequate. Usually there is a tradeoff between the cost of the countermeasure (warning and prevention) and its efficacy (i.e. its capability of minimizing the damage). The level of the acceptable tradeoff is an issue pertaining to decision makers and to local threatened communities. This paper, that represents a contribution from the European project TRIDEC on management of emergency crises, discusses the role of geoscientists in providing predictions and the related uncertainties. It is stressed that through academic education geoscientists are formed more to better their understanding of processes and the quantification of uncertainties, but are often unprepared to communicate their results in a way appropriate for society. Filling this gap is crucial for improving the way geoscience and society handle natural hazards and devise proper defense means.
NASA Astrophysics Data System (ADS)
Dunbar, P. K.; Weaver, C.
2007-12-01
In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the qualitative assessments based on frequency and amplitude. The second method to assess tsunami hazard involved using the USGS earthquake databases to search for possible earthquake sources near American coastlines to extend the NOAA/NGDC tsunami databases backward in time. The qualitative tsunami hazard assessment based on the results of the NGDC and USGS database searches will be presented.
NASA Astrophysics Data System (ADS)
Loevenbruck, A.; Quentel, E.; Hebert, H.
2011-12-01
The catastrophic 2004 tsunami drew the international community's attention to tsunami risk in all basins where tsunamis occurred but no warning system exists. Consequently, under the coordination of UNESCO, France decided to create a regional center, called CENALT, for the north-east Atlantic and the western Mediterranean. This warning system, which should be operational by 2012, is set up by the CEA in collaboration with the SHOM and the CNRS. The French authorities are in charge of the top-down alert system including the local alert dissemination. In order to prepare the appropriate means and measures, they initiated the ALDES (Alerte Descendante) project to which the CEA also contributes. It aims at examining along the French Mediterranean coast the tsunami risk related to earthquakes and landslides. In addition to the evaluation at regional scale, it includes the detailed studies of 3 selected sites; the local alert system will be designed for one of them. In this project, our main task at CEA consists in assessing tsunami hazard related to seismic sources using numerical modeling. Tsunamis have already affected the west Mediterranean coast; however past events are too few and poorly documented to provide a suitable database. Thus, a synthesis of earthquakes representative of the tsunamigenic seismic activity and prone to induce the largest impact to the French coast is performed based on historical data, seismotectonics and first order models. The North Africa Margin, the Ligurian and the South Tyrrhenian Seas are considered as the main tsunamigenic zones. In order to forecast the most important plausible effects, the magnitudes are estimated by enhancing to some extent the largest known values. Our hazard estimation is based on the simulation of the induced tsunamis scenarios performed with the CEA code. Models of propagation in the basin and off the French coast allow evaluating the potential threat at regional scale in terms of sources location and highlight the most exposed areas. The 3 sites have been chosen according to the regional hazard studies, coastal typology elements and the availability of appropriate DTMs (Digital Terrain Models). Indeed, the propagation models accuracy relies on the resolution of the input bathymetry, especially in shallow water areas, and the inundation estimation also depends on the precision of the coastal topographic data. The ALDES project allows the SHOM and the IGN to conduct high resolution data acquisition in the Litto3D framework for 2 sites, one west of the Gulf of Lion and one west of the French Riviera. DTMs of the third site, centered on the Antibes Cape, are built using pre-existent data sets with lesser resolution. Detailed modeling of the tsunamis scenarios provides refined estimation of the potential impacts; it points out the most exposed places and morphologic features prone to amplify potential waves and to generate significant coastal effects. Expected water heights and currents, inundation distances and run-up elevations are assessed. Our set of simulations gives an evaluation of the expected maximum impact distribution and highlights places, such as specific beaches or harbors, where mitigation measures must be given priority.
NASA Astrophysics Data System (ADS)
Dilmen, Derya I.; Titov, Vasily V.; Roe, Gerard H.
2015-12-01
On September 29, 2009, an Mw = 8.1 earthquake at 17:48 UTC in Tonga Trench generated a tsunami that caused heavy damage across Samoa, American Samoa, and Tonga islands. Tutuila island, which is located 250 km from the earthquake epicenter, experienced tsunami flooding and strong currents on the north and east coasts, causing 34 fatalities (out of 192 total deaths from this tsunami) and widespread structural and ecological damage. The surrounding coral reefs also suffered heavy damage. The damage was formally evaluated based on detailed surveys before and immediately after the tsunami. This setting thus provides a unique opportunity to evaluate the relationship between tsunami dynamics and coral damage. In this study, estimates of the maximum wave amplitudes and coastal inundation of the tsunami are obtained with the MOST model (T itov and S ynolakis, J. Waterway Port Coast Ocean Eng: pp 171, 1998; T itov and G onzalez, NOAA Tech. Memo. ERL PMEL 112:11, 1997), which is now the operational tsunami forecast tool used by the National Oceanic and Atmospheric Administration (NOAA). The earthquake source function was constrained using the real-time deep-ocean tsunami data from three DART® (Deep-ocean Assessment and Reporting for Tsunamis) systems in the far field, and by tide-gauge observations in the near field. We compare the simulated run-up with observations to evaluate the simulation performance. We present an overall synthesis of the tide-gauge data, survey results of the run-up, inundation measurements, and the datasets of coral damage around the island. These data are used to assess the overall accuracy of the model run-up prediction for Tutuila, and to evaluate the model accuracy over the coral reef environment during the tsunami event. Our primary findings are that: (1) MOST-simulated run-up correlates well with observed run-up for this event ( r = 0.8), it tends to underestimated amplitudes over coral reef environment around Tutuila (for 15 of 31 villages, run-up is underestimated by more than 10 %; in only 5 was run-up overestimated by more than 10 %), and (2) the locations where the model underestimates run-up also tend to have experienced heavy or very heavy coral damage (8 of the 15 villages), whereas well-estimated run-up locations characteristically experience low or very low damage (7 of 11 villages). These findings imply that a numerical model may overestimate the energy loss of the tsunami waves during their interaction with the coral reef. We plan future studies to quantify this energy loss and to explore what improvements can be made in simulations of tsunami run-up when simulating coastal environments with fringing coral reefs.
Recent improvements in earthquake and tsunami monitoring in the Caribbean
NASA Astrophysics Data System (ADS)
Gee, L.; Green, D.; McNamara, D.; Whitmore, P.; Weaver, J.; Huang, P.; Benz, H.
2007-12-01
Following the catastrophic loss of life from the December 26, 2004, Sumatra-Andaman Islands earthquake and tsunami, the U.S. Government appropriated funds to improve monitoring along a major portion of vulnerable coastal regions in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Partners in this project include the United States Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the Puerto Rico Seismic Network (PRSN), the Seismic Research Unit of the University of the West Indies, and other collaborating institutions in the Caribbean region. As part of this effort, the USGS is coordinating with Caribbean host nations to design and deploy nine new broadband and strong-motion seismic stations. The instrumentation consists of an STS-2 seismometer, an Episensor accelerometer, and a Q330 high resolution digitizer. Six stations are currently transmitting data to the USGS National Earthquake Information Center, where the data are redistributed to the NOAA's Tsunami Warning Centers, regional monitoring partners, and the IRIS Data Management Center. Operating stations include: Isla Barro Colorado, Panama; Gun Hill Barbados; Grenville, Grenada; Guantanamo Bay, Cuba; Sabaneta Dam, Dominican Republic; and Tegucigalpa, Honduras. Three additional stations in Barbuda, Grand Turks, and Jamaica will be completed during the fall of 2007. These nine stations are affiliates of the Global Seismographic Network (GSN) and complement existing GSN stations as well as regional stations. The new seismic stations improve azimuthal coverage, increase network density, and provide on-scale recording throughout the region. Complementary to this network, NOAA has placed Deep-ocean Assessment and Reporting of Tsunami (DART) stations at sites in regions with a history of generating destructive tsunamis. Recently, NOAA completed deployment of 7 DART stations off the coasts of Montauk Pt, NY; Charleston, SC; Miami, FL; San Juan, Puerto Rico; New Orleans, LA; and Bermuda as part of the U.S. tsunami warning system expansion. DART systems consist of an anchored seafloor pressure recorder (BPR) and a companion moored surface buoy for real-time communications. The new stations are a second-generation design (DART II) equipped with two- way satellite communications that allow NOAA's Tsunami Warning Centers to set stations in event mode in anticipation of possible tsunamis or retrieve the high-resolution (15-s intervals) data in one-hour blocks for detailed analysis. Combined with development of sophisticated wave propagation and site-specific inundation models, the DART data are being used to forecast wave heights for at-risk coastal communities. NOAA expects to deploy a total of 39 DART II buoy stations by 2008 (32 in the Pacific and 7 in the Atlantic, Caribbean and Gulf regions). The seismic and DART networks are two components in a comprehensive and fully-operational global observing system to detect and warn the public of earthquake and tsunami threats. NOAA and USGS are working together to make important strides in enhancing communication networks so residents and visitors can receive earthquake and tsunami watches and warnings around the clock.
NASA Astrophysics Data System (ADS)
Quentel, E.; Loevenbruck, A.; Hébert, H.
2012-04-01
The catastrophic 2004 tsunami drew the international community's attention to tsunami risk in all basins where tsunamis occurred but no warning system exists. Consequently, under the coordination of UNESCO, France decided to create a regional center, called CENALT, for the north-east Atlantic and the western Mediterranean. This warning system, which should be operational by 2012, is set up by the CEA in collaboration with the SHOM and the CNRS. The French authorities are in charge of the top-down alert system including the local alert dissemination. In order to prepare the appropriate means and measures, they initiated the ALDES (Alerte Descendante) project to which the CEA also contributes. It aims at examining along the French Mediterranean coast the tsunami risk related to earthquakes and landslides. In addition to the evaluation at regional scale, it includes the detailed studies of 3 selected sites; the local alert system will be designed for one of them : the French Riviera. In this project, our main task at CEA consists in assessing tsunami hazard related to seismic sources using numerical modeling. Past tsunamis have affected the west Mediterranean coast but are too few and poorly documented to provide a suitable database. Thus, a synthesis of earthquakes representative of the tsunamigenic seismic activity and prone to induce the largest impact to the French coast is performed based on historical data, seismotectonics and first order models. The North Africa Margin, the Ligurian and the South Tyrrhenian Seas are considered as the main tsunamigenic zones. In order to forecast the most important plausible effects, the magnitudes are estimated by enhancing to some extent the largest known values. Our hazard estimation is based on the simulation of the induced tsunamis scenarios performed with the CEA code. The 3 sites have been chosen according to the regional hazard studies, coastal typology elements and the appropriate DTMs (Digital Terrain Models). The ALDES project allows the SHOM and the IGN to conduct high resolution data acquisition in the Litto3D framework for 2 sites, one west of the Gulf of Lion (3 m) and one west of the French Riviera (3 m). DTMs of the third site, centered on the Antibes Cape, are built using pre-existent data sets with lesser resolution (10 m). Then, detailed models for the selected sites are performed based on high resolution bathymetric and topographic data; they provide estimations of water heights and currents, inundation distances and run-up elevations. It points out the most exposed places and morphologic features prone to amplify potential waves and to generate significant coastal effects. Our set of simulations gives an evaluation of the expected maximum impact distribution and highlights places, such as specific beaches or harbors, where mitigation measures must be given priority.
NASA Astrophysics Data System (ADS)
Gregg, C. E.; Sorensen, J. H.; Vogt Sorensen, B.; Whitmore, P.; Johnston, D. M.
2016-12-01
Spurred in part by world-wide interest in improving warning messaging for and response to tsunamis in the wake of several catastrophic tsunamis since 2004 and growing interest at the US National Weather Service (NWS) to integrate social science into their Tsunami Program, the NWS Tsunami Warning Centers in Alaska and Hawaii have made great progress toward enhancing tsunami messages. These include numerous products, among them being Tsunami Warnings, Tsunami Advisories and Tsunami Watches. Beginning in 2010 we have worked with US National Tsunami Hazard Mitigation Program (NTHMP) Warning Coordination and Mitigation and Education Subcommittee members; Tsunami Program administrators; and NWS Weather Forecast Officers to conduct a series of focus group meetings with stakeholders in coastal areas of Alaska, American Samoa, California, Hawaii, North Carolina, Oregon, US Virgin Islands and Washington to understand end-user perceptions of existing messages and their existing needs in message products. We also reviewed research literature on behavioral response to warnings to develop a Tsunami Warning Message Metric that could be used to guide revisions to tsunami warning messages of both warning centers. The message metric is divided into categories of Message Content, Style, Order, Formatting, and Receiver Characteristics. A sample message is evaluated by cross-referencing the message with the operational definitions of metric factors. Findings are then used to guide revisions of the message until the characteristics of each factor are met, whether the message is a full length or short message. Incrementally, this work contributed to revisions in the format, content and style of message products issued by the National Tsunami Warning Center (NTWC). Since that time, interest in short warning messages has continued to increase and in May 2016 the NTWC began efforts to revise message products to take advantage of recent NWS policy changes allowing use of mixed-case text format and expanded punctuation, a practice which the NWS first started in 2010. Here we describe our application of a modification of the warning message metric to develop new streamlined messages using mixed-case text. These messages reflect current state-of-the-art knowledge on warning message effectiveness.
New Science Applications Within the U.S. National Tsunami Hazard Mitigation Program
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Eble, M. C.; Forson, C. K.; Horrillo, J. J.; Nicolsky, D.
2017-12-01
The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is a collaborative State and Federal program which supports consistent and cost effective tsunami preparedness and mitigation activities at a community level. The NTHMP is developing a new five-year Strategic Plan based on the 2017 Tsunami Warning, Education, and Research Act as well as recommendations the 2017 NTHMP External Review Panel. Many NTHMP activities are based on the best available scientific methods through the NTHMP Mapping and Modeling Subcommittee (MMS). The primary activities for the MMS member States are to characterize significant tsunami sources, numerically model those sources, and create tsunami inundation maps for evacuation planning. This work remains a focus for many unmapped coastlines. With the lessons learned from the 2004 Indian Ocean and 2011 Tohoku Japan tsunamis, where both immediate risks and long-term recovery issues where recognized, the NTHMP MMS is expanding efforts into other areas that address community resilience. Tsunami evacuation modeling based on both pedestrian and vehicular modes of transportation are being developed by NTHMP States. Products include tools for the public to create personal evacuation maps. New tsunami response planning tools are being developed for both maritime and coastal communities. Maritime planning includes tsunami current-hazard maps for in-harbor and offshore response activities. Multi-tiered tsunami evacuation plans are being developed in some states to address local- versus distant-source tsunamis, as well as real-time evacuation plans, or "playbooks," for distant-source tsunamis forecasted to be less than the worst-case flood event. Products to assist community mitigation and recovery are being developed at a State level. Harbor Improvement Reports, which evaluate the impacts of currents, sediment, and debris on harbor infrastructure, include direct mitigation activities for Local Hazard Mitigation Plans. Building code updates in the five Pacific states will include new sections on tsunami load analysis of structures, and require Tsunami Design Zones based on probabilistic analyses. Guidance for community recovery planning has also been initiated. These new projects are being piloted by some States and will help create guidance for other States in the future.
A~probabilistic tsunami hazard assessment for Indonesia
NASA Astrophysics Data System (ADS)
Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.
2014-05-01
Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.
NASA Astrophysics Data System (ADS)
Arnaud, G.; Krien, Y.; Zahibo, N.; Dudon, B.
2017-12-01
Coastal hazards are among the most worrying threats of our time. In a context of climate change coupled to a large population increase, tropical areas could be the most exposed zones of the globe. In such circumstances, understanding the underlying processes can help to better predict storm surges and the associated global risks.Here we present the partial preliminary results integrated in a multidisciplinary project focused on climatic change effects over the coastal threat in the French West Indies and funded by the European Regional Development Fund. The study aims to provide a coastal hazard assessment based on hurricane surge and tsunami modeling including several aspects of climate changes that can affect hazards such as sea level rise, crustal subsidence/uplift, coastline changes etc. Several tsunamis scenarios have been simulated including tele-tsunamis to ensure a large range of tsunami hazards. Surge level of hurricane have been calculated using a large number of synthetic hurricanes to cover the actual and forecasted climate over the tropical area of Atlantic ocean. This hazard assessment will be later coupled with stakes assessed over the territory to provide risk maps.
An observation on the main factor for the high fatalities by the March 11 earthquake
NASA Astrophysics Data System (ADS)
Ishida, M.; Baba, T.; Ando, M.
2011-12-01
On 11 March 2011, Mw9.0 earthquake occurred in Tohoku district, the northeastern Japan, and caused a large tsunami which affected the greater part of the area. During 115 years prior to this event, large tsunamis have struck the Tohoku region in 1960, 1933 and 1896. Therefore, disaster mitigation efforts have been undertaken in the Tohoku region, such as the construction of incomparably strong breakwaters, the annual practice for tsunami evacuation drill, the preparation of hazard maps, etc. Despite these long-term efforts, ca. 25,000 deaths and missing persons were reported by the National Police Headquarters, Japan. In order to clarify the causes of such high number of the fatalities, we interviewed 120 tsunami survivors in 7 cities mainly in Iwate prefecture in several periods after the earthquake. Since the tsunami arrived more than 20-30 min later after the strong ground shaking stopped and highlands are within about 10 to 20 minutes on foot, residents would have been saved if people had taken an immediate action. We found several major reasons why the residents delayed their evacuation actions as follows: 1. Earthquakes that were forecast for the offshore Tohoku by the governmental committee had been much smaller than the March 11 event. Accordingly, evacuation shelters were located at the lower level than that required for the incoming tsunami; 2. The earthquake magnitude and tsunami height of the first warning issue by Japan Meteorological Agency (JMA) was significantly smaller than those of the actual events. Majority of local residents thought that breakwaters would protect them. The JMA renewed the earthquake magnitude and tsunami height step by step, but the corrected information did not reach to the local residents because of the blackout of electric power. Consequently, the residents were unable to get the renewed information through TV or radio; 3. Fifty percent of the local residents experienced the 1960 Chile tsunami that significantly smaller than the March 11 tsunami. Most of them had estimated the height and inundation area of the incoming tsunami based on their experience; 4. People had believed that breakwaters would protect the city from the tsunami. But the March 11 tsunami climbed over and destroyed most breakwaters. Focusing on the reliance of the breakwaters that delayed the evacuation of residents, we numerically simulated the tsunami height caused by the March 11 event in Kamaishi-city for three cases; 1. with breakwaters, 2. without breakwaters, 3. with partially collapsed breakwaters. Our preliminary results showed that the tsunami height does not show much difference among the above three cases during about 20 min from the beginning. Detail of the results will be shown in the poster. It is noticeable that the immoderate confidence on breakwaters delayed the timing for the local resident to evacuation, although there are other reasons that influenced their behaviors. Finally we emphasize that educating children at a young age is important and essential to understand the basic mechanism of tsunami generation even if technology could underestimate tsunami heights, the warning systems could fail, and the breakwaters were not sturdy enough.
Effect of Variable Manning Coefficients on Tsunami Inundation
NASA Astrophysics Data System (ADS)
Barberopoulou, A.; Rees, D.
2017-12-01
Numerical simulations are commonly used to help estimate tsunami hazard, improve evacuation plans, issue or cancel tsunami warnings, inform forecasting and hazard assessments and have therefore become an integral part of hazard mitigation among the tsunami community. Many numerical codes exist for simulating tsunamis, most of which have undergone extensive benchmarking and testing. Tsunami hazard or risk assessments employ these codes following a deterministic or probabilistic approach. Depending on the scope these studies may or may not consider uncertainty in the numerical simulations, the effects of tides, variable friction or estimate financial losses, none of which are necessarily trivial. Distributed manning coefficients, the roughness coefficients used in hydraulic modeling, are commonly used in simulating both riverine and pluvial flood events however, their use in tsunami hazard assessments is primarily part of limited scope studies and for the most part, not a standard practice. For this work, we investigate variations in manning coefficients and their effects on tsunami inundation extent, pattern and financial loss. To assign manning coefficients we use land use maps that come from the New Zealand Land Cover Database (LCDB) and more recent data from the Ministry of the Environment. More than 40 classes covering different types of land use are combined into major classes such as cropland, grassland and wetland representing common types of land use in New Zealand, each of which is assigned a unique manning coefficient. By utilizing different data sources for variable manning coefficients, we examine the impact of data sources and classification methodology on the accuracy of model outputs.
Coastal Warning Display Program
! Boating Safety Beach Hazards Rip Currents Hypothermia Hurricanes Thunderstorms Lightning Coastal Flooding Tsunamis 406 EPIRB's National Weather Service Marine Forecasts COASTAL WARNING DISPLAY PROGRAM Marine COASTAL WARNING DISPLAY PROGRAM As of February 15, 1989, the National Weather Service retired its Coastal
NASA Astrophysics Data System (ADS)
Allgeyer, S.; Quentel, É.; Hébert, H.; Gailler, A.; Loevenbruck, A.
2017-08-01
Several major tsunamis have affected the southwest Indian Ocean area since the 2004 Sumatra event, and some of them (2005, 2006, 2007 and 2010) have hit La Réunion Island in the southwest Indian Ocean. However, tsunami hazard is not well defined for La Réunion Island where vulnerable coastlines can be exposed. This study offers a first tsunami hazard assesment for La Réunion Island. We first review the historical tsunami observations made on the coastlines, where high tsunami waves (2-3 m) have been reported on the western coast, especially during the 2004 Indian Ocean tsunami. Numerical models of historical scenarios yield results consistent with available observations on the coastal sites (the harbours of La Pointe des Galets and Saint-Paul). The 1833 Pagai earthquake and tsunami can be considered as the worst-case historical scenario for this area. In a second step, we assess the tsunami exposure by covering the major subduction zones with syntethic events of constant magnitude (8.7, 9.0 and 9.3). The aggregation of magnitude 8.7 scenarios all generate strong currents in the harbours (3-7 m s^{-1}) and about 2 m of tsunami maximum height without significant inundation. The analysis of the magnitude 9.0 events confirms that the main commercial harbour (Port Est) is more vulnerable than Port Ouest and that flooding in Saint-Paul is limited to the beach area and the river mouth. Finally, the magnitude 9.3 scenarios show limited inundations close to the beach and in the riverbed in Saint-Paul. More generally, the results confirm that for La Runion, the Sumatra subduction zone is the most threatening non-local source area for tsunami generation. This study also shows that far-field coastal sites should be prepared for tsunami hazard and that further work is needed to improve operational warning procedures. Forecast methods should be developed to provide tools to enable the authorities to anticipate the local effects of tsunamis and to evacuate the harbours in sufficient time when such an earthquake occurs.
Tsunami mitigation - redistribution of energy
NASA Astrophysics Data System (ADS)
Kadri, Usama
2017-04-01
Tsunamis are water waves caused by the displacement of a large volume of water, in the deep ocean or a large lake, following an earthquake, landslide, underwater explosion, meteorite impacts, or other violent geological events. On the coastline, the resulting waves evolve from unnoticeable to devastating, reaching heights of tens of meters and causing destruction of property and loss of life. Over 225,000 people were killed in the 2004 Indian Ocean tsunami alone. For many decades, scientists have been studying tsunami, and progress has been widely reported in connection with the causes (1), forecasting (2), and recovery (3). However, none of the studies ratifies the approach of a direct mitigation of tsunamis, with the exception of mitigation using submarine barriers (e.g. see Ref. (4)). In an attempt to open a discussion on direct mitigation, I examine the feasibility of redistributing the total energy of a very long surface ocean (gravity) wave over a larger space through nonlinear resonant interaction with two finely tuned acoustic-gravity waves (see Refs. (5-8)). Theoretically, while the energy input in the acoustic-gravity waves required for an effective interaction is comparable to that in a tsunami (i.e. impractically large), employing the proposed mitigation technique the initial tsunami amplitude could be reduced substantially resulting in a much milder impact at the coastline. Moreover, such a technique would allow for the harnessing of the tsunami's own energy. Practically, this mitigation technique requires the design of highly accurate acoustic-gravity wave frequency transmitters or modulators, which is a rather challenging ongoing engineering problem. References 1. E. Bryant, 2014. Tsunami: the underrated hazard. Springer, doi:10.1007/978-3-319- 06133-7. 2. V. V. Titov, F. I. Gonza`lez, E. N. Bernard, M. C. Eble, H. O. Mofjeld, J. C. Newman, A. J. Venturato, 2005. Real-Time Tsunami Forecasting: Challenges and Solutions. Nat. Hazards 35:41-58, doi:10.1007/1-4020-3607-8 3 3. E. Check, 2005. Natural disasters: Roots of recovery. Nature 438, 910-911, doi:10.1038/438910a. 4. A. M. Fridman, L. S. Alperovich, L. Shemer, L. Pustil'nik, D. Shtivelman, A. G. Marchuk, D. Liberzon, 2010. Tsunami wave suppression using submarine barriers. Phys. Usp. 53 809-816, doi:10.3367/UFNe.0180.201008d.0843. 5. U. Kadri, M. Stiassnie, 2013. Generation of an acoustic-gravity wave by two gravity waves, and their mutual interaction. J. Fluid Mech. 735, R6, doi:10.1017/jfm.2013.539. 6. U. Kadri, 2015. Wave motion in a heavy compressible fluid: revisited. European Journal of Mechanics - B/Fluids, 49(A), 50-57, doi:10.1016/j.euromechflu.2014.07.008 7. U. Kadri, T.R. Akylas, 2016. On resonant triad interactions of acoustic-gravity waves. J. Fluid Mech., 788, R1(12 pages), doi:10.1017/jfm.2015.721. 8. U. Kadri, 2016. Triad resonance between a surface-gravity wave and two high frequency hydro-acoustic waves. Eur. J. Mech. B/Fluid, 55(1), 157-161, doi:10.1016/j.euromechflu.2015.09.008.
NASA Astrophysics Data System (ADS)
Kontar, Y. A.; Gusiakov, V. K.; Izbekov, P. E.; Gordeev, E.; Titov, V. V.; Verstraeten, I. M.; Pinegina, T. K.; Tsadikovsky, E. I.; Heilweil, V. M.; Gingerich, S. B.
2012-12-01
During the US-Russia Geohazards Workshop held July 17-19, 2012 in Moscow, Russia the international research effort was asked to identify cooperative actions for disaster risk reduction, focusing on extreme geophysical events. As a part of this recommendation the PIRE project was developed to understand, quantify, forecast and protect the coastal zone aquifers and inland water resources of Kamchatka (Russia) and its ecosystems affected by the November 4, 1952 Kamchatka tsunami (Khalatyrka Beach near Petropavlovsk-Kamchatskiy) and the January 2, 1996 Karymskiy volcano eruption and the lake tsunami. This project brings together teams from U.S. universities and research institutions located in Russia. The research consortium was briefed on recent technical developments and will utilize samples secured via major international volcanic and tsunami programs for the purpose of advancing the study of submarine groundwater discharge (SGD) in the volcanic eruption and tsunami affected coastal areas and inland lakes of Kamchatka. We plan to accomplish this project by developing and applying the next generation of field sampling, remote sensing, laboratory techniques and mathematical tools to study groundwater-surface water interaction processes and SGD. We will develop a field and modeling approach to define SGD environment, key controls, and influence of volcano eruption and tsunami, which will provide a framework for making recommendations to combat contamination. This is valuable for politicians, water resource managers and decision-makers and for the volcano eruption and tsunami affected region water supply and water quality of Kamchatka. Data mining and results of our field work will be compiled for spatial modeling by Geo-Information System (GIS) using 3-D Earth Systems Visualization Lab. The field and model results will be communicated to interested stakeholders via an interactive web site. This will allow computation of SGD spatial patterns. In addition, thanks to the conceptual integrated approach, the mathematical tool will be transportable to other regions affected by volcanic eruption and tsunami. We will involve students in the work, incorporate the results into our teaching portfolio and work closely with the IUGG GeoRisk Commission and AGU Natural Hazards Focus Group to communicate our findings to the broader public, specifically local communities that will be most impacted. Under the PIRE education component, a cohort of U.S. and Russian post-doctoral researchers and students will receive training and contribute to the overall natural hazards SGD science agenda in cooperation with senior U.S. researchers and leading investigators from the Russian institutions. Overall, the extensive team of researchers, students and institutions is poised to deliver an innovative and broad spectrum of science associated with the study of SGD in the volcanic eruption and tsunami affected areas, in a way not possible to achieve in isolation.
Astronomical Data Tsunami Full Site FAQ Site Info Feedback Click map for forecast jQuery Mobile Framework = Requested Location Satellite Visible (Vis) Infrared (IR) Regional Vis Regional IR Legal Mobile site Product : NWS Internet Team Privacy Policy Mobile Page Feedback Full Survey Tweet feedback (#nwsmobileweb
Evaluation of Tsunami-HySEA for tsunami forecasting at selected locations in U.S.
NASA Astrophysics Data System (ADS)
Gonzalez Vida, J. M., Sr.; Ortega, S.; Castro, M. J.; de la Asuncion, M.; Arcas, D.
2017-12-01
The GPU-based Tsunami-HySEA model (Macias, J. et al., Pure and Applied Geophysics, 1-37, 2017, Lynett, P. et al., Ocean modeling, 114, 2017) is used to test four tsunami events: the January, 13, 2007 earthquake in Kuril islands (Mw 8.1), the September, 29, 2009 earthquake in Samoa (Mw 8.3), the February, 27, 2010 earthquake in Chile (Mw 9.8) and the March, 11, 2011 earthquake in Tohoku (Mw 9.0). Initial conditions have been provided by NOAA Center for Tsunami Research (NCTR) obtained from DART inversion results. All simulations have been performed using a global 4 arc-min grid of the Ocean Pacific and three nested-meshes levels around the selected locations. Wave amplitudes time series have been computed at selected tide gauges located at each location and maximum amplitudes compared with both MOST model results and observations where they are available. In addition, inundation also has been computed at selected U.S. locations for the 2011 Tohoku and 2009 Samoa events under the assumption of a steady mean high water level. Finally, computational time is also evaluated in order to study the operational capabilities of Tsunami-HySEA for these kind of events. Ackowledgements: This work has been funded by WE133R16SE1418 contract between PMEL (NOAA) and the Universidad de Málaga (Spain).
Societal acceptance of unnecessary evacuation
NASA Astrophysics Data System (ADS)
McCaughey, Jamie W.; Mundzir, Ibnu; Patt, Anthony; Rosemary, Rizanna; Safrina, Lely; Mahdi, Saiful; Daly, Patrick
2017-04-01
Uncertainties in forecasting extreme events force an unavoidable tradeoff between false alarms and misses. The appropriate balance depends on the level of societal acceptance of unnecessary evacuations, but there has been little empirical research on this. Intuitively it may seem that an unnecessary evacuation would make people less likely to evacuate again in the future, but our study finds no support for this intuition. Using new quantitative (n=800) and qualitative evidence, we examine individual- and household-level evacuation decisions in response to the strong 11-Apr-2012 earthquake in Aceh, Indonesia. This earthquake did not produce a tsunami, but the population had previously experienced the devastating 2004 tsunami. In our sample, the vast majority of people (86%) evacuated in the 2012 earthquake, and nearly all (94%) say they would evacuate again if a similar earthquake happened in the future. Self-reported level of fear at the moment of the 2012 earthquake explains more of the variance in evacuation decisions and intentions than does a combination of perceived tsunami risk and perceived efficacy of evacuation modeled on protection motivation theory. These findings suggest that the appropriate balance between false alarms and misses may be highly context-specific. Investigating this in each context would make an important contribution to the effectiveness of early-warning systems.
Simulation of Earthquake-Generated Sea-Surface Deformation
NASA Astrophysics Data System (ADS)
Vogl, Chris; Leveque, Randy
2016-11-01
Earthquake-generated tsunamis can carry with them a powerful, destructive force. One of the most well-known, recent examples is the tsunami generated by the Tohoku earthquake, which was responsible for the nuclear disaster in Fukushima. Tsunami simulation and forecasting, a necessary element of emergency procedure planning and execution, is typically done using the shallow-water equations. A typical initial condition is that using the Okada solution for a homogeneous, elastic half-space. This work focuses on simulating earthquake-generated sea-surface deformations that are more true to the physics of the materials involved. In particular, a water layer is added on top of the half-space that models the seabed. Sea-surface deformations are then simulated using the Clawpack hyperbolic PDE package. Results from considering the water layer both as linearly elastic and as "nearly incompressible" are compared to that of the Okada solution.
Space Weather Impacts to Mariners
Tsunamis 406 EPIRB's National Weather Service Marine Forecasts SPACE WEATHER IMPACTS TO MARINERS Marine present an even greater danger near shore or any shallow waters? Space Weather Impacts to Mariners Don't ), Notices to Mariners, Special Paragraphs: "(73) SPACE WEATHER IMPACTS. There is a growing potential
COMMERCIAL MARITIME COAST STATIONS and WEATHER NETS
Tsunamis 406 EPIRB's National Weather Service Marine Forecasts COMMERCIAL MARITIME COAST STATIONS and PRODUCTS VIA COMMERCIAL MARITIME COAST STATIONS and WEATHER NETS Commercial maritime coast stations, which ;NETS" operating on commercial marine VHF, MF and HF frequencies, where weather information is
NASA Astrophysics Data System (ADS)
Hou, Jingming; Yuan, Ye; Wang, Peitao; Ren, Zhiyuan; Li, Xiaojuan
2017-03-01
Major tsunami disasters often cause great damage in the first few hours following an earthquake. The possible severity of such events requires preparations to prevent tsunami disasters or mitigate them. This paper is an attempt to develop a decision support system for rapid tsunami evacuation for local decision makers. Based on the numerical results database of tsunami disasters, this system can quickly obtain the tsunami inundation and travel time. Because numerical models are calculated in advance, this system can reduce decision-making time. Population distribution, as a vulnerability factor, was analyzed to identify areas of high risk for tsunami disasters. Combined with spatial data, this system can comprehensively analyze the dynamic and static evacuation process and identify problems that negatively impact evacuation, thus supporting the decision-making for tsunami evacuation in high-risk areas. When an earthquake and tsunami occur, this system can rapidly obtain the tsunami inundation and travel time and provide information to assist with tsunami evacuation operations.
Global mapping of nonseismic sea level oscillations at tsunami timescales.
Vilibić, Ivica; Šepić, Jadranka
2017-01-18
Present investigations of sea level extremes are based on hourly data measured at coastal tide gauges. The use of hourly data restricts existing global and regional analyses to periods larger than 2 h. However, a number of processes occur at minute timescales, of which the most ruinous are tsunamis. Meteotsunamis, hazardous nonseismic waves that occur at tsunami timescales over limited regions, may also locally dominate sea level extremes. Here, we show that nonseismic sea level oscillations at tsunami timescales (<2 h) may substantially contribute to global sea level extremes, up to 50% in low-tidal basins. The intensity of these oscillations is zonally correlated with mid-tropospheric winds at the 99% significance level, with the variance doubling from the tropics and subtropics to the mid-latitudes. Specific atmospheric patterns are found during strong events at selected locations in the World Ocean, indicating a globally predominant generation mechanism. Our analysis suggests that these oscillations should be considered in sea level hazard assessment studies. Establishing a strong correlation between nonseismic sea level oscillations at tsunami timescales and atmospheric synoptic patterns would allow for forecasting of nonseismic sea level oscillations for operational use, as well as hindcasting and projection of their effects under past, present and future climates.
Global mapping of nonseismic sea level oscillations at tsunami timescales
Vilibić, Ivica; Šepić, Jadranka
2017-01-01
Present investigations of sea level extremes are based on hourly data measured at coastal tide gauges. The use of hourly data restricts existing global and regional analyses to periods larger than 2 h. However, a number of processes occur at minute timescales, of which the most ruinous are tsunamis. Meteotsunamis, hazardous nonseismic waves that occur at tsunami timescales over limited regions, may also locally dominate sea level extremes. Here, we show that nonseismic sea level oscillations at tsunami timescales (<2 h) may substantially contribute to global sea level extremes, up to 50% in low-tidal basins. The intensity of these oscillations is zonally correlated with mid-tropospheric winds at the 99% significance level, with the variance doubling from the tropics and subtropics to the mid-latitudes. Specific atmospheric patterns are found during strong events at selected locations in the World Ocean, indicating a globally predominant generation mechanism. Our analysis suggests that these oscillations should be considered in sea level hazard assessment studies. Establishing a strong correlation between nonseismic sea level oscillations at tsunami timescales and atmospheric synoptic patterns would allow for forecasting of nonseismic sea level oscillations for operational use, as well as hindcasting and projection of their effects under past, present and future climates. PMID:28098195
Exploring Options for an Integrated Water Level Observation Network in Alaska
NASA Astrophysics Data System (ADS)
McCammon, M.
2016-02-01
Portions' of Alaska's remote coastlines are among the Nation's most vulnerable to geohazards such as tsunami, extra-tropical storm surge, and erosion; and the availability of observations of water levels, ocean waves, and river discharge are severely lacking to support water level warnings and forecasts. Alaska is experiencing dramatic reductions in sea ice cover, changes in extra-tropical storm surge patterns, and thawing permafrost. These conditions are endangering coastal populations throughout the State. Gaps in the ocean observing system limit our State's ability to provide useful marine and sea ice forecasts, especially in the Arctic. A spectrum of observation platforms may provide an optimal solution for filling the most critical gaps in these coastal and ocean areas. The collaborations described in this talk and better leveraging of resources and capabilities across federal, state, and academic partners will provide the best opportunity for advancing our science capacity and capabilities in this remote region.
Navigating Declining Budgets, Political Hurdles: A New Vision for the Future of Geoscience
NASA Astrophysics Data System (ADS)
Gagosian, Robert B.
2013-06-01
The Oklahoma tornadoes, Superstorm Sandy, the Tohoku tsunami, and the Deepwater Horizon oil spill are just a few examples of oceanic, atmospheric, and other Earth system disasters in the past 3 years that together claimed thousands of lives and caused hundreds of billions of dollars of damage. Basic and applied research in the geosciences were essential in supporting early warnings and forecasts that were used not only to protect lives when these natural disasters struck but also to assess risks and help society to be better able to adapt and recover after disaster struck.
NASA Astrophysics Data System (ADS)
Rohmer, Jeremy; Rousseau, Marie; Lemoine, Anne; Pedreros, Rodrigo; Lambert, Jerome; benki, Aalae
2017-04-01
Recent tsunami events including the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami have caused many casualties and damages to structures. Advances in numerical simulation of tsunami-induced wave processes have tremendously improved forecast, hazard and risk assessment and design of early warning for tsunamis. Among the major challenges, several studies have underlined uncertainties in earthquake slip distributions and rupture processes as major contributor on tsunami wave height and inundation extent. Constraining these uncertainties can be performed by taking advantage of observations either on tsunami waves (using network of water level gauge) or on inundation characteristics (using field evidence and eyewitness accounts). Despite these successful applications, combining tsunami observations and simulations still faces several limitations when the problem is addressed for past tsunamis events like 1755 Lisbon. 1) While recent inversion studies can benefit from current modern networks (e.g., tide gauges, sea bottom pressure gauges, GPS-mounted buoys), the number of tide gauges can be very scarce and testimonies on tsunami observations can be limited, incomplete and imprecise for past tsunamis events. These observations often restrict to eyewitness accounts on wave heights (e.g., maximum reached wave height at the coast) instead of the full observed waveforms; 2) Tsunami phenomena involve a large span of spatial scales (from ocean basin scales to local coastal wave interactions), which can make the modelling very demanding: the computation time cost of tsunami simulation can be very prohibitive; often reaching several hours. This often limits the number of allowable long-running simulations for performing the inversion, especially when the problem is addressed from a Bayesian inference perspective. The objective of the present study is to overcome both afore-described difficulties in the view to combine historical observations on past tsunami-induced waves and numerical simulations. In order to learn the uncertainty information on source parameters, we treat the problem within the Bayesian setting, which enables to incorporate in a flexible manner the different uncertainty sources. We propose to rely on an emerging technique called Approximate Bayesian Computation ABC, which has been developed to estimate the posterior distribution in modelling scenarios where the likelihood function is either unknown or cannot be explicitly defined. To overcome the computational issue, we combine ABC with statistical emulators (aka meta-model). We apply the proposed approach on the case study of Ligurian (North West of Italy) tsunami (1887) and discuss the results with a special attention paid to the impact of the observational error.
NASA Astrophysics Data System (ADS)
Mattioli, Glen; Mencin, David; Hodgkinson, Kathleen; Meertens, Charles; Phillips, David; Blume, Fredrick; Berglund, Henry; Fox, Otina; Feaux, Karl
2016-04-01
The NSF-funded GAGE Facility, managed by UNAVCO, operates approximately ~1300 GNSS stations distributed across North and Central America and in the circum-Caribbean. Following community input starting in 2011 from several workshops and associated reports,UNAVCO has been exploring ways to increase the capability and utility of the geodetic resources under its management to improve our understanding in diverse areas of geophysics including properties of seismic, volcanic, magmatic and tsunami deformation sources. Networks operated by UNAVCO for the NSF have the potential to profoundly transform our ability to rapidly characterize events, provide rapid characterization and warning, as well as improve hazard mitigation and response. Specific applications currently under development include earthquake early warning, tsunami early warning, and tropospheric modeling with university, commercial, non-profit and government partners on national and international scales. In the case of tsunami early warning, for example, an RT-GNSS network can provide multiple inputs in an operational system starting with rapid assessment of earthquake sources and associated deformation, which leads to the initial model of ocean forcing and tsunami generation. In addition, terrestrial GNSScan provide direct measurements of the tsunami through the associated traveling ionospheric disturbance from several 100's of km away as they approach the shoreline,which can be used to refine tsunami inundation models. Any operational system like this has multiple communities that rely on a pan-Pacific real-time open data set. Other scientific and operational applications for high-rate GPS include glacier and ice sheet motions, tropospheric modeling, and better constraints on the dynamics of space weather. Combining existing data sets and user communities, for example seismic data and tide gauge observations, with GNSS and Met data products has proven complicated because of issues related to metadata, appropriate data formats, data quality assessment in real-time and other issues related to using these products operational forecasting. While progress has been made toward more open and free data access across national borders and toward more cooperation among cognizant government sanctioned "early warning" agencies, some impediments remain making a truly operational system a work in progress. Accordingly, UNAVCO has embarked on significant improvements and improvement goals to the original infrastructure and scope of the PBO. We anticipate that PBO and related networks will form a backbone for these disparate efforts providing high quality, low latency raw and processed GNSS data. This requires substantial upgrades to the entire system from the basic GNNS receiver, through robust data collection, archiving and open distribution mechanisms, to efficient data-processing strategies. UNAVCO is currently in a partnership with the commercial and scientific stakeholders to define, develop and deploy all segments of this improved geodetic network. We present the overarching goals, and current and planned future stateof this international resource.
Earthquake and submarine landslide tsunamis: how can we tell the difference? (Invited)
NASA Astrophysics Data System (ADS)
Tappin, D. R.; Grilli, S. T.; Harris, J.; Geller, R. J.; Masterlark, T.; Kirby, J. T.; Ma, G.; Shi, F.
2013-12-01
Several major recent events have shown the tsunami hazard from submarine mass failures (SMF), i.e., submarine landslides. In 1992 a small earthquake triggered landslide generated a tsunami over 25 meters high on Flores Island. In 1998 another small, earthquake-triggered, sediment slump-generated tsunami up to 15 meters high devastated the local coast of Papua New Guinea killing 2,200 people. It was this event that led to the recognition of the importance of marine geophysical data in mapping the architecture of seabed sediment failures that could be then used in modeling and validating the tsunami generating mechanism. Seabed mapping of the 2004 Indian Ocean earthquake rupture zone demonstrated, however, that large, if not great, earthquakes do not necessarily cause major seabed failures, but that along some convergent margins frequent earthquakes result in smaller sediment failures that are not tsunamigenic. Older events, such as Messina, 1908, Makran, 1945, Alaska, 1946, and Java, 2006, all have the characteristics of SMF tsunamis, but for these a SMF source has not been proven. When the 2011 tsunami struck Japan, it was generally assumed that it was directly generated by the earthquake. The earthquake has some unusual characteristics, such as a shallow rupture that is somewhat slow, but is not a 'tsunami earthquake.' A number of simulations of the tsunami based on an earthquake source have been published, but in general the best results are obtained by adjusting fault rupture models with tsunami wave gauge or other data so, to the extent that they can model the recorded tsunami data, this demonstrates self-consistency rather than validation. Here we consider some of the existing source models of the 2011 Japan event and present new tsunami simulations based on a combination of an earthquake source and an SMF mapped from offshore data. We show that the multi-source tsunami agrees well with available tide gauge data and field observations and the wave data from offshore buoys, and that the SMF generated the large runups in the Sanriku region (northern Tohoku). Our new results for the 2011 Tohoku event suggest that care is required in using tsunami wave and tide gauge data to both model and validate earthquake tsunami sources. They also suggest a potential pitfall in the use of tsunami waveform inversion from tide gauges and buoys to estimate the size and spatial characteristics of earthquake rupture. If the tsunami source has a significant SMF component such studies may overestimate earthquake magnitude. Our seabed mapping identifies other large SMFs off Sanriku that have the potential to generate significant tsunamis and which should be considered in future analyses of the tsunami hazard in Japan. The identification of two major SMF-generated tsunamis (PNG and Tohoku), especially one associated with a M9 earthquake, is important in guiding future efforts at forecasting and mitigating the tsunami hazard from large megathrust plus SMF events both in Japan and globally.
WWV, WWVH HF VOICE (TIME TICK)
Tsunamis 406 EPIRB's National Weather Service Marine Forecasts WWV, WWVH HF VOICE (TIME TICK) Marine of Standards, broadcasts a time and frequency service from stations WWV in Fort Collins, CO and WWVH in Kauai, Hawaii., commonly known to mariners as the "Time Tick", used as an aid in
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-04
...://www.iattc.org/ResolutionsActiveENG.htm . Changes to Tuna Conservation Measures for 2011-2013... fishing vessels that often leads to loss of data critical to weather forecasting, tsunami warnings, search... of Climate Observations at http://osmc.noaa.gov/Monitor/OSMC/OSMC.html , also provides information...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... without change. All personal identifying information (e.g., name, address, etc.) submitted voluntarily by...: (1) A change to the duration of the purse seine closure of the Convention Area in 2011 and... weather forecasting, tsunami warnings, search and rescue efforts, and research of the marine environment...
NASA Astrophysics Data System (ADS)
Macpherson, K. A.
2017-12-01
The National Oceanographic and Atmospheric Administration's National and Pacific Tsunami Warning Centers currently rely on traditional seismic data in order to detect and evaluate potential tsunamigenic earthquakes anywhere on the globe. The first information products disseminated by the centers following a significant seismic event are based solely on seismically-derived earthquake locations and magnitudes, and are issued within minutes of the earthquake origin time. Thus, the rapid and reliable determination of the earthquake magnitude is a critical piece of information needed by the centers to generate the appropriate alert levels. However, seismically-derived magnitudes of large events are plagued by well-known problems, particularly during the first few minutes following the origin time; near-source broad-band instruments may go off scale, and magnitudes tend to saturate until sufficient teleseismic data arrive to represent the long-period signal that characterizes large events. However, geodetic data such as high-rate Global Positioning System (hGPS) displacements and seismogeodetic data that is a combination of collocated hGPS and accelerometer data do not suffer from these limitations. These sensors stay on scale, even for large events, and they record both dynamic and static displacements that may be used to estimate magnitude without saturation. Therefore, there is an ongoing effort to incorporate these data streams into the operations of the tsunami warning centers to enhance current magnitude determination capabilities, and eventually, to invert the geodetic displacements for mechanism and finite-fault information. These later quantities will be useful for tsunami modeling and forecasting. The tsunami warning centers rely on the Earthworm system for real-time data acquisition, so we have developed Earthworm modules for the Magnitude from Peak Ground Displacement (MPGD) algorithm, developed at the University of Washington and the University of California, Berkeley, and a module for a Static Offset Estimator algorithm that was developed by the NASA Jet Propulsion Laboratory. In this presentation we will discuss module architecture and show output computed by replaying both synthetic and historical scenarios in a simulated real-time Earthworm environment.
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
June 13, 2013 U.S. East Coast Meteotsunami: Comparing a Numerical Model With Observations
NASA Astrophysics Data System (ADS)
Wang, D.; Becker, N. C.; Weinstein, S.; Whitmore, P.; Knight, W.; Kim, Y.; Bouchard, R. H.; Grissom, K.
2013-12-01
On June 13, 2013, a tsunami struck the U.S. East Coast and caused several reported injuries. This tsunami occurred after a derecho moved offshore from North America into the Atlantic Ocean. The presence of this storm, the lack of a seismic source, and the fact that tsunami arrival times at tide stations and deep ocean-bottom pressure sensors cannot be attributed to a 'point-source' suggest this tsunami was caused by atmospheric forces, i.e., a meteotsunami. In this study we attempt to reproduce the observed phenomenon using a numerical model with idealized atmospheric pressure forcing resembling the propagation of the observed barometric anomaly. The numerical model was able to capture some observed features of the tsunami at some tide stations, including the time-lag between the time of pressure jump and the time of tsunami arrival. The model also captures the response at a deep ocean-bottom pressure gauge (DART 44402), including the primary wave and the reflected wave. There are two components of the oceanic response to the propagating pressure anomaly, inverted barometer response and dynamic response. We find that the dynamic response over the deep ocean to be much smaller than the inverted barometer response. The time lag between the pressure jump and tsunami arrival at tide stations is due to the dynamic response: waves generated and/or reflected at the shelf-break propagate shoreward and amplify due to the shoaling effect. The evolution of the derecho over the deep ocean (propagation direction and intensity) is not well defined, however, because of the lack of data so the forcing used for this study is somewhat speculative. Better definition of the pressure anomaly through increased observation or high resolution atmospheric models would improve meteotsunami forecast capabilities.
Tsunami.gov: NOAA's Tsunami Information Portal
NASA Astrophysics Data System (ADS)
Shiro, B.; Carrick, J.; Hellman, S. B.; Bernard, M.; Dildine, W. P.
2014-12-01
We present the new Tsunami.gov website, which delivers a single authoritative source of tsunami information for the public and emergency management communities. The site efficiently merges information from NOAA's Tsunami Warning Centers (TWC's) by way of a comprehensive XML feed called Tsunami Event XML (TEX). The resulting unified view allows users to quickly see the latest tsunami alert status in geographic context without having to understand complex TWC areas of responsibility. The new site provides for the creation of a wide range of products beyond the traditional ASCII-based tsunami messages. The publication of modern formats such as Common Alerting Protocol (CAP) can drive geographically aware emergency alert systems like FEMA's Integrated Public Alert and Warning System (IPAWS). Supported are other popular information delivery systems, including email, text messaging, and social media updates. The Tsunami.gov portal allows NOAA staff to easily edit content and provides the facility for users to customize their viewing experience. In addition to access by the public, emergency managers and government officials may be offered the capability to log into the portal for special access rights to decision-making and administrative resources relevant to their respective tsunami warning systems. The site follows modern HTML5 responsive design practices for optimized use on mobile as well as non-mobile platforms. It meets all federal security and accessibility standards. Moving forward, we hope to expand Tsunami.gov to encompass tsunami-related content currently offered on separate websites, including the NOAA Tsunami Website, National Tsunami Hazard Mitigation Program, NOAA Center for Tsunami Research, National Geophysical Data Center's Tsunami Database, and National Data Buoy Center's DART Program. This project is part of the larger Tsunami Information Technology Modernization Project, which is consolidating the software architectures of NOAA's existing TWC's into a single system. We welcome your feedback to help Tsunami.gov become an effective public resource for tsunami information and a medium to enable better global tsunami warning coordination.
NASA Astrophysics Data System (ADS)
Gailler, A.; Loevenbruck, A.; Hebert, H.
2013-12-01
Numerical tsunami propagation and inundation models are well developed and have now reached an impressive level of accuracy, especially in locations such as harbors where the tsunami waves are mostly amplified. In the framework of tsunami warning under real-time operational conditions, the main obstacle for the routine use of such numerical simulations remains the slowness of the numerical computation, which is strengthened when detailed grids are required for the precise modeling of the coastline response of an individual harbor. Thus only tsunami offshore propagation modeling tools using a single sparse bathymetric computation grid are presently included within the French Tsunami Warning Center (CENALT), providing rapid estimation of tsunami warning at western Mediterranean and NE Atlantic basins scale. We present here a preliminary work that performs quick estimates of the inundation at individual harbors from these high sea forecasting tsunami simulations. The method involves an empirical correction based on theoretical amplification laws (either Green's or Synolakis laws). The main limitation is that its application to a given coastal area would require a large database of previous observations, in order to define the empirical parameters of the correction equation. As no such data (i.e., historical tide gage records of significant tsunamis) are available for the western Mediterranean and NE Atlantic basins, we use a set of synthetic mareograms, calculated for both fake and well-known historical tsunamigenic earthquakes in the area. This synthetic dataset is obtained through accurate numerical tsunami propagation and inundation modeling by using several nested bathymetric grids of increasingly fine resolution close to the shores (down to a grid cell size of 3m in some Mediterranean harbors). Non linear shallow water tsunami modeling performed on a single 2' coarse bathymetric grid are compared to the values given by time-consuming nested grids simulations (and observation when available), in order to check to which extent the simple approach based on the amplification laws can explain the data. The idea is to fit tsunami data with numerical modeling carried out without any refined coastal bathymetry/topography. To this end several parameters are discussed, namely the bathymetric depth to which model results must be extrapolated (using the Green's law), or the mean bathymetric slope to consider near the studied coast (when using the Synolakis law).
Evolution of tsunami warning systems and products.
Bernard, Eddie; Titov, Vasily
2015-10-28
Each year, about 60 000 people and $4 billion (US$) in assets are exposed to the global tsunami hazard. Accurate and reliable tsunami warning systems have been shown to provide a significant defence for this flooding hazard. However, the evolution of warning systems has been influenced by two processes: deadly tsunamis and available technology. In this paper, we explore the evolution of science and technology used in tsunami warning systems, the evolution of their products using warning technologies, and offer suggestions for a new generation of warning products, aimed at the flooding nature of the hazard, to reduce future tsunami impacts on society. We conclude that coastal communities would be well served by receiving three standardized, accurate, real-time tsunami warning products, namely (i) tsunami energy estimate, (ii) flooding maps and (iii) tsunami-induced harbour current maps to minimize the impact of tsunamis. Such information would arm communities with vital flooding guidance for evacuations and port operations. The advantage of global standardized flooding products delivered in a common format is efficiency and accuracy, which leads to effectiveness in promoting tsunami resilience at the community level. © 2015 The Authors.
Evolution of tsunami warning systems and products
Bernard, Eddie; Titov, Vasily
2015-01-01
Each year, about 60 000 people and $4 billion (US$) in assets are exposed to the global tsunami hazard. Accurate and reliable tsunami warning systems have been shown to provide a significant defence for this flooding hazard. However, the evolution of warning systems has been influenced by two processes: deadly tsunamis and available technology. In this paper, we explore the evolution of science and technology used in tsunami warning systems, the evolution of their products using warning technologies, and offer suggestions for a new generation of warning products, aimed at the flooding nature of the hazard, to reduce future tsunami impacts on society. We conclude that coastal communities would be well served by receiving three standardized, accurate, real-time tsunami warning products, namely (i) tsunami energy estimate, (ii) flooding maps and (iii) tsunami-induced harbour current maps to minimize the impact of tsunamis. Such information would arm communities with vital flooding guidance for evacuations and port operations. The advantage of global standardized flooding products delivered in a common format is efficiency and accuracy, which leads to effectiveness in promoting tsunami resilience at the community level. PMID:26392620
The Development of Storm Surge Ensemble Prediction System and Case Study of Typhoon Meranti in 2016
NASA Astrophysics Data System (ADS)
Tsai, Y. L.; Wu, T. R.; Terng, C. T.; Chu, C. H.
2017-12-01
Taiwan is under the threat of storm surge and associated inundation, which is located at a potentially severe storm generation zone. The use of ensemble prediction can help forecasters to know the characteristic of storm surge under the uncertainty of track and intensity. In addition, it can help the deterministic forecasting. In this study, the kernel of ensemble prediction system is based on COMCOT-SURGE (COrnell Multi-grid COupled Tsunami Model - Storm Surge). COMCOT-SURGE solves nonlinear shallow water equations in Open Ocean and coastal regions with the nested-grid scheme and adopts wet-dry-cell treatment to calculate potential inundation area. In order to consider tide-surge interaction, the global TPXO 7.1 tide model provides the tidal boundary conditions. After a series of validations and case studies, COMCOT-SURGE has become an official operating system of Central Weather Bureau (CWB) in Taiwan. In this study, the strongest typhoon in 2016, Typhoon Meranti, is chosen as a case study. We adopt twenty ensemble members from CWB WRF Ensemble Prediction System (CWB WEPS), which differs from parameters of microphysics, boundary layer, cumulus, and surface. From box-and-whisker results, maximum observed storm surges were located in the interval of the first and third quartile at more than 70 % gauge locations, e.g. Toucheng, Chengkung, and Jiangjyun. In conclusion, the ensemble prediction can effectively help forecasters to predict storm surge especially under the uncertainty of storm track and intensity
Rapid inundation estimates using coastal amplification laws in the western Mediterranean basin
NASA Astrophysics Data System (ADS)
Gailler, Audrey; Loevenbruck, Anne; Hébert, Hélène
2014-05-01
Numerical tsunami propagation and inundation models are well developed and have now reached an impressive level of accuracy, especially in locations such as harbors where the tsunami waves are mostly amplified. In the framework of tsunami warning under real-time operational conditions, the main obstacle for the routine use of such numerical simulations remains the slowness of the numerical computation, which is strengthened when detailed grids are required for the precise modeling of the coastline response of an individual harbor. Thus only tsunami offshore propagation modeling tools using a single sparse bathymetric computation grid are presently included within the French Tsunami Warning Center (CENALT), providing rapid estimation of tsunami warning at western Mediterranean and NE Atlantic basins scale. We present here a preliminary work that performs quick estimates of the inundation at individual harbors from these high sea forecasting tsunami simulations. The method involves an empirical correction based on theoretical amplification laws (either Green's or Synolakis laws). The main limitation is that its application to a given coastal area would require a large database of previous observations, in order to define the empirical parameters of the correction equation. As no such data (i.e., historical tide gage records of significant tsunamis) are available for the western Mediterranean and NE Atlantic basins, we use a set of synthetic mareograms, calculated for both fake events and well-known historical tsunamigenic earthquakes in the area. This synthetic dataset is obtained through accurate numerical tsunami propagation and inundation modeling by using several nested bathymetric grids of increasingly fine resolution close to the shores (down to a grid cell size of 3m in some Mediterranean harbors). Non linear shallow water tsunami modeling performed on a single 2' coarse bathymetric grid are compared to the values given by time-consuming nested grids simulations (and observation when available), in order to check to which extent the simple approach based on the amplification laws can explain the data. The idea is to fit tsunami data with numerical modeling carried out without any refined coastal bathymetry/topography. To this end several parameters are discussed, namely the bathymetric depth to which model results must be extrapolated (using the Green's law), or the mean bathymetric slope to consider near the studied coast (when using the Synolakis law).
Lessons Learned and Unlearned from the 2004 Great Sumatran Tsunami.
NASA Astrophysics Data System (ADS)
Synolakis, C.; Kanoglu, U.
2014-12-01
Huppert & Sparks (2006 Phil Trans Math Phys Eng Sci) wrote It is likely that in the future, we will experience several disasters per year that kill more than 10,000 people. The 2011 Great East Japan Earthquake Disaster alone resulted in more than 20,000 casualties. Synolakis & Bernard (2006 Phil Trans Math Phys Eng Sci) concluded that Before the next Sumatra-type tsunami strikes, we must resolve to create a world that can coexist with the tsunami hazard. The 2011 Japan tsunami dramatically showed that we are not there yet. Despite substantial advances after the 2004 Boxing Day tsunami, substantial challenges remain for improving tsunami hazard mitigation. If the tsunami community appeared at first perplexed in the aftermath of the 2004 tsunami, it was not due to the failure of recognized hydrodynamic paradigms, much as certain geophysical ones and scaling laws failed, but at the worst surprise, the lack of preparedness and education. Synolakis et al. (2008 Pure Appl Geophys) presented standards for tsunami modeling; for both warnings and inundation maps (IMs). Although at least one forecasting methodology has gone through extensive testing, and is now officially in use by the warning centers (WCs), standards need urgently to be formalized for warnings. In Europe, several WCs have been established, but none has yet to issue an operational warning for a hazardous event. If it happens, there might be confusion with possibly contradictory/competing warnings. Never again should there be a repeat of the TEPCO analysis for the safety of the Fukushima NPP. This was primarily due to lacks of familiarity with the context of numerical predictions and experience with real tsunami. The accident was the result of a cascade of stupid errors, almost impossible to ignore by anyone in the field (Synolakis, 26.03.2011 The New York Times). Current practices in tsunami studies for US NPPs and for IMs do not provide us with optimism that the Fukushima lessons have been absorbed and that bagatellomania is still rabid. What saves human lives is ancestral knowledge and community preparedness, as demonstrated repeatedly. Efforts need to be focused in improving education worldwide in the simple steps they can take. We acknowledge the partial supports from the 7th FP (ASTARTE, Grant 603839), TUBITAK, TR (109Y387) and GSRT, GR (10TUR/1-50-1) projects.
How to learn and develop from both good and bad lessons- the 2011Tohoku tsunami case -
NASA Astrophysics Data System (ADS)
Sugimoto, Megumi; Okazumi, Toshio
2013-04-01
The 2011 Tohoku tsunami revealed Japan has repeated same mistakes in a long tsunami disaster history. After the disaster Japanese remember many old lessons and materials: an oral traditional evacuation method 'Tsunami TENDENKO' which is individual independent quick evacuation, a tsunami historical memorial stone "Don't construct houses below this stone to seaside" in Aneyoshi town Iwate prefecture, Namiwake-shrine naming from the story of protect people from tsunami in Sendai city, and so on. Tohoku area has created various tsunami historical cultures to descendent. Tohoku area had not had a tsunami disaster for 50 years after the 1960 Chilean tsunami. The 2010 Chilean tsunami damaged little fish industry. People gradually lost tsunami disaster awareness. At just the bad time the magnitude (M) 9 scale earthquake attacked Tohoku. It was for our generations an inexperienced scale disaster. People did not make use of the ancestor's lessons to survive. The 2004 Sumatra tsunami attacked just before 7 years ago. The magnitude scale is almost same as M 9 scale. Why didn't Tohoku people and Japanese tsunami experts make use of the lessons? Japanese has a character outside Japan. This lesson shows it is difficult for human being to learn from other countries. As for Three mile island accident case in US, it was same for Japan. To addition to this, there are similar types of living lessons among different hazards. For examples, nuclear power plantations problem occurred both the 2012 Hurricane Sandy in US and the 2011 Tohoku tsunami. Both local people were not informed about the troubles though Oyster creek nuclear power station case in US did not proceed seriously all. Tsunami and Hurricane are different hazard. Each exparts stick to their last. 1. It is difficult for human being to transfer living lessons through next generation over decades. 2. It is difficult for human being to forecast inexperienced events. 3. It is usually underestimated the danger because human being have a tendency to judge based on own experience. 4. It is difficult for human being to make use of lessons from different countries because human being would not like to think own self suffer victim for a self-preservation mind. 5. It is usual for experts not to pay attention to other fields even if similar case occurs in different fields. We started collecting 18 hazards of such historical living lessons all over the world before the 2011 Tohoku tsunami. We adapted to this project collecting lessons from Tohoku tsunami and will publish for small children in developing countries in March 2013. This will be translated in at least 10 languages. This disaster lessons guide books are free. We will introduce some lessons in the presentations. We believe education is one of useful countermeasures to prevent from repeating same mistakes and transfer directly living lessons to new generations.
Tsunami response system for ports in Korea
NASA Astrophysics Data System (ADS)
Cho, H.-R.; Cho, J.-S.; Cho, Y.-S.
2015-09-01
The tsunamis that have occurred in many places around the world over the past decade have taken a heavy toll on human lives and property. The eastern coast of the Korean Peninsula is not safe from tsunamis, particularly the eastern coastal areas, which have long sustained tsunami damage. The eastern coast had been attacked by 1983 and 1993 tsunami events. The aim of this study was to mitigate the casualties and property damage against unexpected tsunami attacks along the eastern coast of the Korean Peninsula by developing a proper tsunami response system for important ports and harbors with high population densities and high concentrations of key national industries. The system is made based on numerical and physical modelings of 3 historical and 11 virtual tsunamis events, field surveys, and extensive interviews with related people.
NASA Astrophysics Data System (ADS)
von Hillebrandt-Andrade, C.; Crespo Jones, H.
2012-12-01
Over the past 500 years almost 100 tsunamis have been observed in the Caribbean and Western Atlantic, with at least 3510 people having lost their lives to this hazard since 1842. Furthermore, with the dramatic increase in population and infrastructure along the Caribbean coasts, today, millions of coastal residents, workers and visitors are vulnerable to tsunamis. The UNESCO IOC Intergovernmental Coordination Group for Tsunamis and other Coastal Hazards for the Caribbean and Adjacent Regions (CARIBE EWS) was established in 2005 to coordinate and advance the regional tsunami warning system. The CARIBE EWS focuses on four areas/working groups: (1) Monitoring and Warning, (2) Hazard and Risk Assessment, (3) Communication and (4) Education, Preparedness and Readiness. The sea level monitoring component is under Working Group 1. Although in the current system, it's the seismic data and information that generate the initial tsunami bulletins, it is the data from deep ocean buoys (DARTS) and the coastal sea level gauges that are critical for the actual detection and forecasting of tsunamis impact. Despite multiple efforts and investments in the installation of sea level stations in the region, in 2004 there were only a handful of sea level stations operational in the region (Puerto Rico, US Virgin Islands, Bermuda, Bahamas). Over the past 5 years there has been a steady increase in the number of stations operating in the Caribbean region. As of mid 2012 there were 7 DARTS and 37 coastal gauges with additional ones being installed or funded. In order to reach the goal of 100 operational coastal sea level stations in the Caribbean, the CARIBE EWS recognizes also the importance of maintaining the current stations. For this, a trained workforce in the region for the installation, operation and data analysis and quality control is considered to be critical. Since 2008, three training courses have been offered to the sea level station operators and data analysts. Other requirements and factors have been considered for the sustainability of the stations. The sea level stations have to potentially sustain very aggressive conditions of not only tsunamis, but on a more regular basis, hurricanes. Given the requirement that the data be available in near real time, for tsunami and other coastal hazard application, robust communication systems are also essential. For the local operator, the ability to be able to visualize the data is critical and tools like the IOC Sea level Monitoring Facility and the Tide Tool program are very useful. It has also been emphasized the need for these stations to serve multiple purposes. For climate and other research applications the data need to be archived, QC'd and analyzed. Increasing the user base for the sea level data has also been seen as an important goal to gain the local buy in; local weather and meteorological offices are considered as key stakeholders but for whom applications still need to be developed. The CARIBE EWS continues to look forward to working with other IOC partners including the Global Sea Level Observing System (GLOSS) and Sub-Commission for the Caribbean and Adjacent Regions (IOCARIBE)/GOOS, as well as with local, national and global sea level station operators and agencies for the development of a sustainable sea level network.
NASA Astrophysics Data System (ADS)
Shannon, R.; McCloskey, J.; McDowell, S.
2009-12-01
Forecasts of the next likely megathrust earthquake which will occur off the western coast of Sumatra, possibly in the near future, indicate that it will likely be tsunamigenic and could be more devastating than the 2004 event. Hundreds of simulations of potential earthquakes and their tsunamis show that, while the earthquake is fundamentally unpredictable, many scenarios would see dangerous inundation of low-lying areas along the west coast of Sumatra; the cities of Padang and Bengkulu broadside-on to the areas of highest seismic potential have a combined population of over one million. Understanding how the science of unpredictable, high probability events is absorbed by society is essential for the development of effective mitigation and preparedness campaigns. A five month field investigation conducted in Padang and Bengkulu aimed to conceptualise the main issues driving risk perception of tsunami hazard, and explore its influence upon preparedness. Of specific interest was the role of scientifically quantified hazard information upon risk perception and hazard preparedness. Target populations were adult community members (n=270) and senior high school students (n=90). Preliminary findings indicate that scientific knowledge of earthquake and tsunami threat amongst respondents in both cities is good. However the relationship between respondent’s hazard knowledge, desired risk perception, and the adoption of preparedness measures was often non-linear and is susceptible to the negative effects of unscientific forecasts disseminated by government and mass media. Evidence suggests that ‘mystic’ predictions often portrayed in the media as being scientific, have been readily absorbed by the public; when these fail to materialise the credibility of authentic science and scientists plummets. As a result levels of sustainable earthquake and tsunami preparedness measures adopted by those living in tsunami threatened areas can be detrimentally impacted. It is imperative that the internationally accredited science of high probability, unpredictable natural hazards prevails within public consciousness in western Sumatra, despite the frequent circulation of unsubstantiated predictions and claims relating to these events. While the management of this information ultimately lies with government, the recent past has dictated a need for scientists to become more proactive in ensuring their work is accepted as a foremost source of knowledge used to guide accurate risk perceptions and stimulate the adoption of appropriate preparedness measures.
GPS water level measurements for Indonesia's Tsunami Early Warning System
NASA Astrophysics Data System (ADS)
Schöne, T.; Pandoe, W.; Mudita, I.; Roemer, S.; Illigner, J.; Zech, C.; Galas, R.
2011-03-01
On Boxing Day 2004, a severe tsunami was generated by a strong earthquake in Northern Sumatra causing a large number of casualties. At this time, neither an offshore buoy network was in place to measure tsunami waves, nor a system to disseminate tsunami warnings to local governmental entities. Since then, buoys have been developed by Indonesia and Germany, complemented by NOAA's Deep-ocean Assessment and Reporting of Tsunamis (DART) buoys, and have been moored offshore Sumatra and Java. The suite of sensors for offshore tsunami detection in Indonesia has been advanced by adding GPS technology for water level measurements. The usage of GPS buoys in tsunami warning systems is a relatively new approach. The concept of the German Indonesian Tsunami Early Warning System (GITEWS) (Rudloff et al., 2009) combines GPS technology and ocean bottom pressure (OBP) measurements. Especially for near-field installations where the seismic noise may deteriorate the OBP data, GPS-derived sea level heights provide additional information. The GPS buoy technology is precise enough to detect medium to large tsunamis of amplitudes larger than 10 cm. The analysis presented here suggests that for about 68% of the time, tsunamis larger than 5 cm may be detectable.
Integration of WERA Ocean Radar into Tsunami Early Warning Systems
NASA Astrophysics Data System (ADS)
Dzvonkovskaya, Anna; Helzel, Thomas; Kniephoff, Matthias; Petersen, Leif; Weber, Bernd
2016-04-01
High-frequency (HF) ocean radars give a unique capability to deliver simultaneous wide area measurements of ocean surface current fields and sea state parameters far beyond the horizon. The WERA® ocean radar system is a shore-based remote sensing system to monitor ocean surface in near real-time and at all-weather conditions up to 300 km offshore. Tsunami induced surface currents cause increasing orbital velocities comparing to normal oceanographic situation and affect the measured radar spectra. The theoretical approach about tsunami influence on radar spectra showed that a tsunami wave train generates a specific unusual pattern in the HF radar spectra. While the tsunami wave is approaching the beach, the surface current pattern changes slightly in deep water and significantly in the shelf area as it was shown in theoretical considerations and later proved during the 2011 Japan tsunami. These observed tsunami signatures showed that the velocity of tsunami currents depended on a tsunami wave height and bathymetry. The HF ocean radar doesn't measure the approaching wave height of a tsunami; however, it can resolve the surface current velocity signature, which is generated when tsunami reaches the shelf edge. This strong change of the surface current can be detected by a phased-array WERA system in real-time; thus the WERA ocean radar is a valuable tool to support Tsunami Early Warning Systems (TEWS). Based on real tsunami measurements, requirements for the integration of ocean radar systems into TEWS are already defined. The requirements include a high range resolution, a narrow beam directivity of phased-array antennas and an accelerated data update mode to provide a possibility of offshore tsunami detection in real-time. The developed software package allows reconstructing an ocean surface current map of the area observed by HF radar based on the radar power spectrum processing. This fact gives an opportunity to issue an automated tsunami identification message by the WERA radars to TEWS. The radar measurements can be used to confirm a pre-warning and raise a tsunami alert. The output data of WERA processing software can be easily integrated into existing TEWS due to flexible data format, fast update rate and quality control of measurements. The archived radar data can be used for further hazard analysis and research purposes. The newly launched Tsunami Warning Center in Oman is one of the most sophisticated tsunami warning system world-wide applying a mix of well proven state-of-the-art subsystems. It allows the acquisition of data from many different sensor systems including seismic stations, GNSS, tide gauges, and WERA ocean radars in one acquisition system providing access to all sensor data via a common interface. The TEWS in Oman also integrates measurements of a modern network of HF ocean radars to verify tsunami simulations, which give additional scenario quality information and confirmation to the decision support.
Development of a GNSS-Enhanced Tsunami Early Warning System
NASA Astrophysics Data System (ADS)
Bawden, G. W.; Melbourne, T. I.; Bock, Y.; Song, Y. T.; Komjathy, A.
2015-12-01
The past decade has witnessed a terrible loss of life and economic disruption caused by large earthquakes and resultant tsunamis impacting coastal communities and infrastructure across the Indo-Pacific region. NASA has funded the early development of a prototype real-time Global Navigation Satellite System (RT-GNSS) based rapid earthquake and tsunami early warning (GNSS-TEW) system that may be used to enhance seismic tsunami early warning systems for large earthquakes. This prototype GNSS-TEW system geodetically estimates fault parameters (earthquake magnitude, location, strike, dip, and slip magnitude/direction on a gridded fault plane both along strike and at depth) and tsunami source parameters (seafloor displacement, tsunami energy scale, and 3D tsunami initials) within minutes after the mainshock based on dynamic numerical inversions/regressions of the real-time measured displacements within a spatially distributed real-time GNSS network(s) spanning the epicentral region. It is also possible to measure fluctuations in the ionosphere's total electron content (TEC) in the RT-GNSS data caused by the pressure wave from the tsunami. This TEC approach can detect if a tsunami has been triggered by an earthquake, track its waves as they propagate through the oceanic basins, and provide upwards of 45 minutes early warning. These combined real-time geodetic approaches will very quickly address a number of important questions in the immediate minutes following a major earthquake: How big was the earthquake and what are its fault parameters? Could the earthquake have produced a tsunami and was a tsunami generated?
Kamogawa, Masashi; Orihara, Yoshiaki; Tsurudome, Chiaki; Tomida, Yuto; Kanaya, Tatsuya; Ikeda, Daiki; Gusman, Aditya Riadi; Kakinami, Yoshihiro; Liu, Jann-Yenq; Toyoda, Atsushi
2016-12-01
Ionospheric plasma disturbances after a large tsunami can be detected by measurement of the total electron content (TEC) between a Global Positioning System (GPS) satellite and its ground-based receivers. TEC depression lasting for a few minutes to tens of minutes termed as tsunami ionospheric hole (TIH) is formed above the tsunami source area. Here we describe the quantitative relationship between initial tsunami height and the TEC depression rate caused by a TIH from seven tsunamigenic earthquakes in Japan and Chile. We found that the percentage of TEC depression and initial tsunami height are correlated and the largest TEC depressions appear 10 to 20 minutes after the main shocks. Our findings imply that Ionospheric TEC measurement using the existing ground receiver networks could be used in an early warning system for near-field tsunamis that take more than 20 minutes to arrive in coastal areas.
Kamogawa, Masashi; Orihara, Yoshiaki; Tsurudome, Chiaki; Tomida, Yuto; Kanaya, Tatsuya; Ikeda, Daiki; Gusman, Aditya Riadi; Kakinami, Yoshihiro; Liu, Jann-Yenq; Toyoda, Atsushi
2016-01-01
Ionospheric plasma disturbances after a large tsunami can be detected by measurement of the total electron content (TEC) between a Global Positioning System (GPS) satellite and its ground-based receivers. TEC depression lasting for a few minutes to tens of minutes termed as tsunami ionospheric hole (TIH) is formed above the tsunami source area. Here we describe the quantitative relationship between initial tsunami height and the TEC depression rate caused by a TIH from seven tsunamigenic earthquakes in Japan and Chile. We found that the percentage of TEC depression and initial tsunami height are correlated and the largest TEC depressions appear 10 to 20 minutes after the main shocks. Our findings imply that Ionospheric TEC measurement using the existing ground receiver networks could be used in an early warning system for near-field tsunamis that take more than 20 minutes to arrive in coastal areas. PMID:27905487
NASA Astrophysics Data System (ADS)
Becker, N. C.; Wang, D.; Shiro, B.; Ward, B.
2013-12-01
Outreach and education save lives, and the Pacific Tsunami Warning Center (PTWC) has a new tool--a YouTube Channel--to advance its mission to protect lives and property from dangerous tsunamis. Such outreach and education is critical for coastal populations nearest an earthquake since they may not get an official warning before a tsunami reaches them and will need to know what to do when they feel strong shaking. Those who live far enough away to receive useful official warnings and react to them, however, can also benefit from PTWC's education and outreach efforts. They can better understand a tsunami warning message when they receive one, can better understand the danger facing them, and can better anticipate how events will unfold while the warning is in effect. The same holds true for emergency managers, who have the authority to evacuate the public they serve, and for the news media, critical partners in disseminating tsunami hazard information. PTWC's YouTube channel supplements its formal outreach and education efforts by making its computer animations available 24/7 to anyone with an Internet connection. Though the YouTube channel is only a month old (as of August 2013), it should rapidly develop a large global audience since similar videos on PTWC's Facebook page have reached over 70,000 viewers during organized media events, while PTWC's official web page has received tens of millions of hits during damaging tsunamis. These animations are not mere cartoons but use scientific data and calculations to render graphical depictions of real-world phenomena as accurately as possible. This practice holds true whether the animation is a simple comparison of historic earthquake magnitudes or a complex simulation cycling through thousands of high-resolution data grids to render tsunami waves propagating across an entire ocean basin. PTWC's animations fall into two broad categories. The first group illustrates concepts about seismology and how it is critical to tsunami warning operations, such as those about earthquake magnitudes, how earthquakes are located, where and how often earthquakes occur, and fault rupture length. The second group uses the PTWC-developed tsunami forecast model, RIFT (Wang et al., 2012), to show how various historic tsunamis propagated through the world's oceans. These animations illustrate important concepts about tsunami behavior such as their speed, how they bend around and bounce off of seafloor features, how their wave heights vary from place to place and in time, and how their behavior is strongly influenced by the type of earthquake that generated them. PTWC's YouTube channel also includes an animation that simulates both seismic and tsunami phenomena together as they occurred for the 2011 Japan tsunami including actual sea-level measurements and proper timing for tsunami alert status, thus serving as a video 'time line' for that event and showing the time scales involved in tsunami warning operations. Finally, PTWC's scientists can use their YouTube channel to communicate with their colleagues in the research community by supplementing their peer-reviewed papers with video 'figures' (e.g., Wang et al., 2012).
Tsunami Detection Systems for International Requirements
NASA Astrophysics Data System (ADS)
Lawson, R. A.
2007-12-01
Results are presented regarding the first commercially available, fully operational, tsunami detection system to have passed stringent U.S. government testing requirements and to have successfully demonstrated its ability to detect an actual tsunami at sea. Spurred by the devastation of the December 26, 2004, Indian Ocean tsunami that killed more than 230,000 people, the private sector actively supported the Intergovernmental Oceanographic Commission's (IOC"s) efforts to develop a tsunami warning system and mitigation plan for the Indian Ocean region. As each country in the region developed its requirements, SAIC recognized that many of these underdeveloped countries would need significant technical assistance to fully execute their plans. With the original focus on data fusion, consequence assessment tools, and warning center architecture, it was quickly realized that the cornerstone of any tsunami warning system would be reliable tsunami detection buoys that could meet very stringent operational standards. Our goal was to leverage extensive experience in underwater surveillance and oceanographic sensing to produce an enhanced and reliable deep water sensor that could meet emerging international requirements. Like the NOAA Deep-ocean Assessment and Recording of Tsunamis (DART TM ) buoy, the SAIC Tsunami Buoy (STB) system consists of three subsystems: a surfaccommunications buoy subsystem, a bottom pressure recorder subsystem, and a buoy mooring subsystem. With the operational success that DART has demonstrated, SAIC decided to build and test to the same high standards. The tsunami detection buoy system measures small changes in the depth of the deep ocean caused by tsunami waves as they propagate past the sensor. This is accomplished by using an extremely sensitive bottom pressure sensor/recorder to measure very small changes in pressure as the waves move past the buoy system. The bottom pressure recorder component includes a processor with algorithms that recognize these characteristics, and then immediately alerts a tsunami warning center through the communications buoy when the processor senses one of these waves. In addition to the tsunami detection buoy system, an end-to-end tsunami warning system was developed that builds upon the country's existing disaster warning infrastructure. This warning system includes 1) components that receive, process, and analyze buoy, seismic and tide gauge data; 2) predictive tools and a consequence assessment tool set to provide decision support; 3) operation center design and implementation; and 4) tsunami buoy operations and maintenance support. The first buoy was deployed Oct. 25, 2006, approximately 200 nautical miles west of San Diego in 3,800 meters of water. Just three weeks later, it was put to the test during an actual tsunami event. On Nov. 15, 2006, an 8.3 magnitude earthquake rocked the Kuril Islands, located between Japan and the Kamchatka Peninsula of Russia. That quake generated a small tsunami. Waves from the tsunami propagated approximately 4,000 nautical miles across the Pacific Ocean in about nine hours-- a speed of about 445 nautical miles per hour when this commercial buoy first detected them. Throughout that event, the tsunami buoy system showed excellent correlation with data collected by a NOAA DART buoy located 28 nautical miles north of it. Subsequent analysis revealed that the STB matched DART operational capabilities and performed flawlessly. The buoy proved its capabilities again on Jan. 13, 2007, when an 8.1 magnitude earthquake occurred in the same region, and the STB detected the seismic event. As a result of the successes of this entire project, SAIC recently applied for and received a license from NOAA to build DART systems.
NASA Astrophysics Data System (ADS)
Li, Linlin; Switzer, Adam D.; Wang, Yu; Chan, Chung-Han; Qiu, Qiang; Weiss, Robert
2017-04-01
Current tsunami inundation maps are commonly generated using deterministic scenarios, either for real-time forecasting or based on hypothetical "worst-case" events. Such maps are mainly used for emergency response and evacuation planning and do not include the information of return period. However, in practice, probabilistic tsunami inundation maps are required in a wide variety of applications, such as land-use planning, engineer design and for insurance purposes. In this study, we present a method to develop the probabilistic tsunami inundation map using a stochastic earthquake source model. To demonstrate the methodology, we take Macau a coastal city in the South China Sea as an example. Two major advances of this method are: it incorporates the most updated information of seismic tsunamigenic sources along the Manila megathrust; it integrates a stochastic source model into a Monte Carlo-type simulation in which a broad range of slip distribution patterns are generated for large numbers of synthetic earthquake events. When aggregated the large amount of inundation simulation results, we analyze the uncertainties associated with variability of earthquake rupture location and slip distribution. We also explore how tsunami hazard evolves in Macau in the context of sea level rise. Our results suggest Macau faces moderate tsunami risk due to its low-lying elevation, extensive land reclamation, high coastal population and major infrastructure density. Macau consists of four districts: Macau Peninsula, Taipa Island, Coloane island and Cotai strip. Of these Macau Peninsula is the most vulnerable to tsunami due to its low-elevation and exposure to direct waves and refracted waves from the offshore region and reflected waves from mainland. Earthquakes with magnitude larger than Mw8.0 in the northern Manila trench would likely cause hazardous inundation in Macau. Using a stochastic source model, we are able to derive a spread of potential tsunami impacts for earthquakes with the same magnitude. The diversity is caused by both random rupture locations and heterogeneous slip distribution. Adding the sea level rise component, the inundated depth caused by 1 m sea level rise is equivalent to the one caused by 90 percentile of an ensemble of Mw8.4 earthquakes.
Modeling of influence from remote tsunami at the coast of Sakhalin and Kuriles islands.
NASA Astrophysics Data System (ADS)
Zaytsev, Andrey; Pelinovsky, Efim; Yalciner, Ahmet; Chernov, Anton; Kostenko, Irina
2010-05-01
The Far East coast of Russia (Kuriles islands, Sakhalin, Kamchatka) is the area where the dangerous natural phenomena as tsunami is located. A lot of works are established for decreasing of tsunami's influence. Tsunami mapping and mitigation strategy are given for some regions. The centers of Tsunami Warning System are opened, enough plenty of records of a tsunami are collected. The properties of local tsunami are studied well. At the same time, the catastrophic event of the Indonesian tsunami, which had happened in December, 2004, when the sufficient waves have reached the coasts of Africa and South America, it is necessary to note, that the coats, which was far from the epicenter of earthquakes can be effected by catastrophic influence. Moreover, it is practically unique case, when using Tsunami Warning System can reduce the number of human victims to zero. Development of the computer technologies, numerical methods for the solution of systems of the nonlinear differential equations makes computer modeling real and hypothetical tsunamis is the basic method of studying features of distribution of waves in water areas and their influence at coast. Numerical modeling of distribution of historical tsunami from the seismic sources in the Pacific Ocean was observed. The events with an epicenter, remote from Far East coast of Russia were considered. The estimation of the remote tsunami waves propagation was developed. Impact force of tsunamis was estimated. The features of passage of tsunami through Kuril Straits were considered. The spectral analysis of records in settlements of Sakhalin and Kuriles is lead. NAMI-DANCE program was used for tsunami propagation numerical modeling. It is used finite element numerical schemes for Shallow Water Equations and Nonlinear-Dispersive Equations, with use Nested Grid.
Observations and modelling of a meteotsunami across the English Channel on 23rd June 2016
NASA Astrophysics Data System (ADS)
Williams, David; Horsburgh, Kevin; Schultz, David; Hughes, Chris
2017-04-01
Meteotsunami are shallow water waves in the tsunami frequency band, which are generated by sub-mesoscale pressure and wind velocity fluctuations. Whilst documented meteotsunami on the north-western European shelf have not been hazardous, around the world they have caused fatalities and significant economic losses. Previous observational studies suggest that across Western Europe strongly convective storms are meteotsunami-generating. We give evidence for a meteotsunami on 23rd June 2016 along the northern coastline of France, following strongly convective storms. This includes 1-minute temporal resolution tide gauge data, in situ pressure and wind velocities, and infrared satellite images. With an estimated wave height of 0.8 m at Boulogne, this meteotsunami is particularly large compared to previous observations in Western Europe. The tsunami travel times have been estimated using the wavefront method, showing that a single, instantaneous source for the waves is highly unlikely. Using the ocean model Telemac2D, idealised models of pressure and wind have been used to simulate the meteotsunami. The model supports that across the English Channel thunderstorms with north-easterly tracks, moving at the shallow water wave speed, can generate wave amplification through Proudman resonance. The Weather Research and Forecasting (WRF) model has been used to produce numerically simulated thunderstorms, which have been used to force the Telemac2D ocean model with idealised bathymetries. The WRF-Telemac2D model results also support meteotsunami generation by thunderstorms. To the author's knowledge this is the first time a thunderstorm simulation has been used to produce a meteotsunami-like wave, and indicates that non-hydrostatic, convective atmospheric processes are important for meteotsunami generation. This suggests that with combined high resolution observations and modelling, a meteotsunami forecasting system may become possible in Western Europe.
A numerical study of tsunami wave impact and run-up on coastal cliffs using a CIP-based model
NASA Astrophysics Data System (ADS)
Zhao, Xizeng; Chen, Yong; Huang, Zhenhua; Hu, Zijun; Gao, Yangyang
2017-05-01
There is a general lack of understanding of tsunami wave interaction with complex geographies, especially the process of inundation. Numerical simulations are performed to understand the effects of several factors on tsunami wave impact and run-up in the presence of gentle submarine slopes and coastal cliffs, using an in-house code, a constrained interpolation profile (CIP)-based model. The model employs a high-order finite difference method, the CIP method, as the flow solver; utilizes a VOF-type method, the tangent of hyperbola for interface capturing/slope weighting (THINC/SW) scheme, to capture the free surface; and treats the solid boundary by an immersed boundary method. A series of incident waves are arranged to interact with varying coastal geographies. Numerical results are compared with experimental data and good agreement is obtained. The influences of gentle submarine slope, coastal cliff and incident wave height are discussed. It is found that the tsunami amplification factor varying with incident wave is affected by gradient of cliff slope, and the critical value is about 45°. The run-up on a toe-erosion cliff is smaller than that on a normal cliff. The run-up is also related to the length of a gentle submarine slope with a critical value of about 2.292 m in the present model for most cases. The impact pressure on the cliff is extremely large and concentrated, and the backflow effect is non-negligible. Results of our work are highly precise and helpful in inverting tsunami source and forecasting disaster.
NASA Astrophysics Data System (ADS)
Gailler, Audrey; Hébert, Hélène; Loevenbruck, Anne
2013-04-01
Improvements in the availability of sea-level observations and advances in numerical modeling techniques are increasing the potential for tsunami warnings to be based on numerical model forecasts. Numerical tsunami propagation and inundation models are well developed and have now reached an impressive level of accuracy, especially in locations such as harbors where the tsunami waves are mostly amplified. In the framework of tsunami warning under real-time operational conditions, the main obstacle for the routine use of such numerical simulations remains the slowness of the numerical computation, which is strengthened when detailed grids are required for the precise modeling of the coastline response on the scale of an individual harbor. In fact, when facing the problem of the interaction of the tsunami wavefield with a shoreline, any numerical simulation must be performed over an increasingly fine grid, which in turn mandates a reduced time step, and the use of a fully non-linear code. Such calculations become then prohibitively time-consuming, which is clearly unacceptable in the framework of real-time warning. Thus only tsunami offshore propagation modeling tools using a single sparse bathymetric computation grid are presently included within the French Tsunami Warning Center (CENALT), providing rapid estimation of tsunami wave heights in high seas, and tsunami warning maps at western Mediterranean and NE Atlantic basins scale. We present here a preliminary work that performs quick estimates of the inundation at individual harbors from these deep wave heights simulations. The method involves an empirical correction relation derived from Green's law, expressing conservation of wave energy flux to extend the gridded wave field into the harbor with respect to the nearby deep-water grid node. The main limitation of this method is that its application to a given coastal area would require a large database of previous observations, in order to define the empirical parameters of the correction equation. As no such data (i.e., historical tide gage records of significant tsunamis) are available for the western Mediterranean and NE Atlantic basins, a set of synthetic mareograms is calculated for both fake and well-known historical tsunamigenic earthquakes in the area. This synthetic dataset is obtained through accurate numerical tsunami propagation and inundation modeling by using several nested bathymetric grids characterized by a coarse resolution over deep water regions and an increasingly fine resolution close to the shores (down to a grid cell size of 3m in some Mediterranean harbors). This synthetic dataset is then used to approximate the empirical parameters of the correction equation. Results of inundation estimates in several french Mediterranean harbors obtained with the fast "Green's law - derived" method are presented and compared with values given by time-consuming nested grids simulations.
Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture
Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor
2014-01-01
The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327
Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.
Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor
2014-12-02
The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.
NASA Astrophysics Data System (ADS)
Mencin, David; Hodgkinson, Kathleen; Braun, John; Meertens, Charles; Mattioli, Glen; Phillips, David; Blume, Fredrick; Berglund, Henry; Fox, Otina; Feaux, Karl
2015-04-01
The GAGE facility, managed by UNAVCO, maintains and operates about 1300 GNSS stations distributed across North and Central America as part of the EarthScope Plate Boundary Observatory (PBO) and the Continuously Operating Caribbean GPS Observational Network (COCONet). UNAVCO has upgraded about 450 stations in these networks to real-time and high-rate (RT-GNSS) and included surface meteorological instruments. The majority of these streaming stations are part of the PBO but also include approximately 50 RT-GNSS stations in the Caribbean and Central American region as part of the COCONet and TLALOCNet projects. Based on community input UNAVCO has been exploring ways to increase the capability and utility of these resources to improve our understanding in diverse areas of geophysics including seismic, volcanic, magmatic and tsunami deformation sources, extreme weather events such as hurricanes and storms, and space weather. The RT-GNSS networks also have the potential to profoundly transform our ability to rapidly characterize geophysical events, provide early warning, as well as improve hazard mitigation and response. Specific applications currently under development with university, commercial, non-profit and government collaboration on national and international scales include earthquake and tsunami early warning systems and near real-time tropospheric modeling of hurricanes and precipitable water vapor estimate assimilation. Using tsunami early warning as an example, an RT-GNSS network can provide multiple inputs in an operational system starting with rapid assessment of earthquake sources and associated deformation which informs the initial modeled tsunami. The networks can then can also provide direct measurements of the tsunami wave heights and propagation by tracking the associated ionospheric disturbance from several 100's of km away as the waves approaches the shoreline. These GNSS based constraints can refine the tsunami and inundation models and potentially mitigate hazards. Other scientific and operational applications for high-rate GPS include glacier and ice sheet motions, tropospheric modeling, and better constraints on the dynamics of space weather. Our operational system has multiple communities that use and depend on a Pan-Pacific real-time open data set. The ability to merge existing data sets and user communities, seismic and tide gauge observations, with GNSS and Met data products has proven complicated because of issues related to meta-data, appropriate data formats, data quality assessment in real-time and specific issues related to using these products in operational forecasting. Additional issues related to data access across national borders and cognizant government sanctioned "early warning" agencies, some committed to specific technologies, methodologies, internal structure and further constrained by data policies make a truly operational system an on-going work in progress. We present a short history of evolving a very large and expensive RT-GNSS network originally designed to answer specific long term scientific questions about structure and evolution of North American plate boundaries into a much needed national hazard system while continuing to serve our core community in long term scientific studies. Out primary focus in this presentation is an analysis of our current goals and impediments to achieving these broader objectives.
NASA Astrophysics Data System (ADS)
Najihah, R.; Effendi, D. M.; Hairunnisa, M. A.; Masiri, K.
2014-02-01
The catastrophic Indian Ocean tsunami of 26 December 2004 raised a number of questions for scientist and politicians on how to deal with the tsunami risk and assessment in coastal regions. This paper discusses the challenges in tsunami vulnerability assessment and presents the result of tsunami disaster mapping and vulnerability assessment study for West Coast of Peninsular Malaysia. The spatial analysis was carried out using Geographical Information System (GIS) technology to demarcate spatially the tsunami affected village's boundary and suitable disaster management program can be quickly and easily developed. In combination with other thematic maps such as road maps, rail maps, school maps, and topographic map sheets it was possible to plan the accessibility and shelter to the affected people. The tsunami vulnerability map was used to identify the vulnerability of villages/village population to tsunami. In the tsunami vulnerability map, the intensity of the tsunami was classified as hazard zones based on the inundation level in meter (contour). The approach produced a tsunami vulnerability assessment map consists of considering scenarios of plausible extreme, tsunami-generating events, computing the tsunami inundation levels caused by different events and scenarios and estimating the possible range of casualties for computing inundation levels. The study provides an interactive means to identify the tsunami affected areas after the disaster and mapping the tsunami vulnerable village before for planning purpose were the essential exercises for managing future disasters.
NASA Astrophysics Data System (ADS)
Oetjen, Jan; Engel, Max; Prasad Pudasaini, Shiva; Schüttrumpf, Holger; Brückner, Helmut
2017-04-01
Coasts around the world are affected by high-energy wave events like storm surges or tsunamis depending on their regional climatological and geological settings. By focusing on tsunami impacts, we combine the abilities and experiences of different scientific fields aiming at improved insights of near- and onshore tsunami hydrodynamics. We investigate the transport of coarse clasts - so called boulders - due to tsunami impacts by a multi-methodology approach of numerical modelling, laboratory experiments, and sedimentary field records. Coupled numerical hydrodynamic and boulder transport models (BTM) are widely applied for analysing the impact characteristics of the transport by tsunami, such as wave height and flow velocity. Numerical models able to simulate past tsunami events and the corresponding boulder transport patterns with high accuracy and acceptable computational effort can be utilized as powerful forecasting models predicting the impact of a coast approaching tsunami. We have conducted small-scale physical experiments in the tilting flume with real shaped boulder models. Utilizing the structure from motion technique (Westoby et al., 2012) we reconstructed real boulders from a field study on the Island of Bonaire (Lesser Antilles, Caribbean Sea, Engel & May, 2012). The obtained three-dimensional boulder meshes are utilized for creating downscaled replica of the real boulder for physical experiments. The results of the irregular shaped boulder are compared to experiments with regular shaped boulder models to achieve a better insight about the shape related influence on transport patterns. The numerical model is based on the general two-phase mass flow model by Pudasaini (2012) enhanced for boulder transport simulations. The boulder is implemented using the immersed boundary technique (Peskin, 2002) and the direct forcing approach. In this method Cartesian grids (fluid and particle phase) and Lagrangian meshes (boulder) are combined. By applying the immersed boundary method we can compute the interactions between fluid, particles and arbitrary boulder shape. We are able to reproduce the exact physical experiment for calibration and verification of the tsunami boulder transport phenomena. First results of the study will be presented. Engel, M.; May, S.M.: Bonaire's boulder fields revisited: evidence for Holocene tsunami impact on the Leeward, Antilles. Quaternary Science Reviews 54, 126-141, 2012. Peskin, C.S.: The immersed boundary method. Acta Numerica, 479 - 517, 2002. Pudasaini, S. P.: A general two-phase debris flow model. J. Geophys. Res. Earth Surf., 117, F03010, 2012. Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M.: 'Structure-from-Motion' photogrammetry - a low-cost, effective tool for geoscience applications. Geomorphology 179, 300-314, 2012.
Tsunamis generated by eruptions from mount st. Augustine volcano, alaska.
Kienle, J; Kowalik, Z; Murty, T S
1987-06-12
During an eruption of the Alaskan volcano Mount St. Augustine in the spring of 1986, there was concern about the possibility that a tsunami might be generated by the collapse of a portion of the volcano into the shallow water of Cook Inlet. A similar edifice collapse of the volcano and ensuing sea wave occurred during an eruption in 1883. Other sea waves resulting in great loss of life and property have been generated by the eruption of coastal volcanos around the world. Although Mount St. Augustine remained intact during this eruptive cycle, a possible recurrence of the 1883 events spurred a numerical simulation of the 1883 sea wave. This simulation, which yielded a forecast of potential wave heights and travel times, was based on a method that could be applied generally to other coastal volcanos.
Computerized Workstation for Tsunami Hazard Monitoring
NASA Astrophysics Data System (ADS)
Lavrentiev-Jr, Mikhail; Marchuk, Andrey; Romanenko, Alexey; Simonov, Konstantin; Titov, Vasiliy
2010-05-01
We present general structure and functionality of the proposed Computerized Workstation for Tsunami Hazard Monitoring (CWTHM). The tool allows interactive monitoring of hazard, tsunami risk assessment, and mitigation - at all stages, from the period of strong tsunamigenic earthquake preparation to inundation of the defended coastal areas. CWTHM is a software-hardware complex with a set of software applications, optimized to achieve best performance on hardware platforms in use. The complex is calibrated for selected tsunami source zone(s) and coastal zone(s) to be defended. The number of zones (both source and coastal) is determined, or restricted, by available hardware resources. The presented complex performs monitoring of selected tsunami source zone via the Internet. The authors developed original algorithms, which enable detection of the preparation zone of the strong underwater earthquake automatically. For the so-determined zone the event time, magnitude and spatial location of tsunami source are evaluated by means of energy of the seismic precursors (foreshocks) analysis. All the above parameters are updated after each foreshock. Once preparing event is detected, several scenarios are forecasted for wave amplitude parameters as well as the inundation zone. Estimations include the lowest and the highest wave amplitudes and the least and the most inundation zone. In addition to that, the most probable case is calculated. In case of multiple defended coastal zones, forecasts and estimates can be done in parallel. Each time the simulated model wave reaches deep ocean buoys or tidal gauge, expected values of wave parameters and inundation zones are updated with historical events information and pre-calculated scenarios. The Method of Splitting Tsunami (MOST) software package is used for mathematical simulation. The authors suggest code acceleration for deep water wave propagation. As a result, performance is 15 times faster compared to MOST, original version. Performance gain is achieved by compiler options, use of optimized libraries, and advantages of OpenMP parallel technology. Moreover, it is possible to achieve 100 times code acceleration by using modern Graphics Processing Units (GPU). Parallel evaluation of inundation zones for multiple coastal zones is also available. All computer codes can be easily assembled under MS Windows and Unix OS family. Although software is virtually platform independent, the most performance gain is achieved while using the recommended hardware components. When the seismic event occurs, all valuable parameters are updated with seismic data and wave propagation monitoring is enabled. As soon as the wave passes each deep ocean tsunameter, parameters of the initial displacement at source are updated from direct calculations based on original algorithms. For better source reconstruction, a combination of two methods is used: optimal unit source linear combination from preliminary calculated database and direct numerical inversion along the wave ray between real source and particular measurement buoys. Specific dissipation parameter along with the wave ray is also taken into account. During the entire wave propagation process the expected wave parameters and inundation zone(s) characteristics are updated with all available information. If recommended hardware components are used, monitoring results are available in real time. The suggested version of CWTHM has been tested by analyzing seismic precursors (foreshocks) and the measured tsunami waves at North Pacific for the Central Kuril's tsunamigenic earthquake of November 15, 2006.
The U.S. National Tsunami Hazard Mitigation Program: Successes in Tsunami Preparedness
NASA Astrophysics Data System (ADS)
Whitmore, P.; Wilson, R. I.
2012-12-01
Formed in 1995 by Congressional Action, the National Tsunami Hazards Mitigation Program (NTHMP) provides the framework for tsunami preparedness activities in the United States. The Program consists of the 28 U.S. coastal states, territories, and commonwealths (STCs), as well as three Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS). Since its inception, the NTHMP has advanced tsunami preparedness in the United States through accomplishments in many areas of tsunami preparedness: - Coordination and funding of tsunami hazard analysis and preparedness activities in STCs; - Development and execution of a coordinated plan to address education and outreach activities (materials, signage, and guides) within its membership; - Lead the effort to assist communities in meeting National Weather Service (NWS) TsunamiReady guidelines through development of evacuation maps and other planning activities; - Determination of tsunami hazard zones in most highly threatened coastal communities throughout the country by detailed tsunami inundation studies; - Development of a benchmarking procedure for numerical tsunami models to ensure models used in the inundation studies meet consistent, NOAA standards; - Creation of a national tsunami exercise framework to test tsunami warning system response; - Funding community tsunami warning dissemination and reception systems such as sirens and NOAA Weather Radios; and, - Providing guidance to NOAA's Tsunami Warning Centers regarding warning dissemination and content. NTHMP activities have advanced the state of preparedness of United States coastal communities, and have helped save lives and property during recent tsunamis. Program successes as well as future plans, including maritime preparedness, are discussed.
Using GPS to Detect Imminent Tsunamis
NASA Technical Reports Server (NTRS)
Song, Y. Tony
2009-01-01
A promising method of detecting imminent tsunamis and estimating their destructive potential involves the use of Global Positioning System (GPS) data in addition to seismic data. Application of the method is expected to increase the reliability of global tsunami-warning systems, making it possible to save lives while reducing the incidence of false alarms. Tsunamis kill people every year. The 2004 Indian Ocean tsunami killed about 230,000 people. The magnitude of an earthquake is not always a reliable indication of the destructive potential of a tsunami. The 2004 Indian Ocean quake generated a huge tsunami, while the 2005 Nias (Indonesia) quake did not, even though both were initially estimated to be of the similar magnitude. Between 2005 and 2007, five false tsunami alarms were issued worldwide. Such alarms result in negative societal and economic effects. GPS stations can detect ground motions of earthquakes in real time, as frequently as every few seconds. In the present method, the epicenter of an earthquake is located by use of data from seismometers, then data from coastal GPS stations near the epicenter are used to infer sea-floor displacements that precede a tsunami. The displacement data are used in conjunction with local topographical data and an advanced theory to quantify the destructive potential of a tsunami on a new tsunami scale, based on the GPS-derived tsunami energy, much like the Richter Scale used for earthquakes. An important element of the derivation of the advanced theory was recognition that horizontal sea-floor motions contribute much more to generation of tsunamis than previously believed. The method produces a reliable estimate of the destructive potential of a tsunami within minutes typically, well before the tsunami reaches coastal areas. The viability of the method was demonstrated in computational tests in which the method yielded accurate representations of three historical tsunamis for which well-documented ground-motion measurements were available. Development of a global tsunami-warning system utilizing an expanded network of coastal GPS stations was under consideration at the time of reporting the information for this article.
NASA Astrophysics Data System (ADS)
Savastano, Giorgio; Komjathy, Attila; Verkhoglyadova, Olga; Wei, Yong; Mazzoni, Augusto; Crespi, Mattia
2017-04-01
Tsunamis can produce gravity waves that propagate up to the ionosphere generating disturbed electron densities in the E and F regions. These ionospheric disturbances are studied in detail using ionospheric total electron content (TEC) measurements collected by continuously operating ground-based receivers from the Global Navigation Satellite Systems (GNSS). Here, we present results using a new approach, named VARION (Variometric Approach for Real-Time Ionosphere Observation), and for the first time, we estimate slant TEC (sTEC) variations in a real-time scenario from GPS and Galileo constellations. Specifically, we study the 2016 New Zealand tsunami event using GNSS receivers with multi-constellation tracking capabilities located in the Pacific region. We compare sTEC estimates obtained using GPS and Galileo constellations. The efficiency of the real-time sTEC estimation using the VARION algorithm has been demonstrated for the 2012 Haida Gwaii tsunami event. TEC variations induced by the tsunami event are computed using 56 GPS receivers in Hawai'i. We observe TEC perturbations with amplitudes up to 0.25 TEC units and traveling ionospheric disturbances moving away from the epicenter at a speed of about 316 m/s. We present comparisons with the real-time tsunami model MOST (Method of Splitting Tsunami) provided by the NOAA Center for Tsunami Research. We observe variations in TEC that correlate well in time and space with the propagating tsunami waves. We conclude that the integration of different satellite constellations is a crucial step forward to increasing the reliability of real-time tsunami detection systems using ground-based GNSS receivers as an augmentation to existing tsunami early warning systems.
Implementation of the NEAMTWS in Portugal
NASA Astrophysics Data System (ADS)
Matias, L. M.; Annunziato, A.; Carrilho, F.; Baptista, M.
2008-12-01
In this paper we present the ongoing implementation of a national tsunami warning system in Portugal. After the Sumatra event in December 2004, the UNESCO, through its International Oceanographic Commission, recognized the need for an end to end global tsunami warning system and International Coordination Groups have been established for different areas around the globe: Indian, Caribbean, Atlantic and Mediterranean ocean basins. This system is the natural response to the historical and recent instrumental events generated along the western segment of the Eurasia and Nubian plates, which eastern end corresponds to the Gulf of Cadiz. The TWS includes three main components: the seismic detection, the tsunami detection and the issue of warnings/alerts. In Portugal the automatic earthquake processing is installed at IM (Instituto de Meteorologia) which is the only national institution operating on a 24x7 basis. This makes IM the natural candidate to host the Portuguese tsunami warning system. The TWS under implementation has several key points: definition of the tsunami scenarios, tsunami detection, and tsunami protocol messages. The system will also be able to predict tsunami potential impact along the coast, wave-heights and arrival times at pre-defined locations along the coast. In this study we present the recent results on definition of tsunami scenarios, establishment of the scenario database and the tsunami analysis tool. This work is a joint effort between Instituto de Meteorologia (Portugal), the Joint Research Center, JRC- ISPRA, Italy and the coordination of the Portuguese Group for the implementation of NEAMTWS in the area. This work has been financed by different European projects as NEAREST and TRANSFER, and also by the JRC, the IM and CGUL/IDL institutions.
NASA Astrophysics Data System (ADS)
LaBrecque, John
2016-04-01
The Global Geodetic Observing System has issued a Call for Participation to research scientists, geodetic research groups and national agencies in support of the implementation of the IUGG recommendation for a Global Navigation Satellite System (GNSS) Augmentation to Tsunami Early Warning Systems. The call seeks to establish a working group to be a catalyst and motivating force for the definition of requirements, identification of resources, and for the encouragement of international cooperation in the establishment, advancement, and utilization of GNSS for Tsunami Early Warning. During the past fifteen years the populations of the Indo-Pacific region experienced a series of mega-thrust earthquakes followed by devastating tsunamis that claimed nearly 300,000 lives. The future resiliency of the region will depend upon improvements to infrastructure and emergency response that will require very significant investments from the Indo-Pacific economies. The estimation of earthquake moment magnitude, source mechanism and the distribution of crustal deformation are critical to rapid tsunami warning. Geodetic research groups have demonstrated the use of GNSS data to estimate earthquake moment magnitude, source mechanism and the distribution of crustal deformation sufficient for the accurate and timely prediction of tsunamis generated by mega-thrust earthquakes. GNSS data have also been used to measure the formation and propagation of tsunamis via ionospheric disturbances acoustically coupled to the propagating surface waves; thereby providing a new technique to track tsunami propagation across ocean basins, opening the way for improving tsunami propagation models, and providing accurate warning to communities in the far field. These two new advancements can deliver timely and accurate tsunami warnings to coastal communities in the near and far field of mega-thrust earthquakes. This presentation will present the justification for and the details of the GGOS Call for Participation.
NASA Astrophysics Data System (ADS)
Armigliato, Alberto; Tinti, Stefano; Pagnoni, Gianluca; Ausilia Paparo, Maria; Zaniboni, Filippo
2016-04-01
A Mw = 6.5 earthquake occurred on November 17, 2015 just offshore the western coast of the Ionian island of Lefkada (western Greece). The earthquake caused two fatalities and severe damage, especially in the island of Lefkada. Several landslides were set in motion by the earthquake, some of which occurred along the coastal cliffs. The earthquake was clearly felt also along the eastern coasts of Apulia, Calabria and Sicily (Italy). The computed focal mechanisms indicate that the rupture occurred along a dextral strike-slip, sub-vertical fault, compatible with the well-known transcurrent tectonics of the Lefkada-Cephalonia area. At the time of the drafting of this abstract no heterogeneous slip distribution has been proposed. No clear evidence of tsunami effects is available, with the only exception of the signal recorded by the tide gauge in Crotone (eastern Calabria, Italy), where a clear disturbance (still to be fully characterised and explained) emerges from the background at approximately 1 hour after the earthquake origin time. From the tsunami research point of view, the November 17 Lefkada earthquake poses at least two problems, which we try to address in this paper. The first consists in studying the tsunami generation based on the available seismic information and on the tectonic setting of the area. We present results of numerical simulations of the tsunami generation and propagation aimed at casting light on the reasons why the generated tsunami was so weak (or even absent). Starting from the official fault parameters provided by the seismic agencies, we vary a number of them, there including the length and width calculated on the basis of different regression formulas, and the depth. For each configuration we perform tsunami simulations by means of the in-house finite-difference code UBO-TSUFD. In parallel, we analyse the Crotone tide-gauge record in order to understand whether the observed "anomalous" signal can be attributed to a tsunami or not. In the first case we will try at least to reproduce the observed signal, otherwise we will try to understand whether the non-tsunamigenic nature of the event is confirmed by the tsunami simulations. The second problem is more related to tsunami early warning issues, in particular with the performance of the Tsunami Decision Matrix for the Mediterranean, presently adopted for example by the candidate Tsunami Service Providers at NOA (Greece) and INGV (Italy). We will briefly discuss whether the present form of the matrix, which does not include any information on focal mechanism, is well suited to a peculiar event like the November 17 earthquake, which was of strike-slip nature and had a magnitude lying just at the border between two distinct classes of tsunami potential forecast. This study is funded in the frame of the EU Project called ASTARTE - "Assessment, STrategy And Risk Reduction for Tsunamis in Europe", Grant 603839, 7th FP (ENV.2013.6.4-3), and of the Italian Flagship Project RITMARE ("La Ricerca ITaliana per il MARE").
Seaside, Oregon, Tsunami Vulnerability Assessment Pilot Study
NASA Astrophysics Data System (ADS)
Dunbar, P. K.; Dominey-Howes, D.; Varner, J.
2006-12-01
The results of a pilot study to assess the risk from tsunamis for the Seaside-Gearhart, Oregon region will be presented. To determine the risk from tsunamis, it is first necessary to establish the hazard or probability that a tsunami of a particular magnitude will occur within a certain period of time. Tsunami inundation maps that provide 100-year and 500-year probabilistic tsunami wave height contours for the Seaside-Gearhart, Oregon, region were developed as part of an interagency Tsunami Pilot Study(1). These maps provided the probability of the tsunami hazard. The next step in determining risk is to determine the vulnerability or degree of loss resulting from the occurrence of tsunamis due to exposure and fragility. The tsunami vulnerability assessment methodology used in this study was developed by M. Papathoma and others(2). This model incorporates multiple factors (e.g. parameters related to the natural and built environments and socio-demographics) that contribute to tsunami vulnerability. Data provided with FEMA's HAZUS loss estimation software and Clatsop County, Oregon, tax assessment data were used as input to the model. The results, presented within a geographic information system, reveal the percentage of buildings in need of reinforcement and the population density in different inundation depth zones. These results can be used for tsunami mitigation, local planning, and for determining post-tsunami disaster response by emergency services. (1)Tsunami Pilot Study Working Group, Seaside, Oregon Tsunami Pilot Study--Modernization of FEMA Flood Hazard Maps, Joint NOAA/USGS/FEMA Special Report, U.S. National Oceanic and Atmospheric Administration, U.S. Geological Survey, U.S. Federal Emergency Management Agency, 2006, Final Draft. (2)Papathoma, M., D. Dominey-Howes, D.,Y. Zong, D. Smith, Assessing Tsunami Vulnerability, an example from Herakleio, Crete, Natural Hazards and Earth System Sciences, Vol. 3, 2003, p. 377-389.
Educating and Preparing for Tsunamis in the Caribbean
NASA Astrophysics Data System (ADS)
von Hillebrandt-Andrade, C.; Aliaga, B.; Edwards, S.
2013-12-01
The Caribbean and Adjacent Regions has a long history of tsunamis and earthquakes. Over the past 500 years, more than 75 tsunamis have been documented in the region by the NOAA National Geophysical Data Center. Just since 1842, 3446 lives have been lost to tsunamis; this is more than in the Northeastern Pacific for the same time period. With a population of almost 160 million, over 40 million visitors a year and a heavy concentration of residents, tourists, businesses and critical infrastructure along its shores (especially in the northern and eastern Caribbean), the risk to lives and livelihoods is greater than ever before. The only way to survive a tsunami is to get out of harm's way before the waves strike. In the Caribbean given the relatively short distances from faults, potential submarine landslides and volcanoes to some of the coastlines, the tsunamis are likely to be short fused, so it is imperative that tsunami warnings be issued extremely quickly and people be educated on how to recognize and respond. Nevertheless, given that tsunamis occur infrequently as compared with hurricanes, it is a challenge for them to receive the priority they require in order to save lives when the next one strikes the region. Close cooperation among countries and territories is required for warning, but also for education and public awareness. Geographical vicinity and spoken languages need to be factored in when developing tsunami preparedness in the Caribbean, to make sure citizens receive a clear, reliable and sound science based message about the hazard and the risk. In 2006, in the wake of the Indian Ocean tsunami and after advocating without success for a Caribbean Tsunami Warning System since the mid 90's, the Intergovernmental Oceanographic Commission of UNESCO established the Intergovernmental Coordination Group for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). Its purpose is to advance an end to end tsunami warning system that serves regionally and delivers locally, saving lives and livelihoods, not only from tsunamis, but all coastal hazards. Through this and other platforms, physical and social scientists, emergency managers and elected officials have been working together via different mechanisms. Community based recognition programs, like the TsunamiReadyTM Program, regional tsunami exercises, sub-regional public education activities such as the Tsunami Smart campaigns, internet technologies, social media, meetings and conferences, identification of local and national champions, capitalization of news breaking tsunamis and earthquakes, economic resources for equipment and training have all been key to developing a tsunami safer Caribbean. Given these efforts, according to a 2013 survey, 93% of the countries covered by CARIBE EWS have tsunami response protocols in place, although much more work is required. In 2010 the US National Weather Service established the Caribbean Tsunami Warning Program as the first step towards a Caribbean Tsunami Warning Center in the region. In 2013 the Caribbean Tsunami Information Center was established in Barbados. Both these institutions which serve the region play a key role for promoting both the warning and educational components of the warning system.
NASA Astrophysics Data System (ADS)
Griffis, A. M.; Jessica, P.; Reinhardt, E. G.; Kosciuch, T. J.; Kovacs, S. E.; Hoffmann, G.
2017-12-01
Coastlines along the Arabian Sea are susceptible to tsunami-related inundation due to their proximity to the Makran Subduction Zone (MSZ). This subduction zone has seen decades of low intensity events, but has historically produced large tsunamigenic-earthquakes that have impacted the 100 million people living along the Arabian Sea. One major problem in assessing the seismic risk of the MSZ is that the historical record of events are spatially and temporally limited and rely heavily on eye witness accounts. This hinders our ability to forecast the potential magnitude and recurrence intervals of earthquakes and tsunamis that can be expected in the future. Sediments deposited by paleotsunamis are useful as they expand the decadal record of events to include millennial timescales that more accurately capture the full range of magnitudes and recurrence intervals. On November 28, 1945 a 8.1 Mw earthquake originating from the MSZ generated a tsunami inundating coastlines along the Arabian Sea with wave heights up to 13m. At Sur, a small village on the northeastern coastline of Oman, the tsunami deposited a laterally continuous shell-rich layer within a 12 km2 lagoon. This layer contained distinctive taphonomic assemblages of foraminifera and bivalves. Below the 1945 shelly deposit at Sur Lagoon, seven anomalous sand layers were found preserved within fine-grained lagoonal sediment. These layers of medium-coarse sands range in thickness from 5 to 35 cm and are separated by sandy-mud sediment. Grain size analysis shows that these anomalously coarse layers are followed by an abrupt return to lagoonal mud. The sand layers have features consistent with the 1945 tsunami deposit such as fining upward trends, sharp basal contact, and marine foraminifera (e.g., Amphistegina sp., planktics). In contrast, the surrounding lagoon deposits are generally massive, finer in grain size, and contain foraminiferal species typically found in shallow quiescent coastal environments (e.g., Ammonia tepida, miliolids). We have attributed these seven marine sand layers to tsunami overwash deposition. Preliminary radiocarbon dating of articulated bivalve shells establish a late Holocene age range for the tsunami sand layers with ongoing analyses further constraining timings leading to assessing relative magnitudes of past events.
NASA Astrophysics Data System (ADS)
Hagerty, M. T.; Lomax, A.; Hellman, S. B.; Whitmore, P.; Weinstein, S.; Hirshorn, B. F.; Knight, W. R.
2015-12-01
Tsunami warning centers must rapidly decide whether an earthquake is likely to generate a destructive tsunami in order to issue a tsunami warning quickly after a large event. For very large events (Mw > 8 or so), magnitude and location alone are sufficient to warrant an alert. However, for events of smaller magnitude (e.g., Mw ~ 7.5), particularly for so-called "tsunami earthquakes", magnitude alone is insufficient to issue an alert and other measurements must be rapidly made and used to assess tsunamigenic potential. The Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). We (ISTI) are responsible for implementing the seismic monitoring components in this new system, including real-time seismic data collection and seismic processing. The seismic data processor includes a variety of methods aimed at real-time discrimination of tsunamigenic events, including: Mwp, Me, slowness (Theta), W-phase, mantle magnitude (Mm), array processing and finite-fault inversion. In addition, it contains the ability to designate earthquake scenarios and play the resulting synthetic seismograms through the processing system. Thus, it is also a convenient tool that integrates research and monitoring and may be used to calibrate and tune the real-time monitoring system. Here we show results of the automated processing system for a large dataset of subduction zone earthquakes containing recent tsunami earthquakes and we examine the accuracy of the various discrimation methods and discuss issues related to their successful real-time application.
Early waning and evacuation from Tsunami, volcano, flood and other hazards
NASA Astrophysics Data System (ADS)
Sugimoto, M.
2012-12-01
In reconsideration of the great sacrifice among the people, evacuation calls for evacuation through Japan Meteorological Agency (JMA), local governments and Medias have been drastically changed after the 2011 Tohoku tsunami in Japan. One of example is that JMA changed from forecasted concrete figure of tsunami height to one of 3 levels of tsunami height. A data shows the border between life and death is just 2 minutes of earlier evacuation in case of the 2011 tsunami. It shows how importance for communities to prompt early evacuation for survivals. However, the 2011 Tohoku tsunami revealed there is no reliable trigger to prompt early evacuation to people in case of blackout under disasters, excluding effective education. The warning call was still complicated situations in Japan in July 2012. The 2012 Northern Kyusyu downpours was at worst around 110 millimeters an hour and casualties 30 in Japan. JMA learned from the last tsunami. In this time JMA informed to local governments as a waning call "Unexpected severe rains" to local governments. However, local governments did not notice the call from JMA in the same as usual informed way. One of the local government said "We were very busy for preparing for staffs. We looked at the necessary information of the water levels of rivers and flood prevention under emergent situation" (NHK 2012). This case shows JMA's evacuation calls from upstream to midstream of local government and downstream of communities started, however upstream calls have not engaged with midstream and communities yet. Calls of early warning from upstream is still a self-centered idea for both midstream and downstream. Finally JMA could not convey a crisis mentality to local government. The head of Oarai town independently decided to use the different warning call "Order townspersons to evacuate immediately" in Ibaraki prefecture, Japan from the other municipalities in 2011 though there was not such a manuals calls in Japan. This risk communication succeeded between the local government and communities. People said I have never heard such warning call so I started evacuate soon. On the other hand, Japanese government make a strategy of level 1 tsunami height and lever 2 height. Japan is still seeking to adept at harmonizing evacuation with infrastructures to prevent. It is still not clear to solve warning issues and prevent issues. This research contributes how to struggle with these issues now in Japan.
Tsunamis and splay fault dynamics
Wendt, J.; Oglesby, D.D.; Geist, E.L.
2009-01-01
The geometry of a fault system can have significant effects on tsunami generation, but most tsunami models to date have not investigated the dynamic processes that determine which path rupture will take in a complex fault system. To gain insight into this problem, we use the 3D finite element method to model the dynamics of a plate boundary/splay fault system. We use the resulting ground deformation as a time-dependent boundary condition for a 2D shallow-water hydrodynamic tsunami calculation. We find that if me stress distribution is homogeneous, rupture remains on the plate boundary thrust. When a barrier is introduced along the strike of the plate boundary thrust, rupture propagates to the splay faults, and produces a significantly larger tsunami man in the homogeneous case. The results have implications for the dynamics of megathrust earthquakes, and also suggest mat dynamic earthquake modeling may be a useful tool in tsunami researcn. Copyright 2009 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Heidarzadeh, Mohammad; Necmioglu, Ocal; Ishibe, Takeo; Yalciner, Ahmet C.
2017-12-01
Various Tsunami Service Providers (TSPs) within the Mediterranean Basin supply tsunami warnings including CAT-INGV (Italy), KOERI-RETMC (Turkey), and NOA/HL-NTWC (Greece). The 20 July 2017 Bodrum-Kos (Turkey-Greece) earthquake (Mw 6.6) and tsunami provided an opportunity to assess the response from these TSPs. Although the Bodrum-Kos tsunami was moderate (e.g., runup of 1.9 m) with little damage to properties, it was the first noticeable tsunami in the Mediterranean Basin since the 21 May 2003 western Mediterranean tsunami. Tsunami waveform analysis revealed that the trough-to-crest height was 34.1 cm at the near-field tide gauge station of Bodrum (Turkey). Tsunami period band was 2-30 min with peak periods at 7-13 min. We proposed a source fault model for this tsunami with the length and width of 25 and 15 km and uniform slip of 0.4 m. Tsunami simulations using both nodal planes produced almost same results in terms of agreement between tsunami observations and simulations. Different TSPs provided tsunami warnings at 10 min (CAT-INGV), 19 min (KOERI-RETMC), and 18 min (NOA/HL-NTWC) after the earthquake origin time. Apart from CAT-INGV, whose initial Mw estimation differed 0.2 units with respect to the final value, the response from the other two TSPs came relatively late compared to the desired warning time of 10 min, given the difficulties for timely and accurate calculation of earthquake magnitude and tsunami impact assessment. It is argued that even if a warning time of 10 min was achieved, it might not have been sufficient for addressing near-field tsunami hazards. Despite considerable progress and achievements made within the upstream components of NEAMTWS (North East Atlantic, Mediterranean and Connected seas Tsunami Warning System), the experience from this moderate tsunami may highlight the need for improving operational capabilities of TSPs, but more importantly for effectively integrating civil protection authorities into NEAMTWS and strengthening tsunami education programs.
2015-06-19
trigger a massive tsunami hitting the Fukushima Nuclear Power Plant creating the largest nuclear incident since the Chernobyl disaster in1986. The...progress. Though Kadena is several hundred miles closer to Osan than Misawa, once the pallet arrives at Kadena it will need to be processed and may...example is after rates are stabilized (fixed rate for the entire year of execution) there is a sudden spike in aviation fuel prices. A substantial
Natural disasters: forecasting economic and life losses
Nishenko, Stuart P.; Barton, Christopher C.
1997-01-01
Events such as hurricanes, earthquakes, floods, tsunamis, volcanic eruptions, and tornadoes are natural disasters because they negatively impact society, and so they must be measured and understood in human-related terms. At the U.S. Geological Survey, we have developed a new method to examine fatality and dollar-loss data, and to make probabilistic estimates of the frequency and magnitude of future events. This information is vital to large sectors of society including disaster relief agencies and insurance companies.
Tsunami Ready Recognition Program for the Caribbean and Adjacent Regions Launched in 2015
NASA Astrophysics Data System (ADS)
von Hillebrandt-Andrade, C.; Hinds, K.; Aliaga, B.; Brome, A.; Lopes, R.
2015-12-01
Over 75 tsunamis have been documented in the Caribbean and Adjacent Regions over the past 500 years with 4,561 associated deaths according to the NOAA Tsunami Database. The most recent devastating tsunamis occurred in 1946 in Dominican Republic; 1865 died. With the explosive increase in residents, tourists, infrastructure, and economic activity along the coasts, the potential for human and economic loss is enormous. It has been estimated that on any day, more than 500,000 people in the Caribbean could be in harm's way just along the beaches, with hundreds of thousands more working and living in the tsunamis hazard zones. In 2005 the UNESCO Intergovernmental Oceanographic Commission established the Intergovernmental Coordination Group for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (ICG CARIBE EWS) to coordinate tsunami efforts among the 48 participating countries in territories in the region. In addition to monitoring, modeling and communication systems, one of the fundamental components of the warning system is community preparedness, readiness and resilience. Over the past 10 years 49 coastal communities in the Caribbean have been recognized as TsunamiReady® by the US National Weather Service (NWS) in the case of Puerto Rico and the US Virgin Islands and jointly by UNESCO and NWS in the case of the non US jurisdictions of Anguilla and the British Virgin Islands. In response to the positive feedback of the implementation of TsunamiReady, the ICG CARIBE EWS in 2015 recommended the approval of the guidelines for a Community Performance Based Recognition program. It also recommended the adoption of the name "Tsunami Ready", which has been positively consulted with the NWS. Ten requirements were established for recognition and are divided among Preparedness, Mitigation and Response elements which were adapted from the proposed new US TsunamiReady guidelines and align well with emergency management functions. Both a regional ICG CARIBE EWS and national/territorial "Tsunami Ready" boards will administer the recognition program. With the "Tsunami Ready" program, it will be possible for to better track the full implementation of tsunami warning system in the Caribbean and Adjacent regions. Member States and donor agencies have been invited to support pilot projects.
OceanNOMADS: Real-time and retrospective access to operational U.S. ocean prediction products
NASA Astrophysics Data System (ADS)
Harding, J. M.; Cross, S. L.; Bub, F.; Ji, M.
2011-12-01
The National Oceanic and Atmospheric Administration (NOAA) National Operational Model Archive and Distribution System (NOMADS) provides both real-time and archived atmospheric model output from servers at the National Centers for Environmental Prediction (NCEP) and National Climatic Data Center (NCDC) respectively (http://nomads.ncep.noaa.gov/txt_descriptions/marRutledge-1.pdf). The NOAA National Ocean Data Center (NODC) with NCEP is developing a complementary capability called OceanNOMADS for operational ocean prediction models. An NCEP ftp server currently provides real-time ocean forecast output (http://www.opc.ncep.noaa.gov/newNCOM/NCOM_currents.shtml) with retrospective access through NODC. A joint effort between the Northern Gulf Institute (NGI; a NOAA Cooperative Institute) and the NOAA National Coastal Data Development Center (NCDDC; a division of NODC) created the developmental version of the retrospective OceanNOMADS capability (http://www.northerngulfinstitute.org/edac/ocean_nomads.php) under the NGI Ecosystem Data Assembly Center (EDAC) project (http://www.northerngulfinstitute.org/edac/). Complementary funding support for the developmental OceanNOMADS from U.S. Integrated Ocean Observing System (IOOS) through the Southeastern University Research Association (SURA) Model Testbed (http://testbed.sura.org/) this past year provided NODC the analogue that facilitated the creation of an NCDDC production version of OceanNOMADS (http://www.ncddc.noaa.gov/ocean-nomads/). Access tool development and storage of initial archival data sets occur on the NGI/NCDDC developmental servers with transition to NODC/NCCDC production servers as the model archives mature and operational space and distribution capability grow. Navy operational global ocean forecast subsets for U.S waters comprise the initial ocean prediction fields resident on the NCDDC production server. The NGI/NCDDC developmental server currently includes the Naval Research Laboratory Inter-America Seas Nowcast/Forecast System over the Gulf of Mexico from 2004-Mar 2011, the operational Naval Oceanographic Office (NAVOCEANO) regional USEast ocean nowcast/forecast system from early 2009 to present, and the NAVOCEANO operational regional AMSEAS (Gulf of Mexico/Caribbean) ocean nowcast/forecast system from its inception 25 June 2010 to present. AMSEAS provided one of the real-time ocean forecast products accessed by NOAA's Office of Response and Restoration from the NGI/NCDDC developmental OceanNOMADS during the Deep Water Horizon oil spill last year. The developmental server also includes archived, real-time Navy coastal forecast products off coastal Japan in support of U.S./Japanese joint efforts following the 2011 tsunami. Real-time NAVOCEANO output from regional prediction systems off Southern California and around Hawaii, currently available on the NCEP ftp server, are scheduled for archival on the developmental OceanNOMADS by late 2011 along with the next generation Navy/NOAA global ocean prediction output. Accession and archival of additional regions is planned as server capacities increase.
Tsunami Data and Scientific Data Diplomacy
NASA Astrophysics Data System (ADS)
Arcos, N. P.; Dunbar, P. K.; Gusiakov, V. K.; Kong, L. S. L.; Aliaga, B.; Yamamoto, M.; Stroker, K. J.
2016-12-01
Free and open access to data and information fosters scientific progress and can build bridges between nations even when political relationships are strained. Data and information held by one stakeholder may be vital for promoting research of another. As an emerging field of inquiry, data diplomacy explores how data-sharing helps create and support positive relationships between countries to enable the use of data for societal and humanitarian benefit. Tsunami has arguably been the only natural hazard that has been addressed so effectively at an international scale and illustrates the success of scientific data diplomacy. Tsunami mitigation requires international scientific cooperation in both tsunami science and technology development. This requires not only international agreements, but working-level relationships between scientists from countries that may have different political and economic policies. For example, following the Pacific wide tsunami of 1960 that killed two thousand people in Chile and then, up to a day later, hundreds in Hawaii, Japan, and the Philippines; delegates from twelve countries met to discuss and draft the requirements for an international tsunami warning system. The Pacific Tsunami Warning System led to the development of local, regional, and global tsunami databases and catalogs. For example, scientists at NOAA/NCEI and the Tsunami Laboratory/Russian Academy of Sciences have collaborated on their tsunami catalogs that are now routinely accessed by scientists and the public around the world. These data support decision-making during tsunami events, are used in developing inundation and evacuation maps, and hazard assessments. This presentation will include additional examples of agreements for data-sharing between countries, as well as challenges in standardization and consistency among the tsunami research community. Tsunami data and scientific data diplomacy have ultimately improved understanding of tsunami and associated impacts.
NASA Astrophysics Data System (ADS)
Armigliato, A.; Tinti, S.; Pagnoni, G.; Zaniboni, F.
2012-04-01
The central Mediterranean, and in particular the coasts of southern Italy, is one of the areas with the highest tsunami hazard in Europe. Limiting our attention to earthquake-generated tsunamis, the sources of historical events hitting this region, as well as the largest part of the potential tsunamigenic seismic sources mapped there, are found at very short distances from the closest shorelines, reducing the time needed for the tsunami to attack the coasts themselves to few minutes. This represents by itself an issue from the Tsunami Early Warning (TEW) perspective. To make the overall problem even more intriguing and challenging, it is known that large tsunamigenic earthquakes are generally characterized by highly heterogeneous distributions of the slip on the fault. This feature has been recognized clearly, for instance, in the giant Sumatra 2004, Chile 2010, and Japan 2011 earthquakes (magnitude 9.3, 8.8 and 9.0, respectively), but it was a property also of smaller magnitude events occurred in the region considered in this study, like the 28 December 1908 Messina Straits tsunamigenic earthquake (M=7.2). In terms of tsunami impact, the parent fault slip heterogeneity usually determines a high variability of run-up and inundation on the near-field coasts, which further complicates the TEW problem. The information on the details of the seismic source rupture coming from the seismic (and possibly geodetic) networks, though of primary importance, is typically available after a time that is comparable or larger than the time comprised between the generation and the impact of the tsunami. In the framework of the EU-FP7 TRIDEC Project, we investigate how a proper marine sensors coverage both along the coasts and offshore can help posing constraints on the characteristics of the source in near-real time. Our approach consists in discussing numerical tsunami scenarios in the central Mediterranean involving different slip distributions on the parent fault; the tsunamigenic region we take into consideration is the Hyblaean-Malta escarpment located offshore eastern Sicily, where several large historical tsunamigenic earthquakes took place (e.g. 11 January 1693). Starting from different slip configurations on a chosen fault, we compare the time series of wave elevation simulated for tide gauges placed along the coast and for virtual deep sea sensors placed at different distances from the source area. The final goal is to understand whether a properly designed marine sensor network can help determining in real-time the slip characteristics along the parent fault and hence forecasting the pattern of impact of the tsunami especially along the closest coasts.
Implementation and Challenges of the Tsunami Warning System in the Western Mediterranean
NASA Astrophysics Data System (ADS)
Schindelé, F.; Gailler, A.; Hébert, H.; Loevenbruck, A.; Gutierrez, E.; Monnier, A.; Roudil, P.; Reymond, D.; Rivera, L.
2015-03-01
The French Tsunami Warning Center (CENALT) has been in operation since 2012. It is contributing to the North-eastern and Mediterranean (NEAM) tsunami warning and mitigation system coordinated by the United Nations Educational, Scientific, and Cultural Organization, and benefits from data exchange with several foreign institutes. This center is supported by the French Government and provides French civil-protection authorities and member states of the NEAM region with relevant messages for assessing potential tsunami risk when an earthquake has occurred in the Western Mediterranean sea or the Northeastern Atlantic Ocean. To achieve its objectives, CENALT has developed a series of innovative techniques based on recent research results in seismology for early tsunami warning, monitoring of sea level variations and detection capability, and effective numerical computation of ongoing tsunamis.
The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program Pr-Nthmp
NASA Astrophysics Data System (ADS)
Huerfano Moreno, V. A.; Hincapie-Cardenas, C. M.
2014-12-01
Tsunami hazard assessment, detection, warning, education and outreach efforts are intended to reduce losses to life and property. The Puerto Rico Seismic Network (PRSN) is participating in an effort with local and federal agencies, to developing tsunami hazard risk reduction strategies under the National Tsunami Hazards and Mitigation Program (NTHMP). This grant supports the TsunamiReady program which is the base of the tsunami preparedness and mitigation in PR. The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. The seismic water waves originating in the prominent fault systems around PR are considered to be a near-field hazard for Puerto Rico and the Virgin islands (PR/VI) because they can reach coastal areas within a few minutes after the earthquake. Sources for local, regional and tele tsunamis have been identified and modeled and tsunami evacuation maps were prepared for PR. These maps were generated in three phases: First, hypothetical tsunami scenarios on the basis of the parameters of potential underwater earthquakes were developed. Secondly, each of these scenarios was simulated. The third step was to determine the worst case scenario (MOM). The run-ups were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Online maps and related evacuation products are available to the public via the PR-TDST (PR Tsunami Decision Support Tool). Currently all the 44 coastal municipalities were recognized as TsunamiReady by the US NWS. The main goal of the program is to declare Puerto Rico as TsunamiReady, including two cities that are not coastal but could be affected by tsunamis. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were approved. Also, the existing tsunami protocol and criteria in the PR/VI was updated. This paper describes the PR-NTHMP project, including the real time earthquake and tsunami monitoring as well as the specific protocols used to broadcast tsunami messages. The paper highlights tsunami hazards assessment, detection, warning, education and outreach in Puerto Rico.
REWSET: A prototype seismic and tsunami early warning system in Rhodes island, Greece
NASA Astrophysics Data System (ADS)
Papadopoulos, Gerasimos; Argyris, Ilias; Aggelou, Savvas; Karastathis, Vasilis
2014-05-01
Tsunami warning in near-field conditions is a critical issue in the Mediterranean Sea since the most important tsunami sources are situated within tsunami wave travel times starting from about five minutes. The project NEARTOWARN (2012-2013) supported by the EU-DG ECHO contributed substantially to the development of new tools for the near-field tsunami early warning in the Mediterranean. One of the main achievements is the development of a local warning system in the test-site of Rhodes island (Rhodes Early Warning System for Earthquakes and Tsunamis - REWSET). The system is composed by three main subsystems: (1) a network of eight seismic early warning devices installed in four different localities of the island, one in the civil protection, another in the Fire Brigade and another two in municipality buildings; (2) two radar-type (ultrasonic) tide-gauges installed in the eastern coastal zine of the island which was selected since research on the historical earthquake and tsunami activity has indicated that the most important, near-field tsunami sources are situated offshore to the east of Rhodes; (3) a crisis Geographic Management System (GMS), which is a web-based and GIS-based application incorporating a variety of thematic maps and other information types. The seismic early warning devices activate by strong (magnitude around 6 or more) earthquakes occurring at distances up to about 100 km from Rhodes, thus providing immediate mobilization of the civil protection. The tide-gauges transmit sea level data, while during the crisis the GMS supports decisions to be made by civil protection. In the near future it is planned the REWSET system to be integrated with national and international systems. REWSET is a prototype which certainly could be developed in other coastal areas of the Mediterranean and beyond.
Spatial Analysis of Geohazards using ArcGIS--A web-based Course.
NASA Astrophysics Data System (ADS)
Harbert, W.; Davis, D.
2003-12-01
As part of the Environmental Systems Research Incorporated (ESRI) Virtual Campus program, a course was designed to present the benefits of Geographical Information Systems (GIS) based spatial analysis as applied towards a variety of geohazards. We created this on-line ArcGIS 8.x-based course to aid the motivated student or professional in his or her efforts to use GIS in determining where geohazards are likely to occur and for assessing their potential impact on the human community. Our course is broadly designed for earth scientists, public sector professionals, students, and others who want to apply GIS to the study of geohazards. Participants work with ArcGIS software and diverse datasets to display, visualize and analyze a wide variety of data sets and map a variety of geohazards including earthquakes, volcanoes, landslides, tsunamis, and floods. Following the GIS-based methodology of posing a question, decomposing the question into specific criteria, applying the criteria to spatial or tabular geodatasets and then analyzing feature relationships, from the beginning the course content was designed in order to enable the motivated student to answer questions. For example, to explain the relationship between earth quake location, earthquake depth, and plate boundaries; use a seismic hazard map to identify population and features at risk from an earthquake; import data from an earthquake catalog and visualize these data in 3D; explain the relationship between earthquake damage and local geology; use a flood scenario map to identify features at risk for forecast river discharges; use a tsunami inundation map to identify population and features at risk from tsunami; use a hurricane inundation map to identify the population at risk for any given category hurricane; estimate accumulated precipitation by integrating time-series Doppler radar data; and model a real-life landslide event. The six on-line modules for our course are Earthquakes I, Earthquakes II, Volcanoes, Floods, Coastal Geohazards and Landslides. Earthquake I can be viewed and accessed for no cost at http://campus.esri.com.
Reducing risk where tectonic plates collide—U.S. Geological Survey subduction zone science plan
Gomberg, Joan S.; Ludwig, Kristin A.; Bekins, Barbara; Brocher, Thomas M.; Brock, John C.; Brothers, Daniel; Chaytor, Jason D.; Frankel, Arthur; Geist, Eric L.; Haney, Matt; Hickman, Stephen H.; Leith, William S.; Roeloffs, Evelyn A.; Schulz, William H.; Sisson, Thomas W.; Wallace, Kristi; Watt, Janet; Wein, Anne M.
2017-06-19
The U.S. Geological Survey (USGS) serves the Nation by providing reliable scientific information and tools to build resilience in communities exposed to subduction zone earthquakes, tsunamis, landslides, and volcanic eruptions. Improving the application of USGS science to successfully reduce risk from these events relies on whole community efforts, with continuing partnerships among scientists and stakeholders, including researchers from universities, other government labs and private industry, land-use planners, engineers, policy-makers, emergency managers and responders, business owners, insurance providers, the media, and the general public.Motivated by recent technological advances and increased awareness of our growing vulnerability to subduction-zone hazards, the USGS is uniquely positioned to take a major step forward in the science it conducts and products it provides, building on its tradition of using long-term monitoring and research to develop effective products for hazard mitigation. This science plan provides a blueprint both for prioritizing USGS science activities and for delineating USGS interests and potential participation in subduction zone science supported by its partners.The activities in this plan address many USGS stakeholder needs:High-fidelity tools and user-tailored information that facilitate increasingly more targeted, neighborhood-scale decisions to mitigate risks more cost-effectively and ensure post-event operability. Such tools may include maps, tables, and simulated earthquake ground-motion records conveying shaking intensity and frequency. These facilitate the prioritization of retrofitting of vulnerable infrastructure;Information to guide local land-use and response planning to minimize development in likely hazardous zones (for example, databases, maps, and scenario documents to guide evacuation route planning in communities near volcanoes, along coastlines vulnerable to tsunamis, and built on landslide-prone terrain);New tools to assess the potential for cascading hazards, such as landslides, tsunamis, coastal changes, and flooding caused by earthquakes or volcanic eruptions;Geospatial models of permanent, widespread land- and sea-level changes that may occur in the immediate aftermath of great (M ≥8.0) subduction zone earthquakes;Strong partnerships between scientists and public safety providers for effective decision making during periods of elevated hazard and risk;Accurate forecasts of far-reaching hazards (for example, ash clouds, tsunamis) to avert catastrophes and unnecessary disruptions in air and sea transportation;Aftershock forecasts to guide decisions about when and where to re-enter, repair, or rebuild buildings and infrastructure, for all types of subduction zone earthquakes.
Tsunami Generation Modelling for Early Warning Systems
NASA Astrophysics Data System (ADS)
Annunziato, A.; Matias, L.; Ulutas, E.; Baptista, M. A.; Carrilho, F.
2009-04-01
In the frame of a collaboration between the European Commission Joint Research Centre and the Institute of Meteorology in Portugal, a complete analytical tool to support Early Warning Systems is being developed. The tool will be part of the Portuguese National Early Warning System and will be used also in the frame of the UNESCO North Atlantic Section of the Tsunami Early Warning System. The system called Tsunami Analysis Tool (TAT) includes a worldwide scenario database that has been pre-calculated using the SWAN-JRC code (Annunziato, 2007). This code uses a simplified fault generation mechanism and the hydraulic model is based on the SWAN code (Mader, 1988). In addition to the pre-defined scenario, a system of computers is always ready to start a new calculation whenever a new earthquake is detected by the seismic networks (such as USGS or EMSC) and is judged capable to generate a Tsunami. The calculation is performed using minimal parameters (epicentre and the magnitude of the earthquake): the programme calculates the rupture length and rupture width by using empirical relationship proposed by Ward (2002). The database calculations, as well the newly generated calculations with the current conditions are therefore available to TAT where the real online analysis is performed. The system allows to analyze also sea level measurements available worldwide in order to compare them and decide if a tsunami is really occurring or not. Although TAT, connected with the scenario database and the online calculation system, is at the moment the only software that can support the tsunami analysis on a global scale, we are convinced that the fault generation mechanism is too simplified to give a correct tsunami prediction. Furthermore short tsunami arrival times especially require a possible earthquake source parameters data on tectonic features of the faults like strike, dip, rake and slip in order to minimize real time uncertainty of rupture parameters. Indeed the earthquake parameters available right after an earthquake are preliminary and could be inaccurate. Determining which earthquake source parameters would affect the initial height and time series of tsunamis will show the sensitivity of the tsunami time series to seismic source details. Therefore a new fault generation model will be adopted, according to the seismotectonics properties of the different regions, and finally included in the calculation scheme. In order to do this, within the collaboration framework of Portuguese authorities, a new model is being defined, starting from the seismic sources in the North Atlantic, Caribbean and Gulf of Cadiz. As earthquakes occurring in North Atlantic and Caribbean sources may affect Portugal mainland, the Azores and Madeira archipelagos also these sources will be included in the analysis. Firstly we have started to examine the geometries of those sources that spawn tsunamis to understand the effect of fault geometry and depths of earthquakes. References: Annunziato, A., 2007. The Tsunami Assesment Modelling System by the Joint Research Center, Science of Tsunami Hazards, Vol. 26, pp. 70-92. Mader, C.L., 1988. Numerical modelling of water waves, University of California Press, Berkeley, California. Ward, S.N., 2002. Tsunamis, Encyclopedia of Physical Science and Technology, Vol. 17, pp. 175-191, ed. Meyers, R.A., Academic Press.
Optimization of the Number and Location of Tsunami Stations in a Tsunami Warning System
NASA Astrophysics Data System (ADS)
An, C.; Liu, P. L. F.; Pritchard, M. E.
2014-12-01
Optimizing the number and location of tsunami stations in designing a tsunami warning system is an important and practical problem. It is always desirable to maximize the capability of the data obtained from the stations for constraining the earthquake source parameters, and to minimize the number of stations at the same time. During the 2011 Tohoku tsunami event, 28 coastal gauges and DART buoys in the near-field recorded tsunami waves, providing an opportunity for assessing the effectiveness of those stations in identifying the earthquake source parameters. Assuming a single-plane fault geometry, inversions of tsunami data from combinations of various number (1~28) of stations and locations are conducted and evaluated their effectiveness according to the residues of the inverse method. Results show that the optimized locations of stations depend on the number of stations used. If the stations are optimally located, 2~4 stations are sufficient to constrain the source parameters. Regarding the optimized location, stations must be uniformly spread in all directions, which is not surprising. It is also found that stations within the source region generally give worse constraint of earthquake source than stations farther from source, which is due to the exaggeration of model error in matching large amplitude waves at near-source stations. Quantitative discussions on these findings will be given in the presentation. Applying similar analysis to the Manila Trench based on artificial scenarios of earthquakes and tsunamis, the optimal location of tsunami stations are obtained, which provides guidance of deploying a tsunami warning system in this region.
A Multi-Disciplinary Approach to Tsunami Disaster Prevention in Java, Indonesia
NASA Astrophysics Data System (ADS)
Horns, D. M.; Hall, S.; Harris, R. A.
2016-12-01
The island of Java in Indonesia is the most densely populated island on earth, and is situated within one of the most tectonically active regions on the planet. Deadly tsunamis struck Java in 1994 and 2006. We conducted an assessment of tsunami hazards on the south coast of Java using a team of geologists, public health professionals, and disaster education specialists. The social science component included tsunami awareness surveys, education in communities and schools, evacuation drills, and evaluation. We found that the evacuation routes were generally appropriate for the local hazard, and that most people were aware of the routes and knew how to use them. However, functional tsunami warning systems were lacking in most areas and knowledge of natural warning signs was incomplete. We found that while knowledge of when to evacuate improved after our educational lesson, some incorrect beliefs persisted (e.g. misconceptions about types of earthquakes able to generate tsunamis and how far inland tsunamis can reach). There was a general over-reliance on government to alert when evacuation is needed as well as reluctance on the part of local leaders to take initiative to sound tsunami alerts. Many people on earth who are at risk of tsunamis live in areas where the government lacks resources to maintain a functional tsunami warning system. The best hope for protecting those people is direct education working within the local cultural belief system. Further collaboration is needed with government agencies to design consistent and repeated messages challenging misperceptions about when to evacuate and to encourage individuals to take personal responsibility based on natural warning signs.
Observing Tsunamis in the Ionosphere Using Ground Based GPS Measurements
NASA Technical Reports Server (NTRS)
Galvan, D. A.; Komjathy, A.; Song, Y. Tony; Stephens, P.; Hickey, M. P.; Foster, J.
2011-01-01
Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following recent seismic events, including the Tohoku tsunami of March 11, 2011. We observe fluctuations correlated in time, space, and wave properties with this tsunami in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the Tohoku tsunami near the epicenter, at Hawaii, and near the west coast of North America. Disturbance magnitudes are 1-10% of the background TEC value. Observations near the epicenter are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement. The potential exists to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for future early warning systems.
NASA Astrophysics Data System (ADS)
Tanioka, Y.; Miranda, G. J. A.; Gusman, A. R.
2017-12-01
Recently, tsunami early warning technique has been improved using tsunami waveforms observed at the ocean bottom pressure gauges such as NOAA DART system or DONET and S-NET systems in Japan. However, for tsunami early warning of near field tsunamis, it is essential to determine appropriate source models using seismological analysis before large tsunamis hit the coast, especially for tsunami earthquakes which generated significantly large tsunamis. In this paper, we develop a technique to determine appropriate source models from which appropriate tsunami inundation along the coast can be numerically computed The technique is tested for four large earthquakes, the 1992 Nicaragua tsunami earthquake (Mw7.7), the 2001 El Salvador earthquake (Mw7.7), the 2004 El Astillero earthquake (Mw7.0), and the 2012 El Salvador-Nicaragua earthquake (Mw7.3), which occurred off Central America. In this study, fault parameters were estimated from the W-phase inversion, then the fault length and width were determined from scaling relationships. At first, the slip amount was calculated from the seismic moment with a constant rigidity of 3.5 x 10**10N/m2. The tsunami numerical simulation was carried out and compared with the observed tsunami. For the 1992 Nicaragua tsunami earthquake, the computed tsunami was much smaller than the observed one. For the 2004 El Astillero earthquake, the computed tsunami was overestimated. In order to solve this problem, we constructed a depth dependent rigidity curve, similar to suggested by Bilek and Lay (1999). The curve with a central depth estimated by the W-phase inversion was used to calculate the slip amount of the fault model. Using those new slip amounts, tsunami numerical simulation was carried out again. Then, the observed tsunami heights, run-up heights, and inundation areas for the 1992 Nicaragua tsunami earthquake were well explained by the computed one. The other tsunamis from the other three earthquakes were also reasonably well explained by the computed ones. Therefore, our technique using a depth dependent rigidity curve is worked to estimate an appropriate fault model which reproduces tsunami heights near the coast in Central America. The technique may be worked in the other subduction zones by finding a depth dependent rigidity curve in that particular subduction zone.
A tsunami early warning system for the coastal area modeling
NASA Astrophysics Data System (ADS)
Soebroto, Arief Andy; Sunaryo, Suhartanto, Ery
2015-04-01
The tsunami disaster is a potential disaster in the territory of Indonesia. Indonesia is an archipelago country and close to the ocean deep. The tsunami occurred in Aceh province in 2004. Early prevention efforts have been carried out. One of them is making "tsunami buoy" which has been developed by BPPT. The tool puts sensors on the ocean floor near the coast to detect earthquakes on the ocean floor. Detection results are transmitted via satellite by a transmitter placed floating on the sea surface. The tool will cost billions of dollars for each system. Another constraint was the transmitter theft "tsunami buoy" in the absence of guard. In this study of the system has a transmission system using radio frequency and focused on coastal areas where costs are cheaper, so that it can be applied at many beaches in Indonesia are potentially affected by the tsunami. The monitoring system sends the detection results to the warning system using a radio frequency with a capability within 3 Km. Test results on the sub module sensor monitoring system generates an error of 0.63% was taken 10% showed a good quality sensing. The test results of data transmission from the transceiver of monitoring system to the receiver of warning system produces 100% successful delivery and reception of data. The test results on the whole system to function 100% properly.
A User's Guide to the Tsunami Datasets at NOAA's National Data Buoy Center
NASA Astrophysics Data System (ADS)
Bouchard, R. H.; O'Neil, K.; Grissom, K.; Garcia, M.; Bernard, L. J.; Kern, K. J.
2013-12-01
The National Data Buoy Center (NDBC) has maintained and operated the National Oceanic and Atmospheric Administration's (NOAA) tsunameter network since 2003. The tsunameters employ the NOAA-developed Deep-ocean Assessment and Reporting of Tsunamis (DART) technology. The technology measures the pressure and temperature every 15 seconds on the ocean floor and transforms them into equivalent water-column height observations. A complex series of subsampled observations are transmitted acoustically in real-time to a moored buoy or marine autonomous vehicle (MAV) at the ocean surface. The surface platform uses its satellite communications to relay the observations to NDBC. NDBC places the observations onto the Global Telecommunication System (GTS) for relay to NOAA's Tsunami Warning Centers (TWC) in Hawai'i and Alaska and to the international community. It takes less than three minutes to speed the observations from the ocean floor to the TWCs. NDBC can retrieve limited amounts of the 15-s measurements from the instrumentation on the ocean floor using the technology's two-way communications. NDBC recovers the full resolution 15-s measurements about every 2 years and forwards the datasets and metadata to the National Geophysical Data Center for permanent archive. Meanwhile, NDBC retains the real-time observations on its website. The type of real-time observation depends on the operating mode of the tsunameter. NDBC provides the observations in a variety of traditional and innovative methods and formats that include descriptors of the operating mode. Datasets, organized by station, are available from the NDBC website as text files and from the NDBC THREDDS server in netCDF format. The website provides alerts and lists of events that allow users to focus on the information relevant for tsunami hazard analysis. In addition, NDBC developed a basic web service to query station information and observations to support the Short-term Inundation Forecasting for Tsunamis (SIFT) model. NDBC and NOAA's Integrated Ocean Observing System have fielded the innovative Sensor Observation Service (SOS) that allows users access to observations by station, or groups of stations that have been organized into Features of Interest, such as the 2011 Honshu Tsunami. The user can elect to receive the SOS observations in several different formats, such as Sensor Web Enablement (SWE) or delimiter-separated values. Recently, NDBC's Coastal and Offshore Buoys provided meteorological observations used in analyzing possible meteotsunamis on the U.S. East Coast. However, many of these observations are some distance away from the tsunameters. In a demonstration project, NDBC has added sensors to a tsunameter's surface buoy and a MAV to support program requirements for meteorological observations. All these observations are available from NDBC's website in text files, netCDF, and SOS. To aid users in obtaining information relevant to their applications, the presentation documents, in detail, the characteristics of the different types of real-time observations and the availability and organization of the resulting datasets at NDBC .
MARSite-MARMARA SUPERSITE: Accomplishments and Outlook
NASA Astrophysics Data System (ADS)
Meral Ozel, Nurcan; Necmioglu, Ocal; Ergintav, Semih; Oguz Ozel, Asım; Italiano, Franco; Favali, Paolo; Bigarre, Pascal; Cakir, Ziyadin; Geli, Louis; Aochi, Hideo; Bossu, Remy; Zulfikar, Can; Sesetyan, Karin
2017-04-01
MARsite Project, funded under FP7-ENV.2012 6.4-2 (Grant 308417) and successfully implemented to Marmara Region during 2014-2016 indicated that focusing on the monitoring of the region and the integration of data from land, sea and space and the processing of this composed data based on sound earth-science research is an effective tool for mitigating damage from future earthquakes. This was achieved by monitoring the earthquake hazard through the ground-shaking and forecast maps, short- and long-term earthquake rate forecasting and time-dependent seismic hazard maps to make important risk-mitigation decisions regarding building design, insurance rates, land-use planning, and public-policy issues that need to balance safety and economic and social interests. MARSite has demonstrated the power of the use of different sensors in the assessment of the earthquake hazard. In addition to the more than 30 scientific publication within the MARsite Project framework, a multidisciplinary innovative borehole seismic observatory and a dilatometer have been installed within MARSite where its a data can be used for a range of seismic studies. Due to the encouraging results obtained from this experiment, it was determined that in the future likely smaller number of stations will be required reducing the cost of national seismic networks. The technical infrastructure of the continuous GPS stations of MAGNET network has been updated within MARSite. Tsunami hazard studies in MARSite in Marmara Sea showed that the tsunami hazard in the Marmara Region is primarily due to submarine landslides triggered by an earthquake and a conceptual Tsunami Early Warning System in the Marmara region strongly coupled with the strong ground motion and existing Earthquake Early Warning System was developed. The existing Earthquake Early Warning and Rapid Response system in the Marmara Region was improved and the installation and test of a pilot seismic landslide monitoring system was taken place in the Avcilar-Beylikdüzü Peninsula, a large landslide prone area located in westward part of Istanbul and facing the North Anatolian Fault Zone (NAFZ). An integrated approach based on multi-parameter seafloor observatories was implemented to continuously monitor the micro-seismicity along with the fluid expulsion activity within the submerged fault zone. During MARSite, strong integration and links had been established with major European initiatives focused on the collection of multidisciplinary data, their dissemination, interpretation and fusion to produce consistent theoretical and practical models, the implementation of good practices so as to provide the necessary information to end users, and the updating of seismic hazard and risk evaluations in the Marmara region. In this perspective, to continue the understanding of and improvement in the preparedness for geological disasters, the existing monitoring infrastructure of Marsite requires the continuation of a strong a European initiative. This presentation will provide a venue for information exchange towards the establishment of such an initiative.
Physical Observations of the Tsunami during the September 8th 2017 Tehuantepec, Mexico Earthquake
NASA Astrophysics Data System (ADS)
Ramirez-Herrera, M. T.; Corona, N.; Ruiz-Angulo, A.; Melgar, D.; Zavala-Hidalgo, J.
2017-12-01
The September 8th 2017, Mw8.2 earthquake offshore Chiapas, Mexico, is the largest earthquake recorded history in Chiapas since 1902. It caused damage in the states of Oaxaca, Chiapas and Tabasco; it had more than 100 fatalities, over 1.5 million people were affected, and 41,000 homes were damaged in the state of Chiapas alone. This earthquake, a deep intraplate event on a normal fault on the oceanic subducting plate, generated a tsunami recorded at several tide gauge stations in Mexico and on the Pacific Ocean. Here we report the physical effects of the tsunami on the Chiapas coast and analyze the societal implications of this tsunami on the basis of our field observations. Tide gauge data indicate 11.3 and 8.2 cm of coastal subsidence at Salina Cruz and Puerto Chiapas stations. The associated tsunami waves were recorded first at Salina Cruz tide gauge station at 5:13 (GMT). We covered ground observations along 41 km of the coast of Chiapas, encompassing the sites with the highest projected wave heights based on the preliminary tsunami model (maximum tsunami amplitudes between -94.5 and -93.0 W). Runup and inundation distances were measured with an RTK GPS and using a Sokkia B40 level along 8 sites. We corrected runup data with estimated astronomical tide levels at the time of the tsunami. The tsunami occurred at low tide. The maximum runup was 3 m at Boca del Cielo, and maximum inundation distance was 190 m in Puerto Arista, corresponding to the coast directly opposite the epicenter and in the central sector of the Gulf of Tehuantepec. In general, our field data agree with the predicted results from the preliminary tsunami model. Tsunami scour and erosion was evident on the Chiapas coast. Tsunami deposits, mainly sand, reached up to 32 cm thickness thinning landwards up to 172 m distance. Even though the Mexican tsunami early warning system (CAT) issued several warnings, the tsunami arrival struck the Chiapas coast prior to the arrival of official warnings to the residents of small coastal towns, owing to the multi-ranked notification system. Thus, a tsunami early warning system with a direct warning to all coastal communities is needed. Some people evacuated under their own initiative, but some did not evacuate. Therefore, community-based education and awareness programs are needed.
A Tsunami Model for Chile for (Re) Insurance Purposes
NASA Astrophysics Data System (ADS)
Arango, Cristina; Rara, Vaclav; Puncochar, Petr; Trendafiloski, Goran; Ewing, Chris; Podlaha, Adam; Vatvani, Deepak; van Ormondt, Maarten; Chandler, Adrian
2014-05-01
Catastrophe models help (re)insurers to understand the financial implications of catastrophic events such as earthquakes and tsunamis. In earthquake-prone regions such as Chile,(re)insurers need more sophisticated tools to quantify the risks facing their businesses, including models with the ability to estimate secondary losses. The 2010 (M8.8) Maule (Chile) earthquake highlighted the need for quantifying losses from secondary perils such as tsunamis, which can contribute to the overall event losses but are not often modelled. This paper presents some key modelling aspects of a new earthquake catastrophe model for Chile developed by Impact Forecasting in collaboration with Aon Benfield Research partners, focusing on the tsunami component. The model has the capability to model tsunami as a secondary peril - losses due to earthquake (ground-shaking) and induced tsunamis along the Chilean coast are quantified in a probabilistic manner, and also for historical scenarios. The model is implemented in the IF catastrophe modelling platform, ELEMENTS. The probabilistic modelling of earthquake-induced tsunamis uses a stochastic event set that is consistent with the seismic (ground shaking) hazard developed for Chile, representing simulations of earthquake occurrence patterns for the region. Criteria for selecting tsunamigenic events (from the stochastic event set) are proposed which take into consideration earthquake location, depth and the resulting seabed vertical displacement and tsunami inundation depths at the coast. The source modelling software RuptGen by Babeyko (2007) was used to calculate static seabed vertical displacement resulting from earthquake slip. More than 3,600 events were selected for tsunami simulations. Deep and shallow water wave propagation is modelled using the Delft3D modelling suite, which is a state-of-the-art software developed by Deltares. The Delft3D-FLOW module is used in 2-dimensional hydrodynamic simulation settings with non-steady flow. Earthquake-induced static seabed vertical displacement is used as an input boundary condition to the model. The model is hierarchically set up with three nested domain levels; with 250 domains in total covering the entire Chilean coast. Spatial grid-cell resolution is equal to the native SRTM resolution of approximately 90m. In addition to the stochastic events, the 1960 (M9.5) Valdivia and 2010 (M8.8) Maule earthquakes are modelled. The modelled tsunami inundation map for the 2010 Maule event is validated through comparison with real observations. The vulnerability component consists of an extensive damage curves database, including curves for buildings, contents and business interruption for 21 occupancies, 24 structural types and two secondary modifies such as building height and period of construction. The building damage curves are developed by use of load-based method in which the building's capacity to resist tsunami loads is treated as equivalent to the design earthquake load capacity. The contents damage and business interruption curves are developed by use of deductive approach i.e. HAZUS flood vulnerability and business function restoration models are adapted for detailed occupancies and then assigned to the dominant structural types in Chile. The vulnerability component is validated through model overall back testing by use of observed aggregated earthquake and tsunami losses for client portfolios for 2010 Maule earthquake.
Community participation in tsunami early warning system in Pangandaran town
NASA Astrophysics Data System (ADS)
Hadian, Sapari D.; Khadijah, Ute Lies Siti; Saepudin, Encang; Budiono, Agung; Yuliawati, Ayu Krishna
2017-07-01
Disaster-resilient communities are communities capable of anticipating and minimizing destructive forces through adaptation. Disaster is an event very close to the people of Indonesia, especially in the small tourism town of Pangadaran located at West Java, Indonesia. On July 17, 2006, the town was hit by a Mw 7.8 earthquake and tsunami that effected over 300 km of the coastline, where the community suffered losses in which more than 600 people were killed, with run up heights exceeding 20 m. The devastation of the tsunami have made the community more alert and together with the local government and other stakeholder develop an Early Warning System for Tsunami. The study is intended to discover issues on tsunami Early Warning System (EWS), disaster risk reduction measures taken and community participation. The research method used is descriptive and explanatory research. The study describe the Tsunami EWS and community based Disaster Risk Reduction in Pangandaran, the implementation of Tsunami alert/EWS in disaster preparedness and observation of community participation in EWS. Data were gathered by secondary data collection, also primary data through interviews, focus group discussions and field observations. Research resulted in a description of EWS implementation, community participation and recommendation to reduce disaster risk in Pangandaran.
NASA Astrophysics Data System (ADS)
Strauch, W.; Talavera, E.; Acosta, N.; Sanchez, M.; Mejia, E.
2007-05-01
The Nicaraguan Pacific coast presents considerable tsunami risk. On September 1, 1992, a tsunami caused enormous damage in the infrastructure and killed more than 170 people. A pilot project was conducted between 2006 and 2007 in the municipality of San Rafel del Sur, area of Masachapa, The project included multiple topics of tsunami prevention measures and considering the direct participation of the local population, as: -General education on disaster prevention, participative events; -Investigation of awareness level and information needs for different population groups; -Specific educational measures in the schools; -Publication of brochures, calendars, news paper articles, radio programs, TV spots -Development of local tsunami hazard maps, 1:5,000 scale; (based on previous regional tsunami hazard mapping projects and local participation) -Development of a tsunami warning plan; -Improvements of the national tsunami warning system. -Installation of sirens for tsunami warning -Installation of tsunami signs, indicating hazardous areas, evacuation routes, safe places; -Realization of evacuation drills in schools. Based on the experiences gained in Masachapa it is planned to run similar projects in other areas along the Nicaraguan Pacific coast. In the project participated the local municipality and local stakeholders of San Rafael del Sur, Ministry of Education, National Police, Nicaraguan Red Cross, Ministry of Health, Ministry of Tourism, Nicaraguan Geosciences Institute (INETER), National System for Disaster Prevention (SINAPRED), Swiss Agency for Development and Cooperation (SDC). It was financed by SDC and INETER.
Tsunami Hazards - A National Threat
,
2006-01-01
In December 2004, when a tsunami killed more than 200,000 people in 11 countries around the Indian Ocean, the United States was reminded of its own tsunami risks. In fact, devastating tsunamis have struck North America before and are sure to strike again. Especially vulnerable are the five Pacific States--Hawaii, Alaska, Washington, Oregon, and California--and the U.S. Caribbean islands. In the wake of the Indian Ocean disaster, the United States is redoubling its efforts to assess the Nation's tsunami hazards, provide tsunami education, and improve its system for tsunami warning. The U.S. Geological Survey (USGS) is helping to meet these needs, in partnership with the National Oceanic and Atmospheric Administration (NOAA) and with coastal States and counties.
Complexity in Size, Recurrence and Source of Historical Earthquakes and Tsunamis in Central Chile
NASA Astrophysics Data System (ADS)
Cisternas, M.
2013-05-01
Central Chile has a 470-year-long written earthquake history, the longest of any part of the country. Thanks to the early and continuous Spanish settlement of this part of Chile (32°- 35° S), records document destructive earthquakes and tsunamis in 1575, 1647, 1730, 1822, 1906 and 1985. This sequence has promoted the idea that central Chile's large subduction inter-plate earthquakes recur at regular intervals of about 80 years. The last of these earthquakes, in 1985, was even forecast as filling a seismic gap on the thrust boundary between the subducting Nazca Plate and the overriding South America Plate. Following this logic, the next large earthquake in metropolitan Chile will not occur until late in the 21st century. However, here I challenge this conclusion by reporting recently discovered historical evidence in Spain, Japan, Peru, and Chile. This new evidence augments the historical catalog in central Chile, strongly suggests that one of these earthquakes previously assumed to occur on the inter-plate interface in fact occurred elsewhere, and forces the conclusion that another of these earthquakes (and its accompanying tsunami) dwarfed the others. These findings complicate the task of assessing the hazard of future earthquakes in Chile's most populated region.
First tsunami gravity wave detection in ionospheric radio occultation data
Coïsson, Pierdavide; Lognonné, Philippe; Walwer, Damian; ...
2015-05-09
After the 11 March 2011 earthquake and tsunami off the coast of Tohoku, the ionospheric signature of the displacements induced in the overlying atmosphere has been observed by ground stations in various regions of the Pacific Ocean. We analyze here the data of radio occultation satellites, detecting the tsunami-driven gravity wave for the first time using a fully space-based ionospheric observation system. One satellite of the Constellation Observing System for Meteorology, Ionosphere and Climate (COSMIC) recorded an occultation in the region above the tsunami 2.5 h after the earthquake. The ionosphere was sounded from top to bottom, thus providing themore » vertical structure of the gravity wave excited by the tsunami propagation, observed as oscillations of the ionospheric Total Electron Content (TEC). The observed vertical wavelength was about 50 km, with maximum amplitude exceeding 1 total electron content unit when the occultation reached 200 km height. We compared the observations with synthetic data obtained by summation of the tsunami-coupled gravity normal modes of the Earth/Ocean/atmosphere system, which models the associated motion of the ionosphere plasma. These results provide experimental constraints on the attenuation of the gravity wave with altitude due to atmosphere viscosity, improving the understanding of the propagation of tsunami-driven gravity waves in the upper atmosphere. They demonstrate that the amplitude of the tsunami can be estimated to within 20% by the recorded ionospheric data.« less
NASA Astrophysics Data System (ADS)
Nelson, A. R.; Briggs, R. W.; Kemp, A.; Haeussler, P. J.; Engelhart, S. E.; Dura, T.; Angster, S. J.; Bradley, L.
2012-12-01
Uncertainty in earthquake and tsunami prehistory of the Aleutian-Alaska megathrust westward of central Kodiak Island limit assessments of southern Alaska's earthquake hazard and forecasts of potentially damaging tsunamis along much of North America's west coast. Sitkinak Island, one of the Trinity Islands off the southwest tip of Kodiak Island, lies at the western end of the rupture zone of the 1964 Mw9.2 earthquake. Plafker reports that a rancher on the north coast of Sitkinak Island observed ~0.6 m of shoreline uplift immediately following the 1964 earthquake, and the island is now subsiding at about 3 mm/yr (PBO GPS). Although a high tsunami in 1788 caused the relocation of the first Russian settlement on southwestern Kodiak Island, the eastern extent of the megathrust rupture accompanying the tsunami is uncertain. Interpretation of GPS observations from the Shumagin Islands, 380 km southwest of Kodiak Island, suggests an entirely to partially creeping megathrust in that region. Here we report the first stratigraphic evidence of tsunami inundation and land-level change during prehistoric earthquakes west of central Kodiak Island. Beneath tidal and freshwater marshes around a lagoon on the south coast of Sitkinak Island, 27 cores and tidal outcrops reveal the deposits of four to six tsunamis in 2200 years and two to four abrupt changes in lithology that may correspond with coseismic uplift and subsidence over the past millennia. A 2- to 45-mm-thick bed of clean to peaty sand in sequences of tidal sediment and freshwater peat, identified in more than one-half the cores as far inland as 1.5 km, was probably deposited by the 1788 tsunami. A 14C age on Scirpus seeds, double 137Cs peaks at 2 cm and 7 cm depths (Chernobyl and 1963?), a consistent decline in 210Pb values, and our assumption of an exponential compaction rate for freshwater peat, point to a late 18th century age for the sand bed. Initial 14C ages suggest that two similar extensive sandy beds, identified in eight cores at higher tidal and freshwater sites, date from about 1.5 ka and 2.0 ka, respectively. A younger silty sand bed, <10 cm beneath the now-eroding low marsh around the lagoon, may record the 1964 tsunami. Correlations of two to three other sandy beds are too uncertain to infer their deposition by tsunamis. Stratigraphic contacts found only in cores and outcrops of the <0.8- to 1-ka tidal section fringing the lagoon may mark coseismic uplift (peat over tidal mud, sometimes with intervening sand) or subsidence (tidal mud over peat, sometimes with intervening sand). We collected samples of modern tidal foraminifera along three elevational transects for the baseline dataset needed to use fossil assemblages to measure the amount of uplift or subsidence recorded by contacts. Foraminiferal assemblages above and below one contact confirm rapid uplift a few hundred years before the 1788 tsunami, but cores are too few to correlate this contact with any of the sandy beds that we infer were deposited by tsunamis farther inland. These initial results demonstrate the promise of this previously unexplored island and similar sites for using stratigraphic evidence of sudden land-level changes and high tsunamis to map prehistoric ruptures of the Aleutian-Alaskan megathrust.
Implementation of the TsunamiReady Supporter Program in Puerto Rico
NASA Astrophysics Data System (ADS)
Flores Hots, V. E.; Vanacore, E. A.; Gonzalez Ruiz, W.; Gomez, G.
2016-12-01
The Puerto Rico Seismic Network (PRSN) manages the PR Tsunami Program (NTHMP), including the TsunamiReady Supporter Program. Through this program the PRSN helps private organizations, businesses, facilities or local government entities to willingly engage in tsunami planning and preparedness that meet some requirements established by the National Weather Service. TsunamiReady Supporter organizations are better prepared to respond to a tsunami emergency, developing a response plan (using a template that PRSN developed and provides), and reinforcing their communication systems including NOAA radio, RSS, and loud speakers to receive and disseminate the alerts issued by the NWS and the Tsunami Warning Centers (TWC). The planning and the communication systems added to the training that PRSN provides to the staff and employees, are intend to help visitors and employees evacuate the tsunami hazard zone to the nearest assembly point minimizing loss of life. Potential TsunamiReady Supporters include, but are not limited to, businesses, schools, churches, hospitals, malls, utilities, museums, beaches, and harbors. However, the traditional targets for such programs are primarily tourism sites and hotels where people unaware of the tsunami hazard may be present. In 2016 the Tsunami Ready Program guided four businesses to achieve the TsunamiReady Supporter recognition. Two facilities were hotels near or inside the evacuation zone. The other facilities were the first and only health center and supermarket to be recognized in the United States and US territories. Based on the experience of preparing the health center and supermarket sites, here we present two case studies of how the TsunamiReady Supporter Program can be applied to non-traditional facilities as well as how the application of this program to such facilities can improve tsunami hazard mitigation. Currently, we are working on expanding the application of this program to non-traditional facilities by working with a banking facility located in a tsunami evacuation zone increasing their capacity to manage a tsunami event and to reinforce the entity's involvement in developing a plan for their clients and employees to evacuate the area and head to a safe place.
Observing Traveling Ionospheric Disturbances Caused by Tsunamis Using GPS TEC Measurements
NASA Technical Reports Server (NTRS)
Galvan, David A.; Komjathy, Attila; Hickey, Michael; Foster, James; Mannucci, Anthony J.
2010-01-01
Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following two recent seismic events: the American Samoa earthquake of September 29, 2009, and the Chile earthquake of February 27, 2010. Fluctuations in TEC correlated in time, space, and wave properties with these tsunamis were observed in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with wavelengths and periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the tsunamis in certain locations, but not in others. Where variations are observed, the typical amplitude tends to be on the order of 1% of the background TEC value. Variations with amplitudes 0.1 - 0.2 TECU are observable with periods and timing affiliated with the tsunami. These observations are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement in some locations, though there are cases when the model predicts an observable tsunami-driven signature and none is observed. These TEC variations are not always seen when a tsunami is present, but in these two events the regions where a strong ocean tsunami was observed did coincide with clear TEC observations, while a lack of clear TEC observations coincided with smaller tsunami amplitudes. There exists the potential to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for early warning systems.
SCALE 6.2 Continuous-Energy TSUNAMI-3D Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perfetti, Christopher M; Rearden, Bradley T
2015-01-01
The TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation) capabilities within the SCALE code system make use of sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different systems, quantifying computational biases, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved ease of use and fidelity and the desire to extend TSUNAMI analysis to advanced applications have motivated the development of a SCALE 6.2 module for calculating sensitivity coefficients using three-dimensional (3D) continuous-energy (CE) Montemore » Carlo methods: CE TSUNAMI-3D. This paper provides an overview of the theory, implementation, and capabilities of the CE TSUNAMI-3D sensitivity analysis methods. CE TSUNAMI contains two methods for calculating sensitivity coefficients in eigenvalue sensitivity applications: (1) the Iterated Fission Probability (IFP) method and (2) the Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Track length importance CHaracterization (CLUTCH) method. This work also presents the GEneralized Adjoint Response in Monte Carlo method (GEAR-MC), a first-of-its-kind approach for calculating adjoint-weighted, generalized response sensitivity coefficients—such as flux responses or reaction rate ratios—in CE Monte Carlo applications. The accuracy and efficiency of the CE TSUNAMI-3D eigenvalue sensitivity methods are assessed from a user perspective in a companion publication, and the accuracy and features of the CE TSUNAMI-3D GEAR-MC methods are detailed in this paper.« less
The 2017 México Tsunami Record, Numerical Modeling and Threat Assessment in Costa Rica
NASA Astrophysics Data System (ADS)
Chacón-Barrantes, Silvia
2018-03-01
An M w 8.2 earthquake and tsunami occurred offshore the Pacific coast of México on 2017-09-08, at 04:49 UTC. Costa Rican tide gauges have registered a total of 21 local, regional and far-field tsunamis. The Quepos gauge registered 12 tsunamis between 1960 and 2014 before it was relocated inside a harbor by late 2014, where it registered two more tsunamis. This paper analyzes the 2017 México tsunami as recorded by the Quepos gauge. It took 2 h for the tsunami to arrive to Quepos, with a first peak height of 9.35 cm and a maximum amplitude of 18.8 cm occurring about 6 h later. As a decision support tool, this tsunami was modeled for Quepos in real time using ComMIT (Community Model Interface for Tsunami) with the finer grid having a resolution of 1 arcsec ( 30 m). However, the model did not replicate the tsunami record well, probably due to the lack of a finer and more accurate bathymetry. In 2014, the National Tsunami Monitoring System of Costa Rica (SINAMOT) was created, acting as a national tsunami warning center. The occurrence of the 2017 México tsunami raised concerns about warning dissemination mechanisms for most coastal communities in Costa Rica, due to its short travel time.
Helping coastal communities at risk from tsunamis: the role of U.S. Geological Survey research
Geist, Eric L.; Gelfenbaum, Guy R.; Jaffe, Bruce E.; Reid, Jane A.
2000-01-01
In 1946, 1960, and 1964, major tsunamis (giant sea waves usually caused by earthquakes or submarine landslides) struck coastal areas of the Pacific Ocean. In the U.S. alone, these tsunamis killed hundreds of people and caused many tens of millions of dollars in damage. Recent events in Papua New Guinea (1998) and elsewhere are reminders that a catastrophic tsunami could strike U.S. coasts at any time. The USGS, working closely with NOAA and other partners in the National Tsunami Hazard Mitigation Program, is helping to reduce losses from tsunamis through increased hazard assessment and improved real-time warning systems.
Tsunami risk mapping simulation for Malaysia
Teh, S.Y.; Koh, H. L.; Moh, Y.T.; De Angelis, D. L.; Jiang, J.
2011-01-01
The 26 December 2004 Andaman mega tsunami killed about a quarter of a million people worldwide. Since then several significant tsunamis have recurred in this region, including the most recent 25 October 2010 Mentawai tsunami. These tsunamis grimly remind us of the devastating destruction that a tsunami might inflict on the affected coastal communities. There is evidence that tsunamis of similar or higher magnitudes might occur again in the near future in this region. Of particular concern to Malaysia are tsunamigenic earthquakes occurring along the northern part of the Sunda Trench. Further, the Manila Trench in the South China Sea has been identified as another source of potential tsunamigenic earthquakes that might trigger large tsunamis. To protect coastal communities that might be affected by future tsunamis, an effective early warning system must be properly installed and maintained to provide adequate time for residents to be evacuated from risk zones. Affected communities must be prepared and educated in advance regarding tsunami risk zones, evacuation routes as well as an effective evacuation procedure that must be taken during a tsunami occurrence. For these purposes, tsunami risk zones must be identified and classified according to the levels of risk simulated. This paper presents an analysis of tsunami simulations for the South China Sea and the Andaman Sea for the purpose of developing a tsunami risk zone classification map for Malaysia based upon simulated maximum wave heights. ?? 2011 WIT Press.
Identification of elements at risk for a credible tsunami event for Istanbul
NASA Astrophysics Data System (ADS)
Hancilar, U.
2012-01-01
Physical and social elements at risk are identified for a credible tsunami event for Istanbul. For this purpose, inundation maps resulting from probabilistic tsunami hazard analysis for a 10% probability of exceedance in 50 yr are utilised in combination with the geo-coded inventories of building stock, lifeline systems and demographic data. The built environment on Istanbul's shorelines that is exposed to tsunami inundation comprises residential, commercial, industrial, public (governmental/municipal, schools, hospitals, sports and religious), infrastructure (car parks, garages, fuel stations, electricity transformer buildings) and military buildings, as well as piers and ports, gas tanks and stations and other urban elements (e.g., recreational facilities). Along the Marmara Sea shore, Tuzla shipyards and important port and petrochemical facilities at Ambarlı are expected to be exposed to tsunami hazard. Significant lifeline systems of the city of Istanbul such as natural gas, electricity, telecommunication and sanitary and waste-water transmission, are also under the threat of tsunamis. In terms of social risk, it is estimated that there are about 32 000 inhabitants exposed to tsunami hazard.
NASA Astrophysics Data System (ADS)
Rakowsky, N.; Harig, S.; Androsov, A.; Fuchs, A.; Immerz, A.; Schröter, J.; Hiller, W.
2012-04-01
Starting in 2005, the GITEWS project (German-Indonesian Tsunami Early Warning System) established from scratch a fully operational tsunami warning system at BMKG in Jakarta. Numerical simulations of prototypic tsunami scenarios play a decisive role in a priori risk assessment for coastal regions and in the early warning process itself. Repositories with currently 3470 regional tsunami scenarios for GITEWS and 1780 Indian Ocean wide scenarios in support of Indonesia as a Regional Tsunami Service Provider (RTSP) were computed with the non-linear shallow water modell TsunAWI. It is based on a finite element discretisation, employs unstructured grids with high resolution along the coast and includes inundation. This contribution gives an overview on the model itself, the enhancement of the model physics, and the experiences gained during the process of establishing an operational code suited for thousands of model runs. Technical aspects like computation time, disk space needed for each scenario in the repository, or post processing techniques have a much larger impact than they had in the beginning when TsunAWI started as a research code. Of course, careful testing on artificial benchmarks and real events remains essential, but furthermore, quality control for the large number of scenarios becomes an important issue.
Insights from interviews regarding high fatality rate caused by the 2011 Tohoku-Oki earthquake
NASA Astrophysics Data System (ADS)
Ando, M.; Ishida, M.
2012-12-01
The 11 March 2011 Tohoku-Oki earthquake (Mw9.0) caused approximately 19,000 casualties including missing persons along the entire coast of the Tohoku region. Three historical tsunamis occurred in the past 115 years preceding this tsunami. Since these tsunamis, numerous countermeasures against future tsunamis such as breakwaters, early tsunami warning systems and tsunami evacuation drills were implemented. Despite the preparedness, a number of deaths and missing persons occurred. Although this death rate is approximately 4 % of the population in severely inundated areas; 96 % safely evacuated or managed to survive the tsunami. To understand why some people evacuated immediately while others delayed; survivors were interviewed in the northern part of the Tohoku region. Our interviews revealed that many residents obtained no appropriate warnings and many chose to remain in dangerous locations partly because they obtained the wrong idea of the risks. In addition, our interviews also indicated that the resultant high casualties were due to current technology malfunction, underestimated earthquake size and tsunami heights, and failure of warning systems. Furthermore, the existing breakwaters provided the local community a false sense of security. The advanced technology did not work properly, especially at the time of the severe disaster. If residents had taken an immediate action after the major shaking stopped, most local residents might have survived considering that safer highlands are within 5 to 20 minute walking distance from the interviewed areas. However, the elderly and physically disabled people would still be in a much more difficult situation to walk such distance into safety. Nevertheless, even if these problems occur in future earthquakes, better knowledge regarding earthquakes and tsunami hazards could save more lives. People must take immediate action without waiting for official warning or help. To avoid similar high tsunami death ratios in the future, residents including young children should be taught the basic mechanism of tsunami generation. Such basic knowledge can lead local residents to evacuate sooner, enabling more people to survive a tsunami even if warning systems or other technology would fail to function.
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Lendholt, Matthias; Reißland, Sven; Schulz, Jana
2013-04-01
On November 27-28, 2012, the Kandilli Observatory and Earthquake Research Institute (KOERI) and the Portuguese Institute for the Sea and Atmosphere (IPMA) joined other countries in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region as participants in an international tsunami response exercise. The exercise, titled NEAMWave12, simulated widespread Tsunami Watch situations throughout the NEAM region. It is the first international exercise as such, in this region, where the UNESCO-IOC ICG/NEAMTWS tsunami warning chain has been tested to a full scale for the first time with different systems. One of the systems is developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) and has been validated in this exercise among others by KOERI and IPMA. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing related challenges. The first and second phase system demonstrator, deployed at KOERI's crisis management room and deployed at IPMA has been designed and implemented, firstly, to support plausible scenarios for the Turkish NTWC and for the Portuguese NTWC to demonstrate the treatment of simulated tsunami threats with an essential subset of a NTWC. Secondly, the feasibility and the potentials of the implemented approach are demonstrated covering ICG/NEAMTWS standard operations as well as tsunami detection and alerting functions beyond ICG/NEAMTWS requirements. The demonstrator presented addresses information management and decision-support processes for hypothetical tsunami-related crisis situations in the context of the ICG/NEAMTWS NEAMWave12 exercise for the Turkish and Portuguese tsunami exercise scenarios. Impressions gained with the standards compliant TRIDEC system during the exercise will be reported. The system version presented is based on event-driven architecture (EDA) and service-oriented architecture (SOA) concepts and is making use of relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). This demonstration is linked with the talk 'Experiences with TRIDEC's Crisis Management Demonstrator in the Turkish NEAMWave12 exercise tsunami scenario' (EGU2013-2833) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.6).
NASA Astrophysics Data System (ADS)
Shinohara, M.; Yamada, T.; Sakai, S.; Shiobara, H.; Kanazawa, T.
2014-12-01
A seismic and tsunami observation system using seafloor optical fiber had been installed off Sanriku, northeastern Japan in 1996. The objectives of the system are to obtain exact seismic activity related to plate subduction and to observe tsunami on seafloor. The continuous real-time observation has been carried out since the installation. In March 2011, the Tohoku earthquake occurred at the plate boundary near the Japan Trench, and the system recorded seismic waves and tsunamis by the mainshock. These data are useful to obtain accurate position of the source faults and source region of tsunami generated by the event. However, the landing station of the system was damaged by huge tsunami, and the observation was suspended. Because the real-time seafloor observation by cabled system is important in this region, we decide to reconstruct a landing station and install newly developed Ocean Bottom Cabled Seismic and Tsunami (OBCST) observation system for additional observation and/or replacement of the existing system. From 2005, we have been developed the new compact Ocean Bottom Cabled Seismometer (OBCS) system using Information and Communication Technology (ICT). Our system is characterized by securement of reliability by using TCP/IP technology and down-sizing of an observation node using up-to-date electronics technology. In 2010, the first OBCS was installed near Awashima-island in the Japan Sea, and is being operated continuously. The new OBCST system is placed as the second generation of our system, and has two types of observation nodes. Both types have accelerometers as seismic sensors. One type of observation nodes equips a crystal oscillator type pressure gauge as tsunami sensor. Another type has an external port for additional observation sensor by using Power over Ethernet technology. Clocks in observation nodes can be synchronized through TCP/IP protocol with an accuracy of 300 ns (IEEE 1588). A simple canister for tele-communication seafloor cable is adopted for the observation node, and has diameter of 26 cm and length of about 1.3 m. At the present, we are producing a practical OBCST system which has total length of approximately 100 km and three observation nodes. We have a plan to install the practical system in 2015.
Response to the 2011 Great East Japan Earthquake and Tsunami disaster.
Koshimura, Shunichi; Shuto, Nobuo
2015-10-28
We revisited the lessons of the 2011 Great East Japan Earthquake Tsunami disaster specifically on the response and impact, and discussed the paradigm shift of Japan's tsunami disaster management policies and the perspectives for reconstruction. Revisiting the modern histories of Tohoku tsunami disasters and pre-2011 tsunami countermeasures, we clarified how Japan's coastal communities have prepared for tsunamis. The discussion mainly focuses on structural measures such as seawalls and breakwaters and non-structural measures of hazard map and evacuation. The responses to the 2011 event are discussed specifically on the tsunami warning system and efforts to identify the tsunami impacts. The nation-wide post-tsunami survey results shed light on the mechanisms of structural destruction, tsunami loads and structural vulnerability to inform structural rehabilitation measures and land-use planning. Remarkable paradigm shifts in designing coastal protection and disaster mitigation measures were introduced, leading with a new concept of potential tsunami levels: Prevention (Level 1) and Mitigation (Level 2) levels according to the level of 'protection'. The seawall is designed with reference to Level 1 tsunami scenario, while comprehensive disaster management measures should refer to Level 2 tsunami for protection of human lives and reducing potential losses and damage. Throughout the case study in Sendai city, the proposed reconstruction plan was evaluated from the tsunami engineering point of view to discuss how the post 2011 paradigm was implemented in coastal communities for future disaster mitigation. The analysis revealed that Sendai city's multiple protection measures for Level 2 tsunami will contribute to a substantial reduction of the tsunami inundation zone and potential losses, combined with an effective tsunami evacuation plan. © 2015 The Author(s).
How did the AD 1755 tsunami impact on sand barriers across the southern coast of Portugal?
NASA Astrophysics Data System (ADS)
Costa, Pedro J. M.; Costas, Susana; González-Villanueva, R.; Oliveira, M. A.; Roelvink, D.; Andrade, C.; Freitas, M. C.; Cunha, P. P.; Martins, A.; Buylaert, J.-P.; Murray, A.
2016-09-01
Tsunamis are highly energetic events that may destructively impact the coast. Resolving the degree of coastal resilience to tsunamis is extremely difficult and sometimes impossible. In part, our understanding is constrained by the limited number of contemporaneous examples and by the high dynamism of coastal systems. In fact, long-term changes of coastal systems can mask the evidence of past tsunamis, leaving us a short or incomplete sedimentary archive. Here, we present a multidisciplinary approach involving sedimentological, geomorphological and geophysical analyses and numerical modelling of the AD 1755 tsunami flood on a coastal segment located within the southern coast of Portugal. In particular, the work focuses on deciphering the impact of the tsunami waves over a coastal sand barrier enclosing two lowlands largely inundated by the tsunami flood. Erosional features documented by geophysical data were assigned to the AD 1755 event with support of sedimentological and age estimation results. Furthermore, these features allowed the calibration of the simulation settings to reconstruct the local conditions and establish the run-up range of the AD 1755 tsunami when it hit this coast (6-8 m above mean sea level). Our work highlights the usefulness of erosional imprints preserved in the sediment record to interpret the impact of the extreme events on sand barriers.
Nowcasting Earthquakes and Tsunamis
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Turcotte, D. L.
2017-12-01
The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk. As another application, we can define large rectangular regions of subduction zones and shallow depths to compute the progress of the fault zone towards the next major tsunami-genic earthquake. We can then rank the relative progress of the major subduction zones of the world through their cycles of large earthquakes using this method to determine which zones are most at risk.
On-line applications of numerical models in the Black Sea GIS
NASA Astrophysics Data System (ADS)
Zhuk, E.; Khaliulin, A.; Zodiatis, G.; Nikolaidis, A.; Nikolaidis, M.; Stylianou, Stavros
2017-09-01
The Black Sea Geographical Information System (GIS) is developed based on cutting edge information technologies, and provides automated data processing and visualization on-line. Mapserver is used as a mapping service; the data are stored in MySQL DBMS; PHP and Python modules are utilized for data access, processing, and exchange. New numerical models can be incorporated in the GIS environment as individual software modules, compiled for a server-based operational system, providing interaction with the GIS. A common interface allows setting the input parameters; then the model performs the calculation of the output data in specifically predefined files and format. The calculation results are then passed to the GIS for visualization. Initially, a test scenario of integration of a numerical model into the GIS was performed, using software, developed to describe a two-dimensional tsunami propagation in variable basin depth, based on a linear long surface wave model which is legitimate for more than 5 m depth. Furthermore, the well established oil spill and trajectory 3-D model MEDSLIK (http://www.oceanography.ucy.ac.cy/medslik/) was integrated into the GIS with more advanced GIS functionality and capabilities. MEDSLIK is able to forecast and hind cast the trajectories of oil pollution and floating objects, by using meteo-ocean data and the state of oil spill. The MEDSLIK module interface allows a user to enter all the necessary oil spill parameters, i.e. date and time, rate of spill or spill volume, forecasting time, coordinates, oil spill type, currents, wind, and waves, as well as the specification of the output parameters. The entered data are passed on to MEDSLIK; then the oil pollution characteristics are calculated for pre-defined time steps. The results of the forecast or hind cast are then visualized upon a map.
Performances of the New Real Time Tsunami Detection Algorithm applied to tide gauges data
NASA Astrophysics Data System (ADS)
Chierici, F.; Embriaco, D.; Morucci, S.
2017-12-01
Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection (TDA) based on the real-time tide removal and real-time band-pass filtering of seabed pressure time series acquired by Bottom Pressure Recorders. The TDA algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability with respect to the most widely used tsunami detection algorithm, while containing the computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. In this work we present the performance of the TDA algorithm applied to tide gauge data. We have adapted the new tsunami detection algorithm and the Monte Carlo test methodology to tide gauges. Sea level data acquired by coastal tide gauges in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event generated by Tohoku earthquake on March 11th 2011, using data recorded by several tide gauges scattered all over the Pacific area.
Development of real-time mobile-buoy observation system for tsunami and crustal movement
NASA Astrophysics Data System (ADS)
Takahashi, N.; Ishihara, Y.; Fukuda, T.; Tahara, J.; Ochi, H.; Mori, T.; Deguchi, M.; Kido, M.; Ohta, Y.; Hino, R.; Mutoh, K.; Hashimoto, G.; Motohashi, O.; Kaneda, Y.
2014-12-01
We have developed real-time buoy system for tsunami and crustal movement since 2012. Our motivations are needs of the crustal movement data of not only for vertical component but horizontal, real-time data transmission for future prediction of the next large earthquake, and needs of relatively easily system comparing with the seafloor cable network system. Therefore, we are developing the above system using a buoy system, considering long term observation of approximately two years. Our system's characteristics are real-time observation, separation between tsunami and crustal movement, mobility, and environmental compatibility. Tsunami and crustal movement data are sent with intervals of an hour and a week respectively in real-time, and we can also get them on-demand via satellite transmission from the land station. We are going to observe tsunami using a pressure sensor and a PPP (precise point positioning) navigation system on the buoy, therefore, tsunami and vertical crustal deformation are separated in real-time. And the horizontal component of the crustal deformation is also measured by acoustic signals between the buoy and six seafloor transponders. Our system can be used under severe strong sea current with a speed of 5.5 knots due to adaption of slack mooring. Therefore, we can deploy it without consideration of sea current. In addition, the geometry including the size of the buoy, lengths of some ropes, and capacity of the electric battery and so on is tuned considering an environment of deployment location. Through twice sea trials, we are confirming each function. In this presentation, we introduce the outline and results of the sea trials.
NASA Astrophysics Data System (ADS)
Schneider, Bastian; Hoffmann, Gösta; Reicherter, Klaus
2016-04-01
Knowledge of tsunami risk and vulnerability is essential to establish a well-adapted Multi Hazard Early Warning System, land-use planning and emergency management. As the tsunami risk for the coastline of Oman is still under discussion and remains enigmatic, various scenarios based on historical tsunamis were created. The suggested inundation and run-up heights were projected onto the modern infrastructural setting of the Muscat Capital Area. Furthermore, possible impacts of the worst-case tsunami event for Muscat are discussed. The approved Papathoma Tsunami Vulnerability Assessment Model was used to model the structural vulnerability of the infrastructure for a 2 m tsunami scenario, depicting the 1945 tsunami and a 5 m tsunami in Muscat. Considering structural vulnerability, the results suggest a minor tsunami risk for the 2 m tsunami scenario as the flooding is mainly confined to beaches and wadis. Especially traditional brick buildings, still predominant in numerous rural suburbs, and a prevalently coast-parallel road network lead to an increased tsunami risk. In contrast, the 5 m tsunami scenario reveals extensively inundated areas and with up to 48% of the buildings flooded, and therefore consequently a significantly higher tsunami risk. We expect up to 60000 damaged buildings and up to 380000 residents directly affected in the Muscat Capital Area, accompanied with a significant loss of life and damage to vital infrastructure. The rapid urbanization processes in the Muscat Capital Area, predominantly in areas along the coast, in combination with infrastructural, demographic and economic growth will additionally increase the tsunami risk and therefore emphasizes the importance of tsunami risk assessment in Oman.
NASA Astrophysics Data System (ADS)
Manta, F.; Feng, L.; Occhipinti, G.; Taisne, B.; Hill, E.
2017-12-01
Tsunami earthquakes generate tsunamis larger than expected for their seismic magnitude. They rupture the shallow megathrust, which is usually at significant distance from land-based monitoring networks. This distance presents a challenge in accurately estimating the magnitude and source extent of tsunami earthquakes. Whether these parameters can be estimated reliably is critical to the success of tsunami early warning systems. In this work, we investigate the potential role of using GNSS-observed ionospheric total electron content (TEC) to discriminate tsunami earthquakes, by introducing for the first time the TEC Intensity Index (TECII) for rapidly identify tsunamigenic earthquakes. We examine two Mw 7.8 megathrust events along the Sumatran subduction zone with data from the Sumatran GPS Array (SuGAr). Both events triggered a tsunami alert that was canceled later. The Banyaks event (April 6th, 2010) did not generate a tsunami and caused only minor earthquake-related damage to infrastructure. On the contrary, the Mentawai event (October 25th, 2010) produced a large tsunami with run-up heights of >16 m along the southwestern coasts of the Pagai Islands. The tsunami claimed more than 400 lives. The primary difference between the two events was the depth of rupture: the Mentawai event ruptured a very shallow (<6 km) portion of the Sunda megathrust, while the Banyaks event ruptured a deeper portion (20-30 km). While we identify only a minor ionospheric signature of the Banyaks event (TECII = 1.05), we identify a strong characteristic acoustic-gravity wave only 8 minutes after the Mentawai earthquake (TECII = 1.14) and a characteristic signature of a tsunami 40 minutes after the event. These two signals reveal the large surface displacement at the rupture, and the consequent destructive tsunami. This comparative study of two earthquakes with the same magnitude at different depths highlights the potential role of ionospheric monitoring by GNSS to tsunami early warning systems
Stand-alone tsunami alarm equipment
NASA Astrophysics Data System (ADS)
Katsumata, Akio; Hayashi, Yutaka; Miyaoka, Kazuki; Tsushima, Hiroaki; Baba, Toshitaka; Catalán, Patricio A.; Zelaya, Cecilia; Riquelme Vasquez, Felipe; Sanchez-Olavarria, Rodrigo; Barrientos, Sergio
2017-05-01
One of the quickest means of tsunami evacuation is transfer to higher ground soon after strong and long ground shaking. Ground shaking itself is a good initiator of the evacuation from disastrous tsunami. Longer period seismic waves are considered to be more correlated with the earthquake magnitude. We investigated the possible application of this to tsunami hazard alarm using single-site ground motion observation. Information from the mass media is sometimes unavailable due to power failure soon after a large earthquake. Even when an official alarm is available, multiple information sources of tsunami alert would help people become aware of the coming risk of a tsunami. Thus, a device that indicates risk of a tsunami without requiring other data would be helpful to those who should evacuate. Since the sensitivity of a low-cost MEMS (microelectromechanical systems) accelerometer is sufficient for this purpose, tsunami alarm equipment for home use may be easily realized. Amplitude of long-period (20 s cutoff) displacement was proposed as the threshold for the alarm based on empirical relationships among magnitude, tsunami height, hypocentral distance, and peak ground displacement of seismic waves. Application of this method to recent major earthquakes indicated that such equipment could effectively alert people to the possibility of tsunami.
Near Field Modeling for the Maule Tsunami from DART, GPS and Finite Fault Solutions (Invited)
NASA Astrophysics Data System (ADS)
Arcas, D.; Chamberlin, C.; Lagos, M.; Ramirez-Herrera, M.; Tang, L.; Wei, Y.
2010-12-01
The earthquake and tsunami of February, 27, 2010 in central Chile has rekindled an interest in developing techniques to predict the impact of near field tsunamis along the Chilean coastline. Following the earthquake, several initiatives were proposed to increase the density of seismic, pressure and motion sensors along the South American trench, in order to provide field data that could be used to estimate tsunami impact on the coast. However, the precise use of those data in the elaboration of a quantitative assessment of coastal tsunami damage has not been clarified. The present work makes use of seismic, Deep-ocean Assessment and Reporting of Tsunamis (DART®) systems, and GPS measurements obtained during the Maule earthquake to initiate a number of tsunami inundation models along the rupture area by expressing different versions of the seismic crustal deformation in terms of NOAA’s tsunami unit source functions. Translation of all available real-time data into a feasible tsunami source is essential in near-field tsunami impact prediction in which an impact assessment must be generated under very stringent time constraints. Inundation results from each different source are then contrasted with field and tide gauge data by comparing arrival time, maximum wave height, maximum inundation and tsunami decay rate, using field data collected by the authors.
Tsunami risk zoning in south-central Chile
NASA Astrophysics Data System (ADS)
Lagos, M.
2010-12-01
The recent 2010 Chilean tsunami revealed the need to optimize methodologies for assessing the risk of disaster. In this context, modern techniques and criteria for the evaluation of the tsunami phenomenon were applied in the coastal zone of south-central Chile as a specific methodology for the zoning of tsunami risk. This methodology allows the identification and validation of a scenario of tsunami hazard; the spatialization of factors that have an impact on the risk; and the zoning of the tsunami risk. For the hazard evaluation, different scenarios were modeled by means of numerical simulation techniques, selecting and validating the results that better fit with the observed tsunami data. Hydrodynamic parameters of the inundation as well as physical and socioeconomic vulnerability aspects were considered for the spatialization of the factors that affect the tsunami risk. The tsunami risk zoning was integrated into a Geographic Information System (GIS) by means of multicriteria evaluation (MCE). The results of the tsunami risk zoning show that the local characteristics and their location, together with the concentration of poverty levels, establish spatial differentiated risk levels. This information builds the basis for future applied studies in land use planning that tend to minimize the risk levels associated to the tsunami hazard. This research is supported by Fondecyt 11090210.
Errors in Tsunami Source Estimation from Tide Gauges
NASA Astrophysics Data System (ADS)
Arcas, D.
2012-12-01
Linearity of tsunami waves in deep water can be assessed as a comparison of flow speed, u to wave propagation speed √gh. In real tsunami scenarios this evaluation becomes impractical due to the absence of observational data of tsunami flow velocities in shallow water. Consequently the extent of validity of the linear regime in the ocean is unclear. Linearity is the fundamental assumption behind tsunami source inversion processes based on linear combinations of unit propagation runs from a deep water propagation database (Gica et al., 2008). The primary tsunami elevation data for such inversion is usually provided by National Oceanic and Atmospheric (NOAA) deep-water tsunami detection systems known as DART. The use of tide gauge data for such inversions is more controversial due to the uncertainty of wave linearity at the depth of the tide gauge site. This study demonstrates the inaccuracies incurred in source estimation using tide gauge data in conjunction with a linear combination procedure for tsunami source estimation.
The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program (PR-NTHMP)
NASA Astrophysics Data System (ADS)
Vanacore, E. A.; Huerfano Moreno, V. A.; Lopez, A. M.
2015-12-01
The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. Of particular interest is the Puerto Rico - Virgin Islands (PRVI) region, where the proximity of the coast to prominent tectonic faults would result in near-field tsunamis. Tsunami hazard assessment, detection capabilities, warning, education and outreach efforts are common tools intended to reduce loss of life and property. It is for these reasons that the PRSN is participating in an effort with local and federal agencies to develop tsunami hazard risk reduction strategies under the NTHMP. This grant supports the TsunamiReady program, which is the base of the tsunami preparedness and mitigation in PR. In order to recognize threatened communities in PR as TsunamiReady by the US NWS, the PR Component of the NTHMP have identified and modeled sources for local, regional and tele-tsunamis and the results of simulations have been used to develop tsunami response plans. The main goal of the PR-NTHMP is to strengthen resilient coastal communities that are prepared for tsunami hazards, and recognize PR as TsunamiReady. Evacuation maps were generated in three phases: First, hypothetical tsunami scenarios of potential underwater earthquakes were developed, and these scenarios were then modeled through during the second phase. The third phase consisted in determining the worst-case scenario based on the Maximum of Maximums (MOM). Inundation and evacuation zones were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Maps and related evacuation products, like evacuation times, can be accessed online via the PR Tsunami Decision Support Tool. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were approved. Also, the existing tsunami protocol and criteria in the PR/VI was updated. This paper describes the PR-NTHMP recent outcomes, including the real time monitoring as well as the protocols used to broadcast tsunami messages. The paper highlights tsunami hazards assessment, detection, warning, education and outreach efforts in Puerto Rico.
New method to determine initial surface water displacement at tsunami source
NASA Astrophysics Data System (ADS)
Lavrentyev, Mikhail; Romanenko, Alexey; Tatarintsev, Pavel
2013-04-01
Friday, March 11, 2011 at 05:46:23 UTC, Japan was struck by an 8.9-magnitude earthquake near its Northeastern coast. This is one of the largest earthquakes that Japan has ever experienced. Tsunami waves swept away houses and cars and caused massive human losses. To predict tsunami wave parameters better and faster, we propose to improve data inversion scheme and achieve the performance gain of data processing. One of the reasons of inaccurate predictions of tsunami parameters is that very little information is available about the initial disturbance of the sea bed at tsunami source. In this paper, we suggest a new way of improving the quality of tsunami source parameters prediction. Modern computational technologies can accurately calculate tsunami wave propagation over the deep ocean provided that the initial displacement (perturbation of the sea bed at tsunami source) is known [4]. Direct geophysical measurements provide the location of an earthquake hypocenter and its magnitude (the released energy evaluation). Among the methods of determination of initial displacement the following ones should be considered. Calculation through the known fault structure and available seismic information. This method is widely used and provides useful information. However, even if the exact knowledge about rock blocks shifts is given, recalculation in terms of sea bed displacement is needed. This results in a certain number of errors. GPS data analysis. This method was developed after the December 2004 event in the Indian Ocean. A good correlation between dry land based GPS sensors and tsunami wave parameters was observed in the particular case of the West coast of Sumatra, Indonesia. This approach is very unique and can hardly been used in other geo locations. Satellite image analysis. The resolution of modern satellite images has dramatically improved. In the future, correct data of sea surface displacement will probably be available in real time, right after a tsunamigenic earthquake. However, today it is not yet possible. Ground-based sea radars. This is an effective tool for direct measurement of tsunami wave. At the same time, the wave is measured at a rather narrow area in front of the radar and does not include information about neighboring parts of the wave. Direct measurement of tsunami wave at deep water [2]. Today, this technology is certainly among the most useful and promising. The DART II® system consists of a seafloor bottom pressure recording (BPR) system, capable of detecting tsunamis as small as 1 cm, and a moored surface buoy for real-time communications. We focus our research on improving the later method, direct measurement of tsunami wave at deep water. We suggest the new way to analyze DART data, modifying the methodology originally proposed by V. Titov. Smaller system of unit sources [3] should be considered to approximate all typical shapes of initial disturbance by several suitable basis functions. To successfully implement it, performance of data analysis should be dramatically improved. This could be done by using a signal orthogonalization procedure for considered system of unit sources and calculation of Fourier coefficients of the measured time series with respect to orthogonal basis. The approach suggested was used as a part of computerized workstation for tsunami hazard monitoring [5-6]. National Oceanic and Atmospheric Administration Center for Tsunami Research. URL: http://nctr.pmel.noaa.gov/honshu20110311/ National Data Buoy Center. URL: http://www.ndbc.noaa.gov/dart.shtml National Oceanic and Atmospheric Administration Center for Tsunami Research. URL: http://sift.pmel.noaa.gov/thredds/dodsC/uncompressed/ National Oceanic and Atmospheric Administration Center for Tsunami Research. URL: http://nctr.pmel.noaa.gov/model.html Alexey Romanenko, Mikhail Lavrentiev-jr, Vasily Titov, "Modern Architecture for Tsunami Hazard Mitigation" // Asia Oceania Geosciences Society (AOGS-2012), ISBN 978-981-07-2049-0 Mikhail Lavrentiev-jr, Andrey Marchuk, Alexey Romanenko, Konstantin Simonov, and Vasiliy Titov, "Computerized Workstation for Tsunami Hazard Monitoring", Geophysical research abstracts, Vol. 12, EGU2010-3021-1, 2010
Evaluation and Numerical Simulation of Tsunami for Coastal Nuclear Power Plants of India
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Pavan K.; Singh, R.K.; Ghosh, A.K.
2006-07-01
Recent tsunami generated on December 26, 2004 due to Sumatra earthquake of magnitude 9.3 resulted in inundation at the various coastal sites of India. The site selection and design of Indian nuclear power plants demand the evaluation of run up and the structural barriers for the coastal plants: Besides it is also desirable to evaluate the early warning system for tsunami-genic earthquakes. The tsunamis originate from submarine faults, underwater volcanic activities, sub-aerial landslides impinging on the sea and submarine landslides. In case of a submarine earthquake-induced tsunami the wave is generated in the fluid domain due to displacement of themore » seabed. There are three phases of tsunami: generation, propagation, and run-up. Reactor Safety Division (RSD) of Bhabha Atomic Research Centre (BARC), Trombay has initiated computational simulation for all the three phases of tsunami source generation, its propagation and finally run up evaluation for the protection of public life, property and various industrial infrastructures located on the coastal regions of India. These studies could be effectively utilized for design and implementation of early warning system for coastal region of the country apart from catering to the needs of Indian nuclear installations. This paper presents some results of tsunami waves based on different analytical/numerical approaches with shallow water wave theory. (authors)« less
NASA Astrophysics Data System (ADS)
Mulia, Iyan E.; Inazu, Daisuke; Waseda, Takuji; Gusman, Aditya Riadi
2017-10-01
The future Nankai Trough tsunami is one of the imminent threats to the Japanese coastal communities that could potentially cause a catastrophic event. As a part of the countermeasure efforts for such an occurrence, this study analyzes the efficacy of combining tsunami data assimilation (DA) and waveform inversion (WI). The DA is used to continuously refine a wavefield model whereas the WI is used to estimate the tsunami source. We consider a future scenario of the Nankai Trough tsunami recorded at various observational systems, including ocean bottom pressure (OBP) gauges, global positioning system (GPS) buoys, and ship height positioning data. Since most of the OBP gauges are located inside the source region, the recorded tsunami signals exhibit significant offsets from surface measurements due to coseismic seafloor deformation effects. Such biased data are not applicable to the standard DA, but can be taken into account in the WI. On the other hand, the use of WI for the ship data may not be practical because a considerably large precomputed tsunami database is needed to cope with the spontaneous ship locations. The DA is more suitable for such an observational system as it can be executed sequentially in time and does not require precomputed scenarios. Therefore, the combined approach of DA and WI allows us to concurrently make use of all observational resources. Additionally, we introduce a bias correction scheme for the OBP data to improve the accuracy, and an adaptive thinning of observations to determine the efficient number of observations.
Tsunami Warning Center in Turkey : Status Update 2012
NASA Astrophysics Data System (ADS)
Meral Ozel, N.; Necmioglu, O.; Yalciner, A. C.; Kalafat, D.; Yilmazer, M.; Comoglu, M.; Sanli, U.; Gurbuz, C.; Erdik, M.
2012-04-01
This is an update to EGU2011-3094 informing on the progress of the establishment of a National Tsunami Warning Center in Turkey (NTWC-TR) under the UNESCO Intergovernmental Oceanographic Commission - Intergovernmental Coordination Group for the Tsunami Early Warning and Mitigation System in the North-eastern Atlantic, the Mediterranean and connected seas (IOC-ICG/NEAMTWS) initiative. NTWC-TR is integrated into the 24/7 operational National Earthquake Monitoring Center (NEMC) of KOERI comprising 129 BB and 61 strong motion sensors. Based on an agreement with the Disaster and Emergency Management Presidency (DEMP), data from 10 BB stations located in the Aegean and Mediterranean Coast is now transmitted in real time to KOERI. Real-time data transmission from 6 primary and 10 auxiliary stations from the International Monitoring System will be in place in the very near future based on an agreement concluded with the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO) in 2011. In an agreement with a major Turkish GSM company, KOERI is enlarging its strong-motion network to promote real-time seismology and to extend Earthquake Early Warning system countrywide. 25 accelerometers (included in the number given above) have been purchased and installed at Base Transceiver Station Sites in coastal regions within the scope of this initiative. Data from 3 tide gauge stations operated by General Command of Mapping (GCM) is being transmitted to KOERI via satellite connection and the aim is to integrate all tide-gauge stations operated by GCM into NTWC-TR. A collaborative agreement has been signed with the European Commission - Joint Research Centre (EC-JRC) and MOD1 Tsunami Scenario Database and TAT (Tsunami Analysis Tool) are received by KOERI and user training was provided. The database and the tool are linked to SeisComp3 and currently operational. In addition KOERI is continuing the work towards providing contributions to JRC in order to develop an improved database (MOD2), and also continuing work related to the development of its own scenario database using NAMI DANCE Tsunami Simulation and Visualization Software. Further improvement of the Tsunami Warning System at the NTWC-TR will be accomplished through KOERI's participation in the FP-7 Project TRIDEC focusing on new technologies for real-time intelligent earth information management to be used in Tsunami Early Warning Systems. In cooperation with Turkish State Meteorological Service (TSMS), KOERI has its own GTS system now and connected to GTS via its own satellite hub. The system has been successfully utilized during the First Enlarged Communication Test Exercise (NEAMTWS/ECTE1), where KOERI acted as the message provider. KOERI is providing guidance and assistance to a working group established within the DEMP on issues such as Communication and Tsunami Exercises, National Procedures and National Tsunami Response Plan. KOERI is also participating in NEAMTIC (North-Eastern Atlantic and Mediterranean Tsunami Information Centre) Project. Finally, during the 8th Session of NEAMTWS in November 2011, KOERI has announced that NTWC-TR is operational as of January 2012 covering Eastern Mediterranean, Aegean, Marmara and Black Seas and KOERI is also ready to operate as an Interim Candidate Tsunami Watch Provider.
Šepić, Jadranka; Vilibić, Ivica; Rabinovich, Alexander B; Monserrat, Sebastian
2015-06-29
A series of tsunami-like waves of non-seismic origin struck several southern European countries during the period of 23 to 27 June 2014. The event caused considerable damage from Spain to Ukraine. Here, we show that these waves were long-period ocean oscillations known as meteorological tsunamis which are generated by intense small-scale air pressure disturbances. An unique atmospheric synoptic pattern was tracked propagating eastward over the Mediterranean and the Black seas in synchrony with onset times of observed tsunami waves. This pattern favoured generation and propagation of atmospheric gravity waves that induced pronounced tsunami-like waves through the Proudman resonance mechanism. This is the first documented case of a chain of destructive meteorological tsunamis occurring over a distance of thousands of kilometres. Our findings further demonstrate that these events represent potentially dangerous regional phenomena and should be included in tsunami warning systems.
Šepić, Jadranka; Vilibić, Ivica; Rabinovich, Alexander B.; Monserrat, Sebastian
2015-01-01
A series of tsunami-like waves of non-seismic origin struck several southern European countries during the period of 23 to 27 June 2014. The event caused considerable damage from Spain to Ukraine. Here, we show that these waves were long-period ocean oscillations known as meteorological tsunamis which are generated by intense small-scale air pressure disturbances. An unique atmospheric synoptic pattern was tracked propagating eastward over the Mediterranean and the Black seas in synchrony with onset times of observed tsunami waves. This pattern favoured generation and propagation of atmospheric gravity waves that induced pronounced tsunami-like waves through the Proudman resonance mechanism. This is the first documented case of a chain of destructive meteorological tsunamis occurring over a distance of thousands of kilometres. Our findings further demonstrate that these events represent potentially dangerous regional phenomena and should be included in tsunami warning systems. PMID:26119833
Advanced Simulation of Coupled Earthquake and Tsunami Events
NASA Astrophysics Data System (ADS)
Behrens, Joern
2013-04-01
Tsunami-Earthquakes represent natural catastrophes threatening lives and well-being of societies in a solitary and unexpected extreme event as tragically demonstrated in Sumatra (2004), Samoa (2009), Chile (2010), or Japan (2011). Both phenomena are consequences of the complex system of interactions of tectonic stress, fracture mechanics, rock friction, rupture dynamics, fault geometry, ocean bathymetry, and coastline geometry. The ASCETE project forms an interdisciplinary research consortium that couples the most advanced simulation technologies for earthquake rupture dynamics and tsunami propagation to understand the fundamental conditions of tsunami generation. We report on the latest research results in physics-based dynamic rupture and tsunami wave propagation simulation, using unstructured and adaptive meshes with continuous and discontinuous Galerkin discretization approaches. Coupling both simulation tools - the physics-based dynamic rupture simulation and the hydrodynamic tsunami wave propagation - will give us the possibility to conduct highly realistic studies of the interaction of rupture dynamics and tsunami impact characteristics.
Far-field tsunami magnitude determined from ocean-bottom pressure gauge data around Japan
NASA Astrophysics Data System (ADS)
Baba, T.; Hirata, K.; Kaneda, Y.
2003-12-01
\\hspace*{3mm}Tsunami magnitude is the most fundamental parameter to scale tsunamigenic earthquakes. According to Abe (1979), the tsunami magnitude, Mt, is empirically related to the crest to trough amplitude, H, of the far-field tsunami wave in meters (Mt = logH + 9.1). Here we investigate the far-field tsunami magnitude using ocean-bottom pressure gauge data. The recent ocean-bottom pressure measurements provide more precise tsunami data with a high signal-to-noise ratio. \\hspace*{3mm}Japan Marine Science and Technology Center is monitoring ocean bottom pressure fluctuations using two submarine cables of depths of 1500 - 2400 m. These geophysical observatory systems are located off Cape Muroto, Southwest Japan, and off Hokkaido, Northern Japan. The ocean-bottom pressure data recorded with the Muroto and Hokkaido systems have been collected continuously since March, 1997 and October, 1999, respectively. \\hspace*{3mm}Over the period from March 1997 to June 2003, we have observed four far-field tsunami signals, generated by earthquakes, on ocean-bottom pressure records. These far-field tsunamis were generated by the 1998 Papua New Guinea eq. (Mw 7.0), 1999 Vanuatu eq. (Mw 7.2), 2001 Peru eq. (Mw 8.4) and 2002 Papua New Guinea eq. (Mw 7.6). Maximum amplitude of about 30 mm was recorded by the tsunami from the 2001 Peru earthquake. \\hspace*{3mm}Direct application of the Abe's empirical relation to ocean-bottom pressure gauge data underestimates tsunami magnitudes by about an order of magnitude. This is because the Abe's empirical relation was derived only from tsunami amplitudes with coastal tide gauges where tsunami is amplified by the shoaling of topography and the reflection at the coastline. However, these effects do not work for offshore tsunami in deep oceans. In general, amplification due to shoaling near the coastline is governed by the Green's Law, in which the tsunami amplitude is proportional to h-1/4, where h is the water depth. Wave amplitude also is doubled by reflection at the fixed edge (coastline). Hence, we introduce a water-depth term and a reflection coefficient of 2 in the original Abe_fs empirical relation to correct tsunami amplitude for open oceans and obtain Mt = log(2H/h-1/4) + 9.1, where h is the depth of the ocean bottom pressure gage. The modified empirical relation produces tsunami magnitudes close to those determined using tide gauges.
Tsunami focusing and leading wave height
NASA Astrophysics Data System (ADS)
Kanoglu, Utku
2016-04-01
Field observations from tsunami events show that sometimes the maximum tsunami amplitude might not occur for the first wave, such as the maximum wave from the 2011 Japan tsunami reaching to Papeete, Tahiti as a fourth wave 72 min later after the first wave. This might mislead local authorities and give a wrong sense of security to the public. Recently, Okal and Synolakis (2016, Geophys. J. Int. 204, 719-735) discussed "the factors contributing to the sequencing of tsunami waves in the far field." They consider two different generation mechanisms through an axial symmetric source -circular plug; one, Le Mehaute and Wang's (1995, World Scientific, 367 pp.) formalism where irritational wave propagation is formulated in the framework of investigating tsunamis generated by underwater explosions and two, Hammack's formulation (1972, Ph.D. Dissertation, Calif. Inst. Tech., 261 pp., Pasadena) which introduces deformation at the ocean bottom and does not represent an immediate deformation of the ocean surface, i.e. time dependent ocean surface deformation. They identify the critical distance for transition from the first wave being largest to the second wave being largest. To verify sequencing for a finite length source, Okal and Synolakis (2016) is then used NOAA's validated and verified real time forecasting numerical model MOST (Titov and Synolakis, 1998, J. Waterw. Port Coast. Ocean Eng., 124, 157-171) through Synolakis et al. (2008, Pure Appl. Geophys. 165, 2197-2228). As a reference, they used the parameters of the 1 April 2014 Iquique, Chile earthquake over real bathymetry, variants of this source (small, big, wide, thin, and long) over a flat bathymetry, and 2010 Chile and 211 Japan tsunamis over both real and flat bathymetries to explore the influence of the fault parameters on sequencing. They identified that sequencing more influenced by the source width rather than the length. We extend Okal and Synolakis (2016)'s analysis to an initial N-wave form (Tadepalli and Synolakis, 1994, Proc. R. Soc. A: Math. Phys. Eng. Sci., 445, 99-112) with a finite crest length, which is most common tsunami initial waveform. We fit earthquake initial waveform calculated through Okada (1985, Bull. Seismol. Soc. Am. 75, 1135-1040) to the N-wave form presented by Tadepalli and Synolakis (1994). First, we investigate focusing phenomena as presented by Kanoglu et al. (2013, Proc. R. Soc. A: Math. Phys. Eng. Sci., 469, 20130015) and compare our results with their non-dispersive and dispersive linear analytical solutions. We confirm focusing phenomena, which amplify the wave height in the leading depression side. We then study sequencing of an N-wave profile with a finite crest length. Our preliminary results show that sequencing is more pronounced on the leading depression side. We perform parametric study to understand sequencing in terms of N-wave, hence earthquake, parameters. We then discuss the results both in terms of tsunami focusing and leading wave amplitude. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe).
Towards a certification process for tsunami early warning systems
NASA Astrophysics Data System (ADS)
Löwe, Peter; Wächter, Jochen; Hammitzsch, Martin
2013-04-01
The natural disaster of the Boxing Day Tsunami of 2004 was followed by an information catastrophe. Crucial early warning information could not be delivered to the communities under imminent threat, resulting in over 240,000 casualties in 14 countries. This tragedy sparked the development of a new generation of integrated modular Tsunami Early Warning Systems (TEWS). While significant advances were accomplished in the past years, recent events, like the Chile 2010 and the Tohoku 2011 tsunami demonstrate that the key technical challenge for Tsunami Early Warning research on the supranational scale still lies in the timely issuing of status information and reliable early warning messages in a proven workflow. A second challenge stems from the main objective of the Intergovernmental Oceanographic Commission of UNESCO (IOC) Tsunami Programme, the integration of national TEWS towards ocean-wide networks: Each of the increasing number of integrated Tsunami Early Warning Centres has to cope with the continuing evolution of sensors, hardware and software while having to maintain reliable inter-center information exchange services. To avoid future information catastrophes, the performance of all components, ranging from individual sensors, to Warning Centers within their particular end-to-end Warning System Environments, and up to federated Systems of Tsunami Warning Systems has to be regularly validated against defined criteria. Since 2004, GFZ German Research Centre for Geosciences (GFZ) has built up expertise in the field of TEWS. Within GFZ, the Centre for GeoInformation Technology (CeGIT) has focused its work on the geoinformatics aspects of TEWS in two projects already, being the German Indonesian Tsunami Early Warning System (GITEWS) and the Distant Early Warning System (DEWS). This activity is continued in the TRIDEC project (Collaborative, Complex, and Critical Decision Processes in Evolving Crises) funded under the European Union's seventh Framework Programme (FP7). TRIDEC focuses on real-time intelligent information management in Earth management and its long-term application: The technical development is based on mature system architecture models and industry standards. The use of standards already applies to the operation of individual TRIDEC reference installations and their interlinking into an integrated service infrastructure for supranational warning services. This is a first step towards best practices and service lifecycles for Early Warning Centre IT service management, including Service Level Agreements (SLA) and Service Certification. While on a global scale the integration of TEWS progresses towards Systems of Systems (SoS), there is still an absence of accredited and reliable certifications for national TEWS or regional Tsunami Early Warning Systems of Systems (TEWSoS). Concepts for TEWS operations have already been published under the guidance of the IOC, and can now be complemented by the recent research advances concerning SoS architecture. Combined with feedback from the real world, such as the NEAMwave 2012 Tsunami exercise in the Mediterranean, this can serve as a starting point to formulate initial requirements for TEWS and TEWSoS certification: Certification activities will cover the establishment of new TEWS and TEWSoS, and also both maintenance and enhancement of existing TEWS/TEWSoS. While the IOC is expected to take a central role in the development of the certification strategy, it remains to be defined which bodies will actually conduct the certification process. Certification requirements and results are likely to become a valuable information source for various target groups, ranging from national policy decision makers, government agency planners, national and local government preparedness officials, TWC staff members, Disaster Responders, the media and the insurance industry.
The potential role of real-time geodetic observations in tsunami early warning
NASA Astrophysics Data System (ADS)
Tinti, Stefano; Armigliato, Alberto
2016-04-01
Tsunami warning systems (TWS) have the final goal to launch a reliable alert of an incoming dangerous tsunami to coastal population early enough to allow people to flee from the shore and coastal areas according to some evacuation plans. In the last decade, especially after the catastrophic 2004 Boxing Day tsunami in the Indian Ocean, much attention has been given to filling gaps in the existing TWSs (only covering the Pacific Ocean at that time) and to establishing new TWSs in ocean regions that were uncovered. Typically, TWSs operating today work only on earthquake-induced tsunamis. TWSs estimate quickly earthquake location and size by real-time processing seismic signals; on the basis of some pre-defined "static" procedures (either based on decision matrices or on pre-archived tsunami simulations), assess the tsunami alert level on a large regional scale and issue specific bulletins to a pre-selected recipients audience. Not unfrequently these procedures result in generic alert messages with little value. What usually operative TWSs do not do, is to compute earthquake focal mechanism, to calculate the co-seismic sea-floor displacement, to assess the initial tsunami conditions, to input these data into tsunami simulation models and to compute tsunami propagation up to the threatened coastal districts. This series of steps is considered nowadays too time consuming to provide the required timely alert. An equivalent series of steps could start from the same premises (earthquake focal parameters) and reach the same result (tsunami height at target coastal areas) by replacing the intermediate steps of real-time tsunami simulations with proper selection from a large archive of pre-computed tsunami scenarios. The advantage of real-time simulations and of archived scenarios selection is that estimates are tailored to the specific occurring tsunami and alert can be more detailed (less generic) and appropriate for local needs. Both these procedures are still at an experimental or testing stage and haven't been implemented yet in any standard TWS operations. Nonetheless, this is seen to be the future and the natural TWS evolving enhancement. In this context, improvement of the real-time estimates of tsunamigenic earthquake focal mechanism is of fundamental importance to trigger the appropriate computational chain. Quick discrimination between strike-slip and thrust-fault earthquakes, and equally relevant, quick assessment of co-seismic on-fault slip distribution, are exemplary cases to which a real-time geodetic monitoring system can contribute significantly. Robust inversion of geodetic data can help to reconstruct the sea floor deformation pattern especially if two conditions are met: the source is not too far from network stations and is well covered azimuthally. These two conditions are sometimes hard to satisfy fully, but in certain regions, like the Mediterranean and the Caribbean sea, this is quite possible due to the limited size of the ocean basins. Close cooperation between the Global Geodetic Observing System (GGOS) community, seismologists, tsunami scientists and TWS operators is highly recommended to obtain significant progresses in the quick determination of the earthquake source, which can trigger a timely estimation of the ensuing tsunami and a more reliable and detailed assessment of the tsunami size at the coast.
NASA Astrophysics Data System (ADS)
Fauzan Zakki, Ahmad; Suharto; Windyandari, Aulia
2018-03-01
Several attempts have been made to reduce the risk of tsunami disasters such as the development of early warning systems, evacuation procedures training, coastal protection and coastal spatial planning. Although many efforts to mitigate the impact of the tsunami in Indonesia was made, no one has developed a portable disaster rescue vehicle/shelter as well as a lifeboat on ships and offshore building, which is always available when a disaster occurs. The aim of the paper is to evaluate the performance of cone capsule shaped hull form that would be used for the portable tsunami lifeboat. The investigation of the boat resistance, intact stability, and seakeeping characteristics was made. The numerical analysis results indicate that the cone capsule is reliable as an alternative hull form for the portable tsunami lifeboat.
The 17 July 2006 Tsunami earthquake in West Java, Indonesia
Mori, J.; Mooney, W.D.; Afnimar,; Kurniawan, S.; Anaya, A.I.; Widiyantoro, S.
2007-01-01
A tsunami earthquake (Mw = 7.7) occurred south of Java on 17 July 2006. The event produced relatively low levels of high-frequency radiation, and local felt reports indicated only weak shaking in Java. There was no ground motion damage from the earthquake, but there was extensive damage and loss of life from the tsunami along 250 km of the southern coasts of West Java and Central Java. An inspection of the area a few days after the earthquake showed extensive damage to wooden and unreinforced masonry buildings that were located within several hundred meters of the coast. Since there was no tsunami warning system in place, efforts to escape the large waves depended on how people reacted to the earthquake shaking, which was only weakly felt in the coastal areas. This experience emphasizes the need for adequate tsunami warning systems for the Indian Ocean region.
NASA Astrophysics Data System (ADS)
Cienfuegos, R.; Gonzalez, G.; Repetto, P.; Cipriano, A.; Moris, R.; Catalan, P. A.; Guic, E.; Martin, J. C. D. L. L.
2016-12-01
The Research Center for Integrated Natural Hazards Management (CIGIDEN) has developed in recent years (supported by the Fondap/Conicyt Excellence in research center's program) active efforts to connect science and public institutions in charge of disaster management in Chile. We have been able to reach in particular the National Emergency Office (ONEMI) and the National Hydrographic and Oceanographic Naval Service (SHOA), and develop joint specific programs that have been mutually beneficial both for research enrichment and the operation of the emergency response system. Through these efforts, also supplemented by other Chilean and International research institutions, we analyzed together issues and challenges from the systemic failure experienced by the emergency system in Chile after the 2010 earthquake and tsunami. In this talk we will review some of the main collaboration actions and their outcomes, connecting them to the extreme events that impacted Chile in 2015 (earthquakes, tsunamis, storm waves, and flash floods). In particular we will describe the effort that CIGIDEN has developed i) with ONEMI in developing instruments to assess community preparedness and awareness and to understand tsunami evacuation behaviors; and ii) with SHOA to develop a new Integrated Decision Support System for Tsunami alerting that is being transferred to SHOA in September 2015, and was successfully tested offline during the September 16th, 2015, tsunami.
Real-time Tsunami Inundation Prediction Using High Performance Computers
NASA Astrophysics Data System (ADS)
Oishi, Y.; Imamura, F.; Sugawara, D.
2014-12-01
Recently off-shore tsunami observation stations based on cabled ocean bottom pressure gauges are actively being deployed especially in Japan. These cabled systems are designed to provide real-time tsunami data before tsunamis reach coastlines for disaster mitigation purposes. To receive real benefits of these observations, real-time analysis techniques to make an effective use of these data are necessary. A representative study was made by Tsushima et al. (2009) that proposed a method to provide instant tsunami source prediction based on achieving tsunami waveform data. As time passes, the prediction is improved by using updated waveform data. After a tsunami source is predicted, tsunami waveforms are synthesized from pre-computed tsunami Green functions of linear long wave equations. Tsushima et al. (2014) updated the method by combining the tsunami waveform inversion with an instant inversion of coseismic crustal deformation and improved the prediction accuracy and speed in the early stages. For disaster mitigation purposes, real-time predictions of tsunami inundation are also important. In this study, we discuss the possibility of real-time tsunami inundation predictions, which require faster-than-real-time tsunami inundation simulation in addition to instant tsunami source analysis. Although the computational amount is large to solve non-linear shallow water equations for inundation predictions, it has become executable through the recent developments of high performance computing technologies. We conducted parallel computations of tsunami inundation and achieved 6.0 TFLOPS by using 19,000 CPU cores. We employed a leap-frog finite difference method with nested staggered grids of which resolution range from 405 m to 5 m. The resolution ratio of each nested domain was 1/3. Total number of grid points were 13 million, and the time step was 0.1 seconds. Tsunami sources of 2011 Tohoku-oki earthquake were tested. The inundation prediction up to 2 hours after the earthquake occurs took about 2 minutes, which would be sufficient for a practical tsunami inundation predictions. In the presentation, the computational performance of our faster-than-real-time tsunami inundation model will be shown, and preferable tsunami wave source analysis for an accurate inundation prediction will also be discussed.
The Pacific tsunami warning system
Pararas-Carayannis, G.
1986-01-01
The impact of tsunamis on human societies can be traced back in written history to 480 BC, when the Minoan civilization in the Eastern Mediterranean was wiped out by great tsunami waves generated by the volcanic explosion of the island of Santorin. In the Pacific Ocean where the majority of these waves have been generated, the historical record, although brief, shows tremendous destruction. In Japan which has one of the most populated coastal regions in the world and a long history of earthquake activity, tsunamis have destroyed entire coastal communities. There is also history of tsunami destruction in Alaska, in Hawaiian Islands, and in South America.
An improvement of the GPS buoy system for detecting tsunami at far offshore
NASA Astrophysics Data System (ADS)
Kato, T.; Terada, Y.; Nagai, T.; Kawaguchi, K.; Koshimura, S.; Matsushita, Y.
2012-12-01
We have developed a GPS buoy system for detecting a tsunami before its arrival at coasts and thereby mitigating tsunami disaster. The system was first deployed in 1997 for a short period in the Sagami bay, south of Tokyo, for basic experiments, and then deployed off Ofunato city, northeastern part of Japan, for the period 2001-2004. The system was then established at about 13km south of Cape Muroto, southwestern part of Japan, since 2004. Five tsunamis of about 10cm have been observed in these systems, including 2001 Peru earthquake (Mw8.3), 2003 Tokachi-oki earthquake (Mw8.3), 2004 Off Kii Peninsula earthquake (Mw7.4), 2010 Chile earthquake (Mw8.8), and 2011 Tohoku-Oki earthquake (Mw9.0). These experiments clearly showed that GPS buoy is capable of detecting tsunami with a few centimeter accuracy and can be monitored in near real time by applying an appropriate filter, real-time data transmission using radio and dissemination of obtained records of sea surface height changes through internet. Considering that the system is a powerful tool to monitor sea surface variations due to wind as well as tsunami, the Ministry of Land, Infrastructure, Transport and Tourism implemented the system in a part of the Nationwide Ocean Wave information network for Ports and HArbourS (NOWPHAS) system and deployed the system at 15 sites along the coasts around the Japanese Islands. The system detected the tsunami due to the 11th March 2011 Tohoku-Oki earthquake with higher than 6m of tsunami height at the site Off South Iwate (Kamaishi). The Japan Meteorological Agency that was monitoring the record updated the level of the tsunami warning to the greatest value due to the result. Currently, the GPS buoy system uses a RTK-GPS which requires a land base for obtaining precise location of the buoy by a baseline analysis. This algorithm limits the distance of the buoy to, at most, 20km from the coast as the accuracy of positioning gets much worse as the baseline distance becomes longer than 20km. This limits the lead time for letting coastal residents to evacuate from the coast only about 10 minutes after the detection of tsunami at a GPS buoy. This requires us to improve the system to put the buoy much farther from the coast. In order to solve this problem, we have introduced a new algorithm of precise point positioning with ambiguity resolution (PPP-AR) method and point precise variance detection (PVD) method for estimating the precise location of the buoy. As these method does not require land base station, it may allow us to deploy a buoy much farther than 100km offshore observation. Also, an open source program package (RTKLIB) is introduced for kinematic analysis for a long baseline. A new experiment using this system has started about 40km south off Cape Muroto in April 2012. One of buoys called as "Kuroshio Bokujo", which is used as a fish bed by Kochi Prefecture, is used for this purpose. The positioning results are exhibited in real time on the internet.
Development of jacket platform tsunami risk rating system in waters offshore North Borneo
NASA Astrophysics Data System (ADS)
Lee, H. E.; Liew, M. S.; Mardi, N. H.; Na, K. L.; Toloue, Iraj; Wong, S. K.
2016-09-01
This work details the simulation of tsunami waves generated by seaquakes in the Manila Trench and their effect on fixed oil and gas jacket platforms in waters offshore North Borneo. For this study, a four-leg living quarter jacket platform located in a water depth of 63m is modelled in SACS v5.3. Malaysia has traditionally been perceived to be safe from the hazards of earthquakes and tsunamis. Local design practices tend to neglect tsunami waves and include no such provisions. In 2004, a 9.3 M w seaquake occurred off the northwest coast of Aceh, which generated tsunami waves that caused destruction in Malaysia totalling US 25 million and 68 deaths. This event prompted an awareness of the need to study the reliability of fixed offshore platforms scattered throughout Malaysian waters. In this paper, we present a review of research on the seismicity of the Manila Trench, which is perceived to be high risk for Southeast Asia. From the tsunami numerical model TUNA-M2, we extract computer-simulated tsunami waves at prescribed grid points in the vicinity of the platforms in the region. Using wave heights as input, we simulate the tsunami using SACS v5.3 structural analysis software of offshore platforms, which is widely accepted by the industry. We employ the nonlinear solitary wave theory in our tsunami loading calculations for the platforms, and formulate a platform-specific risk quantification system. We then perform an intensive structural sensitivity analysis and derive a corresponding platform-specific risk rating model.
NASA Astrophysics Data System (ADS)
Liu, Jiaqi; Tokunaga, Tomochika
2016-04-01
Groundwater is vulnerable to many natural hazards, including tsunami. As reported after the 2004 Indian Ocean earthquake and the 2011 Great East Japan earthquake, the generated massive tsunami inundations resulted in unexpected groundwater salinization in coastal areas. Water supply was strongly disturbed due to the significantly elevated salinity in groundwater. Supplying fresh water is one of the prioritized concerns in the immediate aftermath of disaster, and during long-term post-disaster reconstruction as well. The aim of this study is to assess the impact of tsunami on coastal groundwater system and provide guidelines on managing water resources in post-tsunami period. We selected the study area as the Niijima Island, a tsunami-prone area in Japan, which is under the risk of being attacked by a devastated tsunami with its wave height up to 30 m. A three-dimension (3-D) numerical model of the groundwater system on the Niijima Island was developed by using the simulation code FEFLOW which can handle both density- dependent groundwater flow and saturated-unsaturated flow processes. The model was justified by the measured water table data obtained from the field work in July, 2015. By using this model, we investigated saltwater intrusion and aquifer recovery process under different tsunami scenarios. Modelling results showed that saltwater could fully saturate the vadose zone and come into contact with groundwater table in just 10 mins. The 0.6 km2 of inundation area introduced salt mass equivalent to approximately 9×104 t of NaCl into the vadose zone. After the retreat of tsunami waves, the remained saltwater in vadose zone continuously intruded into the groundwater and dramatically salinized the aquifer up to about 10,000 mg/L. In the worst tsunami scenario, it took more than 10 years for the polluted aquifer to be entirely recovered by natural rainfall. Given that the groundwater is the only freshwater source on the Niijima Island, we can provide suggestions on preparedness of tsunami disasters and guidelines of supplying water resource in post-tsunami period based on these numerical modelling results. This approach has implications for the disaster prevention and the better preparation with respect to tsunami and tsunami-like events such as storm surges on other coastal areas.
Application of Seismic Array Processing to Tsunami Early Warning
NASA Astrophysics Data System (ADS)
An, C.; Meng, L.
2015-12-01
Tsunami wave predictions of the current tsunami warning systems rely on accurate earthquake source inversions of wave height data. They are of limited effectiveness for the near-field areas since the tsunami waves arrive before data are collected. Recent seismic and tsunami disasters have revealed the need for early warning to protect near-source coastal populations. In this work we developed the basis for a tsunami warning system based on rapid earthquake source characterisation through regional seismic array back-projections. We explored rapid earthquake source imaging using onshore dense seismic arrays located at regional distances on the order of 1000 km, which provides faster source images than conventional teleseismic back-projections. We implement this method in a simulated real-time environment, and analysed the 2011 Tohoku earthquake rupture with two clusters of Hi-net stations in Kyushu and Northern Hokkaido, and the 2014 Iquique event with the Earthscope USArray Transportable Array. The results yield reasonable estimates of rupture area, which is approximated by an ellipse and leads to the construction of simple slip models based on empirical scaling of the rupture area, seismic moment and average slip. The slip model is then used as the input of the tsunami simulation package COMCOT to predict the tsunami waves. In the example of the Tohoku event, the earthquake source model can be acquired within 6 minutes from the start of rupture and the simulation of tsunami waves takes less than 2 min, which could facilitate a timely tsunami warning. The predicted arrival time and wave amplitude reasonably fit observations. Based on this method, we propose to develop an automatic warning mechanism that provides rapid near-field warning for areas of high tsunami risk. The initial focus will be Japan, Pacific Northwest and Alaska, where dense seismic networks with the capability of real-time data telemetry and open data accessibility, such as the Japanese HiNet (>800 instruments) and the Earthscope USArray Transportable Array (~400 instruments), are established.
Numerical reconstruction of tsunami source using combined seismic, satellite and DART data
NASA Astrophysics Data System (ADS)
Krivorotko, Olga; Kabanikhin, Sergey; Marinin, Igor
2014-05-01
Recent tsunamis, for instance, in Japan (2011), in Sumatra (2004), and at the Indian coast (2004) showed that a system of producing exact and timely information about tsunamis is of a vital importance. Numerical simulation is an effective instrument for providing such information. Bottom relief characteristics and the initial perturbation data (a tsunami source) are required for the direct simulation of tsunamis. The seismic data about the source are usually obtained in a few tens of minutes after an event has occurred (the seismic waves velocity being about five hundred kilometres per minute, while the velocity of tsunami waves is less than twelve kilometres per minute). A difference in the arrival times of seismic and tsunami waves can be used when operationally refining the tsunami source parameters and modelling expected tsunami wave height on the shore. The most suitable physical models related to the tsunamis simulation are based on the shallow water equations. The problem of identification parameters of a tsunami source using additional measurements of a passing wave is called inverse tsunami problem. We investigate three different inverse problems of determining a tsunami source using three different additional data: Deep-ocean Assessment and Reporting of Tsunamis (DART) measurements, satellite wave-form images and seismic data. These problems are severely ill-posed. We apply regularization techniques to control the degree of ill-posedness such as Fourier expansion, truncated singular value decomposition, numerical regularization. The algorithm of selecting the truncated number of singular values of an inverse problem operator which is agreed with the error level in measured data is described and analyzed. In numerical experiment we used gradient methods (Landweber iteration and conjugate gradient method) for solving inverse tsunami problems. Gradient methods are based on minimizing the corresponding misfit function. To calculate the gradient of the misfit function, the adjoint problem is solved. The conservative finite-difference schemes for solving the direct and adjoint problems in the approximation of shallow water are constructed. Results of numerical experiments of the tsunami source reconstruction are presented and discussed. We show that using a combination of three different types of data allows one to increase the stability and efficiency of tsunami source reconstruction. Non-profit organization WAPMERR (World Agency of Planetary Monitoring and Earthquake Risk Reduction) in collaboration with Informap software development department developed the Integrated Tsunami Research and Information System (ITRIS) to simulate tsunami waves and earthquakes, river course changes, coastal zone floods, and risk estimates for coastal constructions at wave run-ups and earthquakes. The special scientific plug-in components are embedded in a specially developed GIS-type graphic shell for easy data retrieval, visualization and processing. This work was supported by the Russian Foundation for Basic Research (project No. 12-01-00773 'Theory and Numerical Methods for Solving Combined Inverse Problems of Mathematical Physics') and interdisciplinary project of SB RAS 14 'Inverse Problems and Applications: Theory, Algorithms, Software'.
NASA Astrophysics Data System (ADS)
Yang, Y. M.; Komjathy, A.; Meng, X.; Verkhoglyadova, O. P.; Langley, R. B.; Mannucci, A. J.
2015-12-01
Traveling ionospheric disturbances (TIDs) induced by acoustic-gravity waves in the neutral atmosphere have significant impact on trans-ionospheric radio waves such as Global Navigation Satellite System (GNSS, including Global Position System (GPS)) measurements. Natural hazards and solid Earth events, such as earthquakes, tsunamis and volcanic eruptions are actual sources that may trigger acoustic and gravity waves resulting in traveling ionospheric disturbances (TIDs) in the upper atmosphere. Trans-ionospheric radio wave measurements sense the total electron content (TEC) along the signal propagation path. In this research, we introduce a novel GPS-based detection and estimation technique for remote sensing of atmospheric wave-induced TIDs including space weather phenomena induced by major natural hazard events, using TEC time series collected from worldwide ground-based dual-frequency GNSS (including GPS) receiver networks. We demonstrate the ability of using ground- and space-based dual-frequency GPS measurements to detect and monitor tsunami wave propagation from the 2011 Tohoku-Oki earthquake and tsunami. Major wave trains with different propagation speeds and wavelengths were identified through analysis of the GPS remote sensing observations. Dominant physical characteristics of atmospheric wave-induced TIDs are found to be associated with specific tsunami propagations and oceanic Rayleigh waves. In this research, we compared GPS-based observations, corresponding model simulations and tsunami wave propagation. Results are shown to lead to a better understanding of the tsunami-induced ionosphere responses. Based on current distribution of Plate Boundary Observatory GPS stations, the results indicate that tsunami-induced TIDs may be detected about 60 minutes prior to tsunamis arriving at the U.S. west coast. It is expected that this GNSS-based technology will become an integral part of future early-warning systems.
Long-term perspectives on giant earthquakes and tsunamis at subduction zones
Satake, K.; Atwater, B.F.; ,
2007-01-01
Histories of earthquakes and tsunamis, inferred from geological evidence, aid in anticipating future catastrophes. This natural warning system now influences building codes and tsunami planning in the United States, Canada, and Japan, particularly where geology demonstrates the past occurrence of earthquakes and tsunamis larger than those known from written and instrumental records. Under favorable circumstances, paleoseismology can thus provide long-term advisories of unusually large tsunamis. The extraordinary Indian Ocean tsunami of 2004 resulted from a fault rupture more than 1000 km in length that included and dwarfed fault patches that had broken historically during lesser shocks. Such variation in rupture mode, known from written history at a few subduction zones, is also characteristic of earthquake histories inferred from geology on the Pacific Rim. Copyright ?? 2007 by Annual Reviews. All rights reserved.
Tsunami magnetic signals in the Northwestern Pacific seafloor magnetic measurements
NASA Astrophysics Data System (ADS)
Schnepf, N. R.; An, C.; Nair, M. C.; Maus, S.
2013-12-01
In the past two decades, underwater cables and seafloor magnetometers have observed motional inductance from ocean tsunamis. This study aimed to characterize the electromagnetic signatures of tsunamis from seafloor stations to assist in the long-term goal of real-time tsunami detection and warning systems. Four ocean seafloor stations (T13, T14, T15, T18) in the Northeastern Philippine Sea collected vector measurements of the electric and magnetic fields every minute during the period of 10/05/2005 to 11/30/2007 (Baba et al., 2010 PEPI). During this time, four major tsunamis occurred as a result of moment magnitude 8.0-8.1 earthquakes. These tsunamis include the 05/03/2006 Tonga event, the 01/13/2007 Kuril Islands event, the 04/01/2007 Solomon Islands event, and the 08/15/2007 Peru event. The Cornell Multi-grid Coupled Tsunami model (COMCOT) was used to predict the arrival time of the tsunamis at each of the seafloor stations. The stations' raw magnetic field signals underwent a high pass filter to then be examined for signals of the tsunami arrival. The high pass filtering showed clear tsunami signals for the Tonga event, but a clear signal was not seen for the other events. This may be due to signals from near Earth space with periods similar to tsunamis. To remove extraneous atmospheric magnetic signals, a cross-wavelet analysis was conducted using the horizontal field components from three INTERMAGNET land stations and the vertical component from the seafloor stations. The cross-wavelet analysis showed that for three of the six stations (two of the four tsunami events) the peak in wavelet amplitude matched the arrival of the tsunami. We discuss implications of our finding in magnetic monitoring of tsunamis.
Geist, E.L.; Bilek, S.L.; Arcas, D.; Titov, V.V.
2006-01-01
Source parameters affecting tsunami generation and propagation for the Mw > 9.0 December 26, 2004 and the Mw = 8.6 March 28, 2005 earthquakes are examined to explain the dramatic difference in tsunami observations. We evaluate both scalar measures (seismic moment, maximum slip, potential energy) and finite-source repre-sentations (distributed slip and far-field beaming from finite source dimensions) of tsunami generation potential. There exists significant variability in local tsunami runup with respect to the most readily available measure, seismic moment. The local tsunami intensity for the December 2004 earthquake is similar to other tsunamigenic earthquakes of comparable magnitude. In contrast, the March 2005 local tsunami was deficient relative to its earthquake magnitude. Tsunami potential energy calculations more accurately reflect the difference in tsunami severity, although these calculations are dependent on knowledge of the slip distribution and therefore difficult to implement in a real-time system. A significant factor affecting tsunami generation unaccounted for in these scalar measures is the location of regions of seafloor displacement relative to the overlying water depth. The deficiency of the March 2005 tsunami seems to be related to concentration of slip in the down-dip part of the rupture zone and the fact that a substantial portion of the vertical displacement field occurred in shallow water or on land. The comparison of the December 2004 and March 2005 Sumatra earthquakes presented in this study is analogous to previous studies comparing the 1952 and 2003 Tokachi-Oki earthquakes and tsunamis, in terms of the effect slip distribution has on local tsunamis. Results from these studies indicate the difficulty in rapidly assessing local tsunami runup from magnitude and epicentral location information alone.
Allan, Tom
2006-01-01
GANDER – for Global Altimeter Network Designed to Evaluate Risk – was an idea that was probably ahead of its time. Conceived at a time when ocean observing satellites were sometimes 10 years in the planning stage, the concept of affordable faster sampling through the use of altimeter-carrying microsats was primarily advanced as a way of detecting and tracking storms at sea on a daily basis. But, of course, a radar altimeter monitors changes in sea-level as well as surface wave height and wind speed. Here then is a system which, flown with more precise missions such as JASON 2, could meet the needs of ocean modellers by providing the greater detail required for tracking mesoscale eddies, whilst servicing forecasting centres and units at sea with near real-time sea state information. A tsunami mode, instantly activated when an undersea earthquake is detected by the global network of seismic stations, could also be incorporated.
A rapid calculation system for tsunami propagation in Japan by using the AQUA-MT/CMT solutions
NASA Astrophysics Data System (ADS)
Nakamura, T.; Suzuki, W.; Yamamoto, N.; Kimura, H.; Takahashi, N.
2017-12-01
We developed a rapid calculation system of geodetic deformations and tsunami propagation in and around Japan. The system automatically conducts their forward calculations by using point source parameters estimated by the AQUA system (Matsumura et al., 2006), which analyze magnitude, hypocenter, and moment tensors for an event occurring in Japan in 3 minutes of the origin time at the earliest. An optimized calculation code developed by Nakamura and Baba (2016) is employed for the calculations on our computer server with 12 core processors of Intel Xeon 2.60 GHz. Assuming a homogeneous fault slip in the single fault plane as the source fault, the developed system calculates each geodetic deformation and tsunami propagation by numerically solving the 2D linear long-wave equations for the grid interval of 1 arc-min from two fault orientations simultaneously; i.e., one fault and its conjugate fault plane. Because fault models based on moment tensor analyses of event data are used, the system appropriately evaluate tsunami propagation even for unexpected events such as normal faulting in the subduction zone, which differs with the evaluation of tsunami arrivals and heights from a pre-calculated database by using fault models assuming typical types of faulting in anticipated source areas (e.g., Tatehata, 1998; Titov et al., 2005; Yamamoto et al., 2016). By the complete automation from event detection to output graphical figures, the calculation results can be available via e-mail and web site in 4 minutes of the origin time at the earliest. For moderate-sized events such as M5 to 6 events, the system helps us to rapidly investigate whether amplitudes of tsunamis at nearshore and offshore stations exceed a noise level or not, and easily identify actual tsunamis at the stations by comparing with obtained synthetic waveforms. In the case of using source models investigated from GNSS data, such evaluations may be difficult because of the low resolution of sources due to a low signal to noise ratio at land stations. For large to huge events in offshore areas, the developed system may be useful to decide to starting or stopping preparations and precautions against tsunami arrivals, because calculation results including arrival times and heights of initial and maximum waves can be rapidly available before their arrivals at coastal areas.
Landslide tsunami hazard in New South Wales, Australia: novel observations from 3D modelling
NASA Astrophysics Data System (ADS)
Power, Hannah; Clarke, Samantha; Hubble, Tom
2015-04-01
This paper examines the potential of tsunami inundation generated from two case study sites of submarine mass failures on the New South Wales coast of Australia. Two submarine mass failure events are investigated: the Bulli Slide and the Shovel Slide. Both slides are located approximately 65 km southeast of Sydney and 60 km east of the township of Wollongong. The Bulli Slide (~20 km3) and the Shovel Slide (7.97 km3) correspond to the two largest identified erosional surface submarine landslides scars of the NSW continental margin (Glenn et al. 2008; Clarke 2014) and represent examples of large to very large submarine landslide scars. The Shovel Slide is a moderately thick (80-165 m), moderately wide to wide (4.4 km) slide, and is located in 880 m water depth; and the Bulli Slide is an extremely thick (200-425 m), very wide (8.9 km) slide, and is located in 1500 m water depth. Previous work on the east Australian margin (Clarke et al., 2014) and elsewhere (Harbitz et al., 2013) suggests that submarine landslides similar to the Bulli Slide or the Shovel Slide are volumetrically large enough and occur at shallow enough water depths (400-2500 m) to generate substantial tsunamis that could cause widespread damage on the east Australian coast and threaten coastal communities (Burbidge et al. 2008; Clarke 2014; Talukder and Volker 2014). Currently, the tsunamogenic potential of these two slides has only been investigated using 2D modelling (Clarke 2014) and to date it has been difficult to establish the onshore tsunami surge characteristics for the submarine landslides with certainty. To address this knowledge gap, the forecast inundation as a result of these two mass failure events was investigated using a three-dimensional model (ANUGA) that predicts water flow resulting from natural hazard events such as tsunami (Nielsen et al., 2005). The ANUGA model solves the two-dimensional shallow water wave equations and accurately models the process of wetting and drying thus making it ideal for simulating inundation due to tsunami. The model generates a surface wave profile based on the dimensions of the submarine mass failure event using the method of Ward et al. (2005). Inundation maps are shown for these two slides and sensitivity analysis is conducted to identify the characteristics of the slides that are most influential on inundation areas and depths.
New Coastal Tsunami Gauges: Application at Augustine Volcano, Cook Inlet, Alaska
NASA Astrophysics Data System (ADS)
Burgy, M.; Bolton, D. K.
2006-12-01
Recent eruptive activity at Augustine Volcano and its associated tsunami threat to lower Cook Inlet pointed out the need for a quickly deployable tsunami detector which could be installed on Augustine Island's coast. The detector's purpose would be to verify tsunami generation by direct observation of the wave at the source to support tsunami warning decisions along populated coastlines. To fill this need the Tsunami Mobile Alert Real-Time (TSMART) system was developed at NOAA's West Coast/Alaska Tsunami Warning Center with support from the University of Alaska Tsunami Warning and Environmental Observatory for Alaska program (TWEAK) and the Alaska Volcano Observatory (AVO). The TSMART system consists of a pressure sensor installed as near as possible to the low tide line. The sensor is enclosed in a water-tight hypalon bag filled with propylene-glycol to prevent silt damage to the sensor and freezing. The bag is enclosed in a perforated, strong plastic pipe about 16 inches long and 8 inches in diameter enclosed at both ends for protection. The sensor is cabled to a data logger/radio/power station up to 300 feet distant. Data are transmitted to a base station and made available to the warning center in real-time through the internet. This data telemetry system can be incorporated within existing AVO and Plate Boundary Observatory networks which makes it ideal for volcano-tsunami monitoring. A TSMART network can be utilized anywhere in the world within 120 miles of an internet connection. At Augustine, two test stations were installed on the east side of the island in August 2006. The sensors were located very near the low tide limit and covered with rock, and the cable was buried to the data logger station which was located well above high tide mark. Data logger, radio, battery and other electronics are housed in an enclosure mounted to a pole which also supports an antenna and solar panel. Radio signal is transmitted to a repeater station higher up on the island which then transmits the data to a base station in Homer, Alaska. Sea level data values are transmitted every 15 seconds and displayed at the tsunami warning center in Palmer, Alaska.
Tsunami Risk for the Caribbean Coast
NASA Astrophysics Data System (ADS)
Kozelkov, A. S.; Kurkin, A. A.; Pelinovsky, E. N.; Zahibo, N.
2004-12-01
The tsunami problem for the coast of the Caribbean basin is discussed. Briefly the historical data of tsunami in the Caribbean Sea are presented. Numerical simulation of potential tsunamis in the Caribbean Sea is performed in the framework of the nonlinear-shallow theory. The tsunami wave height distribution along the Caribbean Coast is computed. These results are used to estimate the far-field tsunami potential of various coastal locations in the Caribbean Sea. In fact, five zones with tsunami low risk are selected basing on prognostic computations, they are: the bay "Golfo de Batabano" and the coast of province "Ciego de Avila" in Cuba, the Nicaraguan Coast (between Bluefields and Puerto Cabezas), the border between Mexico and Belize, the bay "Golfo de Venezuela" in Venezuela. The analysis of historical data confirms that there was no tsunami in the selected zones. Also, the wave attenuation in the Caribbean Sea is investigated; in fact, wave amplitude decreases in an order if the tsunami source is located on the distance up to 1000 km from the coastal location. Both factors wave attenuation and wave height distribution should be taken into account in the planned warning system for the Caribbean Sea.
Predictability of extremes in non-linear hierarchically organized systems
NASA Astrophysics Data System (ADS)
Kossobokov, V. G.; Soloviev, A.
2011-12-01
Understanding the complexity of non-linear dynamics of hierarchically organized systems progresses to new approaches in assessing hazard and risk of the extreme catastrophic events. In particular, a series of interrelated step-by-step studies of seismic process along with its non-stationary though self-organized behaviors, has led already to reproducible intermediate-term middle-range earthquake forecast/prediction technique that has passed control in forward real-time applications during the last two decades. The observed seismic dynamics prior to and after many mega, great, major, and strong earthquakes demonstrate common features of predictability and diverse behavior in course durable phase transitions in complex hierarchical non-linear system of blocks-and-faults of the Earth lithosphere. The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable analytical models, which leads to widespread practice of their deceptive application. The consequences of underestimation of seismic hazard propagate non-linearly into inflicted underestimation of risk and, eventually, into unexpected societal losses due to earthquakes and associated phenomena (i.e., collapse of buildings, landslides, tsunamis, liquefaction, etc.). The studies aimed at forecast/prediction of extreme events (interpreted as critical transitions) in geophysical and socio-economical systems include: (i) large earthquakes in geophysical systems of the lithosphere blocks-and-faults, (ii) starts and ends of economic recessions, (iii) episodes of a sharp increase in the unemployment rate, (iv) surge of the homicides in socio-economic systems. These studies are based on a heuristic search of phenomena preceding critical transitions and application of methodologies of pattern recognition of infrequent events. Any study of rare phenomena of highly complex origin, by their nature, implies using problem oriented methods, which design breaks the limits of classical statistical or econometric applications. The unambiguously designed forecast/prediction algorithms of the "yes or no" variety, analyze the observable quantitative integrals and indicators available to a given date, then provides unambiguous answer to the question whether a critical transition should be expected or not in the next time interval. Since the predictability of an originating non-linear dynamical system is limited in principle, the probabilistic component of forecast/prediction algorithms is represented by the empirical probabilities of alarms, on one side, and failures-to-predict, on the other, estimated on control sets achieved in the retro- and prospective experiments. Predicting in advance is the only decisive test of forecast/predictions and the relevant on-going experiments are conducted in the case seismic extremes, recessions, and increases of unemployment rate. The results achieved in real-time testing keep being encouraging and confirm predictability of the extremes.
February 27, 2010 Chilean Tsunami in Pacific and its Arrival to North East Asia
NASA Astrophysics Data System (ADS)
Zaytsev, Andrey; Pelinovsky, Eï¬M.; Yalciner, Ahmet C.; Ozer, Ceren; Chernov, Anton; Kostenko, Irina; Shevchenko, Georgy
2010-05-01
The outskirts of the fault plane broken by the strong earthquake on February 27, 2010 in Chili with a magnitude 8.8 at the 35km depth of 35.909°S, 72.733°W coordinates generated a moderate size tsunami. The initial amplitude of the tsunami source is not so high because of the major area of the plane was at land. The tsunami waves propagated far distances in South and North directions to East Asia and Wet America coasts. The waves are also recorded by several gauges in Pacific during its propagation and arrival to coastal areas. The recorded and observed amplitudes of tsunami waves are important for the potential effects with the threatening amplitudes. The event also showed that a moderate size tsunami can be effective even if it propagates far distances in any ocean or a marginal sea. The far east coasts of Russia at North East Asia (Sakhalin, Kuriles, Kamchatka) are one of the important source (i.e. November 15, 2006, Kuril Island Tsunami) and target (i.e. February, 27, 2010 Chilean tsunami) areas of the Pacific tsunamis. Many efforts have been spent for establishment of the monitoring system and assessment of tsunamis and development of the mitigation strategies against tsunamis and other hazards in the region. Development of the computer technologies provided the advances in data collection, transfer, and processing. Furthermore it also contributed new developments in computational tools and made the computer modeling to be an efficient tool in tsunami warning systems. In this study the tsunami numerical model NAMI DANCE Nested version is used. NAMI-DANCE solves Nonlinear form of Long Wave (Shallow water) equations (with or without dispersion) using finite difference model in nested grid domains from the source to target areas in multiprocessor hardware environment. It is applied to 2010 Chilean tsunami and its propagation and coastal behavior at far distances near Sakhalin, Kuril and Kamchatka coasts. The main tide gauge records used in this study are from Petropavlosk (Kamchatka), Severo-Kurilsk (Paramushir), Kurilsk (Iturup, coast of the Okhotsk sea), Malokurilskoe (Shikotan), Korsakov, Kholmsk and Aniva Bay (Sakhalin). These records and also other offshore DART records are analyzed and used for comparison of the modeling results with offshore and nearshore records. The transmission of tsunami waves through Sakhalin and Kuril straits and their propagation to nearby coasts are investigated. The spectral analysis of records in settlements of Sakhalin and Kurile Islands are investigated. The performance and capabilities of NAMI DANCE is also presented together with comparisons between the model, observations and discussions.
NASA Astrophysics Data System (ADS)
Hammitzsch, M.; Spazier, J.; Reißland, S.
2014-12-01
Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the concept a whirl and to shape science's future. Further functionality, improvements and possible profound changes have to implemented successively based on the users' evolving needs.
The July 17, 2006 Java Tsunami: Tsunami Modeling and the Probable Causes of the Extreme Run-up
NASA Astrophysics Data System (ADS)
Kongko, W.; Schlurmann, T.
2009-04-01
On 17 July 2006, an Earthquake magnitude Mw 7.8 off the south coast of west Java, Indonesia generated tsunami that affected over 300 km of south Java coastline and killed more than 600 people. Observed tsunami heights and field measurement of run-up distributions were uniformly scattered approximately 5 to 7 m along a 200 km coastal stretch; remarkably, a locally focused tsunami run-up height exceeding 20 m at Nusakambangan Island has been observed. Within the framework of the German Indonesia Tsunami Early Warning System (GITEWS) Project, a high-resolution near-shore bathymetrical survey equipped by multi-beam echo-sounder has been recently conducted. Additional geodata have been collected using Intermap Technologies STAR-4 airborne interferometric SAR data acquisition system on a 5 m ground sample distance basis in order to establish a most-sophisticated Digital Terrain Model (DTM). This paper describes the outcome of tsunami modelling approaches using high resolution data of bathymetry and topography being part of a general case study in Cilacap, Indonesia, and medium resolution data for other area along coastline of south Java Island. By means of two different seismic deformation models to mimic the tsunami source generation, a numerical code based on the 2D nonlinear shallow water equations is used to simulate probable tsunami run-up scenarios. Several model tests are done and virtual points in offshore, near-shore, coastline, as well as tsunami run-up on the coast are collected. For the purpose of validation, the model results are compared with field observations and sea level data observed at several tide gauges stations. The performance of numerical simulations and correlations with observed field data are highlighted, and probable causes for the extreme wave heights and run-ups are outlined. References Ammon, C.J., Kanamori, K., Lay, T., and Velasco, A., 2006. The July 2006 Java Tsunami Earthquake, Geophysical Research Letters, 33(L24308). Fritz, H.M., Kongko, W., Moore, A., McAdoo, B., Goff, J., Harbitz, C., Uslu, B., Kalligeris, N., Suteja, D., Kalsum, K., Titov, V., Gusman, A., Latief, H., Santoso, E., Sujoko, S., Djulkarnaen, D., Sunendar, H., and Synolakis, C., 2007. Extreme Run-up from the 17 July 2006 Java Tsunami. Geophysical Research Letters, 34(L12602). Fujii, Y., and Satake, K., 2006. Source of the July 2006 Java Tsunami Estimated from Tide Gauge Records. Geophysical Research Letters, 33(L23417). Intermap Federal Services Inc., 2007. Digital Terrain Model Cilacap, version 1. Project of GITEWS, DLR Germany. Kongko, W., and Leschka, S., 2008. Nearshore Bathymetry Measurements in Indonesia: Part 1. Cilacap, Technical Report, DHI-WASY GmbH Syke Germany. Kongko, W., Suranto, Chaeroni, Aprijanto, Zikra, and SUjantoko, 2006, Rapid Survey on Tsunami Jawa 17 July 2006, http://nctr.pmel.noaa.gov/java20060717/tsunami-java170706_e.pdf Lavigne, F., Gomes, C., Giffo, M., Wassmer, P., Hoebreck, C., Mardiatno, D., Prioyono, J., and Paris R., 2007. Field Observation of the 17 July 2006 Tsunami in Java. Natural Hazards and Earth Systems Sciences, 7: 177-183.
A new real-time tsunami detection algorithm
NASA Astrophysics Data System (ADS)
Chierici, F.; Embriaco, D.; Pignagnoli, L.
2016-12-01
Real-time tsunami detection algorithms play a key role in any Tsunami Early Warning System. We have developed a new algorithm for tsunami detection based on the real-time tide removal and real-time band-pass filtering of sea-bed pressure recordings. The algorithm greatly increases the tsunami detection probability, shortens the detection delay and enhances detection reliability, at low computational cost. The algorithm is designed to be used also in autonomous early warning systems with a set of input parameters and procedures which can be reconfigured in real time. We have also developed a methodology based on Monte Carlo simulations to test the tsunami detection algorithms. The algorithm performance is estimated by defining and evaluating statistical parameters, namely the detection probability, the detection delay, which are functions of the tsunami amplitude and wavelength, and the occurring rate of false alarms. Pressure data sets acquired by Bottom Pressure Recorders in different locations and environmental conditions have been used in order to consider real working scenarios in the test. We also present an application of the algorithm to the tsunami event which occurred at Haida Gwaii on October 28th, 2012 using data recorded by the Bullseye underwater node of Ocean Networks Canada. The algorithm successfully ran for test purpose in year-long missions onboard the GEOSTAR stand-alone multidisciplinary abyssal observatory, deployed in the Gulf of Cadiz during the EC project NEAREST and on NEMO-SN1 cabled observatory deployed in the Western Ionian Sea, operational node of the European research infrastructure EMSO.
Implications Of The 11 March Tohoku Tsunami On Warning Systems And Vertical Evacuation Strategies
NASA Astrophysics Data System (ADS)
Fraser, S.; Leonard, G.; Johnston, D.
2011-12-01
The Mw 9.0 Tohoku earthquake and tsunami of March 11th 2011 claimed over 20,000 lives in an event which inundated over 500 km2 of land on the north-east coast of Japan. Successful execution of tsunami warning procedures and evacuation strategies undoubtedly saved thousands of lives, and there is evidence that vertical evacuation facilities were a key part of reducing the fatality rate in several municipalities in the Sendai Plains. As with all major disasters, however, post-event observations show that there are lessons to be learned in minimising life loss in future events. This event has raised or reinforced several key points that should be considered for implementation in all areas at risk from tsunami around the world. Primary areas for discussion are the need for redundant power supplies in tsunami warning systems; considerations of natural warnings when official warnings may not come; adequate understanding and estimation of the tsunami hazard; thorough site assessments for critical infrastructure, including emergency management facilities and tsunami refuges; and adequate signage of evacuation routes and refuges. This paper will present observations made on two field visits to the Tohoku region during 2011, drawing conclusions from field observations and discussions with local emergency officials. These observations will inform the enhancement of current tsunami evacuation strategies in New Zealand; it is believed discussion of these observations can also benefit continuing development of warning and evacuation strategies existing in the United States and elsewhere.
NASA Astrophysics Data System (ADS)
Gebert, Niklas; Post, Joachim
2010-05-01
The development of early warning systems are one of the key domains of adaptation to global environmental change and contribute very much to the development of societal reaction and adaptive capacities to deal with extreme events. Especially, Indonesia is highly exposed to tsunami. In average every three years small and medium size tsunamis occur in the region causing damage and death. In the aftermath of the Indian Ocean Tsunami 2004, the German and Indonesian government agreed on a joint cooperation to develop a People Centered End-to-End Early Warning System (GITEWS). The analysis of risk and vulnerability, as an important step in risk (and early warning) governance, is a precondition for the design of effective early warning structures by delivering the knowledge base for developing institutionalized quick response mechanisms of organizations involved in the issuing of a tsunami warning, and of populations exposed to react to warnings and to manage evacuation before the first tsunami wave hits. Thus, a special challenge for developing countries is the governance of complex cross-sectoral and cross-scale institutional, social and spatial processes and requirements for the conceptualization, implementation and optimization of a people centered tsunami early warning system. In support of this, the risk and vulnerability assessment of the case study aims at identifying those factors that constitute the causal structure of the (dis)functionality between the technological warning and the social response system causing loss of life during an emergency situation: Which social groups are likely to be less able to receive and respond to an early warning alert? And, are people able to evacuate in due time? Here, only an interdisciplinary research approach is capable to analyze the socio-spatial and environmental conditions of vulnerability and risk and to produce valuable results for decision makers and civil society to manage tsunami risk in the early warning context. This requires the integration of natural / spatial and social science concepts, methods and data: E.g. a scenario based approach for tsunami inundation modeling was developed to provide decision makers with options to decide up to what level they aim to protect their people and territory, on the contrary household surveys were conducted for the spatial analysis of the evacuation preparedness of the population as a function of place specific hazard, risk, warning and evacuation perception; remote sensing was applied for the spatial analysis (land-use) of the socio-physical conditions of a city and region for evacuation; and existing social / population statistics were combined with land-use data for the precise spatial mapping of the population exposed to tsunami risks. Only by utilizing such a comprehensive assessment approach valuable information for risk governance can be generated. The results are mapped using GIS and designed according to the specific needs of different end-users, such as public authorities involved in the design of warning dissemination strategies, land-use planners (shelter planning, road network configuration) and NGOs mandated to provide education for the general public on tsunami risk and evacuation behavior. The case study of the city of Padang (one of the pilot areas of GITEWS), Indonesia clearly show, that only by intersecting social (vulnerability) and natural hazards research a comprehensive picture on tsunami risk can be provided with which risk governance in the early warning context can be conducted in a comprehensive, systemic and sustainable manner.
An Integrated Crustal Dynamics Simulator
NASA Astrophysics Data System (ADS)
Xing, H. L.; Mora, P.
2007-12-01
Numerical modelling offers an outstanding opportunity to gain an understanding of the crustal dynamics and complex crustal system behaviour. This presentation provides our long-term and ongoing effort on finite element based computational model and software development to simulate the interacting fault system for earthquake forecasting. A R-minimum strategy based finite-element computational model and software tool, PANDAS, for modelling 3-dimensional nonlinear frictional contact behaviour between multiple deformable bodies with the arbitrarily-shaped contact element strategy has been developed by the authors, which builds up a virtual laboratory to simulate interacting fault systems including crustal boundary conditions and various nonlinearities (e.g. from frictional contact, materials, geometry and thermal coupling). It has been successfully applied to large scale computing of the complex nonlinear phenomena in the non-continuum media involving the nonlinear frictional instability, multiple material properties and complex geometries on supercomputers, such as the South Australia (SA) interacting fault system, South California fault model and Sumatra subduction model. It has been also extended and to simulate the hot fractured rock (HFR) geothermal reservoir system in collaboration of Geodynamics Ltd which is constructing the first geothermal reservoir system in Australia and to model the tsunami generation induced by earthquakes. Both are supported by Australian Research Council.
NASA Astrophysics Data System (ADS)
Bouchard, R. H.; Wang, D.; Branski, F.
2008-05-01
The National Oceanic and Atmospheric Administration (NOAA) operates two tsunami warning centers (TWCs): the West Coast/Alaska Tsunami Warning Center (ATWC) and Pacific Tsunami Warning Center (PTWC). ATWC provides tsunami alerts to Canadian coastal regions, Virgin Islands, Puerto Rico, and the coasts of continental US and Alaska. PTWC provides local/regional tsunami alerts/advisories to the state of Hawaii. An operational center of the Tsunami Warning System of the Pacific, it provides tsunami alerts to most countries of the Pacific Rim. PTWC also provides tsunami alerts for the Caribbean and Indian Ocean countries on an interim basis. The TWCs aim to issue first tsunami bulletins within 10-15 minutes of the earthquake for tele-tsunamis and within a few minutes for local tsunamis. The TWCs have a requirement for offshore tsunami detection in real-time with a data latency of 1 minute or less. Offshore detection of tsunamis is the purpose of NOAA`s recently completed 39-station array of deep-sea tsunameters. The tsunameters, employing the second-generation DART (Deep-ocean Assessment and Reporting of Tsunamis) technology, can speed tsunami detection information to the TWCs in less than 3 minutes from depths of 6000 meters in the Pacific and Western Atlantic oceans. The tsunameters consist of a Bottom Pressure Recorder (BPR) and a surface buoy. Communication from the BPR to the buoy is via underwater acoustic transmissions. Satellite communications carry the data from the buoy to NOAA`s National Data Buoy Center (NDBC), which operates the tsunameters. The BPRs make pressure measurements, converts them to an equivalent water-column height, and passes them through a tsunami detection algorithm. If the algorithm detects a sufficient change in the height, the tsunameter goes into a rapid reporting mode or Event Mode. The acoustic modem-satellite telecommunications path takes approximately 50 seconds to reach the NDBC server. In a few seconds, NDBC reformats the data and pushes them as messages to the National Weather Service Telecommunications Gateway also known as World Meteorological Organization (WMO) Regional Telecommunication Hub (RTH) Washington. RTH Washington can route more than 50 routine messages per second with reliability for all dissemination to all of its users of 99.9 percent. It provides a latency for high priority traffic of 10 seconds or less and routinely handles 1.2 TB of information per day. Its switching centers are on the Main Trunk Network of the WMO`s Global Telecommunication System (GTS), which provides international distribution of the tsunameter data. The GTS is required to deliver tsunami data and warnings to any connected center within two minutes anywhere in the world. TWCs receive the tsunameter data from RTH Washington via GTS circuits, or download the data from servers at the RTH, in the event the GTS circuits fails. TWCs display the data in real-time in their operations. When a tsunameter goes into Event Mode, the TWCs receive alerts. After subtracting the tide, tsunameter signals can measure tsunamis as small as a few millimeters. The usefulness of the tsunameter data at TWCs was demonstrated in some of the recent events in the Pacific Ocean (Kuril Tsunamis of November 2006 and January 2007, Peru Tsunamis of August 2007 and September 2007) and the Indian Ocean (Southern Sumatra Tsunami of September 2007).
NASA Astrophysics Data System (ADS)
Chacón-Barrantes, Silvia; López-Venegas, Alberto; Sánchez-Escobar, Rónald; Luque-Vergara, Néstor
2018-04-01
Historical records have shown that tsunami have affected the Caribbean region in the past. However infrequent, recent studies have demonstrated that they pose a latent hazard for countries within this basin. The Hazard Assessment Working Group of the ICG/CARIBE-EWS (Intergovernmental Coordination Group of the Early Warning System for Tsunamis and Other Coastal Threats for the Caribbean Sea and Adjacent Regions) of IOC/UNESCO has a modeling subgroup, which seeks to develop a modeling platform to assess the effects of possible tsunami sources within the basin. The CaribeWave tsunami exercise is carried out annually in the Caribbean region to increase awareness and test tsunami preparedness of countries within the basin. In this study we present results of tsunami inundation using the CaribeWave15 exercise scenario for four selected locations within the Caribbean basin (Colombia, Costa Rica, Panamá and Puerto Rico), performed by tsunami modeling researchers from those selected countries. The purpose of this study was to provide the states with additional results for the exercise. The results obtained here were compared to co-seismic deformation and tsunami heights within the basin (energy plots) provided for the exercise to assess the performance of the decision support tools distributed by PTWC (Pacific Tsunami Warning Center), the tsunami service provider for the Caribbean basin. However, comparison of coastal tsunami heights was not possible, due to inconsistencies between the provided fault parameters and the modeling results within the provided exercise products. Still, the modeling performed here allowed to analyze tsunami characteristics at the mentioned states from sources within the North Panamá Deformed Belt. The occurrence of a tsunami in the Caribbean may affect several countries because a great variety of them share coastal zones in this basin. Therefore, collaborative efforts similar to the one presented in this study, particularly between neighboring countries, are critical to assess tsunami hazard and increase preparedness within the countries.
NASA Astrophysics Data System (ADS)
Chacón-Barrantes, Silvia; López-Venegas, Alberto; Sánchez-Escobar, Rónald; Luque-Vergara, Néstor
2017-10-01
Historical records have shown that tsunami have affected the Caribbean region in the past. However infrequent, recent studies have demonstrated that they pose a latent hazard for countries within this basin. The Hazard Assessment Working Group of the ICG/CARIBE-EWS (Intergovernmental Coordination Group of the Early Warning System for Tsunamis and Other Coastal Threats for the Caribbean Sea and Adjacent Regions) of IOC/UNESCO has a modeling subgroup, which seeks to develop a modeling platform to assess the effects of possible tsunami sources within the basin. The CaribeWave tsunami exercise is carried out annually in the Caribbean region to increase awareness and test tsunami preparedness of countries within the basin. In this study we present results of tsunami inundation using the CaribeWave15 exercise scenario for four selected locations within the Caribbean basin (Colombia, Costa Rica, Panamá and Puerto Rico), performed by tsunami modeling researchers from those selected countries. The purpose of this study was to provide the states with additional results for the exercise. The results obtained here were compared to co-seismic deformation and tsunami heights within the basin (energy plots) provided for the exercise to assess the performance of the decision support tools distributed by PTWC (Pacific Tsunami Warning Center), the tsunami service provider for the Caribbean basin. However, comparison of coastal tsunami heights was not possible, due to inconsistencies between the provided fault parameters and the modeling results within the provided exercise products. Still, the modeling performed here allowed to analyze tsunami characteristics at the mentioned states from sources within the North Panamá Deformed Belt. The occurrence of a tsunami in the Caribbean may affect several countries because a great variety of them share coastal zones in this basin. Therefore, collaborative efforts similar to the one presented in this study, particularly between neighboring countries, are critical to assess tsunami hazard and increase preparedness within the countries.
Tsunami disaster risk management capabilities in Greece
NASA Astrophysics Data System (ADS)
Marios Karagiannis, Georgios; Synolakis, Costas
2015-04-01
Greece is vulnerable to tsunamis, due to the length of the coastline, its islands and its geographical proximity to the Hellenic Arc, an active subduction zone. Historically, about 10% of all world tsunamis occur in the Mediterranean region. Here we review existing tsunami disaster risk management capabilities in Greece. We analyze capabilities across the disaster management continuum, including prevention, preparedness, response and recovery. Specifically, we focus on issues like legal requirements, stakeholders, hazard mitigation practices, emergency operations plans, public awareness and education, community-based approaches and early-warning systems. Our research is based on a review of existing literature and official documentation, on previous projects, as well as on interviews with civil protection officials in Greece. In terms of tsunami disaster prevention and hazard mitigation, the lack of tsunami inundation maps, except for some areas in Crete, makes it quite difficult to get public support for hazard mitigation practices. Urban and spatial planning tools in Greece allow the planner to take into account hazards and establish buffer zones near hazard areas. However, the application of such ordinances at the local and regional levels is often difficult. Eminent domain is not supported by law and there are no regulatory provisions regarding tax abatement as a disaster prevention tool. Building codes require buildings and other structures to withstand lateral dynamic earthquake loads, but there are no provisions for resistance to impact loading from water born debris Public education about tsunamis has increased during the last half-decade but remains sporadic. In terms of disaster preparedness, Greece does have a National Tsunami Warning Center (NTWC) and is a Member of UNESCO's Tsunami Program for North-eastern Atlantic, the Mediterranean and connected seas (NEAM) region. Several exercises have been organized in the framework of the NEAM Tsunami Warning System, with the Greek NWTC actively participating as a Candidate Tsunami Watch Provider. In addition, Greece designed and conducted the first tsunami exercise program in the Union Civil Protection Mechanism in 2011, which also considered the attrition of response capabilities by the earthquake generating the tsunami. These exercises have demonstrated the capability of the Greek NWTC to provide early warning to local civil protection authorities, but warning dissemination to the population remains an issue, especially during the summer season. However, there is no earthquake or tsunami national emergency operations plan, and we found that tsunami disaster planning and preparedness activities are rather limited at the local level. We acknowledge partial support by the project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe) FP7-ENV2013 6.4-3, Grant 603839 to the Technical University of Crete.
NASA Astrophysics Data System (ADS)
Sugimoto, M.
2015-12-01
The 2004 Indian Ocean tsunami killed around 220,000 people and startled the world. North of Chennai (Madras), the Indian plant nearly affected by tsunami in 2004. The local residents really did not get any warning in India. "On December 26, the Madras Atomic Power Station looked like a desolate place with no power, no phones, no water, no security arrangement and no hindrance whatsoever for outsiders to enter any part of the plant," said S.P. Udaykumar of SACCER. Nuclear issues hide behind such big tsunami damaged. Few media reported outside India. As for US, San Francisco Chronicle reported scientists had to rethink about nuclear power plants by the 2004 tsunami in 11th July 2005. Few tsunami scientsts did not pay attention to nucler power plants nearly affected by tsunami in US. On the other hand, US government noticed the Indian plant nearly affected in 2004. US Goverment supported nucler disaster management in several countries. As for Japan, Japanese goverment mainly concentrated reconstrucation in affected areas and tsunami early warning system. I worked in Japanese embassy in Jakarta Indonesia at that time. I did not receive the information about the Indian plant nearly affected by tsunami and US supported nucler safety to the other coutries. The 2011 Tohoku earthquake and tsunami damaged society and nuclear power stations. The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in the largest release of radioactive material since the 1986 Chernobyl accident. Why did not Japanese tsunami scientists learn from warning signs from the nuclear plant in India by the 2004 Indian Ocean tsunami to the 2011 Fukushima accident? I would like to clarify the reason few tsunami scientist notice this point in my presentation.
Installation of seafloor cabled seismic and tsunami observation system developed by using ICT
NASA Astrophysics Data System (ADS)
Shinohara, Masanao; Yamada, Tomoaki; Sakai, Shin'ichi; Shiobara, Hajime; Kanazawa, Toshihiko
2017-04-01
A seafloor cabled system is useful for study of earth science and disaster mitigation, because real-time and long-term observation can be performed. Therefore seafloor cabled systems with seismometers and tsunami-meters have been used over the past 25 years around Japan. Because increase of a number of sensors is needed, a new system with low costs for production, deployment and operation is expected. In addition, the new system should have sufficient for flexibility of measurements after installation. To achieve these demands, we started development of a new system using Information and Communication Technologies (ICT) for data transmission and system control. The new system can be made compact since software processes various measurements. Reliability of the system is kept by using redundant system which is easily constructed using the ICT. The first system based on this concept was developed as Ocean Bottom Cabled Seismometer (OBCS) system and deployed in Japan Sea. Development of the second system started from 2012. The Ocean Bottom Cabled Seismometer and Tsunami-meter (OBCST) system has both seismometers and tsunami-meters. Each observation node has a CPU and FPGAs. The OBCST system uses standard TCP/IP protocol with a speed of 1 Gbps for data transmission, system control and monitoring. IEEE-1588 (PTP) is implemented to synchronize a real-time clock, and accuracy is less than 300 ns. We developed two types of observation node. One equips a pressure gauge as tsunami sensor, and another has an external port for additional observation sensor using PoE. Deployment of the OBCST system was carried out in September 2015 by using a commercial telecommunication cable ship. The noise levels at the OBCST system are comparable to those at the existing cabled system off Sanriku. It is found that the noise levels at the OBCST system are low at frequencies greater than 2 Hz and smaller than 0.1 Hz. This level of ambient seismic noise is close to a typical system noise. From the pressure data, pressure gauge has a resolution of less than 1 hPa, which corresponds to a change of water height of less than 1 cm, and data from all the pressure gauges are consistent. From the deployment, the system has been collecting data on seafloor until the present. Tsunami waves on November 22nd, 2016, which were generated by an earthquake with magnitude of 7.4 off Fukushima were clearly observed by all tsunami sensors in the system.
Doocy, Shannon; Daniels, Amy; Dick, Anna; Kirsch, Thomas D.
2013-01-01
Introduction. Although rare, tsunamis have the potential to cause considerable loss of life and injury as well as widespread damage to the natural and built environments. The objectives of this review were to describe the impact of tsunamis on human populations in terms of mortality, injury, and displacement and, to the extent possible, identify risk factors associated with these outcomes. This is one of five reviews on the human impact of natural disasters. Methods. Data on the impact of tsunamis were compiled using two methods, a historical review from 1900 to mid 2009 of tsunami events from multiple databases and a systematic literature review to October 2012 of publications. Analysis included descriptive statistics and bivariate tests for associations between tsunami mortality and characteristics using STATA 11. Findings. There were 255,195 deaths (range 252,619-275,784) and 48,462 injuries (range 45,466-51,457) as a result of tsunamis from 1900 to 2009. The majority of deaths (89%) and injuries reported during this time period were attributed to a single event –the 2004 Indian Ocean tsunami. Findings from the systematic literature review indicate that the primary cause of tsunami-related mortality is drowning, and that females, children and the elderly are at increased mortality risk. The few studies that reported on tsunami-related injury suggest that males and young adults are at increased injury-risk. Conclusions. Early warning systems may help mitigate tsunami-related loss of life. PMID:23857277
Doocy, Shannon; Daniels, Amy; Dick, Anna; Kirsch, Thomas D
2013-04-16
Introduction. Although rare, tsunamis have the potential to cause considerable loss of life and injury as well as widespread damage to the natural and built environments. The objectives of this review were to describe the impact of tsunamis on human populations in terms of mortality, injury, and displacement and, to the extent possible, identify risk factors associated with these outcomes. This is one of five reviews on the human impact of natural disasters. Methods. Data on the impact of tsunamis were compiled using two methods, a historical review from 1900 to mid 2009 of tsunami events from multiple databases and a systematic literature review to October 2012 of publications. Analysis included descriptive statistics and bivariate tests for associations between tsunami mortality and characteristics using STATA 11. Findings. There were 255,195 deaths (range 252,619-275,784) and 48,462 injuries (range 45,466-51,457) as a result of tsunamis from 1900 to 2009. The majority of deaths (89%) and injuries reported during this time period were attributed to a single event -the 2004 Indian Ocean tsunami. Findings from the systematic literature review indicate that the primary cause of tsunami-related mortality is drowning, and that females, children and the elderly are at increased mortality risk. The few studies that reported on tsunami-related injury suggest that males and young adults are at increased injury-risk. Conclusions. Early warning systems may help mitigate tsunami-related loss of life.
Lessons on vulnerability from the 2011 Tohoku earthquake for Indonesia and the United States
NASA Astrophysics Data System (ADS)
Sugimoto, M.; Dengler, L.
2011-12-01
The 2011 Tohoku earthquake and tsunami shocked people relevant for tsunami disaster risk reduction all over the world because such people thought Tohoku has often attacked by tsunamis and has declared one of the most wellprepared areas for tsunami in the world. Each author has separately promoted tsunami education to community in Indonesia for 7 years after the 2004 Indian Ocean tsunami and California US for 19 years after the1992 M7.2 Cape Mendocino earthquake. In order to learn the lesson from the 2011 Tohoku earthquake and tsunami and feedback to Indonesia, US and International society, we examined some of the factors that contributed to impacts in Tohoku based on field reconnaissance and reports from other organizations. The biggest factors exacerbating losses were the underestimation M8 of the real tsunami size M9 in design of prevention structures and evacuation planning coupled with a perception of individuals that they were not at risk. Approximately 86 % of the tsunami victims were in areas outside the mapped tsunami hazard zone in Unosumai town, Iwate. At least 100 chosen tsunami evacuation buildings were either overtopped or structurally toppled by the tsunami. More than 200 people died in the first story gymnasium of elementary school beside the river and canal in areas outside the mapped tsunami hazard zone in Higashi-Matsushima city Miyagi. Around 80 students sacrificed in Okawa Elementary school in Ishinomaki city Miyagi. Additional factors affecting vulnerability included people who were in safe areas at the time of the earthquake, returning to hazard zones after feeling the earthquake to rescue relatives or possessions, and relying on cars for evacuation. Factors that enhanced resilience include the good performance of most structures to earthquake ground shaking and the performance of the tsunami early warning system in stopping trains and shutting down other critical systems. Although power was out in most of the affected region, some cell phones and automobile car radios worked in many areas and were able to provide some warning guidance. Individuals who were able to improvise and make changes in their evacuation plans and routes may have been more likely to survive. As for US, it has triggered a re-examination of how slip and secondary fault rupture may affect the size of the tsunami and engendered debate about how to treat uncertainty in model results while it has not changed the maximum magnitude estimate for an earthquake on the Cascadia subduction zone, it has triggered a re-examination of how slip and secondary fault rupture may affect the size of the tsunami and engendered debate about how to treat uncertainty in model results. It has also raised the priority of FEMA's catastrophic response planning efforts for a great Cascadia earthquake and has invigorated states and local coastal jurisdiction's planning, education, and outreach efforts. Indonesia has been on the way to prepare for tsunami from the Tohoku model after the 2004 Indian Ocean tsunami. I stopped the plan make signboards of numerical tsunami height in Padang Indonesia because such signboards were not effective in Tohoku in this time. We introduce new plans in this presentation.
Impact of earthquake-induced tsunamis on public health
NASA Astrophysics Data System (ADS)
Mavroulis, Spyridon; Mavrouli, Maria; Lekkas, Efthymios; Tsakris, Athanassios
2017-04-01
Tsunamis are caused by rapid sea floor displacement during earthquakes, landslides and large explosive eruptions in marine environment setting. Massive amounts of sea water in the form of devastating surface waves travelling hundreds of kilometers per hour have the potential to cause extensive damage to coastal infrastructures, considerable loss of life and injury and emergence of infectious diseases (ID). This study involved an extensive and systematic literature review of 50 research publications related to public health impact of the three most devastating tsunamis of the last 12 years induced by great earthquakes, namely the 2004 Sumatra-Andaman earthquake (moment magnitude Mw 9.2), the 2009 Samoa earthquake (Mw 8.1) and the 2011 Tōhoku (Japan) earthquake (Mw 9.0) in the Indian, Western Pacific and South Pacific Oceans respectively. The inclusion criteria were literature type comprising journal articles and official reports, natural disaster type including tsunamis induced only by earthquakes, population type including humans, and outcome measure characterized by disease incidence increase. The potential post-tsunami ID are classified into 11 groups including respiratory, pulmonary, wound-related, water-borne, skin, vector-borne, eye, fecal-oral, food-borne, fungal and mite-borne ID. Respiratory infections were detected after all the above mentioned tsunamis. Wound-related, skin and water-borne ID were observed after the 2004 and 2011 tsunamis, while vector-borne, fecal-oral and eye ID were observed only after the 2004 tsunami and pulmonary, food-borne and mite-borne ID were diagnosed only after the 2011 tsunami. Based on available age and genre data, it is concluded that the most vulnerable population groups are males, children (age ≤ 15 years) and adults (age ≥ 65 years). Tetanus and pneumonia are the deadliest post-tsunami ID. The detected risk factors include (1) lowest socioeconomic conditions, poorly constructed buildings and lack of prevention measures, (2) lack of awareness and prior warning resulting in little time for preparedness or evacuation, (3) severely injured tsunami survivors exposed to high pathogen densities in soil and water, (4) destruction of critical infrastructures including health care systems causing delayed management and treatment of severe cases, (5) aggravating post-tsunami weather conditions, (6) formation of extensive potential vector breeding sites due to flooding, (7) overcrowded conditions in evacuation shelters characterized by small places, inadequate air ventilation, poor hand hygiene and dysfunction of the public health system, (8) low vaccination coverage, (9) poor personal hygiene, (10) minimum precautions against food contamination and (11) dependency of young children and weaker physical strength and resilience of elders needing assistance with daily activities. In conclusion, our study referred to potential ID following tsunamis induced after great earthquakes during the last 12 years. The establishment of strong disaster preparedness plans characterized by adequate environmental planning, resistant infrastructures and resilient health facilities is significant for the early detection, surveillance and control of emerging ID. Moreover, the establishment and the unceasing function of reliable early warning systems may help mitigate tsunami-related impact on public health.
A comparison between two inundation models for the 25 Ooctober 2010 Mentawai Islands Tsunami
NASA Astrophysics Data System (ADS)
Huang, Z.; Borrero, J. C.; Qiu, Q.; Hill, E. M.; Li, L.; Sieh, K. E.
2011-12-01
On 25 October 2010, an Mw~7.8 earthquake occurred on the Sumatra megathrust seaward of the Mentawai Islands, Indonesia, generating a tsunami which killed approximately 500 people. Following the event, the Earth Observatory of Singapore (EOS) initiated a post-tsunami field survey, collecting tsunami run-up data from more than 30 sites on Pagai Selatan, Pagai Utara and Sipora. The strongest tsunami effects were observed on several small islands offshore of Pagai Selatan, where runup exceeded 16 m. This presentation will focus on a detailed comparison between two tsunami propagation and inundation models: COMCOT (Cornell Multi-grid Coupled Tsunami model) and MOST (Method of Splitting Tsunami). Simulations are initialized using fault models based on data from a 1-hz GPS system that measured co-seismic deformation throughout the region. Preliminary simulations suggest that 2-m vertical seafloor deformation over a reasonably large area is required to recreate most of the observed tsunami effects. Since the GPS data suggest that subsidence of the islands is small, this implies that the tsunami source region is somewhat narrower and located further offshore than described in recently published earthquake source models based on teleseismic inversions alone. We will also discuss issues such as bathymetric and topographic data preparation and the uncertainty in the modeling results due to the lack of high resolution bathymetry and topography in the study area.
Modeling Tsunami Wave Generation Using a Two-layer Granular Landslide Model
NASA Astrophysics Data System (ADS)
Ma, G.; Kirby, J. T., Jr.; Shi, F.; Grilli, S. T.; Hsu, T. J.
2016-12-01
Tsunamis can be generated by subaerial or submarine landslides in reservoirs, lakes, fjords, bays and oceans. Compared to seismogenic tsunamis, landslide or submarine mass failure (SMF) tsunamis are normally characterized by relatively shorter wave lengths and stronger wave dispersion, and potentially may generate large wave amplitudes locally and high run-up along adjacent coastlines. Due to a complex interplay between the landslide and tsunami waves, accurate simulation of landslide motion as well as tsunami generation is a challenging task. We develop and test a new two-layer model for granular landslide motion and tsunami wave generation. The landslide is described as a saturated granular flow, accounting for intergranular stresses governed by Coulomb friction. Tsunami wave generation is simulated by the three-dimensional non-hydrostatic wave model NHWAVE, which is capable of capturing wave dispersion efficiently using a small number of discretized vertical levels. Depth-averaged governing equations for the granular landslide are derived in a slope-oriented coordinate system, taking into account the dynamic interaction between the lower-layer granular landslide and upper-layer water motion. The model is tested against laboratory experiments on impulsive wave generation by subaerial granular landslides. Model results illustrate a complex interplay between the granular landslide and tsunami waves, and they reasonably predict not only the tsunami wave generation but also the granular landslide motion from initiation to deposition.
NASA Astrophysics Data System (ADS)
Anisya; Yoga Swara, Ganda
2017-12-01
Padang is one of the cities prone to earthquake disaster with tsunami due to its position at the meeting of two active plates, this is, a source of potentially powerful earthquake and tsunami. Central government and most offices are located in the red zone (vulnerable areas), it will also affect the evacuation of the population during the earthquake and tsunami disaster. In this study, researchers produced a system of search nearest shelter using best-first-search method. This method uses the heuristic function, the amount of cost taken and the estimated value or travel time, path length and population density. To calculate the length of the path, researchers used method of haversine formula. The value obtained from the calculation process is implemented on a web-based system. Some alternative paths and some of the closest shelters will be displayed in the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Idris, Nasrullah, E-mail: nasrullah.idris@unsyiah.ac.id; Ramli, Muliadi; Hedwig, Rinda
This work is intended to asses the capability of LIBS for the detection of the tsunami sediment contamination in soil. LIBS apparatus used in this work consist of a laser system and an optical multichannel analyzer (OMA) system. The soil sample was collected from in Banda Aceh City, Aceh, Indonesia, the most affected region by the giant Indian Ocean tsunami 2004. The laser beam was focused onto surface of the soil pellet using a focusing lens to produce luminous plasma. The experiment was conducted under air as surrounding gas at 1 atmosphere. The emission spectral lines from the plasma weremore » detected by the OMA system. It was found that metal including heavy metals can surely be detected, thus implying the potent of LIBS technique as a fast screening tools of tsunami sediment contamination.« less
Preliminary tsunami hazard assessment in British Columbia, Canada
NASA Astrophysics Data System (ADS)
Insua, T. L.; Grilli, A. R.; Grilli, S. T.; Shelby, M. R.; Wang, K.; Gao, D.; Cherniawsky, J. Y.; Harris, J. C.; Heesemann, M.; McLean, S.; Moran, K.
2015-12-01
Ocean Networks Canada (ONC), a not-for-profit initiative by the University of Victoria that operates several cabled ocean observatories, is developing a new generation of ocean observing systems (referred to as Smart Ocean Systems™), involving advanced undersea observation technologies, data networks and analytics. The ONC Tsunami project is a Smart Ocean Systems™ project that addresses the need for a near-field tsunami detection system for the coastal areas of British Columbia. Recent studies indicate that there is a 40-80% probability over the next 50 for a significant tsunami impacting the British Columbia (BC) coast with runups higher than 1.5 m. The NEPTUNE cabled ocean observatory, operated by ONC off of the west coast of British Columbia, could be used to detect near-field tsunami events with existing instrumentation, including seismometers and bottom pressure recorders. As part of this project, new tsunami simulations are underway for the BC coast. Tsunami propagation is being simulated with the FUNWAVE-TVD model, for a suite of new source models representing Cascadia megathrust rupture scenarios. Simulations are performed by one-way coupling in a series of nested model grids (from the source to the BC coast), whose bathymetry was developed based on digital elevation maps (DEMs) of the area, to estimate both tsunami arrival time and coastal runup/inundation for different locations. Besides inundation, maps of additional parameters such as maximum current are being developed, that will aid in tsunami hazard assessment and risk mitigation, as well as developing evacuation plans. We will present initial results of this work for the Port Alberni inlet, in particular Ucluelet, based on new source models developed using the best available data. We will also present a model validation using measurements of the 2011 transpacific Tohoku-oki tsunami recorded in coastal BC by several instruments from various US and Canadian agencies.
NASA Astrophysics Data System (ADS)
Beranzoli, Laura; Best, Mairi; Chierici, Francesco; Embriaco, Davide; Galbraith, Nan; Heeseman, Martin; Kelley, Deborah; Pirenne, Benoit; Scofield, Oscar; Weller, Robert
2015-04-01
There is a need for tsunami modeling and early warning systems for near-source areas. For example this is a common public safety threat in the Mediterranean and Juan de Fuca/NE Pacific Coast of N.A.; Regions covered by the EMSO, OOI, and ONC ocean observatories. Through the CoopEUS international cooperation project, a number of environmental research infrastructures have come together to coordinate efforts on environmental challenges; this tsunami case study tackles one such challenge. There is a mutual need of tsunami event field data and modeling to deepen our experience in testing methodology and developing real-time data processing. Tsunami field data are already available for past events, part of this use case compares these for compatibility, gap analysis, and model groundtruthing. It also reviews sensors needed and harmonizes instrument settings. Sensor metadata and registries are compared, harmonized, and aligned. Data policies and access are also compared and assessed for gap analysis. Modelling algorithms are compared and tested against archived and real-time data. This case study will then be extended to other related tsunami data and model sources globally with similar geographic and seismic scenarios.
NASA Astrophysics Data System (ADS)
Lavigne, Franck; Grancher, Delphine; Goeldner-Gianella, Lydie; Karanci, Nuray; Dogulu, Nilay; Kanoglu, Utku; Zaniboni, Filippo; Tinti, Stefano; Papageorgiou, Antonia; Papadopoulos, Gerassimos; Constantin, Angela; Moldovan, Iren; El Mouraouah, Azelarab; Benchekroun, Sabah; Birouk, Abdelouahad
2016-04-01
Understanding social vulnerability to tsunamis provides risk managers with the required information to determine whether individuals have the capacity to evacuate, and therefore to take mitigation measures to protect their communities. In the frame of the EU programme ASTARTE (Assessment, STrategy And Risk reduction for Tsunamis in Europe), we conducted a questionnaire-based survey among 1,661 people from 41 nationalities living in, working in, or visiting 10 Test Sites from 9 different countries. The questions, which have been translated in 11 languages, focused on tsunami hazard awareness, risk perception, and knowledge of the existing warning systems. Our results confirm our initial hypothesis that low attention is paid in Europe to tsunami risk. Among all type of hazards, either natural or not, tsunami rank first in only one site (Lyngen fjord in Norway), rank third in 3 other sites (Eforie Nord in Romania, Nice and Istanbul), rank 4 in Gulluk Bay, 5 in Sines and Heraklion, and 10 in Siracusa (Sicily) and San Jordi (Balearic Islands). Whatever the respondent's status (i.e. local population, local authorities, or tourists), earthquakes and drawdown of the sea are cited as tsunami warning signs by 43% and 39% of the respondents, respectively. Therefore self-evacuation is not expected for more than half of the population. Considering that most European countries have no early warning system for tsunamis, a disaster is likely to happen in any coastal area exposed to this specific hazard. Furthermore, knowledge of past tsunami events is also very limited: only 22% of people stated that a tsunami has occurred in the past, whereas a deadly tsunami occurs every century in the Mediterranean Sea (e.g. in AD 365, 1660, 1672 or 1956 in the eastern part, 1908, 1979 or 2003 in the western part), and high tsunami waves devastated the Portugal and Moroccan coasts in 1755. Despite this lack of knowledge and awareness of past events, 62% of the respondents think that the site of the interview could be affected by a tsunami in the future. Respondents were strongly influenced by the images of catastrophic tsunamis they have seen in 2004 and 2011, leading them to consider local wave heights >10 or 15m, even in low-exposed areas such as Nice or the Balearic Islands. Such overestimation of the wave heights could lead to confusion during an evacuation. This survey at the European scale underlines the need to better mitigation strategies, including but not limited to inform residents, local workers and tourists of each site about: (1) the reality of the tsunami risk; (2) the maximal wave height that has been modelled for the worst case; and (3) where to evacuate in case of a future tsunami. Key words: tsunami, coastal risk, hazard knowledge, risk perception, vulnerability, resilience, evacuation, Europe
Introduction to the High-Rate GPS Network of Puerto Rico and the U.S. Virgin Islands
NASA Astrophysics Data System (ADS)
Wang, G.; Hillebrandt, C. V.; Martinez, J. M.; Huerfano, V.; Schellekens, J.
2008-12-01
The Puerto Rico Seismic Network at the University of Puerto Rico at Mayagüez is a regional earthquake and tsunami monitoring institute. One of its primary objective is to provide timely and reliable earthquake and tsunami information and warning to the state (Puerto Rico) and local governments, the US and British Virgin Islands, as well as to the general public. In the past five years, it has been expanding its operations for the establishment of a Caribbean Tsunami Warning Center. With funding of the Puerto Rico government and NOAA, it is operated 24 hours per day and 7 days per week. Broadband seismometers are generally unable to capture the full bandwidth of long period ground motions following very large earthquakes. As a result, it is difficult to rapidly estimate the true magnitudes of large earthquakes using only seismic data. High-rate GPS has been justified as a very useful tool in recording long-period and permanent earthquake ground motions. Estimation of the true magnitude (and therefore tsunami potential) of large earthquakes may be determined more accurately in a timely manner (minutes after the quake) using high rate GPS observations. With the major aim of improving the ability of the PRSN in rapidly and precisely monitoring large earthquakes, NSF funded a Major Research Instrumentation (MRI) project, Acquisition of 9 High-rate GPS Units for Developing a Broadband Earthquake Observation System in Puerto Rico and the U.S. Virgin Islands (EAR-0722540, August 1, 2007-July 31, 2009). The major purpose of this project is to build a high-rate GPS network in Puerto Rico and the U.S. Virgin Islands. The GPS network includes 3 campaign and 6 permanent GPS stations. These campaign stations were designed to use in emergency response after large earthquakes to get co-seismic and post-seismic displacement. These six permanent stations were designed to complement current seismic observation system of Puerto Rico and U.S. Virgin Islands. We have installed three permanent GPS stations in May, 2008. They locate in Arecibo Observatory, Bayamon Science Park, and Caja de Muertos Island. We will install the other three stations in October, 2008. They will be located in Mona, Culebra, and St. Thomas islands. All of these permanent GPS stations are colocated with seismic stations operated by the Puerto Rico Seismic Network and the Puerto Rico Strong Motion Program. They are also very-closely spaced to the Tide Gauge stations operated by PRSN and NOAA. Therefore they will also complement the tide gauge sea-level observation system to get accurate absolute sea-level changes after large earthquakes. The integrated velocitymeter-accelerometer- GPS earthquake observation system will advance knowledge of seismic wave propagation, the kinematics and dynamics of fault rupture process, pre-seismic, co-seismic and post-seismic deformation, and is also likely to be useful for improving building and critical structure designs. It will support earthquake and tsunami hazards research and mitigation in Puerto Rico and the surrounding region. High-rate GPS observations can also be used for real time tropospheric water vapor tomography which is useful for weather prediction, including improved hurricane track forecasting. Raw GPS data are freely available through the UNAVCO archive. As a result, a large number of researchers can potentially benefit from the data for research and applications ranging from neotectonics to atmospheric science to civil engineering.
Prehospital care of tsunami victims in Thailand: description and analysis.
Schwartz, Dagan; Goldberg, Avishay; Ashkenasi, Issac; Nakash, Guy; Pelts, Rami; Leiba, Adi; Levi, Yeheskel; Bar-Dayan, Yaron
2006-01-01
On 26 December 2004 at 09:00 h, an earthquake of 9.0 magnitude (Richter scale) struck the area off of the western coast of northern Sumatra, Indonesia, triggering a Tsunami. As of 25 January 2005, 5,388 fatalities were confirmed, 3,120 people were reported missing, and 8,457 people were wounded in Thailand alone. Little information is available in the medical literature regarding the response and restructuring of the prehospital healthcare system in dealing with major natural disasters. The objective of the study was to analyze the prehospital medical response to the Tsunami in Thailand, and to identify possible ways of improving future preparedness and response. The Israeli Defense Forces (IDF) Home Front Command Medical Department sent a research delegation to study the response of the Thai medical system to the 2004 earthquake and Tsunami disaster. The delegation met with Thai healthcare and military personnel, who provided medical care for and evacuated the Tsunami victims. The research instruments included questionnaires (open and closed questions), interviews, and a review of debriefing session reports held in the days following the Tsunami. Beginning the day after the event, primary health care in the affected provinces was expanded and extended. This included: (1) strengthening existing primary care facilities with personnel and equipment; (2) enhancing communication and transportation capabilities; (3) erecting healthcare facilities in newly constructed evacuation centers; (4) deploying mobile, medical teams to make house calls to flood refugees in affected areas; and (5) deploying ambulance crews to the affected areas to search for survivors and provide primary care triage and transportation. The restructuring of the prehospital healthcare system was crucial for optimal management of the healthcare needs of Tsunami victims and for the reduction of the patient loads on secondary medical facilities. The disaster plan of a national healthcare system should include special consideration for the restructuring and reinforcement prehospital system.
Tsunami Amplitude Estimation from Real-Time GNSS.
NASA Astrophysics Data System (ADS)
Jeffries, C.; MacInnes, B. T.; Melbourne, T. I.
2017-12-01
Tsunami early warning systems currently comprise modeling of observations from the global seismic network, deep-ocean DART buoys, and a global distribution of tide gauges. While these tools work well for tsunamis traveling teleseismic distances, saturation of seismic magnitude estimation in the near field can result in significant underestimation of tsunami excitation for local warning. Moreover, DART buoy and tide gauge observations cannot be used to rectify the underestimation in the available time, typically 10-20 minutes, before local runup occurs. Real-time GNSS measurements of coseismic offsets may be used to estimate finite faulting within 1-2 minutes and, in turn, tsunami excitation for local warning purposes. We describe here a tsunami amplitude estimation algorithm; implemented for the Cascadia subduction zone, that uses continuous GNSS position streams to estimate finite faulting. The system is based on a time-domain convolution of fault slip that uses a pre-computed catalog of hydrodynamic Green's functions generated with the GeoClaw shallow-water wave simulation software and maps seismic slip along each section of the fault to points located off the Cascadia coast in 20m of water depth and relies on the principle of the linearity in tsunami wave propagation. The system draws continuous slip estimates from a message broker, convolves the slip with appropriate Green's functions which are then superimposed to produce wave amplitude at each coastal location. The maximum amplitude and its arrival time are then passed into a database for subsequent monitoring and display. We plan on testing this system using a suite of synthetic earthquakes calculated for Cascadia whose ground motions are simulated at 500 existing Cascadia GPS sites, as well as real earthquakes for which we have continuous GNSS time series and surveyed runup heights, including Maule, Chile 2010 and Tohoku, Japan 2011. This system has been implemented in the CWU Geodesy Lab for the Cascadia subduction zone but will be expanded to the circum-Pacific as real-time processing of international GNSS data streams become available.
Spatial Modeling of Tsunami Impact in Manado City using Geographic Information System
NASA Astrophysics Data System (ADS)
Kumaat, J. C.; Kandoli, S. T. B.; Laeloma, F.
2018-02-01
Manado City is a coastal area in the shape of a bay. Manado Bay is a water body that protrudes in the area of Manado City where the condition of this region is likely to have a tsunami threat. Manado Bay is home to several rivers such as Tondano River has a geological history of both land and sea. There are several active faults, such as in the sea, subduction of subplate in the north of the island, Mayu mountain plate, and Sangihe plate east of North Sulawesi. The purpose of this study is divided into two parts: General purpose is to describe GIS-based disaster mitigation that can be done to minimize disaster risk if Tsunami disaster occurs in coastal area of Manado Bay, while special purpose consists of 3 parts, namely: 1. mapping of zone- Tsunami vulnerability zone of Manado Bay; 2. mapping the distance and time of the scenario of the Manado Bay Tsunami evacuation route; 3. mapping of the number of buildings and roads exposed to the Manado Bay Tsunami. Data collection techniques use secondary data collection techniques. Secondary data comes from related institutions or institutions, libraries, or individual archives. The data collection is also continued by direct observation. Direct observation is meant by direct observation by using a checklist for secondary data adjustment and then the determination of coordinate point with Global Position System (GPS) at some tsunami location.
Parallelization of the Coupled Earthquake Model
NASA Technical Reports Server (NTRS)
Block, Gary; Li, P. Peggy; Song, Yuhe T.
2007-01-01
This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.
A new physics-based modeling approach for tsunami-ionosphere coupling
NASA Astrophysics Data System (ADS)
Meng, X.; Komjathy, A.; Verkhoglyadova, O. P.; Yang, Y.-M.; Deng, Y.; Mannucci, A. J.
2015-06-01
Tsunamis can generate gravity waves propagating upward through the atmosphere, inducing total electron content (TEC) disturbances in the ionosphere. To capture this process, we have implemented tsunami-generated gravity waves into the Global Ionosphere-Thermosphere Model (GITM) to construct a three-dimensional physics-based model WP (Wave Perturbation)-GITM. WP-GITM takes tsunami wave properties, including the wave height, wave period, wavelength, and propagation direction, as inputs and time-dependently characterizes the responses of the upper atmosphere between 100 km and 600 km altitudes. We apply WP-GITM to simulate the ionosphere above the West Coast of the United States around the time when the tsunami associated with the March 2011 Tohuku-Oki earthquke arrived. The simulated TEC perturbations agree with Global Positioning System observations reasonably well. For the first time, a fully self-consistent and physics-based model has reproduced the GPS-observed traveling ionospheric signatures of an actual tsunami event.
Use of SCALE Continuous-Energy Monte Carlo Tools for Eigenvalue Sensitivity Coefficient Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perfetti, Christopher M; Rearden, Bradley T
2013-01-01
The TSUNAMI code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the development of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The CLUTCH and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in themore » CE KENO framework to generate the capability for TSUNAMI-3D to perform eigenvalue sensitivity calculations in continuous-energy applications. This work explores the improvements in accuracy that can be gained in eigenvalue and eigenvalue sensitivity calculations through the use of the SCALE CE KENO and CE TSUNAMI continuous-energy Monte Carlo tools as compared to multigroup tools. The CE KENO and CE TSUNAMI tools were used to analyze two difficult models of critical benchmarks, and produced eigenvalue and eigenvalue sensitivity coefficient results that showed a marked improvement in accuracy. The CLUTCH sensitivity method in particular excelled in terms of efficiency and computational memory requirements.« less
NASA Astrophysics Data System (ADS)
Rakoto, V.; Lognonne, P. H.; Rolland, L. M.
2015-12-01
Large earthquakes (i.eM>6) and tsunamis associated are responsible for ionospheric perturbations. These perturbations can be observed in the total electron content (TEC) measured from multi- frequency Global Navigation Satellite systems (GNSS) data (e.g GPS). We will focus on the studies of the Haïda Gwaii earthquake and tsunami case. It happened the 28 october 2012 along the Queen Charlotte fault of the Canada Western Coast. First, we compare GPS data of perturbation TEC to our model. We model the TEC perturbation in several steps. (1) First, we compute tsunami normal modes modes in atmosphere in using PREM model with 4.7km of oceanic layer. (2) We sum all the tsunami modes to obtain the neutral displacement. (3) We couple the ionosphere with the neutral atmosphere. (4) We integrate the perturbed electron density along each satellite station line of sight. At last, we present first results of TEC inversion in order to retrieve the waveform of the tsunami. This inversion has been done on synthetics data assuming Queen Charlotte Earthquake and Tsunami can be considered as a point source in far field.
Samarakoon, M B; Tanaka, Norio; Iimura, Kosuke
2013-01-15
Coastal vegetation can play a significant role in reducing the severity of a tsunami because the energy associated with the tsunami is dissipated when it passes through coastal vegetation. Field surveys were conducted on the eastern coastline of Sri Lanka to investigate which vegetation species are effective against a tsunami and to evaluate the effectiveness of existing Casuarina equisetifolia forests in tsunami mitigation. Open gaps in C. equisetifolia forests were identified as a disadvantage, and introduction of a new vegetation belt in front or back of the existing C. equisetifolia forest is proposed to reduce the disadvantages of the open gap. Among the many plant species encountered during the field survey, ten species were selected as effective for tsunami disaster mitigation. The selection of appropriate vegetation for the front or back vegetation layer was based on the vegetation thickness per unit area (dN(u)) and breaking moment of each species. A numerical model based on two-dimensional nonlinear long-wave equations was applied to explain the present situation of open gaps in C. equisetifolia forests, and to evaluate the effectiveness of combined vegetation systems. The results of the numerical simulation for existing conditions of C. equisetifolia forests revealed that the tsunami force ratio (R = tsunami force with vegetation/tsunami force without vegetation) was 1.4 at the gap exit. The species selected for the front and back vegetation layers were Pandanus odoratissimus and Manilkara hexandra, respectively. A numerical simulation of the modified system revealed that R was reduced to 0.7 in the combined P. odoratissimus and C. equisetifolia system. However, the combination of C. equisetifolia and M. hexandra did not effectively reduce R at the gap exit. Therefore, P. odoratissimus as the front vegetation layer is proposed to reduce the disadvantages of the open gaps in existing C. equisetifolia forests. The optimal width of P. odoratissimus (W(1)) calculated from the numerical simulation was W(1) = 10 m. R at the exit of a 15-m-wide open gap was 0.8, and therefore the proposed system was appropriate for cases with the highest velocity at the gap exit as well. Establishment of a new front vegetation layer except for open gaps that are essential, such as access roads to the beach, is proposed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Hunter, Jennifer C; Crawley, Adam W; Petrie, Michael; Yang, Jane E; Aragón, Tomás J
2012-07-16
Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami's impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergency responders' ability to rapidly communicate and coordinate a response to a potential threat. Methods To evaluate the local public health system emergency response to the tsunami threat in California, we surveyed all local public health, emergency medical services (EMS), and emergency management agencies in coastal or floodplain counties about several domains related to the tsunami threat in California, including: (1) the extent to which their community was affected by the tsunami, (2) when and how they received notification of the event, (3) which public health response activities were carried out to address the tsunami threat in their community, and (4) which organizations contributed to the response. Public health activities were characterized using the Centers for Disease Control and Prevention (CDC) Public Health Preparedness Capabilities (PHEP) framework. Findings The tsunami's impact on coastal communities in California ranged widely, both in terms of the economic consequences and the response activities. Based on estimates from the National Oceanic and Atmospheric Administration (NOAA), ten jurisdictions in California reported tsunami-related damage, which ranged from $15,000 to $35 million. Respondents first became aware of the tsunami threat in California between the hours of 10:00pm Pacific Standard Time (PST) on Thursday March 10th and 2:00pm PST on Friday March 11th, a range of 16 hours, with notification occurring through both formal and informal channels. In response to this threat, the activities most commonly reported by the local government agencies included in this study were: emergency public information and warning, emergency operations coordination, and inter-organizational information sharing, which were reported by 86%, 75%, and 65% of all respondents, respectively. When looking at the distribution of responsibility, emergency management agencies were the most likely to report assuming a lead role in these common activities as well as those related to evacuation and community recovery. While activated less frequently, public health agencies carried out emergency response functions related to surveillance and epidemiology, environmental health, and mental health/psychological support. Both local public health and EMS agencies took part in mass care and medical material management activities. A large network of organizations contributed to response activities, with emergency management, law enforcement, fire, public health, public works, EMS, and media cited by more than half of respondents. Conclusions In response to the tsunami threat in California, we found that emergency management agencies assumed a lead role in the local response efforts. While public health and medical agencies played a supporting role in the response, they uniquely contributed to a number of specific activities. If the response to the recent tsunami is any indication, these support activities can be anticipated in planning for future events with similar characteristics to the tsunami threat. Additionally, we found that many respondents first learned of the tsunami through the media, rather than through rapid notification systems, which suggests that government agencies must continue to develop and maintain the ability to rapidly aggregate and analyze information in order to provide accurate assessments and guidance to a potentially well-informed public. Hunter JC, Crawley AW, Petrie M, Yang JE, Aragón TJ. Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake. PLoS Currents Disasters. 2012 Jul 16.
Tsunami evacuation buildings and evacuation planning in Banda Aceh, Indonesia.
Yuzal, Hendri; Kim, Karl; Pant, Pradip; Yamashita, Eric
Indonesia, a country of more than 17,000 islands, is exposed to many hazards. A magnitude 9.1 earthquake struck off the coast of Sumatra, Indonesia, on December 26, 2004. It triggered a series of tsunami waves that spread across the Indian Ocean causing damage in 11 countries. Banda Aceh, the capital city of Aceh Province, was among the most damaged. More than 31,000 people were killed. At the time, there were no early warning systems nor evacuation buildings that could provide safe refuge for residents. Since then, four tsunami evacuation buildings (TEBs) have been constructed in the Meuraxa subdistrict of Banda Aceh. Based on analysis of evacuation routes and travel times, the capacity of existing TEBs is examined. Existing TEBs would not be able to shelter all of the at-risk population. In this study, additional buildings and locations for TEBs are proposed and residents are assigned to the closest TEBs. While TEBs may be part of a larger system of tsunami mitigation efforts, other strategies and approaches need to be considered. In addition to TEBs, robust detection, warning and alert systems, land use planning, training, exercises, and other preparedness strategies are essential to tsunami risk reduction.
Memory in coastal systems: Post-tsunami beach recovery within a decade on the Thai coast.
NASA Astrophysics Data System (ADS)
Switzer, A.; Gouramanis, C.; Bristow, C. S.; Jankaew, K.; Rubin, C. M.; Lee, Y.; Carson, S.; Pham, D. T.; Ildefonso, S.
2015-12-01
Do coastlines have memory? In this study we used a combination of remote sensing, field surveys and Ground Penetrating Radar (GPR) to reconstruct the recovery of beaches at Phra Thong Island, Thailand. The study site was severely impacted by the 2004 Indian Ocean Tsunami. Here we show that within a decade the beaches have completely recovered without any human intervention. We apply GPR to image periods of aggradation, progradation and washover sedimentation and match these with local events including a storm in 2007. At one location the beach has locally prograded at least 10m after partially blocking the mouth of a creek that was reamed out by the retreating tsunami. Here we also used GPR to image the scour and recovery of the coastal system (see figure). The rapid recovery of the barrier beach and local progradation indicate that sediment scoured by the tsunami was not transported far offshore but remained in the littoral zone within reach of fair-weather waves that returned to the beach naturally. In both cases coastal processes have reconstructed the beach-dune system to an almost identical pre-tsunami state in under a decade.
The Hellenic National Tsunami Warning Centre (HL-NTWC): Recent updates and future developments
NASA Astrophysics Data System (ADS)
Melis, Nikolaos S.; Charalampakis, Marinos
2014-05-01
The Hellenic NTWC (HL-NTWC) was established officially by Greek Law in September 2010. HL-NTWC is hosted at the National Observatory of Athens, Institute of Geodynamics (NOA-IG), which also operates a 24/7 earthquake monitoring service in Greece and coordinates the newly established Hellenic Unified National Seismic Network. NOA-IG and HL-NTWC Operational Centre is linked to the Civil Protection Operational Centre and serves as the official alerting agency to the General Secretariat for Civil Protection in Greece, regarding earthquake events and tsunami watch. Since August 2012, HL-NTWC acts as Candidate Tsunami Watch Provider (CTWP) under the UNESCO IOC - ICG NEAMTWS tsunami warning system (NEAM: North-Eastern Atlantic, the Mediterranean and connected seas) and offers its services to the NEAMTWS system. HL-NTWC has participated in all Communication Test Exercises (CTE) under NEAMTWS and also it has provided tsunami scenarios for extended system testing exercises such as NEAMWAVE12. Some of the recent developments at HL-NTWC in Greece include: deployment of new tide gauge stations for tsunami watch purposes, computation of tsunami scenarios and extending the database in use, improving alerting response times, earthquake magnitude estimation and testing newly established software modules for tsunami and earthquake alerting (i.e. Early-Est, SeisComP3 etc.) in Greece and the Eastern Mediterranean. Although funding today is limited, an advantage of the participation in important EC funded research projects, i.e. NERIES, NERA, TRANSFER, NEAMTIC and ASTARTE, demonstrates that collaboration of top class Research Institutions that care to produce important and useful results in the research front in Europe, can facilitate towards developing and operating top class Operational Centers, useful for Civil Protection purposes in regions in need. Last, it is demonstrated that HL-NTWC collaboration with important key role Research Centers in the Security and Safety issues (e.g. JRC-IPSC) at the Operational front, can further facilitate and secure everyday operation under a collaborative and experience exchanging manner. This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3)
NASA Astrophysics Data System (ADS)
Kammerer, A. M.; Godoy, A. R.
2009-12-01
In response to the 2004 Indian Ocean Tsunami, as well as the anticipation of the submission of license applications for new nuclear facilities, the United States Nuclear Regulatory Commission (US NRC) initiated a long-term research program to improve understanding of tsunami hazard levels for nuclear power plants and other coastal facilities in the United States. To undertake this effort, the US NRC organized a collaborative research program jointly undertaken with researchers at the United States Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA) for the purpose of assessing tsunami hazard on the Atlantic and Gulf Coasts of the United States. This study identified and modeled both seismic and landslide tsunamigenic sources in the near- and far-field. The results from this work are now being used directly as the basis for the review of tsunami hazard at potential nuclear plant sites. This application once again shows the importance that the earth sciences can play in addressing issues of importance to society. Because the Indian Ocean Tsunami was a global event, a number of cooperative international activities have also been initiated within the nuclear community. The results of US efforts are being incorporated into updated regulatory guidance for both the U.S. Nuclear Regulatory Commission and the United Nation’s International Atomic Energy Agency (IAEA). Coordinated efforts are underway to integrate state-of-the art tsunami warning tools developed by NOAA into NRC and IAEA activities. The goal of the warning systems project is to develop automated protocols that allow scientists at these agencies to have up-to-the minute user-specific information in hand shortly after a potential tsunami has been identified by the US Tsunami Warning System. Lastly, USGS and NOAA scientists are assisting the NRC and IAEA in a special Extra-Budgetary Program (IAEA EBP) on tsunami being coordinated by the IAEA’s International Seismic Safety Center. This IAEA EBP is focused on sharing lessons learned, tsunami hazard assessment techniques, and numerical tools among UN Member States. The complete body of basic and applied research undertaken in these many projects represents the combined effort of a diverse group of marine geologists, geophysicists, geotechnical engineers, seismologists and hydrodynamic modelers at multiple organizations.
Vulnerability of the Built Environment to Tsunamis - an Overview of Where We Are in 2012
NASA Astrophysics Data System (ADS)
Petroff, C. M.
2012-12-01
The last twenty years have seen great strides in the understanding and prediction of tsunami behavior. Though study of these disasters has always been motivated by the need to reduce casualties and damage, early work focused primarily on predicting magnitude, propagation and inundation from tsunami waves. Investigations have expanded to include a burgeoning field concentrated on the landward effects of tsunamis on communities: examining building and infrastructure vulnerability, assessing the probabilities of varying levels of damage and applying these findings to planning of land-use, development, evacuation and response. Catastrophic events of the last decade in the Indian Ocean and Japan have brought these issues to the fore and raise the question: Where are we in our understanding of vulnerability to tsunamis? What have we learned? What are the lessons that the most recent events teach us? This overview summarizes recent investigations of the vulnerability of engineered structures to damage from tsunamis - from individual buildings of various uses to larger facilities and structural systems. Examples are provided of both successes and failures in design for tsunami resistance. Vulnerability of critical infrastructure and lifelines is discussed in the context of tsunamis in Sumatra, Chile and Japan. This includes the ability of critical systems to function during and immediately after a disaster as well as the short and long term resilience of utilities, services and coastal facilities after tsunamis. Recent work on probabilistic prediction of damage and development of fragility functions is summarized for the Chile 2010 and Japan 2011 tsunamis. Finally, a commentary is presented on building vulnerability issues as they relate to land use planning, building design and codes and vertical evacuation planning.; Three views of the Oya Train Station in Miyagi Prefecture: Prior to (top), two months after (middle), and one year after (bottom) the March 11, 2011 Tohoku Japan tsunami. The top view shows the rail line, shops, residences, coastal vegetation, tourist beach and coastal slope protection. All these were damaged or destroyed in the tsunami. One year after, a sand bag barrier had been installed inland of remaining low profile shore protection at Oya Kaigan. Rail lines had not been replaced and the station building remained closed. The area remained evacuated. Power line installation and road repairs were complete. (top photo courtesy F. Imamura)
Wastewater treatment in tsunami affected areas of Thailand by constructed wetlands.
Brix, H; Koottatep, T; Laugesen, C H
2007-01-01
The tsunami of December 2004 destroyed infrastructure in many coastal areas in South-East Asia. In January 2005, the Danish Government gave a tsunami relief grant to Thailand to re-establish the wastewater management services in some of the areas affected by the tsunami. This paper describes the systems which have been built at three locations: (a) Baan Pru Teau: A newly-built township for tsunami victims which was constructed with the contribution of the Thai Red Cross. Conventional septic tanks were installed for the treatment of blackwater from each household and its effluent and grey water (40 m3/day) are collected and treated at a 220 m2 subsurface flow constructed wetland. (b) Koh Phi Phi Don island: A wastewater collection system for the main business and hotel area of the island, a pumping station and a pressure pipe to the treatment facility, a multi-stage constructed wetland system and a system for reuse of treated wastewater. The constructed wetland system (capacity 400 m3/day) consists of vertical flow, horizontal subsurface flow, free water surface flow and pond units. Because the treatment plant is surrounded by resorts, restaurants and shops, the constructed wetland systems are designed with terrains as scenic landscaping. (c) Patong: A 5,000 m2 constructed wetland system has been established to treat polluted water from drainage canals which collect overflow from septic tanks and grey water from residential areas. It is envisaged that these three systems will serve as prototype demonstration systems for appropriate wastewater management in Thailand and other tropical countries.
An automatic tsunami warning system: TREMORS application in Europe
NASA Astrophysics Data System (ADS)
Reymond, D.; Robert, S.; Thomas, Y.; Schindelé, F.
1996-03-01
An integrated system named TREMORS (Tsunami Risk Evaluation through seismic Moment of a Real-time System) has been installed in EVORA station, in Portugal which has been affected by historical tsunamis. The system is based on a three component long period seismic station linked to a compatible IBM_PC with a specific software. The goals of this system are the followings: detect earthquake, locate them, compute their seismic moment, give a seismic warning. The warnings are based on the seismic moment estimation and all the processing are made automatically. The finality of this study is to check the quality of estimation of the main parameters of interest in a goal of tsunami warning: the location which depends of azimuth and distance, and at last the seismic moment, M 0, which controls the earthquake size. The sine qua non condition for obtaining an automatic location is that the 3 main seismic phases P, S, R must be visible. This study gives satisfying results (automatic analysis): ± 5° errors in azimuth and epicentral distance, and a standard deviation of less than a factor 2 for the seismic moment M 0.
Speeding up tsunami wave propagation modeling
NASA Astrophysics Data System (ADS)
Lavrentyev, Mikhail; Romanenko, Alexey
2014-05-01
Trans-oceanic wave propagation is one of the most time/CPU consuming parts of the tsunami modeling process. The so-called Method Of Splitting Tsunami (MOST) software package, developed at PMEL NOAA USA (Pacific Marine Environmental Laboratory of the National Oceanic and Atmospheric Administration, USA), is widely used to evaluate the tsunami parameters. However, it takes time to simulate trans-ocean wave propagation, that is up to 5 hours CPU time to "drive" the wave from Chili (epicenter) to the coast of Japan (even using a rather coarse computational mesh). Accurate wave height prediction requires fine meshes which leads to dramatic increase in time for simulation. Computation time is among the critical parameter as it takes only about 20 minutes for tsunami wave to approach the coast of Japan after earthquake at Japan trench or Sagami trench (as it was after the Great East Japan Earthquake on March 11, 2011). MOST solves numerically the hyperbolic system for three unknown functions, namely velocity vector and wave height (shallow water approximation). The system could be split into two independent systems by orthogonal directions (splitting method). Each system can be treated independently. This calculation scheme is well suited for SIMD architecture and GPUs as well. We performed adaptation of MOST package to GPU. Several numerical tests showed 40x performance gain for NVIDIA Tesla C2050 GPU vs. single core of Intel i7 processor. Results of numerical experiments were compared with other available simulation data. Calculation results, obtained at GPU, differ from the reference ones by 10^-3 cm of the wave height simulating 24 hours wave propagation. This allows us to speak about possibility to develop real-time system for evaluating tsunami danger.
NASA Astrophysics Data System (ADS)
Krivorot'ko, Olga; Kabanikhin, Sergey; Marinin, Igor; Karas, Adel; Khidasheli, David
2013-04-01
One of the most important problems of tsunami investigation is the problem of seismic tsunami source reconstruction. Non-profit organization WAPMERR (http://wapmerr.org) has provided a historical database of alleged tsunami sources around the world that obtained with the help of information about seaquakes. WAPMERR also has a database of observations of the tsunami waves in coastal areas. The main idea of presentation consists of determining of the tsunami source parameters using seismic data and observations of the tsunami waves on the shore, and the expansion and refinement of the database of presupposed tsunami sources for operative and accurate prediction of hazards and assessment of risks and consequences. Also we present 3D visualization of real-time tsunami wave propagation and loss assessment, characterizing the nature of the building stock in cities at risk, and monitoring by satellite images using modern GIS technology ITRIS (Integrated Tsunami Research and Information System) developed by WAPMERR and Informap Ltd. The special scientific plug-in components are embedded in a specially developed GIS-type graphic shell for easy data retrieval, visualization and processing. The most suitable physical models related to simulation of tsunamis are based on shallow water equations. We consider the initial-boundary value problem in Ω := {(x,y) ?R2 : x ?(0,Lx ), y ?(0,Ly ), Lx,Ly > 0} for the well-known linear shallow water equations in the Cartesian coordinate system in terms of the liquid flow components in dimensional form Here ?(x,y,t) defines the free water surface vertical displacement, i.e. amplitude of a tsunami wave, q(x,y) is the initial amplitude of a tsunami wave. The lateral boundary is assumed to be a non-reflecting boundary of the domain, that is, it allows the free passage of the propagating waves. Assume that the free surface oscillation data at points (xm, ym) are given as a measured output data from tsunami records: fm(t) := ? (xm, ym,t), (xm,ym ) ?Ω, t ?(Tm1, Tm2), m = 1,2,...,M, M ?N (2) The problem of tsunami source reconstruction (inverse tsunami problem) consists of determining the unknown initial perturbation q(x,y) of the free surface defied in (1) from knowledge of the free surface oscillation data fm(t) given by (2). We present a numerical method to determine the tsunami source using measurements of the height of a passing tsunami wave. Proposed approach based on the weak solution theory for hyperbolic PDEs and adjoint problem method for minimization of the corresponding cost functional 2 J(q) = ?Aq - F? , F = (f1,...,fM ). (3) The adjoint problem is defined to obtain an explicit gradient formula for the cost functional (3). Different numerical algorithms (finite-difference approach and finite volume method) are proposed for the direct as well as adjoint problem. Conjugate gradient algorithm based on explicit gradient formula is used for numerical solution of the inverse problem (1)-(2). This work was partially supported by the Russian Foundation for Basic Research (project No. 12-01-00773) and by SB RAS interdisciplinary project 14 "Inverse Problems and Applications: Theory, Algorithms, Software".
A novel new tsunami detection network using GNSS on commercial ships
NASA Astrophysics Data System (ADS)
Foster, J. H.; Ericksen, T.; Avery, J.
2015-12-01
Accurate and rapid detection and assessment of tsunamis in the open ocean is critical for predicting how they will impact distant coastlines, enabling appropriate mitigation efforts. The unexpectedly huge fault slip for the 2011 Tohoku, Japan earthquake, and the unanticipated type of slip for the 2012 event at Queen Charlotte Islands, Canada highlighted weaknesses in our understanding of earthquake and tsunami hazards, and emphasized the need for more densely-spaced observing capabilities. Crucially, when each sensor is extremely expensive to build, deploy, and maintain, only a limited network of them can be installed. Gaps in the coverage of the network as well as routine outages of instruments, limit the ability of the detection system to accurately characterize events. Ship-based geodetic GNSS has been demonstrated to be able to detect and measure the properties of tsunamis in the open ocean. Based on this approach, we have used commercial ships operating in the North Pacific to construct a pilot network of low-cost, tsunami sensors to augment the existing detection systems. Partnering with NOAA, Maersk and Matson Navigation, we have equipped 10 ships with high-accuracy GNSS systems running the Trimble RTX high-accuracy real-time positioning service. Satellite communications transmit the position data streams to our shore-side server for processing and analysis. We present preliminary analyses of this novel network, assessing the robustness of the system, the quality of the time-series and the effectiveness of various processing and filtering strategies for retrieving accurate estimates of sea surface height variations for triggering detection and characterization of tsunami in the open ocean.
Scedosporium aurantiacum brain abscess after near-drowning in a survivor of a tsunami in Japan.
Nakamura, Yutaka; Suzuki, Naomi; Nakajima, Yoshio; Utsumi, Yu; Murata, Okinori; Nagashima, Hiromi; Saito, Heisuke; Sasaki, Nobuhito; Fujimura, Itaru; Ogino, Yoshinobu; Kato, Kanako; Terayama, Yasuo; Miyamoto, Shinya; Yarita, Kyoko; Kamei, Katsuhiko; Nakadate, Toshihide; Endo, Shigeatsu; Shibuya, Kazutoshi; Yamauchi, Kohei
2013-12-01
Many victims of the tsunami that occurred following the Great East Japan Earthquake on March 11, 2011 developed systemic disorders owing to aspiration pneumonia. Herein, we report a case of tsunami lung wherein Scedosporium aurantiacum was detected in the respiratory tract. A magnetic resonance image of the patient's head confirmed multiple brain abscesses and lateral right ventricle enlargement. In this case report, we describe a potential refractory multidrug-resistant infection following a tsunami disaster. Copyright © 2013 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.
A Sensitivity Analysis of Tsunami Inversions on the Number of Stations
NASA Astrophysics Data System (ADS)
An, Chao; Liu, Philip L.-F.; Meng, Lingsen
2018-05-01
Current finite-fault inversions of tsunami recordings generally adopt as many tsunami stations as possible to better constrain earthquake source parameters. In this study, inversions are evaluated by the waveform residual that measures the difference between model predictions and recordings, and the dependence of the quality of inversions on the number tsunami stations is derived. Results for the 2011 Tohoku event show that, if the tsunami stations are optimally located, the waveform residual decreases significantly with the number of stations when the number is 1 ˜ 4 and remains almost constant when the number is larger than 4, indicating that 2 ˜ 4 stations are able to recover the main characteristics of the earthquake source. The optimal location of tsunami stations is explained in the text. Similar analysis is applied to the Manila Trench in the South China Sea using artificially generated earthquakes and virtual tsunami stations. Results confirm that 2 ˜ 4 stations are necessary and sufficient to constrain the earthquake source parameters, and the optimal sites of stations are recommended in the text. The conclusion is useful for the design of new tsunami warning systems. Current strategies of tsunameter network design mainly focus on the early detection of tsunami waves from potential sources to coastal regions. We therefore recommend that, in addition to the current strategies, the waveform residual could also be taken into consideration so as to minimize the error of tsunami wave prediction for warning purposes.
Towards to Resilience Science -Research on the Nankai trough seismogenic zone-
NASA Astrophysics Data System (ADS)
Kaneda, Yoshiyuki; Shiraki, Wataru; Fujisawa, Kazuhito; Tokozakura, Eiji
2017-04-01
For the last few decades, many destructive earthquakes and tsunamis occurred in the world. Based on lessons learnt from 2004 Sumatra Earthquake/Tsunamis, 2010 Chilean Earthquake/Tsunami and 2011 East Japan Earthquake/Tsunami, we recognized the importance of real time monitoring on Earthquakes and Tsunamis for disaster mitigation. Recently, Kumamoto Earthquake occurred in 2006. This destructive Earthquake indicated that multi strong motions including pre shock and main shock generated severe earthquake damages buildings. Furthermore, we recognize recovers/ revivals are very important and difficult. In Tohoku area damaged by large tsunamis, recovers/revivals have been under progressing after over 5 years passed after the 2011 Tohoku Earthquake. Therefore, we have to prepare the pre plan before next destructive disasters such as the Nankai trough mega thrust earthquake. As one of disaster countermeasures, we would like to propose that Disaster Mitigation Science. This disaster mitigation science is including engineering, science, medicine and social science such as sociology, informatics, law, literature, art, psychology etc. For Urgent evacuations, there are some kinds of real time monitoring system such as Dart buoy and ocean floor network. Especially, the real time monitoring system using multi kinds of sensors such as the accelerometer, broadband seismometer, pressure gauge, difference pressure gauge, hydrophone and thermometer is indispensable for Earthquakes/ Tsunamis monitoring. Furthermore, using multi kind of sensors, we can analyze and estimate broadband crustal activities around mega thrust earthquake seismogenic zones. Therefore, we deployed DONET1 and DONET2 which are dense ocean floor networks around the Nankai trough Southwestern Japan. We will explain about Resilience Science and real time monitoring systems around the Nankai trough seismogenic zone.
Tsunami Early Warning in Europe: NEAMWave Exercise 2012 - the Portuguese Scenario
NASA Astrophysics Data System (ADS)
Lendholt, Matthias; Hammmitzsch, Martin; Schulz, Jana; Reißland, Sven
2013-04-01
On 27th and 28th November 2012 the first European-wide tsunami exercise took place under the auspices of UNESCO Intergovernmental Coordination Group for the Tsunami Early Warning and Mitigation System in the North-eastern Atlantic, the Mediterranean and connected seas (ICG/NEAMTWS). Four international scenarios were performed - one for each candidate tsunami watch provider France, Greece, Portugal and Turkey. Their task was to generate and disseminate tsunami warning bulletins in-time and in compliance with the official NEAMTWS specifications. The Instituto Português do Mar e da Atmosfera (IPMA, [1]) in Lissabon and the Kandilli Observatory and Earthquake Research Institute (KOERI [2]) in Istanbul are the national agencies of Portugal and Turkey responsible for tsunami early warning. Both institutes are partners in the TRIDEC [3] project and were using the TRIDEC Natural Crisis Management (NCM) system during NEAMWave exercise. The software demonstrated the seamless integration of diverse components including sensor systems, simulation data, and dissemination hardware. The functionalities that were showcased significantly exceeded the internationally agreed range of capabilities. Special attention was given to the Command and Control User Interface (CCUI) serving as central application for the operator. Its origins lie in the DEWS project [4] but numerous new functionalities were added to master all requirements defined by the complex NEAMTWS workflows. It was of utmost importance to develop an application handling the complexity of tsunami science but providing a clearly arranged and comprehensible interface that disburdens the operator during time-critical hazard situations. [1] IPMA: www.ipma.pt/ [2] KOERI: www.koeri.boun.edu.tr/ [3] TRIDEC: www.tridec-online.eu [4] DEWS: www.dews-online.org
NASA Astrophysics Data System (ADS)
Yomogida, K.; Saito, T.
2017-12-01
Conventional tsunami excitation and propagation have been formulated by incompressible fluid with velocity components. This approach is valid in most cases because we usually analyze tunamis as "long gravity waves" excited by submarine earthquakes. Newly developed ocean-bottom tsunami networks such as S-net and DONET have dramatically changed the above situation for the following two reasons: (1) tsunami propagations are now directly observed in a 2-D array manner without being suffered by complex "site effects" of sea shore, and (2) initial tsunami features can be directly detected just above a fault area. Removing the incompressibility assumption of sea water, we have formulated a new representation of tsunami excitation based on not velocity but displacement components. As a result, not only dynamics but static term (i.e., the component of zero frequency) can be naturally introduced, which is important for the pressure observed on the ocean floor, which ocean-bottom tsunami stations are going to record. The acceleration on the ocean floor should be combined with the conventional tsunami height (that is, the deformation of the sea level above a given station) in the measurement of ocean-bottom pressure although the acceleration exists only during fault motions in time. The M7.2 Off Fukushima earthquake on 22 November 2016 was the first event that excited large tsunamis within the territory of S-net stations. The propagation of tsunamis is found to be highly non-uniform, because of the strong velocity (i.e., sea depth) gradient perpendicular to the axis of Japan Trench. The earthquake was located in a shallow sea close to the coast, so that all the tsunami energy is reflected by the trench region of high velocity. Tsunami records (pressure gauges) within its fault area recorded clear slow motions of tsunamis (i.e., sea level changes) but also large high-frequency signals, as predicted by our theoretical result. That is, it may be difficult to extract tsunami motions from near-fault pressure gauge data immediately after the earthquake occurs, in the sense of tsunami early warning systems.
Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation
NASA Astrophysics Data System (ADS)
Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.
2006-12-01
An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes reduce the level of uncertainty in their results to the uncertainty in the geophysical initial conditions. Further, when coupled with real--time free--field tsunami measurements from tsunameters, validated codes are the only choice for realistic forecasting of inundation; the consequences of failure are too ghastly to take chances with numerical procedures that have not been validated. We discuss a ten step process of benchmark tests for models used for inundation mapping. The associated methodology and algorithmes have to first be validated with analytical solutions, then verified with laboratory measurements and field data. The models need to be published in the scientific literature in peer-review journals indexed by ISI. While this process may appear onerous, it reflects our state of knowledge, and is the only defensible methodology when human lives are at stake. Synolakis, C.E., and Bernard, E.N, Tsunami science before and beyond Boxing Day 2004, Phil. Trans. R. Soc. A 364 1845, 2231--2263, 2005.
NASA Astrophysics Data System (ADS)
Yamada, M.; Fujino, S.; Satake, K.
2017-12-01
The 7.3 ka eruption of Kikai volcano, southern Kyushu, Japan, is one of the largest caldera-forming eruption in the world. Given that a huge caldera was formed in shallow sea area during the eruption, a tsunami must have been generated by a sea-level change associated. Pyroclastic flow and tsunami deposits by the eruption have been studied around the caldera, but they are not enough to evaluate the tsunami size. The goal of this study is to unravel sizes of tsunami and triggering caldera collapse by numerical simulations based on a widely-distributed tsunami deposit associated with the eruption. In this presentation, we will provide an initial data on distribution of the 7.3 ka tsunami deposit contained in sediment cores taken at three coastal lowlands in Wakayama, Tokushima, and Oita prefectures (560 km, 520 km, and 310 km north-east from the caldera, respectively). A volcanic ash from the eruption (Kikai Akahoya tephra: K-Ah) is evident in organic-rich muddy sedimentary sequence in all sediment cores. Up to 6-cm-thick sand layer, characterized by a grading structure and sharp bed boundary with lower mud, is observed immediately beneath the K-Ah tephra in all study sites. These sedimentary characteristics and broad distribution indicate that the sand layer was most likely deposited by a tsunami which can propagate to a wide area, but not by a local storm surge. Furthermore, the stratigraphic relationship implies that the study sites must have been inundated by the tsunami prior to the ash fall. A sand layer is also evident within the K-Ah tephra layer, suggesting that the sand layer was probably formed by a subsequent tsunami wave during the ash fall. These geological evidences for the 7.3 ka tsunami inundation will contribute to a better understanding of the caldera collapse and the resultant tsunami, but also of the tsunami generating system in the eruptive process.
Numerical Simulations of the 1991 Limón Tsunami, Costa Rica Caribbean Coast
NASA Astrophysics Data System (ADS)
Chacón-Barrantes, Silvia; Zamora, Natalia
2017-08-01
The second largest recorded tsunami along the Caribbean margin of Central America occurred 25 years ago. On April 22nd, 1991, an earthquake with magnitude Mw 7.6 ruptured along the thrust faults that form the North Panamá Deformed Belt (NPDB). The earthquake triggered a tsunami that affected the Caribbean coast of Costa Rica and Panamá within few minutes, generating two casualties. These are the only deaths caused by a tsunami in Costa Rica. Coseismic uplift up to 1.6 m and runup values larger than 2 m were measured along some coastal sites. Here, we consider three solutions for the seismic source as initial conditions to model the tsunami, each considering a single rupture plane. We performed numerical modeling of the tsunami propagation and runup using NEOWAVE numerical model (Yamazaki et al. in Int J Numer Methods Fluids 67:2081-2107, 2010, doi: 10.1002/fld.2485 ) on a system of nested grids from the entire Caribbean Sea to Limón city. The modeled surface deformation and tsunami runup agreed with the measured data along most of the coastal sites with one preferred model that fits the field data. The model results are useful to determine how the 1991 tsunami could have affected regions where tsunami records were not preserved and to simulate the effects of the coastal surface deformations as buffer to tsunami. We also performed tsunami modeling to simulate the consequences if a similar event with larger magnitude Mw 7.9 occurs offshore the southern Costa Rican Caribbean coast. Such event would generate maximum wave heights of more than 5 m showing that Limón and northwestern Panamá coastal areas are exposed to moderate-to-large tsunamis. These simulations considering historical events and maximum credible scenarios can be useful for hazard assessment and also as part of studies leading to tsunami evacuation maps and mitigation plans, even when that is not the scope of this paper.
Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data
Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.
2006-01-01
Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.
Tsunamis warning from space :Ionosphere seismology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larmat, Carene
2012-09-04
Ionosphere is the layer of the atmosphere from about 85 to 600km containing electrons and electrically charged atoms that are produced by solar radiation. Perturbations - layering affected by day and night, X-rays and high-energy protons from the solar flares, geomagnetic storms, lightning, drivers-from-below. Strategic for radio-wave transmission. This project discusses the inversion of ionosphere signals, tsunami wave amplitude and coupling parameters, which improves tsunami warning systems.
Potential coping capacities to avoid tsunamis in Mentawai
NASA Astrophysics Data System (ADS)
Panjaitan, Berton; Gomez, Christopher; Pawson, Eric
2017-07-01
In 2010 a tsunamigenic earthquake triggered tsunami waves reaching the Mentawai archipelago in less than ten minutes. Similar events can occur any time as seismic scholars predict enormous energy remains trapped on the Sunda Megathrust - approximately 30 km offshore from the archipelago. Therefore, the local community of Mentawai is vulnerable to tsunami hazards. In the absence of modern technology to monitor the sea surface interventions, existing strategies need to be improved. This study was based on a qualitative research and literature review about developing coping capacity on tsunami hazards for Mentawai. A community early-warning system is the main strategy to develop the coping capacity at the community level. This consists of risk knowledge, monitoring, warning dissemination, and capability response. These are interlocked and are an end-to-end effort. From the study, the availability of risk assessments and risk mappings were mostly not found at dusun, whereas they are effective to increase tsunami risk knowledge. Also, the monitoring of tsunami waves can be maximized by strengthening and expanding the community systems for the people to avoid the waves. Moreover, the traditional tools are potential to deliver warnings. Lastly, although the local government has provided a few public facilities to increase the response capability, the people often ignore them. Therefore, their traditional values should be revitalized.
Tsunami hazard assessment in the Colombian Caribbean Coast with a deterministic approach
NASA Astrophysics Data System (ADS)
Otero Diaz, L.; Correa, R.; Ortiz R, J. C.; Restrepo L, J. C.
2014-12-01
For the Caribbean Sea, we propose six potential tectonic sources of tsunami, defining for each source the worst credible earthquake from the analysis of historical seismicity, tectonics, pasts tsunami, and review of IRIS, PDE, NOAA, and CMT catalogs. The generation and propagation of tsunami waves in the selected sources were simulated with COMCOT 1.7, which is a numerical model that solves the linear and nonlinear long wave equations in finite differences in both Cartesian, and spherical coordinates. The results of the modeling are presented in maps of maximum displacement of the free surface for the Colombian Caribbean coast and the island areas, and they show that the event would produce greater impact is generated in the source of North Panama Deformed Belt (NPDB), where the first wave train reaches the central Colombian coast in 40 minutes, generating wave heights up to 3.7 m. In San Andrés and Providencia island, tsunami waves reach more than 4.5 m due effects of edge waves caused by interactions between waves and a barrier coral reef around of each island. The results obtained in this work are useful for planning systems and future regional and local warning systems and to identify priority areas to conduct detailed research to the tsunami threat.
A Walk through TRIDEC's intermediate Tsunami Early Warning System
NASA Astrophysics Data System (ADS)
Hammitzsch, M.; Reißland, S.; Lendholt, M.
2012-04-01
The management of natural crises is an important application field of the technology developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme. TRIDEC is based on the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the Distant Early Warning System (DEWS) providing a service platform for both sensor integration and warning dissemination. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing challenges, such as the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulation tools and data fusion tools. In addition to conventional sensors also unconventional sensors and sensor networks play an important role in TRIDEC. The system version presented is based on service-oriented architecture (SOA) concepts and on relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). The first system demonstrator has been designed and implemented to support plausible scenarios demonstrating the treatment of simulated tsunami threats with an essential subset of a National Tsunami Warning Centre (NTWC). The feasibility and the potentials of the implemented approach are demonstrated covering standard operations as well as tsunami detection and alerting functions. The demonstrator presented addresses information management and decision-support processes in a hypothetical natural crisis situation caused by a tsunami in the Eastern Mediterranean. Developments of the system are based to the largest extent on free and open source software (FOSS) components and industry standards. Emphasis has been and will be made on leveraging open source technologies that support mature system architecture models wherever appropriate. All open source software produced is foreseen to be published on a publicly available software repository thus allowing others to reuse results achieved and enabling further development and collaboration with a wide community including scientists, developers, users and stakeholders. This live demonstration is linked with the talk "TRIDEC Natural Crisis Management Demonstrator for Tsunamis" (EGU2012-7275) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.7/ESSI1.7).
NASA Astrophysics Data System (ADS)
Aytore, Betul; Yalciner, Ahmet Cevdet; Zaytsev, Andrey; Cankaya, Zeynep Ceren; Suzen, Mehmet Lütfi
2016-08-01
Turkey is highly prone to earthquakes because of active fault zones in the region. The Marmara region located at the western extension of the North Anatolian Fault Zone (NAFZ) is one of the most tectonically active zones in Turkey. Numerous catastrophic events such as earthquakes or earthquake/landslide-induced tsunamis have occurred in the Marmara Sea basin. According to studies on the past tsunami records, the Marmara coasts have been hit by 35 different tsunami events in the last 2000 years. The recent occurrences of catastrophic tsunamis in the world's oceans have also raised awareness about tsunamis that might take place around the Marmara coasts. Similarly, comprehensive studies on tsunamis, such as preparation of tsunami databases, tsunami hazard analysis and assessments, risk evaluations for the potential tsunami-prone regions, and establishing warning systems have accelerated. However, a complete tsunami inundation analysis in high resolution will provide a better understanding of the effects of tsunamis on a specific critical structure located in the Marmara Sea. Ports are one of those critical structures that are susceptible to marine disasters. Resilience of ports and harbors against tsunamis are essential for proper, efficient, and successful rescue operations to reduce loss of life and property. Considering this, high-resolution simulations have been carried out in the Marmara Sea by focusing on Haydarpaşa Port of the megacity Istanbul. In the first stage of simulations, the most critical tsunami sources possibly effective for Haydarpaşa Port were inputted, and the computed tsunami parameters at the port were compared to determine the most critical tsunami scenario. In the second stage of simulations, the nested domains from 90 m gird size to 10 m grid size (in the port region) were used, and the most critical tsunami scenario was modeled. In the third stage of simulations, the topography of the port and its regions were used in the two nested domains in 3-m and 1-m resolutions and the water elevations computed from the previous simulations were inputted from the border of the large domain. A tsunami numerical code, NAMI DANCE, was used in the simulations. The tsunami parameters in the highest resolution were computed in and around the port. The effect of the data resolution on the computed results has been presented. The performance of the port structures and possible effects of tsunami on port operations have been discussed. Since the harbor protection structures have not been designed to withstand tsunamis, the breakwaters' stability becomes one of the major concerns for less agitation and inundation under tsunami in Haydarpaşa Port for resilience. The flow depth, momentum fluxes, and current pattern are the other concerns that cause unexpected circulations and uncontrolled movements of objects on land and vessels in the sea.
Tsunami Risk in the NE Atlantic: Pilot Study for Algarve Portugal and Applications for future TWS
NASA Astrophysics Data System (ADS)
Omira, R.; Baptista, M. A.; Catita, C.; Carrilho, F.; Matias, L.
2012-04-01
Tsunami risk assessment is an essential component of any Tsunami Early Warning System due to its significant contribution to the disaster reduction by providing valuable information that serve as basis for mitigation preparedness and strategies. Generally, risk assessment combines the outputs of the hazard and the vulnerability assessment for considered exposed elements. In the NE Atlantic region, the tsunami hazard is relatively well established through compilation of tsunami historical events, evaluation of tsunamigenic sources and impact computations for site-specific coastal areas. While, tsunami vulnerability remains poorly investigated in spite of some few studies that focused on limited coastal areas of the Gulf of Cadiz region. This work seeks to present a pilot study for tsunami risk assessment that covers about 170 km of coasts of Algarve region, south of Portugal. This area of high coastal occupation and touristic activities was strongly impacted by the 1755 tsunami event as reported in various historical documents. An approach based upon a combination of tsunami hazard and vulnerability is developed in order to take into account the dynamic aspect of tsunami risk in the region that depends on the variation of hazard and vulnerability of exposed elements from a coastal point to other. Hazard study is based upon the consideration of most credible earthquake scenarios and the derivation of hazard maps through hydrodynamic modeling of inundation and tsunami arrival time. The vulnerability assessment is performed by: i) the analysis of the occupation and the population density, ii) derivation of evacuation maps and safe shelters, and iii) the analysis of population response and evacuation times. Different risk levels ranging from "low" to "high" are assigned to the coats of the studied area. Variation of human tsunami risk between the high and low touristic seasons is also considered in this study and aims to produce different tsunami risk-related scenarios. Results are presented in terms of thematic maps and GIS layers highlighting information on inundation depths and limits, evacuation plans and safe shelters, tsunami vulnerability, evacuation times and tsunami risk levels. Results can be used for national and regional tsunami disaster management and planning. This work is funded by TRIDEC (Collaborative, Complex and Critical Decision-Support in Evolving Crises) FP7, EU project and by MAREMOTI (Mareograph and field tsunami observations, modeling and vulnerability studies for Northeast Atlantic and western Mediterranean) French project. Keywords: Tsunami, Algarve-Portugal, Evacuation, Vulnerability, Risk
NASA Astrophysics Data System (ADS)
Bouchard, R.; Locke, L.; Hansen, W.; Collins, S.; McArthur, S.
2007-12-01
DART systems are a critical component of the tsunami warning system as they provide the only real-time, in situ, tsunami detection before landfall. DART systems consist of a surface buoy that serves as a position locater and communications transceiver and a Bottom Pressure Recorder (BPR) on the seafloor. The BPR records temperature and pressure at 15-second intervals to a memory card for later retrieval for analysis and use by tsunami researchers, but the BPRs are normally recovered only once every two years. The DART systems also transmit subsets of the data, converted to an estimation of the sea surface height, in near real-time for use by the tsunami warning community. These data are available on NDBC's webpages, http://www.ndbc.noaa.gov/dart.shtml. Although not of the resolution of the data recorded to the BPR memory card, the near real-time data have proven to be of value in research applications [1]. Of particular interest are the DART data associated with geophysical events. The DART BPR continuously compares the measured sea height with a predicted sea-height and when the difference exceeds a threshold value, the BPR goes into Event Mode. Event Mode provides an extended, more frequent near real-time reporting of the sea surface heights for tsunami detection. The BPR can go into Event Mode because of geophysical triggers, such as tsunamis or seismic activity, which may or may not be tsunamigenic. The BPR can also go into Event Mode during recovery of the BPR as it leaves the seafloor, or when manually triggered by the Tsunami Warning Centers in advance of an expected tsunami. On occasion, the BPR will go into Event Mode without any associated tsunami or seismic activity or human intervention and these are considered "False'' Events. Approximately one- third of all Events can be classified as "False". NDBC is responsible for the operations, maintenance, and data management of the DART stations. Each DART station has a webpage with a drop-down list of all Events. NDBC maintains the non-geophysical Events in order to maintain the continuity of the time series records. In 2007, NDBC compiled all DART Events that occurred while under NDBC's operational control and made an assessment on their validity. The NDBC analysts performed the assessment using the characteristics of the data time series, triggering criteria, and associated seismic events. The compilation and assessments are catalogued in a NDBC technical document. The Catalog also includes a listing of the one-hour, high-resolution data, retrieved remotely from the BPRs that are not available on the web pages. The Events are classified by their triggering mechanism and listed by station location and, for those Events associated with geophysical triggers, they are listed by their associated seismic events. The Catalog provides researchers with a valuable tool in locating, assessing, and applying near real-time DART data to tsunami research and will be updated following DART Events. A link to the published Catalog can be found on the NDBC DART website, http://www.ndbc.noaa.gov/dart.shtml. Reference: [1] Gower, J. and F. González (2006), U.S. Warning System Detected the Sumatra Tsunami, Eos Trans. AGU, 87(10), 105-112.
Tsunami warnings: Understanding in Hawai'i
Gregg, Chris E.; Houghton, Bruce F.; Paton, Douglas; Johnston, David M.; Swanson, D.A.; Yanagi, B.S.
2007-01-01
The devastating southeast Asian tsunami of December 26, 2004 has brought home the destructive consequences of coastal hazards in an absence of effective warning systems. Since the 1946 tsunami that destroyed much of Hilo, Hawai'i, a network of pole mounted sirens has been used to provide an early public alert of future tsunamis. However, studies in the 1960s showed that understanding of the meaning of siren soundings was very low and that ambiguity in understanding had contributed to fatalities in the 1960 tsunami that again destroyed much of Hilo. The Hawaiian public has since been exposed to monthly tests of the sirens for more than 25 years and descriptions of the system have been widely published in telephone books for at least 45 years. However, currently there remains some uncertainty in the level of public understanding of the sirens and their implications for behavioral response. Here, we show from recent surveys of Hawai'i residents that awareness of the siren tests and test frequency is high, but these factors do not equate with increased understanding of the meaning of the siren, which remains disturbingly low (13%). Furthermore, the length of time people have lived in Hawai'i is not correlated systematically with understanding of the meaning of the sirens. An additional issue is that warning times for tsunamis gene rated locally in Hawai'i will be of the order of minutes to tens of minutes and limit the immediate utility of the sirens. Natural warning signs of such tsunamis may provide the earliest warning to residents. Analysis of a survey subgroup from Hilo suggests that awareness of natural signs is only moderate, and a majority may expect notification via alerts provided by official sources. We conclude that a major change is needed in tsunami education, even in Hawai'i, to increase public understanding of, and effective response to, both future official alerts and natural warning signs of future tsunamis. ?? Springer 2006.
NASA Astrophysics Data System (ADS)
Ausilia Paparo, Maria; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo; Tinti, Stefano
2017-02-01
Eastern Sicily is affected by earthquakes and tsunamis of local and remote origin, which is known through numerous historical chronicles. Recent studies have put emphasis on the role of submarine landslides as the direct cause of the main local tsunamis, envisaging that earthquakes (in 1693 and 1908) did produce a tsunami, but also that they triggered mass failures that were able to generate an even larger tsunami. The debate is still open, and though no general consensus has been found among scientists so far, this research had the merit to attract attention on possible generation of tsunamis by landslides off Sicily. In this paper we investigate the tsunami potential of mass failures along one sector of the Hyblean-Malta Escarpment (HME). facing Augusta. The HME is the main offshore geological structure of the region running almost parallel to the coast, off eastern Sicily. Here, bottom morphology and slope steepness favour soil failures. In our work we study slope stability under seismic load along a number of HME transects by using the Minimun Lithostatic Deviation (MLD) method, which is based on the limit-equilibrium theory. The main goal is to identify sectors of the HME that could be unstable under the effect of realistic earthquakes. We estimate the possible landslide volume and use it as input for numerical codes to simulate the landslide motion and the consequent tsunami. This is an important step for the assessment of the tsunami hazard in eastern Sicily and for local tsunami mitigation policies. It is also important in view of tsunami warning system since it can help to identify the minimum earthquake magnitude capable of triggering destructive tsunamis induced by landslides, and therefore to set up appropriate knowledge-based criteria to launch alert to the population.
Earthquake and Tsunami planning, outreach and awareness in Humboldt County, California
NASA Astrophysics Data System (ADS)
Ozaki, V.; Nicolini, T.; Larkin, D.; Dengler, L.
2008-12-01
Humboldt County has the longest coastline in California and is one of the most seismically active areas of the state. It is at risk from earthquakes located on and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ), other regional fault systems, and from distant sources elsewhere in the Pacific. In 1995 the California Division of Mines and Geology published the first earthquake scenario to include both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of representatives from government agencies, tribes, service groups, academia and the private sector from the three northern coastal California counties, was formed in 1996 to coordinate and promote earthquake and tsunami hazard awareness and mitigation. The RCTWG and its member agencies have sponsored a variety of projects including education/outreach products and programs, tsunami hazard mapping, signage and siren planning, and has sponsored an Earthquake - Tsunami Education Room at the Humboldt County fair for the past eleven years. Three editions of Living on Shaky Ground an earthquake-tsunami preparedness magazine for California's North Coast, have been published since 1993 and a fourth is due to be published in fall 2008. In 2007, Humboldt County was the first region in the country to participate in a tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD and the first area in California to conduct a full-scale tsunami evacuation drill. The County has conducted numerous multi-agency, multi-discipline coordinated exercises using county-wide tsunami response plan. Two Humboldt County communities were recognized as TsunamiReady by the National Weather Service in 2007. Over 300 tsunami hazard zone signs have been posted in Humboldt County since March 2008. Six assessment surveys from 1993 to 2006 have tracked preparedness actions and personal awareness of earthquake and tsunami hazards in the county and additional surveys have tracked public awareness and tourist concerns about tsunami hazard signs. Over the thirteen year period covered by the surveys, the percent with houses secured to foundations has increased from 58 to 80 percent, respondents aware of a local tsunami hazard increased from 51 to 73 percent and knowing what the Cascadia subduction zone is from 16 to 42 percent.
NASA Astrophysics Data System (ADS)
Sabeur, Z. A.; Wächter, J.; Middleton, S. E.; Zlatev, Z.; Häner, R.; Hammitzsch, M.; Loewe, P.
2012-04-01
The intelligent management of large volumes of environmental monitoring data for early tsunami warning requires the deployment of robust and scalable service oriented infrastructure that is supported by an agile knowledge-base for critical decision-support In the TRIDEC project (TRIDEC 2010-2013), a sensor observation service bus of the TRIDEC system is being developed for the advancement of complex tsunami event processing and management. Further, a dedicated TRIDEC system knowledge-base is being implemented to enable on-demand access to semantically rich OGC SWE compliant hydrodynamic observations and operationally oriented meta-information to multiple subscribers. TRIDEC decision support requires a scalable and agile real-time processing architecture which enables fast response to evolving subscribers requirements as the tsunami crisis develops. This is also achieved with the support of intelligent processing services which specialise in multi-level fusion methods with relevance feedback and deep learning. The TRIDEC knowledge base development work coupled with that of the generic sensor bus platform shall be presented to demonstrate advanced decision-support with situation awareness in context of tsunami early warning and crisis management.
Hunter, Jennifer C.; Crawley, Adam W.; Petrie, Michael; Yang, Jane E.; Aragón, Tomás J.
2012-01-01
Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami’s impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergency responders’ ability to rapidly communicate and coordinate a response to a potential threat. Methods To evaluate the local public health system emergency response to the tsunami threat in California, we surveyed all local public health, emergency medical services (EMS), and emergency management agencies in coastal or floodplain counties about several domains related to the tsunami threat in California, including: (1) the extent to which their community was affected by the tsunami, (2) when and how they received notification of the event, (3) which public health response activities were carried out to address the tsunami threat in their community, and (4) which organizations contributed to the response. Public health activities were characterized using the Centers for Disease Control and Prevention (CDC) Public Health Preparedness Capabilities (PHEP) framework. Findings The tsunami's impact on coastal communities in California ranged widely, both in terms of the economic consequences and the response activities. Based on estimates from the National Oceanic and Atmospheric Administration (NOAA), ten jurisdictions in California reported tsunami-related damage, which ranged from $15,000 to $35 million. Respondents first became aware of the tsunami threat in California between the hours of 10:00pm Pacific Standard Time (PST) on Thursday March 10th and 2:00pm PST on Friday March 11th, a range of 16 hours, with notification occurring through both formal and informal channels. In response to this threat, the activities most commonly reported by the local government agencies included in this study were: emergency public information and warning, emergency operations coordination, and inter-organizational information sharing, which were reported by 86%, 75%, and 65% of all respondents, respectively. When looking at the distribution of responsibility, emergency management agencies were the most likely to report assuming a lead role in these common activities as well as those related to evacuation and community recovery. While activated less frequently, public health agencies carried out emergency response functions related to surveillance and epidemiology, environmental health, and mental health/psychological support. Both local public health and EMS agencies took part in mass care and medical material management activities. A large network of organizations contributed to response activities, with emergency management, law enforcement, fire, public health, public works, EMS, and media cited by more than half of respondents. Conclusions In response to the tsunami threat in California, we found that emergency management agencies assumed a lead role in the local response efforts. While public health and medical agencies played a supporting role in the response, they uniquely contributed to a number of specific activities. If the response to the recent tsunami is any indication, these support activities can be anticipated in planning for future events with similar characteristics to the tsunami threat. Additionally, we found that many respondents first learned of the tsunami through the media, rather than through rapid notification systems, which suggests that government agencies must continue to develop and maintain the ability to rapidly aggregate and analyze information in order to provide accurate assessments and guidance to a potentially well-informed public. Citation: Hunter JC, Crawley AW, Petrie M, Yang JE, Aragón TJ. Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake. PLoS Currents Disasters. 2012 Jul 16 PMID:22953236
NASA Astrophysics Data System (ADS)
Dominey-Howes, D.
2009-12-01
The September 2009 tsunami was a regional South Pacific event of enormous significance. Our UNESCO-IOC ITST Samoa survey used a simplified version of a ‘coupled human-environment systems framework’ (Turner et al., 2003) to investigate the impacts and effects of the tsunami in Samoa. Further, the framework allowed us to identify those factors that affected the vulnerability and resilience of the human-environment system before, during and after the tsunami - a global first. Key findings (unprocessed) include: Maximum run-up exceeded 14 metres above sea level Maximum inundation (at right angles to the shore) was approximately 400 metres Maximum inundation with the wave running parallel with the shore (but inland), exceeded 700 metres Buildings sustained varying degrees of damage Damage was correlated with depth of tsunami flow, velocity, condition of foundations, quality of building materials used, quality of workmanship, adherence to the building code and so on Buildings raised even one metre above the surrounding land surface suffered much less damage Plants, trees and mangroves reduced flow velocity and flow depth - leading to greater chances of human survival and lower levels of building damage The tsunami has left a clear and distinguishable geological record in terms of sediments deposited in the coastal landscape The clear sediment layer associated with this tsunami suggests that older (and prehistoric) tsunamis can be identified, helping to answer questions about frequency and magnitude of tsunamis The tsunami caused widespread erosion of the coastal and beach zones but this damage will repair itself naturally and quickly The tsunami has had clear impacts on ecosystems and these are highly variable Ecosystems will repair themselves naturally and are unlikely to preserve long-term impacts It is clear that some plant (tree) species are highly resilient and provided immediate places for safety during the tsunami and resources post-tsunami People of Samoa are forgetting their knowledge of the value and uses of indigenous plant and animal species and efforts are needed to increase the understanding of the value of these plants and animals (thus increasing community resilience) Video recording survivor stories is important Sadly, there is no tradition of story telling or memory of past tsunamis so the capturing of survivor accounts means that such stories can be introduced to the cultural memory Permitting survivors to tell their stories allows them to heal emotionally, and also provides valuable information for future education and community outreach The people of Samoa are hurting after the tsunami Impacts and effects are highly variable socially and spatially Where lives have been lost, the impacts and associated fear are much higher Communities require practical and long-term emotional care A complex picture is emerging about community experiences of warning and response behaviour that presents challenges to the Government of Samoa in terms of education and outreach for hazard reduction
NASA Astrophysics Data System (ADS)
Savastano, Giorgio; Komjathy, Attila; Verkhoglyadova, Olga; Mazzoni, Augusto; Crespi, Mattia; Wei, Yong; Mannucci, Anthony J.
2017-04-01
It is well known that tsunamis can produce gravity waves that propagate up to the ionosphere generating disturbed electron densities in the E and F regions. These ionospheric disturbances can be studied in detail using ionospheric total electron content (TEC) measurements collected by continuously operating ground-based receivers from the Global Navigation Satellite Systems (GNSS). Here, we present results using a new approach, named VARION (Variometric Approach for Real-Time Ionosphere Observation), and estimate slant TEC (sTEC) variations in a real-time scenario. Using the VARION algorithm we compute TEC variations at 56 GPS receivers in Hawaii as induced by the 2012 Haida Gwaii tsunami event. We observe TEC perturbations with amplitudes of up to 0.25 TEC units and traveling ionospheric perturbations (TIDs) moving away from the earthquake epicenter at an approximate speed of 316 m/s. We perform a wavelet analysis to analyze localized variations of power in the TEC time series and we find perturbation periods consistent with a tsunami typical deep ocean period. Finally, we present comparisons with the real-time tsunami MOST (Method of Splitting Tsunami) model produced by the NOAA Center for Tsunami Research and we observe variations in TEC that correlate in time and space with the tsunami waves.
Tsunami waveform inversion of the 2007 Bengkulu, southern Sumatra, earthquake
NASA Astrophysics Data System (ADS)
Fujii, Y.; Satake, K.
2008-09-01
We performed tsunami waveform inversions for the Bengkulu, southern Sumatra, earthquake on September 12, 2007 (Mw 8.4 by USGS). The tsunami was recorded at many tide gauge stations around the Indian Ocean and by a DART system in the deep ocean. The observed tsunami records indicate that the amplitudes were less than several tens of centimeters at most stations, around 1 m at Padang, the nearest station to the source, and a few centimeters at the DART station. For the tsunami waveform inversions, we adopted 20-, 15- and 10-subfault models. The tsunami waveforms computed from the estimated slip distributions explain the observed waveforms at most stations, regardless of the subfault model. We found that large slips were consistently estimated at the deeper part (>24 km) of the fault plane, located more than 100 km from the trench axis. The largest slips of 6-9 m were located about 100-200 km northwest of the epicenter. The deep slips may have contributed to the relatively small tsunami for its earthquake size. The total seismic moment is calculated as 4.7 × 1021 N m (Mw = 8.4) for the 10-subfault model, our preferred model from a comparison of tsunami waveforms at Cocos and the DART station.
Savastano, Giorgio; Komjathy, Attila; Verkhoglyadova, Olga; Mazzoni, Augusto; Crespi, Mattia; Wei, Yong; Mannucci, Anthony J.
2017-01-01
It is well known that tsunamis can produce gravity waves that propagate up to the ionosphere generating disturbed electron densities in the E and F regions. These ionospheric disturbances can be studied in detail using ionospheric total electron content (TEC) measurements collected by continuously operating ground-based receivers from the Global Navigation Satellite Systems (GNSS). Here, we present results using a new approach, named VARION (Variometric Approach for Real-Time Ionosphere Observation), and estimate slant TEC (sTEC) variations in a real-time scenario. Using the VARION algorithm we compute TEC variations at 56 GPS receivers in Hawaii as induced by the 2012 Haida Gwaii tsunami event. We observe TEC perturbations with amplitudes of up to 0.25 TEC units and traveling ionospheric perturbations (TIDs) moving away from the earthquake epicenter at an approximate speed of 316 m/s. We perform a wavelet analysis to analyze localized variations of power in the TEC time series and we find perturbation periods consistent with a tsunami typical deep ocean period. Finally, we present comparisons with the real-time tsunami MOST (Method of Splitting Tsunami) model produced by the NOAA Center for Tsunami Research and we observe variations in TEC that correlate in time and space with the tsunami waves. PMID:28429754
Savastano, Giorgio; Komjathy, Attila; Verkhoglyadova, Olga; Mazzoni, Augusto; Crespi, Mattia; Wei, Yong; Mannucci, Anthony J
2017-04-21
It is well known that tsunamis can produce gravity waves that propagate up to the ionosphere generating disturbed electron densities in the E and F regions. These ionospheric disturbances can be studied in detail using ionospheric total electron content (TEC) measurements collected by continuously operating ground-based receivers from the Global Navigation Satellite Systems (GNSS). Here, we present results using a new approach, named VARION (Variometric Approach for Real-Time Ionosphere Observation), and estimate slant TEC (sTEC) variations in a real-time scenario. Using the VARION algorithm we compute TEC variations at 56 GPS receivers in Hawaii as induced by the 2012 Haida Gwaii tsunami event. We observe TEC perturbations with amplitudes of up to 0.25 TEC units and traveling ionospheric perturbations (TIDs) moving away from the earthquake epicenter at an approximate speed of 316 m/s. We perform a wavelet analysis to analyze localized variations of power in the TEC time series and we find perturbation periods consistent with a tsunami typical deep ocean period. Finally, we present comparisons with the real-time tsunami MOST (Method of Splitting Tsunami) model produced by the NOAA Center for Tsunami Research and we observe variations in TEC that correlate in time and space with the tsunami waves.
High tsunami frequency as a result of combined strike-slip faulting and coastal landslides
Hornbach, Matthew J.; Braudy, Nicole; Briggs, Richard W.; Cormier, Marie-Helene; Davis, Marcy B.; Diebold, John B.; Dieudonne, Nicole; Douilly, Roby; Frohlich, Cliff; Gulick, Sean P.S.; Johnson, Harold E.; Mann, Paul; McHugh, Cecilia; Ryan-Mishkin, Katherine; Prentice, Carol S.; Seeber, Leonardo; Sorlien, Christopher C.; Steckler, Michael S.; Symithe, Steeve Julien; Taylor, Frederick W.; Templeton, John
2010-01-01
Earthquakes on strike-slip faults can produce devastating natural hazards. However, because they consist predominantly of lateral motion, these faults are rarely associated with significant uplift or tsunami generation. And although submarine slides can generate tsunami, only a few per cent of all tsunami are believed to be triggered in this way. The 12 January Mw 7.0 Haiti earthquake exhibited primarily strike-slip motion but nevertheless generated a tsunami. Here we present data from a comprehensive field survey that covered the onshore and offshore area around the epicentre to document that modest uplift together with slope failure caused tsunamigenesis. Submarine landslides caused the most severe tsunami locally. Our analysis suggests that slide-generated tsunami occur an order-of-magnitude more frequently along the Gonave microplate than global estimates predict. Uplift was generated because of the earthquake's location, where the Caribbean and Gonave microplates collide obliquely. The earthquake also caused liquefaction at several river deltas that prograde rapidly and are prone to failure. We conclude that coastal strike-slip fault systems such as the Enriquillo-Plantain Garden fault produce relief conducive to rapid sedimentation, erosion and slope failure, so that even modest predominantly strike-slip earthquakes can cause potentially catastrophic slide-generated tsunami - a risk that is underestimated at present.
Tsunami Detection by High-Frequency Radar Beyond the Continental Shelf
NASA Astrophysics Data System (ADS)
Grilli, Stéphan T.; Grosdidier, Samuel; Guérin, Charles-Antoine
2016-12-01
Where coastal tsunami hazard is governed by near-field sources, such as submarine mass failures or meteo-tsunamis, tsunami propagation times may be too small for a detection based on deep or shallow water buoys. To offer sufficient warning time, it has been proposed to implement early warning systems relying on high-frequency (HF) radar remote sensing, that can provide a dense spatial coverage as far offshore as 200-300 km (e.g., for Diginext Ltd.'s Stradivarius radar). Shore-based HF radars have been used to measure nearshore currents (e.g., CODAR SeaSonde® system; http://www.codar.com/), by inverting the Doppler spectral shifts, these cause on ocean waves at the Bragg frequency. Both modeling work and an analysis of radar data following the Tohoku 2011 tsunami, have shown that, given proper detection algorithms, such radars could be used to detect tsunami-induced currents and issue a warning. However, long wave physics is such that tsunami currents will only rise above noise and background currents (i.e., be at least 10-15 cm/s), and become detectable, in fairly shallow water which would limit the direct detection of tsunami currents by HF radar to nearshore areas, unless there is a very wide shallow shelf. Here, we use numerical simulations of both HF radar remote sensing and tsunami propagation to develop and validate a new type of tsunami detection algorithm that does not have these limitations. To simulate the radar backscattered signal, we develop a numerical model including second-order effects in both wind waves and radar signal, with the wave angular frequency being modulated by a time-varying surface current, combining tsunami and background currents. In each "radar cell", the model represents wind waves with random phases and amplitudes extracted from a specified (wind speed dependent) energy density frequency spectrum, and includes effects of random environmental noise and background current; phases, noise, and background current are extracted from independent Gaussian distributions. The principle of the new algorithm is to compute correlations of HF radar signals measured/simulated in many pairs of distant "cells" located along the same tsunami wave ray, shifted in time by the tsunami propagation time between these cell locations; both rays and travel time are easily obtained as a function of long wave phase speed and local bathymetry. It is expected that, in the presence of a tsunami current, correlations computed as a function of range and an additional time lag will show a narrow elevated peak near the zero time lag, whereas no pattern in correlation will be observed in the absence of a tsunami current; this is because surface waves and background current are uncorrelated between pair of cells, particularly when time-shifted by the long-wave propagation time. This change in correlation pattern can be used as a threshold for tsunami detection. To validate the algorithm, we first identify key features of tsunami propagation in the Western Mediterranean Basin, where Stradivarius is deployed, by way of direct numerical simulations with a long wave model. Then, for the purpose of validating the algorithm we only model HF radar detection for idealized tsunami wave trains and bathymetry, but verify that such idealized case studies capture well the salient tsunami wave physics. Results show that, in the presence of strong background currents, the proposed method still allows detecting a tsunami with currents as low as 0.05 m/s, whereas a standard direct inversion based on radar signal Doppler spectra fails to reproduce tsunami currents weaker than 0.15-0.2 m/s. Hence, the new algorithm allows detecting tsunami arrival in deeper water, beyond the shelf and further away from the coast, and providing an early warning. Because the standard detection of tsunami currents works well at short range, we envision that, in a field situation, the new algorithm could complement the standard approach of direct near-field detection by providing a warning that a tsunami is approaching, at larger range and in greater depth. This warning would then be confirmed at shorter range by a direct inversion of tsunami currents, from which the magnitude of the tsunami would also estimated. Hence, both algorithms would be complementary. In future work, the algorithm will be applied to actual tsunami case studies performed using a state-of-the-art long wave model, such as briefly presented here in the Mediterranean Basin.
Numerical modeling of marine Gravity data for tsunami hazard zone mapping
NASA Astrophysics Data System (ADS)
Porwal, Nipun
2012-07-01
Tsunami is a series of ocean wave with very high wavelengths ranges from 10 to 500 km. Therefore tsunamis act as shallow water waves and hard to predict from various methods. Bottom Pressure Recorders of Poseidon class considered as a preeminent method to detect tsunami waves but Acoustic Modem in Ocean Bottom Pressure (OBP) sensors placed in the vicinity of trenches having depth of more than 6000m fails to propel OBP data to Surface Buoys. Therefore this paper is developed for numerical modeling of Gravity field coefficients from Bureau Gravimetric International (BGI) which do not play a central role in the study of geodesy, satellite orbit computation, & geophysics but by mathematical transformation of gravity field coefficients using Normalized Legendre Polynomial high resolution ocean bottom pressure (OBP) data is generated. Real time sea level monitored OBP data of 0.3° by 1° spatial resolution using Kalman filter (kf080) for past 10 years by Estimating the Circulation and Climate of the Ocean (ECCO) has been correlated with OBP data from gravity field coefficients which attribute a feasible study on future tsunami detection system from space and in identification of most suitable sites to place OBP sensors near deep trenches. The Levitus Climatological temperature and salinity are assimilated into the version of the MITGCM using the ad-joint method to obtain the sea height segment. Then TOPEX/Poseidon satellite altimeter, surface momentum, heat, and freshwater fluxes from NCEP reanalysis product and the dynamic ocean topography DOT_DNSCMSS08_EGM08 is used to interpret sea-bottom elevation. Then all datasets are associated under raster calculator in ArcGIS 9.3 using Boolean Intersection Algebra Method and proximal analysis tools with high resolution sea floor topographic map. Afterward tsunami prone area and suitable sites for set up of BPR as analyzed in this research is authenticated by using Passive microwave radiometry system for Tsunami Hazard Zone Mapping by network of seismometers. Thus using such methodology for early Tsunami Hazard Zone Mapping also increase accuracy and reduce time period for tsunami predictions. KEYWORDS:, Tsunami, Gravity Field Coefficients, Ocean Bottom Pressure, ECCO, BGI, Sea Bottom Temperature, Sea Floor Topography.
Ionospheric Method of Detecting Tsunami-Generating Earthquakes.
ERIC Educational Resources Information Center
Najita, Kazutoshi; Yuen, Paul C.
1978-01-01
Reviews the earthquake phenomenon and its possible relation to ionospheric disturbances. Discusses the basic physical principles involved and the methods upon which instrumentation is being developed for possible use in a tsunami disaster warning system. (GA)
Post-tsunami beach recovery in Thailand: A case for punctuated equilibrium in coastal dynamics
NASA Astrophysics Data System (ADS)
Switzer, Adam D.; Gouramanis, Chris; Bristow, Charles; Yeo, Jeffrey; Kruawun, Jankaew; Rubin, Charles; Sin Lee, Ying; Tien Dat, Pham
2017-04-01
A morpho-geophysical investigation of two beaches in Thailand over the last decade shows that they have completely recovered from the 2004 Indian Ocean tsunami (IOT) without any human intervention. Although the beach systems show contrasting styles of recovery in both cases natural processes have reconstructed the beaches to comparable pre-tsunami morphologies in under a decade, demonstrating the existence of punctuated equilibrium in coastal systems and the resilience of natural systems to catastrophic events. Through a combination of remote sensing, field surveys and shallow geophysics we reconstruct the post-event recovery of beaches at Phra Thong Island, a remote, near pristine site that was severely impacted by the IOT. We identify periods of aggradation, progradation and washover sedimentation that match with local events including a storm in November 2007. The rapid recovery of these systems implies that majority of sediment scoured by the tsunami was not transported far offshore but remained in the littoral zone within reach of fair-weather waves that returned it (the sediment) to the beach naturally.
Assessment of tsunami hazard for coastal areas of Shandong Province, China
NASA Astrophysics Data System (ADS)
Feng, Xingru; Yin, Baoshu
2017-04-01
Shandong province is located on the east coast of China and has a coastline of about 3100 km. There are only a few tsunami events recorded in the history of Shandong Province, but the tsunami hazard assessment is still necessary as the rapid economic development and increasing population of this area. The objective of this study was to evaluate the potential danger posed by tsunamis for Shandong Province. The numerical simulation method was adopted to assess the tsunami hazard for coastal areas of Shandong Province. The Cornell multi-grid coupled tsunami numerical model (COMCOT) was used and its efficacy was verified by comparison with three historical tsunami events. The simulated maximum tsunami wave height agreed well with the observational data. Based on previous studies and statistical analyses, multiple earthquake scenarios in eight seismic zones were designed, the magnitudes of which were set as the potential maximum values. Then, the tsunamis they induced were simulated using the COMCOT model to investigate their impact on the coastal areas of Shandong Province. The numerical results showed that the maximum tsunami wave height, which was caused by the earthquake scenario located in the sea area of the Mariana Islands, could reach up to 1.39 m off the eastern coast of Weihai city. The tsunamis from the seismic zones of the Bohai Sea, Okinawa Trough, and Manila Trench could also reach heights of >1 m in some areas, meaning that earthquakes in these zones should not be ignored. The inundation hazard was distributed primarily in some northern coastal areas near Yantai and southeastern coastal areas of Shandong Peninsula. When considering both the magnitude and arrival time of tsunamis, it is suggested that greater attention be paid to earthquakes that occur in the Bohai Sea. In conclusion, the tsunami hazard facing the coastal area of Shandong Province is not very serious; however, disasters could occur if such events coincided with spring tides or other extreme oceanic conditions. The results of this study will be useful for the design of coastal engineering projects and the establishment of a tsunami warning system for Shandong Province.
Numerical Simulation of Several Tectonic Tsunami Sources at the Caribbean Basin
NASA Astrophysics Data System (ADS)
Chacon-Barrantes, S. E.; Lopez, A. M.; Macias, J.; Zamora, N.; Moore, C. W.; Llorente Isidro, M.
2016-12-01
The Tsunami Hazard Assessment Working Group (WG2) of the Intergovernmental Coordination Group for the Tsunami and Other Coastal Hazards Early Warning System for the Caribbean and Adjacent Regions (ICG/CARIBE-EWS), has been tasked to identify tsunami sources for the Caribbean region and evaluate their effects along Caribbean coasts. A list of tectonic sources was developed and presented at the Fall 2015 AGU meeting and the WG2 is currently working on a list of non-tectonic sources. In addition, three Experts Meetings have already been held in 2016 to define worst-case, most credible scenarios for southern Hispaniola and Central America. The WG2 has been tasked to simulate these scenarios to provide an estimate of the resulting effects on coastal areas within the Caribbean. In this study we simulated tsunamis with two leading numerical models (NEOWAVE and Tsunami-HySEA) to compare results among them and report on the consequences for the Caribbean region if a tectonically-induced tsunami occurs in any of these postulated sources. The considered sources are located offshore Central America, at the North Panamá Deformed Belt (NPDB), at the South Caribbean Deformed Belt (SCDB) and around La Hispaniola Island. Results obtained in this study are critical to develop a catalog of scenarios that can be used in future CaribeWave exercises, as well as their usage for ICG/CARIBE-EWS member states as input to model tsunami inundation for their coastal locations. Data from inundation parameters are an additional step to produce tsunami evacuation maps, and develop plans and procedures to increase tsunami awareness and preparedness within the Caribbean.
NASA Astrophysics Data System (ADS)
Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.
2009-04-01
Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of the inundation for a specific area, the wave height at coast at this area and the estimated times of arrival (ETAs) of the waves, caused by one tsunamigenic source with a specific magnitude. These parameters from the several scenarios can overlap each other along the coast and must be combined to get one comprehensive hazard assessment for all possible future tsunamis at the region under observation. The simplest way to derive the inundation probability along the coast using the multiscenario approach is to overlay all scenario inundation results and to determine how often a point on land will be significantly inundated from the various scenarios. But this does not take into account that the used tsunamigenic sources for the modeled scenarios have different likelihoods of causing a tsunami. Hence a statistical analysis of historical data and geophysical investigation results based on numerical modelling results is added to the hazard assessment, which clearly improves the significance of the hazard assessment. For this purpose the present method is developed and contains a complex logical combination of the diverse probabilities assessed like probability of occurrence for different earthquake magnitudes at different localities, probability of occurrence for a specific wave height at the coast and the probability for every point on land likely to get hit by a tsunami. The values are combined by a logical tree technique and quantified by statistical analysis of historical data and of the tsunami modelling results as mentioned before. This results in a tsunami inundation probability map covering the South West Coast of Indonesia which nevertheless shows a significant spatial diversity offering a good base for evacuation planning and mitigation strategies. Keywords: tsunami hazard assessment, tsunami modelling, probabilistic analysis, early warning
Comparison of Human Response against Earthquake and Tsunami
NASA Astrophysics Data System (ADS)
Arikawa, T.; Güler, H. G.; Yalciner, A. C.
2017-12-01
The evacuation response against the earthquake and tsunamis is very important for the reduction of human damages against tsunami. But it is very difficult to predict the human behavior after shaking of the earthquake. The purpose of this research is to clarify the difference of the human response after the earthquake shock in the difference countries and to consider the relation between the response and the safety feeling, knowledge and education. For the objective of this paper, the questionnaire survey was conducted after the 21st July 2017 Gokova earthquake and tsunami. Then, consider the difference of the human behavior by comparison of that in 2015 Chilean earthquake and tsunami and 2011 Japan earthquake and tsunami. The seismic intensity of the survey points was almost 6 to 7. The contents of the questions include the feeling of shaking, recalling of the tsunami, the behavior after shock and so on. The questionnaire was conducted for more than 20 20 people in 10 areas. The results are the following; 1) Most people felt that it was a strong shake not to stand, 2) All of the questionnaires did not recall the tsunami, 3) Depending on the area, they felt that after the earthquake the beach was safer than being at home. 4) After they saw the sea drawing, they thought that a tsunami would come and ran away. Fig. 1 shows the comparison of the evacuation rate within 10 minutes in 2011 Japan, 2015 Chile and 2017 Turkey.. From the education point of view, education for tsunami is not done much in Turkey. From the protection facilities point of view, the high sea walls are constructed only in Japan. From the warning alert point of view, there is no warning system against tsunamis in the Mediterranean Sea. As a result of this survey, the importance of tsunami education is shown, and evacuation tends to be delayed if dependency on facilities and alarms is too high.
TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Mueller, Don; Bowman, Stephen M
2009-01-01
This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity andmore » uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.« less
NASA Astrophysics Data System (ADS)
Makela, J. J.; Lognonne, P.; Occhipinti, G.; Hebert, H.; Gehrels, T.; Coisson, P.; Rolland, L. M.; Allgeyer, S.; Kherani, A.
2011-12-01
The Mw=9.0 earthquake that occurred off the east coast of Honshu, Japan on 11 March 2011 launched a tsunami that traveled across the Pacific Ocean, in turn launching vertically propagating atmospheric gravity waves. Upon reaching 250-350 km in altitude, these waves impressed their signature on the thermosphere/ionosphere system. We present observations of this signature obtained using a variety of radio instruments and an imaging system located on the islands of Hawaii. These measurements represent the first optical images recorded of the airglow signature resulting from the passage of a tsunami. Results from these instruments clearly show wave structure propagating in the upper atmosphere with the same velocity as the ocean tsunami, emphasizing the coupled nature of the ocean, atmosphere, and ionosphere. Modeling results are also presented to highlight current understandings of this coupling process.
The GNSS-based component for the new Indonesian tsunami early warning centre provided by GITEWS
NASA Astrophysics Data System (ADS)
Falck, C.; Ramatschi, M.; Bartsch, M.; Merx, A.; Hoeberechts, J.; Rothacher, M.
2009-04-01
Introduction Nowadays GNSS technologies are used for a large variety of precise positioning applications. The accuracy can reach the mm level depending on the data analysis methods. GNSS technologies thus offer a high potential to support tsunami early warning systems, e.g., by detection of ground motions due to earthquakes and of tsunami waves on the ocean by GNSS instruments on a buoy. Although GNSS-based precise positioning is a standard method, it is not yet common to apply this technique under tight time constraints and, hence, in the absence of precise satellite orbits and clocks. The new developed GNSS-based component utilises on- and offshore measured GNSS data and is the first system of its kind that was integrated into an operational early warning system. (Indonesian Tsunami Early Warning Centre INATEWS, inaugurated at BMKG, Jakarta on November, 11th 2008) Motivation After the Tsunami event of 26th December 2004 the German government initiated the GITEWS project (German Indonesian Tsunami Early Warning System) to develop a tsunami early warning system for Indonesia. The GFZ Potsdam (German Research Centre for Geosciences) as the consortial leader of GITEWS also covers several work packages, most of them related to sensor systems. The geodetic branch (Department 1) of the GFZ was assigned to develop a GNSS-based component. Brief system description The system covers all aspects from sensor stations with new developed hard- and software designs, manufacturing and installation of stations, real-time data transfer issues, a new developed automatic near real-time data processing and a graphical user interface for early warning centre operators including training on the system. GNSS sensors are installed on buoys, at tide gauges and as real-time reference stations (RTR stations), either stand-alone or co-located with seismic sensors. The GNSS data are transmitted to the warning centre where they are processed in a near real-time data processing chain. For sensors on land the processing system delivers deviations from their normal, mean coordinates. The deviations or so called displacements are indicators for land mass movements which can occur, e.g., due to strong earthquakes. The ground motion information is a valuable source for a fast understanding of an earthquake's mechanism with possible relevance for a potentially following tsunami. By this means the GNSS system supports the decision finding process whether most probably a tsunami has been generated or not. For buoy based GNSS data the processing (differential, with GNSS reference station on land) delivers coordinates as well. Only the vertical component is of interest as it corresponds to the instant sea level height. Deviations to the mean sea level height are an indicator for a possibly passing tsunami wave. The graphical user interface (GUI) of the GNSS system supports both, a quick view for all staff members at the warning centre (24h/7d shifts) and deeper analysis by GNSS experts. The GNSS GUI system is implemented as a web-based application and allows all views to be displayed on different screens at the same time, even at remote locations. This is part of the concept, as it can support the dialogue between warning centre staff on duty or on standby and sensor station maintenance staff. Acknowledgements The GITEWS project (German Indonesian Tsunami Early Warning System) is carried out by a large group of scientists and engineers from (GFZ) German Research Centre for Geosciences and its partners from the German Aerospace Centre (DLR), the Alfred Wegener Institute for Polar and Marine Research (AWI), the GKSS Research Centre, the Konsortium Deutsche Meeresforschung (KDM), the Leibniz Institute for Marine Sciences (IFM-GEOMAR), the United Nations University (UNU), the Federal Institute for Geosciences and Natural Resources (BGR), the German Agency for Technical Cooperation (GTZ) and other international partners. Most relevant partners in Indonesia with respect to the GNSS component of GITEWS are the National Coordinating Agency for Surveys and Mapping (BAKOSURTANAL), the National Metereology and Geophysics Agency (BMKG) and the National Agency for the Assessment and Application of Technology (BPPT). Funding is provided by the German Federal Ministry for Education and Research (BMBF), Grant 03TSU01.
GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps
Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.
2007-01-01
A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.
Evaluation of earthquake and tsunami on JSFR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chikazawa, Y.; Enuma, Y.; Kisohara, N.
2012-07-01
Evaluation of earthquake and tsunami on JSFR has been analyzed. For seismic design, safety components are confirmed to maintain their functions even against recent strong earthquakes. As for Tsunami, some parts of reactor building might be submerged including component cooling water system whose final heat sink is sea water. However, in the JSFR design, safety grade components are independent from component cooling water system (CCWS). The JSFR emergency power supply adopts a gas turbine system with air cooling, since JSFR does not basically require quick start-up of the emergency power supply thanks to the natural convection DHRS. Even in casemore » of long station blackout, the DHRS could be activated by emergency batteries or manually and be operated continuously by natural convection. (authors)« less
Numerical Modeling and Forecasting of Strong Sumatra Earthquakes
NASA Astrophysics Data System (ADS)
Xing, H. L.; Yin, C.
2007-12-01
ESyS-Crustal, a finite element based computational model and software has been developed and applied to simulate the complex nonlinear interacting fault systems with the goal to accurately predict earthquakes and tsunami generation. With the available tectonic setting and GPS data around the Sumatra region, the simulation results using the developed software have clearly indicated that the shallow part of the subduction zone in the Sumatra region between latitude 6S and 2N has been locked for a long time, and remained locked even after the Northern part of the zone underwent a major slip event resulting into the infamous Boxing Day tsunami. Two strong earthquakes that occurred in the distant past in this region (between 6S and 1S) in 1797 (M8.2) and 1833 (M9.0) respectively are indicative of the high potential for very large destructive earthquakes to occur in this region with relatively long periods of quiescence in between. The results have been presented in the 5th ACES International Workshop in 2006 before the recent 2007 Sumatra earthquakes occurred which exactly fell into the predicted zone (see the following web site for ACES2006 and detailed presentation file through workshop agenda). The preliminary simulation results obtained so far have shown that there seem to be a few obvious events around the previously locked zone before it is totally ruptured, but apparently no indication of a giant earthquake similar to the 2004 M9 event in the near future which is believed to happen by several earthquake scientists. Further detailed simulations will be carried out and presented in the meeting.
Inundation Mapping and Hazard Assessment of Tectonic and Landslide Tsunamis in Southeast Alaska
NASA Astrophysics Data System (ADS)
Suleimani, E.; Nicolsky, D.; Koehler, R. D., III
2014-12-01
The Alaska Earthquake Center conducts tsunami inundation mapping for coastal communities in Alaska, and is currently focused on the southeastern region and communities of Yakutat, Elfin Cove, Gustavus and Hoonah. This activity provides local emergency officials with tsunami hazard assessment, planning, and mitigation tools. At-risk communities are distributed along several segments of the Alaska coastline, each having a unique seismic history and potential tsunami hazard. Thus, a critical component of our project is accurate identification and characterization of potential tectonic and landslide tsunami sources. The primary tectonic element of Southeast Alaska is the Fairweather - Queen Charlotte fault system, which has ruptured in 5 large strike-slip earthquakes in the past 100 years. The 1958 "Lituya Bay" earthquake triggered a large landslide into Lituya Bay that generated a 540-m-high wave. The M7.7 Haida Gwaii earthquake of October 28, 2012 occurred along the same fault, but was associated with dominantly vertical motion, generating a local tsunami. Communities in Southeast Alaska are also vulnerable to hazards related to locally generated waves, due to proximity of communities to landslide-prone fjords and frequent earthquakes. The primary mechanisms for local tsunami generation are failure of steep rock slopes due to relaxation of internal stresses after deglaciation, and failure of thick unconsolidated sediments accumulated on underwater delta fronts at river mouths. We numerically model potential tsunami waves and inundation extent that may result from future hypothetical far- and near-field earthquakes and landslides. We perform simulations for each source scenario using the Alaska Tsunami Model, which is validated through a set of analytical benchmarks and tested against laboratory and field data. Results of numerical modeling combined with historical observations are compiled on inundation maps and used for site-specific tsunami hazard assessment by emergency planners.
Tsunami Preparedness, Response, Mitigation, and Recovery Planning in California
NASA Astrophysics Data System (ADS)
Miller, K.; Wilson, R. I.; Johnson, L. A.; Mccrink, T. P.; Schaffer, E.; Bower, D.; Davis, M.
2016-12-01
In California officials of state, federal, and local governments have coordinated to implement a Tsunami Preparedness and Mitigation Program. Building upon past preparedness efforts carried out year-round this group has leveraged government support at all levels. A primary goal is for everyone who lives at or visits the coast to understand basic life-safety measures when responding to official tsunami alerts or natural warnings. Preparedness actions include: observation of National Tsunami Preparedness Week, local "tsunami walk" drills, scenario-based exercises, testing of notification systems for public alert messaging, outreach materials, workshops, presentations, and media events.Program partners have worked together to develop emergency operations, evacuation plans, and tsunami annexes to plans for counties, cities, communities, and harbors in 20 counties along the coast. Working with the state and federal partner agencies, coastal communities have begun to incorporate sophisticated tsunami "Playbook" scenario information into their planning. These innovative tsunami evacuation and response tools provide detailed evacuation maps and associated real-time response information for identifying areas where flooding could occur. This is critical information for evacuating populations on land, near the shoreline.Acting on recommendations from the recent USGS-led, multi-discipline Science Application for Risk Reduction Tsunami Scenario report on impacts to California and American Society of Civil Engineering adoption proposals to the International Building Code, the state has begun to develop a strategy to incorporate probabilistic tsunami findings into state level policy recommendations for addressing building code adoption, as well as approach land use planning and building code implementation in local jurisdictions. Additional efforts, in the context of sustained community resiliency, include developing recovery planning guidance for local communities.
Tsunami Preparedness: Building On Past Efforts to Reach More People… California and Beyond!
NASA Astrophysics Data System (ADS)
Miller, K.; Siegel, J.; Pridmore, C. L.; Benthien, M. L.; Wilson, R. I.; Long, K.; Ross, S.
2014-12-01
The California Tsunami Program has continued to build upon past preparedness efforts, carried out year-round, while leveraging government support at all levels during National Tsunami Preparedness Week, the last week of March. A primary goal is for everyone who lives at or visits the coast to understand basic safety measures when responding to official tsunami alerts or natural warnings. In 2014, more so than ever before, many local, coastal jurisdictions conducted grass-roots activities in their areas. When requested, state and federal programs stepped in to contribute subject matter expertise, lessons learned, and support. And, this year, the new website, www.TsunamiZone.org, was developed. With a goal of establishing a baseline for future years, this website builds on the successes of the Great Shakeout Earthquake Drills (www.ShakeOut.org) by allowing people to locate and register for tsunami preparedness events in their area. Additionally, it provides a central location for basic tsunami preparedness information, and links to find out more. The idea is not only to empower people with the best available, vetted, scientifically-based public safety information, but also to provide ways in which individuals can take physical action to educate themselves and others. Several broad categories of preparedness actions include: official acknowledgement of National Tsunami Preparedness Week, local "tsunami walk" drills, simulated tsunami-based exercises, testing of sirens and notification systems, outreach materials (brochures, videos, maps), workshops, presentations, media events, and websites. Next steps include building on the foundation established in 2014 by leveraging ShakeOut audiences, providing people with more information about how they can participate in 2015, and carrying the effort forward to other states and territories.
The Geological Trace Of The 1932 Tsunamis In The Tropical Jalisco-Colima Coast, Mexico
NASA Astrophysics Data System (ADS)
Ramirez-Herrera, M.; Blecher, L.; Goff, J. R.; Corona, N.; Chague-Goff, C.; Lagos, M.; Hutchinson, I.; Aguilar, B.; Goguitchaichrili, A.; Machain-Castillo, M. L.; Rangel, V.; Zawadzki, A.; Jacobsen, G.
2013-05-01
The study and preservation of tsunami deposits have being challenging in humid tropical environments. While tsunami deposits have been widely studied at temperate latitudes, few studies assess this problem in tropical environments due to the difficulties intrinsic to these places (e.g. tsunami deposit preservation, post-burial changes in a tropical environment, mangrove vegetation, difficult access, wildlife, among others). Here we assess the problem of tsunami-deposits preservation on the Jalisco-Colima tropical coast of Mexico, which parallels the more than 1000-km long Mexican subduction, where historical accounts indicate the occurrence of two significant tsunamis on June 3 and 22, 1932 (Corona and Ramírez-Herrera, 2012a, Valdivia et al., 2012). However, up to date, no geological evidence of these events has been reported. We present geological evidence of two large tsunamis related to the June 3, M 8.2 earthquake, and the June 22, Ms 6.9 landslide-triggering event of 1932 (Corona and Ramírez-Herrera, 2012a, b). A multiproxy approach was applied to unravel the nature of anomalous sand units and sharp basal contacts in the stratigraphy of a number of sites at Palo Verde estuary, El Tecuán swales and marsh, and La Manzanilla swales, on the Jalisco-Colima coast. Lines of evidence including historical, geomorphological, stratigraphic, grain size, organic matter content, microfossils (diatoms and foraminifera), geochemical content, magnetic susceptibility and AMS analyses, together with dating (210Pb and 14C), and modeling, corroborate the presence of tsunami deposits of both the 3 June 1932 tsunami at El Tecuán and La Manzanilla, and the 22 June 1932 tsunami at Palo Verde. Further evidence of earlier tsunamis, at least four events, is also evident in the stratigraphy. Work in progress should reveal the chronology of the earliest tsunamis and their origin. Corona, N., M.T. Ramirez-Herrera. (2012a) Mapping and historical reconstruction of the great Mexican 1932 tsunami. Natural Hazards and Earth System Sciences, 12, 1337-1352. NHESS-2011-369. Corona Morales N. y M.T. Ramírez-Herrera. (2012b) Técnicas histórico-etnográficas en la reconstrucción y caracterización de tsunamis: El ejemplo del gran tsunami del 22 de junio de 1932, en las costas del Pacífico Mexicano. Revista de Geografía Norte Grande. 53, 107-122. Valdivia O. L., Castillo A. M.R., Estrada T. M. (2012). Tsunamis en Jalisco, Geocalli, Cuadernos De Geografía, Universidad de Guadalajara. Año 13, No. 25, 103p.
General Vulnerability and Exposure Profile to Tsunami in Puerto Rico
NASA Astrophysics Data System (ADS)
Ruiz, R.; Huérfano-Moreno, V.
2012-12-01
The Puerto Rico archipelago, located in the seismically active Caribbean region, has been directly affected by tsunamis in the last two centuries. The M 7.3 tsunamigenic earthquake, which occurred on October 11, 1918, caused $29 million in damage, death of 116 people and 100 residents were reported as missing. Presently, deficiencies on urban planning have induced an increase on the number of vulnerable people living inside the tsunami flood areas. Tsunami-prone areas have been delimited for Puerto Rico based on numerical tsunami modeling. However, the demographic, social and physical (e.g. critical and essential facilities) characteristics of these areas have not been documented in detail. We are conducting a municipality and community-level tsunami vulnerability and exposure study using Geographical Information System (GIS) tool. The results of our study are being integrated into the Puerto Rico Disaster Decision Support Tool (DDST). The DDST is a tool that brings access, at no cost, to a variety of updated geo-referenced information for Puerto Rico. This tool provides internet-based scalable maps that will aid emergency managers and decision-makers on their responsibilities and will improve Puerto Rico communities' resilience against tsunami hazard. This project aims to provide an initial estimate of Puerto Rico vulnerability and exposure to tsunami and brings to the community a technological tool that will help increase their awareness of this hazard and to assist them on their decisions.
New buoy observation system for tsunami and crustal deformation
NASA Astrophysics Data System (ADS)
Takahashi, Narumi; Ishihara, Yasuhisa; Ochi, Hiroshi; Fukuda, Tatsuya; Tahara, Jun'ichiro; Maeda, Yosaku; Kido, Motoyuki; Ohta, Yusaku; Mutoh, Katsuhiko; Hashimoto, Gosei; Kogure, Satoshi; Kaneda, Yoshiyuki
2014-09-01
We have developed a new system for real-time observation of tsunamis and crustal deformation using a seafloor pressure sensor, an array of seafloor transponders and a Precise Point Positioning (PPP ) system on a buoy. The seafloor pressure sensor and the PPP system detect tsunamis, and the pressure sensor and the transponder array measure crustal deformation. The system is designed to be capable of detecting tsunami and vertical crustal deformation of ±8 m with a resolution of less than 5 mm. A noteworthy innovation in our system is its resistance to disturbance by strong ocean currents. Seismogenic zones near Japan lie in areas of strong currents like the Kuroshio, which reaches speeds of approximately 5.5 kt (2.8 m/s) around the Nankai Trough. Our techniques include slack mooring and new acoustic transmission methods using double pulses for sending tsunami data. The slack ratio can be specified for the environment of the deployment location. We can adjust slack ratios, rope lengths, anchor weights and buoy sizes to control the ability of the buoy system to maintain freeboard. The measured pressure data is converted to time difference of a double pulse and this simple method is effective to save battery to transmit data. The time difference of the double pulse has error due to move of the buoy and fluctuation of the seawater environment. We set a wire-end station 1,000 m beneath the buoy to minimize the error. The crustal deformation data is measured by acoustic ranging between the buoy and six transponders on the seafloor. All pressure and crustal deformation data are sent to land station in real-time using iridium communication.
The Pacific Tsunami Warning Center's Response to the Tohoku Earthquake and Tsunami
NASA Astrophysics Data System (ADS)
Weinstein, S. A.; Becker, N. C.; Shiro, B.; Koyanagi, K. K.; Sardina, V.; Walsh, D.; Wang, D.; McCreery, C. S.; Fryer, G. J.; Cessaro, R. K.; Hirshorn, B. F.; Hsu, V.
2011-12-01
The largest Pacific basin earthquake in 47 years, and also the largest magnitude earthquake since the Sumatra 2004 earthquake, struck off of the east coast of the Tohoku region of Honshu, Japan at 5:46 UTC on 11 March 2011. The Tohoku earthquake (Mw 9.0) generated a massive tsunami with runups of up to 40m along the Tohoku coast. The tsunami waves crossed the Pacific Ocean causing significant damage as far away as Hawaii, California, and Chile, thereby becoming the largest, most destructive tsunami in the Pacific Basin since 1960. Triggers on the seismic stations at Erimo, Hokkaido (ERM) and Matsushiro, Honshu (MAJO), alerted Pacific Tsunami Warning Center (PTWC) scientists 90 seconds after the earthquake began. Four minutes after its origin, and about one minute after the earthquake's rupture ended, PTWC issued an observatory message reporting a preliminary magnitude of 7.5. Eight minutes after origin time, the Japan Meteorological Agency (JMA) issued its first international tsunami message in its capacity as the Northwest Pacific Tsunami Advisory Center. In accordance with international tsunami warning system protocols, PTWC then followed with its first international tsunami warning message using JMA's earthquake parameters, including an Mw of 7.8. Additional Mwp, mantle wave, and W-phase magnitude estimations based on the analysis of later-arriving seismic data at PTWC revealed that the earthquake magnitude reached at least 8.8, and that a destructive tsunami would likely be crossing the Pacific Ocean. The earthquake damaged the nearest coastal sea-level station located 90 km from the epicenter in Ofunato, Japan. The NOAA DART sensor situated 600 km off the coast of Sendai, Japan, at a depth of 5.6 km recorded a tsunami wave amplitude of nearly two meters, making it by far the largest tsunami wave ever recorded by a DART sensor. Thirty minutes later, a coastal sea-level station at Hanasaki, Japan, 600 km from the epicenter, recorded a tsunami wave amplitude of nearly three meters. The evacuation of Hawaii's coastlines commenced at 7:31 UTC. Concurrent with this tsunami event, a widely-felt Mw 4.6 earthquake occurred beneath the island of Hawai`i at 8:58 UTC. PTWC responded within three minutes of origin time with a Tsunami Information Statement stating that the Hawaii earthquake would not generate a tsunami. After issuing 27 international tsunami bulletins to Pacific basin countries, and 16 messages to the State of Hawaii during a period of 25 hours after the event began, PTWC concluded its role during the Tohoku tsunami event with the issuance of the corresponding warning cancellation message at 6:36 UTC on 12 March 2011. During the following weeks, however, the PTWC would continue to respond to dozens of aftershocks related to the earthquake. We will present a complete timeline of PTWC's activities, both domestic and international, during the Tohoku tsunami event. We will also illustrate the immense number of website hits, phone calls, and media requests that flooded PTWC during the course of the event, as well as the growing role social media plays in communicating tsunami hazard information to the public.
NASA Astrophysics Data System (ADS)
Mulia, Iyan E.; Gusman, Aditya Riadi; Satake, Kenji
2017-12-01
Recently, there are numerous tsunami observation networks deployed in several major tsunamigenic regions. However, guidance on where to optimally place the measurement devices is limited. This study presents a methodological approach to select strategic observation locations for the purpose of tsunami source characterizations, particularly in terms of the fault slip distribution. Initially, we identify favorable locations and determine the initial number of observations. These locations are selected based on extrema of empirical orthogonal function (EOF) spatial modes. To further improve the accuracy, we apply an optimization algorithm called a mesh adaptive direct search to remove redundant measurement locations from the EOF-generated points. We test the proposed approach using multiple hypothetical tsunami sources around the Nankai Trough, Japan. The results suggest that the optimized observation points can produce more accurate fault slip estimates with considerably less number of observations compared to the existing tsunami observation networks.
Variations in City Exposure and Sensitivity to Tsunami Hazards in Oregon
Wood, Nathan
2007-01-01
Evidence of past events and modeling of potential future events suggest that tsunamis are significant threats to Oregon coastal communities. Although a potential tsunami-inundation zone from a Cascadia Subduction Zone earthquake has been delineated, what is in this area and how communities have chosen to develop within it have not been documented. A vulnerability assessment using geographic-information-system tools was conducted to describe tsunami-prone landscapes on the Oregon coast and to document city variations in developed land, human populations, economic assets, and critical facilities relative to the tsunami-inundation zone. Results indicate that the Oregon tsunami-inundation zone contains approximately 22,201 residents (four percent of the total population in the seven coastal counties), 14,857 employees (six percent of the total labor force), and 53,714 day-use visitors on average every day to coastal Oregon State Parks within the tsunami-inundation zone. The tsunami-inundation zone also contains 1,829 businesses that generate approximately $1.9 billion in annual sales volume (seven and five percent of study-area totals, respectively) and tax parcels with a combined total value of $8.2 billion (12 percent of the study-area total). Although occupancy values are not known for each facility, the tsunami-inundation zone also contains numerous dependent-population facilities (for example, adult-residential-care facilities, child-day-care facilities, and schools), public venues (for example, religious organizations and libraries), and critical facilities (for example, police stations). Racial diversity of residents in the tsunami-inundation zone is low, with 96 percent identifying themselves as White, either alone or in combination with one or more race. Twenty-two percent of the residents in the tsunami-inundation zone are over 65 years in age, 36 percent of the residents live on unincorporated county lands, and 37 percent of the households are renter occupied. The employee population in the tsunami-inundation zone is largely in accommodation and food services, retail trade, manufacturing, and arts and entertainment sectors. Results indicate that vulnerability, described here by exposure (the amount of assets in tsunami-prone areas) and sensitivity (the relative percentage of assets in tsunami-prone areas) varies considerably among 26 incorporated cities in Oregon. City exposure and sensitivity to tsunami hazards is highest in the northern portion of the coast. The City of Seaside in Clatsop County has the highest exposure, the highest sensitivity, and the highest combined relative exposure and sensitivity to tsunamis. Results also indicate that the amount of city assets in tsunami-prone areas is weakly related to the amount of a community's land in this zone; the percentage of a city's assets, however, is strongly related to the percentage of its land that is in the tsunami-prone areas. This report will further the dialogue on societal risk to tsunami hazards in Oregon and help identify future preparedness, mitigation, response, and recovery planning needs within coastal cities and economic sectors of the state of Oregon.
Geological evidence for Holocene earthquakes and tsunamis along the Nankai-Suruga Trough, Japan
NASA Astrophysics Data System (ADS)
Garrett, Ed; Fujiwara, Osamu; Garrett, Philip; Heyvaert, Vanessa M. A.; Shishikura, Masanobu; Yokoyama, Yusuke; Hubert-Ferrari, Aurélia; Brückner, Helmut; Nakamura, Atsunori; De Batist, Marc
2016-04-01
The Nankai-Suruga Trough, lying immediately south of Japan's densely populated and highly industrialised southern coastline, generates devastating great earthquakes (magnitude > 8). Intense shaking, crustal deformation and tsunami generation accompany these ruptures. Forecasting the hazards associated with future earthquakes along this >700 km long fault requires a comprehensive understanding of past fault behaviour. While the region benefits from a long and detailed historical record, palaeoseismology has the potential to provide a longer-term perspective and additional insights. Here, we summarise the current state of knowledge regarding geological evidence for past earthquakes and tsunamis, incorporating literature originally published in both Japanese and English. This evidence comes from a wide variety of sources, including uplifted marine terraces and biota, marine and lacustrine turbidites, liquefaction features, subsided marshes and tsunami deposits in coastal lakes and lowlands. We enhance available results with new age modelling approaches. While publications describe proposed evidence from > 70 sites, only a limited number provide compelling, well-dated evidence. The best available records allow us to map the most likely rupture zones of eleven earthquakes occurring during the historical period. Our spatiotemporal compilation suggests the AD 1707 earthquake ruptured almost the full length of the subduction zone and that earthquakes in AD 1361 and 684 were predecessors of similar magnitude. Intervening earthquakes were of lesser magnitude, highlighting variability in rupture mode. Recurrence intervals for ruptures of the a single seismic segment range from less than 100 to more than 450 years during the historical period. Over longer timescales, palaeoseismic evidence suggests intervals ranging from 100 to 700 years. However, these figures reflect thresholds of evidence creation and preservation as well as genuine recurrence intervals. At present, we have not identified any geological data that support the occurrence earthquakes of larger magnitude than that experienced in AD 1707; however, few published studies seek to establish the relative magnitudes of different earthquake and tsunami events. Alongside the paucity of research designed to quantify the magnitude of past earthquakes, we emphasise a number of other issues, including alternative hypotheses for proposed palaeoseismic evidence, the lack of robust chronological frameworks and insufficient appreciation of changing thresholds of evidence creation and preservation over time. These key issues must be addressed by future research.
NASA Astrophysics Data System (ADS)
Grilli, Stéphan T.; Guérin, Charles-Antoine; Shelby, Michael; Grilli, Annette R.; Moran, Patrick; Grosdidier, Samuel; Insua, Tania L.
2017-08-01
In past work, tsunami detection algorithms (TDAs) have been proposed, and successfully applied to offline tsunami detection, based on analyzing tsunami currents inverted from high-frequency (HF) radar Doppler spectra. With this method, however, the detection of small and short-lived tsunami currents in the most distant radar ranges is challenging due to conflicting requirements on the Doppler spectra integration time and resolution. To circumvent this issue, in Part I of this work, we proposed an alternative TDA, referred to as time correlation (TC) TDA, that does not require inverting currents, but instead detects changes in patterns of correlations of radar signal time series measured in pairs of cells located along the main directions of tsunami propagation (predicted by geometric optics theory); such correlations can be maximized when one signal is time-shifted by the pre-computed long wave propagation time. We initially validated the TC-TDA based on numerical simulations of idealized tsunamis in a simplified geometry. Here, we further develop, extend, and apply the TC algorithm to more realistic tsunami case studies. These are performed in the area West of Vancouver Island, BC, where Ocean Networks Canada recently deployed a HF radar (in Tofino, BC), to detect tsunamis from far- and near-field sources, up to a 110 km range. Two case studies are considered, both simulated using long wave models (1) a far-field seismic, and (2) a near-field landslide, tsunami. Pending the availability of radar data, a radar signal simulator is parameterized for the Tofino HF radar characteristics, in particular its signal-to-noise ratio with range, and combined with the simulated tsunami currents to produce realistic time series of backscattered radar signal from a dense grid of cells. Numerical experiments show that the arrival of a tsunami causes a clear change in radar signal correlation patterns, even at the most distant ranges beyond the continental shelf, thus making an early tsunami detection possible with the TC-TDA. Based on these results, we discuss how the new algorithm could be combined with standard methods proposed earlier, based on a Doppler analysis, to develop a new tsunami detection system based on HF radar data, that could increase warning time. This will be the object of future work, which will be based on actual, rather than simulated, radar data.
NASA Astrophysics Data System (ADS)
Roger, J.; Clouard, V.; Moizan, E.
2014-12-01
The recent devastating tsunamis having occurred during the last decades have highlighted the essential necessity to deploy operationnal warning systems and educate coastal populations. This could not be prepared correctly without a minimum knowledge about the tsunami history. That is the case of the Lesser Antilles islands, where a few handfuls of tsunamis have been reported over the past 5 centuries, some of them leading to notable destructions and inundations. But the lack of accurate details for most of the historical tsunamis and the limited period during which we could find written information represents an important problem for tsunami hazard assessment in this region. Thus, it is of major necessity to try to find other evidences of past tsunamis by looking for sedimentary deposits. Unfortunately, island tropical environments do not seem to be the best places to keep such deposits burried. In fact, heavy rainfalls, storms, and all other phenomena leading to coastal erosion, and associated to human activities such as intensive sugarcane cultivation in coastal flat lands, could caused the loss of potential tsunami deposits. Lots of places have been accurately investigated within the Lesser Antilles (from Sainte-Lucia to the British Virgin Islands) the last 3 years and nothing convincing has been found. That is when archeaological investigations excavated a 8-cm thick sandy and shelly layer in downtown Fort-de-France (Martinique), wedged between two well-identified layers of human origin (Fig. 1), that we found new hope: this sandy layer has been quickly attributed without any doubt to the 1755 tsunami, using on one hand the information provided by historical reports of the construction sites, and on the other hand by numerical modeling of the tsunami (wave heights, velocity fields, etc.) showing the ability of this transoceanic tsunami to wrap around the island after ~7 hours of propagation, enter Fort-de-France's Bay with enough energy to carry sediments, and inundate it. Helping with this discovery, we conclude that tsunami markers could have been simply buried and preserved by human earthmoving, leveling and other building activities. It also shows how a collaborative research involving geology and archaeology could chart a new course to greatly improve our tsunami databases.
Mental health in Aceh--Indonesia: A decade after the devastating tsunami 2004.
Marthoenis, Marthoenis; Yessi, Sarifah; Aichberger, Marion C; Schouler-Ocak, Meryam
2016-02-01
The province of Aceh has suffered enormously from the perennial armed conflict and the devastating Tsunami in 2004. Despite the waves of external aid and national concern geared toward improving healthcare services as part of the reconstruction and rehabilitation efforts after the Tsunami, mental health services still require much attention. This paper aims to understand the mental healthcare system in Aceh Province, Indonesia; its main focus is on the burden, on the healthcare system, its development, service delivery and cultural issues from the devastating Tsunami in 2004 until the present. We reviewed those published and unpublished reports from the local and national government, from international instances (UN bodies, NGOs) and from the academic literature pertaining to mental health related programs conducted in Aceh. To some extent, mental health services in Aceh have been improved compared to their condition before the Tsunami. The development programs have focused on procurement of policy, improvement of human resources, and enhancing service delivery. Culture and religious beliefs shape the pathways by which people seek mental health treatment. The political system also determines the development of the mental health service in the province. The case of Aceh is a unique example where conflict and disaster serve as the catalysts toward the development of a mental healthcare system. Several factors contribute to the improvement of the mental health system, but security is a must. Whilst the Acehnese enjoy the improvements, some issues such as stigma, access to care and political fluctuations remain challenging. Copyright © 2016 Elsevier B.V. All rights reserved.
Enhancement of EarthScope Infrastructure with Real Time Seismogeodesy
NASA Astrophysics Data System (ADS)
Bock, Y.; Melgar, D.; Geng, J.; Haase, J. S.; Crowell, B. W.; Squibb, M. B.
2013-12-01
Recent great earthquakes and ensuing tsunamis in Sumatra, Chile and Japan have demonstrated the need for accurate ground displacements that fully characterize the great amplitudes and broad dynamic range of motions associated with these complex ruptures. Our ability to model the source processes of these events and their effects, whether in real-time or after the fact, is limited by the weaknesses of both seismic and geodetic networks. Geodetic instruments provide the static component as well as coarse dynamic motions but are much less precise than seismic instruments, especially in the vertical direction. Seismic instruments provide exceptionally-sensitive dynamic motions but typically have difficulty in recovering unbiased near-field low-frequency absolute displacements. We have shown in several publications that an optimal combination of data from collocated GPS and strong motion accelerometers provides seismogeodetic displacement, velocity and point tilt waveforms spanning the full spectrum of seismic motion, without clipping and magnitude saturation. These observations are suitable for earthquake early warning (EEW) through detection of P wave arrivals, rapid assessment of earthquake magnitude, finite-source centroid moment tensor solutions and fault slip models, and tsunami warning, in particular in the near-source regions of large earthquakes. At present, more than 550 real-time GPS stations are operating in Western North America, a majority as part of the EarthScope/PBO effort with a concentration in the Cascadia region and southern California. Unfortunately, there are few collocations of GPS and accelerometers in this region (the exception being in parts of the BARD network in northern California). We have leveraged the considerable infrastructure already invested in the EarthScope project, and funding through NSF and NASA to create advanced software, hardware, and algorithms that make it possible to utilize EarthScope/PBO as an EEW test bed. We have developed cost-effective hardware and embedded firmware to upgrade existing real-time GPS stations with low-cost MEMS accelerometers. Fifteen PBO and SCIGN stations in southern California have already been upgraded with this technology. We have also developed a software suite to analyze seismogeodetic data in real time using a tightly-coupled precise point positioning (PPP) Kalman filter that supports PPP with ambiguity resolution (PPP-AR) throughout the seismically active regions of the Western U.S. The seismogeodetic system contributes directly to collaborative natural hazards research by providing technology for early warning systems for earthquakes, volcanoes and tsunamis, and for short-term high impact weather forecasting and related flooding hazards (we are also installing MEMS temperature and pressure sensors for GPS meteorology). The systems have also been deployed for earthquake engineering research for large structures (e.g., bridges, buildings, dams). Here we present the components and status of our seismogeodetic earthquake and tsunami monitoring system. Although the analysis techniques are quite advanced, the project lends itself to opportunities for education and outreach, specifically in illustrating concepts in elementary physics of position, velocity, and acceleration. Many of the animations generated in the research are available for development into appealing and accessible educational modules.
What caused a large number of fatalities in the Tohoku earthquake?
NASA Astrophysics Data System (ADS)
Ando, M.; Ishida, M.; Nishikawa, Y.; Mizuki, C.; Hayashi, Y.
2012-04-01
The Mw9.0 earthquake caused 20,000 deaths and missing persons in northeastern Japan. 115 years prior to this event, there were three historical tsunamis that struck the region, one of which is a "tsunami earthquake" resulted with a death toll of 22,000. Since then, numerous breakwaters were constructed along the entire northeastern coasts and tsunami evacuation drills were carried out and hazard maps were distributed to local residents on numerous communities. However, despite the constructions and preparedness efforts, the March 11 Tohoku earthquake caused numerous fatalities. The strong shaking lasted three minutes or longer, thus all residents recognized that this is the strongest and longest earthquake that they had been ever experienced in their lives. The tsunami inundated an enormous area at about 560km2 over 35 cities along the coast of northeast Japan. To find out the reasons behind the high number of fatalities due to the March 11 tsunami, we interviewed 150 tsunami survivors at public evacuation shelters in 7 cities mainly in Iwate prefecture in mid-April and early June 2011. Interviews were done for about 30min or longer focused on their evacuation behaviors and those that they had observed. On the basis of the interviews, we found that residents' decisions not to evacuate immediately were partly due to or influenced by earthquake science results. Below are some of the factors that affected residents' decisions. 1. Earthquake hazard assessments turned out to be incorrect. Expected earthquake magnitudes and resultant hazards in northeastern Japan assessed and publicized by the government were significantly smaller than the actual Tohoku earthquake. 2. Many residents did not receive accurate tsunami warnings. The first tsunami warning were too small compared with the actual tsunami heights. 3. The previous frequent warnings with overestimated tsunami height influenced the behavior of the residents. 4. Many local residents above 55 years old experienced the 1960 Chile tsunami, which was significantly smaller than that of the 11 March tsunami. This sense of "knowing" put their lives at high risk. 5. Some local residents believed that with the presence of a breakwater, only slight flooding would occur. 6. Many people did not understand why tsunami is created under the sea. Therefore, relation of earthquake and tsunami is not quite linked to many people. These interviews made it clear that many deaths resulted because current technology and earthquake science underestimated tsunami heights, warning systems failed, and breakwaters were not strong or high enough. However, even if these problems occur in future earthquakes, better knowledge regarding earthquakes and tsunami hazards could save more lives. In an elementary school when children have fresh brain, it is necessary for them to learn the basic mechanism of tsunami generation.
Variations in Community Exposure and Sensitivity to Tsunami Hazards in the State of Hawai'i
Wood, Nathan; Church, Alyssia; Frazier, Tim; Yarnal, Brent
2007-01-01
Hawai`i has experienced numerous destructive tsunamis and the potential for future events threatens the safety and economic well being of its coastal communities. Although tsunami-evacuation zones have been delineated, what is in these areas and how communities have chosen to develop within them has not been documented. A community-level vulnerability assessment using geographic-information-system tools was conducted to describe tsunami-prone landscapes on the Hawaiian coast and to document variations in land cover, demographics, economic assets, and critical facilities among 65 communities. Results indicate that the Hawai`i tsunami-evacuation zone contains approximately 56,678 residents (five percent of the total population), 67,113 employees (eleven percent of the State labor force), and 50,174 average daily visitors to hotels (44 percent of the State total). With regards to economic conditions, the tsunami-evacuation zone contains 5,779 businesses that generate $10.1 billion in annual sales volume (both eleven percent of State totals), and tax parcels with a combined total value of $36.1 billion (18 percent of the State total). Although occupancy values are not known for each facility, the tsunami-evacuation zone also contains numerous dependent-population facilities (for example, child-day-care facilities and schools), public venues (for example, religious organizations and parks) and critical facilities (for example, fire stations). The residential population in tsunami-prone areas is racially diverse, with most residents identifying themselves as White, Asian, or Native Hawaiian and Other Pacific Islander, either alone or in combination with one or more race. Fifty-one percent of the households in the tsunami-evacuation zone are renter occupied. The employee population in the tsunami-evacuation zone is largely in accommodation and food services, health services, and retail-trade sectors. Results indicate that community vulnerability, described here by exposure (the amount of assets in tsunami-prone areas) and sensitivity (the relative percentage of assets in tsunami-prone areas) varies considerably among 65 coastal communities in Hawai`i. Honolulu has the highest exposure, Punalu`u has the highest sensitivity, and Ka`anapali has the highest combination of exposure and sensitivity to tsunamis. Results also indicate that the level of community-asset exposure to tsunamis is not determined by the amount of a community's land that is in tsunami-evacuation zones. Community sensitivity, however, is related to the percentage of a community's land that is in the tsunami-prone areas. This report will further the dialogue on societal risk to tsunami hazards in Hawai`i and help identify future preparedness, mitigation, response, and recovery planning needs within coastal communities and economic sectors of the State of Hawai`i.
NASA Astrophysics Data System (ADS)
Rakoto, Virgile; Lognonné, Philippe; Rolland, Lucie; Coïsson, Pierdavide; Drilleau, Mélanie
2017-04-01
Large underwater earthquakes (Mw > 7) can transmit part of their energy to the surrounding ocean through large sea-floor motions, generating tsunamis that propagate over long distances. The forcing effect of tsunami waves on the atmosphere generate internal gravity waves which produce detectable ionospheric perturbations when they reach the upper atmosphere. Theses perturbations are frequently observed in the total electron content (TEC) measured by the multi-frequency Global navigation Satellite systems (GNSS) data (e.g., GPS,GLONASS). In this paper, we performed for the first time an inversion of the sea level anomaly using the GPS TEC data using a least square inversion (LSQ) through a normal modes summation modeling technique. Using the tsunami of the 2012 Haida Gwaii in far field as a test case, we showed that the amplitude peak to peak of the sea level anomaly inverted using this method is below 10 % error. Nevertheless, we cannot invert the second wave arriving 20 minutes later. This second wave is generaly explain by the coastal reflection which the normal modeling does not take into account. Our technique is then applied to two other tsunamis : the 2006 Kuril Islands tsunami in far field, and the 2011 Tohoku tsunami in closer field. This demonstrates that the inversion using a normal mode approach is able to estimate fairly well the amplitude of the first arrivals of the tsunami. In the future, we plan to invert in real the TEC data in order to retrieve the tsunami height.
Lessons from the Tōhoku tsunami: A model for island avifauna conservation prioritization.
Reynolds, Michelle H; Berkowitz, Paul; Klavitter, John L; Courtot, Karen N
2017-08-01
Earthquake-generated tsunamis threaten coastal areas and low-lying islands with sudden flooding. Although human hazards and infrastructure damage have been well documented for tsunamis in recent decades, the effects on wildlife communities rarely have been quantified. We describe a tsunami that hit the world's largest remaining tropical seabird rookery and estimate the effects of sudden flooding on 23 bird species nesting on Pacific islands more than 3,800 km from the epicenter. We used global positioning systems, tide gauge data, and satellite imagery to quantify characteristics of the Tōhoku earthquake-generated tsunami (11 March 2011) and its inundation extent across four Hawaiian Islands. We estimated short-term effects of sudden flooding to bird communities using spatially explicit data from Midway Atoll and Laysan Island, Hawai'i. We describe variation in species vulnerability based on breeding phenology, nesting habitat, and life history traits. The tsunami inundated 21%-100% of each island's area at Midway Atoll and Laysan Island. Procellariformes (albatrosses and petrels) chick and egg losses exceeded 258,500 at Midway Atoll while albatross chick losses at Laysan Island exceeded 21,400. The tsunami struck at night and during the peak of nesting for 14 colonial seabird species. Strongly philopatric Procellariformes were vulnerable to the tsunami. Nonmigratory, endemic, endangered Laysan Teal ( Anas laysanensis ) were sensitive to ecosystem effects such as habitat changes and carcass-initiated epizootics of avian botulism, and its populations declined approximately 40% on both atolls post-tsunami. Catastrophic flooding of Pacific islands occurs periodically not only from tsunamis, but also from storm surge and rainfall; with sea-level rise, the frequency of sudden flooding events will likely increase. As invasive predators occupy habitat on higher elevation Hawaiian Islands and globally important avian populations are concentrated on low-lying islands, additional conservation strategies may be warranted to increase resilience of island biodiversity encountering tsunamis and rising sea levels.
Tsunami waveform inversion of the 2007 Bengkulu, southern Sumatra earthquake
NASA Astrophysics Data System (ADS)
Fujii, Y.; Satake, K.
2007-12-01
We have performed tsunami waveform inversion for the 2007 Bengkulu, southern Sumatra earthquake on September 12, 2007 (4.520°S, 101.374°E, Mw=8.4 at 11:10:26 UTC according to USGS), and found that the large slips were located on deeper part (> 20 km) of the fault plane, more than 100 km from the trench axis. The deep slip might have contributed the relatively small tsunami for its earthquake size. The largest slips more than 6 m were located beneath Pagais Islands, about 100-200 km northwest of the epicenter. The obtained slip distribution yields a total seismic moment of 3.6 × 1021 Nm (Mw = 8.3). The tsunami generated by this earthquake was recorded at many tide gauge stations located in and around the Indian Ocean. The DART system installed in deep ocean and maintained by Thai Meteorological Department (TMD) also captured this tsunami. We have downloaded the tsunami waveforms at 16 stations from University of Hawaii Sea Level Center's (UHSLC) and National Oceanic & Atmospheric Administration's (NOAA) web sites. The observed tsunami records indicate that the tsunami amplitudes were less than several tens of cm at most stations, around 1 m at Padang, nearest station to the source, and a few cm at DART station. For the tsunami waveforms inversion, we divided the source area (length: 250 km, width: 200 km) into 20 subfaults. Tsunami waveforms from each subfault (50 km × 50 km) or Greens functions were calculated by numerically solving the linear shallow-water long-wave equations. We adopted the focal mechanism of Global CMT solution (strike: 327°, dip: 12°, rake: 114°) for each subfault, and assumed a rise time of 1 min. The computed tsunami waveforms from the estimated slip distribution explain the observed waveforms at most of tide gauges and DART station.
Lessons from the Tōhoku tsunami: A model for island avifauna conservation prioritization
Reynolds, Michelle H.; Berkowitz, Paul; Klavitter, John; Courtot, Karen
2017-01-01
Earthquake-generated tsunamis threaten coastal areas and low-lying islands with sudden flooding. Although human hazards and infrastructure damage have been well documented for tsunamis in recent decades, the effects on wildlife communities rarely have been quantified. We describe a tsunami that hit the world's largest remaining tropical seabird rookery and estimate the effects of sudden flooding on 23 bird species nesting on Pacific islands more than 3,800 km from the epicenter. We used global positioning systems, tide gauge data, and satellite imagery to quantify characteristics of the Tōhoku earthquake-generated tsunami (11 March 2011) and its inundation extent across four Hawaiian Islands. We estimated short-term effects of sudden flooding to bird communities using spatially explicit data from Midway Atoll and Laysan Island, Hawai'i. We describe variation in species vulnerability based on breeding phenology, nesting habitat, and life history traits. The tsunami inundated 21%–100% of each island's area at Midway Atoll and Laysan Island. Procellariformes (albatrosses and petrels) chick and egg losses exceeded 258,500 at Midway Atoll while albatross chick losses at Laysan Island exceeded 21,400. The tsunami struck at night and during the peak of nesting for 14 colonial seabird species. Strongly philopatric Procellariformes were vulnerable to the tsunami. Nonmigratory, endemic, endangered Laysan Teal (Anas laysanensis) were sensitive to ecosystem effects such as habitat changes and carcass-initiated epizootics of avian botulism, and its populations declined approximately 40% on both atolls post-tsunami. Catastrophic flooding of Pacific islands occurs periodically not only from tsunamis, but also from storm surge and rainfall; with sea-level rise, the frequency of sudden flooding events will likely increase. As invasive predators occupy habitat on higher elevation Hawaiian Islands and globally important avian populations are concentrated on low-lying islands, additional conservation strategies may be warranted to increase resilience of island biodiversity encountering tsunamis and rising sea levels.
A catalog of tsunamis in New Caledonia from 28 March 1875 to 30 September 2009
NASA Astrophysics Data System (ADS)
Sahal, Alexandre; Pelletier, Bernard; Chatelier, Jean; Lavigne, Franck; Schindelé, François
2010-06-01
In order to establish a tsunami alert system in New Caledonia in April 2008, the French Secretary of State for Overseas Affairs, with the aid of the UNESCO French Commission, mandated an investigation to build a more complete record of the most recent tsunamis. To complete this task, a call for witnesses was broadcast through various media and in public locations. These witnesses were then interviewed onsite about the phenomenon they had observed. Previous witness reports that had been obtained in the last few years were also used. For the most recent events, various archives were consulted. In total, 18 events were documented, of which 12 had not been previously mentioned in past work. These results confirm an exposure to a hazard of: (1) local origin (the southern part of the Vanuatu arc) with a very short post-seismic delay (< 30 min) before the arrival of wave trains; (2) regional origin (Solomon Islands arc, northern part of the Vanuatu arc) with a delay of several hours; and (3) an exposure to trans-oceanic tsunamis (Kamchatka 1952, South Chile 1960, Kuril Islands 2006, North Tonga 2009), unknown until today. These results highlight the necessity for New Caledonia to adopt an alert system, coupled with ocean tide gauges, that liaises with the main alert system for the Pacific (Pacific Tsunami Warning Center), and brings to light the importance of establishing a prevention campaign.
Toward tsunami early warning system in Indonesia by using rapid rupture durations estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madlazim
2012-06-20
Indonesia has Indonesian Tsunami Early Warning System (Ina-TEWS) since 2008. The Ina-TEWS has used automatic processing on hypocenter; Mwp, Mw (mB) and Mj. If earthquake occurred in Ocean, depth < 70 km and magnitude > 7, then Ina-TEWS announce early warning that the earthquake can generate tsunami. However, the announcement of the Ina-TEWS is still not accuracy. Purposes of this research are to estimate earthquake rupture duration of large Indonesia earthquakes that occurred in Indian Ocean, Java, Timor sea, Banda sea, Arafura sea and Pasific ocean. We analyzed at least 330 vertical seismogram recorded by IRIS-DMC network using a directmore » procedure for rapid assessment of earthquake tsunami potential using simple measures on P-wave vertical seismograms on the velocity records, and the likelihood that the high-frequency, apparent rupture duration, T{sub dur}. T{sub dur} can be related to the critical parameters rupture length (L), depth (z), and shear modulus ({mu}) while T{sub dur} may be related to wide (W), slip (D), z or {mu}. Our analysis shows that the rupture duration has a stronger influence to generate tsunami than Mw and depth. The rupture duration gives more information on tsunami impact, Mo/{mu}, depth and size than Mw and other currently used discriminants. We show more information which known from the rupture durations. The longer rupture duration, the shallower source of the earthquake. For rupture duration greater than 50 s, the depth less than 50 km, Mw greater than 7, the longer rupture length, because T{sub dur} is proportional L and greater Mo/{mu}. Because Mo/{mu} is proportional L. So, with rupture duration information can be known information of the four parameters. We also suggest that tsunami potential is not directly related to the faulting type of source and for events that have rupture duration greater than 50 s, the earthquakes generated tsunami. With available real-time seismogram data, rapid calculation, rupture duration discriminant can be completed within 4-5 min after an earthquake occurs and thus can aid in effective, accuracy and reliable tsunami early warning for Indonesia region.« less
Integrating Caribbean Seismic and Tsunami Hazard into Public Policy and Action
NASA Astrophysics Data System (ADS)
von Hillebrandt-Andrade, C.
2012-12-01
The Caribbean has a long history of tsunamis and earthquakes. Over the past 500 years, more than 80 tsunamis have been documented in the region by the NOAA National Geophysical Data Center. Almost 90% of all these historical tsunamis have been associated with earthquakes. Just since 1842, 3510 lives have been lost to tsunamis; this is more than in the Northeastern Pacific for the same time period. With a population of almost 160 million and a heavy concentration of residents, tourists, businesses and critical infrastructure along the Caribbean shores (especially in the northern and eastern Caribbean), the risk to lives and livelihoods is greater than ever before. Most of the countries also have a very high exposure to earthquakes. Given the elevated vulnerability, it is imperative that government officials take steps to mitigate the potentially devastating effects of these events. Nevertheless, given the low frequency of high impact earthquakes and tsunamis, in comparison to hurricanes, combined with social and economic considerations, the needed investments are not made and disasters like the 2010 Haiti earthquake occur. In the absence of frequent significant events, an important driving force for public officials to take action, is the dissemination of scientific studies. When papers of this nature have been published and media advisories issued, public officials demonstrate heightened interest in the topic which in turn can lead to increased legislation and funding efforts. This is especially the case if the material can be easily understood by the stakeholders and there is a local contact. In addition, given the close link between earthquakes and tsunamis, in Puerto Rico alone, 50% of the high impact earthquakes have also generated destructive tsunamis, it is very important that earthquake and tsunami hazards studies demonstrate consistency. Traditionally in the region, earthquake and tsunami impacts have been considered independently in the emergency planning processes. For example, earthquake and tsunami exercises are conducted separately, without taking into consideration the compounding effects. Recognizing this deficiency, the UNESCO IOC Intergovernmental Coordination Group for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS) which was established in 2005, decided to include the tsunami and earthquake impacts for the upcoming March 20, 2013 regional CARIBE WAVE/LANTEX tsunami exercise. In addition to the tsunami wave heights predicted by the National Weather Service Tsunami Warning Centers in Alaska and Hawaii, the USGS PAGER and SHAKE MAP results for the M8.5 scenario earthquake in the southern Caribbean were also integrated into the manual. Additionally, in recent catastrophic planning for Puerto Rico, FEMA did request the local researchers to determine both the earthquake and tsunami impacts for the same source. In the US, despite that the lead for earthquakes and tsunamis lies within two different agencies, USGS and NOAA/NWS, it has been very beneficial that the National Tsunami Hazard Mitigation Program partnership includes both agencies. By working together, the seismic and tsunami communities can achieve an even better understanding of the hazards, but also foster more actions on behalf of government officials and the populations at risk.
NASA Astrophysics Data System (ADS)
Marinin, I. V.; Kabanikhin, S. I.; Krivorotko, O. I.; Karas, A.; Khidasheli, D. G.
2012-04-01
We consider new techniques and methods for earthquake and tsunami related problems, particularly - inverse problems for the determination of tsunami source parameters, numerical simulation of long wave propagation in soil and water and tsunami risk estimations. In addition, we will touch upon the issue of database management and destruction scenario visualization. New approaches and strategies, as well as mathematical tools and software are to be shown. The long joint investigations by researchers of the Institute of Mathematical Geophysics and Computational Mathematics SB RAS and specialists from WAPMERR and Informap have produced special theoretical approaches, numerical methods, and software tsunami and earthquake modeling (modeling of propagation and run-up of tsunami waves on coastal areas), visualization, risk estimation of tsunami, and earthquakes. Algorithms are developed for the operational definition of the origin and forms of the tsunami source. The system TSS numerically simulates the source of tsunami and/or earthquakes and includes the possibility to solve the direct and the inverse problem. It becomes possible to involve advanced mathematical results to improve models and to increase the resolution of inverse problems. Via TSS one can construct maps of risks, the online scenario of disasters, estimation of potential damage to buildings and roads. One of the main tools for the numerical modeling is the finite volume method (FVM), which allows us to achieve stability with respect to possible input errors, as well as to achieve optimum computing speed. Our approach to the inverse problem of tsunami and earthquake determination is based on recent theoretical results concerning the Dirichlet problem for the wave equation. This problem is intrinsically ill-posed. We use the optimization approach to solve this problem and SVD-analysis to estimate the degree of ill-posedness and to find the quasi-solution. The software system we developed is intended to create technology «no frost», realizing a steady stream of direct and inverse problems: solving the direct problem, the visualization and comparison with observed data, to solve the inverse problem (correction of the model parameters). The main objective of further work is the creation of a workstation operating emergency tool that could be used by an emergency duty person in real time.
Tsunami Ionospheric warning and Ionospheric seismology
NASA Astrophysics Data System (ADS)
Lognonne, Philippe; Rolland, Lucie; Rakoto, Virgile; Coisson, Pierdavide; Occhipinti, Giovanni; Larmat, Carene; Walwer, Damien; Astafyeva, Elvira; Hebert, Helene; Okal, Emile; Makela, Jonathan
2014-05-01
The last decade demonstrated that seismic waves and tsunamis are coupled to the ionosphere. Observations of Total Electron Content (TEC) and airglow perturbations of unique quality and amplitude were made during the Tohoku, 2011 giant Japan quake, and observations of much lower tsunamis down to a few cm in sea uplift are now routinely done, including for the Kuril 2006, Samoa 2009, Chili 2010, Haida Gwai 2012 tsunamis. This new branch of seismology is now mature enough to tackle the new challenge associated to the inversion of these data, with either the goal to provide from these data maps or profile of the earth surface vertical displacement (and therefore crucial information for tsunami warning system) or inversion, with ground and ionospheric data set, of the various parameters (atmospheric sound speed, viscosity, collision frequencies) controlling the coupling between the surface, lower atmosphere and the ionosphere. We first present the state of the art in the modeling of the tsunami-atmospheric coupling, including in terms of slight perturbation in the tsunami phase and group velocity and dependance of the coupling strength with local time, ocean depth and season. We then show the confrontation of modelled signals with observations. For tsunami, this is made with the different type of measurement having proven ionospheric tsunami detection over the last 5 years (ground and space GPS, Airglow), while we focus on GPS and GOCE observation for seismic waves. These observation systems allowed to track the propagation of the signal from the ground (with GPS and seismometers) to the neutral atmosphere (with infrasound sensors and GOCE drag measurement) to the ionosphere (with GPS TEC and airglow among other ionospheric sounding techniques). Modelling with different techniques (normal modes, spectral element methods, finite differences) are used and shown. While the fits of the waveform are generally very good, we analyse the differences and draw direction of future studies and improvements, enabling the integration of lateral variations of the solid earth, bathymetry or atmosphere, finite model sources, non-linearity of the waves and better attenuation and coupling processes. All these effects are revealed by phase or amplitude discrepancies in selected observations. We then present goals and first results of source inversions, with a focus on estimations of the sea level uplift location and amplitude, either by using GPS networks close from the epicentre or, for tsunamis, GPS of the Hawaii Islands.
Advanced Geospatial Hydrodynamic Signals Analysis for Tsunami Event Detection and Warning
NASA Astrophysics Data System (ADS)
Arbab-Zavar, Banafshe; Sabeur, Zoheir
2013-04-01
Current early tsunami warning can be issued upon the detection of a seismic event which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic event from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source event. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded tsunami event in the Pacific Ocean. The new approach will be tested in the future on other oceanic regions including the Mediteranean Sea and North East Atlantic Ocean zones. Both authors acknowledge that the current research is currently conducted under the TRIDEC IP FP7 project[1] which involves the development of a system of systems for collaborative, complex and critical decision-support in evolving crises. [1] TRIDEC IP ICT-2009.4.3 Intelligent Information Management Project Reference: 258723. http://www.tridec-online.eu/home
The Numerical Technique for the Landslide Tsunami Simulations Based on Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Kozelkov, A. S.
2017-12-01
The paper presents an integral technique simulating all phases of a landslide-driven tsunami. The technique is based on the numerical solution of the system of Navier-Stokes equations for multiphase flows. The numerical algorithm uses a fully implicit approximation method, in which the equations of continuity and momentum conservation are coupled through implicit summands of pressure gradient and mass flow. The method we propose removes severe restrictions on the time step and allows simulation of tsunami propagation to arbitrarily large distances. The landslide origin is simulated as an individual phase being a Newtonian fluid with its own density and viscosity and separated from the water and air phases by an interface. The basic formulas of equation discretization and expressions for coefficients are presented, and the main steps of the computation procedure are described in the paper. To enable simulations of tsunami propagation across wide water areas, we propose a parallel algorithm of the technique implementation, which employs an algebraic multigrid method. The implementation of the multigrid method is based on the global level and cascade collection algorithms that impose no limitations on the paralleling scale and make this technique applicable to petascale systems. We demonstrate the possibility of simulating all phases of a landslide-driven tsunami, including its generation, propagation and uprush. The technique has been verified against the problems supported by experimental data. The paper describes the mechanism of incorporating bathymetric data to simulate tsunamis in real water areas of the world ocean. Results of comparison with the nonlinear dispersion theory, which has demonstrated good agreement, are presented for the case of a historical tsunami of volcanic origin on the Montserrat Island in the Caribbean Sea.
Simulation of Tsunami Resistance of a Pinus Thunbergii tree in Coastal Forest in Japan
NASA Astrophysics Data System (ADS)
Nanko, K.; Suzuki, S.; Noguchi, H.; Hagino, H.
2015-12-01
Forests reduce fluid force of tsunami, whereas extreme tsunami sometimes breaks down the forest trees. It is difficult to estimate the interactive relationship between the fluid and the trees because fluid deform tree architecture and deformed tree changes flow field. Dynamic tree deformation and fluid behavior should be clarified by fluid-structure interaction analysis. For the initial step, we have developed dynamic simulation of tree sway and breakage caused by tsunami based on a vibrating system with multiple degrees of freedom. The target specie of the simulation was Japanese black pine (pinus thunbergii), which is major specie in the coastal forest to secure livelihood area from the damage by blown sand and salt in Japanese coastal area. For the simulation, a tree was segmented into 0.2 m long circular truncated cones. Turning moment induced by tsunami and self-weight was calculated at each segment bottom. Tree deformation was computed on multi-degree-of-freedom vibration equation. Tree sway was simulated by iterative calculation of the tree deformation with time step 0.05 second with temporally varied flow velocity of tsunami. From the calculation of bending stress and turning moment at tree base, we estimated resistance of a Pinus thunbergii tree from tsunami against tree breakage.
An Evaluation of Infrastructure for Tsunami Evacuation in Padang, West Sumatra, Indonesia (Invited)
NASA Astrophysics Data System (ADS)
Cedillos, V.; Canney, N.; Deierlein, G.; Diposaptono, S.; Geist, E. L.; Henderson, S.; Ismail, F.; Jachowski, N.; McAdoo, B. G.; Muhari, A.; Natawidjaja, D. H.; Sieh, K. E.; Toth, J.; Tucker, B. E.; Wood, K.
2009-12-01
Padang has one of the world’s highest tsunami risks due to its high hazard, vulnerable terrain and population density. The current strategy to prepare for tsunamis in Padang is focused on developing early warning systems, planning evacuation routes, conducting evacuation drills, and raising local awareness. Although these are all necessary, they are insufficient. Padang’s proximity to the Sunda Trench and flat terrain make reaching safe ground impossible for much of the population. The natural warning in Padang - a strong earthquake that lasts over a minute - will be the first indicator of a potential tsunami. People will have about 30 minutes after the earthquake to reach safe ground. It is estimated that roughly 50,000 people in Padang will be unable to evacuate in that time. Given these conditions, other means to prepare for the expected tsunami must be developed. With this motivation, GeoHazards International and Stanford University’s Chapter of Engineers for a Sustainable World partnered with Indonesian organizations - Andalas University and Tsunami Alert Community in Padang, Laboratory for Earth Hazards, and the Ministry of Marine Affairs and Fisheries - in an effort to evaluate the need for and feasibility of tsunami evacuation infrastructure in Padang. Tsunami evacuation infrastructure can include earthquake-resistant bridges and evacuation structures that rise above the maximum tsunami water level, and can withstand the expected earthquake and tsunami forces. The choices for evacuation structures vary widely - new and existing buildings, evacuation towers, soil berms, elevated highways and pedestrian overpasses. This interdisciplinary project conducted a course at Stanford University, undertook several field investigations, and concluded that: (1) tsunami evacuation structures and bridges are essential to protect the people in Padang, (2) there is a need for a more thorough engineering-based evaluation than conducted to-date of the suitability of existing buildings to serve as evacuation structures, and of existing bridges to serve as elements of evacuation routes, and (3) additions to Padang’s tsunami evacuation infrastructure must carefully take into account technical matters (e.g. expected wave height, debris impact forces), social considerations (e.g. cultural acceptability, public’s confidence in the structure’s integrity), and political issues (e.g. land availability, cost, maintenance). Future plans include collaboration between U.S. and Indonesian engineers in developing designs for new tsunami evacuation structures, as well as providing training for Indonesian authorities on: (1) siting, designing, and constructing tsunami evacuation structures, and (2) evaluating the suitability of existing buildings to serve as tsunami evacuation shelters.
Installation of seafloor cabled seismic and tsunami observation system developed by using ICT
NASA Astrophysics Data System (ADS)
Shinohara, M.
2016-12-01
A seafloor cabled system is useful for study of earth science and disaster mitigation, because real-time and long-term observation can be performed. Therefore seafloor cabled systems with seismometers and tsunami-meters have been used over the past 25 years around Japan. Because increase of a number of sensors is needed, a new system with low costs for production, deployment and operation is expected. In addition, the new system should have sufficient for flexibility of measurements after installation. To achieve these demands, we started development of a new system using Information and Communication Technologies (ICT) for data transmission and system control. The new system can be made compact since software processes various measurements. Reliability of the system is kept by using redundant system which is easily constructed using the ICT. The first system based on this concept was developed as Ocean Bottom Cabled Seismometer (OBCS) system and deployed in Japan Sea. Development of the second system started from 2012. The Ocean Bottom Cabled Seismometer and tsunami-meter (OBCST) system has both seismometers and tsunami-meters. Each observation node has an CPU and FPGAs. The OBCST system uses standard TCP/IP protocol with a speed of 1 Gbps for data transmission, system control and monitoring. IEEE-1588 (PTP) is implemented to synchronize a real-time clock, and accuracy is less than 300 ns. We developed two types of observation node. One equips a pressure gauge as tsunami sensor, and another has an external port for additional observation sensor using PoE. Deployment of the OBCST system was carried out in September 2015 by using a commercial telecommunication cable ship. The noise levels at the OBCST system are comparable to those at the existing cabled system off Sanriku. It is found that the noise levels at the OBCST system are low at frequencies greater than 2 Hz and smaller than 0.1 Hz. This level of ambient seismic noise is close to a typical system noise. From the pressure data, pressure gauge has a resolution of less than 1 hPa, which corresponds to a change of water height of less than 1 cm, and data from all the pressure gauges are consistent.
Modeling of Marine Natural Hazards in the Lesser Antilles
NASA Astrophysics Data System (ADS)
Zahibo, Narcisse; Nikolkina, Irina; Pelinovsky, Efim
2010-05-01
The Caribbean Sea countries are often affected by various marine natural hazards: hurricanes and cyclones, tsunamis and flooding. The historical data of marine natural hazards for the Lesser Antilles and specially, for Guadeloupe are presented briefly. Numerical simulation of several historical tsunamis in the Caribbean Sea (1755 Lisbon trans-Atlantic tsunami, 1867 Virgin Island earthquake tsunami, 2003 Montserrat volcano tsunami) are performed within the framework of the nonlinear-shallow theory. Numerical results demonstrate the importance of the real bathymetry variability with respect to the direction of propagation of tsunami wave and its characteristics. The prognostic tsunami wave height distribution along the Caribbean Coast is computed using various forms of seismic and hydrodynamics sources. These results are used to estimate the far-field potential for tsunami hazards at coastal locations in the Caribbean Sea. The nonlinear shallow-water theory is also applied to model storm surges induced by tropical cyclones, in particular, cyclones "Lilli" in 2002 and "Dean" in 2007. Obtained results are compared with observed data. The numerical models have been tested against known analytical solutions of the nonlinear shallow-water wave equations. Obtained results are described in details in [1-7]. References [1] N. Zahibo and E. Pelinovsky, Natural Hazards and Earth System Sciences, 1, 221 (2001). [2] N. Zahibo, E. Pelinovsky, A. Yalciner, A. Kurkin, A. Koselkov and A. Zaitsev, Oceanologica Acta, 26, 609 (2003). [3] N. Zahibo, E. Pelinovsky, A. Kurkin and A. Kozelkov, Science Tsunami Hazards. 21, 202 (2003). [4] E. Pelinovsky, N. Zahibo, P. Dunkley, M. Edmonds, R. Herd, T. Talipova, A. Kozelkov and I. Nikolkina, Science of Tsunami Hazards, 22, 44 (2004). [5] N. Zahibo, E. Pelinovsky, E. Okal, A. Yalciner, C. Kharif, T. Talipova and A. Kozelkov, Science of Tsunami Hazards, 23, 25 (2005). [6] N. Zahibo, E. Pelinovsky, T. Talipova, A. Rabinovich, A. Kurkin and I. Nikolkina, Atmospheric Research. 84, 13 (2007). [7] Zahibo, N., Pelinovsky, E., Kurkin, A., and Nikolkina, I. Tsunami hazard for the French West Indies, Lesser Antilles. Integrated Coastal Zone Management (Ed. R. KRISHNAMURTHY). Re-search Publ., Singapore, 2008, 517-535.
Analysis of tsunami disaster resilience in Bandar Lampung Bay Coastal Zone
NASA Astrophysics Data System (ADS)
Alhamidi; Pakpahan, V. H.; Simanjuntak, J. E. S.
2018-05-01
The coastal area is an area that has potential diversity of natural resources and high economic value. The coastal area is influenced by changes in land and sea so that the coastal areas are highly vulnerable to tsunami. Bandar Lampung has the potential of coastal areas of considerable potential as it is located in the bay adjacent to the Sunda Strait. Based on the study of Heru Sri Naryanto (2003), Bandar Lampung ranks third from the level of vulnerability to tsunami. Therefore, the purpose of this study to determine the readiness of the region in facing tsunami and the magnitude of the potential risks of tsunami disaster in the Gulf Coast region of Lampung in Bandar Lampung; thus, it needs to make the model or concept of tsunami disaster mitigation appropriate in terms of vulnerability and danger in creating the resilience of the Gulf Coast region of Lampung in Bandar Lampung against tsunami. The methodology used in this study was the methods of primary and secondary data collection, and the data analysis method was quantitative analysis such as spatial analysis and descriptive analysis of the data obtained from the field. The results showed that the level of preparedness in the Gulf coast region of Lampung in Bandar Lampung in facing the tsunami was still low. There are still many developed regions or houses belonging to the community either fishermen or non-fishermen located in a tsunami hazard zone. Other than that, the level of education in the Gulf coast region of Lampung in Bandar Lampung is still low where the majority of inhabitants work as fishermen. Besides, the infrastructure is old and not well-maintained so that it becomes a slum area. Therefore, the development and planning to mitigate the natural disasters tsunami using technology of IOT (Internet of Things) is an embeded system with the use of sensor seismic as a means of pre-Earthquakes vibrations, placed both on the land and in the ocean, to read the vibrations and faults in the earth’s crust under the sea. With the use of seismic sensors under the sea, the vibration of the earth’s crust under the sea will be detected. The sensors then will be connected to a flare marker buoys as a means to inform the disaster mitigation center. The construction of hall disaster at some point will be helpful to give first aid to those who are difficult to pass through the evacuation place since it is far away from the Gulf coast. The hall mitigation can be designed anti-earthquake and anti-tsunami. The model and concept of mitigation used is combining the Spatial Plan of Bandar Lampung and the mitigation of tsunami disaster as an integrated system of pre-disaster, during disaster and post-disaster by making the city of Bandar Lampung has the resilience to tsunamis.
NASA Astrophysics Data System (ADS)
Koarai, M.; Okatani, T.; Nakano, T.; Nakamura, T.; Hasegawa, M.
2012-07-01
The great earthquake occurred in Tohoku District, Japan on 11th March, 2011. This earthquake is named "the 2011 off the Pacific coast of Tohoku Earthquake", and the damage by this earthquake is named "the Great East Japan Earthquake". About twenty thousand people were killed or lost by the tsunami of this earthquake, and large area was flooded and a large number of buildings were destroyed by the tsunami. The Geospatial Information Authority of Japan (GSI) has provided the data of tsunami flooded area interpreted from aerial photos taken just after the great earthquake. This is fundamental data of tsunami damage and very useful for consideration of reconstruction planning of tsunami damaged area. The authors analyzed the relationship among land use, landform classification, DEMs data flooded depth of the tsunami flooded area by the Great East Japan Earthquake in the Sendai Plain using GIS. Land use data is 100 meter grid data of National Land Information Data by the Ministry of Land, Infrastructure, Transportation and Tourism (MLIT). Landform classification data is vector data of Land Condition Map produced by GSI. DEMs data are 5 meters grid data measured with LiDAR by GSI after earthquake. Especially, the authors noticed the relationship between tsunami hazard damage and flooded depth. The authors divided tsunami damage into three categories by interpreting aerial photos; first is the completely destroyed area where almost wooden buildings were lost, second is the heavily damaged area where a large number of houses were destroyed by the tsunami, and third is the flooded only area where houses were less destroyed. The flooded depth was measured by photogrammetric method using digital image taken by Mobile Mapping System (MMS). The result of these geographic analyses show the distribution of tsunami damage level is as follows: 1) The completely destroyed area was located within 1km area from the coastline, flooded depth of this area is over 4m, and no relationship between damaged area and landform classification. 2) The heavily damaged area was observed up to 3 or 4km from the coastline. Flooded depth of this area is over 1.5m, and there is a good relationship between damaged area and height of DEMs. 3) The flood only area was observed up to 4 or 5km from the coastline. Flooded depth of this area was less than 1.5m, and there is a good relationship between damaged area and landform. For instance, a certain area in valley plain or flooded plain was not affected by the tsunami, even though an area with almost the same height in coastal plain or delta was flooded. These results mean that it is important for tsunami disaster management to consider not only DEMs but also landform classification.
Tsunamis hazard assessment and monitoring for the Back Sea area
NASA Astrophysics Data System (ADS)
Partheniu, Raluca; Ionescu, Constantin; Constantin, Angela; Moldovan, Iren; Diaconescu, Mihail; Marmureanu, Alexandru; Radulian, Mircea; Toader, Victorin
2016-04-01
NIEP has improved lately its researches regarding tsunamis in the Black Sea. As part of the routine earthquake and tsunami monitoring activity, the first tsunami early-warning system in the Black Sea has been implemented in 2013 and is active during these last years. In order to monitor the seismic activity of the Black Sea, NIEP is using a total number of 114 real time stations and 2 seismic arrays, 18 of the stations being located in Dobrogea area, area situated in the vicinity of the Romanian Black Sea shore line. Moreover, there is a data exchange with the Black Sea surrounding countries involving the acquisition of real-time data for 17 stations from Bulgaria, Turkey, Georgia and Ukraine. This improves the capability of the Romanian Seismic Network to monitor and more accurately locate the earthquakes occurred in the Black Sea area. For tsunamis monitoring and warning, a number of 6 sea level monitoring stations, 1 infrasound barometer, 3 offshore marine buoys and 7 GPS/GNSS stations are installed in different locations along and near the Romanian shore line. In the framework of ASTARTE project, few objectives regarding the seismic hazard and tsunami waves height assessment for the Black Sea were accomplished. The seismic hazard estimation was based on statistical studies of the seismic sources and their characteristics, compiled using different seismic catalogues. Two probabilistic methods were used for the evaluation of the seismic hazard, the Cornell method, based on the Gutenberg Richter distribution parameters, and Gumbel method, based on extremes statistic. The results show maximum values of possible magnitudes and their recurrence periods, for each seismic source. Using the Tsunami Analysis Tool (TAT) software, a set of tsunami modelling scenarios have been generated for Shabla area, the seismic source that could mostly affect the Romanian shore. These simulations are structured in a database, in order to set maximum possible tsunami waves that could be generated and to establish minimum magnitude values that could trigger tsunamis in this area. Some particularities of Shabla source are: past observed magnitudes > 7 and a recurrence period of 175 years. Some other important objectives of NIEP are to continue the monitoring of the seismic activity of the Black Sea, to improve the data base of the tsunami simulations for this area, near real time fault plane solution estimations used for the warning system, and to add new seismic, GPS/GNSS and sea level monitoring equipment to the existing network. Acknowledgements: This work was partially supported by the FP7 FP7-ENV2013 6.4-3 "Assessment, Strategy And Risk Reduction For Tsunamis in Europe" (ASTARTE) Project 603839/2013 and PNII, Capacity Module III ASTARTE RO Project 268/2014. This work was partially supported by the "Global Tsunami Informal Monitoring Service - 2" (GTIMS2) Project, JRC/IPR/2015/G.2/2006/NC 260286, Ref. Ares (2015)1440256 - 01.04.2015.
Sri Lanka's post-tsunami health system recovery: a qualitative analysis of physician perspectives.
Schenk, William Collin; Bui, Thuy
2018-01-01
The 2004 Indian Ocean tsunami caused significant damage to the health system in Sri Lanka. Rebuilding infrastructure and improving the mental health system were targets of recovery policies. Retrospective analyses of the post-tsunami health system recovery in Sri Lanka lack the perspectives of local stakeholders, including health care providers. In 2014 we interviewed 23 Sri Lankan physicians from the Eastern and Southern regions. Participants were recruited with snowball sampling. We used a content analysis approach in analysing the transcriptions. Sri Lankan physicians critiqued governance, sustainability and equity in the health system recovery. They held leadership roles as facilitators and sustainers of specific projects but were rarely formally consulted in recovery strategic planning. They identified instances of poor coordination among partners, corruption trends, local resource mismatches, regional resource disparities and the influence of the Sri Lankan civil war. Post-tsunami health system recovery planning and implementation in Sri Lanka did not involve local physician stakeholders in ways that have been prioritized more recently in other recovery frameworks. Despite limited formal inclusion, local physicians developed significant leadership roles that have informed their critical perspectives on the health system recovery. © The Author(s) 2018. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Tsunami evacuation mathematical model for the city of Padang
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kusdiantara, R.; Hadianti, R.; Badri Kusuma, M. S.
2012-05-22
Tsunami is a series of wave trains which travels with high speed on the sea surface. This traveling wave is caused by the displacement of a large volume of water after the occurrence of an underwater earthquake or volcano eruptions. The speed of tsunami decreases when it reaches the sea shore along with the increase of its amplitudes. Two large tsunamis had occurred in the last decades in Indonesia with huge casualties and large damages. Indonesian Tsunami Early Warning System has been installed along the west coast of Sumatra. This early warning system will give about 10-15 minutes to evacuatemore » people from high risk regions to the safe areas. Here in this paper, a mathematical model for Tsunami evacuation is presented with the city of Padang as a study case. In the model, the safe areas are chosen from the existing and selected high rise buildings, low risk region with relatively high altitude and (proposed to be built) a flyover ring road. Each gathering points are located in the radius of approximately 1 km from the ring road. The model is formulated as an optimization problem with the total normalized evacuation time as the objective function. The constraints consist of maximum allowable evacuation time in each route, maximum capacity of each safe area, and the number of people to be evacuated. The optimization problem is solved numerically using linear programming method with Matlab. Numerical results are shown for various evacuation scenarios for the city of Padang.« less
Post-crisis analysis of an ineffective tsunami alert: the 2010 earthquake in Maule, Chile.
Soulé, Bastien
2014-04-01
Considering its huge magnitude and its location in a densely populated area of Chile, the Maule seism of 27 February 2010 generated a low amount of victims. However, post-seismic tsunamis were particularly devastating on that day; surprisingly, no full alert was launched, not at the national, regional or local level. This earthquake and associated tsunamis are of interest in the context of natural hazards management as well as crisis management planning. Instead of focusing exclusively on the event itself, this article places emphasis on the process, systems and long-term approach that led the tsunami alert mechanism to be ineffectual. Notably, this perspective reveals interrelated forerunner signs of vulnerability. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.
Tsunami Wave Height Estimation from GPS-Derived Ionospheric Data
NASA Astrophysics Data System (ADS)
Rakoto, Virgile; Lognonné, Philippe; Rolland, Lucie; Coïsson, P.
2018-05-01
Large underwater earthquakes (Mw>7) can transmit part of their energy to the surrounding ocean through large seafloor motions, generating tsunamis that propagate over long distances. The forcing effect of tsunami waves on the atmosphere generates internal gravity waves that, when they reach the upper atmosphere, produce ionospheric perturbations. These perturbations are frequently observed in the total electron content (TEC) measured by multifrequency Global Navigation Satellite Systems (GNSS) such as GPS, GLONASS, and, in the future, Galileo. This paper describes the first inversion of the variation in sea level derived from GPS TEC data. We used a least squares inversion through a normal-mode summation modeling. This technique was applied to three tsunamis in far field associated to the 2012 Haida Gwaii, 2006 Kuril Islands, and 2011 Tohoku events and for Tohoku also in close field. With the exception of the Tohoku far-field case, for which the tsunami reconstruction by the TEC inversion is less efficient due to the ionospheric noise background associated to geomagnetic storm, which occurred on the earthquake day, we show that the peak-to-peak amplitude of the sea level variation inverted by this method can be compared to the tsunami wave height measured by a DART buoy with an error of less than 20%. This demonstrates that the inversion of TEC data with a tsunami normal-mode summation approach is able to estimate quite accurately the amplitude and waveform of the first tsunami arrival.
The role of deposits in tsunami risk assessment
Jaffe, B.
2008-01-01
An incomplete catalogue of tsunamis in the written record hinders tsunami risk assessment. Tsunami deposits, hard evidence of tsunami, can be used to extend the written record. The two primary factors in tsunami risk, tsunami frequency and magnitude, can be addressed through field and modeling studies of tsunami deposits. Recent research has increased the utility of tsunami deposits in tsunami risk assessment by improving the ability to identify tsunami deposits and developing models to determine tsunami magnitude from deposit characteristics. Copyright ASCE 2008.
The TRIDEC Project: Future-Saving FOSS GIS Applications for Tsunami Early Warning
NASA Astrophysics Data System (ADS)
Loewe, P.; Wächter, J.; Hammitzsch, M.
2011-12-01
The Boxing Day Tsunami of 2004 killed over 240,000 people in 14 countries and inundated the affected shorelines with waves reaching heights up to 30m. This natural disaster coincided with an information catastrophy, as potentially life-saving early warning information existed, yet no means were available to deliver it to the communities under imminent threat. Tsunami Early Warning Capabilities have improved in the meantime by continuing development of modular Tsunami Early Warning Systems (TEWS). However, recent tsunami events, like the Chile 2010 and the Tohoku 2011 tsunami demonstrate that the key challenge for ongoing TEWS research on the supranational scale still lies in the timely issuing of reliable early warning messages. Since 2004, the GFZ German Research Centre for Geosciences has built up expertise in the field of TEWS. Within GFZ, the Centre for GeoInformation Technology (CEGIT) has focused its work on the geoinformatics aspects of TEWS in two projects already: The German Indonesian Tsunami Early Warning System (GITEWS) funded by the German Federal Ministry of Education and Research (BMBF) and the Distant Early Warning System (DEWS), a European project funded under the sixth Framework Programme (FP6). These developments are continued in the TRIDEC project (Collaborative, Complex, and Critical Decision Processes in Evolving Crises) funded under the European Union's seventh Framework Programme (FP7). This ongoing project focuses on real-time intelligent information management in Earth management and its long-term application. All TRIDEC developments are based on Free and Open Source Software (FOSS) components and industry standards where-ever possible. Tsunami Early Warning in TRIDEC is also based on mature system architecture models to ensure long-term usability and the flexibility to adapt to future generations of Tsunami sensors. All open source software produced by the project consortium are foreseen to be published on FOSSLAB, a publicly available software repository provided by CEGIT. FOSSLAB serves as a platform for the development of FOSS projects in geospatial context, allowing to save, advance and reuse results achieved in previous and on-going project activities and enabling further development and collaboration with a wide community including scientists, developers, users and stakeholders. FOSSLABs potential to preserve and advance existing best practices for reuse in new scenarios is documented by a first case study: For TEWS education and public outreach a comprehensive approach to generate high resolution globe maps was compiled using GRASS GIS and the POV-Ray rendering software. The task resulted in the merging of isolated technical know-how into publicly available best practices, which had been previously maintained in disparate GIS- and rendering communities. Beyond the scope of TRIDEC, FOSSLAB constitutes an umbrella encompassing several geoinformatics-related activities, such as the documentation of Best Practices for experiences and results while working with Spatial Data Infrastructures (SDI), Geographic Information Systems (GIS), Geomatics, and future spatial processing on Computation Clusters and in Cloud Computing.
DISTANT EARLY WARNING SYSTEM for Tsunamis - A wide-area and multi-hazard approach
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Lendholt, Matthias; Wächter, Joachim
2010-05-01
The DEWS (Distant Early Warning System) [1] project, funded under the 6th Framework Programme of the European Union, has the objective to create a new generation of interoperable early warning systems based on an open sensor platform. This platform integrates OGC [2] SWE [3] compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements in the case of tsunami early warning. Based on the upstream information flow DEWS focuses on the improvement of downstream capacities of warning centres especially by improving information logistics for effective and targeted warning message aggregation for a multilingual environment. Multiple telecommunication channels will be used for the dissemination of warning messages. Wherever possible, existing standards have been integrated. The Command and Control User Interface (CCUI), a rich client application based on Eclipse RCP (Rich Client Platform) [4] and the open source GIS uDig [5], integrates various OGC services. Using WMS (Web Map Service) [6] and WFS (Web Feature Service) [7] spatial data are utilized to depict the situation picture and to integrate a simulation system via WPS (Web Processing Service) [8] to identify affected areas. Warning messages are compiled and transmitted in the OASIS [9] CAP (Common Alerting Protocol) [10] standard together with addressing information defined via EDXL-DE (Emergency Data Exchange Language - Distribution Element) [11]. Internal interfaces are realized with SOAP [12] web services. Based on results of GITEWS [13] - in particular the GITEWS Tsunami Service Bus [14] - the DEWS approach provides an implementation for tsunami early warning systems but other geological paradigms are going to follow, e.g. volcanic eruptions or landslides. Therefore in future also multi-hazard functionality is conceivable. The specific software architecture of DEWS makes it possible to dock varying sensors to the system and to extend the CCUI with hazard specific functionality. The presentation covers the DEWS project, the system architecture and the CCUI in conjunction with details of information logistics. The DEWS Wide Area Centre connecting national centres to allow the international communication and warning exchange is presented also. REFERENCES: [1] DEWS, www.dews-online.org [2] OGC, www.opengeospatial.org [3] SWE, www.opengeospatial.org/projects/groups/sensorweb [4] Eclipse RCP, www.eclipse.org/home/categories/rcp.php [5] uDig, udig.refractions.net [6] WMS, www.opengeospatial.org/standards/wms [7] WFS, www.opengeospatial.org/standards/wfs [8] WPS, www.opengeospatial.org/standards/wps [9] OASIS, www.oasis-open.org [10] CAP, www.oasis-open.org/specs/#capv1.1 [11] EDXL-DE, www.oasis-open.org/specs/#edxlde-v1.0 [12] SOAP, www.w3.org/TR/soap [13] GITEWS (German Indonesian Tsunami Early Warning System) is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone Indian Ocean region, www.gitews.org [14] The Tsunami Service Bus is the GITEWS sensor system integration platform offering standardised services for the detection and monitoring of tsunamis
PACT - a bottom pressure based, compact deep-ocean tsunameter with acoustic surface coupling
NASA Astrophysics Data System (ADS)
Macrander, A.; Gouretski, V.; Boebel, O.
2009-04-01
The German-Indonsian Tsunami Early Warning System (GITEWS) processes a multitude of information to comprehensively and accurately evaluate the possible risks inherent to seismic events around Indonesia. Within just a few minutes, measurements of the vibration and horizontal movements off the coastal regions of Indonesia provide a clear picture of the location and intensity of a seaquake. However, not every seaquake causes a tsunami, nor is every tsunami caused by a seaquake. To avoid nerve-wrecking and costly false alarms and to protect against tsunamis caused by landslides, the oceanic sea-level must be measured directly. This goal is pursued in the GITEWS work package "ocean instrumentation", aiming at a a highest reliability and redundancy by developing a set of independent instruments, which measure the sea-level both offshore in the deep ocean and at the coast on the islands off Indonesia. Deep ocean sea-level changes less than a centimetre can be detected by pressure gauges deployed at the sea floor. Based on some of the concepts developed as part of the US DART system, a bottom pressure based, acoustically coupled tsunami detector (PACT) was developed under the auspices of the AWI in collaboration with two German SME and with support of University of Bremen and University of Rhode Island. The PACT system records ocean bottom pressure, performs on-board tsunami detection and acoustically relays the data to the surface buoy. However, employing computational powers and communication technologies of the new millennium, PACT integrates the entire sea-floor package (pressure gauge, data logger and analyzer, acoustic modem, acoustic release and relocation aids) into a single unit, i.e. a standard benthos sphere. PACT thereby reduces costs, minimizes the deployment efforts, while maximizing reliability and maintenance intervals. Several PACT systems are scheduled for their first deployment off Indonesia during 2009. In this presentation, the technical specifications and results from extensive laboratory and at-sea tests are shown.
Mathematics of tsunami: modelling and identification
NASA Astrophysics Data System (ADS)
Krivorotko, Olga; Kabanikhin, Sergey
2015-04-01
Tsunami (long waves in the deep water) motion caused by underwater earthquakes is described by shallow water equations ( { ηtt = div (gH (x,y)-gradη), (x,y) ∈ Ω, t ∈ (0,T ); η|t=0 = q(x,y), ηt|t=0 = 0, (x,y) ∈ Ω. ( (1) Bottom relief H(x,y) characteristics and the initial perturbation data (a tsunami source q(x,y)) are required for the direct simulation of tsunamis. The main difficulty problem of tsunami modelling is a very big size of the computational domain (Ω = 500 × 1000 kilometres in space and about one hour computational time T for one meter of initial perturbation amplitude max|q|). The calculation of the function η(x,y,t) of three variables in Ω × (0,T) requires large computing resources. We construct a new algorithm to solve numerically the problem of determining the moving tsunami wave height S(x,y) which is based on kinematic-type approach and analytical representation of fundamental solution. Proposed algorithm of determining the function of two variables S(x,y) reduces the number of operations in 1.5 times than solving problem (1). If all functions does not depend on the variable y (one dimensional case), then the moving tsunami wave height satisfies of the well-known Airy-Green formula: S(x) = S(0)° --- 4H (0)/H (x). The problem of identification parameters of a tsunami source using additional measurements of a passing wave is called inverse tsunami problem. We investigate two different inverse problems of determining a tsunami source q(x,y) using two different additional data: Deep-ocean Assessment and Reporting of Tsunamis (DART) measurements and satellite altimeters wave-form images. These problems are severely ill-posed. The main idea consists of combination of two measured data to reconstruct the source parameters. We apply regularization techniques to control the degree of ill-posedness such as Fourier expansion, truncated singular value decomposition, numerical regularization. The algorithm of selecting the truncated number of singular values of an inverse problem operator which is agreed with the error level in measured data is described and analysed. In numerical experiment we used conjugate gradient method for solving inverse tsunami problems. Gradient methods are based on minimizing the corresponding misfit function. To calculate the gradient of the misfit function, the adjoint problem is solved. The conservative finite-difference schemes for solving the direct and adjoint problems in the approximation of shallow water are constructed. Results of numerical experiments of the tsunami source reconstruction are presented and discussed. We show that using a combination of two types of data allows one to increase the stability and efficiency of tsunami source reconstruction. Non-profit organization WAPMERR (World Agency of Planetary Monitoring and Earthquake Risk Reduction) in collaboration with Institute of Computational Mathematics and Mathematical Geophysics of SB RAS developed the Integrated Tsunami Research and Information System (ITRIS) to simulate tsunami waves and earthquakes, river course changes, coastal zone floods, and risk estimates for coastal constructions at wave run-ups and earthquakes. The special scientific plug-in components are embedded in a specially developed GIS-type graphic shell for easy data retrieval, visualization and processing. We demonstrate the tsunami simulation plug-in for historical tsunami events (2004 Indian Ocean tsunami, Simushir tsunami 2006 and others). This work was supported by the Ministry of Education and Science of the Russian Federation.
Wood, Nathan; Soulard, Christopher
2008-01-01
Evidence of past events and modeling of potential future events suggest that tsunamis are significant threats to communities on the open-ocean and Strait of Juan de Fuca coasts of Washington. Although potential tsunami-inundation zones from a Cascadia Subduction Zone (CSZ) earthquake have been delineated, the amount and type of human development in tsunami-prone areas have not been documented. A vulnerability assessment using geographic-information-system tools was conducted to document variations in developed land, human populations, economic assets, and critical facilities relative to CSZ-related tsunami-inundation zones among communities on the open-ocean and Strait of Juan de Fuca coasts of Washington (including Clallam, Jefferson, Grays Harbor, and Pacific Counties). The tsunami-inundation zone in these counties contains 42,972 residents (24 percent of the total study-area population), 24,934 employees (33 percent of the total labor force), and 17,029 daily visitors to coastal Washington State Parks. The tsunami-inundation zone also contains 2,908 businesses that generate $4.6 billion in annual sales volume (31 and 40 percent of study-area totals, respectively) and tax parcels with a combined total value of $4.5 billion (25 percent of the study-area total). Although occupancy values are not known for each site, the tsunami-inundation zone also contains numerous dependent-population facilities (for example, schools and child-day-care centers), public venues (for example, religious organizations), and critical facilities (for example, police stations and public-work facilities). Racial diversity of residents in tsunami-prone areas is low?89 percent of residents are White and 8 percent are American Indian or Alaska Native. Nineteen percent of the residents in the tsunami-inundation zone are over 65 years in age, 30 percent of the residents live on unincorporated county lands, and 35 percent of the households are renter occupied. Employees in the tsunami-inundation zone are largely in businesses related to health care and social assistance, accommodation and food services, and retail trade, reflecting businesses that cater to a growing retiree and tourist population. Community vulnerability, described here by exposure (the amount of assets in tsunami-prone areas) and sensitivity (the relative percentage of assets in tsunami-prone areas) varies among 13 incorporated cities, 7 Indian reservations, and 4 counties. The City of Aberdeen has the highest relative community exposure to tsunamis, whereas the City of Long Beach has the highest relative community sensitivity. Levels of community exposure and sensitivity to tsunamis are found to be related to the amount and percentage, respectively, of a community?s land that is in a tsunami-inundation zone. This report will further the dialogue on societal risk to tsunami hazards in Washington and help risk managers to determine where additional risk-reduction strategies may be needed.
Preliminary Observations of the Tsunami's Impact on U.S. Trade and Transportation With Japan
DOT National Transportation Integrated Search
2011-05-01
The United States faces potential ramifications from the damage to Japan's freight transportation system caused by the March 2011 earthquake and tsunami. During that time, the United States may face lower levels of both air and maritime imports in au...
Subaqueous Tsunami Deposits from Ohtsuchi Bay of Sanriku Coast, North Eastern Japan
NASA Astrophysics Data System (ADS)
Haraguchi, T.; Fujiwara, O.; Shimazaki, K.
2005-12-01
Holocene tsunami history was analyzed by using a drilling core obtained from the Ohtsuchi Bay on the Sanriku coast, Pacific side of NE Japan. The saw-tooth Sanriku coast line, facing the Japan Trench, is well known for repeated suffers from the historical great tsunamis. The worst tsunami damage in Japanese history, more than 20,000 fatalities, by the AD1896 Meiji Sanriku Tsunami (M 8 1/2) centered off Sanriku was recorded from this coast. However, the geological records of ancient tsunami such as tsunami deposits have been rarely reported from the Sanriku coast.Reconstruction of the pale-tsunami history including the recurrence interval is fundamental data for the tsunami disaster mitigation on the coast. The core, 24-meter long, obtained from a bay center of 10 m-deep is mainly composed of sandy mud excluding the basal gravel bed (core bottom reached SL-34 m). Sand and gravelly sand beds ranging from several to 200 cm-thick are intercalated in the core and denoted TS-22 to TS-1 in ascending order. Most of these coarse-grained beds have evidences of deposition from high-energy and density currents, basal erosion surface, rip-up clasts mixed mulluscan shells, inverse- and normal grading, and generally upward-fining sequence.Most likely origin of these event deposits is great tsunami, because the coring site is a deep and low energy bay floor isolated from major river mouth. Low sediment supply by river floods and small disturbance by wind waves at the drilling site are favorable for the preservation of tsunami deposits. Depositional ages of TS-1 to TS-22 were estimated from a depositional curve of the core based on ten 14C ages of marine shells. Recurrence interval of 13 sand and gravel beds in the lower part of the core, TS22 (ca. 7800 cal BP) to TS-10 (AD1660-1700), is 400 to 500 years.The number of event beds in the upper part of the core, deposited during the last 400 years (TS-9 to TS-1), approximates to that of historic large tsunamis recorded around the Ohtsuchi Bay (13-14 times).Remarkable differences of the recurrence intervals of event deposits between the lower and upper parts of the core reflects the change of sediment supply system and preservation potential of the event deposits.Identification of tsunami deposits from other deposits such as river flood and storm deposits is problem to be solved for reconstructing the accurate tsunami history on the Sanriku coast.
Integrated approach for coastal hazards and risks in Sri Lanka
NASA Astrophysics Data System (ADS)
Garcin, M.; Desprats, J. F.; Fontaine, M.; Pedreros, R.; Attanayake, N.; Fernando, S.; Siriwardana, C. H. E. R.; de Silva, U.; Poisson, B.
2008-06-01
The devastating impact of the tsunami of 26 December 2004 on the shores of the Indian Ocean recalled the importance of knowledge and the taking into account of coastal hazards. Sri Lanka was one of the countries most affected by this tsunami (e.g. 30 000 dead, 1 million people homeless and 70% of the fishing fleet destroyed). Following this tsunami, as part of the French post-tsunami aid, a project to establish a Geographical Information System (GIS) on coastal hazards and risks was funded. This project aims to define, at a pilot site, a methodology for multiple coastal hazards assessment that might be useful for the post-tsunami reconstruction and for development planning. This methodology could be applied to the whole coastline of Sri Lanka. The multi-hazard approach deals with very different coastal processes in terms of dynamics as well as in terms of return period. The first elements of this study are presented here. We used a set of tools integrating a GIS, numerical simulations and risk scenario modelling. While this action occurred in response to the crisis caused by the tsunami, it was decided to integrate other coastal hazards into the study. Although less dramatic than the tsunami these remain responsible for loss of life and damage. Furthermore, the establishment of such a system could not ignore the longer-term effects of climate change on coastal hazards in Sri Lanka. This GIS integrates the physical and demographic data available in Sri Lanka that is useful for assessing the coastal hazards and risks. In addition, these data have been used in numerical modelling of the waves generated during periods of monsoon as well as for the December 2004 tsunami. Risk scenarios have also been assessed for test areas and validated by field data acquired during the project. The results obtained from the models can be further integrated into the GIS and contribute to its enrichment and to help in better assessment and mitigation of these risks. The coastal-hazards-and-risks GIS coupled with modelling thus appears to be a very useful tool that can constitute the skeleton of a coastal zone management system. Decision makers will be able to make informed choices with regards to hazards during reconstruction and urban planning projects.
NASA Technical Reports Server (NTRS)
Hung, R. J.; Smith, R. E.
1978-01-01
Atmospheric acoustic-gravity waves associated with severe thunderstorms, tornadoes, typhoons (hurricanes) and tsunamis can be studied through the coupling between the ionosphere and the troposphere. Reverse ray tracing computations of acoustic-gravity waves observed by an ionospheric Doppler sounder array show that wave sources are in the nearby storm systems and that the waves are excited prior to the storms. Results show that ionospheric observations, together with satellite observations, can contribute to the understanding of the dynamical behavior of typhoons, severe storms and tsunamis.
NASA Astrophysics Data System (ADS)
Hellman, S. B.; Lisowski, S.; Baker, B.; Hagerty, M.; Lomax, A.; Leifer, J. M.; Thies, D. A.; Schnackenberg, A.; Barrows, J.
2015-12-01
Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). While this project was funded by NOAA to solve a specific problem, the requirements that the delivered system be both open source and easily maintainable have resulted in the creation of a variety of open source (OS) software packages. The open source software is now complete and this is a presentation of the OS Software that has been funded by NOAA for benefit of the entire seismic community. The design architecture comprises three distinct components: (1) The user interface, (2) The real-time data acquisition and processing system and (3) The scientific algorithm library. The system follows a modular design with loose coupling between components. We now identify the major project constituents. The user interface, CAVE, is written in Java and is compatible with the existing National Weather Service (NWS) open source graphical system AWIPS. The selected real-time seismic acquisition and processing system is open source SeisComp3 (sc3). The seismic library (libseismic) contains numerous custom written and wrapped open source seismic algorithms (e.g., ML/mb/Ms/Mwp, mantle magnitude (Mm), w-phase moment tensor, bodywave moment tensor, finite-fault inversion, array processing). The seismic library is organized in a way (function naming and usage) that will be familiar to users of Matlab. The seismic library extends sc3 so that it can be called by the real-time system, but it can also be driven and tested outside of sc3, for example, by ObsPy or Earthworm. To unify the three principal components we have developed a flexible and lightweight communication layer called SeismoEdex.
Role of Compressibility on Tsunami Propagation
NASA Astrophysics Data System (ADS)
Abdolali, Ali; Kirby, James T.
2017-12-01
In the present paper, we aim to reduce the discrepancies between tsunami arrival times evaluated from tsunami models and real measurements considering the role of ocean compressibility. We perform qualitative studies to reveal the phase speed reduction rate via a modified version of the Mild Slope Equation for Weakly Compressible fluid (MSEWC) proposed by Sammarco et al. (2013). The model is validated against a 3-D computational model. Physical properties of surface gravity waves are studied and compared with those for waves evaluated from an incompressible flow solver over realistic geometry for 2011 Tohoku-oki event, revealing reduction in phase speed.
GPS-TEC of the Ionospheric Disturbances as a Tool for Early Tsunami Warning
NASA Astrophysics Data System (ADS)
Kunitsyn, Viacheslav E.; Nesterov, Ivan A.; Shalimov, Sergey L.; Krysanov, Boris Yu.; Padokhin, Artem M.; Rekenthaler, Douglas
2013-04-01
Recently, the GPS measurements were used for retrieving the information on the various types of ionospheric responses to seismic events (earthquakes, seismic Rayleigh waves, and tsunami) which generate atmospheric waves propagating up to the ionospheric altitudes where the collisions between the neutrals and charge particles give rise to the motion of the ionospheric plasma. These experimental results can well be used in architecture of the future tsunami warning system. The point is an earlier (in comparison with seismological methods) detection of the ionospheric signal that can indicate the moment of tsunami generation. As an example we consider the two-dimensional distributions of the vertical total electron content (TEC) variations in the ionosphere both close to and far from the epicenter of the Japan undersea earthquake of March 11, 2011 using radio tomographic (RT) reconstruction of high-temporal-resolution (2-minute) data from the Japan and the US GPS networks. Near-zone TEC variations shows a diverging ionospheric perturbation with multi-component spectral composition emerging after the main shock. The initial phase of the disturbance can be used as an indicator of the tsunami generation and subsequently for the tsunami early warning. Far-zone TEC variations reveals distinct wave train associated with gravity waves generated by tsunami. According to observations tsunami arrives at Hawaii and further at the coast of Southern California with delay relative to the gravity waves. Therefore the gravity wave pattern can be used in the early tsunami warning. We support this scenario by the results of modeling with the parameters of the ocean surface perturbation corresponding to the considered earthquake. In addition it was observed in the modeling that at long distance from the source the gravity wave can pass ahead of the tsunami. The work was supported by the Russian Foundation for Basic Research (grants 11-05-01157 and 12-05-33065).
Numerical tool for tsunami risk assessment in the southern coast of Dominican Republic
NASA Astrophysics Data System (ADS)
Macias Sanchez, J.; Llorente Isidro, M.; Ortega, S.; Gonzalez Vida, J. M., Sr.; Castro, M. J.
2016-12-01
The southern coast of Dominican Republic is a very populated region, with several important cities including Santo Domingo, its capital. Important activities are rooted in the southern coast including tourism, industry, commercial ports, and, energy facilities, among others. According to historical reports, it has been impacted by big earthquakes accompanied by tsunamis as in Azua in 1751 and recently Pedernales in 2010, but their sources are not clearly identified. The aim of the present work is to develop a numerical tool to simulate the impact in the southern coast of the Dominican Republic of tsunamis generated in the Caribbean Sea. This tool, based on the Tsunami-HySEA model from EDANYA group (University of Malaga, Spain), could be used in the framework of a Tsunami Early Warning Systems due the very short computing times when only propagation is computed or it could be used to assess inundation impact, computing inundation with a initial 5 meter resolution. Numerical results corresponding to three theoretical sources are used to test the numerical tool.
SATO, Shinji
2015-01-01
Characteristics of the 2011 Tohoku Tsunami have been revealed by collaborative tsunami surveys extensively performed under the coordination of the Joint Tsunami Survey Group. The complex behaviors of the mega-tsunami were characterized by the unprecedented scale and the low occurrence frequency. The limitation and the performance of tsunami countermeasures were described on the basis of tsunami surveys, laboratory experiments and numerical analyses. These findings contributed to the introduction of two-level tsunami hazards to establish a new strategy for tsunami disaster mitigation, combining structure-based flood protection designed by the Level-1 tsunami and non-structure-based damage reduction planned by the Level-2 tsunami. PMID:26062739
Sato, Shinji
2015-01-01
Characteristics of the 2011 Tohoku Tsunami have been revealed by collaborative tsunami surveys extensively performed under the coordination of the Joint Tsunami Survey Group. The complex behaviors of the mega-tsunami were characterized by the unprecedented scale and the low occurrence frequency. The limitation and the performance of tsunami countermeasures were described on the basis of tsunami surveys, laboratory experiments and numerical analyses. These findings contributed to the introduction of two-level tsunami hazards to establish a new strategy for tsunami disaster mitigation, combining structure-based flood protection designed by the Level-1 tsunami and non-structure-based damage reduction planned by the Level-2 tsunami.
CARIBE WAVE/LANTEX Caribbean and Western Atlantic Tsunami Exercises
NASA Astrophysics Data System (ADS)
von Hillebrandt-Andrade, C.; Whitmore, P.; Aliaga, B.; Huerfano Moreno, V.
2013-12-01
Over 75 tsunamis have been documented in the Caribbean and Adjacent Regions over the past 500 years. While most have been generated by local earthquakes, distant generated tsunamis can also affect the region. For example, waves from the 1755 Lisbon earthquake and tsunami were observed in Cuba, Dominican Republic, British Virgin Islands, as well as Antigua, Martinique, Guadalupe and Barbados in the Lesser Antilles. Since 1500, at least 4484 people are reported to have perished in these killer waves. Although the tsunami generated by the 2010 Haiti earthquake claimed only a few lives, in the 1530 El Pilar, Venezuela; 1602 Port Royale, Jamaica; 1918 Puerto Rico; and 1946 Samaná, Dominican Republic tsunamis the death tolls ranged to over a thousand. Since then, there has been an explosive increase in residents, visitors, infrastructure, and economic activity along the coastlines, increasing the potential for human and economic loss. It has been estimated that on any day, upwards of more than 500,000 people could be in harm's way just along the beaches, with hundreds of thousands more working and living in the tsunamis hazard zones. Given the relative infrequency of tsunamis, exercises are a valuable tool to test communications, evaluate preparedness and raise awareness. Exercises in the Caribbean are conducted under the framework of the UNESCO IOC Intergovernmental Coordination Group for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS) and the US National Tsunami Hazard Mitigation Program. On March 23, 2011, 34 countries and territories participated in the first CARIBE WAVE/LANTEX regional tsunami exercise, while in the second exercise on March 20, 2013 a total of 45 countries and territories participated. 481 organizations (almost 200 more than in 2011) also registered to receive the bulletins issued by the Pacific Tsunami Warning Center (PTWC), West Coast and Alaska Tsunami Warning Center and/or the Puerto Rico Seismic Network. The CARIBE WAVE/LANTEX 13 scenario simulated a tsunami generated by a magnitude 8.5 earthquake originating north of Oranjestad, Aruba in the Caribbean Sea. For the first time earthquake impact was included in addition to expected tsunami impact. The initial message was issued by the warning centers over the established channels, while different mechanisms were then used by participants for further dissemination. The enhanced PTWC tsunami products for the Caribbean were also made available to the participants. To provide feedback on the exercise an online survey tool with 85 questions was used. The survey demonstrated satisfaction with exercise, timely receipt of bulletins and interest in the enhanced PTWC products. It also revealed that while 93% of the countries had an activation and response process, only 59% indicated that they also had an emergency response plan for tsunamis and even fewer had tsunami evacuation plans and inundation maps. Given that 80% of those surveyed indicated that CARIBE WAVE should be conducted annually, CARIBE EWS decided that the next exercise be held on March 26, 2014, instead of waiting until 2015.
NASA Astrophysics Data System (ADS)
Bernard, E. N.; Behn, R. R.; Hebenstreit, G. T.; Gonzalez, F. I.; Krumpe, P.; Lander, J. F.; Lorca, E.; McManamon, P. M.; Milburn, H. B.
Rapid onset natural hazards have claimed more than 2.8 million lives worldwide in the past 20 years. This category includes such events as earthquakes, landslides, hurricanes, tornados, floods, volcanic eruptions, wildfires, and tsunamis. Effective hazard mitigation is particularly difficult in such cases, since the time available to issue warnings can be very short or even nonexistent. This paper presents the concept of a local warning system that exploits and integrates the existing technologies of risk evaluation, environmental measurement, and telecommunications. We describe Project THRUST, a successful implementation of this general, systematic approach to tsunamis. The general approach includes pre-event emergency planning, real-time hazard assessment, and rapid warning via satellite communication links.
Geological and historical evidence of irregular recurrent earthquakes in Japan.
Satake, Kenji
2015-10-28
Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).
Richmond, Bruce M.; Buckley, Mark; Etienne, Samuel; Chagué-Goff, Catherine; Clark, Kate; Goff, James; Dominey-Howes, Dale; Strotz, Luke
2011-01-01
The September 29th 2009 tsunami caused widespread coastal modification within the islands of Samoa and northern Tonga in the South Pacific. Preliminary measurements indicate maximum runup values of around 17 m (Okal et al., 2010) and shore-normal inundation distances of up to ~ 620 m (Jaffe et al., 2010). Geological field reconnaissance studies were conducted as part of an UNESCO-IOC International Tsunami Survey Team survey within three weeks of the event in order to document the erosion, transport, and deposition of sediment by the tsunami. Data collected included: a) general morphology and geological characteristics of the coast, b) evidence of tsunami flow (inundation, flow depth and direction, wave height and runup), c) surficial and subsurface sediment samples including deposit thickness and extent, d) topographic mapping, and e) boulder size and location measurements. Four main types of sedimentary deposits were identified: a) gravel fields consisting mostly of isolated cobbles and boulders, b) sand sheets from a few to ~ 25 cm thick, c) piles of organic (mostly vegetation) and man-made material forming debris ramparts, and d) surface mud deposits that settled from suspension from standing water in the tsunami aftermath. Tsunami deposits within the reef system were not widespread, however, surficial changes to the reefs were observed. PMID:27065478
NASA Astrophysics Data System (ADS)
Ulutas, Ergin
2013-01-01
The numerical simulations of recent tsunami caused by 11 March 2011 off-shore Pacific coast of Tohoku-Oki earthquake (Mw 9.0) using diverse co-seismic source models have been performed. Co-seismic source models proposed by various observational agencies and scholars are further used to elucidate the effects of uniform and non-uniform slip models on tsunami generation and propagation stages. Non-linear shallow water equations are solved with a finite difference scheme, using a computational grid with different cell sizes over GEBCO30 bathymetry data. Overall results obtained and reported by various tsunami simulation models are compared together with the available real-time kinematic global positioning system (RTK-GPS) buoys, cabled deep ocean-bottom pressure gauges (OBPG), and Deep-ocean Assessment and Reporting of Tsunami (DART) buoys. The purpose of this study is to provide a brief overview of major differences between point-source and finite-fault methodologies on generation and simulation of tsunamis. Tests of the assumptions of uniform and non-uniform slip models designate that the average uniform slip models may be used for the tsunami simulations off-shore, and far from the source region. Nevertheless, the heterogeneities of the slip distribution within the fault plane are substantial for the wave amplitude in the near field which should be investigated further.
High tsunami frequency as a result of combined strike-slip faulting and coastal landslides
Hornbach, M.J.; Braudy, N.; Briggs, R.W.; Cormier, M.-H.; Davis, M.B.; Diebold, J.B.; Dieudonne, N.; Douilly, R.; Frohlich, C.; Gulick, S.P.S.; Johnson, H. E.; Mann, P.; McHugh, C.; Ryan-Mishkin, K.; Prentice, C.S.; Seeber, L.; Sorlien, C.C.; Steckler, M.S.; Symithe, S.J.; Taylor, F.W.; Templeton, J.
2010-01-01
Earthquakes on strike-slip faults can produce devastating natural hazards. However, because they consist predominantly of lateral motion, these faults are rarely associated with significant uplift or tsunami generation. And although submarine slides can generate tsunami, only a few per cent of all tsunami are believed to be triggered in this way. The 12 January Mw 7.0 Haiti earthquake exhibited primarily strike-slip motion but nevertheless generated a tsunami. Here we present data from a comprehensive field survey that covered the onshore and offshore area around the epicentre to document that modest uplift together with slope failure caused tsunamigenesis. Submarine landslides caused the most severe tsunami locally. Our analysis suggests that slide-generated tsunami occur an order-of-magnitude more frequently along the Gonave microplate than global estimates predict. Uplift was generated because of the earthquake?s location, where the Caribbean and Gonave microplates collide obliquely. The earthquake also caused liquefaction at several river deltas that prograde rapidly and are prone to failure. We conclude that coastal strike-slip fault systems such as the Enriquillog-Plantain Garden fault produce relief conducive to rapid sedimentation, erosion and slope failure, so that even modest predominantly strike-slip earthquakes can cause potentially catastrophic slide-generated tsunamig-a risk that is underestimated at present. ?? 2010 Macmillan Publishers Limited. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-09
... information following testing of the associated NWS communications systems. The tests are planned annually, in March/April and again in September. Post-test feedback information will be requested from emergency... Collection; Comment Request; Feedback Survey for Annual Tsunami Warning Communications Tests AGENCY: National...
Tsunami on Sanriku Coast in 1586: Orphan or Ghost Tsunami ?
NASA Astrophysics Data System (ADS)
Satake, K.
2017-12-01
The Peruvian earthquake on July 9, 1586 was the oldest earthquake that damaged Lima. The tsunami height was assigned as 24 m in Callao and 1-2 m in Miyagi prefecture in Japan by Soloviev and Go (1975). Dorbath et al. (1990) studied historical earthquakes in Peru and estimated that the 1586 earthquake was similar to the 1974 event (Mw 8.1) with source length of 175 km. They referred two different tsunami heights, 3. 7m and 24 m, in Callao, and judged that the latter was exaggerated. Okal et al. (2006) could not make a source model to explain both tsunami heights in Callao and Japan. More recently, Butler et al. (2017) estimated the age of coral boulders in Hawaii as AD 1572 +/- 21, speculated the tsunami source in Aleutians, and attributed it to the source of the 1586 tsunami in Japan. Historical tsunamis, both near-field and far-field, have been documented along the Sanriku coast since 1586 (e.g., Watanabe, 1998). However, there is no written document for the 1586 tsunami (Tsuji et al., 2013). Ninomiya (1960) compiled the historical tsunami records on the Sanriku coast soon after the 1960 Chilean tsunami, and correlated the legend of tsunami in Tokura with the 1586 Peruvian earthquake, although he noted that the dates were different. About the legend, he referred to Kunitomi(1933) who compiled historical tsunami data after the 1933 Showa Sanriku tsunami. Kunitomi referred to "Tsunami history of Miyagi prefecture" published after the 1896 Meiji Sanriku tsunami. "Tsunami history" described the earthquake and tsunami damage of Tensho earthquake on January 18 (Gregorian),1586 in central Japan, and correlated the tsunami legend in Tokura on June 30, 1586 (G). Following the 2011 Tohoku tsunami, tsunami legend in Tokura was studied again (Ebina, 2015). A local person published a story he heard from his grandfather that many small valleys were named following the 1611 tsunami, which inundated further inland than the 2011 tsunami. Ebina (2015), based on historical documents, estimated that the legend existed around 1750. From the above research, the tsunami legend in Tokura is unlikely from the Peruvian earthquake. Hence the 1586 tsunami was not an orphan tsunami, but rather a ghost or fake tsunami. The legend simply mentioned about tsunami, but the tsunami heights were speculated as 1-2 m (Soloviev and Go) or 2 - 2.5 m (NOAA tsunami DB).
Performance Benchmarking of tsunami-HySEA for NTHMP Inundation Mapping Activities
NASA Astrophysics Data System (ADS)
González Vida, Jose M.; Castro, Manuel J.; Ortega Acosta, Sergio; Macías, Jorge; Millán, Alejandro
2016-04-01
According to the 2006 USA Tsunami Warning and Education Act, the tsunami inundation models used in the National Tsunami Hazard Mitigation Program (NTHMP) projects must be validated against some existing standard problems (see [OAR-PMEL-135], [Proceedings of the 2011 NTHMP Model Benchmarking Workshop]). These Benchmark Problems (BPs) cover different tsunami processes related to the inundation stage that the models must meet to achieve the NTHMP Mapping and Modeling Subcommittee (MMS) approval. Tsunami-HySEA solves the two-dimensional shallow-water system using a high-order path-conservative finite volume method. Values of h, qx and qy in each grid cell represent cell averages of the water depth and momentum components. The numerical scheme is conservative for both mass and momentum in flat bathymetries, and, in general, is mass preserving for arbitrary bathymetries. Tsunami-HySEA implements a PVM-type method that uses the fastest and the slowest wave speeds, similar to HLL method (see [Castro et al, 2012]). A general overview of the derivation of the high order methods is performed in [Castro et al, 2009]. For very big domains, Tsunami-HySEA also implements a two-step scheme similar to leap-frog for the propagation step and a second-order TVD-WAF flux-limiter scheme described in [de la Asunción et al, 2013] for the inundation step. Here, we present the results obtained by the model tsunami-HySEA against the proposed BPs. BP1: Solitary wave on a simple beach (non-breaking - analytic experiment). BP4: Solitary wave on a simple beach (breaking - laboratory experiment). BP6: Solitary wave on a conical island (laboratory experiment). BP7 - Runup on Monai Valley beach (laboratory experiment) and BP9: Okushiri Island tsunami (field experiment). The analysis and results of Tsunami-HySEA model are presented, concluding that the model meets the required objectives for all the BP proposed. References - Castro M.J., E.D. Fernández, A.M. Ferreiro, A. García, C. Parés (2009). High order extension of Roe schemes for two dimensional nonconservative hyperbolic systems. J. Sci. Comput. 39(1), 67-114. - Castro M.J., E.D. Fernández-Nieto (2012). A class of computationally fast first order finite volume solvers: PVM methods. SIAM J. Sci. Comput. 34, A2173-2196. - de la Asunción M., M.J. Castro, E.D. Fernández-Nieto, J.M. Mantas, et al. Efficient GPU implementation of a two waves TVD-WAF method for the two-dimensional one layer shallow water system on structured meshes (2013). Computers & Fluids 80, 441-452. - OAR PMEL-135. Synolakis, C.E., E.N. Bernard, V.V. Titov, U. Kânoǧlu, and F.I. González (2007). Standards, criteria, and procedures for NOAA evaluation of tsunami numerical models. NOAA Tech. Memo. NOAA/Pacific Marine Environmental Laboratory, Seattle, WA, 55 pp. - Proceedings and results of the 2011 NTHMP Model Benchmarking Workshop. NOAA Special Report. July 2012. Acknowledgements This research has been partially supported by the Junta de Andalucía research project TESELA (P11-RNM7069), the Spanish Government Research project DAIFLUID (MTM2012-38383-C02-01) and the Unit of Numerical Methods (UNM) of the Research Support Central Services (SCAI) of the University of Málaga.
High Resolution Tsunami Modeling and Assessment of Harbor Resilience; Case Study in Istanbul
NASA Astrophysics Data System (ADS)
Cevdet Yalciner, Ahmet; Aytore, Betul; Gokhan Guler, Hasan; Kanoglu, Utku; Duzgun, Sebnem; Zaytsev, Andrey; Arikawa, Taro; Tomita, Takashi; Ozer Sozdinler, Ceren; Necmioglu, Ocal; Meral Ozel, Nurcan
2014-05-01
Ports and harbors are the major vulnerable coastal structures under tsunami attack. Resilient harbors against tsunami impacts are essential for proper, efficient and successful rescue operations and reduction of the loss of life and property by tsunami disasters. There are several critical coastal structures as such in the Marmara Sea. Haydarpasa and Yenikapi ports are located in the Marmara Sea coast of Istanbul. These two ports are selected as the sites of numerical experiments to test their resilience under tsunami impact. Cargo, container and ro-ro handlings, and short/long distance passenger transfers are the common services in both ports. Haydarpasa port has two breakwaters with the length of three kilometers in total. Yenikapi port has one kilometer long breakwater. The accurate resilience analysis needs high resolution tsunami modeling and careful assessment of the site. Therefore, building data with accurate coordinates of their foot prints and elevations are obtained. The high resolution bathymetry and topography database with less than 5m grid size is developed for modeling. The metadata of the several types of structures and infrastructure of the ports and environs are processed. Different resistances for the structures/buildings/infrastructures are controlled by assigning different friction coefficients in a friction matrix. Two different tsunami conditions - high expected and moderate expected - are selected for numerical modeling. The hybrid tsunami simulation and visualization codes NAMI DANCE, STOC-CADMAS System are utilized to solve all necessary tsunami parameters and obtain the spatial and temporal distributions of flow depth, current velocity, inundation distance and maximum water level in the study domain. Finally, the computed critical values of tsunami parameters are evaluated and structural performance of the port components are discussed in regard to a better resilience. ACKNOWLEDGEMENTS: Support by EU 603839 ASTARTE Project, UDAP-Ç-12-14 of AFAD, 108Y227 and 113M556 of TUBITAK, RAPSODI (CONCERT_Dis-021) of CONCERT-Japan Joint Call, Earthquake and Tsunami Disaster Mitigation in The Marmara Region and Disaster Education in Turkey Japan-Turkey Joint Research Project by SATREPS, 2011K140210 of DPT, Istanbul Metropolitan Municipality are acknowledged.
NASA Astrophysics Data System (ADS)
Fritz, H. M.; Phillips, D. A.; Okayasu, A.; Shimozono, T.; Liu, H.; Takeda, S.; Mohammed, F.; Skanavis, V.; Synolakis, C.; Takahashi, T.
2014-12-01
The 2004 Indian Ocean tsunami marked the advent of survivor videos mainly from tourist areas in Thailand and basin-wide locations. Near-field video recordings on Sumatra's north tip at Banda Aceh were limited to inland areas a few kilometres off the beach (Fritz et al., 2006). The March 11, 2011, magnitude Mw 9.0 earthquake off the Tohoku coast of Japan caused catastrophic damage and loss of life resulting in the costliest natural disaster in recorded history. The mid-afternoon tsunami arrival combined with survivors equipped with cameras on top of vertical evacuation buildings provided numerous inundation recordings with unprecedented spatial and temporal resolution. High quality tsunami video recording sites at Yoriisohama, Kesennuma, Kamaishi and Miyako along Japan's Sanriku coast were surveyed, eyewitnesses interviewed and precise topographic data recorded using terrestrial laser scanning (TLS). The original video recordings were recovered from eyewitnesses and the Japanese Coast Guard (JCG). The analysis of the tsunami videos follows an adapted four step procedure (Fritz et al., 2012). Measured overland flow velocities during tsunami runup exceed 13 m/s at Yoriisohama. The runup hydrograph at Yoriisohama highlights the under sampling at the Onagawa Nuclear Power Plant (NPP) pressure gauge, which skips the shorter period second crest. Combined tsunami and runup hydrographs are derived from the videos based on water surface elevations at surface piercing objects and along slopes identified in the acquired topographic TLS data. Several hydrographs reveal a draw down to minus 10 m after a first wave crest exposing harbor bottoms at Yoriisohama and Kamaishi. In some cases ship moorings resist the main tsunami crest only to be broken by the extreme draw down. A multi-hour ship track for the Asia Symphony with the vessels complete tsunami drifting motion in Kamaishi Bay is recovered from the universal ship borne AIS (Automatic Identification System). Multiple hydrographs corroborate the tsunami propagation through Miyako Bay and up the Hei River. Tsunami outflow currents up to 11 m/s were measured in Kesennuma Bay making navigation impossible. Further we discuss the complex effects of coastal structures on inundation and outflow hydrographs as well as associated flow velocities.
NASA Astrophysics Data System (ADS)
Occhipinti, G.; Manta, F.; Rolland, L.; Watada, S.; Makela, J. J.; Hill, E.; Astafieva, E.; Lognonne, P. H.
2017-12-01
Detection of ionospheric anomalies following the Sumatra and Tohoku earthquakes (e.g., Occhipinti 2015) demonstrated that ionosphere is sensitive to earthquake and tsunami propagation: ground and oceanic vertical displacement induces acoustic-gravity waves propagating within the neutral atmosphere and detectable in the ionosphere. Observations supported by modelling proved that ionospheric anomalies related to tsunamis are deterministic and reproducible by numerical modeling via the ocean/neutral-atmosphere/ionosphere coupling mechanism (Occhipinti et al., 2008). To prove that the tsunami signature in the ionosphere is routinely detected we show here perturbations of total electron content (TEC) measured by GPS and following tsunamigenic earthquakes from 2004 to 2011 (Rolland et al. 2010, Occhipinti et al., 2013), nominally, Sumatra (26 December, 2004 and 12 September, 2007), Chile (14 November, 2007), Samoa (29 September, 2009) and the recent Tohoku-Oki (11 Mars, 2011). Based on the observations close to the epicenter, mainly performed by GPS networks located in Sumatra, Chile and Japan, we highlight the TEC perturbation observed within the first 8 min after the seismic rupture. This perturbation contains information about the ground displacement, as well as the consequent sea surface displacement resulting in the tsunami. In addition to GNSS-TEC observations close to the epicenter, new exciting measurements in the far-field were performed by airglow measurement in Hawaii show the propagation of the internal gravity waves induced by the Tohoku tsunami (Occhipinti et al., 2011). This revolutionary imaging technique is today supported by two new observations of moderate tsunamis: Queen Charlotte (M: 7.7, 27 October, 2013) and Chile (M: 8.2, 16 September 2015). We finally detail here our recent work (Manta et al., 2017) on the case of tsunami alert failure following the Mw7.8 Mentawai event (25 October, 2010), and its twin tsunami alert response following the Mw7.8 Benyak event (2010). In this talk we present all this new tsunami observations in the ionosphere and we discuss, under the light of modelling, the potential role of ionospheric sounding by GNSS-TEC and airglow cameras in oceanic monitoring and future tsunami warning system. All ref. here @ www.ipgp.fr/ ninto
Kirby, Stephen; Scholl, David; von Huene, Roland E.; Wells, Ray
2013-01-01
Tsunami modeling has shown that tsunami sources located along the Alaska Peninsula segment of the Aleutian-Alaska subduction zone have the greatest impacts on southern California shorelines by raising the highest tsunami waves for a given source seismic moment. The most probable sector for a Mw ~ 9 source within this subduction segment is between Kodiak Island and the Shumagin Islands in what we call the Semidi subduction sector; these bounds represent the southwestern limit of the 1964 Mw 9.2 Alaska earthquake rupture and the northeastern edge of the Shumagin sector that recent Global Positioning System (GPS) observations indicate is currently creeping. Geological and geophysical features in the Semidi sector that are thought to be relevant to the potential for large magnitude, long-rupture-runout interplate thrust earthquakes are remarkably similar to those in northeastern Japan, where the destructive Mw 9.1 tsunamigenic earthquake of 11 March 2011 occurred. In this report we propose and justify the selection of a tsunami source seaward of the Alaska Peninsula for use in the Tsunami Scenario that is part of the U.S. Geological Survey (USGS) Science Application for Risk Reduction (SAFRR) Project. This tsunami source should have the potential to raise damaging tsunami waves on the California coast, especially at the ports of Los Angeles and Long Beach. Accordingly, we have summarized and abstracted slip distribution from the source literature on the 2011 event, the best characterized for any subduction earthquake, and applied this synoptic slip distribution to the similar megathrust geometry of the Semidi sector. The resulting slip model has an average slip of 18.6 m and a moment magnitude of Mw = 9.1. The 2011 Tohoku earthquake was not anticipated, despite Japan having the best seismic and geodetic networks in the world and the best historical record in the world over the past 1,500 years. What was lacking was adequate paleogeologic data on prehistoric earthquakes and tsunamis, a data gap that also presently applies to the Alaska Peninsula and the Aleutian Islands. Quantitative appraisal of potential tsunami sources in Alaska requires such investigations.
NASA Astrophysics Data System (ADS)
Soto-Cordero, L.; Meltzer, A.
2014-12-01
A mag 6.4 earthquake offshore northern Puerto Rico earlier this year (1/13/14) is a reminder of the high risk of earthquakes and tsunamis in the northeastern Caribbean. Had the magnitude of this event been 0.1 larger (M 6.5) a tsunami warning would have been issued for the Puerto Rico-Virgin Islands (PRVI) region based on the West Coast Alaska Tsunami Warning Center (WCATWC) and Puerto Rico Seismic Network (PRSN) response procedures at the time. Such an alert level would have led local authorities to issue evacuation orders for all PRVI coastal areas. Since the number of deaths associated with tsunamis in the Caribbean region is greater than the total casualties from tsunamis in the entire US (including Hawaii and Alaska coasts) having an effective and redundant warning system is critical in order to save lives and to minimize false alarms that could result in significant economic costs and loss of confidence of Caribbean residents. We are evaluating three fundamental components of tsunami monitoring protocols currently in place in the northeastern Caribbean: 1) preliminary earthquake parameters (used to determine the potential that a tsunami will be generated and the basis of tsunami alert levels), 2) adequacy of the tsunami alert levels, and 3) tsunami message dissemination. We compiled a catalog of earthquake locations (2007-2014) and dissemination times from the PTWC, WCATWC and NEIC (final locations). The events were classified into 3 categories: local [17°-20°N, 63.5°-69°W], regional (Caribbean basin) and distant/teleseismic (Atlantic basin). A total of 104 local earthquakes, 31 regional and 25 distant events were analyzed. We found that in general preliminary epicentral locations have an accuracy of 40 km. 64% of local events were located with an accuracy of 20 km. The depth accuracy of local events shallower than 50 km, regional and distant earthquakes is usually smaller than 30 km. For deeper local events the error distribution shows more variability (-32 to 81 km); preliminary locations tend to underestimate depth. A trade-off between epicentral location and depth was observed for several local events deeper than 50 km.
Tsunami Modeling and Prediction Using a Data Assimilation Technique with Kalman Filters
NASA Astrophysics Data System (ADS)
Barnier, G.; Dunham, E. M.
2016-12-01
Earthquake-induced tsunamis cause dramatic damages along densely populated coastlines. It is difficult to predict and anticipate tsunami waves in advance, but if the earthquake occurs far enough from the coast, there may be enough time to evacuate the zones at risk. Therefore, any real-time information on the tsunami wavefield (as it propagates towards the coast) is extremely valuable for early warning systems. After the 2011 Tohoku earthquake, a dense tsunami-monitoring network (S-net) based on cabled ocean-bottom pressure sensors has been deployed along the Pacific coast in Northeastern Japan. Maeda et al. (GRL, 2015) introduced a data assimilation technique to reconstruct the tsunami wavefield in real time by combining numerical solution of the shallow water wave equations with additional terms penalizing the numerical solution for not matching observations. The penalty or gain matrix is determined though optimal interpolation and is independent of time. Here we explore a related data assimilation approach using the Kalman filter method to evolve the gain matrix. While more computationally expensive, the Kalman filter approach potentially provides more accurate reconstructions. We test our method on a 1D tsunami model derived from the Kozdon and Dunham (EPSL, 2014) dynamic rupture simulations of the 2011 Tohoku earthquake. For appropriate choices of model and data covariance matrices, the method reconstructs the tsunami wavefield prior to wave arrival at the coast. We plan to compare the Kalman filter method to the optimal interpolation method developed by Maeda et al. (GRL, 2015) and then to implement the method for 2D.
An Optimal Design for Placements of Tsunami Observing Systems Around the Nankai Trough, Japan
NASA Astrophysics Data System (ADS)
Mulia, I. E.; Gusman, A. R.; Satake, K.
2017-12-01
Presently, there are numerous tsunami observing systems deployed in several major tsunamigenic regions throughout the world. However, documentations on how and where to optimally place such measurement devices are limited. This study presents a methodological approach to select the best and fewest observation points for the purpose of tsunami source characterizations, particularly in the form of fault slip distributions. We apply the method to design a new tsunami observation network around the Nankai Trough, Japan. In brief, our method can be divided into two stages: initialization and optimization. The initialization stage aims to identify favorable locations of observation points, as well as to determine the initial number of observations. These points are generated based on extrema of an empirical orthogonal function (EOF) spatial modes derived from 11 hypothetical tsunami events in the region. In order to further improve the accuracy, we apply an optimization algorithm called a mesh adaptive direct search (MADS) to remove redundant measurements from the initially generated points by the first stage. A combinatorial search by the MADS will improve the accuracy and reduce the number of observations simultaneously. The EOF analysis of the hypothetical tsunamis using first 2 leading modes with 4 extrema on each mode results in 30 observation points spread along the trench. This is obtained after replacing some clustered points within the radius of 30 km with only one representative. Furthermore, the MADS optimization can improve the accuracy of the EOF-generated points by approximately 10-20% with fewer observations (23 points). Finally, we compare our result with the existing observation points (68 stations) in the region. The result shows that the optimized design with fewer number of observations can produce better source characterizations with approximately 20-60% improvement of accuracies at all the 11 hypothetical cases. It should be note, however, that our design is a tsunami-based approach, some of the existing observing systems are equipped with additional devices to measure other parameter of interests, i.e., for monitoring seismic activities.
Tsunami geology in paleoseismology
Yuichi Nishimura,; Jaffe, Bruce E.
2015-01-01
The 2004 Indian Ocean and 2011 Tohoku-oki disasters dramatically demonstrated the destructiveness and deadliness of tsunamis. For the assessment of future risk posed by tsunamis it is necessary to understand past tsunami events. Recent work on tsunami deposits has provided new information on paleotsunami events, including their recurrence interval and the size of the tsunamis (e.g. [187–189]). Tsunamis are observed not only on the margin of oceans but also in lakes. The majority of tsunamis are generated by earthquakes, but other events that displace water such as landslides and volcanic eruptions can also generate tsunamis. These non-earthquake tsunamis occur less frequently than earthquake tsunamis; it is, therefore, very important to find and study geologic evidence for past eruption and submarine landslide triggered tsunami events, as their rare occurrence may lead to risks being underestimated. Geologic investigations of tsunamis have historically relied on earthquake geology. Geophysicists estimate the parameters of vertical coseismic displacement that tsunami modelers use as a tsunami's initial condition. The modelers then let the simulated tsunami run ashore. This approach suffers from the relationship between the earthquake and seafloor displacement, the pertinent parameter in tsunami generation, being equivocal. In recent years, geologic investigations of tsunamis have added sedimentology and micropaleontology, which focus on identifying and interpreting depositional and erosional features of tsunamis. For example, coastal sediment may contain deposits that provide important information on past tsunami events [190, 191]. In some cases, a tsunami is recorded by a single sand layer. Elsewhere, tsunami deposits can consist of complex layers of mud, sand, and boulders, containing abundant stratigraphic evidence for sediment reworking and redeposition. These onshore sediments are geologic evidence for tsunamis and are called ‘tsunami deposits’ (Figs. 26 and 27). Tsunami deposits can be classified into two groups: modern tsunami deposits and paleotsunami deposits. A modern tsunami deposit is a deposit whose source event is known. A paleotsunami deposit is a deposit whose age is estimated and has a source that is either inferred to be a historical event or is unknown.
Field Survey of the 17 June 2017 Landslide and Tsunami in Karrat Fjord, Greenland
NASA Astrophysics Data System (ADS)
Fritz, H. M.; Giachetti, T.; Anderson, S.; Gauthier, D.
2017-12-01
On 17 June 2017 a massive landslide generated tsunami impacted Karrat Fjord and the Uummannaq fjord system located some 280 km north of Ilulissat in western Greenland. The eastern of two easily recognized landslides detached completely and fell approximately 1 km to sea level, before plunging into the Karrat Fjord and generating a tsunami within the fjord system. The landslide generated tsunami washed 4 victims and several houses into the fjord at Nuugaatsiaq, about 30 km west of the landslide. Eyewitnesses at Nuugaatsiaq and Illorsuit recorded the tsunami inundation on videos. The active western landslide features a back scarp and large cracks, and therefore remains a threat in Karrat Fjord. The villages of Nuugaatsiaq and Illorsuit remain evacuated. The Geotechnical Extreme Events Reconnaissance (GEER) survey team deployed to Greenland from July 6 to 9, 2017. The reconnaissance on July 8 involved approximately 800 km of helicopter flight and landings in several key locations. The survey focused on the landslides and coastlines within 30 km of the landslide in either fjord direction. The aerial reconnaissance collected high quality oblique aerial photogrammetry (OAP) of the landslide, scarp, and debris avalanche track. The 3D model of the landslide provides the ability to study the morphology of the slope on July 8, it provides a baseline model for future surveys, and it can be used to compare to earlier imagery to estimate what happened on June 17. Change detection using prior satellite imagery indicates an approximate 55 million m3 total landslide volume of which 45 million m3 plunged into the fjord from elevations up to 1200 m above the water surface. The ground based tsunami survey documented flow depths, runup heights, inundation distances, sediment deposition, damage patterns at various scales, performance of the man-made infrastructure, and impact on the natural and glacial environment. Perishable high-water marks include changes in vegetation and damage to roots, deposits and scour of soil and rock, stranded icebergs, as well as damage to homes and infrastructure. The tsunami runup heights exceeded 90 m laterally to the west of the landslide and 50 m across the 6 km wide fjord. The Greenland landslide generated tsunami highlights coastal hazards to communities not commonly exposed to earthquake generated tsunamis.
NASA Astrophysics Data System (ADS)
Levin, B.; Kopanina, A.; Ivelskaya, T.; Sasorova, E.
2007-12-01
The investigation of the Central Kuril Islands (Simushir, Urup, Ketoy) coast was performance by the field survey for the Institute of Marine Geology and Geophysics FEB RAS (Yuzhno-Sakhalinsk) on the vessel "Iskatel-4" to be able find different deposits of the devastating tsunami waves influence on soil and vegetation. There were average run-up heights and inundation areas (tsunami flooding zones): h=6-9 m and 40-60 m (Ketoy); h=7-19 m and 80-300 m (Simushir). The field observation showed destruction of the soil layer. The estimation of water stream velocity for the hydraulic destruction of rocks enabled to receive velocity average mean for the water stream during tsunami dynamic inundation which may be in interval of velocities near 30 -50 m/sec. Field observations of coastal plants in tsunami inundation zones on Urup, Simushir and Ketoy Islands enabled us to recognize the character of destructive influence of tsunami waves to plant structure and essential signs of micro-phytocenoses for ecotopes at different distances from the coastline. Various plant species and vital morphes were found to indicate different reaction on sea waves. The investigation results showed that selected plant species demonstrate the strong response to tsunami wave inundation. We found that the most sensitive species to mechanical and physical- chemical tsunami impact are: Pinus pumila (Pall.) Regel and Phyllodoce aleutica (Spreng.) A. Heller. The character of plant damage shows in breaking of skeletal axes, infringement of root systems, and leaf dying. These findings allow us to use the species as effective indicators of tsunami flooding zone and estimation of tsunami run-up heights. Fulfilled analyzes let us to reconstruct possible events when tsunami hits to coast with specific shore morphology. The wave front at the slightly sloping coast (from coastline to first terrace) is characterized by uniform growth of water level when water moves away soil material (no more 2-3 cm) and micro- phytocenoses is maintaining the stability. During impact to steep dune slopes, tsunami wave generates violent horizontal streams which hit to sea-bank with velocities in order to 30m/sec and lead to considerable destructions of soil layer on the depth 30-35cm and structure damage of vegetation.
An Earthquake Source Sensitivity Analysis for Tsunami Propagation in the Eastern Mediterranean
NASA Astrophysics Data System (ADS)
Necmioglu, Ocal; Meral Ozel, Nurcan
2013-04-01
An earthquake source parameter sensitivity analysis for tsunami propagation in the Eastern Mediterranean has been performed based on 8 August 1303 Crete and Dodecanese Islands earthquake resulting in destructive inundation in the Eastern Mediterranean. The analysis involves 23 cases describing different sets of strike, dip, rake and focal depth, while keeping the fault area and displacement, thus the magnitude, same. The main conclusions of the evaluation are drawn from the investigation of the wave height distributions at Tsunami Forecast Points (TFP). The earthquake vs. initial tsunami source parameters comparison indicated that the maximum initial wave height values correspond in general to the changes in rake angle. No clear depth dependency is observed within the depth range considered and no strike angle dependency is observed in terms of amplitude change. Directivity sensitivity analysis indicated that for the same strike and dip, 180° shift in rake may lead to 20% change in the calculated tsunami wave height. Moreover, an approximately 10 min difference in the arrival time of the initial wave has been observed. These differences are, however, greatly reduced in the far field. The dip sensitivity analysis, performed separately for thrust and normal faulting, has both indicated that an increase in the dip angle results in the decrease of the tsunami wave amplitude in the near field approximately 40%. While a positive phase shift is observed, the period and the shape of the initial wave stays nearly the same for all dip angles at respective TFPs. These affects are, however, not observed at the far field. The resolution of the bathymetry, on the other hand, is a limiting factor for further evaluation. Four different cases were considered for the depth sensitivity indicating that within the depth ranges considered (15-60 km), the increase of the depth has only a smoothing effect on the synthetic tsunami wave height measurements at the selected TFPs. The strike sensitivity analysis showed clear phase shift with respect to the variation of the strike angles, without leading to severe variation of the initial and maximum waves at locations considered. Travel time maps for two cases corresponding to difference in the strike value (60° vs 150°) presented a more complex wave propagation for the case with 60° strike angle due to the fact that the normal of the fault plane is orthogonal to the main bathymetric structure in the region, namely the Eastern section of the Hellenic Arc between Crete and Rhodes Islands. For a given set of strike, dip and focal depth parameters, the effect of the variation in the rake angle has been evaluated in the rake sensitivity analysis. A waveform envelope composed of symmetric synthetic recordings at one TFPs could be clearly observed as a result of rake angle variations in 0-180° range. This could also lead to the conclusion that for a given magnitude (fault size and displacement), the expected maximum and minimum tsunami wave amplitudes could be evaluated as a waveform envelope rather limited to a single point of time or amplitude. The Evaluation of the initial wave arrival times follows an expected pattern controlled by the distance, wheras maximum wave arrival time distribution presents no clear pattern. Nevertheless, the distribution is rather concentrated in time domain for some TFPs. Maximum positive and minimum negative wave amplitude distributions indicates a broader range for a subgroup of TFPs, wheras for the remaining TFPs the distributions are narrow. Any deviation from the expected trend of calculating narrower ranges of amplitude distributions could be interpreted as the result o the bathymetry and focusing effects. As similar studies conducted in the different parts of the globe indicated, the main characteristics of the tsunami propagation are unique for each basin. It should be noted, however, that the synthetic measurements obtained at the TFPs in the absence of high-resolution bathymetric data, should be considered only an overall guidance. The results indicate the importance of the accuracy of earthquake source parameters for reliable tsunami predictions and the need for high-resolution bathymetric data to be able to perform calculations with higher accuracy. On the other hand, this study did not address other parameters, such as heterogeneous slip distribution and rupture duration, which affect the tsunami initiation and propagation process.
NASA Astrophysics Data System (ADS)
Tang, H.; WANG, J.
2017-12-01
Population living close to coastlines is increasing, which creates higher risks due to coastal hazards, such as the tsunami. However, the generation of a tsunami is not fully understood yet, especially for paleo-tsunami. Tsunami deposits are one of the concrete evidence in the geological record which we can apply for studying paleo-tsunami. The understanding of tsunami deposits has significantly improved over the last decades. There are many inversion models (e.g. TsuSedMod, TSUFLIND, and TSUFLIND-EnKF) to study the overland-flow characteristics based on tsunami deposits. However, none of them tries to reconstruct offshore tsunami wave characteristics (wave form, wave height, and length) based on tsunami deposits. Here we present a state-of-the-art inverse approach to reconstruct offshore tsunami wave based on the tsunami inundation data, the spatial distribution of tsunami deposits and Marine-terrestrial sediment signal in the tsunami deposits. Ensemble Kalman Filter (EnKF) Method is used for assimilating both sediment transport simulations and the field observation data. While more computationally expensive, the EnKF approach potentially provides more accurate reconstructions for tsunami waveform. In addition to the improvement of inversion results, the ensemble-based method can also quantify the uncertainties of the results. Meanwhile, joint inversion improves the resolution of tsunami waves compared with inversions using any single data type. The method will be tested by field survey data and gauge data from the 2011 Tohoku tsunami on Sendai plain area.
Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin
2015-12-15
As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dominey-Howes, D.; Goff, J. R.
2009-12-01
National economies are increasingly dependent on the global telecommunications system - and in particular, its submarine cable infrastructure. Submarine cable traffic represents about 30% of global GDP so the cost of losing, or even simply slowing, communications traffic is high. Many natural hazards are capable of damaging and destroying this infrastructure but tsunamis are the most significant threat, particularly in waters >1000 m deep. Submarine cables and their shore-based infrastructure (the anchor points), are at risk from direct and indirect tsunami-related effects. During the 2004 Indian Ocean Tsunami in India and Indonesia, cables were broken (direct effect) as the tsunami eroded supporting sediments, and were further damaged by floating/submerged objects and intense nearshore currents. Shore-based infrastructure was also directly damaged in India, Indonesia, and the Maldives. The 1929 Grand Banks earthquake generated a submarine landslide and tsunami off Newfoundland which broke 12 submarine telegraph cables. In 2006, an earthquake in Taiwan generated submarine landslides and a tsunami. These landslides caused one of the largest disruptions of modern telecommunications history when nine cables in the Strait of Luzon were broken disabling vital connections between SE Asia and the rest of the world. Although electronic traffic in and out of Australia was slowed, it did not cease because >70% of our traffic is routed via cables that pass through Hawaii. This is extremely significant because Hawaii is an internationally recognised bottleneck or “choke point” in the global telecommunications network. The fact that Hawaii is a choke point is important because it is regularly affected by numerous large magnitude natural hazards. Any damage to the submarine telecommunications infrastructure routed through Hawaii could result in significant impacts on the electronic flow of data and voice traffic, negatively affecting dependent economies such as Australia. Other choke points exist globally, many in high hazards regions. We propose that proper risk assessments be undertaken at all bottlenecks in the global telecommunications system affected by natural hazards (such as tsunami). We use Hawaii as an example of the sort of research that should be undertaken.
The GNSS-based Ground Tracking System (GTS) of GFZ; from GITEWS to PROTECTS and beyond
NASA Astrophysics Data System (ADS)
Falck, Carsten; Merx, Alexander; Ramatschi, Markus
2013-04-01
Introduction An automatic system for the near real-time determination and visualization of ground motions, respectively co-seismic deformations of the Earth's surface, was developed by GFZ (German Research Centre for Geosciences) within the project GITEWS (German Indonesian Tsunami Early Warning System). The system is capable to deliver 3D-displacement vectors for locations with appropriate GPS-equipment in the vicinity of an earthquake's epicenter with a delay of only a few minutes. These vectors can help to assess the earthquake causing tectonic movements, which must be known to make reliable early warning predictions, e.g., concerning the generation of tsunami waves. The GTS (Ground Tracking System) has been integrated into InaTEWS (Indonesian Tsunami Early Warning System) and is in operation at the national warning center in Jakarta since November 2008. After the end of the project GITEWS GFZ continues to support the GTS in Indonesia within the frame of PROTECTS (Project for Training, Education and Consulting for Tsunami Early Warning Systems) and recently some new developments have been introduced. We now aim to make further use of the achievements made, e.g., by developing a license model for the GTS software package. Motivation After the Tsunami of 26th December 2004 the German government initiated the GITEWS project to develop the main components for a tsunami early warning system in Indonesia. The GFZ, as the consortial leader of GITEWS, had several work packages, most of them related to sensor systems. The geodetic branch (Department 1) of GFZ was assigned to develop a GNSS-based component, which since then is known as the GTS (Ground Tracking System). System benefit The ground motion information delivered by the GTS is a valuable source for a fast understanding of an earthquake's mechanism with a high relevance to assess the probability and magnitude of a potentially following tsunami. The system may detect highest displacement vector values, where seismic systems may tend to have problems with the determination of earthquake magnitudes, e.g. close to an earthquake epicenter. By considering displacement vectors the GTS may significantly support the decision finding process whether a tsunami has been generated. Brief system description The GTS may be divided into three main components: 1) The data acquisition component receives and manages data from GNSS-stations being transferred either in real-time, file based or both in parallel, including, e.g., format conversions and real-time spreading to other services. It also acquires the most actual auxiliary data needed for data processing, e.g., GNSS-satellite orbit data or, in case of internet problems, generates them from ephemeris broadcast transmissions, received by the connected GNSS-network stations. 2) The automatic GNSS-data processing unit calculates coordinate time series for all GNSS-stations providing data. The processing kernel is the robust working and well supported »Bernese GPS Software«, but wrapped into adaptations for a fully automatic near real-time processing. The final products of this unit are 3D-displacement vectors, which are calculated as differences to the mean coordinates of the latest timespan prior to an earthquake. 3) The graphical user interface (GUI) of the GTS supports both, a quick view for all staff members at the warning centre (24h/7d shifts) and deeper analysis by experts. The states of the connected GNSS-networks and of the automatic data processing system are displayed. Other views are available, e.g., to check intermediate processing steps or historic data. The GTS final products, the 3D-displacement vectors, are displayed as arrows and bars on a map view. The GUI system is implemented as a web-based application and allows all views to be displayed on many screens at the same time, even at remote locations. Acknowledgements The projects GITEWS (German Indonesian Tsunami Early Warning System) and PROTECTS (Project for Training, Education and Consulting for Tsunami Early Warning System) are carried out by a large group of scientists and engineers from (GFZ) German Research Centre for Geosciences and its partners from the German Aerospace Centre (DLR), the Alfred Wegener Institute for Polar and Marine Research (AWI), the GKSS Research Centre, the Konsortium Deutsche Meeresforschung (KDM), the Leibniz Institute for Marine Sciences (IFM-GEOMAR), the United Nations University (UNU), the Federal Institute for Geosciences and Natural Resources (BGR), the German Agency for Technical Cooperation (GTZ) and other international partners. Funding is provided by the German Federal Ministry for Education and Research (BMBF), Grant 03TSU01 and 03TSU07.
Improvement and speed optimization of numerical tsunami modelling program using OpenMP technology
NASA Astrophysics Data System (ADS)
Chernov, A.; Zaytsev, A.; Yalciner, A.; Kurkin, A.
2009-04-01
Currently, the basic problem of tsunami modeling is low speed of calculations which is unacceptable for services of the operative notification. Existing algorithms of numerical modeling of hydrodynamic processes of tsunami waves are developed without taking the opportunities of modern computer facilities. There is an opportunity to have considerable acceleration of process of calculations by using parallel algorithms. We discuss here new approach to parallelization tsunami modeling code using OpenMP Technology (for multiprocessing systems with the general memory). Nowadays, multiprocessing systems are easily accessible for everyone. The cost of the use of such systems becomes much lower comparing to the costs of clusters. This opportunity also benefits all programmers to apply multithreading algorithms on desktop computers of researchers. Other important advantage of the given approach is the mechanism of the general memory - there is no necessity to send data on slow networks (for example Ethernet). All memory is the common for all computing processes; it causes almost linear scalability of the program and processes. In the new version of NAMI DANCE using OpenMP technology and multi-threading algorithm provide 80% gain in speed in comparison with the one-thread version for dual-processor unit. The speed increased and 320% gain was attained for four core processor unit of PCs. Thus, it was possible to reduce considerably time of performance of calculations on the scientific workstations (desktops) without complete change of the program and user interfaces. The further modernization of algorithms of preparation of initial data and processing of results using OpenMP looks reasonable. The final version of NAMI DANCE with the increased computational speed can be used not only for research purposes but also in real time Tsunami Warning Systems.
Application of Catastrophe Risk Modelling to Evacuation Public Policy
NASA Astrophysics Data System (ADS)
Woo, G.
2009-04-01
The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a catastrophe risk model, is required to explore the casualty implications of different possible hazard scenarios, to assess the proportion of an evacuated population who would owe their lives to an evacuation, and to estimate the economic loss associated with an unnecessary evacuation. This paper will review the developing methodology for applying catastrophe risk modelling to support public policy in evacuation decision-making, and provide illustrations from across the range of natural hazards. Evacuation during volcanic crises is a prime example, recognizing the improving forecasting skill of volcanologists, now able to account probabilistically for precursory seismological, geodetic, and geochemical monitoring data. This methodology will be shown to help civic authorities make sounder risk-informed decisions on the timing and population segmentation of evacuation from both volcanoes and calderas, such as Vesuvius and Campi Flegrei, which are in densely populated urban regions.
Post-tsunami outbreaks of influenza in evacuation centers in Miyagi Prefecture, Japan.
Hatta, Masumitsu; Endo, Shiro; Tokuda, Koichi; Kunishima, Hiroyuki; Arai, Kazuaki; Yano, Hisakazu; Ishibashi, Noriomi; Aoyagi, Tetsuji; Yamada, Mitsuhiro; Inomata, Shinya; Kanamori, Hajime; Gu, Yoshiaki; Kitagawa, Miho; Hirakata, Yoichi; Kaku, Mitsuo
2012-01-01
We describe 2 post-tsunami outbreaks of influenza A in evacuation centers in Miyagi Prefecture, Japan, in 2011. Although containment of the outbreak was challenging in the evacuation settings, prompt implementation of a systemic approach with a bundle of control measures was important to control the influenza outbreaks.
Post Fukushima tsunami simulations for Malaysian coasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, Hock Lye, E-mail: kohhl@ucsiuniversity.edu.my; Teh, Su Yean, E-mail: syteh@usm.my; Abas, Mohd Rosaidi Che
The recent recurrences of mega tsunamis in the Asian region have rekindled concern regarding potential tsunamis that could inflict severe damage to affected coastal facilities and communities. The 11 March 2011 Fukushima tsunami that crippled nuclear power plants in Northern Japan has further raised the level of caution. The recent discovery of petroleum reserves in the coastal water surrounding Malaysia further ignites the concern regarding tsunami hazards to petroleum facilities located along affected coasts. Working in a group, federal government agencies seek to understand the dynamics of tsunami and their impacts under the coordination of the Malaysian National Centre formore » Tsunami Research, Malaysian Meteorological Department. Knowledge regarding the generation, propagation and runup of tsunami would provide the scientific basis to address safety issues. An in-house tsunami simulation models known as TUNA has been developed by the authors to assess tsunami hazards along affected beaches so that mitigation measures could be put in place. Capacity building on tsunami simulation plays a critical role in the development of tsunami resilience. This paper aims to first provide a simple introduction to tsunami simulation towards the achievement of tsunami simulation capacity building. The paper will also present several scenarios of tsunami dangers along affected Malaysia coastal regions via TUNA simulations to highlight tsunami threats. The choice of tsunami generation parameters reflects the concern following the Fukushima tsunami.« less
Survey of the July 17, 2006 Central Javan tsunami reveals 21m runup heights
NASA Astrophysics Data System (ADS)
Fritz, H.; Goff, J.; Harbitz, C.; McAdoo, B.; Moore, A.; Latief, H.; Kalligeris, N.; Kodjo, W.; Uslu, B.; Titov, V.; Synolakis, C.
2006-12-01
The Monday, July 17, 2006 Central Javan 7.7 earthquake triggered a substantial tsunami that killed 600 people along a 200km stretch of coastline. The earthquake was not reported felt along the coastline. While there was a warning issued by the PTWC, it did not trigger an evacuation warning (Synolakis, 2006). The Indian Ocean Tsunami Warning System announced by UNESCO as operational in a press release two weeks before the event did not function as promised. There were no seismic recordings transmitted to the PTWC, and two German tsunameter buoys had broken off their moorings and were not operational. Lifeguards along a tourist beach reported that while the observed the harbinger shoreline recession, they attributed to exteme storm waves that were pounding the beaches that day. Had the tsunami struck on the preceding Sunday, instead of Monday, the death toll would had been far higher. The International Tsunami Survey Team (ITST) surveyed the coastline measuring runup, inundation, flow depths and sediment deposition, with standard methods (Synolakis and Okal, 2004). Runup values ranged up to 21m with several readings over 10m, while sand sheets up to 15cm were deposited. The parent earthquake was similar, albeit of smaller magnitude, to the 1994 East Javan tsunami, which struck about 200km east (Synolakis, et al, 1995) and reached a maximum of 11m runup height only at one location on steep cliffs. The unusual distribution of runup heights, and the pronounced extreme values near Nusa Kambangan, suggest a local coseismic landslide may have triggered an additional tsunami (Okal and Synolakis, 2005). The ITST observed that many coastal villages were completely abandoned after the tsunami, even in locales where there were no casualties. Whether residents will return is uncertain, but it is clear that an education campaign in tsunami hazard mitigation is urgently needed. In the aftermath of the tsunami, the Government of Indonesia enforced urgent emergency preparedness measures, including sirens, identification of rapid evacuation routes, and emergency drills, which were under way some locations the team visited. Synolakis, C.E., What went wrong Wall Street Journal. p. 12, July 25, 2006. Synolakis, C.E., and E.A. Okal, 1992--2002: Perspective on a decade of post-tsunami surveys, in: Tsunamis: Case studies, K. Satake (ed), Adv. Natur. Technol. Hazards, 23 1--30, 2005. Okal, E.A., and Synolakis, C.E., Source discriminants for nearfield tsunamis, Geophysical Journal International, 158, 899?-912, 2004. Synolakis, C.E., Imamura, F., Tsuji, Y., Matsutomi, S., Tinti, B., Cook, B., and Ushman, M. Damage, Conditions of East Java tsunami of 1994 analyzed, EOS, 76, (26), 257 and 261-?262, 1995.
NASA Astrophysics Data System (ADS)
Bernard, E. N.
2014-12-01
As the decade of mega-tsunamis has unfolded with new data, the science of tsunami has advanced at an unprecedented pace. Our responsibility to society should guide the use of these new scientific discoveries to better prepare society for the next tsunami. This presentation will focus on the impacts of the 2004 and 2011 tsunamis and new societal expectations accompanying enhanced funding for tsunami research. A list of scientific products, including tsunami hazard maps, tsunami energy scale, real-time tsunami flooding estimates, and real-time current velocities in harbors will be presented to illustrate society's need for relevant, easy to understand tsunami information. Appropriate use of these tsunami scientific products will be presented to demonstrate greater tsunami resilience for tsunami threatened coastlines. Finally, a scientific infrastructure is proposed to ensure that these products are both scientifically sound and represent today's best practices to protect the scientific integrity of the products as well as the safety of coastal residents.
NASA Astrophysics Data System (ADS)
Spahn, H.; Hoppe, M.; Vidiarina, H. D.; Usdianto, B.
2010-07-01
Five years after the 2004 tsunami, a lot has been achieved to make communities in Indonesia better prepared for tsunamis. This achievement is primarily linked to the development of the Indonesian Tsunami Early Warning System (InaTEWS). However, many challenges remain. This paper describes the experience with local capacity development for tsunami early warning (TEW) in Indonesia, based on the activities of a pilot project. TEW in Indonesia is still new to disaster management institutions and the public, as is the paradigm of Disaster Risk Reduction (DRR). The technology components of InaTEWS will soon be fully operational. The major challenge for the system is the establishment of clear institutional arrangements and capacities at national and local levels that support the development of public and institutional response capability at the local level. Due to a lack of information and national guidance, most local actors have a limited understanding of InaTEWS and DRR, and often show little political will and priority to engage in TEW. The often-limited capacity of local governments is contrasted by strong engagement of civil society organisations that opt for early warning based on natural warning signs rather than technology-based early warning. Bringing together the various actors, developing capacities in a multi-stakeholder cooperation for an effective warning system are key challenges for the end-to-end approach of InaTEWS. The development of local response capability needs to receive the same commitment as the development of the system's technology components. Public understanding of and trust in the system comes with knowledge and awareness on the part of the end users of the system and convincing performance on the part of the public service provider. Both sides need to be strengthened. This requires the integration of TEW into DRR, clear institutional arrangements, national guidance and intensive support for capacity development at local levels as well as dialogue between the various actors.
Analysis of tsunami disaster map by Geographic Information System (GIS): Aceh Singkil-Indonesia
NASA Astrophysics Data System (ADS)
Farhan, A.; Akhyar, H.
2017-02-01
Tsunami risk map is used by stakeholder as a base to decide evacuation plan and evaluates from disaster. Aceh Singkil district of Aceh- Indonesia’s disaster maps have been developed and analyzed by using GIS tool. Overlay methods through algorithms are used to produce hazard map, vulnerability, capacity and finally created disaster risk map. Spatial maps are used topographic maps, administrative map, SRTM. The parameters are social, economic, physical environmental vulnerability, a level of exposed people, parameters of houses, public building, critical facilities, productive land, population density, sex ratio, poor ratio, disability ratio, age group ratio, the protected forest, natural forest, and mangrove forest. The results show high-risk tsunami disaster at nine villages; moderate levels are seventeen villages, and other villages are shown in the low level of tsunami risk disaster.
NASA Astrophysics Data System (ADS)
Farreras, Salvador; Ortiz, Modesto; Gonzalez, Juan I.
2007-03-01
The highly vulnerable Pacific southwest coast of Mexico has been repeatedly affected by local, regional and remote source tsunamis. Mexico presently has no national tsunami warning system in operation. The implementation of key elements of a National Program on Tsunami Detection, Monitoring, Warning and Mitigation is in progress. For local and regional events detection and monitoring, a prototype of a robust and low cost high frequency sea-level tsunami gauge, sampling every minute and equipped with 24 hours real time transmission to the Internet, was developed and is currently in operation. Statistics allow identification of low, medium and extreme hazard categories of arriving tsunamis. These categories are used as prototypes for computer simulations of coastal flooding. A finite-difference numerical model with linear wave theory for the deep ocean propagation, and shallow water nonlinear one for the near shore and interaction with the coast, and non-fixed boundaries for flooding and recession at the coast, is used. For prevention purposes, tsunami inundation maps for several coastal communities, are being produced in this way. The case of the heavily industrialized port of Lázaro Cárdenas, located on the sand shoals of a river delta, is illustrated; including a detailed vulnerability assessment study. For public education on preparedness and awareness, printed material for children and adults has been developed and published. It is intended to extend future coverage of this program to the Mexican Caribbean and Gulf of Mexico coastal areas.
An Experimental Seismic Data and Parameter Exchange System for Tsunami Warning Systems
NASA Astrophysics Data System (ADS)
Hoffmann, T. L.; Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Heinloo, A.; Hoffmann, M.
2009-12-01
For several years GFZ Potsdam is operating a global earthquake monitoring system. Since the beginning of 2008, this system is also used as an experimental seismic background data center for two different regional Tsunami Warning Systems (TWS), the IOTWS (Indian Ocean) and the interim NEAMTWS (NE Atlantic and Mediterranean). The SeisComP3 (SC3) software, developed within the GITEWS (German Indian Ocean Tsunami Early Warning System) project, capable to acquire, archive and process real-time data feeds, was extended for export and import of individual processing results within the two clusters of connected SC3 systems. Therefore not only real-time waveform data are routed to the attached warning centers through GFZ but also processing results. While the current experimental NEAMTWS cluster consists of SC3 systems in six designated national warning centers in Europe, the IOTWS cluster presently includes seven centers, with another three likely to join in 2009/10. For NEAMTWS purposes, the GFZ virtual real-time seismic network (GEOFON Extended Virtual Network -GEVN) in Europe was substantially extended by adding many stations from Western European countries optimizing the station distribution. In parallel to the data collection over the Internet, a GFZ VSAT hub for secured data collection of the EuroMED GEOFON and NEAMTWS backbone network stations became operational and first data links were established through this backbone. For the Southeast Asia region, a VSAT hub has been established in Jakarta already in 2006, with some other partner networks connecting to this backbone via the Internet. Since its establishment, the experimental system has had the opportunity to prove its performance in a number of relevant earthquakes. Reliable solutions derived from a minimum of 25 stations were very promising in terms of speed. For important events, automatic alerts were released and disseminated by emails and SMS. Manually verified solutions are added as soon as they become available. The results are also promising in terms of accuracy since epicenter coordinates, depth and magnitude estimates were sufficiently accurate from the very beginning, and usually do not differ substantially from the final solutions. In summary, automatic seismic event processing has shown to work well as a first step for starting a Tsunami Warning process. However, for the secured assessment of the tsunami potential of a given event, 24/7-manned regional TWCs are mandatory for reliable manual verification of the automatic seismic results. At this time, GFZ itself provides manual verification only when staff is available, not on a 24/7 basis, while the actual national tsunami warning centers have all a reliable 24/7 service.
NASA Astrophysics Data System (ADS)
Vanacore, E. A.; Baez-Sanchez, G.; Huerfano, V.; Lopez, A. M.; Lugo, J.
2017-12-01
The Puerto Rico Seismic Network (PRSN) is an integral part of earthquake and tsunami monitoring in Puerto Rico and the Virgin Islands. The PRSN conducts scientific research as part of the University of Puerto Rico Mayaguez, conducts the earthquake monitoring for the region, runs extensive earthquake and tsunami education and outreach programs, and acts as a Tsunami Warning Focal Point Alternate for Puerto Rico. During and in the immediate aftermath of Hurricane Maria, the PRSN duties and responsibilities evolved from a seismic network to a major information and communications center for the western side of Puerto Rico. Hurricane Maria effectively destroyed most communications on island, critically between the eastern side of the island where Puerto Rico's Emergency Management's (PREMA) main office and the National Weather Service (NWS) is based and the western side of the island. Additionally, many local emergency management agencies on the western side of the island lost a satellite based emergency management information system called EMWIN which provides critical tsunami and weather information. PRSN's EMWIN system remained functional and consequently via this system and radio communications PRSN became the only information source for NWS warnings and bulletins, tsunami alerts, and earthquake information for western Puerto Rico. Additionally, given the functional radio and geographic location of the PRSN, the network became a critical communications relay for local emergency management. Here we will present the PRSN response in relation to Hurricane Maria including the activation of the PRSN devolution plan, adoption of duties, experiences and lessons learned for continuity of operations and adoption of responsibilities during future catastrophic events.
NASA Astrophysics Data System (ADS)
Imai, K.; Sugawara, D.; Takahashi, T.
2017-12-01
A large flow caused by tsunami transports sediments from beach and forms tsunami deposits in land and coastal lakes. A tsunami deposit has been found in their undisturbed on coastal lakes especially. Okamura & Matsuoka (2012) found some tsunami deposits in the field survey of coastal lakes facing to the Nankai trough, and tsunami deposits due to the past eight Nankai Trough megathrust earthquakes they identified. The environment in coastal lakes is stably calm and suitable for tsunami deposits preservation compared to other topographical conditions such as plains. Therefore, there is a possibility that the recurrence interval of megathrust earthquakes and tsunamis will be discussed with high resolution. In addition, it has been pointed out that small events that cannot be detected in plains could be separated finely (Sawai, 2012). Various aspects of past tsunami is expected to be elucidated, in consideration of topographical conditions of coastal lakes by using the relationship between the erosion-and-sedimentation process of the lake bottom and the external force of tsunami. In this research, numerical examination based on tsunami sediment transport model (Takahashi et al., 1999) was carried out on the site Ryujin-ike pond of Ohita, Japan where tsunami deposit was identified, and deposit migration analysis was conducted on the tsunami deposit distribution process of historical Nankai Trough earthquakes. Furthermore, examination of tsunami source conditions is possibly investigated by comparison studies of the observed data and the computation of tsunami deposit distribution. It is difficult to clarify details of tsunami source from indistinct information of paleogeographical conditions. However, this result shows that it can be used as a constraint condition of the tsunami source scale by combining tsunami deposit distribution in lakes with computation data.
Itoh, Tomonori; Nakajima, Satoshi; Tanaka, Fumitaka; Nishiyama, Osamu; Matsumoto, Tatsuya; Endo, Hiroshi; Sakai, Toshiaki; Nakamura, Motoyuki; Morino, Yoshihiro
2014-09-01
The aims of this study were to evaluate reperfusion rate, therapeutic time course and in-hospital mortality pre- and post-Japan earthquake disaster, comparing patients with ST-elevation myocardial infarction (STEMI) treated in the inland area or the Tsunami-stricken area of Iwate prefecture. Subjects were 386 consecutive STEMI patients admitted to the four percutaneous coronary intervention (PCI) centers in Iwate prefecture in 2010 and 2011. Patients were divided into two groups: those treated in the inland or Tsunami-stricken area. We compared clinical characteristics, time course and in-hospital mortality in both years in the two groups. PCI was performed in 310 patients (80.3%). Door-to-balloon (D2B) time in the Tsunami-stricken area in 2011 was significantly shorter than in 2010 in patients treated with PCI. However, the rate of PCI performed in the Tsunami-stricken area in March-April 2011 was significantly lower than that in March-April 2010 (41.2% vs 85.7%; p=0.03). In-hospital mortality increased three-fold from 7.1% in March-April 2010 to 23.5% in March-April 2011 in the Tsunami-stricken area. Standardized mortality ratio (SMR) in March-April 2011 in the Tsunami-stricken area was significantly higher than the control SMR (SMR 4.72: 95% confidence interval (CI): 1.77-12.6: p=0.007). The rate of PCI decreased and in-hospital mortality increased immediately after the Japan earthquake disaster in the Tsunami-stricken area. Disorder in hospitals and in the distribution systems after the disaster impacted the clinical care and outcome of STEMI patients. © The European Society of Cardiology 2014.
The Adriatic Sea: A Long-Standing Laboratory for Sea Level Studies
NASA Astrophysics Data System (ADS)
Vilibić, Ivica; Šepić, Jadranka; Pasarić, Mira; Orlić, Mirko
2017-10-01
The paper provides a comprehensive review of all aspects of Adriatic Sea level research covered by the literature. It discusses changes occurring over millennial timescales and documented by a variety of natural and man-made proxies and post-glacial rebound models; mean sea level changes occurring over centennial to annual timescales and measured by modern instruments; and daily and higher-frequency changes (with periods ranging from minutes to a day) that are contributing to sea level extremes and are relevant for present-day flooding of coastal areas. Special tribute is paid to the historic sea level studies that shaped modern sea level research in the Adriatic, followed by a discussion of existing in situ and remote sensing observing systems operating in the Adriatic area, operational forecasting systems for Adriatic storm surges, as well as warning systems for tsunamis and meteotsunamis. Projections and predictions of sea level and related hazards are also included in the review. Based on this review, open issues and research gaps in the Adriatic Sea level studies are identified, as well as the additional research efforts needed to fill the gaps. The Adriatic Sea, thus, remains a laboratory for coastal sea level studies for semi-enclosed, coastal and marginal seas in the world ocean.
Nationwide tsunami hazard assessment project in Japan
NASA Astrophysics Data System (ADS)
Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.
2014-12-01
In 2012, we began a project of nationwide Probabilistic Tsunami Hazard Assessment (PTHA) in Japan to support various measures (Fujiwara et al., 2013, JpGU; Hirata et al., 2014, AOGS). The most important strategy in the nationwide PTHA is predominance of aleatory uncertainty in the assessment but use of epistemic uncertainty is limited to the minimum, because the number of all possible combinations among epistemic uncertainties diverges quickly when the number of epistemic uncertainties in the assessment increases ; we consider only a type of earthquake occurrence probability distribution as epistemic uncertainty. We briefly show outlines of the nationwide PTHA as follows; (i) we consider all possible earthquakes in the future, including those that the Headquarters for Earthquake Research Promotion (HERP) of Japanese Government, already assessed. (ii) We construct a set of simplified earthquake fault models, called "Characterized Earthquake Fault Models (CEFMs)", for all of the earthquakes by following prescribed rules (Toyama et al., 2014, JpGU; Korenaga et al., 2014, JpGU). (iii) For all of initial water surface distributions caused by a number of the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. (iv) Finally, we integrate information about the tsunamis calculated from the numerous CEFMs to get nationwide tsunami hazard assessments. One of the most popular representations of the integrated information is a tsunami hazard curve for coastal tsunami heights, incorporating uncertainties inherent in tsunami simulation and earthquake fault slip heterogeneity (Abe et al., 2014, JpGU). We will show a PTHA along the eastern coast of Honshu, Japan, based on approximately 1,800 tsunami sources located within the subduction zone along the Japan Trench, as a prototype of the nationwide PTHA. This study is supported by part of the research project on research on evaluation of hazard and risk of natural disasters, under the direction of the HERP of Japanese Government.