Science.gov

Sample records for verifying seismic design

  1. A Real Quantum Designated Verifier Signature Scheme

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang

    2015-09-01

    The effectiveness of most quantum signature schemes reported in the literature can be verified by a designated person, however, those quantum signature schemes aren't the real traditional designated verifier signature schemes, because the designated person hasn't the capability to efficiently simulate a signature which is indistinguishable from a signer, which cannot satisfy the requirements in some special environments such as E-voting, call for tenders and software licensing. For solving this problem, a real quantum designated verifier signature scheme is proposed in this paper. According to the property of unitary transformation and quantum one-way function, only a verifier designated by a signer can verify the "validity of a signature" and the designated verifier cannot prove to a third party that the signature was produced by the signer or by himself through a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Analysis results show that this new scheme satisfies the main security requirements of designated verifier signature scheme and the major attack strategies.

  2. The seismic design handbook

    SciTech Connect

    Naeim, F. )

    1989-01-01

    This book contains papers on the planning, analysis, and design of earthquake resistant building structures. Theories and concepts of earthquake resistant design and their implementation in seismic design practice are presented.

  3. Position paper: Seismic design criteria

    SciTech Connect

    Farnworth, S.K.

    1995-05-22

    The purpose of this paper is to document the seismic design criteria to be used on the Title 11 design of the underground double-shell waste storage tanks and appurtenant facilities of the Multi-Function Waste Tank Facility (MWTF) project, and to provide the history and methodologies for determining the recommended Design Basis Earthquake (DBE) Peak Ground Acceleration (PGA) anchors for site-specific seismic response spectra curves. Response spectra curves for use in design are provided in Appendix A.

  4. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  5. Design Strategy for a Formally Verified Reliable Computing Platform

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.

    1991-01-01

    This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.

  6. Design of a verifiable subset for HAL/S

    NASA Technical Reports Server (NTRS)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  7. Simplified seismic performance assessment and implications for seismic design

    NASA Astrophysics Data System (ADS)

    Sullivan, Timothy J.; Welch, David P.; Calvi, Gian Michele

    2014-08-01

    The last decade or so has seen the development of refined performance-based earthquake engineering (PBEE) approaches that now provide a framework for estimation of a range of important decision variables, such as repair costs, repair time and number of casualties. This paper reviews current tools for PBEE, including the PACT software, and examines the possibility of extending the innovative displacement-based assessment approach as a simplified structural analysis option for performance assessment. Details of the displacement-based s+eismic assessment method are reviewed and a simple means of quickly assessing multiple hazard levels is proposed. Furthermore, proposals for a simple definition of collapse fragility and relations between equivalent single-degree-of-freedom characteristics and multi-degree-of-freedom story drift and floor acceleration demands are discussed, highlighting needs for future research. To illustrate the potential of the methodology, performance measures obtained from the simplified method are compared with those computed using the results of incremental dynamic analyses within the PEER performance-based earthquake engineering framework, applied to a benchmark building. The comparison illustrates that the simplified method could be a very effective conceptual seismic design tool. The advantages and disadvantages of the simplified approach are discussed and potential implications of advanced seismic performance assessments for conceptual seismic design are highlighted through examination of different case study scenarios including different structural configurations.

  8. Structural concepts and details for seismic design

    SciTech Connect

    Not Available

    1991-09-01

    This manual discusses building and building component behavior during earthquakes, and provides suggested details for seismic resistance which have shown by experience to provide adequate performance during earthquakes. Special design and construction practices are also described which, although they might be common in some high-seismic regions, may not be common in low and moderate seismic-hazard regions of the United States. Special attention is given to describing the level of detailing appropriate for each seismic region. The UBC seismic criteria for all seismic zones is carefully examined, and many examples of connection details are given. The general scope of discussion is limited to materials and construction types common to Department of Energy (DOE) sites. Although the manual is primarily written for professional engineers engaged in performing seismic-resistant design for DOE facilities, the first two chapters, plus the introductory sections of succeeding chapters, contain descriptions which are also directed toward project engineers who authorize, review, or supervise the design and construction of DOE facilities. 88 refs., 188 figs.

  9. Seismic design guidelines for highway bridges

    NASA Astrophysics Data System (ADS)

    Mayes, R. L.; Sharpe, R. L.

    1981-10-01

    Guidelines for the seismic design of highway bridges are given. The guidelines are the recommendations of a team of nationally recognized experts which included consulting engineers, academicians, State highway, and Federal agency representatives from throughout the United States. The guidelines are comprehensive in nature and they embody several new concepts which are significant departures from existing design provisions. An extensive commentary documenting the basis for the guidelines and an example demonstrating their use are included. A draft of the guidelines was used to seismically redesign twenty-one bridges. A summary of the redesigns is included.

  10. Recent reliable observations and improved tests on synthetic catalogs with spatiotemporal clustering verify precursory decelerating-accelerating seismicity

    NASA Astrophysics Data System (ADS)

    Karakaisis, G. F.; Papazachos, C. B.; Scordilis, E. M.

    2013-07-01

    We examined the seismic activity which preceded six strong mainshocks that occurred in the Aegean ( M = 6.4-6.9, 33-43° N, 19-28° E) and two strong mainshocks that occurred in California ( M = 6.5-7.1, 32-41° N, 115-125° W) during 1995-2010. We find that each of these eight mainshocks has been preceded by a pronounced decelerating and an equally easily identifiable accelerating seismic sequence with the time to the mainshock. The two preshock sequences of each mainshock occurred in separate space, time, and magnitude windows. In all eight cases, very low decelerating seismicity, as well as very low accelerating seismicity, is observed around the actual epicenter of the ensuing mainshock. Statistical tests on the observed measures of decelerating, q d, and accelerating, q a, seismicity against similar measures calculated using synthetic catalogs with spatiotemporal clustering based on the ETAS model show that there is an almost zero probability for each one of the two preshock sequences which preceded each of the eight mainshocks to be random. These results support the notion that every strong shallow mainshock is preceded by a decelerating and an accelerating seismic sequence with predictive properties for the ensuing mainshock.

  11. The Relationship Between Verified Organ Donor Designation and Patient Demographic and Medical Characteristics.

    PubMed

    Sehgal, N K R; Scallan, C; Sullivan, C; Cedeño, M; Pencak, J; Kirkland, J; Scott, K; Thornton, J D

    2016-04-01

    Previous studies on the correlates of organ donation consent have focused on self-reported willingness to donate and on self-reported medical suitability to donate. However, these may be subject to social desirability bias and inaccurate assessments of medical suitability. The authors sought to overcome these limitations by directly verifying donor designation on driver's licenses and by abstracting comorbid conditions from electronic health records. Using a cross-sectional study design, they reviewed the health records of 2070 randomly selected primary care patients at a large urban safety-net medical system to obtain demographic and medical characteristics. They also examined driver's licenses that were scanned into electronic health records as part of the patient registration process for donor designation. Overall, 943 (46%) patients were designated as a donor on their driver's license. On multivariate analysis, donor designation was positively associated with age 35-54 years, female sex, nonblack race, speaking English or Spanish, being employed, having private insurance, having an income >$45 000, and having fewer comorbid conditions. These demographic and medical characteristics resulted in patient subgroups with donor designation rates ranging from 21% to 75%. In conclusion, patient characteristics are strongly related to verified donor designation. Further work should tailor organ donation efforts to specific subgroups. PMID:26603147

  12. Tritium glovebox stripper system seismic design evaluation

    SciTech Connect

    Grinnell, J. J.; Klein, J. E.

    2015-09-01

    The use of glovebox confinement at US Department of Energy (DOE) tritium facilities has been discussed in numerous publications. Glovebox confinement protects the workers from radioactive material (especially tritium oxide), provides an inert atmosphere for prevention of flammable gas mixtures and deflagrations, and allows recovery of tritium released from the process into the glovebox when a glovebox stripper system (GBSS) is part of the design. Tritium recovery from the glovebox atmosphere reduces emissions from the facility and the radiological dose to the public. Location of US DOE defense programs facilities away from public boundaries also aids in reducing radiological doses to the public. This is a study based upon design concepts to identify issues and considerations for design of a Seismic GBSS. Safety requirements and analysis should be considered preliminary. Safety requirements for design of GBSS should be developed and finalized as a part of the final design process.

  13. Feasibility study and verified design concept for new improved hot gas facility

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The MSFC Hot Gas Facility (HGF) was fabricated in 1975 as a temporary facility to provide immediate turnaround testing to support the SRB and ET TPS development. This facility proved to be very useful and was used to make more than 1300 runs, far more than ever intended in the original design. Therefore, it was in need of constant repair and needed to be replaced with a new improved design to support the continuing SRB/ET TPS product improvement and/or removal efforts. MSFC contracted with Lockheed-Huntsville to work on this improved design through contract NAS8-36304 Feasibility Study and Verified Design Concept for the New Improved Hot Gas Facility. The results of Lockheed-Huntsville's efforts under this contract are summarized.

  14. Fiber designs for micro-seismic sensing

    NASA Astrophysics Data System (ADS)

    Gillooly, Andy M.; Hill, Mark D.

    2013-05-01

    Acrylate and polyimide coatings are found to have a suitable modulus for micro-seismic sensors whilst carbon coatings are too hard and inelastic for reliable use in this application. Fiber cladding designs can be optimized for mechanical reliability by using 80μm or 50μm cladding diameters and the numerical aperture (NA) increased to give low bend losses. To reduce splice losses, a bridging fiber has been developed, capable of reducing splice losses between telecoms fibers and reduced cladding diameter high NA sensor fibers by <50%.

  15. Guidelines for the seismic design of fire protection systems

    SciTech Connect

    Benda, B.; Cushing, R.; Driesen, G.E.

    1991-12-31

    The engineering knowledge gained from earthquake experience data surveys of fire protection system components is combined with analytical evaluation results to develop guidelines for the design of seismically rugged fire protection distribution piping. The seismic design guidelines of the National Fire Protection Association Standard NFPA-13 are reviewed, augmented, and summarized to define an efficient method for the seismic design of fire protection piping systems. 8 refs.

  16. Verified by Visa and MasterCard SecureCode: Or, How Not to Design Authentication

    NASA Astrophysics Data System (ADS)

    Murdoch, Steven J.; Anderson, Ross

    Banks worldwide are starting to authenticate online card transactions using the '3-D Secure' protocol, which is branded as Verified by Visa and MasterCard SecureCode. This has been partly driven by the sharp increase in online fraud that followed the deployment of EMV smart cards for cardholder-present payments in Europe and elsewhere. 3-D Secure has so far escaped academic scrutiny; yet it might be a textbook example of how not to design an authentication protocol. It ignores good design principles and has significant vulnerabilities, some of which are already being exploited. Also, it provides a fascinating lesson in security economics. While other single sign-on schemes such as OpenID, InfoCard and Liberty came up with decent technology they got the economics wrong, and their schemes have not been adopted. 3-D Secure has lousy technology, but got the economics right (at least for banks and merchants); it now boasts hundreds of millions of accounts. We suggest a path towards more robust authentication that is technologically sound and where the economics would work for banks, merchants and customers - given a gentle regulatory nudge.

  17. Verifying single-station seismic approaches using Earth-based data: Preparation for data return from the InSight mission to Mars

    NASA Astrophysics Data System (ADS)

    Panning, Mark P.; Beucler, Éric; Drilleau, Mélanie; Mocquet, Antoine; Lognonné, Philippe; Banerdt, W. Bruce

    2015-03-01

    The planned InSight mission will deliver a single seismic station containing 3-component broadband and short-period sensors to the surface of Mars in 2016. While much of the progress in understanding the Earth and Moon's interior has relied on the use of seismic networks for accurate location of sources, single station approaches can be applied to data returned from Mars in order to locate events and determine interior structure. In preparation for the data return from InSight, we use a terrestrial dataset recorded at the Global Seismic Network station BFO, located at the Black Forest Observatory in Germany, to verify an approach for event location and structure determination based on recordings of multiple orbit surface waves, which will be more favorable to record on Mars than Earth due to smaller planetary radius and potentially lower background noise. With this approach applied to events near the threshold of observability on Earth, we are able to determine epicentral distance within approximately 1° (corresponding to ∼60 km on Mars), and origin time within ∼30 s. With back azimuth determined from Rayleigh wave polarization, absolute locations are determined generally within an aperture of 10°, allowing for localization within large tectonic regions on Mars. With these locations, we are able to recover Earth mantle structure within ±5% (the InSight mission requirements for martian mantle structure) using 1D travel time inversions of P and S travel times for datasets of only 7 events. The location algorithm also allows for the measurement of great-circle averaged group velocity dispersion, which we measure between 40 and 200 s to scale the expected reliable frequency range of the InSight data from Earth to Mars data. Using the terrestrial data, we are able to resolve structure down to ∼200 km, but synthetic tests demonstrate we should be able to resolve martian structure to ∼400 km with the same frequency content given the smaller planetary size.

  18. Cost reduction through improved seismic design

    SciTech Connect

    Severud, L.K.

    1984-01-01

    During the past decade, many significnt seismic technology developments have been accomplished by the United States Department of Energy (USDOE) programs. Both base technology and major projects, such as the Fast Flux Test Facility (FFTF) and the Clinch River Breeder Reactor (CRBR) plant, have contributed to seismic technology development and validation. Improvements have come in the areas of ground motion definitions, soil-structure interaction, and structural analysis methods and criteria for piping, equipment, components, reactor core, and vessels. Examples of some of these lessons learned and technology developments are provided. Then, the highest priority seismic technology needs, achievable through DOE actions and sponsorship are identified and discussed. Satisfaction of these needs are expected to make important contributions toward cost avoidances and reduced capital costs of future liquid metal nuclear plants. 23 references, 12 figures.

  19. Prevention of seismic damages in telescope design

    NASA Astrophysics Data System (ADS)

    Perrotta, F.; Schipani, P.; Martelli, F.; Parodi, G.; Ottolini, M.

    Some of the best astronomical sites are unfortunately located in potentially seismic areas. An appropriate study to evaluate the dynamic forces acting on telescope optics is therefore crucial, to prevent them from damages in case of earthquakes. We present a procedure to estimate the response of the VLT Survey Telescope (VST) telescope primary mirror to a Maximum Likely Earthquake (MLE) in the European Southern Observatory (ESO) site of Cerro Paranal, Northern Chile.

  20. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  1. Coupling induced seismic hazard analysis with reservoir design

    NASA Astrophysics Data System (ADS)

    Gischig, V.; Wiemer, S.; Alcolea, A. R.

    2013-12-01

    The hazard and risk perspective in research on induced seismicity usually focuses on the question how to reduce the occurrence of induced earthquakes. However, it is also well accepted that shear-dilatancy accompanied by seismic energy radiation is a required process for reservoir creation in low permeability rock. Assessment of induced seismic hazard for a planned stimulation experiment must take into account the target reservoir properties. We present a generic modelling study, in which induced seismic hazard can be analysed in balance with the permeability enhancement and the size of the stimulated reservoir. The model has two coupled components: 1) a flow model that solves the pressure diffusion equations, and 2) a stochastic seismicity model, which uses the transient pressure disturbances to trigger seismic events at so-called seed points. At triggering, a magnitude is randomly drawn from a Gutenberg-Richter distribution with a local b-value that depends on the stress state at the seed point. In the source area of the events the permeability is increased depending on the amount of slip, but only by a maximum factor of 200. Due to the stochastic nature of the modelling approach, a representative number of 500 model realizations are computed. The results demonstrate that planning and controlling of reservoir engineering operation may be compromised by the considerable variability of maximum observed magnitude, reservoir size, b-value and seismogenic index arising from the intrinsic virtually random nature of induced seismicity. We also find that injection volume has the highest impact on both reservoir size and seismic hazard, while changing injection rate and strategy at constant final injection volume has a negligible effect. However, the impact of site-specific parameters on seismicity and reservoir properties is greater than the volume effect. In particular, conditions that lead to high b-values - for instance a low differential stress level - have a high positive impact on seismic hazard. However, as smaller magnitudes contribute less to permeability enhancement the efficiency of stimulation is degraded in case of high b-value conditions. Nevertheless, target permeability enhancement can be still be achieved under high b-value condition without reaching an unacceptable seismic hazard level, if either initial permeability is already high or if several fractures are stimulated. The proposed modelling approach is a first step towards including induced seismic hazard analysis into the design of reservoir stimulation.

  2. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    SciTech Connect

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  3. Seismic fragility assessment of RC frame structure designed according to modern Chinese code for seismic design of buildings

    NASA Astrophysics Data System (ADS)

    Wu, D.; Tesfamariam, S.; Stiemer, S. F.; Qin, D.

    2012-09-01

    Following several damaging earthquakes in China, research has been devoted to find the causes of the collapse of reinforced concrete (RC) building sand studying the vulnerability of existing buildings. The Chinese Code for Seismic Design of Buildings (CCSDB) has evolved over time, however, there is still reported earthquake induced damage of newly designed RC buildings. Thus, to investigate modern Chinese seismic design code, three low-, mid- and high-rise RC frames were designed according to the 2010 CCSDB and the corresponding vulnerability curves were derived by computing a probabilistic seismic demand model (PSDM).The PSDM was computed by carrying out nonlinear time history analysis using thirty ground motions obtained from the Pacific Earthquake Engineering Research Center. Finally, the PSDM was used to generate fragility curves for immediate occupancy, significant damage, and collapse prevention damage levels. Results of the vulnerability assessment indicate that the seismic demands on the three different frames designed according to the 2010 CCSDB meet the seismic requirements and are almost in the same safety level.

  4. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

    NASA Astrophysics Data System (ADS)

    Liu, Aiwen; Chen, Kun; Wu, Jian

    2010-06-01

    The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

  5. Next generation seismic fragility curves for California bridges incorporating the evolution in seismic design philosophy

    NASA Astrophysics Data System (ADS)

    Ramanathan, Karthik Narayan

    Quantitative and qualitative assessment of the seismic risk to highway bridges is crucial in pre-earthquake planning, and post-earthquake response of transportation systems. Such assessments provide valuable knowledge about a number of principal effects of earthquakes such as traffic disruption of the overall highway system, impact on the regions’ economy and post-earthquake response and recovery, and more recently serve as measures to quantify resilience. Unlike previous work, this study captures unique bridge design attributes specific to California bridge classes along with their evolution over three significant design eras, separated by the historic 1971 San Fernando and 1989 Loma Prieta earthquakes (these events affected changes in bridge seismic design philosophy). This research developed next-generation fragility curves for four multispan concrete bridge classes by synthesizing new knowledge and emerging modeling capabilities, and by closely coordinating new and ongoing national research initiatives with expertise from bridge designers. A multi-phase framework was developed for generating fragility curves, which provides decision makers with essential tools for emergency response, design, planning, policy support, and maximizing investments in bridge retrofit. This framework encompasses generational changes in bridge design and construction details. Parameterized high-fidelity three-dimensional nonlinear analytical models are developed for the portfolios of bridge classes within different design eras. These models incorporate a wide range of geometric and material uncertainties, and their responses are characterized under seismic loadings. Fragility curves were then developed considering the vulnerability of multiple components and thereby help to quantify the performance of highway bridge networks and to study the impact of seismic design principles on the performance within a bridge class. This not only leads to the development of fragility relations that are unique and better suited for bridges in California, but also leads to the creation of better bridge classes and sub-bins that have more consistent performance characteristics than those currently provided by the National Bridge Inventory. Another important feature of this research is associated with the development of damage state definitions and grouping of bridge components in a way that they have similar consequences in terms of repair and traffic implications following a seismic event. These definitions are in alignment with the California Department of Transportation’s design and operational experience, thereby enabling better performance assessment, emergency response, and management in the aftermath of a seismic event. The fragility curves developed as a part of this research will be employed in ShakeCast, a web-based post-earthquake situational awareness application that automatically retrieves earthquake shaking data and generates potential damage assessment notifications for emergency managers and responders.

  6. A verified design of a fault-tolerant clock synchronization circuit: Preliminary investigations

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1992-01-01

    Schneider demonstrates that many fault tolerant clock synchronization algorithms can be represented as refinements of a single proven correct paradigm. Shankar provides mechanical proof that Schneider's schema achieves Byzantine fault tolerant clock synchronization provided that 11 constraints are satisfied. Some of the constraints are assumptions about physical properties of the system and cannot be established formally. Proofs are given that the fault tolerant midpoint convergence function satisfies three of the constraints. A hardware design is presented, implementing the fault tolerant midpoint function, which is shown to satisfy the remaining constraints. The synchronization circuit will recover completely from transient faults provided the maximum fault assumption is not violated. The initialization protocol for the circuit also provides a recovery mechanism from total system failure caused by correlated transient faults.

  7. Review of seismicity and ground motion studies related to development of seismic design at SRS

    SciTech Connect

    Stephenson, D.E.; Acree, J.R.

    1992-08-01

    The NRC response spectra developed in Reg. Guide 1.60 is being used in the studies related to restarting of the existing Savannah River Site (SRS) reactors. Because it envelopes all the other site specific spectra which have been developed for SRS, it provides significant conservatism in the design and analysis of the reactor systems for ground motions of this value or with these probability levels. This spectral shape is also the shape used for the design of the recently licensed Vogtle Nuclear Station, located south of the Savannah River from the SRS. This report provides a summary of the data base used to develop the design basis earthquake. This includes the seismicity, rates of occurrence, magnitudes, and attenuation relationships. A summary is provided for the studies performed and methodologies used to establish the design basis earthquake for SRS. The ground motion response spectra developed from the various studies are also summarized. The seismic hazard and PGA`s developed for other critical facilities in the region are discussed, and the SRS seismic instrumentation is presented. The programs for resolving outstanding issues are discussed and conclusions are presented.

  8. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 12 2013-01-01 2013-01-01 false Seismic design and construction standards for new..., REGULATIONS, AND EXECUTIVE ORDERS Seismic Safety of Federally Assisted New Building Construction § 1792.103 Seismic design and construction standards for new buildings. (a) In the design and construction...

  9. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 12 2012-01-01 2012-01-01 false Seismic design and construction standards for new..., REGULATIONS, AND EXECUTIVE ORDERS Seismic Safety of Federally Assisted New Building Construction § 1792.103 Seismic design and construction standards for new buildings. (a) In the design and construction...

  10. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 12 2014-01-01 2013-01-01 true Seismic design and construction standards for new..., REGULATIONS, AND EXECUTIVE ORDERS Seismic Safety of Federally Assisted New Building Construction § 1792.103 Seismic design and construction standards for new buildings. (a) In the design and construction...

  11. Salt Repository Project input to seismic design: Revision 0. [Contains Glossary

    SciTech Connect

    Not Available

    1987-12-01

    The Salt Repository Program (SRP) Input to Seismic Design (ISD) documents the assumptions, rationale, approaches, judgments, and analyses that support the development of seismic-specific data and information to be used for shaft design in accordance with the SRP Shaft Design Guide (SDG). The contents of this document are divided into four subject areas: (1) seismic assessment, (2) stratigraphy and material properties for seismic design, (3) development of seismic design parameters, and (4) host media stability. These four subject areas have been developed considering expected conditions at a proposed site in Deaf Smith County, Texas. The ISD should be used only in conjunction with seismic design of the exploratory and repository shafts. Seismic design considerations relating to surface facilities are not addressed in this document. 54 refs., 55 figs., 18 tabs.

  12. Seismic design technology for Breeder Reactor structures. Volume 3: special topics in reactor structures

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into six chapters: analysis techniques, equivalent damping values, probabilistic design factors, design verifications, equivalent response cycles for fatigue analysis, and seismic isolation. (JDB)

  13. A New Event Detector Designed for the Seismic Research Observatories

    USGS Publications Warehouse

    Murdock, James N.; Hutt, Charles R.

    1983-01-01

    A new short-period event detector has been implemented on the Seismic Research Observatories. For each signal detected, a printed output gives estimates of the time of onset of the signal, direction of the first break, quality of onset, period and maximum amplitude of the signal, and an estimate of the variability of the background noise. On the SRO system, the new algorithm runs ~2.5x faster than the former (power level) detector. This increase in speed is due to the design of the algorithm: all operations can be performed by simple shifts, additions, and comparisons (floating point operations are not required). Even though a narrow-band recursive filter is not used, the algorithm appears to detect events competitively with those algorithms that employ such filters. Tests at Albuquerque Seismological Laboratory on data supplied by Blandford suggest performance commensurate with the on-line detector of the Seismic Data Analysis Center, Alexandria, Virginia.

  14. Seismic isolation systems designed with distinct multiple frequencies

    SciTech Connect

    Wu, Ting-shu; Seidensticker, R.W.

    1991-01-01

    Two systems for seismic base isolation are presented. The main feature of these system is that, instead of only one isolation frequency as in conventional isolation systems, they are designed to have two distinct isolation frequencies. When the responses during an earthquake exceed the design value(s), the system will automatically and passively shift to the secondly isolation frequency. Responses of these two systems to different ground motions including a harmonic motion with frequency same as the primary isolation frequency, show that no excessive amplification will occur. Adoption of these new systems certainly will greatly enhance the safety and reliability of an isolated superstructure against future strong earthquakes. 3 refs.

  15. Seismic isolation systems designed with distinct multiple frequencies

    SciTech Connect

    Wu, Ting-shu; Seidensticker, R.W.

    1991-12-31

    Two systems for seismic base isolation are presented. The main feature of these system is that, instead of only one isolation frequency as in conventional isolation systems, they are designed to have two distinct isolation frequencies. When the responses during an earthquake exceed the design value(s), the system will automatically and passively shift to the secondly isolation frequency. Responses of these two systems to different ground motions including a harmonic motion with frequency same as the primary isolation frequency, show that no excessive amplification will occur. Adoption of these new systems certainly will greatly enhance the safety and reliability of an isolated superstructure against future strong earthquakes. 3 refs.

  16. Seismicity and seismic response of the Soviet-designed VVER (Water-cooled, Water moderated Energy Reactor) reactor plants

    SciTech Connect

    Ma, D.C.; Gvildys, J.; Wang, C.Y.; Spencer, B.W.; Sienicki, J.J.; Seidensticker, R.W.; Purvis, E.E. III

    1989-01-01

    On March 4, 1977, a strong earthquake occurred at Vrancea, Romania, about 350 km from the Kozloduy plant in Bulgaria. Subsequent to this event, construction of the unit 2 of the Armenia plant was delayed over two years while seismic features were added. On December 7, 1988, another strong earthquake struck northwest Armenia about 90 km north of the Armenia plant. Extensive damage of residential and industrial facilities occurred in the vicinity of the epicenter. The earthquake did not damage the Armenia plant. Following this event, the Soviet government announced that the plant would be shutdown permanently by March 18, 1989, and the station converted to a fossil-fired plant. This paper presents the results of the seismic analyses of the Soviet-designed VVER (Water-cooled, Water moderated Energy Reactor) plants. Also presented is the information concerning seismicity in the regions where VVERs are located and information on seismic design of VVERs. The reference units are the VVER-440 model V230 (similar to the two units of the Armenia plant) and the VVER-1000 model V320 units at Kozloduy in Bulgaria. This document provides an initial basis for understanding the seismicity and seismic response of VVERs under seismic events. 1 ref., 9 figs., 3 tabs.

  17. Study of seismic design bases and site conditions for nuclear power plants

    SciTech Connect

    Not Available

    1980-04-01

    This report presents the results of an investigation of four topics pertinent to the seismic design of nuclear power plants: Design accelerations by regions of the continental United States; review and compilation of design-basis seismic levels and soil conditions for existing nuclear power plants; regional distribution of shear wave velocity of foundation materials at nuclear power plant sites; and technical review of surface-founded seismic analysis versus embedded approaches.

  18. An Alternative Approach to "Identification of Unknowns": Designing a Protocol to Verify the Identities of Nitrogen Fixing Bacteria.

    PubMed

    Martinez-Vaz, Betsy M; Denny, Roxanne; Young, Nevin D; Sadowsky, Michael J

    2015-12-01

    Microbiology courses often include a laboratory activity on the identification of unknown microbes. This activity consists of providing students with microbial cultures and running biochemical assays to identify the organisms. This approach lacks molecular techniques such as sequencing of genes encoding 16S rRNA, which is currently the method of choice for identification of unknown bacteria. A laboratory activity was developed to teach students how to identify microorganisms using 16S rRNA polymerase chain reaction (PCR) and validate microbial identities using biochemical techniques. We hypothesized that designing an experimental protocol to confirm the identity of a bacterium would improve students' knowledge of microbial identification techniques and the physiological characteristics of bacterial species. Nitrogen-fixing bacteria were isolated from the root nodules of Medicago truncatula and prepared for 16S rRNA PCR analysis. Once DNA sequencing revealed the identity of the organisms, the students designed experimental protocols to verify the identity of rhizobia. An assessment was conducted by analyzing pre- and posttest scores and by grading students' verification protocols and presentations. Posttest scores were higher than pretest scores at or below p = 0.001. Normalized learning gains (G) showed an improvement of students' knowledge of microbial identification methods (LO4, G = 0.46), biochemical properties of nitrogen-fixing bacteria (LO3, G = 0.45), and the events leading to the establishment of nitrogen-fixing symbioses (LO1&2, G = 0.51, G = 0.37). An evaluation of verification protocols also showed significant improvement with a p value of less than 0.001. PMID:26753033

  19. An Alternative Approach to “Identification of Unknowns”: Designing a Protocol to Verify the Identities of Nitrogen Fixing Bacteria†

    PubMed Central

    Martinez-Vaz, Betsy M.; Denny, Roxanne; Young, Nevin D.; Sadowsky, Michael J.

    2015-01-01

    Microbiology courses often include a laboratory activity on the identification of unknown microbes. This activity consists of providing students with microbial cultures and running biochemical assays to identify the organisms. This approach lacks molecular techniques such as sequencing of genes encoding 16S rRNA, which is currently the method of choice for identification of unknown bacteria. A laboratory activity was developed to teach students how to identify microorganisms using 16S rRNA polymerase chain reaction (PCR) and validate microbial identities using biochemical techniques. We hypothesized that designing an experimental protocol to confirm the identity of a bacterium would improve students’ knowledge of microbial identification techniques and the physiological characteristics of bacterial species. Nitrogen-fixing bacteria were isolated from the root nodules of Medicago truncatula and prepared for 16S rRNA PCR analysis. Once DNA sequencing revealed the identity of the organisms, the students designed experimental protocols to verify the identity of rhizobia. An assessment was conducted by analyzing pre- and posttest scores and by grading students’ verification protocols and presentations. Posttest scores were higher than pretest scores at or below p = 0.001. Normalized learning gains (G) showed an improvement of students’ knowledge of microbial identification methods (LO4, G = 0.46), biochemical properties of nitrogen-fixing bacteria (LO3, G = 0.45), and the events leading to the establishment of nitrogen-fixing symbioses (LO1&2, G = 0.51, G = 0.37). An evaluation of verification protocols also showed significant improvement with a p value of less than 0.001. PMID:26753033

  20. A preliminary study on seismic design criteria of offshore platforms in Bohai Sea of China

    NASA Astrophysics Data System (ADS)

    Peng, Yanju; Lu, Yuejun; Yan, Jiaquan; Tang, Rongyu; Wang, Junqin; Li, Jiagang

    2010-06-01

    This paper analyzes the seismicity in Bohai Sea, introducing a shape factor K to characterize the seismic risk distribution in sub-regions of the sea. Based on the seismic design ground motions for 46 platforms located in the Bohai Sea, a statistical analysis was performed for different peak ground acceleration (PGA) ratios at two different probability levels. In accordance with the two-stage design method, a scheme of two seismic design levels is proposed, and two seismic design objectives are established respectively for the strength level earthquake and the ductility level earthquake. By analogy with and comparison to the Chinese seismic design code for buildings, it is proposed that the probability level for the strength level earthquake and ductility level earthquake have a return period of 200 and 1000-2500 years, respectively. The validity of these proposed values is discussed. Finally, the PGAs corresponding to these two probability levels are calculated for different sub-regions of the Bohai Sea.

  1. Design and application of an electromagnetic vibrator seismic source

    USGS Publications Warehouse

    Haines, S.S.

    2006-01-01

    Vibrational seismic sources frequently provide a higher-frequency seismic wavelet (and therefore better resolution) than other sources, and can provide a superior signal-to-noise ratio in many settings. However, they are often prohibitively expensive for lower-budget shallow surveys. In order to address this problem, I designed and built a simple but effective vibrator source for about one thousand dollars. The "EMvibe" is an inexpensive electromagnetic vibrator that can be built with easy-to-machine parts and off-the-shelf electronics. It can repeatably produce pulse and frequency-sweep signals in the range of 5 to 650 Hz, and provides sufficient energy for recording at offsets up to 20 m. Analysis of frequency spectra show that the EMvibe provides a broader frequency range than the sledgehammer at offsets up to ??? 10 m in data collected at a site with soft sediments in the upper several meters. The EMvibe offers a high-resolution alternative to the sledgehammer for shallow surveys. It is well-suited to teaching applications, and to surveys requiring a precisely-repeatable source signature.

  2. Report of the US Nuclear Regulatory Commission Piping Review Committee. Volume 2. Evaluation of seismic designs: a review of seismic design requirements for Nuclear Power Plant Piping

    SciTech Connect

    Not Available

    1985-04-01

    This document reports the position and recommendations of the NRC Piping Review Committee, Task Group on Seismic Design. The Task Group considered overlapping conservation in the various steps of seismic design, the effects of using two levels of earthquake as a design criterion, and current industry practices. Issues such as damping values, spectra modification, multiple response spectra methods, nozzle and support design, design margins, inelastic piping response, and the use of snubbers are addressed. Effects of current regulatory requirements for piping design are evaluated, and recommendations for immediate licensing action, changes in existing requirements, and research programs are presented. Additional background information and suggestions given by consultants are also presented.

  3. Assessment of the impact of degraded shear wall stiffnesses on seismic plant risk and seismic design loads

    SciTech Connect

    Klamerus, E.W.; Bohn, M.P.; Johnson, J.J.; Asfura, A.P.; Doyle, D.J.

    1994-02-01

    Test results sponsored by the USNRC have shown that reinforced shear wall (Seismic Category I) structures exhibit stiffnesses and natural frequencies which are smaller than those calculated in the design process. The USNRC has sponsored Sandia National Labs to perform an evaluation of the effects of the reduced frequencies on several existing seismic PRAs in order to determine the seismic risk implications inherent in these test results. This report presents the results for the re-evaluation of the seismic risk for three nuclear power plants: the Peach Bottom Atomic Power Station, the Zion Nuclear Power Plant, and Arkansas Nuclear One -- Unit 1 (ANO-1). Increases in core damage frequencies for seismic initiated events at Peach Bottom were 25 to 30 percent (depending on whether LLNL or EPRI hazard curves were used). At the ANO-1 site, the corresponding increases in plant risk were 10 percent (for each set of hazard curves). Finally, at Zion, there was essentially no change in the computed core damage frequency when the reduction in shear wall stiffness was included. In addition, an evaluation of deterministic ``design-like`` structural dynamic calculations with and without the shear stiffness reductions was made. Deterministic loads calculated for these two cases typically increased on the order of 10 to 20 percent for the affected structures.

  4. Engineering Seismic Base Layer for Defining Design Earthquake Motion

    SciTech Connect

    Yoshida, Nozomu

    2008-07-08

    Engineer's common sense that incident wave is common in a widespread area at the engineering seismic base layer is shown not to be correct. An exhibiting example is first shown, which indicates that earthquake motion at the ground surface evaluated by the analysis considering the ground from a seismic bedrock to a ground surface simultaneously (continuous analysis) is different from the one by the analysis in which the ground is separated at the engineering seismic base layer and analyzed separately (separate analysis). The reason is investigated by several approaches. Investigation based on eigen value problem indicates that the first predominant period in the continuous analysis cannot be found in the separate analysis, and predominant period at higher order does not match in the upper and lower ground in the separate analysis. The earthquake response analysis indicates that reflected wave at the engineering seismic base layer is not zero, which indicates that conventional engineering seismic base layer does not work as expected by the term 'base'. All these results indicate that wave that goes down to the deep depths after reflecting in the surface layer and again reflects at the seismic bedrock cannot be neglected in evaluating the response at the ground surface. In other words, interaction between the surface layer and/or layers between seismic bedrock and engineering seismic base layer cannot be neglected in evaluating the earthquake motion at the ground surface.

  5. Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Sullivan, T. J.

    2012-04-01

    The aftermath of recent earthquakes is a reminder that seismic risk is a very relevant issue for our communities. Implicit within the seismic design standards currently in place around the world is that minimum acceptable levels of seismic risk will be ensured through design in accordance with the codes. All the same, none of the design standards specify what the minimum acceptable level of seismic risk actually is. Instead, a series of deterministic limit states are set which engineers then demonstrate are satisfied for their structure, typically through the use of elastic dynamic analyses adjusted to account for non-linear response using a set of empirical correction factors. From the early nineties the seismic engineering community has begun to recognise numerous fundamental shortcomings with such seismic design procedures in modern codes. Deficiencies include the use of elastic dynamic analysis for the prediction of inelastic force distributions, the assignment of uniform behaviour factors for structural typologies irrespective of the structural proportions and expected deformation demands, and the assumption that hysteretic properties of a structure do not affect the seismic displacement demands, amongst other things. In light of this a number of possibilities have emerged for improved control of risk through seismic design, with several innovative displacement-based seismic design methods now well developed. For a specific seismic design intensity, such methods provide a more rational means of controlling the response of a structure to satisfy performance limit states. While the development of such methodologies does mark a significant step forward for the control of seismic risk, they do not, on their own, identify the seismic risk of a newly designed structure. In the U.S. a rather elaborate performance-based earthquake engineering (PBEE) framework is under development, with the aim of providing seismic loss estimates for new buildings. The PBEE framework consists of the following four main analysis stages: (i) probabilistic seismic hazard analysis to give the mean occurrence rate of earthquake events having an intensity greater than a threshold value, (ii) structural analysis to estimate the global structural response, given a certain value of seismic intensity, (iii) damage analysis, in which fragility functions are used to express the probability that a building component exceeds a damage state, as a function of the global structural response, (iv) loss analysis, in which the overall performance is assessed based on the damage state of all components. This final step gives estimates of the mean annual frequency with which various repair cost levels (or other decision variables) are exceeded. The realisation of this framework does suggest that risk-based seismic design is now possible. However, comparing current code approaches with the proposed PBEE framework, it becomes apparent that mainstream consulting engineers would have to go through a massive learning curve in order to apply the new procedures in practice. With this in mind, it is proposed that simplified loss-based seismic design procedures are a logical means of helping the engineering profession transition from what are largely deterministic seismic design procedures in current codes, to more rational risk-based seismic design methodologies. Examples are provided to illustrate the likely benefits of adopting loss-based seismic design approaches in practice.

  6. Technical Basis for Certification of Seismic Design Criteria for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Youngs, R.R.; Costantino, C.J.; Miller, L.F.

    2008-07-01

    In August 2007, Secretary of Energy Samuel W. Bodman approved the final seismic and ground motion criteria for the Waste Treatment and Immobilization Plant (WTP) at the Department of Energy's (DOE) Hanford Site. Construction of the WTP began in 2002 based on seismic design criteria established in 1999 and a probabilistic seismic hazard analysis completed in 1996. The design criteria were reevaluated in 2005 to address questions from the Defense Nuclear Facilities Safety Board (DNFSB), resulting in an increase by up to 40% in the seismic design basis. DOE announced in 2006 the suspension of construction on the pretreatment and high-level waste vitrification facilities within the WTP to validate the design with more stringent seismic criteria. In 2007, the U.S. Congress mandated that the Secretary of Energy certify the final seismic and ground motion criteria prior to expenditure of funds on construction of these two facilities. With the Secretary's approval of the final seismic criteria in the summer of 2007, DOE authorized restart of construction of the pretreatment and high-level waste vitrification facilities. The technical basis for the certification of seismic design criteria resulted from a two-year Seismic Boreholes Project that planned, collected, and analyzed geological data from four new boreholes drilled to depths of approximately 1400 feet below ground surface on the WTP site. A key uncertainty identified in the 2005 analyses was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The absence of directly-measured seismic shear wave velocities in the sedimentary interbeds resulted in the use of a wider and more conservative range of velocities in the 2005 analyses. The Seismic Boreholes Project was designed to directly measure the velocities and velocity contrasts in the basalts and sediments below the WTP, reanalyze the ground motion response, and assess the level of conservatism in the 2005 seismic design criteria. The characterization and analysis effort included 1) downhole measurements of the velocity properties (including uncertainties) of the basalt/interbed sequences, 2) confirmation of the geometry of the contact between the various basalt and interbedded sediments through examination of retrieved core from the core-hole and data collected through geophysical logging of each borehole, and 3) prediction of ground motion response to an earthquake using newly acquired and historic data. The data and analyses reflect a significant reduction in the uncertainty in shear wave velocities below the WTP and result in a significantly lower spectral acceleration (i.e., ground motion). The updated ground motion response analyses and corresponding design response spectra reflect a 25% lower peak horizontal acceleration than reflected in the 2005 design criteria. These results provide confidence that the WTP seismic design criteria are conservative. (authors)

  7. Seismic Analysis Issues in Design Certification Applications for New Reactors

    SciTech Connect

    Miranda, M.; Morante, R.; Xu, J.

    2011-07-17

    The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants,” provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

  8. Design and development of digital seismic amplifier recorder

    SciTech Connect

    Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan

    2015-04-16

    A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1KΩ and 1MΩ. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.

  9. Design and development of digital seismic amplifier recorder

    NASA Astrophysics Data System (ADS)

    Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan

    2015-04-01

    A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1K? and 1M?. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.

  10. Overcoming barriers to high performance seismic design using lessons learned from the green building industry

    NASA Astrophysics Data System (ADS)

    Glezil, Dorothy

    NEHRP's Provisions today currently governing conventional seismic resistant design. These provisions, though they ensure the life-safety of building occupants, extensive damage and economic losses may still occur in the structures. This minimum performance can be enhanced using the Performance-Based Earthquake Engineering methodology and passive control systems like base isolation and energy dissipation systems. Even though these technologies and the PBEE methodology are effective reducing economic losses and fatalities during earthquakes, getting them implemented into seismic resistant design has been challenging. One of the many barriers to their implementation has been their upfront costs. The green building community has faced some of the same challenges that the high performance seismic design community currently faces. The goal of this thesis is to draw on the success of the green building industry to provide recommendations that may be used overcome the barriers that high performance seismic design (HPSD) is currently facing.

  11. Experimental investigation of damage behavior of RC frame members including non-seismically designed columns

    NASA Astrophysics Data System (ADS)

    Chen, Linzhi; Lu, Xilin; Jiang, Huanjun; Zheng, Jianbo

    2009-06-01

    Reinforced concrete (RC) frame structures are one of the mostly common used structural systems, and their seismic performance is largely determined by the performance of columns and beams. This paper describes horizontal cyclic loading tests of ten column and three beam specimens, some of which were designed according to the current seismic design code and others were designed according to the early non-seismic Chinese design code, aiming at reporting the behavior of the damaged or collapsed RC frame strctures observed during the Wenchuan earthquake. The effects of axial load ratio, shear span ratio, and transverse and longitudinal reinforcement ratio on hysteresis behavior, ductility and damage progress were incorporated in the experimental study. Test results indicate that the non-seismically designed columns show premature shear failure, and yield larger maximum residual crack widths and more concrete spalling than the seismically designed columns. In addition, longitudinal steel reinforcement rebars were severely buckled. The axial load ratio and shear span ratio proved to be the most important factors affecting the ductility, crack opening width and closing ability, while the longitudinal reinforcement ratio had only a minor effect on column ductility, but exhibited more influence on beam ductility. Finally, the transverse reinforcement ratio did not influence the maximum residual crack width and closing ability of the seismically designed columns.

  12. Design and implementation of telemetry seismic data acquisition system based on embedded P2P Ethernet

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Lin, J.; Chen, Z.

    2011-12-01

    A new design of telemetry seismic data acquisition system is presented which uses embedded, point to point (P2P) Ethernet networks. In our presentation, we explain the idea and motivation behind the use of P2P Ethernet topology and show the problems when such topology is used in seismic acquisition system. The presented paper focuses on the network protocols developed by us which include the generation of route table and dynamic IP address management. This new design has been implemented based on ARM and FPGA, which we have tested in laboratory and seismic exploration.

  13. SEISMIC DESIGN REQUIREMENTS SELECTION METHODOLOGY FOR THE SLUDGE TREATMENT & M-91 SOLID WASTE PROCESSING FACILITIES PROJECTS

    SciTech Connect

    RYAN GW

    2008-04-25

    In complying with direction from the U.S. Department of Energy (DOE), Richland Operations Office (RL) (07-KBC-0055, 'Direction Associated with Implementation of DOE-STD-1189 for the Sludge Treatment Project,' and 08-SED-0063, 'RL Action on the Safety Design Strategy (SDS) for Obtaining Additional Solid Waste Processing Capabilities (M-91 Project) and Use of Draft DOE-STD-I 189-YR'), it has been determined that the seismic design requirements currently in the Project Hanford Management Contract (PHMC) will be modified by DOE-STD-1189, Integration of Safety into the Design Process (March 2007 draft), for these two key PHMC projects. Seismic design requirements for other PHMC facilities and projects will remain unchanged. Considering the current early Critical Decision (CD) phases of both the Sludge Treatment Project (STP) and the Solid Waste Processing Facilities (M-91) Project and a strong intent to avoid potentially costly re-work of both engineering and nuclear safety analyses, this document describes how Fluor Hanford, Inc. (FH) will maintain compliance with the PHMC by considering both the current seismic standards referenced by DOE 0 420.1 B, Facility Safety, and draft DOE-STD-1189 (i.e., ASCE/SEI 43-05, Seismic Design Criteria for Structures, Systems, and Components in Nuclear Facilities, and ANSI!ANS 2.26-2004, Categorization of Nuclear Facility Structures, Systems and Components for Seismic Design, as modified by draft DOE-STD-1189) to choose the criteria that will result in the most conservative seismic design categorization and engineering design. Following the process described in this document will result in a conservative seismic design categorization and design products. This approach is expected to resolve discrepancies between the existing and new requirements and reduce the risk that project designs and analyses will require revision when the draft DOE-STD-1189 is finalized.

  14. Seismic Response Analysis and Design of Structure with Base Isolation

    SciTech Connect

    Rosko, Peter

    2010-05-21

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  15. Low-Noise Potential of Advanced Fan Stage Stator Vane Designs Verified in NASA Lewis Wind Tunnel Test

    NASA Technical Reports Server (NTRS)

    Hughes, Christopher E.

    1999-01-01

    With the advent of new, more stringent noise regulations in the next century, aircraft engine manufacturers are investigating new technologies to make the current generation of aircraft engines as well as the next generation of advanced engines quieter without sacrificing operating performance. A current NASA initiative called the Advanced Subsonic Technology (AST) Program has set as a goal a 6-EPNdB (effective perceived noise) reduction in aircraft engine noise relative to 1992 technology levels by the year 2000. As part of this noise program, and in cooperation with the Allison Engine Company, an advanced, low-noise, high-bypass-ratio fan stage design and several advanced technology stator vane designs were recently tested in NASA Lewis Research Center's 9- by 15-Foot Low-Speed Wind Tunnel (an anechoic facility). The project was called the NASA/Allison Low Noise Fan.

  16. Effective Parameters on Seismic Design of Rectangular Underground Structures

    SciTech Connect

    Amiri, G. Ghodrati; Maddah, N.; Mohebi, B.

    2008-07-08

    Underground structures are a significant part of the transportation in the modern society and in the seismic zones should withstand against both seismic and static loadings. Embedded structures should conform to ground deformations during the earthquake but almost exact evaluation of structure to ground distortion is critical. Several two-dimensional finite difference models are used to find effective parameters on racking ratio (structure to ground distortion) including flexibility ratio, various cross sections, embedment depth, and Poisson's ratio of soil. Results show that influence of different cross sections, by themselves is negligible but embedment depth in addition to flexibility ratio and Poisson's ratio is known as a consequential parameter. A comparison with pseudo-static method (simplified frame analysis) is also performed. The results show that for a stiffer structure than soil, racking ratio decreases as the depth of burial decreases; on the other hand, shallow and flexible structures can suffer greater distortion than deeper ones up to 30 percents.

  17. Seismic design factors for RC special moment resisting frames in Dubai, UAE

    NASA Astrophysics Data System (ADS)

    Alhamaydeh, Mohammad; Abdullah, Sulayman; Hamid, Ahmed; Mustapha, Abdilwahhab

    2011-12-01

    This study investigates the seismic design factors for three reinforced concrete (RC) framed buildings with 4, 16 and 32-stories in Dubai, UAE utilizing nonlinear analysis. The buildings are designed according to the response spectrum procedure defined in the 2009 International Building Code (IBC'09). Two ensembles of ground motion records with 10% and 2% probability of exceedance in 50 years (10/50 and 2/50, respectively) are used. The nonlinear dynamic responses to the earthquake records are computed using IDARC-2D. Key seismic design parameters are evaluated; namely, response modification factor ( R), deflection amplification factor ( C d), system overstrength factor ( Ω o), and response modification factor for ductility ( R d ) in addition to inelastic interstory drift. The evaluated seismic design factors are found to significantly depend on the considered ground motion (10/50 versus 2/50). Consequently, resolution to the controversy of Dubai seismicity is urged. The seismic design factors for the 2/50 records show an increase over their counterparts for the 10/50 records in the range of 200%-400%, except for the Ω o factor, which shows a mere 30% increase. Based on the observed trends, perioddependent R and C d factors are recommended if consistent collapse probability (or collapse prevention performance) in moment frames with varying heights is to be expected.

  18. Performance-based seismic design of nonstructural building components: The next frontier of earthquake engineering

    NASA Astrophysics Data System (ADS)

    Filiatrault, Andre; Sullivan, Timothy

    2014-08-01

    With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major knowledge gaps that will need to be filled by future research. Furthermore, considering recent trends in earthquake engineering, the paper explores how performance-based seismic design might be conceived for nonstructural components, drawing on recent developments made in the field of seismic design and hinting at the specific considerations required for nonstructural components.

  19. Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm

    PubMed Central

    Veladi, H.

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717

  20. Evaluation of collapse resistance of RC frame structures for Chinese schools in seismic design categories B and C

    NASA Astrophysics Data System (ADS)

    Tang, Baoxin; Lu, Xinzheng; Ye, Lieping; Shi, Wei

    2011-09-01

    According to the Code for Seismic Design of Buildings (GB50011-2001), ten typical reinforced concrete (RC) frame structures, used as school classroom buildings, are designed with different seismic fortification intensities (SFIs) (SFI=6 to 8.5) and different seismic design categories (SDCs) (SDC=B and C). The collapse resistance of the frames with SDC=B and C in terms of collapse fragility curves are quantitatively evaluated and compared via incremental dynamic analysis (IDA). The results show that the collapse resistance of structures should be evaluated based on both the absolute seismic resistance and the corresponding design seismic intensity. For the frames with SFI from 6 to 7.5, because they have relatively low absolute seismic resistance, their collapse resistance is insufficient even when their corresponding SDCs are upgraded from B to C. Thus, further measures are needed to enhance these structures, and some suggestions are proposed.

  1. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Reidel, S.P.; Gardner, M.G.

    2007-07-01

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energy's (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84. percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis. A key uncertainty identified in the 2005 analysis was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The velocity structure of the upper four basalt flows (Saddle Mountains Basalt) and the inter-layered sedimentary interbeds (Ellensburg Formation) produces strong reductions in modeled earthquake ground motions propagating through them. Uncertainty in the strength of velocity contrasts between these basalts and interbeds primarily resulted from an absence of measured shear wave velocities (Vs) in the interbeds. For this study, Vs in the interbeds was estimated from older, limited compressional wave velocity (Vp) data using estimated ranges for the ratio of the two velocities (Vp/Vs) based on analogues in similar materials. A range of possible Vs for the interbeds and basalts was used and produced additional uncertainty in the resulting response spectra. Because of the sensitivity of the calculated response spectra to the velocity contrasts between the basalts and interbedded sediments, DOE initiated an effort to emplace additional boreholes at the WTP site and obtain direct Vs measurements and other physical property measurements in these layers. One core-hole and three boreholes have been installed at the WTP site to a maximum depth of 1468 ft (447 m) below ground surface. The three boreholes are within 500 ft (152 m) of and surrounding the high level waste vitrification and pretreatment facilities of the WTP, which were the Performance Category 3 (PC-3) structures affected by the interim design spectra. The core-hole is co-located with the borehole closest to the two PC-3 structures. These new measurements are expected to reduce the uncertainty in the modeled site response that is caused by the lack of direct knowledge of the Vs contrasts within these layers. (authors)

  2. Estimation of cyclic interstory drift capacity of steel framed structures and future applications for seismic design.

    PubMed

    Bojrquez, Edn; Reyes-Salazar, Alfredo; Ruiz, Sonia E; Tern-Gilmore, Amador

    2014-01-01

    Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions. PMID:25089288

  3. Estimation of Cyclic Interstory Drift Capacity of Steel Framed Structures and Future Applications for Seismic Design

    PubMed Central

    Bojórquez, Edén; Reyes-Salazar, Alfredo; Ruiz, Sonia E.; Terán-Gilmore, Amador

    2014-01-01

    Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions. PMID:25089288

  4. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  5. optimization of seismic network design: application to a geophysical international lunar network

    NASA Astrophysics Data System (ADS)

    Yamada, R.; Garcia, R. F.; Lognonne, P.; Calvet, M.; Gagnepain-Beyneix, J.; Le Feuvre, M.

    2010-12-01

    During the next decade, some lunar seismic experiments are planned under the international lunar network initiative, such as NASA ILN Anchor nodes mission or Lunette DISCOVERY proposal, JAXA SELENE-2 and LUNA-GLOB penetrator missions, during which 1 to 4 seismic stations will be deployed on the lunar surface. Yamada et al. (submitted) have described how to design the optimized network in order to obtain the best scientific gain from these future lunar landing missions. In this presentation, we will describe the expected gain from the new lunar seismic observations potentially obtained by the optimized network compared with past Apollo seismic experiments. From the Apollo seismic experiments, valuable information about the lunar interior structure was obtained using deep and shallow moonquakes, and meteoroid impacts (e.g., Nakamura et al., 1983, Lognonné et al., 2003). However, due to the limited sensitivity of Apollo lunar seismometers and the narrowness of the seismic network, the deep lunar structure, especially the core, was not properly retrieved. In addition, large uncertainties are associated with the inferred crustal thickness around the Apollo seismic stations. Improvements of these knowledge will help us to understand the origin of the Earth-Moon system and the initial differentiation of the Moon. Therefore, we have studied the optimization of a seismic network consisting of three or four new seismometers in order to place better constraints on the lunar mantle structure and /or crustal thickness. The network is designed to minimize the a posteriori errors and maximize the resolution of the velocity perturbations inside the mantle and /or the crust through a linear inverse method. For the inversion, the deep moonquakes from active sources already located by Apollo seismic data are used, because it is known that these events occur repeatedly at identical nests depending on tidal constraints. In addition, we use randomly distributed meteoroid impacts located either by the new seismic network or by detection of the impact flashes from Earth-based observation. The use of these impact events will greatly contribute to improve the knowledge of shallow structures, in particular the crust. Finally, a comparison between the a posteriori errors deduced from our optimized network with those of the Apollo network will indicate the potential of the optimized network and the expected scientific gain. This method will be a useful tool to consider for future geophysical network landing missions.

  6. Verifying Diagnostic Software

    NASA Technical Reports Server (NTRS)

    Lindsey, Tony; Pecheur, Charles

    2004-01-01

    Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.

  7. Architecture for Verifiable Software

    NASA Technical Reports Server (NTRS)

    Reinholtz, William; Dvorak, Daniel

    2005-01-01

    Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

  8. Seismic Assessment of High-Raised Designed Structures Based on 2800 Iranian Seismic Code (same as UBC1997)

    SciTech Connect

    Negar, Moharrami Gargari; Rassol, Mirgaderi

    2008-07-08

    Seismic design codes have been applied by researchers to employ an appropriate performance of structures during earthquakes, in this regard, variety of load patterns, history and location of plastic hinges, ultimate capacity of structure, demand capacity of structure and response to many other questions about actual and assumptive performance of structures during earthquake have been considered by experts in this fields. In order to decline the retrofit cost of structure, evaluation of non-linear behavior of structure during the earthquake has been studied more. Since last 1980's the first generation of structural retrofit codes was established while designing codes were using linear behavior of structure. Consequently, comparison of design and retrofit code results, which are evaluated the actual behavior of the structure, has been considered. This research evaluates structures designed by 2800 code with performance levels, described in FEMA356, and also it compares results of modal analysis with outcomes of static non-linear analysis by application of load patterns mentioned in FEMA356. This structure designed and controlled by all regulations in 2800 code then it is evaluated by FEMA356 regulations. Finally, results are presented performance point of structure and distribution of plastic hinges over the whole structure when it collapses.

  9. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Seismic design and construction standards for new buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE (CONTINUED) COMPLIANCE WITH OTHER FEDERAL...

  10. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 12 2011-01-01 2011-01-01 false Seismic design and construction standards for new buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE (CONTINUED) COMPLIANCE WITH OTHER FEDERAL...

  11. Risk-Targeted versus Current Seismic Design Maps for the Conterminous United States

    USGS Publications Warehouse

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  12. Verifying Ballast Water Treatment Performance

    EPA Science Inventory

    The U.S. Environmental Protection Agency, NSF International, Battelle, and U.S. Coast Guard are jointly developing a protocol for verifying the technical performance of commercially available technologies designed to treat ship ballast water for potentially invasive species. The...

  13. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  14. Seismic design of circular-section concrete-lined underground openings: Preclosure performance considerations for the Yucca Mountain Site

    SciTech Connect

    Richardson, A.M.; Blejwas, T.E.

    1992-07-01

    Yucca Mountain, the potential site of a repository for high-level radioactive waste, is situated in a region of natural and man-made seismicity. Underground openings excavated at this site must be designed for worker safety in the seismic environment anticipated for the preclosure period. This includes accesses developed for site characterization regardless of the ultimate outcome of the repository siting process. Experience with both civil and mining structures has shown that underground openings are much more resistant to seismic effects than surface structures, and that even severe dynamic strains can usually be accommodated with proper design. This paper discusses the design and performance of lined openings in the seismic environment of the potential site. The types and ranges of possible ground motions (seismic loads) are briefly discussed. Relevant historical records of underground opening performance during seismic loading are reviewed. Simple analytical methods of predicting liner performance under combined in situ, thermal, and seismic loading are presented, and results of calculations are discussed in the context of realistic performance requirements for concrete-lined openings for the preclosure period. Design features that will enhance liner stability and mitigate the impact of the potential seismic load are reviewed. The paper is limited to preclosure performance concerns involving worker safety because present decommissioning plans specify maintaining the option for liner removal at seal locations, thus decoupling liner design from repository postclosure performance issues.

  15. Malargüe seismic array: Design and deployment of the temporary array

    NASA Astrophysics Data System (ADS)

    Ruigrok, E.; Draganov, D.; Gómez, M.; Ruzzante, J.; Torres, D.; Lópes Pumarega, I.; Barbero, N.; Ramires, A.; Castaño Gañan, A. R.; van Wijk, K.; Wapenaar, K.

    2012-10-01

    We present the goals and the current status of the Malargüe seismic array. Our main goal is imaging and monitoring the subsurface below the Malargüe region, Mendoza, Argentina. More specifically, we focus on the Planchon-Peteroa Volcano and an area just east of the town of Malargüe. We start our project installing a temporary array of 38 seismic stations, which records continuously for one year. The array consists of two subarrays: one array located on the flanks of the volcano; the other spread out on a plateau just east of the Andes. The imaging targets, like the Moho and the Nazca slab, are relatively deep. Yet, the array has a dense station spacing, allowing exploration-type processing. For high-resolution imaging, also a dense source spacing is required. This we aim to achieve by creating virtual sources at the receiver positions, with a technique called seismic interferometry (SI). The array is designed such that a recent improvement of SI can be applied to the recordings. Other goals are to collect high-quality core-phase measurements and to characterize sources of microseism noise in the Southern Hemisphere. Furthermore, we plan to collaborate with researchers from the Pierre Auger Collaboration to study coupling of seismic, atmospheric, and cosmic signals using data from our instruments and from the Pierre Auger detectors.

  16. Verifying Greenhouse Gas Emissions

    NASA Astrophysics Data System (ADS)

    Linn, A. M.; Law, B.

    2010-12-01

    Trust in an international agreement to limit future greenhouse gas emissions will depend on the ability of each nation to make accurate estimates of its own emissions, monitor their changes over time, and verify one anothers estimates with independent information. A recent National Research Council committee assessed current capabilities for estimating and verifying emissions from greenhouse gases that result from human activities, have long lifetimes in the atmosphere, and are likely to be included in an international agreements. These include CO2, CH4, N2O, HFCs, PFCs, SF6, and CFCs. The analysis shows that countries have the capability to estimate their CO2 emissions from fossil-fuel use with sufficient accuracy to support monitoring of an international treaty, but accurate methods are not universally applied and the estimates cannot be checked against independent data. Deployment of existing methods and technologies could, within 5 years, yield a capability to both estimate and verify CO2 emissions from fossil-fuel use and deforestation, which comprise approximately three-quarters of greenhouse emissions likely covered by a treaty. Estimates of emissions of other greenhouse gases will remain uncertain in the near term.

  17. Seismic design technology for breeder reactor structures. Volume 1. Special topics in earthquake ground motion

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This report is divided into twelve chapters: seismic hazard analysis procedures, statistical and probabilistic considerations, vertical ground motion characteristics, vertical ground response spectrum shapes, effects of inclined rock strata on site response, correlation of ground response spectra with intensity, intensity attenuation relationships, peak ground acceleration in the very mean field, statistical analysis of response spectral amplitudes, contributions of body and surface waves, evaluation of ground motion characteristics, and design earthquake motions. (DLC)

  18. Seismic Evaluation and Preliminary Design of Regular Setback Masonry Infilled Open Ground Storey RC Frame

    NASA Astrophysics Data System (ADS)

    Hashmi, Arshad K.

    2016-03-01

    Current seismic code presents certain stringent factors for defining frame as regular and irregular. Thereby these stringent factors only decide the type of analysis (i.e. equivalent static analysis or dynamic analysis) to be done. On the contrary, development of new simplified methods such as pushover analysis can give lateral load capacity of any structure (e.g. regular or irregular frame etc.) easily. Design by iterative procedure with the help of pushover analysis for serviceability requirement (i.e. inter storey drift limitation) provided by present seismic code, can provide an alternative to present practicing procedure. Present paper deals with regular setback frame in combination with vulnerable layout of masonry infill walls over the frame elevation (i.e. probable case of "Vertical Stiffness Irregularities"). Nonlinear time history analysis and Capacity Spectrum Method have been implemented to investigate the seismic performance of these frames. Finally, recently developed preliminary design procedure satisfying the serviceability criterion of inter storey drift limitation has been employed for the preliminary design of these frames.

  19. Martian seismicity

    NASA Technical Reports Server (NTRS)

    Phillips, Roger J.; Grimm, Robert E.

    1991-01-01

    The design and ultimate success of network seismology experiments on Mars depends on the present level of Martian seismicity. Volcanic and tectonic landforms observed from imaging experiments show that Mars must have been a seismically active planet in the past and there is no reason to discount the notion that Mars is seismically active today but at a lower level of activity. Models are explored for present day Mars seismicity. Depending on the sensitivity and geometry of a seismic network and the attenuation and scattering properties of the interior, it appears that a reasonable number of Martian seismic events would be detected over the period of a decade. The thermoelastic cooling mechanism as estimated is surely a lower bound, and a more refined estimate would take into account specifically the regional cooling of Tharsis and lead to a higher frequency of seismic events.

  20. IMPLEMENTATION OF THE SEISMIC DESIGN CRITERIA OF DOE-STD-1189-2008 APPENDIX A [FULL PAPER

    SciTech Connect

    OMBERG SK

    2008-05-14

    This paper describes the approach taken by two Fluor Hanford projects for implementing of the seismic design criteria from DOE-STD-1189-2008, Appendix A. The existing seismic design criteria and the new seismic design criteria is described, and an assessment of the primary differences provided. The gaps within the new system of seismic design criteria, which necessitate conduct of portions of work to the existing technical standards pending availability of applicable industry standards, is discussed. Two Hanford Site projects currently in the Control Decision (CD)-1 phase of design have developed an approach to implementation of the new criteria. Calculations have been performed to determine the seismic design category for one project, based on information available in early CD-1. The potential effects of DOE-STD-1189-2008, Appendix A seismic design criteria on the process of project alternatives analysis is discussed. Present of this work is expected to benefit others in the DOE Complex that may be implementing DOE-STD-1189-2008.

  1. Verifying performance requirements

    NASA Technical Reports Server (NTRS)

    Cross, Joseph

    1986-01-01

    Today, it is impossible to verify performance requirements on Ada software, except in a very approximate sense. There are several reasons for this difficulty, of which the main reason is the lack of use of information on the mapping of the program onto the target machine. An approach to a partial solution to the verification of performance requirements on Ada software is proposed, called the rule based verification approach. This approach is suitable when the target machine is well defined and when additional effort and expense are justified in order to guarantee that the performance requirements will be met by the final system.

  2. Verifiable Quantum Computing

    NASA Astrophysics Data System (ADS)

    Kashefi, Elham

    Over the next five to ten years we will see a state of flux as quantum devices become part of the mainstream computing landscape. However adopting and applying such a highly variable and novel technology is both costly and risky as this quantum approach has an acute verification and validation problem: On the one hand, since classical computations cannot scale up to the computational power of quantum mechanics, verifying the correctness of a quantum-mediated computation is challenging; on the other hand, the underlying quantum structure resists classical certification analysis. Our grand aim is to settle these key milestones to make the translation from theory to practice possible. Currently the most efficient ways to verify a quantum computation is to employ cryptographic methods. I will present the current state of the art of various existing protocols where generally there exists a trade-off between the practicality of the scheme versus their generality, trust assumptions and security level. EK gratefully acknowledges funding through EPSRC Grants EP/N003829/1 and EP/M013243/1.

  3. Decision making with epistemic uncertainty under safety constraints: An application to seismic design

    USGS Publications Warehouse

    Veneziano, D.; Agarwal, A.; Karaca, E.

    2009-01-01

    The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project. ?? 2009 Elsevier Ltd. All rights reserved.

  4. Seismic design repair and retrofit strategies for steel roof deck diaphragms

    NASA Astrophysics Data System (ADS)

    Franquet, John-Edward

    Structural engineers will often rely on the roof diaphragm to transfer lateral seismic loads to the bracing system of single-storey structures. The implementation of capacity-based design in the NBCC 2005 has caused an increase in the diaphragm design load due to the need to use the probable capacity of the bracing system, thus resulting in thicker decks, closer connector patterns and higher construction costs. Previous studies have shown that accounting for the in-plane flexibility of the diaphragm when calculating the overall building period can result in lower seismic forces and a more cost-efficient design. However, recent studies estimating the fundamental period of single storey structures using ambient vibration testing showed that the in-situ approximation was much shorter than that obtained using analytical means. The difference lies partially in the diaphragm stiffness characteristics which have been shown to decrease under increasing excitation amplitude. Using the diaphragm as the energy-dissipating element in the seismic force resisting system has also been investigated as this would take advantage of the diaphragm's ductility and limited overstrength; thus, lower capacity based seismic forces would result. An experimental program on 21.0m by 7.31m diaphragm test specimens was carried out so as to investigate the dynamic properties of diaphragms including the stiffness, ductility and capacity. The specimens consisted of 20 and 22 gauge panels with nailed frame fasteners and screwed sidelap connections as well a welded and button-punch specimen. Repair strategies for diaphragms that have previously undergone inelastic deformations were devised in an attempt to restitute the original stiffness and strength and were then experimentally evaluated. Strength and stiffness experimental estimations are compared with those predicted with the Steel Deck Institute (SDI) method. A building design comparative study was also completed. This study looks at the difference in design and cost yielded by previous and current design practice with EBF braced frames. Two alternate design methodologies, where the period is not restricted by code limitations and where the diaphragm force is limited to the equivalent shear force calculated with RdR o = 1.95, are also used for comparison. This study highlights the importance of incorporating the diaphragm stiffness in design and the potential cost savings.

  5. Verifying versus falsifying banknotes

    NASA Astrophysics Data System (ADS)

    van Renesse, Rudolf L.

    1998-04-01

    A series of counterfeit Dutch, German, English, and U.S. banknotes was examined with respect to the various modi operandi to imitate paper based, printed and post-printed security features. These features provide positive evidence (verifiability) as well as negative evidence (falsifiability). It appears that the positive evidence provided in most cases is insufficiently convincing: banknote inspection mainly rests on negative evidence. The act of falsifying (to prove to be false), however, is an inefficacious procedure. Ergonomic verificatory security features are demanded. This demand is increasingly met by security features based on nano- technology. The potential of nano-security has a twofold base: (1) the unique optical effects displayed allow simple, fast and unambiguous inspection, and (2) the nano-technology they are based on, makes successful counterfeit or simulation extremely improbable.

  6. Seismic analysis of the LSST telescope

    NASA Astrophysics Data System (ADS)

    Neill, Douglas R.

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) will be located on the seismically active Chilean mountain of Cerro Pachón. The accelerations resulting from seismic events produce the most demanding load cases the telescope and its components must withstand. Seismic ground accelerations were applied to a comprehensive finite element analysis (FEA) model which included the telescope, its pier and the mountain top. Response accelerations for specific critical components (camera and secondary mirror assembly) on the telescope were determined by applying seismic accelerations in the form of Power Spectral Densities (PSD) to the FEA model. The PSDs were chosen based on the components design lives. Survival level accelerations were determined utilizing PSDs for seismic events with return periods 10 times the telescope's design life which is equivalent to a 10% chance of occurring over the lifetime. Since the telescope has a design life of 30 years it was analyzed for a return period of 300 years. Operational level seismic accelerations were determined using return periods of 5 times the lifetimes. Since the seismic accelerations provided by the Chilean design codes were provided in the form of Peak Spectral Accelerations (PSA), a method to convert between the two forms was developed. The accelerations are also affected by damping level. The LSST incorporates added damping to meets its rapid slew and settle requirements. This added damping also reduces the components' seismic accelerations. The analysis was repeated for the telescope horizon and zenith pointing. Closed form solutions were utilized to verify the results.

  7. Seismic design evaluation guidelines for buried piping for the DOE HLW Facilities

    SciTech Connect

    Lin, Chi-Wen; Antaki, G.; Bandyopadhyay, K.; Bush, S.H.; Costantino, C.; Kennedy, R.

    1995-05-01

    This paper presents the seismic design and evaluation guidelines for underground piping for the Department of Energy (DOE) High-Level-Waste (HLW) Facilities. The underground piping includes both single and double containment steel pipes and concrete pipes with steel lining, with particular emphasis on the double containment piping. The design and evaluation guidelines presented in this paper follow the generally accepted beam-on-elastic-foundation analysis principle and the inertial response calculation method, respectively, for piping directly in contact with the soil or contained in a jacket. A standard analysis procedure is described along with the discussion of factors deemed to be significant for the design of the underground piping. The following key considerations are addressed: the design feature and safety requirements for the inner (core) pipe and the outer pipe; the effect of soil strain and wave passage; assimilation of the necessary seismic and soil data; inertial response calculation for the inner pipe; determination of support anchor movement loads; combination of design loads; and code comparison. Specifications and justifications of the key parameters used, stress components to be calculated and the allowable stress and strain limits for code evaluation are presented.

  8. On standard and optimal designs of industrial-scale 2-D seismic surveys

    NASA Astrophysics Data System (ADS)

    Guest, T.; Curtis, A.

    2011-08-01

    The principal aim of performing a survey or experiment is to maximize the desired information within a data set by minimizing the post-survey uncertainty on the ranges of the model parameter values. Using Bayesian, non-linear, statistical experimental design (SED) methods we show how industrial scale amplitude variations with offset (AVO) surveys can be constructed to maximize the information content contained in AVO crossplots, the principal source of petrophysical information from seismic surveys. The design method allows offset dependent errors, previously not allowed in non-linear geoscientific SED methods. The method is applied to a single common-midpoint gather. The results show that the optimal design is highly dependent on the ranges of the model parameter values when a low number of receivers is being used, but that a single optimal design exists for the complete range of parameters once the number of receivers is increased above a threshold value. However, when acquisition and processing costs are considered we find that a design with constant spatial receiver separation survey becomes close to optimal. This explains why regularly-spaced, 2-D seismic surveys have performed so well historically, not only from the point of view of noise attenuation and imaging in which homogeneous data coverage confers distinct advantages, but also to provide data to constrain subsurface petrophysical information.

  9. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    SciTech Connect

    , R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The primary reasons for this difference is the greater activity rate used in contemporary models for the Charleston source zone and proper incorporation of uncertainty and randomness in GMAMs.

  10. Exploratory Shaft Seismic Design Basis Working Group report; Yucca Mountain Project

    SciTech Connect

    Subramanian, C.V.; King, J.L.; Perkins, D.M.; Mudd, R.W.; Richardson, A.M.; Calovini, J.C.; Van Eeckhout, E.; Emerson, D.O.

    1990-08-01

    This report was prepared for the Yucca Mountain Project (YMP), which is managed by the US Department of Energy. The participants in the YMP are investigating the suitability of a site at Yucca Mountain, Nevada, for construction of a repository for high-level radioactive waste. An exploratory shaft facility (ESF) will be constructed to permit site characterization. The major components of the ESF are two shafts that will be used to provide access to the underground test areas for men, utilities, and ventilation. If a repository is constructed at the site, the exploratory shafts will be converted for use as intake ventilation shafts. In the context of both underground nuclear explosions (conducted at the nearby Nevada Test Site) and earthquakes, the report contains discussions of faulting potential at the site, control motions at depth, material properties of the different rock layers relevant to seismic design, the strain tensor for each of the waveforms along the shaft liners, and the method for combining the different strain components along the shaft liners. The report also describes analytic methods, assumptions used to ensure conservatism, and uncertainties in the data. The analyses show that none of the shafts` structures, systems, or components are important to public radiological safety; therefore, the shafts need only be designed to ensure worker safety, and the report recommends seismic design parameters appropriate for this purpose. 31 refs., 5 figs., 6 tabs.

  11. Implementation of seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Conrads, T.J.

    1993-06-01

    In the fall of 1992, a draft of the Seismic Design and Evaluation Guidelines for the Department of Energy (DOE) High-level Waste Storage Tanks and Appurtenances was issued. The guidelines were prepared by the Tanks Seismic Experts Panel (TSEP) and this task was sponsored by DOE, Environmental Management. The TSEP is comprised of a number of consultants known for their knowledge of seismic ground motion and expertise in the analysis of structures, systems and components subjected to seismic loads. The development of these guidelines was managed by staff from Brookhaven National Laboratory, Engineering Research and Applications Division, Department of Nuclear Energy. This paper describes the process used to incorporate the Seismic Design and Evaluation Guidelines for the DOE High-Level Waste Storage Tanks and Appurtenances into the design criteria for the Multi-Function Waste Tank Project at the Hanford Site. This project will design and construct six new high-level waste tanks in the 200 Areas at the Hanford Site. This paper also discusses the vehicles used to ensure compliance to these guidelines throughout Title 1 and Title 2 design phases of the project as well as the strategy used to ensure consistent and cost-effective application of the guidelines by the structural analysts. The paper includes lessons learned and provides recommendations for other tank design projects which might employ the TSEP guidelines.

  12. AP1000{sup R} design robustness against extreme external events - Seismic, flooding, and aircraft crash

    SciTech Connect

    Pfister, A.; Goossen, C.; Coogler, K.; Gorgemans, J.

    2012-07-01

    Both the International Atomic Energy Agency (IAEA) and the U.S. Nuclear Regulatory Commission (NRC) require existing and new nuclear power plants to conduct plant assessments to demonstrate the unit's ability to withstand external hazards. The events that occurred at the Fukushima-Dai-ichi nuclear power station demonstrated the importance of designing a nuclear power plant with the ability to protect the plant against extreme external hazards. The innovative design of the AP1000{sup R} nuclear power plant provides unparalleled protection against catastrophic external events which can lead to extensive infrastructure damage and place the plant in an extended abnormal situation. The AP1000 plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance and safety. The plant's compact safety related footprint and protection provided by its robust nuclear island structures prevent significant damage to systems, structures, and components required to safely shutdown the plant and maintain core and spent fuel pool cooling and containment integrity following extreme external events. The AP1000 nuclear power plant has been extensively analyzed and reviewed to demonstrate that it's nuclear island design and plant layout provide protection against both design basis and extreme beyond design basis external hazards such as extreme seismic events, external flooding that exceeds the maximum probable flood limit, and malicious aircraft impact. The AP1000 nuclear power plant uses fail safe passive features to mitigate design basis accidents. The passive safety systems are designed to function without safety-grade support systems (such as AC power, component cooling water, service water, compressed air or HVAC). The plant has been designed to protect systems, structures, and components critical to placing the reactor in a safe shutdown condition within the steel containment vessel which is further surrounded by a substantial 'steel concrete' composite shield building. The containment vessel is not affected by external flooding, and the shield building design provides hazard protection beyond that provided by a comparable reinforced concrete structure. The intent of this paper is to demonstrate the robustness of the AP1000 design against extreme events. The paper will focus on the plants ability to withstand extreme external events such as beyond design basis flooding, seismic events, and malicious aircraft impact. The paper will highlight the robustness of the AP1000 nuclear island design including the protection provided by the unique AP1000 composite shield building. (authors)

  13. Seismic Ecology

    NASA Astrophysics Data System (ADS)

    Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

    The paper is devoted to researches of influence of seismic actions for industrial and civil buildings and people. The seismic actions bring influence directly on the people (vibration actions, force shocks at earthquakes) or indirectly through various build- ings and the constructions and can be strong (be felt by people) and weak (be fixed by sensing devices). The great number of work is devoted to influence of violent seismic actions (first of all of earthquakes) on people and various constructions. This work is devoted to study weak, but long seismic actions on various buildings and people. There is a need to take into account seismic oscillations, acting on the territory, at construction of various buildings on urbanized territories. Essential influence, except for violent earthquakes, man-caused seismic actions: the explosions, seismic noise, emitted by plant facilities and moving transport, radiation from high-rise buildings and constructions under action of a wind, etc. can exert. Materials on increase of man- caused seismicity in a number of regions in Russia, which earlier were not seismic, are presented in the paper. Along with maps of seismic microzoning maps to be built indicating a variation of amplitude spectra of seismic noise within day, months, years. The presence of an information about amplitudes and frequencies of oscillations from possible earthquakes and man-caused oscillations in concrete regions allows carry- ing out soundly designing and construction of industrial and civil housing projects. The construction of buildings even in not seismically dangerous regions, which have one from resonance frequencies coincident on magnitude to frequency of oscillations, emitted in this place by man-caused objects, can end in failure of these buildings and heaviest consequences for the people. The practical examples of detail of engineering- seismological investigation of large industrial and civil housing projects of Siberia territory (hydro power stations, bridges, constructions, etc.) are given.

  14. Best estimate method versus evaluation method: a comparison of two techniques in evaluating seismic analysis and design. Technical report

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-07-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the tradditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC)--seismic input, soil-structure interaction, major structural response, and subsystem response--are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on the model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evauation Method is also demonstrated.

  15. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    SciTech Connect

    Pucinotti, Raffaele; Ferrario, Fabio; Bursi, Oreste S.

    2008-07-08

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  16. Optimal seismic design of reinforced concrete structures under time-history earthquake loads using an intelligent hybrid algorithm

    NASA Astrophysics Data System (ADS)

    Gharehbaghi, Sadjad; Khatibinia, Mohsen

    2015-03-01

    A reliable seismic-resistant design of structures is achieved in accordance with the seismic design codes by designing structures under seven or more pairs of earthquake records. Based on the recommendations of seismic design codes, the average time-history responses (ATHR) of structure is required. This paper focuses on the optimal seismic design of reinforced concrete (RC) structures against ten earthquake records using a hybrid of particle swarm optimization algorithm and an intelligent regression model (IRM). In order to reduce the computational time of optimization procedure due to the computational efforts of time-history analyses, IRM is proposed to accurately predict ATHR of structures. The proposed IRM consists of the combination of the subtractive algorithm (SA), K-means clustering approach and wavelet weighted least squares support vector machine (WWLS-SVM). To predict ATHR of structures, first, the input-output samples of structures are classified by SA and K-means clustering approach. Then, WWLS-SVM is trained with few samples and high accuracy for each cluster. 9- and 18-storey RC frames are designed optimally to illustrate the effectiveness and practicality of the proposed IRM. The numerical results demonstrate the efficiency and computational advantages of IRM for optimal design of structures subjected to time-history earthquake loads.

  17. Spatial correlation analysis of seismic noise for STAR X-ray infrastructure design

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Antonino; Agostino, Raffaele; Festa, Lorenzo; Gervasi, Anna; Guerra, Ignazio; Palmer, Dennis T.; Serafini, Luca

    2014-05-01

    The Italian PON MaTeRiA project is focused on the creation of a research infrastructure open to users based on an innovative and evolutionary X-ray source. This source, named STAR (Southern Europe TBS for Applied Research), exploits the Thomson backscattering process of a laser radiation by fast-electron beams (Thomson Back Scattering - TBS). Its main performances are: X-ray photon flux 109-1010 ph/s, Angular divergence variable between 2 and 10 mrad, X-ray energy continuously variable between 8 keV and 150 keV, Bandwidth ΔE/E variable between 1 and 10%, ps time resolved structure. In order to achieve this performances, bunches of electrons produced by a photo-injector are accelerated to relativistic velocities by a linear accelerator section. The electron beam, few hundreds of micrometer wide, is driven by magnetic fields to the interaction point along a 15 m transport line where it is focused in a 10 micrometer-wide area. In the same area, the laser beam is focused after being transported along a 12 m structure. Ground vibrations could greatly affect the collision probability and thus the emittance by deviating the paths of the beams during their travel in the STAR source. Therefore, the study program to measure ground vibrations in the STAR site can be used for site characterization in relation to accelerator design. The environmental and facility noise may affect the X-ray operation especially if the predominant wavelengths in the microtremor wavefield are much smaller than the size of the linear accelerator. For wavelength much greater, all the accelerator parts move in phase, and therefore also large displacements cannot generate any significant effect. On the other hand, for wavelengths equal or less than half the accelerator size several parts could move in phase opposition and therefore small displacements could affect its proper functioning. Thereafter, it is important to characterize the microtremor wavefield in both frequencies and wavelengths domains. For this reason, we performed some measurements of seismic noise in order to characterize the environmental noise in the site in which the X-ray accelerator arise. For the characterization of the site, we carried out several passive seismic monitoring experiments at different times of the day and in different weather conditions. We recorded microtremor using an array of broadband 3C seismic sensors arranged along the linear accelerator. For each measurement point, we determined the displacement, velocity and acceleration spectrogram and power spectral density of both horizontal and vertical components. We determined also the microtremor horizontal to vertical spectral ratio as function of azimuth to individuate the main ground vibration direction and detect the existence of site or building resonance frequencies. We applied a rotation matrix to transform the North-South and East-West signal components in transversal and radial components, respect to the direction of the linear accelerator. Subsequently, for each couple of seismic stations we determined the coherence function to analyze the seismic noise spatial correlation. These analyses have allowed us to exhaustively characterize the seismic noise of the study area, from the point of view of the power and space-time variability, both in frequency and wavelength.

  18. UNCERTAINTY IN PHASE ARRIVAL TIME PICKS FOR REGIONAL SEISMIC EVENTS: AN EXPERIMENTAL DESIGN

    SciTech Connect

    A. VELASCO; ET AL

    2001-02-01

    The detection and timing of seismic arrivals play a critical role in the ability to locate seismic events, especially at low magnitude. Errors can occur with the determination of the timing of the arrivals, whether these errors are made by automated processing or by an analyst. One of the major obstacles encountered in properly estimating travel-time picking error is the lack of a clear and comprehensive discussion of all of the factors that influence phase picks. This report discusses possible factors that need to be modeled to properly study phase arrival time picking errors. We have developed a multivariate statistical model, experimental design, and analysis strategy that can be used in this study. We have embedded a general form of the International Data Center(IDC)/U.S. National Data Center (USNDC) phase pick measurement error model into our statistical model. We can use this statistical model to optimally calibrate a picking error model to regional data. A follow-on report will present the results of this analysis plan applied to an implementation of an experiment/data-gathering task.

  19. Design and utilization of a portable seismic/acoustic calibration system

    SciTech Connect

    Stump, B.W.; Pearson, D.C.

    1996-10-01

    Empirical results from the current GSETT-3 illustrate the need for source specific information for the purpose of calibrating the monitoring system. With the specified location design goal of 1,000 km{sup 2}, preliminary analysis indicates the importance of regional calibration of travel times. This calibration information can be obtained in a passive manner utilizing locations derived from local seismic array arrival times and assumes the resulting locations are accurate. Alternatively, an active approach to the problem can be undertaken, attempting to make near-source observations of seismic sources of opportunity to provide specific information on the time, location and characteristics of the source. Moderate to large mining explosions are one source type that may be amenable to such calibration. This paper describes an active ground truthing procedure for regional calibration. A prototype data acquisition system that includes the primary ground motion component for source time and location determination, and secondary, optional acoustic and video components for improved source phenomenology is discussed. The system costs approximately $25,000 and can be deployed and operated by one to two people thus providing a cost effective system for calibration and documentation of sources of interest. Practical implementation of the system is illustrated, emphasizing the minimal impact on an active mining operation.

  20. On the Need for Reliable Seismic Input Assessment for Optimized Design and Retrofit of Seismically Isolated Civil and Industrial Structures, Equipment, and Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Martelli, Alessandro

    2011-01-01

    Based on the experience of recent violent earthquakes, the limits of the methods that are currently used for the definition of seismic hazard are becoming more and more evident to several seismic engineers. Considerable improvement is felt necessary not only for the seismic classification of the territory (for which the probabilistic seismic hazard assessment—PSHA—is generally adopted at present), but also for the evaluation of local amplification. With regard to the first item, among others, a better knowledge of fault extension and near-fault effects is judged essential. The aforesaid improvements are particularly important for the design of seismically isolated structures, which relies on displacement. Thus, such a design requires an accurate definition of the maximum value of displacement corresponding to the isolation period, and a reliable evaluation of the earthquake energy content at the low frequencies that are typical of the isolated structures, for the site and ground of interest. These evaluations shall include possible near-fault effects even in the vertical direction; for the construction of high-risk plants and components and retrofit of some cultural heritage, they shall be performed for earthquakes characterized by very long return periods. The design displacement shall not be underestimated, but neither be excessively overestimated, at least when using rubber bearings in the seismic isolation (SI) system. In fact, by decreasing transverse deformation of such SI systems below a certain value, their horizontal stiffness increases. Thus, should a structure (e.g. a civil defence centre, a masterpiece, etc.) protected in the aforesaid way be designed to withstand an unnecessarily too large earthquake, the behaviour of its SI system will be inadequate (i.e. it will be too stiff) during much more frequent events, which may really strike the structure during its life. Furthermore, since SI can be used only when the room available to the structure laterally is sufficient to create a structural gap compatible with the design displacement, overestimating this displacement may lead to unnecessarily renouncing of the use of such a very efficient method, especially in the case of retrofits of existing buildings. Finally, for long structures (e.g. several bridges or viaducts and even some buildings) an accurate evaluation of the possibly different ground displacements along the structure is required (this also applies to conventionally built structures). In order to overcome the limits of PSHA, this method shall be complemented by the development and application of deterministic models. In particular, the lack of displacement records requires the use of modelling, once they are calibrated against more commonly available velocity or acceleration records. The aforesaid remarks are now particularly important in the P.R. China and Italy, to ensure safe reconstruction after the Wenchuan earthquake of May 12, 2008 and the Abruzzo earthquake of April 6, 2009: in fact, wide use of SI and other anti-seismic systems has been planned in the areas struck by both events.

  1. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part II: Numerical Results

    SciTech Connect

    Mazza, Fabio; Vulcano, Alfonso

    2008-07-08

    For a widespread application of dissipative braces to protect framed buildings against seismic loads, practical and reliable design procedures are needed. In this paper a design procedure based on the Direct Displacement-Based Design approach is adopted, assuming the elastic lateral storey-stiffness of the damped braces proportional to that of the unbraced frame. To check the effectiveness of the design procedure, presented in an associate paper, a six-storey reinforced concrete plane frame, representative of a medium-rise symmetric framed building, is considered as primary test structure; this structure, designed in a medium-risk region, is supposed to be retrofitted as in a high-risk region, by insertion of diagonal braces equipped with hysteretic dampers. A numerical investigation is carried out to study the nonlinear static and dynamic responses of the primary and the damped braced test structures, using step-by-step procedures described in the associate paper mentioned above; the behaviour of frame members and hysteretic dampers is idealized by bilinear models. Real and artificial accelerograms, matching EC8 response spectrum for a medium soil class, are considered for dynamic analyses.

  2. Damage investigation of girder bridges under the Wenchuan earthquake and corresponding seismic design recommendations

    NASA Astrophysics Data System (ADS)

    Li, Jianzhong; Peng, Tianbo; Xu, Yan

    2008-12-01

    An investigation of girder bridges on National Highway 213 and the Doujiangyan-Wenchuan expressway after the Wenchuan earthquake showed that typical types of damage included: span collapses due to unseating at expansion joints; shear key failure; and damage of the expansion joint due to the slide-induced large relative displacement between the bottom of the girder and the top of the laminated-rubber bearing. This slide, however, can actually act as a form of isolation for the substructure, and as a result, the piers and foundation of most of the bridges on state route 213 suffered minor damage. The exception was the Baihua Bridge, which suffered severe damage. Corresponding seismic design recommendations are presented based on this investigation.

  3. Some issues in the seismic design of nuclear power-plant facilities

    SciTech Connect

    Hadjian, A.H.; Iwan, W.D.

    1980-09-01

    This paper summarizes the major issues discussed by an international panel of experts during the post-SMIRT (Structural Mechanics in Reactor Technology) Seminar on Extreme Load Design of Nuclear Power-Plant Facilities, which was held in Berlin, Aug. 20-21, 1979. The emphasis of the deliberations was on the state of the art of seismic-response calculations to predict the expected performance of structures and equipment during earthquakes. Four separate panels discussed issues on (1) soil-structure interaction and structural response, (2) modeling, materials, and boundary conditions, (3) damping in structures and equipment, and (4) fragility levels of equipment. The international character of the seminar was particularly helpful in the cross-pollination of ideas regarding the issues and the steps required to enhance the cause of safety of nuclear plants.

  4. CHARACTERIZING THE YUCCA MOUNTAIN SITE FOR DEVELOPING SEISMIC DESIGN GROUND MOTIONS

    SciTech Connect

    S. Upadhyaya, I. Wong, R. Kulkarni, K. Stokoe, M. Dober, W. Silva, and R. Quittmeyer

    2006-02-24

    Yucca Mountain, Nevada is the designated site for the first long-term geologic repository to safely dispose spent nuclear fuel and high-level nuclear waste in the U.S. Yucca Mountain consists of stacked layers of welded and non-welded volcanic tuffs. Site characterization studies are being performed to assess its future performance as a permanent geologic repository. These studies include the characterization of the shear-wave velocity (Vs) structure of the repository block and the surface facilities area. The Vs data are an input in the calculations of ground motions for the preclosure seismic design and for postclosure performance assessment and therefore their accurate estimation is needed. Three techniques have been employed: 24 downhole surveys, 15 suspension seismic logging surveys and 95 spectral-analysis-of-surface-waves (SASW) surveys have been performed to date at the site. The three data sets were compared with one another and with Vs profiles developed from vertical seismic profiling data collected by the Lawrence Berkeley National Laboratory and with Vs profiles developed independently by the University of Nevada, Reno using the refraction microtremor technique. Based on these data, base case Vs profiles have been developed and used in site response analyses. Since the question of adequate sampling arises in site characterization programs and a correlation between geology and Vs would help address this issue, a possible correlation was evaluated. To assess the influence of different factors on velocity, statistical analyses of the Vs data were performed using the method of multi-factor Analysis of Variance (ANOVA). The results of this analysis suggest that the effect of each of three factors, depth, lithologic unit, and spatial location, on velocity is statistically significant. Furthermore, velocity variation with depth is different at different spatial locations: Preliminary results show that the lithologic unit alone explains about 54% and 42% of the velocity variation in the suspension and downhole data sets, respectively. The three factors together explain about 73% and 81% of the velocity variation in the suspension and downhole data sets, respectively. Development of a relationship, using multiple regression analysis, which may be used as a predictive tool to estimate velocity at a new location, is currently being examined.

  5. On the Computation of H/V and its Application to Microzonation and Seismic Design

    NASA Astrophysics Data System (ADS)

    Perton, M.; Martínez, J. A.; Lermo, J. F.; Sanchez-Sesma, F. J.

    2014-12-01

    The H/V ratio is the square root of the ratio of horizontal to vertical energies of ground motion. It has been observed that the frequency of the main peak is well suited for the characterization of site effects and had been widely used for micro-zonation and seismic structural design. Historically that ratio was made from the average of individual H/V ratios obtained from noise autocorrelations. Nevertheless, it has been recently pointed out that the H/V ratio should be calculated differently as the ratio of the average of H over the average of V. This calculation is based on the relation between the directional energies (the imaginary part of Green's function) and the noise autocorrelations. In general, the average of ratios is different from the ratio of averages. Although the frequency of the main response was correctly obtained, the associated amplification factor has generally been badly predicted, having little matching with the amplification observed during strong earthquakes. The unexpected decay behavior of such ratios at high frequency and the lack of stability and reproducibility of the H/V ratios are other problems that face the method. These problems are addressed here from the point of view of normalization of noise correlations. In fact, several normalization techniques have already been proposed in order to correctly retrieve the Green's function. Some of them are well suited for the retrieval of the surface wave contribution, while others are more appropriate for bulk wave incidence. Since the H/V ratio may be used for various purposes like surface wave tomography, micro-zonation or seismic design, different normalizations are discussed in functions of the objectives. The H/V obtained from local historical earthquakes on top or far away from the subduction zone are also discussed. ACKNOWLEDGEMENT This research has been partially supported by DGAPA-UNAM under Project IN104712 and the AXA Research Fund.

  6. Pushover Analysis Methodologies: A Tool For Limited Damage Based Design Of Structure For Seismic Vibration

    NASA Astrophysics Data System (ADS)

    Dutta, Sekhar Chandra; Chakroborty, Suvonkar; Raychaudhuri, Anusrita

    Vibration transmitted to the structure during earthquake may vary in magnitude over a wide range. Design methodology should, therefore, enumerates steps so that structures are able to survive in the event of even severe ground motion. However, on account of economic reason, the strengths can be provided to the structures in such a way that the structure remains in elastic range in low to moderate range earthquake and is allowed to undergo inelastic deformation in severe earthquake without collapse. To implement this design philosophy a rigorous nonlinear dynamic analysis is needed to be performed to estimate the inelastic demands. Furthermore, the same is time consuming and requires expertise to judge the results obtained from the same. In this context, the present paper discusses and demonstrates an alternative simple method known as Pushover method, which can be easily used by practicing engineers bypassing intricate nonlinear dynamic analysis and can be thought of as a substitute of the latter. This method is in the process of development and is increasingly becoming popular for its simplicity. The objective of this paper is to emphasize and demonstrate the basic concept, strength and ease of this state of the art methodology for regular use in design offices in performance based seismic design of structures.

  7. Effects of charge design features on parameters of acoustic and seismic waves and cratering, for SMR chemical surface explosions

    NASA Astrophysics Data System (ADS)

    Gitterman, Y.

    2012-04-01

    A series of experimental on-surface shots was designed and conducted by the Geophysical Institute of Israel at Sayarim Military Range (SMR) in Negev desert, including two large calibration explosions: about 82 tons of strong IMI explosives in August 2009, and about 100 tons of ANFO explosives in January 2011. It was a collaborative effort between Israel, CTBTO, USA and several European countries, with the main goal to provide fully controlled ground truth (GT0) infrasound sources in different weather/wind conditions, for calibration of IMS infrasound stations in Europe, Middle East and Asia. Strong boosters and the upward charge detonation scheme were applied to provide a reduced energy release to the ground and an enlarged energy radiation to the atmosphere, producing enhanced infrasound signals, for better observation at far-regional stations. The following observations and results indicate on the required explosives energy partition for this charge design: 1) crater size and local seismic (duration) magnitudes were found smaller than expected for these large surface explosions; 2) small test shots of the same charge (1 ton) conducted at SMR with different detonation directions showed clearly lower seismic amplitudes/energy and smaller crater size for the upward detonation; 3) many infrasound stations at local and regional distances showed higher than expected peak amplitudes, even after application of a wind-correction procedure. For the large-scale explosions, high-pressure gauges were deployed at 100-600 m to record air-blast properties, evaluate the efficiency of the charge design and energy generation, and provide a reliable estimation of the charge yield. Empirical relations for air-blast parameters - peak pressure, impulse and the Secondary Shock (SS) time delay - depending on distance, were developed and analyzed. The parameters, scaled by the cubic root of estimated TNT equivalent charges, were found consistent for all analyzed explosions, except of SS time delays clearly separated for the shot of IMI explosives (characterized by much higher detonation velocity than ANFO). Additionally acoustic records at close distances from WSMR explosions Distant Image (2440 tons of ANFO) and Minor Uncle (2725 tons of ANFO) were used to extend the charge and distance range for the SS delay scaled relationship, that showed consistency with SMR ANFO shots. The developed specific charge design contributed to the success of this unique dual Sayarim explosion experiment, providing the strongest GT0 sources since the establishment of the IMS network, that demonstrated clearly the most favorable westward/ eastward infrasound propagation up to 3400/6250 km according to appropriate summer/winter weather pattern and stratospheric wind directions, respectively, and thus verified empirically common models of infrasound propagation in the atmosphere. The research was supported by the CTBTO, Vienna, and the Israel Ministry of Immigrant Absorption.

  8. Overview of Thermal-Hydraulic Test Program for Evaluating or Verifying the Performance of New Design Features in APR1400 Reactor

    SciTech Connect

    Song, C.H.; Kwon, T.S.; Chu, I.C.; Jun, H.G.; Park, C.K.

    2002-07-01

    Experimental program and some of test results for thermal-hydraulic evaluation or verification of new design features in APR1400 are introduced for major test items. APR1400 incorporates many advanced design features to enhance its performance and safety. New design features adopted in APR1400 include, among others, four trains of the safety injection system (SIS) with direct vessel injection (DVI) mode and passively operating safety injection tank (SIT), the In-containment Refueling Water Storage Tank (IRWST) and the safety depressurization and vent system (SDVS). For these new design features, experimental activities relevant for ensuring their performance and contribution to the safety enhancement have been carried out at KAERI. They include the LBLOCA ECCS performance evaluation test for the DVI mode of SIS, performance verification test of the fluidic device as a passive flow controller, performance evaluation test of steam sparger for SDVS and the CEDM (control element drive mechanism) performance evaluation test. In this paper, the test program is briefly introduced, which includes the test objectives, experimental method and some of typical results for each test item. (authors)

  9. GA-based optimum design of a shape memory alloy device for seismic response mitigation

    NASA Astrophysics Data System (ADS)

    Ozbulut, O. E.; Roschke, P. N.; Y Lin, P.; Loh, C. H.

    2010-06-01

    Damping systems discussed in this work are optimized so that a three-story steel frame structure and its shape memory alloy (SMA) bracing system minimize response metrics due to a custom-tailored earthquake excitation. Multiple-objective numerical optimization that simultaneously minimizes displacements and accelerations of the structure is carried out with a genetic algorithm (GA) in order to optimize SMA bracing elements within the structure. After design of an optimal SMA damping system is complete, full-scale experimental shake table tests are conducted on a large-scale steel frame that is equipped with the optimal SMA devices. A fuzzy inference system is developed from data collected during the testing to simulate the dynamic material response of the SMA bracing subcomponents. Finally, nonlinear analyses of a three-story braced frame are carried out to evaluate the performance of comparable SMA and commonly used steel braces under dynamic loading conditions and to assess the effectiveness of GA-optimized SMA bracing design as compared to alternative designs of SMA braces. It is shown that peak displacement of a structure can be reduced without causing significant acceleration response amplification through a judicious selection of physical characteristics of the SMA devices. Also, SMA devices provide a recentering mechanism for the structure to return to its original position after a seismic event.

  10. Image resolution analysis: A new, robust approach to seismic survey design

    NASA Astrophysics Data System (ADS)

    Tzimeas, Constantinos

    Seismic survey design methods often rely on qualitative measures to provide an optimal image of their objective target. Fold, ray tracing techniques counting ray hits on binned interfaces, and even advanced 3-D survey design methods that try to optimize offset and azimuth coverage are prone to fail (especially in complex geological or structural settings) in their imaging predictions. The reason for the potential failure of these commonly used approaches derives from the fact that they do not take into account the ray geometry at the target points. Inverse theory results can provide quantitative and objective constraints on acquisition design. Beylkin's contribution to this field is an elegant and simple equation describing a reconstructed point scatterer given the source/receiver distribution used in the imaging experiment. Quantitative measures of spatial image resolution were developed to assess the efficacy of competing acquisition geometries. Apart from the source/receiver configuration, parameters such as the structure and seismic velocity also influence image resolution. Understanding their effect on image quality, allows us to better interpret the resolution results for the surveys under examination. A salt model was used to simulate imaging of target points located underneath and near the flanks of the diapir. Three different survey designs were examined. Results from these simulations show that contrary to simple models, near-offsets do not always produce better resolved images than far-offsets. However, consideration of decreasing signal-to-noise ratio revealed that images obtained from the far-offset experiment are degrading faster than the near-offset ones. The image analysis was performed on VSP field data as well as synthetics generated by finite difference forward modeling. The predicted image resolution results were compared to measured resolution from the migrated sections of both the field data and the synthetics. This comparison confirms that image resolution analysis provides as good a resolution prediction as the prestack Kirchhoff depth migrated section of the synthetic gathers. Even in the case of the migrated field data, despite the presence of error introducing factors (different signal-to-noise ratios, shape and frequency content of source wavelets, etc.), image resolution performed well exhibiting the same trends of resolution changes at different test points.

  11. A New Seismic Broadband Sensor Designed for Easy and Rapid Deployment

    NASA Astrophysics Data System (ADS)

    Guralp, Cansun; Pearcey, Chris; Nicholson, Bruce; Pearce, Nathan

    2014-05-01

    Properly deploying digital seismic broadband sensors in the field can be time consuming and logistically challenging. On active volcanoes the time it takes to install such instruments has to be particularly short in order to minimize the risk for the deployment personnel. In addition, once a seismometer is installed it is not always feasible to pay regular visits to the deployment site in order to correct for possible movements of the seismometer due to settling, sliding or other external events. In order to address those issues we have designed a new type of versatile and very robust three component feedback sensor which can be easily installed and is capable of self correcting changes of its tilt and measuring orientation changes during deployment. The instrument can be installed by direct burial in soil, in a borehole, in glacial ice and can even be used under water as an ocean bottom seismometer (OBS). It components are fitted above each other in a cylindrical stainless steel casing with a diameter of 51 mm. Each seismic sensor has a flat response to velocity between 30s to 100 Hz and a tilt tolerance of up to 20 degrees. A tilt sensor and a two axis magnetometer inside the casing capture changes in tilt and horizontal orientation during the course of the deployment. Their output can be fed into internal motors which in turn adjust the actual orientation of each sensor in the casing. First production models of this instrument have been deployed as OBS in an active submarine volcanic area along the Juan de Fuca Ridge in the NE Pacific. We are currently finishing units to be deployed for volcano monitoring in Icelandic glaciers. This instrument will be offered as an analogue version or with a 24-bit-digitizer fitted into the same casing. A pointy tip can be added to the casing ease direct burial.

  12. Capabilities of seismic networks and their design. Technical report for the period 1 January-31 March 1987

    SciTech Connect

    Baumstark, R.; Bulin, G.; Campanella, A.; Dysart, P.; Israelsson, H.

    1987-04-01

    Results of the assessment of capabilities of seismic networks and the design of networks show that a network's location accuracy approaches a near-maximum when stations are distributed over an azimuthal sector of about 180/sup 0/ and that good depth determination requires stations within about 15/sup 0/ of the focus of the event. The latter suggests that global networks should include 100 or more seismological stations if accurate depth estimation is important. A study of about 100 local and regional events detected by NORESS is a systematic effort to test analytical methods being developed for extracting features from seismic waveforms that can be used to identify regional phases, and for recognizing repeated events from a single source such as quarry blasts. The methods under evaluation includes the extraction of frequency-domain spectral parameters, and particle motion information. Substantial effort was devoted to developing concepts and preparing for experiments in international exchange of of seismic waveform data. This work included developing the technical concepts for a potential global seismic-monitoring system. The Center has been heavily involved in an examination of its data-base management system and the development of extensions to handle new types of data - principally data from seismic arrays and data anticipated from forthcoming GSE experiments.

  13. MASSACHUSETTS DEP EELGRASS VERIFIED POINTS

    EPA Science Inventory

    Field verified points showing presence or absence of submerged rooted vascular plants along Massachusetts coastline. In addition to the photo interpreted eelgrass coverage (EELGRASS), this point coverage (EGRASVPT) was generated based on field-verified sites as well as all field...

  14. Basis of Design and Seismic Action for Long Suspension Bridges: the case of the Messina Strait Bridge

    SciTech Connect

    Bontempi, Franco

    2008-07-08

    The basis of design for complex structures like suspension bridges is reviewed. Specific attention is devoted to seismic action and to the performance required and to the connected structural analysis. Uncertainty is specially addressed by probabilistic and soft-computing techniques. The paper makes punctual reference to the work end the experience developed during the last years for the re-design of the Messina Strait Bridge.

  15. Verifiable threshold signature schemes against conspiracy attack.

    PubMed

    Gan, Yuan-Ju

    2004-01-01

    In this study, the author has designed new verifiable (t,n) threshold untraceable signature schemes. The proposed schemes have the following properties:(1) Verification: The shadows of the secret distributed by the trusted center can be verified by all of the participants;(2) Security: Even if the number of the dishonest member is over the value of the threshold, they cannot get the system secret parameters, such as the group secret key, and forge other member's individual signature;(3) Efficient verification: The verifier can verify the group signature easily and the verification time of the group signature is equivalent to that of an individual signature; (4) Untraceability: The signers of the group signature cannot be traced. PMID:14663852

  16. Design of an implantable seismic sensor placed on the ossicular chain.

    PubMed

    Sachse, M; Hortschitz, W; Stifter, M; Steiner, H; Sauter, T

    2013-10-01

    This paper presents a design guideline for matching a fully implantable middle ear microphone with the physiology of human hearing. The guideline defines the first natural frequency of a seismic sensor placed at the tip of the manubrium mallei with respect to the frequency-dependence hearing of the human ear as well as the deflection of the ossicular chain. A transducer designed in compliance with the guideline presented reduces the range of the output signal while preserving all information obtained by the ossicular chain. On top of a output signal compression, static deflections, which can mask the tiny motions of the ossicles, are reduced. For guideline verification, a microelectromechanical system (MEMS) based on silicon on insulator technology was produced and tested. This prototype is capable of resolving 0.4 pm/Hz with a custom made read-out circuit. For a bandwidth of 0.1 kHz, this deflection is comparable with the lower threshold of speech (≈ 40 phon). PMID:23810385

  17. Seismic design technology for breeder reactor structures. Volume 4. Special topics in piping and equipment

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into five chapters: experimental verification of piping systems, analytical verification of piping restraint systems, seismic analysis techniques for piping systems with multisupport input, development of floor spectra from input response spectra, and seismic analysis procedures for in-core components. (DLC)

  18. Site study plan for EDBH (Engineering Design Boreholes) seismic surveys, Deaf Smith County site, Texas: Revision 1

    SciTech Connect

    Hume, H.

    1987-12-01

    This site study plan describes seismic reflection surveys to run north-south and east-west across the Deaf Smith County site, and intersecting near the Engineering Design Boreholes (EDBH). Both conventional and shallow high-resolution surveys will be run. The field program has been designed to acquire subsurface geologic and stratigraphic data to address information/data needs resulting from Federal and State regulations and Repository program requirements. The data acquired by the conventional surveys will be common-depth- point, seismic reflection data optimized for reflection events that indicate geologic structure near the repository horizon. The data will also resolve the basement structure and shallow reflection events up to about the top of the evaporite sequence. Field acquisition includes a testing phase to check/select parameters and a production phase. The field data will be subjected immediately to conventional data processing and interpretation to determine if there are any anamolous structural for stratigraphic conditions that could affect the choice of the EDBH sites. After the EDBH's have been drilled and logged, including vertical seismic profiling, the data will be reprocessed and reinterpreted for detailed structural and stratigraphic information to guide shaft development. The shallow high-resulition seismic reflection lines will be run along the same alignments, but the lines will be shorter and limited to immediate vicinity of the EDBH sites. These lines are planned to detect faults or thick channel sands that may be present at the EDBH sites. 23 refs. , 7 figs., 5 tabs.

  19. Seismic design of low-level nuclear waste repositories and toxic waste management facilities

    SciTech Connect

    Chung, D.H.; Bernreuter, D.L.

    1984-05-08

    Identification of the elements of typical hazardous waste facilities (HFWs) that are the major contributors to the risk are focussed on as the elements which require additional considerations in the design and construction of low-level nuclear waste management repositories and HWFs. From a recent study of six typical HWFs it was determined that the factors that contribute most to the human and environmental risk fall into four basic categories: geologic and seismological conditions at each HWF; engineered structures at each HWF; environmental conditions at each HWF; and nature of the material being released. In selecting and carrying out the six case studies, three groups of hazardous waste facilities were examined: generator industries which treat or temporarily store their own wastes; generator facilities which dispose of their own hazardous wastes on site; and industries in the waste treatment and disposal business. The case studies have a diversity of geologic setting, nearby settlement patterns, and environments. Two sites are above a regional aquifer, two are near a bay important to regional fishing, one is in rural hills, and one is in a desert, although not isolated from nearby towns and a groundwater/surface-water system. From the results developed in the study, it was concluded that the effect of seismic activity on hazardous facilities poses a significant risk to the population. Fifteen reasons are given for this conclusion.

  20. Active seismic experiment

    NASA Technical Reports Server (NTRS)

    Kovach, R. L.; Watkins, J. S.; Talwani, P.

    1972-01-01

    The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

  1. Updated Optimal Designs of Time-Lapse Seismic Surveys for Monitoring CO2 Leakage through Fault Zones

    NASA Astrophysics Data System (ADS)

    Liu, J.; Shang, X.; Sun, Y.; Chen, P.

    2012-12-01

    Cost-effective time-lapse seismic surveys are crucial for long-term monitoring of geologic carbon sequestration. Similar to Shang and Huang (2012), in this study we have numerically modeled time-lapse seismic surveys for monitoring CO2 leakage through fault zones, and designed updated optimal surveys for time-lapse seismic data acquisition using elastic-wave sensitivity analysis. When CO2 was confined in a relatively deep region, our results show that the most desired location for receivers at the surface is at the hanging-wall side of the two fault zones, of high-angle normal faults and reverse faults. The most sensitive places at the surface to the change of different P- and S-wave velocities and density are similar to each other, but are often not sensitive to the source location. When CO2 migrates close to the surface, our modeling suggests that the best region at the surface for time-lapse seismic surveys is very sensitive to the source location and the elastic parameter to be monitored.

  2. Model verifies design of mobile data modem

    NASA Technical Reports Server (NTRS)

    Davarian, F.; Sumida, J.

    1986-01-01

    It has been proposed to use differential minimum shift keying (DMSK) modems in spacecraft-based mobile communications systems. For an employment of these modems, it is necessary that the transmitted carrier frequency be known prior to signal detection. In addition, the time needed by the receiver to lock onto the carrier frequency must be minimized. The present article is concerned with a DMSK modem developed for the Mobile Satellite Service. This device demonstrated fast acquisition time and good performance in the presence of fading. However, certain problems arose in initial attempts to study the acquisition behavior of the AFC loop through breadboard techniques. The development of a software model of the AFC loop is discussed, taking into account two cases which were plotted using the model. Attention is given to a demonstration of the viability of the modem by an approach involving modeling and analysis of the frequency synchronizer.

  3. Conceptual Design and Architecture of Mars Exploration Rover (MER) for Seismic Experiments Over Martian Surfaces

    NASA Astrophysics Data System (ADS)

    Garg, Akshay; Singh, Amit

    2012-07-01

    Keywords: MER, Mars, Rover, Seismometer Mars has been a subject of human interest for exploration missions for quite some time now. Both rover as well as orbiter missions have been employed to suit mission objectives. Rovers have been preferentially deployed for close range reconnaissance and detailed experimentation with highest accuracy. However, it is essential to strike a balance between the chosen science objectives and the rover operations as a whole. The objective of this proposed mechanism is to design a vehicle (MER) to carry out seismic studies over Martian surface. The conceptual design consists of three units i.e. Mother Rover as a Surrogate (Carrier) and Baby Rovers (two) as seeders for several MEMS-based accelerometer / seismometer units (Nodes). Mother Rover can carry these Baby Rovers, having individual power supply with solar cells and with individual data transmission capabilities, to suitable sites such as Chasma associated with Valles Marineris, Craters or Sand Dunes. Mother rover deploys these rovers in two opposite direction and these rovers follow a triangulation pattern to study shock waves generated through firing tungsten carbide shells into the ground. Till the time of active experiments Mother Rover would act as a guiding unit to control spatial spread of detection instruments. After active shock experimentation, the babies can still act as passive seismometer units to study and record passive shocks from thermal quakes, impact cratering & landslides. Further other experiments / payloads (XPS / GAP / APXS) can also be carried by Mother Rover. Secondary power system consisting of batteries can also be utilized for carrying out further experiments over shallow valley surfaces. The whole arrangement is conceptually expected to increase the accuracy of measurements (through concurrent readings) and prolong life cycle of overall experimentation. The proposed rover can be customised according to the associated scientific objectives and further needs.

  4. Geological investigation for CO2 storage: from seismic and well data to storage design

    NASA Astrophysics Data System (ADS)

    Chapuis, Flavie; Bauer, Hugues; Grataloup, Sandrine; Leynet, Aurélien; Bourgine, Bernard; Castagnac, Claire; Fillacier, Simon; Lecomte, Antony; Le Gallo, Yann; Bonijoly, Didier

    2010-05-01

    Geological investigation for CO2 storage: from seismic and well data to storage design Chapuis F.1, Bauer H.1, Grataloup S.1, Leynet A.1, Bourgine B.1, Castagnac C.1, Fillacier, S.2, Lecomte A.2, Le Gallo Y.2, Bonijoly D.1. 1 BRGM, 3 av Claude Guillemin, 45060 Orléans Cedex, France, f.chapuis@brgm.fr, d.bonijoly@brgm.fr 2 Geogreen, 7, rue E. et A. Peugeot, 92563 Rueil-Malmaison Cedex, France, ylg@greogreen.fr The main purpose of this study is to evaluate the techno-economical potential of storing 200 000 tCO2 per year produced by a sugar beat distillery. To reach this goal, an accurate hydro-geological characterisation of a CO2 injection site is of primary importance because it will strongly influence the site selection, the storage design and the risk management. Geological investigation for CO2 storage is usually set in the center or deepest part of sedimentary basins. However, CO2 producers do not always match with the geological settings, and so other geological configurations have to be studied. This is the aim of this project, which is located near the South-West border of the Paris Basin, in the Orléans region. Special geometries such as onlaps and pinch out of formation against the basement are likely to be observed and so have to be taken into account. Two deep saline aquifers are potentially good candidates for CO2 storage. The Triassic continental deposits capped by the Upper Triassic/Lower Jurassic continental shales and the Dogger carbonate deposits capped by the Callovian and Oxfordian shales. First, a data review was undertaken, to provide the palaeogeographical settings and ideas about the facies, thicknesses and depth of the targeted formations. It was followed by a seismic interpretation. Three hundred kilometres of seismic lines were reprocessed and interpreted to characterize the geometry of the studied area. The main structure identified is the Étampes fault that affects all the formations. Apart from the vicinity of the fault where drag folds appear, the layers are sub-horizontal and gently dip and thicken eastwards. Then, interpreted seismic lines, together with well data from more than 50 boreholes were integrated into a 2D-model of the main surfaces using geostatistics (Isatis® and Petrel® softwares). The main difficulty of this step was to generate a realistic model accounting for both the specific geometries linked to the basin border (onlapping, pinching out...) and the faults. If the former only concerns the Triassic, the latter also affects the overlying formations. Regarding the Dogger top surface, it is less than 700m deep in the western area, which is too shallow for supercritical state injection. Consequently, the next part of the study focused on the Triassic reservoir and integrated changes in petrophysical properties as a function of lateral lithological variation. This ultimately led to upgrade the model from 2D to 3D in order to perform the simulation of CO2 migration. To achieve this objective, we first applied sequence stratigraphy concepts on Triassic deposits to compensate the lack of quantitative petrophysical data. It provided qualitative data about the reservoir heterogeneities which are crucial for a realistic 3D-modelling. Paleoenvironmental reconstructions show that the sediment supply direction is WSW-ENE, implying more proximal deposits to the West, and so better reservoir properties. The final step is to use this 3D-model to elaborate a flow model to estimate the injectivity rate and the extension of the overpressure within the open aquifer and the CO2 plume after 30 years of injection. Two injection rates as well as two well locations were hypothesized into four scenarios considering several locations and injections rates. In any case, the fault has been considered as a barrier to the CO2 migration and the system as a closed one. In the four cases, results are satisfying, the overpressure is less than 30% of the initial pressure and the reservoir capacity is enough regarding the goal of the project. The results of these simulations will then be integrated into the risk analysis of the project, which is of utmost importance to ensure safety and cope with public acceptance. Acknowledgements: This work is supported by the French Ministry of Research (DRRT), the regional Council "Région Centre", the European Regional Development Fund (FEDER) and the BRGM.

  5. Verifiable and Redactable Medical Documents

    PubMed Central

    Brown, Jordan; Blough, Douglas M.

    2012-01-01

    This paper considers how to verify provenance and integrity of data in medical documents that are exchanged in a distributed system of health IT services. Provenance refers to the sources of health information within the document and integrity means that the information was not modified after generation by the source. Our approach allows intermediate parties to redact the document by removing information that they do not wish to reveal. For example, patients can store verifiable health information and provide subsets of it to third parties, while redacting sensitive information that they do not wish employers, insurers, or others to receive. Our method uses a cryptographic primitive known as a redactable signature. We study practical issues and performance impacts of building, redacting, and verifying Continuity of Care Documents (CCDs) that are protected with redactable signatures. Results show that manipulating redactable CCDs provides superior security and privacy with little computational overhead. PMID:23304391

  6. Verifying the Hanging Chain Model

    ERIC Educational Resources Information Center

    Karls, Michael A.

    2013-01-01

    The wave equation with variable tension is a classic partial differential equation that can be used to describe the horizontal displacements of a vertical hanging chain with one end fixed and the other end free to move. Using a web camera and TRACKER software to record displacement data from a vibrating hanging chain, we verify a modified version

  7. Verifying the Hanging Chain Model

    ERIC Educational Resources Information Center

    Karls, Michael A.

    2013-01-01

    The wave equation with variable tension is a classic partial differential equation that can be used to describe the horizontal displacements of a vertical hanging chain with one end fixed and the other end free to move. Using a web camera and TRACKER software to record displacement data from a vibrating hanging chain, we verify a modified version…

  8. Seismic design and evaluation guidelines for the Department of Energy High-Level Waste Storage Tanks and Appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1995-10-01

    This document provides seismic design and evaluation guidelines for underground high-level waste storage tanks. The guidelines reflect the knowledge acquired in the last two decades in defining seismic ground motion and calculating hydrodynamic loads, dynamic soil pressures and other loads for underground tank structures, piping and equipment. The application of the guidelines is illustrated with examples. The guidelines are developed for a specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document. The original version of this document was published in January 1993. Since then, additional studies have been performed in several areas and the results are included in this revision. Comments received from the users are also addressed. Fundamental concepts supporting the basic seismic criteria contained in the original version have since then been incorporated and published in DOE-STD-1020-94 and its technical basis documents. This information has been deleted in the current revision.

  9. Seismic design of steel structures with lead-extrusion dampers as knee braces

    NASA Astrophysics Data System (ADS)

    monir, Habib Saeed; Naser, Ali

    2008-07-01

    One of the effective methods in decreasing the seismic response of structure against dynamic loads due to earthquake is using energy dissipating systems. Lead-extrusion dampers (LED)are one of these systems that dissipate energy in to one lead sleeve because of steel rod movement. Hysteresis loops of these dampers are approximately rectangular and acts independent from velocity in frequencies that are in the seismic frequency rang. In this paper lead dampers are considered as knee brace in steel frames and are studied in an economical view. Considering that lead dampers don't clog structural panels, so this characteristic can solve brace problems from architectural view. The behavior of these dampers is compared with the other kind of dampers such as XADAS and TADAS. The results indicate that lead dampers act properly in absorbing the induced energy due to earthquake and good function in controlling seismic movements of multi-story structures

  10. Seismic design of steel structures with lead-extrusion dampers as knee braces

    SciTech Connect

    Monir, Habib Saeed; Naser, Ali

    2008-07-08

    One of the effective methods in decreasing the seismic response of structure against dynamic loads due to earthquake is using energy dissipating systems. Lead-extrusion dampers (LED) are one of these systems that dissipate energy in to one lead sleeve because of steel rod movement. Hysteresis loops of these dampers are approximately rectangular and acts independent from velocity in frequencies that are in the seismic frequency rang. In this paper lead dampers are considered as knee brace in steel frames and are studied in an economical view. Considering that lead dampers don't clog structural panels, so this characteristic can solve brace problems from architectural view. The behavior of these dampers is compared with the other kind of dampers such as XADAS and TADAS. The results indicate that lead dampers act properly in absorbing the induced energy due to earthquake and good function in controlling seismic movements of multi-story structures.

  11. Seismic Studies

    SciTech Connect

    R. Quittmeyer

    2006-09-25

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at Yucca Mountain. (2) For probabilistic analyses supporting the demonstration of compliance with preclosure performance objectives, provide a mean seismic hazard curve for the surface facilities area. Results should be consistent with the PSHA for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at Yucca Mountain. (3) For annual ground motion exceedance probabilities appropriate for postclosure analyses, provide site-specific seismic time histories (acceleration, velocity, and displacement) for the waste emplacement level. Time histories should be consistent with the PSHA and reflect available knowledge on the limits to extreme ground motion at Yucca Mountain. (4) In support of ground-motion site-response modeling, perform field investigations and laboratory testing to provide a technical basis for model inputs. Characterize the repository block and areas in which important-to-safety surface facilities will be sited. Work should support characterization and reduction of uncertainties in inputs to ground-motion site-response modeling. (5) On the basis of rock mechanics, geologic, and seismic information, determine limits on extreme ground motion at Yucca Mountain and document the technical basis for them. (6) Update the ground-motion site-response model, as appropriate, on the basis of new data. Expand and enhance the technical basis for model validation to further increase confidence in the site-response modeling. (7) Document seismic methodologies and approaches in reports to be submitted to the NRC. (8) Address condition reports.

  12. Proceedings of seismic engineering 1991

    SciTech Connect

    Ware, A.G. )

    1991-01-01

    This book contains proceedings of the Seismic Engineering Technical Subcommittee of the ASME Pressure Vessels and Piping Division. Topics covered include: seismic damping and energy absorption, advanced seismic analysis methods, new analysis techniques and applications of advanced methods, seismic supports and test results, margins inherent in the current design methods, and risk assessment, and component and equipment qualification.

  13. Utilization of a finite element model to verify spent nuclear fuel storage rack welds

    SciTech Connect

    Nitzel, M.E.

    1998-07-01

    Elastic and plastic finite element analyses were performed for the inner tie block assembly of a 25 port fuel rack designed for installation at the Idaho National Engineering and Environmental Laboratory (INEEL) Idaho Chemical Processing Plant (ICPP). The model was specifically developed to verify the adequacy of certain welds joining components of the fuel storage rack assembly. The work scope for this task was limited to an investigation of the stress levels in the inner tie welds when the rack was subjected to seismic loads. Structural acceptance criteria used for the elastic calculations performed were as defined by the rack`s designer. Structural acceptance criteria used for the plastic calculations performed as part of this effort were as defined in Subsection NF and Appendix F of Section III of the ASME Boiler and Pressure Vessel Code. The results confirm that the welds joining the inner tie block to the surrounding rack structure meet the acceptance criteria. The analysis results verified that the inner tie block welds should be capable of transferring the expected seismic load without structural failure.

  14. Verifying Correct Functionality of Avionics Subsystems

    NASA Technical Reports Server (NTRS)

    Meuer, Ben t.

    2005-01-01

    This project focuses on the testing of the telecommunications interface subsystem of the Multi-Mission System Architecture Platform to ensure proper functionality. The Multi-Mission System Architecture Platform is a set of basic tools designed to be used in future spacecraft. The responsibilities of the telecommunications interface include communication between the spacecraft and ground teams as well as acting as the bus controller for the system. The tests completed include bit wise read\\write tests to each register, testing of status bits, and verifying various bus controller activities. Testing is accomplished through the use of software-based simulations run on an electronic design of the system. The tests are written in Verilog Hardware Definition Language and they simulate specific states and conditions in telecommunication interfaces. Upon successful completion, the output is examined to verify that the system responded appropriately.

  15. Application of Seismic Design Requirements to Cold Vacuum Drying (CVD) Facility Structures and Systems and Components

    SciTech Connect

    CREA, B.A.

    1999-11-15

    The methodology followed in assignment of Performance Class (PC) for Natural Phenomena Hazards (NPH) seismic loads for Cold Vacuum Drying Facility (CVDF) Structures, Systems and Components is defined. The loading definition associated with each PC and structure, system and component is then defined.

  16. Southern California Seismic Network: New Design and Implementation of Redundant and Reliable Real-time Data Acquisition Systems

    NASA Astrophysics Data System (ADS)

    Saleh, T.; Rico, H.; Solanki, K.; Hauksson, E.; Friberg, P.

    2005-12-01

    The Southern California Seismic Network (SCSN) handles more than 2500 high-data rate channels from more than 380 seismic stations distributed across southern California. These data are imported real-time from dataloggers, earthworm hubs, and partner networks. The SCSN also exports data to eight different partner networks. Both the imported and exported data are critical for emergency response and scientific research. Previous data acquisition systems were complex and difficult to operate, because they grew in an ad hoc fashion to meet the increasing needs for distributing real-time waveform data. To maximize reliability and redundancy, we apply best practices methods from computer science for implementing the software and hardware configurations for import, export, and acquisition of real-time seismic data. Our approach makes use of failover software designs, methods for dividing labor diligently amongst the network nodes, and state of the art networking redundancy technologies. To facilitate maintenance and daily operations we seek to provide some separation between major functions such as data import, export, acquisition, archiving, real-time processing, and alarming. As an example, we make waveform import and export functions independent by operating them on separate servers. Similarly, two independent servers provide waveform export, allowing data recipients to implement their own redundancy. The data import is handled differently by using one primary server and a live backup server. These data import servers, run fail-over software that allows automatic role switching in case of failure from primary to shadow. Similar to the classic earthworm design, all the acquired waveform data are broadcast onto a private network, which allows multiple machines to acquire and process the data. As we separate data import and export away from acquisition, we are also working on new approaches to separate real-time processing and rapid reliable archiving of real-time data. Further, improved network security is an integral part of the new design. Redundant firewalls will provide secure data imports, exports, and acquisition as well as DMZ zones for web servers and other publicly available servers. We will present the detailed design of this new configuration that is currently being implemented by the SCSN at Caltech. The design principals are general enough to be of use to most regional seismic networks.

  17. Verifying differential pressure transmitter operation

    SciTech Connect

    Corley, M.A.; O'Neal, D.L.

    1999-07-01

    The monitoring of chilled and hot water consumption has become more important in recent years. However, reduction of consumption through energy conserving retrofits has reduced flow velocities significantly. This paper presents the results of a study performed to verify that differential pressure transmitters used in chilled and hot water metering were able to capture actual conditions within acceptable accuracy even at low flow rates. The results fell into three categories: transmitters whose expected and actual output coincided, transmitters that exhibited offset and slope errors, and transmitters that exhibited errors from unknown sources.

  18. A successful 3D seismic survey in the ``no-data zone,`` offshore Mississippi delta: Survey design and refraction static correction processing

    SciTech Connect

    Carvill, C.; Faris, N.; Chambers, R.

    1996-12-31

    This is a success story of survey design and refraction static correction processing of a large 3D seismic survey in the South Pass area of the Mississippi delta. In this transition zone, subaqueous mudflow gullies and lobes of the delta, in various states of consolidation and gas saturation, are strong absorbers of seismic energy. Seismic waves penetrating the mud are severely restricted in bandwidth and variously delayed by changes in mud velocity and thickness. Using a delay-time refraction static correction method, the authors find compensation for the various delays, i.e., static corrections, commonly vary 150 ms over a short distance. Application of the static corrections markedly improves the seismic stack volume. This paper shows that intelligent survey design and delay-time refraction static correction processing economically eliminate the historic no data status of this area.

  19. A study on the seismic fortification level of offshore platform in Bohai Sea of China

    NASA Astrophysics Data System (ADS)

    Lu, Y.

    2010-12-01

    The Chinese sea areas are important places of offshore petroleum resources, and at the same time they are also seismically active regions. Fixed offshore platforms (OPs) are the fundamental facilities for marine resource exploitation, and usually situated in a complex and severe environment as having to endure many environmental loads in their life span, therefore, the damage to their structures may result in serious disasters. Among these environmental loads the seismic load has tremendous destructive effect and is not predictable. In case of not overly severe wind, wave and current, seismic resistance dominates the strength design of platforms. Furthermore, strong earthquakes have occurred recently or in the history of all the sea areas of oil/gas exploitation in China. Therefore, seismic design of fixed OPs is a very important issue. With the development of marine exploration and earthquake researches in the sea area, extensive studies on the seismotectonic environment and seismicity characteristics of the sea areas of China have been performed, meanwhile, more and more experience and data have been accumulated from OP design practice, which laid a foundation for studying and establishing the seismic design standard of OPs. This paper first gives an overall understanding of the seismic environment of the sea areas of China, then taking the Bohai Sea seismic risk study as an example, introducing a so-called shape factor K to characterize the seismic risk distribution in sub-regions of the Bohai Sea. Based on the seismic design ground motions for 46 platforms in Bohai Sea, a statistic analysis was performed for different peak ground acceleration (PGA) ratios at two different probability levels. In accordance with the two-stage design method, a scheme of two seismic design levels is proposed, and two seismic design objectives are established respectively for the strength level earthquake and ductility level earthquake. By analogy with and comparison to the Chinese seismic design code for buildings it is proposed that the probability level for the strength level earthquake and ductility level earthquake takes respectively a return period of 200 and 1000-2500 years. By comparing with the codes developed by relevant industry institutions the rationality and safety of the seismic fortification objectives of OPs is verified. Finally, the seismic parameters in the sub-regions of Bohai Sea are calculated based on seismic risk zoning and ground motion intensity maps.

  20. Seismic design spectra 200 West and East Areas DOE Hanford Site, Washington

    SciTech Connect

    Tallman, A.M.

    1995-12-31

    This document presents equal hazard response spectra for the W236A project for the 200 East and West new high-level waste tanks. The hazard level is based upon WHC-SD-W236A-TI-002, Probabilistic Seismic Hazard Analysis, DOE Hanford Site, Washington. Spectral acceleration amplification is plotted with frequency (Hz) for horizontal and vertical motion and attached to this report. The vertical amplification is based upon the preliminary draft revision of Standard ASCE 4-86. The vertical spectral acceleration is equal to the horizontal at frequencies above 3.3Hz because of near-field, less than 15 km, sources.

  1. Seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1993-01-01

    This document provides guidelines for the design and evaluation of underground high-level waste storage tanks due to seismic loads. Attempts were made to reflect the knowledge acquired in the last two decades in the areas of defining the ground motion and calculating hydrodynamic loads and dynamic soil pressures for underground tank structures. The application of the analysis approach is illustrated with an example. The guidelines are developed for specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document.

  2. COMPARISON OF HORIZONTAL SEISMIC COEFFICIENTS DEFINED BY CURRENT AND PREVIOUS DESIGN STANDARDS FOR PORT AND HARBOR FACILITIES

    NASA Astrophysics Data System (ADS)

    Takahashi, Hidenori; Ikuta, Akiho

    Japanese design standard for port and harbor facilities was revised in 2007, modifying the method used to calculate the horizontal seismic coefficient, kh. The comprehensive change of the method indicates that the quay walls designed by the previous standard could be lack of earthquake resistance in terms of the current standard. In the present study, the coefficients, kh, calculated by the two standards were compared for the existing quay walls constructed in Kanto area, Japan. In addition, the factors that affected the relationship of two types of coefficients, kh, were identified by means of multiple regression analyses. Only 16 % of the current standard of kh exceeded the previous standard of kh. According to the multiple regression analyses, the ratio of two types of coefficients, kh, tended to increase in the quay walls which were located in a specific port and had the large wall height and the small importance factor.

  3. A Seismic Isolation Application Using Rubber Bearings; Hangar Project in Turkey

    SciTech Connect

    Sesigur, Haluk; Cili, Feridun

    2008-07-08

    Seismic isolation is an effective design strategy to mitigate the seismic hazard wherein the structure and its contents are protected from the damaging effects of an earthquake. This paper presents the Hangar Project in Sabiha Goekcen Airport which is located in Istanbul, Turkey. Seismic isolation system where the isolation layer arranged at the top of the columns is selected. The seismic hazard analysis, superstructure design, isolator design and testing were based on the Uniform Building Code (1997) and met all requirements of the Turkish Earthquake Code (2007). The substructure which has the steel vertical trusses on facades and RC H shaped columns in the middle axis of the building was designed with an R factor limited to 2.0 in accordance with Turkish Earthquake Code. In order to verify the effectiveness of the isolation system, nonlinear static and dynamic analyses are performed. The analysis revealed that isolated building has lower base shear (approximately 1/4) against the non-isolated structure.

  4. Experimentally Verified Numerical Model of Thixoforming Process

    SciTech Connect

    Bialobrzeski, Andrzej; Kotynia, Monika; Petera, Jerzy

    2007-04-07

    A new mathematical model of thixotropic casting based on the two-phase approach for the semi-solid metal alloys is presented. The corresponding numerical algorithm has been implemented in original computer software using the finite element method in the 3-D geometry and using the Lagrangian approach to flow description. The model has been verified by means of an original experiment of thixoforming in a model die specially designed for this purpose. Some particular cases of such casting and influence of operating parameters on the segregation phenomenon have been discussed.

  5. Simulation and Processing Seismic Data in Complex Geological Models

    NASA Astrophysics Data System (ADS)

    Forestieri da Gama Rodrigues, S.; Moreira Lupinacci, W.; Martins de Assis, C. A.

    2014-12-01

    Seismic simulations in complex geological models are interesting to verify some limitations of seismic data. In this project, different geological models were designed to analyze some difficulties encountered in the interpretation of seismic data. Another idea is these data become available for LENEP/UENF students to test new tools to assist in seismic data processing. The geological models were created considering some characteristics found in oil exploration. We simulated geological medium with volcanic intrusions, salt domes, fault, pinch out and layers more distante from surface (Kanao, 2012). We used the software Tesseral Pro to simulate the seismic acquisitions. The acquisition geometries simulated were of the type common offset, end-on and split-spread. (Figure 1) Data acquired with constant offset require less processing routines. The processing flow used with tools available in Seismic Unix package (for more details, see Pennington et al., 2005) was geometric spreading correction, deconvolution, attenuation correction and post-stack depth migration. In processing of the data acquired with end-on and split-spread geometries, we included velocity analysis and NMO correction routines. Although we analyze synthetic data and carefully applied each processing routine, we can observe some limitations of the seismic reflection in imaging thin layers, great surface depth layers, layers with low impedance contrast and faults.

  6. Verifying a Computer Algorithm Mathematically.

    ERIC Educational Resources Information Center

    Olson, Alton T.

    1986-01-01

    Presents an example of mathematics from an algorithmic point of view, with emphasis on the design and verification of this algorithm. The program involves finding roots for algebraic equations using the half-interval search algorithm. The program listing is included. (JN)

  7. Verify MesoNAM Performance

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2010-01-01

    The AMU conducted an objective analysis of the MesoNAM forecasts compared to observed values from sensors at specified KSC/CCAFS wind towers by calculating the following statistics to verify the performance of the model: 1) Bias (mean difference), 2) Standard deviation of Bias, 3) Root Mean Square Error (RMSE), and 4) Hypothesis test for Bias = O. The 45 WS LWOs use the MesoNAM to support launch weather operations. However, the actual performance of the model at KSC and CCAFS had not been measured objectively. The analysis compared the MesoNAM forecast winds, temperature and dew point to the observed values from the sensors on wind towers. The data were stratified by tower sensor, month and onshore/offshore wind direction based on the orientation of the coastline to each tower's location. The model's performance statistics were then calculated for each wind tower based on sensor height and model initialization time. The period of record for the data used in this task was based on the operational start of the current MesoNAM in mid-August 2006 and so the task began with the first full month of data, September 2006, through May 2010. The analysis of model performance indicated: a) The accuracy decreased as the forecast valid time from the model initialization increased, b) There was a diurnal signal in T with a cool bias during the late night and a warm bias during the afternoon, c) There was a diurnal signal in Td with a low bias during the afternoon and a high bias during the late night, and d) The model parameters at each vertical level most closely matched the observed parameters at heights closest to those vertical levels. The AMU developed a GUI that consists of a multi-level drop-down menu written in JavaScript embedded within the HTML code. This tool allows the LWO to easily and efficiently navigate among the charts and spreadsheet files containing the model performance statistics. The objective statistics give the LWOs knowledge of the model's strengths and weaknesses and the GUI allows quick access to the data which will result in improved forecasts for operations.

  8. Numerical analysis on seismic response of Shinkansen bridge-train interaction system under moderate earthquakes

    NASA Astrophysics Data System (ADS)

    He, Xingwen; Kawatani, Mitsuo; Hayashikawa, Toshiro; Matsumoto, Takashi

    2011-03-01

    This study is intended to evaluate the influence of dynamic bridge-train interaction (BTI) on the seismic response of the Shinkansen system in Japan under moderate earthquakes. An analytical approach to simulate the seismic response of the BTI system is developed. In this approach, the behavior of the bridge structure is assumed to be within the elastic range under moderate ground motions. A bullet train car model idealized as a sprung-mass system is established. The viaduct is modeled with 3D finite elements. The BTI analysis algorithm is verified by comparing the analytical and experimental results. The seismic analysis is validated through comparison with a general program. Then, the seismic responses of the BTI system are simulated and evaluated. Some useful conclusions are drawn, indicating the importance of a proper consideration of the dynamic BTI in seismic design.

  9. Verifying Deadlock-Freedom of Communication Fabrics

    NASA Astrophysics Data System (ADS)

    Gotmanov, Alexander; Chatterjee, Satrajit; Kishinevsky, Michael

    Avoiding message dependent deadlocks in communication fabrics is critical for modern microarchitectures. If discovered late in the design cycle, deadlocks lead to missed project deadlines and suboptimal design decisions. One approach to avoid this problem is to get high level of confidence on an early microarchitectural model. However, formal proofs of liveness even on abstract models are hard due to large number of queues and distributed control. In this work we address liveness verification of communication fabrics described in the form of high-level microarchitectural models which use a small set of well-defined primitives. We prove that under certain realistic restrictions, deadlock freedom can be reduced to unsatisfiability of a system of Boolean equations. Using this approach, we have automatically verified liveness of several non-trivial models (derived from industrial microarchitectures), where state-of-the-art model checkers failed and pen and paper proofs were either tedious or unknown.

  10. Seismic Waveguide of Metamaterials

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Hoon; Das, Mukunda P.

    We developed a new method of an earthquake-resistant design to support conventional aseismic system using acoustic metamaterials. The device is an attenuator of a seismic wave that reduces the amplitude of the wave exponentially. Constructing a cylindrical shell-type waveguide composed of many Helmholtz resonators that creates a stop-band for the seismic frequency range, we convert the seismic wave into an attenuated one without touching the building that we want to protect. It is a mechanical way to convert the seismic energy into sound and heat.

  11. Design and development of safety evaluation system of buildings on a seismic field based on the network platform

    NASA Astrophysics Data System (ADS)

    Sun, Baitao; Zhang, Lei; Chen, Xiangzhao; Zhang, Xinghua

    2015-03-01

    This paper describes a set of on-site earthquake safety evaluation systems for buildings, which were developed based on a network platform. The system embedded into the quantitative research results which were completed in accordance with the provisions from Post-earthquake Field Works, Part 2: Safety Assessment of Buildings, GB18208.2 -2001, and was further developed into an easy-to-use software platform. The system is aimed at allowing engineering professionals, civil engineeing technicists or earthquake-affected victims on site to assess damaged buildings through a network after earthquakes. The authors studied the function structure, process design of the safety evaluation module, and hierarchical analysis algorithm module of the system in depth, and developed the general architecture design, development technology and database design of the system. Technologies such as hierarchical architecture design and Java EE were used in the system development, and MySQL5 was adopted in the database development. The result is a complete evaluation process of information collection, safety evaluation, and output of damage and safety degrees, as well as query and statistical analysis of identified buildings. The system can play a positive role in sharing expert post-earthquake experience and promoting safety evaluation of buildings on a seismic field.

  12. A new instrumentation to measure seismic waves attenuation

    NASA Astrophysics Data System (ADS)

    Tisato, N.; Madonna, C.; Boutareaud, S.; Burg, J.

    2010-12-01

    Attenuation of seismic waves is the general expression describing the loss of energy of an elastic perturbation during its propagation in a medium. As a geophysical method, measuring the attenuation of seismic waves is a key to uncover essential information about fluid saturation of buried rocks. Attenuation of seismic waves depends on several mechanisms. In the case of saturated rock, fluids play an important role. Seismic waves create zones of overpressure by mobilizing the fluids in the pores of the rock. Starting from Gassmann-Biot theory (Gassman, 1951), several models (e.g. White, 1975; Mavko and Jizba, 1991) have been formulated to describe the energy absorption by flow of fluids. According to Mavko et al. (1998) for rock with permeability equals or less than 1 D, fluid viscosity between 1 cP and 10 cP and low frequencies seismic wave (< 100 Hz), the most important processes that subtract energy from the seismic waves are squirt flow and patchy saturation. Numerical models like Quintal et al. (2009) calculate how a patchy saturated vertical rock section (25 cm height), after stress steps of several kPa (i.e. 30 kPa) show a dissimilar increase in pore pressure between gas-saturated and liquid-saturated layers. The Rock Deformation Laboratory at ETH-Zürich has designed and set up a new pressure vessel to measure seismic wave attenuation in rocks at frequencies between 0.1 and 100 Hz and to verify the predicted influence of seismic waves on the pore pressure in patchy saturated rocks. We present this pressure vessel which can reach confining pressures of 25 MPa and holds a 250 mm long and 76 mm diameter sample. Dynamic stress is applied at the top of the rock cylinder by a piezoelectric motor that can generate a stress of several kPa (> 100 KPa) in less than 10 ms. The vessel is equipped with 5 pressure sensors buried within the rock sample, a load cell and a strain sensor to measure axial shortening while the motor generates the seismic waves. The sensor conditioning system has been designed and realized by us and the acquisition software has been developed in Matlab. We present the first results, at room pressure and temperature, based on the measurements of pore fluid pressure increase in a sandstone sample with a permeability of 200 to 500 mD and partially saturated with water and air. These preliminary results show the reliability of this new instrumentation to measure seismic wave attenuation at low frequency and to verify the pore fluid flow driven by seismic waves.

  13. New finite element models and seismic analyses of the telescopes at W.M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    Kan, Frank W.; Sarawit, Andrew T.; Callahan, Shawn P.; Pollard, Mike L.

    2014-07-01

    On 15 October 2006 a large earthquake damaged both telescopes at Keck observatory resulting in weeks of observing downtime. A significant portion of the downtime was attributed to recovery efforts repairing damage to telescope bearing journals, radial pad support structures and encoder subsystems. Inadequate damping and strength in the seismic restraint design and the lack of break-away features on the azimuth radial pads are key design deficiencies. In May, 2011 a feasibility study was conducted to review several options to enhance the protection of the telescopes with the goal to minimize the time to bring the telescopes back into operation after a large seismic event. At that time it was determined that new finite element models of the telescope structures were required to better understand the telescope responses to design earthquakes required by local governing building codes and the USGS seismic data collected at the site on 15 October 2006. These models were verified by comparing the calculated natural frequencies from the models to the measured frequencies obtained from the servo identification study and comparing the time history responses of the telescopes to the October 2006 seismic data to the actual observed damages. The results of two finite element methods, response spectrum analysis and time history analysis, used to determine seismic demand forces and seismic response of each telescope to the design earthquakes were compared. These models can be used to evaluate alternate seismic restraint design options for both Keck telescopes.

  14. Earthquake damage potential and critical scour depth of bridges exposed to flood and seismic hazards under lateral seismic loads

    NASA Astrophysics Data System (ADS)

    Song, Shin-Tai; Wang, Chun-Yao; Huang, Wen-Hsiu

    2015-12-01

    Many bridges located in seismic hazard regions suffer from serious foundation exposure caused by riverbed scour. Loss of surrounding soil significantly reduces the lateral strength of pile foundations. When the scour depth exceeds a critical level, the strength of the foundation is insufficient to withstand the imposed seismic demand, which induces the potential for unacceptable damage to the piles during an earthquake. This paper presents an analytical approach to assess the earthquake damage potential of bridges with foundation exposure and identify the critical scour depth that causes the seismic performance of a bridge to differ from the original design. The approach employs the well-accepted response spectrum analysis method to determine the maximum seismic response of a bridge. The damage potential of a bridge is assessed by comparing the imposed seismic demand with the strengths of the column and the foundation. The versatility of the analytical approach is illustrated with a numerical example and verified by the nonlinear finite element analysis. The analytical approach is also demonstrated to successfully determine the critical scour depth. Results highlight that relatively shallow scour depths can cause foundation damage during an earthquake, even for bridges designed to provide satisfactory seismic performance.

  15. Robust design of mass-uncertain rolling-pendulum TMDs for the seismic protection of buildings

    NASA Astrophysics Data System (ADS)

    Matta, Emiliano; De Stefano, Alessandro

    2009-01-01

    Commonly used for mitigating wind- and traffic-induced vibrations in flexible structures, passive tuned mass dampers (TMDs) are rarely applied to the seismic control of buildings, their effectiveness to impulsive loads being conditional upon adoption of large mass ratios. Instead of recurring to cumbersome metal or concrete devices, this paper suggests meeting that condition by turning into TMDs non-structural masses sometimes available atop buildings. An innovative roof-garden TMD, for instance, sounds a promising tool capable of combining environmental and structural protection in one device. Unfortunately, the amount of these masses being generally variable, the resulting mass-uncertain TMD (MUTMD) appears prone to mistuning and control loss. In an attempt to minimize such adverse effects, robust analysis and synthesis against mass variations are applied in this study to MUTMDs of the rolling-pendulum type, a configuration characterized by mass-independent natural period. Through simulations under harmonic and recorded ground motions of increasing intensity, the performance of circular and cycloidal rolling-pendulum MUTMDs is evaluated on an SDOF structure in order to illustrate their respective advantages as well as the drawbacks inherent in their non-linear behavior. A possible implementation of a roof-garden TMD on a real building structure is described and its control efficacy numerically demonstrated, showing that in practical applications MUTMDs can become a good alternative to traditional TMDs.

  16. Seismic hazard analyses for Taipei city including deaggregation, design spectra, and time history with excel applications

    NASA Astrophysics Data System (ADS)

    Wang, Jui-Pin; Huang, Duruo; Cheng, Chin-Tung; Shao, Kuo-Shin; Wu, Yuan-Chieh; Chang, Chih-Wei

    2013-03-01

    Given the difficulty of earthquake forecast, Probabilistic Seismic Hazard Analysis (PSHA) has been a method to best estimate site-specific ground motion or response spectra in earthquake engineering and engineering seismology. In this paper, the first in-depth PSHA study for Taipei, the economic center of Taiwan with a six-million population, was carried out. Unlike the very recent PSHA study for Taiwan, this study includes the follow-up hazard deaggregation, response spectra, and the earthquake motion recommendations. Hazard deaggregation results show that moderate-size and near-source earthquakes are the most probable scenario for this city. Moreover, similar to the findings in a few recent studies, the earthquake risk for Taipei should be relatively high and considering this city's importance, the high risk should not be overlooked and a potential revision of the local technical reference would be needed. In addition to the case study, some innovative Excel applications to PSHA are introduced in this paper. Such spreadsheet applications are applicable to geosciences research as those developed for data reduction or quantitative analysis with Excel's user-friendly nature and wide accessibility.

  17. Multidisciplinary co-operation in building design according to urbanistic zoning and seismic microzonation

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2005-05-01

    Research and practice in seismology and urban planning interfere concerning the impact of earthquakes on urban areas. The roles of sub-area wide or typological divisions of the town were investigated with the methodology of regression, regarding their contribution to urban earthquake risk management. The inductive data set comprised recovery, preparedness, mitigation and resilience planning. All timely constituted planning types are refound today as layers, as the zoning results are used by differently backgrounded actors: local authorities, civil protection, urban planners, civil engineers. In resilience planning, the urban system is complexly theoretised, then integratedly approached. The steady restructuring process of the urban organism is evident in a dynamic analysis. Although expressed materially, the "urban-frame" is realised spiritually, space adaptation being also social. A retrospective investigation of the role of resilient individual buildings within the urban system of Bucharest, Romania, was undertaken, in order to learn systemic lessons considering the street, an educational environment. (In)formation in the study and decision making process stay in a reciprocal relationship, both being obliged in the (in)formation of the public opinion. For a complete view on resilience, both zoning types, seismic and urbanistic, must be considered and through their superposition new sub-area wide divisions of the town appear, making recommendations according to the vulnerability of the building type.

  18. Efficacy of Code Provisions for Seismic Design of Asymmetric RC Building

    NASA Astrophysics Data System (ADS)

    Balakrishnan, Bijily; Sarkar, Pradip

    2016-04-01

    The earthquake resistant design code in India, IS: 1893, has been revised in 2002 to include provisions for torsional irregularity in asymmetric buildings. In line with other international code, IS 1893: 2002 requires estimating the design eccentricity from static and accidental eccentricity. The present study attempts to evaluate the effectiveness of the design code requirements for designing torsionally irregular asymmetric buildings. Two similar asymmetric buildings designed considering and ignoring code requirement has been considered for this study. Nonlinear static and dynamic analyses are performed on these buildings to realize the difference in their behaviour and it is found that the plan asymmetry in the building makes it non-ductile even after design with code provisions. Code criterion for plan asymmetry tends to improve the strength of members but this study indicates that changing the stiffness distribution to reduce eccentricity may lead to a preferred mode of failure.

  19. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  20. Seismic-acoustic communication for UGS

    NASA Astrophysics Data System (ADS)

    Cechak, Jaroslav

    2010-04-01

    The paper deals with Unattended Ground Sensors (UGS) and takes into consideration both present and future aspects of the practical deployment of this equipment under conditions of Electronic Warfare (EW), including the integration of UGS into a joint system using the Unmanned Aircraft System (UAS). The first part of the paper deals with the possibilities, characteristics and useable properties of seismic-acoustic communication in the group of nodes, supplementing the information coverage of existing UGS, including the selection of a suitable working frequency band for seismic communication. The second part of the paper then describes an alternative method of communication between nodes and UGS using LF radio communication, and analyses the design and real properties of a proposed communication channel in LF band, the design of a loop antenna and its mechanical construction. The interim conclusions of each section generalize the results of seismic-acoustic and radio LF communications as verified in practice, and describe both the advantages and disadvantages of communication channels defined in this way. The third part of the paper deals with the possibility of integrating the nodes-UGS to a central system consisting of a UAS device. It covers the design and an energy evaluation of a system operating on the principle of data selection from UGS. In addition, the paper includes illustrative photographs of the practical design and graphic results of real measurements.

  1. Theoretical and practical considerations for the design of the iMUSH active-source seismic experiment

    NASA Astrophysics Data System (ADS)

    Kiser, E.; Levander, A.; Harder, S. H.; Abers, G. A.; Creager, K. C.; Vidale, J. E.; Moran, S. C.; Malone, S. D.

    2013-12-01

    The multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment seeks to understand the details of the magmatic system that feeds Mount St. Helens using active- and passive-source seismic, magnetotelluric, and petrologic data. The active-source seismic component of this experiment will take place in the summer of 2014 utilizing all of the 2600 PASSCAL 'Texan' Reftek instruments which will record twenty-four 1000-2000 lb shots distributed around the Mount St. Helens region. The instruments will be deployed as two consecutive refraction profiles centered on the volcano, and a series of areal arrays. The actual number of areal arrays, as well as their locations, will depend strongly on the length of the experiment (3-4 weeks), the number of instrument deployers (50-60), and the time it will take per deployment given the available road network. The current work shows how we are balancing these practical considerations against theoretical experiment designs in order to achieve the proposed scientific goals with the available resources. One of the main goals of the active-source seismic experiment is to image the magmatic system down to the Moho (35-40 km). Calculating sensitivity kernels for multiple shot/receiver offsets shows that direct P waves should be sensitive to Moho depths at offsets of 150 km, and therefore this will likely be the length of the refraction profiles. Another primary objective of the experiment is to estimate the locations and volumes of different magma accumulation zones beneath the volcano using the areal arrays. With this in mind, the optimal locations of these arrays, as well as their associated shots, are estimated using an eigenvalue analysis of the approximate Hessian for each possible experiment design. This analysis seeks to minimize the number of small eigenvalues of the approximate Hessian that would amplify the propagation of data noise into regions of interest in the model space, such as the likely locations of magma reservoirs. In addition, this analysis provides insight into the tradeoff between the number of areal array deployments and the information that will be gained from the experiment. An additional factor incorporated into this study is the expected data quality in different regions around Mount St. Helens. Expected data quality is determined using the signal-to-noise ratios of data from existing seismometers in the region, and from forward modeling the wavefields from different experiment designs using SPECFEM3D software. In particular, we are interested in evaluating how topography near the volcano and low velocity volcaniclastic layers affect data quality. This information is especially important within 5 km of the volcano where only hiking trails are available for instrument deployment, and in a large area north of the volcano where road maintenance has lagged since the 1980 eruption. Instrument deployment will be slow in these regions, and therefore it is essential to understand if deployment of instruments here is a reasonable use of resources. A final step of this study will be validating different experiment designs based upon the above criteria by inverting synthetic data from velocity models that contain a generalized representation of the magma system to confirm that the main features of the models can be recovered.

  2. Seismic analysis of Industrial Waste Landfill 4 at Y-12 Plant

    SciTech Connect

    1995-04-07

    This calculation was to seismically evaluate Landfill IV at Y-12 as required by Tennessee Rule 1200-1-7-04(2) for seismic impact zones. The calculation verifies that the landfill meets the seismic requirements of the Tennessee Division of Solid Waste, ``Earthquake Evaluation Guidance Document.`` The theoretical displacements of 0.17 in. and 0.13 in. for the design basis earthquake are well below the limiting seimsic slope stability design criteria. There is no potential for liquefaction due to absence of chohesionless soils, or for loss or reduction of shear strength for the clays at this site as result of earthquake vibration. The vegetative cover on slopes will most likely be displaced and move during a large seismic event, but this is not considered a serious deficiency because the cover is not involved in the structural stability of the landfill and there would be no release of waste to the environment.

  3. Seismic and layout design for a tank-type fast reactor

    SciTech Connect

    Goodman, L.; Yamaki, Hideo; Davies, S.M.

    1984-06-01

    Hitachi Ltd. of Japan, with the assistance of the Bechtel Group, Inc. and the General Electric Company of the US, initiated a conceptual design study of a compact tank-type LMFBR. The Bechtel work concentrated on layout of the nuclear island (NI), and its orientation with respect to the Control (CB) and Turbine (TGB) Buildings. This joint effort was carried out during 1982 and 1983 in four steps. Each step produced improvements in the design and reduced the plant size and cost. This paper described the design evolution and the final result with respect to Bechtel's development of the NI layout.

  4. Preclosure seismic design methodology for a geologic repository at Yucca Mountain. Topical report YMP/TR-003-NP

    SciTech Connect

    1996-10-01

    This topical report describes the methodology and criteria that the U.S. Department of Energy (DOE) proposes to use for preclosure seismic design of structures, systems, and components (SSCs) of the proposed geologic repository operations area that are important to safety. Title 10 of the Code of Federal Regulations, Part 60 (10 CFR 60), Disposal of High-Level Radioactive Wastes in Geologic Repositories, states that for a license to be issued for operation of a high-level waste repository, the U.S. Nuclear Regulatory Commission (NRC) must find that the facility will not constitute an unreasonable risk to the health and safety of the public. Section 60.131 (b)(1) requires that SSCs important to safety be designed so that natural phenomena and environmental conditions anticipated at the geologic repository operations area will not interfere with necessary safety functions. Among the natural phenomena specifically identified in the regulation as requiring safety consideration are the hazards of ground shaking and fault displacement due to earthquakes.

  5. The DDBD Method In The A-Seismic Design of Anchored Diaphragm Walls

    SciTech Connect

    Manuela, Cecconi; Vincenzo, Pane; Sara, Vecchietti

    2008-07-08

    The development of displacement based approaches for earthquake engineering design appears to be very useful and capable to provide improved reliability by directly comparing computed response and expected structural performance. In particular, the design procedure known as the Direct Displacement Based Design (DDBD) method, which has been developed in structural engineering over the past ten years in the attempt to mitigate some of the deficiencies in current force-based design methods, has been shown to be very effective and promising ([1], [2]). The first attempts of application of the procedure to geotechnical engineering and, in particular, earth retaining structures are discussed in [3], [4] and [5]. However in this field, the outcomes of the research need to be further investigated in many aspects. The paper focuses on the application of the DDBD method to anchored diaphragm walls. The results of the DDBD method are discussed in detail in the paper, and compared to those obtained from conventional pseudo-static analyses.

  6. Seismic design or retrofit of buildings with metallic structural fuses by the damage-reduction spectrum

    NASA Astrophysics Data System (ADS)

    Li, Gang; Jiang, Yi; Zhang, Shuchuan; Zeng, Yan; Li, Qiang

    2015-03-01

    Recently, the structural fuse has become an important issue in the field of earthquake engineering. Due to the trilinearity of the pushover curve of buildings with metallic structural fuses, the mechanism of the structural fuse is investigated through the ductility equation of a single-degree-of-freedom system, and the corresponding damage-reduction spectrum is proposed to design and retrofit buildings. Furthermore, the controlling parameters, the stiffness ratio between the main frame and structural fuse and the ductility factor of the main frame, are parametrically studied, and it is shown that the structural fuse concept can be achieved by specific combinations of the controlling parameters based on the proposed damage-reduction spectrum. Finally, a design example and a retrofit example, variations of real engineering projects after the 2008 Wenchuan earthquake, are provided to demonstrate the effectiveness of the proposed design procedures using buckling restrained braces as the structural fuses.

  7. Optimization for performance-based design under seismic demands, including social costs

    NASA Astrophysics Data System (ADS)

    Möller, Oscar; Foschi, Ricardo O.; Ascheri, Juan P.; Rubinstein, Marcelo; Grossman, Sergio

    2015-06-01

    Performance-based design in earthquake engineering is a structural optimization problem that has, as the objective, the determination of design parameters for the minimization of total costs, while at the same time satisfying minimum reliability levels for the specified performance criteria. Total costs include those for construction and structural damage repairs, those associated with non-structural components and the social costs of economic losses, injuries and fatalities. This paper presents a general framework to approach this problem, using a numerical optimization strategy and incorporating the use of neural networks for the evaluation of dynamic responses and the reliability levels achieved for a given set of design parameters. The strategy is applied to an example of a three-story office building. The results show the importance of considering the social costs, and the optimum failure probabilities when minimum reliability constraints are not taken into account.

  8. The LUSI Seismic Experiment: Deployment of a Seismic Network around LUSI, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Karyono, Karyono; Mazzini, Adriano; Lupi, Matteo; Syafri, Ildrem; Haryanto, Iyan; Masturyono, Masturyono; Hadi, Soffian; Rohadi, Suprianto; Suardi, Iman; Rudiyanto, Ariska; Pranata, Bayu

    2015-04-01

    The spectacular Lusi eruption started in northeast Java, Indonesia the 29 of May 2006 following a M6.3 earthquake striking the island. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. Lusi is located few kilometres to the NE of the Arjuno-Welirang volcanic complex. Lusi sits upon the Watukosek fault system. From this volcanic complex originates the Watukosek fault system that was reactivated by the M6.3 earthquake in 2006 and is still periodically reactivated by the frequent seismicity. To date Lusi is still active and erupting gas, water, mud and clasts. Gas and water data show that the Lusi plumbing system is connected with the neighbouring Arjuno-Welirang volcanic complex. This makes the Lusi eruption a "sedimentary hosted geothermal system". To verify and characterise the occurrence of seismic activity and how this perturbs the connected Watukosek fault, the Arjuno-Welirang volcanic system and the ongoing Lusi eruption, we deployed 30 seismic stations (short-period and broadband) in this region of the East Java basin. The seismic stations are more densely distributed around LUSI and the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. Fewer stations are positioned around the volcanic arc. Our study sheds light on the seismic activity along the Watukosek fault system and describes the waveforms associated to the geysering activity of Lusi. The initial network aims to locate small event that may not be captured by the Indonesian Agency for Meteorology, Climatology and Geophysics (BMKG) seismic network and it will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-Arjuno Welirang region and temporal variations of vp/vs ratios. Such variations will then be ideally related to large-magnitude seismic events. This project is an unprecedented monitoring of a multi component system including an Lusi active eruption, an unlocked strike slip fault, a neighbouring volcanic arc all affected by frequent seismicity. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. The seismic experiment suggested in this study enforces our knowledge about Lusi and will represent a step further towards the reconstruction of a society devastated by Lusi disaster.

  9. RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES

    EPA Science Inventory

    On October 9, 1993, the new RCRA Subtitle D regulation (40CFR Part 258) went into effect. hese regulations are applicable to landfills reclining solid waste (MSW) and establish minimum Federal criteria for the siting, design, operations, and closure of MSW landfills. hese regulat...

  10. RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES

    EPA Science Inventory

    On October 9, 1993, the new RCRA Subtitle D regulations (40 CFR Part 258) went into effect. These regulations are applicable to landfills receiving municipal solid waste (MSW) and establish minimum Federal criteria for the siting, design, operation, and closure of MSW landfills....

  11. Seismic performance analysis and design suggestion for frame buildings with cast-in-place staircases

    NASA Astrophysics Data System (ADS)

    Feng, Yuan; Wu, Xiaobin; Xiong, Yaoqing; Li, Congchun; Yang, Wen

    2013-06-01

    Many staircases in reinforced concrete (RC) frame structures suffered severe damage during the Wenchuan earthquake. Elastic analyses for 18 RC structure models with and without staircases are conducted and compared to study the influence of the staircase on the stiffness, displacements and internal forces of the structures. To capture the yielding development and damage mechanism of frame structures, elasto-plastic analysis is carried out for one of the 18 models. Based on the features observed in the analyses, a new type of staircase design i.e., isolating them from the master structure to eliminate the effect of K-type struts, is proposed and discussed. It is concluded that the proposed method of staircase isolation is effective and feasible for engineering design, and does not significantly increase the construction cost.

  12. A structural design and analysis of a piping system including seismic load

    SciTech Connect

    Hsieh, B.J.; Kot, C.A.

    1991-01-01

    The structural design/analysis of a piping system at a nuclear fuel facility is used to investigate some aspects of current design procedures. Specifically the effect of using various stress measures including ASME Boiler Pressure Vessel (B PV) Code formulas is evaluated. It is found that large differences in local maximum stress values may be calculated depending on the stress criterion used. However, when the global stress maximum for the entire system are compared the differences are much smaller, being nevertheless, for some load combinations, of the order of 50 percent. The effect of using an Equivalent Static Method (ESM) analysis is also evaluated by comparing its results with those obtained from a Response Spectrum Method (RSM) analysis with the modal responses combined by using the absolute summation (ABS), by using the square root of the squares (SRSS), and by using the 10 percent method (10PC). It is shown that for a spectrum amplification factor (equivalent static coefficient greater than unity) of at least 1.32 must be used in the current application of the ESM analysis in order to obtain results which are conservative in all aspects relative to an RSM analysis based on ABS. However, it appears that an adequate design would be obtained from the ESM approach even without the use of a spectrum amplification factor. 7 refs., 3 figs., 3 tabs.

  13. Appraising the value of independent EIA follow-up verifiers

    SciTech Connect

    Wessels, Jan-Albert

    2015-01-15

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities arising from EIA follow-up through the appointment of independent verifiers. - Highlights: • A framework for appraising the role of independent verifiers is established. • The value added to EIA follow-up by independent verifiers in South Africa is documented. • Verifiers add most value when involved with screening, checking compliance, influencing decisions and community engagement. • Verifiers could be more creatively utilized in pre-construction preparation, giving feedback, and performance evaluation.

  14. Seismic analysis of diagrid structural frames with shear-link fuse devices

    NASA Astrophysics Data System (ADS)

    Moghaddasi B, Nasim S.; Zhang, Yunfeng

    2013-09-01

    This paper presents a new concept for enhancing the seismic ductility and damping capacity of diagrid structural frames by using shear-link fuse devices and its seismic performance is assessed through nonlinear static and dynamic analysis. The architectural elegancy of the diagrid structure attributed to its triangular leaning member configuration and high structural redundancy make this system a desirable choice for tall building design. However, forming a stable energy dissipation mechanism in diagrid framing remains to be investigated to expand its use in regions with high seismicity. To address this issue, a diagrid framing design is proposed here which provides a competitive design option in highly seismic regions through its increased ductility and improved energy dissipation capacity provided by replaceable shear links interconnecting the diagonal members at their ends. The structural characteristics and seismic behavior (capacity, stiffness, energy dissipation, ductility) of the diagrid structural frame are demonstrated with a 21-story building diagrid frame subjected to nonlinear static and dynamic analysis. The findings from the nonlinear time history analysis verify that satisfactory seismic performance can be achieved by the proposed diagrid frame subjected to design basis earthquakes in California. In particular, one appealing feature of the proposed diagrid building is its reduced residual displacement after strong earthquakes.

  15. Research of CRP-based irregular 2D seismic acquisition

    NASA Astrophysics Data System (ADS)

    Zhao, Hu; Yin, Cheng; He, Guang-Ming; Chen, Ai-Ping; Jing, Long-Jiang

    2015-03-01

    Seismic exploration in the mountainous areas of western Chinese is extremely difficult because of the complexity of the surface and subsurface, which results in shooting difficulties, seismic data with low signal-to-noise ratio, and strong interference. The complexity of the subsurface structure leads to strong scattering of the reflection points; thus, the curved-line acquisition method has been used. However, the actual subsurface structural characteristics have been rarely considered. We propose a design method for irregular acquisition based on common reflection points (CRP) to avoid difficult-to-shoot areas, while considering the structural characteristics and CRP positions and optimizing the surface-receiving line position. We arrange the positions of the receiving points to ensure as little dispersion of subsurface CRP as possible to improve the signal-to-noise ratio of the seismic data. We verify the applicability of the method using actual data from a site in Sichuan Basin. The proposed method apparently solves the problem of seismic data acquisition and facilitates seismic exploration in structurally complex areas.

  16. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    ERIC Educational Resources Information Center

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…

  17. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    ERIC Educational Resources Information Center

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and

  18. Seismic design technology for breeder reactor structures. Volume 2. Special topics in soil/structure interaction analyses

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into six chapters: definition of seismic input ground motion, review of state-of-the-art procedures, analysis guidelines, rock/structure interaction analysis example, comparison of two- and three-dimensional analyses, and comparison of analyses using FLUSH and TRI/SAC Codes. (DLC)

  19. Comparison of seismic sources for shallow seismic: sledgehammer and pyrotechnics

    NASA Astrophysics Data System (ADS)

    Brom, Aleksander; Stan-Kłeczek, Iwona

    2015-10-01

    The pyrotechnic materials are one of the types of the explosives materials which produce thermal, luminous or sound effects, gas, smoke and their combination as a result of a self-sustaining chemical reaction. Therefore, pyrotechnics can be used as a seismic source that is designed to release accumulated energy in a form of seismic wave recorded by tremor sensors (geophones) after its passage through the rock mass. The aim of this paper was to determine the utility of pyrotechnics for shallow seismic engineering. The work presented comparing the conventional method of seismic wave excitation for seismic refraction method like plate and hammer and activating of firecrackers on the surface. The energy released by various sources and frequency spectra was compared for the two types of sources. The obtained results did not determine which sources gave the better results but showed very interesting aspects of using pyrotechnics in seismic measurements for example the use of pyrotechnic materials in MASW.

  20. An IBM 370 assembly language program verifier

    NASA Technical Reports Server (NTRS)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  1. Seismic Survey

    USGS hydrologists conduct a seismic survey in New Orleans, Louisiana. The survey was one of several geophysical methods used during USGS applied research on the utility of the multi-channel analysis of surface waves (MASW) seismic method (no pictured here) for non-invasive assessment of earthen leve...

  2. Neural networks in seismic discrimination

    SciTech Connect

    Dowla, F.U.

    1995-01-01

    Neural networks are powerful and elegant computational tools that can be used in the analysis of geophysical signals. At Lawrence Livermore National Laboratory, we have developed neural networks to solve problems in seismic discrimination, event classification, and seismic and hydrodynamic yield estimation. Other researchers have used neural networks for seismic phase identification. We are currently developing neural networks to estimate depths of seismic events using regional seismograms. In this paper different types of network architecture and representation techniques are discussed. We address the important problem of designing neural networks with good generalization capabilities. Examples of neural networks for treaty verification applications are also described.

  3. Monitoring and modeling the multi-time-scale seismic hazard of the southern Longmenshan fault: an experimental design of the `monitoring and modeling for prediction' system

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Li, L.; Liu, G.; Jiang, C.; Ma, H.

    2010-12-01

    To the southwest of the WFSD-I and WFSD-II is the southern part of the Longmenshan fault, which has been keeping quiet since the May 12, 2008, Wenchuan earthquake which ruptured the middle and the northern part of the Longmenshan fault zone. The seismic hazard in this reason is one of the concerns not only in the WFSD project but also in the regional sustainability. This presentation tries to discuss the following three major problems related to the seismic hazard of this fault segment: 1) If there were a major earthquake rupturing this fault segment, what would be the ‘scenario rupture’ preparing and occurring; 2) Based on this concept of ‘scenario rupture’, how to design the ‘monitoring and modeling for prediction’ system in this region, for the effective constraint of geodynamic models for earthquake preparation, the effective monitoring of potentially pre-seismic changes of geophysical fields, and the effective test of the predictive models and/or algorithms; and 3) what will be the potential contribution of the WFSD project, in both long-term sense and short-term sense, to the monitoring and modeling of seismic hazard in this region. In considering these three questions, lessons and experiences from the Wenchuan earthquake plays an important role, and the relation between the Xianshuihe fault and the Longmenshan fault is one of the critical issues subject to consideration. Considering the state-of-the-art of earthquake science and social needs, the monitoring and modeling endeavor should be dealing with different time scales considering both scientific issues and decision-making issues. Taking the lessons and experiences of the previously-conducted earthquake prediction experiment sites, we propose a concept ‘seismological engineering’ (which is different from either ‘earthquake engineering’ or ‘engineering seismology’) dealing with the design of the operational multi-disciplinary observation system oriented at the monitoring and modeling of multi-time-scale seismic hazard, for a specific tectonic region such like the southern part of the Longmenshan fault.

  4. Design and Implementation of a Wireless Sensor Network of GPS-enabled Seismic Sensors for the Study of Glaciers and Ice Sheets

    NASA Astrophysics Data System (ADS)

    Bilen, S. G.; Anandakrishnan, S.; Urbina, J. V.

    2012-12-01

    In an effort to provide new and improved geophysical sensing capabilities for the study of ice sheets in Antarctica and Greenland, or to study mountain glaciers, we are developing a network of wirelessly interconnected seismic and GPS sensor nodes (called "geoPebbles"), with the primary objective of making such instruments more capable and cost effective. We describe our design methodology, which has enabled us to develop these state-of-the art sensors using commercial-off-the-shelf hardware combined with custom-designed hardware and software. Each geoPebble is a self-contained, wirelessly connected sensor for collecting seismic measurements and position information. Each node is built around a three-component seismic recorder, which includes an amplifier, filter, and 24-bit analog-to-digital card that can sample up to 10 kHz. Each unit also includes a microphone channel to record the ground-coupled airwave. The timing for each node is available through a carrier-phase measurement of the L1 GPS signal at an absolute accuracy of better than a microsecond. Each geoPebble includes 16 GB of solid-state storage, wireless communications capability to a central supervisory unit, and auxiliary measurements capability (up to eight 10-bit channels at low sample rates). We will report on current efforts to test this new instrument and how we are addressing the challenges imposed by the extreme weather conditions on the Antarctic continent. After fully validating its operational conditions, the geoPebble system will be available for NSF-sponsored glaciology research projects. Geophysical experiments in the polar region are logistically difficult. With the geoPebble system, the cost of doing today's experiments (low-resolution, 2D) will be significantly reduced, and the cost and feasibility of doing tomorrow's experiments (integrated seismic, positioning, 3D, etc.) will be reasonable. Sketch of an experiment with geoPebbles scattered on the surface of the ice sheet. The seismic source can move through the array. The SQC node communicates with all the elements in the array.

  5. 37 CFR 2.33 - Verified statement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... under § 2.20 of the applicant's continued use or bona fide intention to use the mark in commerce. (d) (e... COMMERCE RULES OF PRACTICE IN TRADEMARK CASES The Written Application § 2.33 Verified statement. (a) The... behalf of the applicant under § 2.193(e)(1). (b)(1) In an application under section 1(a) of the Act,...

  6. Firms Verify Online IDs Via Schools

    ERIC Educational Resources Information Center

    Davis, Michelle R.

    2008-01-01

    Companies selling services to protect children and teenagers from sexual predators on the Internet have enlisted the help of schools and teachers to verify students' personal information. Those companies are also sharing some of the information with Web sites, which can pass it along to businesses for use in targeting advertising to young…

  7. Impact of lateral force-resisting system and design/construction practices on seismic performance and cost of tall buildings in Dubai, UAE

    NASA Astrophysics Data System (ADS)

    AlHamaydeh, Mohammad; Galal, Khaled; Yehia, Sherif

    2013-09-01

    The local design and construction practices in the United Arab Emirates (UAE), together with Dubai's unique rate of development, warrant special attention to the selection of Lateral Force-Resisting Systems (LFRS). This research proposes four different feasible solutions for the selection of the LFRS for tall buildings and quantifies the impact of these selections on seismic performance and cost. The systems considered are: Steel Special Moment-Resisting Frame (SMRF), Concrete SMRF, Steel Dual System (SMRF with Special Steel Plates Shear Wall, SPSW), and Concrete Dual System (SMRF with Special Concrete Shear Wall, SCSW). The LFRS selection is driven by seismic setup as well as the adopted design and construction practices in Dubai. It is found that the concrete design alternatives are consistently less expensive than their steel counterparts. The steel dual system is expected to have the least damage based on its relatively lesser interstory drifts. However, this preferred performance comes at a higher initial construction cost. Conversely, the steel SMRF system is expected to have the most damage and associated repair cost due to its excessive flexibility. The two concrete alternatives are expected to have relatively moderate damage and repair costs in addition to their lesser initial construction cost.

  8. Seismic, shock, and vibration isolation - 1988

    SciTech Connect

    Chung, H. ); Mostaghel, N. )

    1988-01-01

    This book contains papers presented at a conference on pressure vessels and piping. Topics covered include: Design of R-FBI bearings for seismic isolation; Benefits of vertical and horizontal seismic isolation for LMR nuclear reactor units; and Some remarks on the use and perspectives of seismic isolation for fast reactors.

  9. Towards composition of verified hardware devices

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, G. C.

    1991-01-01

    Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.

  10. Seismic isolation of nuclear power plants using sliding isolation bearings

    NASA Astrophysics Data System (ADS)

    Kumar, Manish

    Nuclear power plants (NPP) are designed for earthquake shaking with very long return periods. Seismic isolation is a viable strategy to protect NPPs from extreme earthquake shaking because it filters a significant fraction of earthquake input energy. This study addresses the seismic isolation of NPPs using sliding bearings, with a focus on the single concave Friction Pendulum(TM) (FP) bearing. Friction at the sliding surface of an FP bearing changes continuously during an earthquake as a function of sliding velocity, axial pressure and temperature at the sliding surface. The temperature at the sliding surface, in turn, is a function of the histories of coefficient of friction, sliding velocity and axial pressure, and the travel path of the slider. A simple model to describe the complex interdependence of the coefficient of friction, axial pressure, sliding velocity and temperature at the sliding surface is proposed, and then verified and validated. Seismic hazard for a seismically isolated nuclear power plant is defined in the United States using a uniform hazard response spectrum (UHRS) at mean annual frequencies of exceedance (MAFE) of 10-4 and 10 -5. A key design parameter is the clearance to the hard stop (CHS), which is influenced substantially by the definition of the seismic hazard. Four alternate representations of seismic hazard are studied, which incorporate different variabilities and uncertainties. Response-history analyses performed on single FP-bearing isolation systems using ground motions consistent with the four representations at the two shaking levels indicate that the CHS is influenced primarily by whether the observed difference between the two horizontal components of ground motions in a given set is accounted for. The UHRS at the MAFE of 10-4 is increased by a design factor (≥ 1) for conventional (fixed base) nuclear structure to achieve a target annual frequency of unacceptable performance. Risk oriented calculations are performed for eight sites across the United States to show that the factor is equal to 1.0 for seismically isolated NPPs, if the risk is dominated by horizontal earthquake shaking. Response-history analyses using different models of seismically isolated NPPs are performed to understand the importance of the choice of friction model, model complexity and vertical ground motion for calculating horizontal displacement response across a wide range of sites and shaking intensities. A friction model for the single concave FP bearing should address heating. The pressure- and velocity-dependencies were not important for the models and sites studied. Isolation-system displacements can be computed using a macro model comprising a single FP bearing.

  11. Teacher Directed Design: Content Knowledge, Pedagogy and Assessment under the Nevada K-12 Real-Time Seismic Network

    NASA Astrophysics Data System (ADS)

    Cantrell, P.; Ewing-Taylor, J.; Crippen, K. J.; Smith, K. D.; Snelson, C. M.

    2004-12-01

    Education professionals and seismologists under the emerging SUN (Shaking Up Nevada) program are leveraging the existing infrastructure of the real-time Nevada K-12 Seismic Network to provide a unique inquiry based science experience for teachers. The concept and effort are driven by teacher needs and emphasize rigorous content knowledge acquisition coupled with the translation of that knowledge into an integrated seismology based earth sciences curriculum development process. We are developing a pedagogical framework, graduate level coursework, and materials to initiate the SUN model for teacher professional development in an effort to integrate the research benefits of real-time seismic data with science education needs in Nevada. A component of SUN is to evaluate teacher acquisition of qualified seismological and earth science information and pedagogy both in workshops and in the classroom and to assess the impact on student achievement. SUN's mission is to positively impact earth science education practices. With the upcoming EarthScope initiative, the program is timely and will incorporate EarthScope real-time seismic data (USArray) and educational materials in graduate course materials and teacher development programs. A number of schools in Nevada are contributing real-time data from both inexpensive and high-quality seismographs that are integrated with Nevada regional seismic network operations as well as the IRIS DMC. A powerful and unique component of the Nevada technology model is that schools can receive "stable" continuous live data feeds from 100's seismograph stations in Nevada, California and world (including live data from Earthworm systems and the IRIS DMC BUD - Buffer of Uniform Data). Students and teachers see their own networked seismograph station within a global context, as participants in regional and global monitoring. The robust real-time Internet communications protocols invoked in the Nevada network provide for local data acquisition, remote multi-channel data access, local time-series data management, interactive multi-window waveform display and time-series analysis with centralized meta-data control. Formally integrating educational seismology into the K-12 science curriculum with an overall "positive" impact to science education practices necessarily requires a collaborative effort between professional educators and seismologists yet driven exclusively by teacher needs.

  12. Verifying speculative multithreading in an application

    DOEpatents

    Felton, Mitchell D

    2014-12-09

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  13. Verifying speculative multithreading in an application

    SciTech Connect

    Felton, Mitchell D

    2014-11-18

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  14. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    SciTech Connect

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly evaluated and identified. This document supersedes the seismic classifications, assignments, and computations in ''Seismic Analysis for Preclosure Safety'' (BSC 2004a).

  15. Annual Hanford seismic report -- fiscal year 1996

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.P.

    1996-12-01

    Seismic monitoring (SM) at the Hanford Site was established in 1969 by the US Geological Survey (USGS) under a contract with the US Atomic Energy Commission. Since 1980, the program has been managed by several contractors under the US Department of Energy (USDOE). Effective October 1, 1996, the Seismic Monitoring workscope, personnel, and associated contracts were transferred to the USDOE Pacific Northwest National Laboratory (PNNL). SM is tasked to provide an uninterrupted collection and archives of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) located on and encircling the Hanford Site. SM is also tasked to locate and identify sources of seismic activity and monitor changes in the historical pattern of seismic activity at the Hanford Site. The data compiled are used by SM, Waste Management, and engineering activities at the Hanford Site to evaluate seismic hazards and seismic design for the Site.

  16. Verifying disarmament: scientific, technological and political challenges

    SciTech Connect

    Pilat, Joseph R

    2011-01-25

    There is growing interest in, and hopes for, nuclear disarmament in governments and nongovernmental organizations (NGOs) around the world. If a nuclear-weapon-free world is to be achievable, verification and compliance will be critical. VerifYing disarmament would have unprecedented scientific, technological and political challenges. Verification would have to address warheads, components, materials, testing, facilities, delivery capabilities, virtual capabilities from existing or shutdown nuclear weapon and existing nuclear energy programs and material and weapon production and related capabilities. Moreover, it would likely have far more stringent requirements. The verification of dismantlement or elimination of nuclear warheads and components is widely recognized as the most pressing problem. There has been considerable research and development done in the United States and elsewhere on warhead and dismantlement transparency and verification since the early 1990s. However, we do not today know how to verifY low numbers or zero. We need to develop the needed verification tools and systems approaches that would allow us to meet this complex set of challenges. There is a real opportunity to explore verification options and, given any realistic time frame for disarmament, there is considerable scope to invest resources at the national and international levels to undertake research, development and demonstrations in an effort to address the anticipated and perhaps unanticipated verification challenges of disarmament now andfor the next decades. Cooperative approaches have the greatest possibility for success.

  17. Development of Earthquake Ground Motion Input for Preclosure Seismic Design and Postclosure Performance Assessment of a Geologic Repository at Yucca Mountain, NV

    SciTech Connect

    I. Wong

    2004-11-05

    This report describes a site-response model and its implementation for developing earthquake ground motion input for preclosure seismic design and postclosure assessment of the proposed geologic repository at Yucca Mountain, Nevada. The model implements a random-vibration theory (RVT), one-dimensional (1D) equivalent-linear approach to calculate site response effects on ground motions. The model provides results in terms of spectral acceleration including peak ground acceleration, peak ground velocity, and dynamically-induced strains as a function of depth. In addition to documenting and validating this model for use in the Yucca Mountain Project, this report also describes the development of model inputs, implementation of the model, its results, and the development of earthquake time history inputs based on the model results. The purpose of the site-response ground motion model is to incorporate the effects on earthquake ground motions of (1) the approximately 300 m of rock above the emplacement levels beneath Yucca Mountain and (2) soil and rock beneath the site of the Surface Facilities Area. A previously performed probabilistic seismic hazard analysis (PSHA) (CRWMS M&O 1998a [DIRS 103731]) estimated ground motions at a reference rock outcrop for the Yucca Mountain site (Point A), but those results do not include these site response effects. Thus, the additional step of applying the site-response ground motion model is required to develop ground motion inputs that are used for preclosure and postclosure purposes.

  18. Seismic evaluation methods for existing buildings

    SciTech Connect

    Hsieh, B.J.

    1995-07-01

    Recent US Department of Energy natural phenomena hazards mitigation directives require the earthquake reassessment of existing hazardous facilities and general use structures. This applies also to structures located in accordance with the Uniform Building Code in Seismic Zone 0 where usually no consideration is given to seismic design, but where DOE specifies seismic hazard levels. An economical approach for performing such a seismic evaluation, which relies heavily on the use of preexistent structural analysis results is outlined below. Specifically, three different methods are used to estimate the seismic capacity of a building, which is a unit of a building complex located on a site considered low risk to earthquakes. For structures originally not seismically designed, which may not have or be able to prove sufficient capacity to meet new arbitrarily high seismic design requirement and which are located on low-seismicity sites, it may be very cost effective to perform detailed site-specific seismic hazard studies in order to establish the true seismic threat. This is particularly beneficial, to sites with many buildings and facilities to be seismically evaluated.

  19. Positively Verifying Mating of Previously Unverifiable Flight Connectors

    NASA Technical Reports Server (NTRS)

    Pandipati R. K. Chetty

    2011-01-01

    Current practice is to uniquely key the connectors, which, when mated, could not be verified by ground tests such as those used in explosive or non-explosive initiators and pyro valves. However, this practice does not assure 100-percent correct mating. This problem could be overcome by the following approach. Errors in mating of interchangeable connectors can result in degraded or failed space mission. Mating of all flight connectors considered not verifiable via ground tests can be verified electrically by the following approach. It requires two additional wires going through the connector of interest, a few resistors, and a voltage source. The test-point voltage V(sub tp) when the connector is not mated will be the same as the input voltage, which gets attenuated by the resistor R(sub 1) when the female (F) and male (M) connectors are mated correctly and properly. The voltage at the test point will be a function of R(sub 1) and R(sub 2). Monitoring of the test point could be done on ground support equipment (GSE) only, or it can be a telemetry point. For implementation on multiple connector pairs, a different value for R(sub 1) or R(sub 2) or both can be selected for each pair of connectors that would result in a unique test point voltage for each connector pair. Each test point voltage is unique, and correct test point voltage is read only when the correct pair is mated correctly together. Thus, this design approach can be used to verify positively the correct mating of the connector pairs. This design approach can be applied to any number of connectors on the flight vehicle.

  20. Verifiable process monitoring through enhanced data authentication.

    SciTech Connect

    Goncalves, Joao G. M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-09-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  1. Seismic Data Gathering and Validation

    SciTech Connect

    Coleman, Justin

    2015-02-01

    Three recent earthquakes in the last seven years have exceeded their design basis earthquake values (so it is implied that damage to SSC’s should have occurred). These seismic events were recorded at North Anna (August 2011, detailed information provided in [Virginia Electric and Power Company Memo]), Fukushima Daichii and Daini (March 2011 [TEPCO 1]), and Kaswazaki-Kariwa (2007, [TEPCO 2]). However, seismic walk downs at some of these plants indicate that very little damage occurred to safety class systems and components due to the seismic motion. This report presents seismic data gathered for two of the three events mentioned above and recommends a path for using that data for two purposes. One purpose is to determine what margins exist in current industry standard seismic soil-structure interaction (SSI) tools. The second purpose is the use the data to validated seismic site response tools and SSI tools. The gathered data represents free field soil and in-structure acceleration time histories data. Gathered data also includes elastic and dynamic soil properties and structural drawings. Gathering data and comparing with existing models has potential to identify areas of uncertainty that should be removed from current seismic analysis and SPRA approaches. Removing uncertainty (to the extent possible) from SPRA’s will allow NPP owners to make decisions on where to reduce risk. Once a realistic understanding of seismic response is established for a nuclear power plant (NPP) then decisions on needed protective measures, such as SI, can be made.

  2. Seismic monitoring of Poland - temporary seismic project - first results

    NASA Astrophysics Data System (ADS)

    Trojanowski, J.; Plesiewicz, B.; Wiszniowski, J.; Suchcicki, J.; Tokarz, A.

    2012-04-01

    The aim of the project is to develop national database of seismic activity for seismic hazard assessment. Poland is known as a region of very low seismicity, however some earthquakes occur from time to time. The historical catalogue consists of less than one hundred earthquakes in the time span of almost one thousand years. Due to such a low occurrence rate, the study has been focussing on events at magnitudes lower than 2 which are more likely to occur during a few-year-long project. There are 24 mobile seismic stations involved in the project which are deployed in temporary locations close to humans neighbourhood. It causes a high level of noise and disturbances in recorded seismic signal. Moreover, the majority of Polish territory is covered by a thick sediments. It causes the problem of a reliable detection method for small seismic events in noisy data. The majority of algorithms is based on the concept of STA/LTA ratio and is designed for strong teleseismic events registered on many stations. Unfortunately they fail on the problem of weak events in the signal with noise and disturbances. It has been decided to apply Real Time Recurrent Neural Network (RTRN) to detect small natural seismic events from Poland. This method is able to assess relations of seismic signal in frequency domains as well as in time of seismic phases. The RTRN was taught by wide range of seismic signals - regional, teleseismic as well as blasts. The method is routinely used to analyse data from the project. In the firs two years of the project the seismic network was set in southern Poland, where relatively large seismicity in known. Since the mid-2010 the stations have been working in several regions of central and northern Poland where some minor historical earthquakes occurred. Over one hundred seismic events in magnitude range from 0.5 to 2.3 confirms the activity of Podhale region (Tatra Mountains, Carpathians), where an earthquake of magnitude 4.3 occurred in 2004. Initially three and now five seismic stations monitor this region of southern Poland. Locations of the events form a stable pattern of epicentral regions on Podhale. At the beginning of 2012 an unexpected earthquake of magnitude 3.8 was felt in western Poland - the region where not a single historical event has been reported.

  3. Seismic Tomography.

    ERIC Educational Resources Information Center

    Anderson, Don L.; Dziewonski, Adam M.

    1984-01-01

    Describes how seismic tomography is used to analyze the waves produced by earthquakes. The information obtained from the procedure can then be used to map the earth's mantle in three dimensions. The resulting maps are then studied to determine such information as the convective flow that propels the crustal plates. (JN)

  4. Seismic Symphonies

    NASA Astrophysics Data System (ADS)

    Strinna, Elisa; Ferrari, Graziano

    2015-04-01

    The project started in 2008 as a sound installation, a collaboration between an artist, a barrel organ builder and a seismologist. The work differs from other attempts of sound transposition of seismic records. In this case seismic frequencies are not converted automatically into the "sound of the earthquake." However, it has been studied a musical translation system that, based on the organ tonal scale, generates a totally unexpected sequence of sounds which is intended to evoke the emotions aroused by the earthquake. The symphonies proposed in the project have somewhat peculiar origins: they in fact come to life from the translation of graphic tracks into a sound track. The graphic tracks in question are made up by copies of seismograms recorded during some earthquakes that have taken place around the world. Seismograms are translated into music by a sculpture-instrument, half a seismograph and half a barrel organ. The organ plays through holes practiced on paper. Adapting the documents to the instrument score, holes have been drilled on the waves' peaks. The organ covers about three tonal scales, starting from heavy and deep sounds it reaches up to high and jarring notes. The translation of the seismic records is based on a criterion that does match the highest sounds to larger amplitudes with lower ones to minors. Translating the seismogram in the organ score, the larger the amplitude of recorded waves, the more the seismogram covers the full tonal scale played by the barrel organ and the notes arouse an intense emotional response in the listener. Elisa Strinna's Seismic Symphonies installation becomes an unprecedented tool for emotional involvement, through which can be revived the memory of the greatest disasters of over a century of seismic history of the Earth. A bridge between art and science. Seismic Symphonies is also a symbolic inversion: the instrument of the organ is most commonly used in churches, and its sounds are derived from the heavens and symbolize cosmic harmony. But here it is the earth, "nature", the ground beneath our feet that is moving. It speaks to us not of harmony, but of our fragility. For the oldest earthquakes considered, Seismic Symphonies drew on SISMOS archives, the INGV project for recovery, high resolution digital reproduction and distribution of the seismograms of earthquakes of the Euro-Mediterranean area from 1895 to 1984. After the first exposure to the Fondazione Bevilacqua La Masa in Venice, the organ was later exhibited in Taiwan, the Taipei Biennial, with seismograms provided from the Taiwanese Central Weather Bureau, and at the EACC Castello in Spain, with seismograms of Spanish earthquakes provided by the Instituto Geográfico Nacional.

  5. Derivation and implementation of a nonlinear experimental design criterion and its application to seismic network expansion at Kawerau geothermal field, New Zealand

    NASA Astrophysics Data System (ADS)

    Rawlinson, Z. J.; Townend, J.; Arnold, R.; Bannister, S.

    2012-09-01

    The accuracy with which geophysical observations are made is inherently determined by the geometry of the observation network, and typically depends on a highly non-linear relationship between data and earth parameters. Statistical experimental design provides a means of optimizing the network geometry to provide maximum information about parameters of interest. Here, we re-derive the nonlinear experimental design DN optimization method, without the need for the usual assumption of a multivariate normal model of data uncertainties. We demonstrate the criterion's utility by applying it to the problem of seismic network expansion in the active Kawerau geothermal field, Taupo Volcanic Zone, New Zealand. The design calculations maximize the ratio of the hypocentre data generalized variance (attributable to resolvable spatial separation of earthquakes) to the measurement error generalized variance (attributable to observational uncertainties), and incorporate realistic 3-D velocity and attenuation models, surface noise sources, and both P- and S-wave data. In geologically complex areas, statistical experimental design provides a means of objectively deploying finite observational resources to target areas of particular interest while taking into account environmental and logistical factors.

  6. Ringing load models verified against experiments

    SciTech Connect

    Krokstad, J.R.; Stansberg, C.T.

    1995-12-31

    What is believed to be the main reason for discrepancies between measured and simulated loads in previous studies has been assessed. One has focused on the balance between second- and third-order load components in relation to what is called ``fat body`` load correction. It is important to understand that the use of Morison strip theory in combination with second-order wave theory give rise to second- as well as third-order components in the horizontal force. A proper balance between second- and third-order components in horizontal force is regarded as the most central requirements for a sufficient accurate ringing load model in irregular sea. It is also verified that simulated second-order components are largely overpredicted both in regular and irregular seas. Nonslender diffraction effects are important to incorporate in the FNV formulation in order to reduce the simulated second-order component and to match experiments more closely. A sufficient accurate ringing simulation model with the use of simplified methods is shown to be within close reach. Some further development and experimental verification must however be performed in order to take non-slender effects into account.

  7. Static behaviour of induced seismicity

    NASA Astrophysics Data System (ADS)

    Mignan, Arnaud

    2016-04-01

    The standard paradigm to describe seismicity induced by fluid injection is to apply non-linear diffusion dynamics in a poroelastic medium. I show that the spatio-temporal behaviour and rate evolution of induced seismicity can, instead, be expressed by geometric operations on a static stress field produced by volume change at depth. I obtain laws similar in form to the ones derived from poroelasticity while requiring a lower description length. Although fluid flow is known to occur in the ground, it is not pertinent to the geometrical description of the spatio-temporal patterns of induced seismicity. The proposed model is equivalent to the static stress model for tectonic foreshocks generated by the Non-Critical Precursory Accelerating Seismicity Theory. This study hence verifies the explanatory power of this theory outside of its original scope and provides an alternative physical approach to poroelasticity for the modelling of induced seismicity. The applicability of the proposed geometrical approach is illustrated for the case of the 2006, Basel enhanced geothermal system stimulation experiment. Applicability to more problematic cases where the stress field may be spatially heterogeneous is also discussed.

  8. Design of an UML conceptual model and implementation of a GIS with metadata information for a seismic hazard assessment cooperative project.

    NASA Astrophysics Data System (ADS)

    Torres, Y.; Escalante, M. P.

    2009-04-01

    This work illustrates the advantages of using a Geographic Information System in a cooperative project with researchers of different countries, such as the RESIS II project (financed by the Norwegian Government and managed by CEPREDENAC) for seismic hazard assessment of Central America. As input data present different formats, cover distinct geographical areas and are subjected to different interpretations, data inconsistencies may appear and their management get complicated. To achieve data homogenization and to integrate them in a GIS, it is required previously to develop a conceptual model. This is accomplished in two phases: requirements analysis and conceptualization. The Unified Modeling Language (UML) is used to compose the conceptual model of the GIS. UML complies with ISO 19100 norms and allows the designer defining model architecture and interoperability. The GIS provides a frame for the combination of large geographic-based data volumes, with an uniform geographic reference and avoiding duplications. All this information contains its own metadata following ISO 19115 normative. In this work, the integration in the same environment of active faults and subduction slabs geometries, combined with the epicentres location, has facilitated the definition of seismogenetic regions. This is a great support for national specialists of different countries to make easier their teamwork. The GIS capacity for making queries (by location and by attributes) and geostatistical analyses is used to interpolate discrete data resulting from seismic hazard calculations and to create continuous maps as well as to check and validate partial results of the study. GIS-based products, such as complete, homogenised databases and thematic cartography of the region, are distributed to all researchers, facilitating cross-national communication, the project execution and results dissemination.

  9. Conceptual design report: Nuclear materials storage facility renovation. Part 5, Structural/seismic investigation. Section B, Renovation calculations/supporting data

    SciTech Connect

    1995-07-14

    The Nuclear Materials Storage Facility (NMSF) at the Los Alamos National Laboratory (LANL) was a Fiscal Year (FY) 1984 line-item project completed in 1987 that has never been operated because of major design and construction deficiencies. This renovation project, which will correct those deficiencies and allow operation of the facility, is proposed as an FY 97 line item. The mission of the project is to provide centralized intermediate and long-term storage of special nuclear materials (SNM) associated with defined LANL programmatic missions and to establish a centralized SNM shipping and receiving location for Technical Area (TA)-55 at LANL. Based on current projections, existing storage space for SNM at other locations at LANL will be loaded to capacity by approximately 2002. This will adversely affect LANUs ability to meet its mission requirements in the future. The affected missions include LANL`s weapons research, development, and testing (WRD&T) program; special materials recovery; stockpile survelliance/evaluation; advanced fuels and heat sources development and production; and safe, secure storage of existing nuclear materials inventories. The problem is further exacerbated by LANL`s inability to ship any materials offsite because of the lack of receiver sites for mate rial and regulatory issues. Correction of the current deficiencies and enhancement of the facility will provide centralized storage close to a nuclear materials processing facility. The project will enable long-term, cost-effective storage in a secure environment with reduced radiation exposure to workers, and eliminate potential exposures to the public. This report is organized according to the sections and subsections. It is organized into seven parts. This document, Part V, Section B - Structural/Seismic Information provides a description of the seismic and structural analyses performed on the NMSF and their results.

  10. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    NASA Technical Reports Server (NTRS)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  11. iMUSH: The design of the Mount St. Helens high-resolution active source seismic experiment

    NASA Astrophysics Data System (ADS)

    Kiser, Eric; Levander, Alan; Harder, Steve; Abers, Geoff; Creager, Ken; Vidale, John; Moran, Seth; Malone, Steve

    2013-04-01

    Mount St. Helens is one of the most societally relevant and geologically interesting volcanoes in the United States. Although much has been learned about the shallow structure of this volcano since its eruption in 1980, important questions still remain regarding its magmatic system and connectivity to the rest of the Cascadia arc. For example, the structure of the magma plumbing system below the shallowest magma chamber under the volcano is still only poorly known. This information will be useful for hazard assessment for the southwest Washington area, and also for gaining insight into fundamental scientific questions such as the assimilation and differentiation processes that lead to the formation of continental crust. As part of the multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment, funded by NSF GeoPRISMS and EarthScope, an active source seismic experiment will be conducted in late summer 2014. The experiment will utilize all of the 2600 IRIS/PASSCAL/USArray Texan instruments. The instruments will be deployed as two 1000-instrument consecutive refraction profiles (one N/S and one WNW/ESE). Each of these profiles will be accompanied by two 1600-instrument areal arrays at varying distances from Mount St. Helens. Finally, one 2600-instrument areal array will be centered on Mount St. Helens. These instruments will record a total of twenty-four 500-1000 kg shots. Each refraction profile will have an average station spacing of 150 m, and a total length of 150 km. The stations in the areal arrays will be separated by ~1 km. A critical step in the success of this project is to develop an experimental setup that can resolve the most interesting aspects of the magmatic system. In particular, we want to determine the distribution of shot locations that will provide good coverage throughout the entire model space, while still allowing us to focus on regions likely to contain the magmatic plumbing system. In this study, we approach this problem by calculating Fréchet kernels with dynamic ray tracing. An initial observation from these kernels is that waves traveling across the largest offsets of the experiment (~150km) have sensitivity below depths of 30km. This means that we may be able to image the magmatic system down to the Moho, estimated at ~40 km. Additional work is focusing on searching for the shot locations that provide high resolution around very shallow features beneath Mount St. Helens, such as the first magmatic reservoir at about 3 km depth, and the associated Mount St. Helens seismic zone. One way in which we are guiding this search is to find the shot locations that maximize sensitivity values within the regions of interest after summing Fréchet kernels from each shot/station pair

  12. Ground Motion Simulations for Bursa Region (Turkey) Using Input Parameters derived from the Regional Seismic Network

    NASA Astrophysics Data System (ADS)

    Unal, B.; Askan, A.

    2014-12-01

    Earthquakes are among the most destructive natural disasters in Turkey and it is important to assess seismicity in different regions with the use of seismic networks. Bursa is located in Marmara Region, Northwestern Turkey and to the south of the very active North Anatolian Fault Zone. With around three million inhabitants and key industrial facilities of the country, Bursa is the fourth largest city in Turkey. Since most of the focus is on North Anatolian Fault zone, despite its significant seismicity, Bursa area has not been investigated extensively until recently. For reliable seismic hazard estimations and seismic design of structures, assessment of potential ground motions in this region is essential using both recorded and simulated data. In this study, we employ stochastic finite-fault simulation with dynamic corner frequency approach to model previous events as well to assess potential earthquakes in Bursa. To ensure simulations with reliable synthetic ground motion outputs, the input parameters must be carefully derived from regional data. In this study, using strong motion data collected at 33 stations in the region, site-specific parameters such as near-surface high frequency attenuation parameter and amplifications are obtained. Similarly, source and path parameters are adopted from previous studies that as well employ regional data. Initially, major previous events in the region are verified by comparing the records with the corresponding synthetics. Then simulations of scenario events in the region are performed. We present the results in terms of spatial distribution of peak ground motion parameters and time histories at selected locations.

  13. Black Thunder Coal Mine and Los Alamos National Laboratory experimental study of seismic energy generated by large scale mine blasting

    SciTech Connect

    Martin, R.L.; Gross, D.; Pearson, D.C.; Stump, B.W.; Anderson, D.P.

    1996-12-31

    In an attempt to better understand the impact that large mining shots will have on verifying compliance with the international, worldwide, Comprehensive Test Ban Treaty (CTBT, no nuclear explosion tests), a series of seismic and videographic experiments has been conducted during the past two years at the Black Thunder Coal Mine. Personnel from the mine and Los Alamos National Laboratory have cooperated closely to design and perform experiments to produce results with mutual benefit to both organizations. This paper summarizes the activities, highlighting the unique results of each. Topics which were covered in these experiments include: (1) synthesis of seismic, videographic, acoustic, and computer modeling data to improve understanding of shot performance and phenomenology; (2) development of computer generated visualizations of observed blasting techniques; (3) documentation of azimuthal variations in radiation of seismic energy from overburden casting shots; (4) identification of, as yet unexplained, out of sequence, simultaneous detonation in some shots using seismic and videographic techniques; (5) comparison of local (0.1 to 15 kilometer range) and regional (100 to 2,000 kilometer range) seismic measurements leading to determine of the relationship between local and regional seismic amplitude to explosive yield for overburden cast, coal bulking and single fired explosions; and (6) determination of the types of mining shots triggering the prototype International Monitoring System for the CTBT.

  14. Passive seismic experiment

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Ewing, M.; Press, F.; Sutton, G.; Dorman, J.; Nakamura, Y.; Toksoz, N.; Lammlein, D.; Duennebier, F.

    1972-01-01

    The design, deployment, and operation of the Apollo 16 passive seismic experiment (PSE) are discussed. Since activation, all elements of the PSE have operated as planned, with the exception of the sensor thermal control system. Significant progress in the measurement of meteoroid flux in near-earth space has been made, along with dilineation of active moonquake source regions. The data obtained indicate that moonquakes are concentrated at great depth (800 to 1000 km) and that the apparent disparity between meteoroid flux estimtes based on lunar crater counts and those from earth-based observations can be resolved by seismic measurements in favor of the lower flux indicated by the crater count method. The results obtained from the PSE are summarized and their significance is discussed in detail.

  15. Seismic Isolation Working Meeting Gap Analysis Report

    SciTech Connect

    Justin Coleman; Piyush Sabharwall

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  16. Infrasound Generation from the HH Seismic Hammer.

    SciTech Connect

    Jones, Kyle Richard

    2014-10-01

    The HH Seismic hammer is a large, %22weight-drop%22 source for active source seismic experiments. This system provides a repetitive source that can be stacked for subsurface imaging and exploration studies. Although the seismic hammer was designed for seismological studies it was surmised that it might produce energy in the infrasonic frequency range due to the ground motion generated by the 13 metric ton drop mass. This study demonstrates that the seismic hammer generates a consistent acoustic source that could be used for in-situ sensor characterization, array evaluation and surface-air coupling studies for source characterization.

  17. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity....

  18. Using Theorem Proving to Verify Properties of Agent Programs

    NASA Astrophysics Data System (ADS)

    Alechina, N.; Dastani, M.; Khan, F.; Logan, B.; Meyer, J.-J. Ch.

    We present a sound and complete logic for automatic verification of simpleAPL programs. simpleAPL is a simplified version of agent programming languages such as 3APL and 2APL designed for the implementation of cognitive agents with beliefs, goals and plans. Our logic is a variant of PDL, and allows the specification of safety and liveness properties of agent programs. We prove a correspondence between the operational semantics of simpleAPL and the models of the logic for two example program execution strategies. We show how to translate agent programs written in simpleAPL into expressions of the logic, and give an example in which we show how to verify correctness properties for a simple agent program using theorem-proving.

  19. Final Report: Seismic Hazard Assessment at the PGDP

    SciTech Connect

    Wang, Zhinmeng

    2007-06-01

    Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties of seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.

  20. Seismic risk perception in Italy

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro; Peruzza, Laura

    2014-05-01

    Risk perception is a fundamental element in the definition and the adoption of preventive counter-measures. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. This paper presents results of a survey on seismic risk perception in Italy conducted from January 2013 to present . The research design combines a psychometric and a cultural theoretic approach. More than 7,000 on-line tests have been compiled. The data collected show that in Italy seismic risk perception is strongly underestimated; 86 on 100 Italian citizens, living in the most dangerous zone (namely Zone 1), do not have a correct perception of seismic hazard. From these observations we deem that extremely urgent measures are required in Italy to reach an effective way to communicate seismic risk. Finally, the research presents a comparison between groups on seismic risk perception: a group involved in campaigns of information and education on seismic risk and a control group.

  1. Strong Motion Instrumentation of Seismically-Strengthened Port Structures in California by CSMIP

    USGS Publications Warehouse

    Huang, M.J.; Shakal, A.F.

    2009-01-01

    The California Strong Motion Instrumentation Program (CSMIP) has instrumented five port structures. Instrumentation of two more port structures is underway and another one is in planning. Two of the port structures have been seismically strengthened. The primary goals of the strong motion instrumentation are to obtain strong earthquake shaking data for verifying seismic analysis procedures and strengthening schemes, and for post-earthquake evaluations of port structures. The wharves instrumented by CSMIP were recommended by the Strong Motion Instrumentation Advisory Committee, a committee of the California Seismic Safety Commission. Extensive instrumentation of a wharf is difficult and would be impossible without the cooperation of the owners and the involvement of the design engineers. The instrumentation plan for a wharf is developed through study of the retrofit plans of the wharf, and the strong-motion sensors are installed at locations where specific instrumentation objectives can be achieved and access is possible. Some sensor locations have to be planned during design; otherwise they are not possible to install after construction. This paper summarizes the two seismically-strengthened wharves and discusses the instrumentation schemes and objectives. ?? 2009 ASCE.

  2. A Very High Resolution, Deep-Towed Multichannel Seismic Survey in the Yaquina Basin off Peru ? Technical Design of the new Deep-Tow Streamer

    NASA Astrophysics Data System (ADS)

    Bialas, J.; Breitzke, M.

    2002-12-01

    Within the project INGGAS a new deep towed acoustic profiling instrument consisting of a side scan sonar fish and a 26 channel seismic streamer has been developed for operation in full ocean depth. The digital channels are build by single hydrophones and three engineering nodes (EN) which are connected either by 1 m or 6.5 m long cable segments. Together with high frequent surface sources (e.g. GI gun) this hybrid system allows to complete surveys with target resolutions of higher frequency content than from complete surface based configurations. Consequently special effort has been addressed to positioning information of the submerged towed instrument. Ultra Short Base Line (USBL) navigation of the tow fish allows precise coordinate evaluation even with more than 7 km of tow cable. Specially designed engineering nodes comprise a single hydrophone with compass, depth, pitch and roll sensors. Optional extension of the streamer up to 96 hydrophone nodes and 75 engineering nodes is possible. A telemetry device allows up- and downlink transmission of all system parameters and all recorded data from the tow fish in real time. Signals from the streamer and the various side scan sensors are multiplexed along the deep-sea cable. Within the telemetry system coaxial and fiber optic connectors are available and can be chosen according to the ships needs. In case of small bandwidth only selected portions of data are transmitted onboard to provide full online quality control while a copy of the complete data set is stored within the submerged systems. Onboard the record strings of side scan and streamer are demultiplexed and distributed to the quality control (QC) systems by Ethernet. A standard marine multichannel control system is used to display shot gather, spectra and noise monitoring of the streamer channels as well as data storage in SEG format. Precise navigation post processing includes all available positioning information from the vessel (DGPS), the USBL, the streamer (EN) and optional first break information. Therefore exact positioning of each hydrophone can be provided throughout the entire survey which is an essential input for later migration processing of the seismic data.

  3. Application of bounding spectra to seismic design of piping based on the performance of above ground piping in power plants subjected to strong motion earthquakes

    SciTech Connect

    Stevenson, J.D.

    1995-02-01

    This report extends the potential application of Bounding Spectra evaluation procedures, developed as part of the A-46 Unresolved Safety Issue applicable to seismic verification of in-situ electrical and mechanical equipment, to in-situ safety related piping in nuclear power plants. The report presents a summary of earthquake experience data which define the behavior of typical U.S. power plant piping subject to strong motion earthquakes. The report defines those piping system caveats which would assure the seismic adequacy of the piping systems which meet those caveats and whose seismic demand are within the bounding spectra input. Based on the observed behavior of piping in strong motion earthquakes, the report describes the capabilities of the piping system to carry seismic loads as a function of the type of connection (i.e. threaded versus welded). This report also discusses in some detail the basic causes and mechanisms for earthquake damages and failures to power plant piping systems.

  4. Romanian Educational Seismic Network Project

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers will be installed in several schools in the most important seismic areas (Vrancea, Dobrogea), vulnerable cities (Bucharest, Ploiesti, Iasi) or high populated places (Cluj, Sibiu, Timisoara, Zalău). All the elements of the seismic station are especially designed for educational purposes and can be operated independently by the students and teachers themselves. The first stage of ROEDUSEIS project was centered on the work of achievement of educational materials for all levels of pre-university education (kindergarten, primary, secondary and high school). A study of necessity preceded the achievement of educational materials. This was done through a set of questionnaires for teachers and students sent to participating schools. Their responses formed a feedback instrument for properly materials editing. The topics covered within educational materials include: seismicity (general principles, characteristics of Romanian seismicity, historical local events), structure of the Earth, measuring of earthquakes, seismic hazard and risk.

  5. Seismic sources

    DOEpatents

    Green, M.A.; Cook, N.G.W.; McEvilly, T.V.; Majer, E.L.; Witherspoon, P.A.

    1987-04-20

    Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Longitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements for more than about one minute. 9 figs.

  6. 2008 United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, M.D.; and others

    2008-01-01

    The U.S. Geological Survey recently updated the National Seismic Hazard Maps by incorporating new seismic, geologic, and geodetic information on earthquake rates and associated ground shaking. The 2008 versions supersede those released in 1996 and 2002. These maps are the basis for seismic design provisions of building codes, insurance rate structures, earthquake loss studies, retrofit priorities, and land-use planning. Their use in design of buildings, bridges, highways, and critical infrastructure allows structures to better withstand earthquake shaking, saving lives and reducing disruption to critical activities following a damaging event. The maps also help engineers avoid costs from over-design for unlikely levels of ground motion.

  7. Seismic no-data zone, offshore Mississippi delta: depositional controls on geotechnical properties, velocity structure, and seismic attenuation

    SciTech Connect

    May, J.A.; Meeder, C.A.; Tinkle, A.R.; Wener, K.R.

    1986-09-01

    Seismic acquisition problems plague exploration and production offshore the Mississippi delta. Geologic and geotechnical analyses of 300-ft borings and 20-ft piston cores, combined with subbottom acoustic measurements, help identify and predict the locations, types, and magnitudes of anomalous seismic zones. This knowledge is used to design acquisition and processing techniques to circumvent the seismic problems.

  8. Seismic refraction exploration

    SciTech Connect

    Ruehle, W.H.

    1980-12-30

    In seismic exploration, refracted seismic energy is detected by seismic receivers to produce seismograms of subsurface formations. The seismograms are produced by directing seismic energy from an array of sources at an angle to be refracted by the subsurface formations and detected by the receivers. The directivity of the array is obtained by delaying the seismic pulses produced by each source in the source array.

  9. Seismic analysis of nuclear power plant structures

    NASA Technical Reports Server (NTRS)

    Go, J. C.

    1973-01-01

    Primary structures for nuclear power plants are designed to resist expected earthquakes of the site. Two intensities are referred to as Operating Basis Earthquake and Design Basis Earthquake. These structures are required to accommodate these seismic loadings without loss of their functional integrity. Thus, no plastic yield is allowed. The application of NASTRAN in analyzing some of these seismic induced structural dynamic problems is described. NASTRAN, with some modifications, can be used to analyze most structures that are subjected to seismic loads. A brief review of the formulation of seismic-induced structural dynamics is also presented. Two typical structural problems were selected to illustrate the application of the various methods of seismic structural analysis by the NASTRAN system.

  10. Verifying an interactive consistency circuit: A case study in the reuse of a verification technology

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1990-01-01

    The work done at ORA for NASA-LRC in the design and formal verification of a hardware implementation of a scheme for attaining interactive consistency (byzantine agreement) among four microprocessors is presented in view graph form. The microprocessors used in the design are an updated version of a formally verified 32-bit, instruction-pipelined, RISC processor, MiniCayuga. The 4-processor system, which is designed under the assumption that the clocks of all the processors are synchronized, provides software control over the interactive consistency operation. Interactive consistency computation is supported as an explicit instruction on each of the microprocessors. An identical user program executing on each of the processors decides when and on what data interactive consistency must be performed. This exercise also served as a case study to investigate the effectiveness of reusing the technology which was developed during the MiniCayuga effort for verifying synchronous hardware designs. MiniCayuga was verified using the verification system Clio which was also developed at ORA. To assist in reusing this technology, a computer-aided specification and verification tool was developed. This tool specializes Clio to synchronous hardware designs and significantly reduces the tedium involved in verifying such designs. The tool is presented and how it was used to specify and verify the interactive consistency circuit is described.

  11. A university-developed seismic source for shallow seismic surveys

    NASA Astrophysics Data System (ADS)

    Yordkayhun, Sawasdee; Na Suwan, Jumras

    2012-07-01

    The main objectives of this study were to (1) design and develop a low cost seismic source for shallow seismic surveys and (2) test the performance of the developed source at a test site. The surface seismic source, referred to here as a university-developed seismic source is based upon the principle of an accelerated weight drop. A 30 kg activated mass is lifted by a mechanical rack and pinion gear and is accelerated by a mounted spring. When the mass is released from 0.5 m above the surface, it hits a 30 kg base plate and energy is transferred to the ground, generating a seismic wave. The developed source is portable, environmentally friendly, easy to operate and maintain, and is a highly repeatable impact source. To compare the developed source with a sledgehammer source, a source test was performed at a test site, a study site for mapping a major fault zone in southern Thailand. The sledgehammer and the developed sources were shot along a 300 m long seismic reflection profile with the same parameters. Data were recorded using 12 channels off-end geometry with source and receiver spacing of 5 m, resulting in CDP stacked sections with 2.5 m between traces. Source performances were evaluated based on analyses of signal penetration, frequency content and repeatability, as well as the comparison of stacked sections. The results show that both surface sources are suitable for seismic studies down to a depth of about 200 m at the site. The hammer data are characterized by relatively higher frequency signals than the developed source data, whereas the developed source generates signals with overall higher signal energy transmission and greater signal penetration. In addition, the repeatability of the developed source is considerably higher than the hammer source.

  12. Seismic reflection imaging of shallow oceanographic structures

    NASA Astrophysics Data System (ADS)

    PiéTé, Helen; Marié, Louis; Marsset, Bruno; Thomas, Yannick; Gutscher, Marc-André

    2013-05-01

    Multichannel seismic (MCS) reflection profiling can provide high lateral resolution images of deep ocean thermohaline fine structure. However, the shallowest layers of the water column (z < 150 m) have remained unexplored by this technique until recently. In order to explore the feasibility of shallow seismic oceanography (SO), we reprocessed and analyzed four multichannel seismic reflection sections featuring reflectors at depths between 10 and 150 m. The influence of the acquisition parameters was quantified. Seismic data processing dedicated to SO was also investigated. Conventional seismic acquisition systems were found to be ill-suited to the imaging of shallow oceanographic structures, because of a high antenna filter effect induced by large offsets and seismic trace lengths, and sources that typically cannot provide both a high level of emission and fine vertical resolution. We considered a test case, the imagery of the seasonal thermocline on the western Brittany continental shelf. New oceanographic data acquired in this area allowed simulation of the seismic acquisition. Sea trials of a specifically designed system were performed during the ASPEX survey, conducted in early summer 2012. The seismic device featured: (i) four seismic streamers, each consisting of six traces of 1.80 m; (ii) a 1000 J SIG sparker source, providing a 400 Hz signal with a level of emission of 205 dB re 1 μPa @ 1 m. This survey captured the 15 m thick, 30 m deep seasonal thermocline in unprecedented detail, showing images of vertical displacements most probably induced by internal waves.

  13. IMPLEMENTATION OF SEISMIC STOPS IN PIPING SYSTEMS.

    SciTech Connect

    BEZLER,P.

    1993-02-01

    Commonwealth Edison has submitted a request to NRC to replace the snubbers in the Reactor Coolant Bypass Line of Byron Station -Unit 2 with gapped pipe supports. The specific supports intended for use are commercial units designated ''Seismic Stops'' manufactured by Robert L. Cloud Associates, Inc. (RLCA). These devices have the physical appearance of snubbers and are essentially spring supports incorporating clearance gaps sized for the Byron Station application. Although the devices have a nonlinear stiffness characteristic, their design adequacy is demonstrated through the use of a proprietary linear elastic piping analysis code ''GAPPIPE'' developed by RLCA. The code essentially has all the capabilities of a conventional piping analysis code while including an equivalent linearization technique to process the nonlinear spring elements. Brookhaven National Laboratory (BNL) has assisted the NRC staff in its evaluation of the RLCA implementation of the equivalent linearization technique and the GAPPIPE code. Towards this end, BNL performed a detailed review of the theoretical basis for the method, an independent evaluation of the Byron piping using the nonlinear time history capability of the ANSYS computer code and by result comparisons to the RLCA developed results, an assessment of the adequacy of the response estimates developed with GAPPIPE. Associated studies included efforts to verify the ANSYS analysis results and the development of bounding calculations for the Byron Piping using linear response spectrum methods.

  14. Implementation of Seismic Stops in Piping Systems

    SciTech Connect

    Bezler, P.; Simos, N.; Wang, Y.K.

    1993-02-01

    Commonwealth Edison has submitted a request to NRC to replace the snubbers in the Reactor Coolant Bypass Line of Byron Station-Unit 2 with gapped pipe supports. The specific supports intended for use are commercial units designated ''Seismic Stops'' manufactured by Robert L. Cloud Associates, Inc. (RLCA). These devices have the physical appearance of snubbers and are essentially spring supports incorporating clearance gaps sized for the Byron Station application. Although the devices have a nonlinear stiffness characteristic, their design adequacy is demonstrated through the use of a proprietary linear elastic piping analysis code ''GAPPIPE'' developed by RLCA. The code essentially has all the capabilities of a conventional piping analysis code while including an equivalent linearization technique to process the nonlinear spring elements. Brookhaven National Laboratory (BNL) has assisted the NRC staff in its evaluation of the RLCA implementation of the equivalent Linearization technique and the GAPPIPE code. Towards this end, BNL performed a detailed review of the theoretical basis for the method, an independent evaluation of the Byron piping using the nonlinear time history capability of the ANSYS computer code and by result comparisons to the RLCA developed results, an assessment of the adequacy of the response estimates developed with GAPPIPE. Associated studies included efforts to verify the ANSYS analysis results and the development of bounding calculations for the Byron Piping using linear response spectrum methods.

  15. Buried tank-to-tank interaction during a seismic event

    SciTech Connect

    Moore, C.J.; Wagenblast, G.R.; Day, J.P.

    1995-12-01

    Three-dimensional dynamic soil-structure interaction seismic analyses have become practical and accepted only since 1980. This new capability allows the study of interaction among closely spaced buried tanks during a seismic event. This paper presents the results of two studies of seismic tank-to-tank interaction at the US Department of Energy`s Hanford Site. One study evaluates seismic tank-to-tank interaction for an existing reinforced concrete tank design used during construction of the Hanford Site in the 1940`s. The other study evaluates seismic interaction and radius of separation for newly designed Hanford double-shelled buried waste tanks that are to be constructed.

  16. LANL seismic screening method for existing buildings

    SciTech Connect

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method and will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.

  17. Identity-Based Verifiably Encrypted Signatures without Random Oracles

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Wu, Qianhong; Qin, Bo

    Fair exchange protocol plays an important role in electronic commerce in the case of exchanging digital contracts. Verifiably encrypted signatures provide an optimistic solution to these scenarios with an off-line trusted third party. In this paper, we propose an identity-based verifiably encrypted signature scheme. The scheme is non-interactive to generate verifiably encrypted signatures and the resulting encrypted signature consists of only four group elements. Based on the computational Diffie-Hellman assumption, our scheme is proven secure without using random oracles. To the best of our knowledge, this is the first identity-based verifiably encrypted signature scheme provably secure in the standard model.

  18. Integrated seismic monitoring in Slovakia

    NASA Astrophysics Data System (ADS)

    Bystrický, E.; Kristeková, M.; Moczo, P.; Cipciar, A.; Fojtíková, L.; Pažák, P.; Gális, M.

    2009-04-01

    Two seismic networks are operated on the territory of the Slovak republic by two academic institutions. The Geophysical Institute of the Slovak Academy of Sciences operates the Slovak National Network of Seismic Stations (SNNSS, established in 2004) and the Faculty of Mathematics, Physics and Informatics, Comenius University Bratislava operates the Local Seismic Network Eastern Slovakia (LSNES, established in 2007). SNNSS is focused on the regional seismicity and participates in the international data exchange on a regular basis. LSNES, designed to be compatible and complementary with the existing SNNSS infrastructure, is focused on the seismicity of the eastern Slovakia source zone. The two networks share database and archive. Thus the expenses and workload of the joint data center operation are split between the two institutions. The cooperation enhances the overall reliability of the data center while does not interfere with the original scopes of the two networks. Relational database with thin client based on the standard web browser is implemented. Maintenance requirements of clients are reduced to minimum and it is easier to manage the system integrity. The database manages parametric data, macroseismic data, waveform data, inventory data, and geographic data. The database is not only a central part of the data processing of the two institutions; it also forms a core of the warning system. The warning system functionality requires development of the modules which are additional to the standard seismic database functionality. The modules for editing, publishing and automatic processing of macroseismic questionnaires were implemented for the purpose of the warning system, and the database integrates macroseismic data with other seismic data.

  19. Seismic sources

    DOEpatents

    Green, Michael A.; Cook, Neville G. W.; McEvilly, Thomas V.; Majer, Ernest L.; Witherspoon, Paul A.

    1992-01-01

    Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Logitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole relative to a stator that is clamped to the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements at a power level that causes heating to over 150.degree. C. within one minute of operation, but energizing the elements for no more than about one minute.

  20. Effect of Different Groundwater Levels on Seismic Dynamic Response and Failure Mode of Sandy Slope.

    PubMed

    Huang, Shuai; Lv, Yuejun; Peng, Yanju; Zhang, Lifang; Xiu, Liwei

    2015-01-01

    Heavy seismic damage tends to occur in slopes when groundwater is present. The main objectives of this paper are to determine the dynamic response and failure mode of sandy slope subjected simultaneously to seismic forces and variable groundwater conditions. This paper applies the finite element method, which is a fast and efficient design tool in modern engineering analysis, to evaluate dynamic response of the slope subjected simultaneously to seismic forces and variable groundwater conditions. Shaking table test is conducted to analyze the failure mode and verify the accuracy of the finite element method results. The research results show that dynamic response values of the slope have different variation rules under near and far field earthquakes. And the damage location and pattern of the slope are different in varying groundwater conditions. The destruction starts at the top of the slope when the slope is in no groundwater, which shows that the slope appears obvious whipping effect under the earthquake. The destruction starts at the toe of the slope when the slope is in the high groundwater levels. Meanwhile, the top of the slope shows obvious seismic subsidence phenomenon after earthquake. Furthermore, the existence of the groundwater has a certain effect of damping. PMID:26560103

  1. Effect of Different Groundwater Levels on Seismic Dynamic Response and Failure Mode of Sandy Slope

    PubMed Central

    Huang, Shuai; Lv, Yuejun; Peng, Yanju; Zhang, Lifang; Xiu, Liwei

    2015-01-01

    Heavy seismic damage tends to occur in slopes when groundwater is present. The main objectives of this paper are to determine the dynamic response and failure mode of sandy slope subjected simultaneously to seismic forces and variable groundwater conditions. This paper applies the finite element method, which is a fast and efficient design tool in modern engineering analysis, to evaluate dynamic response of the slope subjected simultaneously to seismic forces and variable groundwater conditions. Shaking table test is conducted to analyze the failure mode and verify the accuracy of the finite element method results. The research results show that dynamic response values of the slope have different variation rules under near and far field earthquakes. And the damage location and pattern of the slope are different in varying groundwater conditions. The destruction starts at the top of the slope when the slope is in no groundwater, which shows that the slope appears obvious whipping effect under the earthquake. The destruction starts at the toe of the slope when the slope is in the high groundwater levels. Meanwhile, the top of the slope shows obvious seismic subsidence phenomenon after earthquake. Furthermore, the existence of the groundwater has a certain effect of damping. PMID:26560103

  2. The intelligent seismic retrofitting of structure based on the magnetorheological dampers

    NASA Astrophysics Data System (ADS)

    Li, Xiu-ling; Li, Hong-nan

    2009-03-01

    Based on the state-of-the-art about seismic damage principles and aseismic strengthening technology, analysis and design method of seismic retrofitting for earthquake damaged reinforced concrete frame using magnetorheological (MR) damper is proposed. Three levels of fortification objects are put forward and quantified or intelligent retrofitting of reinforced concrete frame using MR damper. The experiment system of a three-floor reinforced concrete frame-shear wall eccentric structure has been built based on Matlab/Simulink software environment and hardware/software resources of dSPACE. The shaking table experiment of seismic retrofitting of earthquake damaged reinforced concrete frame-shear wall structure using MR damper is implemented using rapid control prototyping (RCP) technology. The validity of passive control strategies and semi-active control strategy is verified under El Centro earthquake excitations with different peak value. The experimental results indicate that MR dampers can significantly enhance aseismic performance level of the seismic damaged reinforced concrete frame, and meet all the earthquake fortification levels. The aseismic ability of MR damper intelligent aseismic structure system of auto-reinforcement is much better than both the damaged structure and the aseismic structure reinforced by the passive damper.

  3. Seismic waveform viewer, processor and calculator

    Energy Science and Technology Software Center (ESTSC)

    2015-02-15

    SWIFT is a computer code that is designed to do research level signal analysis on seismic waveforms, including visualization, filtering and measurement. LLNL is using this code, amplitude and global tomography efforts.

  4. Verifying the Dependence of Fractal Coefficients on Different Spatial Distributions

    SciTech Connect

    Gospodinov, Dragomir; Marekova, Elisaveta; Marinov, Alexander

    2010-01-21

    A fractal distribution requires that the number of objects larger than a specific size r has a power-law dependence on the size N(r) = C/r{sup D}propor tor{sup -D} where D is the fractal dimension. Usually the correlation integral is calculated to estimate the correlation fractal dimension of epicentres. A 'box-counting' procedure could also be applied giving the 'capacity' fractal dimension. The fractal dimension can be an integer and then it is equivalent to a Euclidean dimension (it is zero of a point, one of a segment, of a square is two and of a cube is three). In general the fractal dimension is not an integer but a fractional dimension and there comes the origin of the term 'fractal'. The use of a power-law to statistically describe a set of events or phenomena reveals the lack of a characteristic length scale, that is fractal objects are scale invariant. Scaling invariance and chaotic behavior constitute the base of a lot of natural hazards phenomena. Many studies of earthquakes reveal that their occurrence exhibits scale-invariant properties, so the fractal dimension can characterize them. It has first been confirmed that both aftershock rate decay in time and earthquake size distribution follow a power law. Recently many other earthquake distributions have been found to be scale-invariant. The spatial distribution of both regional seismicity and aftershocks show some fractal features. Earthquake spatial distributions are considered fractal, but indirectly. There are two possible models, which result in fractal earthquake distributions. The first model considers that a fractal distribution of faults leads to a fractal distribution of earthquakes, because each earthquake is characteristic of the fault on which it occurs. The second assumes that each fault has a fractal distribution of earthquakes. Observations strongly favour the first hypothesis.The fractal coefficients analysis provides some important advantages in examining earthquake spatial distribution, which are: - Simple way to quantify scale-invariant distributions of complex objects or phenomena by a small number of parameters. - It is becoming evident that the applicability of fractal distributions to geological problems could have a more fundamental basis. Chaotic behaviour could underlay the geotectonic processes and the applicable statistics could often be fractal.The application of fractal distribution analysis has, however, some specific aspects. It is usually difficult to present an adequate interpretation of the obtained values of fractal coefficients for earthquake epicenter or hypocenter distributions. That is why in this paper we aimed at other goals - to verify how a fractal coefficient depends on different spatial distributions. We simulated earthquake spatial data by generating randomly points first in a 3D space - cube, then in a parallelepiped, diminishing one of its sides. We then continued this procedure in 2D and 1D space. For each simulated data set we calculated the points' fractal coefficient (correlation fractal dimension of epicentres) and then checked for correlation between the coefficients values and the type of spatial distribution.In that way one can obtain a set of standard fractal coefficients' values for varying spatial distributions. These then can be used when real earthquake data is analyzed by comparing the real data coefficients values to the standard fractal coefficients. Such an approach can help in interpreting the fractal analysis results through different types of spatial distributions.

  5. The ENAM Explosive Seismic Source Test

    NASA Astrophysics Data System (ADS)

    Harder, S. H.; Magnani, M. B.

    2013-12-01

    We present the results of the pilot study conducted as part of the eastern North American margin (ENAM) community seismic experiment (CSE) to test an innovative design of land explosive seismic source for crustal-scale seismic surveys. The ENAM CSE is a community based onshore-offshore controlled- and passive-source seismic experiment spanning a 400 km-wide section of the mid-Atlantic East Coast margin around Cape Hatteras. The experiment was designed to address prominent research questions such as the role of the pre-existing lithospheric grain on the structure and evolution of the ENAM margin, the distribution of magmatism, and the along-strike segmentation of the margin. In addition to a broadband OBS deployment, the CSE will acquire multichannel marine seismic data and two major onshore-offshore controlled-source seismic profiles recording both marine sources (airguns) and land explosions. The data acquired as part of the ENAM CSE will be available to the community immediately upon completion of QC procedures required for archiving purposes. The ENAM CSE provides an opportunity to test a radically new and more economical design for land explosive seismic sources used for crustal-scale seismic surveys. Over the years we have incrementally improved the performance and reduced the cost of shooting crustal seismic shots. These improvements have come from better explosives and more efficient configuration of those explosives. These improvements are largely intuitive, using higher velocity explosives and shorter, but larger diameter explosive configurations. However, recently theoretical advances now allow us to model not only these incremental improvements, but to move to more radical shot designs, which further enhance performance and reduce costs. Because some of these designs are so radical, they need experimental verification. To better engineer the shots for the ENAM experiment we are conducting an explosives test in the region of the ENAM CSE. The results of this test will guide engineering for the main ENAM experiment as well as other experiments in the future.

  6. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION...; contents. A verified statement should contain all the facts upon which the witness relies, and to...

  7. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION...; contents. A verified statement should contain all the facts upon which the witness relies, and to...

  8. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Verifying your identity. 802.13 Section 802.13 Judicial Administration COURT SERVICES AND OFFENDER SUPERVISION AGENCY FOR THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access...

  9. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION...; contents. A verified statement should contain all the facts upon which the witness relies, and to...

  10. Flutter Stability Verified for the Trailing Edge Blowing Fan

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.; Srivastava, Rakesh

    2005-01-01

    The TURBO-AE aeroelastic code has been used to verify the flutter stability of the trailing edge blowing (TEB) fan, which is a unique technology demonstrator being designed and fabricated at the NASA Glenn Research Center for testing in Glenn s 9- by 15-Foot Low-Speed Wind Tunnel. Air can be blown out of slots near the trailing edges of the TEB fan blades to fill in the wakes downstream of the rotating blades, which reduces the rotor-stator interaction (tone) noise caused by the interaction of wakes with the downstream stators. The TEB fan will demonstrate a 1.6-EPNdB reduction in tone noise through wake filling. Furthermore, the reduced blade-row interaction will decrease the possibility of forced-response vibrations and enable closer spacing of blade rows, thus reducing engine length and weight. The detailed aeroelastic analysis capability of the three-dimensional Navier-Stokes TURBO-AE code was used to check the TEB fan rotor blades for flutter stability. Flutter calculations were first performed with no TEB flow; then select calculations were repeated with TEB flow turned on.

  11. VISION - Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics

    SciTech Connect

    Steven J. Piet; A. M. Yacout; J. J. Jacobson; C. Laws; G. E. Matthern; D. E. Shropshire

    2006-02-01

    The U.S. DOE Advanced Fuel Cycle Initiative’s (AFCI) fundamental objective is to provide technology options that - if implemented - would enable long-term growth of nuclear power while improving sustainability and energy security. The AFCI organization structure consists of four areas; Systems Analysis, Fuels, Separations and Transmutations. The Systems Analysis Working Group is tasked with bridging the program technical areas and providing the models, tools, and analyses required to assess the feasibility of design and deployment options and inform key decision makers. An integral part of the Systems Analysis tool set is the development of a system level model that can be used to examine the implications of the different mixes of reactors, implications of fuel reprocessing, impact of deployment technologies, as well as potential "exit" or "off ramp" approaches to phase out technologies, waste management issues and long-term repository needs. The Verifiable Fuel Cycle Simulation Model (VISION) is a computer-based simulation model that allows performing dynamic simulations of fuel cycles to quantify infrastructure requirements and identify key trade-offs between alternatives. It is based on the current AFCI system analysis tool "DYMOND-US" functionalities in addition to economics, isotopic decay, and other new functionalities. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI and Generation IV reactor development studies.

  12. Broadband seismology and small regional seismic networks

    USGS Publications Warehouse

    Herrmann, Robert B.

    1995-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  13. Regional Seismic Methods of Identifying Explosions

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Ford, S. R.; Pasyanos, M.; Pyle, M. L.; Hauk, T. F.

    2013-12-01

    A lesson from the 2006, 2009 and 2013 DPRK declared nuclear explosion Ms:mb observations is that our historic collection of data may not be representative of future nuclear test signatures (e.g. Selby et al., 2012). To have confidence in identifying future explosions amongst the background of other seismic signals, we need to put our empirical methods on a firmer physical footing. Here we review the two of the main identification methods: 1) P/S ratios and 2) Moment Tensor techniques, which can be applied at the regional distance (200-1600 km) to very small events, improving nuclear explosion monitoring and confidence in verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Amplitude ratios of seismic P-to-S waves at sufficiently high frequencies (~>2 Hz) can identify explosions among a background of natural earthquakes (e.g. Walter et al., 1995). However the physical basis for the generation of explosion S-waves, and therefore the predictability of this P/S technique as a function of event properties such as size, depth, geology and path, remains incompletely understood. Calculated intermediate period (10-100s) waveforms from regional 1-D models can match data and provide moment tensor results that separate explosions from earthquakes and cavity collapses (e.g. Ford et al. 2009). However it has long been observed that some nuclear tests produce large Love waves and reversed Rayleigh waves that complicate moment tensor modeling. Again the physical basis for the generation of these effects from explosions remains incompletely understood. We are re-examining regional seismic data from a variety of nuclear test sites including the DPRK and the former Nevada Test Site (now the Nevada National Security Site (NNSS)). Newer relative amplitude techniques can be employed to better quantify differences between explosions and used to understand those differences in term of depth, media and other properties. We are also making use of the Source Physics Experiments (SPE) at NNSS. The SPE chemical explosions are explicitly designed to improve our understanding of emplacement and source material effects on the generation of shear and surface waves (e.g. Snelson et al., 2013). Our goal is to improve our explosion models and our ability to understand and predict where P/S and moment tensor methods of identifying explosions work, and any circumstances where they may not. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  14. Seismic qualification of unanchored equipment

    SciTech Connect

    Moran, T.J.

    1995-12-01

    This paper describes procedures used to design and qualify unanchored equipment to survive Seismic events to the PC = 4 level in a moderate seismic area. The need for flexibility to move experimental equipment together with the requirements for remote handling in a highly-radioactive non-reactor nuclear facility precluded normal equipment anchorage. Instead equipment was designed to remain stable under anticipated DBE floor motions with sufficient margin to achieve the performance goal. The equipment was also designed to accommodate anticipated sliding motions with sufficient. The simplified design criteria used to achieve these goals were based on extensive time-history simulations of sliding, rocking, and overturning of generic equipment models. The entire process was subject to independent peer review and accepted in a Safety Evaluation Report. The process provides a model suitable for adaptation to similar applications and for assessment of the potential for seismic damage of existing, unanchored equipment In particular, the paper describes: (1) Two dimensional sliding studies of deformable equipment subject to 3-D floor excitation as the basis for simplified sliding radius and sliding velocity design criteria. (2) Two dimensional rocking and overturning simulations of rigid equipment used to establish design criteria for minimum base dimensions and equipment rigidity to prevent overturning. (3) Assumed mode rocking analyses of deformable equipment models used to establish uplift magnitudes and subsequent impacts during stable rocking motions. The model used for these dynamic impact studies is reported elsewhere.

  15. Seismic intrusion detector system

    DOEpatents

    Hawk, Hervey L.; Hawley, James G.; Portlock, John M.; Scheibner, James E.

    1976-01-01

    A system for monitoring man-associated seismic movements within a control area including a geophone for generating an electrical signal in response to seismic movement, a bandpass amplifier and threshold detector for eliminating unwanted signals, pulse counting system for counting and storing the number of seismic movements within the area, and a monitoring system operable on command having a variable frequency oscillator generating an audio frequency signal proportional to the number of said seismic movements.

  16. Reasoning about knowledge: Children's evaluations of generality and verifiability.

    PubMed

    Koenig, Melissa A; Cole, Caitlin A; Meyer, Meredith; Ridge, Katherine E; Kushnir, Tamar; Gelman, Susan A

    2015-12-01

    In a series of experiments, we examined 3- to 8-year-old children's (N=223) and adults' (N=32) use of two properties of testimony to estimate a speaker's knowledge: generality and verifiability. Participants were presented with a "Generic speaker" who made a series of 4 general claims about "pangolins" (a novel animal kind), and a "Specific speaker" who made a series of 4 specific claims about "this pangolin" as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., "has a pointy nose") or a non-evident feature that was not visible (e.g., "sleeps in a hollow tree"). Three main findings emerged: (1) young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) children's attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) children often generalized speakers' knowledge outside of the pangolin domain, indicating a belief that a person's knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. PMID:26451884

  17. Advanced Seismic While Drilling System

    SciTech Connect

    Robert Radtke; John Fontenot; David Glowka; Robert Stokes; Jeffery Sutherland; Ron Evans; Jim Musser

    2008-06-30

    A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology of a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured by TII. An APS Turbine Alternator powered the SeismicPULSER{trademark} to produce two Hz frequency peak signals repeated every 20 seconds. Since the ION Geophysical, Inc. (ION) seismic survey surface recording system was designed to detect a minimum downhole signal of three Hz, successful performance was confirmed with a 5.3 Hz recording with the pumps running. The two Hz signal generated by the sparker was modulated with the 3.3 Hz signal produced by the mud pumps to create an intense 5.3 Hz peak frequency signal. The low frequency sparker source is ultimately capable of generating selectable peak frequencies of 1 to 40 Hz with high-frequency spectra content to 10 kHz. The lower frequencies and, perhaps, low-frequency sweeps, are needed to achieve sufficient range and resolution for realtime imaging in deep (15,000 ft+), high-temperature (150 C) wells for (a) geosteering, (b) accurate seismic hole depth, (c) accurate pore pressure determinations ahead of the bit, (d) near wellbore diagnostics with a downhole receiver and wired drill pipe, and (e) reservoir model verification. Furthermore, the pressure of the sparker bubble will disintegrate rock resulting in an increased overall rates of penetration. Other applications for the SeismicPULSER{trademark} technology are to deploy a low-frequency source for greater range on a wireline for Reverse Vertical Seismic Profiling (RVSP) and Cross-Well Tomography. Commercialization of the technology is being undertaken by first contacting stakeholders to define the value proposition for rig site services utilizing SeismicPULSER{trademark} technologies. Stakeholders include national oil companies, independent oil companies, independents, service companies, and commercial investors. Service companies will introduce a new Drill Bit SWD service for deep HTHP wells. Collaboration will be encouraged between stakeholders in the form of joint industry projects to develop prototype tools and initial field trials. No barriers have been identified for developing, utilizing, and exploiting the low-frequency SeismicPULSER{trademark} source in a variety of applications. Risks will be minimized since Drill Bit SWD will not interfere with the drilling operation, and can be performed in a relatively quiet environment when the pumps are turned off. The new source must be integrated with other Measurement While Drilling (MWD) tools. To date, each of the oil companies and service companies contacted have shown interest in participating in the commercialization of the low-frequency SeismicPULSER{trademark} source. A technical paper has been accepted for presentation at the 2009 Offshore Technology Conference (OTC) in a Society of Exploration Geologists/American Association of Petroleum Geophysicists (SEG/AAPG) technical session.

  18. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-12-09

    This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate preferential contaminant-flow paths. Historically, SSR has been used very little at depths shallower than 30 m, and even more rarely at depths of 10 m or less. Conversely, GPR is rarely useful at depths greater than 10 m, especially in areas where clay or other electrically conductive materials are present near the surface. Efforts to image the cone of depression around a pumping well using seismic methods were only partially successful (for complete references of all research results, see the full Final Technical Report, DOE/ER/14826-F), but peripheral results included development of SSR methods for depths shallower than one meter, a depth range that had not been achieved before. Imaging at such shallow depths, however, requires geophone intervals of the order of 10 cm or less, which makes such surveys very expensive in terms of human time and effort. We also showed that SSR and GPR could be used in a complementary fashion to image the same volume of earth at very shallow depths. The primary research focus of the second three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under rough topographic conditions. In some easy-access environments, this device could make SSR surveying considerably more efficient and less expensive, particularly when geophone intervals of 25 cm or less are required. The most recent research analyzed the difference in seismic response of the geophones with variable geophone spike length and geophones attached to various steel media. Experiments investigated the azimuthal dependence of the quality of data relative to the orientation of the rigidly attached geophones. Other experiments designed to test the hypothesis that the data are being amplified in much the same way that an organ pipe amplifies sound have so far proved inconclusive. Taken together, the positive results show that SSR imaging within a few meters of the earth's surface is possible if the geology is suitable, that SSR imaging can complement GPR imaging, and that SSR imaging could be made significantly more cost effective, at least in areas where the topography and the geology are favorable. Increased knowledge of the Earth's shallow subsurface through non-intrusive techniques is of potential benefit to management of DOE facilities. Among the most significant problems facing hydrologists today is the delineation of preferential permeability paths in sufficient detail to make a quantitative analysis possible. Aquifer systems dominated by fracture flow have a reputation of being particularly difficult to characterize and model. At chemically contaminated sites, including U.S. Department of Energy (DOE) facilities and others at Department of Defense (DOD) installations worldwide, establishing the spatial extent of the contamination, along with the fate of the contaminants and their transport-flow directions, is essential to the development of effective cleanup strategies. Detailed characterization of the shallow subsurface is important not only in environmental, groundwater, and geotechnical engineering applications, but also in neotectonics, mining geology, and the analysis of petroleum reservoir analogs. Near-surface seismology is in the vanguard of non-intrusive approaches to increase knowledge of the shallow subsurface; our work is a significant departure from conventional seismic-survey field procedures.

  19. Seismic-Scale Rock Physics of Methane Hydrate

    SciTech Connect

    Amos Nur

    2009-01-08

    We quantify natural methane hydrate reservoirs by generating synthetic seismic traces and comparing them to real seismic data: if the synthetic matches the observed data, then the reservoir properties and conditions used in synthetic modeling might be the same as the actual, in-situ reservoir conditions. This approach is model-based: it uses rock physics equations that link the porosity and mineralogy of the host sediment, pressure, and hydrate saturation, and the resulting elastic-wave velocity and density. One result of such seismic forward modeling is a catalogue of seismic reflections of methane hydrate which can serve as a field guide to hydrate identification from real seismic data. We verify this approach using field data from known hydrate deposits.

  20. Micromachined silicon seismic transducers

    SciTech Connect

    Barron, C.C.; Fleming, J.G.; Sniegowski, J.J.; Armour, D.L.; Fleming, R.P.

    1995-08-01

    Batch-fabricated silicon seismic transducers could revolutionize the discipline of CTBT monitoring by providing inexpensive, easily depolyable sensor arrays. Although our goal is to fabricate seismic sensors that provide the same performance level as the current state-of-the-art ``macro`` systems, if necessary one could deploy a larger number of these small sensors at closer proximity to the location being monitored in order to compensate for lower performance. We have chosen a modified pendulum design and are manufacturing prototypes in two different silicon micromachining fabrication technologies. The first set of prototypes, fabricated in our advanced surface- micromachining technology, are currently being packaged for testing in servo circuits -- we anticipate that these devices, which have masses in the 1--10 {mu}g range, will resolve sub-mG signals. Concurrently, we are developing a novel ``mold`` micromachining technology that promises to make proof masses in the 1--10 mg range possible -- our calculations indicate that devices made in this new technology will resolve down to at least sub-{mu}G signals, and may even approach to 10{sup {minus}10} G/{radical}Hz acceleration levels found in the low-earth-noise model.

  1. 26 CFR 301.6334-4 - Verified statements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... ADMINISTRATION PROCEDURE AND ADMINISTRATION Seizure of Property for Collection of Taxes § 301.6334-4 Verified... dependent child; (2) The name, relationship, and Social Security Number of each individual whom the...

  2. 26 CFR 301.6334-4 - Verified statements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... ADMINISTRATION PROCEDURE AND ADMINISTRATION Seizure of Property for Collection of Taxes § 301.6334-4 Verified... dependent child; (2) The name, relationship, and Social Security Number of each individual whom the...

  3. Seismic evaluation of headframe and associated equipment

    SciTech Connect

    Not Available

    1982-11-01

    This report presents the results of studies on the seismic evaluation of the headframe structure and the associated equipment for the Canistered Waste Facility of a Nuclear Waste Storage Repository. The conceptual design for the repository was developed by Stearns-Roger Engineering Corporation. The studies described in this report were performed by Engineering Decision Analysis Company, Inc. (EDAC) for Battelle/Office of Nuclear Waste Isolation (ONWI). The evaluations included the following main tasks: Task I. Development of seismic input; Task II. Seismic evaluation of headframe; Task III. Seismic evaluation of the cable/hoist system; Task IV. Cask drop evaluation; Task V. Quality assurance analysis. Because some components in the system could not withstand the postulated seismic motions without failure or without exceeding specified factors of safety, it was recommended that complete and detailed seismic criteria should be developed for the seismic input motions and the design of the headframe structure and associated equipment, such as the cable/hoist system. The analyses performed in this study and the resulting understanding developed of the behavior of the headframe structure and the cable/hoist system will be extremely helpful in the development of such criteria. A description of the computer programs used in this study is presented in Appendix A.

  4. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

    2009-08-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several Microsoft Excel input files, a Powersim Studio core, and several Microsoft Excel output files. All must be co-located in the same folder on a PC to function. We use Microsoft Excel 2003 and have not tested VISION with Microsoft Excel 2007. The VISION team uses both Powersim Studio 2005 and 2009 and it should work with either.

  5. Salvo: Seismic imaging software for complex geologies

    SciTech Connect

    OBER,CURTIS C.; GJERTSEN,ROB; WOMBLE,DAVID E.

    2000-03-01

    This report describes Salvo, a three-dimensional seismic-imaging software for complex geologies. Regions of complex geology, such as overthrusts and salt structures, can cause difficulties for many seismic-imaging algorithms used in production today. The paraxial wave equation and finite-difference methods used within Salvo can produce high-quality seismic images in these difficult regions. However this approach comes with higher computational costs which have been too expensive for standard production. Salvo uses improved numerical algorithms and methods, along with parallel computing, to produce high-quality images and to reduce the computational and the data input/output (I/O) costs. This report documents the numerical algorithms implemented for the paraxial wave equation, including absorbing boundary conditions, phase corrections, imaging conditions, phase encoding, and reduced-source migration. This report also describes I/O algorithms for large seismic data sets and images and parallelization methods used to obtain high efficiencies for both the computations and the I/O of seismic data sets. Finally, this report describes the required steps to compile, port and optimize the Salvo software, and describes the validation data sets used to help verify a working copy of Salvo.

  6. Integration of onshore and offshore seismological data to study the seismicity of the Calabrian Region

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Antonino; Guerra, Ignazio; D'Anna, Giuseppe; Gervasi, Anna; Harabaglia, Paolo; Luzio, Dario; Stellato, Gilda

    2014-05-01

    The Pollino Massif marks the transition from the Southern Appenninic to the Calabrian Arc. On the western side it is characterized by a moderately sized seismicity (about 9 M > 4 events in the last 50 years), well documented in the last 400 years. The moment tensor solutions available in this area yields, mainly, normal faults with coherent Southern Appeninic trend. This remains true also for the events that are localized on the calabrian side of Pollino, South of the massif. In most of the Sibari plane, seismic activity is very scarce, while it is again rather marked on its southeastern corner, both onshore and offshore. The above observations point to the perspective that the stress field of a vast portion of Northern Calabria still resembles that of the Southern Appenines. In this frame, it becomes important to investigate the offshore seismicity of the Sibari Gulf and the deformation pattern within the Sibari Plane. The latter might function as a hinge to transfer the deformation of the extensional fault system in the Pollino area to a different offshore fault system. Since return times of larger events might be very long, we need to investigate the true seismic potential of the offshore faults and to verify whether they are truly strike slip or if they could involve relevant thrust or normal components, that would add to the risk that of potentially associated tsunamis. Despite their importance in the understanding of the seismotectonic processes taking place in the Southern Appenninic - Calabrian Arc border and surrounding areas, the seismicity and the seismogenic volumes of the Sibari Gulf until now has not been well characterized due to the lack of offshore seismic stations. The seismicity of the Calabrian is monitored by the Italian National Seismic Network (INSN) managed by Istituto Nazionale di Geofisica e Vulcanologia and by the Calabrian Regional Seismic Network (CRSN) managed by the University of Calabria. Both the network comprise only on-land seismic stations. The lack of offshore stations prevents accurate determination of the hypocentral parameters also for moderate-strong earthquakes that occur in the Calabria offshore. With the aim of investigate the near shore seismicity in the Sibari Gulf and its eventual relationship with the Pollino activity, in the early 2014 will start a project for the improvement of the Calabrian Seismic Network in monitoring the Sibari Gulf area by deploying several Ocean Bottom Seismometers with Hydrophone (OBS/H). For this experiment, each OBS/H is equipped with a broad-band seismometer housed in a glass sphere designed to operate at a depth of up to 6000 m and with an autolevelling sensor system. The OBS/Hs are also equipped with an hydrophone. Analogical signals are recorded with a sampling frequency of 200 Hz by a four-channel 21 bits datalogger. In this work, we plan to present the preliminary results of the monitoring campaign showing the largest improvement in hypocenter locations derived from the integration of the onshore and offshore seismic stations.

  7. Seismic analysis of the large 70-meter antenna. Part 2: General dynamic response and a seismic safety check

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    An extensive dynamic analysis for the new JPL 70-meter antenna structure is presented. Analytical procedures are based on the normal mode decomposition which include dumping and special forcing functions. The dynamic response can be obtained for any arbitrarily selected point on the structure. A new computer program for computing the time-dependent, resultant structural displacement, summing the effects of all participating modes, was developed also. Program compatibility with natural frequency analysis output was verified. The program was applied to the JPL 70-meter antenna structure and the dynamic response for several specially selected points was computed. Seismic analysis of structures, a special application of the general dynamic analysis, is based also on the normal modal decomposition. Strength specification of the antenna, with respect to the earthquake excitation, is done by using the common response spectra. The results indicated basically a safe design under an assumed 5% or more damping coefficient. However, for the antenna located at Goldstone, with more active seismic environment, this study strongly recommends and experimental program that determines the true damping coefficient for a more reliable safety check.

  8. Verified Centers, Nonverified Centers or Other Facilities: A National Analysis of Burn Patient Treatment Location

    PubMed Central

    Zonies, David; Mack, Christopher; Kramer, Bradley; Rivara, Frederick; Klein, Matthew

    2009-01-01

    Background Although comprehensive burn care requires significant resources, patients may be treated at verified burn centers, non-verified burn centers, or other facilities due to a variety of factors. The purpose of this study was to evaluate the association between patient and injury characteristics and treatment location using a national database. Study Design We performed an analysis of all burn patients admitted to United States hospitals participating in the Healthcare Cost and Utilization Project over 2 years. Univariate and multivariate analyses were performed to identify patient and injury factors associated with the likelihood of treatment at designated burn care facilities. Definitve care facilities were categorized as American Burn Association verified centers, non-verified burn centers, or other facilities. Results Over the two years, 29,971 burn patients were treated in 1,376 hospitals located in 19 participating states. A total of 6,712 (22%) patients were treated at verified centers, with 26% and 52% treated at non-verified or other facilities, respectively. Patients treated at verified centers were younger than those at non-verified or other facilities (33.1 years vs. 33.7 years vs. 41.9 years, p<0.001) and had a higher rate of inhalation injury (3.4% vs. 3.2% vs. 2.2%, p<0.001). Independent factors associated with treatment at verified centers include burns to the head/neck (RR 2.4, CI 2.1-2.7), hand (RR 1.8, CI 1.6-1.9), electrical injury (RR 1.4, CI 1.4, CI 1.2-1.7), and fewer co-morbidities (RR 0.55, CI 0.5-0.6). Conclusions More than two-thirds of significantly burned patients are treated at non-verified burn centers in the U.S. Many patients meeting ABA criteria for transfer to a burn center are being treated at non-burn center facilities. PMID:20193892

  9. Development of Seismic Isolation Systems Using Periodic Materials

    SciTech Connect

    Yan, Yiqun; Mo, Yi-Lung; Menq, Farn-Yuh; Stokoe, II, Kenneth H.; Perkins, Judy; Tang, Yu

    2014-12-10

    Advanced fast nuclear power plants and small modular fast reactors are composed of thin-walled structures such as pipes; as a result, they do not have sufficient inherent strength to resist seismic loads. Seismic isolation, therefore, is an effective solution for mitigating earthquake hazards for these types of structures. Base isolation, on which numerous studies have been conducted, is a well-defined structure protection system against earthquakes. In conventional isolators, such as high-damping rubber bearings, lead-rubber bearings, and friction pendulum bearings, large relative displacements occur between upper structures and foundations. Only isolation in a horizontal direction is provided; these features are not desirable for the piping systems. The concept of periodic materials, based on the theory of solid-state physics, can be applied to earthquake engineering. The periodic material is a material that possesses distinct characteristics that prevent waves with certain frequencies from being transmitted through it; therefore, this material can be used in structural foundations to block unwanted seismic waves with certain frequencies. The frequency band of periodic material that can filter out waves is called the band gap, and the structural foundation made of periodic material is referred to as the periodic foundation. The design of a nuclear power plant, therefore, can be unified around the desirable feature of a periodic foundation, while the continuous maintenance of the structure is not needed. In this research project, three different types of periodic foundations were studied: one-dimensional, two-dimensional, and three-dimensional. The basic theories of periodic foundations are introduced first to find the band gaps; then the finite element methods are used, to perform parametric analysis, and obtain attenuation zones; finally, experimental programs are conducted, and the test data are analyzed to verify the theory. This procedure shows that the periodic foundation is a promising and effective way to mitigate structural damage caused by earthquake excitation.

  10. Basis for seismic provisions of DOE-STD-1020

    SciTech Connect

    Kennedy, R.C.; Short, S.A.

    1994-04-01

    DOE-STD-1020 provides for a graded approach for the seismic design and evaluation of DOE structures, systems, and components (SSC). Each SSC is assigned to a Performance Category (PC) with a performance description and an approximate annual probability of seismic-induced unacceptable performance, P{sub F}. The seismic annual probability performance goals for PC 1 through 4 for which specific seismic design and evaluation criteria are presented. DOE-STD-1020 also provides a seismic design and evaluation procedure applicable to achieve any seismic performance goal annual probability of unacceptable performance specified by the user. The desired seismic performance goal is achieved by defining the seismic hazard in terms of a site-specified design/evaluation response spectrum (called herein, the Design/Evaluation Basis Earthquake, DBE). Probabilistic seismic hazard estimates are used to establish the DBE. The resulting seismic hazard curves define the amplitude of the ground motion as a function of the annual probability of exceedance P{sub H} of the specified seismic hazard. Once the DBE is defined, the SSC is designed or evaluated for this DBE using adequately conservative deterministic acceptance criteria. To be adequately conservative, the acceptance criteria must introduce an additional reduction in the risk of unacceptable performance below the annual risk of exceeding the DBE. The ratio of the seismic hazard exceedance probability P{sub H} to the performance goal probability P{sub F} is defined herein as the risk reduction ratio. The required degree of conservatism in the deterministic acceptance criteria is a function of the specified risk reduction ratio.

  11. Angola Seismicity MAP

    NASA Astrophysics Data System (ADS)

    Neto, F. A. P.; Franca, G.

    2014-12-01

    The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of Sá da Bandeira (Lubango)-Chibemba-Oncócua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambândua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic zone Porto Amboim in the coastal portion of Kwanza basin sedimentary.

  12. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in the Russian Federation the corresponding procedures have been developed. They are based on mathematical modeling of the system elements' interaction: the oil pipe line and ground, in the case of seismic loads. As a result the dependence-ships between the probability of oil pipe line system to be damaged, and the intensity of shaking in grades of seismic scales have been obtained. The following three damage states for oil pipe line systems have been considered: light damage - elastic deformation of the linear part; localized plastic deformation without breaching the pipeline; average damage - significant plastic deformation of the linear part; fistulas in some areas; complete destruction - large horizontal and vertical displacements of the linear part; mass fistulas, cracks; "guillotine break" of pipe line in some areas.

  13. Development of material measures for performance verifying surface topography measuring instruments

    NASA Astrophysics Data System (ADS)

    Leach, Richard; Giusca, Claudiu; Rickens, Kai; Riemer, Oltmann; Rubert, Paul

    2014-04-01

    The development of two irregular-geometry material measures for performance verifying surface topography measuring instruments is described. The material measures are designed to be used to performance verify tactile and optical areal surface topography measuring instruments. The manufacture of the material measures using diamond turning followed by nickel electroforming is described in detail. Measurement results are then obtained using a traceable stylus instrument and a commercial coherence scanning interferometer, and the results are shown to agree to within the measurement uncertainties. The material measures are now commercially available as part of a suite of material measures aimed at the calibration and performance verification of areal surface topography measuring instruments.

  14. Evolution of optically nondestructive and data-non-intrusive credit card verifiers

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2010-04-01

    Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

  15. The SCALE Verified, Archived Library of Inputs and Data - VALID

    SciTech Connect

    Marshall, William BJ J; Rearden, Bradley T

    2013-01-01

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated with model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.

  16. Rock-physics and seismic-inversion based reservoir characterization of the Haynesville Shale

    NASA Astrophysics Data System (ADS)

    Jiang, Meijuan; Spikes, Kyle T.

    2016-06-01

    Seismic reservoir characterization of unconventional gas shales is challenging due to their heterogeneity and anisotropy. Rock properties of unconventional gas shales such as porosity, pore-shape distribution, and composition are important for interpreting seismic data amplitude variations in order to locate optimal drilling locations. The presented seismic reservoir characterization procedure applied a grid-search algorithm to estimate the composition, pore-shape distribution, and porosity at the seismic scale from the seismically inverted impedances and a rock-physics model, using the Haynesville Shale as a case study. All the proposed rock properties affected the seismic velocities, and the combined effects of these rock properties on the seismic amplitude were investigated simultaneously. The P- and S-impedances correlated negatively with porosity, and the V P/V S correlated positively with clay fraction and negatively with the pore-shape distribution and quartz fraction. The reliability of these estimated rock properties at the seismic scale was verified through comparisons between two sets of elastic properties: one coming from inverted impedances, which were obtained from simultaneous inversion of prestack seismic data, and one derived from these estimated rock properties. The differences between the two sets of elastic properties were less than a few percent, verifying the feasibility of the presented seismic reservoir characterization.

  17. Mapping Europe's Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Giardini, Domenico; Wössner, Jochen; Danciu, Laurentiu

    2014-07-01

    From the rift that cuts through the heart of Iceland to the complex tectonic convergence that causes frequent and often deadly earthquakes in Italy, Greece, and Turkey to the volcanic tremors that rattle the Mediterranean, seismic activity is a prevalent and often life-threatening reality across Europe. Any attempt to mitigate the seismic risk faced by society requires an accurate estimate of the seismic hazard.

  18. Oklahoma seismic network. Final report

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. |

    1993-07-01

    The US Nuclear Regulatory Commission has established rigorous guidelines that must be adhered to before a permit to construct a nuclear-power plant is granted to an applicant. Local as well as regional seismicity and structural relationships play an integral role in the final design criteria for nuclear power plants. The existing historical record of seismicity is inadequate in a number of areas of the Midcontinent region because of the lack of instrumentation and (or) the sensitivity of the instruments deployed to monitor earthquake events. The Nemaha Uplift/Midcontinent Geophysical Anomaly is one of five principal areas east of the Rocky Mountain front that has a moderately high seismic-risk classification. The Nemaha uplift, which is common to the states of Oklahoma, Kansas, and Nebraska, is approximately 415 miles long and 12-14 miles wide. The Midcontinent Geophysical Anomaly extends southward from Minnesota across Iowa and the southeastern corner of Nebraska and probably terminates in central Kansas. A number of moderate-sized earthquakes--magnitude 5 or greater--have occurred along or west of the Nemaha uplift. The Oklahoma Geological Survey, in cooperation with the geological surveys of Kansas, Nebraska, and Iowa, conducted a 5-year investigation of the seismicity and tectonic relationships of the Nemaha uplift and associated geologic features in the Midcontinent. This investigation was intended to provide data to be used to design nuclear-power plants. However, the information is also being used to design better large-scale structures, such as dams and high-use buildings, and to provide the necessary data to evaluate earthquake-insurance rates in the Midcontinent.

  19. Seismic Imaging and Monitoring

    SciTech Connect

    Huang, Lianjie

    2012-07-09

    I give an overview of LANL's capability in seismic imaging and monitoring. I present some seismic imaging and monitoring results, including imaging of complex structures, subsalt imaging of Gulf of Mexico, fault/fracture zone imaging for geothermal exploration at the Jemez pueblo, time-lapse imaging of a walkway vertical seismic profiling data for monitoring CO{sub 2} inject at SACROC, and microseismic event locations for monitoring CO{sub 2} injection at Aneth. These examples demonstrate LANL's high-resolution and high-fidelity seismic imaging and monitoring capabilities.

  20. Verifying Stiffness Parameters Of Filament-Wound Cylinders

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Rheinfurth, M.

    1994-01-01

    Predicted engineering stiffness parameters of filament-wound composite-material cylinders verified with respect to experimental data, by use of equations developed straightforwardly from applicable formulation of Hooke's law. Equations derived in engineering study of filament-wound rocket-motor cases, also applicable to other cylindrical pressure vessels made of orthotropic materials.

  1. Verifying continuous-variable entanglement in finite spaces

    SciTech Connect

    Sperling, J.; Vogel, W.

    2009-05-15

    Starting from arbitrary Hilbert spaces, we reduce the problem to verify entanglement of any bipartite quantum state to finite-dimensional subspaces. Entanglement can be fully characterized as a finite-dimensional property, even though in general the truncation of the Hilbert space may cause fake nonclassicality. A generalization for multipartite quantum states is also given.

  2. Elements of a system for verifying a Comprehensive Test Ban

    SciTech Connect

    Hannon, W.J.

    1987-03-06

    The paper discusses the goals of a monitoring system for a CTB, its functions, the challenges to verification, discrimination techniques, and some recent developments. It is concluded technical, military and political efforts are required to establish and verify test ban treaties which will contribute to stability in the long term. It currently appears there will be a significant number of unidentified events. (ACR)

  3. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  4. A Trustworthy Internet Auction Model with Verifiable Fairness.

    ERIC Educational Resources Information Center

    Liao, Gen-Yih; Hwang, Jing-Jang

    2001-01-01

    Describes an Internet auction model achieving verifiable fairness, a requirement aimed at enhancing the trust of bidders in auctioneers. Analysis results demonstrate that the proposed model satisfies various requirements regarding fairness and privacy. Moreover, in the proposed model, the losing bids remain sealed. (Author/AEF)

  5. Seismic Catalogue and Seismic Network in Haiti

    NASA Astrophysics Data System (ADS)

    Belizaire, D.; Benito, B.; Carreño, E.; Meneses, C.; Huerfano, V.; Polanco, E.; McCormack, D.

    2013-05-01

    The destructive earthquake occurred on January 10, 2010 in Haiti, highlighted the lack of preparedness of the country to address seismic phenomena. At the moment of the earthquake, there was no seismic network operating in the country, and only a partial control of the past seismicity was possible, due to the absence of a national catalogue. After the 2010 earthquake, some advances began towards the installation of a national network and the elaboration of a seismic catalogue providing the necessary input for seismic Hazard Studies. This paper presents the state of the works carried out covering both aspects. First, a seismic catalogue has been built, compiling data of historical and instrumental events occurred in the Hispaniola Island and surroundings, in the frame of the SISMO-HAITI project, supported by the Technical University of Madrid (UPM) and Developed in cooperation with the Observatoire National de l'Environnement et de la Vulnérabilité of Haiti (ONEV). Data from different agencies all over the world were gathered, being relevant the role of the Dominican Republic and Puerto Rico seismological services which provides local data of their national networks. Almost 30000 events recorded in the area from 1551 till 2011 were compiled in a first catalogue, among them 7700 events with Mw ranges between 4.0 and 8.3. Since different magnitude scale were given by the different agencies (Ms, mb, MD, ML), this first catalogue was affected by important heterogeneity in the size parameter. Then it was homogenized to moment magnitude Mw using the empirical equations developed by Bonzoni et al (2011) for the eastern Caribbean. At present, this is the most exhaustive catalogue of the country, although it is difficult to assess its degree of completeness. Regarding the seismic network, 3 stations were installed just after the 2010 earthquake by the Canadian Government. The data were sent by telemetry thought the Canadian System CARINA. In 2012, the Spanish IGN together with ONEV and BME, installed 4 seismic stations with financial support from the Inter-American Development Bank and the Haitian Government. The 4 stations include strong motion and broad-band sensors, complementing the 8 sensors initially installed. The stations communicate via SATMEX5 with the Canadian HUB, which sends the data back to Haiti with minimum delay. In the immediate future, data transfer will be improved with the installation of a main antenna for data reception and the Seismic Warning Center of Port-au-Prince. A bidirectional satellite communication is considered of fundamental importance for robust real-time data transmission that is not affected in the case of a catastrophic event.

  6. An economical educational seismic system

    USGS Publications Warehouse

    Lehman, J. D.

    1980-01-01

    There is a considerable interest in seismology from the nonprofessional or amateur standpoint. The operation of a seismic system can be satisfying and educational, especially when you have built and operated the system yourself. A long-period indoor-type sensor and recording system that works extremely well has been developed in the James Madison University Physics Deparment. The system can be built quite economically, and any educational institution that cannot commit themselves to a professional installation need not be without first-hand seismic information. The system design approach has been selected by college students working a project or senior thesis, several elementary and secondary science teachers, as well as the more ambitious tinkerer or hobbyist at home 

  7. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    U.S. Geological Survey

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  8. Improvement of broadband seismic station installations at the Observatoire de Grenoble (OSUG) seismic network

    NASA Astrophysics Data System (ADS)

    Langlais, M.; Vial, B.; Coutant, O.

    2013-04-01

    We describe in this paper different improvements that were brought to the installation of seismic broadband stations deployed by the Observatoire de Grenoble (OSUG) in the northern French Alps. This work was realized in the frame of a French-Italian ALCOTRA project (RISE), aimed at modernizing the broadband seismic networks across our common border. We had the opportunity with this project to improve some of our seismic recording sites, both in term of sensor installation quality, and in term of reliability. We detail in particular the thermal and barometric protection system that we designed and show its effect on the reduction of long period noise above 20 s.

  9. Development of adaptive seismic isolators for ultimate seismic protection of civil structures

    NASA Astrophysics Data System (ADS)

    Li, Jianchun; Li, Yancheng; Li, Weihua; Samali, Bijan

    2013-04-01

    Base isolation is the most popular seismic protection technique for civil engineering structures. However, research has revealed that the traditional base isolation system due to its passive nature is vulnerable to two kinds of earthquakes, i.e. the near-fault and far-fault earthquakes. A great deal of effort has been dedicated to improve the performance of the traditional base isolation system for these two types of earthquakes. This paper presents a recent research breakthrough on the development of a novel adaptive seismic isolation system as the quest for ultimate protection for civil structures, utilizing the field-dependent property of the magnetorheological elastomer (MRE). A novel adaptive seismic isolator was developed as the key element to form smart seismic isolation system. The novel isolator contains unique laminated structure of steel and MR elastomer layers, which enable its large-scale civil engineering applications, and a solenoid to provide sufficient and uniform magnetic field for energizing the field-dependent property of MR elastomers. With the controllable shear modulus/damping of the MR elastomer, the developed adaptive seismic isolator possesses a controllable lateral stiffness while maintaining adequate vertical loading capacity. In this paper, a comprehensive review on the development of the adaptive seismic isolator is present including designs, analysis and testing of two prototypical adaptive seismic isolators utilizing two different MRE materials. Experimental results show that the first prototypical MRE seismic isolator can provide stiffness increase up to 37.49%, while the second prototypical MRE seismic isolator provides amazing increase of lateral stiffness up to1630%. Such range of increase of the controllable stiffness of the seismic isolator makes it highly practical for developing new adaptive base isolation system utilizing either semi-active or smart passive controls.

  10. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-01

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  11. Formally Verified Practical Algorithms for Recovery from Loss of Separation

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Caesar A.

    2009-01-01

    In this paper, we develop and formally verify practical algorithms for recovery from loss of separation. The formal verification is performed in the context of a criteria-based framework. This framework provides rigorous definitions of horizontal and vertical maneuver correctness that guarantee divergence and achieve horizontal and vertical separation. The algorithms are shown to be independently correct, that is, separation is achieved when only one aircraft maneuvers, and implicitly coordinated, that is, separation is also achieved when both aircraft maneuver. In this paper we improve the horizontal criteria over our previous work. An important benefit of the criteria approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).

  12. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  13. Seismic isolation of two dimensional periodic foundations

    SciTech Connect

    Yan, Y.; Mo, Y. L.; Laskar, A.; Cheng, Z.; Shi, Z.; Menq, F.; Tang, Y.

    2014-07-28

    Phononic crystal is now used to control acoustic waves. When the crystal goes to a larger scale, it is called periodic structure. The band gaps of the periodic structure can be reduced to range from 0.5 Hz to 50 Hz. Therefore, the periodic structure has potential applications in seismic wave reflection. In civil engineering, the periodic structure can be served as the foundation of upper structure. This type of foundation consisting of periodic structure is called periodic foundation. When the frequency of seismic waves falls into the band gaps of the periodic foundation, the seismic wave can be blocked. Field experiments of a scaled two dimensional (2D) periodic foundation with an upper structure were conducted to verify the band gap effects. Test results showed the 2D periodic foundation can effectively reduce the response of the upper structure for excitations with frequencies within the frequency band gaps. When the experimental and the finite element analysis results are compared, they agree well with each other, indicating that 2D periodic foundation is a feasible way of reducing seismic vibrations.

  14. The Spatial Scale of Detected Seismicity

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Chen, C.-C.

    2016-01-01

    An experimental method for the spatial resolution analysis of the earthquake frequency-magnitude distribution is introduced in order to identify the intrinsic spatial scale of the detected seismicity phenomenon. We consider the unbounded magnitude range m ∈ (-∞, +∞), which includes incomplete data below the completeness magnitude m c. By analyzing a relocated earthquake catalog of Taiwan, we find that the detected seismicity phenomenon is scale-variant for m ∈ (-∞, +∞) with its spatial grain a function of the configuration of the seismic network, while seismicity is known to be scale invariant for m ∈ [ m c, +∞). Correction for data incompleteness for m < m c based on the knowledge of the spatial scale of the process allows extending the analysis of the Gutenberg-Richter law and of the fractal dimension to lower magnitudes. This shall allow verifying the continuity of universality of these parameters over a wider magnitude range. Our results also suggest that the commonly accepted Gaussian model of earthquake detection might be an artifact of observation.

  15. Statistical classification methods applied to seismic discrimination

    SciTech Connect

    Ryan, F.M.; Anderson, D.N.; Anderson, K.K.; Hagedorn, D.N.; Higbee, K.T.; Miller, N.E.; Redgate, T.; Rohay, A.C.

    1996-06-11

    To verify compliance with a Comprehensive Test Ban Treaty (CTBT), low energy seismic activity must be detected and discriminated. Monitoring small-scale activity will require regional (within {approx}2000 km) monitoring capabilities. This report provides background information on various statistical classification methods and discusses the relevance of each method in the CTBT seismic discrimination setting. Criteria for classification method selection are explained and examples are given to illustrate several key issues. This report describes in more detail the issues and analyses that were initially outlined in a poster presentation at a recent American Geophysical Union (AGU) meeting. Section 2 of this report describes both the CTBT seismic discrimination setting and the general statistical classification approach to this setting. Seismic data examples illustrate the importance of synergistically using multivariate data as well as the difficulties due to missing observations. Classification method selection criteria are presented and discussed in Section 3. These criteria are grouped into the broad classes of simplicity, robustness, applicability, and performance. Section 4 follows with a description of several statistical classification methods: linear discriminant analysis, quadratic discriminant analysis, variably regularized discriminant analysis, flexible discriminant analysis, logistic discriminant analysis, K-th Nearest Neighbor discrimination, kernel discrimination, and classification and regression tree discrimination. The advantages and disadvantages of these methods are summarized in Section 5.

  16. Real-Time Projection to Verify Plan Success During Execution

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Dvorak, Daniel L.; Rasmussen, Robert D.; Knight, Russell L.; Morris, John R.; Bennett, Matthew B.; Ingham, Michel D.

    2012-01-01

    The Mission Data System provides a framework for modeling complex systems in terms of system behaviors and goals that express intent. Complex activity plans can be represented as goal networks that express the coordination of goals on different state variables of the system. Real-time projection extends the ability of this system to verify plan achievability (all goals can be satisfied over the entire plan) into the execution domain so that the system is able to continuously re-verify a plan as it is executed, and as the states of the system change in response to goals and the environment. Previous versions were able to detect and respond to goal violations when they actually occur during execution. This new capability enables the prediction of future goal failures; specifically, goals that were previously found to be achievable but are no longer achievable due to unanticipated faults or environmental conditions. Early detection of such situations enables operators or an autonomous fault response capability to deal with the problem at a point that maximizes the available options. For example, this system has been applied to the problem of managing battery energy on a lunar rover as it is used to explore the Moon. Astronauts drive the rover to waypoints and conduct science observations according to a plan that is scheduled and verified to be achievable with the energy resources available. As the astronauts execute this plan, the system uses this new capability to continuously re-verify the plan as energy is consumed to ensure that the battery will never be depleted below safe levels across the entire plan.

  17. Verifying non-Abelian statistics by numerical braiding Majorana fermions

    NASA Astrophysics Data System (ADS)

    Cheng, Qiu-Bo; He, Jing; Kou, Su-Peng

    2016-02-01

    Recently, Majorana fermions have attracted intensive attention because of their possible non-Abelian statistics and potential applications in topological quantum computation. This paper describes an approach to verify the non-Abelian statistics of Majorana fermions in topological superconductors. From the relationship between the braiding operator of Majorana fermions and that of Bogoliubov-de Gennes states, we determine that Majorana fermions in one-dimensional and two-dimensional topological superconductors both obey non-Abelian statistics.

  18. Nuclear archaeology: Verifying declarations of fissile-material production

    SciTech Connect

    Fetter, S. )

    1993-01-01

    Controlling the production of fissile material is an essential element of nonproliferation policy. Similarly, accounting for the past production of fissile material should be an important component of nuclear disarmament. This paper describes two promising techniques that make use of physical evidence at reactors and enrichment facilities to verify the past production of plutonium and highly enriched uranium. In the first technique, the concentrations of long-lived radionuclides in permanent components of the reactor core are used to estimate the neutron fluence in various regions of the reactor, and thereby verify declarations of plutonium production in the reactor. In the second technique, the ratio of the concentration of U-235 to that of U-234 in the tails is used to determine whether a given container of tails was used in the production of low- enriched uranium, which is suitable for reactor fuel, or highly enriched uranium, which can be used in nuclear weapons. Both techniques belong to the new field of [open quotes]nuclear archaeology,[close quotes] in which the authors attempt to document past nuclear weapons activities and thereby lay a firm foundation for verifiable nuclear disarmament. 11 refs., 1 fig., 3 tabs.

  19. Robustness and device independence of verifiable blind quantum computing

    NASA Astrophysics Data System (ADS)

    Gheorghiu, Alexandru; Kashefi, Elham; Wallden, Petros

    2015-08-01

    Recent advances in theoretical and experimental quantum computing bring us closer to scalable quantum computing devices. This makes the need for protocols that verify the correct functionality of quantum operations timely and has led to the field of quantum verification. In this paper we address key challenges to make quantum verification protocols applicable to experimental implementations. We prove the robustness of the single server verifiable universal blind quantum computing protocol of Fitzsimons and Kashefi (2012 arXiv:1203.5217) in the most general scenario. This includes the case where the purification of the deviated input state is in the hands of an adversarial server. The proved robustness property allows the composition of this protocol with a device-independent state tomography protocol that we give, which is based on the rigidity of CHSH games as proposed by Reichardt et al (2013 Nature 496 456-60). The resulting composite protocol has lower round complexity for the verification of entangled quantum servers with a classical verifier and, as we show, can be made fault tolerant.

  20. Seismic isolation of an electron microscope

    SciTech Connect

    Godden, W.G.; Aslam, M.; Scalise, D.T.

    1980-01-01

    A unique two-stage dynamic-isolation problem is presented by the conflicting design requirements for the foundations of an electron microscope in a seismic region. Under normal operational conditions the microscope must be isolated from ambient ground noise; this creates a system extremely vulnerable to seismic ground motions. Under earthquake loading the internal equipment forces must be limited to prevent damage or collapse. An analysis of the proposed design solution is presented. This study was motivated by the 1.5 MeV High Voltage Electron Microscope (HVEM) to be installed at the Lawrence Berkeley Laboratory (LBL) located near the Hayward Fault in California.

  1. Operations plan for the Regional Seismic Test Network

    SciTech Connect

    Not Available

    1981-05-15

    The Regional Seismic Test Network program was established to provide a capability for detection of extremely sensitive earth movements. Seismic signals from both natural and man-made earth motions will be analyzed with the ultimate objective of accurately locating underground nuclear explosions. The Sandia National Laboratories, Albuquerque, has designed an unattended seismic station capable of recording seismic information received at the location of the seismometers installed as part of that specific station. A network of stations is required to increase the capability of determining the source of the seismic signal and the location of the source. Current plans are to establish a five-station seismic network in the United States and Canada. The Department of Energy, Nevada Operations Office, has been assigned the responsibility for deploying, installing, and operating these remote stations. This Operation Plan provides the basic information and tasking to accomplish this assignment.

  2. Seismic Safety Of Simple Masonry Buildings

    SciTech Connect

    Guadagnuolo, Mariateresa; Faella, Giuseppe

    2008-07-08

    Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have to be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.

  3. The Kyrgyz Seismic Network (KNET)

    NASA Astrophysics Data System (ADS)

    Bragin, V. D.; Willemann, R. J.; Matix, A. I.; Dudinskih, R. R.; Vernon, F.; Offield, G.

    2007-05-01

    The Kyrgyz Digital Seismic Network (KNET) is a regional continuous telemetric network of very broadband seismic data. KNET was installed in 1991. The telemetry system was upgraded in 1998. The seismograms are transmitted in near real time. KNET is located along part of the boundary between the northern Tien Shan Mountains and the Kazakh platform. Several major tectonic features are spanned by the network including a series of thrust faults in the Tien Shan, the Chu Valley, and the NW-SE trending ridges north of Bishkek. This network is designed to monitor regional seismic activity at the magnitude 3.5+ level as well as to provide high quality data for research projects in regional and global broadband seismology. The Kyrgyz seismic network array consists of 10 stations - 3 of them with more than 3600 m altitude, 2 mountain repeaters, 1 intermediate data base and 2 data centers. One of data centers is a remote source for IRIS data base. KNET is operated by International Research Center - Geodynamic Proving Ground in Bishkek (IGRC) with the participation of Research Station of the Russian Academy of Sciences (RS RAS) and Kyrgyz Institute of Seismology (KIS). The network consists of Streckeisen STS-2 sensors with 24-bit PASSCAL data loggers. All continuous real-time data are accessible through the IRIS DMC in Seattle with over 95% data availability, which compares favorably to the best networks currently operating worldwide. National institutes of seismology in Kyrgyzstan and Kazakhstan, National Nuclear Centre of Kazakhstan, RS RAS, divisions of the ministries on extreme situations and the institutes of the Russian Academy of Sciences use KNET data for estimating seismic hazards and to study deep-seated structure of researched territory. KNET data is used by National Nuclear Centre of Republic of Kazakhstan, which together with LAMONT laboratory (USA) carries out verification researches and monitoring of nuclear detonations in China, India and Pakistan. The uniform digital Catalogue of Central Asia data which will include Kyrgyzstan, Kazakhstan, Uzbekistan and KNET seismic networks data is being developed. Chinese scientists have expressed interest in usage of KNET data, and also in association of a digital network located in the Tarim platform and KNET territory.

  4. Seismic amplitude processing and inversion

    NASA Astrophysics Data System (ADS)

    Dev, Ashwani

    2008-10-01

    Hydrocarbon exploration requires reliable seismic amplitudes to identify oil and gas reservoirs. Erroneous seismic amplitude processing can potentially generate large economic losses. Correct seismic amplitude processing is pre-requisite for any amplitude dependent analysis. The accuracy of the subsurface image and estimation of the elastic properties of subsurface sediments depends upon the reliability of the amplitudes. Geophone groups are wavenumber filters that change the seismic amplitudes because of a wavenumber dependent information loss. Numerically defined filters deconvolve the recording group response from horizontal and the vertical component seismic data recorded with groups of uniform and non-uniform geophone sensitivity, different group lengths and spacing, and noise. The filtering effect of an array increases as the group length increases, and only the wavenumber range defined by the group interval can be correctly compensated for the group effect. A rigorous, explicit spatial antialias filter is designed and applied by removing the energy above the first Nyquist wavenumber in the horizontal slowness-frequency domain. The filter removes the spatially aliased frequencies selectively at each slowness. The aliased energy is dispersive and present at both small and large horizontal slownesses. The filter can be explicitly applied to regularly spaced or irregularly spaced traces and is independent of any event linearity assumption. An integrative interpretation approach defines the effect of the structural setting on gas hydrate and free-gas accumulation at a site at the East Casey fault zone in the Gulf of Mexico. At a well location, hydrates are interpreted as fracture fillings with maximum saturation ˜30% of the available pore space. Two low acoustic impedance (Ip) free-gas features terminating at the bottom simulating reflector (BSR) are interpreted from the 3D seismic data and the derived Ip volumes. The 2D Ip profile shows a contrast in BSR strength across the fault with limited lateral extent, and low-impedance freegas features along the fault suggest that fault is an active conduit for gas transport. Well logs, and the 2D and 3D Ip data suggest that the most readily available pore spaces determine the host formations for hydrate deposition, and that the distribution of hydrate and free-gas distributions depend largely on the faults within the localized depositional setting.

  5. 41 CFR 128-1.8005 - Seismic safety standards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... appropriate model code, in which case the local code shall be utilized as the standard; or (2) The locality... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC... Congress (SBCC) Standard Building Code (SBC). (b) The seismic design and construction of a covered...

  6. 41 CFR 128-1.8006 - Seismic Safety Program requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Component Seismic Safety Coordinator shall ensure that an individual familiar with seismic design provisions... exist, the appropriate Component Head shall ensure completion of one of the following: (1) For a new... provide a statement certifying compliance with the Department standards. The Component Head shall...

  7. Sensor-based warranty system for improving seismic performance of building structures

    NASA Astrophysics Data System (ADS)

    Miyamoto, Ryu; Mita, Akira

    2008-03-01

    This paper proposes a warranty system based on a seismic performance agreement, and investigates its feasibility. Specifically, we focus on making clear to building users the relationship between seismic force and seismic damage or loss, and propose a warranty agreement in which accountability of seismic loss is defined in terms of ground motion parameters obtained by a seismic sensor. This study uses the Japan Meteorological Agency seismic intensity scale (I-jma) because of its general acceptance and recognition. A portfolio of buildings in 10 suburbs of the Kanto region is chosen for seismic portfolio analysis. The following conclusions were derived: 1. For a portfolio of 10 base-isolated buildings, the builder's seismic expected loss was found to be approximately 0.01%. 2. In regards to feasibility of risk finance by seismic derivatives, this study found that it is possible to transfer most of builder's risk through a 0.01% premium rate. 3. Builder risk reduction was verified by use of a seismometer. 4. A new contract warranty agreement for seismic loss insurance for users was proposed, and it can reduce the premium to 1/15 of the current seismic insurance schemes.

  8. K-means cluster analysis and seismicity partitioning for Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2014-07-01

    Pakistan and the western Himalaya is a region of high seismic activity located at the triple junction between the Arabian, Eurasian and Indian plates. Four devastating earthquakes have resulted in significant numbers of fatalities in Pakistan and the surrounding region in the past century (Quetta, 1935; Makran, 1945; Pattan, 1974 and the recent 2005 Kashmir earthquake). It is therefore necessary to develop an understanding of the spatial distribution of seismicity and the potential seismogenic sources across the region. This forms an important basis for the calculation of seismic hazard; a crucial input in seismic design codes needed to begin to effectively mitigate the high earthquake risk in Pakistan. The development of seismogenic source zones for seismic hazard analysis is driven by both geological and seismotectonic inputs. Despite the many developments in seismic hazard in recent decades, the manner in which seismotectonic information feeds the definition of the seismic source can, in many parts of the world including Pakistan and the surrounding regions, remain a subjective process driven primarily by expert judgment. Whilst much research is ongoing to map and characterise active faults in Pakistan, knowledge of the seismogenic properties of the active faults is still incomplete in much of the region. Consequently, seismicity, both historical and instrumental, remains a primary guide to the seismogenic sources of Pakistan. This study utilises a cluster analysis approach for the purposes of identifying spatial differences in seismicity, which can be utilised to form a basis for delineating seismogenic source regions. An effort is made to examine seismicity partitioning for Pakistan with respect to earthquake database, seismic cluster analysis and seismic partitions in a seismic hazard context. A magnitude homogenous earthquake catalogue has been compiled using various available earthquake data. The earthquake catalogue covers a time span from 1930 to 2007 and an area from 23.00° to 39.00°N and 59.00° to 80.00°E. A threshold magnitude of 5.2 is considered for K-means cluster analysis. The current study uses the traditional metrics of cluster quality, in addition to a seismic hazard contextual metric to attempt to constrain the preferred number of clusters found in the data. The spatial distribution of earthquakes from the catalogue was used to define the seismic clusters for Pakistan, which can be used further in the process of defining seismogenic sources and corresponding earthquake recurrence models for estimates of seismic hazard and risk in Pakistan. Consideration of the different approaches to cluster validation in a seismic hazard context suggests that Pakistan may be divided into K = 19 seismic clusters, including some portions of the neighbouring countries of Afghanistan, Tajikistan and India.

  9. Seismic station, USGS Northern California Seismic Network

    Traditional seismic stations such as this one require a source of power (solar here), a poured concrete foundation and several square feet of space. They are not always practical to install in urban areas, and that's where NetQuakes comes in....

  10. Verifying a Simplified Fuel Oil Field Measurement Protocol

    SciTech Connect

    Henderson, Hugh; Dentz, Jordan; Doty, Chris

    2013-07-01

    The Better Buildings program is a U.S. Department of Energy program funding energy efficiency retrofits in buildings nationwide. The program is in need of an inexpensive method for measuring fuel oil consumption that can be used in evaluating the impact that retrofits have in existing properties with oil heat. This project developed and verified a fuel oil flow field measurement protocol that is cost effective and can be performed with little training for use by the Better Buildings program as well as other programs and researchers.

  11. Verifiable Quantum ( k, n)-threshold Secret Key Sharing

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Teng, Yi-Wei; Chai, Hai-Ping; Wen, Qiao-Yan

    2011-03-01

    Based on Lagrange interpolation formula and the post-verification mechanism, we show how to construct a verifiable quantum ( k, n) threshold secret key sharing scheme. Compared with the previous secret sharing protocols, ours has the merits: (i) it can resist the fraud of the dealer who generates and distributes fake shares among the participants during the secret distribution phase; Most importantly, (ii) It can check the cheating of the dishonest participant who provides a false share during the secret reconstruction phase such that the authorized group cannot recover the correct secret.

  12. A verified case of recovered memories of sexual abuse.

    PubMed

    Bull, D L

    1999-01-01

    A case is presented that shows verifiable evidence of repression at work. Rachel, a 40-year-old woman with no history of mental illness and ten years of exemplary professional work, recovers memories of childhood sexual abuse by her father through a call from her youth pastor in whom she had confided as an adolescent. This reminder triggered a severe depression, suicidal action, and the need for hospitalization. Rachel's older sister, herself an abuse victim, had witnessed the abuse, yet Rachel had no memory of the events. No apparent causes of false memories are present, so a different mechanism than forgetting must have been at work. PMID:10415991

  13. Verifying a Simplified Fuel Oil Flow Field Measurement Protocol

    SciTech Connect

    Henderson, H.; Dentz, J.; Doty, C.

    2013-07-01

    The Better Buildings program is a U.S. Department of Energy program funding energy efficiency retrofits in buildings nationwide. The program is in need of an inexpensive method for measuring fuel oil consumption that can be used in evaluating the impact that retrofits have in existing properties with oil heat. This project developed and verified a fuel oil flow field measurement protocol that is cost effective and can be performed with little training for use by the Better Buildings program as well as other programs and researchers.

  14. Verifying compliance to the biological and toxin weapons convention.

    PubMed

    Zilinskas, R A

    1998-01-01

    There are difficult technical problems inherent in verifying compliance to the Biological Weapons and Toxin Convention (BWC) that are making it difficult to reach international agreement on a verification protocol. A compliance regime will most likely involve the formation of an Organization for the Prevention of Biological Warfare (OPBW). Based in part on the experience of UNSCOM in Iraq, this article considers the value of establishing an OPBW and the problems that would be faced by such an international organization. It also reviews the types of verification measures that might be applied by the OPBW and their limitations and benefits for deterring biological weapons programs. PMID:9800100

  15. Verifying Galileo's discoveries: telescope-making at the Collegio Romano

    NASA Astrophysics Data System (ADS)

    Reeves, Eileen; van Helden, Albert

    The Jesuits of the Collegio Romano in Rome, especially the mathematicians Clavius and Grienberger, were very interested in Galilei's discoveries. After they had failed to recognize with telescopes of own construction the celestial phenomena, they expressed serious doubts. But from November 1610 onward, after they had built a better telescope and had obtained from Venice another one in addition, and could verify Galilei's observations, they completely accepted them. Clavius, who stuck to the Ptolemaic system till his death in 1612, even pointed out these facts in his last edition of Sacrobosco's Sphaera. He as well as his conpatres, however, avoided any conclusions with respect to the planetary system.

  16. Permeameter data verify new turbulence process for MODFLOW

    USGS Publications Warehouse

    Kuniansky, Eve L.; Halford, Keith J.; Shoemaker, W. Barclay

    2008-01-01

    A sample of Key Largo Limestone from southern Florida exhibited turbulent flow behavior along three orthogonal axes as reported in recently published permeameter experiments. The limestone sample was a cube measuring 0.2 m on edge. The published nonlinear relation between hydraulic gradient and discharge was simulated using the turbulent flow approximation applied in the Conduit Flow Process (CFP) for MODFLOW-2005 mode 2, CFPM2. The good agreement between the experimental data and the simulated results verifies the utility of the approach used to simulate the effects of turbulent flow on head distributions and flux in the CFPM2 module of MODFLOW-2005.

  17. Permeameter data verify new turbulence process for MODFLOW.

    PubMed

    Kuniansky, Eve L; Halford, Keith J; Shoemaker, W Barclay

    2008-01-01

    Abstract A sample of Key Largo Limestone from southern Florida exhibited turbulent flow behavior along three orthogonal axes as reported in recently published permeameter experiments. The limestone sample was a cube measuring 0.2 m on edge. The published nonlinear relation between hydraulic gradient and discharge was simulated using the turbulent flow approximation applied in the Conduit Flow Process (CFP) for MODFLOW-2005 mode 2, CFPM2. The good agreement between the experimental data and the simulated results verifies the utility of the approach used to simulate the effects of turbulent flow on head distributions and flux in the CFPM2 module of MODFLOW-2005. PMID:18459958

  18. Stressing of fault patch during seismic swarms in central Apennines, Italy

    NASA Astrophysics Data System (ADS)

    De Gori, P.; Lucente, F. P.; Chiarabba, C.

    2015-04-01

    Persistent seismic swarms originate along the normal faulting system of central Apennines (Italy). In this study, we analyze the space-time-energy distribution of one of the longer and more intense of these swarms, active since August 2013 in the high seismic risk area of the Gubbio basin. Our aim is to verify if information relevant to constraint short-term earthquake occurrence scenarios is hidden in seismic swarms. During the swarm, the seismic moment release first accelerated, with a rapid migration of seismicity along the fault system, and suddenly dropped. We observe a decrease of the b-value, along the portion of the fault system where large magnitude events concentrated, possibly indicating that a fault patch was dynamically stressed. This finding suggests that the onset of seismic swarms might help the formation of critically stressed patches.

  19. Towards data fusion in seismic monitoring: Source characterization of mining blasts with acoustic and seismic records

    SciTech Connect

    Leach, R.R. Jr.; Dowla, F.U.

    1995-07-01

    Event identification that combines data from a diverse range of sensor types, such as seismic, hydroacoustic, infrasound, optical, or acoustic sensors, has been discussed recently as a way to improve treaty monitoring technology, especially for a Comprehensive Test Ban Treaty. In this exploratory study the authors compare features in acoustic and seismic data from ripple-fired mining blasts, in an effort to understand the issues of incorporating data fusion into seismic monitoring. They study the possibility of identifying features such as spectral scalloping at high frequencies using acoustic signals recorded in the near field during mining blasts. Recorded acoustic and seismic data from two mining blasts at Carlin, Nevada, were analyzed. The authors have found that there is a clear presence of the periodic and impulsive nature of the ripple-fire source present in the acoustic recordings at high frequencies. They have discovered that the arrival time and duration of the acoustic recordings are also clearly discernible at high frequencies. This is in contrast to the absence of these features in seismic signals, due to attenuation and scattering at high frequencies. The association of signals from different sensors offers solutions for difficult monitoring problems. Seismic or acoustic signals individually may not be able to detect a nuclear test hidden under a typical mining blast. However, the presence of an underground nuclear test during a mining event could be determined by deriving the mining explosion source from the acoustic recording, modeling a seismic signal from the derived source, and subtracting the modeled seismic signal from the seismic recording for the event. Recommendations in the design of data fusion systems for treaty monitoring are suggested.

  20. Method of migrating seismic records

    DOEpatents

    Ober, Curtis C.; Romero, Louis A.; Ghiglia, Dennis C.

    2000-01-01

    The present invention provides a method of migrating seismic records that retains the information in the seismic records and allows migration with significant reductions in computing cost. The present invention comprises phase encoding seismic records and combining the encoded seismic records before migration. Phase encoding can minimize the effect of unwanted cross terms while still allowing significant reductions in the cost to migrate a number of seismic records.

  1. A verified minimal YAC contig for human chromosome 21

    SciTech Connect

    Graw, S.L.; Patterson, D.; Drabkin, H.

    1994-09-01

    The goal of this project is the construction of a verified YAC contig of the complete long arm of human chromosome 21 utilizing YACs from the CEPH and St. Louis libraries. The YACs in this contig have been analyzed for size by PFGE, tested for chimerism by FISH or end-cloning, and verified for STS content by PCR. This last analysis has revealed a number of cases of conflict with the published STS order. To establish correct order, we have utilized STS content analysis of somatic cell hybrids containing portions of chromosome 21. Additional problems being addressed include completeness of coverage and possible deletions or gaps. Questions of completeness of the CEPH 810 YAC set arose after screening with 57 independently derived probes failed to identify clones for 11 (19%). Ten of the 11, however, do detect chromosome 21 cosmids when used to screen Lawrence Livermore library LL21NC02`G,` a cosmid library constructed from flow-sorted chromosomes 21. Remaining gaps in the contig are being closed by several methods. These include YAC fingerprinting and conversion of YACs to cosmids. In addition, we are establishing the overlap between the physical NotI map and the YAC contig by testing YACs for NotI sites and screening the YACs in the contig for the presence of NotI-linking clones.

  2. Successes and failures of recording and interpreting seismic data in structurally complex area: seismic case history

    SciTech Connect

    Morse, V.C.; Johnson, J.H.; Crittenden, J.L.; Anderson, T.D.

    1986-05-01

    There are successes and failures in recording and interpreting a single seismic line across the South Owl Creek Mountain fault on the west flank of the Casper arch. Information obtained from this type of work should help explorationists who are exploring structurally complex areas. A depth cross section lacks a subthrust prospect, but is illustrated to show that the South Owl Creek Mountain fault is steeper with less apparent displacement than in areas to the north. This cross section is derived from two-dimensional seismic modeling, using data processing methods specifically for modeling. A flat horizon and balancing technique helps confirm model accuracy. High-quality data were acquired using specifically designed seismic field parameters. The authors concluded that the methodology used is valid, and an interactive modeling program in addition to cross-line control can improve seismic interpretations in structurally complex areas.

  3. Seismic fragility test of a 6-inch diameter pipe system

    SciTech Connect

    Chen, W. P.; Onesto, A. T.; DeVita, V.

    1987-02-01

    This report contains the test results and assessments of seismic fragility tests performed on a 6-inch diameter piping system. The test was funded by the US Nuclear Regulatory Commission (NRC) and conducted by ETEC. The objective of the test was to investigate the ability of a representative nuclear piping system to withstand high level dynamic seismic and other loadings. Levels of loadings achieved during seismic testing were 20 to 30 times larger than normal elastic design evaluations to ASME Level D limits would permit. Based on failure data obtained during seismic and other dynamic testing, it was concluded that nuclear piping systems are inherently able to withstand much larger dynamic seismic loadings than permitted by current design practice criteria or predicted by the probabilistic risk assessment (PRA) methods and several proposed nonlinear methods of failure analysis.

  4. BUILDING 341 Seismic Evaluation

    SciTech Connect

    Halle, J.

    2015-06-15

    The Seismic Evaluation of Building 341 located at Lawrence Livermore National Laboratory in Livermore, California has been completed. The subject building consists of a main building, Increment 1, and two smaller additions; Increments 2 and 3.

  5. Deepwater seismic acquisition technology

    SciTech Connect

    Caldwell, J.

    1996-09-01

    Although truly new technology is not required for successful acquisition of seismic data in deep Gulf of Mexico waters, it is helpful to review some basic aspects of these seismic surveys. Additionally, such surveys are likely to see early use of some emerging new technology which can improve data quality. Because such items as depth imaging, borehole seismic, 4-D and marine 3-component recording were mentioned in the May 1996 issue of World Oil, they are not discussed again here. However, these technologies will also play some role in the deepwater seismic activities. What is covered in this paper are some new considerations for: (1) longer data records needed in deeper water, (2) some pros and cons of very long steamer use, and (3) two new commercial systems for quantifying data quality.

  6. Discussing Seismic Data

    USGS scientists Debbie Hutchinson and Jonathan Childs discuss collected seismic data. This image was taken on U.S. Coast Guard Cutter Healy and was during a scientific expedition to map the Arctic seafloor....

  7. A Novel Simple Phantom for Verifying the Dose of Radiation Therapy

    PubMed Central

    Lee, J. H.; Chang, L. T.; Shiau, A. C.; Chen, C. W.; Liao, Y. J.; Li, W. J.; Lee, M. S.; Hsu, S. M.

    2015-01-01

    A standard protocol of dosimetric measurements is used by the organizations responsible for verifying that the doses delivered in radiation-therapy institutions are within authorized limits. This study evaluated a self-designed simple auditing phantom for use in verifying the dose of radiation therapy; the phantom design, dose audit system, and clinical tests are described. Thermoluminescent dosimeters (TLDs) were used as postal dosimeters, and mailable phantoms were produced for use in postal audits. Correction factors are important for converting TLD readout values from phantoms into the absorbed dose in water. The phantom scatter correction factor was used to quantify the difference in the scattered dose between a solid water phantom and homemade phantoms; its value ranged from 1.084 to 1.031. The energy-dependence correction factor was used to compare the TLD readout of the unit dose irradiated by audit beam energies with 60Co in the solid water phantom; its value was 0.99 to 1.01. The setup-condition factor was used to correct for differences in dose-output calibration conditions. Clinical tests of the device calibrating the dose output revealed that the dose deviation was within 3%. Therefore, our homemade phantoms and dosimetric system can be applied for accurately verifying the doses applied in radiation-therapy institutions. PMID:25883980

  8. Moving formal methods into practice. Verifying the FTPP Scoreboard: Results, phase 1

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1992-01-01

    This report documents the Phase 1 results of an effort aimed at formally verifying a key hardware component, called Scoreboard, of a Fault-Tolerant Parallel Processor (FTPP) being built at Charles Stark Draper Laboratory (CSDL). The Scoreboard is part of the FTPP virtual bus that guarantees reliable communication between processors in the presence of Byzantine faults in the system. The Scoreboard implements a piece of control logic that approves and validates a message before it can be transmitted. The goal of Phase 1 was to lay the foundation of the Scoreboard verification. A formal specification of the functional requirements and a high-level hardware design for the Scoreboard were developed. The hardware design was based on a preliminary Scoreboard design developed at CSDL. A main correctness theorem, from which the functional requirements can be established as corollaries, was proved for the Scoreboard design. The goal of Phase 2 is to verify the final detailed design of Scoreboard. This task is being conducted as part of a NASA-sponsored effort to explore integration of formal methods in the development cycle of current fault-tolerant architectures being built in the aerospace industry.

  9. AUTOMATING SHALLOW SEISMIC IMAGING

    SciTech Connect

    Steeples, Don W.

    2003-09-14

    The current project is a continuation of an effort to develop ultrashallow seismic imaging as a cost-effective method potentially applicable to DOE facilities. The objective of the present research is to develop and demonstrate the use of a cost-effective, automated method of conducting shallow seismic surveys, an approach that represents a significant departure from conventional seismic-survey field procedures. Initial testing of a mechanical geophone-planting device suggests that large numbers of geophones can be placed both quickly and automatically. The development of such a device could make the application of SSR considerably more efficient and less expensive. The imaging results obtained using automated seismic methods will be compared with results obtained using classical seismic techniques. Although this research falls primarily into the field of seismology, for comparison and quality-control purposes, some GPR data will be collected as well. In the final year of th e research, demonstration surveys at one or more DOE facilities will be performed. An automated geophone-planting device of the type under development would not necessarily be limited to the use of shallow seismic reflection methods; it also would be capable of collecting data for seismic-refraction and possibly for surface-wave studies. Another element of our research plan involves monitoring the cone of depression of a pumping well that is being used as a proxy site for fluid-flow at a contaminated site. Our next data set will be collected at a well site where drawdown equilibrium has been reached. Noninvasive, in-situ methods such as placing geophones automatically and using near-surface seismic methods to identify and characterize the hydrologic flow regimes at contaminated sites support the prospect of developing effective, cost-conscious cleanup strategies for DOE and others.

  10. Passive seismic experiment

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Ewing, M.; Press, F.; Sutton, G.; Dorman, J.; Nakamura, Y.; Toksoz, N.; Lammlein, D.; Duennebier, F.

    1972-01-01

    The establishment of a network of seismic stations on the lunar surface as a result of equipment installed by Apollo 12, 14, and 15 flights is described. Four major discoveries obtained by analyzing seismic data from the network are discussed. The use of the system to detect vibrations of the lunar surface and the use of the data to determine the internal structure, physical state, and tectonic activity of the moon are examined.

  11. Application of areal seismics to mapping sandstone channels

    SciTech Connect

    Dobecki, T.L.

    1981-01-01

    The seismic formation mapping project is a two-part program whose prime objective is the evaluation of state-of-the-art seismic reflection methods as a means of mapping the subsurface configuration of low permeability sandstone channels - potential gas reservoirs typical of Tertiary and Cretaceous formations of the Western United States. The initial part of the program involved performing a computer model study to predict the effectiveness of seismic techniques applied to such targets and to develop criteria for interpreting real data. The second part consisted of a seismic field experiment designed to test and evaluate the ability to map known lenses. The field program utilized areal (3-D) acquisition methods at a site underlain by known, shallow Mesa Verde channels. Through the lessons learned by seismic modeling, it was possible to interpret field seismic data in terms of channeling and thereby predict the orientation of channels in the subsurface. Projecting these channels out of the area of seismic coverage in order to predict their outcrop position; existing channels which agree quite well with the seismic description and projection were located. It is felt that this exercise has satisfied the program objectives, and that seismic methods may be successfully applied to the description of channel sandstone reservoirs. The next test of this will be in the upcoming DOE-sponsored multi-well experiment.

  12. Third Quarter Hanford Seismic Report for Fiscal Year 2005

    SciTech Connect

    Reidel, Steve P.; Rohay, Alan C.; Hartshorn, Donald C.; Clayton, Ray E.; Sweeney, Mark D.

    2005-09-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 337 triggers during the third quarter of fiscal year 2005. Of these triggers, 20 were earthquakes within the Hanford Seismic Network. The largest earthquake within the Hanford Seismic Network was a magnitude 1.3 event May 25 near Vantage, Washington. During the third quarter, stratigraphically 17 (85%) events occurred in the Columbia River basalt (approximately 0-5 km), no events in the pre-basalt sediments (approximately 5-10 km), and three (15%) in the crystalline basement (approximately 10-25 km). During the first quarter, geographically five (20%) earthquakes occurred in swarm areas, 10 (50%) earthquakes were associated with a major geologic structure, and 5 (25%) were classified as random events.

  13. Annual Hanford Seismic Report for Fiscal Year 2003

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2003-12-01

    This report describes the seismic activity in and around the Hanford Site during Fiscal year 2003. Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 1,336 triggers during fiscal year 2003. Of these triggers, 590 were earthquakes. One hundred and one earthquakes of the 590 earthquakes were located in the Hanford Seismic Network area. Stratigraphically 35 (34.6%) occurred in the Columbia River basalt, 29 (28.7%) were earthquakes in the pre-basalt sediments, and 37 (36.7%) were earthquakes in the crystalline basement. Geographically, 48 (47%) earthquakes occurred in swarm areas, 4 (4%) earthquakes were associated with a major geologic structure, and 49 (49%) were classified as random events. During the third and fourth quarters, an earthquake swarm consisting of 27 earthquakes occurred on the south limb of Rattlesnake Mountain. The earthquakes are centered over the northwest extension of the Horse Heaven Hills anticline and probably occur near the interface of the Columbia River Basalt Group and pre-basalt sediments.

  14. Seismic Consequence Abstraction

    SciTech Connect

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  15. Revised seismic and geologic siting regulations for nuclear power plants

    SciTech Connect

    Murphy, A.J.; Chokshi, N.C.

    1997-02-01

    The primary regulatory basis governing the seismic design of nuclear power plants is contained in Appendix A to Part 50, General Design Criteria for Nuclear Power Plants, of Title 10 of the Code of Federal Regulations (CFR). General Design Criteria (GDC) 2 defines requirements for design bases for protection against natural phenomena. GDC 2 states the performance criterion that {open_quotes}Structures, systems, and components important to safety shall be designed to withstand the effects of natural phenomena such as earthquakes, . . . without loss of capability to perform their safety functions. . .{close_quotes}. Appendix A to Part 100, Seismic and Geologic Siting Criteria for Nuclear Power Plants, has been the principal document which provided detailed criteria to evaluate the suitability of proposed sites and suitability of the plant design basis established in consideration of the seismic and geologic characteristics of the proposed sites. Appendix A defines required seismological and geological investigations and requirements for other design conditions such as soil stability, slope stability, and seismically induced floods and water waves, and requirements for seismic instrumentation. The NRC staff is in the process of revising Appendix A. The NRC has recently revised seismic siting and design regulations for future applications. These revisions are discussed in detail in this paper.

  16. Seismic analysis of a reinforced concrete containment vessel model

    SciTech Connect

    RANDY,JAMES J.; CHERRY,JEFFERY L.; RASHID,YUSEF R.; CHOKSHI,NILESH

    2000-02-03

    Pre-and post-test analytical predictions of the dynamic behavior of a 1:10 scale model Reinforced Concrete Containment Vessel are presented. This model, designed and constructed by the Nuclear Power Engineering Corp., was subjected to seismic simulation tests using the high-performance shaking table at the Tadotsu Engineering Laboratory in Japan. A group of tests representing design-level and beyond-design-level ground motions were first conducted to verify design safety margins. These were followed by a series of tests in which progressively larger base motions were applied until structural failure was induced. The analysis was performed by ANATECH Corp. and Sandia National Laboratories for the US Nuclear Regulatory Commission, employing state-of-the-art finite-element software specifically developed for concrete structures. Three-dimensional time-history analyses were performed, first as pre-test blind predictions to evaluate the general capabilities of the analytical methods, and second as post-test validation of the methods and interpretation of the test result. The input data consisted of acceleration time histories for the horizontal, vertical and rotational (rocking) components, as measured by accelerometers mounted on the structure's basemat. The response data consisted of acceleration and displacement records for various points on the structure, as well as time-history records of strain gages mounted on the reinforcement. This paper reports on work in progress and presents pre-test predictions and post-test comparisons to measured data for tests simulating maximum design basis and extreme design basis earthquakes. The pre-test analyses predict the failure earthquake of the test structure to have an energy level in the range of four to five times the energy level of the safe shutdown earthquake. The post-test calculations completed so far show good agreement with measured data.

  17. Analysis of Fingerprint Image to Verify a Person

    NASA Astrophysics Data System (ADS)

    Jahankhani, Hossein; Mohid, Maktuba

    Identification and authentication technologies are increasing day by day to protect people and goods from crime and terrorism. This paper is aimed to discuss fingerprint technology in depth and analysis of fingerprint image. Verify a person with a highlight on fingerprint matching. Some fingerprint matching algorithms are analysed and compared. The outcomes of the analysis has identified some major issues or factors of fingerprinting, which are location, rotation, clipping, noise, non-linear distortion sensitiveness/ insensitiveness properties, computational cost and accuracy level of fingerprint matching algorithms. Also a new fingerprint matching algorithm proposed in this research work. The proposed algorithm has used Euclidean distance, angle difference, type as matching parameters instead of specific location parameter (like, x or y coordinates), which makes the algorithm location and rotation insensitive. The matching of local neighbourhoods at each stage makes the algorithm non-linear distortion insensitive.

  18. Beyond Hammers and Nails: Mitigating and Verifying Greenhouse Gas Emissions

    NASA Astrophysics Data System (ADS)

    Gurney, Kevin Robert

    2013-05-01

    One of the biggest challenges to future international agreements on climate change is an independent, science-driven method of verifying reductions in greenhouse gas emissions (GHG) [Niederberger and Kimble, 2011]. The scientific community has thus far emphasized atmospheric measurements to assess changes in emissions. An alternative is direct measurement or estimation of fluxes at the source. Given the many challenges facing the approach that uses "top-down" atmospheric measurements and recent advances in "bottom-up" estimation methods, I challenge the current doctrine, which has the atmospheric measurement approach "validating" bottom-up, "good-faith" emissions estimation [Balter, 2012] or which holds that the use of bottom-up estimation is like "dieting without weighing oneself" [Nisbet and Weiss, 2010].

  19. Developing an Approach for Analyzing and Verifying System Communication

    NASA Technical Reports Server (NTRS)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  20. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  1. Cryptanalysis and improvement of verifiable quantum ( k, n) secret sharing

    NASA Astrophysics Data System (ADS)

    Song, Xiuli; Liu, Yanbing

    2016-02-01

    After analyzing Yang's verifiable quantum secret sharing (VQSS) scheme, we show that in their scheme a participant can prepare a false quantum particle sequence corresponding to a forged share, while other any participant cannot trace it. In addition, an attacker or a participant can forge a new quantum sequence by transforming an intercepted quantum sequence; moreover, the forged sequence can pass the verification of other participants. So we propose a new VQSS scheme to improve the existed one. In the improved scheme, we construct an identity-based quantum signature encryption algorithm, which ensures chosen plaintext attack security of the shares and their signatures transmitted in the quantum tunnel. We employ dual quantum signature and one-way function to trace against forgery and repudiation of the deceivers (dealer or participants). Furthermore, we add the reconstruction process of quantum secret and prove the security property against superposition attack in this process.

  2. Generating and verifying entangled-itinerant microwave fields

    NASA Astrophysics Data System (ADS)

    Ku, H. S.

    This thesis presents the experimental achievements of (1) generating entangled-microwave fields propagating on two physically separate transmission lines and (2) verifying the entangled states with efficient measurements. Shared entanglement between two parties is an essential resource for quantum information processing and quantum communication protocols. Experimentally, entangled pairs of electromagnetic fields can be realized by distributing a squeezed vacuum over two separated modes. As a result, entanglement is revealed by the strong cross-correlations between specific quadratures of the two modes. Although it is possible to verify the presence of entanglement with low-efficiency quadrature measurements, higher detection efficiencies are desired for performing protocols that exploit entanglement with high fidelity. In the microwave regime, Josephson parametric amplifiers (JPAs) fulfill the two major tasks mentioned above: JPAs prepare the required squeezed states to generate entanglement and enable us to perform efficient quadrature measurements. Therefore, for the purposes of entanglement generation and verification, ultralow-noise--frequency-tunable JPAs have been developed. Additionally, to increase the efficiency of entanglement generation, we integrate JPAs with two on-chip microwave passive components, a directional coupler and a quadrature hybrid, to form an entangler circuit. The two-mode entangled states are created at the two output modes of the entangler and are measured with a two-channel measurement apparatus where each of the two channels incorporates a JPA as a single-quadrature preamplifier. By employing this measurement scheme, the two measured quadratures of the two output modes can be chosen independently of each other, enabling a full characterization of the two-mode state. To definitively demonstrate the two-mode entanglement, I prove that the measured quadrature variances satisfy the inseparability criterion.

  3. WiggleView : Visualizing Large Seismic Datasets

    NASA Astrophysics Data System (ADS)

    Nayak, A. M.; Leigh, J.; Johnson, A.; Russo, R.; Morin, P.; Laughbon, C.; Ahern, T.

    2002-12-01

    Wiggleview is a tool for visualizing seismic data collected from a worldwide network of seismometers. The visualization consists of overlaying familiar 2D seismic traces recorded for the N-S, E-W and vertical components of the earth's displacement over the topographic map of the affected area. In addition, a 3D particle trace consisting of the integration of these 3 components provides a depiction of how an object placed at a particular seismic recording station would shake at the instant of the event. Data for the seismic events is obtained from repositories maintained by IRIS (Incorporated Research Institutions for Seismology) at the Data Management Center, Seattle Washington. Suppose a seismologist wants to examine data gathered when an earthquake measuring 7.2 on the Richter scale hit Turkey on October 31,1998. Wiggleview displays data at 20 stations for this event. The tool's strength lies in being able to depict as many as 60 channels of waveforms and 20 traces of particle motion on a single display. This allows one to watch the seismic wave field expand about a source and see how it differs from place to place. It can also assist in understanding surface wave multipathing and anisotropy -- this is important for revealing structure and for seismic hazard estimation. Wiggleview was designed for two display platforms: the standard PC-based desktop or laptop with a modern-day game graphics card; and a stereoscopic projection system called the Geowall. The stereoscopic nature of the images enhances depth perception and thus allows better understanding of attenuation due to distance and earth structure, source directivity and seismic hazard estimation. Illustrations are available at the Wiggleview website http://www.evl.uic.edu/atul/wiggleview

  4. The Budget Guide to Seismic Network Management

    NASA Astrophysics Data System (ADS)

    Hagerty, M. T.; Ebel, J. E.

    2007-05-01

    Regardless of their size, there are certain tasks that all seismic networks must perform, including data collection and processing, earthquake location, information dissemination, and quality control. Small seismic networks are unlikely to possess the resources -- manpower and money -- required to do much in-house development. Fortunately, there are a lot of free or inexpensive software solutions available that are able to perform many of the required tasks. Often the available solutions are all-in-one turnkey packages designed and developed for much larger seismic networks, and the cost of adapting them to a smaller network must be weighed against the ease with which other, non-seismic software can be adapted to the same task. We describe here the software and hardware choices we have made for the New England Seismic Network (NESN), a sparse regional seismic network responsible for monitoring and reporting all seismicity within the New England region in the northeastern U.S. We have chosen to use a cost-effective approach to monitoring using free, off-the-shelf solutions where available (e.g., Earthworm, HYP2000) and modifying freeware solutions when it is easier than trying to adapt a large, complicated package. We have selected for use software that is: free, likely to receive continued support from the seismic or, preferably, larger internet community, and modular. Modularity is key to our design because it ensures that if one component of our processing system becomes obsolete, we can insert a suitable replacement with few modifications to the other modules. Our automated event detection, identification and location system is based on a wavelet transform analysis of station data that arrive continuously via TCP/IP transmission over the internet. Our system for interactive analyst review of seismic events and remote system monitoring utilizes a combination of Earthworm modules, Perl cgi-bin scripts, Java, and native Unix commands and can now be carried out via internet browser from anywhere in the world. With our current communication and processing system we are able to achieve a monitoring threshold of about M2.0 for most New England, in spite of high cultural noise and sparse station distribution, and maintain an extremely high rate of data recovery, for minimal cost.

  5. Short-Period Seismic Noise in Vorkuta (Russia)

    SciTech Connect

    Kishkina, S B; Spivak, A A; Sweeney, J J

    2008-05-15

    Cultural development of new subpolar areas of Russia is associated with a need for detailed seismic research, including both mapping of regional seismicity and seismic monitoring of specific mining enterprises. Of special interest are the northern territories of European Russia, including shelves of the Kara and Barents Seas, Yamal Peninsula, and the Timan-Pechora region. Continuous seismic studies of these territories are important now because there is insufficient seismological knowledge of the area and an absence of systematic data on the seismicity of the region. Another task of current interest is the necessity to consider the seismic environment in the design, construction, and operation of natural gas extracting enterprises such as the construction of the North European Gas Pipeline. Issues of scientific importance for seismic studies in the region are the complex geodynamical setting, the presence of permafrost, and the complex tectonic structure. In particular, the Uralian Orogene (Fig. 1) strongly affects the propagation of seismic waves. The existing subpolar seismic stations [APA (67,57{sup o}N; 33,40{sup o}E), LVZ (67,90{sup o}N; 34,65{sup o}E), and NRIL (69,50{sup o}N; 88,40{sup o}E)] do not cover the extensive area between the Pechora and Ob Rivers (Fig. 1). Thus seismic observations in the Vorkuta area, which lies within the area of concern, represent a special interest. Continuous recording at a seismic station near the city of Vorkuta (67,50{sup o}N; 64,11{sup o}E) [1] has been conducted since 2005 for the purpose of regional seismic monitoring and, more specifically, detection of seismic signals caused by local mining enterprises. Current surveys of local seismic noise [7,8,9,11], are particularly aimed at a technical survey for the suitability of the site for installation of a small-aperture seismic array, which would include 10-12 recording instruments, with the Vorkuta seismic station as the central element. When constructed, this seismic array will considerably improve the recording capacity of regional and local seismic events. It will allow detection of signatures of seismic waves propagating in submeridional and sublatitudinal directions. The latter is of special interest not only to access the influence of the Urals on propagation patterns of seismic waves, but also to address other questions, such as the structure and dynamic characteristics of the internal dynamo of the Earth [9,13]. Recording seismic waves at low angular distances from seismically active subpolar zones will allow us to collect data on vortical and convective movements in subpolar lithosphere blocks and at the boundary of the inner core of the Earth, possibly giving essential clues to the modeling of the Earth's electromagnetic field [3,13]. The present study considers basic features of seismic noise at the Vorkuta station obtained through the analysis of seismic records from March, 2006 till December, 2007.

  6. Seismic analysis of a nonlinear airlock door system

    SciTech Connect

    Huang, S.N.

    1983-01-01

    The containment equipment airlock door of the Fast Flux Test Facility utilizes screw-type actuators as a push-pull mechanism for closing and opening operations. Special design features were used to protect these actuators from pressure differential loading. These made the door behave as a nonlinear system during a seismic event. Seismic analyses, utilizing the time history method, were conducted to determine the seismic loads on these screw-type actuators. Several sizes of actuators were examined. Procedures for determining the final optimum design are discussed in detail.

  7. Multichannel Wiener deconvolution of vertical seismic profiles

    SciTech Connect

    Haldorsen, J.B.U. ); Miller, D.E. ); Walsh, J.J. )

    1994-10-01

    The authors describe a technique for performing optimal, least-squares deconvolution of vertical seismic profile (VSP) data. The method is a two-step process that involves (1) estimating the source signature and (2) applying a least-squares optimum deconvolution operator that minimizes the noise not coherent with the source signature estimate. The optimum inverse problem, formulated in the frequency domain, gives as a solution an operator that can be interpreted as a simple inverse to the estimated aligned signature multiplied by semblance across the array. An application to a zero-offset VSP acquired with a dynamite source shows the effectiveness of the operator in attaining the two conflicting goals of adaptively spiking the effective source signature and minimizing the noise. Signature design for seismic surveys could benefit from observing that the optimum deconvolution operator gives a flat signal spectrum if and only if the seismic source has the same amplitude spectrum as the noise.

  8. Seismic exploration for water on Mars

    NASA Technical Reports Server (NTRS)

    Page, Thornton

    1987-01-01

    It is proposed to soft-land three seismometers in the Utopia-Elysium region and three or more radio controlled explosive charges at nearby sites that can be accurately located by an orbiter. Seismic signatures of timed explosions, to be telemetered to the orbiter, will be used to detect present surface layers, including those saturated by volatiles such as water and/or ice. The Viking Landers included seismometers that showed that at present Mars is seismically quiet, and that the mean crustal thickness at the site is about 14 to 18 km. The new seismic landers must be designed to minimize wind vibration noise, and the landing sites selected so that each is well formed on the regolith, not on rock outcrops or in craters. The explosive charges might be mounted on penetrators aimed at nearby smooth areas. They must be equipped with radio emitters for accurate location and radio receivers for timed detonation.

  9. Seismic stratigraphy on a micro budget

    SciTech Connect

    Clark, T.M.

    1984-04-01

    For brief period Tandy Corporation marketed an inexpensive digitizer under its Radio Shack trademark. A seismic stratigraphic analysis system has been developed using this device and a 48K Radio Shack microcomputer. This system has the capacity to enter well log curves and seismic traces at the digitizer, convert log curves to time dimension by integration of interpolation, compute synthetic seismograms and time logs, and do synthetic modeling, wavelet estimation, and inversion of seismic and synthetic traces. The system allows great flexibility, as each process is designed as a stand-alone, interactive program, and data files are in identical format. Thus almost any order of operation may be chosen, and modeling may be in either depth or time. Display is to a pen plotter or dot-matrix printer-plotter. The plot routines allow flexibility in the number, order, spacing, and scale of the curves displayed.

  10. Permafrost Active Layer Seismic Interferometry Experiment (PALSIE).

    SciTech Connect

    Abbott, Robert; Knox, Hunter Anne; James, Stephanie; Lee, Rebekah; Cole, Chris

    2016-01-01

    We present findings from a novel field experiment conducted at Poker Flat Research Range in Fairbanks, Alaska that was designed to monitor changes in active layer thickness in real time. Results are derived primarily from seismic data streaming from seven Nanometric Trillium Posthole seismometers directly buried in the upper section of the permafrost. The data were evaluated using two analysis methods: Horizontal to Vertical Spectral Ratio (HVSR) and ambient noise seismic interferometry. Results from the HVSR conclusively illustrated the method's effectiveness at determining the active layer's thickness with a single station. Investigations with the multi-station method (ambient noise seismic interferometry) are continuing at the University of Florida and have not yet conclusively determined active layer thickness changes. Further work continues with the Bureau of Land Management (BLM) to determine if the ground based measurements can constrain satellite imagery, which provide measurements on a much larger spatial scale.

  11. ROSE seismic data storage and exchange facility

    SciTech Connect

    LaTraille, S.L.; Gettrust, J.F.; Simpson, M.E.

    1982-10-10

    The Rivera Ocean Seismic Experiment was a multi-institutional marine seismic experiment held off western Mexico during early 1979. Dense spatial sampling was provided by ocean bottom seismometers and ocean bottom hydrophones recording both natural events and explosions. The ROSE experiment and a companion land experiment generated a data base of approximately 76,000 'events' (source-instrument pairs). Procedures for efficient data storage, retrieval, and exchange were designed and implemented at the Hawaii Institute of Geophysics. Data capabilities of the exchange are demonstrated by using data from several participants to obtain qualitative estimates of the attenuation of seismic energy in the oceanic mantle near the East Pacific Rise and through the ocean/continent margin near Petatlan, Guerrero, Mexico. The data base is now available for general use.

  12. Tornado Detection Based on Seismic Signal.

    NASA Astrophysics Data System (ADS)

    Tatom, Frank B.; Knupp, Kevin R.; Vitton, Stanley J.

    1995-02-01

    At the present time the only generally accepted method for detecting when a tornado is on the ground is human observation. Based on theoretical considerations combined with eyewitness testimony, there is strong reason to believe that a tornado in contact with the ground transfers a significant amount of energy into the ground. The amount of energy transferred depends upon the intensity of the tornado and the characteristics of the surface. Some portion of this energy takes the form of seismic waves, both body and surface waves. Surface waves (Rayleigh and possibly Love) represent the most likely type of seismic signal to be detected. Based on the existence of such a signal, a seismic tornado detector appears conceptually possible. The major concerns for designing such a detector are range of detection and discrimination between the tornadic signal and other types of surface waves generated by ground transportation equipment, high winds, or other nontornadic sources.

  13. Seismic Hazard Characterization at the DOE Savannah River Site (SRS): Status report

    SciTech Connect

    Savy, J.B.

    1994-06-24

    The purpose of the Seismic Hazard Characterization project for the Savannah River Site (SRS-SHC) is to develop estimates of the seismic hazard for several locations within the SRS. Given the differences in the geology and geotechnical characteristics at each location, the estimates of the seismic hazard are to allow for the specific local conditions at each site. Characterization of seismic hazard is a critical factor for the design of new facilities as well as for the review and potential retrofit of existing facilities at SRS. The scope of the SRS seismic hazard characterization reported in this document is limited to the Probabilistic Seismic Hazard Analysis (PSHA). The goal of the project is to provide seismic hazard estimates based on a state-of-the-art method which is consistent with developments and findings of several ongoing studies which are deemed to bring improvements in the state of the seismic hazard analyses.

  14. Optimizing Seismic Monitoring Networks for EGS and Conventional Geothermal Projects

    NASA Astrophysics Data System (ADS)

    Kraft, Toni; Herrmann, Marcus; Bethmann, Falko; Stefan, Wiemer

    2013-04-01

    In the past several years, geological energy technologies receive growing attention and have been initiated in or close to urban areas. Some of these technologies involve injecting fluids into the subsurface (e.g., oil and gas development, waste disposal, and geothermal energy development) and have been found or suspected to cause small to moderate sized earthquakes. These earthquakes, which may have gone unnoticed in the past when they occurred in remote sparsely populated areas, are now posing a considerable risk for the public acceptance of these technologies in urban areas. The permanent termination of the EGS project in Basel, Switzerland after a number of induced ML~3 (minor) earthquakes in 2006 is one prominent example. It is therefore essential for the future development and success of these geological energy technologies to develop strategies for managing induced seismicity and keeping the size of induced earthquakes at a level that is acceptable to all stakeholders. Most guidelines and recommendations on induced seismicity published since the 1970ies conclude that an indispensable component of such a strategy is the establishment of seismic monitoring in an early stage of a project. This is because an appropriate seismic monitoring is the only way to detect and locate induced microearthquakes with sufficient certainty to develop an understanding of the seismic and geomechanical response of the reservoir to the geotechnical operation. In addition, seismic monitoring lays the foundation for the establishment of advanced traffic light systems and is therefore an important confidence building measure towards the local population and authorities. We have developed an optimization algorithm for seismic monitoring networks in urban areas that allows to design and evaluate seismic network geometries for arbitrary geotechnical operation layouts. The algorithm is based on the D-optimal experimental design that aims to minimize the error ellipsoid of the linearized location problem. Optimization for additional criteria (e.g., focal mechanism determination or installation costs) can be included. We consider a 3D seismic velocity model, an European ambient seismic noise model derived from high-resolution land-use data, and existing seismic stations in the vicinity of the geotechnical site. Additionally, we account for the attenuation of the seismic signal with travel time and ambient seismic noise with depth to be able to correctly deal with borehole station networks. Using this algorithm we are able to find the optimal geometry and size of the seismic monitoring network that meets the predefined application-oriented performance criteria. This talk will focus on optimal network geometries for deep geothermal projects of the EGS and hydrothermal type, and discuss the requirements for basic seismic surveillance and high-resolution reservoir monitoring and characterization.

  15. Constraints on Subglacial Conditions from Seismicity

    NASA Astrophysics Data System (ADS)

    Lipovsky, B.; Olivo, D. C.; Dunham, E. M.

    2014-12-01

    A family of physics-based models designed to explain emergent, bandlimited, "tremor-like" seismograms shed light onto subglacial and englacial conditions. We consider two such models. In the first, a water-filled fracture hosts resonant modes; the seismically observable quality factor and characteristic frequency of these modes constrain the fracture length and aperture. In the second model, seismicity is generated by repeating stick-slip events on a fault patch (portion of the glacier bed) with sliding described by rate- and state-dependent friction laws. Wave propagation phenomena may additionally generate bandlimited seismic signals. These models make distinct predictions that may be used to address questions of glaciological concern. Laboratory friction experiments show that small, repeating earthquakes most likely occur at the ice-till interface and at conditions below the pressure melting point. These laboratory friction values, when combined with observed ice surface velocities, may also be used to constrain basal pore pressure. In contrast, seismic signals indicative of water-filled basal fractures suggest that, at least locally, temperatures are above the pressure melting point. We present a simple diagnostic test between these two processes that concerns the relationship between the multiple seismic spectral peaks generated by each process. Whereas repeating earthquakes generate evenly spaced spectral peaks through the Dirac comb effect, hydraulic fracture resonance, as a result of dispersive propagation of waves along the crack, generates spectral peaks that are not evenly spaced.

  16. Overview of seismic considerations at the Paducah Gaseous Diffusion Plant

    SciTech Connect

    Hunt, R.J.; Stoddart, W.C.; Burnett, W.A.; Beavers, J.E.

    1992-10-01

    This paper presents an overview of seismic considerations at the Paducah Gaseous Diffusion Plant (PGDP), which is managed by Martin Marietta Energy Systems, Inc., for the Department of Energy (DOE). The overview describes the original design, the seismic evaluations performed for the Safety Analysis Report (SAR) issued in 1985, and current evaluations and designs to address revised DOE requirements. Future plans to ensure changes in requirements and knowledge are addressed.

  17. Regional seismic discrimination research at LLNL

    SciTech Connect

    Walter, W.R.; Mayeda, K.M.; Goldstein, P.; Patton, H.J.; Jarpe, S.; Glenn, L.

    1995-10-01

    The ability to verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and show much variability from region to region. The goals of Lawrence Livermore National Laboratory`s (LLNL`s) regional discriminant research are to evaluate the most promising discriminants, improve the understanding of their physical basis and use this information to develop new and more effective discriminants that can be transported to new regions of high monitoring interest. In this report the authors discuss preliminary efforts to geophysically characterize the Middle East and North Africa. They show that the remarkable stability of coda allows one to develop physically based, stable single station magnitude scales in new regions. They then discuss progress to date on evaluating and improving physical understanding and ability to model regional discriminants, focusing on the comprehensive NTS dataset. The authors apply this modeling ability to develop improved discriminants including slopes of P to S ratios. They find combining disparate discriminant techniques is particularly effective in identifying consistent outliers such as shallow earthquakes and mine seismicity. Finally they discuss development and use of new coda and waveform modeling tools to investigate special events.

  18. LLNL`s regional seismic discrimination research

    SciTech Connect

    Walter, W.R.; Mayeda, K.M.; Goldstein, P.

    1995-07-01

    The ability to negotiate and verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and show much variability from region to region. The goals of Lawrence Livermore National Laboratory`s (LLNL`s) regional discriminant research are to evaluate the most promising discriminants, improve our understanding of their physical basis and use this information to develop new and more effective discriminants that can be transported to new regions of high monitoring interest. In this report we discuss our preliminary efforts to geophysically characterize two regions, the Korean Peninsula and the Middle East-North Africa. We show that the remarkable stability of coda allows us to develop physically based, stable single station magnitude scales in new regions. We then discuss our progress to date on evaluating and improving our physical understanding and ability to model regional discriminants, focusing on the comprehensive NTS dataset. We apply this modeling ability to develop improved discriminants including slopes of P to S ratios. We find combining disparate discriminant techniques is particularly effective in identifying consistent outliers such as shallow earthquakes and mine seismicity. Finally we discuss our development and use of new coda and waveform modeling tools to investigate special events.

  19. Effects of Large and Small-Source Seismic Surveys on Marine Mammals and Sea Turtles

    NASA Astrophysics Data System (ADS)

    Holst, M.; Richardson, W. J.; Koski, W. R.; Smultea, M. A.; Haley, B.; Fitzgerald, M. W.; Rawson, M.

    2006-05-01

    L-DEO implements a marine mammal and sea turtle monitoring and mitigation program during its seismic surveys. The program consists of visual observations, mitigation, and/or passive acoustic monitoring (PAM). Mitigation includes ramp ups, powerdowns, and shutdowns of the seismic source if marine mammals or turtles are detected in or about to enter designated safety radii. Visual observations for marine mammals and turtles have taken place during all 11 L-DEO surveys since 2003, and PAM was done during five of those. Large sources were used during six cruises (10 to 20 airguns; 3050 to 8760 in3; PAM during four cruises). For two interpretable large-source surveys, densities of marine mammals were lower during seismic than non- seismic periods. During a shallow-water survey off Yucatán, delphinid densities during non-seismic periods were 19x higher than during seismic; however, this number is based on only 3 sightings during seismic and 11 sightings during non-seismic. During a Caribbean survey, densities were 1.4x higher during non-seismic. The mean closest point of approach (CPA) for delphinids for both cruises was significantly farther during seismic (1043 m) than during non-seismic (151 m) periods (Mann-Whitney U test, P < 0.001). Large whales were only seen during the Caribbean survey; mean CPA during seismic was 1722 m compared to 1539 m during non-seismic, but sample sizes were small. Acoustic detection rates with and without seismic were variable for three large-source surveys with PAM, with rates during seismic ranging from 1/3 to 6x those without seismic (n = 0 for fourth survey). The mean CPA for turtles was closer during non-seismic (139 m) than seismic (228 m) periods (P < 0.01). Small-source surveys used up to 6 airguns or 3 GI guns (75 to 1350 in3). During a Northwest Atlantic survey, delphinid densities during seismic and non-seismic were similar. However, in the Eastern Tropical Pacific, delphinid densities during non-seismic were 2x those during seismic. During a survey in Alaska, densities of large whales were 4.5x greater during non-seismic than seismic. In contrast, densities of Dall's porpoise were ~2x greater during seismic than during non-seismic; they also approached closer to the vessel during seismic (622 m) than during non-seismic (1044 m), though not significantly so (P = 0.16). CPAs for all other marine mammal groups sighted during small-source surveys were similar during seismic and non- seismic. For the one small-source survey with PAM, the acoustic detection rate during seismic was 1/3 of that without seismic. The mean CPA for turtles was 120 m during non-seismic and 285 m during seismic periods (P < 0.001). The large-source results suggest that, with operating airguns, some cetaceans tended to avoid the immediate area but often continued calling. Some displacement was also apparent during three interpretable small- source surveys, but the evidence was less clear than for large-source surveys. With both large and small sources, although some cetaceans avoided the airguns and vessel, others came to bowride during seismic operations. Sea turtles showed localized avoidance during large and small-source surveys.

  20. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.

    2011-12-01

    The CSN is a network of low-cost accelerometers deployed in the Pasadena, CA region. It is a prototype network with the goal of demonstrating the importance of dense measurements in determining the rapid lateral variations in ground motion due to earthquakes. The main product of the CSN is a map of peak ground produced within seconds of significant local earthquakes that can be used as a proxy for damage. Examples of this are shown using data from a temporary network in Long Beach, CA. Dense measurements in buildings are also being used to determine the state of health of structures. In addition to fixed sensors, portable sensors such as smart phones are also used in the network. The CSN has necessitated several changes in the standard design of a seismic network. The first is that the data collection and processing is done in the "cloud" (Google cloud in this case) for robustness and the ability to handle large impulsive loads (earthquakes). Second, the database is highly de-normalized (i.e. station locations are part of waveform and event-detection meta data) because of the mobile nature of the sensors. Third, since the sensors are hosted and/or owned by individuals, the privacy of the data is very important. The location of fixed sensors is displayed on maps as sensor counts in block-wide cells, and mobile sensors are shown in a similar way, with the additional requirement to inhibit tracking that at least two must be present in a particular cell before any are shown. The raw waveform data are only released to users outside of the network after a felt earthquake.

  1. Experimental Techniques Verified for Determining Yield and Flow Surfaces

    NASA Technical Reports Server (NTRS)

    Lerch, Brad A.; Ellis, Rod; Lissenden, Cliff J.

    1998-01-01

    Structural components in aircraft engines are subjected to multiaxial loads when in service. For such components, life prediction methodologies are dependent on the accuracy of the constitutive models that determine the elastic and inelastic portions of a loading cycle. A threshold surface (such as a yield surface) is customarily used to differentiate between reversible and irreversible flow. For elastoplastic materials, a yield surface can be used to delimit the elastic region in a given stress space. The concept of a yield surface is central to the mathematical formulation of a classical plasticity theory, but at elevated temperatures, material response can be highly time dependent. Thus, viscoplastic theories have been developed to account for this time dependency. Since the key to many of these theories is experimental validation, the objective of this work (refs. 1 and 2) at the NASA Lewis Research Center was to verify that current laboratory techniques and equipment are sufficient to determine flow surfaces at elevated temperatures. By probing many times in the axial-torsional stress space, we could define the yield and flow surfaces. A small offset definition of yield (10 me) was used to delineate the boundary between reversible and irreversible behavior so that the material state remained essentially unchanged and multiple probes could be done on the same specimen. The strain was measured with an off-the-shelf multiaxial extensometer that could measure the axial and torsional strains over a wide range of temperatures. The accuracy and resolution of this extensometer was verified by comparing its data with strain gauge data at room temperature. The extensometer was found to have sufficient resolution for these experiments. In addition, the amount of crosstalk (i.e., the accumulation of apparent strain in one direction when strain in the other direction is applied) was found to be negligible. Tubular specimens were induction heated to determine the flow surfaces at elevated temperatures. The heating system induced a large amount of noise in the data. By reducing thermal fluctuations and using appropriate data averaging schemes, we could render the noise inconsequential. Thus, accurate and reproducible flow surfaces (see the figure) could be obtained.

  2. Enhancing Seismic Monitoring Capability for Hydraulic Fracturing Induced Seismicity in Canada

    NASA Astrophysics Data System (ADS)

    Kao, H.; Cassidy, J. F.; Farahbod, A.; Lamontagne, M.

    2012-12-01

    The amount of natural gas produced from unconventional sources, such as the shale gas, has increased dramatically since the last decade. One of the key factors in the success of shale gas production is the application of hydraulic fracturing (also known as "fracking") to facilitate the efficient recovery of natural gas from shale matrices. As the fracking operation becomes routine in all major shale gas fields, its potential to induce local earthquakes at some locations has become a public concern. To address this concern, Natural Resources Canada has initiated a research effort to investigate the potential links between fracking operations and induced seismicity in some major shale gas basins of Canada. This federal-provincial collaborative research aims to assess if shale gas fracking can alter regional pattern of background seismicity and if so, what the relationship between how fracking is conducted and the maximum magnitude of induced seismicity would be. Other objectives include the investigation of the time scale of the interaction between fracking events and induced seismicity and the evaluation of induced seismicity potential for shale gas basins under different tectonic/geological conditions. The first phase of this research is to enhance the detection and monitoring capability for seismicity possibly related to shale gas recovery in Canada. Densification of the Canadian National Seismograph Network (CNSN) is currently underway in northeast British Columbia where fracking operations are taking place. Additional seismic stations are planned for major shale gas basins in other regions where fracking might be likely in the future. All newly established CNSN stations are equipped with broadband seismographs with real-time continuous data transmission. The design goal of the enhanced seismic network is to significantly lower the detection threshold such that the anticipated low-magnitude earthquakes that might be related to fracking operations can be identified and located shortly after their occurrence.

  3. The Lusi seismic experiment: An initial study to understand the effect of seismic activity to Lusi

    SciTech Connect

    Karyono; Mazzini, Adriano; Sugiharto, Anton; Lupi, Matteo; Syafri, Ildrem; Masturyono,; Rudiyanto, Ariska; Pranata, Bayu; Muzli,; Widodo, Handi Sulistyo; Sudrajat, Ajat

    2015-04-24

    The spectacular Lumpur Sidoarjo (Lusi) eruption started in northeast Java on the 29 of May 2006 following a M6.3 earthquake striking the island [1,2]. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system [3] and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. The Lusi seismic experiment is a project aims to begin a detailed study of seismicity around the Lusi area. In this initial phase we deploy 30 seismometers strategically distributed in the area around Lusi and along the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. The purpose of the initial monitoring is to conduct a preliminary seismic campaign aiming to identify the occurrence and the location of local seismic events in east Java particularly beneath Lusi.This network will locate small event that may not be captured by the existing BMKG network. It will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-AW region and spatial and temporal variations of vp/vs ratios. The goal of this study is to understand how the seismicity occurring along the Sunda subduction zone affects to the behavior of the Lusi eruption. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. In this study, we will extract Green’s functions from ambient seismic noise data in order to image the shallow subsurface structure beneath LUSI area. The waveform cross-correlation technique will be apply to all of recordings of ambient seismic noise at 30 seismographic stations around the LUSI area. We use the dispersive behaviour of the retrieved Rayleigh waves to infer velocity structures in the shallow subsurface.

  4. The Lusi seismic experiment: An initial study to understand the effect of seismic activity to Lusi

    NASA Astrophysics Data System (ADS)

    Karyono, Mazzini, Adriano; Lupi, Matteo; Syafri, Ildrem; Masturyono, Rudiyanto, Ariska; Pranata, Bayu; Muzli, Widodo, Handi Sulistyo; Sudrajat, Ajat; Sugiharto, Anton

    2015-04-01

    The spectacular Lumpur Sidoarjo (Lusi) eruption started in northeast Java on the 29 of May 2006 following a M6.3 earthquake striking the island [1,2]. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system [3] and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. The Lusi seismic experiment is a project aims to begin a detailed study of seismicity around the Lusi area. In this initial phase we deploy 30 seismometers strategically distributed in the area around Lusi and along the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. The purpose of the initial monitoring is to conduct a preliminary seismic campaign aiming to identify the occurrence and the location of local seismic events in east Java particularly beneath Lusi.This network will locate small event that may not be captured by the existing BMKG network. It will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-AW region and spatial and temporal variations of vp/vs ratios. The goal of this study is to understand how the seismicity occurring along the Sunda subduction zone affects to the behavior of the Lusi eruption. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. In this study, we will extract Green's functions from ambient seismic noise data in order to image the shallow subsurface structure beneath LUSI area. The waveform cross-correlation technique will be apply to all of recordings of ambient seismic noise at 30 seismographic stations around the LUSI area. We use the dispersive behaviour of the retrieved Rayleigh waves to infer velocity structures in the shallow subsurface.

  5. Realities of verifying the absence of highly enriched uranium (HEU) in gas centrifuge enrichment plants

    SciTech Connect

    Swindle, D.W.

    1990-03-01

    Over a two and one-half year period beginning in 1981, representatives of six countries (United States, United Kingdom, Federal Republic of Germany, Australia, The Netherlands, and Japan) and the inspectorate organizations of the International Atomic Energy Agency and EURATOM developed and agreed to a technically sound approach for verifying the absence of highly enriched uranium (HEU) in gas centrifuge enrichment plants. This effort, known as the Hexapartite Safeguards Project (HSP), led to the first international concensus on techniques and requirements for effective verification of the absence of weapons-grade nuclear materials production. Since that agreement, research and development has continued on the radiation detection technology-based technique that technically confirms the HSP goal is achievable. However, the realities of achieving the HSP goal of effective technical verification have not yet been fully attained. Issues such as design and operating conditions unique to each gas centrifuge plant, concern about the potential for sensitive technology disclosures, and on-site support requirements have hindered full implementation and operator support of the HSP agreement. In future arms control treaties that may limit or monitor fissile material production, the negotiators must recognize and account for the realities and practicalities in verifying the absence of HEU production. This paper will describe the experiences and realities of trying to achieve the goal of developing and implementing an effective approach for verifying the absence of HEU production. 3 figs.

  6. Generalized seismic wavelets

    NASA Astrophysics Data System (ADS)

    Wang, Yanghua

    2015-11-01

    The Ricker wavelet, which is often employed in seismic analysis, has a symmetrical form. Seismic wavelets observed from field data, however, are commonly asymmetric with respect to the time variation. In order to better represent seismic signals, asymmetrical wavelets are defined systematically as fractional derivatives of a Gaussian function in which the Ricker wavelet becomes just a special case with the integer derivative of order 2. The fractional value and a reference frequency are two key parameters in the generalization. Frequency characteristics, such as the central frequency, the bandwidth, the mean frequency and the deviation, may be expressed analytically in closed forms. In practice, once the statistical properties (the mean frequency and deviation) are numerically evaluated from the discrete Fourier spectra of seismic data, these analytical expressions can be used to uniquely determine the fractional value and the reference frequency, and subsequently to derive various frequency quantities needed for the wavelet analysis. It is demonstrated that field seismic signals, recorded at various depths in a vertical borehole, can be closely approximated by generalized wavelets, defined in terms of fractional values and reference frequencies.

  7. Seismic source parameters

    SciTech Connect

    Johnson, L.R.

    1994-06-01

    The use of information contained on seismograms to infer the properties of an explosion source presents an interesting challenge because the seismic waves recorded on the seismograms represent only small indirect, effects of the explosion. The essential physics of the problem includes the process by which these elastic waves are generated by the explosion and also the process involved in propagating the seismic waves from the source region to the sites where the seismic data are collected. Interpretation of the seismic data in terms of source properties requires that the effects of these generation and propagation processes be taken into account. The propagation process involves linear mechanics and a variety of standard seismological methods have been developed for handling this part of the problem. The generation process presents a more difficult problem, as it involves non-linear mechanics, but semi-empirical methods have been developed for handling this part of the problem which appear to yield reasonable results. These basic properties of the seismic method are illustrated with some of the results from the NPE.

  8. Landslide seismic magnitude

    NASA Astrophysics Data System (ADS)

    Lin, C. H.; Jan, J. C.; Pu, H. C.; Tu, Y.; Chen, C. C.; Wu, Y. M.

    2015-11-01

    Landslides have become one of the most deadly natural disasters on earth, not only due to a significant increase in extreme climate change caused by global warming, but also rapid economic development in topographic relief areas. How to detect landslides using a real-time system has become an important question for reducing possible landslide impacts on human society. However, traditional detection of landslides, either through direct surveys in the field or remote sensing images obtained via aircraft or satellites, is highly time consuming. Here we analyze very long period seismic signals (20-50 s) generated by large landslides such as Typhoon Morakot, which passed though Taiwan in August 2009. In addition to successfully locating 109 large landslides, we define landslide seismic magnitude based on an empirical formula: Lm = log ⁡ (A) + 0.55 log ⁡ (Δ) + 2.44, where A is the maximum displacement (μm) recorded at one seismic station and Δ is its distance (km) from the landslide. We conclude that both the location and seismic magnitude of large landslides can be rapidly estimated from broadband seismic networks for both academic and applied purposes, similar to earthquake monitoring. We suggest a real-time algorithm be set up for routine monitoring of landslides in places where they pose a frequent threat.

  9. Measurements verifying the optics of the Electron Drift Instrument

    NASA Astrophysics Data System (ADS)

    Kooi, Vanessa M.

    This thesis concentrates on laboratory measurements of the Electron Drift Instrument (EDI), focussing primarily on the EDI optics of the system. The EDI is a device used on spacecraft to measure electric fields by emitting an electron beam and measuring the E x B drift of the returning electrons after one gyration. This drift velocity is determined using two electron beams directed perpendicular to the magnetic field returning to be detected by the spacecraft. The EDI will be used on the Magnetospheric Multi-Scale Mission. The EDI optic's testing process takes measurements of the optics response to a uni-directional electron beam. These measurements are used to verify the response of the EDI's optics and to allow for the optimization of the desired optics state via simulation. The optics state tables were created in simulations and we are using these measurements to confirm their accuracy. The setup consisted of an apparatus made up of the EDI's optics and sensor electronics was secured to the two axis gear arm inside a vacuum chamber. An electron beam was projected at the apparatus which then used the EDI optics to focus the beam into the micro-controller plates and onto the circular 32 pad annular ring that makes up the sensor. The concentration of counts per pad over an interval of 1ms were averaged over 25 samples and plotted in MATLAB. The results of the measurements plotted agreed well with the simulations, providing confidence in the EDI instrument.

  10. Measurements Verifying the Optics of the Electron Drift Instrument

    NASA Astrophysics Data System (ADS)

    Kooi, Vanessa; Kletzing, Craig; Bounds, Scott; Sigsbee, Kristine M.

    2015-04-01

    Magnetic reconnection is the process of breaking and reconnecting of opposing magnetic field lines, and is often associated with tremendous energy transfer. The energy transferred by reconnection directly affects people through its influence on geospace weather and technological systems - such as telecommunication networks, GPS, and power grids. However, the mechanisms that cause magnetic reconnection are not well understood. The Magnetospheric Multi-Scale Mission (MMS) will use four spacecraft in a pyramid formation to make three-dimensional measurements of the structures in magnetic reconnection occurring in the Earth's magnetosphere.The spacecraft will repeatedly sample these regions for a prolonged period of time to gather data in more detail than has been previously possible. MMS is scheduled to be launched in March of 2015. The Electron Drift Instrument (EDI) will be used on MMS to measure the electric fields associated with magnetic reconnection. The EDI is a device used on spacecraft to measure electric fields by emitting an electron beam and measuring the E x B drift of the returning electrons after one gyration. This paper concentrates on measurements of the EDI’s optics system. The testing process includes measuring the optics response to a uni-directional electron beam. These measurements are used to verify the response of the EDI's optics and to allow for the optimization of the desired optics state. The measurements agree well with simulations and we are confident in the performance of the EDI instrument.

  11. Garbage collection can be made real-time and verifiable

    NASA Technical Reports Server (NTRS)

    Hino, James H.; Ross, Charles L.

    1988-01-01

    An efficient means of memory reclamation (also known as Garbage Collection) is essential for Machine Intelligence applications where dynamic storage allocation is desired or required. Solutions for real-time systems must introduce very small processing overhead and must also provide for the verification of the software in order to meet the application time budgets and to verify the correctness of the software. Garbage Collection (GC) techniques are proposed for symbolic processing systems which may simultaneously meet both real-time requirements and verification requirements. The proposed memory reclamation technique takes advantage of the strong points of both the earlier Mark and Sweep technique and the more recent Copy Collection approaches. At least one practical implementation of these new GC techniques has already been developed and tested on a very-high performance symbolic computing system. Complete GC processing of all generated garbage has been demonstrated to require as little as a few milliseconds to perform. This speed enables the effective operation of the GC function as either a background task or as an actual part of the application task itself.

  12. Verifying operator fitness - an imperative not an option

    SciTech Connect

    Scott, A.B. Jr.

    1987-01-01

    In the early morning hours of April 26, 1986, whatever credence those who operate nuclear power plants around the world could then muster, suffered a jarring reversal. Through an incredible series of personal errors, the operators at what was later to be termed one of the best operated plants in the USSR systematically stripped away the physical and procedural safeguards inherent to their installation and precipitated the worst reactor accident the world has yet seen. This challenge to the adequacy of nuclear operators comes at a time when many companies throughout the world - not only those that involve nuclear power - are grappling with the problem of how to assure the fitness for duty of those in their employ, specifically those users of substances that have an impact on the ability to function safely and productively in the workplace. In actuality, operator fitness for duty is far more than the lack of impairment from substance abuse, which many today consider it. Full fitness for duty implies mental and moral fitness, as well, and physical fitness in a more general sense. If we are to earn the confidence of the public, credible ways to verify total fitness on an operator-by-operator basis must be considered.

  13. A credit card verifier structure using diffraction and spectroscopy concepts

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2008-04-01

    We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram is a diffractive optical element, we choose to shine one at a time a number of broadband lightsources, each at different incident angle, on the embossed hologram of the credit card in such a way that different color spectra per incident angle beam is diffracted and separated in space. In this way, the number of pixels of each color plane is investigated. Then we apply a feed forward back propagation neural network configuration to separate the counterfeit credit card from the real one. Our experimental demonstration using two off-the-shelf broadband white light emitting diodes, one digital camera, a 3-layer neural network, and a notebook computer can identify all 69 counterfeit credit cards from eight real credit cards.

  14. A procedure for seismic risk reduction in Campania Region

    NASA Astrophysics Data System (ADS)

    Zuccaro, G.; Palmieri, M.; Maggiò, F.; Cicalese, S.; Grassi, V.; Rauci, M.

    2008-07-01

    The Campania Region has set and performed a peculiar procedure in the field of seismic risk reduction. Great attention has been paid to public strategic buildings such as town halls, civil protection buildings and schools. The Ordinance 3274 promulgate in the 2004 by the Italian central authority obliged the owners of strategic buildings to perform seismic analyses within 2008 in order to check the safety of the structures and the adequacy to the use. In the procedure the Campania region, instead of the local authorities, ensure the complete drafting of seismic checks through financial resources of the Italian Government. A regional scientific technical committee has been constituted, composed of scientific experts, academics in seismic engineering. The committee has drawn up guidelines for the processing of seismic analyses. At the same time, the Region has issued a public competition to select technical seismic engineering experts to appoint seismic analysis in accordance with guidelines. The scientific committee has the option of requiring additional documents and studies in order to approve the safety checks elaborated. The Committee is supported by a technical and administrative secretariat composed of a group of expert in seismic engineering. At the moment several seismic safety checks have been completed. The results will be presented in this paper. Moreover, the policy to mitigate the seismic risk, set by Campania region, was to spend the most of the financial resources available on structural strengthening of public strategic buildings rather than in safety checks. A first set of buildings of which the response under seismic action was already known by data and studies of vulnerability previously realised, were selected for immediate retrofitting designs. Secondly, an other set of buildings were identified for structural strengthening. These were selected by using the criteria specified in the Guide Line prepared by the Scientific Committee and based on data obtained by the first set of safety checks. The strengthening philosophy adopt in the projects will be described in the paper.

  15. Third Quarter Hanford Seismic report for Fiscal year 2003

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2003-09-11

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 356 triggers during the third quarter of fiscal year 2003. Of these triggers, 141 were earthquakes. Thirty-four earthquakes of the 141 earthquakes were located in the Hanford Seismic Network area. Stratigraphically 15 occurred in the Columbia River basalt, 13 were earthquakes in the pre-basalt sediments, and 6 were earthquakes in the crystalline basement. Geographically, 22 earthquakes occurred in swarm areas, 1 earthquake was associated with a major geologic structure, and 11 were classified as random events. During the third quarter, an earthquake swarm consisting of 15 earthquakes occurred on the south limb of Rattlesnake Mountain. The earthquakes are centered over the northwest extension of the Horse Heaven Hills anticline and probably occur at the base of the Columbia River Basalt Group.

  16. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  17. Seismic surveys test on Innerhytta Pingo, Adventdalen, Svalbard Islands

    NASA Astrophysics Data System (ADS)

    Boaga, Jacopo; Rossi, Giuliana; Petronio, Lorenzo; Accaino, Flavio; Romeo, Roberto; Wheeler, Walter

    2015-04-01

    We present the preliminary results of an experimental full-wave seismic survey test conducted on the Innnerhytta a Pingo, located in the Adventdalen, Svalbard Islands, Norway. Several seismic surveys were adopted in order to study a Pingo inner structure, from classical reflection/refraction arrays to seismic tomography and surface waves analysis. The aim of the project IMPERVIA, funded by Italian PNRA, was the evaluation of the permafrost characteristics beneath this open-system Pingo by the use of seismic investigation, evaluating the best practice in terms of logistic deployment. The survey was done in April-May 2014: we collected 3 seismic lines with different spacing between receivers (from 2.5m to 5m), for a total length of more than 1 km. We collected data with different vertical geophones (with natural frequency of 4.5 Hz and 14 Hz) as well as with a seismic snow-streamer. We tested different seismic sources (hammer, seismic gun, fire crackers and heavy weight drop), and we verified accurately geophone coupling in order to evaluate the different responses. In such peculiar conditions we noted as fire-crackers allow the best signal to noise ratio for refraction/reflection surveys. To ensure the best geophones coupling with the frozen soil, we dug snow pits, to remove the snow-cover effect. On the other hand, for the surface wave methods, the very high velocity of the permafrost strongly limits the generation of long wavelengths both with these explosive sources as with the common sledgehammer. The only source capable of generating low frequencies was a heavy drop weight system, which allows to analyze surface wave dispersion below 10 Hz. Preliminary data analysis results evidence marked velocity inversions and strong velocity contrasts in depth. The combined use of surface and body waves highlights the presence of a heterogeneous soil deposit level beneath a thick layer of permafrost. This is the level that hosts the water circulation from depth controlling the Pingo structure evolution.

  18. Building a Laboratory-Scale Biogas Plant and Verifying its Functionality

    NASA Astrophysics Data System (ADS)

    Boleman, Tomáš; Fiala, Jozef; Blinová, Lenka; Gerulová, Kristína

    2011-01-01

    The paper deals with the process of building a laboratory-scale biogas plant and verifying its functionality. The laboratory-scale prototype was constructed in the Department of Safety and Environmental Engineering at the Faculty of Materials Science and Technology in Trnava, of the Slovak University of Technology. The Department has already built a solar laboratory to promote and utilise solar energy, and designed SETUR hydro engine. The laboratory is the next step in the Department's activities in the field of renewable energy sources and biomass. The Department is also involved in the European Union project, where the goal is to upgrade all existed renewable energy sources used in the Department.

  19. Adjustment of minimum seismic shear coefficient considering site effects for long-period structures

    NASA Astrophysics Data System (ADS)

    Guan, Minsheng; Du, Hongbiao; Cui, Jie; Zeng, Qingli; Jiang, Haibo

    2016-06-01

    Minimum seismic base shear is a key factor employed in the seismic design of long-period structures, which is specified in some of the major national seismic building codes viz. ASCE7-10, NZS1170.5 and GB50011-2010. In current Chinese seismic design code GB50011-2010, however, effects of soil types on the minimum seismic shear coefficient are not considered, which causes problems for long-period structures sited in hard or rock soil to meet the minimum base shear requirement. This paper aims to modify the current minimum seismic shear coefficient by taking into account site effects. For this purpose, effective peak acceleration (EPA) is used as a representation for the ordinate value of the design response spectrum at the plateau. A large amount of earthquake records, for which EPAs are calculated, are examined through the statistical analysis by considering soil conditions as well as the seismic fortification intensities. The study indicates that soil types have a significant effect on the spectral ordinates at the plateau as well as the minimum seismic shear coefficient. Modified factors related to the current minimum seismic shear coefficient are preliminarily suggested for each site class. It is shown that the modified seismic shear coefficients are more effective to the determination of minimum seismic base shear of long-period structures.

  20. Controllable seismic source

    SciTech Connect

    Gomez, Antonio; DeRego, Paul Jeffrey; Ferrel, Patrick Andrew; Thom, Robert Anthony; Trujillo, Joshua J.; Herridge, Brian

    2014-08-19

    An apparatus for generating seismic waves includes a housing, a strike surface within the housing, and a hammer movably disposed within the housing. An actuator induces a striking motion in the hammer such that the hammer impacts the strike surface as part of the striking motion. The actuator is selectively adjustable to change characteristics of the striking motion and characteristics of seismic waves generated by the impact. The hammer may be modified to change the physical characteristics of the hammer, thereby changing characteristics of seismic waves generated by the hammer. The hammer may be disposed within a removable shock cavity, and the apparatus may include two hammers and two shock cavities positioned symmetrically about a center of the apparatus.

  1. Controllable seismic source

    SciTech Connect

    Gomez, Antonio; DeRego, Paul Jeffrey; Ferrell, Patrick Andrew; Thom, Robert Anthony; Trujillo, Joshua J.; Herridge, Brian

    2015-09-29

    An apparatus for generating seismic waves includes a housing, a strike surface within the housing, and a hammer movably disposed within the housing. An actuator induces a striking motion in the hammer such that the hammer impacts the strike surface as part of the striking motion. The actuator is selectively adjustable to change characteristics of the striking motion and characteristics of seismic waves generated by the impact. The hammer may be modified to change the physical characteristics of the hammer, thereby changing characteristics of seismic waves generated by the hammer. The hammer may be disposed within a removable shock cavity, and the apparatus may include two hammers and two shock cavities positioned symmetrically about a center of the apparatus.

  2. Induced seismicity. Final report

    SciTech Connect

    Segall, P.

    1997-09-18

    The objective of this project has been to develop a fundamental understanding of seismicity associated with energy production. Earthquakes are known to be associated with oil, gas, and geothermal energy production. The intent is to develop physical models that predict when seismicity is likely to occur, and to determine to what extent these earthquakes can be used to infer conditions within energy reservoirs. Early work focused on earthquakes induced by oil and gas extraction. Just completed research has addressed earthquakes within geothermal fields, such as The Geysers in northern California, as well as the interactions of dilatancy, friction, and shear heating, on the generation of earthquakes. The former has involved modeling thermo- and poro-elastic effects of geothermal production and water injection. Global Positioning System (GPS) receivers are used to measure deformation associated with geothermal activity, and these measurements along with seismic data are used to test and constrain thermo-mechanical models.

  3. Downhole seismic array system

    SciTech Connect

    Petermann, S.G.

    1992-03-03

    This patent describes an apparatus of receiving seismic signals from an earth formation at least at one or more points in a wellbore penetrating the formation. It comprises a sonde including extensible and retractable support means thereon for supporting seismic signal receiver means, hydraulic actuator means for extending and reacting the support means, body means for supporting the actuator means and the support means and signal transmitting means for transmitting electrical signals related to seismic signals received by the receiver means; tubing means connected to the sonde for deploying the sonde in the wellbore, the tubing means including electrical conductor means disposed therein for conducting electrical signals between means on the surface of the formation and the sonde and the tubing means comprising means for conducting hydraulic fluid to the sonde for operation of the actuator means; and means for supplying hydraulic fluid from the surface of the formation through the tubing means to the sonde for operating the actuator means.

  4. Seismic ruggedness of relays

    SciTech Connect

    Merz, K.L. )

    1991-08-01

    This report complements EPRI report NP-5223 Revision 1, February 1991, and presents additional information and analyses concerning generic seismic ruggedness of power plant relays. Existing and new test data have been used to construct Generic Equipment Ruggedness Spectra (GERS) which can be used in identifying rugged relays during seismic re-evaluation of nuclear power plants. This document is an EPRI tier 1 report. The results of relay fragility tests for both old and new relays are included in an EPRI tier 2 report with the same title. In addition to the presentation of relay GERS, the tier 2 report addresses the applicability of GERS to relays of older vintage, discusses the important identifying nomenclature for each relay type, and examines relay adjustment effects on seismic ruggedness. 9 refs., 3 figs, 1 tab.

  5. Canadian Seismic Agreement

    SciTech Connect

    Wetmiller, R.J.; Lyons, J.A.; Shannon, W.E.; Munro, P.S.; Thomas, J.T.; Andrew, M.D.; Lapointe, S.P.; Lamontagne, M.; Wong, C.; Anglin, F.M.; Adams, J.; Cajka, M.G.; McNeil, W.; Drysdale, J.A. )

    1992-05-01

    This is a progress report of work carried out under the terms of a research agreement entitled the Canadian Seismic Agreement'' between the US Nuclear Regulatory Commission (USNRC), the Canadian Commercial Corporation and the Geophysics Division of the Geological Survey of Canada (GD/GSC) during the period from July 01, 1989 to June 30, 1990. The Canadian Seismic Agreement'' supports generally the operation of various seismograph stations in eastern Canada and the collection and analysis of earthquake data for the purpose of mitigating seismic hazards in eastern Canada and the northeastern US. The specific activities carried out in this one-year period are summarized below under four headings; Eastern Canada Telemetred Network and local network developments, Datalab developments, strong-motion network developments and earthquake activity. During this period the first surface fault unequivocably determined to have accompanied a historic earthquake in eastern North America, occurred in northern Quebec.

  6. Magnitude correlations in global seismicity

    SciTech Connect

    Sarlis, N. V.

    2011-08-15

    By employing natural time analysis, we analyze the worldwide seismicity and study the existence of correlations between earthquake magnitudes. We find that global seismicity exhibits nontrivial magnitude correlations for earthquake magnitudes greater than M{sub w}6.5.

  7. Stress-Release Seismic Source for Seismic Velocity Measurement in Mines

    NASA Astrophysics Data System (ADS)

    Swanson, P. L.; Clark, C.; Richardson, J.; Martin, L.; Zahl, E.; Etter, A.

    2014-12-01

    Accurate seismic event locations are needed to delineate roles of mine geometry, stress and geologic structures in developing rockburst conditions. Accurate absolute locations are challenging in mine environments with rapid changes in seismic velocity due to sharp contrasts between individual layers and large time-dependent velocity gradients attending excavations. Periodic use of controlled seismic sources can help constrain the velocity in this continually evolving propagation medium comprising the miners' workplace. With a view to constructing realistic velocity models in environments in which use of explosives is problematic, a seismic source was developed subject to the following design constraints: (i) suitable for use in highly disturbed zones surrounding mine openings, (ii) able to produce usable signals over km-scale distances in the frequency range of typical coal mine seismic events (~10-100 Hz), (iii) repeatable, (iv) portable, (v) non-disruptive to mining operations, and (vi) safe for use in potentially explosive gaseous environments. Designs of the compressed load column seismic source (CLCSS), which generates a stress, or load, drop normal to the surface of mine openings, and the fiber-optic based source-initiation timer are presented. Tests were conducted in a coal mine at a depth of 500 m (1700 ft) and signals were recorded on the surface with a 72-ch (14 Hz) exploration seismograph for load drops of 150-470 kN (16-48 tons). Signal-to-noise ratios of unfiltered signals ranged from ~200 immediately above the source (500 m (1700 ft)) to ~8 at the farthest extent of the array (slant distance of ~800 m (2600 ft)), suggesting the potential for use over longer range. Results are compared with signals produced by weight drop and sledge hammer sources, indicating the superior waveform quality for first-arrival measurements with the CLCSS seismic source.

  8. Development of a wireless seismic array for volcano monitoring

    NASA Astrophysics Data System (ADS)

    Moure, David; Toma, Daniel; Lázaro, Antoni Manuel; Del Río, Joaquín; Carreras, Normandino; José Blanco, María

    2014-05-01

    Volcano monitoring is mainly based on three sciences: seismology, geodesy and geochemistry. Seismic arrays are used to locate the seismic source, based on analysis of signals recorded by each seismometer. The most important advantages of arrays over classical seismic networks are: painless deployment, no major infrastructures needed, able to provide an approximate location of a signal that is not feasible by a seismic network. In this paper the design of a low-power wireless array is presented. All sensors transmit acquired data to a central node which is capable to calculate the possible location of the seismic source in real-time. The reliability of those locations depends, among other parameters (number of sensors and geometrical distribution), on precision of time synchronization between the nodes. To achieve the necessary precision, the wireless seismic array implements a time synchronization protocol based on the IEEE1588 protocol, which ensures clock synchronization between nodes better than a microsecond, therefore, signal correlation between sensors is achieved correlating the signals from all the sensors. The ultimate challenge would be that the central node receives data from all the seismometers locating the seismic source, only transmitting the result, which dramatically reduces data traffic. Often, active volcano areas are located far from inhabited areas and data transmission options are limited. In situ calculation is crucial in order to reduce data volume transmission generated by the seismic array.

  9. Synthesis of artificial spectrum-compatible seismic accelerograms

    NASA Astrophysics Data System (ADS)

    Vrochidou, E.; Alvanitopoulos, P. F.; Andreadis, I.; Elenas, A.; Mallousi, K.

    2014-08-01

    The Hilbert-Huang transform is used to generate artificial seismic signals compatible with the acceleration spectra of natural seismic records. Artificial spectrum-compatible accelerograms are utilized instead of natural earthquake records for the dynamic response analysis of many critical structures such as hospitals, bridges, and power plants. The realistic estimation of the seismic response of structures involves nonlinear dynamic analysis. Moreover, it requires seismic accelerograms representative of the actual ground acceleration time histories expected at the site of interest. Unfortunately, not many actual records of different seismic intensities are available for many regions. In addition, a large number of seismic accelerograms are required to perform a series of nonlinear dynamic analyses for a reliable statistical investigation of the structural behavior under earthquake excitation. These are the main motivations for generating artificial spectrum-compatible seismic accelerograms and could be useful in earthquake engineering for dynamic analysis and design of buildings. According to the proposed method, a single natural earthquake record is deconstructed into amplitude and frequency components using the Hilbert-Huang transform. The proposed method is illustrated by studying 20 natural seismic records with different characteristics such as different frequency content, amplitude, and duration. Experimental results reveal the efficiency of the proposed method in comparison with well-established and industrial methods in the literature.

  10. New seismic sensors for footstep detection and other military applications

    NASA Astrophysics Data System (ADS)

    Pakhomov, Alex; Goldburt, Tim

    2004-09-01

    Performance of seismic security systems relies on the particular application of the characteristics of seismic sensors. Current seismic sensors do not yield best possible results. In addition to identifying the requirements for optimal seismic sensors, we have developed seismic sensors for defense and security applications. We show two different types of seismic sensors: a miniscule, extremely low cost sensor and a bulk sensor. The miniscule, extremely low cost sensor is an electret-based geophone for both seismic and acoustic detection systems. This geophone detects a small size object - i.e. a walking/running/crawling person or a small underwater vehicle-that moves on the surface, underground, and/or in the water. It can also detect large size objects-i.e. heavy vehicles, trucks, tanks-as well as be used in littoral warfare. The electret-based design significantly improves technical characteristics achieving performance uniqueness: expanded frequency response range in the low frequency area, improved sensitivity threshold and accuracy response, and improved sensor's protection from electromagnetic interference. The bulk sensor has an extremely large detection surface, a nanocomposite body in special form casing, and a special electronic circuit. These sensors allow detection of footstep signals in high ambient seismic noise levels. However, installation requires significant installation groundwork effort.

  11. First Quarter Hanford Seismic Report for Fiscal Year 2011

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Clayton, Ray E.; Devary, Joseph L.

    2011-03-31

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 16 local earthquakes during the first quarter of FY 2011. Six earthquakes were located at shallow depths (less than 4 km), seven earthquakes at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments, and three earthquakes were located at depths greater than 9 km, within the basement. Geographically, thirteen earthquakes were located in known swarm areas and three earthquakes were classified as random events. The highest magnitude event (1.8 Mc) was recorded on October 19, 2010 at depth 17.5 km with epicenter located near the Yakima River between the Rattlesnake Mountain and Horse Heaven Hills swarm areas.

  12. Induced Seismicity Monitoring System

    NASA Astrophysics Data System (ADS)

    Taylor, S. R.; Jarpe, S.; Harben, P.

    2014-12-01

    There are many seismological aspects associated with monitoring of permanent storage of carbon dioxide (CO2) in geologic formations. Many of these include monitoring underground gas migration through detailed tomographic studies of rock properties, integrity of the cap rock and micro seismicity with time. These types of studies require expensive deployments of surface and borehole sensors in the vicinity of the CO2 injection wells. Another problem that may exist in CO2 sequestration fields is the potential for damaging induced seismicity associated with fluid injection into the geologic reservoir. Seismic hazard monitoring in CO2 sequestration fields requires a seismic network over a spatially larger region possibly having stations in remote settings. Expensive observatory-grade seismic systems are not necessary for seismic hazard deployments or small-scale tomographic studies. Hazard monitoring requires accurate location of induced seismicity to magnitude levels only slightly less than that which can be felt at the surface (e.g. magnitude 1), and the frequencies of interest for tomographic analysis are ~1 Hz and greater. We have developed a seismo/acoustic smart sensor system that can achieve the goals necessary for induced seismicity monitoring in CO2 sequestration fields. The unit is inexpensive, lightweight, easy to deploy, can operate remotely under harsh conditions and features 9 channels of recording (currently 3C 4.5 Hz geophone, MEMS accelerometer and microphone). An on-board processor allows for satellite transmission of parameter data to a processing center. Continuous or event-detected data is kept on two removable flash SD cards of up to 64+ Gbytes each. If available, data can be transmitted via cell phone modem or picked up via site visits. Low-power consumption allows for autonomous operation using only a 10 watt solar panel and a gel-cell battery. The system has been successfully tested for long-term (> 6 months) remote operations over a wide range of environments including summer in Arizona to winter above 9000' in the mountains of southern Colorado. Statistically based on-board processing is used for detection, arrival time picking, back azimuth estimation and magnitude estimates from coda waves and acoustic signals.

  13. Scenarios for exercising technical approaches to verified nuclear reductions

    SciTech Connect

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information for establishing a conceptual approach to a five-year technical program plan for research and development of nuclear arms reductions verification and transparency technologies and procedures.

  14. Software for Verifying Image-Correlation Tie Points

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Yagi, Gary

    2008-01-01

    A computer program enables assessment of the quality of tie points in the image-correlation processes of the software described in the immediately preceding article. Tie points are computed in mappings between corresponding pixels in the left and right images of a stereoscopic pair. The mappings are sometimes not perfect because image data can be noisy and parallax can cause some points to appear in one image but not the other. The present computer program relies on the availability of a left- right correlation map in addition to the usual right left correlation map. The additional map must be generated, which doubles the processing time. Such increased time can now be afforded in the data-processing pipeline, since the time for map generation is now reduced from about 60 to 3 minutes by the parallelization discussed in the previous article. Parallel cluster processing time, therefore, enabled this better science result. The first mapping is typically from a point (denoted by coordinates x,y) in the left image to a point (x',y') in the right image. The second mapping is from (x',y') in the right image to some point (x",y") in the left image. If (x,y) and(x",y") are identical, then the mapping is considered perfect. The perfect-match criterion can be relaxed by introducing an error window that admits of round-off error and a small amount of noise. The mapping procedure can be repeated until all points in each image not connected to points in the other image are eliminated, so that what remains are verified correlation data.

  15. A manufactured solution for verifying CFD boundary conditions: part II.

    SciTech Connect

    Bond, Ryan Bomar; Ober, Curtis Curry; Knupp, Patrick Michael

    2005-01-01

    Order-of-accuracy verification is necessary to ensure that software correctly solves a given set of equations. One method to verify the order of accuracy of a code is the method of manufactured solutions. In this study, a manufactured solution has been derived and implemented that allows verification of not only the Euler, Navier-Stokes, and Reynolds-Averaged Navier-Stokes (RANS) equation sets, but also some of their associated boundary conditions (BC's): slip, no-slip (adiabatic and isothermal), and outflow (subsonic, supersonic, and mixed). Order-of-accuracy verification has been performed for the Euler and Navier-Stokes equations and these BC's in a compressible computational fluid dynamics code. All of the results shown are on skewed, non-uniform meshes. RANS results will be presented in a future paper. The observed order of accuracy was lower than the expected order of accuracy in two cases. One of these cases resulted in the identification and correction of a coding mistake in the CHAD gradient correction that was reducing the observed order of accuracy. This mistake would have been undetectable on a Cartesian mesh. During the search for the CHAD gradient correction problem, an unrelated coding mistake was found and corrected. The other case in which the observed order of accuracy was less than expected was a test of the slip BC; although no specific coding or formulation mistakes have yet been identified. After the correction of the identified coding mistakes, all of the aforementioned equation sets and BC's demonstrated the expected (or at least acceptable) order of accuracy except the slip condition.

  16. Verifying and Quantifying Helicobacter pylori Infection Status of Research Mice

    PubMed Central

    Whary, Mark T.; Ge, Zhongming; Fox, James G.

    2012-01-01

    Mice used to model helicobacter gastritis should be screened by PCR prior to experimental dosing to confirm the absence of enterohepatic Helicobacter species (EHS) that colonize the cecum and colon of mice. Natural infections with EHS are common and impact of concurrent EHS infection on Helicobacter pylori-induced gastric pathology has been demonstrated. PCR of DNA isolated from gastric tissue is the most sensitive and efficient technique to confirm the H. pylori infection status of research mice after experimental dosing. To determine the level of colonization, quantitative PCR to estimate the equivalent colony-forming units of H. pylori per µg of mouse DNA is less labor-intensive than limiting dilution culture methods. Culture recovery of H. pylori is a less sensitive technique due to its fastidious in vitro culture requirements; however, recovery of viable organisms confirms persistent colonization and allows for further molecular characterization of wild-type or mutant H. pylori strains. ELISA is useful to confirm PCR and culture results and to correlate pro- and anti-inflammatory host immune responses with lesion severity and cytokine gene or protein expression. Histologic assessment with a silver stain has a role in identifying gastric bacteria with spiral morphology consistent with H. pylori but is a relatively insensitive technique and lacks specificity. A variety of spiral bacteria colonizing the lower bowel of mice can be observed in the stomach, particularly if gastric atrophy develops, and these species are not morphologically distinct at the level of light microscopy either in the stomach or lower bowel. Other less commonly used techniques to localize H. pylori in tissues include immunohistochemistry using labeled polyclonal antisera or in situ hybridization for H. pylori rRNA. In this chapter, we will summarize strategies to allow initiation of experiments with helicobacter-free mice and then focus on PCR and ELISA techniques to verify and quantify H. pylori infection of research mice. PMID:23015502

  17. Application of the Neo-Deterministic Seismic Microzonation Procedure in Bulgaria and Validation of the Seismic Input Against Eurocode 8

    SciTech Connect

    Ivanka, Paskaleva; Mihaela, Kouteva; Franco, Vaccari; Panza, Giuliano F.

    2008-07-08

    The earthquake record and the Code for design and construction in seismic regions in Bulgaria have shown that the territory of the Republic of Bulgaria is exposed to a high seismic risk due to local shallow and regional strong intermediate-depth seismic sources. The available strong motion database is quite limited, and therefore not representative at all of the real hazard. The application of the neo-deterministic seismic hazard assessment procedure for two main Bulgarian cities has been capable to supply a significant database of synthetic strong motions for the target sites, applicable for earthquake engineering purposes. The main advantage of the applied deterministic procedure is the possibility to take simultaneously and correctly into consideration the contribution to the earthquake ground motion at the target sites of the seismic source and of the seismic wave propagation in the crossed media. We discuss in this study the result of some recent applications of the neo-deterministic seismic microzonation procedure to the cities of Sofia and Russe. The validation of the theoretically modeled seismic input against Eurocode 8 and the few available records at these sites is discussed.

  18. Generic seismic ruggedness of power plant equipment

    SciTech Connect

    Merz, K.L. )

    1991-08-01

    This report updates the results of a program with the overall objective of demonstrating the generic seismic adequacy of as much nuclear power plant equipment as possible by means of collecting and evaluating existing seismic qualification test data. These data are then used to construct ruggedness'' spectra below which equipment in operating plants designed to earlier earthquake criteria would be generically adequate. This document is an EPRI Tier 1 Report. The report gives the methodology for the collection and evaluation of data which are used to construct a Generic Equipment Ruggedness Spectrum (GERs) for each equipment class considered. The GERS for each equipment class are included in an EPRI Tier 2 Report with the same title. Associated with each GERS are inclusion rules, cautions, and checklists for field screening of in-place equipment for GERS applicability. A GERS provides a measure of equipment seismic resistance based on available test data. As such, a GERS may also be used to judge the seismic adequacy of similar new or replacement equipment or to estimate the seismic margin of equipment re-evaluated with respect to earthquake levels greater than considered to date, resulting in fifteen finalized GERS. GERS for relays (included in the original version of this report) are now covered in a separate report (NP-7147). In addition to the presentation of GERS, the Tier 2 report addresses the applicability of GERS to equipment of older vintage, methods for estimating amplification factors for evaluating devices installed in cabinets and enclosures, and how seismic test data from related studies relate to the GERS approach. 28 refs., 5 figs., 4 tabs.

  19. Functional seismic evaluation of hospitals

    NASA Astrophysics Data System (ADS)

    Guevara, L. T.

    2003-04-01

    Functional collapse of hospitals (FCH) occurs when a medical complex, or part of it, although with neither structural nor nonstructural damage, is unable to provide required services for immediate attention to earthquake victims and for the recovery of the affected community. As it is known, FCH during and after an earthquake, is produced, not only by the damage to nonstructural components, but by an inappropriate or deficient distribution of essential and supporting medical spaces. This paper presents some conclusions on the analysis of the traditional architectural schemes for the design and construction of hospitals in the 20th Century and some recommendations for the establishment of evaluation parameters for the remodeling and seismic upgrade of existing hospitals in seismic zones based on the new concepts of: a) the relative location of each essential service (ES) into the medical complex, b) the capacity of each of these spaces for housing temporary activities required for the attention of a massive emergency (ME); c) the relationship between ES and the supporting services (SS); d) the flexibility of transformation of nonessential services into complementary spaces for the attention of extraordinary number of victims; e) the dimensions and appropriateness of evacuation routes; and d) the appropriate supply and maintenance of water, electricity and vital gases emergency installations.

  20. Lunar seismicity and tectonics

    NASA Technical Reports Server (NTRS)

    Lammlein, D. R.

    1977-01-01

    Results are presented for an analysis of all moonquake data obtained by the Apollo seismic stations during the period from November 1969 to May 1974 and a preliminary analysis of critical data obtained in the interval from May 1974 to May 1975. More accurate locations are found for previously located moonquakes, and additional sources are located. Consideration is given to the sources of natural seismic signals, lunar seismic activity, moonquake periodicities, tidal periodicities in moonquake activity, hypocentral locations and occurrence characteristics of deep and shallow moonquakes, lunar tidal control over moonquakes, lunar tectonism, the locations of moonquake belts, and the dynamics of the lunar interior. It is concluded that: (1) moonquakes are distributed in several major belts of global extent that coincide with regions of the youngest and most intense volcanic and tectonic activity; (2) lunar tides control both the small quakes occurring at great depth and the larger quakes occurring near the surface; (3) the moon has a much thicker lithosphere than earth; (4) a single tectonic mechanism may account for all lunar seismic activity; and (5) lunar tidal stresses are an efficient triggering mechanism for moonquakes.

  1. Seismic Inversion Methods

    NASA Astrophysics Data System (ADS)

    Jackiewicz, Jason

    2009-09-01

    With the rapid advances in sophisticated solar modeling and the abundance of high-quality solar pulsation data, efficient and robust inversion techniques are crucial for seismic studies. We present some aspects of an efficient Fourier Optimally Localized Averaging (OLA) inversion method with an example applied to time-distance helioseismology.

  2. Seismic Inversion Methods

    SciTech Connect

    Jackiewicz, Jason

    2009-09-16

    With the rapid advances in sophisticated solar modeling and the abundance of high-quality solar pulsation data, efficient and robust inversion techniques are crucial for seismic studies. We present some aspects of an efficient Fourier Optimally Localized Averaging (OLA) inversion method with an example applied to time-distance helioseismology.

  3. The Viking seismic experiment

    NASA Technical Reports Server (NTRS)

    Anderson, D. L.; Miller, W. F.; Duennebier, F. K.; Lazarewicz, A. R.; Sutton, G.; Latham, G. V.; Nakamura, Y.; Toksoz, M. F.; Kovach, R. L.; Knight, T. C. D.

    1976-01-01

    A three-axis short-period seismometer is now operating on Mars in the Utopia Planitia region. The noise background correlates well with wind gusts. Although no quakes have been detected in the first 60 days of observation, it is premature to draw any conclusions about the seismicity of Mars. The instrument is expected to return data for at least 2 years.

  4. Nonstructural seismic restraint guidelines

    SciTech Connect

    Butler, D.M.; Czapinski, R.H.; Firneno, M.J.; Feemster, H.C.; Fornaciari, N.R.; Hillaire, R.G.; Kinzel, R.L.; Kirk, D.; McMahon, T.T.

    1993-08-01

    The Nonstructural Seismic Restraint Guidelines provide general information about how to secure or restrain items (such as material, equipment, furniture, and tools) in order to prevent injury and property, environmental, or programmatic damage during or following an earthquake. All SNL sites may experience earthquakes of magnitude 6.0 or higher on the Richter scale. Therefore, these guidelines are written for all SNL sites.

  5. Continous Seismic Profiling

    The USGS collaborated with cooperator U.S. Fish & Wildlife Service to conduct continuous seismic-reflection profiling in the Havasu National Wildlife Refuge. The survey was conducted as part of an applied research and technology transfer effort by the USGS Office of Groundwater Branch of Geophysics ...

  6. Real-time Imaging Orientation Determination System to Verify Imaging Polarization Navigation Algorithm

    PubMed Central

    Lu, Hao; Zhao, Kaichun; Wang, Xiaochu; You, Zheng; Huang, Kaoli

    2016-01-01

    Bio-inspired imaging polarization navigation which can provide navigation information and is capable of sensing polarization information has advantages of high-precision and anti-interference over polarization navigation sensors that use photodiodes. Although all types of imaging polarimeters exist, they may not qualify for the research on the imaging polarization navigation algorithm. To verify the algorithm, a real-time imaging orientation determination system was designed and implemented. Essential calibration procedures for the type of system that contained camera parameter calibration and the inconsistency of complementary metal oxide semiconductor calibration were discussed, designed, and implemented. Calibration results were used to undistort and rectify the multi-camera system. An orientation determination experiment was conducted. The results indicated that the system could acquire and compute the polarized skylight images throughout the calibrations and resolve orientation by the algorithm to verify in real-time. An orientation determination algorithm based on image processing was tested on the system. The performance and properties of the algorithm were evaluated. The rate of the algorithm was over 1 Hz, the error was over 0.313°, and the population standard deviation was 0.148° without any data filter. PMID:26805851

  7. Real-time Imaging Orientation Determination System to Verify Imaging Polarization Navigation Algorithm.

    PubMed

    Lu, Hao; Zhao, Kaichun; Wang, Xiaochu; You, Zheng; Huang, Kaoli

    2016-01-01

    Bio-inspired imaging polarization navigation which can provide navigation information and is capable of sensing polarization information has advantages of high-precision and anti-interference over polarization navigation sensors that use photodiodes. Although all types of imaging polarimeters exist, they may not qualify for the research on the imaging polarization navigation algorithm. To verify the algorithm, a real-time imaging orientation determination system was designed and implemented. Essential calibration procedures for the type of system that contained camera parameter calibration and the inconsistency of complementary metal oxide semiconductor calibration were discussed, designed, and implemented. Calibration results were used to undistort and rectify the multi-camera system. An orientation determination experiment was conducted. The results indicated that the system could acquire and compute the polarized skylight images throughout the calibrations and resolve orientation by the algorithm to verify in real-time. An orientation determination algorithm based on image processing was tested on the system. The performance and properties of the algorithm were evaluated. The rate of the algorithm was over 1 Hz, the error was over 0.313°, and the population standard deviation was 0.148° without any data filter. PMID:26805851

  8. Separation of seismic blended data by sparse inversion over dictionary learning

    NASA Astrophysics Data System (ADS)

    Zhou, Yanhui; Chen, Wenchao; Gao, Jinghuai

    2014-07-01

    Recent development of blended acquisition calls for the new procedure to process blended seismic measurements. Presently, deblending and reconstructing unblended data followed by conventional processing is the most practical processing workflow. We study seismic deblending by advanced sparse inversion with a learned dictionary in this paper. To make our method more effective, hybrid acquisition and time-dithering sequential shooting are introduced so that clean single-shot records can be used to train the dictionary to favor the sparser representation of data to be recovered. Deblending and dictionary learning with l1-norm based sparsity are combined to construct the corresponding problem with respect to unknown recovery, dictionary, and coefficient sets. A two-step optimization approach is introduced. In the step of dictionary learning, the clean single-shot data are selected as trained data to learn the dictionary. For deblending, we fix the dictionary and employ an alternating scheme to update the recovery and coefficients separately. Synthetic and real field data were used to verify the performance of our method. The outcome can be a significant reference in designing high-efficient and low-cost blended acquisition.

  9. Seismic Initiating Event Analysis For a PBMR Plant

    SciTech Connect

    Van Graan, Henriette; Serbanescu, Dan; Combrink, Yolanda; Coman, Ovidiu

    2004-07-01

    Seismic Initiating Event (IE) analysis is one of the most important tasks that control the level of effort and quality of the whole Seismic Probabilistic Safety Assessment (SPRA). The typical problems are related to the following aspects: how the internal PRA model and its complexity can be used and how to control the number of PRA components for which fragility evaluation should be performed and finally to obtain a manageable number of significant cut-sets for seismic risk quantification. The answers to these questions are highly dependent on the possibility to improve the interface between the internal events analysis and the external events analysis at the design stage. (authors)

  10. Spot: A Programming Language for Verified Flight Software

    NASA Technical Reports Server (NTRS)

    Bocchino, Robert L., Jr.; Gamble, Edward; Gostelow, Kim P.; Some, Raphael R.

    2014-01-01

    The C programming language is widely used for programming space flight software and other safety-critical real time systems. C, however, is far from ideal for this purpose: as is well known, it is both low-level and unsafe. This paper describes Spot, a language derived from C for programming space flight systems. Spot aims to maintain compatibility with existing C code while improving the language and supporting verification with the SPIN model checker. The major features of Spot include actor-based concurrency, distributed state with message passing and transactional updates, and annotations for testing and verification. Spot also supports domain-specific annotations for managing spacecraft state, e.g., communicating telemetry information to the ground. We describe the motivation and design rationale for Spot, give an overview of the design, provide examples of Spot's capabilities, and discuss the current status of the implementation.

  11. High Voltage Seismic Generator

    NASA Astrophysics Data System (ADS)

    Bogacz, Adrian; Pala, Damian; Knafel, Marcin

    2015-04-01

    This contribution describes the preliminary result of annual cooperation of three student research groups from AGH UST in Krakow, Poland. The aim of this cooperation was to develop and construct a high voltage seismic wave generator. Constructed device uses a high-energy electrical discharge to generate seismic wave in ground. This type of device can be applied in several different methods of seismic measurement, but because of its limited power it is mainly dedicated for engineering geophysics. The source operates on a basic physical principles. The energy is stored in capacitor bank, which is charged by two stage low to high voltage converter. Stored energy is then released in very short time through high voltage thyristor in spark gap. The whole appliance is powered from li-ion battery and controlled by ATmega microcontroller. It is possible to construct larger and more powerful device. In this contribution the structure of device with technical specifications is resented. As a part of the investigation the prototype was built and series of experiments conducted. System parameter was measured, on this basis specification of elements for the final device were chosen. First stage of the project was successful. It was possible to efficiently generate seismic waves with constructed device. Then the field test was conducted. Spark gap wasplaced in shallowborehole(0.5 m) filled with salt water. Geophones were placed on the ground in straight line. The comparison of signal registered with hammer source and sparker source was made. The results of the test measurements are presented and discussed. Analysis of the collected data shows that characteristic of generated seismic signal is very promising, thus confirms possibility of practical application of the new high voltage generator. The biggest advantage of presented device after signal characteristics is its size which is 0.5 x 0.25 x 0.2 m and weight approximately 7 kg. This features with small li-ion battery makes constructed device very mobile. The project is still developing.

  12. Verifying Stability of Dynamic Soft-Computing Systems

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Napolitano, Marcello; Callahan, John

    1997-01-01

    Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

  13. IDMS: A System to Verify Component Interface Completeness and Compatibility for Product Integration

    NASA Astrophysics Data System (ADS)

    Areeprayolkij, Wantana; Limpiyakorn, Yachai; Gansawat, Duangrat

    The growing approach of Component-Based software Development has had a great impact on today system architectural design. However, the design of subsystems that lacks interoperability and reusability can cause problems during product integration. At worst, this may result in project failure. In literature, it is suggested that the verification of interface descriptions and management of interface changes are factors essential to the success of product integration process. This paper thus presents an automation approach to facilitate reviewing component interfaces for completeness and compatibility. The Interface Descriptions Management System (IDMS) has been implemented to ease and fasten the interface review activities using UML component diagrams as input. The method of verifying interface compatibility is accomplished by traversing the component dependency graph called Component Compatibility Graph (CCG). CCG is the visualization of which each node represents a component, and each edge represents communications between associated components. Three case studies were studied to subjectively evaluate the correctness and usefulness of IDMS.

  14. Second Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-06-26

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, seven local earthquakes were recorded during the second quarter of fiscal year 2008. The largest event recorded by the network during the second quarter (February 3, 2008 - magnitude 2.3 Mc) was located northeast of Richland in Franklin County at a depth of 22.5 km. With regard to the depth distribution, two earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), three earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and two earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, five earthquakes occurred in swarm areas and two earthquakes were classified as random events.

  15. First Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-03-21

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, forty-four local earthquakes were recorded during the first quarter of fiscal year 2008. A total of thirty-one micro earthquakes were recorded within the Rattlesnake Mountain swarm area at depths in the 5-8 km range, most likely within the pre-basalt sediments. The largest event recorded by the network during the first quarter (November 25, 2007 - magnitude 1.5 Mc) was located within this swarm area at a depth of 4.3 km. With regard to the depth distribution, three earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), thirty-six earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and five earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, thirty-eight earthquakes occurred in swarm areas and six earth¬quakes were classified as random events.

  16. First quarter Hanford seismic report for fiscal year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-02-23

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EW uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 311 triggers on two parallel detection and recording systems during the first quarter of fiscal year (FY) 2000. Twelve seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree}N latitude and 119--120{degree}W longitude; 2 were earthquakes in the Columbia River Basalt Group, 3 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 1 was a quarry blast. Two earthquakes appear to be related to a major geologic structure, no earthquakes occurred in known swarm areas, and 9 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the first quarter of FY 2000.

  17. Second Quarter Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-07-17

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 506 triggers on two parallel detection and recording systems during the second quarter of fiscal year (FY) 2000. Twenty-seven seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree} N latitude and 119--120{degree} W longitude; 12 were earthquakes in the Columbia River Basalt Group, 2 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 5 were quarry blasts. Three earthquakes appear to be related to geologic structures, eleven earthquakes occurred in known swarm areas, and seven earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the second quarter of FY 2000.

  18. Third Quarter Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-09-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its con-tractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (E WRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 818 triggers on two parallel detection and recording systems during the third quarter of fiscal year (FY) 2000. Thirteen seismic events were located by the Hanford Seismic Network within the reporting region of 46-47{degree} N latitude and 119-120{degree} W longitude; 7 were earthquakes in the Columbia River Basalt Group, 1 was an earthquake in the pre-basalt sediments, and 5 were earthquakes in the crystalline basement. Three earthquakes occurred in known swarm areas, and 10 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the third quarter of FY 2000.

  19. Sub-seismic Deformation Prediction of Potential Pathways and Seismic Validation - The Joint Project PROTECT

    NASA Astrophysics Data System (ADS)

    Krawczyk, C. M.; Kolditz, O.

    2013-12-01

    The joint project PROTECT (PRediction Of deformation To Ensure Carbon Traps) aims to determine the existence and characteristics of sub-seismic structures that can potentially link deep reservoirs with the surface in the framework of CO2 underground storage. The research provides a new approach of assessing the long-term integrity of storage reservoirs. The objective is predicting and quantifying the distribution and the amount of sub-/seismic strain caused by fault movement in the proximity of a CO2 storage reservoir. The study is developing tools and workflows which will be tested at the CO2CRC Otway Project Site in the Otway Basin in south-western Victoria, Australia. For this purpose, we are building a geometrical kinematic 3-D model based on 2-D and 3-D seismic data that are provided by the Australian project partner, the CO2CRC Consortium. By retro-deforming the modeled subsurface faults in the inspected subsurface volume we can determine the accumulated sub-seismic deformation and thus the strain variation around the faults. Depending on lithology, the calculated strain magnitude and its orientation can be used as an indicator for fracture density. Furthermore, from the complete 3D strain tensor we can predict the orientation of fractures at sub-seismic scale. In areas where we have preliminary predicted critical deformation, we will acquire in November this year new near- surface, high resolution P- and S-wave 2-D seismic data in order to verify and calibrate our model results. Here, novel and parameter-based model building will especially benefit from extracting velocities and elastic parameters from VSP and other seismic data. Our goal is to obtain a better overview of possible fluid migration pathways and communication between reservoir and overburden. Thereby, we will provide a tool for prediction and adapted time-dependent monitoring strategies for subsurface storage in general including scientific visualization capabilities. Acknowledgement This work was sponsored in part by the Australian Commonwealth Government through the Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC). PROTECT (PRediction Of deformation To Ensure Carbon Traps) is funded through the Geotechnologien Programme (grant 03G0797) of the German Ministry for Education and Research (BMBF). The PROTECT research group consists of Leibniz Institute for Applied Geophysics in Hannover, Technical University Darmstadt, Helmholtz-Zentrum für Umweltforschung in Leipzig, Trappe Erdöl Erdgas Consultant in Isernhagen (all Germany), and Curtin University in Perth, Australia.

  20. Network Optimization for Induced Seismicity Monitoring in Urban Areas

    NASA Astrophysics Data System (ADS)

    Kraft, T.; Husen, S.; Wiemer, S.

    2012-12-01

    With the global challenge to satisfy an increasing demand for energy, geological energy technologies receive growing attention and have been initiated in or close to urban areas in the past several years. Some of these technologies involve injecting fluids into the subsurface (e.g., oil and gas development, waste disposal, and geothermal energy development) and have been found or suspected to cause small to moderate sized earthquakes. These earthquakes, which may have gone unnoticed in the past when they occurred in remote sparsely populated areas, are now posing a considerable risk for the public acceptance of these technologies in urban areas. The permanent termination of the EGS project in Basel, Switzerland after a number of induced ML~3 (minor) earthquakes in 2006 is one prominent example. It is therefore essential to the future development and success of these geological energy technologies to develop strategies for managing induced seismicity and keeping the size of induced earthquake at a level that is acceptable to all stakeholders. Most guidelines and recommendations on induced seismicity published since the 1970ies conclude that an indispensable component of such a strategy is the establishment of seismic monitoring in an early stage of a project. This is because an appropriate seismic monitoring is the only way to detect and locate induced microearthquakes with sufficient certainty to develop an understanding of the seismic and geomechanical response of the reservoir to the geotechnical operation. In addition, seismic monitoring lays the foundation for the establishment of advanced traffic light systems and is therefore an important confidence building measure towards the local population and authorities. We have developed an optimization algorithm for seismic monitoring networks in urban areas that allows to design and evaluate seismic network geometries for arbitrary geotechnical operation layouts. The algorithm is based on the D-optimal experimental design that aims to minimize the error ellipsoid of the linearized location problem. Optimization for additional criteria (e.g., focal mechanism determination or installation costs) can be included. We consider a 3D seismic velocity model, an European ambient seismic noise model derived from high-resolution land-use data and existing seismic stations in the vicinity of the geotechnical site. Using this algorithm we are able to find the optimal geometry and size of the seismic monitoring network that meets the predefined application-oriented performance criteria. In this talk we will focus on optimal network geometries for deep geothermal projects of the EGS and hydrothermal type. We will discuss the requirements for basic seismic surveillance and high-resolution reservoir monitoring and characterization.

  1. The Great Maule earthquake: seismicity prior to and after the main shock from amphibious seismic networks

    NASA Astrophysics Data System (ADS)

    Lieser, K.; Arroyo, I. G.; Grevemeyer, I.; Flueh, E. R.; Lange, D.; Tilmann, F. J.

    2013-12-01

    The Chilean subduction zone is among the seismically most active plate boundaries in the world and its coastal ranges suffer from a magnitude 8 or larger megathrust earthquake every 10-20 years. The Constitución-Concepción or Maule segment in central Chile between ~35.5°S and 37°S was considered to be a mature seismic gap, rupturing last in 1835 and being seismically quiet without any magnitude 4.5 or larger earthquakes reported in global catalogues. It is located to the north of the nucleation area of the 1960 magnitude 9.5 Valdivia earthquake and to the south of the 1928 magnitude 8 Talca earthquake. On 27 February 2010 this segment ruptured in a Mw=8.8 earthquake, nucleating near 36°S and affecting a 500-600 km long segment of the margin between 34°S and 38.5°S. Aftershocks occurred along a roughly 600 km long portion of the central Chilean margin, most of them offshore. Therefore, a network of 30 ocean-bottom-seismometers was deployed in the northern portion of the rupture area for a three month period, recording local offshore aftershocks between 20 September 2010 and 25 December 2010. In addition, data of a network consisting of 33 landstations of the GeoForschungsZentrum Potsdam were included into the network, providing an ideal coverage of both the rupture plane and areas affected by post-seismic slip as deduced from geodetic data. Aftershock locations are based on automatically detected P wave onsets and a 2.5D velocity model of the combined on- and offshore network. Aftershock seismicity analysis in the northern part of the survey area reveals a well resolved seismically active splay fault in the accretionary prism of the Chilean forearc. Our findings imply that in the northernmost part of the rupture zone, co-seismic slip most likely propagated along the splay fault and not the subduction thrust fault. In addition, the updip limit of aftershocks along the plate interface can be verified to about 40 km landwards from the deformation front. Prior to the Great Maule earthquake the Collaborative Research Center SFB 574 'Volatiles and Fluids in Subduction Zones' shot several wide-angle profiles and operated a network, also consisting of OBS and land stations for six months in 2008. Both projects provide a great opportunity to study the evolution of a subduction zone within the seismic cycle of a great earthquake. The most profound features are (i) a sharp reduction in intraslab seismic activity after the Maule earthquake and (ii) a sharp increase in seismic activity at the slab interface above 50 km depth, where large parts of the rupture zone were largely aseismic prior to the Maule earthquake. Further, the aftershock seismicity shows a broader depth distribution above 50 km depth.

  2. Seismic evaluation of safety systems at the Savannah River reactors

    SciTech Connect

    Hardy, G.S.; Johnson, J.J.; Eder, S.J.; Monahon, T.M.; Ketcham, D.R.

    1989-12-31

    A thorough review of all safety related systems in commercial nuclear power plants was prompted by the accident at the Three Mile Island Nuclear Power Plant. As a consequence of this review, the Nuclear Regulatory Commission (NRC) focused its attention on the environmental and seismic qualification of the industry`s electrical and mechanical equipment. In 1980, the NRC issued Unresolved Safety Issue (USI) A-46 to verify the seismic adequacy of the equipment required to safely shut down a plant and maintain a stable condition for 72 hours. After extensive research by the NRC, it became apparent that traditional analysis and testing methods would not be a feasible mechanism to address this USI A-46 issue. The costs associated with utilizing the standard analytical and testing qualification approaches were exorbitant and could not be justified. In addition, the only equipment available to be shake table testing which is similar to the item being qualified is typically the nuclear plant component itself. After 8 years of studies and data collection, the NRC issued its ``Generic Safety Evaluation Report`` approving an alternate seismic qualification approach based on the use of seismic experience data. This experience-based seismic assessment approach will be the basis for evaluating each of the 70 pre-1972 commercial nuclear power units in the United States and for an undetermined number of nuclear plants located in foreign countries. This same cost-effective developed for the commercial nuclear power industry is currently being applied to the Savannah River Production Reactors to address similar seismic adequacy issues. This paper documents the results of the Savannah River Plant seismic evaluating program. This effort marks the first complete (non-trial) application of this state-of-the-art USI A-46 resolution methodology.

  3. A Hammer-Impact, Aluminum, Shear-Wave Seismic Source

    USGS Publications Warehouse

    Haines, Seth S.

    2007-01-01

    Near-surface seismic surveys often employ hammer impacts to create seismic energy. Shear-wave surveys using horizontally polarized waves require horizontal hammer impacts against a rigid object (the source) that is coupled to the ground surface. I have designed, built, and tested a source made out of aluminum and equipped with spikes to improve coupling. The source is effective in a variety of settings, and it is relatively simple and inexpensive to build.

  4. Seismic Tomography in Sensor Networks

    NASA Astrophysics Data System (ADS)

    Shi, L.; Song, W.; Lees, J. M.; Xing, G.

    2012-12-01

    Tomography imaging, applied to seismology, requires a new, decentralized approach if high resolution calculations are to be performed in a sensor network configuration. The real-time data retrieval from a network of large-amount wireless seismic stations to a central server is virtually impossible due to the sheer data amount and resource limitations. In this paper, we propose and design a distributed algorithm for processing data and inverting tomography in the network, while avoiding costly data collections and centralized computations. Based on a partition of the tomographic inversion problem, the new algorithms distribute the computational burden to sensor nodes and perform real-time tomographic inversion in the network, so that we can recover a high resolution tomographic model in real-time under the constraints of network resources. Our emulation results indicate that the distributed algorithms successfully reconstruct the synthetic models, while reducing and balancing the communication and computation cost to a large extent.

  5. Sound source localization technique using a seismic streamer and its extension for whale localization during seismic surveys.

    PubMed

    Abadi, Shima H; Wilcock, William S D; Tolstoy, Maya; Crone, Timothy J; Carbotte, Suzanne M

    2015-12-01

    Marine seismic surveys are under increasing scrutiny because of concern that they may disturb or otherwise harm marine mammals and impede their communications. Most of the energy from seismic surveys is low frequency, so concerns are particularly focused on baleen whales. Extensive mitigation efforts accompany seismic surveys, including visual and acoustic monitoring, but the possibility remains that not all animals in an area can be observed and located. One potential way to improve mitigation efforts is to utilize the seismic hydrophone streamer to detect and locate calling baleen whales. This study describes a method to localize low frequency sound sources with data recoded by a streamer. Beamforming is used to estimate the angle of arriving energy relative to sub-arrays of the streamer which constrains the horizontal propagation velocity to each sub-array for a given trial location. A grid search method is then used to minimize the time residual for relative arrival times along the streamer estimated by cross correlation. Results from both simulation and experiment are shown and data from the marine mammal observers and the passive acoustic monitoring conducted simultaneously with the seismic survey are used to verify the analysis. PMID:26723349

  6. Seismic Modification of Asteroid Surfaces: Laboratory Simulations

    NASA Astrophysics Data System (ADS)

    Izenberg, Noam R.; Barnouin-Jha, O. S.

    2006-09-01

    We are conducting experiments to explore the effects of impact-derived seismic shaking on the morphology and stratigraphy of asteroids. A better understanding of the processes of seismic shaking in the terrestrial gravitational field should provide useful insights for low gravity environments into crater degradation and erasure processes, the development of "ponds" (Eros), and the evolution of the regolith on Itokawa. The vibration tables at the JHU/APL spacecraft testing facility can be configured to induce both vertical and horizontal accelerations of up to a few gravities over amplitudes of a few centimeters. The Seismic Simulation Mockup, a 1-meter square, 40 cm deep Plexiglas sandbox, boltable to the table, is designed to handle the accelerations of simulated seismic events. Preliminary experiments using playground sand as a regolith simulant provide results for the response of slopes, flat surfaces, and landforms to single jerks and sustained shaking in different horizontal directions. Some notable empirical results include slow downslope motion of large pebbles relative to smaller and lighter materials; sustained convection of lighter material near the surface while large blocks remain nearly motionless during sustained quaking; formation of landslides in craters, when subjected to single jerks, primarily in the direction of initial acceleration; significant softening of crater rims after only a few small jerks; and complete erasure of a crater upon one large jerk. This work is supported by NASA DDAP grant NNG05GC08G.

  7. Seismic Hazard of Romania: Deterministic Approach

    NASA Astrophysics Data System (ADS)

    Radulian, M.; Vaccari, F.; Mândrescu, N.; Panza, G. F.; Moldoveanu, C. L.

    The seismic hazard of Romania is estimated in terms of peak-ground motion values-displacement, velocity, design ground acceleration (DGA)-computing complete synthetic seismograms, which are considered to be representative of the different seismogenic and structural zones of the country. The deterministic method addresses issues largely neglected in probabilistic hazard analysis, e.g., how crustal properties affect attenuation, since the ground motion parameters are not derived from overly simplified attenuation ``functions,'' but rather from synthetic time histories. The synthesis of the hazard is divided into two parts, one that of shallow-focus earthquakes, and the other, that of intermediate-focus events of the Vrancea region.The previous hazard maps of Romania completely ignore the seismic activity in the southeastern part of the country (due to the seismic source of Shabla zone). For the Vrancea intermediate-depth earthquakes, which control the seismic hazard level over most of the territory, the comparison of the numerical results with the historically-based intensity map show significant differences. They could be due to possible structural or source properties not captured by our modeling, or to differences in the distribution of damageable buildings over the territory (meaning that future earthquakes can be more spectacularly damaging in regions other than those regions experiencing damage in the past). Since the deterministic modeling is highly sensitive to the source and path effects, it can be used to improve the seismological parameters of the historical events.

  8. Seismic Hazards in Seattle

    NASA Astrophysics Data System (ADS)

    Delorey, Andrew; Vidale, John

    2010-05-01

    Much of Seattle, in the northwestern United States, lies atop a sedimentary basin that extends approximately 9 km deep. The basin structure is the result of the evolution of the Puget Lowland fore arc, which combines strike-slip and thrust-fault movements to accommodate right-lateral strike-slip and N-S shortening due to the oblique subduction of the Juan de Fuca Plate beneath North America. The Seattle Basin has been observed to amplify and distort the seismic waves from a variety of moderate and large earthquakes in ways that affect the hazard from those earthquakes. Seismic hazard assessments heavily depend upon upper crustal and near-surface S-wave velocity models, which have traditionally been constructed from P-wave models using an empirical relationship between P-wave and S-wave velocity or by interpolating across widely spaced observations of shallow geologic structures. Improving the accuracy and resolution of shallow S-wave models using direct measurements is key to improving seismic hazard assessments and predictions for levels of ground shaking. Tomography, with short-period Rayleigh waves extracted using noise interferometry, can refine S-wave velocity models in urban areas with dense arrays of short period and broadband instruments. We apply this technique to the Seattle area to develop a new shallow S-wave model for use in hazard assessment. Continuous data from the Seismic Hazards in Puget Sound (SHIPS) array have inter-station distances that range from a few, to tens of kilometers. This allows us to extract Rayleigh waves between 2 and 10 seconds period that are sensitive to shallow basin shear wave velocities. Our results show that shear wave velocities are about 25% lower in some regions in the upper 3 km than previous estimates and align more closely with surface geological features and gravity observations. We validate our model by comparing synthetic waveforms to several earthquakes recorded locally on accelerometers operated by the United States Geologic Survey (USGS) and the Pacific Northwest Seismic Network (PNSN). Then, we make predictions on the levels of shaking during likely future events at different areas around Seattle by running simulations using a finite difference code. As is typical in subduction zones, Seattle is exposed to shallow crustal events, intraplate events in the down-going slab, and large megathrust events. Of these three types of events, only large intraplate events have been recorded locally, so simulations are our best opportunity to make predictions for all of the possible scenarios. Our results can be used to update seismic hazard maps for Seattle and can be reproduced in other urban areas with dense arrays of short period and broadband instruments.

  9. Geophysical Monitoring at the CO2SINK Site: Combining Seismic and Geoelectric Data

    NASA Astrophysics Data System (ADS)

    Giese, R.; Lüth, S.; Cosma, C.; Juhlin, C.; Kiessling, D.; Schütt, H.; Schöbel, B.; Schmidt-Hattenberger, C.; Schilling, F.; Co2SINK Group

    2009-04-01

    The CO2SINK project at the German town of Ketzin (near Berlin), is aimed at a pilot storage of CO2, and at developing and testing efficient integrated monitoring procedures (physical, chemical, and biological observations) for assessing the processes triggered within the reservoir by a long term injection operation. In particular, geophysical methods as seismic and geoelectric measurements have delivered the structural framework, and they enable to observe the reaction of the reservoir and the caprock to CO2 propagation at locations which are not accessible for direct observations. We report on the seismic monitoring program of the CO2SINK project which comprises baseline and repeat observations at different scales in time and space, combined with comprehensive geoelectrical monitoring performed in the Ketzin wells and on the surface. The main objectives of the 3D seismic survey (carried out in spring 2005) were to provide the structural model around the location of the Ketzin wells, to verify earlier geologic interpretations of structure based on vintage 2D seismic and borehole data, as well as providing a baseline for future seismic surveys. The uppermost 1000 m are well imaged and show an anticlinal structure with an east-west striking central graben on its top. The 3D baseline survey was extended by VSP (vertical seismic profiling), MSP (moving source profiling) on 7 profiles, and crosshole tomographic measurements. 2D "star" measurements were carried out on the 7 MSP profiles in order to tie-in the down-hole surveys with the 3D baseline survey. These measurements provide enhanced resolution in time (faster and more cost effective than a full 3D survey) and space (higher source and receiver frequencies). Three crosshole measurements were performed, one baseline survey in May 2008, and two repeats in July and August 2008, respectively. A third crosshole repeat is planned for a later stage in the project when a steady state situation has been reached in the reservoir between the two observation boreholes Ktzi 200 and Ktzi 202. The interpretation of the time lapse crosshole seismic measurements is still work in progress. A time lapse effect can be recognized on cross correlations of baseline and repeat data indicating that considering the full wave form of the recordings does have the potential to locate subtle changes in the seismic properties of the reservoir due to CO2 injection. In addition, we show the results of the site-specific geoelectrical monitoring concept VERA (Vertical Electrical Resistivity Array), which covers electrical resistivity measurements in all three Ketzin wells. The array consists of 45 permanent electrodes (15 in each well), placed on the electrically insulated casings of the wells in the 600 m to 750 m depth range with a spacing of 10 m. This layout has been designed according to numerical forward modeling assuming electrical properties of pre- and post-injection scenarios. In addition to the geoelectric downhole measurement setup, surface to surface, and surface to downhole measurements are added in order to enlarge the area of observation between the three Ketzin wells to a hemispherical area (with a radius of about 1.5 km) around the wells. First results of the Electrical Resistivity Tomography (ERT) fit the expected reservoir behaviour. Higher resistivity values (presently up to factor 3 compared to other horizons) represent the intervals of the sandstone reservoir as preferred pathways of the CO2 propagation.

  10. Results from the latest SN-4 multi-parametric benthic observatory experiment (MARsite EU project) in the Gulf of Izmit, Turkey: oceanographic, chemical and seismic monitoring

    NASA Astrophysics Data System (ADS)

    Embriaco, Davide; Marinaro, Giuditta; Frugoni, Francesco; Giovanetti, Gabriele; Monna, Stephen; Etiope, Giuseppe; Gasperini, Luca; Çağatay, Namık; Favali, Paolo

    2015-04-01

    An autonomous and long-term multiparametric benthic observatory (SN-4) was designed to study gas seepage and seismic energy release along the submerged segment of the North Anatolian Fault (NAF). Episodic gas seepage occurs at the seafloor in the Gulf of Izmit (Sea of Marmara, NW Turkey) along this submerged segment of the NAF, which ruptured during the 1999 Mw7.4 Izmit earthquake. The SN-4 observatory already operated in the Gulf of Izmit at the western end of the 1999 Izmit earthquake rupture for about one-year at 166 m water depth during the 2009-2010 experiment (EGU2014-13412-1, EGU General Assembly 2014). SN-4 was re-deployed in the same site for a new long term mission (September 2013 - April 2014) in the framework of MARsite (New Directions in Seismic Hazard assessment through Focused Earth Observation in the Marmara Supersite, http://marsite.eu/ ) EC project, which aims at evaluating seismic risk and managing of long-term monitoring activities in the Marmara Sea. A main scientific objective of the SN-4 experiment is to investigate the possible correlations between seafloor methane seepage and release of seismic energy. We used the same site of the 2009-2010 campaign to verify both the occurrence of previously observed phenomena and the reliability of results obtained in the previous experiment (Embriaco et al., 2014, doi:10.1093/gji/ggt436). In particular, we are interested in the detection of gas release at the seafloor, in the role played by oceanographic phenomena in this detection, and in the association of gas and seismic energy release. The scientific payload included, among other instruments, a three-component broad-band seismometer, and gas and oceanographic sensors. We present a technical description of the observatory, including the data acquisition and control system, results from the preliminary analysis of this new multidisciplinary data set, and a comparison with the previous experiment.

  11. A case study: Time-lapse seismic monitoring of a thin heavy oil reservoir

    NASA Astrophysics Data System (ADS)

    Zhang, Yajun

    This thesis presents a case study on time-lapse seismic monitoring. The target area is located at East Senlac in the vicinity of Alberta and Saskatchewan border, a heavy oil reservoir in the Western Canadian Sedimentary Basin. In order to observe rock property related seismic anomalies, two perpendicular seismic lines have been set up. One seismic line along the N-S direction is subject to Steam Assisted Gravity Drainage (SAGD) while the other seismic line along the W-E direction is not affected. This case study covers the subjects of feasibility study, processing strategy, repeatability evaluation, seismic attribute analysis, and impedance inversion. Systematic feasibility study is conducted by prediction of rock properties based on Gassmann's equation, technical risk assessment, forward modelling and seismic survey design. The first stage simulation of oil substitution by steam indicates that it is feasible to perform time-lapse seismic monitoring project, but great challenge might be encountered. Continuous gas injection barely induces seismic variations. In the aspect of seismic data processing, better seismic quality is obtained by employing the prestack simultaneous processing (PSP) strategy. The three metrics, Pearson correlation, normalized root-mean-squares and predictability are employed to quantify the post-stack seismic repeatability. Higher repeatability along the W-E direction than along the N-S direction shows different local geology environment. The non-uniform CMP stack fold distribution is found the main factor to affect seismic repeatability. The seismic attribute, power spectra calculated from the N-S seismic surveys demonstrate that higher frequency energy tend to increase with time due to the possible decrease in pore pressure and pore temperature. On the other hand, the inverted impedance using the recently proposed hybrid data transformation shows mixed impedance variations. The continuous gas injection and the simultaneous drop in temperature and pressure are possibly the main reason to result in this mixed impedance variations.

  12. Quiet Clean Short-haul Experimental Engine (QCSEE) Under-The-Wing (UTW) composite nacelle subsystem test report. [to verify strength of selected composite materials

    NASA Technical Reports Server (NTRS)

    Stotler, C. L., Jr.; Johnston, E. A.; Freeman, D. S.

    1977-01-01

    The element and subcomponent testing conducted to verify the under the wing composite nacelle design is reported. This composite nacelle consists of an inlet, outer cowl doors, inner cowl doors, and a variable fan nozzle. The element tests provided the mechanical properties used in the nacelle design. The subcomponent tests verified that the critical panel and joint areas of the nacelle had adequate structural integrity.

  13. Seismic qualification of existing safety class manipulators

    SciTech Connect

    Wu, Ting-shu; Moran, T.J.

    1992-01-01

    There are two bridge type electromechanical manipulators within a nuclear fuel handling facility which were constructed over twenty-five years ago. At that time, there were only minimal seismic considerations. These manipulators together with the facility are being reactivated. Detailed analyses have shown that the manipulators will satisfy the requirements of ANSI/AISC N690-1984 when they are subjected to loadings including the site specific design basis earthquake. 4 refs.

  14. Seismic qualification of existing safety class manipulators

    SciTech Connect

    Wu, Ting-shu; Moran, T.J.

    1992-05-01

    There are two bridge type electromechanical manipulators within a nuclear fuel handling facility which were constructed over twenty-five years ago. At that time, there were only minimal seismic considerations. These manipulators together with the facility are being reactivated. Detailed analyses have shown that the manipulators will satisfy the requirements of ANSI/AISC N690-1984 when they are subjected to loadings including the site specific design basis earthquake. 4 refs.

  15. Conceptual design report: Nuclear materials storage facility renovation. Part 5, Structural/seismic investigation. Section A report, existing conditions calculations/supporting information

    SciTech Connect

    1995-07-14

    The Nuclear Materials Storage Facility (NMSF) at the Los Alamos National Laboratory (LANL) was a Fiscal Year (FY) 1984 line-item project completed in 1987 that has never been operated because of major design and construction deficiencies. This renovation project, which will correct those deficiencies and allow operation of the facility, is proposed as an FY 97 line item. The mission of the project is to provide centralized intermediate and long-term storage of special nuclear materials (SNM) associated with defined LANL programmatic missions and to establish a centralized SNM shipping and receiving location for Technical Area (TA)-55 at LANL. Based on current projections, existing storage space for SNM at other locations at LANL will be loaded to capacity by approximately 2002. This will adversely affect LANUs ability to meet its mission requirements in the future. The affected missions include LANL`s weapons research, development, and testing (WRD&T) program; special materials recovery; stockpile survelliance/evaluation; advanced fuels and heat sources development and production; and safe, secure storage of existing nuclear materials inventories. The problem is further exacerbated by LANL`s inability to ship any materials offsite because of the lack of receiver sites for mate rial and regulatory issues. Correction of the current deficiencies and enhancement of the facility will provide centralized storage close to a nuclear materials processing facility. The project will enable long-term, cost-effective storage in a secure environment with reduced radiation exposure to workers, and eliminate potential exposures to the public. Based upon US Department of Energy (DOE) Albuquerque Operations (DOE/Al) Office and LANL projections, storage space limitations/restrictions will begin to affect LANL`s ability to meet its missions between 1998 and 2002.

  16. Seismic detection of tornadoes

    USGS Publications Warehouse

    Tatom, F. B.

    1993-01-01

    Tornadoes represent the most violent of all forms of atmospheric storms, each year resulting in hundreds of millions of dollars in property damage and approximately one hundred fatalities. In recent years, considerable success has been achieved in detecting tornadic storms by means of Doppler radar. However, radar systems cannot determine when a tornado is actually in contact with the ground, expect possibly at extremely close range. At the present time, human observation is the only truly reliable way of knowing that a tornado is actually on the ground. However, considerable evidence exists indicating that a tornado in contact with the ground produces a significant seismic signal. If such signals are generated, the seismic detection and warning of an imminent tornado can become a distinct possibility. 

  17. New seismic codes and their impact on the acoustician

    NASA Astrophysics Data System (ADS)

    Lama, Patrick J.

    2005-09-01

    New seismic building codes for HVAC and electrical equipment, pipe ducts and conduits are being adopted nationwide. These codes affect the way acousticians practice their profession. Recently published model codes (such as IBC, NFPA, ASCE and NBC T1809-4) specify systems that require documented seismic protection. Specific performance and prescriptive code provisions that affect acoustical system applications and how they can be made to comply is included. Key terms in these codes (life safety, essential, seismic use group, category and importance factor) are explained and illustrated. A table listing major code seismic demand formulas (horizontal static seismic force, acting at the center of gravity of the equipment, pipe duct or conduit), is a useful reference. A table that defines which HVAC systems require static or dynamic analysis based on seismic use group, design category and importance factor is provided. A discussion of code-mandated Certificates of Compliance for both mountings and equipment is included and may impact acoustical decisions. New codes may require that engineers, architects and acousticians use seismic restraints with acoustical ceilings, floating floors, resilient pipe duct supports, HVAC equipment and architectural items. ``How To'' for all of this is presented with tables, details and graphs.

  18. Validation of seismic probabilistic risk assessments of nuclear power plants

    SciTech Connect

    Ellingwood, B.

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  19. Investigation of the Seismic Performance of Reinforced Highway Embankments

    NASA Astrophysics Data System (ADS)

    Toksoy, Y. S.; Edinçliler, A.

    2014-12-01

    Despite the fact that highway embankments are highly prone to earthquake induced damage, there are not enough studies in the literature concentrated on improving the seismic performance of highway embankments. Embankments which are quite stable under static load conditions can simply collapse during earthquakes due to the destructive seismic loading. This situation poses a high sequence thread to the structural integrity of the embankment, service quality and serviceability. The objective of this study is to determine the effect of the geosynthetic reinforcement on the seismic performance of the highway embankments and evaluate the seismic performance of the geotextile reinforced embankment under different earthquake motions. A 1:50 scale highway embankment model is designed and reinforced with geosynthetics in order to increase the seismic performance of the embankment model. A series of shaking table tests were performed for the identical unreinforced and reinforced embankment models using earthquake excitations with different characteristics. The experimental results were evaluated comparing the unreinforced and reinforced cases. Results revealed that reinforced embankment models perform better seismic performance especially under specificied ground excitations used in this study. Also, the prototype embankment was numerically modelled. It is seen that similar seismic behavior trend is obtained in the finite element simulations.

  20. Patterns of significant seismic quiescence on the Mexican Pacific coast

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.; Rudolf-Navarro, A. H.; Angulo-Brown, F.; Barrera-Ferrer, A. G.

    Many authors have proposed that the study of seismicity rates is an appropriate technique for evaluating how close a seismic gap may be to rupture. We designed an algorithm for identification of patterns of significant seismic quiescence by using the definition of seismic quiescence proposed by Schreider (1990). This algorithm shows the area of quiescence where an earthquake of great magnitude may probably occur. We have applied our algorithm to the earthquake catalog on the Mexican Pacific coast located between 14 and 21 degrees of North latitude and 94 and 106 degrees West longitude; with depths less than or equal to 60 km and magnitude greater than or equal to 4.3, which occurred from January, 1965 until December, 2014. We have found significant patterns of seismic quietude before the earthquakes of Oaxaca (November 1978, Mw = 7.8), Petatlán (March 1979, Mw = 7.6), Michoacán (September 1985, Mw = 8.0, and Mw = 7.6) and Colima (October 1995, Mw = 8.0). Fortunately, in this century earthquakes of great magnitude have not occurred in Mexico. However, we have identified well-defined seismic quiescences in the Guerrero seismic-gap, which are apparently correlated with the occurrence of silent earthquakes in 2002, 2006 and 2010 recently discovered by GPS technology.

  1. First Quarter Hanford Seismic Report for Fiscal Year 1999

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    1999-05-26

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. They also locate and identify sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consists of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The operational rate for the first quarter of FY99 for stations in the HSN was 99.8%. There were 121 triggers during the first quarter of fiscal year 1999. Fourteen triggers were local earthquakes; seven (50%) were in the Columbia River Basalt Group, no earthquakes occurred in the pre-basalt sediments, and seven (50%) were in the crystalline basement. One earthquake (7%) occurred near or along the Horn Rapids anticline, seven earthquakes (50%) occurred in a known swarm area, and six earthquakes (43%) were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometer during the first quarter of FY99.

  2. Hanford quarterly seismic report -- 97A seismicity on and near the Hanford Site, Pasco Basin, Washington, October 1, 1996 through December 31, 1996

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.P.

    1997-02-01

    Seismic Monitoring is part of PNNL`s Applied Geology and Geochemistry Group. The Seismic Monitoring Analysis and Repair Team (SMART) operates, maintains, and analyzes data from the hanford Seismic Network (HSN), extending the site historical seismic database and fulfilling US Department of Energy, Richland Operations Office requirements and orders. The SMART also maintains the Eastern Washington Regional Network (EWRN). The University of Washington uses the data from the EWRN and other seismic networks in the Northwest to provide the SMART with necessary regional input for the seismic hazards analysis at the Hanford Site. The SMART is tasked to provide an uninterrupted collection of high-quality raw seismic data from the HSN located on and around the Hanford Site. These unprocessed data are permanently archived. SMART also is tasked to locate and identify sources of seismic activity, monitor changes in the historical pattern of seismic activity at the Hanford Site, and build a local earthquake database (processed data) that is permanently archived. Local earthquakes are defined as earthquakes that occur within 46 degrees to 47 degrees west longitude and 119 degrees to 120 degrees north latitude. The data are used by the Hanford contractor for waste management activities, Natural Phenomena Hazards assessments and engineering design and construction. In addition, the seismic monitoring organization works with Hanford Site Emergency Services Organization to provide assistance in the event of an earthquake on the Hanford Site.

  3. Albuquerque Basin seismic network

    USGS Publications Warehouse

    Jaksha, Lawrence H.; Locke, Jerry; Thompson, J.B.; Garcia, Alvin

    1977-01-01

    The U.S. Geological Survey has recently completed the installation of a seismic network around the Albuquerque Basin in New Mexico. The network consists of two seismometer arrays, a thirteen-station array monitoring an area of approximately 28,000 km 2 and an eight-element array monitoring the area immediately adjacent to the Albuquerque Seismological Laboratory. This report describes the instrumentation deployed in the network.

  4. Paradoxes of Italian seismicity

    NASA Astrophysics Data System (ADS)

    Boschi, E.; Pantosti, D.; Valensise, G.

    Ten years after Europe's largest earthquake of the past half-century, scientists will gather in the heart of Italy's earthquake country to assess the current state of study of active tectonics and seismic hazards in the region. The meeting, organized by the Istituto Nazionale di Geofisica, Rome, and sponsored by the major Italian research institutions, will be held in Sorrento November 19-24, 1990.

  5. Lunar seismic data analysis

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.; Latham, G. V.; Dorman, H. J.

    1982-01-01

    The scientific data transmitted continuously from all ALSEP (Apollo Lunar Surface Experiment Package) stations on the Moon and recorded on instrumentation tapes at receiving stations distributed around the Earth were processed. The processing produced sets of computer-compatible digital tapes, from which various other data sets convenient for analysis were generated. The seismograms were read, various types of seismic events were classified; the detected events were cataloged.

  6. Monitoring and verifying changes of organic carbon in soil

    USGS Publications Warehouse

    Post, W.M.; Izaurralde, R. C.; Mann, L. K.; Bliss, Norman B.

    2001-01-01

    Changes in soil and vegetation management can impact strongly on the rates of carbon (C) accumulation and loss in soil, even over short periods of time. Detecting the effects of such changes in accumulation and loss rates on the amount of C stored in soil presents many challenges. Consideration of the temporal and spatial heterogeneity of soil properties, general environmental conditions, and management history is essential when designing methods for monitoring and projecting changes in soil C stocks. Several approaches and tools will be required to develop reliable estimates of changes in soil C at scales ranging from the individual experimental plot to whole regional and national inventories. In this paper we present an overview of soil properties and processes that must be considered. We classify the methods for determining soil C changes as direct or indirect. Direct methods include field and laboratory measurements of total C, various physical and chemical fractions, and C isotopes. A promising direct method is eddy covariance measurement of CO2 fluxes. Indirect methods include simple and stratified accounting, use of environmental and topographic relationships, and modeling approaches. We present a conceptual plan for monitoring soil C changes at regional scales that can be readily implemented. Finally, we anticipate significant improvements in soil C monitoring with the advent of instruments capable of direct and precise measurements in the field as well as methods for interpreting and extrapolating spatial and temporal information.

  7. Seismic basement in Poland

    NASA Astrophysics Data System (ADS)

    Grad, Marek; Polkowski, Marcin

    2015-09-01

    The area of contact between Precambrian and Phanerozoic Europe in Poland has complicated structure of sedimentary cover and basement. The thinnest sedimentary cover in the Mazury-Belarus anteclize is only 0.3-1 km thick, increases to 7-8 km along the East European Craton margin, and 9-12 km in the Trans-European Suture Zone (TESZ). The Variscan domain is characterized by a 1- to 2-km-thick sedimentary cover, while the Carpathians are characterized by very thick sediments, up to c. 20 km. The map of the basement depth is created by combining data from geological boreholes with a set of regional seismic refraction profiles. These maps do not provide data about the basement depth in the central part of the TESZ and in the Carpathians. Therefore, the data set is supplemented by 32 models from deep seismic sounding profiles and a map of a high-resistivity (low-conductivity) layer from magnetotelluric soundings, identified as a basement. All of these data provide knowledge about the basement depth and of P-wave seismic velocities of the crystalline and consolidated type of basement for the whole area of Poland. Finally, the differentiation of the basement depth and velocity is discussed with respect to geophysical fields and the tectonic division of the area.

  8. Definition of Exclusion Zones Using Seismic Data

    NASA Astrophysics Data System (ADS)

    Bartal, Y.; Villagran, M.; Ben Horin, Y.; Leonard, G.; Joswig, M.

    - In verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), there is a motivation to be effective, efficient and economical and to prevent abuse of the right to conduct an On-site Inspection (OSI) in the territory of a challenged State Party. In particular, it is in the interest of a State Party to avoid irrelevant search in specific areas. In this study we propose several techniques to determine `exclusion zones', which are defined as areas where an event could not have possibly occurred. All techniques are based on simple ideas of arrival time differences between seismic stations and thus are less prone to modeling errors compared to standard event location methods. The techniques proposed are: angular sector exclusion based on a tripartite micro array, half-space exclusion based on a station pair, and closed area exclusion based on circumferential networks.

  9. Verifying likelihoods for low template DNA profiles using multiple replicates

    PubMed Central

    Steele, Christopher D.; Greenhalgh, Matthew; Balding, David J.

    2014-01-01

    To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

  10. Study on Application of Seismic Isolation System to ABWR-II Building

    SciTech Connect

    Hideaki Saito; Hideo Tanaka; Atsuko Noguchi; Junji Suhara; Yasuaki Fukushima

    2004-07-01

    This paper reports the result of a study that evaluated the applicability of the seismic isolation system to nuclear power plants. The study focuses on possibilities of a standard design with improved seismic safety of building and equipment for ABWR-II. A base isolation system with laminated lead rubber bearing was applied in the study. Based on the structural design of isolated buildings, it was confirmed that the design seismic loads can be largely reduced and that seismic elements of buildings and equipment can be easily designed compared with non-isolated buildings. Improvement in the building construction cost and period was also confirmed. The analytical results of seismic probabilistic safety assessments showed that an isolated building has a much higher degree of the seismic safety than a non-isolated building. The study concludes that the seismic isolation system is well applicable to ABWR-II plants. In addition, with an aim to enhance the earthquake-resistance of future ABWR-II plants, a building concept was developed, in which a lot of important equipment are laid out on a floor directly supported by the base isolation system. On this plant, further improvement of the seismic reliability is expected due to reduction of the seismic responses of important equipment. (authors)

  11. Monitoring hydraulic fracturing with seismic emission volume

    NASA Astrophysics Data System (ADS)

    Niu, F.; Tang, Y.; Chen, H.; TAO, K.; Levander, A.

    2014-12-01

    Recent developments in horizontal drilling and hydraulic fracturing have made it possible to access the reservoirs that are not available for massive production in the past. Hydraulic fracturing is designed to enhance rock permeability and reservoir drainage through the creation of fracture networks. Microseismic monitoring has been proven to be an effective and valuable technology to image hydraulic fracture geometry. Based on data acquisition, seismic monitoring techniques have been divided into two categories: downhole and surface monitoring. Surface monitoring is challenging because of the extremely low signal-to-noise ratio of the raw data. We applied the techniques used in earthquake seismology and developed an integrated monitoring system for mapping hydraulic fractures. The system consists of 20 to 30 state-of-the-art broadband seismographs, which are generally about hundreds times more sensible than regular geophones. We have conducted two experiments in two basins with very different geology and formation mechanism in China. In each case, we observed clear microseismic events, which may correspond to the induced seismicity directly associated with fracturing and the triggered ones at pre-existing faults. However, the magnitude of these events is generally larger than magnitude -1, approximately one to two magnitudes larger than those detected by downhole instruments. Spectrum-frequency analysis of the continuous surface recordings indicated high seismic energy associated with injection stages. The seismic energy can be back-projected to a volume that surrounds each injection stage. Imaging seismic emission volume (SEV) appears to be an effective way to map the stimulated reservior volume, as well as natural fractures.

  12. Seismic hazard assessment in Aswan, Egypt

    NASA Astrophysics Data System (ADS)

    Deif, A.; Hamed, H.; Ibrahim, H. A.; Abou Elenean, K.; El-Amin, E.

    2011-12-01

    The study of earthquake activity and seismic hazard assessment around Aswan is very important due to the proximity of the Aswan High Dam. The Aswan High Dam is based on hard Precambrian bedrock and is considered to be the most important project in Egypt from the social, agricultural and electrical energy production points of view. The seismotectonic settings around Aswan strongly suggest that medium to large earthquakes are possible, particularly along the Kalabsha, Seiyal and Khor El-Ramla faults. The seismic hazard for Aswan is calculated utilizing the probabilistic approach within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for eight ground motion spectral periods and for a return period of 475 years, which is deemed appropriate for structural design standards in the Egyptian building codes. The results were also displayed in terms of uniform hazard spectra for rock sites at the Aswan High Dam for return periods of 475 and 2475 years. In addition, the ground-motion levels are also deaggregated at the dam site, in order to provide insight into which events are the most important for hazard estimation. The peak ground acceleration ranges between 36 and 152 cm s-2 for return periods of 475 years (equivalent to 90% probability of non-exceedance in 50 years). Spectral hazard values clearly indicate that compared with countries of high seismic risk, the seismicity in the Aswan region can be described as low at most sites to moderate in the area between the Kalabsha and Seyial faults.

  13. Seismic Adequacy Review of PC012 SCEs that are Potential Seismic Hazards with PC3 SCEs at Cold Vacuum Dryer (CVD) Facility

    SciTech Connect

    OCOMA, E.C.

    1999-08-12

    This document provides seismic adequacy review of PCO12 Systems, Components L Equipment anchorage that are potential seismic interaction hazards with PC3 SCEs during a Design Basis Earthquake. The PCO12 items are identified in the Safety Equipment List as 3/1 SCEs.

  14. Seismic Imager Space Telescope

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin; Coste, Keith; Cunningham, J.; Sievers,Michael W.; Agnes, Gregory S.; Polanco, Otto R.; Green, Joseph J.; Cameron, Bruce A.; Redding, David C.; Avouac, Jean Philippe; Ampuero, Jean Paul; Leprince, Sebastien; Michel, Remi

    2012-01-01

    A concept has been developed for a geostationary seismic imager (GSI), a space telescope in geostationary orbit above the Pacific coast of the Americas that would provide movies of many large earthquakes occurring in the area from Southern Chile to Southern Alaska. The GSI movies would cover a field of view as long as 300 km, at a spatial resolution of 3 to 15 m and a temporal resolution of 1 to 2 Hz, which is sufficient for accurate measurement of surface displacements and photometric changes induced by seismic waves. Computer processing of the movie images would exploit these dynamic changes to accurately measure the rapidly evolving surface waves and surface ruptures as they happen. These measurements would provide key information to advance the understanding of the mechanisms governing earthquake ruptures, and the propagation and arrest of damaging seismic waves. GSI operational strategy is to react to earthquakes detected by ground seismometers, slewing the satellite to point at the epicenters of earthquakes above a certain magnitude. Some of these earthquakes will be foreshocks of larger earthquakes; these will be observed, as the spacecraft would have been pointed in the right direction. This strategy was tested against the historical record for the Pacific coast of the Americas, from 1973 until the present. Based on the seismicity recorded during this time period, a GSI mission with a lifetime of 10 years could have been in position to observe at least 13 (22 on average) earthquakes of magnitude larger than 6, and at least one (2 on average) earthquake of magnitude larger than 7. A GSI would provide data unprecedented in its extent and temporal and spatial resolution. It would provide this data for some of the world's most seismically active regions, and do so better and at a lower cost than could be done with ground-based instrumentation. A GSI would revolutionize the understanding of earthquake dynamics, perhaps leading ultimately to effective warning capabilities, to improved management of earthquake risk, and to improved public safety policies. The position of the spacecraft, its high optical quality, large field of view, and large field of regard will make it an ideal platform for other scientific studies. The same data could be simply reused for other studies. If different data, such as multi-spectral data, is required, additional instruments could share the telescope.

  15. Swept Impact Seismic Technique (SIST)

    USGS Publications Warehouse

    Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.

    1996-01-01

    A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.

  16. Seismic databases of The Caucasus

    NASA Astrophysics Data System (ADS)

    Gunia, I.; Sokhadze, G.; Mikava, D.; Tvaradze, N.; Godoladze, T.

    2012-12-01

    The Caucasus is one of the active segments of the Alpine-Himalayan collision belt. The region needs continues seismic monitoring systems for better understanding of tectonic processes going in the region. Seismic Monitoring Center of Georgia (Ilia State University) is operating the digital seismic network of the country and is also collecting and exchanging data with neighboring countries. The main focus of our study was to create seismic database which is well organized, easily reachable and is convenient for scientists to use. The seismological database includes the information about more than 100 000 earthquakes from the whole Caucasus. We have to mention that it includes data from analog and digital seismic networks. The first analog seismic station in Georgia was installed in 1899 in the Caucasus in Tbilisi city. The number of analog seismic stations was increasing during next decades and in 1980s about 100 analog stations were operated all over the region. From 1992 due to political and economical situation the number of stations has been decreased and in 2002 just two analog equipments was operated. New digital seismic network was developed in Georgia since 2003. The number of digital seismic stations was increasing and in current days there are more than 25 digital stations operating in the country. The database includes the detailed information about all equipments installed on seismic stations. Database is available online. That will make convenient interface for seismic data exchange data between Caucasus neighboring countries. It also makes easier both the seismic data processing and transferring them to the database and decreases the operator's mistakes during the routine work. The database was created using the followings: php, MySql, Javascript, Ajax, GMT, Gmap, Hypoinverse.

  17. 49 CFR 40.139 - On what basis does the MRO verify test results involving opiates?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false On what basis does the MRO verify test results... Verification Process § 40.139 On what basis does the MRO verify test results involving opiates? As the MRO, you... laboratory confirms the presence of 6-acetylmorphine (6-AM) in the specimen, you must verify the test...

  18. 49 CFR 40.139 - On what basis does the MRO verify test results involving opiates?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false On what basis does the MRO verify test results... Verification Process § 40.139 On what basis does the MRO verify test results involving opiates? As the MRO, you... laboratory confirms the presence of 6-acetylmorphine (6-AM) in the specimen, you must verify the test...

  19. 75 FR 31288 - Plant-Verified Drop Shipment (PVDS)-Nonpostal Documentation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-03

    ... 111 Plant-Verified Drop Shipment (PVDS)--Nonpostal Documentation AGENCY: Postal Service TM . ACTION... Service, Domestic Mail Manual (DMM ) 705.15. 2.14 to clarify that PS Form 8125, Plant-Verified Drop...: As a result of reviews of USPS policy concerning practices at induction points of plant-verified...

  20. Small Arrays for Seismic Intruder Detections: A Simulation Based Experiment

    NASA Astrophysics Data System (ADS)

    Pitarka, A.

    2014-12-01

    Seismic sensors such as geophones and fiber optic have been increasingly recognized as promising technologies for intelligence surveillance, including intruder detection and perimeter defense systems. Geophone arrays have the capability to provide cost effective intruder detection in protecting assets with large perimeters. A seismic intruder detection system uses one or multiple arrays of geophones design to record seismic signals from footsteps and ground vehicles. Using a series of real-time signal processing algorithms the system detects, classify and monitors the intruder's movement. We have carried out numerical experiments to demonstrate the capability of a seismic array to detect moving targets that generate seismic signals. The seismic source is modeled as a vertical force acting on the ground that generates continuous impulsive seismic signals with different predominant frequencies. Frequency-wave number analysis of the synthetic array data was used to demonstrate the array's capability at accurately determining intruder's movement direction. The performance of the array was also analyzed in detecting two or more objects moving at the same time. One of the drawbacks of using a single array system is its inefficiency at detecting seismic signals deflected by large underground objects. We will show simulation results of the effect of an underground concrete block at shielding the seismic signal coming from an intruder. Based on simulations we found that multiple small arrays can greatly improve the system's detection capability in the presence of underground structures. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344

  1. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to densify the network and to enhance performance of the Algerian Digital Seismic Network.

  2. Savannah River Site disaggregated seismic spectra

    SciTech Connect

    Stephenson, D.E.

    1993-02-01

    The objective of this technical note is to characterize seismic ground motion at the Savannah River Site (SRS) by postulated earthquakes that may impact facilities at the site. This task is accomplished by reviewing the deterministic and probabilistic assessments of the seismic hazard to establish the earthquakes that control the hazard to establish the earthquakes that control the hazard at the site and then evaluate the associated seismic ground motions in terms of response spectra. For engineering design criteria of earthquake-resistant structures, response spectra serve the function of characterizing ground motions as a function of period or frequency. These motions then provide the input parameters that are used in the analysis of structural response. Because they use the maximum response, the response spectra are an inherently conservative design tool. Response spectra are described in terms of amplitude, duration, and frequency content, and these are related to source parameters, travel path, and site conditions. Studies by a number of investigators have shown by statistical analysis that for different magnitudes the response spectrum values are different for differing periods. These facts support Jennings` position that using different shapes of design spectra for earthquakes of different magnitudes and travel paths is a better practice than employing a single, general-purpose shape. All seismic ground motion characterization results indicate that the PGA is controlled by a local event with M{sub w} < 6 and R < 30km. The results also show that lower frequencies are controlled by a larger, more distant event, typically the Charleston source. The PGA of 0.2 g, based originally on the Blume study, is consistent with LLNL report UCRL-15910 (1990) and with the DOE position on LLNL/EPRI.

  3. Savannah River Site disaggregated seismic spectra

    SciTech Connect

    Stephenson, D.E.

    1993-02-01

    The objective of this technical note is to characterize seismic ground motion at the Savannah River Site (SRS) by postulated earthquakes that may impact facilities at the site. This task is accomplished by reviewing the deterministic and probabilistic assessments of the seismic hazard to establish the earthquakes that control the hazard to establish the earthquakes that control the hazard at the site and then evaluate the associated seismic ground motions in terms of response spectra. For engineering design criteria of earthquake-resistant structures, response spectra serve the function of characterizing ground motions as a function of period or frequency. These motions then provide the input parameters that are used in the analysis of structural response. Because they use the maximum response, the response spectra are an inherently conservative design tool. Response spectra are described in terms of amplitude, duration, and frequency content, and these are related to source parameters, travel path, and site conditions. Studies by a number of investigators have shown by statistical analysis that for different magnitudes the response spectrum values are different for differing periods. These facts support Jennings' position that using different shapes of design spectra for earthquakes of different magnitudes and travel paths is a better practice than employing a single, general-purpose shape. All seismic ground motion characterization results indicate that the PGA is controlled by a local event with M[sub w] < 6 and R < 30km. The results also show that lower frequencies are controlled by a larger, more distant event, typically the Charleston source. The PGA of 0.2 g, based originally on the Blume study, is consistent with LLNL report UCRL-15910 (1990) and with the DOE position on LLNL/EPRI.

  4. Verifiable Adaptive Control with Analytical Stability Margins by Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2010-01-01

    This paper presents a verifiable model-reference adaptive control method based on an optimal control formulation for linear uncertain systems. A predictor model is formulated to enable a parameter estimation of the system parametric uncertainty. The adaptation is based on both the tracking error and predictor error. Using a singular perturbation argument, it can be shown that the closed-loop system tends to a linear time invariant model asymptotically under an assumption of fast adaptation. A stability margin analysis is given to estimate a lower bound of the time delay margin using a matrix measure method. Using this analytical method, the free design parameter n of the optimal control modification adaptive law can be determined to meet a specification of stability margin for verification purposes.

  5. Workmanship Coupon Verifies and Validates the Remote Inspection System Used to Inspect Dry Shielded Canister Welds

    SciTech Connect

    Custer, K. E.; Zirker, L. R.; Dowalo, J. A.; Kaylor, J. E.

    2002-02-25

    The Idaho National Engineering and Environmental Laboratory (INEEL) is operated by Bechtel-BWXT Idaho LLC (BBWI), which recently completed a very successful Three-Mile Island-2 (TMI-2) program for the Department of Energy. This complex and challenging program loaded, welded, and transported an unprecedented 27 dry shielded canisters in seven-months, and did so ahead of schedule. The program moved over 340 canisters of TMI-2 core debris that had been in wet storage into a dry storage facility at the INEEL. Welding flaws with the manually welded purge and vent ports discovered in mid-campaign had to be verified as not effecting previous completed seal welds. A portable workmanship coupon was designed and built to validate remote inspection of completed in-service seal welds. This document outlines the methodology and advantages for building and using workmanship coupons.

  6. Modelling of NW Himalayan Seismicity

    NASA Astrophysics Data System (ADS)

    Bansal, A. R.; Dimri, V. P.

    2014-12-01

    The northwest Himalaya is seismicity active region due to the collision of Indian and Eurasian plates and experienced many large earthquakes in past. A systematic analysis of seismicity is useful for seismic hazard estimation of the region. We analyzed the seismicity of northwestern Himalaya since 1980. The magnitude of completeness of the catalogue is carried out using different methods and found as 3.0. A large difference in magnitude of completeness is found using different methods and a reliable value is obtained after testing the distribution of magnitudes with time. The region is prone to large earthquake and many studied have shown that seismic activation or quiescence takes place before large earthquakes. We studied such behavior of seismicity based on Epidemic Type Aftershock Sequence (ETAS) model and found that a stationary ETAS model is more suitable for modelling the seismicity of this region. The earthquake catalogue is de-clustered using stochasting approach to study behavior of background and triggered seismicity. The triggered seismicity is found to have shallower depths as compared to the background events.

  7. Seismicity around Brazilian dam reservoirs

    SciTech Connect

    Coelho, P.E.F.P. )

    1987-01-01

    More than 30 cases of seismicity associated with dam reservoir sites are known throughout the world. Despite the lack of data in some areas, where seismicity occurred after reservoir impounding, there have been distinct seismic patterns observed in seismic areas after dam projects implantation. This has demonstrated that reservoir loading can trigger earthquakes. A mechanism of earthquake generation by reservoir impounding is proposed here with particular application to the Brazilian cases and to areas subject to low confining stress conditions in stable regions. Six artificial lakes are described and the associated earthquake sources are discussed in terms of natural or induced seismicity. Earthquake monitoring in Brazil up to 1967, when Brasilia's seismological station started operation, was mainly based in personal communications to the media. Therefore, there is a general lack of seismic records in relatively uninhabited areas, making it difficult to establish a seismic risk classification for the territory and to distinguish natural from induced seismicity. Despite this, cases reported here have shown an alteration of the original seismic stability in dam sites after reservoir loading, as observed by the inhabitants or records from Brasilia's seismological station. All cases appear to be related to an increase in pore pressure in permeable rocks or fracture zones which are confined between impermeable rock slabs or more competent rock. It is apparent that some cases show some participation of high residual stress conditions in the area.

  8. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    SciTech Connect

    Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian; Kamali, M. Koohi

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrence relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.

  9. Sloshing of coolant in a seismically isolated reactor

    SciTech Connect

    Wu, Ting-shu; Gvildys, J.; Seidensticker, R.W.

    1988-01-01

    During a seismic event, the liquid coolant inside the reactor vessel will have sloshing motion which is a low-frequency phenomenon. In a reactor system incorporated with seismic isolation, the isolation frequency usually is also very low. There is concern on the potential amplification of sloshing motion of the liquid coolant. This study investigates the effects of seismic isolation on the sloshing of liquid coolant inside the reactor vessel of a liquid metal cooled reactor. Based on a synthetic ground motion whose response spectra envelop those specified by the NRC Regulator Guide 1.60, it is found that the maximum sloshing wave height increases from 18 in. to almost 30 in. when the system is seismically isolated. Since higher sloshing wave may introduce severe impact forces and thermal shocks to the reactor closure and other components within the reactor vessel, adequate design considerations should be made either to suppress the wave height or to reduce the effects caused by high waves.

  10. Seismic analysis of a large LMFBR with fluid-structure interactions

    SciTech Connect

    Ma, D.C.

    1985-01-01

    The seismic analysis of a large LMFBR with many internal components and structures is presented. Both vertical and horizontal seismic excitations are considered. The important hydrodynamic phenomena such as fluid-structure interaction, sloshing, fluid coupling and fluid inertia effects are included in the analysis. The results of this study are discussed in detail. Information which is useful to the design of future reactions under seismic conditions is also given. 4 refs., 12 figs.

  11. Hanford annual second quarter seismic report, fiscal year 1998: Seismicity on and near the Hanford Site, Pasco, Washington

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.P.; Rohay, A.C.

    1998-06-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. The staff also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of an earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (ENN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The operational rate for the second quarter of FY98 for stations in the HSN was 99.92%. The operational rate for the second quarter of FY98 for stations of the EWRN was 99.46%. For the second quarter of FY98, the acquisition computer triggered 159 times. Of these triggers 14 were local earthquakes: 7 (50%) in the Columbia River Basalt Group, 3 (21%) in the pre-basalt sediments, and 4 (29%) in the crystalline basement. The geologic and tectonic environments where these earthquakes occurred are discussed in this report. The most significant seismic event for the second quarter was on March 23, 1998 when a 1.9 Mc occurred near Eltopia, WA and was felt by local residents. Although this was a small event, it was felt at the surface and is an indication of the potential impact on Hanford of seismic events that are common to the Site.

  12. Magnitude Dependent Seismic Quiescence of 2008 Wenchuan Earthquake

    NASA Astrophysics Data System (ADS)

    Suyehiro, K.; Sacks, S. I.; Takanami, T.; Smith, D. E.; Rydelek, P. A.

    2014-12-01

    The change in seismicity leading to the Wenchuan Earthquake in 2008 (Mw 7.9) has been studied by various authors based on statistics and/or pattern recognitions (Huang, 2008; Yan et al., 2009; Chen and Wang, 2010; Yi et al., 2011). We show, in particular, that the magnitude-dependent seismic quiescence is observed for the Wenchuan earthquake and that it adds to other similar observations. Such studies on seismic quiescence prior to major earthquakes include 1982 Urakawa-Oki earthquake (M 7.1) (Taylor et al., 1992), 1994 Hokkaido-Toho-Oki earthquake (Mw=8.2) (Takanami et al., 1996), 2011 Tohoku earthquake (Mw=9.0) (Katsumata, 2011). Smith and Sacks (2013) proposed a magnitude-dependent quiescence based on a physical earthquake model (Rydelek and Sacks, 1995) and demonstrated the quiescence can be reproduced by the introduction of "asperities" (dilantacy hardened zones). Actual observations indicate the change occurs in a broader area than the eventual earthquake fault zone. In order to accept the explanation, we need to verify the model as the model predicts somewhat controversial features of earthquakes such as the magnitude dependent stress drop at lower magnitude range or the dynamically appearing asperities and repeating slips in some parts of the rupture zone. We show supportive observations. We will also need to verify the dilatancy diffusion to be taking place. So far, we only seem to have indirect evidences, which need to be more quantitatively substantiated.

  13. Seismic isolation of nuclear power plants using elastomeric bearings

    NASA Astrophysics Data System (ADS)

    Kumar, Manish

    Seismic isolation using low damping rubber (LDR) and lead-rubber (LR) bearings is a viable strategy for mitigating the effects of extreme earthquake shaking on safety-related nuclear structures. Although seismic isolation has been deployed in nuclear structures in France and South Africa, it has not seen widespread use because of limited new build nuclear construction in the past 30 years and a lack of guidelines, codes and standards for the analysis, design and construction of isolation systems specific to nuclear structures. The nuclear accident at Fukushima Daiichi in March 2011 has led the nuclear community to consider seismic isolation for new large light water and small modular reactors to withstand the effects of extreme earthquakes. The mechanical properties of LDR and LR bearings are not expected to change substantially in design basis shaking. However, under shaking more intense than design basis, the properties of the lead cores in lead-rubber bearings may degrade due to heating associated with energy dissipation, some bearings in an isolation system may experience net tension, and the compression and tension stiffness may be affected by the horizontal displacement of the isolation system. The effects of intra-earthquake changes in mechanical properties on the response of base-isolated nuclear power plants (NPPs) were investigated using an advanced numerical model of a lead-rubber bearing that has been verified and validated, and implemented in OpenSees and ABAQUS. A series of experiments were conducted at University at Buffalo to characterize the behavior of elastomeric bearings in tension. The test data was used to validate a phenomenological model of an elastomeric bearing in tension. The value of three times the shear modulus of rubber in elastomeric bearing was found to be a reasonable estimate of the cavitation stress of a bearing. The sequence of loading did not change the behavior of an elastomeric bearing under cyclic tension, and there was no significant change in the shear modulus, compressive stiffness, and buckling load of a bearing following cavitation. Response-history analysis of base-isolated NPPs was performed using a two-node macro model and a lumped-mass stick model. A comparison of responses obtained from analysis using simplified and advanced isolator models showed that the variation in buckling load due to horizontal displacement and strength degradation due to heating of lead cores affect the responses of a base-isolated NPP most significantly. The two-node macro model can be used to estimate the horizontal displacement response of a base-isolated NPP, but a three-dimensional model that explicitly considers all of the bearings in the isolation system will be required to estimate demands on individual bearings, and to investigate rocking and torsional responses. The use of the simplified LR bearing model underestimated the torsional and rocking response of the base-isolated NPP. Vertical spectral response at the top of containment building was very sensitive to how damping was defined for the response-history analysis.

  14. Seismic monitoring of geomorphic processes

    NASA Astrophysics Data System (ADS)

    Burtin, A.; Hovius, N.; Turowski, J. M.

    2014-12-01

    In seismology, the signal is usually analysed for earthquake data, but these represent less than 1% of continuous recording. The remaining data are considered as seismic noise and were for a long time ignored. Over the past decades, the analysis of seismic noise has constantly increased in popularity, and this has led to develop new approaches and applications in geophysics. The study of continuous seismic records is now open to other disciplines, like geomorphology. The motion of mass at the Earth's surface generates seismic waves that are recorded by nearby seismometers and can be used to monitor its transfer through the landscape. Surface processes vary in nature, mechanism, magnitude and space and time, and this variability can be observed in the seismic signals. This contribution aims to give an overview of the development and current opportunities for the seismic monitoring of geomorphic processes. We first describe the common principles of seismic signal monitoring and introduce time-frequency analysis for the purpose of identification and differentiation of surface processes. Second, we present techniques to detect, locate and quantify geomorphic events. Third, we review the diverse layout of seismic arrays and highlight their advantages and limitations for specific processes, like slope or channel activity. Finally, we illustrate all these characteristics with the analysis of seismic data acquired in a small debris-flow catchment where geomorphic events show interactions and feedbacks. Further developments must aim to fully understand the richness of the continuous seismic signals, to better quantify the geomorphic activity and improve the performance of warning systems. Seismic monitoring may ultimately allow the continuous survey of erosion and transfer of sediments in the landscape on the scales of external forcing.

  15. A future for drifting seismic networks

    NASA Astrophysics Data System (ADS)

    Simons, F. J.; Nolet, G.; Babcock, J.

    2007-12-01

    One-dimensional, radial Earth models are sufficiently well constrained to accurately locate earthquakes and calculate the paths followed by seismic rays. The differences between observations and theoretical predictions of seismograms in such Earth models can be used to reconstruct the three-dimensional wave speed distribution in the regions sampled by the seismic waves, by the technique of seismic tomography. Caused by thermal, compositional, and textural variations, wave speed anomalies remain the premier data source to fully understand the structure and evolution of our planet, from the scale of mantle convection and the mechanisms of heat transfer from core to surface to the international between the deep Earth and surface processes such as plate motion and crustal deformation. Unequal geographical data coverage continues to fundamentally limit the quality of tomographic reconstructions of seismic wave speeds in the interior of the Earth. Only at great cost can geophysicists overcome the difficulties of placing seismographs on the two thirds of the Earth's surface that is covered by oceans. The lack of spatial data coverage strongly hampers the determination of the structure of the Earth in the uncovered regions: all 3-D Earth models are marked by blank spots in areas, distributed throughout the Earth, where little or no information can be obtained. As a possible solution to gaining equal geographic data coverage, we have developed MERMAID, a prototype mobile receiver that could provide an easy, cost-effective way to collect seismic data in the ocean. It is a modification of the robotic floating instruments designed and used by oceanographers. Like them, MERMAID spends its life at depth but is capable of surfacing using a pump and bladder. We have equipped it with a hydrophone to record water pressure variations induced by compressional (P) waves. Untethered and passively drifting, such a floating seismometer will surface upon detection of a "useful" seismic event (for seismic tomography, that is), determine a GPS location, and transmit the waveforms to a satellite. In this presentation we discuss the progress made in this field by our group. More specifically, we discuss the results of preliminary tests conducted off-shore La Jolla in 2003 and 2004, as well as just-in results from a third successful, in situ, test completed in August 2007. We will draw attention to design issues and bottlenecks and the need for and features of sophisticated onboard data analysis software which we have developed and tested. We will chart a road map of the way to our ultimate goal: a worldwide array of MERMAID floating hydrophones, on the scale of the current international land-based seismic arrays. This, we believe, has the potential to progressively eliminate the discrepancies in spatial coverage that currently result in seismic Earth models that are very poorly resolved in places.

  16. Comment on "How can seismic hazard around the New Madrid seismic zone be similar to that in California?" by Arthur Frankel

    USGS Publications Warehouse

    Wang, Z.; Shi, B.; Kiefer, J.D.

    2005-01-01

    PSHA is the method used most to assess seismic hazards for input into various aspects of public and financial policy. For example, PSHA was used by the U.S. Geological Survey to develop the National Seismic Hazard Maps (Frankel et al., 1996, 2002). These maps are the basis for many national, state, and local seismic safety regulations and design standards, such as the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, the International Building Code, and the International Residential Code. Adoption and implementation of these regulations and design standards would have significant impacts on many communities in the New Madrid area, including Memphis, Tennessee and Paducah, Kentucky. Although "mitigating risks to society from earthquakes involves economic and policy issues" (Stein, 2004), seismic hazard assessment is the basis. Seismologists should provide the best information on seismic hazards and communicate them to users and policy makers. There is a lack of effort in communicating the uncertainties in seismic hazard assessment in the central U.S., however. Use of 10%, 5%, and 2% PE in 50 years causes confusion in communicating seismic hazard assessment. It would be easy to discuss and understand the design ground motions if the true meaning of the ground motion derived from PSHA were presented, i.e., the ground motion with the estimated uncertainty or the associated confidence level.

  17. Seismic hazard from induced seismicity: effect of time-dependent hazard variables

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2012-12-01

    Geothermal systems are drawing large attention worldwide as an alternative source of energy. Although geothermal energy is beneficial, field operations can produce induced seismicity whose effects can range from light and unfelt to severe damaging. In a recent paper by Convertito et al. (2012), we have investigated the effect of time-dependent seismicity parameters on seismic hazard from induced seismicity. The analysis considered the time-variation of the b-value of the Gutenberg-Richter relationship and the seismicity rate, and assumed a non-homogeneous Poisson model to solve the hazard integral. The procedure was tested in The Geysers geothermal area in Northern California where commercial exploitation has started in the 1960s. The analyzed dataset consists of earthquakes recorded during the period 2007 trough 2010 by the LBNL Geysers/Calpine network. To test the reliability of the analysis, we applied a simple forecasting procedure which compares the estimated hazard values in terms of ground-motion values having fixed probability of exceedance and the observed ground-motion values. The procedure is feasible for monitoring purposes and for calibrating the production/extraction rate to avoid adverse consequences. However, one of the main assumptions we made concern the fact that both median predictions and standard deviation of the ground-motion prediction equation (GMPE) are stationary. Particularly for geothermal areas where the number of recorded earthquakes can rapidly change with time, we want to investigate how a variation of the coefficients of the used GMPE and of the standard deviation influences the hazard estimates. Basically, we hypothesize that the physical-mechanical properties of a highly fractured medium which is continuously perturbed by field operations can produce variations of both source and medium properties that cannot be captured by a stationary GMPE. We assume a standard GMPE which accounts for the main effects which modify the scaling of the peak-ground motion parameters (e.g., magnitude, geometrical spreading and anelastic attenuation). Moreover, we consider both the inter-event and intra-event components of the standard deviation. For comparison, we use the same dataset analyzed by Convertito et al. (2012), and for successive time windows we perform the regression analysis to infer the time-dependent coefficients of the GMPE. After having tested the statistical significance of the new coefficients and having verified a reduction in the total standard deviation, we introduce the new model in the hazard integral. Hazard maps and site-specific analyses in terms of a uniform hazard spectrum are used to compare the new results with those obtained in our previous study to investigate which coefficients and which components of the total standard deviation do really matter for refining seismic hazard estimates for induced seismicity. Convertito et al. (2012). From Induced Seismicity to Direct Time-Dependent Seismic Hazard, BSSA 102(6), doi:10.1785/0120120036.

  18. Seismic vibration source

    NASA Technical Reports Server (NTRS)

    Dowler, W. L.; Varsi, G.; Yang, L. C. (Inventor)

    1979-01-01

    A system for vibrating the earth in a location where seismic mapping is to take place is described. A relatively shallow hole formed in the earth, such as a hole 10 feet deep, placing a solid propellant in the hole, sealing a portion of the hole above the solid propellant with a device that can rapidly open and close to allow a repeatedly interrupted escape of gas. The propellant is ignited so that high pressure gas is created which escapes in pulses to vibrate the earth.

  19. Seismic risk perception test

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro

    2013-04-01

    The perception of risks involves the process of collecting, selecting and interpreting signals about uncertain impacts of events, activities or technologies. In the natural sciences the term risk seems to be clearly defined, it means the probability distribution of adverse effects, but the everyday use of risk has different connotations (Renn, 2008). The two terms, hazards and risks, are often used interchangeably by the public. Knowledge, experience, values, attitudes and feelings all influence the thinking and judgement of people about the seriousness and acceptability of risks. Within the social sciences however the terminology of 'risk perception' has become the conventional standard (Slovic, 1987). The mental models and other psychological mechanisms which people use to judge risks (such as cognitive heuristics and risk images) are internalized through social and cultural learning and constantly moderated (reinforced, modified, amplified or attenuated) by media reports, peer influences and other communication processes (Morgan et al., 2001). Yet, a theory of risk perception that offers an integrative, as well as empirically valid, approach to understanding and explaining risk perception is still missing". To understand the perception of risk is necessary to consider several areas: social, psychological, cultural, and their interactions. Among the various research in an international context on the perception of natural hazards, it seemed promising the approach with the method of semantic differential (Osgood, C.E., Suci, G., & Tannenbaum, P. 1957, The measurement of meaning. Urbana, IL: University of Illinois Press). The test on seismic risk perception has been constructed by the method of the semantic differential. To compare opposite adjectives or terms has been used a Likert's scale to seven point. The test consists of an informative part and six sections respectively dedicated to: hazard; vulnerability (home and workplace); exposed value (with reference to population and territory); seismic risk in general; risk information and their sources; comparison between seismic risk and other natural hazards. Informative data include: Region, Province, Municipality of residence, Data compilation, Age, Sex, Place of Birth, Nationality, Marital status, Children, Level of education, Employment. The test allows to obtain the perception score for each factor: Hazard, Exposed value, Vulnerability. These scores can be put in relation with the scientific data relating to hazard, vulnerability and the exposed value. On January 2013 started a Survey in the Po Valley and Southern Apennines. The survey will be conducted via web using institutional sites of regions, provinces, municipalities, online newspapers to local spreading, etc. Preliminary data will be discussed. Improve our understanding of the perception of seismic risk would allow us to inform more effectively and to built better educational projects to mitigate risk.

  20. Elastic-Wavefield Seismic Stratigraphy: A New Seismic Imaging Technology

    SciTech Connect

    Bob A. Hardage; Milo M. Backus; Michael V. DeAngelo; Sergey Fomel; Khaled Fouad; Robert J. Graebner; Paul E. Murray; Randy Remington; Diana Sava

    2006-07-31

    The purpose of our research has been to develop and demonstrate a seismic technology that will provide the oil and gas industry a better methodology for understanding reservoir and seal architectures and for improving interpretations of hydrocarbon systems. Our research goal was to expand the valuable science of seismic stratigraphy beyond the constraints of compressional (P-P) seismic data by using all modes (P-P, P-SV, SH-SH, SV-SV, SV-P) of a seismic elastic wavefield to define depositional sequences and facies. Our objective was to demonstrate that one or more modes of an elastic wavefield may image stratal surfaces across some stratigraphic intervals that are not seen by companion wave modes and thus provide different, but equally valid, information regarding depositional sequences and sedimentary facies within that interval. We use the term elastic wavefield stratigraphy to describe the methodology we use to integrate seismic sequences and seismic facies from all modes of an elastic wavefield into a seismic interpretation. We interpreted both onshore and marine multicomponent seismic surveys to select the data examples that we use to document the principles of elastic wavefield stratigraphy. We have also used examples from published papers that illustrate some concepts better than did the multicomponent seismic data that were available for our analysis. In each interpretation study, we used rock physics modeling to explain how and why certain geological conditions caused differences in P and S reflectivities that resulted in P-wave seismic sequences and facies being different from depth-equivalent S-wave sequences and facies across the targets we studied.

  1. Development of the Multi-Level Seismic Receiver (MLSR)

    SciTech Connect

    Sleefe, G.E.; Engler, B.P.; Drozda, P.M.; Franco, R.J.; Morgan, J.

    1995-02-01

    The Advanced Geophysical Technology Department (6114) and the Telemetry Technology Development Department (2664) have, in conjunction with the Oil Recovery Technology Partnership, developed a Multi-Level Seismic Receiver (MLSR) for use in crosswell seismic surveys. The MLSR was designed and evaluated with the significant support of many industry partners in the oil exploration industry. The unit was designed to record and process superior quality seismic data operating in severe borehole environments, including high temperature (up to 200{degrees}C) and static pressure (10,000 psi). This development has utilized state-of-the-art technology in transducers, data acquisition, and real-time data communication and data processing. The mechanical design of the receiver has been carefully modeled and evaluated to insure excellent signal coupling into the receiver.

  2. NSR&D Program Fiscal Year (FY) 2015 Call for Proposals Mitigation of Seismic Risk at Nuclear Facilities using Seismic Isolation

    SciTech Connect

    Coleman, Justin

    2015-02-01

    Seismic isolation (SI) has the potential to drastically reduce seismic response of structures, systems, or components (SSCs) and therefore the risk associated with large seismic events (large seismic event could be defined as the design basis earthquake (DBE) and/or the beyond design basis earthquake (BDBE) depending on the site location). This would correspond to a potential increase in nuclear safety by minimizing the structural response and thus minimizing the risk of material release during large seismic events that have uncertainty associated with their magnitude and frequency. The national consensus standard America Society of Civil Engineers (ASCE) Standard 4, Seismic Analysis of Safety Related Nuclear Structures recently incorporated language and commentary for seismically isolating a large light water reactor or similar large nuclear structure. Some potential benefits of SI are: 1) substantially decoupling the SSC from the earthquake hazard thus decreasing risk of material release during large earthquakes, 2) cost savings for the facility and/or equipment, and 3) applicability to both nuclear (current and next generation) and high hazard non-nuclear facilities. Issue: To date no one has evaluated how the benefit of seismic risk reduction reduces cost to construct a nuclear facility. Objective: Use seismic probabilistic risk assessment (SPRA) to evaluate the reduction in seismic risk and estimate potential cost savings of seismic isolation of a generic nuclear facility. This project would leverage ongoing Idaho National Laboratory (INL) activities that are developing advanced (SPRA) methods using Nonlinear Soil-Structure Interaction (NLSSI) analysis. Technical Approach: The proposed study is intended to obtain an estimate on the reduction in seismic risk and construction cost that might be achieved by seismically isolating a nuclear facility. The nuclear facility is a representative pressurized water reactor building nuclear power plant (NPP) structure. Figure 1: Project activities The study will consider a representative NPP reinforced concrete reactor building and representative plant safety system. This study will leverage existing research and development (R&D) activities at INL. Figure 1 shows the proposed study steps with the steps in blue representing activities already funded at INL and the steps in purple the activities that would be funded under this proposal. The following results will be documented: 1) Comparison of seismic risk for the non-seismically isolated (non-SI) and seismically isolated (SI) NPP, and 2) an estimate of construction cost savings when implementing SI at the site of the generic NPP.

  3. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed technique to a training dataset of induced earthquakes recorded by Berkeley-Geysers network, which is installed in The Geysers geothermal area in Northern California. The reliability of the techniques is then tested by using a different dataset performing seismic hazard analysis in a time-evolving approach, which provides with ground-motion values having fixed probabilities of exceedence. Those values can be finally compared with the observations by using appropriate statistical tests.

  4. Seismological investigation of earthquakes in the New Madrid Seismic Zone. Final report, September 1986--December 1992

    SciTech Connect

    Herrmann, R.B.; Nguyen, B.

    1993-08-01

    Earthquake activity in the New Madrid Seismic Zone had been monitored by regional seismic networks since 1975. During this time period, over 3,700 earthquakes have been located within the region bounded by latitudes 35{degrees}--39{degrees}N and longitudes 87{degrees}--92{degrees}W. Most of these earthquakes occur within a 1.5{degrees} x 2{degrees} zone centered on the Missouri Bootheel. Source parameters of larger earthquakes in the zone and in eastern North America are determined using surface-wave spectral amplitudes and broadband waveforms for the purpose of determining the focal mechanism, source depth and seismic moment. Waveform modeling of broadband data is shown to be a powerful tool in defining these source parameters when used complementary with regional seismic network data, and in addition, in verifying the correctness of previously published focal mechanism solutions.

  5. Overview of seismic panel activities

    SciTech Connect

    Bandyopadhyay, K.K.

    1991-01-01

    In January 1991, the DOE-EM appointed a Seismic Panel to develop seismic criteria that can be used for evaluation of underground storage tanks containing high level radioactive wastes. The Panel expects to issue the first draft of the criteria report in January 1992. This paper provides an overview of the Panel's activities and briefly discusses the criteria. 3 refs.

  6. Development of a HT seismic downhole tool.

    SciTech Connect

    Maldonado, Frank P.; Greving, Jeffrey J.; Henfling, Joseph Anthony; Chavira, David J.; Uhl, James Eugene; Polsky, Yarom

    2009-06-01

    Enhanced Geothermal Systems (EGS) require the stimulation of the drilled well, likely through hydraulic fracturing. Whether fracturing of the rock occurs by shear destabilization of natural fractures or by extensional failure of weaker zones, control of the fracture process will be required to create the flow paths necessary for effective heat mining. As such, microseismic monitoring provides one method for real-time mapping of the fractures created during the hydraulic fracturing process. This monitoring is necessary to help assess stimulation effectiveness and provide the information necessary to properly create the reservoir. In addition, reservoir monitoring of the microseismic activity can provide information on reservoir performance and evolution over time. To our knowledge, no seismic tool exists that will operate above 125 C for the long monitoring durations that may be necessary. Replacing failed tools is costly and introduces potential errors such as depth variance, etc. Sandia has designed a high temperature seismic tool for long-term deployment in geothermal applications. It is capable of detecting microseismic events and operating continuously at temperatures up to 240 C. This project includes the design and fabrication of two High Temperature (HT) seismic tools that will have the capability to operate in both temporary and long-term monitoring modes. To ensure the developed tool meets industry requirements for high sampling rates (>2ksps) and high resolution (24-bit Analog-to-Digital Converter) two electronic designs will be implemented. One electronic design will utilize newly developed 200 C electronic components. The other design will use qualified Silicon-on-Insulator (SOI) devices and will have a continuous operating temperature of 240 C.

  7. Seismic Prediction While Drilling (SPWD): Seismic exploration ahead of the drill bit using phased array sources

    NASA Astrophysics Data System (ADS)

    Jaksch, Katrin; Giese, Rüdiger; Kopf, Matthias

    2010-05-01

    In the case of drilling for deep reservoirs previous exploration is indispensable. In recent years the focus shifted more on geological structures like small layers or hydrothermal fault systems. Beside 2D- or 3D-seismics from the surface and seismic measurements like Vertical Seismic Profile (VSP) or Seismic While Drilling (SWD) within a borehole these methods cannot always resolute this structures. The resolution is worsen the deeper and smaller the sought-after structures are. So, potential horizons like small layers in oil exploration or fault zones usable for geothermal energy production could be failed or not identified while drilling. The application of a device to explore the geology with a high resolution ahead of the drill bit in direction of drilling would be of high importance. Such a device would allow adjusting the drilling path according to the real geology and would minimize the risk of discovery and hence the costs for drilling. Within the project SPWD a device for seismic exploration ahead of the drill bit will be developed. This device should allow the seismic exploration to predict areas about 50 to 100 meters ahead of the drill bit with a resolution of one meter. At the GFZ a first prototype consisting of different units for seismic sources, receivers and data loggers has been designed and manufactured. As seismic sources four standard magnetostrictive actuators and as receivers four 3-component-geophones are used. Every unit, actuator or geophone, can be rotated in steps of 15° around the longitudinal axis of the prototype to test different measurement configurations. The SPWD prototype emits signal frequencies of about 500 up to 5000 Hz which are significant higher than in VSP and SWD. An increased radiation of seismic wave energy in the direction of the borehole axis allows the view in areas to be drilled. Therefore, every actuator must be controlled independently of each other regarding to amplitude and phase of the source signal to maximize the energy of the seismic source in order to reach a sufficient exploration range. The next step for focusing is to use the method of phased array. Dependent of the seismic wave velocities of the surrounding rock, the distance of the actuators to each other and the used frequencies the signal phases for each actuator can be determined. Since one year several measurements with the prototype have been realized under defined conditions at a test site in a mine. The test site consists of a rock block surrounded from three galleries with a dimension of about 100 by 200 meters. For testing the prototype two horizontal boreholes were drilled. They are directed to one of the gallery to get a strong reflector. The quality of the data of the borehole seismics in amplitude and frequency spectra show overall a good signal-to-noise ratio and correlate strongly with the fracture density along the borehole and are associated with a lower signal-to-noise ratio. Additionally, the geophones of the prototype show reflections from ahead and rearward in the seismic data. In particular, the reflections from the gallery ahead are used for the calibration of focusing. The direct seismic wave field indicates distinct compression and shear waves. The analysis of several seismic measurements with a focus on the direct seismic waves shows that the phased array technology explicit can influence the directional characteristics of the radiated seimic waves. The amplitudes of the seismic waves can be enhanced up to three times more in the desired direction and simultaneously be attenuated in the reverse direction. A major step for the directional investigation in boreholes has accomplished. But the focusing of the seismic waves has to be improved to maximize the energy in the desired direction in more measurements by calibrating the initiating seismic signals of the sources. A next step this year is the development of a wireline prototype for application in vertical boreholes with depths not more than 2000 meters are planned. The prototype must be modified and adapted to the conditions in deep boreholes with respect to pressure and temperature. This project is funded by the German Federal Environment Ministry.

  8. Global overview of subduction seismicity

    NASA Astrophysics Data System (ADS)

    Funiciello, F.; Presti, D.; Heuret, A.; Piromallo, C.

    2013-12-01

    In the framework of the EURYI Project ';Convergent margins and seismogenesis: defining the risk of great earthquakes by using statistical data and modelling', we propose the first global overview of subduction seismicity. Previous studies have been focused on interplate seismicity, intraslab seismicity, upper plate deformation, or relation between interplate and intraslab seismicity, but the three components of subduction seismicity have been never approached in an systematic and exhaustive study. To allow such a study, nodal planes and seismic moments of worldwide subduction-related earthquakes heve been extracted by EHB hypocenter and CMT Harvard catalogues for the period 1976 - 2007. Data were collected for centroid depths between sea level and 700 km and for magnitude Mw 5.5. For each subduction zone, a set of trench-normal transects were constructed choosing a 120km width of the cross-section on each side of a vertical plane and a spacing of 1 degree along the trench. For each of the 505 resulting transects, the whole subduction seismogenic zone was mapped as focal mechanisms projected on to a vertical plane after their faulting type classification according to the Aki-Richards convention. Transect by transect, fist the seismicity that can be considered not related to the subduction process under investigation was removed, then was selected the upper plate seismicity (i.e. earthquakes generated within the upper plate as a result of the subduction process). After deletion from the so obtained event subset of the interplate seismicity as identified in the framework of this project by Heuret et al. (2011), we can be reasonably confident that the remaining seismicity can be related to the subducting plate. Among these earthquakes we then selected the shallow (0-70 km), intermediate (70-300 km) and deep (300-660 km) depth seismicity. Following Heuret et al. (2011), the 505 transects were merged into 62 larger segments that were ideally homogeneous in terms of their seismogenic zone characteristics. For each subduction around the world, interplate, intraslab and upper plate seismicity have been estimated and compared to each other through several parameters (seismic rate, moment released rate, maximal expressed magnitude) order to obtain a snapshot on the general behaviour of global subduction-related seismicity. In a second step, the seismological parameters have been compared to long-term geodynamical parameters (e.g., subduction velocity, plate and trench absolute motions, slab age, thermal parameter and geometry, sediment thickness at trench) with the aim to find possible cause-effect relationships.

  9. Key aspects governing induced seismicity

    NASA Astrophysics Data System (ADS)

    Buijze, Loes; Wassing, Brecht; Fokker, Peter

    2013-04-01

    In the past decades numerous examples of earthquakes induced by human-induced changes in subsurface fluid pressures have been reported. This poses a major threat to the future development of some of these operations and calls for an understanding and quantification of the seismicity generated. From geomechanical considerations and insights from laboratory experiments the factors controlling induced seismicity may be grouped into 4 categories; the magnitude of the stress disturbance, the pre-existing stress conditions, the reservoir/fault rock properties and the local geometry. We investigated whether the (relative) contributions of these factors and their influence on magnitudes generated could be recognized by looking at the entire dataset of reported cases of induced seismicity as a whole, and what this might imply for future developments. An extensive database has been built out of over a 160 known cases of induced seismicity worldwide, incorporating the relevant geological, seismological and fluid-related parameters. The cases studied include hydrocarbon depletion and secondary recovery, waste water injection, (enhanced) geothermal systems and hydraulic fracturing with observed magnitudes ranging from less than -1.5 to 7. The parameters taken into account were based on the theoretical background of the mechanisms of induced seismicity and include the injection/depletion-related parameters, (spatial) characteristics of seismicity, lithological properties and the local stress situation. Correlations between the seismic response and the geological/geomechanical characteristics of the various sites were investigated. The injected/depleted volumes and the scale of the activities are major controlling factors on the maximum magnitudes generated. Spatial signatures of seismicity such as the depth and lateral spread of the seismicity were observed to be distinct for different activities, which is useful when considering future operations. Where available the local stress situation is considered, as well as the influence of the natural seismicity. Finally, we related induced seismicity to several reservoir and fault rock properties, including fault rock stability as is observed from the laboratory. The c