Science.gov

Sample records for verifying seismic design

  1. A Real Quantum Designated Verifier Signature Scheme

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang

    2015-09-01

    The effectiveness of most quantum signature schemes reported in the literature can be verified by a designated person, however, those quantum signature schemes aren't the real traditional designated verifier signature schemes, because the designated person hasn't the capability to efficiently simulate a signature which is indistinguishable from a signer, which cannot satisfy the requirements in some special environments such as E-voting, call for tenders and software licensing. For solving this problem, a real quantum designated verifier signature scheme is proposed in this paper. According to the property of unitary transformation and quantum one-way function, only a verifier designated by a signer can verify the "validity of a signature" and the designated verifier cannot prove to a third party that the signature was produced by the signer or by himself through a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Analysis results show that this new scheme satisfies the main security requirements of designated verifier signature scheme and the major attack strategies.

  2. Position paper: Seismic design criteria

    SciTech Connect

    Farnworth, S.K.

    1995-05-22

    The purpose of this paper is to document the seismic design criteria to be used on the Title 11 design of the underground double-shell waste storage tanks and appurtenant facilities of the Multi-Function Waste Tank Facility (MWTF) project, and to provide the history and methodologies for determining the recommended Design Basis Earthquake (DBE) Peak Ground Acceleration (PGA) anchors for site-specific seismic response spectra curves. Response spectra curves for use in design are provided in Appendix A.

  3. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  4. Design Strategy for a Formally Verified Reliable Computing Platform

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.

    1991-01-01

    This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.

  5. Design of a verifiable subset for HAL/S

    NASA Technical Reports Server (NTRS)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  6. DISPLACEMENT BASED SEISMIC DESIGN METHODS.

    SciTech Connect

    HOFMAYER,C.MILLER,C.WANG,Y.COSTELLO,J.

    2003-07-15

    A research effort was undertaken to determine the need for any changes to USNRC's seismic regulatory practice to reflect the move, in the earthquake engineering community, toward using expected displacement rather than force (or stress) as the basis for assessing design adequacy. The research explored the extent to which displacement based seismic design methods, such as given in FEMA 273, could be useful for reviewing nuclear power stations. Two structures common to nuclear power plants were chosen to compare the results of the analysis models used. The first structure is a four-story frame structure with shear walls providing the primary lateral load system, referred herein as the shear wall model. The second structure is the turbine building of the Diablo Canyon nuclear power plant. The models were analyzed using both displacement based (pushover) analysis and nonlinear dynamic analysis. In addition, for the shear wall model an elastic analysis with ductility factors applied was also performed. The objectives of the work were to compare the results between the analyses, and to develop insights regarding the work that would be needed before the displacement based analysis methodology could be considered applicable to facilities licensed by the NRC. A summary of the research results, which were published in NUREGICR-6719 in July 2001, is presented in this paper.

  7. Simplified seismic performance assessment and implications for seismic design

    NASA Astrophysics Data System (ADS)

    Sullivan, Timothy J.; Welch, David P.; Calvi, Gian Michele

    2014-08-01

    The last decade or so has seen the development of refined performance-based earthquake engineering (PBEE) approaches that now provide a framework for estimation of a range of important decision variables, such as repair costs, repair time and number of casualties. This paper reviews current tools for PBEE, including the PACT software, and examines the possibility of extending the innovative displacement-based assessment approach as a simplified structural analysis option for performance assessment. Details of the displacement-based s+eismic assessment method are reviewed and a simple means of quickly assessing multiple hazard levels is proposed. Furthermore, proposals for a simple definition of collapse fragility and relations between equivalent single-degree-of-freedom characteristics and multi-degree-of-freedom story drift and floor acceleration demands are discussed, highlighting needs for future research. To illustrate the potential of the methodology, performance measures obtained from the simplified method are compared with those computed using the results of incremental dynamic analyses within the PEER performance-based earthquake engineering framework, applied to a benchmark building. The comparison illustrates that the simplified method could be a very effective conceptual seismic design tool. The advantages and disadvantages of the simplified approach are discussed and potential implications of advanced seismic performance assessments for conceptual seismic design are highlighted through examination of different case study scenarios including different structural configurations.

  8. Structural concepts and details for seismic design

    SciTech Connect

    Not Available

    1991-09-01

    This manual discusses building and building component behavior during earthquakes, and provides suggested details for seismic resistance which have shown by experience to provide adequate performance during earthquakes. Special design and construction practices are also described which, although they might be common in some high-seismic regions, may not be common in low and moderate seismic-hazard regions of the United States. Special attention is given to describing the level of detailing appropriate for each seismic region. The UBC seismic criteria for all seismic zones is carefully examined, and many examples of connection details are given. The general scope of discussion is limited to materials and construction types common to Department of Energy (DOE) sites. Although the manual is primarily written for professional engineers engaged in performing seismic-resistant design for DOE facilities, the first two chapters, plus the introductory sections of succeeding chapters, contain descriptions which are also directed toward project engineers who authorize, review, or supervise the design and construction of DOE facilities. 88 refs., 188 figs.

  9. Seismic design parameters - A user guide

    USGS Publications Warehouse

    Leyendecker, E.V.; Frankel, A.D.; Rukstales, K.S.

    2001-01-01

    The 1997 NEHRP Recommended Provisions for Seismic Regulations for New Buildings (1997 NEHRP Provisions) introduced seismic design procedure that is based on the explicit use of spectral response acceleration rather than the traditional peak ground acceleration and/or peak ground velocity or zone factors. The spectral response accelerations are obtained from spectral response acceleration maps accompanying the report. Maps are available for the United States and a number of U.S. territories. Since 1997 additional codes and standards have also adopted seismic design approaches based on the same procedure used in the NEHRP Provisions and the accompanying maps. The design documents using the 1997 NEHRP Provisions procedure may be divided into three categories -(1) Design of New Construction, (2) Design and Evaluation of Existing Construction, and (3) Design of Residential Construction. A CD-ROM has been prepared for use in conjunction with the design documents in each of these three categories. The spectral accelerations obtained using the software on the CD are the same as those that would be obtained by using the maps accompanying the design documents. The software has been prepared to operate on a personal computer using a Windows (Microsoft Corporation) operating environment and a point and click type of interface. The user can obtain the spectral acceleration values that would be obtained by use of the maps accompanying the design documents, include site factors appropriate for the Site Class provided by the user, calculate a response spectrum that includes the site factor, and plot a response spectrum. Sites may be located by providing the latitude-longitude or zip code for all areas covered by the maps. All of the maps used in the various documents are also included on the CDROM

  10. The Relationship Between Verified Organ Donor Designation and Patient Demographic and Medical Characteristics.

    PubMed

    Sehgal, N K R; Scallan, C; Sullivan, C; Cedeño, M; Pencak, J; Kirkland, J; Scott, K; Thornton, J D

    2016-04-01

    Previous studies on the correlates of organ donation consent have focused on self-reported willingness to donate and on self-reported medical suitability to donate. However, these may be subject to social desirability bias and inaccurate assessments of medical suitability. The authors sought to overcome these limitations by directly verifying donor designation on driver's licenses and by abstracting comorbid conditions from electronic health records. Using a cross-sectional study design, they reviewed the health records of 2070 randomly selected primary care patients at a large urban safety-net medical system to obtain demographic and medical characteristics. They also examined driver's licenses that were scanned into electronic health records as part of the patient registration process for donor designation. Overall, 943 (46%) patients were designated as a donor on their driver's license. On multivariate analysis, donor designation was positively associated with age 35-54 years, female sex, nonblack race, speaking English or Spanish, being employed, having private insurance, having an income >$45 000, and having fewer comorbid conditions. These demographic and medical characteristics resulted in patient subgroups with donor designation rates ranging from 21% to 75%. In conclusion, patient characteristics are strongly related to verified donor designation. Further work should tailor organ donation efforts to specific subgroups. PMID:26603147

  11. Tritium glovebox stripper system seismic design evaluation

    SciTech Connect

    Grinnell, J. J.; Klein, J. E.

    2015-09-01

    The use of glovebox confinement at US Department of Energy (DOE) tritium facilities has been discussed in numerous publications. Glovebox confinement protects the workers from radioactive material (especially tritium oxide), provides an inert atmosphere for prevention of flammable gas mixtures and deflagrations, and allows recovery of tritium released from the process into the glovebox when a glovebox stripper system (GBSS) is part of the design. Tritium recovery from the glovebox atmosphere reduces emissions from the facility and the radiological dose to the public. Location of US DOE defense programs facilities away from public boundaries also aids in reducing radiological doses to the public. This is a study based upon design concepts to identify issues and considerations for design of a Seismic GBSS. Safety requirements and analysis should be considered preliminary. Safety requirements for design of GBSS should be developed and finalized as a part of the final design process.

  12. A Preliminary study on the seismic conceptual design

    NASA Astrophysics Data System (ADS)

    Zhao, Zhen; Xie, Lili

    2014-08-01

    The seismic conceptual design is an essential part of seismic design codes. It points out that the term "seismic conceptual design" should imply three aspects, i.e., the given concept itself, the specific provisions related to the given concept and the designing following the provisions. Seismic conceptual design can be classified into two categories: the strict or traditional seismic conceptual design and the generalized seismic conceptual design. The authors are trying to define for both conceptual designs their connotations and study their characteristics, in particular, the differences between them. Authors emphasize that both conceptual designs sound very close, however, their differences are apparent. The strict conceptual designs are usually worked out directly from engineering practice and/or lessons learnt from earthquake damage, while the generalized conceptual designs are resulted in a series of visions aiming to realize the general objectives of the seismic codes. The strict conceptual designs, (traditional conceptual designs) are indispensable elements of seismic codes in assuring designed structures safer and the generalized conceptual designs are playing key roles in directing to a more advanced and effective seismic codes.

  13. Feasibility study and verified design concept for new improved hot gas facility

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The MSFC Hot Gas Facility (HGF) was fabricated in 1975 as a temporary facility to provide immediate turnaround testing to support the SRB and ET TPS development. This facility proved to be very useful and was used to make more than 1300 runs, far more than ever intended in the original design. Therefore, it was in need of constant repair and needed to be replaced with a new improved design to support the continuing SRB/ET TPS product improvement and/or removal efforts. MSFC contracted with Lockheed-Huntsville to work on this improved design through contract NAS8-36304 Feasibility Study and Verified Design Concept for the New Improved Hot Gas Facility. The results of Lockheed-Huntsville's efforts under this contract are summarized.

  14. Guidelines for the seismic design of fire protection systems

    SciTech Connect

    Benda, B. ); Cushing, R. ); Driesen, G.E. )

    1991-01-01

    The engineering knowledge gained from earthquake experience data surveys of fire protection system components is combined with analytical evaluation results to develop guidelines for the design of seismically rugged fire protection distribution piping. The seismic design guidelines of the National Fire Protection Association Standard NFPA-13 are reviewed, augmented, and summarized to define an efficient method for the seismic design of fire protection piping systems. 8 refs.

  15. Guidelines for the seismic design of fire protection systems

    SciTech Connect

    Benda, B.; Cushing, R.; Driesen, G.E.

    1991-12-31

    The engineering knowledge gained from earthquake experience data surveys of fire protection system components is combined with analytical evaluation results to develop guidelines for the design of seismically rugged fire protection distribution piping. The seismic design guidelines of the National Fire Protection Association Standard NFPA-13 are reviewed, augmented, and summarized to define an efficient method for the seismic design of fire protection piping systems. 8 refs.

  16. Design of the IPIRG-2 simulated seismic forcing function

    SciTech Connect

    Olson, R.; Scott, P.; Wilkowski, G.

    1996-02-01

    A series of pipe system experiments was conducted in IPIRG-2 that used a realistic seismic forcing function. Because the seismic forcing function was more complex than the single-frequency increasing-amplitude sinusoidal forcing function used in the IPIRG-1 pipe system experiments, considerable effort went into designing the function. This report documents the design process for the seismic forcing function used in the IPIRG-2 pipe system experiments.

  17. Implied preference for seismic design level and earthquake insurance.

    PubMed

    Goda, K; Hong, H P

    2008-04-01

    Seismic risk can be reduced by implementing newly developed seismic provisions in design codes. Furthermore, financial protection or enhanced utility and happiness for stakeholders could be gained through the purchase of earthquake insurance. If this is not so, there would be no market for such insurance. However, perceived benefit associated with insurance is not universally shared by stakeholders partly due to their diverse risk attitudes. This study investigates the implied seismic design preference with insurance options for decisionmakers of bounded rationality whose preferences could be adequately represented by the cumulative prospect theory (CPT). The investigation is focused on assessing the sensitivity of the implied seismic design preference with insurance options to model parameters of the CPT and to fair and unfair insurance arrangements. Numerical results suggest that human cognitive limitation and risk perception can affect the implied seismic design preference by the CPT significantly. The mandatory purchase of fair insurance will lead the implied seismic design preference to the optimum design level that is dictated by the minimum expected lifecycle cost rule. Unfair insurance decreases the expected gain as well as its associated variability, which is preferred by risk-averse decisionmakers. The obtained results of the implied preference for the combination of the seismic design level and insurance option suggest that property owners, financial institutions, and municipalities can take advantage of affordable insurance to establish successful seismic risk management strategies. PMID:18419667

  18. Seismic design accelerations for the LSST telescope

    NASA Astrophysics Data System (ADS)

    Neill, Douglas R.; Warner, Mike; Sebag, Jacques

    2012-09-01

    The Large Synoptic Survey Telescope will be located on a seismically active Chilean mountain. Seismic ground accelerations produce the telescope's most demanding load cases. Consequently, accurate prediction of these accelerations is required. These seismic accelerations, in the form of Peak Spectral Acceleration (PSA), were compared for site specific surveys, the Chilean building codes and measured seismic accelerations. Methods were also investigated for adjusting for variations in damping level and return period. The return period is the average interval of time between occurrences of a specific intensity.

  19. Verified by Visa and MasterCard SecureCode: Or, How Not to Design Authentication

    NASA Astrophysics Data System (ADS)

    Murdoch, Steven J.; Anderson, Ross

    Banks worldwide are starting to authenticate online card transactions using the '3-D Secure' protocol, which is branded as Verified by Visa and MasterCard SecureCode. This has been partly driven by the sharp increase in online fraud that followed the deployment of EMV smart cards for cardholder-present payments in Europe and elsewhere. 3-D Secure has so far escaped academic scrutiny; yet it might be a textbook example of how not to design an authentication protocol. It ignores good design principles and has significant vulnerabilities, some of which are already being exploited. Also, it provides a fascinating lesson in security economics. While other single sign-on schemes such as OpenID, InfoCard and Liberty came up with decent technology they got the economics wrong, and their schemes have not been adopted. 3-D Secure has lousy technology, but got the economics right (at least for banks and merchants); it now boasts hundreds of millions of accounts. We suggest a path towards more robust authentication that is technologically sound and where the economics would work for banks, merchants and customers - given a gentle regulatory nudge.

  20. Verifying single-station seismic approaches using Earth-based data: Preparation for data return from the InSight mission to Mars

    NASA Astrophysics Data System (ADS)

    Panning, Mark P.; Beucler, Éric; Drilleau, Mélanie; Mocquet, Antoine; Lognonné, Philippe; Banerdt, W. Bruce

    2015-03-01

    The planned InSight mission will deliver a single seismic station containing 3-component broadband and short-period sensors to the surface of Mars in 2016. While much of the progress in understanding the Earth and Moon's interior has relied on the use of seismic networks for accurate location of sources, single station approaches can be applied to data returned from Mars in order to locate events and determine interior structure. In preparation for the data return from InSight, we use a terrestrial dataset recorded at the Global Seismic Network station BFO, located at the Black Forest Observatory in Germany, to verify an approach for event location and structure determination based on recordings of multiple orbit surface waves, which will be more favorable to record on Mars than Earth due to smaller planetary radius and potentially lower background noise. With this approach applied to events near the threshold of observability on Earth, we are able to determine epicentral distance within approximately 1° (corresponding to ∼60 km on Mars), and origin time within ∼30 s. With back azimuth determined from Rayleigh wave polarization, absolute locations are determined generally within an aperture of 10°, allowing for localization within large tectonic regions on Mars. With these locations, we are able to recover Earth mantle structure within ±5% (the InSight mission requirements for martian mantle structure) using 1D travel time inversions of P and S travel times for datasets of only 7 events. The location algorithm also allows for the measurement of great-circle averaged group velocity dispersion, which we measure between 40 and 200 s to scale the expected reliable frequency range of the InSight data from Earth to Mars data. Using the terrestrial data, we are able to resolve structure down to ∼200 km, but synthetic tests demonstrate we should be able to resolve martian structure to ∼400 km with the same frequency content given the smaller planetary size.

  1. Seismic design and evaluation criteria based on target performance goals

    SciTech Connect

    Murray, R.C.; Nelson, T.A.; Kennedy, R.P.; Short, S.A.

    1994-04-01

    The Department of Energy utilizes deterministic seismic design/evaluation criteria developed to achieve probabilistic performance goals. These seismic design and evaluation criteria are intended to apply equally to the design of new facilities and to the evaluation of existing facilities. In addition, the criteria are intended to cover design and evaluation of buildings, equipment, piping, and other structures. Four separate sets of seismic design/evaluation criteria have been presented each with a different performance goal. In all these criteria, earthquake loading is selected from seismic hazard curves on a probabilistic basis but seismic response evaluation methods and acceptable behavior limits are deterministic approaches with which design engineers are familiar. For analytical evaluations, conservatism has been introduced through the use of conservative inelastic demand-capacity ratios combined with ductile detailing requirements, through the use of minimum specified material strengths and conservative code capacity equations, and through the use of a seismic scale factor. For evaluation by testing or by experience data, conservatism has been introduced through the use of an increase scale factor which is applied to the prescribed design/evaluation input motion.

  2. Cost reduction through improved seismic design

    SciTech Connect

    Severud, L.K.

    1984-01-01

    During the past decade, many significnt seismic technology developments have been accomplished by the United States Department of Energy (USDOE) programs. Both base technology and major projects, such as the Fast Flux Test Facility (FFTF) and the Clinch River Breeder Reactor (CRBR) plant, have contributed to seismic technology development and validation. Improvements have come in the areas of ground motion definitions, soil-structure interaction, and structural analysis methods and criteria for piping, equipment, components, reactor core, and vessels. Examples of some of these lessons learned and technology developments are provided. Then, the highest priority seismic technology needs, achievable through DOE actions and sponsorship are identified and discussed. Satisfaction of these needs are expected to make important contributions toward cost avoidances and reduced capital costs of future liquid metal nuclear plants. 23 references, 12 figures.

  3. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  4. Seismic fragility assessment of RC frame structure designed according to modern Chinese code for seismic design of buildings

    NASA Astrophysics Data System (ADS)

    Wu, D.; Tesfamariam, S.; Stiemer, S. F.; Qin, D.

    2012-09-01

    Following several damaging earthquakes in China, research has been devoted to find the causes of the collapse of reinforced concrete (RC) building sand studying the vulnerability of existing buildings. The Chinese Code for Seismic Design of Buildings (CCSDB) has evolved over time, however, there is still reported earthquake induced damage of newly designed RC buildings. Thus, to investigate modern Chinese seismic design code, three low-, mid- and high-rise RC frames were designed according to the 2010 CCSDB and the corresponding vulnerability curves were derived by computing a probabilistic seismic demand model (PSDM).The PSDM was computed by carrying out nonlinear time history analysis using thirty ground motions obtained from the Pacific Earthquake Engineering Research Center. Finally, the PSDM was used to generate fragility curves for immediate occupancy, significant damage, and collapse prevention damage levels. Results of the vulnerability assessment indicate that the seismic demands on the three different frames designed according to the 2010 CCSDB meet the seismic requirements and are almost in the same safety level.

  5. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

    NASA Astrophysics Data System (ADS)

    Liu, Aiwen; Chen, Kun; Wu, Jian

    2010-06-01

    The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a citys water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

  6. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    SciTech Connect

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  7. Next generation seismic fragility curves for California bridges incorporating the evolution in seismic design philosophy

    NASA Astrophysics Data System (ADS)

    Ramanathan, Karthik Narayan

    Quantitative and qualitative assessment of the seismic risk to highway bridges is crucial in pre-earthquake planning, and post-earthquake response of transportation systems. Such assessments provide valuable knowledge about a number of principal effects of earthquakes such as traffic disruption of the overall highway system, impact on the regions’ economy and post-earthquake response and recovery, and more recently serve as measures to quantify resilience. Unlike previous work, this study captures unique bridge design attributes specific to California bridge classes along with their evolution over three significant design eras, separated by the historic 1971 San Fernando and 1989 Loma Prieta earthquakes (these events affected changes in bridge seismic design philosophy). This research developed next-generation fragility curves for four multispan concrete bridge classes by synthesizing new knowledge and emerging modeling capabilities, and by closely coordinating new and ongoing national research initiatives with expertise from bridge designers. A multi-phase framework was developed for generating fragility curves, which provides decision makers with essential tools for emergency response, design, planning, policy support, and maximizing investments in bridge retrofit. This framework encompasses generational changes in bridge design and construction details. Parameterized high-fidelity three-dimensional nonlinear analytical models are developed for the portfolios of bridge classes within different design eras. These models incorporate a wide range of geometric and material uncertainties, and their responses are characterized under seismic loadings. Fragility curves were then developed considering the vulnerability of multiple components and thereby help to quantify the performance of highway bridge networks and to study the impact of seismic design principles on the performance within a bridge class. This not only leads to the development of fragility relations that are unique and better suited for bridges in California, but also leads to the creation of better bridge classes and sub-bins that have more consistent performance characteristics than those currently provided by the National Bridge Inventory. Another important feature of this research is associated with the development of damage state definitions and grouping of bridge components in a way that they have similar consequences in terms of repair and traffic implications following a seismic event. These definitions are in alignment with the California Department of Transportation’s design and operational experience, thereby enabling better performance assessment, emergency response, and management in the aftermath of a seismic event. The fragility curves developed as a part of this research will be employed in ShakeCast, a web-based post-earthquake situational awareness application that automatically retrieves earthquake shaking data and generates potential damage assessment notifications for emergency managers and responders.

  8. Review of seismicity and ground motion studies related to development of seismic design at SRS

    SciTech Connect

    Stephenson, D.E.; Acree, J.R.

    1992-08-01

    The NRC response spectra developed in Reg. Guide 1.60 is being used in the studies related to restarting of the existing Savannah River Site (SRS) reactors. Because it envelopes all the other site specific spectra which have been developed for SRS, it provides significant conservatism in the design and analysis of the reactor systems for ground motions of this value or with these probability levels. This spectral shape is also the shape used for the design of the recently licensed Vogtle Nuclear Station, located south of the Savannah River from the SRS. This report provides a summary of the data base used to develop the design basis earthquake. This includes the seismicity, rates of occurrence, magnitudes, and attenuation relationships. A summary is provided for the studies performed and methodologies used to establish the design basis earthquake for SRS. The ground motion response spectra developed from the various studies are also summarized. The seismic hazard and PGA`s developed for other critical facilities in the region are discussed, and the SRS seismic instrumentation is presented. The programs for resolving outstanding issues are discussed and conclusions are presented.

  9. A verified design of a fault-tolerant clock synchronization circuit: Preliminary investigations

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1992-01-01

    Schneider demonstrates that many fault tolerant clock synchronization algorithms can be represented as refinements of a single proven correct paradigm. Shankar provides mechanical proof that Schneider's schema achieves Byzantine fault tolerant clock synchronization provided that 11 constraints are satisfied. Some of the constraints are assumptions about physical properties of the system and cannot be established formally. Proofs are given that the fault tolerant midpoint convergence function satisfies three of the constraints. A hardware design is presented, implementing the fault tolerant midpoint function, which is shown to satisfy the remaining constraints. The synchronization circuit will recover completely from transient faults provided the maximum fault assumption is not violated. The initialization protocol for the circuit also provides a recovery mechanism from total system failure caused by correlated transient faults.

  10. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 12 2011-01-01 2011-01-01 false Seismic design and construction standards for new..., REGULATIONS, AND EXECUTIVE ORDERS Seismic Safety of Federally Assisted New Building Construction 1792.103 Seismic design and construction standards for new buildings. (a) In the design and construction...

  11. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, Thomas M.; Rohay, Alan C.; Reidel, Steve; Gardner, Martin G.

    2007-02-27

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energys (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84th percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis.

  12. Salt Repository Project input to seismic design: Revision 0. [Contains Glossary

    SciTech Connect

    Not Available

    1987-12-01

    The Salt Repository Program (SRP) Input to Seismic Design (ISD) documents the assumptions, rationale, approaches, judgments, and analyses that support the development of seismic-specific data and information to be used for shaft design in accordance with the SRP Shaft Design Guide (SDG). The contents of this document are divided into four subject areas: (1) seismic assessment, (2) stratigraphy and material properties for seismic design, (3) development of seismic design parameters, and (4) host media stability. These four subject areas have been developed considering expected conditions at a proposed site in Deaf Smith County, Texas. The ISD should be used only in conjunction with seismic design of the exploratory and repository shafts. Seismic design considerations relating to surface facilities are not addressed in this document. 54 refs., 55 figs., 18 tabs.

  13. Seismic design technology for Breeder Reactor structures. Volume 3: special topics in reactor structures

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into six chapters: analysis techniques, equivalent damping values, probabilistic design factors, design verifications, equivalent response cycles for fatigue analysis, and seismic isolation. (JDB)

  14. RCC for seismic design. [Roller-Compacted Concrete

    SciTech Connect

    Wong, N.C.; Forrest, M.P.; Lo, S.H. )

    1994-09-01

    This article describes how the use of roller-compacted concrete is saving $10 million on the seismic retrofit of Southern California's historic multiple-arch Littlerock Dam. Throughout its 70-year existence, the Littlerock Dam in Southern California's Angeles National Forest has been a subject of the San Andreas Fault, could this 28-arch dam withstand any major movement from that fault line, much less the big one'' Working with the state's Division of Safety of Dams, Woodward-Clyde Consultants, Oakland, Calif., performed stability and stress analyses to find the answer. The evaluation showed that, as feared, the dam failed to meet required seismic safety criteria, principally due to its lack of lateral stability, a deficiency inherent in multiple-arch dams. To provide adequate seismic stability the authors developed a rehabilitation design centered around the use of roller-compacted concrete (RCC) to construct a gravity section between and around the downstream portions of the existing buttresses. The authors also proposed that the arches be resurfaced and stiffened with steel-fiber-reinforced silica fume. The alternative design would have required filling the arch bays between the buttresses with mass concrete at a cost of $22.5 million. The RCC buttress repair construction, scheduled for completion this fall, will cost about $13 million.

  15. Fast Bayesian optimal experimental design for seismic source inversion

    NASA Astrophysics Data System (ADS)

    Long, Quan; Motamed, Mohammad; Tempone, Ral

    2015-07-01

    We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the "true" parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem.

  16. A New Event Detector Designed for the Seismic Research Observatories

    USGS Publications Warehouse

    Murdock, James N.; Hutt, Charles R.

    1983-01-01

    A new short-period event detector has been implemented on the Seismic Research Observatories. For each signal detected, a printed output gives estimates of the time of onset of the signal, direction of the first break, quality of onset, period and maximum amplitude of the signal, and an estimate of the variability of the background noise. On the SRO system, the new algorithm runs ~2.5x faster than the former (power level) detector. This increase in speed is due to the design of the algorithm: all operations can be performed by simple shifts, additions, and comparisons (floating point operations are not required). Even though a narrow-band recursive filter is not used, the algorithm appears to detect events competitively with those algorithms that employ such filters. Tests at Albuquerque Seismological Laboratory on data supplied by Blandford suggest performance commensurate with the on-line detector of the Seismic Data Analysis Center, Alexandria, Virginia.

  17. Seismic isolation systems designed with distinct multiple frequencies

    SciTech Connect

    Wu, Ting-shu; Seidensticker, R.W.

    1991-01-01

    Two systems for seismic base isolation are presented. The main feature of these system is that, instead of only one isolation frequency as in conventional isolation systems, they are designed to have two distinct isolation frequencies. When the responses during an earthquake exceed the design value(s), the system will automatically and passively shift to the secondly isolation frequency. Responses of these two systems to different ground motions including a harmonic motion with frequency same as the primary isolation frequency, show that no excessive amplification will occur. Adoption of these new systems certainly will greatly enhance the safety and reliability of an isolated superstructure against future strong earthquakes. 3 refs.

  18. Seismic isolation systems designed with distinct multiple frequencies

    SciTech Connect

    Wu, Ting-shu; Seidensticker, R.W.

    1991-12-31

    Two systems for seismic base isolation are presented. The main feature of these system is that, instead of only one isolation frequency as in conventional isolation systems, they are designed to have two distinct isolation frequencies. When the responses during an earthquake exceed the design value(s), the system will automatically and passively shift to the secondly isolation frequency. Responses of these two systems to different ground motions including a harmonic motion with frequency same as the primary isolation frequency, show that no excessive amplification will occur. Adoption of these new systems certainly will greatly enhance the safety and reliability of an isolated superstructure against future strong earthquakes. 3 refs.

  19. Seismicity and seismic response of the Soviet-designed VVER (Water-cooled, Water moderated Energy Reactor) reactor plants

    SciTech Connect

    Ma, D.C.; Gvildys, J.; Wang, C.Y.; Spencer, B.W.; Sienicki, J.J.; Seidensticker, R.W.; Purvis, E.E. III

    1989-01-01

    On March 4, 1977, a strong earthquake occurred at Vrancea, Romania, about 350 km from the Kozloduy plant in Bulgaria. Subsequent to this event, construction of the unit 2 of the Armenia plant was delayed over two years while seismic features were added. On December 7, 1988, another strong earthquake struck northwest Armenia about 90 km north of the Armenia plant. Extensive damage of residential and industrial facilities occurred in the vicinity of the epicenter. The earthquake did not damage the Armenia plant. Following this event, the Soviet government announced that the plant would be shutdown permanently by March 18, 1989, and the station converted to a fossil-fired plant. This paper presents the results of the seismic analyses of the Soviet-designed VVER (Water-cooled, Water moderated Energy Reactor) plants. Also presented is the information concerning seismicity in the regions where VVERs are located and information on seismic design of VVERs. The reference units are the VVER-440 model V230 (similar to the two units of the Armenia plant) and the VVER-1000 model V320 units at Kozloduy in Bulgaria. This document provides an initial basis for understanding the seismicity and seismic response of VVERs under seismic events. 1 ref., 9 figs., 3 tabs.

  20. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 12 2014-01-01 2013-01-01 true Seismic design and construction standards for new... Seismic design and construction standards for new buildings. (a) In the design and construction of...) 2002 American Society of Civil Engineers (ASCE) 7, Minimum Design Loads for Buildings and...

  1. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 12 2012-01-01 2012-01-01 false Seismic design and construction standards for new... Seismic design and construction standards for new buildings. (a) In the design and construction of...) 2002 American Society of Civil Engineers (ASCE) 7, Minimum Design Loads for Buildings and...

  2. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 12 2013-01-01 2013-01-01 false Seismic design and construction standards for new... Seismic design and construction standards for new buildings. (a) In the design and construction of...) 2002 American Society of Civil Engineers (ASCE) 7, Minimum Design Loads for Buildings and...

  3. Design of engineered cementitious composites for ductile seismic resistant elements

    NASA Astrophysics Data System (ADS)

    Kanda, Tetsushi

    This dissertation focuses on designing Engineered Cementitious Composite (ECC) to achieve high performance seismic resistant elements. To attain this goal, three major tasks have been accomplished. Task 1 aims at achieving new ECCs involving low cost fiber, which often involve fiber rupture in crack bridging, thus named as "Fiber Rupture Type ECC". Achieving the new ECC requires a new practical and comprehensive composite design theory. For this theory, single fiber behavior was first investigated. Specifically, fiber rupture in composite and chemical bond in fiber/matrix interface were experimentally examined and mathematically modeled. Then this model for single fiber behavior was implemented into a proposed bridging law, a theoretical model for relationship between fiber bridging stress of composite and Crack Opening Displacement (COD). This new bridging law was finally employed to establish a new composite design theory. Task 2 was initiated to facilitate structural interpretation of ECC's material behavior investigated in Task 1. For this purpose, uniaxial tensile behavior, one of the most important ECC's properties, was theoretically characterized with stress-strain relation from micromechanics view point. As a result, a theory is proposed to express ECC's tensile stress-strain relation in terms of micromechanics parameters of composites, such as bond strengths. Task 3 primarily demonstrates an integrated design scheme for ductile seismic elements that covers from micromechanics in single fiber level to structural design tool, such as with non-linear FEM analysis. The significance of this design scheme is that the influences of ECC's microstructure on element's structural performance is quantitatively captured. This means that a powerful tool is obtained for tailoring constitutive micromechanics parameters in order to maximize structural performance of elements. While the tool is still preliminary, completing this tool in future studies will enable one to optimally exploit the performance of constitutive materials, thus resulting in maximum structural safety with reasonable cost.

  4. Study of seismic design bases and site conditions for nuclear power plants

    SciTech Connect

    Not Available

    1980-04-01

    This report presents the results of an investigation of four topics pertinent to the seismic design of nuclear power plants: Design accelerations by regions of the continental United States; review and compilation of design-basis seismic levels and soil conditions for existing nuclear power plants; regional distribution of shear wave velocity of foundation materials at nuclear power plant sites; and technical review of surface-founded seismic analysis versus embedded approaches.

  5. An Alternative Approach to "Identification of Unknowns": Designing a Protocol to Verify the Identities of Nitrogen Fixing Bacteria.

    PubMed

    Martinez-Vaz, Betsy M; Denny, Roxanne; Young, Nevin D; Sadowsky, Michael J

    2015-12-01

    Microbiology courses often include a laboratory activity on the identification of unknown microbes. This activity consists of providing students with microbial cultures and running biochemical assays to identify the organisms. This approach lacks molecular techniques such as sequencing of genes encoding 16S rRNA, which is currently the method of choice for identification of unknown bacteria. A laboratory activity was developed to teach students how to identify microorganisms using 16S rRNA polymerase chain reaction (PCR) and validate microbial identities using biochemical techniques. We hypothesized that designing an experimental protocol to confirm the identity of a bacterium would improve students' knowledge of microbial identification techniques and the physiological characteristics of bacterial species. Nitrogen-fixing bacteria were isolated from the root nodules of Medicago truncatula and prepared for 16S rRNA PCR analysis. Once DNA sequencing revealed the identity of the organisms, the students designed experimental protocols to verify the identity of rhizobia. An assessment was conducted by analyzing pre- and posttest scores and by grading students' verification protocols and presentations. Posttest scores were higher than pretest scores at or below p = 0.001. Normalized learning gains (G) showed an improvement of students' knowledge of microbial identification methods (LO4, G = 0.46), biochemical properties of nitrogen-fixing bacteria (LO3, G = 0.45), and the events leading to the establishment of nitrogen-fixing symbioses (LO1&2, G = 0.51, G = 0.37). An evaluation of verification protocols also showed significant improvement with a p value of less than 0.001. PMID:26753033

  6. An Alternative Approach to Identification of Unknowns: Designing a Protocol to Verify the Identities of Nitrogen Fixing Bacteria

    PubMed Central

    Martinez-Vaz, Betsy M.; Denny, Roxanne; Young, Nevin D.; Sadowsky, Michael J.

    2015-01-01

    Microbiology courses often include a laboratory activity on the identification of unknown microbes. This activity consists of providing students with microbial cultures and running biochemical assays to identify the organisms. This approach lacks molecular techniques such as sequencing of genes encoding 16S rRNA, which is currently the method of choice for identification of unknown bacteria. A laboratory activity was developed to teach students how to identify microorganisms using 16S rRNA polymerase chain reaction (PCR) and validate microbial identities using biochemical techniques. We hypothesized that designing an experimental protocol to confirm the identity of a bacterium would improve students knowledge of microbial identification techniques and the physiological characteristics of bacterial species. Nitrogen-fixing bacteria were isolated from the root nodules of Medicago truncatula and prepared for 16S rRNA PCR analysis. Once DNA sequencing revealed the identity of the organisms, the students designed experimental protocols to verify the identity of rhizobia. An assessment was conducted by analyzing pre- and posttest scores and by grading students verification protocols and presentations. Posttest scores were higher than pretest scores at or below p = 0.001. Normalized learning gains (G) showed an improvement of students knowledge of microbial identification methods (LO4, G = 0.46), biochemical properties of nitrogen-fixing bacteria (LO3, G = 0.45), and the events leading to the establishment of nitrogen-fixing symbioses (LO1&2, G = 0.51, G = 0.37). An evaluation of verification protocols also showed significant improvement with a p value of less than 0.001. PMID:26753033

  7. Design Of Bridges For Non Synchronous Seismic Motion

    SciTech Connect

    Nuti, Camillo; Vanzi, Ivo

    2008-07-08

    this paper aims to develop and validate structural design criteria which account for the effects of earthquakes spatial variability. In past works [1, 2] the two simplest forms of this problem were dealt with: differential displacements between two points belonging to the soil or to two single degree of freedom structures. Seismic action was defined according to EC8 [3]; the structures were assumed linear elastic sdof oscillators. Despite this problem may seem trivial, existing codes models appeared improvable on this aspect. For the differential displacements of two points on the ground, these results are now validated and generalized using the newly developed response spectra contained in the new seismic Italian code [4]; the resulting code formulation is presented. Next, the problem of statistically defining the differential displacement among any number of points on the ground (which is needed for continuos deck bridges) is approached, and some preliminary results shown. It is also shown that the current codes (e.g. EC8) rules may be improved on this aspect.

  8. Design and application of an electromagnetic vibrator seismic source

    USGS Publications Warehouse

    Haines, S.S.

    2006-01-01

    Vibrational seismic sources frequently provide a higher-frequency seismic wavelet (and therefore better resolution) than other sources, and can provide a superior signal-to-noise ratio in many settings. However, they are often prohibitively expensive for lower-budget shallow surveys. In order to address this problem, I designed and built a simple but effective vibrator source for about one thousand dollars. The "EMvibe" is an inexpensive electromagnetic vibrator that can be built with easy-to-machine parts and off-the-shelf electronics. It can repeatably produce pulse and frequency-sweep signals in the range of 5 to 650 Hz, and provides sufficient energy for recording at offsets up to 20 m. Analysis of frequency spectra show that the EMvibe provides a broader frequency range than the sledgehammer at offsets up to ??? 10 m in data collected at a site with soft sediments in the upper several meters. The EMvibe offers a high-resolution alternative to the sledgehammer for shallow surveys. It is well-suited to teaching applications, and to surveys requiring a precisely-repeatable source signature.

  9. Report of the US Nuclear Regulatory Commission Piping Review Committee. Volume 2. Evaluation of seismic designs: a review of seismic design requirements for Nuclear Power Plant Piping

    SciTech Connect

    Not Available

    1985-04-01

    This document reports the position and recommendations of the NRC Piping Review Committee, Task Group on Seismic Design. The Task Group considered overlapping conservation in the various steps of seismic design, the effects of using two levels of earthquake as a design criterion, and current industry practices. Issues such as damping values, spectra modification, multiple response spectra methods, nozzle and support design, design margins, inelastic piping response, and the use of snubbers are addressed. Effects of current regulatory requirements for piping design are evaluated, and recommendations for immediate licensing action, changes in existing requirements, and research programs are presented. Additional background information and suggestions given by consultants are also presented.

  10. Assessment of the impact of degraded shear wall stiffnesses on seismic plant risk and seismic design loads

    SciTech Connect

    Klamerus, E.W.; Bohn, M.P.; Johnson, J.J.; Asfura, A.P.; Doyle, D.J.

    1994-02-01

    Test results sponsored by the USNRC have shown that reinforced shear wall (Seismic Category I) structures exhibit stiffnesses and natural frequencies which are smaller than those calculated in the design process. The USNRC has sponsored Sandia National Labs to perform an evaluation of the effects of the reduced frequencies on several existing seismic PRAs in order to determine the seismic risk implications inherent in these test results. This report presents the results for the re-evaluation of the seismic risk for three nuclear power plants: the Peach Bottom Atomic Power Station, the Zion Nuclear Power Plant, and Arkansas Nuclear One -- Unit 1 (ANO-1). Increases in core damage frequencies for seismic initiated events at Peach Bottom were 25 to 30 percent (depending on whether LLNL or EPRI hazard curves were used). At the ANO-1 site, the corresponding increases in plant risk were 10 percent (for each set of hazard curves). Finally, at Zion, there was essentially no change in the computed core damage frequency when the reduction in shear wall stiffness was included. In addition, an evaluation of deterministic ``design-like`` structural dynamic calculations with and without the shear stiffness reductions was made. Deterministic loads calculated for these two cases typically increased on the order of 10 to 20 percent for the affected structures.

  11. Engineering Seismic Base Layer for Defining Design Earthquake Motion

    SciTech Connect

    Yoshida, Nozomu

    2008-07-08

    Engineer's common sense that incident wave is common in a widespread area at the engineering seismic base layer is shown not to be correct. An exhibiting example is first shown, which indicates that earthquake motion at the ground surface evaluated by the analysis considering the ground from a seismic bedrock to a ground surface simultaneously (continuous analysis) is different from the one by the analysis in which the ground is separated at the engineering seismic base layer and analyzed separately (separate analysis). The reason is investigated by several approaches. Investigation based on eigen value problem indicates that the first predominant period in the continuous analysis cannot be found in the separate analysis, and predominant period at higher order does not match in the upper and lower ground in the separate analysis. The earthquake response analysis indicates that reflected wave at the engineering seismic base layer is not zero, which indicates that conventional engineering seismic base layer does not work as expected by the term 'base'. All these results indicate that wave that goes down to the deep depths after reflecting in the surface layer and again reflects at the seismic bedrock cannot be neglected in evaluating the response at the ground surface. In other words, interaction between the surface layer and/or layers between seismic bedrock and engineering seismic base layer cannot be neglected in evaluating the earthquake motion at the ground surface.

  12. Investigation of Optimal Seismic Design Methodology for Piping Systems Supported by Elasto-plastic Dampers

    NASA Astrophysics Data System (ADS)

    Ito, Tomohiro; Michiue, Masashi; Fujita, Katsuhisa

    In this study, the applicability of a previously developed optimal seismic design methodology, which can consider the structural integrity of not only piping systems but also elasto-plastic supporting devices, is studied for seismic waves with various frequency characteristics. This methodology employs a genetic algorithm and can search the optimal conditions such as the supporting location and the capacity and stiffness of the supporting devices. Here, a lead extrusion damper is treated as a typical elasto-plastic damper. Numerical simulations are performed using a simple piping system model. As a result, it is shown that the proposed optimal seismic design methodology is applicable to the seismic design of piping systems subjected to seismic waves with various frequency characteristics. The mechanism of optimization is also clarified.

  13. Technical Basis for Certification of Seismic Design Criteria for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Youngs, R.R.; Costantino, C.J.; Miller, L.F.

    2008-07-01

    In August 2007, Secretary of Energy Samuel W. Bodman approved the final seismic and ground motion criteria for the Waste Treatment and Immobilization Plant (WTP) at the Department of Energy's (DOE) Hanford Site. Construction of the WTP began in 2002 based on seismic design criteria established in 1999 and a probabilistic seismic hazard analysis completed in 1996. The design criteria were reevaluated in 2005 to address questions from the Defense Nuclear Facilities Safety Board (DNFSB), resulting in an increase by up to 40% in the seismic design basis. DOE announced in 2006 the suspension of construction on the pretreatment and high-level waste vitrification facilities within the WTP to validate the design with more stringent seismic criteria. In 2007, the U.S. Congress mandated that the Secretary of Energy certify the final seismic and ground motion criteria prior to expenditure of funds on construction of these two facilities. With the Secretary's approval of the final seismic criteria in the summer of 2007, DOE authorized restart of construction of the pretreatment and high-level waste vitrification facilities. The technical basis for the certification of seismic design criteria resulted from a two-year Seismic Boreholes Project that planned, collected, and analyzed geological data from four new boreholes drilled to depths of approximately 1400 feet below ground surface on the WTP site. A key uncertainty identified in the 2005 analyses was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The absence of directly-measured seismic shear wave velocities in the sedimentary interbeds resulted in the use of a wider and more conservative range of velocities in the 2005 analyses. The Seismic Boreholes Project was designed to directly measure the velocities and velocity contrasts in the basalts and sediments below the WTP, reanalyze the ground motion response, and assess the level of conservatism in the 2005 seismic design criteria. The characterization and analysis effort included 1) downhole measurements of the velocity properties (including uncertainties) of the basalt/interbed sequences, 2) confirmation of the geometry of the contact between the various basalt and interbedded sediments through examination of retrieved core from the core-hole and data collected through geophysical logging of each borehole, and 3) prediction of ground motion response to an earthquake using newly acquired and historic data. The data and analyses reflect a significant reduction in the uncertainty in shear wave velocities below the WTP and result in a significantly lower spectral acceleration (i.e., ground motion). The updated ground motion response analyses and corresponding design response spectra reflect a 25% lower peak horizontal acceleration than reflected in the 2005 design criteria. These results provide confidence that the WTP seismic design criteria are conservative. (authors)

  14. Seismic Analysis Issues in Design Certification Applications for New Reactors

    SciTech Connect

    Miranda, M.; Morante, R.; Xu, J.

    2011-07-17

    The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants,” provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

  15. Optimization Criteria In Design Of Seismic Isolated Building

    SciTech Connect

    Clemente, Paolo; Buffarini, Giacomo

    2008-07-08

    Use of new anti-seismic techniques is certainly suitable for buildings of strategic importance and, in general, in the case of very high risk. For ordinary buildings, instead, the cost of base isolation system should be balanced by an equivalent saving in the structure. The comparison criteria have been first defined, then a large numerical investigation has been carried out to analyze the effectiveness and the economic suitability of seismic isolation in concrete buildings.

  16. Design and development of digital seismic amplifier recorder

    NASA Astrophysics Data System (ADS)

    Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan

    2015-04-01

    A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1K? and 1M?. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.

  17. Design and development of digital seismic amplifier recorder

    SciTech Connect

    Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan

    2015-04-16

    A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1KΩ and 1MΩ. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.

  18. Overcoming barriers to high performance seismic design using lessons learned from the green building industry

    NASA Astrophysics Data System (ADS)

    Glezil, Dorothy

    NEHRP's Provisions today currently governing conventional seismic resistant design. These provisions, though they ensure the life-safety of building occupants, extensive damage and economic losses may still occur in the structures. This minimum performance can be enhanced using the Performance-Based Earthquake Engineering methodology and passive control systems like base isolation and energy dissipation systems. Even though these technologies and the PBEE methodology are effective reducing economic losses and fatalities during earthquakes, getting them implemented into seismic resistant design has been challenging. One of the many barriers to their implementation has been their upfront costs. The green building community has faced some of the same challenges that the high performance seismic design community currently faces. The goal of this thesis is to draw on the success of the green building industry to provide recommendations that may be used overcome the barriers that high performance seismic design (HPSD) is currently facing.

  19. Technical Basis for Certification of Seismic Design Criteria for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, Thomas M.; Rohay, Alan C.; Youngs, Robert R.; Costantino, Carl J.; Miller, Lewis F.

    2008-02-28

    In August 2007, Secretary of Energy Samuel W. Bodman approved the final seismic and ground motion criteria for the Waste Treatment and Immobilization Plant (WTP) at the Department of Energys (DOE) Hanford Site. Construction of the WTP began in 2002 based on seismic design criteria established in 1999 and a probabilistic seismic hazard analysis completed in 1996. The design criteria were re-evaluated in 2005 to address questions from the Defense Nuclear Facilities Safety Board (DNFSB), resulting in an increase by up to 40% in the seismic design basis. DOE announced in 2006 the suspension of construction on the pretreatment and high-level waste vitrification facilities within the WTP to validate the design with more stringent seismic criteria. In 2007, the U.S. Congress mandated that the Secretary of Energy certify the final seismic and ground motion criteria prior to expenditure of funds on construction of these two facilities. With the Secretarys approval of the final seismic criteria this past summer, DOE authorized restart of construction of the pretreatment and high-level waste vitrification facilities.

  20. Experimental investigation of damage behavior of RC frame members including non-seismically designed columns

    NASA Astrophysics Data System (ADS)

    Chen, Linzhi; Lu, Xilin; Jiang, Huanjun; Zheng, Jianbo

    2009-06-01

    Reinforced concrete (RC) frame structures are one of the mostly common used structural systems, and their seismic performance is largely determined by the performance of columns and beams. This paper describes horizontal cyclic loading tests of ten column and three beam specimens, some of which were designed according to the current seismic design code and others were designed according to the early non-seismic Chinese design code, aiming at reporting the behavior of the damaged or collapsed RC frame strctures observed during the Wenchuan earthquake. The effects of axial load ratio, shear span ratio, and transverse and longitudinal reinforcement ratio on hysteresis behavior, ductility and damage progress were incorporated in the experimental study. Test results indicate that the non-seismically designed columns show premature shear failure, and yield larger maximum residual crack widths and more concrete spalling than the seismically designed columns. In addition, longitudinal steel reinforcement rebars were severely buckled. The axial load ratio and shear span ratio proved to be the most important factors affecting the ductility, crack opening width and closing ability, while the longitudinal reinforcement ratio had only a minor effect on column ductility, but exhibited more influence on beam ductility. Finally, the transverse reinforcement ratio did not influence the maximum residual crack width and closing ability of the seismically designed columns.

  1. Design and implementation of telemetry seismic data acquisition system based on embedded P2P Ethernet

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Lin, J.; Chen, Z.

    2011-12-01

    A new design of telemetry seismic data acquisition system is presented which uses embedded, point to point (P2P) Ethernet networks. In our presentation, we explain the idea and motivation behind the use of P2P Ethernet topology and show the problems when such topology is used in seismic acquisition system. The presented paper focuses on the network protocols developed by us which include the generation of route table and dynamic IP address management. This new design has been implemented based on ARM and FPGA, which we have tested in laboratory and seismic exploration.

  2. USP Verified Dietary Supplements

    MedlinePLUS

    ... Verified Pharmaceutical Ingredients USP Verified Excipients GMP Facility Audit Program Contact Information Verification Support Customer Service Account Manager All USP Contacts Related Resources Dietary ...

  3. SEISMIC DESIGN REQUIREMENTS SELECTION METHODOLOGY FOR THE SLUDGE TREATMENT & M-91 SOLID WASTE PROCESSING FACILITIES PROJECTS

    SciTech Connect

    RYAN GW

    2008-04-25

    In complying with direction from the U.S. Department of Energy (DOE), Richland Operations Office (RL) (07-KBC-0055, 'Direction Associated with Implementation of DOE-STD-1189 for the Sludge Treatment Project,' and 08-SED-0063, 'RL Action on the Safety Design Strategy (SDS) for Obtaining Additional Solid Waste Processing Capabilities (M-91 Project) and Use of Draft DOE-STD-I 189-YR'), it has been determined that the seismic design requirements currently in the Project Hanford Management Contract (PHMC) will be modified by DOE-STD-1189, Integration of Safety into the Design Process (March 2007 draft), for these two key PHMC projects. Seismic design requirements for other PHMC facilities and projects will remain unchanged. Considering the current early Critical Decision (CD) phases of both the Sludge Treatment Project (STP) and the Solid Waste Processing Facilities (M-91) Project and a strong intent to avoid potentially costly re-work of both engineering and nuclear safety analyses, this document describes how Fluor Hanford, Inc. (FH) will maintain compliance with the PHMC by considering both the current seismic standards referenced by DOE 0 420.1 B, Facility Safety, and draft DOE-STD-1189 (i.e., ASCE/SEI 43-05, Seismic Design Criteria for Structures, Systems, and Components in Nuclear Facilities, and ANSI!ANS 2.26-2004, Categorization of Nuclear Facility Structures, Systems and Components for Seismic Design, as modified by draft DOE-STD-1189) to choose the criteria that will result in the most conservative seismic design categorization and engineering design. Following the process described in this document will result in a conservative seismic design categorization and design products. This approach is expected to resolve discrepancies between the existing and new requirements and reduce the risk that project designs and analyses will require revision when the draft DOE-STD-1189 is finalized.

  4. Seismic Response Analysis and Design of Structure with Base Isolation

    SciTech Connect

    Rosko, Peter

    2010-05-21

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  5. EVALUATING DESIGN AND VERIFYING COMPLIANCE OF WETLANDS CREATED UNDER SECTION 404 OF THE CLEAN WATER ACT IN OREGON

    EPA Science Inventory

    Permit specifications, construction plans, and field measurements were used to examine the correlation between design and conditions "asbuilt" in a population of 11 palustrine emergent marshes created in the metropolitan area of Portland, Oregon between 1980-1986. he projects ran...

  6. On the seismic design of piping for fossil fired power stations

    SciTech Connect

    Lazzeri, L.

    1996-12-01

    The seismic design criteria are briefly reviewed: the importance of the yielding phenomena on the seismic response is presented. The decisive importance of ductility is confirmed by the field observations. The ductility causes reduction in the response with flattening of the peaks. Some analyses are performed on several piping systems in static equivalent conditions with ZPA loading. Such analyses assume some ductility in the system. Problems are found only for very flexible systems.

  7. Revision of seismic design codes corresponding to building damages in the ``5.12'' Wenchuan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Yayong

    2010-06-01

    A large number of buildings were seriously damaged or collapsed in the 5.12 Wenchuan earthquake. Based on field surveys and studies of damage to different types of buildings, seismic design codes have been updated. This paper briefly summarizes some of the major revisions that have been incorporated into the Standard for classification of seismic protection of building constructions GB50223-2008 and Code for Seismic Design of Buildings GB50011-2001. The definition of seismic fortification class for buildings has been revisited, and as a result, the seismic classifications for schools, hospitals and other buildings that hold large populations such as evacuation shelters and information centers have been upgraded in the GB50223-2008 Code. The main aspects of the revised GB50011-2001 code include: (a) modification of the seismic intensity specified for the Provinces of Sichuan, Shanxi and Gansu; (b) basic conceptual design for retaining walls and building foundations in mountainous areas; (c) regularity of building configuration; (d) integration of masonry structures and pre-cast RC floors; (e) requirements for calculating and detailing stair shafts; and (f) limiting the use of single-bay RC frame structures. Some significant examples of damage in the epicenter areas are provided as a reference in the discussion on the consequences of collapse, the importance of duplicate structural systems, and the integration of RC and masonry structures.

  8. Seismic design factors for RC special moment resisting frames in Dubai, UAE

    NASA Astrophysics Data System (ADS)

    Alhamaydeh, Mohammad; Abdullah, Sulayman; Hamid, Ahmed; Mustapha, Abdilwahhab

    2011-12-01

    This study investigates the seismic design factors for three reinforced concrete (RC) framed buildings with 4, 16 and 32-stories in Dubai, UAE utilizing nonlinear analysis. The buildings are designed according to the response spectrum procedure defined in the 2009 International Building Code (IBC'09). Two ensembles of ground motion records with 10% and 2% probability of exceedance in 50 years (10/50 and 2/50, respectively) are used. The nonlinear dynamic responses to the earthquake records are computed using IDARC-2D. Key seismic design parameters are evaluated; namely, response modification factor ( R), deflection amplification factor ( C d), system overstrength factor ( ? o), and response modification factor for ductility ( R d ) in addition to inelastic interstory drift. The evaluated seismic design factors are found to significantly depend on the considered ground motion (10/50 versus 2/50). Consequently, resolution to the controversy of Dubai seismicity is urged. The seismic design factors for the 2/50 records show an increase over their counterparts for the 10/50 records in the range of 200%-400%, except for the ? o factor, which shows a mere 30% increase. Based on the observed trends, perioddependent R and C d factors are recommended if consistent collapse probability (or collapse prevention performance) in moment frames with varying heights is to be expected.

  9. Performance-based seismic design of nonstructural building components: The next frontier of earthquake engineering

    NASA Astrophysics Data System (ADS)

    Filiatrault, Andre; Sullivan, Timothy

    2014-08-01

    With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major knowledge gaps that will need to be filled by future research. Furthermore, considering recent trends in earthquake engineering, the paper explores how performance-based seismic design might be conceived for nonstructural components, drawing on recent developments made in the field of seismic design and hinting at the specific considerations required for nonstructural components.

  10. The 1995 forum on appropriate criteria and methods for seismic design of nuclear piping

    SciTech Connect

    Slagis, G.C.

    1996-12-01

    A record of the 1995 Forum on Appropriate Criteria and Methods for Seismic Design of Nuclear Piping is provided. The focus of the forum was the earthquake experience data base and whether the data base demonstrates that seismic inertia loads will not cause failure in ductile piping systems. This was a follow-up to the 1994 Forum when the use of earthquake experience data, including the recent Northridge earthquake, to justify a design-by-rule method was explored. Two possible topics for the next forum were identified--inspection after an earthquake and design for safe-shutdown earthquake only.

  11. Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm

    PubMed Central

    Veladi, H.

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717

  12. Performance-based seismic design of steel frames utilizing colliding bodies algorithm.

    PubMed

    Veladi, H

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717

  13. Investigation of Optimal Seismic Design Methodology for Piping Systems Supported by Elasto-Plastic Dampers

    NASA Astrophysics Data System (ADS)

    Ito, Tomohiro; Michiue, Masashi; Fujita, Katsuhisa

    In this study, the optimal seismic design methodology that can consider the structural integrity of not only the piping systems but also elasto-plastic supporting devices is developed. This methodology employs a genetic algorithm and can search the optimal conditions such as the supporting location, capacity and stiffness of the supporting devices. Here, a lead extrusion damper is treated as a typical elasto-plastic damper. Four types of evaluation functions are considered. It is found that the proposed optimal seismic design methodology is very effective and can be applied to the actual seismic design for piping systems supported by elasto-plastic dampers. The effectiveness of the evaluation functions is also clarified.

  14. Seismic responses of a pool-type fast reactor with different core support designs

    SciTech Connect

    Wu, Ting-shu; Seidensticker, R.W. )

    1989-01-01

    In designing the core support system for a pool-type fast reactor, there are many issues which must be considered in order to achieve an optimum and balanced design. These issues include safety, reliability, as well as costs. Several design options are possible to support the reactor core. Different core support options yield different frequency ranges and responses. Seismic responses of a large pool-type fast reactor incorporated with different core support designs have been investigated. 4 refs., 3 figs.

  15. Evaluation of collapse resistance of RC frame structures for Chinese schools in seismic design categories B and C

    NASA Astrophysics Data System (ADS)

    Tang, Baoxin; Lu, Xinzheng; Ye, Lieping; Shi, Wei

    2011-09-01

    According to the Code for Seismic Design of Buildings (GB50011-2001), ten typical reinforced concrete (RC) frame structures, used as school classroom buildings, are designed with different seismic fortification intensities (SFIs) (SFI=6 to 8.5) and different seismic design categories (SDCs) (SDC=B and C). The collapse resistance of the frames with SDC=B and C in terms of collapse fragility curves are quantitatively evaluated and compared via incremental dynamic analysis (IDA). The results show that the collapse resistance of structures should be evaluated based on both the absolute seismic resistance and the corresponding design seismic intensity. For the frames with SFI from 6 to 7.5, because they have relatively low absolute seismic resistance, their collapse resistance is insufficient even when their corresponding SDCs are upgraded from B to C. Thus, further measures are needed to enhance these structures, and some suggestions are proposed.

  16. The 1994 Forum on Appropriate Criteria and Methods for Seismic Design of Nuclear Piping

    SciTech Connect

    Slagis, G.C.

    1995-12-31

    A record of the 1994 Forum on Appropriate Criteria and Methods for Seismic Design of Nuclear Piping is provided. The focus of the forum was the design-by-rule method for seismic design of piping. Issues such as acceptance criteria, ductility considerations, demonstration of margin, training, verification and costs were discussed. The use of earthquake experience data, including the recent Northridge earthquake, to justify a design-by-rule method was explored. The majority of the participants felt there are not significant advantages to developing a design-by-rule approach for new plant design. One major disadvantage was considered by many to be training. Extensive training will be required to properly implement a design-by-rule approach. Verification of designs was considered by the majority to be equally important for design-by-rule as for design-by-analysis. If a design-by-rule method is going to be effective, the method will have to be based on ductility considerations (UBC approach). A significant issue will be justification of seismic margins with liberal rules. The UBC approach is being questioned by some because of the recent structural cracking problems in the Northridge earthquake.

  17. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Reidel, S.P.; Gardner, M.G.

    2007-07-01

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energy's (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84. percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis. A key uncertainty identified in the 2005 analysis was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The velocity structure of the upper four basalt flows (Saddle Mountains Basalt) and the inter-layered sedimentary interbeds (Ellensburg Formation) produces strong reductions in modeled earthquake ground motions propagating through them. Uncertainty in the strength of velocity contrasts between these basalts and interbeds primarily resulted from an absence of measured shear wave velocities (Vs) in the interbeds. For this study, Vs in the interbeds was estimated from older, limited compressional wave velocity (Vp) data using estimated ranges for the ratio of the two velocities (Vp/Vs) based on analogues in similar materials. A range of possible Vs for the interbeds and basalts was used and produced additional uncertainty in the resulting response spectra. Because of the sensitivity of the calculated response spectra to the velocity contrasts between the basalts and interbedded sediments, DOE initiated an effort to emplace additional boreholes at the WTP site and obtain direct Vs measurements and other physical property measurements in these layers. One core-hole and three boreholes have been installed at the WTP site to a maximum depth of 1468 ft (447 m) below ground surface. The three boreholes are within 500 ft (152 m) of and surrounding the high level waste vitrification and pretreatment facilities of the WTP, which were the Performance Category 3 (PC-3) structures affected by the interim design spectra. The core-hole is co-located with the borehole closest to the two PC-3 structures. These new measurements are expected to reduce the uncertainty in the modeled site response that is caused by the lack of direct knowledge of the Vs contrasts within these layers. (authors)

  18. Estimation of Cyclic Interstory Drift Capacity of Steel Framed Structures and Future Applications for Seismic Design

    PubMed Central

    Bojórquez, Edén; Reyes-Salazar, Alfredo; Ruiz, Sonia E.; Terán-Gilmore, Amador

    2014-01-01

    Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions. PMID:25089288

  19. Estimation of cyclic interstory drift capacity of steel framed structures and future applications for seismic design.

    PubMed

    Bojrquez, Edn; Reyes-Salazar, Alfredo; Ruiz, Sonia E; Tern-Gilmore, Amador

    2014-01-01

    Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions. PMID:25089288

  20. optimization of seismic network design: application to a geophysical international lunar network

    NASA Astrophysics Data System (ADS)

    Yamada, R.; Garcia, R. F.; Lognonne, P.; Calvet, M.; Gagnepain-Beyneix, J.; Le Feuvre, M.

    2010-12-01

    During the next decade, some lunar seismic experiments are planned under the international lunar network initiative, such as NASA ILN Anchor nodes mission or Lunette DISCOVERY proposal, JAXA SELENE-2 and LUNA-GLOB penetrator missions, during which 1 to 4 seismic stations will be deployed on the lunar surface. Yamada et al. (submitted) have described how to design the optimized network in order to obtain the best scientific gain from these future lunar landing missions. In this presentation, we will describe the expected gain from the new lunar seismic observations potentially obtained by the optimized network compared with past Apollo seismic experiments. From the Apollo seismic experiments, valuable information about the lunar interior structure was obtained using deep and shallow moonquakes, and meteoroid impacts (e.g., Nakamura et al., 1983, Lognonn et al., 2003). However, due to the limited sensitivity of Apollo lunar seismometers and the narrowness of the seismic network, the deep lunar structure, especially the core, was not properly retrieved. In addition, large uncertainties are associated with the inferred crustal thickness around the Apollo seismic stations. Improvements of these knowledge will help us to understand the origin of the Earth-Moon system and the initial differentiation of the Moon. Therefore, we have studied the optimization of a seismic network consisting of three or four new seismometers in order to place better constraints on the lunar mantle structure and /or crustal thickness. The network is designed to minimize the a posteriori errors and maximize the resolution of the velocity perturbations inside the mantle and /or the crust through a linear inverse method. For the inversion, the deep moonquakes from active sources already located by Apollo seismic data are used, because it is known that these events occur repeatedly at identical nests depending on tidal constraints. In addition, we use randomly distributed meteoroid impacts located either by the new seismic network or by detection of the impact flashes from Earth-based observation. The use of these impact events will greatly contribute to improve the knowledge of shallow structures, in particular the crust. Finally, a comparison between the a posteriori errors deduced from our optimized network with those of the Apollo network will indicate the potential of the optimized network and the expected scientific gain. This method will be a useful tool to consider for future geophysical network landing missions.

  1. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 102-76.30 What seismic safety standards must Federal agencies follow in the design and construction of... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  2. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 102-76.30 What seismic safety standards must Federal agencies follow in the design and construction of... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  3. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  4. Seismic Assessment of High-Raised Designed Structures Based on 2800 Iranian Seismic Code (same as UBC1997)

    SciTech Connect

    Negar, Moharrami Gargari; Rassol, Mirgaderi

    2008-07-08

    Seismic design codes have been applied by researchers to employ an appropriate performance of structures during earthquakes, in this regard, variety of load patterns, history and location of plastic hinges, ultimate capacity of structure, demand capacity of structure and response to many other questions about actual and assumptive performance of structures during earthquake have been considered by experts in this fields. In order to decline the retrofit cost of structure, evaluation of non-linear behavior of structure during the earthquake has been studied more. Since last 1980's the first generation of structural retrofit codes was established while designing codes were using linear behavior of structure. Consequently, comparison of design and retrofit code results, which are evaluated the actual behavior of the structure, has been considered. This research evaluates structures designed by 2800 code with performance levels, described in FEMA356, and also it compares results of modal analysis with outcomes of static non-linear analysis by application of load patterns mentioned in FEMA356. This structure designed and controlled by all regulations in 2800 code then it is evaluated by FEMA356 regulations. Finally, results are presented performance point of structure and distribution of plastic hinges over the whole structure when it collapses.

  5. Multi Canister Overpack (MCO) Handling Machine Trolley Seismic Uplift Constraint Design Loads

    SciTech Connect

    SWENSON, C.E.

    2000-03-09

    The MCO Handling Machine (MHM) trolley moves along the top of the MHM bridge girders on east-west oriented rails. To prevent trolley wheel uplift during a seismic event, passive uplift constraints are provided as shown in Figure 1-1. North-south trolley wheel movement is prevented by flanges on the trolley wheels. When the MHM is positioned over a Multi-Canister Overpack (MCO) storage tube, east-west seismic restraints are activated to prevent trolley movement during MCO handling. The active seismic constraints consist of a plunger, which is inserted into slots positioned along the tracks as shown in Figure 1-1. When the MHM trolley is moving between storage tube positions, the active seismic restraints are not engaged. The MHM has been designed and analyzed in accordance with ASME NOG-1-1995. The ALSTHOM seismic analysis (Reference 3) reported seismic uplift restraint loading and EDERER performed corresponding structural calculations. The ALSTHOM and EDERER calculations were performed with the east-west seismic restraints activated and the uplift restraints experiencing only vertical loading. In support of development of the CSB Safety Analysis Report (SAR), an evaluation of the MHM seismic response was requested for the case where the east-west trolley restraints are not engaged. For this case, the associated trolley movements would result in east-west lateral loads on the uplift constraints due to friction, as shown in Figure 1-2. During preliminary evaluations, questions were raised as to whether the EDERER calculations considered the latest ALSTHOM seismic analysis loads (See NCR No. 00-SNFP-0008, Reference 5). Further evaluation led to the conclusion that the EDERER calculations used appropriate vertical loading, but the uplift restraints would need to be re-analyzed and modified to account for lateral loading. The disposition of NCR 00-SNFP-0008 will track the redesign and modification effort. The purpose of this calculation is to establish bounding seismic loads (vertical and horizontal) for input into the uplift restraint hardware redesign calculations. To minimize iterations on the uplift redesign effort, efforts were made to assure that the final loading input was reasonable but unquestionably on the conservative side.

  6. Architecture for Verifiable Software

    NASA Technical Reports Server (NTRS)

    Reinholtz, William; Dvorak, Daniel

    2005-01-01

    Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

  7. Risk-Targeted versus Current Seismic Design Maps for the Conterminous United States

    USGS Publications Warehouse

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  8. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part I: Theoretical formulation

    SciTech Connect

    Mazza, Fabio; Vulcano, Alfonso

    2008-07-08

    The insertion of steel braces equipped with dissipative devices proves to be very effective in order to enhance the performance of a framed building under horizontal seismic loads. Multi-level design criteria were proposed according to the Performance-Based Design, in order to get, for a specific level of the seismic intensity, a designated performance objective of the building (e.g., an assigned damage level of either the framed structure or non-structural elements). In this paper a design procedure aiming to proportion braces with hysteretic dampers in order to attain, for a specific level of the seismic intensity, a designated performance level of the building is proposed. Exactly, a proportional stiffness criterion, which assumes the elastic lateral storey-stiffness due to the braces proportional to that of the unbraced frame, is combined with the Direct Displacement-Based Design, in which the design starts from target deformations. A computer code has been prepared for the nonlinear static and dynamic analyses, using a step-by-step procedure. Frame members and hysteretic dampers are idealized by bilinear models.

  9. Verified Software Toolchain

    NASA Astrophysics Data System (ADS)

    Appel, Andrew W.

    The software toolchain includes static analyzers to check assertions about programs; optimizing compilers to translate programs to machine language; operating systems and libraries to supply context for programs. Our Verified Software Toolchain verifies with machine-checked proofs that the assertions claimed at the top of the toolchain really hold in the machine-language program, running in the operating-system context, on a weakly-consistent-shared-memory machine.

  10. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  11. Effects of surface topography on ground shaking prediction: implications for seismic hazard analysis and recommendations for seismic design

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Massa, Marco; Lovati, Sara; Spallarossa, Daniele

    2014-06-01

    This study examines the role of topographic effects on the prediction of earthquake ground motion. Ground motion prediction equations (GMPEs) are mathematical models that estimate the shaking level induced by an earthquake as a function of several parameters, such as magnitude, source-to-site distance, style of faulting and ground type. However, little importance is given to the effects of topography, which, as known, may play a significant role on the level, duration and frequency content of ground motion. Ridges and crests are often lost inside the large number of sites considered in the definition of a GMPE. Hence, it is presumable that current GMPEs are unable to accurately predict the shaking level at the top of a relief. The present work, which follows the article of Massa et al. about topographic effects, aims at overcoming this limitation by amending an existing GMPE with an additional term to account for the effects of surface topography at a specific site. First, experimental ground motion values and ground motions predicted by the attenuation model of Bindi et al. for five case studies are compared and contrasted in order to quantify their discrepancy and to identify anomalous behaviours of the sites investigated. Secondly, for the site of Narni (Central Italy), amplification factors derived from experimental measurements and numerical analyses are compared and contrasted, pointing out their impact on probabilistic seismic hazard analysis and design norms. In particular, with reference to the Italian building code, our results have highlighted the inadequacy of the national provisions concerning the definition of the seismic load at top of ridges and crests, evidencing a significant underestimation of ground motion around the site resonance frequency.

  12. Verifying Ballast Water Treatment Performance

    EPA Science Inventory

    The U.S. Environmental Protection Agency, NSF International, Battelle, and U.S. Coast Guard are jointly developing a protocol for verifying the technical performance of commercially available technologies designed to treat ship ballast water for potentially invasive species. The...

  13. Design and analysis of a seismically stable platform: An evaluation

    NASA Astrophysics Data System (ADS)

    Jaenke, M. G.

    1980-08-01

    A design of a high precision test platform suitable for the testing of inertial guidance components and systems is evaluated. The platform will also serve as a prototype facility for test concept verification for a planned precision guidance test facility. It is concluded that the proposed passive system is a basically sound concept, although an active system should improve transmissibility of motion. The induction of platform rocking modes by purely translational ground motion is considered. The effect of mismatched seismometer pairs on production of apparent platform rocking modes and the difficulty of matching seismometers to monitor small rotations is discussed. Recommendations are given for analyzing platform performance and interpreting future testing results.

  14. Malargüe seismic array: Design and deployment of the temporary array

    NASA Astrophysics Data System (ADS)

    Ruigrok, E.; Draganov, D.; Gómez, M.; Ruzzante, J.; Torres, D.; Lópes Pumarega, I.; Barbero, N.; Ramires, A.; Castaño Gañan, A. R.; van Wijk, K.; Wapenaar, K.

    2012-10-01

    We present the goals and the current status of the Malargüe seismic array. Our main goal is imaging and monitoring the subsurface below the Malargüe region, Mendoza, Argentina. More specifically, we focus on the Planchon-Peteroa Volcano and an area just east of the town of Malargüe. We start our project installing a temporary array of 38 seismic stations, which records continuously for one year. The array consists of two subarrays: one array located on the flanks of the volcano; the other spread out on a plateau just east of the Andes. The imaging targets, like the Moho and the Nazca slab, are relatively deep. Yet, the array has a dense station spacing, allowing exploration-type processing. For high-resolution imaging, also a dense source spacing is required. This we aim to achieve by creating virtual sources at the receiver positions, with a technique called seismic interferometry (SI). The array is designed such that a recent improvement of SI can be applied to the recordings. Other goals are to collect high-quality core-phase measurements and to characterize sources of microseism noise in the Southern Hemisphere. Furthermore, we plan to collaborate with researchers from the Pierre Auger Collaboration to study coupling of seismic, atmospheric, and cosmic signals using data from our instruments and from the Pierre Auger detectors.

  15. Seismic design technology for breeder reactor structures. Volume 1. Special topics in earthquake ground motion

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This report is divided into twelve chapters: seismic hazard analysis procedures, statistical and probabilistic considerations, vertical ground motion characteristics, vertical ground response spectrum shapes, effects of inclined rock strata on site response, correlation of ground response spectra with intensity, intensity attenuation relationships, peak ground acceleration in the very mean field, statistical analysis of response spectral amplitudes, contributions of body and surface waves, evaluation of ground motion characteristics, and design earthquake motions. (DLC)

  16. Verifying Greenhouse Gas Emissions

    NASA Astrophysics Data System (ADS)

    Linn, A. M.; Law, B.

    2010-12-01

    Trust in an international agreement to limit future greenhouse gas emissions will depend on the ability of each nation to make accurate estimates of its own emissions, monitor their changes over time, and verify one anothers estimates with independent information. A recent National Research Council committee assessed current capabilities for estimating and verifying emissions from greenhouse gases that result from human activities, have long lifetimes in the atmosphere, and are likely to be included in an international agreements. These include CO2, CH4, N2O, HFCs, PFCs, SF6, and CFCs. The analysis shows that countries have the capability to estimate their CO2 emissions from fossil-fuel use with sufficient accuracy to support monitoring of an international treaty, but accurate methods are not universally applied and the estimates cannot be checked against independent data. Deployment of existing methods and technologies could, within 5 years, yield a capability to both estimate and verify CO2 emissions from fossil-fuel use and deforestation, which comprise approximately three-quarters of greenhouse emissions likely covered by a treaty. Estimates of emissions of other greenhouse gases will remain uncertain in the near term.

  17. Martian seismicity

    NASA Technical Reports Server (NTRS)

    Phillips, Roger J.; Grimm, Robert E.

    1991-01-01

    The design and ultimate success of network seismology experiments on Mars depends on the present level of Martian seismicity. Volcanic and tectonic landforms observed from imaging experiments show that Mars must have been a seismically active planet in the past and there is no reason to discount the notion that Mars is seismically active today but at a lower level of activity. Models are explored for present day Mars seismicity. Depending on the sensitivity and geometry of a seismic network and the attenuation and scattering properties of the interior, it appears that a reasonable number of Martian seismic events would be detected over the period of a decade. The thermoelastic cooling mechanism as estimated is surely a lower bound, and a more refined estimate would take into account specifically the regional cooling of Tharsis and lead to a higher frequency of seismic events.

  18. Intelligent monitoring of seismic damage identification using wireless smart sensors: design and validation

    NASA Astrophysics Data System (ADS)

    Kim, Jinho; Jang, Young-Du; Jang, Won-rak

    2011-04-01

    Structural health monitoring (SHM) has been adopted as a technique to monitor the structure performance to detect damage in aging infrastructure. The ultimate goals of implementing an SHM system are to improve infrastructure maintenance, increase public safety, and minimize the economic impact of an extreme loading event by streamlining repair and retrofit measures. With the recent advances in wireless communication technology, wireless SHM systems have emerged as a promising alternative solution for rapid, accurate and low-cost structural monitoring. This article presents an enabling, developing damage algorithm to advance the detection and diagnosis of damage to structures for SHM using networks of wireless smart sensors. Networks of wireless smart sensors are being used as a vibration based structural monitoring network that allows extraction of mode shapes from output-only vibration data from an underground structure. The mode shape information can further be used in modal methods of damage detection. These sensors are being used to experimentally verify analytical models of post-earthquake evaluation based on system identification analysis. Damage measurement system could play a significant role in monitoring/recording with a higher level of completeness the actual seismic response of structures and in non-destructive seismic damage assessment techniques based on dynamic signature analysis.

  19. IMPLEMENTATION OF THE SEISMIC DESIGN CRITERIA OF DOE-STD-1189-2008 APPENDIX A [FULL PAPER

    SciTech Connect

    OMBERG SK

    2008-05-14

    This paper describes the approach taken by two Fluor Hanford projects for implementing of the seismic design criteria from DOE-STD-1189-2008, Appendix A. The existing seismic design criteria and the new seismic design criteria is described, and an assessment of the primary differences provided. The gaps within the new system of seismic design criteria, which necessitate conduct of portions of work to the existing technical standards pending availability of applicable industry standards, is discussed. Two Hanford Site projects currently in the Control Decision (CD)-1 phase of design have developed an approach to implementation of the new criteria. Calculations have been performed to determine the seismic design category for one project, based on information available in early CD-1. The potential effects of DOE-STD-1189-2008, Appendix A seismic design criteria on the process of project alternatives analysis is discussed. Present of this work is expected to benefit others in the DOE Complex that may be implementing DOE-STD-1189-2008.

  20. Decision making with epistemic uncertainty under safety constraints: An application to seismic design

    USGS Publications Warehouse

    Veneziano, D.; Agarwal, A.; Karaca, E.

    2009-01-01

    The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project. ?? 2009 Elsevier Ltd. All rights reserved.

  1. Seismic design repair and retrofit strategies for steel roof deck diaphragms

    NASA Astrophysics Data System (ADS)

    Franquet, John-Edward

    Structural engineers will often rely on the roof diaphragm to transfer lateral seismic loads to the bracing system of single-storey structures. The implementation of capacity-based design in the NBCC 2005 has caused an increase in the diaphragm design load due to the need to use the probable capacity of the bracing system, thus resulting in thicker decks, closer connector patterns and higher construction costs. Previous studies have shown that accounting for the in-plane flexibility of the diaphragm when calculating the overall building period can result in lower seismic forces and a more cost-efficient design. However, recent studies estimating the fundamental period of single storey structures using ambient vibration testing showed that the in-situ approximation was much shorter than that obtained using analytical means. The difference lies partially in the diaphragm stiffness characteristics which have been shown to decrease under increasing excitation amplitude. Using the diaphragm as the energy-dissipating element in the seismic force resisting system has also been investigated as this would take advantage of the diaphragm's ductility and limited overstrength; thus, lower capacity based seismic forces would result. An experimental program on 21.0m by 7.31m diaphragm test specimens was carried out so as to investigate the dynamic properties of diaphragms including the stiffness, ductility and capacity. The specimens consisted of 20 and 22 gauge panels with nailed frame fasteners and screwed sidelap connections as well a welded and button-punch specimen. Repair strategies for diaphragms that have previously undergone inelastic deformations were devised in an attempt to restitute the original stiffness and strength and were then experimentally evaluated. Strength and stiffness experimental estimations are compared with those predicted with the Steel Deck Institute (SDI) method. A building design comparative study was also completed. This study looks at the difference in design and cost yielded by previous and current design practice with EBF braced frames. Two alternate design methodologies, where the period is not restricted by code limitations and where the diaphragm force is limited to the equivalent shear force calculated with RdR o = 1.95, are also used for comparison. This study highlights the importance of incorporating the diaphragm stiffness in design and the potential cost savings.

  2. On standard and optimal designs of industrial-scale 2-D seismic surveys

    NASA Astrophysics Data System (ADS)

    Guest, T.; Curtis, A.

    2011-08-01

    The principal aim of performing a survey or experiment is to maximize the desired information within a data set by minimizing the post-survey uncertainty on the ranges of the model parameter values. Using Bayesian, non-linear, statistical experimental design (SED) methods we show how industrial scale amplitude variations with offset (AVO) surveys can be constructed to maximize the information content contained in AVO crossplots, the principal source of petrophysical information from seismic surveys. The design method allows offset dependent errors, previously not allowed in non-linear geoscientific SED methods. The method is applied to a single common-midpoint gather. The results show that the optimal design is highly dependent on the ranges of the model parameter values when a low number of receivers is being used, but that a single optimal design exists for the complete range of parameters once the number of receivers is increased above a threshold value. However, when acquisition and processing costs are considered we find that a design with constant spatial receiver separation survey becomes close to optimal. This explains why regularly-spaced, 2-D seismic surveys have performed so well historically, not only from the point of view of noise attenuation and imaging in which homogeneous data coverage confers distinct advantages, but also to provide data to constrain subsurface petrophysical information.

  3. Seismic design evaluation guidelines for buried piping for the DOE HLW Facilities

    SciTech Connect

    Lin, Chi-Wen; Antaki, G.; Bandyopadhyay, K.; Bush, S.H.; Costantino, C.; Kennedy, R.

    1995-05-01

    This paper presents the seismic design and evaluation guidelines for underground piping for the Department of Energy (DOE) High-Level-Waste (HLW) Facilities. The underground piping includes both single and double containment steel pipes and concrete pipes with steel lining, with particular emphasis on the double containment piping. The design and evaluation guidelines presented in this paper follow the generally accepted beam-on-elastic-foundation analysis principle and the inertial response calculation method, respectively, for piping directly in contact with the soil or contained in a jacket. A standard analysis procedure is described along with the discussion of factors deemed to be significant for the design of the underground piping. The following key considerations are addressed: the design feature and safety requirements for the inner (core) pipe and the outer pipe; the effect of soil strain and wave passage; assimilation of the necessary seismic and soil data; inertial response calculation for the inner pipe; determination of support anchor movement loads; combination of design loads; and code comparison. Specifications and justifications of the key parameters used, stress components to be calculated and the allowable stress and strain limits for code evaluation are presented.

  4. Software interface verifier

    NASA Technical Reports Server (NTRS)

    Soderstrom, Tomas J.; Krall, Laura A.; Hope, Sharon A.; Zupke, Brian S.

    1994-01-01

    A Telos study of 40 recent subsystem deliveries into the DSN at JPL found software interface testing to be the single most expensive and error-prone activity, and the study team suggested creating an automated software interface test tool. The resulting Software Interface Verifier (SIV), which was funded by NASA/JPL and created by Telos, employed 92 percent software reuse to quickly create an initial version which incorporated early user feedback. SIV is now successfully used by developers for interface prototyping and unit testing, by test engineers for formal testing, and by end users for non-intrusive data flow tests in the operational environment. Metrics, including cost, are included. Lessons learned include the need for early user training. SIV is ported to many platforms and can be successfully used or tailored by other NASA groups.

  5. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    SciTech Connect

    , R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The primary reasons for this difference is the greater activity rate used in contemporary models for the Charleston source zone and proper incorporation of uncertainty and randomness in GMAMs.

  6. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Design and Construction... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  7. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Design and Construction... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  8. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Design and Construction... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  9. Exploratory Shaft Seismic Design Basis Working Group report; Yucca Mountain Project

    SciTech Connect

    Subramanian, C.V.; King, J.L.; Perkins, D.M.; Mudd, R.W.; Richardson, A.M.; Calovini, J.C.; Van Eeckhout, E.; Emerson, D.O.

    1990-08-01

    This report was prepared for the Yucca Mountain Project (YMP), which is managed by the US Department of Energy. The participants in the YMP are investigating the suitability of a site at Yucca Mountain, Nevada, for construction of a repository for high-level radioactive waste. An exploratory shaft facility (ESF) will be constructed to permit site characterization. The major components of the ESF are two shafts that will be used to provide access to the underground test areas for men, utilities, and ventilation. If a repository is constructed at the site, the exploratory shafts will be converted for use as intake ventilation shafts. In the context of both underground nuclear explosions (conducted at the nearby Nevada Test Site) and earthquakes, the report contains discussions of faulting potential at the site, control motions at depth, material properties of the different rock layers relevant to seismic design, the strain tensor for each of the waveforms along the shaft liners, and the method for combining the different strain components along the shaft liners. The report also describes analytic methods, assumptions used to ensure conservatism, and uncertainties in the data. The analyses show that none of the shafts` structures, systems, or components are important to public radiological safety; therefore, the shafts need only be designed to ensure worker safety, and the report recommends seismic design parameters appropriate for this purpose. 31 refs., 5 figs., 6 tabs.

  10. Implementation of seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Conrads, T.J.

    1993-06-01

    In the fall of 1992, a draft of the Seismic Design and Evaluation Guidelines for the Department of Energy (DOE) High-level Waste Storage Tanks and Appurtenances was issued. The guidelines were prepared by the Tanks Seismic Experts Panel (TSEP) and this task was sponsored by DOE, Environmental Management. The TSEP is comprised of a number of consultants known for their knowledge of seismic ground motion and expertise in the analysis of structures, systems and components subjected to seismic loads. The development of these guidelines was managed by staff from Brookhaven National Laboratory, Engineering Research and Applications Division, Department of Nuclear Energy. This paper describes the process used to incorporate the Seismic Design and Evaluation Guidelines for the DOE High-Level Waste Storage Tanks and Appurtenances into the design criteria for the Multi-Function Waste Tank Project at the Hanford Site. This project will design and construct six new high-level waste tanks in the 200 Areas at the Hanford Site. This paper also discusses the vehicles used to ensure compliance to these guidelines throughout Title 1 and Title 2 design phases of the project as well as the strategy used to ensure consistent and cost-effective application of the guidelines by the structural analysts. The paper includes lessons learned and provides recommendations for other tank design projects which might employ the TSEP guidelines.

  11. Ground motion values for use in the seismic design of the Trans-Alaska Pipeline system

    USGS Publications Warehouse

    Page, Robert A.; Boore, D.M.; Joyner, W.B.; Coulter, H.W.

    1972-01-01

    The proposed trans-Alaska oil pipeline, which would traverse the state north to south from Prudhoe Bay on the Arctic coast to Valdez on Prince William Sound, will be subject to serious earthquake hazards over much of its length. To be acceptable from an environmental standpoint, the pipeline system is to be designed to minimize the potential of oil leakage resulting from seismic shaking, faulting, and seismically induced ground deformation. The design of the pipeline system must accommodate the effects of earthquakes with magnitudes ranging from 5.5 to 8.5 as specified in the 'Stipulations for Proposed Trans-Alaskan Pipeline System.' This report characterizes ground motions for the specified earthquakes in terms of peak levels of ground acceleration, velocity, and displacement and of duration of shaking. Published strong motion data from the Western United States are critically reviewed to determine the intensity and duration of shaking within several kilometers of the slipped fault. For magnitudes 5 and 6, for which sufficient near-fault records are available, the adopted ground motion values are based on data. For larger earthquakes the values are based on extrapolations from the data for smaller shocks, guided by simplified theoretical models of the faulting process.

  12. AP1000{sup R} design robustness against extreme external events - Seismic, flooding, and aircraft crash

    SciTech Connect

    Pfister, A.; Goossen, C.; Coogler, K.; Gorgemans, J.

    2012-07-01

    Both the International Atomic Energy Agency (IAEA) and the U.S. Nuclear Regulatory Commission (NRC) require existing and new nuclear power plants to conduct plant assessments to demonstrate the unit's ability to withstand external hazards. The events that occurred at the Fukushima-Dai-ichi nuclear power station demonstrated the importance of designing a nuclear power plant with the ability to protect the plant against extreme external hazards. The innovative design of the AP1000{sup R} nuclear power plant provides unparalleled protection against catastrophic external events which can lead to extensive infrastructure damage and place the plant in an extended abnormal situation. The AP1000 plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance and safety. The plant's compact safety related footprint and protection provided by its robust nuclear island structures prevent significant damage to systems, structures, and components required to safely shutdown the plant and maintain core and spent fuel pool cooling and containment integrity following extreme external events. The AP1000 nuclear power plant has been extensively analyzed and reviewed to demonstrate that it's nuclear island design and plant layout provide protection against both design basis and extreme beyond design basis external hazards such as extreme seismic events, external flooding that exceeds the maximum probable flood limit, and malicious aircraft impact. The AP1000 nuclear power plant uses fail safe passive features to mitigate design basis accidents. The passive safety systems are designed to function without safety-grade support systems (such as AC power, component cooling water, service water, compressed air or HVAC). The plant has been designed to protect systems, structures, and components critical to placing the reactor in a safe shutdown condition within the steel containment vessel which is further surrounded by a substantial 'steel concrete' composite shield building. The containment vessel is not affected by external flooding, and the shield building design provides hazard protection beyond that provided by a comparable reinforced concrete structure. The intent of this paper is to demonstrate the robustness of the AP1000 design against extreme events. The paper will focus on the plants ability to withstand extreme external events such as beyond design basis flooding, seismic events, and malicious aircraft impact. The paper will highlight the robustness of the AP1000 nuclear island design including the protection provided by the unique AP1000 composite shield building. (authors)

  13. Seismic Ecology

    NASA Astrophysics Data System (ADS)

    Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

    The paper is devoted to researches of influence of seismic actions for industrial and civil buildings and people. The seismic actions bring influence directly on the people (vibration actions, force shocks at earthquakes) or indirectly through various build- ings and the constructions and can be strong (be felt by people) and weak (be fixed by sensing devices). The great number of work is devoted to influence of violent seismic actions (first of all of earthquakes) on people and various constructions. This work is devoted to study weak, but long seismic actions on various buildings and people. There is a need to take into account seismic oscillations, acting on the territory, at construction of various buildings on urbanized territories. Essential influence, except for violent earthquakes, man-caused seismic actions: the explosions, seismic noise, emitted by plant facilities and moving transport, radiation from high-rise buildings and constructions under action of a wind, etc. can exert. Materials on increase of man- caused seismicity in a number of regions in Russia, which earlier were not seismic, are presented in the paper. Along with maps of seismic microzoning maps to be built indicating a variation of amplitude spectra of seismic noise within day, months, years. The presence of an information about amplitudes and frequencies of oscillations from possible earthquakes and man-caused oscillations in concrete regions allows carry- ing out soundly designing and construction of industrial and civil housing projects. The construction of buildings even in not seismically dangerous regions, which have one from resonance frequencies coincident on magnitude to frequency of oscillations, emitted in this place by man-caused objects, can end in failure of these buildings and heaviest consequences for the people. The practical examples of detail of engineering- seismological investigation of large industrial and civil housing projects of Siberia territory (hydro power stations, bridges, constructions, etc.) are given.

  14. Best estimate method versus evaluation method: a comparison of two techniques in evaluating seismic analysis and design. Technical report

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-07-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the tradditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC)--seismic input, soil-structure interaction, major structural response, and subsystem response--are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on the model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evauation Method is also demonstrated.

  15. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    SciTech Connect

    Pucinotti, Raffaele; Ferrario, Fabio; Bursi, Oreste S.

    2008-07-08

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  16. Optimal seismic design of reinforced concrete structures under time-history earthquake loads using an intelligent hybrid algorithm

    NASA Astrophysics Data System (ADS)

    Gharehbaghi, Sadjad; Khatibinia, Mohsen

    2015-03-01

    A reliable seismic-resistant design of structures is achieved in accordance with the seismic design codes by designing structures under seven or more pairs of earthquake records. Based on the recommendations of seismic design codes, the average time-history responses (ATHR) of structure is required. This paper focuses on the optimal seismic design of reinforced concrete (RC) structures against ten earthquake records using a hybrid of particle swarm optimization algorithm and an intelligent regression model (IRM). In order to reduce the computational time of optimization procedure due to the computational efforts of time-history analyses, IRM is proposed to accurately predict ATHR of structures. The proposed IRM consists of the combination of the subtractive algorithm (SA), K-means clustering approach and wavelet weighted least squares support vector machine (WWLS-SVM). To predict ATHR of structures, first, the input-output samples of structures are classified by SA and K-means clustering approach. Then, WWLS-SVM is trained with few samples and high accuracy for each cluster. 9- and 18-storey RC frames are designed optimally to illustrate the effectiveness and practicality of the proposed IRM. The numerical results demonstrate the efficiency and computational advantages of IRM for optimal design of structures subjected to time-history earthquake loads.

  17. UNCERTAINTY IN PHASE ARRIVAL TIME PICKS FOR REGIONAL SEISMIC EVENTS: AN EXPERIMENTAL DESIGN

    SciTech Connect

    A. VELASCO; ET AL

    2001-02-01

    The detection and timing of seismic arrivals play a critical role in the ability to locate seismic events, especially at low magnitude. Errors can occur with the determination of the timing of the arrivals, whether these errors are made by automated processing or by an analyst. One of the major obstacles encountered in properly estimating travel-time picking error is the lack of a clear and comprehensive discussion of all of the factors that influence phase picks. This report discusses possible factors that need to be modeled to properly study phase arrival time picking errors. We have developed a multivariate statistical model, experimental design, and analysis strategy that can be used in this study. We have embedded a general form of the International Data Center(IDC)/U.S. National Data Center (USNDC) phase pick measurement error model into our statistical model. We can use this statistical model to optimally calibrate a picking error model to regional data. A follow-on report will present the results of this analysis plan applied to an implementation of an experiment/data-gathering task.

  18. Design and utilization of a portable seismic/acoustic calibration system

    SciTech Connect

    Stump, B.W.; Pearson, D.C.

    1996-10-01

    Empirical results from the current GSETT-3 illustrate the need for source specific information for the purpose of calibrating the monitoring system. With the specified location design goal of 1,000 km{sup 2}, preliminary analysis indicates the importance of regional calibration of travel times. This calibration information can be obtained in a passive manner utilizing locations derived from local seismic array arrival times and assumes the resulting locations are accurate. Alternatively, an active approach to the problem can be undertaken, attempting to make near-source observations of seismic sources of opportunity to provide specific information on the time, location and characteristics of the source. Moderate to large mining explosions are one source type that may be amenable to such calibration. This paper describes an active ground truthing procedure for regional calibration. A prototype data acquisition system that includes the primary ground motion component for source time and location determination, and secondary, optional acoustic and video components for improved source phenomenology is discussed. The system costs approximately $25,000 and can be deployed and operated by one to two people thus providing a cost effective system for calibration and documentation of sources of interest. Practical implementation of the system is illustrated, emphasizing the minimal impact on an active mining operation.

  19. Core restraint and seismic analysis of a large heterogeneous free-flowering core design. Final report

    SciTech Connect

    Madell, J.T.; Moran, T.J.; Ash, J.E.; Fulford, P.J.

    1980-11-01

    The core restraint and seismic performance of a large heterogeneous core was analyzed. A free-flowering core restraint system was selected for this study, as opposed to the limited-free bow system of the FFTF and CRBRP. The key features of the core restraint system, such as stiff reflector assemblies and load pad properties, were specified in this study. Other features - such as the fuel-assembly description, flux and temperature distributions, and clearances between the assembly nozzle and grid plate - were obtained from the other parts of a large, heterogeneous Core Study 11 and 12. Core restraint analysis was performed with NUBOW-3D over the first two cycles of operation. The SCRAP code was used to analyze the time-history seismic response of the core with the effects of fluid, impact, and bowed assemblies modeled in the code. The core restraint system design was assessed in terms of the predicted forces, impacts, displacements, and reactivity effects for different cycle times and power/flow ratios.

  20. On the Need for Reliable Seismic Input Assessment for Optimized Design and Retrofit of Seismically Isolated Civil and Industrial Structures, Equipment, and Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Martelli, Alessandro

    2011-01-01

    Based on the experience of recent violent earthquakes, the limits of the methods that are currently used for the definition of seismic hazard are becoming more and more evident to several seismic engineers. Considerable improvement is felt necessary not only for the seismic classification of the territory (for which the probabilistic seismic hazard assessment—PSHA—is generally adopted at present), but also for the evaluation of local amplification. With regard to the first item, among others, a better knowledge of fault extension and near-fault effects is judged essential. The aforesaid improvements are particularly important for the design of seismically isolated structures, which relies on displacement. Thus, such a design requires an accurate definition of the maximum value of displacement corresponding to the isolation period, and a reliable evaluation of the earthquake energy content at the low frequencies that are typical of the isolated structures, for the site and ground of interest. These evaluations shall include possible near-fault effects even in the vertical direction; for the construction of high-risk plants and components and retrofit of some cultural heritage, they shall be performed for earthquakes characterized by very long return periods. The design displacement shall not be underestimated, but neither be excessively overestimated, at least when using rubber bearings in the seismic isolation (SI) system. In fact, by decreasing transverse deformation of such SI systems below a certain value, their horizontal stiffness increases. Thus, should a structure (e.g. a civil defence centre, a masterpiece, etc.) protected in the aforesaid way be designed to withstand an unnecessarily too large earthquake, the behaviour of its SI system will be inadequate (i.e. it will be too stiff) during much more frequent events, which may really strike the structure during its life. Furthermore, since SI can be used only when the room available to the structure laterally is sufficient to create a structural gap compatible with the design displacement, overestimating this displacement may lead to unnecessarily renouncing of the use of such a very efficient method, especially in the case of retrofits of existing buildings. Finally, for long structures (e.g. several bridges or viaducts and even some buildings) an accurate evaluation of the possibly different ground displacements along the structure is required (this also applies to conventionally built structures). In order to overcome the limits of PSHA, this method shall be complemented by the development and application of deterministic models. In particular, the lack of displacement records requires the use of modelling, once they are calibrated against more commonly available velocity or acceleration records. The aforesaid remarks are now particularly important in the P.R. China and Italy, to ensure safe reconstruction after the Wenchuan earthquake of May 12, 2008 and the Abruzzo earthquake of April 6, 2009: in fact, wide use of SI and other anti-seismic systems has been planned in the areas struck by both events.

  1. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part II: Numerical Results

    SciTech Connect

    Mazza, Fabio; Vulcano, Alfonso

    2008-07-08

    For a widespread application of dissipative braces to protect framed buildings against seismic loads, practical and reliable design procedures are needed. In this paper a design procedure based on the Direct Displacement-Based Design approach is adopted, assuming the elastic lateral storey-stiffness of the damped braces proportional to that of the unbraced frame. To check the effectiveness of the design procedure, presented in an associate paper, a six-storey reinforced concrete plane frame, representative of a medium-rise symmetric framed building, is considered as primary test structure; this structure, designed in a medium-risk region, is supposed to be retrofitted as in a high-risk region, by insertion of diagonal braces equipped with hysteretic dampers. A numerical investigation is carried out to study the nonlinear static and dynamic responses of the primary and the damped braced test structures, using step-by-step procedures described in the associate paper mentioned above; the behaviour of frame members and hysteretic dampers is idealized by bilinear models. Real and artificial accelerograms, matching EC8 response spectrum for a medium soil class, are considered for dynamic analyses.

  2. Some issues in the seismic design of nuclear power-plant facilities

    SciTech Connect

    Hadjian, A.H.; Iwan, W.D.

    1980-09-01

    This paper summarizes the major issues discussed by an international panel of experts during the post-SMIRT (Structural Mechanics in Reactor Technology) Seminar on Extreme Load Design of Nuclear Power-Plant Facilities, which was held in Berlin, Aug. 20-21, 1979. The emphasis of the deliberations was on the state of the art of seismic-response calculations to predict the expected performance of structures and equipment during earthquakes. Four separate panels discussed issues on (1) soil-structure interaction and structural response, (2) modeling, materials, and boundary conditions, (3) damping in structures and equipment, and (4) fragility levels of equipment. The international character of the seminar was particularly helpful in the cross-pollination of ideas regarding the issues and the steps required to enhance the cause of safety of nuclear plants.

  3. On the Computation of H/V and its Application to Microzonation and Seismic Design

    NASA Astrophysics Data System (ADS)

    Perton, M.; Martnez, J. A.; Lermo, J. F.; Sanchez-Sesma, F. J.

    2014-12-01

    The H/V ratio is the square root of the ratio of horizontal to vertical energies of ground motion. It has been observed that the frequency of the main peak is well suited for the characterization of site effects and had been widely used for micro-zonation and seismic structural design. Historically that ratio was made from the average of individual H/V ratios obtained from noise autocorrelations. Nevertheless, it has been recently pointed out that the H/V ratio should be calculated differently as the ratio of the average of H over the average of V. This calculation is based on the relation between the directional energies (the imaginary part of Green's function) and the noise autocorrelations. In general, the average of ratios is different from the ratio of averages. Although the frequency of the main response was correctly obtained, the associated amplification factor has generally been badly predicted, having little matching with the amplification observed during strong earthquakes. The unexpected decay behavior of such ratios at high frequency and the lack of stability and reproducibility of the H/V ratios are other problems that face the method. These problems are addressed here from the point of view of normalization of noise correlations. In fact, several normalization techniques have already been proposed in order to correctly retrieve the Green's function. Some of them are well suited for the retrieval of the surface wave contribution, while others are more appropriate for bulk wave incidence. Since the H/V ratio may be used for various purposes like surface wave tomography, micro-zonation or seismic design, different normalizations are discussed in functions of the objectives. The H/V obtained from local historical earthquakes on top or far away from the subduction zone are also discussed. ACKNOWLEDGEMENT This research has been partially supported by DGAPA-UNAM under Project IN104712 and the AXA Research Fund.

  4. Effects of charge design features on parameters of acoustic and seismic waves and cratering, for SMR chemical surface explosions

    NASA Astrophysics Data System (ADS)

    Gitterman, Y.

    2012-04-01

    A series of experimental on-surface shots was designed and conducted by the Geophysical Institute of Israel at Sayarim Military Range (SMR) in Negev desert, including two large calibration explosions: about 82 tons of strong IMI explosives in August 2009, and about 100 tons of ANFO explosives in January 2011. It was a collaborative effort between Israel, CTBTO, USA and several European countries, with the main goal to provide fully controlled ground truth (GT0) infrasound sources in different weather/wind conditions, for calibration of IMS infrasound stations in Europe, Middle East and Asia. Strong boosters and the upward charge detonation scheme were applied to provide a reduced energy release to the ground and an enlarged energy radiation to the atmosphere, producing enhanced infrasound signals, for better observation at far-regional stations. The following observations and results indicate on the required explosives energy partition for this charge design: 1) crater size and local seismic (duration) magnitudes were found smaller than expected for these large surface explosions; 2) small test shots of the same charge (1 ton) conducted at SMR with different detonation directions showed clearly lower seismic amplitudes/energy and smaller crater size for the upward detonation; 3) many infrasound stations at local and regional distances showed higher than expected peak amplitudes, even after application of a wind-correction procedure. For the large-scale explosions, high-pressure gauges were deployed at 100-600 m to record air-blast properties, evaluate the efficiency of the charge design and energy generation, and provide a reliable estimation of the charge yield. Empirical relations for air-blast parameters - peak pressure, impulse and the Secondary Shock (SS) time delay - depending on distance, were developed and analyzed. The parameters, scaled by the cubic root of estimated TNT equivalent charges, were found consistent for all analyzed explosions, except of SS time delays clearly separated for the shot of IMI explosives (characterized by much higher detonation velocity than ANFO). Additionally acoustic records at close distances from WSMR explosions Distant Image (2440 tons of ANFO) and Minor Uncle (2725 tons of ANFO) were used to extend the charge and distance range for the SS delay scaled relationship, that showed consistency with SMR ANFO shots. The developed specific charge design contributed to the success of this unique dual Sayarim explosion experiment, providing the strongest GT0 sources since the establishment of the IMS network, that demonstrated clearly the most favorable westward/ eastward infrasound propagation up to 3400/6250 km according to appropriate summer/winter weather pattern and stratospheric wind directions, respectively, and thus verified empirically common models of infrasound propagation in the atmosphere. The research was supported by the CTBTO, Vienna, and the Israel Ministry of Immigrant Absorption.

  5. GA-based optimum design of a shape memory alloy device for seismic response mitigation

    NASA Astrophysics Data System (ADS)

    Ozbulut, O. E.; Roschke, P. N.; Y Lin, P.; Loh, C. H.

    2010-06-01

    Damping systems discussed in this work are optimized so that a three-story steel frame structure and its shape memory alloy (SMA) bracing system minimize response metrics due to a custom-tailored earthquake excitation. Multiple-objective numerical optimization that simultaneously minimizes displacements and accelerations of the structure is carried out with a genetic algorithm (GA) in order to optimize SMA bracing elements within the structure. After design of an optimal SMA damping system is complete, full-scale experimental shake table tests are conducted on a large-scale steel frame that is equipped with the optimal SMA devices. A fuzzy inference system is developed from data collected during the testing to simulate the dynamic material response of the SMA bracing subcomponents. Finally, nonlinear analyses of a three-story braced frame are carried out to evaluate the performance of comparable SMA and commonly used steel braces under dynamic loading conditions and to assess the effectiveness of GA-optimized SMA bracing design as compared to alternative designs of SMA braces. It is shown that peak displacement of a structure can be reduced without causing significant acceleration response amplification through a judicious selection of physical characteristics of the SMA devices. Also, SMA devices provide a recentering mechanism for the structure to return to its original position after a seismic event.

  6. Image resolution analysis: A new, robust approach to seismic survey design

    NASA Astrophysics Data System (ADS)

    Tzimeas, Constantinos

    Seismic survey design methods often rely on qualitative measures to provide an optimal image of their objective target. Fold, ray tracing techniques counting ray hits on binned interfaces, and even advanced 3-D survey design methods that try to optimize offset and azimuth coverage are prone to fail (especially in complex geological or structural settings) in their imaging predictions. The reason for the potential failure of these commonly used approaches derives from the fact that they do not take into account the ray geometry at the target points. Inverse theory results can provide quantitative and objective constraints on acquisition design. Beylkin's contribution to this field is an elegant and simple equation describing a reconstructed point scatterer given the source/receiver distribution used in the imaging experiment. Quantitative measures of spatial image resolution were developed to assess the efficacy of competing acquisition geometries. Apart from the source/receiver configuration, parameters such as the structure and seismic velocity also influence image resolution. Understanding their effect on image quality, allows us to better interpret the resolution results for the surveys under examination. A salt model was used to simulate imaging of target points located underneath and near the flanks of the diapir. Three different survey designs were examined. Results from these simulations show that contrary to simple models, near-offsets do not always produce better resolved images than far-offsets. However, consideration of decreasing signal-to-noise ratio revealed that images obtained from the far-offset experiment are degrading faster than the near-offset ones. The image analysis was performed on VSP field data as well as synthetics generated by finite difference forward modeling. The predicted image resolution results were compared to measured resolution from the migrated sections of both the field data and the synthetics. This comparison confirms that image resolution analysis provides as good a resolution prediction as the prestack Kirchhoff depth migrated section of the synthetic gathers. Even in the case of the migrated field data, despite the presence of error introducing factors (different signal-to-noise ratios, shape and frequency content of source wavelets, etc.), image resolution performed well exhibiting the same trends of resolution changes at different test points.

  7. Basis of Design and Seismic Action for Long Suspension Bridges: the case of the Messina Strait Bridge

    SciTech Connect

    Bontempi, Franco

    2008-07-08

    The basis of design for complex structures like suspension bridges is reviewed. Specific attention is devoted to seismic action and to the performance required and to the connected structural analysis. Uncertainty is specially addressed by probabilistic and soft-computing techniques. The paper makes punctual reference to the work end the experience developed during the last years for the re-design of the Messina Strait Bridge.

  8. Design of an implantable seismic sensor placed on the ossicular chain.

    PubMed

    Sachse, M; Hortschitz, W; Stifter, M; Steiner, H; Sauter, T

    2013-10-01

    This paper presents a design guideline for matching a fully implantable middle ear microphone with the physiology of human hearing. The guideline defines the first natural frequency of a seismic sensor placed at the tip of the manubrium mallei with respect to the frequency-dependence hearing of the human ear as well as the deflection of the ossicular chain. A transducer designed in compliance with the guideline presented reduces the range of the output signal while preserving all information obtained by the ossicular chain. On top of a output signal compression, static deflections, which can mask the tiny motions of the ossicles, are reduced. For guideline verification, a microelectromechanical system (MEMS) based on silicon on insulator technology was produced and tested. This prototype is capable of resolving 0.4 pm/Hz with a custom made read-out circuit. For a bandwidth of 0.1 kHz, this deflection is comparable with the lower threshold of speech (? 40 phon). PMID:23810385

  9. Seismic design technology for breeder reactor structures. Volume 4. Special topics in piping and equipment

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into five chapters: experimental verification of piping systems, analytical verification of piping restraint systems, seismic analysis techniques for piping systems with multisupport input, development of floor spectra from input response spectra, and seismic analysis procedures for in-core components. (DLC)

  10. MASSACHUSETTS DEP EELGRASS VERIFIED POINTS

    EPA Science Inventory

    Field verified points showing presence or absence of submerged rooted vascular plants along Massachusetts coastline. In addition to the photo interpreted eelgrass coverage (EELGRASS), this point coverage (EGRASVPT) was generated based on field-verified sites as well as all field...

  11. Site study plan for EDBH (Engineering Design Boreholes) seismic surveys, Deaf Smith County site, Texas: Revision 1

    SciTech Connect

    Hume, H.

    1987-12-01

    This site study plan describes seismic reflection surveys to run north-south and east-west across the Deaf Smith County site, and intersecting near the Engineering Design Boreholes (EDBH). Both conventional and shallow high-resolution surveys will be run. The field program has been designed to acquire subsurface geologic and stratigraphic data to address information/data needs resulting from Federal and State regulations and Repository program requirements. The data acquired by the conventional surveys will be common-depth- point, seismic reflection data optimized for reflection events that indicate geologic structure near the repository horizon. The data will also resolve the basement structure and shallow reflection events up to about the top of the evaporite sequence. Field acquisition includes a testing phase to check/select parameters and a production phase. The field data will be subjected immediately to conventional data processing and interpretation to determine if there are any anamolous structural for stratigraphic conditions that could affect the choice of the EDBH sites. After the EDBH's have been drilled and logged, including vertical seismic profiling, the data will be reprocessed and reinterpreted for detailed structural and stratigraphic information to guide shaft development. The shallow high-resulition seismic reflection lines will be run along the same alignments, but the lines will be shorter and limited to immediate vicinity of the EDBH sites. These lines are planned to detect faults or thick channel sands that may be present at the EDBH sites. 23 refs. , 7 figs., 5 tabs.

  12. Verifiable threshold signature schemes against conspiracy attack.

    PubMed

    Gan, Yuan-Ju

    2004-01-01

    In this study, the author has designed new verifiable (t,n) threshold untraceable signature schemes. The proposed schemes have the following properties:(1) Verification: The shadows of the secret distributed by the trusted center can be verified by all of the participants;(2) Security: Even if the number of the dishonest member is over the value of the threshold, they cannot get the system secret parameters, such as the group secret key, and forge other member's individual signature;(3) Efficient verification: The verifier can verify the group signature easily and the verification time of the group signature is equivalent to that of an individual signature; (4) Untraceability: The signers of the group signature cannot be traced. PMID:14663852

  13. Active seismic experiment

    NASA Technical Reports Server (NTRS)

    Kovach, R. L.; Watkins, J. S.; Talwani, P.

    1972-01-01

    The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

  14. Simplified seismic collapse capacity-based evaluation and design of frame buildings with and without supplemental damping systems

    NASA Astrophysics Data System (ADS)

    Hamidia, Mohammad Javad

    A simplified procedure is developed for estimating the seismic sidesway collapse capacity of frame building structures. The procedure is then extended to quantify the seismic collapse capacity of buildings incorporating supplemental damping systems. The proposed procedure is based on a robust database of seismic peak displacement responses of viscously damped nonlinear single-degree-of-freedom systems for various seismic intensities and uses nonlinear static (pushover) analysis without the need for nonlinear time history dynamic analysis. The proposed procedure is assessed by comparing its collapse capacity predictions on 1470 different building models with those obtained from incremental nonlinear dynamic analyses. A straightforward unifying collapse capacity based design procedure aimed at achieving a pre-determined probability of collapse under maximum considered earthquake event is also introduced for structures equipped with viscous dampers (linear and nonlinear) and hysteretic dampers. The proposed simplified procedure offers a simple, yet efficient, computational/analytical tool that is capable of predicting collapse capacities with acceptable accuracy for a wide variety of frame building structures incorporate several types of supplemental damping systems.

  15. Conceptual Design and Architecture of Mars Exploration Rover (MER) for Seismic Experiments Over Martian Surfaces

    NASA Astrophysics Data System (ADS)

    Garg, Akshay; Singh, Amit

    2012-07-01

    Keywords: MER, Mars, Rover, Seismometer Mars has been a subject of human interest for exploration missions for quite some time now. Both rover as well as orbiter missions have been employed to suit mission objectives. Rovers have been preferentially deployed for close range reconnaissance and detailed experimentation with highest accuracy. However, it is essential to strike a balance between the chosen science objectives and the rover operations as a whole. The objective of this proposed mechanism is to design a vehicle (MER) to carry out seismic studies over Martian surface. The conceptual design consists of three units i.e. Mother Rover as a Surrogate (Carrier) and Baby Rovers (two) as seeders for several MEMS-based accelerometer / seismometer units (Nodes). Mother Rover can carry these Baby Rovers, having individual power supply with solar cells and with individual data transmission capabilities, to suitable sites such as Chasma associated with Valles Marineris, Craters or Sand Dunes. Mother rover deploys these rovers in two opposite direction and these rovers follow a triangulation pattern to study shock waves generated through firing tungsten carbide shells into the ground. Till the time of active experiments Mother Rover would act as a guiding unit to control spatial spread of detection instruments. After active shock experimentation, the babies can still act as passive seismometer units to study and record passive shocks from thermal quakes, impact cratering & landslides. Further other experiments / payloads (XPS / GAP / APXS) can also be carried by Mother Rover. Secondary power system consisting of batteries can also be utilized for carrying out further experiments over shallow valley surfaces. The whole arrangement is conceptually expected to increase the accuracy of measurements (through concurrent readings) and prolong life cycle of overall experimentation. The proposed rover can be customised according to the associated scientific objectives and further needs.

  16. Geological investigation for CO2 storage: from seismic and well data to storage design

    NASA Astrophysics Data System (ADS)

    Chapuis, Flavie; Bauer, Hugues; Grataloup, Sandrine; Leynet, Aurélien; Bourgine, Bernard; Castagnac, Claire; Fillacier, Simon; Lecomte, Antony; Le Gallo, Yann; Bonijoly, Didier

    2010-05-01

    Geological investigation for CO2 storage: from seismic and well data to storage design Chapuis F.1, Bauer H.1, Grataloup S.1, Leynet A.1, Bourgine B.1, Castagnac C.1, Fillacier, S.2, Lecomte A.2, Le Gallo Y.2, Bonijoly D.1. 1 BRGM, 3 av Claude Guillemin, 45060 Orléans Cedex, France, f.chapuis@brgm.fr, d.bonijoly@brgm.fr 2 Geogreen, 7, rue E. et A. Peugeot, 92563 Rueil-Malmaison Cedex, France, ylg@greogreen.fr The main purpose of this study is to evaluate the techno-economical potential of storing 200 000 tCO2 per year produced by a sugar beat distillery. To reach this goal, an accurate hydro-geological characterisation of a CO2 injection site is of primary importance because it will strongly influence the site selection, the storage design and the risk management. Geological investigation for CO2 storage is usually set in the center or deepest part of sedimentary basins. However, CO2 producers do not always match with the geological settings, and so other geological configurations have to be studied. This is the aim of this project, which is located near the South-West border of the Paris Basin, in the Orléans region. Special geometries such as onlaps and pinch out of formation against the basement are likely to be observed and so have to be taken into account. Two deep saline aquifers are potentially good candidates for CO2 storage. The Triassic continental deposits capped by the Upper Triassic/Lower Jurassic continental shales and the Dogger carbonate deposits capped by the Callovian and Oxfordian shales. First, a data review was undertaken, to provide the palaeogeographical settings and ideas about the facies, thicknesses and depth of the targeted formations. It was followed by a seismic interpretation. Three hundred kilometres of seismic lines were reprocessed and interpreted to characterize the geometry of the studied area. The main structure identified is the Étampes fault that affects all the formations. Apart from the vicinity of the fault where drag folds appear, the layers are sub-horizontal and gently dip and thicken eastwards. Then, interpreted seismic lines, together with well data from more than 50 boreholes were integrated into a 2D-model of the main surfaces using geostatistics (Isatis® and Petrel® softwares). The main difficulty of this step was to generate a realistic model accounting for both the specific geometries linked to the basin border (onlapping, pinching out...) and the faults. If the former only concerns the Triassic, the latter also affects the overlying formations. Regarding the Dogger top surface, it is less than 700m deep in the western area, which is too shallow for supercritical state injection. Consequently, the next part of the study focused on the Triassic reservoir and integrated changes in petrophysical properties as a function of lateral lithological variation. This ultimately led to upgrade the model from 2D to 3D in order to perform the simulation of CO2 migration. To achieve this objective, we first applied sequence stratigraphy concepts on Triassic deposits to compensate the lack of quantitative petrophysical data. It provided qualitative data about the reservoir heterogeneities which are crucial for a realistic 3D-modelling. Paleoenvironmental reconstructions show that the sediment supply direction is WSW-ENE, implying more proximal deposits to the West, and so better reservoir properties. The final step is to use this 3D-model to elaborate a flow model to estimate the injectivity rate and the extension of the overpressure within the open aquifer and the CO2 plume after 30 years of injection. Two injection rates as well as two well locations were hypothesized into four scenarios considering several locations and injections rates. In any case, the fault has been considered as a barrier to the CO2 migration and the system as a closed one. In the four cases, results are satisfying, the overpressure is less than 30% of the initial pressure and the reservoir capacity is enough regarding the goal of the project. The results of these simulations will then be integrated into the risk analysis of the project, which is of utmost importance to ensure safety and cope with public acceptance. Acknowledgements: This work is supported by the French Ministry of Research (DRRT), the regional Council "Région Centre", the European Regional Development Fund (FEDER) and the BRGM.

  17. Model verifies design of mobile data modem

    NASA Technical Reports Server (NTRS)

    Davarian, F.; Sumida, J.

    1986-01-01

    It has been proposed to use differential minimum shift keying (DMSK) modems in spacecraft-based mobile communications systems. For an employment of these modems, it is necessary that the transmitted carrier frequency be known prior to signal detection. In addition, the time needed by the receiver to lock onto the carrier frequency must be minimized. The present article is concerned with a DMSK modem developed for the Mobile Satellite Service. This device demonstrated fast acquisition time and good performance in the presence of fading. However, certain problems arose in initial attempts to study the acquisition behavior of the AFC loop through breadboard techniques. The development of a software model of the AFC loop is discussed, taking into account two cases which were plotted using the model. Attention is given to a demonstration of the viability of the modem by an approach involving modeling and analysis of the frequency synchronizer.

  18. Verifiable and Redactable Medical Documents

    PubMed Central

    Brown, Jordan; Blough, Douglas M.

    2012-01-01

    This paper considers how to verify provenance and integrity of data in medical documents that are exchanged in a distributed system of health IT services. Provenance refers to the sources of health information within the document and integrity means that the information was not modified after generation by the source. Our approach allows intermediate parties to redact the document by removing information that they do not wish to reveal. For example, patients can store verifiable health information and provide subsets of it to third parties, while redacting sensitive information that they do not wish employers, insurers, or others to receive. Our method uses a cryptographic primitive known as a redactable signature. We study practical issues and performance impacts of building, redacting, and verifying Continuity of Care Documents (CCDs) that are protected with redactable signatures. Results show that manipulating redactable CCDs provides superior security and privacy with little computational overhead. PMID:23304391

  19. Seismic design and evaluation guidelines for the Department of Energy High-Level Waste Storage Tanks and Appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1995-10-01

    This document provides seismic design and evaluation guidelines for underground high-level waste storage tanks. The guidelines reflect the knowledge acquired in the last two decades in defining seismic ground motion and calculating hydrodynamic loads, dynamic soil pressures and other loads for underground tank structures, piping and equipment. The application of the guidelines is illustrated with examples. The guidelines are developed for a specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document. The original version of this document was published in January 1993. Since then, additional studies have been performed in several areas and the results are included in this revision. Comments received from the users are also addressed. Fundamental concepts supporting the basic seismic criteria contained in the original version have since then been incorporated and published in DOE-STD-1020-94 and its technical basis documents. This information has been deleted in the current revision.

  20. Seismic design of steel structures with lead-extrusion dampers as knee braces

    SciTech Connect

    Monir, Habib Saeed; Naser, Ali

    2008-07-08

    One of the effective methods in decreasing the seismic response of structure against dynamic loads due to earthquake is using energy dissipating systems. Lead-extrusion dampers (LED) are one of these systems that dissipate energy in to one lead sleeve because of steel rod movement. Hysteresis loops of these dampers are approximately rectangular and acts independent from velocity in frequencies that are in the seismic frequency rang. In this paper lead dampers are considered as knee brace in steel frames and are studied in an economical view. Considering that lead dampers don't clog structural panels, so this characteristic can solve brace problems from architectural view. The behavior of these dampers is compared with the other kind of dampers such as XADAS and TADAS. The results indicate that lead dampers act properly in absorbing the induced energy due to earthquake and good function in controlling seismic movements of multi-story structures.

  1. Verifying the Hanging Chain Model

    ERIC Educational Resources Information Center

    Karls, Michael A.

    2013-01-01

    The wave equation with variable tension is a classic partial differential equation that can be used to describe the horizontal displacements of a vertical hanging chain with one end fixed and the other end free to move. Using a web camera and TRACKER software to record displacement data from a vibrating hanging chain, we verify a modified version…

  2. Verifying the Hanging Chain Model

    ERIC Educational Resources Information Center

    Karls, Michael A.

    2013-01-01

    The wave equation with variable tension is a classic partial differential equation that can be used to describe the horizontal displacements of a vertical hanging chain with one end fixed and the other end free to move. Using a web camera and TRACKER software to record displacement data from a vibrating hanging chain, we verify a modified version

  3. Seismic Studies

    SciTech Connect

    R. Quittmeyer

    2006-09-25

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at Yucca Mountain. (2) For probabilistic analyses supporting the demonstration of compliance with preclosure performance objectives, provide a mean seismic hazard curve for the surface facilities area. Results should be consistent with the PSHA for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at Yucca Mountain. (3) For annual ground motion exceedance probabilities appropriate for postclosure analyses, provide site-specific seismic time histories (acceleration, velocity, and displacement) for the waste emplacement level. Time histories should be consistent with the PSHA and reflect available knowledge on the limits to extreme ground motion at Yucca Mountain. (4) In support of ground-motion site-response modeling, perform field investigations and laboratory testing to provide a technical basis for model inputs. Characterize the repository block and areas in which important-to-safety surface facilities will be sited. Work should support characterization and reduction of uncertainties in inputs to ground-motion site-response modeling. (5) On the basis of rock mechanics, geologic, and seismic information, determine limits on extreme ground motion at Yucca Mountain and document the technical basis for them. (6) Update the ground-motion site-response model, as appropriate, on the basis of new data. Expand and enhance the technical basis for model validation to further increase confidence in the site-response modeling. (7) Document seismic methodologies and approaches in reports to be submitted to the NRC. (8) Address condition reports.

  4. Southern California Seismic Network: New Design and Implementation of Redundant and Reliable Real-time Data Acquisition Systems

    NASA Astrophysics Data System (ADS)

    Saleh, T.; Rico, H.; Solanki, K.; Hauksson, E.; Friberg, P.

    2005-12-01

    The Southern California Seismic Network (SCSN) handles more than 2500 high-data rate channels from more than 380 seismic stations distributed across southern California. These data are imported real-time from dataloggers, earthworm hubs, and partner networks. The SCSN also exports data to eight different partner networks. Both the imported and exported data are critical for emergency response and scientific research. Previous data acquisition systems were complex and difficult to operate, because they grew in an ad hoc fashion to meet the increasing needs for distributing real-time waveform data. To maximize reliability and redundancy, we apply best practices methods from computer science for implementing the software and hardware configurations for import, export, and acquisition of real-time seismic data. Our approach makes use of failover software designs, methods for dividing labor diligently amongst the network nodes, and state of the art networking redundancy technologies. To facilitate maintenance and daily operations we seek to provide some separation between major functions such as data import, export, acquisition, archiving, real-time processing, and alarming. As an example, we make waveform import and export functions independent by operating them on separate servers. Similarly, two independent servers provide waveform export, allowing data recipients to implement their own redundancy. The data import is handled differently by using one primary server and a live backup server. These data import servers, run fail-over software that allows automatic role switching in case of failure from primary to shadow. Similar to the classic earthworm design, all the acquired waveform data are broadcast onto a private network, which allows multiple machines to acquire and process the data. As we separate data import and export away from acquisition, we are also working on new approaches to separate real-time processing and rapid reliable archiving of real-time data. Further, improved network security is an integral part of the new design. Redundant firewalls will provide secure data imports, exports, and acquisition as well as DMZ zones for web servers and other publicly available servers. We will present the detailed design of this new configuration that is currently being implemented by the SCSN at Caltech. The design principals are general enough to be of use to most regional seismic networks.

  5. Rapid estimation of earthquake loss based on instrumental seismic intensity: design and realization

    NASA Astrophysics Data System (ADS)

    Huang, Hongsheng; Chen, Lin; Zhu, Gengqing; Wang, Lin; Lin, Yanzhao; Wang, Huishan

    2013-11-01

    As a result of our ability to acquire large volumes of real-time earthquake observation data, coupled with increased computer performance, near real-time seismic instrument intensity can be obtained by using ground motion data observed by instruments and by using the appropriate spatial interpolation methods. By combining vulnerability study results from earthquake disaster research with earthquake disaster assessment models, we can estimate the losses caused by devastating earthquakes, in an attempt to provide more reliable information for earthquake emergency response and decision support. This paper analyzes the latest progress on the methods of rapid earthquake loss estimation at home and abroad. A new method involving seismic instrument intensity rapid reporting to estimate earthquake loss is proposed and the relevant software is developed. Finally, a case study using the M L4.9 earthquake that occurred in Shun-chang county, Fujian Province on March 13, 2007 is given as an example of the proposed method.

  6. Verifying differential pressure transmitter operation

    SciTech Connect

    Corley, M.A.; O'Neal, D.L.

    1999-07-01

    The monitoring of chilled and hot water consumption has become more important in recent years. However, reduction of consumption through energy conserving retrofits has reduced flow velocities significantly. This paper presents the results of a study performed to verify that differential pressure transmitters used in chilled and hot water metering were able to capture actual conditions within acceptable accuracy even at low flow rates. The results fell into three categories: transmitters whose expected and actual output coincided, transmitters that exhibited offset and slope errors, and transmitters that exhibited errors from unknown sources.

  7. Optimum seismic structural design based on random vibration and fuzzy graded damages

    NASA Technical Reports Server (NTRS)

    Cheng, Franklin Y.; Ou, Jin-Ping

    1990-01-01

    This paper presents the fuzzy dynamical reliability and failure probability as well as the basic principles and the analytical method of loss assessment for nonlinear seismic steel structures. Also presented is the optimization formulation and a numerical example for double objectives, initial construction cost and expected failure loss, and dynamical reliability constraints. The earthquake ground motion is based on a stationary filtered non-white noise and the fuzzy damage grade is described by damage index.

  8. Seismic design spectra 200 West and East Areas DOE Hanford Site, Washington

    SciTech Connect

    Tallman, A.M.

    1995-12-31

    This document presents equal hazard response spectra for the W236A project for the 200 East and West new high-level waste tanks. The hazard level is based upon WHC-SD-W236A-TI-002, Probabilistic Seismic Hazard Analysis, DOE Hanford Site, Washington. Spectral acceleration amplification is plotted with frequency (Hz) for horizontal and vertical motion and attached to this report. The vertical amplification is based upon the preliminary draft revision of Standard ASCE 4-86. The vertical spectral acceleration is equal to the horizontal at frequencies above 3.3Hz because of near-field, less than 15 km, sources.

  9. Seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1993-01-01

    This document provides guidelines for the design and evaluation of underground high-level waste storage tanks due to seismic loads. Attempts were made to reflect the knowledge acquired in the last two decades in the areas of defining the ground motion and calculating hydrodynamic loads and dynamic soil pressures for underground tank structures. The application of the analysis approach is illustrated with an example. The guidelines are developed for specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document.

  10. A Seismic Isolation Application Using Rubber Bearings; Hangar Project in Turkey

    SciTech Connect

    Sesigur, Haluk; Cili, Feridun

    2008-07-08

    Seismic isolation is an effective design strategy to mitigate the seismic hazard wherein the structure and its contents are protected from the damaging effects of an earthquake. This paper presents the Hangar Project in Sabiha Goekcen Airport which is located in Istanbul, Turkey. Seismic isolation system where the isolation layer arranged at the top of the columns is selected. The seismic hazard analysis, superstructure design, isolator design and testing were based on the Uniform Building Code (1997) and met all requirements of the Turkish Earthquake Code (2007). The substructure which has the steel vertical trusses on facades and RC H shaped columns in the middle axis of the building was designed with an R factor limited to 2.0 in accordance with Turkish Earthquake Code. In order to verify the effectiveness of the isolation system, nonlinear static and dynamic analyses are performed. The analysis revealed that isolated building has lower base shear (approximately 1/4) against the non-isolated structure.

  11. Experimentally Verified Numerical Model of Thixoforming Process

    SciTech Connect

    Bialobrzeski, Andrzej; Kotynia, Monika; Petera, Jerzy

    2007-04-07

    A new mathematical model of thixotropic casting based on the two-phase approach for the semi-solid metal alloys is presented. The corresponding numerical algorithm has been implemented in original computer software using the finite element method in the 3-D geometry and using the Lagrangian approach to flow description. The model has been verified by means of an original experiment of thixoforming in a model die specially designed for this purpose. Some particular cases of such casting and influence of operating parameters on the segregation phenomenon have been discussed.

  12. Experimentally Verified Numerical Model of Thixoforming Process

    NASA Astrophysics Data System (ADS)

    Bia?obrzeski, Andrzej; Kotynia, Monika; Petera, Jerzy

    2007-04-01

    A new mathematical model of thixotropic casting based on the two-phase approach for the semi-solid metal alloys is presented. The corresponding numerical algorithm has been implemented in original computer software using the finite element method in the 3-D geometry and using the Lagrangian approach to flow description. The model has been verified by means of an original experiment of thixoforming in a model die specially designed for this purpose. Some particular cases of such casting and influence of operating parameters on the segregation phenomenon have been discussed.

  13. Simulation and Processing Seismic Data in Complex Geological Models

    NASA Astrophysics Data System (ADS)

    Forestieri da Gama Rodrigues, S.; Moreira Lupinacci, W.; Martins de Assis, C. A.

    2014-12-01

    Seismic simulations in complex geological models are interesting to verify some limitations of seismic data. In this project, different geological models were designed to analyze some difficulties encountered in the interpretation of seismic data. Another idea is these data become available for LENEP/UENF students to test new tools to assist in seismic data processing. The geological models were created considering some characteristics found in oil exploration. We simulated geological medium with volcanic intrusions, salt domes, fault, pinch out and layers more distante from surface (Kanao, 2012). We used the software Tesseral Pro to simulate the seismic acquisitions. The acquisition geometries simulated were of the type common offset, end-on and split-spread. (Figure 1) Data acquired with constant offset require less processing routines. The processing flow used with tools available in Seismic Unix package (for more details, see Pennington et al., 2005) was geometric spreading correction, deconvolution, attenuation correction and post-stack depth migration. In processing of the data acquired with end-on and split-spread geometries, we included velocity analysis and NMO correction routines. Although we analyze synthetic data and carefully applied each processing routine, we can observe some limitations of the seismic reflection in imaging thin layers, great surface depth layers, layers with low impedance contrast and faults.

  14. A very high-resolution, deep-towed, multichannel seismic streamer, part I: technical design

    NASA Astrophysics Data System (ADS)

    Bialas, J.; Breitzke, M.

    2003-04-01

    In order to allow very high resolution seismic data collection a new deep towed multichannel seismic streamer was developed within the gas hydrate initiative of the "Geotechnologien" program. The essential factor in terms of lateral resolution is determined by the size of the Fresnel zone. Using migration algorithms resolution could be enhanced up to half a wavelength, but this is only valid for the inline direction and will not recognize side effects. As the Fresnel zone is specified by the depth of source and receiver, as well as the velocity and frequency of the acoustic waves a lowering of source and receiver towards the sea floor will increase the lateral resolution. In our case we concentrated on the lowering of the receiver array resulting in hybrid system architecture, still using conventional surface operated airguns. Assuming a working depth of 3000 m and a source signal of 200 Hz the radius will be reduced from 106 m for surface configuration to 26 m for the hybrid case. The digital streamer comprises of single hydrophone nodes, which are coupled by cable sections of individual length. Due to this modular architecture the streamer lay out could be adapted to the source and target requirements. Currently 26 hydrophones are available which are sampled at 0.25 ms using a 24-bit A/D converter. Together with high-resolution data acquisition the request for good positioning is another issue. Therefore three of the hydrophone modules are extended to engineering modules. These nodes include a depth sensor as well as a compass, enabling the online display of a relative positioning of the streamer. Absolute coordinates of the deep towed system are measured through an ultra short baseline (USBL) system. Using a depth sensor within the deployed transponder the position could measured within 1% of the slant range even at very large offsets to the surface vessel. A permanent online connection to the deployed system is provided by a telemetry system, which is capable to handle the connection for the additional side scan sonar at the same time. Onboard the vessel the data are distributed online to the quality control and recording systems of the side scan and seismic applications using Ethernet connections. In case of a reduced bandwidth only a portion of data is transmitted while all raw data is stored within the Linux based PC system installed at the bottom side. Navigation and data processing is described within poster part II by Breitzke and Bialas.

  15. Numerical analysis on seismic response of Shinkansen bridge-train interaction system under moderate earthquakes

    NASA Astrophysics Data System (ADS)

    He, Xingwen; Kawatani, Mitsuo; Hayashikawa, Toshiro; Matsumoto, Takashi

    2011-03-01

    This study is intended to evaluate the influence of dynamic bridge-train interaction (BTI) on the seismic response of the Shinkansen system in Japan under moderate earthquakes. An analytical approach to simulate the seismic response of the BTI system is developed. In this approach, the behavior of the bridge structure is assumed to be within the elastic range under moderate ground motions. A bullet train car model idealized as a sprung-mass system is established. The viaduct is modeled with 3D finite elements. The BTI analysis algorithm is verified by comparing the analytical and experimental results. The seismic analysis is validated through comparison with a general program. Then, the seismic responses of the BTI system are simulated and evaluated. Some useful conclusions are drawn, indicating the importance of a proper consideration of the dynamic BTI in seismic design.

  16. Verify MesoNAM Performance

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2010-01-01

    The AMU conducted an objective analysis of the MesoNAM forecasts compared to observed values from sensors at specified KSC/CCAFS wind towers by calculating the following statistics to verify the performance of the model: 1) Bias (mean difference), 2) Standard deviation of Bias, 3) Root Mean Square Error (RMSE), and 4) Hypothesis test for Bias = O. The 45 WS LWOs use the MesoNAM to support launch weather operations. However, the actual performance of the model at KSC and CCAFS had not been measured objectively. The analysis compared the MesoNAM forecast winds, temperature and dew point to the observed values from the sensors on wind towers. The data were stratified by tower sensor, month and onshore/offshore wind direction based on the orientation of the coastline to each tower's location. The model's performance statistics were then calculated for each wind tower based on sensor height and model initialization time. The period of record for the data used in this task was based on the operational start of the current MesoNAM in mid-August 2006 and so the task began with the first full month of data, September 2006, through May 2010. The analysis of model performance indicated: a) The accuracy decreased as the forecast valid time from the model initialization increased, b) There was a diurnal signal in T with a cool bias during the late night and a warm bias during the afternoon, c) There was a diurnal signal in Td with a low bias during the afternoon and a high bias during the late night, and d) The model parameters at each vertical level most closely matched the observed parameters at heights closest to those vertical levels. The AMU developed a GUI that consists of a multi-level drop-down menu written in JavaScript embedded within the HTML code. This tool allows the LWO to easily and efficiently navigate among the charts and spreadsheet files containing the model performance statistics. The objective statistics give the LWOs knowledge of the model's strengths and weaknesses and the GUI allows quick access to the data which will result in improved forecasts for operations.

  17. Verifying Deadlock-Freedom of Communication Fabrics

    NASA Astrophysics Data System (ADS)

    Gotmanov, Alexander; Chatterjee, Satrajit; Kishinevsky, Michael

    Avoiding message dependent deadlocks in communication fabrics is critical for modern microarchitectures. If discovered late in the design cycle, deadlocks lead to missed project deadlines and suboptimal design decisions. One approach to avoid this problem is to get high level of confidence on an early microarchitectural model. However, formal proofs of liveness even on abstract models are hard due to large number of queues and distributed control. In this work we address liveness verification of communication fabrics described in the form of high-level microarchitectural models which use a small set of well-defined primitives. We prove that under certain realistic restrictions, deadlock freedom can be reduced to unsatisfiability of a system of Boolean equations. Using this approach, we have automatically verified liveness of several non-trivial models (derived from industrial microarchitectures), where state-of-the-art model checkers failed and pen and paper proofs were either tedious or unknown.

  18. Design and development of safety evaluation system of buildings on a seismic field based on the network platform

    NASA Astrophysics Data System (ADS)

    Sun, Baitao; Zhang, Lei; Chen, Xiangzhao; Zhang, Xinghua

    2015-03-01

    This paper describes a set of on-site earthquake safety evaluation systems for buildings, which were developed based on a network platform. The system embedded into the quantitative research results which were completed in accordance with the provisions from Post-earthquake Field Works, Part 2: Safety Assessment of Buildings, GB18208.2 -2001, and was further developed into an easy-to-use software platform. The system is aimed at allowing engineering professionals, civil engineeing technicists or earthquake-affected victims on site to assess damaged buildings through a network after earthquakes. The authors studied the function structure, process design of the safety evaluation module, and hierarchical analysis algorithm module of the system in depth, and developed the general architecture design, development technology and database design of the system. Technologies such as hierarchical architecture design and Java EE were used in the system development, and MySQL5 was adopted in the database development. The result is a complete evaluation process of information collection, safety evaluation, and output of damage and safety degrees, as well as query and statistical analysis of identified buildings. The system can play a positive role in sharing expert post-earthquake experience and promoting safety evaluation of buildings on a seismic field.

  19. Earthquake damage potential and critical scour depth of bridges exposed to flood and seismic hazards under lateral seismic loads

    NASA Astrophysics Data System (ADS)

    Song, Shin-Tai; Wang, Chun-Yao; Huang, Wen-Hsiu

    2015-12-01

    Many bridges located in seismic hazard regions suffer from serious foundation exposure caused by riverbed scour. Loss of surrounding soil significantly reduces the lateral strength of pile foundations. When the scour depth exceeds a critical level, the strength of the foundation is insufficient to withstand the imposed seismic demand, which induces the potential for unacceptable damage to the piles during an earthquake. This paper presents an analytical approach to assess the earthquake damage potential of bridges with foundation exposure and identify the critical scour depth that causes the seismic performance of a bridge to differ from the original design. The approach employs the well-accepted response spectrum analysis method to determine the maximum seismic response of a bridge. The damage potential of a bridge is assessed by comparing the imposed seismic demand with the strengths of the column and the foundation. The versatility of the analytical approach is illustrated with a numerical example and verified by the nonlinear finite element analysis. The analytical approach is also demonstrated to successfully determine the critical scour depth. Results highlight that relatively shallow scour depths can cause foundation damage during an earthquake, even for bridges designed to provide satisfactory seismic performance.

  20. Robust design of mass-uncertain rolling-pendulum TMDs for the seismic protection of buildings

    NASA Astrophysics Data System (ADS)

    Matta, Emiliano; De Stefano, Alessandro

    2009-01-01

    Commonly used for mitigating wind- and traffic-induced vibrations in flexible structures, passive tuned mass dampers (TMDs) are rarely applied to the seismic control of buildings, their effectiveness to impulsive loads being conditional upon adoption of large mass ratios. Instead of recurring to cumbersome metal or concrete devices, this paper suggests meeting that condition by turning into TMDs non-structural masses sometimes available atop buildings. An innovative roof-garden TMD, for instance, sounds a promising tool capable of combining environmental and structural protection in one device. Unfortunately, the amount of these masses being generally variable, the resulting mass-uncertain TMD (MUTMD) appears prone to mistuning and control loss. In an attempt to minimize such adverse effects, robust analysis and synthesis against mass variations are applied in this study to MUTMDs of the rolling-pendulum type, a configuration characterized by mass-independent natural period. Through simulations under harmonic and recorded ground motions of increasing intensity, the performance of circular and cycloidal rolling-pendulum MUTMDs is evaluated on an SDOF structure in order to illustrate their respective advantages as well as the drawbacks inherent in their non-linear behavior. A possible implementation of a roof-garden TMD on a real building structure is described and its control efficacy numerically demonstrated, showing that in practical applications MUTMDs can become a good alternative to traditional TMDs.

  1. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  2. Seismic-acoustic communication for UGS

    NASA Astrophysics Data System (ADS)

    Cechak, Jaroslav

    2010-04-01

    The paper deals with Unattended Ground Sensors (UGS) and takes into consideration both present and future aspects of the practical deployment of this equipment under conditions of Electronic Warfare (EW), including the integration of UGS into a joint system using the Unmanned Aircraft System (UAS). The first part of the paper deals with the possibilities, characteristics and useable properties of seismic-acoustic communication in the group of nodes, supplementing the information coverage of existing UGS, including the selection of a suitable working frequency band for seismic communication. The second part of the paper then describes an alternative method of communication between nodes and UGS using LF radio communication, and analyses the design and real properties of a proposed communication channel in LF band, the design of a loop antenna and its mechanical construction. The interim conclusions of each section generalize the results of seismic-acoustic and radio LF communications as verified in practice, and describe both the advantages and disadvantages of communication channels defined in this way. The third part of the paper deals with the possibility of integrating the nodes-UGS to a central system consisting of a UAS device. It covers the design and an energy evaluation of a system operating on the principle of data selection from UGS. In addition, the paper includes illustrative photographs of the practical design and graphic results of real measurements.

  3. Theoretical and practical considerations for the design of the iMUSH active-source seismic experiment

    NASA Astrophysics Data System (ADS)

    Kiser, E.; Levander, A.; Harder, S. H.; Abers, G. A.; Creager, K. C.; Vidale, J. E.; Moran, S. C.; Malone, S. D.

    2013-12-01

    The multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment seeks to understand the details of the magmatic system that feeds Mount St. Helens using active- and passive-source seismic, magnetotelluric, and petrologic data. The active-source seismic component of this experiment will take place in the summer of 2014 utilizing all of the 2600 PASSCAL 'Texan' Reftek instruments which will record twenty-four 1000-2000 lb shots distributed around the Mount St. Helens region. The instruments will be deployed as two consecutive refraction profiles centered on the volcano, and a series of areal arrays. The actual number of areal arrays, as well as their locations, will depend strongly on the length of the experiment (3-4 weeks), the number of instrument deployers (50-60), and the time it will take per deployment given the available road network. The current work shows how we are balancing these practical considerations against theoretical experiment designs in order to achieve the proposed scientific goals with the available resources. One of the main goals of the active-source seismic experiment is to image the magmatic system down to the Moho (35-40 km). Calculating sensitivity kernels for multiple shot/receiver offsets shows that direct P waves should be sensitive to Moho depths at offsets of 150 km, and therefore this will likely be the length of the refraction profiles. Another primary objective of the experiment is to estimate the locations and volumes of different magma accumulation zones beneath the volcano using the areal arrays. With this in mind, the optimal locations of these arrays, as well as their associated shots, are estimated using an eigenvalue analysis of the approximate Hessian for each possible experiment design. This analysis seeks to minimize the number of small eigenvalues of the approximate Hessian that would amplify the propagation of data noise into regions of interest in the model space, such as the likely locations of magma reservoirs. In addition, this analysis provides insight into the tradeoff between the number of areal array deployments and the information that will be gained from the experiment. An additional factor incorporated into this study is the expected data quality in different regions around Mount St. Helens. Expected data quality is determined using the signal-to-noise ratios of data from existing seismometers in the region, and from forward modeling the wavefields from different experiment designs using SPECFEM3D software. In particular, we are interested in evaluating how topography near the volcano and low velocity volcaniclastic layers affect data quality. This information is especially important within 5 km of the volcano where only hiking trails are available for instrument deployment, and in a large area north of the volcano where road maintenance has lagged since the 1980 eruption. Instrument deployment will be slow in these regions, and therefore it is essential to understand if deployment of instruments here is a reasonable use of resources. A final step of this study will be validating different experiment designs based upon the above criteria by inverting synthetic data from velocity models that contain a generalized representation of the magma system to confirm that the main features of the models can be recovered.

  4. Design and performance of a high temperature/high pressure, hydrogen tolerant, bend insensitive single-mode fiber for downhole seismic systems and applications

    NASA Astrophysics Data System (ADS)

    Gillooly, Andy; Hankey, Judith; Hill, Mark; Cooper, Laurence; Bergonzo, Aurlien

    2014-06-01

    The design of an optical fiber to give optimized sensing and lifetime performance for downhole fiber optic seismic sensors is presented. The SM1500SC(7/80)P is designed with an 80?m cladding diameter, pure silica core, high numerical aperture, high cut off wavelength and a polyimide coating to achieve outstanding performance when used in a coiled deployment state and operating in high temperature and hydrogen rich environments.

  5. Seismic analysis of Industrial Waste Landfill 4 at Y-12 Plant

    SciTech Connect

    1995-04-07

    This calculation was to seismically evaluate Landfill IV at Y-12 as required by Tennessee Rule 1200-1-7-04(2) for seismic impact zones. The calculation verifies that the landfill meets the seismic requirements of the Tennessee Division of Solid Waste, ``Earthquake Evaluation Guidance Document.`` The theoretical displacements of 0.17 in. and 0.13 in. for the design basis earthquake are well below the limiting seimsic slope stability design criteria. There is no potential for liquefaction due to absence of chohesionless soils, or for loss or reduction of shear strength for the clays at this site as result of earthquake vibration. The vegetative cover on slopes will most likely be displaced and move during a large seismic event, but this is not considered a serious deficiency because the cover is not involved in the structural stability of the landfill and there would be no release of waste to the environment.

  6. Seismic and layout design for a tank-type fast reactor

    SciTech Connect

    Goodman, L.; Yamaki, Hideo; Davies, S.M.

    1984-06-01

    Hitachi Ltd. of Japan, with the assistance of the Bechtel Group, Inc. and the General Electric Company of the US, initiated a conceptual design study of a compact tank-type LMFBR. The Bechtel work concentrated on layout of the nuclear island (NI), and its orientation with respect to the Control (CB) and Turbine (TGB) Buildings. This joint effort was carried out during 1982 and 1983 in four steps. Each step produced improvements in the design and reduced the plant size and cost. This paper described the design evolution and the final result with respect to Bechtel's development of the NI layout.

  7. Design of a large remote seismic exploration data acquisition system, with the architecture of a distributed storage area network

    NASA Astrophysics Data System (ADS)

    Cao, Ping; Song, Ke-zhu; Yang, Jun-feng; Ruan, Fu-ming

    2011-03-01

    Nowadays, seismic exploration data acquisition (DAQ) systems have been developed into remote forms with a large-scale coverage area. In this kind of application, some features must be mentioned. Firstly, there are many sensors which are placed remotely. Secondly, the total data throughput is high. Thirdly, optical fibres are not suitable everywhere because of cost control, harsh running environments, etc. Fourthly, the ability of expansibility and upgrading is a must for this kind of application. It is a challenge to design this kind of remote DAQ (rDAQ). Data transmission, clock synchronization, data storage, etc must be considered carefully. A fourth-hierarchy model of rDAQ is proposed. In this model, rDAQ is divided into four different function levels. From this model, a simple and clear architecture based on a distributed storage area network is proposed. rDAQs with this architecture have advantages of flexible configuration, expansibility and stability. This architecture can be applied to design and realize from simple single cable systems to large-scale exploration DAQs.

  8. Preclosure seismic design methodology for a geologic repository at Yucca Mountain. Topical report YMP/TR-003-NP

    SciTech Connect

    1996-10-01

    This topical report describes the methodology and criteria that the U.S. Department of Energy (DOE) proposes to use for preclosure seismic design of structures, systems, and components (SSCs) of the proposed geologic repository operations area that are important to safety. Title 10 of the Code of Federal Regulations, Part 60 (10 CFR 60), Disposal of High-Level Radioactive Wastes in Geologic Repositories, states that for a license to be issued for operation of a high-level waste repository, the U.S. Nuclear Regulatory Commission (NRC) must find that the facility will not constitute an unreasonable risk to the health and safety of the public. Section 60.131 (b)(1) requires that SSCs important to safety be designed so that natural phenomena and environmental conditions anticipated at the geologic repository operations area will not interfere with necessary safety functions. Among the natural phenomena specifically identified in the regulation as requiring safety consideration are the hazards of ground shaking and fault displacement due to earthquakes.

  9. The DDBD Method In The A-Seismic Design of Anchored Diaphragm Walls

    SciTech Connect

    Manuela, Cecconi; Vincenzo, Pane; Sara, Vecchietti

    2008-07-08

    The development of displacement based approaches for earthquake engineering design appears to be very useful and capable to provide improved reliability by directly comparing computed response and expected structural performance. In particular, the design procedure known as the Direct Displacement Based Design (DDBD) method, which has been developed in structural engineering over the past ten years in the attempt to mitigate some of the deficiencies in current force-based design methods, has been shown to be very effective and promising ([1], [2]). The first attempts of application of the procedure to geotechnical engineering and, in particular, earth retaining structures are discussed in [3], [4] and [5]. However in this field, the outcomes of the research need to be further investigated in many aspects. The paper focuses on the application of the DDBD method to anchored diaphragm walls. The results of the DDBD method are discussed in detail in the paper, and compared to those obtained from conventional pseudo-static analyses.

  10. Seismic design or retrofit of buildings with metallic structural fuses by the damage-reduction spectrum

    NASA Astrophysics Data System (ADS)

    Li, Gang; Jiang, Yi; Zhang, Shuchuan; Zeng, Yan; Li, Qiang

    2015-03-01

    Recently, the structural fuse has become an important issue in the field of earthquake engineering. Due to the trilinearity of the pushover curve of buildings with metallic structural fuses, the mechanism of the structural fuse is investigated through the ductility equation of a single-degree-of-freedom system, and the corresponding damage-reduction spectrum is proposed to design and retrofit buildings. Furthermore, the controlling parameters, the stiffness ratio between the main frame and structural fuse and the ductility factor of the main frame, are parametrically studied, and it is shown that the structural fuse concept can be achieved by specific combinations of the controlling parameters based on the proposed damage-reduction spectrum. Finally, a design example and a retrofit example, variations of real engineering projects after the 2008 Wenchuan earthquake, are provided to demonstrate the effectiveness of the proposed design procedures using buckling restrained braces as the structural fuses.

  11. Optimization for performance-based design under seismic demands, including social costs

    NASA Astrophysics Data System (ADS)

    Mller, Oscar; Foschi, Ricardo O.; Ascheri, Juan P.; Rubinstein, Marcelo; Grossman, Sergio

    2015-06-01

    Performance-based design in earthquake engineering is a structural optimization problem that has, as the objective, the determination of design parameters for the minimization of total costs, while at the same time satisfying minimum reliability levels for the specified performance criteria. Total costs include those for construction and structural damage repairs, those associated with non-structural components and the social costs of economic losses, injuries and fatalities. This paper presents a general framework to approach this problem, using a numerical optimization strategy and incorporating the use of neural networks for the evaluation of dynamic responses and the reliability levels achieved for a given set of design parameters. The strategy is applied to an example of a three-story office building. The results show the importance of considering the social costs, and the optimum failure probabilities when minimum reliability constraints are not taken into account.

  12. The LUSI Seismic Experiment: Deployment of a Seismic Network around LUSI, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Karyono, Karyono; Mazzini, Adriano; Lupi, Matteo; Syafri, Ildrem; Haryanto, Iyan; Masturyono, Masturyono; Hadi, Soffian; Rohadi, Suprianto; Suardi, Iman; Rudiyanto, Ariska; Pranata, Bayu

    2015-04-01

    The spectacular Lusi eruption started in northeast Java, Indonesia the 29 of May 2006 following a M6.3 earthquake striking the island. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. Lusi is located few kilometres to the NE of the Arjuno-Welirang volcanic complex. Lusi sits upon the Watukosek fault system. From this volcanic complex originates the Watukosek fault system that was reactivated by the M6.3 earthquake in 2006 and is still periodically reactivated by the frequent seismicity. To date Lusi is still active and erupting gas, water, mud and clasts. Gas and water data show that the Lusi plumbing system is connected with the neighbouring Arjuno-Welirang volcanic complex. This makes the Lusi eruption a "sedimentary hosted geothermal system". To verify and characterise the occurrence of seismic activity and how this perturbs the connected Watukosek fault, the Arjuno-Welirang volcanic system and the ongoing Lusi eruption, we deployed 30 seismic stations (short-period and broadband) in this region of the East Java basin. The seismic stations are more densely distributed around LUSI and the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. Fewer stations are positioned around the volcanic arc. Our study sheds light on the seismic activity along the Watukosek fault system and describes the waveforms associated to the geysering activity of Lusi. The initial network aims to locate small event that may not be captured by the Indonesian Agency for Meteorology, Climatology and Geophysics (BMKG) seismic network and it will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-Arjuno Welirang region and temporal variations of vp/vs ratios. Such variations will then be ideally related to large-magnitude seismic events. This project is an unprecedented monitoring of a multi component system including an Lusi active eruption, an unlocked strike slip fault, a neighbouring volcanic arc all affected by frequent seismicity. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. The seismic experiment suggested in this study enforces our knowledge about Lusi and will represent a step further towards the reconstruction of a society devastated by Lusi disaster.

  13. RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES

    EPA Science Inventory

    On October 9, 1993, the new RCRA Subtitle D regulation (40CFR Part 258) went into effect. hese regulations are applicable to landfills reclining solid waste (MSW) and establish minimum Federal criteria for the siting, design, operations, and closure of MSW landfills. hese regulat...

  14. RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES

    EPA Science Inventory

    On October 9, 1993, the new RCRA Subtitle D regulations (40 CFR Part 258) went into effect. These regulations are applicable to landfills receiving municipal solid waste (MSW) and establish minimum Federal criteria for the siting, design, operation, and closure of MSW landfills....

  15. Seismic Hazard Assessment: Issues and Alternatives

    NASA Astrophysics Data System (ADS)

    Wang, Zhenming

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used interchangeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been proclaimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications.

  16. Seismic performance analysis and design suggestion for frame buildings with cast-in-place staircases

    NASA Astrophysics Data System (ADS)

    Feng, Yuan; Wu, Xiaobin; Xiong, Yaoqing; Li, Congchun; Yang, Wen

    2013-06-01

    Many staircases in reinforced concrete (RC) frame structures suffered severe damage during the Wenchuan earthquake. Elastic analyses for 18 RC structure models with and without staircases are conducted and compared to study the influence of the staircase on the stiffness, displacements and internal forces of the structures. To capture the yielding development and damage mechanism of frame structures, elasto-plastic analysis is carried out for one of the 18 models. Based on the features observed in the analyses, a new type of staircase design i.e., isolating them from the master structure to eliminate the effect of K-type struts, is proposed and discussed. It is concluded that the proposed method of staircase isolation is effective and feasible for engineering design, and does not significantly increase the construction cost.

  17. A structural design and analysis of a piping system including seismic load

    SciTech Connect

    Hsieh, B.J.; Kot, C.A.

    1991-01-01

    The structural design/analysis of a piping system at a nuclear fuel facility is used to investigate some aspects of current design procedures. Specifically the effect of using various stress measures including ASME Boiler Pressure Vessel (B PV) Code formulas is evaluated. It is found that large differences in local maximum stress values may be calculated depending on the stress criterion used. However, when the global stress maximum for the entire system are compared the differences are much smaller, being nevertheless, for some load combinations, of the order of 50 percent. The effect of using an Equivalent Static Method (ESM) analysis is also evaluated by comparing its results with those obtained from a Response Spectrum Method (RSM) analysis with the modal responses combined by using the absolute summation (ABS), by using the square root of the squares (SRSS), and by using the 10 percent method (10PC). It is shown that for a spectrum amplification factor (equivalent static coefficient greater than unity) of at least 1.32 must be used in the current application of the ESM analysis in order to obtain results which are conservative in all aspects relative to an RSM analysis based on ABS. However, it appears that an adequate design would be obtained from the ESM approach even without the use of a spectrum amplification factor. 7 refs., 3 figs., 3 tabs.

  18. Appraising the value of independent EIA follow-up verifiers

    SciTech Connect

    Wessels, Jan-Albert

    2015-01-15

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities arising from EIA follow-up through the appointment of independent verifiers. - Highlights: • A framework for appraising the role of independent verifiers is established. • The value added to EIA follow-up by independent verifiers in South Africa is documented. • Verifiers add most value when involved with screening, checking compliance, influencing decisions and community engagement. • Verifiers could be more creatively utilized in pre-construction preparation, giving feedback, and performance evaluation.

  19. Seismic analysis of diagrid structural frames with shear-link fuse devices

    NASA Astrophysics Data System (ADS)

    Moghaddasi B, Nasim S.; Zhang, Yunfeng

    2013-09-01

    This paper presents a new concept for enhancing the seismic ductility and damping capacity of diagrid structural frames by using shear-link fuse devices and its seismic performance is assessed through nonlinear static and dynamic analysis. The architectural elegancy of the diagrid structure attributed to its triangular leaning member configuration and high structural redundancy make this system a desirable choice for tall building design. However, forming a stable energy dissipation mechanism in diagrid framing remains to be investigated to expand its use in regions with high seismicity. To address this issue, a diagrid framing design is proposed here which provides a competitive design option in highly seismic regions through its increased ductility and improved energy dissipation capacity provided by replaceable shear links interconnecting the diagonal members at their ends. The structural characteristics and seismic behavior (capacity, stiffness, energy dissipation, ductility) of the diagrid structural frame are demonstrated with a 21-story building diagrid frame subjected to nonlinear static and dynamic analysis. The findings from the nonlinear time history analysis verify that satisfactory seismic performance can be achieved by the proposed diagrid frame subjected to design basis earthquakes in California. In particular, one appealing feature of the proposed diagrid building is its reduced residual displacement after strong earthquakes.

  20. Research of CRP-based irregular 2D seismic acquisition

    NASA Astrophysics Data System (ADS)

    Zhao, Hu; Yin, Cheng; He, Guang-Ming; Chen, Ai-Ping; Jing, Long-Jiang

    2015-03-01

    Seismic exploration in the mountainous areas of western Chinese is extremely difficult because of the complexity of the surface and subsurface, which results in shooting difficulties, seismic data with low signal-to-noise ratio, and strong interference. The complexity of the subsurface structure leads to strong scattering of the reflection points; thus, the curved-line acquisition method has been used. However, the actual subsurface structural characteristics have been rarely considered. We propose a design method for irregular acquisition based on common reflection points (CRP) to avoid difficult-to-shoot areas, while considering the structural characteristics and CRP positions and optimizing the surface-receiving line position. We arrange the positions of the receiving points to ensure as little dispersion of subsurface CRP as possible to improve the signal-to-noise ratio of the seismic data. We verify the applicability of the method using actual data from a site in Sichuan Basin. The proposed method apparently solves the problem of seismic data acquisition and facilitates seismic exploration in structurally complex areas.

  1. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    ERIC Educational Resources Information Center

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and

  2. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    ERIC Educational Resources Information Center

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…

  3. Azimuthal variation of radiation of seismic energy from cast blasts

    SciTech Connect

    Pearson, D.C.; Stump, B.W.; Martin, R.L.

    1996-12-31

    As part of a series of seismic experiments designed to improve the understanding of the impact of mining blasts on verifying a Comprehensive Test Ban Treaty, a sixteen station network of three-component seismic sensors were deployed around a large cast shot in the Black Thunder Mine. The seismic stations were placed, where possible, at a range of 2.5 kilometers with a constant inter-station spacing of 22.5 degrees. All of the data were recorded with the seismometers oriented such that the radial component pointed to the middle point of the approximately 2 kilometer long shot. High quality data were recorded at each station. Data were scaled to a range of 2.5 kilometers and the sum of the absolute value of the vertical, radial, and transverse channels computed. These observations were used to construct radiation patterns of the seismic energy propagating from the cast shot. It is obvious that cast shots do not radiate seismic energy isotropically. Most of the vertical motion occurs behind the highwall while radial and transverse components of motion are enhanced in directions parallel to the highwall. These findings have implications for local (0.1 to 15 kilometer range) and possibly for regional (100 to 2,000 kilometer range) seismic observations of cast blasting. Locally, it could be argued that peak particle velocities could be scaled not only by range but also by azimuthal direction from the shot. This result implies that long term planning of pit orientation relative to sensitive structures could mitigate problems with vibration levels from future blasting operations. Regionally, the local radiation pattern may be important in determining the magnitude of large scale cast blasts. Improving the transparency of mining operations to international seismic monitoring systems may be possible with similar considerations.

  4. Seismic design technology for breeder reactor structures. Volume 2. Special topics in soil/structure interaction analyses

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into six chapters: definition of seismic input ground motion, review of state-of-the-art procedures, analysis guidelines, rock/structure interaction analysis example, comparison of two- and three-dimensional analyses, and comparison of analyses using FLUSH and TRI/SAC Codes. (DLC)

  5. Seismic Survey

    USGS Multimedia Gallery

    USGS hydrologists conduct a seismic survey in New Orleans, Louisiana. The survey was one of several geophysical methods used during USGS applied research on the utility of the multi-channel analysis of surface waves (MASW) seismic method (no pictured here) for non-invasive assessment of earthen leve...

  6. Static behaviour of induced seismicity

    NASA Astrophysics Data System (ADS)

    Mignan, A.

    2015-12-01

    The standard paradigm to describe seismicity induced by fluid injection is to apply nonlinear diffusion dynamics in a poroelastic medium. I show that the spatiotemporal behaviour and rate evolution of induced seismicity can, instead, be expressed by geometric operations on a static stress field produced by volume change at depth. I obtain laws similar in form to the ones derived from poroelasticity while requiring a lower description length. Although fluid flow is known to occur in the ground, it is not pertinent to the behaviour of induced seismicity. The proposed model is equivalent to the static stress model for tectonic foreshocks generated by the Non-Critical Precursory Accelerating Seismicity Theory. This study hence verifies the explanatory power of this theory outside of its original scope.

  7. Neural networks in seismic discrimination

    SciTech Connect

    Dowla, F.U.

    1995-01-01

    Neural networks are powerful and elegant computational tools that can be used in the analysis of geophysical signals. At Lawrence Livermore National Laboratory, we have developed neural networks to solve problems in seismic discrimination, event classification, and seismic and hydrodynamic yield estimation. Other researchers have used neural networks for seismic phase identification. We are currently developing neural networks to estimate depths of seismic events using regional seismograms. In this paper different types of network architecture and representation techniques are discussed. We address the important problem of designing neural networks with good generalization capabilities. Examples of neural networks for treaty verification applications are also described.

  8. Impact of lateral force-resisting system and design/construction practices on seismic performance and cost of tall buildings in Dubai, UAE

    NASA Astrophysics Data System (ADS)

    AlHamaydeh, Mohammad; Galal, Khaled; Yehia, Sherif

    2013-09-01

    The local design and construction practices in the United Arab Emirates (UAE), together with Dubai's unique rate of development, warrant special attention to the selection of Lateral Force-Resisting Systems (LFRS). This research proposes four different feasible solutions for the selection of the LFRS for tall buildings and quantifies the impact of these selections on seismic performance and cost. The systems considered are: Steel Special Moment-Resisting Frame (SMRF), Concrete SMRF, Steel Dual System (SMRF with Special Steel Plates Shear Wall, SPSW), and Concrete Dual System (SMRF with Special Concrete Shear Wall, SCSW). The LFRS selection is driven by seismic setup as well as the adopted design and construction practices in Dubai. It is found that the concrete design alternatives are consistently less expensive than their steel counterparts. The steel dual system is expected to have the least damage based on its relatively lesser interstory drifts. However, this preferred performance comes at a higher initial construction cost. Conversely, the steel SMRF system is expected to have the most damage and associated repair cost due to its excessive flexibility. The two concrete alternatives are expected to have relatively moderate damage and repair costs in addition to their lesser initial construction cost.

  9. 37 CFR 2.33 - Verified statement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... under § 2.20 of the applicant's continued use or bona fide intention to use the mark in commerce. (d) (e... COMMERCE RULES OF PRACTICE IN TRADEMARK CASES The Written Application § 2.33 Verified statement. (a) The... behalf of the applicant under § 2.193(e)(1). (b)(1) In an application under section 1(a) of the Act,...

  10. 37 CFR 2.33 - Verified statement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... under § 2.20 of the applicant's continued use or bona fide intention to use the mark in commerce. (d) (e... COMMERCE RULES OF PRACTICE IN TRADEMARK CASES The Written Application § 2.33 Verified statement. (a) The... behalf of the applicant under § 2.193(e)(1). (b)(1) In an application under section 1(a) of the Act,...

  11. Firms Verify Online IDs Via Schools

    ERIC Educational Resources Information Center

    Davis, Michelle R.

    2008-01-01

    Companies selling services to protect children and teenagers from sexual predators on the Internet have enlisted the help of schools and teachers to verify students' personal information. Those companies are also sharing some of the information with Web sites, which can pass it along to businesses for use in targeting advertising to young

  12. Seismic, shock, and vibration isolation - 1988

    SciTech Connect

    Chung, H. ); Mostaghel, N. )

    1988-01-01

    This book contains papers presented at a conference on pressure vessels and piping. Topics covered include: Design of R-FBI bearings for seismic isolation; Benefits of vertical and horizontal seismic isolation for LMR nuclear reactor units; and Some remarks on the use and perspectives of seismic isolation for fast reactors.

  13. Towards composition of verified hardware devices

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, G. C.

    1991-01-01

    Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.

  14. Seismic isolation of nuclear power plants using sliding isolation bearings

    NASA Astrophysics Data System (ADS)

    Kumar, Manish

    Nuclear power plants (NPP) are designed for earthquake shaking with very long return periods. Seismic isolation is a viable strategy to protect NPPs from extreme earthquake shaking because it filters a significant fraction of earthquake input energy. This study addresses the seismic isolation of NPPs using sliding bearings, with a focus on the single concave Friction Pendulum(TM) (FP) bearing. Friction at the sliding surface of an FP bearing changes continuously during an earthquake as a function of sliding velocity, axial pressure and temperature at the sliding surface. The temperature at the sliding surface, in turn, is a function of the histories of coefficient of friction, sliding velocity and axial pressure, and the travel path of the slider. A simple model to describe the complex interdependence of the coefficient of friction, axial pressure, sliding velocity and temperature at the sliding surface is proposed, and then verified and validated. Seismic hazard for a seismically isolated nuclear power plant is defined in the United States using a uniform hazard response spectrum (UHRS) at mean annual frequencies of exceedance (MAFE) of 10-4 and 10 -5. A key design parameter is the clearance to the hard stop (CHS), which is influenced substantially by the definition of the seismic hazard. Four alternate representations of seismic hazard are studied, which incorporate different variabilities and uncertainties. Response-history analyses performed on single FP-bearing isolation systems using ground motions consistent with the four representations at the two shaking levels indicate that the CHS is influenced primarily by whether the observed difference between the two horizontal components of ground motions in a given set is accounted for. The UHRS at the MAFE of 10-4 is increased by a design factor (? 1) for conventional (fixed base) nuclear structure to achieve a target annual frequency of unacceptable performance. Risk oriented calculations are performed for eight sites across the United States to show that the factor is equal to 1.0 for seismically isolated NPPs, if the risk is dominated by horizontal earthquake shaking. Response-history analyses using different models of seismically isolated NPPs are performed to understand the importance of the choice of friction model, model complexity and vertical ground motion for calculating horizontal displacement response across a wide range of sites and shaking intensities. A friction model for the single concave FP bearing should address heating. The pressure- and velocity-dependencies were not important for the models and sites studied. Isolation-system displacements can be computed using a macro model comprising a single FP bearing.

  15. Teacher Directed Design: Content Knowledge, Pedagogy and Assessment under the Nevada K-12 Real-Time Seismic Network

    NASA Astrophysics Data System (ADS)

    Cantrell, P.; Ewing-Taylor, J.; Crippen, K. J.; Smith, K. D.; Snelson, C. M.

    2004-12-01

    Education professionals and seismologists under the emerging SUN (Shaking Up Nevada) program are leveraging the existing infrastructure of the real-time Nevada K-12 Seismic Network to provide a unique inquiry based science experience for teachers. The concept and effort are driven by teacher needs and emphasize rigorous content knowledge acquisition coupled with the translation of that knowledge into an integrated seismology based earth sciences curriculum development process. We are developing a pedagogical framework, graduate level coursework, and materials to initiate the SUN model for teacher professional development in an effort to integrate the research benefits of real-time seismic data with science education needs in Nevada. A component of SUN is to evaluate teacher acquisition of qualified seismological and earth science information and pedagogy both in workshops and in the classroom and to assess the impact on student achievement. SUN's mission is to positively impact earth science education practices. With the upcoming EarthScope initiative, the program is timely and will incorporate EarthScope real-time seismic data (USArray) and educational materials in graduate course materials and teacher development programs. A number of schools in Nevada are contributing real-time data from both inexpensive and high-quality seismographs that are integrated with Nevada regional seismic network operations as well as the IRIS DMC. A powerful and unique component of the Nevada technology model is that schools can receive "stable" continuous live data feeds from 100's seismograph stations in Nevada, California and world (including live data from Earthworm systems and the IRIS DMC BUD - Buffer of Uniform Data). Students and teachers see their own networked seismograph station within a global context, as participants in regional and global monitoring. The robust real-time Internet communications protocols invoked in the Nevada network provide for local data acquisition, remote multi-channel data access, local time-series data management, interactive multi-window waveform display and time-series analysis with centralized meta-data control. Formally integrating educational seismology into the K-12 science curriculum with an overall "positive" impact to science education practices necessarily requires a collaborative effort between professional educators and seismologists yet driven exclusively by teacher needs.

  16. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    SciTech Connect

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly evaluated and identified. This document supersedes the seismic classifications, assignments, and computations in ''Seismic Analysis for Preclosure Safety'' (BSC 2004a).

  17. The Non-Proliferation Experiment recorded at the Pinedale. Seismic research facility

    SciTech Connect

    Carr, D.B.

    1994-06-01

    The Non-Proliferation Experiment was recorded by five different seismic stations operated by Sandia National Laboratories at the Pinedale Seismic Research Facility, approximately 7.60 from the Nevada Test Site. Two stations are different versions of the Deployable Seismic Verification System developed by the Department of Energy to provide seismic data to verify compliance with a Comprehensive Test Ban Treaty. Vault and borehole versions of the Designated Seismic Stations also recorded the event. The final station is test instrumentation located at depths of 10, 40 and 1200 feet. Although the event is seen clearly at all the stations, there are variations in the raw data due to the different bandwidths and depths of deployment. One Deployable Seismic Verification System has been operating at Pinedale for over three years and in that time recorded 14 nuclear explosions and 4 earthquakes from the Nevada Test Site, along with numerous other western U.S. earthquakes. Several discriminants based on the work by Taylor et al. (1989) have been applied to this data. First the discriminants were tested by comparing the explosions only to the 4 earthquakes located on the Test Site. Only one discriminant, log(L{sub g}/P{sub g}), did not show clear separation between the earthquakes and nuclear explosions. When other western U.S. events are included, only the m{sub b} vs. M{sub s} discriminant separated the events. In all cases where discrimination was possible, the Non-Proliferation Experiment was indistinguishable from a nuclear explosion.

  18. 41 CFR 128-1.8006 - Seismic Safety Program requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Component Seismic Safety Coordinator shall ensure that an individual familiar with seismic design provisions... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Seismic Safety Program... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety...

  19. 41 CFR 128-1.8006 - Seismic Safety Program requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Component Seismic Safety Coordinator shall ensure that an individual familiar with seismic design provisions... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false Seismic Safety Program... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety...

  20. 41 CFR 128-1.8006 - Seismic Safety Program requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Component Seismic Safety Coordinator shall ensure that an individual familiar with seismic design provisions... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false Seismic Safety Program... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety...

  1. Annual Hanford seismic report -- fiscal year 1996

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.P.

    1996-12-01

    Seismic monitoring (SM) at the Hanford Site was established in 1969 by the US Geological Survey (USGS) under a contract with the US Atomic Energy Commission. Since 1980, the program has been managed by several contractors under the US Department of Energy (USDOE). Effective October 1, 1996, the Seismic Monitoring workscope, personnel, and associated contracts were transferred to the USDOE Pacific Northwest National Laboratory (PNNL). SM is tasked to provide an uninterrupted collection and archives of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) located on and encircling the Hanford Site. SM is also tasked to locate and identify sources of seismic activity and monitor changes in the historical pattern of seismic activity at the Hanford Site. The data compiled are used by SM, Waste Management, and engineering activities at the Hanford Site to evaluate seismic hazards and seismic design for the Site.

  2. Verifying speculative multithreading in an application

    SciTech Connect

    Felton, Mitchell D

    2014-11-18

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  3. Verifying speculative multithreading in an application

    SciTech Connect

    Felton, Mitchell D

    2014-12-09

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  4. Development of Earthquake Ground Motion Input for Preclosure Seismic Design and Postclosure Performance Assessment of a Geologic Repository at Yucca Mountain, NV

    SciTech Connect

    I. Wong

    2004-11-05

    This report describes a site-response model and its implementation for developing earthquake ground motion input for preclosure seismic design and postclosure assessment of the proposed geologic repository at Yucca Mountain, Nevada. The model implements a random-vibration theory (RVT), one-dimensional (1D) equivalent-linear approach to calculate site response effects on ground motions. The model provides results in terms of spectral acceleration including peak ground acceleration, peak ground velocity, and dynamically-induced strains as a function of depth. In addition to documenting and validating this model for use in the Yucca Mountain Project, this report also describes the development of model inputs, implementation of the model, its results, and the development of earthquake time history inputs based on the model results. The purpose of the site-response ground motion model is to incorporate the effects on earthquake ground motions of (1) the approximately 300 m of rock above the emplacement levels beneath Yucca Mountain and (2) soil and rock beneath the site of the Surface Facilities Area. A previously performed probabilistic seismic hazard analysis (PSHA) (CRWMS M&O 1998a [DIRS 103731]) estimated ground motions at a reference rock outcrop for the Yucca Mountain site (Point A), but those results do not include these site response effects. Thus, the additional step of applying the site-response ground motion model is required to develop ground motion inputs that are used for preclosure and postclosure purposes.

  5. 7 CFR 1792.104 - Seismic acknowledgments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... registered architect or engineer responsible for the building design stating that seismic provisions pursuant... include the identification and date of the model code or standard that is used in the seismic design of... design. The statement shall identify the model code or standard identified that is used in the...

  6. 7 CFR 1792.104 - Seismic acknowledgments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... registered architect or engineer responsible for the building design stating that seismic provisions pursuant... include the identification and date of the model code or standard that is used in the seismic design of... design. The statement shall identify the model code or standard identified that is used in the...

  7. 7 CFR 1792.104 - Seismic acknowledgments.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... registered architect or engineer responsible for the building design stating that seismic provisions pursuant... include the identification and date of the model code or standard that is used in the seismic design of... design. The statement shall identify the model code or standard identified that is used in the...

  8. 7 CFR 1792.104 - Seismic acknowledgments.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... registered architect or engineer responsible for the building design stating that seismic provisions pursuant... include the identification and date of the model code or standard that is used in the seismic design of... design. The statement shall identify the model code or standard identified that is used in the...

  9. Verifying approximate solutions to differential equations

    NASA Astrophysics Data System (ADS)

    Enright, W. H.

    2006-01-01

    It is now standard practice in computational science for large-scale simulations to be implemented and investigated in a problem solving environment (PSE) such as MATLAB or MAPLE. In such an environment, a scientist or engineer will formulate a mathematical model, approximate its solution using an appropriate numerical method, visualize the approximate solution and verify (or validate) the quality of the approximate solution. Traditionally, we have been most concerned with the development of effective numerical software for generating the approximate solution and several efficient and reliable numerical libraries are now available for use within the most widely used PSEs. On the other hand, the visualization and verification tasks have received little attention, even though each often requires as much computational effort as is involved in generating the approximate solution.In this paper, we will investigate the effectiveness of a suite of tools that we have recently introduced in the MATLAB PSE to verify approximate solutions of ordinary differential equations. We will use the notion of `effectivity index', widely used by researchers in the adaptive mesh PDE community, to quantify the credibility of our verification tools. Numerical examples will be presented to illustrate the effectiveness of these tools when applied to a standard numerical method on two model test problems.

  10. Verifying disarmament: scientific, technological and political challenges

    SciTech Connect

    Pilat, Joseph R

    2011-01-25

    There is growing interest in, and hopes for, nuclear disarmament in governments and nongovernmental organizations (NGOs) around the world. If a nuclear-weapon-free world is to be achievable, verification and compliance will be critical. VerifYing disarmament would have unprecedented scientific, technological and political challenges. Verification would have to address warheads, components, materials, testing, facilities, delivery capabilities, virtual capabilities from existing or shutdown nuclear weapon and existing nuclear energy programs and material and weapon production and related capabilities. Moreover, it would likely have far more stringent requirements. The verification of dismantlement or elimination of nuclear warheads and components is widely recognized as the most pressing problem. There has been considerable research and development done in the United States and elsewhere on warhead and dismantlement transparency and verification since the early 1990s. However, we do not today know how to verifY low numbers or zero. We need to develop the needed verification tools and systems approaches that would allow us to meet this complex set of challenges. There is a real opportunity to explore verification options and, given any realistic time frame for disarmament, there is considerable scope to invest resources at the national and international levels to undertake research, development and demonstrations in an effort to address the anticipated and perhaps unanticipated verification challenges of disarmament now andfor the next decades. Cooperative approaches have the greatest possibility for success.

  11. Seismic Data Gathering and Validation

    SciTech Connect

    Coleman, Justin

    2015-02-01

    Three recent earthquakes in the last seven years have exceeded their design basis earthquake values (so it is implied that damage to SSC’s should have occurred). These seismic events were recorded at North Anna (August 2011, detailed information provided in [Virginia Electric and Power Company Memo]), Fukushima Daichii and Daini (March 2011 [TEPCO 1]), and Kaswazaki-Kariwa (2007, [TEPCO 2]). However, seismic walk downs at some of these plants indicate that very little damage occurred to safety class systems and components due to the seismic motion. This report presents seismic data gathered for two of the three events mentioned above and recommends a path for using that data for two purposes. One purpose is to determine what margins exist in current industry standard seismic soil-structure interaction (SSI) tools. The second purpose is the use the data to validated seismic site response tools and SSI tools. The gathered data represents free field soil and in-structure acceleration time histories data. Gathered data also includes elastic and dynamic soil properties and structural drawings. Gathering data and comparing with existing models has potential to identify areas of uncertainty that should be removed from current seismic analysis and SPRA approaches. Removing uncertainty (to the extent possible) from SPRA’s will allow NPP owners to make decisions on where to reduce risk. Once a realistic understanding of seismic response is established for a nuclear power plant (NPP) then decisions on needed protective measures, such as SI, can be made.

  12. Positively Verifying Mating of Previously Unverifiable Flight Connectors

    NASA Technical Reports Server (NTRS)

    Pandipati R. K. Chetty

    2011-01-01

    Current practice is to uniquely key the connectors, which, when mated, could not be verified by ground tests such as those used in explosive or non-explosive initiators and pyro valves. However, this practice does not assure 100-percent correct mating. This problem could be overcome by the following approach. Errors in mating of interchangeable connectors can result in degraded or failed space mission. Mating of all flight connectors considered not verifiable via ground tests can be verified electrically by the following approach. It requires two additional wires going through the connector of interest, a few resistors, and a voltage source. The test-point voltage V(sub tp) when the connector is not mated will be the same as the input voltage, which gets attenuated by the resistor R(sub 1) when the female (F) and male (M) connectors are mated correctly and properly. The voltage at the test point will be a function of R(sub 1) and R(sub 2). Monitoring of the test point could be done on ground support equipment (GSE) only, or it can be a telemetry point. For implementation on multiple connector pairs, a different value for R(sub 1) or R(sub 2) or both can be selected for each pair of connectors that would result in a unique test point voltage for each connector pair. Each test point voltage is unique, and correct test point voltage is read only when the correct pair is mated correctly together. Thus, this design approach can be used to verify positively the correct mating of the connector pairs. This design approach can be applied to any number of connectors on the flight vehicle.

  13. Verifying Timestamps of Occultation Observation Systems

    NASA Astrophysics Data System (ADS)

    Barry, M. A. Tony; Gault, Dave; Bolt, Greg; McEwan, Alistair; Filipovi?, Miroslav D.; White, Graeme L.

    2015-04-01

    We describe an image timestamp verification system to determine the exposure timing characteristics and continuity of images made by an imaging camera and recorder, with reference to Coordinated Universal Time. The original use was to verify the timestamps of stellar occultation recording systems, but the system is applicable to lunar flashes, planetary transits, sprite recording, or any area where reliable timestamps are required. The system offers good temporal resolution (down to 2 ms, referred to Coordinated Universal Time) and provides exposure duration and interframe dead time information. The system uses inexpensive, off-the-shelf components, requires minimal assembly, and requires no high-voltage components or connections. We also describe an application to load fits (and other format) image files, which can decode the verification image timestamp. Source code, wiring diagrams, and built applications are provided to aid the construction and use of the device.

  14. Verifiable process monitoring through enhanced data authentication.

    SciTech Connect

    Goncalves, Joao G. M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-09-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  15. Seismic Tomography.

    ERIC Educational Resources Information Center

    Anderson, Don L.; Dziewonski, Adam M.

    1984-01-01

    Describes how seismic tomography is used to analyze the waves produced by earthquakes. The information obtained from the procedure can then be used to map the earth's mantle in three dimensions. The resulting maps are then studied to determine such information as the convective flow that propels the crustal plates. (JN)

  16. Seismic Symphonies

    NASA Astrophysics Data System (ADS)

    Strinna, Elisa; Ferrari, Graziano

    2015-04-01

    The project started in 2008 as a sound installation, a collaboration between an artist, a barrel organ builder and a seismologist. The work differs from other attempts of sound transposition of seismic records. In this case seismic frequencies are not converted automatically into the "sound of the earthquake." However, it has been studied a musical translation system that, based on the organ tonal scale, generates a totally unexpected sequence of sounds which is intended to evoke the emotions aroused by the earthquake. The symphonies proposed in the project have somewhat peculiar origins: they in fact come to life from the translation of graphic tracks into a sound track. The graphic tracks in question are made up by copies of seismograms recorded during some earthquakes that have taken place around the world. Seismograms are translated into music by a sculpture-instrument, half a seismograph and half a barrel organ. The organ plays through holes practiced on paper. Adapting the documents to the instrument score, holes have been drilled on the waves' peaks. The organ covers about three tonal scales, starting from heavy and deep sounds it reaches up to high and jarring notes. The translation of the seismic records is based on a criterion that does match the highest sounds to larger amplitudes with lower ones to minors. Translating the seismogram in the organ score, the larger the amplitude of recorded waves, the more the seismogram covers the full tonal scale played by the barrel organ and the notes arouse an intense emotional response in the listener. Elisa Strinna's Seismic Symphonies installation becomes an unprecedented tool for emotional involvement, through which can be revived the memory of the greatest disasters of over a century of seismic history of the Earth. A bridge between art and science. Seismic Symphonies is also a symbolic inversion: the instrument of the organ is most commonly used in churches, and its sounds are derived from the heavens and symbolize cosmic harmony. But here it is the earth, "nature", the ground beneath our feet that is moving. It speaks to us not of harmony, but of our fragility. For the oldest earthquakes considered, Seismic Symphonies drew on SISMOS archives, the INGV project for recovery, high resolution digital reproduction and distribution of the seismograms of earthquakes of the Euro-Mediterranean area from 1895 to 1984. After the first exposure to the Fondazione Bevilacqua La Masa in Venice, the organ was later exhibited in Taiwan, the Taipei Biennial, with seismograms provided from the Taiwanese Central Weather Bureau, and at the EACC Castello in Spain, with seismograms of Spanish earthquakes provided by the Instituto Geográfico Nacional.

  17. Ringing load models verified against experiments

    SciTech Connect

    Krokstad, J.R.; Stansberg, C.T.

    1995-12-31

    What is believed to be the main reason for discrepancies between measured and simulated loads in previous studies has been assessed. One has focused on the balance between second- and third-order load components in relation to what is called ``fat body`` load correction. It is important to understand that the use of Morison strip theory in combination with second-order wave theory give rise to second- as well as third-order components in the horizontal force. A proper balance between second- and third-order components in horizontal force is regarded as the most central requirements for a sufficient accurate ringing load model in irregular sea. It is also verified that simulated second-order components are largely overpredicted both in regular and irregular seas. Nonslender diffraction effects are important to incorporate in the FNV formulation in order to reduce the simulated second-order component and to match experiments more closely. A sufficient accurate ringing simulation model with the use of simplified methods is shown to be within close reach. Some further development and experimental verification must however be performed in order to take non-slender effects into account.

  18. Digital seismic inverse methods

    SciTech Connect

    Robinson, E.A.

    1984-01-01

    This mathematically based text presents the basic geophysical models used today. It is designed as a text or reference information for courses in geophysics, and also as a professional reference. It presents following contents; the seismic method as a communication system, the design of high-resolution digital filters; principles of digital wiener filtering; sampling geophysical data, filter theory and wave propagation; random processes; spectral estimation; predictive decomposition of seismic traces; multichannel z-transforms and minimum delay; the spectral function of a layered system and the determination of waveforms at depth; deconvolution; spectral approach to geophysical inversion by Lorentz, Fourier, and radon transforms; dynamic predictive deconvolution; the normal incidence synthetic seismogram; maximum entropy and the relationship of the partial autocorrelation to the reflection coefficients of a layered system; maximum entropy spectral decomposition of a seismogram into its minimum entropy component plus noise; optimum stacking techniques; optimum digital filters for signal-to-noise ratio enhancement; bibliography, and index.

  19. Conceptual design report: Nuclear materials storage facility renovation. Part 5, Structural/seismic investigation. Section B, Renovation calculations/supporting data

    SciTech Connect

    1995-07-14

    The Nuclear Materials Storage Facility (NMSF) at the Los Alamos National Laboratory (LANL) was a Fiscal Year (FY) 1984 line-item project completed in 1987 that has never been operated because of major design and construction deficiencies. This renovation project, which will correct those deficiencies and allow operation of the facility, is proposed as an FY 97 line item. The mission of the project is to provide centralized intermediate and long-term storage of special nuclear materials (SNM) associated with defined LANL programmatic missions and to establish a centralized SNM shipping and receiving location for Technical Area (TA)-55 at LANL. Based on current projections, existing storage space for SNM at other locations at LANL will be loaded to capacity by approximately 2002. This will adversely affect LANUs ability to meet its mission requirements in the future. The affected missions include LANL`s weapons research, development, and testing (WRD&T) program; special materials recovery; stockpile survelliance/evaluation; advanced fuels and heat sources development and production; and safe, secure storage of existing nuclear materials inventories. The problem is further exacerbated by LANL`s inability to ship any materials offsite because of the lack of receiver sites for mate rial and regulatory issues. Correction of the current deficiencies and enhancement of the facility will provide centralized storage close to a nuclear materials processing facility. The project will enable long-term, cost-effective storage in a secure environment with reduced radiation exposure to workers, and eliminate potential exposures to the public. This report is organized according to the sections and subsections. It is organized into seven parts. This document, Part V, Section B - Structural/Seismic Information provides a description of the seismic and structural analyses performed on the NMSF and their results.

  20. Design of an UML conceptual model and implementation of a GIS with metadata information for a seismic hazard assessment cooperative project.

    NASA Astrophysics Data System (ADS)

    Torres, Y.; Escalante, M. P.

    2009-04-01

    This work illustrates the advantages of using a Geographic Information System in a cooperative project with researchers of different countries, such as the RESIS II project (financed by the Norwegian Government and managed by CEPREDENAC) for seismic hazard assessment of Central America. As input data present different formats, cover distinct geographical areas and are subjected to different interpretations, data inconsistencies may appear and their management get complicated. To achieve data homogenization and to integrate them in a GIS, it is required previously to develop a conceptual model. This is accomplished in two phases: requirements analysis and conceptualization. The Unified Modeling Language (UML) is used to compose the conceptual model of the GIS. UML complies with ISO 19100 norms and allows the designer defining model architecture and interoperability. The GIS provides a frame for the combination of large geographic-based data volumes, with an uniform geographic reference and avoiding duplications. All this information contains its own metadata following ISO 19115 normative. In this work, the integration in the same environment of active faults and subduction slabs geometries, combined with the epicentres location, has facilitated the definition of seismogenetic regions. This is a great support for national specialists of different countries to make easier their teamwork. The GIS capacity for making queries (by location and by attributes) and geostatistical analyses is used to interpolate discrete data resulting from seismic hazard calculations and to create continuous maps as well as to check and validate partial results of the study. GIS-based products, such as complete, homogenised databases and thematic cartography of the region, are distributed to all researchers, facilitating cross-national communication, the project execution and results dissemination.

  1. iMUSH: The design of the Mount St. Helens high-resolution active source seismic experiment

    NASA Astrophysics Data System (ADS)

    Kiser, Eric; Levander, Alan; Harder, Steve; Abers, Geoff; Creager, Ken; Vidale, John; Moran, Seth; Malone, Steve

    2013-04-01

    Mount St. Helens is one of the most societally relevant and geologically interesting volcanoes in the United States. Although much has been learned about the shallow structure of this volcano since its eruption in 1980, important questions still remain regarding its magmatic system and connectivity to the rest of the Cascadia arc. For example, the structure of the magma plumbing system below the shallowest magma chamber under the volcano is still only poorly known. This information will be useful for hazard assessment for the southwest Washington area, and also for gaining insight into fundamental scientific questions such as the assimilation and differentiation processes that lead to the formation of continental crust. As part of the multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment, funded by NSF GeoPRISMS and EarthScope, an active source seismic experiment will be conducted in late summer 2014. The experiment will utilize all of the 2600 IRIS/PASSCAL/USArray Texan instruments. The instruments will be deployed as two 1000-instrument consecutive refraction profiles (one N/S and one WNW/ESE). Each of these profiles will be accompanied by two 1600-instrument areal arrays at varying distances from Mount St. Helens. Finally, one 2600-instrument areal array will be centered on Mount St. Helens. These instruments will record a total of twenty-four 500-1000 kg shots. Each refraction profile will have an average station spacing of 150 m, and a total length of 150 km. The stations in the areal arrays will be separated by ~1 km. A critical step in the success of this project is to develop an experimental setup that can resolve the most interesting aspects of the magmatic system. In particular, we want to determine the distribution of shot locations that will provide good coverage throughout the entire model space, while still allowing us to focus on regions likely to contain the magmatic plumbing system. In this study, we approach this problem by calculating Fréchet kernels with dynamic ray tracing. An initial observation from these kernels is that waves traveling across the largest offsets of the experiment (~150km) have sensitivity below depths of 30km. This means that we may be able to image the magmatic system down to the Moho, estimated at ~40 km. Additional work is focusing on searching for the shot locations that provide high resolution around very shallow features beneath Mount St. Helens, such as the first magmatic reservoir at about 3 km depth, and the associated Mount St. Helens seismic zone. One way in which we are guiding this search is to find the shot locations that maximize sensitivity values within the regions of interest after summing Fréchet kernels from each shot/station pair

  2. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    NASA Technical Reports Server (NTRS)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  3. Black Thunder Coal Mine and Los Alamos National Laboratory experimental study of seismic energy generated by large scale mine blasting

    SciTech Connect

    Martin, R.L.; Gross, D.; Pearson, D.C.; Stump, B.W.; Anderson, D.P.

    1996-12-31

    In an attempt to better understand the impact that large mining shots will have on verifying compliance with the international, worldwide, Comprehensive Test Ban Treaty (CTBT, no nuclear explosion tests), a series of seismic and videographic experiments has been conducted during the past two years at the Black Thunder Coal Mine. Personnel from the mine and Los Alamos National Laboratory have cooperated closely to design and perform experiments to produce results with mutual benefit to both organizations. This paper summarizes the activities, highlighting the unique results of each. Topics which were covered in these experiments include: (1) synthesis of seismic, videographic, acoustic, and computer modeling data to improve understanding of shot performance and phenomenology; (2) development of computer generated visualizations of observed blasting techniques; (3) documentation of azimuthal variations in radiation of seismic energy from overburden casting shots; (4) identification of, as yet unexplained, out of sequence, simultaneous detonation in some shots using seismic and videographic techniques; (5) comparison of local (0.1 to 15 kilometer range) and regional (100 to 2,000 kilometer range) seismic measurements leading to determine of the relationship between local and regional seismic amplitude to explosive yield for overburden cast, coal bulking and single fired explosions; and (6) determination of the types of mining shots triggering the prototype International Monitoring System for the CTBT.

  4. On the distribution of seismic reflection coefficients and seismic amplitudes

    SciTech Connect

    Painter, S.; Paterson, L.; Beresford, G.

    1995-07-01

    Reflection coefficient sequences from 14 wells in Australia have a statistical character consistent with a non-Gaussian scaling noise model based on the Levy-stable family of probability distributions. Experimental histograms of reflection coefficients are accurately approximated by symmetric Levy-stable probability density functions with Levy index between 0.99 and 1.43. These distributions have the same canonical role in mathematical statistics as the Gaussian distribution, but they have slowly decaying tails and infinite moments. The distribution of reflection coefficients is independent of the spatial scale (statistically self-similar), and the reflection coefficient sequences have long-range dependence. These results suggest that the logarithm of seismic impedance can be modeled accurately using fractional Levy motion, which is a generalization of fractional Brownian motion. Synthetic seismograms produced from the authors` model for the reflection coefficients also have Levy-stable distributions. These isolations include transmission losses, the effects of reverberations, and the loss of resolution caused by band-limited wavelets, and suggest that actual seismic amplitudes with sufficient signal-to-noise ratio should also have a Levy-stable distribution. This prediction is verified using post-stack seismic data acquired in the Timor Sea and in the continental USA. However, prestack seismic amplitudes from the Timor Sea are nearly Gaussian. They attribute the difference between prestack and poststack data to the high level of measurement noise in the prestack data.

  5. Passive seismic experiment

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Ewing, M.; Press, F.; Sutton, G.; Dorman, J.; Nakamura, Y.; Toksoz, N.; Lammlein, D.; Duennebier, F.

    1972-01-01

    The design, deployment, and operation of the Apollo 16 passive seismic experiment (PSE) are discussed. Since activation, all elements of the PSE have operated as planned, with the exception of the sensor thermal control system. Significant progress in the measurement of meteoroid flux in near-earth space has been made, along with dilineation of active moonquake source regions. The data obtained indicate that moonquakes are concentrated at great depth (800 to 1000 km) and that the apparent disparity between meteoroid flux estimtes based on lunar crater counts and those from earth-based observations can be resolved by seismic measurements in favor of the lower flux indicated by the crater count method. The results obtained from the PSE are summarized and their significance is discussed in detail.

  6. Cross well seismic reservoir characterization

    SciTech Connect

    Sheline, H.E.

    1995-08-01

    A striking example of how Cross Well Seismic reflection data can help characterize a reservoir, has resulted from an ongoing Multi-Discipline study of the carbonate Mishrif reservoir offshore Dubai, U.A.E. Because the study objectives include a more detailed description of intra reservoir structure and layering, Dubai Petroleum Company (DPC) analyzed the feasibility of Cross Well Seismic (CWS) and decided to acquire two surveys between three wells 337 to 523 feet apart. DPC has concluded that CWS can be cost effectively acquired offshore, in a Carbonate reservoir; as well as processed and interpreted. However, generally it is not often easy to acquire cross well seismic when and where it will be most useful. A CWS survey can provide multiple images such as a velocity Tomogram, P-wave reflections, and S-wave reflections. To date, Tomograms and P-wave reflections have been produced, and the reflection data has proven to be the most useful for reservoir characterization. Cross Well Seismic Reflection data have provided a level of vertical seismic reflection resolution of around 2 feet, which is more than 10 times better than surface seismic data (2D or 3D). The increase in vertical resolution has provided important detailed information about the reservoir, it`s continuity/heterogeneity; it`s detailed structure, stratigraphy and layering; and definition of any faults with more than 2 feet of offset. The CWS has shown detailed intra Mishrif reflectors. These reflectors have verified or changed detailed correlations between well bores, and show significant intra Mishrif thinning. These reflectors imply time stratigraphic layering which is consistent with tracer study results and regional sequence stratigraphy. This new data will be used to improve the reservoir model description.

  7. Web seismic Un ?x: making seismic reflection processing more accessible

    NASA Astrophysics Data System (ADS)

    Templeton, M. E.; Gough, C. A.

    1999-05-01

    Web Seismic Un ?x is a browser-based user interface for the Seismic Un ?x freeware developed at Colorado School of Mines. The interface allows users to process and display seismic reflection data from any remote platform that runs a graphical Web browser. Users access data and create processing jobs on a remote server by completing form-based Web pages whose Common Gateway Interface scripts are written in Perl. These scripts supply parameters, manage files, call Seismic Un ?x routines and return data plots. The interface was designed for undergraduate commuter students taking geophysics courses who need to: (a) process seismic data and other time series as a class using computers in campus teaching labs and (b) complete course assignments at home. Students from an undergraduate applied geophysics course tested the Web user interface while completing laboratory assignments in which they acquired and processed common-depth-point seismic reflection data into a subsurface image. This freeware, which will be publicly available by summer 1999, was developed and tested on a Solaris 2.5 server and will be ported to other versions of Unix, including Linux.

  8. Seismic Isolation Working Meeting Gap Analysis Report

    SciTech Connect

    Justin Coleman; Piyush Sabharwall

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  9. Self-verifiable paper documents and automatic content verification

    NASA Astrophysics Data System (ADS)

    Tian, Yibin; Zhan, Xiaonong; Wu, Chaohong; Ming, Wei

    2014-02-01

    This report describes a method for the creation and automatic content verification of low-cost self-verifiable paper documents. The image of an original document is decomposed to symbol templates and their corresponding locations. The resulting data is further compressed and encrypted, and encoded in custom designed high-capacity color barcodes. The original image and barcodes are printed on the same paper to form a self-verifiable authentic document. During content verification, the paper document is scanned to obtain the barcodes and target image. The original image is reconstructed from data extracted from the barcodes, which is then registered with and compared to the target image. The verification is carried out hierarchically from the entire image down to word and symbol levels. For symbol level comparison, multiple types of features and shape matching are utilized in a cascade. The proposed verification method is inexpensive, robust and fast. Evaluation on 216 character tables and 102 real documents achieved greater than 99% alteration detection rate and less than 1% false positives at the word/symbol level.

  10. Infrasound Generation from the HH Seismic Hammer.

    SciTech Connect

    Jones, Kyle Richard

    2014-10-01

    The HH Seismic hammer is a large, %22weight-drop%22 source for active source seismic experiments. This system provides a repetitive source that can be stacked for subsurface imaging and exploration studies. Although the seismic hammer was designed for seismological studies it was surmised that it might produce energy in the infrasonic frequency range due to the ground motion generated by the 13 metric ton drop mass. This study demonstrates that the seismic hammer generates a consistent acoustic source that could be used for in-situ sensor characterization, array evaluation and surface-air coupling studies for source characterization.

  11. Final Report: Seismic Hazard Assessment at the PGDP

    SciTech Connect

    Wang, Zhinmeng

    2007-06-01

    Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties of seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.

  12. Seismic assessment for offshore pipelines

    SciTech Connect

    Bruschi, R.; Gudmestad, O.T.; Blaker, F.; Nadim, F.

    1995-12-31

    An international consensus on seismic design criteria for onshore pipelines has been established during the last thirty years. The need to assess seismic design for offshore pipelines has not been similarly recognized. In this paper, the geotechnical hazard for a pipeline routed across steep slopes and irregular terrains affected by earthquakes, is discussed. The integrity of both natural and artificial load bearing supports is assessed.d The response of the pipeline to direct excitation from soil or through discontinuous, sparsely distributed natural or artificial supports, is commented.

  13. Seismic risk perception in Italy

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro; Peruzza, Laura

    2014-05-01

    Risk perception is a fundamental element in the definition and the adoption of preventive counter-measures. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. This paper presents results of a survey on seismic risk perception in Italy conducted from January 2013 to present . The research design combines a psychometric and a cultural theoretic approach. More than 7,000 on-line tests have been compiled. The data collected show that in Italy seismic risk perception is strongly underestimated; 86 on 100 Italian citizens, living in the most dangerous zone (namely Zone 1), do not have a correct perception of seismic hazard. From these observations we deem that extremely urgent measures are required in Italy to reach an effective way to communicate seismic risk. Finally, the research presents a comparison between groups on seismic risk perception: a group involved in campaigns of information and education on seismic risk and a control group.

  14. A Very High Resolution, Deep-Towed Multichannel Seismic Survey in the Yaquina Basin off Peru ? Technical Design of the new Deep-Tow Streamer

    NASA Astrophysics Data System (ADS)

    Bialas, J.; Breitzke, M.

    2002-12-01

    Within the project INGGAS a new deep towed acoustic profiling instrument consisting of a side scan sonar fish and a 26 channel seismic streamer has been developed for operation in full ocean depth. The digital channels are build by single hydrophones and three engineering nodes (EN) which are connected either by 1 m or 6.5 m long cable segments. Together with high frequent surface sources (e.g. GI gun) this hybrid system allows to complete surveys with target resolutions of higher frequency content than from complete surface based configurations. Consequently special effort has been addressed to positioning information of the submerged towed instrument. Ultra Short Base Line (USBL) navigation of the tow fish allows precise coordinate evaluation even with more than 7 km of tow cable. Specially designed engineering nodes comprise a single hydrophone with compass, depth, pitch and roll sensors. Optional extension of the streamer up to 96 hydrophone nodes and 75 engineering nodes is possible. A telemetry device allows up- and downlink transmission of all system parameters and all recorded data from the tow fish in real time. Signals from the streamer and the various side scan sensors are multiplexed along the deep-sea cable. Within the telemetry system coaxial and fiber optic connectors are available and can be chosen according to the ships needs. In case of small bandwidth only selected portions of data are transmitted onboard to provide full online quality control while a copy of the complete data set is stored within the submerged systems. Onboard the record strings of side scan and streamer are demultiplexed and distributed to the quality control (QC) systems by Ethernet. A standard marine multichannel control system is used to display shot gather, spectra and noise monitoring of the streamer channels as well as data storage in SEG format. Precise navigation post processing includes all available positioning information from the vessel (DGPS), the USBL, the streamer (EN) and optional first break information. Therefore exact positioning of each hydrophone can be provided throughout the entire survey which is an essential input for later migration processing of the seismic data.

  15. Reproducing the seismic response of a cluster of buildings with acoustic meta-materials

    NASA Astrophysics Data System (ADS)

    Colombi, A.; Rupin, M.; Roux, P.

    2013-12-01

    Recently, new kind of meta-materials have appeared in the field of acoustics and electromagnetism. They are realized using so-called sub-wavelength resonators because of their characteristic size (in the plane of interest), far smaller compared to the wavelength. These objects offer the opportunity to manipulate the wave-field at will. We have realized an experiment allowing the study of their effect on elastic waves which is of strong interest in seismology. The experiment already reveals that when excited by an acoustic or a seismic source these meta-materials may give rise to (1) band-gap effects, i.e. frequency bands in which seismic energy cannot propagate, or (2) sub-wavelength focusing, i.e. energy conversion on wavelength much smaller than the original propagated signal. A cluster of aluminum beams, constituting the metamaterial and rigidly mounted on an aluminum plate where we propagate flexural modes, represents an ideal model to study the seismic response of a cluster of tall buildings and the so called site-city interactions, two important aspects of ';'the seismic vulnerability' yet poorly known. Using the analogy with metamaterials, we present some preliminary results that show how such complex phenomena may be downscaled to the laboratory size upon the choice of few relevant design parameters and used to model real-scale configurations. The research methodology equally relies on numerical simulations and laboratory experiments both reproducing the same plate-beams model. With numerical simulations in particular we characterize the effective properties of the cluster and we conduct a sensitivity analysis on the relevant design parameters. The study highlights the appearance of (1) band-gaps and (2) sub-wavelength energy focusing. The first could be potentially exploited to realize seismic-free buildings configuration, seismic cloaking or protection of strategic buildings. The second represent a potentially dangerous phenomenon that needs to be studied deeper to verify whether it can take place on existing cluster configurations.

  16. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity....

  17. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity....

  18. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity....

  19. An arbitrated quantum signature scheme with fast signing and verifying

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Qin, Su-Juan; Su, Qi

    2013-11-01

    Existing arbitrated quantum signature (AQS) schemes are almost all based on the Leung quantum one-time pad (L-QOTP) algorithm. In these schemes, the receiver can achieve an existential forgery of the sender's signatures under the known message attack, and the sender can successfully disavow any of her/his signatures by a simple attack. In this paper, a solution of solving the problems is given, through designing a new QOTP algorithm relying largely on inserting decoy states into fixed insertion positions. Furthermore, we present an AQS scheme with fast signing and verifying, which is based on the new QOTP algorithm. It is just using single particle states and is unconditional secure. To fulfill the functions of AQS schemes, our scheme needs a significantly lower computational costs than that required by other AQS schemes based on the L-QOTP algorithm.

  20. Using Theorem Proving to Verify Properties of Agent Programs

    NASA Astrophysics Data System (ADS)

    Alechina, N.; Dastani, M.; Khan, F.; Logan, B.; Meyer, J.-J. Ch.

    We present a sound and complete logic for automatic verification of simpleAPL programs. simpleAPL is a simplified version of agent programming languages such as 3APL and 2APL designed for the implementation of cognitive agents with beliefs, goals and plans. Our logic is a variant of PDL, and allows the specification of safety and liveness properties of agent programs. We prove a correspondence between the operational semantics of simpleAPL and the models of the logic for two example program execution strategies. We show how to translate agent programs written in simpleAPL into expressions of the logic, and give an example in which we show how to verify correctness properties for a simple agent program using theorem-proving.

  1. Application of bounding spectra to seismic design of piping based on the performance of above ground piping in power plants subjected to strong motion earthquakes

    SciTech Connect

    Stevenson, J.D.

    1995-02-01

    This report extends the potential application of Bounding Spectra evaluation procedures, developed as part of the A-46 Unresolved Safety Issue applicable to seismic verification of in-situ electrical and mechanical equipment, to in-situ safety related piping in nuclear power plants. The report presents a summary of earthquake experience data which define the behavior of typical U.S. power plant piping subject to strong motion earthquakes. The report defines those piping system caveats which would assure the seismic adequacy of the piping systems which meet those caveats and whose seismic demand are within the bounding spectra input. Based on the observed behavior of piping in strong motion earthquakes, the report describes the capabilities of the piping system to carry seismic loads as a function of the type of connection (i.e. threaded versus welded). This report also discusses in some detail the basic causes and mechanisms for earthquake damages and failures to power plant piping systems.

  2. Romanian Educational Seismic Network Project

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babe?-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers will be installed in several schools in the most important seismic areas (Vrancea, Dobrogea), vulnerable cities (Bucharest, Ploiesti, Iasi) or high populated places (Cluj, Sibiu, Timisoara, Zal?u). All the elements of the seismic station are especially designed for educational purposes and can be operated independently by the students and teachers themselves. The first stage of ROEDUSEIS project was centered on the work of achievement of educational materials for all levels of pre-university education (kindergarten, primary, secondary and high school). A study of necessity preceded the achievement of educational materials. This was done through a set of questionnaires for teachers and students sent to participating schools. Their responses formed a feedback instrument for properly materials editing. The topics covered within educational materials include: seismicity (general principles, characteristics of Romanian seismicity, historical local events), structure of the Earth, measuring of earthquakes, seismic hazard and risk.

  3. 41 CFR 128-1.8005 - Seismic safety standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Congress (SBCC) Standard Building Code (SBC). (b) The seismic design and construction of a covered building... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Seismic safety standards... Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety Program ...

  4. Seismic behavior of buried pipelines constructed by design criteria and construction specifications of both Korea and the US

    NASA Astrophysics Data System (ADS)

    Jeon, S.-S.

    2013-03-01

    Lifeline damage induced by earthquake loading not only causes structure damage but also communication problems resulting from the interruption of various energy utilities such as electric power, gas, and water resources. Earthquake loss estimation systems in the US, for example HAZUS (Hazard in US), have been established for the purpose of prevention and efficient response to earthquake hazards. Sufficient damage records obtained from earthquakes are required to establish these systems, however, in Korea, insufficient data sets of damage records are currently available. In this study, according to the design criteria and construction specifications of pipelines in Korea and the US, the behavior of both brittle and ductile pipelines embedded in dense sand overlying various in-situ soils, such as clay, sand, and gravel, were examined and compared with respect to the mechanical characteristics of pipelines under various earthquake loadings.

  5. Seismic sources

    DOEpatents

    Green, M.A.; Cook, N.G.W.; McEvilly, T.V.; Majer, E.L.; Witherspoon, P.A.

    1987-04-20

    Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Longitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements for more than about one minute. 9 figs.

  6. 2008 United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, M.D.; and others

    2008-01-01

    The U.S. Geological Survey recently updated the National Seismic Hazard Maps by incorporating new seismic, geologic, and geodetic information on earthquake rates and associated ground shaking. The 2008 versions supersede those released in 1996 and 2002. These maps are the basis for seismic design provisions of building codes, insurance rate structures, earthquake loss studies, retrofit priorities, and land-use planning. Their use in design of buildings, bridges, highways, and critical infrastructure allows structures to better withstand earthquake shaking, saving lives and reducing disruption to critical activities following a damaging event. The maps also help engineers avoid costs from over-design for unlikely levels of ground motion.

  7. Seismic no-data zone, offshore Mississippi delta: depositional controls on geotechnical properties, velocity structure, and seismic attenuation

    SciTech Connect

    May, J.A.; Meeder, C.A.; Tinkle, A.R.; Wener, K.R.

    1986-09-01

    Seismic acquisition problems plague exploration and production offshore the Mississippi delta. Geologic and geotechnical analyses of 300-ft borings and 20-ft piston cores, combined with subbottom acoustic measurements, help identify and predict the locations, types, and magnitudes of anomalous seismic zones. This knowledge is used to design acquisition and processing techniques to circumvent the seismic problems.

  8. Seismic behavior of buried pipelines constructed by design criteria and construction specifications of both Korea and the US

    NASA Astrophysics Data System (ADS)

    Jeon, S.-S.

    2013-09-01

    Earthquake loss estimation systems in the US, for example HAZUS (Hazard in US), have been established based on sufficient damage records for the purpose of prevention and efficient response to earthquake hazards; however, in Korea, insufficient data sets of earthquakes and damage records are currently available. In this study, the earthquake damages to pipelines in Korea using the pipeline repair rate (RR) recommended in HAZUS was reevaluated with the degree of confidence when RR is used without modification for the damage estimation of pipelines in Korea. The numerical analyses using a commercial finite element model, ABAQUS, were carried out to compare stresses and strains mobilized in both brittle and ductile pipelines constructed by the design criteria and construction specifications of both Korea and the US. These pipelines were embedded in dense sand overlying three different in situ soils (clay, sand, and gravel) subjected to earthquake excitations with peak ground accelerations (PGAs) of 0.2 to 1.2 g and 1994 Northridge and 1999 Chi-Chi earthquake loadings. The numerical results show that differences in the stress and strain rates are less than 10%. This implies that RR in HAZUS can be used for earthquake damage estimation of pipelines with a 90% confidence level in Korea.

  9. Seismic analysis of nuclear power plant structures

    NASA Technical Reports Server (NTRS)

    Go, J. C.

    1973-01-01

    Primary structures for nuclear power plants are designed to resist expected earthquakes of the site. Two intensities are referred to as Operating Basis Earthquake and Design Basis Earthquake. These structures are required to accommodate these seismic loadings without loss of their functional integrity. Thus, no plastic yield is allowed. The application of NASTRAN in analyzing some of these seismic induced structural dynamic problems is described. NASTRAN, with some modifications, can be used to analyze most structures that are subjected to seismic loads. A brief review of the formulation of seismic-induced structural dynamics is also presented. Two typical structural problems were selected to illustrate the application of the various methods of seismic structural analysis by the NASTRAN system.

  10. A university-developed seismic source for shallow seismic surveys

    NASA Astrophysics Data System (ADS)

    Yordkayhun, Sawasdee; Na Suwan, Jumras

    2012-07-01

    The main objectives of this study were to (1) design and develop a low cost seismic source for shallow seismic surveys and (2) test the performance of the developed source at a test site. The surface seismic source, referred to here as a university-developed seismic source is based upon the principle of an accelerated weight drop. A 30 kg activated mass is lifted by a mechanical rack and pinion gear and is accelerated by a mounted spring. When the mass is released from 0.5 m above the surface, it hits a 30 kg base plate and energy is transferred to the ground, generating a seismic wave. The developed source is portable, environmentally friendly, easy to operate and maintain, and is a highly repeatable impact source. To compare the developed source with a sledgehammer source, a source test was performed at a test site, a study site for mapping a major fault zone in southern Thailand. The sledgehammer and the developed sources were shot along a 300 m long seismic reflection profile with the same parameters. Data were recorded using 12 channels off-end geometry with source and receiver spacing of 5 m, resulting in CDP stacked sections with 2.5 m between traces. Source performances were evaluated based on analyses of signal penetration, frequency content and repeatability, as well as the comparison of stacked sections. The results show that both surface sources are suitable for seismic studies down to a depth of about 200 m at the site. The hammer data are characterized by relatively higher frequency signals than the developed source data, whereas the developed source generates signals with overall higher signal energy transmission and greater signal penetration. In addition, the repeatability of the developed source is considerably higher than the hammer source.

  11. Buried tank-to-tank interaction during a seismic event

    SciTech Connect

    Moore, C.J.; Wagenblast, G.R.; Day, J.P.

    1995-12-01

    Three-dimensional dynamic soil-structure interaction seismic analyses have become practical and accepted only since 1980. This new capability allows the study of interaction among closely spaced buried tanks during a seismic event. This paper presents the results of two studies of seismic tank-to-tank interaction at the US Department of Energy`s Hanford Site. One study evaluates seismic tank-to-tank interaction for an existing reinforced concrete tank design used during construction of the Hanford Site in the 1940`s. The other study evaluates seismic interaction and radius of separation for newly designed Hanford double-shelled buried waste tanks that are to be constructed.

  12. LANL seismic screening method for existing buildings

    SciTech Connect

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method and will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.

  13. Implementation of Seismic Stops in Piping Systems

    SciTech Connect

    Bezler, P.; Simos, N.; Wang, Y.K.

    1993-02-01

    Commonwealth Edison has submitted a request to NRC to replace the snubbers in the Reactor Coolant Bypass Line of Byron Station-Unit 2 with gapped pipe supports. The specific supports intended for use are commercial units designated ''Seismic Stops'' manufactured by Robert L. Cloud Associates, Inc. (RLCA). These devices have the physical appearance of snubbers and are essentially spring supports incorporating clearance gaps sized for the Byron Station application. Although the devices have a nonlinear stiffness characteristic, their design adequacy is demonstrated through the use of a proprietary linear elastic piping analysis code ''GAPPIPE'' developed by RLCA. The code essentially has all the capabilities of a conventional piping analysis code while including an equivalent linearization technique to process the nonlinear spring elements. Brookhaven National Laboratory (BNL) has assisted the NRC staff in its evaluation of the RLCA implementation of the equivalent Linearization technique and the GAPPIPE code. Towards this end, BNL performed a detailed review of the theoretical basis for the method, an independent evaluation of the Byron piping using the nonlinear time history capability of the ANSYS computer code and by result comparisons to the RLCA developed results, an assessment of the adequacy of the response estimates developed with GAPPIPE. Associated studies included efforts to verify the ANSYS analysis results and the development of bounding calculations for the Byron Piping using linear response spectrum methods.

  14. IMPLEMENTATION OF SEISMIC STOPS IN PIPING SYSTEMS.

    SciTech Connect

    BEZLER,P.

    1993-02-01

    Commonwealth Edison has submitted a request to NRC to replace the snubbers in the Reactor Coolant Bypass Line of Byron Station -Unit 2 with gapped pipe supports. The specific supports intended for use are commercial units designated ''Seismic Stops'' manufactured by Robert L. Cloud Associates, Inc. (RLCA). These devices have the physical appearance of snubbers and are essentially spring supports incorporating clearance gaps sized for the Byron Station application. Although the devices have a nonlinear stiffness characteristic, their design adequacy is demonstrated through the use of a proprietary linear elastic piping analysis code ''GAPPIPE'' developed by RLCA. The code essentially has all the capabilities of a conventional piping analysis code while including an equivalent linearization technique to process the nonlinear spring elements. Brookhaven National Laboratory (BNL) has assisted the NRC staff in its evaluation of the RLCA implementation of the equivalent linearization technique and the GAPPIPE code. Towards this end, BNL performed a detailed review of the theoretical basis for the method, an independent evaluation of the Byron piping using the nonlinear time history capability of the ANSYS computer code and by result comparisons to the RLCA developed results, an assessment of the adequacy of the response estimates developed with GAPPIPE. Associated studies included efforts to verify the ANSYS analysis results and the development of bounding calculations for the Byron Piping using linear response spectrum methods.

  15. Verifying an interactive consistency circuit: A case study in the reuse of a verification technology

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1990-01-01

    The work done at ORA for NASA-LRC in the design and formal verification of a hardware implementation of a scheme for attaining interactive consistency (byzantine agreement) among four microprocessors is presented in view graph form. The microprocessors used in the design are an updated version of a formally verified 32-bit, instruction-pipelined, RISC processor, MiniCayuga. The 4-processor system, which is designed under the assumption that the clocks of all the processors are synchronized, provides software control over the interactive consistency operation. Interactive consistency computation is supported as an explicit instruction on each of the microprocessors. An identical user program executing on each of the processors decides when and on what data interactive consistency must be performed. This exercise also served as a case study to investigate the effectiveness of reusing the technology which was developed during the MiniCayuga effort for verifying synchronous hardware designs. MiniCayuga was verified using the verification system Clio which was also developed at ORA. To assist in reusing this technology, a computer-aided specification and verification tool was developed. This tool specializes Clio to synchronous hardware designs and significantly reduces the tedium involved in verifying such designs. The tool is presented and how it was used to specify and verify the interactive consistency circuit is described.

  16. Seismic Performance Requirements for WETF

    SciTech Connect

    Hans Jordan

    2001-01-01

    This report develops recommendations for requirements on the Weapons Engineering Tritium Facility (WETF) performance during seismic events. These recommendations are based on fragility estimates of WETF structures, systems, and components that were developed by LANL experts during facility walkdowns. They follow DOE guidance as set forth in standards DOE-STD-1021-93, ''Natural Phenomena Hazards Performance Categorization Guidelines for Structures, Systems, and Components'' and DOE-STD-1020-94, ''Natural Phenomena Hazards Design and Evaluation Criteria for Department of Energy Facilities''. Major recommendations are that WETF institute a stringent combustible loading control program and that additional seismic bracing and anchoring be provided for gloveboxes and heavy equipment.

  17. Vertical seismic profiling

    SciTech Connect

    Wyatt, K.D.

    1986-12-02

    A method is described for converting vertical seismic profiling (VSP) seismic data to surface seismic data. The seismic source used to obtain the VSP seismic data was offset a desired distance from a borehole, the method comprising the steps of: (a) selecting a first VSP data trace from the VSP seismic data; (b) mapping segments of the first VSP data trace at respective VSP times into locations on a plot of surface seismic time as a function of distance from the borehole; (c) repeating steps (a) and (b) for at least a portion of the VSP data traces, other than the first VSP data trace, in the VSP seismic data; and (d) summing sections of each VSP data trace which are mapped into the same location in the plot to produce the surface seismic data.

  18. Seismic sources

    DOEpatents

    Green, Michael A. (Oakland, CA); Cook, Neville G. W. (Lafayette, CA); McEvilly, Thomas V. (Berkeley, CA); Majer, Ernest L. (El Cirrito, CA); Witherspoon, Paul A. (Berkeley, CA)

    1992-01-01

    Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Logitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole relative to a stator that is clamped to the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements at a power level that causes heating to over 150.degree. C. within one minute of operation, but energizing the elements for no more than about one minute.

  19. Seismic waveform viewer, processor and calculator

    Energy Science and Technology Software Center (ESTSC)

    2015-02-15

    SWIFT is a computer code that is designed to do research level signal analysis on seismic waveforms, including visualization, filtering and measurement. LLNL is using this code, amplitude and global tomography efforts.

  20. Effect of Different Groundwater Levels on Seismic Dynamic Response and Failure Mode of Sandy Slope

    PubMed Central

    Huang, Shuai; Lv, Yuejun; Peng, Yanju; Zhang, Lifang; Xiu, Liwei

    2015-01-01

    Heavy seismic damage tends to occur in slopes when groundwater is present. The main objectives of this paper are to determine the dynamic response and failure mode of sandy slope subjected simultaneously to seismic forces and variable groundwater conditions. This paper applies the finite element method, which is a fast and efficient design tool in modern engineering analysis, to evaluate dynamic response of the slope subjected simultaneously to seismic forces and variable groundwater conditions. Shaking table test is conducted to analyze the failure mode and verify the accuracy of the finite element method results. The research results show that dynamic response values of the slope have different variation rules under near and far field earthquakes. And the damage location and pattern of the slope are different in varying groundwater conditions. The destruction starts at the top of the slope when the slope is in no groundwater, which shows that the slope appears obvious whipping effect under the earthquake. The destruction starts at the toe of the slope when the slope is in the high groundwater levels. Meanwhile, the top of the slope shows obvious seismic subsidence phenomenon after earthquake. Furthermore, the existence of the groundwater has a certain effect of damping. PMID:26560103

  1. Effect of Different Groundwater Levels on Seismic Dynamic Response and Failure Mode of Sandy Slope.

    PubMed

    Huang, Shuai; Lv, Yuejun; Peng, Yanju; Zhang, Lifang; Xiu, Liwei

    2015-01-01

    Heavy seismic damage tends to occur in slopes when groundwater is present. The main objectives of this paper are to determine the dynamic response and failure mode of sandy slope subjected simultaneously to seismic forces and variable groundwater conditions. This paper applies the finite element method, which is a fast and efficient design tool in modern engineering analysis, to evaluate dynamic response of the slope subjected simultaneously to seismic forces and variable groundwater conditions. Shaking table test is conducted to analyze the failure mode and verify the accuracy of the finite element method results. The research results show that dynamic response values of the slope have different variation rules under near and far field earthquakes. And the damage location and pattern of the slope are different in varying groundwater conditions. The destruction starts at the top of the slope when the slope is in no groundwater, which shows that the slope appears obvious whipping effect under the earthquake. The destruction starts at the toe of the slope when the slope is in the high groundwater levels. Meanwhile, the top of the slope shows obvious seismic subsidence phenomenon after earthquake. Furthermore, the existence of the groundwater has a certain effect of damping. PMID:26560103

  2. The intelligent seismic retrofitting of structure based on the magnetorheological dampers

    NASA Astrophysics Data System (ADS)

    Li, Xiu-ling; Li, Hong-nan

    2009-03-01

    Based on the state-of-the-art about seismic damage principles and aseismic strengthening technology, analysis and design method of seismic retrofitting for earthquake damaged reinforced concrete frame using magnetorheological (MR) damper is proposed. Three levels of fortification objects are put forward and quantified or intelligent retrofitting of reinforced concrete frame using MR damper. The experiment system of a three-floor reinforced concrete frame-shear wall eccentric structure has been built based on Matlab/Simulink software environment and hardware/software resources of dSPACE. The shaking table experiment of seismic retrofitting of earthquake damaged reinforced concrete frame-shear wall structure using MR damper is implemented using rapid control prototyping (RCP) technology. The validity of passive control strategies and semi-active control strategy is verified under El Centro earthquake excitations with different peak value. The experimental results indicate that MR dampers can significantly enhance aseismic performance level of the seismic damaged reinforced concrete frame, and meet all the earthquake fortification levels. The aseismic ability of MR damper intelligent aseismic structure system of auto-reinforcement is much better than both the damaged structure and the aseismic structure reinforced by the passive damper.

  3. Identity-Based Verifiably Encrypted Signatures without Random Oracles

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Wu, Qianhong; Qin, Bo

    Fair exchange protocol plays an important role in electronic commerce in the case of exchanging digital contracts. Verifiably encrypted signatures provide an optimistic solution to these scenarios with an off-line trusted third party. In this paper, we propose an identity-based verifiably encrypted signature scheme. The scheme is non-interactive to generate verifiably encrypted signatures and the resulting encrypted signature consists of only four group elements. Based on the computational Diffie-Hellman assumption, our scheme is proven secure without using random oracles. To the best of our knowledge, this is the first identity-based verifiably encrypted signature scheme provably secure in the standard model.

  4. The ENAM Explosive Seismic Source Test

    NASA Astrophysics Data System (ADS)

    Harder, S. H.; Magnani, M. B.

    2013-12-01

    We present the results of the pilot study conducted as part of the eastern North American margin (ENAM) community seismic experiment (CSE) to test an innovative design of land explosive seismic source for crustal-scale seismic surveys. The ENAM CSE is a community based onshore-offshore controlled- and passive-source seismic experiment spanning a 400 km-wide section of the mid-Atlantic East Coast margin around Cape Hatteras. The experiment was designed to address prominent research questions such as the role of the pre-existing lithospheric grain on the structure and evolution of the ENAM margin, the distribution of magmatism, and the along-strike segmentation of the margin. In addition to a broadband OBS deployment, the CSE will acquire multichannel marine seismic data and two major onshore-offshore controlled-source seismic profiles recording both marine sources (airguns) and land explosions. The data acquired as part of the ENAM CSE will be available to the community immediately upon completion of QC procedures required for archiving purposes. The ENAM CSE provides an opportunity to test a radically new and more economical design for land explosive seismic sources used for crustal-scale seismic surveys. Over the years we have incrementally improved the performance and reduced the cost of shooting crustal seismic shots. These improvements have come from better explosives and more efficient configuration of those explosives. These improvements are largely intuitive, using higher velocity explosives and shorter, but larger diameter explosive configurations. However, recently theoretical advances now allow us to model not only these incremental improvements, but to move to more radical shot designs, which further enhance performance and reduce costs. Because some of these designs are so radical, they need experimental verification. To better engineer the shots for the ENAM experiment we are conducting an explosives test in the region of the ENAM CSE. The results of this test will guide engineering for the main ENAM experiment as well as other experiments in the future.

  5. Verifying the Dependence of Fractal Coefficients on Different Spatial Distributions

    SciTech Connect

    Gospodinov, Dragomir; Marekova, Elisaveta; Marinov, Alexander

    2010-01-21

    A fractal distribution requires that the number of objects larger than a specific size r has a power-law dependence on the size N(r) = C/r{sup D}propor tor{sup -D} where D is the fractal dimension. Usually the correlation integral is calculated to estimate the correlation fractal dimension of epicentres. A 'box-counting' procedure could also be applied giving the 'capacity' fractal dimension. The fractal dimension can be an integer and then it is equivalent to a Euclidean dimension (it is zero of a point, one of a segment, of a square is two and of a cube is three). In general the fractal dimension is not an integer but a fractional dimension and there comes the origin of the term 'fractal'. The use of a power-law to statistically describe a set of events or phenomena reveals the lack of a characteristic length scale, that is fractal objects are scale invariant. Scaling invariance and chaotic behavior constitute the base of a lot of natural hazards phenomena. Many studies of earthquakes reveal that their occurrence exhibits scale-invariant properties, so the fractal dimension can characterize them. It has first been confirmed that both aftershock rate decay in time and earthquake size distribution follow a power law. Recently many other earthquake distributions have been found to be scale-invariant. The spatial distribution of both regional seismicity and aftershocks show some fractal features. Earthquake spatial distributions are considered fractal, but indirectly. There are two possible models, which result in fractal earthquake distributions. The first model considers that a fractal distribution of faults leads to a fractal distribution of earthquakes, because each earthquake is characteristic of the fault on which it occurs. The second assumes that each fault has a fractal distribution of earthquakes. Observations strongly favour the first hypothesis.The fractal coefficients analysis provides some important advantages in examining earthquake spatial distribution, which are: - Simple way to quantify scale-invariant distributions of complex objects or phenomena by a small number of parameters. - It is becoming evident that the applicability of fractal distributions to geological problems could have a more fundamental basis. Chaotic behaviour could underlay the geotectonic processes and the applicable statistics could often be fractal.The application of fractal distribution analysis has, however, some specific aspects. It is usually difficult to present an adequate interpretation of the obtained values of fractal coefficients for earthquake epicenter or hypocenter distributions. That is why in this paper we aimed at other goals - to verify how a fractal coefficient depends on different spatial distributions. We simulated earthquake spatial data by generating randomly points first in a 3D space - cube, then in a parallelepiped, diminishing one of its sides. We then continued this procedure in 2D and 1D space. For each simulated data set we calculated the points' fractal coefficient (correlation fractal dimension of epicentres) and then checked for correlation between the coefficients values and the type of spatial distribution.In that way one can obtain a set of standard fractal coefficients' values for varying spatial distributions. These then can be used when real earthquake data is analyzed by comparing the real data coefficients values to the standard fractal coefficients. Such an approach can help in interpreting the fractal analysis results through different types of spatial distributions.

  6. Broadband seismology and small regional seismic networks

    USGS Publications Warehouse

    Herrmann, Robert B.

    1995-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  7. Seismic safety of high concrete dams

    NASA Astrophysics Data System (ADS)

    Chen, Houqun

    2014-08-01

    China is a country of high seismicity with many hydropower resources. Recently, a series of high arch dams have either been completed or are being constructed in seismic regions, of which most are concrete dams. The evaluation of seismic safety often becomes a critical problem in dam design. In this paper, a brief introduction to major progress in the research on seismic aspects of large concrete dams, conducted mainly at the Institute of Water Resources and Hydropower Research (IWHR) during the past 60 years, is presented. The dam site-specific ground motion input, improved response analysis, dynamic model test verification, field experiment investigations, dynamic behavior of dam concrete, and seismic monitoring and observation are described. Methods to prevent collapse of high concrete dams under maximum credible earthquakes are discussed.

  8. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION...; contents. A verified statement should contain all the facts upon which the witness relies, and to...

  9. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... registration card, or union card to verify your identity. If you do not have identification papers to verify... records such as medical records. (3) Electronic requests. If you make a request by computer or other... of your relationship to the minor or legal incompetent has been previously given to SSA. (7)...

  10. 26 CFR 301.6334-4 - Verified statements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 18 2012-04-01 2012-04-01 false Verified statements. 301.6334-4 Section 301.6334-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) PROCEDURE AND ADMINISTRATION PROCEDURE AND ADMINISTRATION Seizure of Property for Collection of Taxes 301.6334-4 Verified statements. (a) In general. For...

  11. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION...; contents. A verified statement should contain all the facts upon which the witness relies, and to...

  12. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION...; contents. A verified statement should contain all the facts upon which the witness relies, and to...

  13. Regional Seismic Methods of Identifying Explosions

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Ford, S. R.; Pasyanos, M.; Pyle, M. L.; Hauk, T. F.

    2013-12-01

    A lesson from the 2006, 2009 and 2013 DPRK declared nuclear explosion Ms:mb observations is that our historic collection of data may not be representative of future nuclear test signatures (e.g. Selby et al., 2012). To have confidence in identifying future explosions amongst the background of other seismic signals, we need to put our empirical methods on a firmer physical footing. Here we review the two of the main identification methods: 1) P/S ratios and 2) Moment Tensor techniques, which can be applied at the regional distance (200-1600 km) to very small events, improving nuclear explosion monitoring and confidence in verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Amplitude ratios of seismic P-to-S waves at sufficiently high frequencies (~>2 Hz) can identify explosions among a background of natural earthquakes (e.g. Walter et al., 1995). However the physical basis for the generation of explosion S-waves, and therefore the predictability of this P/S technique as a function of event properties such as size, depth, geology and path, remains incompletely understood. Calculated intermediate period (10-100s) waveforms from regional 1-D models can match data and provide moment tensor results that separate explosions from earthquakes and cavity collapses (e.g. Ford et al. 2009). However it has long been observed that some nuclear tests produce large Love waves and reversed Rayleigh waves that complicate moment tensor modeling. Again the physical basis for the generation of these effects from explosions remains incompletely understood. We are re-examining regional seismic data from a variety of nuclear test sites including the DPRK and the former Nevada Test Site (now the Nevada National Security Site (NNSS)). Newer relative amplitude techniques can be employed to better quantify differences between explosions and used to understand those differences in term of depth, media and other properties. We are also making use of the Source Physics Experiments (SPE) at NNSS. The SPE chemical explosions are explicitly designed to improve our understanding of emplacement and source material effects on the generation of shear and surface waves (e.g. Snelson et al., 2013). Our goal is to improve our explosion models and our ability to understand and predict where P/S and moment tensor methods of identifying explosions work, and any circumstances where they may not. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  14. VISION - Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics

    SciTech Connect

    Steven J. Piet; A. M. Yacout; J. J. Jacobson; C. Laws; G. E. Matthern; D. E. Shropshire

    2006-02-01

    The U.S. DOE Advanced Fuel Cycle Initiatives (AFCI) fundamental objective is to provide technology options that - if implemented - would enable long-term growth of nuclear power while improving sustainability and energy security. The AFCI organization structure consists of four areas; Systems Analysis, Fuels, Separations and Transmutations. The Systems Analysis Working Group is tasked with bridging the program technical areas and providing the models, tools, and analyses required to assess the feasibility of design and deployment options and inform key decision makers. An integral part of the Systems Analysis tool set is the development of a system level model that can be used to examine the implications of the different mixes of reactors, implications of fuel reprocessing, impact of deployment technologies, as well as potential "exit" or "off ramp" approaches to phase out technologies, waste management issues and long-term repository needs. The Verifiable Fuel Cycle Simulation Model (VISION) is a computer-based simulation model that allows performing dynamic simulations of fuel cycles to quantify infrastructure requirements and identify key trade-offs between alternatives. It is based on the current AFCI system analysis tool "DYMOND-US" functionalities in addition to economics, isotopic decay, and other new functionalities. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI and Generation IV reactor development studies.

  15. Verified models of multiagent systems for vehicle health management

    NASA Astrophysics Data System (ADS)

    Esterline, Albert; Gandluri, Bhanu; Sundaresan, Mannur; Sankar, Jagannathan

    2005-05-01

    A multiagent framework for data acquisition, analysis, and diagnosis in health management is proposed. It uses the contract net protocol, a protocol for high-level distributed problem solving that provides adaptive and flexible solutions where task decomposition and assignment of subtasks is natural. Java is used to wrap implementations of existing techniques for individual tasks, such as neural networks or fuzzy rule bases for fault classification. The Java wrapping supplies an agent interface that allows an implementation to participate in the contract net protocol. This framework is demonstrated with a simple Java prototype that monitors a laboratory specimen that generates acoustic emission signals due to fracture-induced failure. A multiagent system that conforms to our framework can focus resources as well as select important data and extract important information. Such a system is extensible and decentralized, and redundancy in it provides fault tolerance and graceful degradation. Finally, the flexibility inherent in such a system allows new strategies to develop on the fly. The behavior of a non-trivial concurrent system (such as multiagent systems) is too complex and uncontrollable to be thoroughly tested, so methods have been developed to check the design of a concurrent system against formal specifications of the system"s behavior. We review one such method-model checking with SPIN-and discuss how it can be used to verify control aspects of multiagent systems that conform to our framework.

  16. Seismic intrusion detector system

    DOEpatents

    Hawk, Hervey L.; Hawley, James G.; Portlock, John M.; Scheibner, James E.

    1976-01-01

    A system for monitoring man-associated seismic movements within a control area including a geophone for generating an electrical signal in response to seismic movement, a bandpass amplifier and threshold detector for eliminating unwanted signals, pulse counting system for counting and storing the number of seismic movements within the area, and a monitoring system operable on command having a variable frequency oscillator generating an audio frequency signal proportional to the number of said seismic movements.

  17. Advanced Seismic While Drilling System

    SciTech Connect

    Robert Radtke; John Fontenot; David Glowka; Robert Stokes; Jeffery Sutherland; Ron Evans; Jim Musser

    2008-06-30

    A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology of a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured by TII. An APS Turbine Alternator powered the SeismicPULSER{trademark} to produce two Hz frequency peak signals repeated every 20 seconds. Since the ION Geophysical, Inc. (ION) seismic survey surface recording system was designed to detect a minimum downhole signal of three Hz, successful performance was confirmed with a 5.3 Hz recording with the pumps running. The two Hz signal generated by the sparker was modulated with the 3.3 Hz signal produced by the mud pumps to create an intense 5.3 Hz peak frequency signal. The low frequency sparker source is ultimately capable of generating selectable peak frequencies of 1 to 40 Hz with high-frequency spectra content to 10 kHz. The lower frequencies and, perhaps, low-frequency sweeps, are needed to achieve sufficient range and resolution for realtime imaging in deep (15,000 ft+), high-temperature (150 C) wells for (a) geosteering, (b) accurate seismic hole depth, (c) accurate pore pressure determinations ahead of the bit, (d) near wellbore diagnostics with a downhole receiver and wired drill pipe, and (e) reservoir model verification. Furthermore, the pressure of the sparker bubble will disintegrate rock resulting in an increased overall rates of penetration. Other applications for the SeismicPULSER{trademark} technology are to deploy a low-frequency source for greater range on a wireline for Reverse Vertical Seismic Profiling (RVSP) and Cross-Well Tomography. Commercialization of the technology is being undertaken by first contacting stakeholders to define the value proposition for rig site services utilizing SeismicPULSER{trademark} technologies. Stakeholders include national oil companies, independent oil companies, independents, service companies, and commercial investors. Service companies will introduce a new Drill Bit SWD service for deep HTHP wells. Collaboration will be encouraged between stakeholders in the form of joint industry projects to develop prototype tools and initial field trials. No barriers have been identified for developing, utilizing, and exploiting the low-frequency SeismicPULSER{trademark} source in a variety of applications. Risks will be minimized since Drill Bit SWD will not interfere with the drilling operation, and can be performed in a relatively quiet environment when the pumps are turned off. The new source must be integrated with other Measurement While Drilling (MWD) tools. To date, each of the oil companies and service companies contacted have shown interest in participating in the commercialization of the low-frequency SeismicPULSER{trademark} source. A technical paper has been accepted for presentation at the 2009 Offshore Technology Conference (OTC) in a Society of Exploration Geologists/American Association of Petroleum Geophysicists (SEG/AAPG) technical session.

  18. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-12-09

    This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate preferential contaminant-flow paths. Historically, SSR has been used very little at depths shallower than 30 m, and even more rarely at depths of 10 m or less. Conversely, GPR is rarely useful at depths greater than 10 m, especially in areas where clay or other electrically conductive materials are present near the surface. Efforts to image the cone of depression around a pumping well using seismic methods were only partially successful (for complete references of all research results, see the full Final Technical Report, DOE/ER/14826-F), but peripheral results included development of SSR methods for depths shallower than one meter, a depth range that had not been achieved before. Imaging at such shallow depths, however, requires geophone intervals of the order of 10 cm or less, which makes such surveys very expensive in terms of human time and effort. We also showed that SSR and GPR could be used in a complementary fashion to image the same volume of earth at very shallow depths. The primary research focus of the second three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under rough topographic conditions. In some easy-access environments, this device could make SSR surveying considerably more efficient and less expensive, particularly when geophone intervals of 25 cm or less are required. The most recent research analyzed the difference in seismic response of the geophones with variable geophone spike length and geophones attached to various steel media. Experiments investigated the azimuthal dependence of the quality of data relative to the orientation of the rigidly attached geophones. Other experiments designed to test the hypothesis that the data are being amplified in much the same way that an organ pipe amplifies sound have so far proved inconclusive. Taken together, the positive results show that SSR imaging within a few meters of the earth's surface is possible if the geology is suitable, that SSR imaging can complement GPR imaging, and that SSR imaging could be made significantly more cost effective, at least in areas where the topography and the geology are favorable. Increased knowledge of the Earth's shallow subsurface through non-intrusive techniques is of potential benefit to management of DOE facilities. Among the most significant problems facing hydrologists today is the delineation of preferential permeability paths in sufficient detail to make a quantitative analysis possible. Aquifer systems dominated by fracture flow have a reputation of being particularly difficult to characterize and model. At chemically contaminated sites, including U.S. Department of Energy (DOE) facilities and others at Department of Defense (DOD) installations worldwide, establishing the spatial extent of the contamination, along with the fate of the contaminants and their transport-flow directions, is essential to the development of effective cleanup strategies. Detailed characterization of the shallow subsurface is important not only in environmental, groundwater, and geotechnical engineering applications, but also in neotectonics, mining geology, and the analysis of petroleum reservoir analogs. Near-surface seismology is in the vanguard of non-intrusive approaches to increase knowledge of the shallow subsurface; our work is a significant departure from conventional seismic-survey field procedures.

  19. Reasoning about knowledge: Children's evaluations of generality and verifiability.

    PubMed

    Koenig, Melissa A; Cole, Caitlin A; Meyer, Meredith; Ridge, Katherine E; Kushnir, Tamar; Gelman, Susan A

    2015-12-01

    In a series of experiments, we examined 3- to 8-year-old children's (N=223) and adults' (N=32) use of two properties of testimony to estimate a speaker's knowledge: generality and verifiability. Participants were presented with a "Generic speaker" who made a series of 4 general claims about "pangolins" (a novel animal kind), and a "Specific speaker" who made a series of 4 specific claims about "this pangolin" as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., "has a pointy nose") or a non-evident feature that was not visible (e.g., "sleeps in a hollow tree"). Three main findings emerged: (1) young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) children's attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) children often generalized speakers' knowledge outside of the pangolin domain, indicating a belief that a person's knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. PMID:26451884

  20. Micromachined silicon seismic transducers

    SciTech Connect

    Barron, C.C.; Fleming, J.G.; Sniegowski, J.J.; Armour, D.L.; Fleming, R.P.

    1995-08-01

    Batch-fabricated silicon seismic transducers could revolutionize the discipline of CTBT monitoring by providing inexpensive, easily depolyable sensor arrays. Although our goal is to fabricate seismic sensors that provide the same performance level as the current state-of-the-art ``macro`` systems, if necessary one could deploy a larger number of these small sensors at closer proximity to the location being monitored in order to compensate for lower performance. We have chosen a modified pendulum design and are manufacturing prototypes in two different silicon micromachining fabrication technologies. The first set of prototypes, fabricated in our advanced surface- micromachining technology, are currently being packaged for testing in servo circuits -- we anticipate that these devices, which have masses in the 1--10 {mu}g range, will resolve sub-mG signals. Concurrently, we are developing a novel ``mold`` micromachining technology that promises to make proof masses in the 1--10 mg range possible -- our calculations indicate that devices made in this new technology will resolve down to at least sub-{mu}G signals, and may even approach to 10{sup {minus}10} G/{radical}Hz acceleration levels found in the low-earth-noise model.

  1. Seismic-Scale Rock Physics of Methane Hydrate

    SciTech Connect

    Amos Nur

    2009-01-08

    We quantify natural methane hydrate reservoirs by generating synthetic seismic traces and comparing them to real seismic data: if the synthetic matches the observed data, then the reservoir properties and conditions used in synthetic modeling might be the same as the actual, in-situ reservoir conditions. This approach is model-based: it uses rock physics equations that link the porosity and mineralogy of the host sediment, pressure, and hydrate saturation, and the resulting elastic-wave velocity and density. One result of such seismic forward modeling is a catalogue of seismic reflections of methane hydrate which can serve as a field guide to hydrate identification from real seismic data. We verify this approach using field data from known hydrate deposits.

  2. Seismic evaluation of headframe and associated equipment

    SciTech Connect

    Not Available

    1982-11-01

    This report presents the results of studies on the seismic evaluation of the headframe structure and the associated equipment for the Canistered Waste Facility of a Nuclear Waste Storage Repository. The conceptual design for the repository was developed by Stearns-Roger Engineering Corporation. The studies described in this report were performed by Engineering Decision Analysis Company, Inc. (EDAC) for Battelle/Office of Nuclear Waste Isolation (ONWI). The evaluations included the following main tasks: Task I. Development of seismic input; Task II. Seismic evaluation of headframe; Task III. Seismic evaluation of the cable/hoist system; Task IV. Cask drop evaluation; Task V. Quality assurance analysis. Because some components in the system could not withstand the postulated seismic motions without failure or without exceeding specified factors of safety, it was recommended that complete and detailed seismic criteria should be developed for the seismic input motions and the design of the headframe structure and associated equipment, such as the cable/hoist system. The analyses performed in this study and the resulting understanding developed of the behavior of the headframe structure and the cable/hoist system will be extremely helpful in the development of such criteria. A description of the computer programs used in this study is presented in Appendix A.

  3. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... electronic means, e.g., over the Internet, we require you to verify your identity by using identity... personally identifiable information over open networks such as the Internet, we use encryption in all of...

  4. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... electronic means, e.g., over the Internet, we require you to verify your identity by using identity... personally identifiable information over open networks such as the Internet, we use encryption in all of...

  5. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... electronic means, e.g., over the Internet, we require you to verify your identity by using identity... personally identifiable information over open networks such as the Internet, we use encryption in all of...

  6. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

    2009-08-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several Microsoft Excel input files, a Powersim Studio core, and several Microsoft Excel output files. All must be co-located in the same folder on a PC to function. We use Microsoft Excel 2003 and have not tested VISION with Microsoft Excel 2007. The VISION team uses both Powersim Studio 2005 and 2009 and it should work with either.

  7. Integration of onshore and offshore seismological data to study the seismicity of the Calabrian Region

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Antonino; Guerra, Ignazio; D'Anna, Giuseppe; Gervasi, Anna; Harabaglia, Paolo; Luzio, Dario; Stellato, Gilda

    2014-05-01

    The Pollino Massif marks the transition from the Southern Appenninic to the Calabrian Arc. On the western side it is characterized by a moderately sized seismicity (about 9 M > 4 events in the last 50 years), well documented in the last 400 years. The moment tensor solutions available in this area yields, mainly, normal faults with coherent Southern Appeninic trend. This remains true also for the events that are localized on the calabrian side of Pollino, South of the massif. In most of the Sibari plane, seismic activity is very scarce, while it is again rather marked on its southeastern corner, both onshore and offshore. The above observations point to the perspective that the stress field of a vast portion of Northern Calabria still resembles that of the Southern Appenines. In this frame, it becomes important to investigate the offshore seismicity of the Sibari Gulf and the deformation pattern within the Sibari Plane. The latter might function as a hinge to transfer the deformation of the extensional fault system in the Pollino area to a different offshore fault system. Since return times of larger events might be very long, we need to investigate the true seismic potential of the offshore faults and to verify whether they are truly strike slip or if they could involve relevant thrust or normal components, that would add to the risk that of potentially associated tsunamis. Despite their importance in the understanding of the seismotectonic processes taking place in the Southern Appenninic - Calabrian Arc border and surrounding areas, the seismicity and the seismogenic volumes of the Sibari Gulf until now has not been well characterized due to the lack of offshore seismic stations. The seismicity of the Calabrian is monitored by the Italian National Seismic Network (INSN) managed by Istituto Nazionale di Geofisica e Vulcanologia and by the Calabrian Regional Seismic Network (CRSN) managed by the University of Calabria. Both the network comprise only on-land seismic stations. The lack of offshore stations prevents accurate determination of the hypocentral parameters also for moderate-strong earthquakes that occur in the Calabria offshore. With the aim of investigate the near shore seismicity in the Sibari Gulf and its eventual relationship with the Pollino activity, in the early 2014 will start a project for the improvement of the Calabrian Seismic Network in monitoring the Sibari Gulf area by deploying several Ocean Bottom Seismometers with Hydrophone (OBS/H). For this experiment, each OBS/H is equipped with a broad-band seismometer housed in a glass sphere designed to operate at a depth of up to 6000 m and with an autolevelling sensor system. The OBS/Hs are also equipped with an hydrophone. Analogical signals are recorded with a sampling frequency of 200 Hz by a four-channel 21 bits datalogger. In this work, we plan to present the preliminary results of the monitoring campaign showing the largest improvement in hypocenter locations derived from the integration of the onshore and offshore seismic stations.

  8. Salvo: Seismic imaging software for complex geologies

    SciTech Connect

    OBER,CURTIS C.; GJERTSEN,ROB; WOMBLE,DAVID E.

    2000-03-01

    This report describes Salvo, a three-dimensional seismic-imaging software for complex geologies. Regions of complex geology, such as overthrusts and salt structures, can cause difficulties for many seismic-imaging algorithms used in production today. The paraxial wave equation and finite-difference methods used within Salvo can produce high-quality seismic images in these difficult regions. However this approach comes with higher computational costs which have been too expensive for standard production. Salvo uses improved numerical algorithms and methods, along with parallel computing, to produce high-quality images and to reduce the computational and the data input/output (I/O) costs. This report documents the numerical algorithms implemented for the paraxial wave equation, including absorbing boundary conditions, phase corrections, imaging conditions, phase encoding, and reduced-source migration. This report also describes I/O algorithms for large seismic data sets and images and parallelization methods used to obtain high efficiencies for both the computations and the I/O of seismic data sets. Finally, this report describes the required steps to compile, port and optimize the Salvo software, and describes the validation data sets used to help verify a working copy of Salvo.

  9. Development of Seismic Isolation Systems Using Periodic Materials

    SciTech Connect

    Yan, Yiqun; Mo, Yi-Lung; Menq, Farn-Yuh; Stokoe, II, Kenneth H.; Perkins, Judy; Tang, Yu

    2014-12-10

    Advanced fast nuclear power plants and small modular fast reactors are composed of thin-walled structures such as pipes; as a result, they do not have sufficient inherent strength to resist seismic loads. Seismic isolation, therefore, is an effective solution for mitigating earthquake hazards for these types of structures. Base isolation, on which numerous studies have been conducted, is a well-defined structure protection system against earthquakes. In conventional isolators, such as high-damping rubber bearings, lead-rubber bearings, and friction pendulum bearings, large relative displacements occur between upper structures and foundations. Only isolation in a horizontal direction is provided; these features are not desirable for the piping systems. The concept of periodic materials, based on the theory of solid-state physics, can be applied to earthquake engineering. The periodic material is a material that possesses distinct characteristics that prevent waves with certain frequencies from being transmitted through it; therefore, this material can be used in structural foundations to block unwanted seismic waves with certain frequencies. The frequency band of periodic material that can filter out waves is called the band gap, and the structural foundation made of periodic material is referred to as the periodic foundation. The design of a nuclear power plant, therefore, can be unified around the desirable feature of a periodic foundation, while the continuous maintenance of the structure is not needed. In this research project, three different types of periodic foundations were studied: one-dimensional, two-dimensional, and three-dimensional. The basic theories of periodic foundations are introduced first to find the band gaps; then the finite element methods are used, to perform parametric analysis, and obtain attenuation zones; finally, experimental programs are conducted, and the test data are analyzed to verify the theory. This procedure shows that the periodic foundation is a promising and effective way to mitigate structural damage caused by earthquake excitation.

  10. Revolutionary seismic vessel sees first action in North Sea

    SciTech Connect

    Greenway, J.

    1995-08-28

    This paper reviews the design of a new seismic surveying vessel which was developed in response to the increased need for 3D seismic data acquisition. To help in the distribution of the seismic equipment in tow, this ship was developed to have a wide, continuous beam to allow uniform distribution of a large number of seismic streamers. This width to length ratio also provides better stability of the ship as it moves through the water increasing the quality of the seismic data. The propulsion systems have also been constructed to better drag the arrays through the water. Specifications and cost benefit analysis of this new seismic vessel are reviewed and compared to conventional seismic survey methods and vessels.

  11. Basis for seismic provisions of DOE-STD-1020

    SciTech Connect

    Kennedy, R.C.; Short, S.A.

    1994-04-01

    DOE-STD-1020 provides for a graded approach for the seismic design and evaluation of DOE structures, systems, and components (SSC). Each SSC is assigned to a Performance Category (PC) with a performance description and an approximate annual probability of seismic-induced unacceptable performance, P{sub F}. The seismic annual probability performance goals for PC 1 through 4 for which specific seismic design and evaluation criteria are presented. DOE-STD-1020 also provides a seismic design and evaluation procedure applicable to achieve any seismic performance goal annual probability of unacceptable performance specified by the user. The desired seismic performance goal is achieved by defining the seismic hazard in terms of a site-specified design/evaluation response spectrum (called herein, the Design/Evaluation Basis Earthquake, DBE). Probabilistic seismic hazard estimates are used to establish the DBE. The resulting seismic hazard curves define the amplitude of the ground motion as a function of the annual probability of exceedance P{sub H} of the specified seismic hazard. Once the DBE is defined, the SSC is designed or evaluated for this DBE using adequately conservative deterministic acceptance criteria. To be adequately conservative, the acceptance criteria must introduce an additional reduction in the risk of unacceptable performance below the annual risk of exceeding the DBE. The ratio of the seismic hazard exceedance probability P{sub H} to the performance goal probability P{sub F} is defined herein as the risk reduction ratio. The required degree of conservatism in the deterministic acceptance criteria is a function of the specified risk reduction ratio.

  12. Verified Centers, Nonverified Centers or Other Facilities: A National Analysis of Burn Patient Treatment Location

    PubMed Central

    Zonies, David; Mack, Christopher; Kramer, Bradley; Rivara, Frederick; Klein, Matthew

    2009-01-01

    Background Although comprehensive burn care requires significant resources, patients may be treated at verified burn centers, non-verified burn centers, or other facilities due to a variety of factors. The purpose of this study was to evaluate the association between patient and injury characteristics and treatment location using a national database. Study Design We performed an analysis of all burn patients admitted to United States hospitals participating in the Healthcare Cost and Utilization Project over 2 years. Univariate and multivariate analyses were performed to identify patient and injury factors associated with the likelihood of treatment at designated burn care facilities. Definitve care facilities were categorized as American Burn Association verified centers, non-verified burn centers, or other facilities. Results Over the two years, 29,971 burn patients were treated in 1,376 hospitals located in 19 participating states. A total of 6,712 (22%) patients were treated at verified centers, with 26% and 52% treated at non-verified or other facilities, respectively. Patients treated at verified centers were younger than those at non-verified or other facilities (33.1 years vs. 33.7 years vs. 41.9 years, p<0.001) and had a higher rate of inhalation injury (3.4% vs. 3.2% vs. 2.2%, p<0.001). Independent factors associated with treatment at verified centers include burns to the head/neck (RR 2.4, CI 2.1-2.7), hand (RR 1.8, CI 1.6-1.9), electrical injury (RR 1.4, CI 1.4, CI 1.2-1.7), and fewer co-morbidities (RR 0.55, CI 0.5-0.6). Conclusions More than two-thirds of significantly burned patients are treated at non-verified burn centers in the U.S. Many patients meeting ABA criteria for transfer to a burn center are being treated at non-burn center facilities. PMID:20193892

  13. Angola Seismicity MAP

    NASA Astrophysics Data System (ADS)

    Neto, F. A. P.; Franca, G.

    2014-12-01

    The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of S da Bandeira (Lubango)-Chibemba-Onccua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambndua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic zone Porto Amboim in the coastal portion of Kwanza basin sedimentary.

  14. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in the Russian Federation the corresponding procedures have been developed. They are based on mathematical modeling of the system elements' interaction: the oil pipe line and ground, in the case of seismic loads. As a result the dependence-ships between the probability of oil pipe line system to be damaged, and the intensity of shaking in grades of seismic scales have been obtained. The following three damage states for oil pipe line systems have been considered: light damage - elastic deformation of the linear part; localized plastic deformation without breaching the pipeline; average damage - significant plastic deformation of the linear part; fistulas in some areas; complete destruction - large horizontal and vertical displacements of the linear part; mass fistulas, cracks; "guillotine break" of pipe line in some areas.

  15. Mapping Europe's Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Giardini, Domenico; Wssner, Jochen; Danciu, Laurentiu

    2014-07-01

    From the rift that cuts through the heart of Iceland to the complex tectonic convergence that causes frequent and often deadly earthquakes in Italy, Greece, and Turkey to the volcanic tremors that rattle the Mediterranean, seismic activity is a prevalent and often life-threatening reality across Europe. Any attempt to mitigate the seismic risk faced by society requires an accurate estimate of the seismic hazard.

  16. Oklahoma seismic network. Final report

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. |

    1993-07-01

    The US Nuclear Regulatory Commission has established rigorous guidelines that must be adhered to before a permit to construct a nuclear-power plant is granted to an applicant. Local as well as regional seismicity and structural relationships play an integral role in the final design criteria for nuclear power plants. The existing historical record of seismicity is inadequate in a number of areas of the Midcontinent region because of the lack of instrumentation and (or) the sensitivity of the instruments deployed to monitor earthquake events. The Nemaha Uplift/Midcontinent Geophysical Anomaly is one of five principal areas east of the Rocky Mountain front that has a moderately high seismic-risk classification. The Nemaha uplift, which is common to the states of Oklahoma, Kansas, and Nebraska, is approximately 415 miles long and 12-14 miles wide. The Midcontinent Geophysical Anomaly extends southward from Minnesota across Iowa and the southeastern corner of Nebraska and probably terminates in central Kansas. A number of moderate-sized earthquakes--magnitude 5 or greater--have occurred along or west of the Nemaha uplift. The Oklahoma Geological Survey, in cooperation with the geological surveys of Kansas, Nebraska, and Iowa, conducted a 5-year investigation of the seismicity and tectonic relationships of the Nemaha uplift and associated geologic features in the Midcontinent. This investigation was intended to provide data to be used to design nuclear-power plants. However, the information is also being used to design better large-scale structures, such as dams and high-use buildings, and to provide the necessary data to evaluate earthquake-insurance rates in the Midcontinent.

  17. Seismic Imaging and Monitoring

    SciTech Connect

    Huang, Lianjie

    2012-07-09

    I give an overview of LANL's capability in seismic imaging and monitoring. I present some seismic imaging and monitoring results, including imaging of complex structures, subsalt imaging of Gulf of Mexico, fault/fracture zone imaging for geothermal exploration at the Jemez pueblo, time-lapse imaging of a walkway vertical seismic profiling data for monitoring CO{sub 2} inject at SACROC, and microseismic event locations for monitoring CO{sub 2} injection at Aneth. These examples demonstrate LANL's high-resolution and high-fidelity seismic imaging and monitoring capabilities.

  18. Evolution of optically nondestructive and data-non-intrusive credit card verifiers

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2010-04-01

    Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

  19. The SCALE Verified, Archived Library of Inputs and Data - VALID

    SciTech Connect

    Marshall, William BJ J; Rearden, Bradley T

    2013-01-01

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated with model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.

  20. Verifying continuous-variable entanglement in finite spaces

    SciTech Connect

    Sperling, J.; Vogel, W.

    2009-05-15

    Starting from arbitrary Hilbert spaces, we reduce the problem to verify entanglement of any bipartite quantum state to finite-dimensional subspaces. Entanglement can be fully characterized as a finite-dimensional property, even though in general the truncation of the Hilbert space may cause fake nonclassicality. A generalization for multipartite quantum states is also given.

  1. 34 CFR 668.56 - Items to be verified.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Information 668.56 Items to be verified. (a) Except as provided in paragraphs (b), (c), (d), and (e) of this... information used to determine the applicant's EFC: (1) Adjusted gross income (AGI) for the base year if base... included on the tax return form, excluding information contained on schedules appended to such forms....

  2. A Trustworthy Internet Auction Model with Verifiable Fairness.

    ERIC Educational Resources Information Center

    Liao, Gen-Yih; Hwang, Jing-Jang

    2001-01-01

    Describes an Internet auction model achieving verifiable fairness, a requirement aimed at enhancing the trust of bidders in auctioneers. Analysis results demonstrate that the proposed model satisfies various requirements regarding fairness and privacy. Moreover, in the proposed model, the losing bids remain sealed. (Author/AEF)

  3. Seismic Catalogue and Seismic Network in Haiti

    NASA Astrophysics Data System (ADS)

    Belizaire, D.; Benito, B.; Carreño, E.; Meneses, C.; Huerfano, V.; Polanco, E.; McCormack, D.

    2013-05-01

    The destructive earthquake occurred on January 10, 2010 in Haiti, highlighted the lack of preparedness of the country to address seismic phenomena. At the moment of the earthquake, there was no seismic network operating in the country, and only a partial control of the past seismicity was possible, due to the absence of a national catalogue. After the 2010 earthquake, some advances began towards the installation of a national network and the elaboration of a seismic catalogue providing the necessary input for seismic Hazard Studies. This paper presents the state of the works carried out covering both aspects. First, a seismic catalogue has been built, compiling data of historical and instrumental events occurred in the Hispaniola Island and surroundings, in the frame of the SISMO-HAITI project, supported by the Technical University of Madrid (UPM) and Developed in cooperation with the Observatoire National de l'Environnement et de la Vulnérabilité of Haiti (ONEV). Data from different agencies all over the world were gathered, being relevant the role of the Dominican Republic and Puerto Rico seismological services which provides local data of their national networks. Almost 30000 events recorded in the area from 1551 till 2011 were compiled in a first catalogue, among them 7700 events with Mw ranges between 4.0 and 8.3. Since different magnitude scale were given by the different agencies (Ms, mb, MD, ML), this first catalogue was affected by important heterogeneity in the size parameter. Then it was homogenized to moment magnitude Mw using the empirical equations developed by Bonzoni et al (2011) for the eastern Caribbean. At present, this is the most exhaustive catalogue of the country, although it is difficult to assess its degree of completeness. Regarding the seismic network, 3 stations were installed just after the 2010 earthquake by the Canadian Government. The data were sent by telemetry thought the Canadian System CARINA. In 2012, the Spanish IGN together with ONEV and BME, installed 4 seismic stations with financial support from the Inter-American Development Bank and the Haitian Government. The 4 stations include strong motion and broad-band sensors, complementing the 8 sensors initially installed. The stations communicate via SATMEX5 with the Canadian HUB, which sends the data back to Haiti with minimum delay. In the immediate future, data transfer will be improved with the installation of a main antenna for data reception and the Seismic Warning Center of Port-au-Prince. A bidirectional satellite communication is considered of fundamental importance for robust real-time data transmission that is not affected in the case of a catastrophic event.

  4. Development of adaptive seismic isolators for ultimate seismic protection of civil structures

    NASA Astrophysics Data System (ADS)

    Li, Jianchun; Li, Yancheng; Li, Weihua; Samali, Bijan

    2013-04-01

    Base isolation is the most popular seismic protection technique for civil engineering structures. However, research has revealed that the traditional base isolation system due to its passive nature is vulnerable to two kinds of earthquakes, i.e. the near-fault and far-fault earthquakes. A great deal of effort has been dedicated to improve the performance of the traditional base isolation system for these two types of earthquakes. This paper presents a recent research breakthrough on the development of a novel adaptive seismic isolation system as the quest for ultimate protection for civil structures, utilizing the field-dependent property of the magnetorheological elastomer (MRE). A novel adaptive seismic isolator was developed as the key element to form smart seismic isolation system. The novel isolator contains unique laminated structure of steel and MR elastomer layers, which enable its large-scale civil engineering applications, and a solenoid to provide sufficient and uniform magnetic field for energizing the field-dependent property of MR elastomers. With the controllable shear modulus/damping of the MR elastomer, the developed adaptive seismic isolator possesses a controllable lateral stiffness while maintaining adequate vertical loading capacity. In this paper, a comprehensive review on the development of the adaptive seismic isolator is present including designs, analysis and testing of two prototypical adaptive seismic isolators utilizing two different MRE materials. Experimental results show that the first prototypical MRE seismic isolator can provide stiffness increase up to 37.49%, while the second prototypical MRE seismic isolator provides amazing increase of lateral stiffness up to1630%. Such range of increase of the controllable stiffness of the seismic isolator makes it highly practical for developing new adaptive base isolation system utilizing either semi-active or smart passive controls.

  5. Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity

    SciTech Connect

    Tolk, Keith M.; Stoker, Gerald C.

    1999-07-20

    An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements.

  6. Scanning Seismic Intrusion Detector

    NASA Technical Reports Server (NTRS)

    Lee, R. D.

    1982-01-01

    Scanning seismic intrusion detector employs array of automatically or manually scanned sensors to determine approximate location of intruder. Automatic-scanning feature enables one operator to tend system of many sensors. Typical sensors used with new system are moving-coil seismic pickups. Detector finds uses in industrial security systems.

  7. Seismic data denoising based on the fractional Fourier transformation

    NASA Astrophysics Data System (ADS)

    Zhai, Ming-Yue

    2014-10-01

    Seismic data may suffer from too severe noise contamination to carry out further processing and interpretation procedure. In the paper, a new scheme was proposed based on the fractional Fourier transform (FrFT) in time frequency domain to mitigate noise. The scheme consists of two steps. In the first step, the seismic signal is filtered with the ordinary Butterworth filter in the frequency domain. The residual noises after frequency filtering are with the same frequencies with the filtered seismic signals. In order to mitigate the residual noises further, the FrFT filter is applied in the second step. The results from the simulated seismic signals and the measurements data verify the validity of the proposed scheme in both frequency and time-frequency domains.

  8. Application of seismic tomography in underground mining

    SciTech Connect

    Scott, D.F.; Williams, T.J.; Friedel, M.J.

    1996-12-01

    Seismic tomography, as used in mining, is based on the principle that highly stressed rock will demonstrate relatively higher P-wave velocities than rock under less stress. A decrease or increase in stress over time can be verified by comparing successive tomograms. Personnel at the Spokane Research Center have been investigating the use of seismic tomography to identify stress in remnant ore pillars in deep (greater than 1220 in) underground mines. In this process, three-dimensional seismic surveys are conducted in a pillar between mine levels. A sledgehammer is used to generate P-waves, which are recorded by geophones connected to a stacking signal seismograph capable of collecting and storing the P-wave data. Travel times are input into a spreadsheet, and apparent velocities are then generated and merged into imaging software. Mine workings are superimposed over apparent P-wave velocity contours to generate a final tomographic image. Results of a seismic tomographic survey at the Sunshine Mine, Kellogg, ED, indicate that low-velocity areas (low stress) are associated with mine workings and high-velocity areas (higher stress) are associated with areas where no mining has taken place. A high stress gradient was identified in an area where ground failed. From this tomographic survey, as well, as four earlier surveys at other deep underground mines, a method was developed to identify relative stress in remnant ore pillars. This information is useful in making decisions about miner safety when mining such ore pillars.

  9. The Spatial Scale of Detected Seismicity

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Chen, C.-C.

    2015-07-01

    An experimental method for the spatial resolution analysis of the earthquake frequency-magnitude distribution is introduced in order to identify the intrinsic spatial scale of the detected seismicity phenomenon. We consider the unbounded magnitude range m ? (-?, +?), which includes incomplete data below the completeness magnitude m c. By analyzing a relocated earthquake catalog of Taiwan, we find that the detected seismicity phenomenon is scale-variant for m ? (-?, +?) with its spatial grain a function of the configuration of the seismic network, while seismicity is known to be scale invariant for m ? [m c, +?). Correction for data incompleteness for m < m c based on the knowledge of the spatial scale of the process allows extending the analysis of the Gutenberg-Richter law and of the fractal dimension to lower magnitudes. This shall allow verifying the continuity of universality of these parameters over a wider magnitude range. Our results also suggest that the commonly accepted Gaussian model of earthquake detection might be an artifact of observation.

  10. The Spatial Scale of Detected Seismicity

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Chen, C.-C.

    2016-01-01

    An experimental method for the spatial resolution analysis of the earthquake frequency-magnitude distribution is introduced in order to identify the intrinsic spatial scale of the detected seismicity phenomenon. We consider the unbounded magnitude range m ∈ (-∞, +∞), which includes incomplete data below the completeness magnitude m c. By analyzing a relocated earthquake catalog of Taiwan, we find that the detected seismicity phenomenon is scale-variant for m ∈ (-∞, +∞) with its spatial grain a function of the configuration of the seismic network, while seismicity is known to be scale invariant for m ∈ [ m c, +∞). Correction for data incompleteness for m < m c based on the knowledge of the spatial scale of the process allows extending the analysis of the Gutenberg-Richter law and of the fractal dimension to lower magnitudes. This shall allow verifying the continuity of universality of these parameters over a wider magnitude range. Our results also suggest that the commonly accepted Gaussian model of earthquake detection might be an artifact of observation.

  11. Seismic isolation of two dimensional periodic foundations

    SciTech Connect

    Yan, Y.; Mo, Y. L.; Laskar, A.; Cheng, Z.; Shi, Z.; Menq, F.; Tang, Y.

    2014-07-28

    Phononic crystal is now used to control acoustic waves. When the crystal goes to a larger scale, it is called periodic structure. The band gaps of the periodic structure can be reduced to range from 0.5 Hz to 50 Hz. Therefore, the periodic structure has potential applications in seismic wave reflection. In civil engineering, the periodic structure can be served as the foundation of upper structure. This type of foundation consisting of periodic structure is called periodic foundation. When the frequency of seismic waves falls into the band gaps of the periodic foundation, the seismic wave can be blocked. Field experiments of a scaled two dimensional (2D) periodic foundation with an upper structure were conducted to verify the band gap effects. Test results showed the 2D periodic foundation can effectively reduce the response of the upper structure for excitations with frequencies within the frequency band gaps. When the experimental and the finite element analysis results are compared, they agree well with each other, indicating that 2D periodic foundation is a feasible way of reducing seismic vibrations.

  12. Statistical classification methods applied to seismic discrimination

    SciTech Connect

    Ryan, F.M.; Anderson, D.N.; Anderson, K.K.; Hagedorn, D.N.; Higbee, K.T.; Miller, N.E.; Redgate, T.; Rohay, A.C.

    1996-06-11

    To verify compliance with a Comprehensive Test Ban Treaty (CTBT), low energy seismic activity must be detected and discriminated. Monitoring small-scale activity will require regional (within {approx}2000 km) monitoring capabilities. This report provides background information on various statistical classification methods and discusses the relevance of each method in the CTBT seismic discrimination setting. Criteria for classification method selection are explained and examples are given to illustrate several key issues. This report describes in more detail the issues and analyses that were initially outlined in a poster presentation at a recent American Geophysical Union (AGU) meeting. Section 2 of this report describes both the CTBT seismic discrimination setting and the general statistical classification approach to this setting. Seismic data examples illustrate the importance of synergistically using multivariate data as well as the difficulties due to missing observations. Classification method selection criteria are presented and discussed in Section 3. These criteria are grouped into the broad classes of simplicity, robustness, applicability, and performance. Section 4 follows with a description of several statistical classification methods: linear discriminant analysis, quadratic discriminant analysis, variably regularized discriminant analysis, flexible discriminant analysis, logistic discriminant analysis, K-th Nearest Neighbor discrimination, kernel discrimination, and classification and regression tree discrimination. The advantages and disadvantages of these methods are summarized in Section 5.

  13. Seismic isolation of an electron microscope

    SciTech Connect

    Godden, W.G.; Aslam, M.; Scalise, D.T.

    1980-01-01

    A unique two-stage dynamic-isolation problem is presented by the conflicting design requirements for the foundations of an electron microscope in a seismic region. Under normal operational conditions the microscope must be isolated from ambient ground noise; this creates a system extremely vulnerable to seismic ground motions. Under earthquake loading the internal equipment forces must be limited to prevent damage or collapse. An analysis of the proposed design solution is presented. This study was motivated by the 1.5 MeV High Voltage Electron Microscope (HVEM) to be installed at the Lawrence Berkeley Laboratory (LBL) located near the Hayward Fault in California.

  14. Seismic monitoring of the Yucca Mountain facility

    SciTech Connect

    Garbin, H.D.; Herrington, P.B.; Kromer, R.P.

    1997-08-01

    Questions have arisen regarding the applicability of seismic sensors to detect mining (re-entry) with a tunnel boring machine (TBM). Unlike cut and blast techniques of mining which produce impulsive seismic signals, the TBM produces seismic signals which are of long duration. (There are well established techniques available for detecting and locating the sources of the impulsive signals.) The Yucca Mountain repository offered an opportunity to perform field evaluations of the capabilities of seismic sensors because during much of 1996, mining there was progressing with the use of a TBM. During the mining of the repository`s southern branch, an effort was designed to evaluate whether the TBM could be detected, identified and located using seismic sensors. Three data acquisition stations were established in the Yucca Mountain area to monitor the TBM activity. A ratio of short term average to long term average algorithm was developed for use in signal detection based on the characteristics shown in the time series. For location of the source of detected signals, FK analysis was used on the array data to estimate back azimuths. The back azimuth from the 3 component system was estimated from the horizontal components. Unique features in the timing of the seismic signal were used to identify the source as the TBM.

  15. Seismic Safety Of Simple Masonry Buildings

    SciTech Connect

    Guadagnuolo, Mariateresa; Faella, Giuseppe

    2008-07-08

    Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have to be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.

  16. The Kyrgyz Seismic Network (KNET)

    NASA Astrophysics Data System (ADS)

    Bragin, V. D.; Willemann, R. J.; Matix, A. I.; Dudinskih, R. R.; Vernon, F.; Offield, G.

    2007-05-01

    The Kyrgyz Digital Seismic Network (KNET) is a regional continuous telemetric network of very broadband seismic data. KNET was installed in 1991. The telemetry system was upgraded in 1998. The seismograms are transmitted in near real time. KNET is located along part of the boundary between the northern Tien Shan Mountains and the Kazakh platform. Several major tectonic features are spanned by the network including a series of thrust faults in the Tien Shan, the Chu Valley, and the NW-SE trending ridges north of Bishkek. This network is designed to monitor regional seismic activity at the magnitude 3.5+ level as well as to provide high quality data for research projects in regional and global broadband seismology. The Kyrgyz seismic network array consists of 10 stations - 3 of them with more than 3600 m altitude, 2 mountain repeaters, 1 intermediate data base and 2 data centers. One of data centers is a remote source for IRIS data base. KNET is operated by International Research Center - Geodynamic Proving Ground in Bishkek (IGRC) with the participation of Research Station of the Russian Academy of Sciences (RS RAS) and Kyrgyz Institute of Seismology (KIS). The network consists of Streckeisen STS-2 sensors with 24-bit PASSCAL data loggers. All continuous real-time data are accessible through the IRIS DMC in Seattle with over 95% data availability, which compares favorably to the best networks currently operating worldwide. National institutes of seismology in Kyrgyzstan and Kazakhstan, National Nuclear Centre of Kazakhstan, RS RAS, divisions of the ministries on extreme situations and the institutes of the Russian Academy of Sciences use KNET data for estimating seismic hazards and to study deep-seated structure of researched territory. KNET data is used by National Nuclear Centre of Republic of Kazakhstan, which together with LAMONT laboratory (USA) carries out verification researches and monitoring of nuclear detonations in China, India and Pakistan. The uniform digital Catalogue of Central Asia data which will include Kyrgyzstan, Kazakhstan, Uzbekistan and KNET seismic networks data is being developed. Chinese scientists have expressed interest in usage of KNET data, and also in association of a digital network located in the Tarim platform and KNET territory.

  17. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-01

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  18. Formally Verified Practical Algorithms for Recovery from Loss of Separation

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Caesar A.

    2009-01-01

    In this paper, we develop and formally verify practical algorithms for recovery from loss of separation. The formal verification is performed in the context of a criteria-based framework. This framework provides rigorous definitions of horizontal and vertical maneuver correctness that guarantee divergence and achieve horizontal and vertical separation. The algorithms are shown to be independently correct, that is, separation is achieved when only one aircraft maneuvers, and implicitly coordinated, that is, separation is also achieved when both aircraft maneuver. In this paper we improve the horizontal criteria over our previous work. An important benefit of the criteria approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).

  19. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  20. Real-Time Projection to Verify Plan Success During Execution

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Dvorak, Daniel L.; Rasmussen, Robert D.; Knight, Russell L.; Morris, John R.; Bennett, Matthew B.; Ingham, Michel D.

    2012-01-01

    The Mission Data System provides a framework for modeling complex systems in terms of system behaviors and goals that express intent. Complex activity plans can be represented as goal networks that express the coordination of goals on different state variables of the system. Real-time projection extends the ability of this system to verify plan achievability (all goals can be satisfied over the entire plan) into the execution domain so that the system is able to continuously re-verify a plan as it is executed, and as the states of the system change in response to goals and the environment. Previous versions were able to detect and respond to goal violations when they actually occur during execution. This new capability enables the prediction of future goal failures; specifically, goals that were previously found to be achievable but are no longer achievable due to unanticipated faults or environmental conditions. Early detection of such situations enables operators or an autonomous fault response capability to deal with the problem at a point that maximizes the available options. For example, this system has been applied to the problem of managing battery energy on a lunar rover as it is used to explore the Moon. Astronauts drive the rover to waypoints and conduct science observations according to a plan that is scheduled and verified to be achievable with the energy resources available. As the astronauts execute this plan, the system uses this new capability to continuously re-verify the plan as energy is consumed to ensure that the battery will never be depleted below safe levels across the entire plan.

  1. Verifying non-Abelian statistics by numerical braiding Majorana fermions

    NASA Astrophysics Data System (ADS)

    Cheng, Qiu-Bo; He, Jing; Kou, Su-Peng

    2016-02-01

    Recently, Majorana fermions have attracted intensive attention because of their possible non-Abelian statistics and potential applications in topological quantum computation. This paper describes an approach to verify the non-Abelian statistics of Majorana fermions in topological superconductors. From the relationship between the braiding operator of Majorana fermions and that of Bogoliubov-de Gennes states, we determine that Majorana fermions in one-dimensional and two-dimensional topological superconductors both obey non-Abelian statistics.

  2. Cross-correlationan objective tool to indicate induced seismicity

    NASA Astrophysics Data System (ADS)

    Oprsal, Ivo; Eisner, Leo

    2014-03-01

    Differentiation between natural and induced seismicity is crucial for the ability to safely and soundly carry out various underground experiments and operations. This paper defines an objective tool for one of the criteria used to discriminate between natural and induced seismicity. The qualitative correlation between earthquake rates and the injected volume has been an established tool for investigating the possibility of induced, or triggered, seismicity. We derive mathematically, and verify using numerical examples, that the definition of normalized cross-correlation (NCC) between positive random functions exhibits high values with a limit equal to one, if these functions (such as earthquake rates and injection volumes) have a large mean and low standard deviation. In such a case, the high NCC values do not necessarily imply temporal relationship between the phenomena. Instead of positive-value time histories, the functions with their running mean subtracted should be used for cross-correlation. The NCC of such functions (called here NCCEP) may be close to zero, or may oscillate between positive and negative values in cases where seismicity is not related to injection. We apply this method for case studies of seismicity in Colorado, the United Kingdom, Switzerland and south-central Oklahoma, and show that NCCEP reliably determines induced seismicity. Finally, we introduce a geomechanical model explaining the positive cross-correlation observed in the induced seismicity data sets.

  3. 41 CFR 128-1.8005 - Seismic safety standards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... appropriate model code, in which case the local code shall be utilized as the standard; or (2) The locality... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC... Congress (SBCC) Standard Building Code (SBC). (b) The seismic design and construction of a covered...

  4. 41 CFR 128-1.8005 - Seismic safety standards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... appropriate model code, in which case the local code shall be utilized as the standard; or (2) The locality... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC... Congress (SBCC) Standard Building Code (SBC). (b) The seismic design and construction of a covered...

  5. 41 CFR 128-1.8005 - Seismic safety standards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... appropriate model code, in which case the local code shall be utilized as the standard; or (2) The locality... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC... Congress (SBCC) Standard Building Code (SBC). (b) The seismic design and construction of a covered...

  6. Nuclear archaeology: Verifying declarations of fissile-material production

    SciTech Connect

    Fetter, S. )

    1993-01-01

    Controlling the production of fissile material is an essential element of nonproliferation policy. Similarly, accounting for the past production of fissile material should be an important component of nuclear disarmament. This paper describes two promising techniques that make use of physical evidence at reactors and enrichment facilities to verify the past production of plutonium and highly enriched uranium. In the first technique, the concentrations of long-lived radionuclides in permanent components of the reactor core are used to estimate the neutron fluence in various regions of the reactor, and thereby verify declarations of plutonium production in the reactor. In the second technique, the ratio of the concentration of U-235 to that of U-234 in the tails is used to determine whether a given container of tails was used in the production of low- enriched uranium, which is suitable for reactor fuel, or highly enriched uranium, which can be used in nuclear weapons. Both techniques belong to the new field of [open quotes]nuclear archaeology,[close quotes] in which the authors attempt to document past nuclear weapons activities and thereby lay a firm foundation for verifiable nuclear disarmament. 11 refs., 1 fig., 3 tabs.

  7. Robustness and device independence of verifiable blind quantum computing

    NASA Astrophysics Data System (ADS)

    Gheorghiu, Alexandru; Kashefi, Elham; Wallden, Petros

    2015-08-01

    Recent advances in theoretical and experimental quantum computing bring us closer to scalable quantum computing devices. This makes the need for protocols that verify the correct functionality of quantum operations timely and has led to the field of quantum verification. In this paper we address key challenges to make quantum verification protocols applicable to experimental implementations. We prove the robustness of the single server verifiable universal blind quantum computing protocol of Fitzsimons and Kashefi (2012 arXiv:1203.5217) in the most general scenario. This includes the case where the purification of the deviated input state is in the hands of an adversarial server. The proved robustness property allows the composition of this protocol with a device-independent state tomography protocol that we give, which is based on the rigidity of CHSH games as proposed by Reichardt et al (2013 Nature 496 456-60). The resulting composite protocol has lower round complexity for the verification of entangled quantum servers with a classical verifier and, as we show, can be made fault tolerant.

  8. Sensor-based warranty system for improving seismic performance of building structures

    NASA Astrophysics Data System (ADS)

    Miyamoto, Ryu; Mita, Akira

    2008-03-01

    This paper proposes a warranty system based on a seismic performance agreement, and investigates its feasibility. Specifically, we focus on making clear to building users the relationship between seismic force and seismic damage or loss, and propose a warranty agreement in which accountability of seismic loss is defined in terms of ground motion parameters obtained by a seismic sensor. This study uses the Japan Meteorological Agency seismic intensity scale (I-jma) because of its general acceptance and recognition. A portfolio of buildings in 10 suburbs of the Kanto region is chosen for seismic portfolio analysis. The following conclusions were derived: 1. For a portfolio of 10 base-isolated buildings, the builder's seismic expected loss was found to be approximately 0.01%. 2. In regards to feasibility of risk finance by seismic derivatives, this study found that it is possible to transfer most of builder's risk through a 0.01% premium rate. 3. Builder risk reduction was verified by use of a seismometer. 4. A new contract warranty agreement for seismic loss insurance for users was proposed, and it can reduce the premium to 1/15 of the current seismic insurance schemes.

  9. K-means cluster analysis and seismicity partitioning for Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2014-07-01

    Pakistan and the western Himalaya is a region of high seismic activity located at the triple junction between the Arabian, Eurasian and Indian plates. Four devastating earthquakes have resulted in significant numbers of fatalities in Pakistan and the surrounding region in the past century (Quetta, 1935; Makran, 1945; Pattan, 1974 and the recent 2005 Kashmir earthquake). It is therefore necessary to develop an understanding of the spatial distribution of seismicity and the potential seismogenic sources across the region. This forms an important basis for the calculation of seismic hazard; a crucial input in seismic design codes needed to begin to effectively mitigate the high earthquake risk in Pakistan. The development of seismogenic source zones for seismic hazard analysis is driven by both geological and seismotectonic inputs. Despite the many developments in seismic hazard in recent decades, the manner in which seismotectonic information feeds the definition of the seismic source can, in many parts of the world including Pakistan and the surrounding regions, remain a subjective process driven primarily by expert judgment. Whilst much research is ongoing to map and characterise active faults in Pakistan, knowledge of the seismogenic properties of the active faults is still incomplete in much of the region. Consequently, seismicity, both historical and instrumental, remains a primary guide to the seismogenic sources of Pakistan. This study utilises a cluster analysis approach for the purposes of identifying spatial differences in seismicity, which can be utilised to form a basis for delineating seismogenic source regions. An effort is made to examine seismicity partitioning for Pakistan with respect to earthquake database, seismic cluster analysis and seismic partitions in a seismic hazard context. A magnitude homogenous earthquake catalogue has been compiled using various available earthquake data. The earthquake catalogue covers a time span from 1930 to 2007 and an area from 23.00 to 39.00N and 59.00 to 80.00E. A threshold magnitude of 5.2 is considered for K-means cluster analysis. The current study uses the traditional metrics of cluster quality, in addition to a seismic hazard contextual metric to attempt to constrain the preferred number of clusters found in the data. The spatial distribution of earthquakes from the catalogue was used to define the seismic clusters for Pakistan, which can be used further in the process of defining seismogenic sources and corresponding earthquake recurrence models for estimates of seismic hazard and risk in Pakistan. Consideration of the different approaches to cluster validation in a seismic hazard context suggests that Pakistan may be divided into K = 19 seismic clusters, including some portions of the neighbouring countries of Afghanistan, Tajikistan and India.

  10. Seismic station, USGS Northern California Seismic Network

    USGS Multimedia Gallery

    Traditional seismic stations such as this one require a source of power (solar here), a poured concrete foundation and several square feet of space. They are not always practical to install in urban areas, and that's where NetQuakes comes in....

  11. Stressing of fault patch during seismic swarms in central Apennines, Italy

    NASA Astrophysics Data System (ADS)

    De Gori, P.; Lucente, F. P.; Chiarabba, C.

    2015-04-01

    Persistent seismic swarms originate along the normal faulting system of central Apennines (Italy). In this study, we analyze the space-time-energy distribution of one of the longer and more intense of these swarms, active since August 2013 in the high seismic risk area of the Gubbio basin. Our aim is to verify if information relevant to constraint short-term earthquake occurrence scenarios is hidden in seismic swarms. During the swarm, the seismic moment release first accelerated, with a rapid migration of seismicity along the fault system, and suddenly dropped. We observe a decrease of the b-value, along the portion of the fault system where large magnitude events concentrated, possibly indicating that a fault patch was dynamically stressed. This finding suggests that the onset of seismic swarms might help the formation of critically stressed patches.

  12. Method of migrating seismic records

    DOEpatents

    Ober, Curtis C. (Las Lunas, NM); Romero, Louis A. (Albuquerque, NM); Ghiglia, Dennis C. (Longmont, CO)

    2000-01-01

    The present invention provides a method of migrating seismic records that retains the information in the seismic records and allows migration with significant reductions in computing cost. The present invention comprises phase encoding seismic records and combining the encoded seismic records before migration. Phase encoding can minimize the effect of unwanted cross terms while still allowing significant reductions in the cost to migrate a number of seismic records.

  13. Verifying a Simplified Fuel Oil Field Measurement Protocol

    SciTech Connect

    Henderson, Hugh; Dentz, Jordan; Doty, Chris

    2013-07-01

    The Better Buildings program is a U.S. Department of Energy program funding energy efficiency retrofits in buildings nationwide. The program is in need of an inexpensive method for measuring fuel oil consumption that can be used in evaluating the impact that retrofits have in existing properties with oil heat. This project developed and verified a fuel oil flow field measurement protocol that is cost effective and can be performed with little training for use by the Better Buildings program as well as other programs and researchers.

  14. Verifying Galileo's discoveries: telescope-making at the Collegio Romano

    NASA Astrophysics Data System (ADS)

    Reeves, Eileen; van Helden, Albert

    The Jesuits of the Collegio Romano in Rome, especially the mathematicians Clavius and Grienberger, were very interested in Galilei's discoveries. After they had failed to recognize with telescopes of own construction the celestial phenomena, they expressed serious doubts. But from November 1610 onward, after they had built a better telescope and had obtained from Venice another one in addition, and could verify Galilei's observations, they completely accepted them. Clavius, who stuck to the Ptolemaic system till his death in 1612, even pointed out these facts in his last edition of Sacrobosco's Sphaera. He as well as his conpatres, however, avoided any conclusions with respect to the planetary system.

  15. Verifying the Performance of RTDs in Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Hashemian, H. M.

    2003-09-01

    This paper describes a number of techniques that have been developed for nuclear power plants to ensure that optimum steady-state and transient performance is achieved with the resistance temperature detectors (RTDs) that are used in the plant for critical temperature measurements. This includes precision laboratory calibration of RTDs, the Loop Current Step Response (LCSR) method for in-situ response time measurements, a cross calibration technique to verify the steady-state performance of RTDs as installed in the plant, and the Time Domain Reflectometry (TDR) test that is used to identify the location of a problem along RTD cables.

  16. Verifiable Quantum ( k, n)-threshold Secret Key Sharing

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Teng, Yi-Wei; Chai, Hai-Ping; Wen, Qiao-Yan

    2011-03-01

    Based on Lagrange interpolation formula and the post-verification mechanism, we show how to construct a verifiable quantum ( k, n) threshold secret key sharing scheme. Compared with the previous secret sharing protocols, ours has the merits: (i) it can resist the fraud of the dealer who generates and distributes fake shares among the participants during the secret distribution phase; Most importantly, (ii) It can check the cheating of the dishonest participant who provides a false share during the secret reconstruction phase such that the authorized group cannot recover the correct secret.

  17. Permeameter data verify new turbulence process for MODFLOW.

    PubMed

    Kuniansky, Eve L; Halford, Keith J; Shoemaker, W Barclay

    2008-01-01

    Abstract A sample of Key Largo Limestone from southern Florida exhibited turbulent flow behavior along three orthogonal axes as reported in recently published permeameter experiments. The limestone sample was a cube measuring 0.2 m on edge. The published nonlinear relation between hydraulic gradient and discharge was simulated using the turbulent flow approximation applied in the Conduit Flow Process (CFP) for MODFLOW-2005 mode 2, CFPM2. The good agreement between the experimental data and the simulated results verifies the utility of the approach used to simulate the effects of turbulent flow on head distributions and flux in the CFPM2 module of MODFLOW-2005. PMID:18459958

  18. Verifying a Simplified Fuel Oil Flow Field Measurement Protocol

    SciTech Connect

    Henderson, H.; Dentz, J.; Doty, C.

    2013-07-01

    The Better Buildings program is a U.S. Department of Energy program funding energy efficiency retrofits in buildings nationwide. The program is in need of an inexpensive method for measuring fuel oil consumption that can be used in evaluating the impact that retrofits have in existing properties with oil heat. This project developed and verified a fuel oil flow field measurement protocol that is cost effective and can be performed with little training for use by the Better Buildings program as well as other programs and researchers.

  19. Permeameter data verify new turbulence process for MODFLOW

    USGS Publications Warehouse

    Kuniansky, Eve L.; Halford, Keith J.; Shoemaker, W. Barclay

    2008-01-01

    A sample of Key Largo Limestone from southern Florida exhibited turbulent flow behavior along three orthogonal axes as reported in recently published permeameter experiments. The limestone sample was a cube measuring 0.2 m on edge. The published nonlinear relation between hydraulic gradient and discharge was simulated using the turbulent flow approximation applied in the Conduit Flow Process (CFP) for MODFLOW-2005 mode 2, CFPM2. The good agreement between the experimental data and the simulated results verifies the utility of the approach used to simulate the effects of turbulent flow on head distributions and flux in the CFPM2 module of MODFLOW-2005.

  20. Successes and failures of recording and interpreting seismic data in structurally complex area: seismic case history

    SciTech Connect

    Morse, V.C.; Johnson, J.H.; Crittenden, J.L.; Anderson, T.D.

    1986-05-01

    There are successes and failures in recording and interpreting a single seismic line across the South Owl Creek Mountain fault on the west flank of the Casper arch. Information obtained from this type of work should help explorationists who are exploring structurally complex areas. A depth cross section lacks a subthrust prospect, but is illustrated to show that the South Owl Creek Mountain fault is steeper with less apparent displacement than in areas to the north. This cross section is derived from two-dimensional seismic modeling, using data processing methods specifically for modeling. A flat horizon and balancing technique helps confirm model accuracy. High-quality data were acquired using specifically designed seismic field parameters. The authors concluded that the methodology used is valid, and an interactive modeling program in addition to cross-line control can improve seismic interpretations in structurally complex areas.

  1. Seismic fragility test of a 6-inch diameter pipe system

    SciTech Connect

    Chen, W. P.; Onesto, A. T.; DeVita, V.

    1987-02-01

    This report contains the test results and assessments of seismic fragility tests performed on a 6-inch diameter piping system. The test was funded by the US Nuclear Regulatory Commission (NRC) and conducted by ETEC. The objective of the test was to investigate the ability of a representative nuclear piping system to withstand high level dynamic seismic and other loadings. Levels of loadings achieved during seismic testing were 20 to 30 times larger than normal elastic design evaluations to ASME Level D limits would permit. Based on failure data obtained during seismic and other dynamic testing, it was concluded that nuclear piping systems are inherently able to withstand much larger dynamic seismic loadings than permitted by current design practice criteria or predicted by the probabilistic risk assessment (PRA) methods and several proposed nonlinear methods of failure analysis.

  2. Discussing Seismic Data

    USGS Multimedia Gallery

    USGS scientists Debbie Hutchinson and Jonathan Childs discuss collected seismic data. This image was taken on U.S. Coast Guard Cutter Healy and was during a scientific expedition to map the Arctic seafloor....

  3. Seismic isolation analysis of FPS bearings in spatial lattice shell structures

    NASA Astrophysics Data System (ADS)

    Yong-Chul, Kim; Xue, Suduo; Zhuang, Peng; Zhao, Wei; Li, Chenghao

    2010-03-01

    A theoretical model of a friction pendulum system (FPS) is introduced to examine its application for the seismic isolation of spatial lattice shell structures. An equation of motion of the lattice shell with FPS bearings is developed. Then, seismic isolation studies are performed for both double-layer and single-layer lattice shell structures under different seismic input and design parameters of the FPS. The influence of frictional coefficients and radius of the FPS on seismic performance are discussed. Based on the study, some suggestions for seismic isolation design of lattice shells with FPS bearings are given and conclusions are made which could be helpful in the application of FPS.

  4. A verified minimal YAC contig for human chromosome 21

    SciTech Connect

    Graw, S.L.; Patterson, D.; Drabkin, H.

    1994-09-01

    The goal of this project is the construction of a verified YAC contig of the complete long arm of human chromosome 21 utilizing YACs from the CEPH and St. Louis libraries. The YACs in this contig have been analyzed for size by PFGE, tested for chimerism by FISH or end-cloning, and verified for STS content by PCR. This last analysis has revealed a number of cases of conflict with the published STS order. To establish correct order, we have utilized STS content analysis of somatic cell hybrids containing portions of chromosome 21. Additional problems being addressed include completeness of coverage and possible deletions or gaps. Questions of completeness of the CEPH 810 YAC set arose after screening with 57 independently derived probes failed to identify clones for 11 (19%). Ten of the 11, however, do detect chromosome 21 cosmids when used to screen Lawrence Livermore library LL21NC02`G,` a cosmid library constructed from flow-sorted chromosomes 21. Remaining gaps in the contig are being closed by several methods. These include YAC fingerprinting and conversion of YACs to cosmids. In addition, we are establishing the overlap between the physical NotI map and the YAC contig by testing YACs for NotI sites and screening the YACs in the contig for the presence of NotI-linking clones.

  5. Third Quarter Hanford Seismic Report for Fiscal Year 2005

    SciTech Connect

    Reidel, Steve P.; Rohay, Alan C.; Hartshorn, Donald C.; Clayton, Ray E.; Sweeney, Mark D.

    2005-09-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 337 triggers during the third quarter of fiscal year 2005. Of these triggers, 20 were earthquakes within the Hanford Seismic Network. The largest earthquake within the Hanford Seismic Network was a magnitude 1.3 event May 25 near Vantage, Washington. During the third quarter, stratigraphically 17 (85%) events occurred in the Columbia River basalt (approximately 0-5 km), no events in the pre-basalt sediments (approximately 5-10 km), and three (15%) in the crystalline basement (approximately 10-25 km). During the first quarter, geographically five (20%) earthquakes occurred in swarm areas, 10 (50%) earthquakes were associated with a major geologic structure, and 5 (25%) were classified as random events.

  6. Annual Hanford Seismic Report for Fiscal Year 2003

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2003-12-01

    This report describes the seismic activity in and around the Hanford Site during Fiscal year 2003. Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 1,336 triggers during fiscal year 2003. Of these triggers, 590 were earthquakes. One hundred and one earthquakes of the 590 earthquakes were located in the Hanford Seismic Network area. Stratigraphically 35 (34.6%) occurred in the Columbia River basalt, 29 (28.7%) were earthquakes in the pre-basalt sediments, and 37 (36.7%) were earthquakes in the crystalline basement. Geographically, 48 (47%) earthquakes occurred in swarm areas, 4 (4%) earthquakes were associated with a major geologic structure, and 49 (49%) were classified as random events. During the third and fourth quarters, an earthquake swarm consisting of 27 earthquakes occurred on the south limb of Rattlesnake Mountain. The earthquakes are centered over the northwest extension of the Horse Heaven Hills anticline and probably occur near the interface of the Columbia River Basalt Group and pre-basalt sediments.

  7. Seismic Consequence Abstraction

    SciTech Connect

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  8. Seismic analysis of a reinforced concrete containment vessel model

    SciTech Connect

    RANDY,JAMES J.; CHERRY,JEFFERY L.; RASHID,YUSEF R.; CHOKSHI,NILESH

    2000-02-03

    Pre-and post-test analytical predictions of the dynamic behavior of a 1:10 scale model Reinforced Concrete Containment Vessel are presented. This model, designed and constructed by the Nuclear Power Engineering Corp., was subjected to seismic simulation tests using the high-performance shaking table at the Tadotsu Engineering Laboratory in Japan. A group of tests representing design-level and beyond-design-level ground motions were first conducted to verify design safety margins. These were followed by a series of tests in which progressively larger base motions were applied until structural failure was induced. The analysis was performed by ANATECH Corp. and Sandia National Laboratories for the US Nuclear Regulatory Commission, employing state-of-the-art finite-element software specifically developed for concrete structures. Three-dimensional time-history analyses were performed, first as pre-test blind predictions to evaluate the general capabilities of the analytical methods, and second as post-test validation of the methods and interpretation of the test result. The input data consisted of acceleration time histories for the horizontal, vertical and rotational (rocking) components, as measured by accelerometers mounted on the structure's basemat. The response data consisted of acceleration and displacement records for various points on the structure, as well as time-history records of strain gages mounted on the reinforcement. This paper reports on work in progress and presents pre-test predictions and post-test comparisons to measured data for tests simulating maximum design basis and extreme design basis earthquakes. The pre-test analyses predict the failure earthquake of the test structure to have an energy level in the range of four to five times the energy level of the safe shutdown earthquake. The post-test calculations completed so far show good agreement with measured data.

  9. A Novel Simple Phantom for Verifying the Dose of Radiation Therapy

    PubMed Central

    Lee, J. H.; Chang, L. T.; Shiau, A. C.; Chen, C. W.; Liao, Y. J.; Li, W. J.; Lee, M. S.; Hsu, S. M.

    2015-01-01

    A standard protocol of dosimetric measurements is used by the organizations responsible for verifying that the doses delivered in radiation-therapy institutions are within authorized limits. This study evaluated a self-designed simple auditing phantom for use in verifying the dose of radiation therapy; the phantom design, dose audit system, and clinical tests are described. Thermoluminescent dosimeters (TLDs) were used as postal dosimeters, and mailable phantoms were produced for use in postal audits. Correction factors are important for converting TLD readout values from phantoms into the absorbed dose in water. The phantom scatter correction factor was used to quantify the difference in the scattered dose between a solid water phantom and homemade phantoms; its value ranged from 1.084 to 1.031. The energy-dependence correction factor was used to compare the TLD readout of the unit dose irradiated by audit beam energies with 60Co in the solid water phantom; its value was 0.99 to 1.01. The setup-condition factor was used to correct for differences in dose-output calibration conditions. Clinical tests of the device calibrating the dose output revealed that the dose deviation was within 3%. Therefore, our homemade phantoms and dosimetric system can be applied for accurately verifying the doses applied in radiation-therapy institutions. PMID:25883980

  10. A novel simple phantom for verifying the dose of radiation therapy.

    PubMed

    Lee, J H; Chang, L T; Shiau, A C; Chen, C W; Liao, Y J; Li, W J; Lee, M S; Hsu, S M

    2015-01-01

    A standard protocol of dosimetric measurements is used by the organizations responsible for verifying that the doses delivered in radiation-therapy institutions are within authorized limits. This study evaluated a self-designed simple auditing phantom for use in verifying the dose of radiation therapy; the phantom design, dose audit system, and clinical tests are described. Thermoluminescent dosimeters (TLDs) were used as postal dosimeters, and mailable phantoms were produced for use in postal audits. Correction factors are important for converting TLD readout values from phantoms into the absorbed dose in water. The phantom scatter correction factor was used to quantify the difference in the scattered dose between a solid water phantom and homemade phantoms; its value ranged from 1.084 to 1.031. The energy-dependence correction factor was used to compare the TLD readout of the unit dose irradiated by audit beam energies with (60)Co in the solid water phantom; its value was 0.99 to 1.01. The setup-condition factor was used to correct for differences in dose-output calibration conditions. Clinical tests of the device calibrating the dose output revealed that the dose deviation was within 3%. Therefore, our homemade phantoms and dosimetric system can be applied for accurately verifying the doses applied in radiation-therapy institutions. PMID:25883980

  11. Moving formal methods into practice. Verifying the FTPP Scoreboard: Results, phase 1

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1992-01-01

    This report documents the Phase 1 results of an effort aimed at formally verifying a key hardware component, called Scoreboard, of a Fault-Tolerant Parallel Processor (FTPP) being built at Charles Stark Draper Laboratory (CSDL). The Scoreboard is part of the FTPP virtual bus that guarantees reliable communication between processors in the presence of Byzantine faults in the system. The Scoreboard implements a piece of control logic that approves and validates a message before it can be transmitted. The goal of Phase 1 was to lay the foundation of the Scoreboard verification. A formal specification of the functional requirements and a high-level hardware design for the Scoreboard were developed. The hardware design was based on a preliminary Scoreboard design developed at CSDL. A main correctness theorem, from which the functional requirements can be established as corollaries, was proved for the Scoreboard design. The goal of Phase 2 is to verify the final detailed design of Scoreboard. This task is being conducted as part of a NASA-sponsored effort to explore integration of formal methods in the development cycle of current fault-tolerant architectures being built in the aerospace industry.

  12. Moving formal methods into practice. Verifying the FTPP Scoreboard: Results, phase 1

    SciTech Connect

    Srivas, M.; Bickford, M.

    1992-05-01

    This report documents the Phase 1 results of an effort aimed at formally verifying a key hardware component, called Scoreboard, of a Fault-Tolerant Parallel Processor (FTPP) being built at Charles Stark Draper Laboratory (CSDL). The Scoreboard is part of the FTPP virtual bus that guarantees reliable communication between processors in the presence of Byzantine faults in the system. The Scoreboard implements a piece of control logic that approves and validates a message before it can be transmitted. The goal of Phase 1 was to lay the foundation of the Scoreboard verification. A formal specification of the functional requirements and a high-level hardware design for the Scoreboard were developed. The hardware design was based on a preliminary Scoreboard design developed at CSDL. A main correctness theorem, from which the functional requirements can be established as corollaries, was proved for the Scoreboard design. The goal of Phase 2 is to verify the final detailed design of Scoreboard. This task is being conducted as part of a NASA-sponsored effort to explore integration of formal methods in the development cycle of current fault-tolerant architectures being built in the aerospace industry.

  13. The Budget Guide to Seismic Network Management

    NASA Astrophysics Data System (ADS)

    Hagerty, M. T.; Ebel, J. E.

    2007-05-01

    Regardless of their size, there are certain tasks that all seismic networks must perform, including data collection and processing, earthquake location, information dissemination, and quality control. Small seismic networks are unlikely to possess the resources -- manpower and money -- required to do much in-house development. Fortunately, there are a lot of free or inexpensive software solutions available that are able to perform many of the required tasks. Often the available solutions are all-in-one turnkey packages designed and developed for much larger seismic networks, and the cost of adapting them to a smaller network must be weighed against the ease with which other, non-seismic software can be adapted to the same task. We describe here the software and hardware choices we have made for the New England Seismic Network (NESN), a sparse regional seismic network responsible for monitoring and reporting all seismicity within the New England region in the northeastern U.S. We have chosen to use a cost-effective approach to monitoring using free, off-the-shelf solutions where available (e.g., Earthworm, HYP2000) and modifying freeware solutions when it is easier than trying to adapt a large, complicated package. We have selected for use software that is: free, likely to receive continued support from the seismic or, preferably, larger internet community, and modular. Modularity is key to our design because it ensures that if one component of our processing system becomes obsolete, we can insert a suitable replacement with few modifications to the other modules. Our automated event detection, identification and location system is based on a wavelet transform analysis of station data that arrive continuously via TCP/IP transmission over the internet. Our system for interactive analyst review of seismic events and remote system monitoring utilizes a combination of Earthworm modules, Perl cgi-bin scripts, Java, and native Unix commands and can now be carried out via internet browser from anywhere in the world. With our current communication and processing system we are able to achieve a monitoring threshold of about M2.0 for most New England, in spite of high cultural noise and sparse station distribution, and maintain an extremely high rate of data recovery, for minimal cost.

  14. Short-Period Seismic Noise in Vorkuta (Russia)

    SciTech Connect

    Kishkina, S B; Spivak, A A; Sweeney, J J

    2008-05-15

    Cultural development of new subpolar areas of Russia is associated with a need for detailed seismic research, including both mapping of regional seismicity and seismic monitoring of specific mining enterprises. Of special interest are the northern territories of European Russia, including shelves of the Kara and Barents Seas, Yamal Peninsula, and the Timan-Pechora region. Continuous seismic studies of these territories are important now because there is insufficient seismological knowledge of the area and an absence of systematic data on the seismicity of the region. Another task of current interest is the necessity to consider the seismic environment in the design, construction, and operation of natural gas extracting enterprises such as the construction of the North European Gas Pipeline. Issues of scientific importance for seismic studies in the region are the complex geodynamical setting, the presence of permafrost, and the complex tectonic structure. In particular, the Uralian Orogene (Fig. 1) strongly affects the propagation of seismic waves. The existing subpolar seismic stations [APA (67,57{sup o}N; 33,40{sup o}E), LVZ (67,90{sup o}N; 34,65{sup o}E), and NRIL (69,50{sup o}N; 88,40{sup o}E)] do not cover the extensive area between the Pechora and Ob Rivers (Fig. 1). Thus seismic observations in the Vorkuta area, which lies within the area of concern, represent a special interest. Continuous recording at a seismic station near the city of Vorkuta (67,50{sup o}N; 64,11{sup o}E) [1] has been conducted since 2005 for the purpose of regional seismic monitoring and, more specifically, detection of seismic signals caused by local mining enterprises. Current surveys of local seismic noise [7,8,9,11], are particularly aimed at a technical survey for the suitability of the site for installation of a small-aperture seismic array, which would include 10-12 recording instruments, with the Vorkuta seismic station as the central element. When constructed, this seismic array will considerably improve the recording capacity of regional and local seismic events. It will allow detection of signatures of seismic waves propagating in submeridional and sublatitudinal directions. The latter is of special interest not only to access the influence of the Urals on propagation patterns of seismic waves, but also to address other questions, such as the structure and dynamic characteristics of the internal dynamo of the Earth [9,13]. Recording seismic waves at low angular distances from seismically active subpolar zones will allow us to collect data on vortical and convective movements in subpolar lithosphere blocks and at the boundary of the inner core of the Earth, possibly giving essential clues to the modeling of the Earth's electromagnetic field [3,13]. The present study considers basic features of seismic noise at the Vorkuta station obtained through the analysis of seismic records from March, 2006 till December, 2007.

  15. Cryptanalysis and improvement of verifiable quantum (k, n) secret sharing

    NASA Astrophysics Data System (ADS)

    Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun

    2015-12-01

    After analyzing Yang's verifiable quantum secret sharing (VQSS) scheme, we show that in their scheme a participant can prepare a false quantum particle sequence corresponding to a forged share, while other any participant cannot trace it. In addition, an attacker or a participant can forge a new quantum sequence by transforming an intercepted quantum sequence; moreover, the forged sequence can pass the verification of other participants. So we propose a new VQSS scheme to improve the existed one. In the improved scheme, we construct an identity-based quantum signature encryption algorithm, which ensures chosen plaintext attack security of the shares and their signatures transmitted in the quantum tunnel. We employ dual quantum signature and one-way function to trace against forgery and repudiation of the deceivers (dealer or participants). Furthermore, we add the reconstruction process of quantum secret and prove the security property against superposition attack in this process.

  16. Analysis of Fingerprint Image to Verify a Person

    NASA Astrophysics Data System (ADS)

    Jahankhani, Hossein; Mohid, Maktuba

    Identification and authentication technologies are increasing day by day to protect people and goods from crime and terrorism. This paper is aimed to discuss fingerprint technology in depth and analysis of fingerprint image. Verify a person with a highlight on fingerprint matching. Some fingerprint matching algorithms are analysed and compared. The outcomes of the analysis has identified some major issues or factors of fingerprinting, which are location, rotation, clipping, noise, non-linear distortion sensitiveness/ insensitiveness properties, computational cost and accuracy level of fingerprint matching algorithms. Also a new fingerprint matching algorithm proposed in this research work. The proposed algorithm has used Euclidean distance, angle difference, type as matching parameters instead of specific location parameter (like, x or y coordinates), which makes the algorithm location and rotation insensitive. The matching of local neighbourhoods at each stage makes the algorithm non-linear distortion insensitive.

  17. Beyond Hammers and Nails: Mitigating and Verifying Greenhouse Gas Emissions

    NASA Astrophysics Data System (ADS)

    Gurney, Kevin Robert

    2013-05-01

    One of the biggest challenges to future international agreements on climate change is an independent, science-driven method of verifying reductions in greenhouse gas emissions (GHG) [Niederberger and Kimble, 2011]. The scientific community has thus far emphasized atmospheric measurements to assess changes in emissions. An alternative is direct measurement or estimation of fluxes at the source. Given the many challenges facing the approach that uses "top-down" atmospheric measurements and recent advances in "bottom-up" estimation methods, I challenge the current doctrine, which has the atmospheric measurement approach "validating" bottom-up, "good-faith" emissions estimation [Balter, 2012] or which holds that the use of bottom-up estimation is like "dieting without weighing oneself" [Nisbet and Weiss, 2010].

  18. Developing an Approach for Analyzing and Verifying System Communication

    NASA Technical Reports Server (NTRS)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  19. Needed: Verified models to predict the fracture of weldments

    SciTech Connect

    Keefer, D.W.; Reuter, W.G.; Smartt, H.B.; Johnson, J.A. . Chemical and Materials Research and Engineering Group); David, S.A. )

    1993-09-01

    The workshop participants agreed that it is necessary to develop verified models to predict the fracture process (initiation of crack growth, crack growth, and instability) for weldments. The capability needs to include weld mismatch, residual stresses, etc., as well as some information about limits of applicability. These models provide the basis for predicting fitness-for-service (purpose), which allows: (1) the establishment of NDE capabilities and accept/reject criteria; (2) assessing flaws for making decisions such as accept, reject, or accept with limits on operating conditions (including time). The workshop participants acknowledged the need to have close collaboration between experiment, micromechanics, and continuum mechanics. The recommended areas of research are first, initiation of crack growth [(a) experimental, (b) continuum, and (c) micromechanics] and second, crack growth.

  20. Cryptanalysis and improvement of verifiable quantum ( k, n) secret sharing

    NASA Astrophysics Data System (ADS)

    Song, Xiuli; Liu, Yanbing

    2016-02-01

    After analyzing Yang's verifiable quantum secret sharing (VQSS) scheme, we show that in their scheme a participant can prepare a false quantum particle sequence corresponding to a forged share, while other any participant cannot trace it. In addition, an attacker or a participant can forge a new quantum sequence by transforming an intercepted quantum sequence; moreover, the forged sequence can pass the verification of other participants. So we propose a new VQSS scheme to improve the existed one. In the improved scheme, we construct an identity-based quantum signature encryption algorithm, which ensures chosen plaintext attack security of the shares and their signatures transmitted in the quantum tunnel. We employ dual quantum signature and one-way function to trace against forgery and repudiation of the deceivers (dealer or participants). Furthermore, we add the reconstruction process of quantum secret and prove the security property against superposition attack in this process.

  1. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  2. Use of VSP to enhance surface seismic processing

    SciTech Connect

    O'Rourke, T.J. )

    1988-08-01

    Many exploratory successes in Mediterranean basins have been guided by the seismic location of structures. Continued exploration (and delineation) efforts must search for smaller, more subtle indications which may be obscured on seismic data by the effects of multiples, attenuation, and velocity errors. By introducing vertical seismic profiles (VSP) and well-log information, the processing of existing seismic data can be fine tuned, often improving resolution and correlation. With VSP measurements, velocities used in the stacking process can be verified. When events are dipping, the potential errors in converting stacking velocities to interval velocities for migration are avoided by using directly measured VSP interval velocities. This may affect fault positioning and closure. Frequency-dependent propagation effects are removed with an inverse Q-filter, computed from the VSP data. This compensation often boosts resolution. The response from surface seismic, VSP, and logs should be the same around a well. Differences are often due to variations in the source wavelet, which can be reconciled by matching both the frequency and phase of the wavelets or converting all results to zero phase. Sections of different vintages can be matched with the same method. After matching, inversion is constrained to match the VSP and logs at the well. This provides low-frequency control and gives a sensible correlation with formations. Introducing VSP and well-log information while reprocessing existing surface seismic data can often provide a cost-effective solution to the problems encountered in stratigraphic or development geophysics.

  3. Seismic analysis of a nonlinear airlock door system

    SciTech Connect

    Huang, S.N.

    1983-01-01

    The containment equipment airlock door of the Fast Flux Test Facility utilizes screw-type actuators as a push-pull mechanism for closing and opening operations. Special design features were used to protect these actuators from pressure differential loading. These made the door behave as a nonlinear system during a seismic event. Seismic analyses, utilizing the time history method, were conducted to determine the seismic loads on these screw-type actuators. Several sizes of actuators were examined. Procedures for determining the final optimum design are discussed in detail.

  4. A PZT-based smart aggregate for seismic shear stress monitoring

    NASA Astrophysics Data System (ADS)

    Hou, S.; Zhang, H. B.; Ou, J. P.

    2013-06-01

    A lead zirconate titanate (PZT)-based smart aggregate (SA) is proposed for seismic shear stress monitoring in concrete structures. This SA uses a d15-mode PZT as the sensing element. A calibration test is designed in which a cyclic shear stress with a dominant frequency of the earthquake response spectrum is applied on the two opposite sides of the proposed SA using a specially designed loading mold. The test is repeated on six copies of the proposed SA. The maximum applied shear stress is larger than the shear strength of ordinary concrete to allow measurements during failure. The output voltage of the SA is experimentally verified as varying linearly with the applied stress in the loading range. The sensitivity of the proposed SA to the applied stress under the given boundary conditions is examined. The calibrated sensitivity value is then compared with the calculated value, which is obtained by computing the stress distribution in the SA using finite element analysis (FEA). The calculated values and the calibrated values are approximately the same, indicating that the established finite element (FE) model is reliable. Monotonic loading is also applied on the proposed SA to induce cracks between the SA and the loading mold, and the SAs response to cracking processes is examined. It is found that the proposed SA underestimates the cracking process. This study demonstrates that the proposed SA can be used in monitoring the overall shear stress development process in concrete during a seismic event.

  5. Seismic Hazard Characterization at the DOE Savannah River Site (SRS): Status report

    SciTech Connect

    Savy, J.B.

    1994-06-24

    The purpose of the Seismic Hazard Characterization project for the Savannah River Site (SRS-SHC) is to develop estimates of the seismic hazard for several locations within the SRS. Given the differences in the geology and geotechnical characteristics at each location, the estimates of the seismic hazard are to allow for the specific local conditions at each site. Characterization of seismic hazard is a critical factor for the design of new facilities as well as for the review and potential retrofit of existing facilities at SRS. The scope of the SRS seismic hazard characterization reported in this document is limited to the Probabilistic Seismic Hazard Analysis (PSHA). The goal of the project is to provide seismic hazard estimates based on a state-of-the-art method which is consistent with developments and findings of several ongoing studies which are deemed to bring improvements in the state of the seismic hazard analyses.

  6. Tornado Detection Based on Seismic Signal.

    NASA Astrophysics Data System (ADS)

    Tatom, Frank B.; Knupp, Kevin R.; Vitton, Stanley J.

    1995-02-01

    At the present time the only generally accepted method for detecting when a tornado is on the ground is human observation. Based on theoretical considerations combined with eyewitness testimony, there is strong reason to believe that a tornado in contact with the ground transfers a significant amount of energy into the ground. The amount of energy transferred depends upon the intensity of the tornado and the characteristics of the surface. Some portion of this energy takes the form of seismic waves, both body and surface waves. Surface waves (Rayleigh and possibly Love) represent the most likely type of seismic signal to be detected. Based on the existence of such a signal, a seismic tornado detector appears conceptually possible. The major concerns for designing such a detector are range of detection and discrimination between the tornadic signal and other types of surface waves generated by ground transportation equipment, high winds, or other nontornadic sources.

  7. Permafrost Active Layer Seismic Interferometry Experiment (PALSIE).

    SciTech Connect

    Abbott, Robert; Knox, Hunter Anne; James, Stephanie; Lee, Rebekah; Cole, Chris

    2016-01-01

    We present findings from a novel field experiment conducted at Poker Flat Research Range in Fairbanks, Alaska that was designed to monitor changes in active layer thickness in real time. Results are derived primarily from seismic data streaming from seven Nanometric Trillium Posthole seismometers directly buried in the upper section of the permafrost. The data were evaluated using two analysis methods: Horizontal to Vertical Spectral Ratio (HVSR) and ambient noise seismic interferometry. Results from the HVSR conclusively illustrated the method's effectiveness at determining the active layer's thickness with a single station. Investigations with the multi-station method (ambient noise seismic interferometry) are continuing at the University of Florida and have not yet conclusively determined active layer thickness changes. Further work continues with the Bureau of Land Management (BLM) to determine if the ground based measurements can constrain satellite imagery, which provide measurements on a much larger spatial scale.

  8. SEISMIC MODELING ENGINES PHASE 1 FINAL REPORT

    SciTech Connect

    BRUCE P. MARION

    2006-02-09

    Seismic modeling is a core component of petroleum exploration and production today. Potential applications include modeling the influence of dip on anisotropic migration; source/receiver placement in deviated-well three-dimensional surveys for vertical seismic profiling (VSP); and the generation of realistic data sets for testing contractor-supplied migration algorithms or for interpreting AVO (amplitude variation with offset) responses. This project was designed to extend the use of a finite-difference modeling package, developed at Lawrence Berkeley Laboratories, to the advanced applications needed by industry. The approach included a realistic, easy-to-use 2-D modeling package for the desktop of the practicing geophysicist. The feasibility of providing a wide-ranging set of seismic modeling engines was fully demonstrated in Phase I. The technical focus was on adding variable gridding in both the horizontal and vertical directions, incorporating attenuation, improving absorbing boundary conditions and adding the optional coefficient finite difference methods.

  9. Seismic exploration for water on Mars

    NASA Technical Reports Server (NTRS)

    Page, Thornton

    1987-01-01

    It is proposed to soft-land three seismometers in the Utopia-Elysium region and three or more radio controlled explosive charges at nearby sites that can be accurately located by an orbiter. Seismic signatures of timed explosions, to be telemetered to the orbiter, will be used to detect present surface layers, including those saturated by volatiles such as water and/or ice. The Viking Landers included seismometers that showed that at present Mars is seismically quiet, and that the mean crustal thickness at the site is about 14 to 18 km. The new seismic landers must be designed to minimize wind vibration noise, and the landing sites selected so that each is well formed on the regolith, not on rock outcrops or in craters. The explosive charges might be mounted on penetrators aimed at nearby smooth areas. They must be equipped with radio emitters for accurate location and radio receivers for timed detonation.

  10. Seismic stratigraphy on a micro budget

    SciTech Connect

    Clark, T.M.

    1984-04-01

    For brief period Tandy Corporation marketed an inexpensive digitizer under its Radio Shack trademark. A seismic stratigraphic analysis system has been developed using this device and a 48K Radio Shack microcomputer. This system has the capacity to enter well log curves and seismic traces at the digitizer, convert log curves to time dimension by integration of interpolation, compute synthetic seismograms and time logs, and do synthetic modeling, wavelet estimation, and inversion of seismic and synthetic traces. The system allows great flexibility, as each process is designed as a stand-alone, interactive program, and data files are in identical format. Thus almost any order of operation may be chosen, and modeling may be in either depth or time. Display is to a pen plotter or dot-matrix printer-plotter. The plot routines allow flexibility in the number, order, spacing, and scale of the curves displayed.

  11. Optimizing Seismic Monitoring Networks for EGS and Conventional Geothermal Projects

    NASA Astrophysics Data System (ADS)

    Kraft, Toni; Herrmann, Marcus; Bethmann, Falko; Stefan, Wiemer

    2013-04-01

    In the past several years, geological energy technologies receive growing attention and have been initiated in or close to urban areas. Some of these technologies involve injecting fluids into the subsurface (e.g., oil and gas development, waste disposal, and geothermal energy development) and have been found or suspected to cause small to moderate sized earthquakes. These earthquakes, which may have gone unnoticed in the past when they occurred in remote sparsely populated areas, are now posing a considerable risk for the public acceptance of these technologies in urban areas. The permanent termination of the EGS project in Basel, Switzerland after a number of induced ML~3 (minor) earthquakes in 2006 is one prominent example. It is therefore essential for the future development and success of these geological energy technologies to develop strategies for managing induced seismicity and keeping the size of induced earthquakes at a level that is acceptable to all stakeholders. Most guidelines and recommendations on induced seismicity published since the 1970ies conclude that an indispensable component of such a strategy is the establishment of seismic monitoring in an early stage of a project. This is because an appropriate seismic monitoring is the only way to detect and locate induced microearthquakes with sufficient certainty to develop an understanding of the seismic and geomechanical response of the reservoir to the geotechnical operation. In addition, seismic monitoring lays the foundation for the establishment of advanced traffic light systems and is therefore an important confidence building measure towards the local population and authorities. We have developed an optimization algorithm for seismic monitoring networks in urban areas that allows to design and evaluate seismic network geometries for arbitrary geotechnical operation layouts. The algorithm is based on the D-optimal experimental design that aims to minimize the error ellipsoid of the linearized location problem. Optimization for additional criteria (e.g., focal mechanism determination or installation costs) can be included. We consider a 3D seismic velocity model, an European ambient seismic noise model derived from high-resolution land-use data, and existing seismic stations in the vicinity of the geotechnical site. Additionally, we account for the attenuation of the seismic signal with travel time and ambient seismic noise with depth to be able to correctly deal with borehole station networks. Using this algorithm we are able to find the optimal geometry and size of the seismic monitoring network that meets the predefined application-oriented performance criteria. This talk will focus on optimal network geometries for deep geothermal projects of the EGS and hydrothermal type, and discuss the requirements for basic seismic surveillance and high-resolution reservoir monitoring and characterization.

  12. Seismic bearing capacity and settlements of foundations

    SciTech Connect

    Richards, R. Jr. ); Elms, D.G. ); Budhu, M. )

    1993-04-01

    Field and laboratory observations of seismic settlements of shallow foundations on granular soils that are not attributable to changes in density or liquefaction are explained in terms of seismic degradation of bearing capacity. Limit analysis using a Coulomb-type mechanism including inertial forces in the soil and on the footing gives expressions for seismic bearing capacity factors that are directly related to their static counterparts. Comparison of the two depicts clearly the rapid deterioration of the overall foundation capacity with increasing acceleration. Such periodic inertial fluidization causes finite settlements that are possible even in moderate earthquakes. Reduction in foundation capacity is due to both the seismic degradation of soil strength and the lateral inertial forces transmitted by shear to the foundation through the structure and any surcharge. A straightforward sliding-block procedure with examples is also presented for computing these settlements due to loss of bearing capacity for short time periods. The approach also leads to a design procedure for footings based on limiting seismic settlements to a prescribed value.

  13. Constraints on Subglacial Conditions from Seismicity

    NASA Astrophysics Data System (ADS)

    Lipovsky, B.; Olivo, D. C.; Dunham, E. M.

    2014-12-01

    A family of physics-based models designed to explain emergent, bandlimited, "tremor-like" seismograms shed light onto subglacial and englacial conditions. We consider two such models. In the first, a water-filled fracture hosts resonant modes; the seismically observable quality factor and characteristic frequency of these modes constrain the fracture length and aperture. In the second model, seismicity is generated by repeating stick-slip events on a fault patch (portion of the glacier bed) with sliding described by rate- and state-dependent friction laws. Wave propagation phenomena may additionally generate bandlimited seismic signals. These models make distinct predictions that may be used to address questions of glaciological concern. Laboratory friction experiments show that small, repeating earthquakes most likely occur at the ice-till interface and at conditions below the pressure melting point. These laboratory friction values, when combined with observed ice surface velocities, may also be used to constrain basal pore pressure. In contrast, seismic signals indicative of water-filled basal fractures suggest that, at least locally, temperatures are above the pressure melting point. We present a simple diagnostic test between these two processes that concerns the relationship between the multiple seismic spectral peaks generated by each process. Whereas repeating earthquakes generate evenly spaced spectral peaks through the Dirac comb effect, hydraulic fracture resonance, as a result of dispersive propagation of waves along the crack, generates spectral peaks that are not evenly spaced.

  14. Overview of seismic considerations at the Paducah Gaseous Diffusion Plant

    SciTech Connect

    Hunt, R.J.; Stoddart, W.C.; Burnett, W.A.; Beavers, J.E.

    1992-10-01

    This paper presents an overview of seismic considerations at the Paducah Gaseous Diffusion Plant (PGDP), which is managed by Martin Marietta Energy Systems, Inc., for the Department of Energy (DOE). The overview describes the original design, the seismic evaluations performed for the Safety Analysis Report (SAR) issued in 1985, and current evaluations and designs to address revised DOE requirements. Future plans to ensure changes in requirements and knowledge are addressed.

  15. Effects of Large and Small-Source Seismic Surveys on Marine Mammals and Sea Turtles

    NASA Astrophysics Data System (ADS)

    Holst, M.; Richardson, W. J.; Koski, W. R.; Smultea, M. A.; Haley, B.; Fitzgerald, M. W.; Rawson, M.

    2006-05-01

    L-DEO implements a marine mammal and sea turtle monitoring and mitigation program during its seismic surveys. The program consists of visual observations, mitigation, and/or passive acoustic monitoring (PAM). Mitigation includes ramp ups, powerdowns, and shutdowns of the seismic source if marine mammals or turtles are detected in or about to enter designated safety radii. Visual observations for marine mammals and turtles have taken place during all 11 L-DEO surveys since 2003, and PAM was done during five of those. Large sources were used during six cruises (10 to 20 airguns; 3050 to 8760 in3; PAM during four cruises). For two interpretable large-source surveys, densities of marine mammals were lower during seismic than non- seismic periods. During a shallow-water survey off Yucatán, delphinid densities during non-seismic periods were 19x higher than during seismic; however, this number is based on only 3 sightings during seismic and 11 sightings during non-seismic. During a Caribbean survey, densities were 1.4x higher during non-seismic. The mean closest point of approach (CPA) for delphinids for both cruises was significantly farther during seismic (1043 m) than during non-seismic (151 m) periods (Mann-Whitney U test, P < 0.001). Large whales were only seen during the Caribbean survey; mean CPA during seismic was 1722 m compared to 1539 m during non-seismic, but sample sizes were small. Acoustic detection rates with and without seismic were variable for three large-source surveys with PAM, with rates during seismic ranging from 1/3 to 6x those without seismic (n = 0 for fourth survey). The mean CPA for turtles was closer during non-seismic (139 m) than seismic (228 m) periods (P < 0.01). Small-source surveys used up to 6 airguns or 3 GI guns (75 to 1350 in3). During a Northwest Atlantic survey, delphinid densities during seismic and non-seismic were similar. However, in the Eastern Tropical Pacific, delphinid densities during non-seismic were 2x those during seismic. During a survey in Alaska, densities of large whales were 4.5x greater during non-seismic than seismic. In contrast, densities of Dall's porpoise were ~2x greater during seismic than during non-seismic; they also approached closer to the vessel during seismic (622 m) than during non-seismic (1044 m), though not significantly so (P = 0.16). CPAs for all other marine mammal groups sighted during small-source surveys were similar during seismic and non- seismic. For the one small-source survey with PAM, the acoustic detection rate during seismic was 1/3 of that without seismic. The mean CPA for turtles was 120 m during non-seismic and 285 m during seismic periods (P < 0.001). The large-source results suggest that, with operating airguns, some cetaceans tended to avoid the immediate area but often continued calling. Some displacement was also apparent during three interpretable small- source surveys, but the evidence was less clear than for large-source surveys. With both large and small sources, although some cetaceans avoided the airguns and vessel, others came to bowride during seismic operations. Sea turtles showed localized avoidance during large and small-source surveys.

  16. Regional seismic discrimination research at LLNL

    SciTech Connect

    Walter, W.R.; Mayeda, K.M.; Goldstein, P.; Patton, H.J.; Jarpe, S.; Glenn, L.

    1995-10-01

    The ability to verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and show much variability from region to region. The goals of Lawrence Livermore National Laboratory`s (LLNL`s) regional discriminant research are to evaluate the most promising discriminants, improve the understanding of their physical basis and use this information to develop new and more effective discriminants that can be transported to new regions of high monitoring interest. In this report the authors discuss preliminary efforts to geophysically characterize the Middle East and North Africa. They show that the remarkable stability of coda allows one to develop physically based, stable single station magnitude scales in new regions. They then discuss progress to date on evaluating and improving physical understanding and ability to model regional discriminants, focusing on the comprehensive NTS dataset. The authors apply this modeling ability to develop improved discriminants including slopes of P to S ratios. They find combining disparate discriminant techniques is particularly effective in identifying consistent outliers such as shallow earthquakes and mine seismicity. Finally they discuss development and use of new coda and waveform modeling tools to investigate special events.

  17. LLNL`s regional seismic discrimination research

    SciTech Connect

    Walter, W.R.; Mayeda, K.M.; Goldstein, P.

    1995-07-01

    The ability to negotiate and verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and show much variability from region to region. The goals of Lawrence Livermore National Laboratory`s (LLNL`s) regional discriminant research are to evaluate the most promising discriminants, improve our understanding of their physical basis and use this information to develop new and more effective discriminants that can be transported to new regions of high monitoring interest. In this report we discuss our preliminary efforts to geophysically characterize two regions, the Korean Peninsula and the Middle East-North Africa. We show that the remarkable stability of coda allows us to develop physically based, stable single station magnitude scales in new regions. We then discuss our progress to date on evaluating and improving our physical understanding and ability to model regional discriminants, focusing on the comprehensive NTS dataset. We apply this modeling ability to develop improved discriminants including slopes of P to S ratios. We find combining disparate discriminant techniques is particularly effective in identifying consistent outliers such as shallow earthquakes and mine seismicity. Finally we discuss our development and use of new coda and waveform modeling tools to investigate special events.

  18. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.

    2011-12-01

    The CSN is a network of low-cost accelerometers deployed in the Pasadena, CA region. It is a prototype network with the goal of demonstrating the importance of dense measurements in determining the rapid lateral variations in ground motion due to earthquakes. The main product of the CSN is a map of peak ground produced within seconds of significant local earthquakes that can be used as a proxy for damage. Examples of this are shown using data from a temporary network in Long Beach, CA. Dense measurements in buildings are also being used to determine the state of health of structures. In addition to fixed sensors, portable sensors such as smart phones are also used in the network. The CSN has necessitated several changes in the standard design of a seismic network. The first is that the data collection and processing is done in the "cloud" (Google cloud in this case) for robustness and the ability to handle large impulsive loads (earthquakes). Second, the database is highly de-normalized (i.e. station locations are part of waveform and event-detection meta data) because of the mobile nature of the sensors. Third, since the sensors are hosted and/or owned by individuals, the privacy of the data is very important. The location of fixed sensors is displayed on maps as sensor counts in block-wide cells, and mobile sensors are shown in a similar way, with the additional requirement to inhibit tracking that at least two must be present in a particular cell before any are shown. The raw waveform data are only released to users outside of the network after a felt earthquake.

  19. The Lusi seismic experiment: An initial study to understand the effect of seismic activity to Lusi

    NASA Astrophysics Data System (ADS)

    Karyono, Mazzini, Adriano; Lupi, Matteo; Syafri, Ildrem; Masturyono, Rudiyanto, Ariska; Pranata, Bayu; Muzli, Widodo, Handi Sulistyo; Sudrajat, Ajat; Sugiharto, Anton

    2015-04-01

    The spectacular Lumpur Sidoarjo (Lusi) eruption started in northeast Java on the 29 of May 2006 following a M6.3 earthquake striking the island [1,2]. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system [3] and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. The Lusi seismic experiment is a project aims to begin a detailed study of seismicity around the Lusi area. In this initial phase we deploy 30 seismometers strategically distributed in the area around Lusi and along the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. The purpose of the initial monitoring is to conduct a preliminary seismic campaign aiming to identify the occurrence and the location of local seismic events in east Java particularly beneath Lusi.This network will locate small event that may not be captured by the existing BMKG network. It will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-AW region and spatial and temporal variations of vp/vs ratios. The goal of this study is to understand how the seismicity occurring along the Sunda subduction zone affects to the behavior of the Lusi eruption. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. In this study, we will extract Green's functions from ambient seismic noise data in order to image the shallow subsurface structure beneath LUSI area. The waveform cross-correlation technique will be apply to all of recordings of ambient seismic noise at 30 seismographic stations around the LUSI area. We use the dispersive behaviour of the retrieved Rayleigh waves to infer velocity structures in the shallow subsurface.

  20. Enhancing Seismic Monitoring Capability for Hydraulic Fracturing Induced Seismicity in Canada

    NASA Astrophysics Data System (ADS)

    Kao, H.; Cassidy, J. F.; Farahbod, A.; Lamontagne, M.

    2012-12-01

    The amount of natural gas produced from unconventional sources, such as the shale gas, has increased dramatically since the last decade. One of the key factors in the success of shale gas production is the application of hydraulic fracturing (also known as "fracking") to facilitate the efficient recovery of natural gas from shale matrices. As the fracking operation becomes routine in all major shale gas fields, its potential to induce local earthquakes at some locations has become a public concern. To address this concern, Natural Resources Canada has initiated a research effort to investigate the potential links between fracking operations and induced seismicity in some major shale gas basins of Canada. This federal-provincial collaborative research aims to assess if shale gas fracking can alter regional pattern of background seismicity and if so, what the relationship between how fracking is conducted and the maximum magnitude of induced seismicity would be. Other objectives include the investigation of the time scale of the interaction between fracking events and induced seismicity and the evaluation of induced seismicity potential for shale gas basins under different tectonic/geological conditions. The first phase of this research is to enhance the detection and monitoring capability for seismicity possibly related to shale gas recovery in Canada. Densification of the Canadian National Seismograph Network (CNSN) is currently underway in northeast British Columbia where fracking operations are taking place. Additional seismic stations are planned for major shale gas basins in other regions where fracking might be likely in the future. All newly established CNSN stations are equipped with broadband seismographs with real-time continuous data transmission. The design goal of the enhanced seismic network is to significantly lower the detection threshold such that the anticipated low-magnitude earthquakes that might be related to fracking operations can be identified and located shortly after their occurrence.

  1. The Lusi seismic experiment: An initial study to understand the effect of seismic activity to Lusi

    SciTech Connect

    Karyono; Mazzini, Adriano; Sugiharto, Anton; Lupi, Matteo; Syafri, Ildrem; Masturyono,; Rudiyanto, Ariska; Pranata, Bayu; Muzli,; Widodo, Handi Sulistyo; Sudrajat, Ajat

    2015-04-24

    The spectacular Lumpur Sidoarjo (Lusi) eruption started in northeast Java on the 29 of May 2006 following a M6.3 earthquake striking the island [1,2]. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system [3] and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. The Lusi seismic experiment is a project aims to begin a detailed study of seismicity around the Lusi area. In this initial phase we deploy 30 seismometers strategically distributed in the area around Lusi and along the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. The purpose of the initial monitoring is to conduct a preliminary seismic campaign aiming to identify the occurrence and the location of local seismic events in east Java particularly beneath Lusi.This network will locate small event that may not be captured by the existing BMKG network. It will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-AW region and spatial and temporal variations of vp/vs ratios. The goal of this study is to understand how the seismicity occurring along the Sunda subduction zone affects to the behavior of the Lusi eruption. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. In this study, we will extract Green’s functions from ambient seismic noise data in order to image the shallow subsurface structure beneath LUSI area. The waveform cross-correlation technique will be apply to all of recordings of ambient seismic noise at 30 seismographic stations around the LUSI area. We use the dispersive behaviour of the retrieved Rayleigh waves to infer velocity structures in the shallow subsurface.

  2. Generalized seismic wavelets

    NASA Astrophysics Data System (ADS)

    Wang, Yanghua

    2015-11-01

    The Ricker wavelet, which is often employed in seismic analysis, has a symmetrical form. Seismic wavelets observed from field data, however, are commonly asymmetric with respect to the time variation. In order to better represent seismic signals, asymmetrical wavelets are defined systematically as fractional derivatives of a Gaussian function in which the Ricker wavelet becomes just a special case with the integer derivative of order 2. The fractional value and a reference frequency are two key parameters in the generalization. Frequency characteristics, such as the central frequency, the bandwidth, the mean frequency and the deviation, may be expressed analytically in closed forms. In practice, once the statistical properties (the mean frequency and deviation) are numerically evaluated from the discrete Fourier spectra of seismic data, these analytical expressions can be used to uniquely determine the fractional value and the reference frequency, and subsequently to derive various frequency quantities needed for the wavelet analysis. It is demonstrated that field seismic signals, recorded at various depths in a vertical borehole, can be closely approximated by generalized wavelets, defined in terms of fractional values and reference frequencies.

  3. Landslide seismic magnitude

    NASA Astrophysics Data System (ADS)

    Lin, C. H.; Jan, J. C.; Pu, H. C.; Tu, Y.; Chen, C. C.; Wu, Y. M.

    2015-11-01

    Landslides have become one of the most deadly natural disasters on earth, not only due to a significant increase in extreme climate change caused by global warming, but also rapid economic development in topographic relief areas. How to detect landslides using a real-time system has become an important question for reducing possible landslide impacts on human society. However, traditional detection of landslides, either through direct surveys in the field or remote sensing images obtained via aircraft or satellites, is highly time consuming. Here we analyze very long period seismic signals (20-50 s) generated by large landslides such as Typhoon Morakot, which passed though Taiwan in August 2009. In addition to successfully locating 109 large landslides, we define landslide seismic magnitude based on an empirical formula: Lm = log ⁡ (A) + 0.55 log ⁡ (Δ) + 2.44, where A is the maximum displacement (μm) recorded at one seismic station and Δ is its distance (km) from the landslide. We conclude that both the location and seismic magnitude of large landslides can be rapidly estimated from broadband seismic networks for both academic and applied purposes, similar to earthquake monitoring. We suggest a real-time algorithm be set up for routine monitoring of landslides in places where they pose a frequent threat.

  4. Gravity of the New Madrid seismic zone; a preliminary study

    USGS Publications Warehouse

    Langenheim, V.E.

    1995-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Mo. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  5. Third Quarter Hanford Seismic report for Fiscal year 2003

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2003-09-11

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 356 triggers during the third quarter of fiscal year 2003. Of these triggers, 141 were earthquakes. Thirty-four earthquakes of the 141 earthquakes were located in the Hanford Seismic Network area. Stratigraphically 15 occurred in the Columbia River basalt, 13 were earthquakes in the pre-basalt sediments, and 6 were earthquakes in the crystalline basement. Geographically, 22 earthquakes occurred in swarm areas, 1 earthquake was associated with a major geologic structure, and 11 were classified as random events. During the third quarter, an earthquake swarm consisting of 15 earthquakes occurred on the south limb of Rattlesnake Mountain. The earthquakes are centered over the northwest extension of the Horse Heaven Hills anticline and probably occur at the base of the Columbia River Basalt Group.

  6. Experimental Techniques Verified for Determining Yield and Flow Surfaces

    NASA Technical Reports Server (NTRS)

    Lerch, Brad A.; Ellis, Rod; Lissenden, Cliff J.

    1998-01-01

    Structural components in aircraft engines are subjected to multiaxial loads when in service. For such components, life prediction methodologies are dependent on the accuracy of the constitutive models that determine the elastic and inelastic portions of a loading cycle. A threshold surface (such as a yield surface) is customarily used to differentiate between reversible and irreversible flow. For elastoplastic materials, a yield surface can be used to delimit the elastic region in a given stress space. The concept of a yield surface is central to the mathematical formulation of a classical plasticity theory, but at elevated temperatures, material response can be highly time dependent. Thus, viscoplastic theories have been developed to account for this time dependency. Since the key to many of these theories is experimental validation, the objective of this work (refs. 1 and 2) at the NASA Lewis Research Center was to verify that current laboratory techniques and equipment are sufficient to determine flow surfaces at elevated temperatures. By probing many times in the axial-torsional stress space, we could define the yield and flow surfaces. A small offset definition of yield (10 me) was used to delineate the boundary between reversible and irreversible behavior so that the material state remained essentially unchanged and multiple probes could be done on the same specimen. The strain was measured with an off-the-shelf multiaxial extensometer that could measure the axial and torsional strains over a wide range of temperatures. The accuracy and resolution of this extensometer was verified by comparing its data with strain gauge data at room temperature. The extensometer was found to have sufficient resolution for these experiments. In addition, the amount of crosstalk (i.e., the accumulation of apparent strain in one direction when strain in the other direction is applied) was found to be negligible. Tubular specimens were induction heated to determine the flow surfaces at elevated temperatures. The heating system induced a large amount of noise in the data. By reducing thermal fluctuations and using appropriate data averaging schemes, we could render the noise inconsequential. Thus, accurate and reproducible flow surfaces (see the figure) could be obtained.

  7. Seismic surveys test on Innerhytta Pingo, Adventdalen, Svalbard Islands

    NASA Astrophysics Data System (ADS)

    Boaga, Jacopo; Rossi, Giuliana; Petronio, Lorenzo; Accaino, Flavio; Romeo, Roberto; Wheeler, Walter

    2015-04-01

    We present the preliminary results of an experimental full-wave seismic survey test conducted on the Innnerhytta a Pingo, located in the Adventdalen, Svalbard Islands, Norway. Several seismic surveys were adopted in order to study a Pingo inner structure, from classical reflection/refraction arrays to seismic tomography and surface waves analysis. The aim of the project IMPERVIA, funded by Italian PNRA, was the evaluation of the permafrost characteristics beneath this open-system Pingo by the use of seismic investigation, evaluating the best practice in terms of logistic deployment. The survey was done in April-May 2014: we collected 3 seismic lines with different spacing between receivers (from 2.5m to 5m), for a total length of more than 1 km. We collected data with different vertical geophones (with natural frequency of 4.5 Hz and 14 Hz) as well as with a seismic snow-streamer. We tested different seismic sources (hammer, seismic gun, fire crackers and heavy weight drop), and we verified accurately geophone coupling in order to evaluate the different responses. In such peculiar conditions we noted as fire-crackers allow the best signal to noise ratio for refraction/reflection surveys. To ensure the best geophones coupling with the frozen soil, we dug snow pits, to remove the snow-cover effect. On the other hand, for the surface wave methods, the very high velocity of the permafrost strongly limits the generation of long wavelengths both with these explosive sources as with the common sledgehammer. The only source capable of generating low frequencies was a heavy drop weight system, which allows to analyze surface wave dispersion below 10 Hz. Preliminary data analysis results evidence marked velocity inversions and strong velocity contrasts in depth. The combined use of surface and body waves highlights the presence of a heterogeneous soil deposit level beneath a thick layer of permafrost. This is the level that hosts the water circulation from depth controlling the Pingo structure evolution.

  8. Realities of verifying the absence of highly enriched uranium (HEU) in gas centrifuge enrichment plants

    SciTech Connect

    Swindle, D.W.

    1990-03-01

    Over a two and one-half year period beginning in 1981, representatives of six countries (United States, United Kingdom, Federal Republic of Germany, Australia, The Netherlands, and Japan) and the inspectorate organizations of the International Atomic Energy Agency and EURATOM developed and agreed to a technically sound approach for verifying the absence of highly enriched uranium (HEU) in gas centrifuge enrichment plants. This effort, known as the Hexapartite Safeguards Project (HSP), led to the first international concensus on techniques and requirements for effective verification of the absence of weapons-grade nuclear materials production. Since that agreement, research and development has continued on the radiation detection technology-based technique that technically confirms the HSP goal is achievable. However, the realities of achieving the HSP goal of effective technical verification have not yet been fully attained. Issues such as design and operating conditions unique to each gas centrifuge plant, concern about the potential for sensitive technology disclosures, and on-site support requirements have hindered full implementation and operator support of the HSP agreement. In future arms control treaties that may limit or monitor fissile material production, the negotiators must recognize and account for the realities and practicalities in verifying the absence of HEU production. This paper will describe the experiences and realities of trying to achieve the goal of developing and implementing an effective approach for verifying the absence of HEU production. 3 figs.

  9. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects where the consequences of failure are more serious, such as dams and chemical plantsit is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniera, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude distribution [Youngs and Coppersmith, 1985]. Notably, the software can deal with uncertainty in the seismicity input parameters such as maximum magnitude value. CRISIS offers a set of built-in GMPEs, as well as the possibility of defining new ones by providing information in a tabular format. Our study shows that in case of Ajaristkali HPP study area, significant contribution to Seismic Hazard comes from local sources with quite low Mmax values, thus these two attenuation lows give us quite different PGA and SA values.

  10. Magnitude correlations in global seismicity

    SciTech Connect

    Sarlis, N. V.

    2011-08-15

    By employing natural time analysis, we analyze the worldwide seismicity and study the existence of correlations between earthquake magnitudes. We find that global seismicity exhibits nontrivial magnitude correlations for earthquake magnitudes greater than M{sub w}6.5.

  11. Controllable seismic source

    DOEpatents

    Gomez, Antonio; DeRego, Paul Jeffrey; Ferrell, Patrick Andrew; Thom, Robert Anthony; Trujillo, Joshua J.; Herridge, Brian

    2015-09-29

    An apparatus for generating seismic waves includes a housing, a strike surface within the housing, and a hammer movably disposed within the housing. An actuator induces a striking motion in the hammer such that the hammer impacts the strike surface as part of the striking motion. The actuator is selectively adjustable to change characteristics of the striking motion and characteristics of seismic waves generated by the impact. The hammer may be modified to change the physical characteristics of the hammer, thereby changing characteristics of seismic waves generated by the hammer. The hammer may be disposed within a removable shock cavity, and the apparatus may include two hammers and two shock cavities positioned symmetrically about a center of the apparatus.

  12. Controllable seismic source

    DOEpatents

    Gomez, Antonio; DeRego, Paul Jeffrey; Ferrel, Patrick Andrew; Thom, Robert Anthony; Trujillo, Joshua J.; Herridge, Brian

    2014-08-19

    An apparatus for generating seismic waves includes a housing, a strike surface within the housing, and a hammer movably disposed within the housing. An actuator induces a striking motion in the hammer such that the hammer impacts the strike surface as part of the striking motion. The actuator is selectively adjustable to change characteristics of the striking motion and characteristics of seismic waves generated by the impact. The hammer may be modified to change the physical characteristics of the hammer, thereby changing characteristics of seismic waves generated by the hammer. The hammer may be disposed within a removable shock cavity, and the apparatus may include two hammers and two shock cavities positioned symmetrically about a center of the apparatus.

  13. Induced seismicity. Final report

    SciTech Connect

    Segall, P.

    1997-09-18

    The objective of this project has been to develop a fundamental understanding of seismicity associated with energy production. Earthquakes are known to be associated with oil, gas, and geothermal energy production. The intent is to develop physical models that predict when seismicity is likely to occur, and to determine to what extent these earthquakes can be used to infer conditions within energy reservoirs. Early work focused on earthquakes induced by oil and gas extraction. Just completed research has addressed earthquakes within geothermal fields, such as The Geysers in northern California, as well as the interactions of dilatancy, friction, and shear heating, on the generation of earthquakes. The former has involved modeling thermo- and poro-elastic effects of geothermal production and water injection. Global Positioning System (GPS) receivers are used to measure deformation associated with geothermal activity, and these measurements along with seismic data are used to test and constrain thermo-mechanical models.

  14. Parallel computation of seismic analysis of high arch dam

    NASA Astrophysics Data System (ADS)

    Chen, Houqun; Ma, Huaifa; Tu, Jin; Cheng, Guangqing; Tang, Juzhen

    2008-03-01

    Parallel computation programs are developed for three-dimensional meso-mechanics analysis of fully-graded dam concrete and seismic response analysis of high arch dams (ADs), based on the Parallel Finite Element Program Generator (PFEPG). The computational algorithms of the numerical simulation of the meso-structure of concrete specimens were studied. Taking into account damage evolution, static preload, strain rate effect, and the heterogeneity of the meso-structure of dam concrete, the fracture processes of damage evolution and configuration of the cracks can be directly simulated. In the seismic response analysis of ADs, all the following factors are involved, such as the nonlinear contact due to the opening and slipping of the contraction joints, energy dispersion of the far-field foundation, dynamic interactions of the dam-foundation-reservoir system, and the combining effects of seismic action with all static loads. The correctness, reliability and efficiency of the two parallel computational programs are verified with practical illustrations.

  15. Robust discrimination of human footsteps using seismic signals

    NASA Astrophysics Data System (ADS)

    Faghfouri, Aram E.; Frish, Michael B.

    2011-06-01

    This paper provides a statistical analysis method for detecting and discriminating different seismic activity sources such as humans, animals, and vehicles using their seismic signals. A five-step process is employed for this purpose: (1) a set of signals with known seismic activities are utilized to verify the algorithms; (2) for each data file, the vibration signal is segmented by a sliding-window and its noise is reduced; (3) a set of features is extracted from each window of the signal which captures its statistical and spectral properties. This set is formed as an array and is called a feature array; (4) a portion of the labeled feature arrays are utilized to train a classifier for discriminating different types of signals; and (5) the rest of the labeled feature arrays are employed to test the performance of the developed classifier. The results indicate that the classifier achieves probability of detection (pd) above 95% and false alarm rate (pfa) less than 1%.

  16. Measurements verifying the optics of the Electron Drift Instrument

    NASA Astrophysics Data System (ADS)

    Kooi, Vanessa M.

    This thesis concentrates on laboratory measurements of the Electron Drift Instrument (EDI), focussing primarily on the EDI optics of the system. The EDI is a device used on spacecraft to measure electric fields by emitting an electron beam and measuring the E x B drift of the returning electrons after one gyration. This drift velocity is determined using two electron beams directed perpendicular to the magnetic field returning to be detected by the spacecraft. The EDI will be used on the Magnetospheric Multi-Scale Mission. The EDI optic's testing process takes measurements of the optics response to a uni-directional electron beam. These measurements are used to verify the response of the EDI's optics and to allow for the optimization of the desired optics state via simulation. The optics state tables were created in simulations and we are using these measurements to confirm their accuracy. The setup consisted of an apparatus made up of the EDI's optics and sensor electronics was secured to the two axis gear arm inside a vacuum chamber. An electron beam was projected at the apparatus which then used the EDI optics to focus the beam into the micro-controller plates and onto the circular 32 pad annular ring that makes up the sensor. The concentration of counts per pad over an interval of 1ms were averaged over 25 samples and plotted in MATLAB. The results of the measurements plotted agreed well with the simulations, providing confidence in the EDI instrument.

  17. Measurements Verifying the Optics of the Electron Drift Instrument

    NASA Astrophysics Data System (ADS)

    Kooi, Vanessa; Kletzing, Craig; Bounds, Scott; Sigsbee, Kristine M.

    2015-04-01

    Magnetic reconnection is the process of breaking and reconnecting of opposing magnetic field lines, and is often associated with tremendous energy transfer. The energy transferred by reconnection directly affects people through its influence on geospace weather and technological systems - such as telecommunication networks, GPS, and power grids. However, the mechanisms that cause magnetic reconnection are not well understood. The Magnetospheric Multi-Scale Mission (MMS) will use four spacecraft in a pyramid formation to make three-dimensional measurements of the structures in magnetic reconnection occurring in the Earth's magnetosphere.The spacecraft will repeatedly sample these regions for a prolonged period of time to gather data in more detail than has been previously possible. MMS is scheduled to be launched in March of 2015. The Electron Drift Instrument (EDI) will be used on MMS to measure the electric fields associated with magnetic reconnection. The EDI is a device used on spacecraft to measure electric fields by emitting an electron beam and measuring the E x B drift of the returning electrons after one gyration. This paper concentrates on measurements of the EDIs optics system. The testing process includes measuring the optics response to a uni-directional electron beam. These measurements are used to verify the response of the EDI's optics and to allow for the optimization of the desired optics state. The measurements agree well with simulations and we are confident in the performance of the EDI instrument.

  18. A credit card verifier structure using diffraction and spectroscopy concepts

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2008-04-01

    We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram is a diffractive optical element, we choose to shine one at a time a number of broadband lightsources, each at different incident angle, on the embossed hologram of the credit card in such a way that different color spectra per incident angle beam is diffracted and separated in space. In this way, the number of pixels of each color plane is investigated. Then we apply a feed forward back propagation neural network configuration to separate the counterfeit credit card from the real one. Our experimental demonstration using two off-the-shelf broadband white light emitting diodes, one digital camera, a 3-layer neural network, and a notebook computer can identify all 69 counterfeit credit cards from eight real credit cards.

  19. Garbage collection can be made real-time and verifiable

    NASA Technical Reports Server (NTRS)

    Hino, James H.; Ross, Charles L.

    1988-01-01

    An efficient means of memory reclamation (also known as Garbage Collection) is essential for Machine Intelligence applications where dynamic storage allocation is desired or required. Solutions for real-time systems must introduce very small processing overhead and must also provide for the verification of the software in order to meet the application time budgets and to verify the correctness of the software. Garbage Collection (GC) techniques are proposed for symbolic processing systems which may simultaneously meet both real-time requirements and verification requirements. The proposed memory reclamation technique takes advantage of the strong points of both the earlier Mark and Sweep technique and the more recent Copy Collection approaches. At least one practical implementation of these new GC techniques has already been developed and tested on a very-high performance symbolic computing system. Complete GC processing of all generated garbage has been demonstrated to require as little as a few milliseconds to perform. This speed enables the effective operation of the GC function as either a background task or as an actual part of the application task itself.

  20. Stress-Release Seismic Source for Seismic Velocity Measurement in Mines

    NASA Astrophysics Data System (ADS)

    Swanson, P. L.; Clark, C.; Richardson, J.; Martin, L.; Zahl, E.; Etter, A.

    2014-12-01

    Accurate seismic event locations are needed to delineate roles of mine geometry, stress and geologic structures in developing rockburst conditions. Accurate absolute locations are challenging in mine environments with rapid changes in seismic velocity due to sharp contrasts between individual layers and large time-dependent velocity gradients attending excavations. Periodic use of controlled seismic sources can help constrain the velocity in this continually evolving propagation medium comprising the miners' workplace. With a view to constructing realistic velocity models in environments in which use of explosives is problematic, a seismic source was developed subject to the following design constraints: (i) suitable for use in highly disturbed zones surrounding mine openings, (ii) able to produce usable signals over km-scale distances in the frequency range of typical coal mine seismic events (~10-100 Hz), (iii) repeatable, (iv) portable, (v) non-disruptive to mining operations, and (vi) safe for use in potentially explosive gaseous environments. Designs of the compressed load column seismic source (CLCSS), which generates a stress, or load, drop normal to the surface of mine openings, and the fiber-optic based source-initiation timer are presented. Tests were conducted in a coal mine at a depth of 500 m (1700 ft) and signals were recorded on the surface with a 72-ch (14 Hz) exploration seismograph for load drops of 150-470 kN (16-48 tons). Signal-to-noise ratios of unfiltered signals ranged from ~200 immediately above the source (500 m (1700 ft)) to ~8 at the farthest extent of the array (slant distance of ~800 m (2600 ft)), suggesting the potential for use over longer range. Results are compared with signals produced by weight drop and sledge hammer sources, indicating the superior waveform quality for first-arrival measurements with the CLCSS seismic source.

  1. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  2. Synthesis of artificial spectrum-compatible seismic accelerograms

    NASA Astrophysics Data System (ADS)

    Vrochidou, E.; Alvanitopoulos, P. F.; Andreadis, I.; Elenas, A.; Mallousi, K.

    2014-08-01

    The Hilbert-Huang transform is used to generate artificial seismic signals compatible with the acceleration spectra of natural seismic records. Artificial spectrum-compatible accelerograms are utilized instead of natural earthquake records for the dynamic response analysis of many critical structures such as hospitals, bridges, and power plants. The realistic estimation of the seismic response of structures involves nonlinear dynamic analysis. Moreover, it requires seismic accelerograms representative of the actual ground acceleration time histories expected at the site of interest. Unfortunately, not many actual records of different seismic intensities are available for many regions. In addition, a large number of seismic accelerograms are required to perform a series of nonlinear dynamic analyses for a reliable statistical investigation of the structural behavior under earthquake excitation. These are the main motivations for generating artificial spectrum-compatible seismic accelerograms and could be useful in earthquake engineering for dynamic analysis and design of buildings. According to the proposed method, a single natural earthquake record is deconstructed into amplitude and frequency components using the Hilbert-Huang transform. The proposed method is illustrated by studying 20 natural seismic records with different characteristics such as different frequency content, amplitude, and duration. Experimental results reveal the efficiency of the proposed method in comparison with well-established and industrial methods in the literature.

  3. First Quarter Hanford Seismic Report for Fiscal Year 2011

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Clayton, Ray E.; Devary, Joseph L.

    2011-03-31

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 16 local earthquakes during the first quarter of FY 2011. Six earthquakes were located at shallow depths (less than 4 km), seven earthquakes at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments, and three earthquakes were located at depths greater than 9 km, within the basement. Geographically, thirteen earthquakes were located in known swarm areas and three earthquakes were classified as random events. The highest magnitude event (1.8 Mc) was recorded on October 19, 2010 at depth 17.5 km with epicenter located near the Yakima River between the Rattlesnake Mountain and Horse Heaven Hills swarm areas.

  4. New seismic sensors for footstep detection and other military applications

    NASA Astrophysics Data System (ADS)

    Pakhomov, Alex; Goldburt, Tim

    2004-09-01

    Performance of seismic security systems relies on the particular application of the characteristics of seismic sensors. Current seismic sensors do not yield best possible results. In addition to identifying the requirements for optimal seismic sensors, we have developed seismic sensors for defense and security applications. We show two different types of seismic sensors: a miniscule, extremely low cost sensor and a bulk sensor. The miniscule, extremely low cost sensor is an electret-based geophone for both seismic and acoustic detection systems. This geophone detects a small size object - i.e. a walking/running/crawling person or a small underwater vehicle-that moves on the surface, underground, and/or in the water. It can also detect large size objects-i.e. heavy vehicles, trucks, tanks-as well as be used in littoral warfare. The electret-based design significantly improves technical characteristics achieving performance uniqueness: expanded frequency response range in the low frequency area, improved sensitivity threshold and accuracy response, and improved sensor's protection from electromagnetic interference. The bulk sensor has an extremely large detection surface, a nanocomposite body in special form casing, and a special electronic circuit. These sensors allow detection of footstep signals in high ambient seismic noise levels. However, installation requires significant installation groundwork effort.

  5. Induced Seismicity Monitoring System

    NASA Astrophysics Data System (ADS)

    Taylor, S. R.; Jarpe, S.; Harben, P.

    2014-12-01

    There are many seismological aspects associated with monitoring of permanent storage of carbon dioxide (CO2) in geologic formations. Many of these include monitoring underground gas migration through detailed tomographic studies of rock properties, integrity of the cap rock and micro seismicity with time. These types of studies require expensive deployments of surface and borehole sensors in the vicinity of the CO2 injection wells. Another problem that may exist in CO2 sequestration fields is the potential for damaging induced seismicity associated with fluid injection into the geologic reservoir. Seismic hazard monitoring in CO2 sequestration fields requires a seismic network over a spatially larger region possibly having stations in remote settings. Expensive observatory-grade seismic systems are not necessary for seismic hazard deployments or small-scale tomographic studies. Hazard monitoring requires accurate location of induced seismicity to magnitude levels only slightly less than that which can be felt at the surface (e.g. magnitude 1), and the frequencies of interest for tomographic analysis are ~1 Hz and greater. We have developed a seismo/acoustic smart sensor system that can achieve the goals necessary for induced seismicity monitoring in CO2 sequestration fields. The unit is inexpensive, lightweight, easy to deploy, can operate remotely under harsh conditions and features 9 channels of recording (currently 3C 4.5 Hz geophone, MEMS accelerometer and microphone). An on-board processor allows for satellite transmission of parameter data to a processing center. Continuous or event-detected data is kept on two removable flash SD cards of up to 64+ Gbytes each. If available, data can be transmitted via cell phone modem or picked up via site visits. Low-power consumption allows for autonomous operation using only a 10 watt solar panel and a gel-cell battery. The system has been successfully tested for long-term (> 6 months) remote operations over a wide range of environments including summer in Arizona to winter above 9000' in the mountains of southern Colorado. Statistically based on-board processing is used for detection, arrival time picking, back azimuth estimation and magnitude estimates from coda waves and acoustic signals.

  6. Application of the Neo-Deterministic Seismic Microzonation Procedure in Bulgaria and Validation of the Seismic Input Against Eurocode 8

    SciTech Connect

    Ivanka, Paskaleva; Mihaela, Kouteva; Franco, Vaccari; Panza, Giuliano F.

    2008-07-08

    The earthquake record and the Code for design and construction in seismic regions in Bulgaria have shown that the territory of the Republic of Bulgaria is exposed to a high seismic risk due to local shallow and regional strong intermediate-depth seismic sources. The available strong motion database is quite limited, and therefore not representative at all of the real hazard. The application of the neo-deterministic seismic hazard assessment procedure for two main Bulgarian cities has been capable to supply a significant database of synthetic strong motions for the target sites, applicable for earthquake engineering purposes. The main advantage of the applied deterministic procedure is the possibility to take simultaneously and correctly into consideration the contribution to the earthquake ground motion at the target sites of the seismic source and of the seismic wave propagation in the crossed media. We discuss in this study the result of some recent applications of the neo-deterministic seismic microzonation procedure to the cities of Sofia and Russe. The validation of the theoretically modeled seismic input against Eurocode 8 and the few available records at these sites is discussed.

  7. Evaluation of Seismic Risk of Siberia Territory

    NASA Astrophysics Data System (ADS)

    Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

    The outcomes of modern geophysical researches of the Geophysical Survey SB RAS, directed on study of geodynamic situation in large industrial and civil centers on the territory of Siberia with the purpose of an evaluation of seismic risk of territories and prediction of origin of extreme situations of natural and man-caused character, are pre- sented in the paper. First of all it concerns the testing and updating of a geoinformation system developed by Russian Emergency Ministry designed for calculations regarding the seismic hazard and response to distructive earthquakes. The GIS database contains the catalogues of earthquakes and faults, seismic zonation maps, vectorized city maps, information on industrial and housing fund, data on character of building and popula- tion in inhabited places etc. The geoinformation system allows to solve on a basis of probabilistic approaches the following problems: - estimating the earthquake impact, required forces, facilities and supplies for life-support of injured population; - deter- mining the consequences of failures on chemical and explosion-dangerous objects; - optimization problems on assurance technology of conduct of salvage operations. Using this computer program, the maps of earthquake risk have been constructed for several seismically dangerous regions of Siberia. These maps display the data on the probable amount of injured people and relative economic damage from an earthquake, which can occur in various sites of the territory according to the map of seismic zona- tion. The obtained maps have allowed determining places where the detailed seismo- logical observations should be arranged. Along with it on the territory of Siberia the wide-ranging investigations with use of new methods of evaluation of physical state of industrial and civil establishments (buildings and structures, hydroelectric power stations, bridges, dams, etc.), high-performance detailed electromagnetic researches of ground conditions of city territories, roads, runways, etc., studying of seismic con- dition in large industrial and civil centers and others.

  8. Capable faulting, environmental effects and seismic landscape in the area affected by the 1997 Umbria-Marche (Central Italy) seismic sequence

    NASA Astrophysics Data System (ADS)

    Guerrieri, L.; Blumetti, A. M.; Esposito, E.; Michetti, A. M.; Porfido, S.; Serva, L.; Tondi, E.; Vittori, E.

    2009-10-01

    The September-October 1997 seismic sequence in the Umbria-Marche regions of Central Italy has been one of the best studied from the seismological, macroseismic and geological point of view. Numerous papers have been published in the period immediately after the seismic sequence, providing a significant database of effects triggered by the earthquake on natural environment. In the following years, further studies have provided additional pieces of evidence that allow to better relate the seismic sequence with its geological background. Moreover, recent developments in the characterization of coseismic environmental effects provide new horizons in seismic hazard assessment (SHA) procedures, which should take into account even long term geomorphological and geological features resulting from repeated characteristic earthquakes (concept of "seismic landscape"). This paper reviews the current state of knowledge on the 1997 Umbria-Marche seismic sequence, with particular regard to coseismic environmental effects (primary and secondary), that have been used for ESI seismic intensity assessment, in order to verify if i) they are consistent with geological, seismological and macroseismic data in the location and characterization of the seismogenic structure, and ii) if they fit the "seismic landscape" features that mark the epicentral area of Colfiorito.

  9. Software for Verifying Image-Correlation Tie Points

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Yagi, Gary

    2008-01-01

    A computer program enables assessment of the quality of tie points in the image-correlation processes of the software described in the immediately preceding article. Tie points are computed in mappings between corresponding pixels in the left and right images of a stereoscopic pair. The mappings are sometimes not perfect because image data can be noisy and parallax can cause some points to appear in one image but not the other. The present computer program relies on the availability of a left- right correlation map in addition to the usual right left correlation map. The additional map must be generated, which doubles the processing time. Such increased time can now be afforded in the data-processing pipeline, since the time for map generation is now reduced from about 60 to 3 minutes by the parallelization discussed in the previous article. Parallel cluster processing time, therefore, enabled this better science result. The first mapping is typically from a point (denoted by coordinates x,y) in the left image to a point (x',y') in the right image. The second mapping is from (x',y') in the right image to some point (x",y") in the left image. If (x,y) and(x",y") are identical, then the mapping is considered perfect. The perfect-match criterion can be relaxed by introducing an error window that admits of round-off error and a small amount of noise. The mapping procedure can be repeated until all points in each image not connected to points in the other image are eliminated, so that what remains are verified correlation data.

  10. Scenarios for exercising technical approaches to verified nuclear reductions

    SciTech Connect

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information for establishing a conceptual approach to a five-year technical program plan for research and development of nuclear arms reductions verification and transparency technologies and procedures.

  11. Verifying and Postprocesing the Ensemble Spread-Error Relationship

    NASA Astrophysics Data System (ADS)

    Hopson, Tom; Knievel, Jason; Liu, Yubao; Roux, Gregory; Wu, Wanli

    2013-04-01

    With the increased utilization of ensemble forecasts in weather and hydrologic applications, there is a need to verify their benefit over less expensive deterministic forecasts. One such potential benefit of ensemble systems is their capacity to forecast their own forecast error through the ensemble spread-error relationship. The paper begins by revisiting the limitations of the Pearson correlation alone in assessing this relationship. Next, we introduce two new metrics to consider in assessing the utility an ensemble's varying dispersion. We argue there are two aspects of an ensemble's dispersion that should be assessed. First, and perhaps more fundamentally: is there enough variability in the ensembles dispersion to justify the maintenance of an expensive ensemble prediction system (EPS), irrespective of whether the EPS is well-calibrated or not? To diagnose this, the factor that controls the theoretical upper limit of the spread-error correlation can be useful. Secondly, does the variable dispersion of an ensemble relate to variable expectation of forecast error? Representing the spread-error correlation in relation to its theoretical limit can provide a simple diagnostic of this attribute. A context for these concepts is provided by assessing two operational ensembles: 30-member Western US temperature forecasts for the U.S. Army Test and Evaluation Command and 51-member Brahmaputra River flow forecasts of the Climate Forecast and Applications Project for Bangladesh. Both of these systems utilize a postprocessing technique based on quantile regression (QR) under a step-wise forward selection framework leading to ensemble forecasts with both good reliability and sharpness. In addition, the methodology utilizes the ensemble's ability to self-diagnose forecast instability to produce calibrated forecasts with informative skill-spread relationships. We will describe both ensemble systems briefly, review the steps used to calibrate the ensemble forecast, and present verification statistics using error-spread metrics, along with figures from operational ensemble forecasts before and after calibration.

  12. Seismic gap of Michoacan, Mexico

    SciTech Connect

    Singh, S.K.; Yamanoto, J.; Havskov, J.; Guzman, M.; Novelo, D.; Castro, R.

    1980-01-01

    A 150 km segment of subduction plate boundary along the Pacific coast of Mexico between the aftershock areas of the Colima earthquake (Jan. 10, 1973; M/sub s/=7.5) and the recent Petartlan earthquake (March 14, 1979; M/sub s/=7.6) has not experienced a major earthquake since 1911 and, thus, has been designated as a seismic gap. There has been considerable discussion in the scientific community about instrumenting this gap for intensive observation. An examination of the 1911 earthquake (M=7 3/4), however, provides strong evidence that its location was about 280 km NNW of the epicenter reported by Gutenberg and Richter. Study of seismicity of Mexico in the past century gives some additional evidence that no major earthquake (M> or approx. =7.5) occurred in the area. Thus, presently available evidence suggests that no large earthquake has occurred in this gap for at least the past 78 years and perhaps for as long as 178 years.

  13. Functional seismic evaluation of hospitals

    NASA Astrophysics Data System (ADS)

    Guevara, L. T.

    2003-04-01

    Functional collapse of hospitals (FCH) occurs when a medical complex, or part of it, although with neither structural nor nonstructural damage, is unable to provide required services for immediate attention to earthquake victims and for the recovery of the affected community. As it is known, FCH during and after an earthquake, is produced, not only by the damage to nonstructural components, but by an inappropriate or deficient distribution of essential and supporting medical spaces. This paper presents some conclusions on the analysis of the traditional architectural schemes for the design and construction of hospitals in the 20th Century and some recommendations for the establishment of evaluation parameters for the remodeling and seismic upgrade of existing hospitals in seismic zones based on the new concepts of: a) the relative location of each essential service (ES) into the medical complex, b) the capacity of each of these spaces for housing temporary activities required for the attention of a massive emergency (ME); c) the relationship between ES and the supporting services (SS); d) the flexibility of transformation of nonessential services into complementary spaces for the attention of extraordinary number of victims; e) the dimensions and appropriateness of evacuation routes; and d) the appropriate supply and maintenance of water, electricity and vital gases emergency installations.

  14. Nonstructural seismic restraint guidelines

    SciTech Connect

    Butler, D.M.; Czapinski, R.H.; Firneno, M.J.; Feemster, H.C.; Fornaciari, N.R.; Hillaire, R.G.; Kinzel, R.L.; Kirk, D.; McMahon, T.T.

    1993-08-01

    The Nonstructural Seismic Restraint Guidelines provide general information about how to secure or restrain items (such as material, equipment, furniture, and tools) in order to prevent injury and property, environmental, or programmatic damage during or following an earthquake. All SNL sites may experience earthquakes of magnitude 6.0 or higher on the Richter scale. Therefore, these guidelines are written for all SNL sites.

  15. Lunar seismicity and tectonics

    NASA Technical Reports Server (NTRS)

    Lammlein, D. R.

    1977-01-01

    Results are presented for an analysis of all moonquake data obtained by the Apollo seismic stations during the period from November 1969 to May 1974 and a preliminary analysis of critical data obtained in the interval from May 1974 to May 1975. More accurate locations are found for previously located moonquakes, and additional sources are located. Consideration is given to the sources of natural seismic signals, lunar seismic activity, moonquake periodicities, tidal periodicities in moonquake activity, hypocentral locations and occurrence characteristics of deep and shallow moonquakes, lunar tidal control over moonquakes, lunar tectonism, the locations of moonquake belts, and the dynamics of the lunar interior. It is concluded that: (1) moonquakes are distributed in several major belts of global extent that coincide with regions of the youngest and most intense volcanic and tectonic activity; (2) lunar tides control both the small quakes occurring at great depth and the larger quakes occurring near the surface; (3) the moon has a much thicker lithosphere than earth; (4) a single tectonic mechanism may account for all lunar seismic activity; and (5) lunar tidal stresses are an efficient triggering mechanism for moonquakes.

  16. The Viking seismic experiment

    NASA Technical Reports Server (NTRS)

    Anderson, D. L.; Miller, W. F.; Duennebier, F. K.; Lazarewicz, A. R.; Sutton, G.; Latham, G. V.; Nakamura, Y.; Toksoz, M. F.; Kovach, R. L.; Knight, T. C. D.

    1976-01-01

    A three-axis short-period seismometer is now operating on Mars in the Utopia Planitia region. The noise background correlates well with wind gusts. Although no quakes have been detected in the first 60 days of observation, it is premature to draw any conclusions about the seismicity of Mars. The instrument is expected to return data for at least 2 years.

  17. Continous Seismic Profiling

    USGS Multimedia Gallery

    The USGS collaborated with cooperator U.S. Fish & Wildlife Service to conduct continuous seismic-reflection profiling in the Havasu National Wildlife Refuge. The survey was conducted as part of an applied research and technology transfer effort by the USGS Office of Groundwater Branch of Geophysics ...

  18. Seismic Inversion Methods

    SciTech Connect

    Jackiewicz, Jason

    2009-09-16

    With the rapid advances in sophisticated solar modeling and the abundance of high-quality solar pulsation data, efficient and robust inversion techniques are crucial for seismic studies. We present some aspects of an efficient Fourier Optimally Localized Averaging (OLA) inversion method with an example applied to time-distance helioseismology.

  19. AUTOMATING SHALLOW SEISMIC IMAGING

    EPA Science Inventory

    Our current EMSP project continues an effort begun in 1997 to develop ultrashallow seismic imaging as a cost-effective method applicable to DOE facilities. The objective of the present research is to refine and demonstrate the use of an automated method of conducting shallow seis...

  20. Seismic Initiating Event Analysis For a PBMR Plant

    SciTech Connect

    Van Graan, Henriette; Serbanescu, Dan; Combrink, Yolanda; Coman, Ovidiu

    2004-07-01

    Seismic Initiating Event (IE) analysis is one of the most important tasks that control the level of effort and quality of the whole Seismic Probabilistic Safety Assessment (SPRA). The typical problems are related to the following aspects: how the internal PRA model and its complexity can be used and how to control the number of PRA components for which fragility evaluation should be performed and finally to obtain a manageable number of significant cut-sets for seismic risk quantification. The answers to these questions are highly dependent on the possibility to improve the interface between the internal events analysis and the external events analysis at the design stage. (authors)

  1. Seismic while drilling: Operational experiences in Viet Nam

    SciTech Connect

    Jackson, M.; Einchcomb, C.

    1997-03-01

    The BP/Statoil alliance in Viet Nam has used seismic while drilling on four wells during the last two years. Three wells employed the Western Atlas Tomex system, and the last well, Schlumberger`s SWD system. Perceived value of seismic while drilling (SWD) lies in being able to supply real-time data linking drill bit position to a seismic picture of the well. However, once confidence in equipment and methodology is attained, SWD can influence well design and planning associated with drilling wells. More important, SWD can remove uncertainty when actually drilling wells, allowing risk assessment to be carried out more accurately and confidently.

  2. Compliant liquid column damper modified by shape memory alloy device for seismic vibration control

    NASA Astrophysics Data System (ADS)

    Gur, Sourav; Mishra, Sudib Kumar; Bhowmick, Sutanu; Chakraborty, Subrata

    2014-10-01

    Liquid column dampers (LCDs) have long been used for the seismic vibration control of flexible structures. In contrast, tuning LCDs to short-period structures poses difficulty. Various modifications have been proposed on the original LCD configuration for improving its performance in relatively stiff structures. One such system, referred to as a compliant-LCD has been proposed recently by connecting the LCD to the structure with a spring. In this study, an improvement is attempted in compliant LCDs by replacing the linear spring with a spring made of shape memory alloy (SMA). Considering the dissipative, super-elastic, force-deformation hysteresis of SMA triggered by stress-induced micro-structural phase transition, the performance is expected to improve further. The optimum parameters for the SMA-compliant LCD are obtained through design optimization, which is based on a nonlinear random vibration response analysis via stochastic linearization of the force-deformation hysteresis of SMA and dissipation by liquid motion through an orifice. Substantially enhanced performance of the SMA-LCD over a conventional compliant LCD is demonstrated, the consistency of which is further verified under recorded ground motions. The robustness of the improved performance is also validated by parametric study concerning the anticipated variations in system parameters as well as variability in seismic loading.

  3. Separation of seismic blended data by sparse inversion over dictionary learning

    NASA Astrophysics Data System (ADS)

    Zhou, Yanhui; Chen, Wenchao; Gao, Jinghuai

    2014-07-01

    Recent development of blended acquisition calls for the new procedure to process blended seismic measurements. Presently, deblending and reconstructing unblended data followed by conventional processing is the most practical processing workflow. We study seismic deblending by advanced sparse inversion with a learned dictionary in this paper. To make our method more effective, hybrid acquisition and time-dithering sequential shooting are introduced so that clean single-shot records can be used to train the dictionary to favor the sparser representation of data to be recovered. Deblending and dictionary learning with l1-norm based sparsity are combined to construct the corresponding problem with respect to unknown recovery, dictionary, and coefficient sets. A two-step optimization approach is introduced. In the step of dictionary learning, the clean single-shot data are selected as trained data to learn the dictionary. For deblending, we fix the dictionary and employ an alternating scheme to update the recovery and coefficients separately. Synthetic and real field data were used to verify the performance of our method. The outcome can be a significant reference in designing high-efficient and low-cost blended acquisition.

  4. High Voltage Seismic Generator

    NASA Astrophysics Data System (ADS)

    Bogacz, Adrian; Pala, Damian; Knafel, Marcin

    2015-04-01

    This contribution describes the preliminary result of annual cooperation of three student research groups from AGH UST in Krakow, Poland. The aim of this cooperation was to develop and construct a high voltage seismic wave generator. Constructed device uses a high-energy electrical discharge to generate seismic wave in ground. This type of device can be applied in several different methods of seismic measurement, but because of its limited power it is mainly dedicated for engineering geophysics. The source operates on a basic physical principles. The energy is stored in capacitor bank, which is charged by two stage low to high voltage converter. Stored energy is then released in very short time through high voltage thyristor in spark gap. The whole appliance is powered from li-ion battery and controlled by ATmega microcontroller. It is possible to construct larger and more powerful device. In this contribution the structure of device with technical specifications is resented. As a part of the investigation the prototype was built and series of experiments conducted. System parameter was measured, on this basis specification of elements for the final device were chosen. First stage of the project was successful. It was possible to efficiently generate seismic waves with constructed device. Then the field test was conducted. Spark gap wasplaced in shallowborehole(0.5 m) filled with salt water. Geophones were placed on the ground in straight line. The comparison of signal registered with hammer source and sparker source was made. The results of the test measurements are presented and discussed. Analysis of the collected data shows that characteristic of generated seismic signal is very promising, thus confirms possibility of practical application of the new high voltage generator. The biggest advantage of presented device after signal characteristics is its size which is 0.5 x 0.25 x 0.2 m and weight approximately 7 kg. This features with small li-ion battery makes constructed device very mobile. The project is still developing.

  5. Methodology for the Caracas Seismic Microzonation Study

    NASA Astrophysics Data System (ADS)

    Hernndez, J. J.; Schmitz, M.

    2007-05-01

    Currently, the Venezuelan Foundation for Seismological Research (FUNVISIS) is executing the Caracas Seismic Microzonation Study. Fundamental objectives are the selection of microzones of similar response and the determination of landslide susceptibility. Both result from a guided combination of damage data of 1967 Caracas earthquake, landslides inventory, geophysical investigations, seismic hazard analysis, geological information, geotechnical database, and earthquake engineering estimations of soil response and probable hillside behavior. Geophysical investigations include refraction seismic, microtremor and gravimeter measurements, for modeling the valley basin; sediment thickness to bedrock reaches 350 m. The model was calibrated with three deep drillings, in which accelerometers will be placed for future comparisons between surface and bedrock seismic motions. Probabilistic seismic hazard analysis was performed, leading to uniform hazard spectra for 475-year mean return period and deaggregation of magnitude-distance pairs, differentiated within the Caracas bedrock. A parametric one-dimensional dynamic soil response study was performed with varying sediment thickness (0-350 m), average shear wave velocity in the upper 30 m (150-500 m/s), and nonlinear soil properties; 144 representative soil profiles were analyzed. The mean amplification of spectral response values of ther surface regarding to the bedrock is obtained, in order to determine probable surface spectra using the bedrock PSHA spectra as input. The seismic effects of the basin are incorporated in an approximate way, from numerical simulations of the 2D and 3D seismic response and statistical data around the world. Finally, a set of microzones with similar average response spectrum is selected, by means of correlating their geological, geophysical and geotechnical properties with those of the parametric study. Hillside pre-seismic hazard is established from lithological properties, slope gradients and seasonal and rainfall wet indexes, estimating the static factor of safety. Newmark displacements and probability of failures are estimated using PSHA results and statistical correlations, leading to the qualification of the earthquake-induced landslide susceptibility. The results will allow updating the municipality ordinances, in order to improve the design and safety construction of new buildings and establish reinforcement priorities of the existing ones. Contribution to projects FONACIT 200400738 (with funds from IDB) and FONACIT-ECOS Nord 2004000347.

  6. Second Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-06-26

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, seven local earthquakes were recorded during the second quarter of fiscal year 2008. The largest event recorded by the network during the second quarter (February 3, 2008 - magnitude 2.3 Mc) was located northeast of Richland in Franklin County at a depth of 22.5 km. With regard to the depth distribution, two earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), three earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and two earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, five earthquakes occurred in swarm areas and two earthquakes were classified as random events.

  7. Second Quarter Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-07-17

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 506 triggers on two parallel detection and recording systems during the second quarter of fiscal year (FY) 2000. Twenty-seven seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree} N latitude and 119--120{degree} W longitude; 12 were earthquakes in the Columbia River Basalt Group, 2 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 5 were quarry blasts. Three earthquakes appear to be related to geologic structures, eleven earthquakes occurred in known swarm areas, and seven earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the second quarter of FY 2000.

  8. First quarter Hanford seismic report for fiscal year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-02-23

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EW uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 311 triggers on two parallel detection and recording systems during the first quarter of fiscal year (FY) 2000. Twelve seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree}N latitude and 119--120{degree}W longitude; 2 were earthquakes in the Columbia River Basalt Group, 3 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 1 was a quarry blast. Two earthquakes appear to be related to a major geologic structure, no earthquakes occurred in known swarm areas, and 9 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the first quarter of FY 2000.

  9. First Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-03-21

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, forty-four local earthquakes were recorded during the first quarter of fiscal year 2008. A total of thirty-one micro earthquakes were recorded within the Rattlesnake Mountain swarm area at depths in the 5-8 km range, most likely within the pre-basalt sediments. The largest event recorded by the network during the first quarter (November 25, 2007 - magnitude 1.5 Mc) was located within this swarm area at a depth of 4.3 km. With regard to the depth distribution, three earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), thirty-six earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and five earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, thirty-eight earthquakes occurred in swarm areas and six earth¬quakes were classified as random events.

  10. Third Quarter Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-09-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its con-tractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (E WRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 818 triggers on two parallel detection and recording systems during the third quarter of fiscal year (FY) 2000. Thirteen seismic events were located by the Hanford Seismic Network within the reporting region of 46-47{degree} N latitude and 119-120{degree} W longitude; 7 were earthquakes in the Columbia River Basalt Group, 1 was an earthquake in the pre-basalt sediments, and 5 were earthquakes in the crystalline basement. Three earthquakes occurred in known swarm areas, and 10 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the third quarter of FY 2000.

  11. Real-time Imaging Orientation Determination System to Verify Imaging Polarization Navigation Algorithm.

    PubMed

    Lu, Hao; Zhao, Kaichun; Wang, Xiaochu; You, Zheng; Huang, Kaoli

    2016-01-01

    Bio-inspired imaging polarization navigation which can provide navigation information and is capable of sensing polarization information has advantages of high-precision and anti-interference over polarization navigation sensors that use photodiodes. Although all types of imaging polarimeters exist, they may not qualify for the research on the imaging polarization navigation algorithm. To verify the algorithm, a real-time imaging orientation determination system was designed and implemented. Essential calibration procedures for the type of system that contained camera parameter calibration and the inconsistency of complementary metal oxide semiconductor calibration were discussed, designed, and implemented. Calibration results were used to undistort and rectify the multi-camera system. An orientation determination experiment was conducted. The results indicated that the system could acquire and compute the polarized skylight images throughout the calibrations and resolve orientation by the algorithm to verify in real-time. An orientation determination algorithm based on image processing was tested on the system. The performance and properties of the algorithm were evaluated. The rate of the algorithm was over 1 Hz, the error was over 0.313, and the population standard deviation was 0.148 without any data filter. PMID:26805851

  12. Alternate approaches to verifying the structural adequacy of the Defense High Level Waste Shipping Cask

    SciTech Connect

    Zimmer, A.; Koploy, M.

    1991-12-01

    In the early 1980s, the US Department of Energy/Defense Programs (DOE/DP) initiated a project to develop a safe and efficient transportation system for defense high level waste (DHLW). A long-standing objective of the DHLW transportation project is to develop a truck cask that represents the leading edge of cask technology as well as one that fully complies with all applicable DOE, Nuclear Regulatory Commission (NRC), and Department of Transportation (DOT) regulations. General Atomics (GA) designed the DHLW Truck Shipping Cask using state-of-the-art analytical techniques verified by model testing performed by Sandia National Laboratories (SNL). The analytical techniques include two approaches, inelastic analysis and elastic analysis. This topical report presents the results of the two analytical approaches and the model testing results. The purpose of this work is to show that there are two viable analytical alternatives to verify the structural adequacy of a Type B package and to obtain an NRC license. It addition, this data will help to support the future acceptance by the NRC of inelastic analysis as a tool in packaging design and licensing.

  13. Real-time Imaging Orientation Determination System to Verify Imaging Polarization Navigation Algorithm

    PubMed Central

    Lu, Hao; Zhao, Kaichun; Wang, Xiaochu; You, Zheng; Huang, Kaoli

    2016-01-01

    Bio-inspired imaging polarization navigation which can provide navigation information and is capable of sensing polarization information has advantages of high-precision and anti-interference over polarization navigation sensors that use photodiodes. Although all types of imaging polarimeters exist, they may not qualify for the research on the imaging polarization navigation algorithm. To verify the algorithm, a real-time imaging orientation determination system was designed and implemented. Essential calibration procedures for the type of system that contained camera parameter calibration and the inconsistency of complementary metal oxide semiconductor calibration were discussed, designed, and implemented. Calibration results were used to undistort and rectify the multi-camera system. An orientation determination experiment was conducted. The results indicated that the system could acquire and compute the polarized skylight images throughout the calibrations and resolve orientation by the algorithm to verify in real-time. An orientation determination algorithm based on image processing was tested on the system. The performance and properties of the algorithm were evaluated. The rate of the algorithm was over 1 Hz, the error was over 0.313°, and the population standard deviation was 0.148° without any data filter. PMID:26805851

  14. Network Optimization for Induced Seismicity Monitoring in Urban Areas

    NASA Astrophysics Data System (ADS)

    Kraft, T.; Husen, S.; Wiemer, S.

    2012-12-01

    With the global challenge to satisfy an increasing demand for energy, geological energy technologies receive growing attention and have been initiated in or close to urban areas in the past several years. Some of these technologies involve injecting fluids into the subsurface (e.g., oil and gas development, waste disposal, and geothermal energy development) and have been found or suspected to cause small to moderate sized earthquakes. These earthquakes, which may have gone unnoticed in the past when they occurred in remote sparsely populated areas, are now posing a considerable risk for the public acceptance of these technologies in urban areas. The permanent termination of the EGS project in Basel, Switzerland after a number of induced ML~3 (minor) earthquakes in 2006 is one prominent example. It is therefore essential to the future development and success of these geological energy technologies to develop strategies for managing induced seismicity and keeping the size of induced earthquake at a level that is acceptable to all stakeholders. Most guidelines and recommendations on induced seismicity published since the 1970ies conclude that an indispensable component of such a strategy is the establishment of seismic monitoring in an early stage of a project. This is because an appropriate seismic monitoring is the only way to detect and locate induced microearthquakes with sufficient certainty to develop an understanding of the seismic and geomechanical response of the reservoir to the geotechnical operation. In addition, seismic monitoring lays the foundation for the establishment of advanced traffic light systems and is therefore an important confidence building measure towards the local population and authorities. We have developed an optimization algorithm for seismic monitoring networks in urban areas that allows to design and evaluate seismic network geometries for arbitrary geotechnical operation layouts. The algorithm is based on the D-optimal experimental design that aims to minimize the error ellipsoid of the linearized location problem. Optimization for additional criteria (e.g., focal mechanism determination or installation costs) can be included. We consider a 3D seismic velocity model, an European ambient seismic noise model derived from high-resolution land-use data and existing seismic stations in the vicinity of the geotechnical site. Using this algorithm we are able to find the optimal geometry and size of the seismic monitoring network that meets the predefined application-oriented performance criteria. In this talk we will focus on optimal network geometries for deep geothermal projects of the EGS and hydrothermal type. We will discuss the requirements for basic seismic surveillance and high-resolution reservoir monitoring and characterization.

  15. Sub-seismic Deformation Prediction of Potential Pathways and Seismic Validation - The Joint Project PROTECT

    NASA Astrophysics Data System (ADS)

    Krawczyk, C. M.; Kolditz, O.

    2013-12-01

    The joint project PROTECT (PRediction Of deformation To Ensure Carbon Traps) aims to determine the existence and characteristics of sub-seismic structures that can potentially link deep reservoirs with the surface in the framework of CO2 underground storage. The research provides a new approach of assessing the long-term integrity of storage reservoirs. The objective is predicting and quantifying the distribution and the amount of sub-/seismic strain caused by fault movement in the proximity of a CO2 storage reservoir. The study is developing tools and workflows which will be tested at the CO2CRC Otway Project Site in the Otway Basin in south-western Victoria, Australia. For this purpose, we are building a geometrical kinematic 3-D model based on 2-D and 3-D seismic data that are provided by the Australian project partner, the CO2CRC Consortium. By retro-deforming the modeled subsurface faults in the inspected subsurface volume we can determine the accumulated sub-seismic deformation and thus the strain variation around the faults. Depending on lithology, the calculated strain magnitude and its orientation can be used as an indicator for fracture density. Furthermore, from the complete 3D strain tensor we can predict the orientation of fractures at sub-seismic scale. In areas where we have preliminary predicted critical deformation, we will acquire in November this year new near- surface, high resolution P- and S-wave 2-D seismic data in order to verify and calibrate our model results. Here, novel and parameter-based model building will especially benefit from extracting velocities and elastic parameters from VSP and other seismic data. Our goal is to obtain a better overview of possible fluid migration pathways and communication between reservoir and overburden. Thereby, we will provide a tool for prediction and adapted time-dependent monitoring strategies for subsurface storage in general including scientific visualization capabilities. Acknowledgement This work was sponsored in part by the Australian Commonwealth Government through the Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC). PROTECT (PRediction Of deformation To Ensure Carbon Traps) is funded through the Geotechnologien Programme (grant 03G0797) of the German Ministry for Education and Research (BMBF). The PROTECT research group consists of Leibniz Institute for Applied Geophysics in Hannover, Technical University Darmstadt, Helmholtz-Zentrum für Umweltforschung in Leipzig, Trappe Erdöl Erdgas Consultant in Isernhagen (all Germany), and Curtin University in Perth, Australia.

  16. The Great Maule earthquake: seismicity prior to and after the main shock from amphibious seismic networks

    NASA Astrophysics Data System (ADS)

    Lieser, K.; Arroyo, I. G.; Grevemeyer, I.; Flueh, E. R.; Lange, D.; Tilmann, F. J.

    2013-12-01

    The Chilean subduction zone is among the seismically most active plate boundaries in the world and its coastal ranges suffer from a magnitude 8 or larger megathrust earthquake every 10-20 years. The Constitucin-Concepcin or Maule segment in central Chile between ~35.5S and 37S was considered to be a mature seismic gap, rupturing last in 1835 and being seismically quiet without any magnitude 4.5 or larger earthquakes reported in global catalogues. It is located to the north of the nucleation area of the 1960 magnitude 9.5 Valdivia earthquake and to the south of the 1928 magnitude 8 Talca earthquake. On 27 February 2010 this segment ruptured in a Mw=8.8 earthquake, nucleating near 36S and affecting a 500-600 km long segment of the margin between 34S and 38.5S. Aftershocks occurred along a roughly 600 km long portion of the central Chilean margin, most of them offshore. Therefore, a network of 30 ocean-bottom-seismometers was deployed in the northern portion of the rupture area for a three month period, recording local offshore aftershocks between 20 September 2010 and 25 December 2010. In addition, data of a network consisting of 33 landstations of the GeoForschungsZentrum Potsdam were included into the network, providing an ideal coverage of both the rupture plane and areas affected by post-seismic slip as deduced from geodetic data. Aftershock locations are based on automatically detected P wave onsets and a 2.5D velocity model of the combined on- and offshore network. Aftershock seismicity analysis in the northern part of the survey area reveals a well resolved seismically active splay fault in the accretionary prism of the Chilean forearc. Our findings imply that in the northernmost part of the rupture zone, co-seismic slip most likely propagated along the splay fault and not the subduction thrust fault. In addition, the updip limit of aftershocks along the plate interface can be verified to about 40 km landwards from the deformation front. Prior to the Great Maule earthquake the Collaborative Research Center SFB 574 'Volatiles and Fluids in Subduction Zones' shot several wide-angle profiles and operated a network, also consisting of OBS and land stations for six months in 2008. Both projects provide a great opportunity to study the evolution of a subduction zone within the seismic cycle of a great earthquake. The most profound features are (i) a sharp reduction in intraslab seismic activity after the Maule earthquake and (ii) a sharp increase in seismic activity at the slab interface above 50 km depth, where large parts of the rupture zone were largely aseismic prior to the Maule earthquake. Further, the aftershock seismicity shows a broader depth distribution above 50 km depth.

  17. Intermediate depth seismicity - a reflection seismic approach

    NASA Astrophysics Data System (ADS)

    Haberland, C.; Rietbrock, A.

    2004-12-01

    During subduction the descending oceanic lithosphere is subject to metamorphic reactions, some of them associated with the release of fluids. It is now widely accepted, that these reactions and associated dehydration processes are directly related with the generation of intermediate depth earthquakes (dehydration embrittlement). However, the structure of the layered oceanic plate at depth and the location of the earthquakes relative to structural units of the subducting plate (sources within the oceanic crust and/or in the upper oceanic mantle lithosphere?) are still not resolved yet. This is in mainly due to the fact that the observational resolution needed to address these topics (in the range of only a few kilometers) is hardly achieved in field experiments and related studies. Here we study the wavefields of intermediate depth earthquakes typically observed by temporary networks in order to assess their high-resolution potential in resolving structure of the down going slab and locus of seismicity. In particular we study whether the subducted oceanic Moho can be detected by the analysis of secondary phases of local earthquakes (near vertical reflection). Due to the irregular geometry of sources and receivers we apply an imaging technique similar to diffraction stack migration. The method is tested using synthetic data both based on 2-D finite difference simulations and 3-D kinematic ray tracing. The accuracy of the hypocenter location and onset times crucial for the successful application of stacking techniques (coherency) was achieved by the use of relatively relocated intermediate depth seismicity. Additionally, we simulate the propagation of the wavefields at larger distance (wide angle) indicating the development of guided waves traveling in the low-velocity waveguide associated with the modeled oceanic crust. We also present application on local earthquake data from the South American subduction zone.

  18. Benchmark problems and results for verifying resonance calculation methodologies

    SciTech Connect

    Wu, H.; Yang, W.; Qin, Y.; He, L.; Cao, L.; Zheng, Y.; Liu, Q.

    2012-07-01

    Resonance calculation is one of the most important procedures for the multi-group neutron transport calculation. With the development of nuclear reactor concepts, many new types of fuel assembly are raised. Compared to the traditional designs, most of the new fuel assemblies have different fuel types either with complex isotopes or with complicated geometry. This makes the traditional resonance calculation method invalid. Recently, many advanced resonance calculation methods are proposed. However, there are few benchmark problems for evaluating those methods with a comprehensive comparison. In this paper, we design 5 groups of benchmark problems including 21 typical cases of different geometries and fuel contents. The reference results of the benchmark problems are generated based on the sub-group method, ultra-fine group method, function expanding method and Monte Carlo method. It is shown that those benchmark problems and their results could be helpful to evaluate the validity of the newly developed resonance calculation method in the future work. (authors)

  19. Hot clamp design for LMFBR piping systems

    SciTech Connect

    Kobayashi, T.; Tateishi, M. )

    1993-02-01

    Thin-wall, large-diameter piping for liquid metal fast breeder reactor (LMFBR) plants can be subjected to significant thermal transients during reactor scrams. To reduce local thermal stresses, an insulated cold clamp was designed for the fast flux test facility and was also applied to some prototype reactors thereafter. However, the cost minimization of LMFBR requires much simpler designs. This paper presents a hot clamp design concept, which uses standard clamp halves directly attached to the pipe surface leaving an initial gap. Combinations of flexible pipe and rigid clamp achieved a self-control effect on clamp-induced pipe stresses due to the initial gap. A 3-D contact and inelastic history analysis were performed to verify the hot clamp concept. Considerations to reduce the initial stress at installation, to mitigate the clamp restraint on the pipe expansion during thermal shocks, and to maintain the pipe-clamp stiffness desired during a seismic event were discussed.

  20. Verifying Stability of Dynamic Soft-Computing Systems

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Napolitano, Marcello; Callahan, John

    1997-01-01

    Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

  1. IDMS: A System to Verify Component Interface Completeness and Compatibility for Product Integration

    NASA Astrophysics Data System (ADS)

    Areeprayolkij, Wantana; Limpiyakorn, Yachai; Gansawat, Duangrat

    The growing approach of Component-Based software Development has had a great impact on today system architectural design. However, the design of subsystems that lacks interoperability and reusability can cause problems during product integration. At worst, this may result in project failure. In literature, it is suggested that the verification of interface descriptions and management of interface changes are factors essential to the success of product integration process. This paper thus presents an automation approach to facilitate reviewing component interfaces for completeness and compatibility. The Interface Descriptions Management System (IDMS) has been implemented to ease and fasten the interface review activities using UML component diagrams as input. The method of verifying interface compatibility is accomplished by traversing the component dependency graph called Component Compatibility Graph (CCG). CCG is the visualization of which each node represents a component, and each edge represents communications between associated components. Three case studies were studied to subjectively evaluate the correctness and usefulness of IDMS.

  2. A Hammer-Impact, Aluminum, Shear-Wave Seismic Source

    USGS Publications Warehouse

    Haines, Seth S.

    2007-01-01

    Near-surface seismic surveys often employ hammer impacts to create seismic energy. Shear-wave surveys using horizontally polarized waves require horizontal hammer impacts against a rigid object (the source) that is coupled to the ground surface. I have designed, built, and tested a source made out of aluminum and equipped with spikes to improve coupling. The source is effective in a variety of settings, and it is relatively simple and inexpensive to build.

  3. Seismic Tomography in Sensor Networks

    NASA Astrophysics Data System (ADS)

    Shi, L.; Song, W.; Lees, J. M.; Xing, G.

    2012-12-01

    Tomography imaging, applied to seismology, requires a new, decentralized approach if high resolution calculations are to be performed in a sensor network configuration. The real-time data retrieval from a network of large-amount wireless seismic stations to a central server is virtually impossible due to the sheer data amount and resource limitations. In this paper, we propose and design a distributed algorithm for processing data and inverting tomography in the network, while avoiding costly data collections and centralized computations. Based on a partition of the tomographic inversion problem, the new algorithms distribute the computational burden to sensor nodes and perform real-time tomographic inversion in the network, so that we can recover a high resolution tomographic model in real-time under the constraints of network resources. Our emulation results indicate that the distributed algorithms successfully reconstruct the synthetic models, while reducing and balancing the communication and computation cost to a large extent.

  4. Seismic Hazard of Romania: Deterministic Approach

    NASA Astrophysics Data System (ADS)

    Radulian, M.; Vaccari, F.; Mândrescu, N.; Panza, G. F.; Moldoveanu, C. L.

    The seismic hazard of Romania is estimated in terms of peak-ground motion values-displacement, velocity, design ground acceleration (DGA)-computing complete synthetic seismograms, which are considered to be representative of the different seismogenic and structural zones of the country. The deterministic method addresses issues largely neglected in probabilistic hazard analysis, e.g., how crustal properties affect attenuation, since the ground motion parameters are not derived from overly simplified attenuation ``functions,'' but rather from synthetic time histories. The synthesis of the hazard is divided into two parts, one that of shallow-focus earthquakes, and the other, that of intermediate-focus events of the Vrancea region.The previous hazard maps of Romania completely ignore the seismic activity in the southeastern part of the country (due to the seismic source of Shabla zone). For the Vrancea intermediate-depth earthquakes, which control the seismic hazard level over most of the territory, the comparison of the numerical results with the historically-based intensity map show significant differences. They could be due to possible structural or source properties not captured by our modeling, or to differences in the distribution of damageable buildings over the territory (meaning that future earthquakes can be more spectacularly damaging in regions other than those regions experiencing damage in the past). Since the deterministic modeling is highly sensitive to the source and path effects, it can be used to improve the seismological parameters of the historical events.

  5. Sound source localization technique using a seismic streamer and its extension for whale localization during seismic surveys.

    PubMed

    Abadi, Shima H; Wilcock, William S D; Tolstoy, Maya; Crone, Timothy J; Carbotte, Suzanne M

    2015-12-01

    Marine seismic surveys are under increasing scrutiny because of concern that they may disturb or otherwise harm marine mammals and impede their communications. Most of the energy from seismic surveys is low frequency, so concerns are particularly focused on baleen whales. Extensive mitigation efforts accompany seismic surveys, including visual and acoustic monitoring, but the possibility remains that not all animals in an area can be observed and located. One potential way to improve mitigation efforts is to utilize the seismic hydrophone streamer to detect and locate calling baleen whales. This study describes a method to localize low frequency sound sources with data recoded by a streamer. Beamforming is used to estimate the angle of arriving energy relative to sub-arrays of the streamer which constrains the horizontal propagation velocity to each sub-array for a given trial location. A grid search method is then used to minimize the time residual for relative arrival times along the streamer estimated by cross correlation. Results from both simulation and experiment are shown and data from the marine mammal observers and the passive acoustic monitoring conducted simultaneously with the seismic survey are used to verify the analysis. PMID:26723349

  6. Seismic Hazards in Seattle

    NASA Astrophysics Data System (ADS)

    Delorey, Andrew; Vidale, John

    2010-05-01

    Much of Seattle, in the northwestern United States, lies atop a sedimentary basin that extends approximately 9 km deep. The basin structure is the result of the evolution of the Puget Lowland fore arc, which combines strike-slip and thrust-fault movements to accommodate right-lateral strike-slip and N-S shortening due to the oblique subduction of the Juan de Fuca Plate beneath North America. The Seattle Basin has been observed to amplify and distort the seismic waves from a variety of moderate and large earthquakes in ways that affect the hazard from those earthquakes. Seismic hazard assessments heavily depend upon upper crustal and near-surface S-wave velocity models, which have traditionally been constructed from P-wave models using an empirical relationship between P-wave and S-wave velocity or by interpolating across widely spaced observations of shallow geologic structures. Improving the accuracy and resolution of shallow S-wave models using direct measurements is key to improving seismic hazard assessments and predictions for levels of ground shaking. Tomography, with short-period Rayleigh waves extracted using noise interferometry, can refine S-wave velocity models in urban areas with dense arrays of short period and broadband instruments. We apply this technique to the Seattle area to develop a new shallow S-wave model for use in hazard assessment. Continuous data from the Seismic Hazards in Puget Sound (SHIPS) array have inter-station distances that range from a few, to tens of kilometers. This allows us to extract Rayleigh waves between 2 and 10 seconds period that are sensitive to shallow basin shear wave velocities. Our results show that shear wave velocities are about 25% lower in some regions in the upper 3 km than previous estimates and align more closely with surface geological features and gravity observations. We validate our model by comparing synthetic waveforms to several earthquakes recorded locally on accelerometers operated by the United States Geologic Survey (USGS) and the Pacific Northwest Seismic Network (PNSN). Then, we make predictions on the levels of shaking during likely future events at different areas around Seattle by running simulations using a finite difference code. As is typical in subduction zones, Seattle is exposed to shallow crustal events, intraplate events in the down-going slab, and large megathrust events. Of these three types of events, only large intraplate events have been recorded locally, so simulations are our best opportunity to make predictions for all of the possible scenarios. Our results can be used to update seismic hazard maps for Seattle and can be reproduced in other urban areas with dense arrays of short period and broadband instruments.

  7. Results from the latest SN-4 multi-parametric benthic observatory experiment (MARsite EU project) in the Gulf of Izmit, Turkey: oceanographic, chemical and seismic monitoring

    NASA Astrophysics Data System (ADS)

    Embriaco, Davide; Marinaro, Giuditta; Frugoni, Francesco; Giovanetti, Gabriele; Monna, Stephen; Etiope, Giuseppe; Gasperini, Luca; a?atay, Nam?k; Favali, Paolo

    2015-04-01

    An autonomous and long-term multiparametric benthic observatory (SN-4) was designed to study gas seepage and seismic energy release along the submerged segment of the North Anatolian Fault (NAF). Episodic gas seepage occurs at the seafloor in the Gulf of Izmit (Sea of Marmara, NW Turkey) along this submerged segment of the NAF, which ruptured during the 1999 Mw7.4 Izmit earthquake. The SN-4 observatory already operated in the Gulf of Izmit at the western end of the 1999 Izmit earthquake rupture for about one-year at 166 m water depth during the 2009-2010 experiment (EGU2014-13412-1, EGU General Assembly 2014). SN-4 was re-deployed in the same site for a new long term mission (September 2013 - April 2014) in the framework of MARsite (New Directions in Seismic Hazard assessment through Focused Earth Observation in the Marmara Supersite, http://marsite.eu/ ) EC project, which aims at evaluating seismic risk and managing of long-term monitoring activities in the Marmara Sea. A main scientific objective of the SN-4 experiment is to investigate the possible correlations between seafloor methane seepage and release of seismic energy. We used the same site of the 2009-2010 campaign to verify both the occurrence of previously observed phenomena and the reliability of results obtained in the previous experiment (Embriaco et al., 2014, doi:10.1093/gji/ggt436). In particular, we are interested in the detection of gas release at the seafloor, in the role played by oceanographic phenomena in this detection, and in the association of gas and seismic energy release. The scientific payload included, among other instruments, a three-component broad-band seismometer, and gas and oceanographic sensors. We present a technical description of the observatory, including the data acquisition and control system, results from the preliminary analysis of this new multidisciplinary data set, and a comparison with the previous experiment.

  8. Conceptual design report: Nuclear materials storage facility renovation. Part 5, Structural/seismic investigation. Section A report, existing conditions calculations/supporting information

    SciTech Connect

    1995-07-14

    The Nuclear Materials Storage Facility (NMSF) at the Los Alamos National Laboratory (LANL) was a Fiscal Year (FY) 1984 line-item project completed in 1987 that has never been operated because of major design and construction deficiencies. This renovation project, which will correct those deficiencies and allow operation of the facility, is proposed as an FY 97 line item. The mission of the project is to provide centralized intermediate and long-term storage of special nuclear materials (SNM) associated with defined LANL programmatic missions and to establish a centralized SNM shipping and receiving location for Technical Area (TA)-55 at LANL. Based on current projections, existing storage space for SNM at other locations at LANL will be loaded to capacity by approximately 2002. This will adversely affect LANUs ability to meet its mission requirements in the future. The affected missions include LANL`s weapons research, development, and testing (WRD&T) program; special materials recovery; stockpile survelliance/evaluation; advanced fuels and heat sources development and production; and safe, secure storage of existing nuclear materials inventories. The problem is further exacerbated by LANL`s inability to ship any materials offsite because of the lack of receiver sites for mate rial and regulatory issues. Correction of the current deficiencies and enhancement of the facility will provide centralized storage close to a nuclear materials processing facility. The project will enable long-term, cost-effective storage in a secure environment with reduced radiation exposure to workers, and eliminate potential exposures to the public. Based upon US Department of Energy (DOE) Albuquerque Operations (DOE/Al) Office and LANL projections, storage space limitations/restrictions will begin to affect LANL`s ability to meet its missions between 1998 and 2002.

  9. Patterns of significant seismic quiescence on the Mexican Pacific coast

    NASA Astrophysics Data System (ADS)

    Muoz-Diosdado, A.; Rudolf-Navarro, A. H.; Angulo-Brown, F.; Barrera-Ferrer, A. G.

    Many authors have proposed that the study of seismicity rates is an appropriate technique for evaluating how close a seismic gap may be to rupture. We designed an algorithm for identification of patterns of significant seismic quiescence by using the definition of seismic quiescence proposed by Schreider (1990). This algorithm shows the area of quiescence where an earthquake of great magnitude may probably occur. We have applied our algorithm to the earthquake catalog on the Mexican Pacific coast located between 14 and 21 degrees of North latitude and 94 and 106 degrees West longitude; with depths less than or equal to 60 km and magnitude greater than or equal to 4.3, which occurred from January, 1965 until December, 2014. We have found significant patterns of seismic quietude before the earthquakes of Oaxaca (November 1978, Mw = 7.8), Petatln (March 1979, Mw = 7.6), Michoacn (September 1985, Mw = 8.0, and Mw = 7.6) and Colima (October 1995, Mw = 8.0). Fortunately, in this century earthquakes of great magnitude have not occurred in Mexico. However, we have identified well-defined seismic quiescences in the Guerrero seismic-gap, which are apparently correlated with the occurrence of silent earthquakes in 2002, 2006 and 2010 recently discovered by GPS technology.

  10. First Quarter Hanford Seismic Report for Fiscal Year 1999

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    1999-05-26

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. They also locate and identify sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consists of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The operational rate for the first quarter of FY99 for stations in the HSN was 99.8%. There were 121 triggers during the first quarter of fiscal year 1999. Fourteen triggers were local earthquakes; seven (50%) were in the Columbia River Basalt Group, no earthquakes occurred in the pre-basalt sediments, and seven (50%) were in the crystalline basement. One earthquake (7%) occurred near or along the Horn Rapids anticline, seven earthquakes (50%) occurred in a known swarm area, and six earthquakes (43%) were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometer during the first quarter of FY99.

  11. Investigation of the Seismic Performance of Reinforced Highway Embankments

    NASA Astrophysics Data System (ADS)

    Toksoy, Y. S.; Edinçliler, A.

    2014-12-01

    Despite the fact that highway embankments are highly prone to earthquake induced damage, there are not enough studies in the literature concentrated on improving the seismic performance of highway embankments. Embankments which are quite stable under static load conditions can simply collapse during earthquakes due to the destructive seismic loading. This situation poses a high sequence thread to the structural integrity of the embankment, service quality and serviceability. The objective of this study is to determine the effect of the geosynthetic reinforcement on the seismic performance of the highway embankments and evaluate the seismic performance of the geotextile reinforced embankment under different earthquake motions. A 1:50 scale highway embankment model is designed and reinforced with geosynthetics in order to increase the seismic performance of the embankment model. A series of shaking table tests were performed for the identical unreinforced and reinforced embankment models using earthquake excitations with different characteristics. The experimental results were evaluated comparing the unreinforced and reinforced cases. Results revealed that reinforced embankment models perform better seismic performance especially under specificied ground excitations used in this study. Also, the prototype embankment was numerically modelled. It is seen that similar seismic behavior trend is obtained in the finite element simulations.

  12. Validation of seismic probabilistic risk assessments of nuclear power plants

    SciTech Connect

    Ellingwood, B.

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  13. Seismic capacity of switchgear

    SciTech Connect

    Bandyopadhyay, K.; Hofmayer, C.; Kassir, M.; Pepper, S.

    1989-01-01

    As part of a component fragility program sponsored by the USNRC, BNL has collected existing information on the seismic capacity of switchgear assemblies from major manufacturers. Existing seismic test data for both low and medium voltage switchgear assemblies have been evaluated and the generic results are presented in this paper. The failure modes are identified and the corresponding generic lower bound capacity levels are established. The test response spectra have been used as a measure of the test vibration input. The results indicate that relays chatter at a very low input level at the base of the switchgear cabinet. This change of state of devices including relays have been observed. Breaker tripping occurs at a higher vibration level. Although the structural failure of internal elements have been noticed, the overall switchgear cabinet structure withstands a high vibration level. 5 refs., 2 figs., 2 tabs.

  14. Seismic detection of tornadoes

    USGS Publications Warehouse

    Tatom, F. B.

    1993-01-01

    Tornadoes represent the most violent of all forms of atmospheric storms, each year resulting in hundreds of millions of dollars in property damage and approximately one hundred fatalities. In recent years, considerable success has been achieved in detecting tornadic storms by means of Doppler radar. However, radar systems cannot determine when a tornado is actually in contact with the ground, expect possibly at extremely close range. At the present time, human observation is the only truly reliable way of knowing that a tornado is actually on the ground. However, considerable evidence exists indicating that a tornado in contact with the ground produces a significant seismic signal. If such signals are generated, the seismic detection and warning of an imminent tornado can become a distinct possibility. 

  15. Calculating California Seismicity Rates

    USGS Publications Warehouse

    Felzer, Karen R.

    2008-01-01

    Empirically the rate of earthquakes = magnitude M is well fit by the Gutenberg-Richter relationship, logN=a-bM (1) where N is the number of earthquakes = M over a given time period, a is the number of M = 0 earthquakes over the same period, and b is a parameter that determines the ratio of larger to smaller earthquakes (Ishimoto and Iida 1939; Gutenberg and Richter 1944). Thus to characterize the seismicity rate, N, and risk in a given region we need to solve for the values of a and b. Here we are concerned with solving for the long term average values of these parameters for the state of California. My primary data source is a catalog of 1850-2006 M = 4.0 seismicity compiled with Tianqing Cao (Appendix H). Because earthquakes outside of the state can influence California I consider both earthquakes within the state and within 100 km of the state border (Figure 1).

  16. Hanford quarterly seismic report -- 97A seismicity on and near the Hanford Site, Pasco Basin, Washington, October 1, 1996 through December 31, 1996

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.P.

    1997-02-01

    Seismic Monitoring is part of PNNL`s Applied Geology and Geochemistry Group. The Seismic Monitoring Analysis and Repair Team (SMART) operates, maintains, and analyzes data from the hanford Seismic Network (HSN), extending the site historical seismic database and fulfilling US Department of Energy, Richland Operations Office requirements and orders. The SMART also maintains the Eastern Washington Regional Network (EWRN). The University of Washington uses the data from the EWRN and other seismic networks in the Northwest to provide the SMART with necessary regional input for the seismic hazards analysis at the Hanford Site. The SMART is tasked to provide an uninterrupted collection of high-quality raw seismic data from the HSN located on and around the Hanford Site. These unprocessed data are permanently archived. SMART also is tasked to locate and identify sources of seismic activity, monitor changes in the historical pattern of seismic activity at the Hanford Site, and build a local earthquake database (processed data) that is permanently archived. Local earthquakes are defined as earthquakes that occur within 46 degrees to 47 degrees west longitude and 119 degrees to 120 degrees north latitude. The data are used by the Hanford contractor for waste management activities, Natural Phenomena Hazards assessments and engineering design and construction. In addition, the seismic monitoring organization works with Hanford Site Emergency Services Organization to provide assistance in the event of an earthquake on the Hanford Site.

  17. A Three-Day Seismic Experiment in an Urban Setting: An Introduction to Seismology for Minority Students

    NASA Astrophysics Data System (ADS)

    Lorenzo, J. M.; Anderson, L. C.; Bart, P. J.; Ferrell, R. E.; Tomkin, J. H.

    2004-12-01

    Summer program participants of LSU GAEMP (Geoscience Alliance to Improve Minority Participation) are non-traditional, STEM (science, technology, engineering, or math), underrepresented minorities from 9 Minority-Serving Institutions in the states of Louisiana, Texas and Mississippi. During this summer of 2004, twelve students completed a six-week field and lab program across the lower U.S.A. Because of the urban background of many of the participants one three-day module on earthquakes and earth deformation emphasized the design of a non-conventional seismic experiment, field acquisition and analysis of data in an urban setting. Day one introduced stress, major fault types and their plate tectonic setting based on a case study of active growth faulting, emphasizing its effects on urban planning. Students visited the field to verify the location of faults from prior interpretations using GPS and topographic maps, and to discuss observed faulted buildings, offices and roadways. Later, students were exposed to the principles of active seismology, divided into six working groups and required to design by the next morning a realistic experiment to verify faults in the shallow subsurface. Day two was dedicated to collecting shallow (<300m) shear-wave seismic refraction data from both sides of a suspected growth fault, with the student expectation that a thicker sediment sequence would be observed on the down-thrown block. Day three involved pencil-and-paper analyses of data for reflection and refraction-thickness and velocity estimation, capped with a discussion and formal oral presentations of group results. The student-led design, active field deployment of equipment and formal discussion groups provided the widest range of activities to promote awareness of the relevance of seismology in modern society.

  18. Lunar seismic data analysis

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.; Latham, G. V.; Dorman, H. J.

    1982-01-01

    The scientific data transmitted continuously from all ALSEP (Apollo Lunar Surface Experiment Package) stations on the Moon and recorded on instrumentation tapes at receiving stations distributed around the Earth were processed. The processing produced sets of computer-compatible digital tapes, from which various other data sets convenient for analysis were generated. The seismograms were read, various types of seismic events were classified; the detected events were cataloged.

  19. Albuquerque Basin seismic network

    USGS Publications Warehouse

    Jaksha, Lawrence H.; Locke, Jerry; Thompson, J.B.; Garcia, Alvin

    1977-01-01

    The U.S. Geological Survey has recently completed the installation of a seismic network around the Albuquerque Basin in New Mexico. The network consists of two seismometer arrays, a thirteen-station array monitoring an area of approximately 28,000 km 2 and an eight-element array monitoring the area immediately adjacent to the Albuquerque Seismological Laboratory. This report describes the instrumentation deployed in the network.

  20. Seismic attributes revisited

    SciTech Connect

    Taner, M.T.; O`Doherty, R.; Schuelke, J.S.; Baysal, E.

    1994-12-31

    Since their introduction in early 1970`s seismic attributes have become one of the most commonly used and powerful interpretation tools. In earlier applications they were considered as a management display tool. This view has changed rapidly as their use in interpretation expanded. Results were analyzed in a qualitative manner. Recently, as evidenced from increasing number of papers, attribute use finally became more quantitative through calibration with well bore measurements. In this paper the authors will present a review of basic attributes, their recently introduced derivatives and their expected significance. They will also classify them with respect to their computation and their end use. They define all seismically driven parameters as the Seismic Attributes. They can be velocity, amplitude, rate of change of any of them with respect to time or space and so on. They can be computed from pre-stack or post stack data sets. Some of the attributes computed from complex traces such as envelope, phase and etc. correspond to the various measurements of the propagating wave-front. They will call those the physical Attributes. These attributes may be used for prediction or extrapolation of lithological or reservoir characteristics. Others are computed from the reflection configurations and continuity. They will call these Geometric Attributes. These are used in structural and stratigraphic interpretation. In this paper they will confine themselves to the classification and definition of the attributes and their possible uses. They will give examples from different geological settings.

  1. Seismic basement in Poland

    NASA Astrophysics Data System (ADS)

    Grad, Marek; Polkowski, Marcin

    2015-09-01

    The area of contact between Precambrian and Phanerozoic Europe in Poland has complicated structure of sedimentary cover and basement. The thinnest sedimentary cover in the Mazury-Belarus anteclize is only 0.3-1 km thick, increases to 7-8 km along the East European Craton margin, and 9-12 km in the Trans-European Suture Zone (TESZ). The Variscan domain is characterized by a 1- to 2-km-thick sedimentary cover, while the Carpathians are characterized by very thick sediments, up to c. 20 km. The map of the basement depth is created by combining data from geological boreholes with a set of regional seismic refraction profiles. These maps do not provide data about the basement depth in the central part of the TESZ and in the Carpathians. Therefore, the data set is supplemented by 32 models from deep seismic sounding profiles and a map of a high-resistivity (low-conductivity) layer from magnetotelluric soundings, identified as a basement. All of these data provide knowledge about the basement depth and of P-wave seismic velocities of the crystalline and consolidated type of basement for the whole area of Poland. Finally, the differentiation of the basement depth and velocity is discussed with respect to geophysical fields and the tectonic division of the area.

  2. Quiet Clean Short-haul Experimental Engine (QCSEE) Under-The-Wing (UTW) composite nacelle subsystem test report. [to verify strength of selected composite materials

    NASA Technical Reports Server (NTRS)

    Stotler, C. L., Jr.; Johnston, E. A.; Freeman, D. S.

    1977-01-01

    The element and subcomponent testing conducted to verify the under the wing composite nacelle design is reported. This composite nacelle consists of an inlet, outer cowl doors, inner cowl doors, and a variable fan nozzle. The element tests provided the mechanical properties used in the nacelle design. The subcomponent tests verified that the critical panel and joint areas of the nacelle had adequate structural integrity.

  3. ELASTIC-WAVEFIELD SEISMIC STRATIGRAPHY: A NEW SEISMIC IMAGING TECHNOLOGY

    SciTech Connect

    Bob A. Hardage

    2004-05-06

    The focus of elastic-wavefield seismic stratigraphy research shifted from onshore prospects to marine environments during this report period. Four-component ocean-bottom-cable (4-C OBC) seismic data acquired in water depths of 2400 to 2500 feet across Green Canyon Block 237 in the Gulf of Mexico were processed and analyzed. The P-P and P-SV images of strata immediately below the seafloor exhibit amazing differences in P-P and P-SV seismic facies. These data may be one of the classic examples of the basic concepts of elastic-wavefield seismic stratigraphy.

  4. Study on Application of Seismic Isolation System to ABWR-II Building

    SciTech Connect

    Hideaki Saito; Hideo Tanaka; Atsuko Noguchi; Junji Suhara; Yasuaki Fukushima

    2004-07-01

    This paper reports the result of a study that evaluated the applicability of the seismic isolation system to nuclear power plants. The study focuses on possibilities of a standard design with improved seismic safety of building and equipment for ABWR-II. A base isolation system with laminated lead rubber bearing was applied in the study. Based on the structural design of isolated buildings, it was confirmed that the design seismic loads can be largely reduced and that seismic elements of buildings and equipment can be easily designed compared with non-isolated buildings. Improvement in the building construction cost and period was also confirmed. The analytical results of seismic probabilistic safety assessments showed that an isolated building has a much higher degree of the seismic safety than a non-isolated building. The study concludes that the seismic isolation system is well applicable to ABWR-II plants. In addition, with an aim to enhance the earthquake-resistance of future ABWR-II plants, a building concept was developed, in which a lot of important equipment are laid out on a floor directly supported by the base isolation system. On this plant, further improvement of the seismic reliability is expected due to reduction of the seismic responses of important equipment. (authors)

  5. Monitoring hydraulic fracturing with seismic emission volume

    NASA Astrophysics Data System (ADS)

    Niu, F.; Tang, Y.; Chen, H.; TAO, K.; Levander, A.

    2014-12-01

    Recent developments in horizontal drilling and hydraulic fracturing have made it possible to access the reservoirs that are not available for massive production in the past. Hydraulic fracturing is designed to enhance rock permeability and reservoir drainage through the creation of fracture networks. Microseismic monitoring has been proven to be an effective and valuable technology to image hydraulic fracture geometry. Based on data acquisition, seismic monitoring techniques have been divided into two categories: downhole and surface monitoring. Surface monitoring is challenging because of the extremely low signal-to-noise ratio of the raw data. We applied the techniques used in earthquake seismology and developed an integrated monitoring system for mapping hydraulic fractures. The system consists of 20 to 30 state-of-the-art broadband seismographs, which are generally about hundreds times more sensible than regular geophones. We have conducted two experiments in two basins with very different geology and formation mechanism in China. In each case, we observed clear microseismic events, which may correspond to the induced seismicity directly associated with fracturing and the triggered ones at pre-existing faults. However, the magnitude of these events is generally larger than magnitude -1, approximately one to two magnitudes larger than those detected by downhole instruments. Spectrum-frequency analysis of the continuous surface recordings indicated high seismic energy associated with injection stages. The seismic energy can be back-projected to a volume that surrounds each injection stage. Imaging seismic emission volume (SEV) appears to be an effective way to map the stimulated reservior volume, as well as natural fractures.

  6. Seismic Adequacy Review of PC012 SCEs that are Potential Seismic Hazards with PC3 SCEs at Cold Vacuum Dryer (CVD) Facility

    SciTech Connect

    OCOMA, E.C.

    1999-08-12

    This document provides seismic adequacy review of PCO12 Systems, Components L Equipment anchorage that are potential seismic interaction hazards with PC3 SCEs during a Design Basis Earthquake. The PCO12 items are identified in the Safety Equipment List as 3/1 SCEs.

  7. Seismic Imager Space Telescope

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin; Coste, Keith; Cunningham, J.; Sievers,Michael W.; Agnes, Gregory S.; Polanco, Otto R.; Green, Joseph J.; Cameron, Bruce A.; Redding, David C.; Avouac, Jean Philippe; Ampuero, Jean Paul; Leprince, Sebastien; Michel, Remi

    2012-01-01

    A concept has been developed for a geostationary seismic imager (GSI), a space telescope in geostationary orbit above the Pacific coast of the Americas that would provide movies of many large earthquakes occurring in the area from Southern Chile to Southern Alaska. The GSI movies would cover a field of view as long as 300 km, at a spatial resolution of 3 to 15 m and a temporal resolution of 1 to 2 Hz, which is sufficient for accurate measurement of surface displacements and photometric changes induced by seismic waves. Computer processing of the movie images would exploit these dynamic changes to accurately measure the rapidly evolving surface waves and surface ruptures as they happen. These measurements would provide key information to advance the understanding of the mechanisms governing earthquake ruptures, and the propagation and arrest of damaging seismic waves. GSI operational strategy is to react to earthquakes detected by ground seismometers, slewing the satellite to point at the epicenters of earthquakes above a certain magnitude. Some of these earthquakes will be foreshocks of larger earthquakes; these will be observed, as the spacecraft would have been pointed in the right direction. This strategy was tested against the historical record for the Pacific coast of the Americas, from 1973 until the present. Based on the seismicity recorded during this time period, a GSI mission with a lifetime of 10 years could have been in position to observe at least 13 (22 on average) earthquakes of magnitude larger than 6, and at least one (2 on average) earthquake of magnitude larger than 7. A GSI would provide data unprecedented in its extent and temporal and spatial resolution. It would provide this data for some of the world's most seismically active regions, and do so better and at a lower cost than could be done with ground-based instrumentation. A GSI would revolutionize the understanding of earthquake dynamics, perhaps leading ultimately to effective warning capabilities, to improved management of earthquake risk, and to improved public safety policies. The position of the spacecraft, its high optical quality, large field of view, and large field of regard will make it an ideal platform for other scientific studies. The same data could be simply reused for other studies. If different data, such as multi-spectral data, is required, additional instruments could share the telescope.

  8. Monitoring and verifying changes of organic carbon in soil

    USGS Publications Warehouse

    Post, W.M.; Izaurralde, R. C.; Mann, L. K.; Bliss, Norman B.

    2001-01-01

    Changes in soil and vegetation management can impact strongly on the rates of carbon (C) accumulation and loss in soil, even over short periods of time. Detecting the effects of such changes in accumulation and loss rates on the amount of C stored in soil presents many challenges. Consideration of the temporal and spatial heterogeneity of soil properties, general environmental conditions, and management history is essential when designing methods for monitoring and projecting changes in soil C stocks. Several approaches and tools will be required to develop reliable estimates of changes in soil C at scales ranging from the individual experimental plot to whole regional and national inventories. In this paper we present an overview of soil properties and processes that must be considered. We classify the methods for determining soil C changes as direct or indirect. Direct methods include field and laboratory measurements of total C, various physical and chemical fractions, and C isotopes. A promising direct method is eddy covariance measurement of CO2fluxes. Indirect methods include simple and stratified accounting, use of environmental and topographic relationships, and modeling approaches. We present a conceptual plan for monitoring soil C changes at regional scales that can be readily implemented. Finally, we anticipate significant improvements in soil C monitoring with the advent of instruments capable of direct and precise measurements in the field as well as methods for interpreting and extrapolating spatial and temporal information.

  9. Verifying a computational method for predicting extreme ground motion

    USGS Publications Warehouse

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, B.T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  10. Seismic databases of The Caucasus

    NASA Astrophysics Data System (ADS)

    Gunia, I.; Sokhadze, G.; Mikava, D.; Tvaradze, N.; Godoladze, T.

    2012-12-01

    The Caucasus is one of the active segments of the Alpine-Himalayan collision belt. The region needs continues seismic monitoring systems for better understanding of tectonic processes going in the region. Seismic Monitoring Center of Georgia (Ilia State University) is operating the digital seismic network of the country and is also collecting and exchanging data with neighboring countries. The main focus of our study was to create seismic database which is well organized, easily reachable and is convenient for scientists to use. The seismological database includes the information about more than 100 000 earthquakes from the whole Caucasus. We have to mention that it includes data from analog and digital seismic networks. The first analog seismic station in Georgia was installed in 1899 in the Caucasus in Tbilisi city. The number of analog seismic stations was increasing during next decades and in 1980s about 100 analog stations were operated all over the region. From 1992 due to political and economical situation the number of stations has been decreased and in 2002 just two analog equipments was operated. New digital seismic network was developed in Georgia since 2003. The number of digital seismic stations was increasing and in current days there are more than 25 digital stations operating in the country. The database includes the detailed information about all equipments installed on seismic stations. Database is available online. That will make convenient interface for seismic data exchange data between Caucasus neighboring countries. It also makes easier both the seismic data processing and transferring them to the database and decreases the operator's mistakes during the routine work. The database was created using the followings: php, MySql, Javascript, Ajax, GMT, Gmap, Hypoinverse.

  11. Small Arrays for Seismic Intruder Detections: A Simulation Based Experiment

    NASA Astrophysics Data System (ADS)

    Pitarka, A.

    2014-12-01

    Seismic sensors such as geophones and fiber optic have been increasingly recognized as promising technologies for intelligence surveillance, including intruder detection and perimeter defense systems. Geophone arrays have the capability to provide cost effective intruder detection in protecting assets with large perimeters. A seismic intruder detection system uses one or multiple arrays of geophones design to record seismic signals from footsteps and ground vehicles. Using a series of real-time signal processing algorithms the system detects, classify and monitors the intruder's movement. We have carried out numerical experiments to demonstrate the capability of a seismic array to detect moving targets that generate seismic signals. The seismic source is modeled as a vertical force acting on the ground that generates continuous impulsive seismic signals with different predominant frequencies. Frequency-wave number analysis of the synthetic array data was used to demonstrate the array's capability at accurately determining intruder's movement direction. The performance of the array was also analyzed in detecting two or more objects moving at the same time. One of the drawbacks of using a single array system is its inefficiency at detecting seismic signals deflected by large underground objects. We will show simulation results of the effect of an underground concrete block at shielding the seismic signal coming from an intruder. Based on simulations we found that multiple small arrays can greatly improve the system's detection capability in the presence of underground structures. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344

  12. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to densify the network and to enhance performance of the Algerian Digital Seismic Network.

  13. Savannah River Site disaggregated seismic spectra

    SciTech Connect

    Stephenson, D.E.

    1993-02-01

    The objective of this technical note is to characterize seismic ground motion at the Savannah River Site (SRS) by postulated earthquakes that may impact facilities at the site. This task is accomplished by reviewing the deterministic and probabilistic assessments of the seismic hazard to establish the earthquakes that control the hazard to establish the earthquakes that control the hazard at the site and then evaluate the associated seismic ground motions in terms of response spectra. For engineering design criteria of earthquake-resistant structures, response spectra serve the function of characterizing ground motions as a function of period or frequency. These motions then provide the input parameters that are used in the analysis of structural response. Because they use the maximum response, the response spectra are an inherently conservative design tool. Response spectra are described in terms of amplitude, duration, and frequency content, and these are related to source parameters, travel path, and site conditions. Studies by a number of investigators have shown by statistical analysis that for different magnitudes the response spectrum values are different for differing periods. These facts support Jennings' position that using different shapes of design spectra for earthquakes of different magnitudes and travel paths is a better practice than employing a single, general-purpose shape. All seismic ground motion characterization results indicate that the PGA is controlled by a local event with M[sub w] < 6 and R < 30km. The results also show that lower frequencies are controlled by a larger, more distant event, typically the Charleston source. The PGA of 0.2 g, based originally on the Blume study, is consistent with LLNL report UCRL-15910 (1990) and with the DOE position on LLNL/EPRI.

  14. Savannah River Site disaggregated seismic spectra

    SciTech Connect

    Stephenson, D.E.

    1993-02-01

    The objective of this technical note is to characterize seismic ground motion at the Savannah River Site (SRS) by postulated earthquakes that may impact facilities at the site. This task is accomplished by reviewing the deterministic and probabilistic assessments of the seismic hazard to establish the earthquakes that control the hazard to establish the earthquakes that control the hazard at the site and then evaluate the associated seismic ground motions in terms of response spectra. For engineering design criteria of earthquake-resistant structures, response spectra serve the function of characterizing ground motions as a function of period or frequency. These motions then provide the input parameters that are used in the analysis of structural response. Because they use the maximum response, the response spectra are an inherently conservative design tool. Response spectra are described in terms of amplitude, duration, and frequency content, and these are related to source parameters, travel path, and site conditions. Studies by a number of investigators have shown by statistical analysis that for different magnitudes the response spectrum values are different for differing periods. These facts support Jennings` position that using different shapes of design spectra for earthquakes of different magnitudes and travel paths is a better practice than employing a single, general-purpose shape. All seismic ground motion characterization results indicate that the PGA is controlled by a local event with M{sub w} < 6 and R < 30km. The results also show that lower frequencies are controlled by a larger, more distant event, typically the Charleston source. The PGA of 0.2 g, based originally on the Blume study, is consistent with LLNL report UCRL-15910 (1990) and with the DOE position on LLNL/EPRI.

  15. Verifying likelihoods for low template DNA profiles using multiple replicates

    PubMed Central

    Steele, Christopher D.; Greenhalgh, Matthew; Balding, David J.

    2014-01-01

    To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

  16. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    SciTech Connect

    Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian; Kamali, M. Koohi

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrence relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.

  17. Sloshing of coolant in a seismically isolated reactor

    SciTech Connect

    Wu, Ting-shu; Gvildys, J.; Seidensticker, R.W.

    1988-01-01

    During a seismic event, the liquid coolant inside the reactor vessel will have sloshing motion which is a low-frequency phenomenon. In a reactor system incorporated with seismic isolation, the isolation frequency usually is also very low. There is concern on the potential amplification of sloshing motion of the liquid coolant. This study investigates the effects of seismic isolation on the sloshing of liquid coolant inside the reactor vessel of a liquid metal cooled reactor. Based on a synthetic ground motion whose response spectra envelop those specified by the NRC Regulator Guide 1.60, it is found that the maximum sloshing wave height increases from 18 in. to almost 30 in. when the system is seismically isolated. Since higher sloshing wave may introduce severe impact forces and thermal shocks to the reactor closure and other components within the reactor vessel, adequate design considerations should be made either to suppress the wave height or to reduce the effects caused by high waves.

  18. 75 FR 31288 - Plant-Verified Drop Shipment (PVDS)-Nonpostal Documentation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-03

    ... 111 Plant-Verified Drop Shipment (PVDS)--Nonpostal Documentation AGENCY: Postal Service TM . ACTION... Service, Domestic Mail Manual (DMM ) 705.15. 2.14 to clarify that PS Form 8125, Plant-Verified Drop...: As a result of reviews of USPS policy concerning practices at induction points of plant-verified...

  19. 49 CFR 40.149 - May the MRO change a verified drug test result?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false May the MRO change a verified drug test result? 40... TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Process 40.149 May the MRO change a verified drug test result? (a) As the MRO, you may change a verified...

  20. 31 CFR 363.14 - How will you verify my identity?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 2 2014-07-01 2014-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application....

  1. 43 CFR 3602.29 - How will BLM verify my production?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false How will BLM verify my production? 3602.29... Materials Sales Administration of Sales 3602.29 How will BLM verify my production? (a) You must submit at... your sales contract so BLM can verify that you have made the required payments. BLM will specify...

  2. 43 CFR 3602.29 - How will BLM verify my production?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false How will BLM verify my production? 3602.29... Materials Sales Administration of Sales 3602.29 How will BLM verify my production? (a) You must submit at... your sales contract so BLM can verify that you have made the required payments. BLM will specify...

  3. 31 CFR 363.14 - How will you verify my identity?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application....

  4. 31 CFR 363.14 - How will you verify my identity?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application....

  5. Hanford annual second quarter seismic report, fiscal year 1998: Seismicity on and near the Hanford Site, Pasco, Washington

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.P.; Rohay, A.C.

    1998-06-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. The staff also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of an earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (ENN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The operational rate for the second quarter of FY98 for stations in the HSN was 99.92%. The operational rate for the second quarter of FY98 for stations of the EWRN was 99.46%. For the second quarter of FY98, the acquisition computer triggered 159 times. Of these triggers 14 were local earthquakes: 7 (50%) in the Columbia River Basalt Group, 3 (21%) in the pre-basalt sediments, and 4 (29%) in the crystalline basement. The geologic and tectonic environments where these earthquakes occurred are discussed in this report. The most significant seismic event for the second quarter was on March 23, 1998 when a 1.9 Mc occurred near Eltopia, WA and was felt by local residents. Although this was a small event, it was felt at the surface and is an indication of the potential impact on Hanford of seismic events that are common to the Site.

  6. Verifiable Adaptive Control with Analytical Stability Margins by Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2010-01-01

    This paper presents a verifiable model-reference adaptive control method based on an optimal control formulation for linear uncertain systems. A predictor model is formulated to enable a parameter estimation of the system parametric uncertainty. The adaptation is based on both the tracking error and predictor error. Using a singular perturbation argument, it can be shown that the closed-loop system tends to a linear time invariant model asymptotically under an assumption of fast adaptation. A stability margin analysis is given to estimate a lower bound of the time delay margin using a matrix measure method. Using this analytical method, the free design parameter n of the optimal control modification adaptive law can be determined to meet a specification of stability margin for verification purposes.

  7. Seismic isolation of nuclear power plants using elastomeric bearings

    NASA Astrophysics Data System (ADS)

    Kumar, Manish

    Seismic isolation using low damping rubber (LDR) and lead-rubber (LR) bearings is a viable strategy for mitigating the effects of extreme earthquake shaking on safety-related nuclear structures. Although seismic isolation has been deployed in nuclear structures in France and South Africa, it has not seen widespread use because of limited new build nuclear construction in the past 30 years and a lack of guidelines, codes and standards for the analysis, design and construction of isolation systems specific to nuclear structures. The nuclear accident at Fukushima Daiichi in March 2011 has led the nuclear community to consider seismic isolation for new large light water and small modular reactors to withstand the effects of extreme earthquakes. The mechanical properties of LDR and LR bearings are not expected to change substantially in design basis shaking. However, under shaking more intense than design basis, the properties of the lead cores in lead-rubber bearings may degrade due to heating associated with energy dissipation, some bearings in an isolation system may experience net tension, and the compression and tension stiffness may be affected by the horizontal displacement of the isolation system. The effects of intra-earthquake changes in mechanical properties on the response of base-isolated nuclear power plants (NPPs) were investigated using an advanced numerical model of a lead-rubber bearing that has been verified and validated, and implemented in OpenSees and ABAQUS. A series of experiments were conducted at University at Buffalo to characterize the behavior of elastomeric bearings in tension. The test data was used to validate a phenomenological model of an elastomeric bearing in tension. The value of three times the shear modulus of rubber in elastomeric bearing was found to be a reasonable estimate of the cavitation stress of a bearing. The sequence of loading did not change the behavior of an elastomeric bearing under cyclic tension, and there was no significant change in the shear modulus, compressive stiffness, and buckling load of a bearing following cavitation. Response-history analysis of base-isolated NPPs was performed using a two-node macro model and a lumped-mass stick model. A comparison of responses obtained from analysis using simplified and advanced isolator models showed that the variation in buckling load due to horizontal displacement and strength degradation due to heating of lead cores affect the responses of a base-isolated NPP most significantly. The two-node macro model can be used to estimate the horizontal displacement response of a base-isolated NPP, but a three-dimensional model that explicitly considers all of the bearings in the isolation system will be required to estimate demands on individual bearings, and to investigate rocking and torsional responses. The use of the simplified LR bearing model underestimated the torsional and rocking response of the base-isolated NPP. Vertical spectral response at the top of containment building was very sensitive to how damping was defined for the response-history analysis.

  8. A future for drifting seismic networks

    NASA Astrophysics Data System (ADS)

    Simons, F. J.; Nolet, G.; Babcock, J.

    2007-12-01

    One-dimensional, radial Earth models are sufficiently well constrained to accurately locate earthquakes and calculate the paths followed by seismic rays. The differences between observations and theoretical predictions of seismograms in such Earth models can be used to reconstruct the three-dimensional wave speed distribution in the regions sampled by the seismic waves, by the technique of seismic tomography. Caused by thermal, compositional, and textural variations, wave speed anomalies remain the premier data source to fully understand the structure and evolution of our planet, from the scale of mantle convection and the mechanisms of heat transfer from core to surface to the international between the deep Earth and surface processes such as plate motion and crustal deformation. Unequal geographical data coverage continues to fundamentally limit the quality of tomographic reconstructions of seismic wave speeds in the interior of the Earth. Only at great cost can geophysicists overcome the difficulties of placing seismographs on the two thirds of the Earth's surface that is covered by oceans. The lack of spatial data coverage strongly hampers the determination of the structure of the Earth in the uncovered regions: all 3-D Earth models are marked by blank spots in areas, distributed throughout the Earth, where little or no information can be obtained. As a possible solution to gaining equal geographic data coverage, we have developed MERMAID, a prototype mobile receiver that could provide an easy, cost-effective way to collect seismic data in the ocean. It is a modification of the robotic floating instruments designed and used by oceanographers. Like them, MERMAID spends its life at depth but is capable of surfacing using a pump and bladder. We have equipped it with a hydrophone to record water pressure variations induced by compressional (P) waves. Untethered and passively drifting, such a floating seismometer will surface upon detection of a "useful" seismic event (for seismic tomography, that is), determine a GPS location, and transmit the waveforms to a satellite. In this presentation we discuss the progress made in this field by our group. More specifically, we discuss the results of preliminary tests conducted off-shore La Jolla in 2003 and 2004, as well as just-in results from a third successful, in situ, test completed in August 2007. We will draw attention to design issues and bottlenecks and the need for and features of sophisticated onboard data analysis software which we have developed and tested. We will chart a road map of the way to our ultimate goal: a worldwide array of MERMAID floating hydrophones, on the scale of the current international land-based seismic arrays. This, we believe, has the potential to progressively eliminate the discrepancies in spatial coverage that currently result in seismic Earth models that are very poorly resolved in places.

  9. Comment on "How can seismic hazard around the New Madrid seismic zone be similar to that in California?" by Arthur Frankel

    USGS Publications Warehouse

    Wang, Z.; Shi, B.; Kiefer, J.D.

    2005-01-01

    PSHA is the method used most to assess seismic hazards for input into various aspects of public and financial policy. For example, PSHA was used by the U.S. Geological Survey to develop the National Seismic Hazard Maps (Frankel et al., 1996, 2002). These maps are the basis for many national, state, and local seismic safety regulations and design standards, such as the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, the International Building Code, and the International Residential Code. Adoption and implementation of these regulations and design standards would have significant impacts on many communities in the New Madrid area, including Memphis, Tennessee and Paducah, Kentucky. Although "mitigating risks to society from earthquakes involves economic and policy issues" (Stein, 2004), seismic hazard assessment is the basis. Seismologists should provide the best information on seismic hazards and communicate them to users and policy makers. There is a lack of effort in communicating the uncertainties in seismic hazard assessment in the central U.S., however. Use of 10%, 5%, and 2% PE in 50 years causes confusion in communicating seismic hazard assessment. It would be easy to discuss and understand the design ground motions if the true meaning of the ground motion derived from PSHA were presented, i.e., the ground motion with the estimated uncertainty or the associated confidence level.

  10. Seismic hazard from induced seismicity: effect of time-dependent hazard variables

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2012-12-01

    Geothermal systems are drawing large attention worldwide as an alternative source of energy. Although geothermal energy is beneficial, field operations can produce induced seismicity whose effects can range from light and unfelt to severe damaging. In a recent paper by Convertito et al. (2012), we have investigated the effect of time-dependent seismicity parameters on seismic hazard from induced seismicity. The analysis considered the time-variation of the b-value of the Gutenberg-Richter relationship and the seismicity rate, and assumed a non-homogeneous Poisson model to solve the hazard integral. The procedure was tested in The Geysers geothermal area in Northern California where commercial exploitation has started in the 1960s. The analyzed dataset consists of earthquakes recorded during the period 2007 trough 2010 by the LBNL Geysers/Calpine network. To test the reliability of the analysis, we applied a simple forecasting procedure which compares the estimated hazard values in terms of ground-motion values having fixed probability of exceedance and the observed ground-motion values. The procedure is feasible for monitoring purposes and for calibrating the production/extraction rate to avoid adverse consequences. However, one of the main assumptions we made concern the fact that both median predictions and standard deviation of the ground-motion prediction equation (GMPE) are stationary. Particularly for geothermal areas where the number of recorded earthquakes can rapidly change with time, we want to investigate how a variation of the coefficients of the used GMPE and of the standard deviation influences the hazard estimates. Basically, we hypothesize that the physical-mechanical properties of a highly fractured medium which is continuously perturbed by field operations can produce variations of both source and medium properties that cannot be captured by a stationary GMPE. We assume a standard GMPE which accounts for the main effects which modify the scaling of the peak-ground motion parameters (e.g., magnitude, geometrical spreading and anelastic attenuation). Moreover, we consider both the inter-event and intra-event components of the standard deviation. For comparison, we use the same dataset analyzed by Convertito et al. (2012), and for successive time windows we perform the regression analysis to infer the time-dependent coefficients of the GMPE. After having tested the statistical significance of the new coefficients and having verified a reduction in the total standard deviation, we introduce the new model in the hazard integral. Hazard maps and site-specific analyses in terms of a uniform hazard spectrum are used to compare the new results with those obtained in our previous study to investigate which coefficients and which components of the total standard deviation do really matter for refining seismic hazard estimates for induced seismicity. Convertito et al. (2012). From Induced Seismicity to Direct Time-Dependent Seismic Hazard, BSSA 102(6), doi:10.1785/0120120036.

  11. Seismic vibration source

    NASA Technical Reports Server (NTRS)

    Dowler, W. L.; Varsi, G.; Yang, L. C. (Inventor)

    1979-01-01

    A system for vibrating the earth in a location where seismic mapping is to take place is described. A relatively shallow hole formed in the earth, such as a hole 10 feet deep, placing a solid propellant in the hole, sealing a portion of the hole above the solid propellant with a device that can rapidly open and close to allow a repeatedly interrupted escape of gas. The propellant is ignited so that high pressure gas is created which escapes in pulses to vibrate the earth.

  12. Seismic risk perception test

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro

    2013-04-01

    The perception of risks involves the process of collecting, selecting and interpreting signals about uncertain impacts of events, activities or technologies. In the natural sciences the term risk seems to be clearly defined, it means the probability distribution of adverse effects, but the everyday use of risk has different connotations (Renn, 2008). The two terms, hazards and risks, are often used interchangeably by the public. Knowledge, experience, values, attitudes and feelings all influence the thinking and judgement of people about the seriousness and acceptability of risks. Within the social sciences however the terminology of 'risk perception' has become the conventional standard (Slovic, 1987). The mental models and other psychological mechanisms which people use to judge risks (such as cognitive heuristics and risk images) are internalized through social and cultural learning and constantly moderated (reinforced, modified, amplified or attenuated) by media reports, peer influences and other communication processes (Morgan et al., 2001). Yet, a theory of risk perception that offers an integrative, as well as empirically valid, approach to understanding and explaining risk perception is still missing". To understand the perception of risk is necessary to consider several areas: social, psychological, cultural, and their interactions. Among the various research in an international context on the perception of natural hazards, it seemed promising the approach with the method of semantic differential (Osgood, C.E., Suci, G., & Tannenbaum, P. 1957, The measurement of meaning. Urbana, IL: University of Illinois Press). The test on seismic risk perception has been constructed by the method of the semantic differential. To compare opposite adjectives or terms has been used a Likert's scale to seven point. The test consists of an informative part and six sections respectively dedicated to: hazard; vulnerability (home and workplace); exposed value (with reference to population and territory); seismic risk in general; risk information and their sources; comparison between seismic risk and other natural hazards. Informative data include: Region, Province, Municipality of residence, Data compilation, Age, Sex, Place of Birth, Nationality, Marital status, Children, Level of education, Employment. The test allows to obtain the perception score for each factor: Hazard, Exposed value, Vulnerability. These scores can be put in relation with the scientific data relating to hazard, vulnerability and the exposed value. On January 2013 started a Survey in the Po Valley and Southern Apennines. The survey will be conducted via web using institutional sites of regions, provinces, municipalities, online newspapers to local spreading, etc. Preliminary data will be discussed. Improve our understanding of the perception of seismic risk would allow us to inform more effectively and to built better educational projects to mitigate risk.

  13. Development of the Multi-Level Seismic Receiver (MLSR)

    SciTech Connect

    Sleefe, G.E.; Engler, B.P.; Drozda, P.M.; Franco, R.J.; Morgan, J.

    1995-02-01

    The Advanced Geophysical Technology Department (6114) and the Telemetry Technology Development Department (2664) have, in conjunction with the Oil Recovery Technology Partnership, developed a Multi-Level Seismic Receiver (MLSR) for use in crosswell seismic surveys. The MLSR was designed and evaluated with the significant support of many industry partners in the oil exploration industry. The unit was designed to record and process superior quality seismic data operating in severe borehole environments, including high temperature (up to 200{degrees}C) and static pressure (10,000 psi). This development has utilized state-of-the-art technology in transducers, data acquisition, and real-time data communication and data processing. The mechanical design of the receiver has been carefully modeled and evaluated to insure excellent signal coupling into the receiver.

  14. Elastic-Wavefield Seismic Stratigraphy: A New Seismic Imaging Technology

    SciTech Connect

    Bob A. Hardage; Milo M. Backus; Michael V. DeAngelo; Sergey Fomel; Khaled Fouad; Robert J. Graebner; Paul E. Murray; Randy Remington; Diana Sava

    2006-07-31

    The purpose of our research has been to develop and demonstrate a seismic technology that will provide the oil and gas industry a better methodology for understanding reservoir and seal architectures and for improving interpretations of hydrocarbon systems. Our research goal was to expand the valuable science of seismic stratigraphy beyond the constraints of compressional (P-P) seismic data by using all modes (P-P, P-SV, SH-SH, SV-SV, SV-P) of a seismic elastic wavefield to define depositional sequences and facies. Our objective was to demonstrate that one or more modes of an elastic wavefield may image stratal surfaces across some stratigraphic intervals that are not seen by companion wave modes and thus provide different, but equally valid, information regarding depositional sequences and sedimentary facies within that interval. We use the term elastic wavefield stratigraphy to describe the methodology we use to integrate seismic sequences and seismic facies from all modes of an elastic wavefield into a seismic interpretation. We interpreted both onshore and marine multicomponent seismic surveys to select the data examples that we use to document the principles of elastic wavefield stratigraphy. We have also used examples from published papers that illustrate some concepts better than did the multicomponent seismic data that were available for our analysis. In each interpretation study, we used rock physics modeling to explain how and why certain geological conditions caused differences in P and S reflectivities that resulted in P-wave seismic sequences and facies being different from depth-equivalent S-wave sequences and facies across the targets we studied.

  15. NSR&D Program Fiscal Year (FY) 2015 Call for Proposals Mitigation of Seismic Risk at Nuclear Facilities using Seismic Isolation

    SciTech Connect

    Coleman, Justin

    2015-02-01

    Seismic isolation (SI) has the potential to drastically reduce seismic response of structures, systems, or components (SSCs) and therefore the risk associated with large seismic events (large seismic event could be defined as the design basis earthquake (DBE) and/or the beyond design basis earthquake (BDBE) depending on the site location). This would correspond to a potential increase in nuclear safety by minimizing the structural response and thus minimizing the risk of material release during large seismic events that have uncertainty associated with their magnitude and frequency. The national consensus standard America Society of Civil Engineers (ASCE) Standard 4, Seismic Analysis of Safety Related Nuclear Structures recently incorporated language and commentary for seismically isolating a large light water reactor or similar large nuclear structure. Some potential benefits of SI are: 1) substantially decoupling the SSC from the earthquake hazard thus decreasing risk of material release during large earthquakes, 2) cost savings for the facility and/or equipment, and 3) applicability to both nuclear (current and next generation) and high hazard non-nuclear facilities. Issue: To date no one has evaluated how the benefit of seismic risk reduction reduces cost to construct a nuclear facility. Objective: Use seismic probabilistic risk assessment (SPRA) to evaluate the reduction in seismic risk and estimate potential cost savings of seismic isolation of a generic nuclear facility. This project would leverage ongoing Idaho National Laboratory (INL) activities that are developing advanced (SPRA) methods using Nonlinear Soil-Structure Interaction (NLSSI) analysis. Technical Approach: The proposed study is intended to obtain an estimate on the reduction in seismic risk and construction cost that might be achieved by seismically isolating a nuclear facility. The nuclear facility is a representative pressurized water reactor building nuclear power plant (NPP) structure. Figure 1: Project activities The study will consider a representative NPP reinforced concrete reactor building and representative plant safety system. This study will leverage existing research and development (R&D) activities at INL. Figure 1 shows the proposed study steps with the steps in blue representing activities already funded at INL and the steps in purple the activities that would be funded under this proposal. The following results will be documented: 1) Comparison of seismic risk for the non-seismically isolated (non-SI) and seismically isolated (SI) NPP, and 2) an estimate of construction cost savings when implementing SI at the site of the generic NPP.

  16. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed technique to a training dataset of induced earthquakes recorded by Berkeley-Geysers network, which is installed in The Geysers geothermal area in Northern California. The reliability of the techniques is then tested by using a different dataset performing seismic hazard analysis in a time-evolving approach, which provides with ground-motion values having fixed probabilities of exceedence. Those values can be finally compared with the observations by using appropriate statistical tests.

  17. Seismological investigation of earthquakes in the New Madrid Seismic Zone. Final report, September 1986--December 1992

    SciTech Connect

    Herrmann, R.B.; Nguyen, B.

    1993-08-01

    Earthquake activity in the New Madrid Seismic Zone had been monitored by regional seismic networks since 1975. During this time period, over 3,700 earthquakes have been located within the region bounded by latitudes 35{degrees}--39{degrees}N and longitudes 87{degrees}--92{degrees}W. Most of these earthquakes occur within a 1.5{degrees} x 2{degrees} zone centered on the Missouri Bootheel. Source parameters of larger earthquakes in the zone and in eastern North America are determined using surface-wave spectral amplitudes and broadband waveforms for the purpose of determining the focal mechanism, source depth and seismic moment. Waveform modeling of broadband data is shown to be a powerful tool in defining these source parameters when used complementary with regional seismic network data, and in addition, in verifying the correctness of previously published focal mechanism solutions.

  18. Seismic Prediction While Drilling (SPWD): Seismic exploration ahead of the drill bit using phased array sources

    NASA Astrophysics Data System (ADS)

    Jaksch, Katrin; Giese, Rüdiger; Kopf, Matthias

    2010-05-01

    In the case of drilling for deep reservoirs previous exploration is indispensable. In recent years the focus shifted more on geological structures like small layers or hydrothermal fault systems. Beside 2D- or 3D-seismics from the surface and seismic measurements like Vertical Seismic Profile (VSP) or Seismic While Drilling (SWD) within a borehole these methods cannot always resolute this structures. The resolution is worsen the deeper and smaller the sought-after structures are. So, potential horizons like small layers in oil exploration or fault zones usable for geothermal energy production could be failed or not identified while drilling. The application of a device to explore the geology with a high resolution ahead of the drill bit in direction of drilling would be of high importance. Such a device would allow adjusting the drilling path according to the real geology and would minimize the risk of discovery and hence the costs for drilling. Within the project SPWD a device for seismic exploration ahead of the drill bit will be developed. This device should allow the seismic exploration to predict areas about 50 to 100 meters ahead of the drill bit with a resolution of one meter. At the GFZ a first prototype consisting of different units for seismic sources, receivers and data loggers has been designed and manufactured. As seismic sources four standard magnetostrictive actuators and as receivers four 3-component-geophones are used. Every unit, actuator or geophone, can be rotated in steps of 15° around the longitudinal axis of the prototype to test different measurement configurations. The SPWD prototype emits signal frequencies of about 500 up to 5000 Hz which are significant higher than in VSP and SWD. An increased radiation of seismic wave energy in the direction of the borehole axis allows the view in areas to be drilled. Therefore, every actuator must be controlled independently of each other regarding to amplitude and phase of the source signal to maximize the energy of the seismic source in order to reach a sufficient exploration range. The next step for focusing is to use the method of phased array. Dependent of the seismic wave velocities of the surrounding rock, the distance of the actuators to each other and the used frequencies the signal phases for each actuator can be determined. Since one year several measurements with the prototype have been realized under defined conditions at a test site in a mine. The test site consists of a rock block surrounded from three galleries with a dimension of about 100 by 200 meters. For testing the prototype two horizontal boreholes were drilled. They are directed to one of the gallery to get a strong reflector. The quality of the data of the borehole seismics in amplitude and frequency spectra show overall a good signal-to-noise ratio and correlate strongly with the fracture density along the borehole and are associated with a lower signal-to-noise ratio. Additionally, the geophones of the prototype show reflections from ahead and rearward in the seismic data. In particular, the reflections from the gallery ahead are used for the calibration of focusing. The direct seismic wave field indicates distinct compression and shear waves. The analysis of several seismic measurements with a focus on the direct seismic waves shows that the phased array technology explicit can influence the directional characteristics of the radiated seimic waves. The amplitudes of the seismic waves can be enhanced up to three times more in the desired direction and simultaneously be attenuated in the reverse direction. A major step for the directional investigation in boreholes has accomplished. But the focusing of the seismic waves has to be improved to maximize the energy in the desired direction in more measurements by calibrating the initiating seismic signals of the sources. A next step this year is the development of a wireline prototype for application in vertical boreholes with depths not more than 2000 meters are planned. The prototype must be modified and adapted to the conditions in deep boreholes with respect to pressure and temperature. This project is funded by the German Federal Environment Ministry.

  19. Development of a HT seismic downhole tool.

    SciTech Connect

    Maldonado, Frank P.; Greving, Jeffrey J.; Henfling, Joseph Anthony; Chavira, David J.; Uhl, James Eugene; Polsky, Yarom

    2009-06-01

    Enhanced Geothermal Systems (EGS) require the stimulation of the drilled well, likely through hydraulic fracturing. Whether fracturing of the rock occurs by shear destabilization of natural fractures or by extensional failure of weaker zones, control of the fracture process will be required to create the flow paths necessary for effective heat mining. As such, microseismic monitoring provides one method for real-time mapping of the fractures created during the hydraulic fracturing process. This monitoring is necessary to help assess stimulation effectiveness and provide the information necessary to properly create the reservoir. In addition, reservoir monitoring of the microseismic activity can provide information on reservoir performance and evolution over time. To our knowledge, no seismic tool exists that will operate above 125 C for the long monitoring durations that may be necessary. Replacing failed tools is costly and introduces potential errors such as depth variance, etc. Sandia has designed a high temperature seismic tool for long-term deployment in geothermal applications. It is capable of detecting microseismic events and operating continuously at temperatures up to 240 C. This project includes the design and fabrication of two High Temperature (HT) seismic tools that will have the capability to operate in both temporary and long-term monitoring modes. To ensure the developed tool meets industry requirements for high sampling rates (>2ksps) and high resolution (24-bit Analog-to-Digital Converter) two electronic designs will be implemented. One electronic design will utilize newly developed 200 C electronic components. The other design will use qualified Silicon-on-Insulator (SOI) devices and will have a continuous operating temperature of 240 C.

  20. Key aspects governing induced seismicity

    NASA Astrophysics Data System (ADS)

    Buijze, Loes; Wassing, Brecht; Fokker, Peter

    2013-04-01

    In the past decades numerous examples of earthquakes induced by human-induced changes in subsurface fluid pressures have been reported. This poses a major threat to the future development of some of these operations and calls for an understanding and quantification of the seismicity generated. From geomechanical considerations and insights from laboratory experiments the factors controlling induced seismicity may be grouped into 4 categories; the magnitude of the stress disturbance, the pre-existing stress conditions, the reservoir/fault rock properties and the local geometry. We investigated whether the (relative) contributions of these factors and their influence on magnitudes generated could be recognized by looking at the entire dataset of reported cases of induced seismicity as a whole, and what this might imply for future developments. An extensive database has been built out of over a 160 known cases of induced seismicity worldwide, incorporating the relevant geological, seismological and fluid-related parameters. The cases studied include hydrocarbon depletion and secondary recovery, waste water injection, (enhanced) geothermal systems and hydraulic fracturing with observed magnitudes ranging from less than -1.5 to 7. The parameters taken into account were based on the theoretical background of the mechanisms of induced seismicity and include the injection/depletion-related parameters, (spatial) characteristics of seismicity, lithological properties and the local stress situation. Correlations between the seismic response and the geological/geomechanical characteristics of the various sites were investigated. The injected/depleted volumes and the scale of the activities are major controlling factors on the maximum magnitudes generated. Spatial signatures of seismicity such as the depth and lateral spread of the seismicity were observed to be distinct for different activities, which is useful when considering future operations. Where available the local stress situation is considered, as well as the influence of the natural seismicity. Finally, we related induced seismicity to several reservoir and fault rock properties, including fault rock stability as is observed from the laboratory. The combination of activities of different natures and associated seismicity occurring through distinct mechanisms in this dataset is very useful for a better understanding of the factors governing induced seismicity and the operation-specific seismic expressions.