Sample records for verifying seismic design

  1. The seismic design handbook

    SciTech Connect

    Naeim, F. (John A. Martin and Associates, Los Angeles, CA (US))

    1989-01-01

    This book contains papers on the planning, analysis, and design of earthquake resistant building structures. Theories and concepts of earthquake resistant design and their implementation in seismic design practice are presented.

  2. A Real Quantum Designated Verifier Signature Scheme

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang

    2015-04-01

    The effectiveness of most quantum signature schemes reported in the literature can be verified by a designated person, however, those quantum signature schemes aren't the real traditional designated verifier signature schemes, because the designated person hasn't the capability to efficiently simulate a signature which is indistinguishable from a signer, which cannot satisfy the requirements in some special environments such as E-voting, call for tenders and software licensing. For solving this problem, a real quantum designated verifier signature scheme is proposed in this paper. According to the property of unitary transformation and quantum one-way function, only a verifier designated by a signer can verify the "validity of a signature" and the designated verifier cannot prove to a third party that the signature was produced by the signer or by himself through a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Analysis results show that this new scheme satisfies the main security requirements of designated verifier signature scheme and the major attack strategies.

  3. SEISMIC BRIDGE FRAGILITIES FOR POST-DESIGN VERIFICATION

    Microsoft Academic Search

    BRYANT G. NIELSON; MATTHEW E. BOWERS

    Conditional probabilistic statements of bridge damage due to seismic loading, known as fragility curves are often utilized in risk assessment and prioritization activities. This study proposes and illustrates the use of bridge fragility curves in verifying the performance objectives which govern a bridge's seismic design. The bridge fragility is convolved with a probabilistic representation of the seismic shaking hazard at

  4. Verifying IP-core based system-on-chip designs

    Microsoft Academic Search

    Pankaj Chauhan; Edmund M. Clarke; Yuan Lu; Dong Wang

    1999-01-01

    We describe a methodology for verifying system-on-chip designs. In our methodology, the problem of verifying system-on-chip designs is decomposed into three tasks. First, we verify, once and for all, the standard bus interconnecting IP cores in the system. The next task is to verify the glue logic, which connects the IP cores to the buses. Finally, using the verified bus

  5. Position paper: Seismic design criteria

    SciTech Connect

    Farnworth, S.K.

    1995-05-22

    The purpose of this paper is to document the seismic design criteria to be used on the Title 11 design of the underground double-shell waste storage tanks and appurtenant facilities of the Multi-Function Waste Tank Facility (MWTF) project, and to provide the history and methodologies for determining the recommended Design Basis Earthquake (DBE) Peak Ground Acceleration (PGA) anchors for site-specific seismic response spectra curves. Response spectra curves for use in design are provided in Appendix A.

  6. On Designatedly Verified (Non-interactive) Watermarking Schemes

    E-print Network

    On Designatedly Verified (Non-interactive) Watermarking Schemes Malapati Raja Sekhar1 , Takeshi,okamoto}@risk.cipher.tsukuba.ac.jp Abstract. Although many watermarking schemes consider the case of universal verifiability. Watermarking scheme with (non-interactive) designated verification through non-invertible schemes was proposed

  7. The Hob system for verifying software design properties

    E-print Network

    Lam, Patrick, Ph. D. Massachusetts Institute of Technology

    2007-01-01

    This dissertation introduces novel techniques for verifying that programs conform to their designs. My Hob system, as described in this dissertation, allows developers to statically ensure that implementations preserve ...

  8. Verifying the "correctness" of your optical proximity correction designs

    NASA Astrophysics Data System (ADS)

    Malhotra, Vinod K.; Chang, Fang C.

    1999-07-01

    The emerging demand for smaller and smaller IC features, undiminished by the delay of next generation stepper technologies, has increased the need for OPC and PSM designs that are becoming critical for leading-edge IC manufacturing. However, modifications made to the original layout by OPC or PSM deign tools in general, exclude the use of conventional design verification tools to verify the modified designs. Therefore, the question of design 'correctness' often goes unanswered until after the wafers have been printed. This is extremely costly in terms of time and money. In this paper, we address the critical issue that has thus far remained open, the development of methods for physical verification of OPC designs. Our approach uses fast lithography simulation to map the modified mask design to the final patterns produced on the wafer. The simulated wafer pattern is matched against the specified tolerances and the problem areas are reported. It is a hierarchical verification tool. The hierarchical processing of the data makes it a high performance tool and keeps the data volume in check. We validate this technology by comparing the simulation results with the experimental data. In addition, performance measurements indicate that it is an effective and practical solution to the problem of verifying correctness of full-chip OPC designs.

  9. Application of process monitoring to verify facility design

    SciTech Connect

    Hakkila, E.A.

    1989-01-01

    Process monitoring has been proposed as a safeguards measure to ensure that a facility is operating as designed, or as a surveillance measure to ensure that material is not removed from the facility in an undeclared manner. In a process-monitoring system, the facility operator monitors process operations such as tank levels, densities, and temperatures; process flows; and physical parameters such as valve positions to ensure that the operations performed are both desired and required. At many facilities (for example, Idaho), the process-monitoring system is also an important safety feature to prevent criticality. Verifying facility design is necessary for application of safeguards in a reprocessing plant. Verifying all pipes and valves through comparison of blueprints with the as-built facility is an almost impossible task with the International Atomic Energy Agency's limited inspection resources. We propose applying process monitoring for international safeguards facility design verification. By carefully selecting process-operating variables, it may be possible to verify that plant flows are as described and that key measurement points are not bypassed. 8 refs.

  10. DISPLACEMENT BASED SEISMIC DESIGN METHODS.

    SciTech Connect

    HOFMAYER,C.MILLER,C.WANG,Y.COSTELLO,J.

    2003-07-15

    A research effort was undertaken to determine the need for any changes to USNRC's seismic regulatory practice to reflect the move, in the earthquake engineering community, toward using expected displacement rather than force (or stress) as the basis for assessing design adequacy. The research explored the extent to which displacement based seismic design methods, such as given in FEMA 273, could be useful for reviewing nuclear power stations. Two structures common to nuclear power plants were chosen to compare the results of the analysis models used. The first structure is a four-story frame structure with shear walls providing the primary lateral load system, referred herein as the shear wall model. The second structure is the turbine building of the Diablo Canyon nuclear power plant. The models were analyzed using both displacement based (pushover) analysis and nonlinear dynamic analysis. In addition, for the shear wall model an elastic analysis with ductility factors applied was also performed. The objectives of the work were to compare the results between the analyses, and to develop insights regarding the work that would be needed before the displacement based analysis methodology could be considered applicable to facilities licensed by the NRC. A summary of the research results, which were published in NUREGICR-6719 in July 2001, is presented in this paper.

  11. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  12. Design Strategy for a Formally Verified Reliable Computing Platform

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.

    1991-01-01

    This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.

  13. Design of a verifiable subset for HAL/S

    NASA Technical Reports Server (NTRS)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  14. Efficient Extension of Standard Schnorr\\/RSA Signatures into Universal Designated-Verifier Signatures

    Microsoft Academic Search

    Ron Steinfeld; Huaxiong Wang; Josef Pieprzyk

    2004-01-01

    Universal Designated-Verifier Signature (UDVS) schemes are digital signature schemes with ad- ditional functionality which allows any holder of a signature to designate the signature to any desired designated-verifier such that the designated-verifier can verify that the message was signed by the signer, but is unable to convince anyone else of this fact. Since UDVS schemes reduce to standard signatures when

  15. Seismic design criteria for LNG tanks

    SciTech Connect

    Devanna, L.R.; Blackman, J.

    1980-01-01

    According to this review of the various factors considered in revising the seismic design criteria in the National Fire Protection Association's LNG standards (NFPA 59A), the new criteria reflect the state-of-the-art and represent a rational, straightforward approach that will provide the same level of safety for LNG tanks across the country without restricting the exercise of good judgment and the incorporation of new developments. When properly applied, the approach evaluates all factors that could be relevant to seismic design, then specifies the safe-shut-down and operating-basis earthquake levels (SSE and OBE); the use of two sets of specifications avoids the need for an LNG-tank safety system designed for extremely high earthquake levels while assuring a tank's structural integrity during the worst earthquake that could reasonably be expected to occur over the facility's operating life.

  16. Seismic Design and Evaluation of Concrete Dams - An Engineering Manual

    Microsoft Academic Search

    Ghanaat Yusof; Anjana K. Chudgar

    This paper provides an overview of the US Army Corps of Engineers' guidance for seismic design and evaluation of concrete dams, as presented in the Engineer Manual EM1110-2-6053. The re- quirements to design and evaluate concrete dams to have a predictable performance for specified levels of seismic hazard are discussed. The seismic input and performance levels associated with serviceability, damage

  17. Motion based seismic design and loss estimation of diagrid structures

    E-print Network

    Liptack, Robert J. (Robert Jeffrey)

    2013-01-01

    Diagrids are becoming an increasingly popular structural system in high rise design and construction. Little research has been performed on the seismic performance of Diagrids and how it integrates with seismic loss ...

  18. Verifying the design of a Cobol system using Andy Kellens, Kris De Schutter, and Theo DHondt

    E-print Network

    De Schutter, Kris

    Verifying the design of a Cobol system using Cognac Andy Kellens, Kris De Schutter, and Theo D. However, it seems that the vast majority of these tools neglects the Cobol language, which is still one a general framework for documenting design rules in Cobol code and verifying their validity with respect

  19. Cognac: a framework for documenting and verifying the design of Cobol systems Andy Kellens

    E-print Network

    De Schutter, Kris

    Cognac: a framework for documenting and verifying the design of Cobol systems Andy Kellens Vrije of the main languages still in use today in industry, namely Cobol. In this paper we present Cognac, an exten- sion of the IntensiVE tool that allows for documenting and verifying design rules in Cobol systems

  20. A New Design of Seismic Stations Deployed in South Tyrol

    Microsoft Academic Search

    P. Melichar; N. Horn

    2007-01-01

    When designing the seismic network in South Tyrol, the seismic service of Austria and the Civil defense in South Tyrol combined more that 10 years experience in running seismic networks and private communication systems. In recent years the high data return rate of > 99% and network uptime of > 99.% is achieved by the combination of high quality station

  1. THE VALIDATION SQUARE: HOW DOES ONE VERIFY AND VALIDATE A DESIGN METHOD?

    E-print Network

    Seepersad, Carolyn Conner

    Page 1 THE VALIDATION SQUARE: HOW DOES ONE VERIFY AND VALIDATE A DESIGN METHOD? Carolyn C of Technology Atlanta, Georgia 30332-0405 ABSTRACT Validation5 of engineering research has traditionally been anchored in the tradition of scientific inquiry. This demands formal, rigorous and quantitative validation

  2. A Tripartite Strong Designated Verifier Scheme Based On Threshold RSA Signatures

    E-print Network

    Markowitch, Olivier

    A Tripartite Strong Designated Verifier Scheme Based On Threshold RSA Signatures Jérôme Dossogne1 by the size of a RSA modulus. Moreover, in addition to the computation of a classical RSA signature scheme remains the same than a classical RSA signature verification. Keywords: Digital Signature

  3. New Extensions of Pairing-based Signatures into Universal (Multi) Designated Verifier Signatures

    E-print Network

    Vergnaud, Damien

    2008-01-01

    The concept of universal designated verifier signatures was introduced by Steinfeld, Bull, Wang and Pieprzyk at Asiacrypt 2003. These signatures can be used as standard publicly verifiable digital signatures but have an additional functionality which allows any holder of a signature to designate the signature to any desired verifier. This designated verifier can check that the message was indeed signed, but is unable to convince anyone else of this fact. We propose new efficient constructions for pairing-based short signatures. Our first scheme is based on Boneh-Boyen signatures and its security can be analyzed in the standard security model. We prove its resistance to forgery assuming the hardness of the so-called strong Diffie-Hellman problem, under the knowledge-of-exponent assumption. The second scheme is compatible with the Boneh-Lynn-Shacham signatures and is proven unforgeable, in the random oracle model, under the assumption that the computational bilinear Diffie-Hellman problem is untractable. Both s...

  4. Verifying IP-Core based System-On-Chip Designs Pankaj Chauhan, Edmund M. Clarke, Yuan Lu and Dong Wang

    E-print Network

    Clarke, Edmund M.

    Verifying IP-Core based System-On-Chip Designs Pankaj Chauhan, Edmund M. Clarke, Yuan Lu and Dong and for all, the standard bus interconnecting IP Cores in the system . The next task is to verify the glue logic, which connects the IP Cores to the buses. Finally, using the verified bus protocols and the IP

  5. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    Microsoft Academic Search

    Thomas M. Brouns; Alan C. Rohay; Steve Reidel; Martin G. Gardner

    2007-01-01

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energys (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The

  6. Seismic design strategy for surface facilities at the prospective Yucca Mountain Nuclear Waste Repository

    Microsoft Academic Search

    C. L. Wu; C. V. Subramanian

    1986-01-01

    The seismic environment for the surface facilities is one of the most important design considerations of the prospective nuclear waste repository at the Yucca Mountain site. As the repository design progresses through the site characterization and advanced conceptual design, the seismic design strategy will be modified to resolve seismic design issues and to enhance confidence in the design. The strategy

  7. Seismic Assessment and Earthquake Resistant Design Considerations1

    Microsoft Academic Search

    Mustafa Erdik

    The probability of a major seismic event occurring during the lifetime of the Marmaray Project's immersed tunnel is high, so an alternative to the conventional two-level earthquake design approach was needed. The authors describe the methods used to develop a design basis earthquake, which is a set of assumptions regard- ing the time-dependent stress distribution within the fault segments that

  8. 1 INTRODUCTION Modern seismic design of reinforced concrete (RC)

    E-print Network

    Paris-Sud XI, Universit de

    of plastic hinges for the earthquake energy dissipation. How- ever existing under-designed buildings or bridges may be not resistant enough and may collapse if any earthquake occurs. Considering that the major1 INTRODUCTION Modern seismic design of reinforced concrete (RC) structures is generally based

  9. "Seismic Behavior and Design of Steel Shear Walls", A. Astaneh-Asl, SEAONC Seminar, November 2001, San Francisco. of 181 Seismic Behavior and Design of Steel Shear Walls

    E-print Network

    Astaneh-Asl, Abolhassan

    "Seismic Behavior and Design of Steel Shear Walls", A. Astaneh-Asl, SEAONC Seminar, November 2001, San Francisco. of 181 Seismic Behavior and Design of Steel Shear Walls By Abolhassan Astaneh-Asl, Ph.ce.berkeley.edu/~astaneh Introduction Steel plate shear wall systems have been used in recent years in highly seismic areas to resist

  10. Coupling induced seismic hazard analysis with reservoir design

    NASA Astrophysics Data System (ADS)

    Gischig, V.; Wiemer, S.; Alcolea, A. R.

    2013-12-01

    The hazard and risk perspective in research on induced seismicity usually focuses on the question how to reduce the occurrence of induced earthquakes. However, it is also well accepted that shear-dilatancy accompanied by seismic energy radiation is a required process for reservoir creation in low permeability rock. Assessment of induced seismic hazard for a planned stimulation experiment must take into account the target reservoir properties. We present a generic modelling study, in which induced seismic hazard can be analysed in balance with the permeability enhancement and the size of the stimulated reservoir. The model has two coupled components: 1) a flow model that solves the pressure diffusion equations, and 2) a stochastic seismicity model, which uses the transient pressure disturbances to trigger seismic events at so-called seed points. At triggering, a magnitude is randomly drawn from a Gutenberg-Richter distribution with a local b-value that depends on the stress state at the seed point. In the source area of the events the permeability is increased depending on the amount of slip, but only by a maximum factor of 200. Due to the stochastic nature of the modelling approach, a representative number of 500 model realizations are computed. The results demonstrate that planning and controlling of reservoir engineering operation may be compromised by the considerable variability of maximum observed magnitude, reservoir size, b-value and seismogenic index arising from the intrinsic virtually random nature of induced seismicity. We also find that injection volume has the highest impact on both reservoir size and seismic hazard, while changing injection rate and strategy at constant final injection volume has a negligible effect. However, the impact of site-specific parameters on seismicity and reservoir properties is greater than the volume effect. In particular, conditions that lead to high b-values - for instance a low differential stress level - have a high positive impact on seismic hazard. However, as smaller magnitudes contribute less to permeability enhancement the efficiency of stimulation is degraded in case of high b-value conditions. Nevertheless, target permeability enhancement can be still be achieved under high b-value condition without reaching an unacceptable seismic hazard level, if either initial permeability is already high or if several fractures are stimulated. The proposed modelling approach is a first step towards including induced seismic hazard analysis into the design of reservoir stimulation.

  11. Seismic-reflection technique used to verify shallow rebound fracture zones in the Pierre Shale of South Dakota ( USA).

    USGS Publications Warehouse

    Nichols, T.C., Jr.; King, K.W.; Collins, D.S.; Williams, R.A.

    1988-01-01

    Shallow seismic-reflection data are presented to demonstrate their usefulness for locating and showing the continuity and lateral extent of rebound fracture zones in the Pierre Shale. Rebound fracture zones, identified in boreholes near Hayes, South Dakota, have variable depth, thickness, and character, thus making questionable the correlation of these zones between holes. Thus, the subsequent determination of dip and of continuity of the zones is somewhat tenuous, especially if the fracture characteristics change significantly between holes. Once rebound fracture zones have been identified and located by borehole geotechnical and geologic data, seismic profiles can reveal the extent and geometry of fractures in these zones, thus providing valuable preconstruction information without the cost of additional drilling.-Authors

  12. SEISMIC DESIGN CONSIDERATIONS FOR PRECAST CONCRETE SHEAR WALL CONNECTIONS

    E-print Network

    steel reinforcement, post tensioning, shear keys and debonding of continuity reinforcementSEISMIC DESIGN CONSIDERATIONS FOR PRECAST CONCRETE SHEAR WALL CONNECTIONS K.A. Soudki J.S. West S concrete shear wall panel system is popular in North America for low, medium and high rise residential

  13. THE RELIABILITY OF CAPACITY-DESIGNED COMPONENTS IN SEISMIC RESISTANT SYSTEMS

    E-print Network

    Baker, Jack W.

    THE RELIABILITY OF CAPACITY-DESIGNED COMPONENTS IN SEISMIC RESISTANT SYSTEMS A DISSERTATION design codes to help ensure ductile response and energy dissipation in seismic resisting systems are to contribute to the understanding of the reliability of capacity-designed components in seismic resistant

  14. Review of seismicity and ground motion studies related to development of seismic design at SRS

    SciTech Connect

    Stephenson, D.E. [Westinghouse Savannah River Co., Aiken, SC (United States); Acree, J.R. [Westinghouse Environmental and Geotechnical Services, Inc., Columbia, SC (United States)

    1992-08-01

    The NRC response spectra developed in Reg. Guide 1.60 is being used in the studies related to restarting of the existing Savannah River Site (SRS) reactors. Because it envelopes all the other site specific spectra which have been developed for SRS, it provides significant conservatism in the design and analysis of the reactor systems for ground motions of this value or with these probability levels. This spectral shape is also the shape used for the design of the recently licensed Vogtle Nuclear Station, located south of the Savannah River from the SRS. This report provides a summary of the data base used to develop the design basis earthquake. This includes the seismicity, rates of occurrence, magnitudes, and attenuation relationships. A summary is provided for the studies performed and methodologies used to establish the design basis earthquake for SRS. The ground motion response spectra developed from the various studies are also summarized. The seismic hazard and PGA`s developed for other critical facilities in the region are discussed, and the SRS seismic instrumentation is presented. The programs for resolving outstanding issues are discussed and conclusions are presented.

  15. High-performance braces for seismic design

    E-print Network

    Lim, Tim S

    2013-01-01

    The fundamental challenge for the structural engineer in designing earthquake-resistant structures is to design buildings with both adequate ductility and sufficient stiffness. Traditional lateral force resisting systems ...

  16. Seismic design technology for Breeder Reactor structures. Volume 3: special topics in reactor structures

    SciTech Connect

    Reddy, D.P. (ed)

    1983-04-01

    This volume is divided into six chapters: analysis techniques, equivalent damping values, probabilistic design factors, design verifications, equivalent response cycles for fatigue analysis, and seismic isolation. (JDB)

  17. Verified by Visa and MasterCard SecureCode: Or, How Not to Design Authentication

    NASA Astrophysics Data System (ADS)

    Murdoch, Steven J.; Anderson, Ross

    Banks worldwide are starting to authenticate online card transactions using the '3-D Secure' protocol, which is branded as Verified by Visa and MasterCard SecureCode. This has been partly driven by the sharp increase in online fraud that followed the deployment of EMV smart cards for cardholder-present payments in Europe and elsewhere. 3-D Secure has so far escaped academic scrutiny; yet it might be a textbook example of how not to design an authentication protocol. It ignores good design principles and has significant vulnerabilities, some of which are already being exploited. Also, it provides a fascinating lesson in security economics. While other single sign-on schemes such as OpenID, InfoCard and Liberty came up with decent technology they got the economics wrong, and their schemes have not been adopted. 3-D Secure has lousy technology, but got the economics right (at least for banks and merchants); it now boasts hundreds of millions of accounts. We suggest a path towards more robust authentication that is technologically sound and where the economics would work for banks, merchants and customers - given a gentle regulatory nudge.

  18. Malarge seismic array: Design and deployment of the temporary array

    NASA Astrophysics Data System (ADS)

    Ruigrok, E.; Draganov, D.; Gmez, M.; Ruzzante, J.; Torres, D.; Lpes Pumarega, I.; Barbero, N.; Ramires, A.; Castao Gaan, A. R.; van Wijk, K.; Wapenaar, K.

    2012-10-01

    We present the goals and the current status of the Malarge seismic array. Our main goal is imaging and monitoring the subsurface below the Malarge region, Mendoza, Argentina. More specifically, we focus on the Planchon-Peteroa Volcano and an area just east of the town of Malarge. We start our project installing a temporary array of 38 seismic stations, which records continuously for one year. The array consists of two subarrays: one array located on the flanks of the volcano; the other spread out on a plateau just east of the Andes. The imaging targets, like the Moho and the Nazca slab, are relatively deep. Yet, the array has a dense station spacing, allowing exploration-type processing. For high-resolution imaging, also a dense source spacing is required. This we aim to achieve by creating virtual sources at the receiver positions, with a technique called seismic interferometry (SI). The array is designed such that a recent improvement of SI can be applied to the recordings. Other goals are to collect high-quality core-phase measurements and to characterize sources of microseism noise in the Southern Hemisphere. Furthermore, we plan to collaborate with researchers from the Pierre Auger Collaboration to study coupling of seismic, atmospheric, and cosmic signals using data from our instruments and from the Pierre Auger detectors.

  19. Actuator saturation and control design for buildings under seismic excitation

    Microsoft Academic Search

    Jin-Hoon Kim; Faryar Jabbari; J. N. Yang

    2000-01-01

    Efficient design techniques are presented for improving the response of seismic-excited buildings under actuators with limited capacity. Information, regarding the actuator capacity and estimated peak ground acceleration, is used to fine tune the controller and reduce conservatism. Both state feedback and observer-based controllers are discussed. The observer-based controller is based on measurement of the ground acceleration, though this requirement can

  20. A New Design of Seismic Stations Deployed in South Tyrol

    NASA Astrophysics Data System (ADS)

    Melichar, P.; Horn, N.

    2007-05-01

    When designing the seismic network in South Tyrol, the seismic service of Austria and the Civil defense in South Tyrol combined more that 10 years experience in running seismic networks and private communication systems. In recent years the high data return rate of > 99% and network uptime of > 99.% is achieved by the combination of high quality station design and equipment, and the use of the Antelope data acquisition and processing software which comes with suite of network monitoring & alerting tools including Nagios, etc. The new Data Center is located in city of Bolzano and is connected to the other Data Centers in Austria, Switzerland, and Italy for data back up purposes. Each Data Center uses also redundant communication system if the primary system fails. When designing the South Tyrol network, new improvements were made in seismometer installations, grounding, lighting protection and data communications in order to improve quality of data recorded as well as network up-time, and data return. The new 12 stations are equipped with 6 Channels Q330+PB14f connected to STS2 + EpiSensor sensor. One of the key achievements was made in the grounding concept for the whole seismic station - and aluminum boxes were introduced which delivered Faraday cage isolation. Lightning protection devices are used for the equipment inside the aluminum housing where seismometer and data logger are housed. For the seismometer cables a special shielding was introduced. The broadband seismometer and strong-motion sensor are placed on a thick glass plate and therefore isolated from the ground. The precise seismometer orientation was done by a special groove on the glass plate and in case of a strong earthquake; the seismometer is tide up to the base plate. Temperature stability was achieved by styrofoam sheets inside the seismometer aluminum protection box.

  1. Fast Bayesian optimal experimental design for seismic source inversion

    NASA Astrophysics Data System (ADS)

    Long, Quan; Motamed, Mohammad; Tempone, Ral

    2015-07-01

    We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the "true" parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem.

  2. Study of seismic design bases and site conditions for nuclear power plants

    SciTech Connect

    Not Available

    1980-04-01

    This report presents the results of an investigation of four topics pertinent to the seismic design of nuclear power plants: Design accelerations by regions of the continental United States; review and compilation of design-basis seismic levels and soil conditions for existing nuclear power plants; regional distribution of shear wave velocity of foundation materials at nuclear power plant sites; and technical review of surface-founded seismic analysis versus embedded approaches.

  3. Report of the US Nuclear Regulatory Commission Piping Review Committee. Volume 2. Evaluation of seismic designs: a review of seismic design requirements for Nuclear Power Plant Piping

    SciTech Connect

    Not Available

    1985-04-01

    This document reports the position and recommendations of the NRC Piping Review Committee, Task Group on Seismic Design. The Task Group considered overlapping conservation in the various steps of seismic design, the effects of using two levels of earthquake as a design criterion, and current industry practices. Issues such as damping values, spectra modification, multiple response spectra methods, nozzle and support design, design margins, inelastic piping response, and the use of snubbers are addressed. Effects of current regulatory requirements for piping design are evaluated, and recommendations for immediate licensing action, changes in existing requirements, and research programs are presented. Additional background information and suggestions given by consultants are also presented.

  4. Design and application of an electromagnetic vibrator seismic source

    USGS Publications Warehouse

    Haines, S.S.

    2006-01-01

    Vibrational seismic sources frequently provide a higher-frequency seismic wavelet (and therefore better resolution) than other sources, and can provide a superior signal-to-noise ratio in many settings. However, they are often prohibitively expensive for lower-budget shallow surveys. In order to address this problem, I designed and built a simple but effective vibrator source for about one thousand dollars. The "EMvibe" is an inexpensive electromagnetic vibrator that can be built with easy-to-machine parts and off-the-shelf electronics. It can repeatably produce pulse and frequency-sweep signals in the range of 5 to 650 Hz, and provides sufficient energy for recording at offsets up to 20 m. Analysis of frequency spectra show that the EMvibe provides a broader frequency range than the sledgehammer at offsets up to ??? 10 m in data collected at a site with soft sediments in the upper several meters. The EMvibe offers a high-resolution alternative to the sledgehammer for shallow surveys. It is well-suited to teaching applications, and to surveys requiring a precisely-repeatable source signature.

  5. Design Of Bridges For Non Synchronous Seismic Motion

    SciTech Connect

    Nuti, Camillo [Dipartimento di Strutture, Dis, Universita di Roma 3, Via Segre 4-6, 00146, Roma (Italy); Vanzi, Ivo [Dipartimento di Progettazione, Pricos, Universita di Chieti, Viale Pindaro 42, 65127, Pescara (Italy)

    2008-07-08

    this paper aims to develop and validate structural design criteria which account for the effects of earthquakes spatial variability. In past works [1, 2] the two simplest forms of this problem were dealt with: differential displacements between two points belonging to the soil or to two single degree of freedom structures. Seismic action was defined according to EC8 [3]; the structures were assumed linear elastic sdof oscillators. Despite this problem may seem trivial, existing codes models appeared improvable on this aspect. For the differential displacements of two points on the ground, these results are now validated and generalized using the newly developed response spectra contained in the new seismic Italian code [4]; the resulting code formulation is presented. Next, the problem of statistically defining the differential displacement among any number of points on the ground (which is needed for continuos deck bridges) is approached, and some preliminary results shown. It is also shown that the current codes (e.g. EC8) rules may be improved on this aspect.

  6. Why perform time-lapse seismic monitoring? Is it to ver-ify the reservoir model? No! We should conduct time-lapse

    E-print Network

    - duction (phase I and phase II, each using four-component ocean-bottom cables). The project, initiatedWhy perform time-lapse seismic monitoring? Is it to ver- ify the reservoir model? No! We should conduct time-lapse seismic surveys in order to find out what is incorrect in the reservoir model, in a way

  7. IMPLEMENTATION OF THE SEISMIC DESIGN CRITERIA OF DOE-STD-1189-2008 APPENDIX A [FULL PAPER

    Microsoft Academic Search

    OMBERG SK

    2008-01-01

    This paper describes the approach taken by two Fluor Hanford projects for implementing of the seismic design criteria from DOE-STD-1189-2008, Appendix A. The existing seismic design criteria and the new seismic design criteria is described, and an assessment of the primary differences provided. The gaps within the new system of seismic design criteria, which necessitate conduct of portions of work

  8. Performance-based design of reinforced concrete buildings subjected to seismic forces

    E-print Network

    Kalghatgi, Nikhil S.

    1998-01-01

    An approach for evaluating reinforced concrete crographics. structural frame systems subjected to seismic forces under the framework of performance-based design methodology was developed. The method integrates the design criteria according...

  9. Estimation of Characteristic Period for Energy Based Seismic Design

    SciTech Connect

    Hancloglu, Baykal; Polat, Zekeriya; Kircil, Murat Serdar [Yildiz Technical University, Department of Civil Engineering, Besiktass 34349 Istanbul (Turkey)

    2008-07-08

    Estimation of input energy using approximate methods has been always a considerable research topic of energy based seismic design. Therefore several approaches have been proposed by many researchers to estimate the energy input to SDOF systems in the last decades. The characteristic period is the key parameter of most of these approaches and it is defined as the period at which the peak value of the input energy occurs. In this study an equation is proposed for estimating the characteristic period considering an extensive earthquake ground motion database which includes a total of 268 far-field records, two horizontal components from 134 recording stations located on both soft and firm soil sites. For this purpose statistical regression analyses are performed to develop an equation in terms of a number of structural parameters, and it is found that the developed equation yields satisfactory results comparing the characteristic periods calculated from time history analyses of SDOF systems.

  10. Seismic Analysis Issues in Design Certification Applications for New Reactors

    SciTech Connect

    Miranda, M.; Morante, R.; Xu, J.

    2011-07-17

    The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants, provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

  11. Considerations for developing seismic design criteria for nuclear waste storage repositories

    NASA Astrophysics Data System (ADS)

    Owen, G. N.; Yanev, P. I.; Scholl, R. E.

    1980-04-01

    The development of seismic design criteria for the underground structures of repositories is addressed. An initial step in the development of seismic design criteria for the underground structures of repositories is the development of performance criteria, or minimum standards of acceptable behavior. A number of possible damage modes are identified for the operating phase of the repository; however, no damage modes are foreseen that would perturb the long-term function of the repository, except for the possibility of increased permeability within the rock mass. Subsequent steps in formulating acceptable seismic design criteria for the underground structures involve the quantification of the design process. The necessity of specifying the form of ground motion that would be needed for seismic analysis and the procedures that may be used for making ground motion predictions are discussed. Further, analytic concerns including rock properties, failure criteria, modeling techniques, seismic hardening criteria for the host rock mass, and probabilistic considerations are examined.

  12. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    SciTech Connect

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  13. Assessment of the impact of degraded shear wall stiffnesses on seismic plant risk and seismic design loads

    SciTech Connect

    Klamerus, E.W.; Bohn, M.P. [Sandia National Labs., Albuquerque, NM (United States); Johnson, J.J.; Asfura, A.P.; Doyle, D.J. [EQE Engineering, Inc., San Francisco, CA (United States)

    1994-02-01

    Test results sponsored by the USNRC have shown that reinforced shear wall (Seismic Category I) structures exhibit stiffnesses and natural frequencies which are smaller than those calculated in the design process. The USNRC has sponsored Sandia National Labs to perform an evaluation of the effects of the reduced frequencies on several existing seismic PRAs in order to determine the seismic risk implications inherent in these test results. This report presents the results for the re-evaluation of the seismic risk for three nuclear power plants: the Peach Bottom Atomic Power Station, the Zion Nuclear Power Plant, and Arkansas Nuclear One -- Unit 1 (ANO-1). Increases in core damage frequencies for seismic initiated events at Peach Bottom were 25 to 30 percent (depending on whether LLNL or EPRI hazard curves were used). At the ANO-1 site, the corresponding increases in plant risk were 10 percent (for each set of hazard curves). Finally, at Zion, there was essentially no change in the computed core damage frequency when the reduction in shear wall stiffness was included. In addition, an evaluation of deterministic ``design-like`` structural dynamic calculations with and without the shear stiffness reductions was made. Deterministic loads calculated for these two cases typically increased on the order of 10 to 20 percent for the affected structures.

  14. Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Sullivan, T. J.

    2012-04-01

    The aftermath of recent earthquakes is a reminder that seismic risk is a very relevant issue for our communities. Implicit within the seismic design standards currently in place around the world is that minimum acceptable levels of seismic risk will be ensured through design in accordance with the codes. All the same, none of the design standards specify what the minimum acceptable level of seismic risk actually is. Instead, a series of deterministic limit states are set which engineers then demonstrate are satisfied for their structure, typically through the use of elastic dynamic analyses adjusted to account for non-linear response using a set of empirical correction factors. From the early nineties the seismic engineering community has begun to recognise numerous fundamental shortcomings with such seismic design procedures in modern codes. Deficiencies include the use of elastic dynamic analysis for the prediction of inelastic force distributions, the assignment of uniform behaviour factors for structural typologies irrespective of the structural proportions and expected deformation demands, and the assumption that hysteretic properties of a structure do not affect the seismic displacement demands, amongst other things. In light of this a number of possibilities have emerged for improved control of risk through seismic design, with several innovative displacement-based seismic design methods now well developed. For a specific seismic design intensity, such methods provide a more rational means of controlling the response of a structure to satisfy performance limit states. While the development of such methodologies does mark a significant step forward for the control of seismic risk, they do not, on their own, identify the seismic risk of a newly designed structure. In the U.S. a rather elaborate performance-based earthquake engineering (PBEE) framework is under development, with the aim of providing seismic loss estimates for new buildings. The PBEE framework consists of the following four main analysis stages: (i) probabilistic seismic hazard analysis to give the mean occurrence rate of earthquake events having an intensity greater than a threshold value, (ii) structural analysis to estimate the global structural response, given a certain value of seismic intensity, (iii) damage analysis, in which fragility functions are used to express the probability that a building component exceeds a damage state, as a function of the global structural response, (iv) loss analysis, in which the overall performance is assessed based on the damage state of all components. This final step gives estimates of the mean annual frequency with which various repair cost levels (or other decision variables) are exceeded. The realisation of this framework does suggest that risk-based seismic design is now possible. However, comparing current code approaches with the proposed PBEE framework, it becomes apparent that mainstream consulting engineers would have to go through a massive learning curve in order to apply the new procedures in practice. With this in mind, it is proposed that simplified loss-based seismic design procedures are a logical means of helping the engineering profession transition from what are largely deterministic seismic design procedures in current codes, to more rational risk-based seismic design methodologies. Examples are provided to illustrate the likely benefits of adopting loss-based seismic design approaches in practice.

  15. Investigation of Optimal Seismic Design Methodology for Piping Systems Supported by Elasto-plastic Dampers

    NASA Astrophysics Data System (ADS)

    Ito, Tomohiro; Michiue, Masashi; Fujita, Katsuhisa

    In this study, the applicability of a previously developed optimal seismic design methodology, which can consider the structural integrity of not only piping systems but also elasto-plastic supporting devices, is studied for seismic waves with various frequency characteristics. This methodology employs a genetic algorithm and can search the optimal conditions such as the supporting location and the capacity and stiffness of the supporting devices. Here, a lead extrusion damper is treated as a typical elasto-plastic damper. Numerical simulations are performed using a simple piping system model. As a result, it is shown that the proposed optimal seismic design methodology is applicable to the seismic design of piping systems subjected to seismic waves with various frequency characteristics. The mechanism of optimization is also clarified.

  16. Seismic analysis and base isolation retrofit design of a steel truss vertical lift bridge

    Microsoft Academic Search

    Itunumi Savage; John C. Eddy; Gregory I. Orsolini

    1999-01-01

    Bridges with steel superstructures are frequently ideal candidates for seismic retrofit utilizing base isolation. The seismic assessment and retrofit design of the Three Mile Slough Bridge included both the evaluation of a conventional retrofit scenario and a base isolation retrofit scenario. The structure is a five span riveted steel truss bridge with a vertical center lift span attached to the

  17. Life-cycle optimization in the establishment of performance-acceptance parameters for seismic design

    Microsoft Academic Search

    L. Esteva; O. D??az-Lpez; J. Garc??a-Prez; G. Sierra; E. Ismael

    2002-01-01

    A life-cycle formulation is presented for the determination of optimum values of the mechanical properties of a structural system exposed to seismic risk. The resulting values are intended for providing support for the establishment of performance-acceptance criteria and parameters for seismic design. A method is developed for the determination of expected damage functions in terms of simplified reference models of

  18. Technical Basis for Certification of Seismic Design Criteria for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C. [Pacific Northwest National Laboratory, Richland, WA (United States); Youngs, R.R. [Geomatrix Consultants, Inc., Oakland, CA (United States); Costantino, C.J. [C.J. Costantino and Associates, Valley, NY (United States); Miller, L.F. [U.S. Department of Energy, Office of River Protection, Richland, WA (United States)

    2008-07-01

    In August 2007, Secretary of Energy Samuel W. Bodman approved the final seismic and ground motion criteria for the Waste Treatment and Immobilization Plant (WTP) at the Department of Energy's (DOE) Hanford Site. Construction of the WTP began in 2002 based on seismic design criteria established in 1999 and a probabilistic seismic hazard analysis completed in 1996. The design criteria were reevaluated in 2005 to address questions from the Defense Nuclear Facilities Safety Board (DNFSB), resulting in an increase by up to 40% in the seismic design basis. DOE announced in 2006 the suspension of construction on the pretreatment and high-level waste vitrification facilities within the WTP to validate the design with more stringent seismic criteria. In 2007, the U.S. Congress mandated that the Secretary of Energy certify the final seismic and ground motion criteria prior to expenditure of funds on construction of these two facilities. With the Secretary's approval of the final seismic criteria in the summer of 2007, DOE authorized restart of construction of the pretreatment and high-level waste vitrification facilities. The technical basis for the certification of seismic design criteria resulted from a two-year Seismic Boreholes Project that planned, collected, and analyzed geological data from four new boreholes drilled to depths of approximately 1400 feet below ground surface on the WTP site. A key uncertainty identified in the 2005 analyses was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The absence of directly-measured seismic shear wave velocities in the sedimentary interbeds resulted in the use of a wider and more conservative range of velocities in the 2005 analyses. The Seismic Boreholes Project was designed to directly measure the velocities and velocity contrasts in the basalts and sediments below the WTP, reanalyze the ground motion response, and assess the level of conservatism in the 2005 seismic design criteria. The characterization and analysis effort included 1) downhole measurements of the velocity properties (including uncertainties) of the basalt/interbed sequences, 2) confirmation of the geometry of the contact between the various basalt and interbedded sediments through examination of retrieved core from the core-hole and data collected through geophysical logging of each borehole, and 3) prediction of ground motion response to an earthquake using newly acquired and historic data. The data and analyses reflect a significant reduction in the uncertainty in shear wave velocities below the WTP and result in a significantly lower spectral acceleration (i.e., ground motion). The updated ground motion response analyses and corresponding design response spectra reflect a 25% lower peak horizontal acceleration than reflected in the 2005 design criteria. These results provide confidence that the WTP seismic design criteria are conservative. (authors)

  19. Experimentally verified design guidelines for minimizing the gray zone width of Josephson comparators

    NASA Astrophysics Data System (ADS)

    Ebert, Bjoern; Mielke, Olaf; Kunert, Juergen; Stolz, Ronny; Ortlepp, Thomas

    2010-05-01

    We investigated the gray zone width of Josephson comparators by means of circuit simulations and experiments, looking at the dependences on different circuit parameters and topologies. Eight different comparator circuits were simulated and designed for a 1 kA cm - 2 niobium device. With our sophisticated measurement set-up, the lowest reported gray zone width of 3.2 A at 4.2 K was measurable. Moreover, the results obtained allow us to derive a set of design rules for further reduction of the gray zone width, which was the original goal of our investigations.

  20. Use of process monitoring for verifying facility design of large-scale reprocessing plants

    SciTech Connect

    Hakkila, E.A.; Zack, N.R. (Los Alamos National Lab., NM (USA)); Ehinger, M.H. (Oak Ridge National Lab., TN (USA)); Franssen, F. (International Atomic Energy Agency, Vienna (Austria))

    1991-01-01

    During the decade of the 1990s, the International Atomic Energy Agency (IAEA) faces the challenge of implementing safeguards in large, new reprocessing facilities. The Agency will be involved in the design, construction, checkout and initial operation of these new facilities to ensure effective safeguards are implemented. One aspect of the Agency involvement is in the area of design verification. The United States Support Program has initiated a task to develop methods for applying process data collection and validation during the cold commissioning phase of plant construction. This paper summarizes the results of this task. 14 refs., 1 tab.

  1. Seismic design and analysis considerations for high level nuclear waste repositories

    SciTech Connect

    Hossain, Q.A.

    1993-09-30

    A high level nuclear waste repository, like the one at Nevada`s Yucca Mountain that is being investigated for site suitability, will have some unique seismic design and analysis considerations. These are discussed, and a design philosophy that can rationally account for the unique performance objectives of such facilities is presented. A case is made for the use of DOE`s performance goal-based seismic design and evaluation methodology that is based on a hybrid ``deterministic`` and ``probabilistic`` concept. How and to what extent this methodology should be modified to adopt it for a potential site like Yucca Mountain is also outlined. Finally, the issue of designing for seismic fault rupture is discussed briefly, and the desirability of using the proposed seismic design philosophy in fault rupture evaluation is described.

  2. Engineering Seismic Base Layer for Defining Design Earthquake Motion

    SciTech Connect

    Yoshida, Nozomu [Department of Civil and Environmental Engineering, Tohoku Gakuin University, Tagajo 1-13-1, Miyagi (Japan)

    2008-07-08

    Engineer's common sense that incident wave is common in a widespread area at the engineering seismic base layer is shown not to be correct. An exhibiting example is first shown, which indicates that earthquake motion at the ground surface evaluated by the analysis considering the ground from a seismic bedrock to a ground surface simultaneously (continuous analysis) is different from the one by the analysis in which the ground is separated at the engineering seismic base layer and analyzed separately (separate analysis). The reason is investigated by several approaches. Investigation based on eigen value problem indicates that the first predominant period in the continuous analysis cannot be found in the separate analysis, and predominant period at higher order does not match in the upper and lower ground in the separate analysis. The earthquake response analysis indicates that reflected wave at the engineering seismic base layer is not zero, which indicates that conventional engineering seismic base layer does not work as expected by the term 'base'. All these results indicate that wave that goes down to the deep depths after reflecting in the surface layer and again reflects at the seismic bedrock cannot be neglected in evaluating the response at the ground surface. In other words, interaction between the surface layer and/or layers between seismic bedrock and engineering seismic base layer cannot be neglected in evaluating the earthquake motion at the ground surface.

  3. A verified design of a fault-tolerant clock synchronization circuit: Preliminary investigations

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1992-01-01

    Schneider demonstrates that many fault tolerant clock synchronization algorithms can be represented as refinements of a single proven correct paradigm. Shankar provides mechanical proof that Schneider's schema achieves Byzantine fault tolerant clock synchronization provided that 11 constraints are satisfied. Some of the constraints are assumptions about physical properties of the system and cannot be established formally. Proofs are given that the fault tolerant midpoint convergence function satisfies three of the constraints. A hardware design is presented, implementing the fault tolerant midpoint function, which is shown to satisfy the remaining constraints. The synchronization circuit will recover completely from transient faults provided the maximum fault assumption is not violated. The initialization protocol for the circuit also provides a recovery mechanism from total system failure caused by correlated transient faults.

  4. Overcoming barriers to high performance seismic design using lessons learned from the green building industry

    NASA Astrophysics Data System (ADS)

    Glezil, Dorothy

    NEHRP's Provisions today currently governing conventional seismic resistant design. These provisions, though they ensure the life-safety of building occupants, extensive damage and economic losses may still occur in the structures. This minimum performance can be enhanced using the Performance-Based Earthquake Engineering methodology and passive control systems like base isolation and energy dissipation systems. Even though these technologies and the PBEE methodology are effective reducing economic losses and fatalities during earthquakes, getting them implemented into seismic resistant design has been challenging. One of the many barriers to their implementation has been their upfront costs. The green building community has faced some of the same challenges that the high performance seismic design community currently faces. The goal of this thesis is to draw on the success of the green building industry to provide recommendations that may be used overcome the barriers that high performance seismic design (HPSD) is currently facing.

  5. Design and implementation of telemetry seismic data acquisition system based on embedded P2P Ethernet

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Lin, J.; Chen, Z.

    2011-12-01

    A new design of telemetry seismic data acquisition system is presented which uses embedded, point to point (P2P) Ethernet networks. In our presentation, we explain the idea and motivation behind the use of P2P Ethernet topology and show the problems when such topology is used in seismic acquisition system. The presented paper focuses on the network protocols developed by us which include the generation of route table and dynamic IP address management. This new design has been implemented based on ARM and FPGA, which we have tested in laboratory and seismic exploration.

  6. SEISMIC DESIGN REQUIREMENTS SELECTION METHODOLOGY FOR THE SLUDGE TREATMENT & M-91 SOLID WASTE PROCESSING FACILITIES PROJECTS

    SciTech Connect

    RYAN GW

    2008-04-25

    In complying with direction from the U.S. Department of Energy (DOE), Richland Operations Office (RL) (07-KBC-0055, 'Direction Associated with Implementation of DOE-STD-1189 for the Sludge Treatment Project,' and 08-SED-0063, 'RL Action on the Safety Design Strategy (SDS) for Obtaining Additional Solid Waste Processing Capabilities (M-91 Project) and Use of Draft DOE-STD-I 189-YR'), it has been determined that the seismic design requirements currently in the Project Hanford Management Contract (PHMC) will be modified by DOE-STD-1189, Integration of Safety into the Design Process (March 2007 draft), for these two key PHMC projects. Seismic design requirements for other PHMC facilities and projects will remain unchanged. Considering the current early Critical Decision (CD) phases of both the Sludge Treatment Project (STP) and the Solid Waste Processing Facilities (M-91) Project and a strong intent to avoid potentially costly re-work of both engineering and nuclear safety analyses, this document describes how Fluor Hanford, Inc. (FH) will maintain compliance with the PHMC by considering both the current seismic standards referenced by DOE 0 420.1 B, Facility Safety, and draft DOE-STD-1189 (i.e., ASCE/SEI 43-05, Seismic Design Criteria for Structures, Systems, and Components in Nuclear Facilities, and ANSI!ANS 2.26-2004, Categorization of Nuclear Facility Structures, Systems and Components for Seismic Design, as modified by draft DOE-STD-1189) to choose the criteria that will result in the most conservative seismic design categorization and engineering design. Following the process described in this document will result in a conservative seismic design categorization and design products. This approach is expected to resolve discrepancies between the existing and new requirements and reduce the risk that project designs and analyses will require revision when the draft DOE-STD-1189 is finalized.

  7. Design of innovative dynamic systems for seismic response mitigation

    E-print Network

    Seymour, Douglas (Douglas Benjamin)

    2012-01-01

    Rocking wall systems consist of shear walls, laterally connected to a building, that are moment-released in their strong plane. Their purpose is to mitigate seismic structural response by constraining a building primarily ...

  8. Optimization Criteria In Design Of Seismic Isolated Building

    SciTech Connect

    Clemente, Paolo; Buffarini, Giacomo [ENEA Casaccia Research Centre, Via Anguillarese 301, 00123 Rome (Italy)

    2008-07-08

    Use of new anti-seismic techniques is certainly suitable for buildings of strategic importance and, in general, in the case of very high risk. For ordinary buildings, instead, the cost of base isolation system should be balanced by an equivalent saving in the structure. The comparison criteria have been first defined, then a large numerical investigation has been carried out to analyze the effectiveness and the economic suitability of seismic isolation in concrete buildings.

  9. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    Microsoft Academic Search

    D. L. Bernreuter; A. C. Boissonnade; C. M. Short

    1998-01-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage

  10. INNOVATIVE DESIGN AND TESTING OF A SEISMIC RETROFITTED STEEL DECK TRUSS

    E-print Network

    Bruneau, Michel

    at Buffalo. He is a lead investigator of various research projects in the area of earthquake-resistant designINNOVATIVE DESIGN AND TESTING OF A SEISMIC RETROFITTED STEEL DECK TRUSS BRIDGE Dr. Majid Sarraf, P designer of a number of major bridge projects, including a section of a very complex detour bridge

  11. EUROSTEEL 2011, August 31 -September 2, 2011, Budapest, Hungary CAPACITY DESIGN IN SEISMIC RESISTANT STEEL BUILDINGS

    E-print Network

    Baker, Jack W.

    RESISTANT STEEL BUILDINGS A Reliability-Based Methodology to Establish Capacity-Design Factors Victor K strength of the critical component and "resistance" factors to reduce its design strength. While the basic factors are established for different seismic resisting systems. Rational development of capacity design

  12. Considerations for developing seismic design criteria for nuclear waste storage repositories

    SciTech Connect

    Owen, G.N.; Yanev, P.I.; Scholl, R.E.

    1980-04-01

    The function of seismic design criteria is to reduce the potential for hazards that may arise during various stages of the repository life. During the operational phase, the major concern is with the possible effects of earthquakes on surface facilities, underground facilities, and equipment. During the decommissioned phase, the major concern is with the potential effects of earthquakes on the geologic formation, which may result in a reduction in isolation capacity. Existing standards and guides or criteria used for the static and seismic design of licensed nuclear facilities were reviewed and evaluated for their applicability to repository design. This report is directed mainly toward the development of seismic design criteria for the underground structures of repositories. An initial step in the development of seismic design criteria for the underground structures of repositories is the development of performance criteria, or minimum standards of acceptable behavior. A number of possible damage modes are identified for the operating phase of the repository; however, no damage modes are foreseen that would perturb the long-term function of the repository, except for the possibility of increased permeability within the rock mass. Subsequent steps in formulating acceptable seismic design criteria for the underground structures involve the quantification of the design process. The report discusses the necessity of specifying the form of ground motion that would be needed for seismic analysis and the procedures that may be used for making ground motion predictions. Further discussions outline what is needed for analysis, including rock properties, failure criteria, modeling techniques, seismic hardening criteria for the host rock mass, and probabilistic considerations.

  13. Seismic Response Analysis and Design of Structure with Base Isolation

    SciTech Connect

    Rosko, Peter [Vienna University of Technology, Center of Mechanics and Structural Dynamics (Austria)

    2010-05-21

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  14. Seismic design factors for RC special moment resisting frames in Dubai, UAE

    NASA Astrophysics Data System (ADS)

    Alhamaydeh, Mohammad; Abdullah, Sulayman; Hamid, Ahmed; Mustapha, Abdilwahhab

    2011-12-01

    This study investigates the seismic design factors for three reinforced concrete (RC) framed buildings with 4, 16 and 32-stories in Dubai, UAE utilizing nonlinear analysis. The buildings are designed according to the response spectrum procedure defined in the 2009 International Building Code (IBC'09). Two ensembles of ground motion records with 10% and 2% probability of exceedance in 50 years (10/50 and 2/50, respectively) are used. The nonlinear dynamic responses to the earthquake records are computed using IDARC-2D. Key seismic design parameters are evaluated; namely, response modification factor ( R), deflection amplification factor ( C d), system overstrength factor ( ? o), and response modification factor for ductility ( R d ) in addition to inelastic interstory drift. The evaluated seismic design factors are found to significantly depend on the considered ground motion (10/50 versus 2/50). Consequently, resolution to the controversy of Dubai seismicity is urged. The seismic design factors for the 2/50 records show an increase over their counterparts for the 10/50 records in the range of 200%-400%, except for the ? o factor, which shows a mere 30% increase. Based on the observed trends, perioddependent R and C d factors are recommended if consistent collapse probability (or collapse prevention performance) in moment frames with varying heights is to be expected.

  15. Performance-based seismic design of nonstructural building components: The next frontier of earthquake engineering

    NASA Astrophysics Data System (ADS)

    Filiatrault, Andre; Sullivan, Timothy

    2014-08-01

    With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major knowledge gaps that will need to be filled by future research. Furthermore, considering recent trends in earthquake engineering, the paper explores how performance-based seismic design might be conceived for nonstructural components, drawing on recent developments made in the field of seismic design and hinting at the specific considerations required for nonstructural components.

  16. VxWorks-based real-time data gathering software design for seismic data acquisition system

    Microsoft Academic Search

    Hong-cai Cheng; Ping Cao; Ke-zhu Song; Jun-feng Yang; Fu-Ming Ruan

    2010-01-01

    A real-time data gathering design is introduced in this paper, which can be used in remote sensing system as seismic data acquisition system. Generally, this kind of system is designed as distributed formation. Data acquired remotely should be transferred to a storage device or analysis center continuously. With the growing of remote sensing scale, it is more difficult to gather

  17. Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm

    PubMed Central

    Veladi, H.

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717

  18. Evaluation of collapse resistance of RC frame structures for Chinese schools in seismic design categories B and C

    NASA Astrophysics Data System (ADS)

    Tang, Baoxin; Lu, Xinzheng; Ye, Lieping; Shi, Wei

    2011-09-01

    According to the Code for Seismic Design of Buildings (GB50011-2001), ten typical reinforced concrete (RC) frame structures, used as school classroom buildings, are designed with different seismic fortification intensities (SFIs) (SFI=6 to 8.5) and different seismic design categories (SDCs) (SDC=B and C). The collapse resistance of the frames with SDC=B and C in terms of collapse fragility curves are quantitatively evaluated and compared via incremental dynamic analysis (IDA). The results show that the collapse resistance of structures should be evaluated based on both the absolute seismic resistance and the corresponding design seismic intensity. For the frames with SFI from 6 to 7.5, because they have relatively low absolute seismic resistance, their collapse resistance is insufficient even when their corresponding SDCs are upgraded from B to C. Thus, further measures are needed to enhance these structures, and some suggestions are proposed.

  19. Seismic responses of a pool-type fast reactor with different core support designs

    SciTech Connect

    Wu, Ting-shu; Seidensticker, R.W. (Argonne National Lab., IL (USA))

    1989-01-01

    In designing the core support system for a pool-type fast reactor, there are many issues which must be considered in order to achieve an optimum and balanced design. These issues include safety, reliability, as well as costs. Several design options are possible to support the reactor core. Different core support options yield different frequency ranges and responses. Seismic responses of a large pool-type fast reactor incorporated with different core support designs have been investigated. 4 refs., 3 figs.

  20. Investigation of Optimal Seismic Design Methodology for Piping Systems Supported by Elasto-Plastic Dampers

    NASA Astrophysics Data System (ADS)

    Ito, Tomohiro; Michiue, Masashi; Fujita, Katsuhisa

    In this study, the optimal seismic design methodology that can consider the structural integrity of not only the piping systems but also elasto-plastic supporting devices is developed. This methodology employs a genetic algorithm and can search the optimal conditions such as the supporting location, capacity and stiffness of the supporting devices. Here, a lead extrusion damper is treated as a typical elasto-plastic damper. Four types of evaluation functions are considered. It is found that the proposed optimal seismic design methodology is very effective and can be applied to the actual seismic design for piping systems supported by elasto-plastic dampers. The effectiveness of the evaluation functions is also clarified.

  1. Effective Parameters on Seismic Design of Rectangular Underground Structures

    SciTech Connect

    Amiri, G. Ghodrati [Center of Excellence for Fundamental Studies in Structural Engineering, College of Civil Engineering, Iran University of Science and Technology, Narmak, Tehran 16846 (Iran, Islamic Republic of); Maddah, N.; Mohebi, B. [College of Civil Engineering, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of)

    2008-07-08

    Underground structures are a significant part of the transportation in the modern society and in the seismic zones should withstand against both seismic and static loadings. Embedded structures should conform to ground deformations during the earthquake but almost exact evaluation of structure to ground distortion is critical. Several two-dimensional finite difference models are used to find effective parameters on racking ratio (structure to ground distortion) including flexibility ratio, various cross sections, embedment depth, and Poisson's ratio of soil. Results show that influence of different cross sections, by themselves is negligible but embedment depth in addition to flexibility ratio and Poisson's ratio is known as a consequential parameter. A comparison with pseudo-static method (simplified frame analysis) is also performed. The results show that for a stiffer structure than soil, racking ratio decreases as the depth of burial decreases; on the other hand, shallow and flexible structures can suffer greater distortion than deeper ones up to 30 percents.

  2. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Reidel, S.P. [Pacific Northwest National Laboratory, Richland, WA (United States); Gardner, M.G. [EnergySolutions, Richland, WA (United States)

    2007-07-01

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energy's (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84. percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis. A key uncertainty identified in the 2005 analysis was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The velocity structure of the upper four basalt flows (Saddle Mountains Basalt) and the inter-layered sedimentary interbeds (Ellensburg Formation) produces strong reductions in modeled earthquake ground motions propagating through them. Uncertainty in the strength of velocity contrasts between these basalts and interbeds primarily resulted from an absence of measured shear wave velocities (Vs) in the interbeds. For this study, Vs in the interbeds was estimated from older, limited compressional wave velocity (Vp) data using estimated ranges for the ratio of the two velocities (Vp/Vs) based on analogues in similar materials. A range of possible Vs for the interbeds and basalts was used and produced additional uncertainty in the resulting response spectra. Because of the sensitivity of the calculated response spectra to the velocity contrasts between the basalts and interbedded sediments, DOE initiated an effort to emplace additional boreholes at the WTP site and obtain direct Vs measurements and other physical property measurements in these layers. One core-hole and three boreholes have been installed at the WTP site to a maximum depth of 1468 ft (447 m) below ground surface. The three boreholes are within 500 ft (152 m) of and surrounding the high level waste vitrification and pretreatment facilities of the WTP, which were the Performance Category 3 (PC-3) structures affected by the interim design spectra. The core-hole is co-located with the borehole closest to the two PC-3 structures. These new measurements are expected to reduce the uncertainty in the modeled site response that is caused by the lack of direct knowledge of the Vs contrasts within these layers. (authors)

  3. Estimation of Cyclic Interstory Drift Capacity of Steel Framed Structures and Future Applications for Seismic Design

    PubMed Central

    Bojrquez, Edn; Reyes-Salazar, Alfredo; Ruiz, Sonia E.; Tern-Gilmore, Amador

    2014-01-01

    Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions. PMID:25089288

  4. optimization of seismic network design: application to a geophysical international lunar network

    NASA Astrophysics Data System (ADS)

    Yamada, R.; Garcia, R. F.; Lognonne, P.; Calvet, M.; Gagnepain-Beyneix, J.; Le Feuvre, M.

    2010-12-01

    During the next decade, some lunar seismic experiments are planned under the international lunar network initiative, such as NASA ILN Anchor nodes mission or Lunette DISCOVERY proposal, JAXA SELENE-2 and LUNA-GLOB penetrator missions, during which 1 to 4 seismic stations will be deployed on the lunar surface. Yamada et al. (submitted) have described how to design the optimized network in order to obtain the best scientific gain from these future lunar landing missions. In this presentation, we will describe the expected gain from the new lunar seismic observations potentially obtained by the optimized network compared with past Apollo seismic experiments. From the Apollo seismic experiments, valuable information about the lunar interior structure was obtained using deep and shallow moonquakes, and meteoroid impacts (e.g., Nakamura et al., 1983, Lognonn et al., 2003). However, due to the limited sensitivity of Apollo lunar seismometers and the narrowness of the seismic network, the deep lunar structure, especially the core, was not properly retrieved. In addition, large uncertainties are associated with the inferred crustal thickness around the Apollo seismic stations. Improvements of these knowledge will help us to understand the origin of the Earth-Moon system and the initial differentiation of the Moon. Therefore, we have studied the optimization of a seismic network consisting of three or four new seismometers in order to place better constraints on the lunar mantle structure and /or crustal thickness. The network is designed to minimize the a posteriori errors and maximize the resolution of the velocity perturbations inside the mantle and /or the crust through a linear inverse method. For the inversion, the deep moonquakes from active sources already located by Apollo seismic data are used, because it is known that these events occur repeatedly at identical nests depending on tidal constraints. In addition, we use randomly distributed meteoroid impacts located either by the new seismic network or by detection of the impact flashes from Earth-based observation. The use of these impact events will greatly contribute to improve the knowledge of shallow structures, in particular the crust. Finally, a comparison between the a posteriori errors deduced from our optimized network with those of the Apollo network will indicate the potential of the optimized network and the expected scientific gain. This method will be a useful tool to consider for future geophysical network landing missions.

  5. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  6. Seismic Assessment of High-Raised Designed Structures Based on 2800 Iranian Seismic Code (same as UBC1997)

    SciTech Connect

    Negar, Moharrami Gargari [Ministry of Roads and Transportation, Ports and Shipping Organization, P.S.O Building, South Didar St, Shahid Haghani Highway, Vanak Sq, Tehran (Iran, Islamic Republic of); Rassol, Mirgaderi [University of Tehran, Englab Square, Tehran (Iran, Islamic Republic of)

    2008-07-08

    Seismic design codes have been applied by researchers to employ an appropriate performance of structures during earthquakes, in this regard, variety of load patterns, history and location of plastic hinges, ultimate capacity of structure, demand capacity of structure and response to many other questions about actual and assumptive performance of structures during earthquake have been considered by experts in this fields. In order to decline the retrofit cost of structure, evaluation of non-linear behavior of structure during the earthquake has been studied more. Since last 1980's the first generation of structural retrofit codes was established while designing codes were using linear behavior of structure. Consequently, comparison of design and retrofit code results, which are evaluated the actual behavior of the structure, has been considered. This research evaluates structures designed by 2800 code with performance levels, described in FEMA356, and also it compares results of modal analysis with outcomes of static non-linear analysis by application of load patterns mentioned in FEMA356. This structure designed and controlled by all regulations in 2800 code then it is evaluated by FEMA356 regulations. Finally, results are presented performance point of structure and distribution of plastic hinges over the whole structure when it collapses.

  7. Seismic design analysis of the country masonry school buildings in the meizoseismal area

    NASA Astrophysics Data System (ADS)

    Feng, Yuan; Yi, Dan; Bi, Qiong

    2011-09-01

    Several reinforcing schemes are illustrated that are based on the loading characteristics of typical country masonry school buildings with sparsely spaced transversal walls and large depth. From the seismic damage observed following the Wenchuan Earthquake, the effects of reinforcing schemes, tie-columns and tie-beams on the seismic resistance of masonry buildings are analyzed. The concept of improving the ductility of these types of buildings is presented. Finally, some suggestions are proposed for the design of masonry buildings with sparsely spaced transversal walls and large depth.

  8. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part I: Theoretical formulation

    SciTech Connect

    Mazza, Fabio; Vulcano, Alfonso [Dipartimento di Modellistica per l'Ingegneria, Universita della Calabria, 87036, Arcavacata di Rende, Cosenza (Italy)

    2008-07-08

    The insertion of steel braces equipped with dissipative devices proves to be very effective in order to enhance the performance of a framed building under horizontal seismic loads. Multi-level design criteria were proposed according to the Performance-Based Design, in order to get, for a specific level of the seismic intensity, a designated performance objective of the building (e.g., an assigned damage level of either the framed structure or non-structural elements). In this paper a design procedure aiming to proportion braces with hysteretic dampers in order to attain, for a specific level of the seismic intensity, a designated performance level of the building is proposed. Exactly, a proportional stiffness criterion, which assumes the elastic lateral storey-stiffness due to the braces proportional to that of the unbraced frame, is combined with the Direct Displacement-Based Design, in which the design starts from target deformations. A computer code has been prepared for the nonlinear static and dynamic analyses, using a step-by-step procedure. Frame members and hysteretic dampers are idealized by bilinear models.

  9. Seismic Vulnerability Assessment of Gravity Load Designed R\\/C Frames

    Microsoft Academic Search

    Angelo Masi

    2003-01-01

    The seismic vulnerability of some frame structures, typical of existing Reinforced Concrete buildings designed only to vertical\\u000a loads, has been evaluated. They are representative of building types widely present in the Italian building stock of the last\\u000a 30 years. A simulated design of the structures has been made with reference to the codes in force, the available handbooks\\u000a and the

  10. A performance goal-based seismic design philosophy for waste repository facilities

    SciTech Connect

    Hossain, Q.A.

    1994-02-01

    A performance goal-based seismic design philosophy, compatible with DOE`s present natural phenomena hazards mitigation and ``graded approach`` philosophy, has been proposed for high level nuclear waste repository facilities. The rationale, evolution, and the desirable features of this method have been described. Why and how the method should and can be applied to the design of a repository facility are also discussed.

  11. SEISMIC DESIGN REQUIREMENTS SELECTION METHODOLOGY FOR THE SLUDGE TREATMENT & M-91 SOLID WASTE PROCESSING FACILITIES PROJECTS

    Microsoft Academic Search

    RYAN GW

    2008-01-01

    In complying with direction from the U.S. Department of Energy (DOE), Richland Operations Office (RL) (07-KBC-0055, 'Direction Associated with Implementation of DOE-STD-1189 for the Sludge Treatment Project,' and 08-SED-0063, 'RL Action on the Safety Design Strategy (SDS) for Obtaining Additional Solid Waste Processing Capabilities (M-91 Project) and Use of Draft DOE-STD-I 189-YR'), it has been determined that the seismic design

  12. ENVIRONMENT, SAFETY & HEALTH DIVISION Seismic Design Specification for

    E-print Network

    Wechsler, Risa H.

    / Experimental Equipment Design Requirements 3 3 Approved Concrete Anchors 3 4 Special Design Requirements 4 4 be based on values determined in accordance with ASCE 7-20052 For all anchors embedded into concrete into hardened concrete (such as epoxy or expansion anchors) must be designed and installed in accordance

  13. SEISMIC MONITORING APPLIED TO MINES SAFETY AND OPTIMAL DESIGN OF MINE LAYOUTS IN HARD ROCK MASS SITATIONS

    E-print Network

    Boyer, Edmond

    SEISMIC MONITORING APPLIED TO MINES SAFETY AND OPTIMAL DESIGN OF MINE LAYOUTS IN HARD ROCK MASS SITATIONS P. Bigarre and M. Bennani National Institute for Industriell Environment and Risks - Lahpratory ofRock rock, tabular situations are usually associated with induced, seismic activity, i.e. the occurrence

  14. Effects of surface topography on ground shaking prediction: implications for seismic hazard analysis and recommendations for seismic design

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Massa, Marco; Lovati, Sara; Spallarossa, Daniele

    2014-06-01

    This study examines the role of topographic effects on the prediction of earthquake ground motion. Ground motion prediction equations (GMPEs) are mathematical models that estimate the shaking level induced by an earthquake as a function of several parameters, such as magnitude, source-to-site distance, style of faulting and ground type. However, little importance is given to the effects of topography, which, as known, may play a significant role on the level, duration and frequency content of ground motion. Ridges and crests are often lost inside the large number of sites considered in the definition of a GMPE. Hence, it is presumable that current GMPEs are unable to accurately predict the shaking level at the top of a relief. The present work, which follows the article of Massa et al. about topographic effects, aims at overcoming this limitation by amending an existing GMPE with an additional term to account for the effects of surface topography at a specific site. First, experimental ground motion values and ground motions predicted by the attenuation model of Bindi et al. for five case studies are compared and contrasted in order to quantify their discrepancy and to identify anomalous behaviours of the sites investigated. Secondly, for the site of Narni (Central Italy), amplification factors derived from experimental measurements and numerical analyses are compared and contrasted, pointing out their impact on probabilistic seismic hazard analysis and design norms. In particular, with reference to the Italian building code, our results have highlighted the inadequacy of the national provisions concerning the definition of the seismic load at top of ridges and crests, evidencing a significant underestimation of ground motion around the site resonance frequency.

  15. IMPLEMENTATION OF THE SEISMIC DESIGN CRITERIA OF DOE-STD-1189-2008 APPENDIX A [FULL PAPER

    SciTech Connect

    OMBERG SK

    2008-05-14

    This paper describes the approach taken by two Fluor Hanford projects for implementing of the seismic design criteria from DOE-STD-1189-2008, Appendix A. The existing seismic design criteria and the new seismic design criteria is described, and an assessment of the primary differences provided. The gaps within the new system of seismic design criteria, which necessitate conduct of portions of work to the existing technical standards pending availability of applicable industry standards, is discussed. Two Hanford Site projects currently in the Control Decision (CD)-1 phase of design have developed an approach to implementation of the new criteria. Calculations have been performed to determine the seismic design category for one project, based on information available in early CD-1. The potential effects of DOE-STD-1189-2008, Appendix A seismic design criteria on the process of project alternatives analysis is discussed. Present of this work is expected to benefit others in the DOE Complex that may be implementing DOE-STD-1189-2008.

  16. Seismic design technology for breeder reactor structures. Volume 1. Special topics in earthquake ground motion

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This report is divided into twelve chapters: seismic hazard analysis procedures, statistical and probabilistic considerations, vertical ground motion characteristics, vertical ground response spectrum shapes, effects of inclined rock strata on site response, correlation of ground response spectra with intensity, intensity attenuation relationships, peak ground acceleration in the very mean field, statistical analysis of response spectral amplitudes, contributions of body and surface waves, evaluation of ground motion characteristics, and design earthquake motions. (DLC)

  17. Seismic design of the waste-handling building at the prospective Yucca Mountain nuclear waste repository

    Microsoft Academic Search

    C. V. Subramanian; C. L. Wu; C. D. DeGabriele

    1988-01-01

    The site for the first prospective high-level nuclear waste repository is located in Yucca Mountain at the southwest corner of the Nevada Test Site in Nye County, Nevada. The preliminary site investigation and seismic hazard evaluation indicate that the design ground acceleration in the horizontal direction of 0.40 g (with a vertical acceleration equal to two-thirds the horizontal) has a

  18. Building configuration and seismic design: The architecture of earthquake resistance

    Microsoft Academic Search

    C. Arnold; R. Reitherman; D. Whitaker

    1981-01-01

    The architecture of a building in relation to its ability to withstand earthquakes was determined. Aspects of round motion which are significant to building behavior are discussed. Results of a survey of configuration decisions that affect the performance of buildings with a focus on the architectural aspects of configuration design are provided. Configuration derivation, building type as it relates to

  19. Intelligent monitoring of seismic damage identification using wireless smart sensors: design and validation

    NASA Astrophysics Data System (ADS)

    Kim, Jinho; Jang, Young-Du; Jang, Won-rak

    2011-04-01

    Structural health monitoring (SHM) has been adopted as a technique to monitor the structure performance to detect damage in aging infrastructure. The ultimate goals of implementing an SHM system are to improve infrastructure maintenance, increase public safety, and minimize the economic impact of an extreme loading event by streamlining repair and retrofit measures. With the recent advances in wireless communication technology, wireless SHM systems have emerged as a promising alternative solution for rapid, accurate and low-cost structural monitoring. This article presents an enabling, developing damage algorithm to advance the detection and diagnosis of damage to structures for SHM using networks of wireless smart sensors. Networks of wireless smart sensors are being used as a vibration based structural monitoring network that allows extraction of mode shapes from output-only vibration data from an underground structure. The mode shape information can further be used in modal methods of damage detection. These sensors are being used to experimentally verify analytical models of post-earthquake evaluation based on system identification analysis. Damage measurement system could play a significant role in monitoring/recording with a higher level of completeness the actual seismic response of structures and in non-destructive seismic damage assessment techniques based on dynamic signature analysis.

  20. Decision making with epistemic uncertainty under safety constraints: An application to seismic design

    USGS Publications Warehouse

    Veneziano, D.; Agarwal, A.; Karaca, E.

    2009-01-01

    The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project. ?? 2009 Elsevier Ltd. All rights reserved.

  1. On standard and optimal designs of industrial-scale 2-D seismic surveys

    NASA Astrophysics Data System (ADS)

    Guest, T.; Curtis, A.

    2011-08-01

    The principal aim of performing a survey or experiment is to maximize the desired information within a data set by minimizing the post-survey uncertainty on the ranges of the model parameter values. Using Bayesian, non-linear, statistical experimental design (SED) methods we show how industrial scale amplitude variations with offset (AVO) surveys can be constructed to maximize the information content contained in AVO crossplots, the principal source of petrophysical information from seismic surveys. The design method allows offset dependent errors, previously not allowed in non-linear geoscientific SED methods. The method is applied to a single common-midpoint gather. The results show that the optimal design is highly dependent on the ranges of the model parameter values when a low number of receivers is being used, but that a single optimal design exists for the complete range of parameters once the number of receivers is increased above a threshold value. However, when acquisition and processing costs are considered we find that a design with constant spatial receiver separation survey becomes close to optimal. This explains why regularly-spaced, 2-D seismic surveys have performed so well historically, not only from the point of view of noise attenuation and imaging in which homogeneous data coverage confers distinct advantages, but also to provide data to constrain subsurface petrophysical information.

  2. Seismic design evaluation guidelines for buried piping for the DOE HLW Facilities

    SciTech Connect

    Lin, Chi-Wen [Consultant, Martinez, CA (United States); Antaki, G. [Westinghouse Savannah River Co., Aiken, SC (United States); Bandyopadhyay, K. [Brookhaven National Lab., Upton, NY (United States); Bush, S.H. [Review & Synthesis Association, Richland, WA (United States); Costantino, C. [City Univ. of New York, New York, NY (United States); Kennedy, R. [RPK Structural Mechanics, Yorba Linda, CA (United States). Consultant

    1995-05-01

    This paper presents the seismic design and evaluation guidelines for underground piping for the Department of Energy (DOE) High-Level-Waste (HLW) Facilities. The underground piping includes both single and double containment steel pipes and concrete pipes with steel lining, with particular emphasis on the double containment piping. The design and evaluation guidelines presented in this paper follow the generally accepted beam-on-elastic-foundation analysis principle and the inertial response calculation method, respectively, for piping directly in contact with the soil or contained in a jacket. A standard analysis procedure is described along with the discussion of factors deemed to be significant for the design of the underground piping. The following key considerations are addressed: the design feature and safety requirements for the inner (core) pipe and the outer pipe; the effect of soil strain and wave passage; assimilation of the necessary seismic and soil data; inertial response calculation for the inner pipe; determination of support anchor movement loads; combination of design loads; and code comparison. Specifications and justifications of the key parameters used, stress components to be calculated and the allowable stress and strain limits for code evaluation are presented.

  3. Improvement of the Damping Constants for Seismic Design of Piping System for NPP

    SciTech Connect

    Kei Kobayashi; Takashi Satoh [Tokyo Electric Power Company (Japan); Nobuyuki Kojima [Mitsubishi Heavy Industries Ltd. (Japan); Kiyoshi Hattori [Toshiba Corporation (Japan); Masaki Nakagawa [Hitachi Ltd. (Japan); Akihito Otani [Ishikawajima-Harima Heavy Industries Company Ltd., 1 Shin-Nakaharacho, Isogoku, Yokohama 235-8501 (Japan)

    2002-07-01

    The present design damping constants for nuclear power plant (NPP)'s piping system in Japan were developed through discussion among expert researchers, electric utilities and power plant manufactures. They are standardized in 'Technical guidelines for seismic design of Nuclear Power Plants' (JEAG 4601-1991 Supplemental Edition). But some of the damping constants are too conservative because of a lack of experimental data. To improve this excessive conservatism, piping systems supported by U-bolts were chosen and U-bolt support element test and piping model excitation test were performed to obtain proper damping constants. The damping mechanism consists of damping due to piping materials, damping due to fluid interaction, damping due to plastic deformation of piping and supports, and damping due to friction and collision between piping and supports. Because the damping due to friction and collision was considered to be dominant, we focused our effort on formulating these phenomena by a physical model. The validity of damping estimation method was confirmed by comparing data that was obtained from the elemental tests and the actual scale piping model test. New design damping constants were decided from the damping estimations for piping systems in an actual plant. From now on, we will use the new design damping constants for U-bolt support piping systems, which were proposed from this study, as a standard in the Japanese piping seismic design. (authors)

  4. Implementation of seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Conrads, T.J.

    1993-06-01

    In the fall of 1992, a draft of the Seismic Design and Evaluation Guidelines for the Department of Energy (DOE) High-level Waste Storage Tanks and Appurtenances was issued. The guidelines were prepared by the Tanks Seismic Experts Panel (TSEP) and this task was sponsored by DOE, Environmental Management. The TSEP is comprised of a number of consultants known for their knowledge of seismic ground motion and expertise in the analysis of structures, systems and components subjected to seismic loads. The development of these guidelines was managed by staff from Brookhaven National Laboratory, Engineering Research and Applications Division, Department of Nuclear Energy. This paper describes the process used to incorporate the Seismic Design and Evaluation Guidelines for the DOE High-Level Waste Storage Tanks and Appurtenances into the design criteria for the Multi-Function Waste Tank Project at the Hanford Site. This project will design and construct six new high-level waste tanks in the 200 Areas at the Hanford Site. This paper also discusses the vehicles used to ensure compliance to these guidelines throughout Title 1 and Title 2 design phases of the project as well as the strategy used to ensure consistent and cost-effective application of the guidelines by the structural analysts. The paper includes lessons learned and provides recommendations for other tank design projects which might employ the TSEP guidelines.

  5. Low-Noise Potential of Advanced Fan Stage Stator Vane Designs Verified in NASA Lewis Wind Tunnel Test

    NASA Technical Reports Server (NTRS)

    Hughes, Christopher E.

    1999-01-01

    With the advent of new, more stringent noise regulations in the next century, aircraft engine manufacturers are investigating new technologies to make the current generation of aircraft engines as well as the next generation of advanced engines quieter without sacrificing operating performance. A current NASA initiative called the Advanced Subsonic Technology (AST) Program has set as a goal a 6-EPNdB (effective perceived noise) reduction in aircraft engine noise relative to 1992 technology levels by the year 2000. As part of this noise program, and in cooperation with the Allison Engine Company, an advanced, low-noise, high-bypass-ratio fan stage design and several advanced technology stator vane designs were recently tested in NASA Lewis Research Center's 9- by 15-Foot Low-Speed Wind Tunnel (an anechoic facility). The project was called the NASA/Allison Low Noise Fan.

  6. ISET Journal of Earthquake Technology, Paper No. 440, Vol. 41, No. 1, March 2004, pp. 53-73 TOWARDS PERFORMANCE-BASED SEISMIC DESIGN OF MDOF

    E-print Network

    Gupta, Vinay Kumar

    residual deformations in the event of a design-level earthquake, even if they perform exactly as expected the seismic performance or in the design of seismic resistant structures. Parameters influencing residual their integrity, recognizing the economic disadvantages of designing buildings to withstand earthquakes

  7. Exploratory Shaft Seismic Design Basis Working Group report; Yucca Mountain Project

    SciTech Connect

    Subramanian, C.V. [Sandia National Labs., Albuquerque, NM (USA); King, J.L. [Science Applications International Corp., Las Vegas, NV (USA); Perkins, D.M. [Geological Survey, Denver, CO (USA); Mudd, R.W. [Fenix and Scisson, Inc., Tulsa, OK (USA); Richardson, A.M. [Parsons, Brinckerhoff, Quade and Douglas, Inc., San Francisco, CA (USA); Calovini, J.C. [Holmes and Narver, Inc., Las Vegas, NV (USA); Van Eeckhout, E. [Los Alamos National Lab., NM (USA); Emerson, D.O. [Lawrence Livermore National Lab., CA (USA)

    1990-08-01

    This report was prepared for the Yucca Mountain Project (YMP), which is managed by the US Department of Energy. The participants in the YMP are investigating the suitability of a site at Yucca Mountain, Nevada, for construction of a repository for high-level radioactive waste. An exploratory shaft facility (ESF) will be constructed to permit site characterization. The major components of the ESF are two shafts that will be used to provide access to the underground test areas for men, utilities, and ventilation. If a repository is constructed at the site, the exploratory shafts will be converted for use as intake ventilation shafts. In the context of both underground nuclear explosions (conducted at the nearby Nevada Test Site) and earthquakes, the report contains discussions of faulting potential at the site, control motions at depth, material properties of the different rock layers relevant to seismic design, the strain tensor for each of the waveforms along the shaft liners, and the method for combining the different strain components along the shaft liners. The report also describes analytic methods, assumptions used to ensure conservatism, and uncertainties in the data. The analyses show that none of the shafts` structures, systems, or components are important to public radiological safety; therefore, the shafts need only be designed to ensure worker safety, and the report recommends seismic design parameters appropriate for this purpose. 31 refs., 5 figs., 6 tabs.

  8. Best estimate method versus evaluation method: a comparison of two techniques in evaluating seismic analysis and design. Technical report

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-07-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the tradditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC)--seismic input, soil-structure interaction, major structural response, and subsystem response--are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on the model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evauation Method is also demonstrated.

  9. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    SciTech Connect

    Pucinotti, Raffaele [Department of Mechanics and Materials, Mediterranean University of Reggio Calabria, loc. Feo di Vito, Reggio Calabria, 89126 (Italy); Ferrario, Fabio; Bursi, Oreste S. [Department of Mechanical and Structural Engineering, University of Trento, via Mesiano 7, Trento, 38050 (Italy)

    2008-07-08

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  10. Ground motion values for use in the seismic design of the Trans-Alaska Pipeline system

    USGS Publications Warehouse

    Page, Robert A.; Boore, D.M.; Joyner, W.B.; Coulter, H.W.

    1972-01-01

    The proposed trans-Alaska oil pipeline, which would traverse the state north to south from Prudhoe Bay on the Arctic coast to Valdez on Prince William Sound, will be subject to serious earthquake hazards over much of its length. To be acceptable from an environmental standpoint, the pipeline system is to be designed to minimize the potential of oil leakage resulting from seismic shaking, faulting, and seismically induced ground deformation. The design of the pipeline system must accommodate the effects of earthquakes with magnitudes ranging from 5.5 to 8.5 as specified in the 'Stipulations for Proposed Trans-Alaskan Pipeline System.' This report characterizes ground motions for the specified earthquakes in terms of peak levels of ground acceleration, velocity, and displacement and of duration of shaking. Published strong motion data from the Western United States are critically reviewed to determine the intensity and duration of shaking within several kilometers of the slipped fault. For magnitudes 5 and 6, for which sufficient near-fault records are available, the adopted ground motion values are based on data. For larger earthquakes the values are based on extrapolations from the data for smaller shocks, guided by simplified theoretical models of the faulting process.

  11. Optimal seismic design of reinforced concrete structures under time-history earthquake loads using an intelligent hybrid algorithm

    NASA Astrophysics Data System (ADS)

    Gharehbaghi, Sadjad; Khatibinia, Mohsen

    2015-03-01

    A reliable seismic-resistant design of structures is achieved in accordance with the seismic design codes by designing structures under seven or more pairs of earthquake records. Based on the recommendations of seismic design codes, the average time-history responses (ATHR) of structure is required. This paper focuses on the optimal seismic design of reinforced concrete (RC) structures against ten earthquake records using a hybrid of particle swarm optimization algorithm and an intelligent regression model (IRM). In order to reduce the computational time of optimization procedure due to the computational efforts of time-history analyses, IRM is proposed to accurately predict ATHR of structures. The proposed IRM consists of the combination of the subtractive algorithm (SA), K-means clustering approach and wavelet weighted least squares support vector machine (WWLS-SVM). To predict ATHR of structures, first, the input-output samples of structures are classified by SA and K-means clustering approach. Then, WWLS-SVM is trained with few samples and high accuracy for each cluster. 9- and 18-storey RC frames are designed optimally to illustrate the effectiveness and practicality of the proposed IRM. The numerical results demonstrate the efficiency and computational advantages of IRM for optimal design of structures subjected to time-history earthquake loads.

  12. Fuzzy genetic optimization on performance-based seismic design of reinforced concrete bridge piers withsingle-column type

    Microsoft Academic Search

    Yu-Chi Sung; Chin-Kuo Su

    2010-01-01

    This paper presents a fuzzy genetic optimization for performance-based seismic design (PBSD) of reinforced concrete (RC) bridge\\u000a piers with single-column type. The design is modeled as a constrained optimization problem with the objective of minimizing\\u000a construction cost subject to the constraints of qualified structural capacity and suitable reinforcement arrangements for\\u000a the designed RC pier. A violation of the constraints is

  13. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part II: Numerical Results

    NASA Astrophysics Data System (ADS)

    Mazza, Fabio; Vulcano, Alfonso

    2008-07-01

    For a widespread application of dissipative braces to protect framed buildings against seismic loads, practical and reliable design procedures are needed. In this paper a design procedure based on the Direct Displacement-Based Design approach is adopted, assuming the elastic lateral storey-stiffness of the damped braces proportional to that of the unbraced frame. To check the effectiveness of the design procedure, presented in an associate paper, a six-storey reinforced concrete plane frame, representative of a medium-rise symmetric framed building, is considered as primary test structure; this structure, designed in a medium-risk region, is supposed to be retrofitted as in a high-risk region, by insertion of diagonal braces equipped with hysteretic dampers. A numerical investigation is carried out to study the nonlinear static and dynamic responses of the primary and the damped braced test structures, using step-by-step procedures described in the associate paper mentioned above; the behaviour of frame members and hysteretic dampers is idealized by bilinear models. Real and artificial accelerograms, matching EC8 response spectrum for a medium soil class, are considered for dynamic analyses.

  14. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part II: Numerical Results

    SciTech Connect

    Mazza, Fabio; Vulcano, Alfonso [Dipartimento di Modellistica per l'Ingegneria, Universita della Calabria, 87036, Arcavacata di Rende, Cosenza (Italy)

    2008-07-08

    For a widespread application of dissipative braces to protect framed buildings against seismic loads, practical and reliable design procedures are needed. In this paper a design procedure based on the Direct Displacement-Based Design approach is adopted, assuming the elastic lateral storey-stiffness of the damped braces proportional to that of the unbraced frame. To check the effectiveness of the design procedure, presented in an associate paper, a six-storey reinforced concrete plane frame, representative of a medium-rise symmetric framed building, is considered as primary test structure; this structure, designed in a medium-risk region, is supposed to be retrofitted as in a high-risk region, by insertion of diagonal braces equipped with hysteretic dampers. A numerical investigation is carried out to study the nonlinear static and dynamic responses of the primary and the damped braced test structures, using step-by-step procedures described in the associate paper mentioned above; the behaviour of frame members and hysteretic dampers is idealized by bilinear models. Real and artificial accelerograms, matching EC8 response spectrum for a medium soil class, are considered for dynamic analyses.

  15. CHARACTERIZING THE YUCCA MOUNTAIN SITE FOR DEVELOPING SEISMIC DESIGN GROUND MOTIONS

    SciTech Connect

    S. Upadhyaya, I. Wong, R. Kulkarni, K. Stokoe, M. Dober, W. Silva, and R. Quittmeyer

    2006-02-24

    Yucca Mountain, Nevada is the designated site for the first long-term geologic repository to safely dispose spent nuclear fuel and high-level nuclear waste in the U.S. Yucca Mountain consists of stacked layers of welded and non-welded volcanic tuffs. Site characterization studies are being performed to assess its future performance as a permanent geologic repository. These studies include the characterization of the shear-wave velocity (Vs) structure of the repository block and the surface facilities area. The Vs data are an input in the calculations of ground motions for the preclosure seismic design and for postclosure performance assessment and therefore their accurate estimation is needed. Three techniques have been employed: 24 downhole surveys, 15 suspension seismic logging surveys and 95 spectral-analysis-of-surface-waves (SASW) surveys have been performed to date at the site. The three data sets were compared with one another and with Vs profiles developed from vertical seismic profiling data collected by the Lawrence Berkeley National Laboratory and with Vs profiles developed independently by the University of Nevada, Reno using the refraction microtremor technique. Based on these data, base case Vs profiles have been developed and used in site response analyses. Since the question of adequate sampling arises in site characterization programs and a correlation between geology and Vs would help address this issue, a possible correlation was evaluated. To assess the influence of different factors on velocity, statistical analyses of the Vs data were performed using the method of multi-factor Analysis of Variance (ANOVA). The results of this analysis suggest that the effect of each of three factors, depth, lithologic unit, and spatial location, on velocity is statistically significant. Furthermore, velocity variation with depth is different at different spatial locations: Preliminary results show that the lithologic unit alone explains about 54% and 42% of the velocity variation in the suspension and downhole data sets, respectively. The three factors together explain about 73% and 81% of the velocity variation in the suspension and downhole data sets, respectively. Development of a relationship, using multiple regression analysis, which may be used as a predictive tool to estimate velocity at a new location, is currently being examined.

  16. On the Computation of H/V and its Application to Microzonation and Seismic Design

    NASA Astrophysics Data System (ADS)

    Perton, M.; Martnez, J. A.; Lermo, J. F.; Sanchez-Sesma, F. J.

    2014-12-01

    The H/V ratio is the square root of the ratio of horizontal to vertical energies of ground motion. It has been observed that the frequency of the main peak is well suited for the characterization of site effects and had been widely used for micro-zonation and seismic structural design. Historically that ratio was made from the average of individual H/V ratios obtained from noise autocorrelations. Nevertheless, it has been recently pointed out that the H/V ratio should be calculated differently as the ratio of the average of H over the average of V. This calculation is based on the relation between the directional energies (the imaginary part of Green's function) and the noise autocorrelations. In general, the average of ratios is different from the ratio of averages. Although the frequency of the main response was correctly obtained, the associated amplification factor has generally been badly predicted, having little matching with the amplification observed during strong earthquakes. The unexpected decay behavior of such ratios at high frequency and the lack of stability and reproducibility of the H/V ratios are other problems that face the method. These problems are addressed here from the point of view of normalization of noise correlations. In fact, several normalization techniques have already been proposed in order to correctly retrieve the Green's function. Some of them are well suited for the retrieval of the surface wave contribution, while others are more appropriate for bulk wave incidence. Since the H/V ratio may be used for various purposes like surface wave tomography, micro-zonation or seismic design, different normalizations are discussed in functions of the objectives. The H/V obtained from local historical earthquakes on top or far away from the subduction zone are also discussed. ACKNOWLEDGEMENT This research has been partially supported by DGAPA-UNAM under Project IN104712 and the AXA Research Fund.

  17. Verifying Ballast Water Treatment Performance

    EPA Science Inventory

    The U.S. Environmental Protection Agency, NSF International, Battelle, and U.S. Coast Guard are jointly developing a protocol for verifying the technical performance of commercially available technologies designed to treat ship ballast water for potentially invasive species. The...

  18. Basis of Design and Seismic Action for Long Suspension Bridges: the case of the Messina Strait Bridge

    SciTech Connect

    Bontempi, Franco [University of Rome 'La Sapienza', School of Engineering Via Eudossiana 18- 00184 Roma (Italy)

    2008-07-08

    The basis of design for complex structures like suspension bridges is reviewed. Specific attention is devoted to seismic action and to the performance required and to the connected structural analysis. Uncertainty is specially addressed by probabilistic and soft-computing techniques. The paper makes punctual reference to the work end the experience developed during the last years for the re-design of the Messina Strait Bridge.

  19. Verifying Diagnostic Software

    NASA Technical Reports Server (NTRS)

    Lindsey, Tony; Pecheur, Charles

    2004-01-01

    Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.

  20. A New Seismic Broadband Sensor Designed for Easy and Rapid Deployment

    NASA Astrophysics Data System (ADS)

    Guralp, Cansun; Pearcey, Chris; Nicholson, Bruce; Pearce, Nathan

    2014-05-01

    Properly deploying digital seismic broadband sensors in the field can be time consuming and logistically challenging. On active volcanoes the time it takes to install such instruments has to be particularly short in order to minimize the risk for the deployment personnel. In addition, once a seismometer is installed it is not always feasible to pay regular visits to the deployment site in order to correct for possible movements of the seismometer due to settling, sliding or other external events. In order to address those issues we have designed a new type of versatile and very robust three component feedback sensor which can be easily installed and is capable of self correcting changes of its tilt and measuring orientation changes during deployment. The instrument can be installed by direct burial in soil, in a borehole, in glacial ice and can even be used under water as an ocean bottom seismometer (OBS). It components are fitted above each other in a cylindrical stainless steel casing with a diameter of 51 mm. Each seismic sensor has a flat response to velocity between 30s to 100 Hz and a tilt tolerance of up to 20 degrees. A tilt sensor and a two axis magnetometer inside the casing capture changes in tilt and horizontal orientation during the course of the deployment. Their output can be fed into internal motors which in turn adjust the actual orientation of each sensor in the casing. First production models of this instrument have been deployed as OBS in an active submarine volcanic area along the Juan de Fuca Ridge in the NE Pacific. We are currently finishing units to be deployed for volcano monitoring in Icelandic glaciers. This instrument will be offered as an analogue version or with a 24-bit-digitizer fitted into the same casing. A pointy tip can be added to the casing ease direct burial.

  1. Site study plan for EDBH (Engineering Design Boreholes) seismic surveys, Deaf Smith County site, Texas: Revision 1

    SciTech Connect

    Hume, H.

    1987-12-01

    This site study plan describes seismic reflection surveys to run north-south and east-west across the Deaf Smith County site, and intersecting near the Engineering Design Boreholes (EDBH). Both conventional and shallow high-resolution surveys will be run. The field program has been designed to acquire subsurface geologic and stratigraphic data to address information/data needs resulting from Federal and State regulations and Repository program requirements. The data acquired by the conventional surveys will be common-depth- point, seismic reflection data optimized for reflection events that indicate geologic structure near the repository horizon. The data will also resolve the basement structure and shallow reflection events up to about the top of the evaporite sequence. Field acquisition includes a testing phase to check/select parameters and a production phase. The field data will be subjected immediately to conventional data processing and interpretation to determine if there are any anamolous structural for stratigraphic conditions that could affect the choice of the EDBH sites. After the EDBH's have been drilled and logged, including vertical seismic profiling, the data will be reprocessed and reinterpreted for detailed structural and stratigraphic information to guide shaft development. The shallow high-resulition seismic reflection lines will be run along the same alignments, but the lines will be shorter and limited to immediate vicinity of the EDBH sites. These lines are planned to detect faults or thick channel sands that may be present at the EDBH sites. 23 refs. , 7 figs., 5 tabs.

  2. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  3. Seismic design technology for breeder reactor structures. Volume 4. Special topics in piping and equipment

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into five chapters: experimental verification of piping systems, analytical verification of piping restraint systems, seismic analysis techniques for piping systems with multisupport input, development of floor spectra from input response spectra, and seismic analysis procedures for in-core components. (DLC)

  4. Seismic response of perforated lightweight aggregate concrete wall panels for low-rise modular classrooms

    Microsoft Academic Search

    Y. H. Chai; John D. Anderson

    2005-01-01

    In this paper, details of precast concrete wall panels for construction of low-rise modular school buildings are described. These panels, designed to be the primary lateral force resisting system of the building, are relatively thin. Compared to the case for conventional moment-resisting frames or shear-wall buildings, where details have been extensively verified to be seismically effective, the seismic response of

  5. Simplified seismic collapse capacity-based evaluation and design of frame buildings with and without supplemental damping systems

    NASA Astrophysics Data System (ADS)

    Hamidia, Mohammad Javad

    A simplified procedure is developed for estimating the seismic sidesway collapse capacity of frame building structures. The procedure is then extended to quantify the seismic collapse capacity of buildings incorporating supplemental damping systems. The proposed procedure is based on a robust database of seismic peak displacement responses of viscously damped nonlinear single-degree-of-freedom systems for various seismic intensities and uses nonlinear static (pushover) analysis without the need for nonlinear time history dynamic analysis. The proposed procedure is assessed by comparing its collapse capacity predictions on 1470 different building models with those obtained from incremental nonlinear dynamic analyses. A straightforward unifying collapse capacity based design procedure aimed at achieving a pre-determined probability of collapse under maximum considered earthquake event is also introduced for structures equipped with viscous dampers (linear and nonlinear) and hysteretic dampers. The proposed simplified procedure offers a simple, yet efficient, computational/analytical tool that is capable of predicting collapse capacities with acceptable accuracy for a wide variety of frame building structures incorporate several types of supplemental damping systems.

  6. Seismic design of low-level nuclear waste repositories and toxic waste management facilities

    SciTech Connect

    Chung, D.H.; Bernreuter, D.L.

    1984-05-08

    Identification of the elements of typical hazardous waste facilities (HFWs) that are the major contributors to the risk are focussed on as the elements which require additional considerations in the design and construction of low-level nuclear waste management repositories and HWFs. From a recent study of six typical HWFs it was determined that the factors that contribute most to the human and environmental risk fall into four basic categories: geologic and seismological conditions at each HWF; engineered structures at each HWF; environmental conditions at each HWF; and nature of the material being released. In selecting and carrying out the six case studies, three groups of hazardous waste facilities were examined: generator industries which treat or temporarily store their own wastes; generator facilities which dispose of their own hazardous wastes on site; and industries in the waste treatment and disposal business. The case studies have a diversity of geologic setting, nearby settlement patterns, and environments. Two sites are above a regional aquifer, two are near a bay important to regional fishing, one is in rural hills, and one is in a desert, although not isolated from nearby towns and a groundwater/surface-water system. From the results developed in the study, it was concluded that the effect of seismic activity on hazardous facilities poses a significant risk to the population. Fifteen reasons are given for this conclusion.

  7. Conceptual Design and Architecture of Mars Exploration Rover (MER) for Seismic Experiments Over Martian Surfaces

    NASA Astrophysics Data System (ADS)

    Garg, Akshay; Singh, Amit

    2012-07-01

    Keywords: MER, Mars, Rover, Seismometer Mars has been a subject of human interest for exploration missions for quite some time now. Both rover as well as orbiter missions have been employed to suit mission objectives. Rovers have been preferentially deployed for close range reconnaissance and detailed experimentation with highest accuracy. However, it is essential to strike a balance between the chosen science objectives and the rover operations as a whole. The objective of this proposed mechanism is to design a vehicle (MER) to carry out seismic studies over Martian surface. The conceptual design consists of three units i.e. Mother Rover as a Surrogate (Carrier) and Baby Rovers (two) as seeders for several MEMS-based accelerometer / seismometer units (Nodes). Mother Rover can carry these Baby Rovers, having individual power supply with solar cells and with individual data transmission capabilities, to suitable sites such as Chasma associated with Valles Marineris, Craters or Sand Dunes. Mother rover deploys these rovers in two opposite direction and these rovers follow a triangulation pattern to study shock waves generated through firing tungsten carbide shells into the ground. Till the time of active experiments Mother Rover would act as a guiding unit to control spatial spread of detection instruments. After active shock experimentation, the babies can still act as passive seismometer units to study and record passive shocks from thermal quakes, impact cratering & landslides. Further other experiments / payloads (XPS / GAP / APXS) can also be carried by Mother Rover. Secondary power system consisting of batteries can also be utilized for carrying out further experiments over shallow valley surfaces. The whole arrangement is conceptually expected to increase the accuracy of measurements (through concurrent readings) and prolong life cycle of overall experimentation. The proposed rover can be customised according to the associated scientific objectives and further needs.

  8. Verifying Randomized Byzantine Agreement

    Microsoft Academic Search

    Marta Z. Kwiatkowska; Gethin Norman

    2002-01-01

    Distributed systems increasingly rely on fault-tolerant and secure authorization services. An essential primitive used to implement such services is the Byzantine agreement protocol for achieving agree- ment among n parties even if t parties (t < n\\/3) are corrupt and behave maliciously. We describe our experience verifying the randomized pro- tocol ABBA (Asynchronous Binary Byzantine Agreement) of Cachin, Kursawe and

  9. Verifying Compilers and ASMs

    Microsoft Academic Search

    Gerhard Goos; Wolf Zimmermann

    2000-01-01

    Abstract A verifying compiler ensures that the compiled code is al - ways correct but the compiler may also terminate with an error mesage and then fails to generate code We argue that with respect to compil - er correctness this is the best possible result which can be achieved in practice Such a compiler may even include unveri ed

  10. A study on the seismic fortification level of offshore platform in Bohai Sea of China

    NASA Astrophysics Data System (ADS)

    Lu, Y.

    2010-12-01

    The Chinese sea areas are important places of offshore petroleum resources, and at the same time they are also seismically active regions. Fixed offshore platforms (OPs) are the fundamental facilities for marine resource exploitation, and usually situated in a complex and severe environment as having to endure many environmental loads in their life span, therefore, the damage to their structures may result in serious disasters. Among these environmental loads the seismic load has tremendous destructive effect and is not predictable. In case of not overly severe wind, wave and current, seismic resistance dominates the strength design of platforms. Furthermore, strong earthquakes have occurred recently or in the history of all the sea areas of oil/gas exploitation in China. Therefore, seismic design of fixed OPs is a very important issue. With the development of marine exploration and earthquake researches in the sea area, extensive studies on the seismotectonic environment and seismicity characteristics of the sea areas of China have been performed, meanwhile, more and more experience and data have been accumulated from OP design practice, which laid a foundation for studying and establishing the seismic design standard of OPs. This paper first gives an overall understanding of the seismic environment of the sea areas of China, then taking the Bohai Sea seismic risk study as an example, introducing a so-called shape factor K to characterize the seismic risk distribution in sub-regions of the Bohai Sea. Based on the seismic design ground motions for 46 platforms in Bohai Sea, a statistic analysis was performed for different peak ground acceleration (PGA) ratios at two different probability levels. In accordance with the two-stage design method, a scheme of two seismic design levels is proposed, and two seismic design objectives are established respectively for the strength level earthquake and ductility level earthquake. By analogy with and comparison to the Chinese seismic design code for buildings it is proposed that the probability level for the strength level earthquake and ductility level earthquake takes respectively a return period of 200 and 1000-2500 years. By comparing with the codes developed by relevant industry institutions the rationality and safety of the seismic fortification objectives of OPs is verified. Finally, the seismic parameters in the sub-regions of Bohai Sea are calculated based on seismic risk zoning and ground motion intensity maps.

  11. Seismic design and evaluation guidelines for the Department of Energy High-Level Waste Storage Tanks and Appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1995-10-01

    This document provides seismic design and evaluation guidelines for underground high-level waste storage tanks. The guidelines reflect the knowledge acquired in the last two decades in defining seismic ground motion and calculating hydrodynamic loads, dynamic soil pressures and other loads for underground tank structures, piping and equipment. The application of the guidelines is illustrated with examples. The guidelines are developed for a specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document. The original version of this document was published in January 1993. Since then, additional studies have been performed in several areas and the results are included in this revision. Comments received from the users are also addressed. Fundamental concepts supporting the basic seismic criteria contained in the original version have since then been incorporated and published in DOE-STD-1020-94 and its technical basis documents. This information has been deleted in the current revision.

  12. Seismic Studies

    SciTech Connect

    R. Quittmeyer

    2006-09-25

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at Yucca Mountain. (2) For probabilistic analyses supporting the demonstration of compliance with preclosure performance objectives, provide a mean seismic hazard curve for the surface facilities area. Results should be consistent with the PSHA for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at Yucca Mountain. (3) For annual ground motion exceedance probabilities appropriate for postclosure analyses, provide site-specific seismic time histories (acceleration, velocity, and displacement) for the waste emplacement level. Time histories should be consistent with the PSHA and reflect available knowledge on the limits to extreme ground motion at Yucca Mountain. (4) In support of ground-motion site-response modeling, perform field investigations and laboratory testing to provide a technical basis for model inputs. Characterize the repository block and areas in which important-to-safety surface facilities will be sited. Work should support characterization and reduction of uncertainties in inputs to ground-motion site-response modeling. (5) On the basis of rock mechanics, geologic, and seismic information, determine limits on extreme ground motion at Yucca Mountain and document the technical basis for them. (6) Update the ground-motion site-response model, as appropriate, on the basis of new data. Expand and enhance the technical basis for model validation to further increase confidence in the site-response modeling. (7) Document seismic methodologies and approaches in reports to be submitted to the NRC. (8) Address condition reports.

  13. A Seismic Isolation Application Using Rubber Bearings; Hangar Project in Turkey

    SciTech Connect

    Sesigur, Haluk; Cili, Feridun [Istanbul Technical University, Faculty of Architecture, Division of Theory of Structures 34434, Taskisla, Istanbul (Turkey)

    2008-07-08

    Seismic isolation is an effective design strategy to mitigate the seismic hazard wherein the structure and its contents are protected from the damaging effects of an earthquake. This paper presents the Hangar Project in Sabiha Goekcen Airport which is located in Istanbul, Turkey. Seismic isolation system where the isolation layer arranged at the top of the columns is selected. The seismic hazard analysis, superstructure design, isolator design and testing were based on the Uniform Building Code (1997) and met all requirements of the Turkish Earthquake Code (2007). The substructure which has the steel vertical trusses on facades and RC H shaped columns in the middle axis of the building was designed with an R factor limited to 2.0 in accordance with Turkish Earthquake Code. In order to verify the effectiveness of the isolation system, nonlinear static and dynamic analyses are performed. The analysis revealed that isolated building has lower base shear (approximately 1/4) against the non-isolated structure.

  14. Seismic Waveguide of Metamaterials

    E-print Network

    Kim, Sang-Hoon

    2012-01-01

    We have developed a new method of an earthquake-resistant design to support conventional aseismic designs using acoustic metamaterials. We suggest a simple and practical method to reduce the amplitude of a seismic wave exponentially. Our device is an attenuator of a seismic wave. Constructing a cylindrical shell-type waveguide that creates a stop-band for the seismic wave, we convert the wave into an evanescent wave for some frequency range without touching the building we want to protect.

  15. Unconditionally verifiable blind computation

    E-print Network

    Fitzsimons, Joseph F

    2012-01-01

    Blind Quantum Computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output and computation remain private. Recently the authors together with Broadbent proposed a universal unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol, or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. In this paper we extend the BQC protocol presented in [Broadbent, Fitzsimons and Kashefi, FOCS 2009 p517] with new functionality allowing blind computational basis m...

  16. Seismic signal discrimination using adaptive system parameters

    Microsoft Academic Search

    N. Magotra; D. Hush; J. Bibbo; E. Clmelt

    1990-01-01

    The motivation for the research presented is the idea of using seismic information to verify compliance with respect to a comprehensive test ban treaty. A method is given for automatically discriminating between seismic events by using a multilayer perceptron neural network system. The input to the neural net consists of the filter coefficients of an adaptive line enhancer. The seismic

  17. Seismic design of steel structures with lead-extrusion dampers as knee braces

    SciTech Connect

    Monir, Habib Saeed [Islamic Azad Unviersity, Maragheh Branch (Iran, Islamic Republic of); Naser, Ali [Department of Civil Engineering, Urmia University (Iran, Islamic Republic of)

    2008-07-08

    One of the effective methods in decreasing the seismic response of structure against dynamic loads due to earthquake is using energy dissipating systems. Lead-extrusion dampers (LED) are one of these systems that dissipate energy in to one lead sleeve because of steel rod movement. Hysteresis loops of these dampers are approximately rectangular and acts independent from velocity in frequencies that are in the seismic frequency rang. In this paper lead dampers are considered as knee brace in steel frames and are studied in an economical view. Considering that lead dampers don't clog structural panels, so this characteristic can solve brace problems from architectural view. The behavior of these dampers is compared with the other kind of dampers such as XADAS and TADAS. The results indicate that lead dampers act properly in absorbing the induced energy due to earthquake and good function in controlling seismic movements of multi-story structures.

  18. Seismic design of steel structures with lead-extrusion dampers as knee braces

    NASA Astrophysics Data System (ADS)

    monir, Habib Saeed; Naser, Ali

    2008-07-01

    One of the effective methods in decreasing the seismic response of structure against dynamic loads due to earthquake is using energy dissipating systems. Lead-extrusion dampers (LED)are one of these systems that dissipate energy in to one lead sleeve because of steel rod movement. Hysteresis loops of these dampers are approximately rectangular and acts independent from velocity in frequencies that are in the seismic frequency rang. In this paper lead dampers are considered as knee brace in steel frames and are studied in an economical view. Considering that lead dampers don't clog structural panels, so this characteristic can solve brace problems from architectural view. The behavior of these dampers is compared with the other kind of dampers such as XADAS and TADAS. The results indicate that lead dampers act properly in absorbing the induced energy due to earthquake and good function in controlling seismic movements of multi-story structures

  19. Simulation and Processing Seismic Data in Complex Geological Models

    NASA Astrophysics Data System (ADS)

    Forestieri da Gama Rodrigues, S.; Moreira Lupinacci, W.; Martins de Assis, C. A.

    2014-12-01

    Seismic simulations in complex geological models are interesting to verify some limitations of seismic data. In this project, different geological models were designed to analyze some difficulties encountered in the interpretation of seismic data. Another idea is these data become available for LENEP/UENF students to test new tools to assist in seismic data processing. The geological models were created considering some characteristics found in oil exploration. We simulated geological medium with volcanic intrusions, salt domes, fault, pinch out and layers more distante from surface (Kanao, 2012). We used the software Tesseral Pro to simulate the seismic acquisitions. The acquisition geometries simulated were of the type common offset, end-on and split-spread. (Figure 1) Data acquired with constant offset require less processing routines. The processing flow used with tools available in Seismic Unix package (for more details, see Pennington et al., 2005) was geometric spreading correction, deconvolution, attenuation correction and post-stack depth migration. In processing of the data acquired with end-on and split-spread geometries, we included velocity analysis and NMO correction routines. Although we analyze synthetic data and carefully applied each processing routine, we can observe some limitations of the seismic reflection in imaging thin layers, great surface depth layers, layers with low impedance contrast and faults.

  20. Unconditionally verifiable blind computation

    E-print Network

    Joseph F. Fitzsimons; Elham Kashefi

    2013-08-15

    Blind Quantum Computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol, or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. The authors, together with Broadbent, previously proposed a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with new functionality allowing blind computational basis measurements, which we use to construct a new verifiable BQC protocol based on a new class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. The new resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest neighbour form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.

  1. Verifying performance requirements

    NASA Technical Reports Server (NTRS)

    Cross, Joseph

    1986-01-01

    Today, it is impossible to verify performance requirements on Ada software, except in a very approximate sense. There are several reasons for this difficulty, of which the main reason is the lack of use of information on the mapping of the program onto the target machine. An approach to a partial solution to the verification of performance requirements on Ada software is proposed, called the rule based verification approach. This approach is suitable when the target machine is well defined and when additional effort and expense are justified in order to guarantee that the performance requirements will be met by the final system.

  2. Image resolution analysis: a new, robust approach to seismic survey design

    E-print Network

    Tzimeas, Constantinos

    2005-08-29

    ?guration, parameters such as the structure and seismic velocity also in?uence image resolution. Understanding their e?ect on image quality, allows us to better interpret the resolution results for the surveys under examination. A salt model was used to simulate...

  3. Software interface verifier

    NASA Technical Reports Server (NTRS)

    Soderstrom, Tomas J.; Krall, Laura A.; Hope, Sharon A.; Zupke, Brian S.

    1994-01-01

    A Telos study of 40 recent subsystem deliveries into the DSN at JPL found software interface testing to be the single most expensive and error-prone activity, and the study team suggested creating an automated software interface test tool. The resulting Software Interface Verifier (SIV), which was funded by NASA/JPL and created by Telos, employed 92 percent software reuse to quickly create an initial version which incorporated early user feedback. SIV is now successfully used by developers for interface prototyping and unit testing, by test engineers for formal testing, and by end users for non-intrusive data flow tests in the operational environment. Metrics, including cost, are included. Lessons learned include the need for early user training. SIV is ported to many platforms and can be successfully used or tailored by other NASA groups.

  4. Optimum seismic structural design based on random vibration and fuzzy graded damages

    NASA Technical Reports Server (NTRS)

    Cheng, Franklin Y.; Ou, Jin-Ping

    1990-01-01

    This paper presents the fuzzy dynamical reliability and failure probability as well as the basic principles and the analytical method of loss assessment for nonlinear seismic steel structures. Also presented is the optimization formulation and a numerical example for double objectives, initial construction cost and expected failure loss, and dynamical reliability constraints. The earthquake ground motion is based on a stationary filtered non-white noise and the fuzzy damage grade is described by damage index.

  5. Computational fluid dynamics verified the advantages of streamlined impeller design in improving flow patterns and anti-haemolysis properties of centrifugal pump.

    PubMed

    Qian, K X; Wang, F Q; Zeng, P; Ru, W M; Yuan, H Y; Feng, Z G

    2006-01-01

    Computational fluid dynamics (CFD) technology was applied to predict the flow patterns in the authors' streamlined blood pump and an American bio-pump with straight vanes and shroud, respectively. Meanwhile, haemolysis comparative tests of the two pumps were performed to verify the theoretical analysis. The results revealed that the flow patterns in the streamlined impeller are coincident with its logarithmic vanes and parabolic shroud, and there is neither separate flow nor impact in the authors' pump. In the bio-pump, the main flow has the form of logarithmic spiral in vertical section and parabola in cross section, thus there are both stagnation and swirl between the main flow and the straight vanes and shroud. Haemolysis comparative tests demonstrated that the authors' pump has an index of haemolysis of 0.030, less than that of the bio-pump (0.065). PMID:17060163

  6. COMPARISON OF HORIZONTAL SEISMIC COEFFICIENTS DEFINED BY CURRENT AND PREVIOUS DESIGN STANDARDS FOR PORT AND HARBOR FACILITIES

    NASA Astrophysics Data System (ADS)

    Takahashi, Hidenori; Ikuta, Akiho

    Japanese design standard for port and harbor facilities was revised in 2007, modifying the method used to calculate the horizontal seismic coefficient, kh. The comprehensive change of the method indicates that the quay walls designed by the previous standard could be lack of earthquake resistance in terms of the current standard. In the present study, the coefficients, kh, calculated by the two standards were compared for the existing quay walls constructed in Kanto area, Japan. In addition, the factors that affected the relationship of two types of coefficients, kh, were identified by means of multiple regression analyses. Only 16 % of the current standard of kh exceeded the previous standard of kh. According to the multiple regression analyses, the ratio of two types of coefficients, kh, tended to increase in the quay walls which were located in a specific port and had the large wall height and the small importance factor.

  7. Seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1993-01-01

    This document provides guidelines for the design and evaluation of underground high-level waste storage tanks due to seismic loads. Attempts were made to reflect the knowledge acquired in the last two decades in the areas of defining the ground motion and calculating hydrodynamic loads and dynamic soil pressures for underground tank structures. The application of the analysis approach is illustrated with an example. The guidelines are developed for specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document.

  8. Seismic Waveguide of Metamaterials

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Hoon; Das, Mukunda P.

    We developed a new method of an earthquake-resistant design to support conventional aseismic system using acoustic metamaterials. The device is an attenuator of a seismic wave that reduces the amplitude of the wave exponentially. Constructing a cylindrical shell-type waveguide composed of many Helmholtz resonators that creates a stop-band for the seismic frequency range, we convert the seismic wave into an attenuated one without touching the building that we want to protect. It is a mechanical way to convert the seismic energy into sound and heat.

  9. Utilization of a finite element model to verify spent nuclear fuel storage rack welds

    SciTech Connect

    Nitzel, M.E.

    1998-07-01

    Elastic and plastic finite element analyses were performed for the inner tie block assembly of a 25 port fuel rack designed for installation at the Idaho National Engineering and Environmental Laboratory (INEEL) Idaho Chemical Processing Plant (ICPP). The model was specifically developed to verify the adequacy of certain welds joining components of the fuel storage rack assembly. The work scope for this task was limited to an investigation of the stress levels in the inner tie welds when the rack was subjected to seismic loads. Structural acceptance criteria used for the elastic calculations performed were as defined by the rack`s designer. Structural acceptance criteria used for the plastic calculations performed as part of this effort were as defined in Subsection NF and Appendix F of Section III of the ASME Boiler and Pressure Vessel Code. The results confirm that the welds joining the inner tie block to the surrounding rack structure meet the acceptance criteria. The analysis results verified that the inner tie block welds should be capable of transferring the expected seismic load without structural failure.

  10. Overview of Thermal-Hydraulic Test Program for Evaluating or Verifying the Performance of New Design Features in APR1400 Reactor

    SciTech Connect

    Song, C.H.; Kwon, T.S.; Chu, I.C.; Jun, H.G.; Park, C.K. [Korea Atomic Energy Research Institute, Yuseong P.O. Box 105, Daejeon 305-600 (Korea, Republic of)

    2002-07-01

    Experimental program and some of test results for thermal-hydraulic evaluation or verification of new design features in APR1400 are introduced for major test items. APR1400 incorporates many advanced design features to enhance its performance and safety. New design features adopted in APR1400 include, among others, four trains of the safety injection system (SIS) with direct vessel injection (DVI) mode and passively operating safety injection tank (SIT), the In-containment Refueling Water Storage Tank (IRWST) and the safety depressurization and vent system (SDVS). For these new design features, experimental activities relevant for ensuring their performance and contribution to the safety enhancement have been carried out at KAERI. They include the LBLOCA ECCS performance evaluation test for the DVI mode of SIS, performance verification test of the fluidic device as a passive flow controller, performance evaluation test of steam sparger for SDVS and the CEDM (control element drive mechanism) performance evaluation test. In this paper, the test program is briefly introduced, which includes the test objectives, experimental method and some of typical results for each test item. (authors)

  11. The LUSI Seismic Experiment: Deployment of a Seismic Network around LUSI, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Karyono, Karyono; Mazzini, Adriano; Lupi, Matteo; Syafri, Ildrem; Haryanto, Iyan; Masturyono, Masturyono; Hadi, Soffian; Rohadi, Suprianto; Suardi, Iman; Rudiyanto, Ariska; Pranata, Bayu

    2015-04-01

    The spectacular Lusi eruption started in northeast Java, Indonesia the 29 of May 2006 following a M6.3 earthquake striking the island. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. Lusi is located few kilometres to the NE of the Arjuno-Welirang volcanic complex. Lusi sits upon the Watukosek fault system. From this volcanic complex originates the Watukosek fault system that was reactivated by the M6.3 earthquake in 2006 and is still periodically reactivated by the frequent seismicity. To date Lusi is still active and erupting gas, water, mud and clasts. Gas and water data show that the Lusi plumbing system is connected with the neighbouring Arjuno-Welirang volcanic complex. This makes the Lusi eruption a "sedimentary hosted geothermal system". To verify and characterise the occurrence of seismic activity and how this perturbs the connected Watukosek fault, the Arjuno-Welirang volcanic system and the ongoing Lusi eruption, we deployed 30 seismic stations (short-period and broadband) in this region of the East Java basin. The seismic stations are more densely distributed around LUSI and the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. Fewer stations are positioned around the volcanic arc. Our study sheds light on the seismic activity along the Watukosek fault system and describes the waveforms associated to the geysering activity of Lusi. The initial network aims to locate small event that may not be captured by the Indonesian Agency for Meteorology, Climatology and Geophysics (BMKG) seismic network and it will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-Arjuno Welirang region and temporal variations of vp/vs ratios. Such variations will then be ideally related to large-magnitude seismic events. This project is an unprecedented monitoring of a multi component system including an Lusi active eruption, an unlocked strike slip fault, a neighbouring volcanic arc all affected by frequent seismicity. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. The seismic experiment suggested in this study enforces our knowledge about Lusi and will represent a step further towards the reconstruction of a society devastated by Lusi disaster.

  12. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  13. Seismic Hazard Assessment: Issues and Alternatives

    NASA Astrophysics Data System (ADS)

    Wang, Zhenming

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used interchangeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been proclaimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications.

  14. Robust design of mass-uncertain rolling-pendulum TMDs for the seismic protection of buildings

    NASA Astrophysics Data System (ADS)

    Matta, Emiliano; De Stefano, Alessandro

    2009-01-01

    Commonly used for mitigating wind- and traffic-induced vibrations in flexible structures, passive tuned mass dampers (TMDs) are rarely applied to the seismic control of buildings, their effectiveness to impulsive loads being conditional upon adoption of large mass ratios. Instead of recurring to cumbersome metal or concrete devices, this paper suggests meeting that condition by turning into TMDs non-structural masses sometimes available atop buildings. An innovative roof-garden TMD, for instance, sounds a promising tool capable of combining environmental and structural protection in one device. Unfortunately, the amount of these masses being generally variable, the resulting mass-uncertain TMD (MUTMD) appears prone to mistuning and control loss. In an attempt to minimize such adverse effects, robust analysis and synthesis against mass variations are applied in this study to MUTMDs of the rolling-pendulum type, a configuration characterized by mass-independent natural period. Through simulations under harmonic and recorded ground motions of increasing intensity, the performance of circular and cycloidal rolling-pendulum MUTMDs is evaluated on an SDOF structure in order to illustrate their respective advantages as well as the drawbacks inherent in their non-linear behavior. A possible implementation of a roof-garden TMD on a real building structure is described and its control efficacy numerically demonstrated, showing that in practical applications MUTMDs can become a good alternative to traditional TMDs.

  15. Theoretical and practical considerations for the design of the iMUSH active-source seismic experiment

    NASA Astrophysics Data System (ADS)

    Kiser, E.; Levander, A.; Harder, S. H.; Abers, G. A.; Creager, K. C.; Vidale, J. E.; Moran, S. C.; Malone, S. D.

    2013-12-01

    The multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment seeks to understand the details of the magmatic system that feeds Mount St. Helens using active- and passive-source seismic, magnetotelluric, and petrologic data. The active-source seismic component of this experiment will take place in the summer of 2014 utilizing all of the 2600 PASSCAL 'Texan' Reftek instruments which will record twenty-four 1000-2000 lb shots distributed around the Mount St. Helens region. The instruments will be deployed as two consecutive refraction profiles centered on the volcano, and a series of areal arrays. The actual number of areal arrays, as well as their locations, will depend strongly on the length of the experiment (3-4 weeks), the number of instrument deployers (50-60), and the time it will take per deployment given the available road network. The current work shows how we are balancing these practical considerations against theoretical experiment designs in order to achieve the proposed scientific goals with the available resources. One of the main goals of the active-source seismic experiment is to image the magmatic system down to the Moho (35-40 km). Calculating sensitivity kernels for multiple shot/receiver offsets shows that direct P waves should be sensitive to Moho depths at offsets of 150 km, and therefore this will likely be the length of the refraction profiles. Another primary objective of the experiment is to estimate the locations and volumes of different magma accumulation zones beneath the volcano using the areal arrays. With this in mind, the optimal locations of these arrays, as well as their associated shots, are estimated using an eigenvalue analysis of the approximate Hessian for each possible experiment design. This analysis seeks to minimize the number of small eigenvalues of the approximate Hessian that would amplify the propagation of data noise into regions of interest in the model space, such as the likely locations of magma reservoirs. In addition, this analysis provides insight into the tradeoff between the number of areal array deployments and the information that will be gained from the experiment. An additional factor incorporated into this study is the expected data quality in different regions around Mount St. Helens. Expected data quality is determined using the signal-to-noise ratios of data from existing seismometers in the region, and from forward modeling the wavefields from different experiment designs using SPECFEM3D software. In particular, we are interested in evaluating how topography near the volcano and low velocity volcaniclastic layers affect data quality. This information is especially important within 5 km of the volcano where only hiking trails are available for instrument deployment, and in a large area north of the volcano where road maintenance has lagged since the 1980 eruption. Instrument deployment will be slow in these regions, and therefore it is essential to understand if deployment of instruments here is a reasonable use of resources. A final step of this study will be validating different experiment designs based upon the above criteria by inverting synthetic data from velocity models that contain a generalized representation of the magma system to confirm that the main features of the models can be recovered.

  16. AUTOMATIC DATA PROCESSING AT BURAR SEISMIC STATION

    Microsoft Academic Search

    Daniela Ghica; Johannes Schweitzer

    BURAR seismic data are continuously recorded and transmitted in real-time to the Romanian National Data Centre (RO_NDC), where they are automatically processed using a program developed at NORSAR for detecting and associating seismic signals from regional array data, and applied for BURAR characteristics. Automatic estimates from detections (slowness vector and onset time) were verified with events listed in PDE bulletins

  17. Statistical classification methods applied to seismic discrimination

    Microsoft Academic Search

    F. M. Ryan; D. N. Anderson; K. K. Anderson; D. N. Hagedorn; K. T. Higbee; N. E. Miller; T. Redgate; A. C. Rohay

    1996-01-01

    To verify compliance with a Comprehensive Test Ban Treaty (CTBT), low energy seismic activity must be detected and discriminated. Monitoring small-scale activity will require regional (within 2000 km) monitoring capabilities. This report provides background information on various statistical classification methods and discusses the relevance of each method in the CTBT seismic discrimination setting. Criteria for classification method selection are explained

  18. Regional seismic discrimination research at LLNL

    Microsoft Academic Search

    W. R. Walter; K. M. Mayeda; P. Goldstein; H. J. Patton; S. Jarpe; L. Glenn

    1995-01-01

    The ability to verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and

  19. Improving the design and performance of concrete bridges in seismic regions

    E-print Network

    Tobolski, Matthew Joseph

    2010-01-01

    concrete design consisted of a nominal 3/8 maximum aggregate size mix with a designconcrete developed to achieve a target compressive strength of 7 ksi at 28 days. The mix designconcrete developed to achieve a target compressive strength of 7 ksi at 28 days. The mix design

  20. The DDBD Method In The A-Seismic Design of Anchored Diaphragm Walls

    NASA Astrophysics Data System (ADS)

    Manuela, Cecconi; Vincenzo, Pane; Sara, Vecchietti

    2008-07-01

    The development of displacement based approaches for earthquake engineering design appears to be very useful and capable to provide improved reliability by directly comparing computed response and expected structural performance. In particular, the design procedure known as the Direct Displacement Based Design (DDBD) method, which has been developed in structural engineering over the past ten years in the attempt to mitigate some of the deficiencies in current force-based design methods, has been shown to be very effective and promising ([1], [2]). The first attempts of application of the procedure to geotechnical engineering and, in particular, earth retaining structures are discussed in [3], [4] and [5]. However in this field, the outcomes of the research need to be further investigated in many aspects. The paper focuses on the application of the DDBD method to anchored diaphragm walls. The results of the DDBD method are discussed in detail in the paper, and compared to those obtained from conventional pseudo-static analyses.

  1. The DDBD Method In The A-Seismic Design of Anchored Diaphragm Walls

    SciTech Connect

    Manuela, Cecconi; Vincenzo, Pane; Sara, Vecchietti [Department of Civil and Environmental Engineering, University of Perugia (Italy)

    2008-07-08

    The development of displacement based approaches for earthquake engineering design appears to be very useful and capable to provide improved reliability by directly comparing computed response and expected structural performance. In particular, the design procedure known as the Direct Displacement Based Design (DDBD) method, which has been developed in structural engineering over the past ten years in the attempt to mitigate some of the deficiencies in current force-based design methods, has been shown to be very effective and promising ([1], [2]). The first attempts of application of the procedure to geotechnical engineering and, in particular, earth retaining structures are discussed in [3], [4] and [5]. However in this field, the outcomes of the research need to be further investigated in many aspects. The paper focuses on the application of the DDBD method to anchored diaphragm walls. The results of the DDBD method are discussed in detail in the paper, and compared to those obtained from conventional pseudo-static analyses.

  2. A Verifiable Fingerprint Vault Scheme

    Microsoft Academic Search

    Qiong Li; Xiamu Niu; Zhifang Wang; Yuhua Jiao; Sheng-he Sun

    2005-01-01

    \\u000a By adopting a non-interactive information-theoretic secure verifiable secret sharing scheme in an unorthodox way, a verifiable\\u000a fingerprint vault scheme is presented in this paper. Fuzzy vault scheme is a novel cryptographic construct which can increase\\u000a the security of the biometric template in a biometric authentication system. It can be also used to bind the cryptographic\\u000a key and the user in

  3. Model verifies design of mobile data modem

    NASA Technical Reports Server (NTRS)

    Davarian, F.; Sumida, J.

    1986-01-01

    It has been proposed to use differential minimum shift keying (DMSK) modems in spacecraft-based mobile communications systems. For an employment of these modems, it is necessary that the transmitted carrier frequency be known prior to signal detection. In addition, the time needed by the receiver to lock onto the carrier frequency must be minimized. The present article is concerned with a DMSK modem developed for the Mobile Satellite Service. This device demonstrated fast acquisition time and good performance in the presence of fading. However, certain problems arose in initial attempts to study the acquisition behavior of the AFC loop through breadboard techniques. The development of a software model of the AFC loop is discussed, taking into account two cases which were plotted using the model. Attention is given to a demonstration of the viability of the modem by an approach involving modeling and analysis of the frequency synchronizer.

  4. Optimization for performance-based design under seismic demands, including social costs

    NASA Astrophysics Data System (ADS)

    Mller, Oscar; Foschi, Ricardo O.; Ascheri, Juan P.; Rubinstein, Marcelo; Grossman, Sergio

    2015-06-01

    Performance-based design in earthquake engineering is a structural optimization problem that has, as the objective, the determination of design parameters for the minimization of total costs, while at the same time satisfying minimum reliability levels for the specified performance criteria. Total costs include those for construction and structural damage repairs, those associated with non-structural components and the social costs of economic losses, injuries and fatalities. This paper presents a general framework to approach this problem, using a numerical optimization strategy and incorporating the use of neural networks for the evaluation of dynamic responses and the reliability levels achieved for a given set of design parameters. The strategy is applied to an example of a three-story office building. The results show the importance of considering the social costs, and the optimum failure probabilities when minimum reliability constraints are not taken into account.

  5. Verifying Correct Functionality of Avionics Subsystems

    NASA Technical Reports Server (NTRS)

    Meuer, Ben t.

    2005-01-01

    This project focuses on the testing of the telecommunications interface subsystem of the Multi-Mission System Architecture Platform to ensure proper functionality. The Multi-Mission System Architecture Platform is a set of basic tools designed to be used in future spacecraft. The responsibilities of the telecommunications interface include communication between the spacecraft and ground teams as well as acting as the bus controller for the system. The tests completed include bit wise read\\write tests to each register, testing of status bits, and verifying various bus controller activities. Testing is accomplished through the use of software-based simulations run on an electronic design of the system. The tests are written in Verilog Hardware Definition Language and they simulate specific states and conditions in telecommunication interfaces. Upon successful completion, the output is examined to verify that the system responded appropriately.

  6. RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES

    EPA Science Inventory

    On October 9, 1993, the new RCRA Subtitle D regulation (40CFR Part 258) went into effect. hese regulations are applicable to landfills reclining solid waste (MSW) and establish minimum Federal criteria for the siting, design, operations, and closure of MSW landfills. hese regulat...

  7. RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES

    EPA Science Inventory

    On October 9, 1993, the new RCRA Subtitle D regulations (40 CFR Part 258) went into effect. These regulations are applicable to landfills receiving municipal solid waste (MSW) and establish minimum Federal criteria for the siting, design, operation, and closure of MSW landfills....

  8. Seismic wavelet estimation: a frequency domain solution to a geophysical noisy input-output problem

    Microsoft Academic Search

    Andrew T. Walden; Roy E. White

    1998-01-01

    In seismic reflection prospecting for oil and gas a key step is the ability to estimate the seismic wavelet (impulse response) traveling through the Earth. Such estimation enables filters to be designed to deblur the recorded seismic time series and allows the integration of downhole and surface seismic data for seismic interpretation purposes. An appropriate model for the seismic time

  9. Detection capabilities of the BURAR seismic arraycontributions to the monitoring of regional and distant seismicity

    Microsoft Academic Search

    Daniela Veronica Ghica

    Data recorded with the Bucovina Romanian Seismic Array (BURAR) seismic array between January 2005 and December 2008 were analyzed\\u000a to verify the monitoring capabilities of regional and distant seismicity. For this time interval, nearly 35,000 events detected\\u000a by BURAR and identified in seismic bulletins (Preliminary Determination of Epicenters and Romanian Earthquake Catalogue) were\\u000a investigated using parameters as backazimuth, epicentral distance

  10. Neural networks in seismic discrimination

    SciTech Connect

    Dowla, F.U.

    1995-01-01

    Neural networks are powerful and elegant computational tools that can be used in the analysis of geophysical signals. At Lawrence Livermore National Laboratory, we have developed neural networks to solve problems in seismic discrimination, event classification, and seismic and hydrodynamic yield estimation. Other researchers have used neural networks for seismic phase identification. We are currently developing neural networks to estimate depths of seismic events using regional seismograms. In this paper different types of network architecture and representation techniques are discussed. We address the important problem of designing neural networks with good generalization capabilities. Examples of neural networks for treaty verification applications are also described.

  11. Seismic Hazard Analysis Quo vadis?

    Microsoft Academic Search

    Jens-Uwe Klgel

    2008-01-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis.

  12. Seismic upgrades of healthcare facilities.

    PubMed

    Yusuf, A

    1997-06-01

    Before 1989 seismic upgrading of hospital structures was not a primary consideration among hospital owners. However, after extensive earthquake damage to hospital buildings at Loma Prieta in Northern California in 1989 and then at Northridge in Southern California in 1994, hospital owners, legislators, and design teams become concerned about the need for seismic upgrading of existing facilities. Because the damage hospital structures sustained in the earthquakes was so severe and far-reaching, California has enacted laws that mandate seismic upgrading for existing facilities. Now hospital owners will have to upgrade buildings that do not conform to statewide seismic adequacy laws. By 2030, California expects all of its hospital structures to be sufficiently seismic-resistant. Slowly, regions in the Midwest and on the East Coast are following their example. This article outlines reasons and ways for seismic upgrading of existing facilities. PMID:10168656

  13. Seismic, shock, and vibration isolation - 1988

    SciTech Connect

    Chung, H. (Argonne National Lab., Argonne, IL (US)); Mostaghel, N. (Univ. of Utah, Salt Lake City, UT (US))

    1988-01-01

    This book contains papers presented at a conference on pressure vessels and piping. Topics covered include: Design of R-FBI bearings for seismic isolation; Benefits of vertical and horizontal seismic isolation for LMR nuclear reactor units; and Some remarks on the use and perspectives of seismic isolation for fast reactors.

  14. Seismic Waves: How Earthquakes Move the Earth

    NSDL National Science Digital Library

    Integrated Teaching and Learning Program,

    Students learn about the types of seismic waves produced by earthquakes and how they move the Earth. The dangers of earthquakes are presented as well as the necessity for engineers to design structures for earthquake-prone areas that are able to withstand the forces of seismic waves. Students learn how engineers build shake tables that simulate the ground motions of the Earth caused by seismic waves in order to test the seismic performance of buildings.

  15. The Non-Proliferation Experiment recorded at the Pinedale. Seismic research facility

    SciTech Connect

    Carr, D.B.

    1994-06-01

    The Non-Proliferation Experiment was recorded by five different seismic stations operated by Sandia National Laboratories at the Pinedale Seismic Research Facility, approximately 7.60 from the Nevada Test Site. Two stations are different versions of the Deployable Seismic Verification System developed by the Department of Energy to provide seismic data to verify compliance with a Comprehensive Test Ban Treaty. Vault and borehole versions of the Designated Seismic Stations also recorded the event. The final station is test instrumentation located at depths of 10, 40 and 1200 feet. Although the event is seen clearly at all the stations, there are variations in the raw data due to the different bandwidths and depths of deployment. One Deployable Seismic Verification System has been operating at Pinedale for over three years and in that time recorded 14 nuclear explosions and 4 earthquakes from the Nevada Test Site, along with numerous other western U.S. earthquakes. Several discriminants based on the work by Taylor et al. (1989) have been applied to this data. First the discriminants were tested by comparing the explosions only to the 4 earthquakes located on the Test Site. Only one discriminant, log(L{sub g}/P{sub g}), did not show clear separation between the earthquakes and nuclear explosions. When other western U.S. events are included, only the m{sub b} vs. M{sub s} discriminant separated the events. In all cases where discrimination was possible, the Non-Proliferation Experiment was indistinguishable from a nuclear explosion.

  16. Seismic Survey

    USGS Multimedia Gallery

    USGS hydrologists conduct a seismic survey in New Orleans, Louisiana. The survey was one of several geophysical methods used during USGS applied research on the utility of the multi-channel analysis of surface waves (MASW) seismic method (no pictured here) for non-invasive assessment of earthen leve...

  17. Recent advances in the Lesser Antilles observatories Part 1 : Seismic Data Acquisition Design based on EarthWorm and SeisComP

    NASA Astrophysics Data System (ADS)

    Saurel, Jean-Marie; Randriamora, Frdric; Bosson, Alexis; Kitou, Thierry; Vidal, Cyril; Bouin, Marie-Paule; de Chabalier, Jean-Bernard; Clouard, Valrie

    2010-05-01

    Lesser Antilles observatories are in charge of monitoring the volcanoes and earthquakes in the Eastern Caribbean region. During the past two years, our seismic networks have evolved toward a full digital technology. These changes, which include modern three components sensors, high dynamic range digitizers, high speed terrestrial and satellite telemetry, improve data quality but also increase the data flows to process and to store. Moreover, the generalization of data exchange to build a wide virtual seismic network around the Caribbean domain requires a great flexibility to provide and receive data flows in various formats. As many observatories, we have decided to use the most popular and robust open source data acquisition systems in use in today observatories community : EarthWorm and SeisComP. The first is renowned for its ability to process real time seismic data flows, with a high number of tunable modules (filters, triggers, automatic pickers, locators). The later is renowned for its ability to exchange seismic data using the international SEED standard (Standard for Exchange of Earthquake Data), either by producing archive files, or by managing output and input SEEDLink flows. French Antilles Seismological and Volcanological Observatories have chosen to take advantage of the best features of each software to design a new data flow scheme and to integrate it in our global observatory data management system, WebObs [Beauducel et al., 2004]1, see the companion paper (Part 2). We assigned the tasks to the different softwares, regarding their main abilities : - EarthWorm first performs the integration of data from different heterogeneous sources; - SeisComP takes all this homogeneous EarthWorm data flow, adds other sources and produces SEED archives and SEED data flow; - EarthWorm is then used again to process this clean and complete SEEDLink data flow, mainly producing triggers, automatic locations and alarms; - WebObs provides a friendly human interface, both to the administrator for station management, and to the regular user for real time everyday analysis of the seismic data (event classification database, location scripts, automatic shakemaps and regional catalog with associated hypocenter maps).

  18. Seismic evaluation methods for existing buildings

    SciTech Connect

    Hsieh, B.J.

    1995-07-01

    Recent US Department of Energy natural phenomena hazards mitigation directives require the earthquake reassessment of existing hazardous facilities and general use structures. This applies also to structures located in accordance with the Uniform Building Code in Seismic Zone 0 where usually no consideration is given to seismic design, but where DOE specifies seismic hazard levels. An economical approach for performing such a seismic evaluation, which relies heavily on the use of preexistent structural analysis results is outlined below. Specifically, three different methods are used to estimate the seismic capacity of a building, which is a unit of a building complex located on a site considered low risk to earthquakes. For structures originally not seismically designed, which may not have or be able to prove sufficient capacity to meet new arbitrarily high seismic design requirement and which are located on low-seismicity sites, it may be very cost effective to perform detailed site-specific seismic hazard studies in order to establish the true seismic threat. This is particularly beneficial, to sites with many buildings and facilities to be seismically evaluated.

  19. Seismic Refraction

    NSDL National Science Digital Library

    Robert D. Cicerone

    This lab allows the students to review the relevant formulas for the analysis of seismic refraction data and provides three different data sets to analyze three different geologic settings (three-layer model, dipping inferface, fault).

  20. vUML: A Tool for Verifying UML Models

    Microsoft Academic Search

    Johan Lilius; Ivan Porres Paltor

    1999-01-01

    The Unified Modelling Language (UML) is a standardised notation for describing object-oriented software designs. We present vUML, a tool that automatically verifies UML models. vUML verifies models where the behaviour of the objects is described using UML Statecharts diagrams. It supports concurrent and distributed models containing active objects and synchronous and asynchronous communication between objects. The tool uses the SPIN

  1. A new seismic discriminant for earthquakes and explosions

    Microsoft Academic Search

    Bradley B. Woods; Donald V. Helmberger

    1993-01-01

    With the spread of nuclear weapons technology, more regions of the world need to be monitored in order to verify nuclear nonproliferation and limited test-ban treaties. Seismic monitoring is the primary means to remotely sense contained underground explosions ``Bolt, 1976; Dahlman and Israelson, 1977''. Both underground explosions and earthquakes generate seismic energy, which propagates through the Earth as elastic waves.

  2. Probabilistic seismic hazard analysis of Islamabad, Pakistan

    NASA Astrophysics Data System (ADS)

    Bhatti, Abdul Qadir; Hassan, Syed Zamir Ul; Rafi, Zahid; Khatoon, Zubeda; Ali, Qurban

    2011-08-01

    Pakistan is prone to seismic activity, and its capital, Islamabad, is located close to the Main Boundary Thrust (MBT) fault. On October 8th, 2005 the disastrous Muzaffarabad earthquake shook Islamabad and damaged many high-rise buildings. A probabilistic seismic hazard analysis technique was used to estimate strong ground motion parameters for a closely spaced 1 km grid. Traditionally, PGA is calculated, which is then used in structural earthquake resistant design or seismic safety assessment. However, Peak Ground Acceleration (PGA) is not sufficient to design for seismic load or to account for the modern building code's emphasis on the use of spectral acceleration values. Therefore, a seismic hazard analysis was performed for Islamabad, and the design parameters that are required by codes to account for seismic loading were derived.

  3. E-Verify is a service of DHS and SSA WHAT IS E-VERIFY?

    E-print Network

    Bolding, M. Chad

    E-Verify is a service of DHS and SSA WHAT IS E-VERIFY? Federal Law requires that all employers called E-Verify to assist employers further in verifying the employment eligibility of all newly-hired employees. Through E-Verify, employers send information from the Form I-9 about you to SSA and DHS (only

  4. Verify Your Address for Postal Service Standards Verify Your Beneficiary and Personal Email

    E-print Network

    Tryon, Michael D.

    Verify Your Address for Postal Service Standards Verify Your Beneficiary and Personal Email Your by verifying that your beneficiary and personal email are accurate and up-to-date. By verifying your address, ensuring you have a beneficiary listed and verifying your personal email address you help us ensure your

  5. Teacher Directed Design: Content Knowledge, Pedagogy and Assessment under the Nevada K-12 Real-Time Seismic Network

    NASA Astrophysics Data System (ADS)

    Cantrell, P.; Ewing-Taylor, J.; Crippen, K. J.; Smith, K. D.; Snelson, C. M.

    2004-12-01

    Education professionals and seismologists under the emerging SUN (Shaking Up Nevada) program are leveraging the existing infrastructure of the real-time Nevada K-12 Seismic Network to provide a unique inquiry based science experience for teachers. The concept and effort are driven by teacher needs and emphasize rigorous content knowledge acquisition coupled with the translation of that knowledge into an integrated seismology based earth sciences curriculum development process. We are developing a pedagogical framework, graduate level coursework, and materials to initiate the SUN model for teacher professional development in an effort to integrate the research benefits of real-time seismic data with science education needs in Nevada. A component of SUN is to evaluate teacher acquisition of qualified seismological and earth science information and pedagogy both in workshops and in the classroom and to assess the impact on student achievement. SUN's mission is to positively impact earth science education practices. With the upcoming EarthScope initiative, the program is timely and will incorporate EarthScope real-time seismic data (USArray) and educational materials in graduate course materials and teacher development programs. A number of schools in Nevada are contributing real-time data from both inexpensive and high-quality seismographs that are integrated with Nevada regional seismic network operations as well as the IRIS DMC. A powerful and unique component of the Nevada technology model is that schools can receive "stable" continuous live data feeds from 100's seismograph stations in Nevada, California and world (including live data from Earthworm systems and the IRIS DMC BUD - Buffer of Uniform Data). Students and teachers see their own networked seismograph station within a global context, as participants in regional and global monitoring. The robust real-time Internet communications protocols invoked in the Nevada network provide for local data acquisition, remote multi-channel data access, local time-series data management, interactive multi-window waveform display and time-series analysis with centralized meta-data control. Formally integrating educational seismology into the K-12 science curriculum with an overall "positive" impact to science education practices necessarily requires a collaborative effort between professional educators and seismologists yet driven exclusively by teacher needs.

  6. Black Thunder Coal Mine and Los Alamos National Laboratory experimental study of seismic energy generated by large scale mine blasting

    SciTech Connect

    Martin, R.L.; Gross, D. [Thunder Basin Coal Co., Wright, WY (United States); Pearson, D.C.; Stump, B.W. [Los Alamos National Lab., NM (United States); Anderson, D.P. [Southern Methodist Univ., Dallas, TX (United States). Dept. of Geological Sciences

    1996-12-31

    In an attempt to better understand the impact that large mining shots will have on verifying compliance with the international, worldwide, Comprehensive Test Ban Treaty (CTBT, no nuclear explosion tests), a series of seismic and videographic experiments has been conducted during the past two years at the Black Thunder Coal Mine. Personnel from the mine and Los Alamos National Laboratory have cooperated closely to design and perform experiments to produce results with mutual benefit to both organizations. This paper summarizes the activities, highlighting the unique results of each. Topics which were covered in these experiments include: (1) synthesis of seismic, videographic, acoustic, and computer modeling data to improve understanding of shot performance and phenomenology; (2) development of computer generated visualizations of observed blasting techniques; (3) documentation of azimuthal variations in radiation of seismic energy from overburden casting shots; (4) identification of, as yet unexplained, out of sequence, simultaneous detonation in some shots using seismic and videographic techniques; (5) comparison of local (0.1 to 15 kilometer range) and regional (100 to 2,000 kilometer range) seismic measurements leading to determine of the relationship between local and regional seismic amplitude to explosive yield for overburden cast, coal bulking and single fired explosions; and (6) determination of the types of mining shots triggering the prototype International Monitoring System for the CTBT.

  7. Verified Software Toolchain Andrew W. Appel

    E-print Network

    Appel, Andrew W.

    and libraries to supply context for programs. Our Verified Soft- ware Toolchain verifies with machine-language program, run- ning in the operating-system context, on a weakly-consistent-shared-memory ma- chine. Our

  8. Subsurface imaging with ocean bottom seismic

    Microsoft Academic Search

    Cafarelli

    1995-01-01

    Ocean bottom seismic, suited for shallow water and obstructed offshore areas, offers a range of benefits--higher bandwidth, design flexibility and virtually unlimited offsets. Its future may include reservoir monitoring. This article will discuss aspects of technology dealing with ocean bottom seismic. Topics presented include: Operational details; Design considerations; Receiver placement; Earlier applications; Attenuating reverberations with dual sensors; and Future applications.

  9. Subsurface imaging with ocean bottom seismic

    SciTech Connect

    Cafarelli, B. [PGS Ocean Bottom Seismic, Inc., Houston, TX (United States)

    1995-10-01

    Ocean bottom seismic, suited for shallow water and obstructed offshore areas, offers a range of benefits--higher bandwidth, design flexibility and virtually unlimited offsets. Its future may include reservoir monitoring. This article will discuss aspects of technology dealing with ocean bottom seismic. Topics presented include: Operational details; Design considerations; Receiver placement; Earlier applications; Attenuating reverberations with dual sensors; and Future applications.

  10. Caisson Foundations subjected to Seismic Faulting

    E-print Network

    Seismic Faulting OSE Research project : 250,000 ; funded by the Greek Railway Organization #12;Bridge ? .... in order to avoid it #12;ChiChi (Taiwan) 1999 : Collapse of a Bridge #12;Motivation : Importance) 1999 : No Damage diversion #12;Design of Bridges against Seismic Faulting #12;Design of Bridges against

  11. This Organization Participates in E-Verify

    E-print Network

    Subramanian, Venkat

    Administration (SSA) and, if necessary, the Department of Homeland Security (DHS), with information from each new DHS and/or the SSA before taking adverse action against you, including terminating your employment-Verify, please contact DHS: 888-897-7781 www.dhs.gov/E-Verify The E-Verify logo and mark are registered

  12. Probabilistic seismic hazard assessment for the effect of vertical ground motions on seismic response of highway bridges

    Microsoft Academic Search

    Zeynep Yilmaz

    2008-01-01

    Typically, the vertical component of the ground motion is not considered explicitly in seismic design of bridges, but in some cases the vertical component can have a significant effect on the structural response. The key question of when the vertical component should be incorporated in design is answered by the probabilistic seismic hazard assessment study incorporating the probabilistic seismic demand

  13. ENGINEERING APPLICATIONS OF THE HIGHRESOLUTION SEISMIC TECHNIQUES: TUNNEL DRILLING

    E-print Network

    Politcnica de Catalunya, Universitat

    ENGINEERING APPLICATIONS OF THE HIGHRESOLUTION SEISMIC TECHNIQUES: TUNNEL DRILLING lvarez, tunnels, etc), require detailed characterization of the rock massif, including information tomography were used for tunnel design in two different areas. Two seismic data acquisition experiments

  14. A Mechanically Verified Code Generator

    E-print Network

    Boyer, Robert Stephen

    in part at Computational Logic, Inc. by the Defense Advanced Research Projects Agency, ARPA Orders 6082., the Defense Advanced Research Projects Agency or the U.S. Government. #12;1 Chapter 1 INTRODUCTION A compiler provides the ability to program/specify/design in a notation which is more elegant, abstract, or expressive

  15. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...in the design and construction of Federal facilities...Contracts and Property Management Federal Property Management Regulations System...Continued) FEDERAL MANAGEMENT REGULATION REAL... 76-DESIGN AND CONSTRUCTION Design and...

  16. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...in the design and construction of Federal facilities...Contracts and Property Management Federal Property Management Regulations System...Continued) FEDERAL MANAGEMENT REGULATION REAL... 76-DESIGN AND CONSTRUCTION Design and...

  17. Seismic waves

    NSDL National Science Digital Library

    University of Utah. Astrophysics Science Project Integrating Research and Education (ASPIRE)

    2003-01-01

    What causes seismic waves and how do they travel through the Earth? This instructional tutorial, part of an interactive laboratory series for grades 8-12, introduces students to seismic waves caused by earthquakes. Students answer questions as they move through the tutorial and investigate how P and S waves travel through layers of the Earth. In one activity, students can produce and view wave motion in a chain of particles. Scored student results are provided. A second activity introduces Love and Rayleigh waves. In a third activity, students study P and S waves by activating four seismographs, watching the resulting P and S waves travel through the Earth, and answering interactive questions. Five web sites about waves, seismic action, and earthquakes are included. Copyright 2005 Eisenhower National Clearinghouse

  18. Advanced Seismic While Drilling System

    SciTech Connect

    Robert Radtke; John Fontenot; David Glowka; Robert Stokes; Jeffery Sutherland; Ron Evans; Jim Musser

    2008-06-30

    A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology of a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured by TII. An APS Turbine Alternator powered the SeismicPULSER{trademark} to produce two Hz frequency peak signals repeated every 20 seconds. Since the ION Geophysical, Inc. (ION) seismic survey surface recording system was designed to detect a minimum downhole signal of three Hz, successful performance was confirmed with a 5.3 Hz recording with the pumps running. The two Hz signal generated by the sparker was modulated with the 3.3 Hz signal produced by the mud pumps to create an intense 5.3 Hz peak frequency signal. The low frequency sparker source is ultimately capable of generating selectable peak frequencies of 1 to 40 Hz with high-frequency spectra content to 10 kHz. The lower frequencies and, perhaps, low-frequency sweeps, are needed to achieve sufficient range and resolution for realtime imaging in deep (15,000 ft+), high-temperature (150 C) wells for (a) geosteering, (b) accurate seismic hole depth, (c) accurate pore pressure determinations ahead of the bit, (d) near wellbore diagnostics with a downhole receiver and wired drill pipe, and (e) reservoir model verification. Furthermore, the pressure of the sparker bubble will disintegrate rock resulting in an increased overall rates of penetration. Other applications for the SeismicPULSER{trademark} technology are to deploy a low-frequency source for greater range on a wireline for Reverse Vertical Seismic Profiling (RVSP) and Cross-Well Tomography. Commercialization of the technology is being undertaken by first contacting stakeholders to define the value proposition for rig site services utilizing SeismicPULSER{trademark} technologies. Stakeholders include national oil companies, independent oil companies, independents, service companies, and commercial investors. Service companies will introduce a new Drill Bit SWD service for deep HTHP wells. Collaboration will be encouraged between stakeholders in the form of joint industry projects to develop prototype tools and initial field trials. No barriers have been identified for developing, utilizing, and exploiting the low-frequency SeismicPULSER{trademark} source in a

  19. Seismic Tomography.

    ERIC Educational Resources Information Center

    Anderson, Don L.; Dziewonski, Adam M.

    1984-01-01

    Describes how seismic tomography is used to analyze the waves produced by earthquakes. The information obtained from the procedure can then be used to map the earth's mantle in three dimensions. The resulting maps are then studied to determine such information as the convective flow that propels the crustal plates. (JN)

  20. Seismic Symphonies

    NASA Astrophysics Data System (ADS)

    Strinna, Elisa; Ferrari, Graziano

    2015-04-01

    The project started in 2008 as a sound installation, a collaboration between an artist, a barrel organ builder and a seismologist. The work differs from other attempts of sound transposition of seismic records. In this case seismic frequencies are not converted automatically into the "sound of the earthquake." However, it has been studied a musical translation system that, based on the organ tonal scale, generates a totally unexpected sequence of sounds which is intended to evoke the emotions aroused by the earthquake. The symphonies proposed in the project have somewhat peculiar origins: they in fact come to life from the translation of graphic tracks into a sound track. The graphic tracks in question are made up by copies of seismograms recorded during some earthquakes that have taken place around the world. Seismograms are translated into music by a sculpture-instrument, half a seismograph and half a barrel organ. The organ plays through holes practiced on paper. Adapting the documents to the instrument score, holes have been drilled on the waves' peaks. The organ covers about three tonal scales, starting from heavy and deep sounds it reaches up to high and jarring notes. The translation of the seismic records is based on a criterion that does match the highest sounds to larger amplitudes with lower ones to minors. Translating the seismogram in the organ score, the larger the amplitude of recorded waves, the more the seismogram covers the full tonal scale played by the barrel organ and the notes arouse an intense emotional response in the listener. Elisa Strinna's Seismic Symphonies installation becomes an unprecedented tool for emotional involvement, through which can be revived the memory of the greatest disasters of over a century of seismic history of the Earth. A bridge between art and science. Seismic Symphonies is also a symbolic inversion: the instrument of the organ is most commonly used in churches, and its sounds are derived from the heavens and symbolize cosmic harmony. But here it is the earth, "nature", the ground beneath our feet that is moving. It speaks to us not of harmony, but of our fragility. For the oldest earthquakes considered, Seismic Symphonies drew on SISMOS archives, the INGV project for recovery, high resolution digital reproduction and distribution of the seismograms of earthquakes of the Euro-Mediterranean area from 1895 to 1984. After the first exposure to the Fondazione Bevilacqua La Masa in Venice, the organ was later exhibited in Taiwan, the Taipei Biennial, with seismograms provided from the Taiwanese Central Weather Bureau, and at the EACC Castello in Spain, with seismograms of Spanish earthquakes provided by the Instituto Geogrfico Nacional.

  1. Verifying Test Hypotheses -HOL/TestGen Verifying Test Hypotheses -HOL/TestGen

    E-print Network

    Verifying Test Hypotheses - HOL/TestGen Verifying Test Hypotheses - HOL/TestGen An Experiment in Test and Proof Thomas Malcher January 20, 2014 1 / 20 #12;Verifying Test Hypotheses - HOL/TestGen HOL/TestGen Outline Introduction Test Hypotheses HOL/TestGen - Demo Verifying Test Hypotheses Conclusion 2 / 20 #12

  2. Seismic retrofit of rectangular RC bridge columns using wire mesh wrap casing

    Microsoft Academic Search

    Sung-Hoon Kim; Dae-Kon Kim

    Many bridges were constructed before a seismic design provision was implemented in Korea. Therefore, poor seismic response\\u000a of non-seismically detailed RC bridge columns could be expected during a seismic event. The aim of this study is to report\\u000a experimental results of a seismic retrofit method for the non-seismically detailed rectangular RC bridge columns by casing\\u000a lap-splice region with a Stainless

  3. Seismic Waves and the Slinky

    NSDL National Science Digital Library

    Lawrence Braile

    This teaching guide is designed to introduce the concepts of seismic waves that propagate within the Earth, and to provide ideas and suggestions for how to teach about seismic waves. The guide provides information on the types and properties of seismic waves and instructions for using the slinky to effectively demonstrate seismic wave characteristics and wave propagation. Most of the activities described in the guide are useful both as demonstrations for the teacher and as exploratory activities for students. A slinky is used to demonstrate P and S waves, Love wave on floor or tabletop, and Rayleigh waves by using three people and a long slinky. Five slinkys attached to wood block show that waves propagate in all directions from the source and that wave vibration for P and S sources will be different in different directions from the source.

  4. Cycles in mining seismicity

    NASA Astrophysics Data System (ADS)

    Marcak, Henryk

    2013-07-01

    Stochastic models of self-organized criticality and intermittent criticality are used to describe the structure of seismic catalogs. The intermittent models introduce three phases of the seismic cycle: increase in seismic energy, seismic relaxation, and seismic quiescence after the final relaxation. In this paper, seismic mining catalogs from a deep copper mine are searched to find these three phases of the seismic cycle. In spite of the differences between the seismic records from earthquakes and the building of stresses in the mine, the cycles can be estimated in mining seismicity.

  5. Seismic Reflection and Refraction

    NSDL National Science Digital Library

    This web site provides a brief introduction to the process of seismic exploration. Included are a definition of seismic exploration, a listing of possible applications of seismic methods, definitions of seismic reflection and refraction, and an explanation of data processing with seismic methods. The text descriptions are accompanied by visualizations helping to aid the reader in their understanding of the concepts discussed.

  6. Specifying and Verifying UML Activity Diagrams via Graph Transformation ?

    E-print Network

    Baldan, Paolo

    Specifying and Verifying UML Activity Diagrams via Graph Transformation ? Paolo Baldan 1 , Andrea for system speci#12;cation and ver- i#12;cation based on UML diagrams and interpreted in terms of graphs The use of visual modeling techniques, like the UML [22], for the design and de- velopment of large

  7. A Note on Understanding and Verifying Component Based Systems

    Microsoft Academic Search

    Eddy Truyen; Wouter Joosen; Pierre Verbaeten

    In this paper we present a model that helps to understand what component-based systems are. The definition of this system model is based on the notion of component frameworks and collaboration-based design. The system model allows to specify and to verify the structure of a component- based system in a uniform, compositional and hierarchically structured way. A possible application of

  8. ARMor: Fully Verified Software Fault Isolation University of Utah, USA

    E-print Network

    Regehr, John

    ARMor: Fully Verified Software Fault Isolation Lu Zhao University of Utah, USA luzhao have designed and implemented ARMor, a system that uses software fault isolation (SFI) to sandbox such as the RTOS and critical control loops from other, less-trusted components. ARMor guarantees memory safety

  9. Architecture Rationalization: A Methodology for Architecture Verifiability, Traceability and Completeness

    E-print Network

    Han, Jun

    Architecture Rationalization: A Methodology for Architecture Verifiability, Traceability-mail: {atang, jhan}@it.swin.edu.au Abstract Architecture modeling is practiced extensively in the software of architecture designs. Deficiencies in any of these three areas in an architecture model can be costly and risky

  10. Verifying a nuclear weapon`s response to radiation environments

    SciTech Connect

    Dean, F.F.; Barrett, W.H.

    1998-05-01

    The process described in the paper is being applied as part of the design verification of a replacement component designed for a nuclear weapon currently in the active stockpile. This process is an adaptation of the process successfully used in nuclear weapon development programs. The verification process concentrates on evaluating system response to radiation environments, verifying system performance during and after exposure to radiation environments, and assessing system survivability.

  11. Seismic Signals

    NSDL National Science Digital Library

    Not so long ago, people living near volcanoes had little that might help them to anticipate an eruption. A deep rumble, a puff of smoke, and ash might foreshadow a major volcanic event. Or a volcano might erupt with no warning at all. This interactive feature illustrates some of the types of seismic activity that may preceed an eruption, which modern seismologists are studying in hopes of improving their ability to predict eruptions.

  12. Seismic Waves

    NSDL National Science Digital Library

    In this activity, students learn about the different types of seismic waves in an environment they can control. Using an interactive, online wave generator, they will study P waves, S waves, Love waves, and Rayleigh waves, and examine a combination of P and S waves that crudely simulates the wave motion experienced during an earthquake. A tutorial is provided to show how the wave generator is used.

  13. Seismic Waves

    NSDL National Science Digital Library

    Jeffrey Barker

    This demonstration elucidates the concept of propagation of compressional waves (primary or P waves) and shear waves (secondary or S waves), which constitute the seismic waves used in locating and modeling earthquakes and underground nuclear explosions, and for imaging the interior structure of the Earth. The demonstration uses a slinky, pushed along its axis to create a compressional (longitudinal) wave, and moved up and down on one end to create a shear (transverse) wave.

  14. OVERVIEW OF THE PROBABILISTIC SEISMIC HAZARD ANALYSES OF YUCCA MOUNTAIN

    Microsoft Academic Search

    Ivan G. Wong

    2006-01-01

    Probabilistic seismic hazard analysis (PSHA) is now established practice as the basis for determining seismic design ground motions for important nuclear facilities. Consistent with this practice, PSHAs for ground motion and fault displacement have been performed for the Yucca Mountain site (Stepp et al., 2001). The methodology used for the PSHAs incorporated multiple expert evaluations of seismic sources, the potential

  15. Verifiable Encryption of Digital Signatures and Applications

    E-print Network

    Amir, Yair

    Verifiable Encryption of Digital Signatures and Applications GIUSEPPE ATENIESE The Johns Hopkins University This paper presents a new simple schemes for verifiable encryption of digital signatures. We make, ver- ifiable encryption schemes appeared in [Ateniese 1999] and the certified email protocol appeared

  16. Esta organizacin participa en E-Verify

    E-print Network

    Subramanian, Venkat

    Social (SSA, por sus siglas en ingls) y, de ser necesario, al Departamento de Seguridad Nacional (DHS la oportunidad de ponerse en contacto conDHS oSSA antes de sancionarlo de cualquier forma o finalizar obtener ms informacin sobre E-Verify, comunquese con DHS al: 888-897-7781 www.dhs.gov/E-Verify A V I

  17. Abstracting Pointers for a Verifying Compiler

    Microsoft Academic Search

    Gregory Kulczycki; Heather Keown; Murali Sitaraman; Bruce W. Weide

    2007-01-01

    Abstract The ultimate objective of a verifying compiler is to prove that proposed code implements a full behavioral specification. Experience reveals this to be especially difficult for programs that involve pointers or refer- ences and linked data structures. In some situations, pointers are unavoidable; in some others, verification can be simplified through suitable abstractions. Re- gardless, a verifying compiler should

  18. Verifying Atomicity via Data Independence Ohad Shacham

    E-print Network

    Aiken, Alex

    that could be repaired and then subsequently auto- matically verified. Moreover, we show that the remaining verifying data-independence is undecidable in the general case, we provide succint sufficient conditions that can be used to establish a composed operation as data-independent. We show that for the common case

  19. Describing and Verifying Web Service Using CCS

    Microsoft Academic Search

    Li Bao; Weishi Zhang; Xiuguo Zhang

    2006-01-01

    Formal method is an effective way for modeling and verifying concurrent system. An important research field is to describe and verify Web services by formal method. Guaranteeing the validity of Web services composition is necessary for enhancing the value of this composite service. CCS is a kind of process algebra which can be used to model concurrent systems. Web services

  20. Verified interoperable implementations of security protocols

    Microsoft Academic Search

    Karthikeyan Bhargavan; Cdric Fournet; Andrew D. Gordon; Stephen Tse

    2008-01-01

    We present an architecture and tools for verifying implementations of security protocols. Our implementations can run with both concrete and symbolic implementations of cryptographic algorithms. The concrete implementation is for production and interoperability testing. The symbolic implementation is for debugging and formal verification. We develop our approach for protocols written in F#, a dialect of ML, and verify them by

  1. Verified Interoperable Implementations of Security Protocols

    Microsoft Academic Search

    Karthikeyan Bhargavan; Cdric Fournet; Andrew D. Gordon; Stephen Tse

    2006-01-01

    We present an architecture and tools for verifying im- plementations of security protocols. Our implementations can run with both concrete and symbolic implementations of cryptographic algorithms. The concrete implementation is for production and interoperability testing. The symbolic implementation is for debugging and formal verification. We develop our approach for protocols written in F#, a di- alect of ML, and verify

  2. Verified Interoperable Implementations of Security Protocols

    Microsoft Academic Search

    Karthikeyan Bhargavan; Cedric Fournet; Andrew D. Gordon; Stephen Tse

    2007-01-01

    We present an architecture and tools for verifying implementations of security protocols. Our implementations can run with both concrete and symbolic implementations of cryptographic algorithms. The concrete implementation is for production and interoperability testing. The symbolic implementation is for debugging and formal verification. We develop our approach for protocols written in F#, a dialect of ML, and verify them by

  3. Instrumentation for verifying conventional forces in Europe

    Microsoft Academic Search

    R. H. Howes; A. DeVolpi

    1991-01-01

    The requirements of verifying the Treaty on Conventional Armed Forces in Europe (CFE-I), signed in Paris in November 1990, are outlined. The goals of verification are discussed. Some problems specific to verifying conventional forces are identified. Two basic inventory models that have been proposed for use in CFE treaties are described and their technology requirements are examined

  4. iMUSH: The design of the Mount St. Helens high-resolution active source seismic experiment

    NASA Astrophysics Data System (ADS)

    Kiser, Eric; Levander, Alan; Harder, Steve; Abers, Geoff; Creager, Ken; Vidale, John; Moran, Seth; Malone, Steve

    2013-04-01

    Mount St. Helens is one of the most societally relevant and geologically interesting volcanoes in the United States. Although much has been learned about the shallow structure of this volcano since its eruption in 1980, important questions still remain regarding its magmatic system and connectivity to the rest of the Cascadia arc. For example, the structure of the magma plumbing system below the shallowest magma chamber under the volcano is still only poorly known. This information will be useful for hazard assessment for the southwest Washington area, and also for gaining insight into fundamental scientific questions such as the assimilation and differentiation processes that lead to the formation of continental crust. As part of the multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment, funded by NSF GeoPRISMS and EarthScope, an active source seismic experiment will be conducted in late summer 2014. The experiment will utilize all of the 2600 IRIS/PASSCAL/USArray Texan instruments. The instruments will be deployed as two 1000-instrument consecutive refraction profiles (one N/S and one WNW/ESE). Each of these profiles will be accompanied by two 1600-instrument areal arrays at varying distances from Mount St. Helens. Finally, one 2600-instrument areal array will be centered on Mount St. Helens. These instruments will record a total of twenty-four 500-1000 kg shots. Each refraction profile will have an average station spacing of 150 m, and a total length of 150 km. The stations in the areal arrays will be separated by ~1 km. A critical step in the success of this project is to develop an experimental setup that can resolve the most interesting aspects of the magmatic system. In particular, we want to determine the distribution of shot locations that will provide good coverage throughout the entire model space, while still allowing us to focus on regions likely to contain the magmatic plumbing system. In this study, we approach this problem by calculating Frchet kernels with dynamic ray tracing. An initial observation from these kernels is that waves traveling across the largest offsets of the experiment (~150km) have sensitivity below depths of 30km. This means that we may be able to image the magmatic system down to the Moho, estimated at ~40 km. Additional work is focusing on searching for the shot locations that provide high resolution around very shallow features beneath Mount St. Helens, such as the first magmatic reservoir at about 3 km depth, and the associated Mount St. Helens seismic zone. One way in which we are guiding this search is to find the shot locations that maximize sensitivity values within the regions of interest after summing Frchet kernels from each shot/station pair

  5. Implementation of Seismic Stops in Piping Systems

    SciTech Connect

    Bezler, P.; Simos, N.; Wang, Y.K.

    1993-02-01

    Commonwealth Edison has submitted a request to NRC to replace the snubbers in the Reactor Coolant Bypass Line of Byron Station-Unit 2 with gapped pipe supports. The specific supports intended for use are commercial units designated ''Seismic Stops'' manufactured by Robert L. Cloud Associates, Inc. (RLCA). These devices have the physical appearance of snubbers and are essentially spring supports incorporating clearance gaps sized for the Byron Station application. Although the devices have a nonlinear stiffness characteristic, their design adequacy is demonstrated through the use of a proprietary linear elastic piping analysis code ''GAPPIPE'' developed by RLCA. The code essentially has all the capabilities of a conventional piping analysis code while including an equivalent linearization technique to process the nonlinear spring elements. Brookhaven National Laboratory (BNL) has assisted the NRC staff in its evaluation of the RLCA implementation of the equivalent Linearization technique and the GAPPIPE code. Towards this end, BNL performed a detailed review of the theoretical basis for the method, an independent evaluation of the Byron piping using the nonlinear time history capability of the ANSYS computer code and by result comparisons to the RLCA developed results, an assessment of the adequacy of the response estimates developed with GAPPIPE. Associated studies included efforts to verify the ANSYS analysis results and the development of bounding calculations for the Byron Piping using linear response spectrum methods.

  6. Microsoft Application Verifier http://www.msstudy.co.kr

    E-print Network

    Hunt, Galen

    Microsoft Application Verifier : http://www.msstudy.co.kr #12;1 | MICROSOFT APPLICATION VERIFIER ?................................................................................... 2 MICROSOFT. ..................................................................................................................................................... 3 MICROSOFT APPLICATION VERIFIER

  7. E-Verify es un servicio de DHS y SSA QU ES E-VERIFY?

    E-print Network

    Bolding, M. Chad

    E-Verify es un servicio de DHS y SSA QU ES E-VERIFY? La ley federal requiere que todos los Seguridad Nacional (DHS por sus siglas en ingls) y la Administracin del Seguro Social (SSA por sus siglas, a travs de E-Verify, los empleadores envian informacin acerca de usted a la SSA y el DHS (slo para no

  8. WHAT IS E-VERIFY? Federal Law requires that all employers verify

    E-print Network

    Bandettini, Peter A.

    and eligibility to work in the United States. The Department of Homeland Security (DHS) and the Social Security information from the Form I-9 about you to SSA and DHS (only for non-citizens) to ensure that you Under E-Verify fast free simple secure (888) 464-4218 www.dhs.gov/E-Verify E-Verify is a service

  9. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-06-15

    This report covers a one-year, no-cost extension that was requested and received in 2003; the extension runs through September 14, 2004. The extension has been used to continue data analysis and prepare additional manuscripts for submission to refereed journals. The primary research focus of the original three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under all conditions (please see the Status and Results of Experiments sections for details). In some easy-access environments, this device could make shallow seismic surveying considerably more efficient and less expensive. The most recent research analyzed the difference in seismic response of the geophones with variable geophone spike length and geophones attached to various steel media. Experiments investigated the azimuthal dependence of the quality of data relative to the orientation of the rigidly attached geophones. Other experiments designed to test the hypothesis that the data are being amplified in much the same way that an organ pipe amplifies sound have so far proved inconclusive. Another element of our research was monitoring the cone of depression around a pumping well, with the well serving as a proxy location for fluid-flow at a contaminated DOE site. We collected data from a well site at which drawdown equilibrium had been reached and at another site during a pumping test. Data analysis disclosed that although we were successful in imaging the water table using seismic reflection techniques (Johnson, 2003), we were not able to explicitly delineate the cone of depression.

  10. Voter verifiability in homomorphic election schemes

    E-print Network

    Forsythe, Joy Marie

    2005-01-01

    Voters are now demanding the ability to verify that their votes are cast and counted as intended. Most existing cryptographic election protocols do not treat the voter as a computationally-limited entity separate from the ...

  11. 24 CFR 3286.507 - Verifying installation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...2013-04-01 2013-04-01 false Verifying installation. 3286.507 Section 3286.507 Housing...AND URBAN DEVELOPMENT MANUFACTURED HOME INSTALLATION PROGRAM Inspection of Installations in HUD-Administered States ...

  12. 24 CFR 3286.507 - Verifying installation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...2010-04-01 2010-04-01 false Verifying installation. 3286.507 Section 3286.507 Housing...AND URBAN DEVELOPMENT MANUFACTURED HOME INSTALLATION PROGRAM Inspection of Installations in HUD-Administered States ...

  13. 24 CFR 3286.507 - Verifying installation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...2012-04-01 2012-04-01 false Verifying installation. 3286.507 Section 3286.507 Housing...AND URBAN DEVELOPMENT MANUFACTURED HOME INSTALLATION PROGRAM Inspection of Installations in HUD-Administered States ...

  14. 24 CFR 3286.507 - Verifying installation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...2014-04-01 2014-04-01 false Verifying installation. 3286.507 Section 3286.507 Housing...AND URBAN DEVELOPMENT MANUFACTURED HOME INSTALLATION PROGRAM Inspection of Installations in HUD-Administered States ...

  15. 24 CFR 3286.507 - Verifying installation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...2011-04-01 2011-04-01 false Verifying installation. 3286.507 Section 3286.507 Housing...AND URBAN DEVELOPMENT MANUFACTURED HOME INSTALLATION PROGRAM Inspection of Installations in HUD-Administered States ...

  16. Geppetto: Versatile Verifiable Computation Craig Costello

    E-print Network

    Geppetto: Versatile Verifiable Computation Craig Costello craigco@microsoft.com Microsoft Research@virginia.edu University of Virginia Michael Naehrig mnaerig@microsoft.co Microsoft Research Bryan Parno parno

  17. LANL seismic screening method for existing buildings

    SciTech Connect

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O. [and others

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method and will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.

  18. Seismic retrofitting of deficient Canadian buildings

    E-print Network

    Gemme, Marie-Claude

    2009-01-01

    Many developed countries such as Canada and the United States are facing a significant infrastructure crisis. Most of their facilities have been built with little consideration of seismic design and durability issues. As ...

  19. A university-developed seismic source for shallow seismic surveys

    NASA Astrophysics Data System (ADS)

    Yordkayhun, Sawasdee; Na Suwan, Jumras

    2012-07-01

    The main objectives of this study were to (1) design and develop a low cost seismic source for shallow seismic surveys and (2) test the performance of the developed source at a test site. The surface seismic source, referred to here as a university-developed seismic source is based upon the principle of an accelerated weight drop. A 30 kg activated mass is lifted by a mechanical rack and pinion gear and is accelerated by a mounted spring. When the mass is released from 0.5 m above the surface, it hits a 30 kg base plate and energy is transferred to the ground, generating a seismic wave. The developed source is portable, environmentally friendly, easy to operate and maintain, and is a highly repeatable impact source. To compare the developed source with a sledgehammer source, a source test was performed at a test site, a study site for mapping a major fault zone in southern Thailand. The sledgehammer and the developed sources were shot along a 300 m long seismic reflection profile with the same parameters. Data were recorded using 12 channels off-end geometry with source and receiver spacing of 5 m, resulting in CDP stacked sections with 2.5 m between traces. Source performances were evaluated based on analyses of signal penetration, frequency content and repeatability, as well as the comparison of stacked sections. The results show that both surface sources are suitable for seismic studies down to a depth of about 200 m at the site. The hammer data are characterized by relatively higher frequency signals than the developed source data, whereas the developed source generates signals with overall higher signal energy transmission and greater signal penetration. In addition, the repeatability of the developed source is considerably higher than the hammer source.

  20. predictingearthquake Seismic-hazard assessment

    E-print Network

    Paris-Sud XI, Universit de

    ;27 Gosciencesnumro4septembre2006 earthquake shaking, and ii) improve existing structures and design new buildings to better resist earthquakes. Both approaches require a reliable assessment of the hazardpredictingearthquake 26 Seismic-hazard assessment A n earthquake occurs when a fault (an area

  1. Application of bounding spectra to seismic design of piping based on the performance of above ground piping in power plants subjected to strong motion earthquakes

    SciTech Connect

    Stevenson, J.D. [Stevenson and Associates, Cleveland, OH (United States)

    1995-02-01

    This report extends the potential application of Bounding Spectra evaluation procedures, developed as part of the A-46 Unresolved Safety Issue applicable to seismic verification of in-situ electrical and mechanical equipment, to in-situ safety related piping in nuclear power plants. The report presents a summary of earthquake experience data which define the behavior of typical U.S. power plant piping subject to strong motion earthquakes. The report defines those piping system caveats which would assure the seismic adequacy of the piping systems which meet those caveats and whose seismic demand are within the bounding spectra input. Based on the observed behavior of piping in strong motion earthquakes, the report describes the capabilities of the piping system to carry seismic loads as a function of the type of connection (i.e. threaded versus welded). This report also discusses in some detail the basic causes and mechanisms for earthquake damages and failures to power plant piping systems.

  2. The ENAM Explosive Seismic Source Test

    NASA Astrophysics Data System (ADS)

    Harder, S. H.; Magnani, M. B.

    2013-12-01

    We present the results of the pilot study conducted as part of the eastern North American margin (ENAM) community seismic experiment (CSE) to test an innovative design of land explosive seismic source for crustal-scale seismic surveys. The ENAM CSE is a community based onshore-offshore controlled- and passive-source seismic experiment spanning a 400 km-wide section of the mid-Atlantic East Coast margin around Cape Hatteras. The experiment was designed to address prominent research questions such as the role of the pre-existing lithospheric grain on the structure and evolution of the ENAM margin, the distribution of magmatism, and the along-strike segmentation of the margin. In addition to a broadband OBS deployment, the CSE will acquire multichannel marine seismic data and two major onshore-offshore controlled-source seismic profiles recording both marine sources (airguns) and land explosions. The data acquired as part of the ENAM CSE will be available to the community immediately upon completion of QC procedures required for archiving purposes. The ENAM CSE provides an opportunity to test a radically new and more economical design for land explosive seismic sources used for crustal-scale seismic surveys. Over the years we have incrementally improved the performance and reduced the cost of shooting crustal seismic shots. These improvements have come from better explosives and more efficient configuration of those explosives. These improvements are largely intuitive, using higher velocity explosives and shorter, but larger diameter explosive configurations. However, recently theoretical advances now allow us to model not only these incremental improvements, but to move to more radical shot designs, which further enhance performance and reduce costs. Because some of these designs are so radical, they need experimental verification. To better engineer the shots for the ENAM experiment we are conducting an explosives test in the region of the ENAM CSE. The results of this test will guide engineering for the main ENAM experiment as well as other experiments in the future.

  3. Lamport clocks: verifying a directory cache-coherence protocol

    Microsoft Academic Search

    Manoj Plakal; Daniel J. Sorin; Anne E. Condon; Mark D. Hill

    1998-01-01

    Modern shared-memory multiprocessors use complex memory sys- tem implementations that include a variety of non-trivial and inter- acting optimizations. More time is spent in verifying the correctness of such implementations than in designing the system. In particular, large-scale Distributed Shared Memory (DSM) sys- tems usually rely on a directory cache-coherence protocol to pro- vide the illusion of a sequentially consistent

  4. Seismic sources

    DOEpatents

    Green, M.A.; Cook, N.G.W.; McEvilly, T.V.; Majer, E.L.; Witherspoon, P.A.

    1987-04-20

    Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Longitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements for more than about one minute. 9 figs.

  5. Utilizing USArray Stations to Verify Tornado Observations

    NASA Astrophysics Data System (ADS)

    Tytell, J. E.; Tatom, F.; Vernon, F.

    2012-12-01

    Of the several hundred individual locations in which stations from the USArray Transportable Array (TA) network have been positioned, quite a few have had close pass-bys (< 10 km) from tornadoes. When a tornado is on the ground there are two clearly observable signals in nearby seismic stations: A discernable tilt of the crust and an increase in seismic noise due to the vibrational energy released into the ground. We will examine three separate scenarios in which individual stations from the TA network experienced pass-bys from tornadoes that were on the ground. Satellite, Doppler radar, track information and even some surrounding seismic stations will be presented for verification. We will then attempt to determine energy signatures from each tornado example in order to isolate a common vibrational or acoustical energy signature of tornadoes when in contact with the ground. By successfully isolating these energy signatures it may be possible to pinpoint on-ground tornado positions in real-time, estimate intensities, and even help to provide early-warning and now-casting support.

  6. Application of displacement-based design method to assess the level of structural damage due to blast loads

    Microsoft Academic Search

    Ramezan Ali Izadifard; Mahmoud Reza Maheri

    2010-01-01

    In this paper, a displacement-based design (DBD) methodology commonly used for seismic design and evaluation of structures\\u000a is adopted to determine the performance of structures to blast loading. In this method, structural performance is linked to\\u000a measurable quantities such as the displacement ductility. To verify the applicability of the method and the accuracy of the\\u000a results, a simple structural shape,

  7. Using Magnetic Disk instead of Main Memory in the Mur' Verifier

    E-print Network

    Dill, David L.

    tions among these components are a notorious source of design errors. Conven tional verificationUsing Magnetic Disk instead of Main Memory in the Mur' Verifier Ulrich Stern and David L. Dill a version of the explicit state enumeration verifier Mur' that al lows the use of magnetic disk instead

  8. Enforceable and Verifiable Stale-Safe Security Properties in Distributed Systems

    E-print Network

    Texas at San Antonio, University of

    Enforceable and Verifiable Stale-Safe Security Properties in Distributed Systems JIANWEI NIU of authorization state are not globally synchronized. This problem is so intrinsic that it is inevitable an access these SMs can be designed so as to satisfy the stale-safe security properties. Next, we formally verify

  9. Verified Computer Algebra in Acl2

    Microsoft Academic Search

    I. Medina-Bulo; F. Palomo-Lozano; J. A. Alonso-Jimenez; J. L. Ruiz-Reina

    \\u000a In this paper, we present the formal verification of a Common Lisp implementation of Buchbergers algorithm for computing Grbner bases of polynomial ideals. This work is carried out in the\\u000a Acl2 system and shows how verified Computer Algebra can be achieved in an executable logic.

  10. Computational Verifiable Secret Sharing Revisited Michael Backes

    E-print Network

    Denmark arpita@cs.au.dk Abstract Verifiable secret sharing (VSS) is an important primitive in distributed at most t of them. In the computational setting, the feasibility of VSS schemes based on com- mitments was established over two decades ago. Interestingly, all known computational VSS schemes rely on the homomorphic

  11. Verifying Magic Square Properties Sample Proof

    E-print Network

    White, Donald L.

    Verifying Magic Square Properties Sample Proof Theorem. Adding the same number n to each entry in a 3 by 3 magic square with magic number M yields a magic square with magic number M + 3n. Proof. Suppose we are given a 3 by 3 magic square, called Square 1, and the three numbers in some row, column

  12. Verifying and validating a simulation model

    Microsoft Academic Search

    Anbin Hu; Ye San; Zicai Wang

    2001-01-01

    This paper presents the verification and validation (V&V) of simulation model with the emphasis on the possible modification. Based on the analysis, a new framework is proposed, and new terms are defined. An example is employed to demonstrate how the framework and terms related are used in verifying and validating an existing model.

  13. Attempts to Verify Written English Anthony Penniston

    E-print Network

    Harley, Eric R.

    @ryerson.ca Abstract The English language offers a complex and ambiguous grammar that is readily understood by itsAttempts to Verify Written English Anthony Penniston Department of Computer Science Ryerson. This paper discusses research and implementation o several techniques towards algorithmically analyzing

  14. This Employer Participates in E-Verify

    E-print Network

    Meyers, Steven D.

    Administration (SSA) and, if necessary, the Department of Homeland Security (DHS), with information from each new to contact SSA and/or DHS before taking adverse action against you, including terminating your employment-Verify, please contact DHS at: 1-888-464-4218 #12;

  15. This Employer Participates in E-Verify

    E-print Network

    Administration (SSA) and, if necessary, the Department of Homeland Security (DHS), with information from each new to contact SSA and/or DHS before taking adverse action against you, including terminating your employment-Verify, please contact DHS at: 1-888-464-4218 #12;IF YOU HAVE THE RIGHT TO WORK, Don't let anyone take it away

  16. Verifiable Agent Interaction in Abductive Logic Programming

    Microsoft Academic Search

    Via Saragat

    SCIFF is a new abductive logic programming proof-procedure for reasoning with expectations in dynamic environments. SCIFF is also the main component of a framework thought to specify and verify interaction in open agent societies. In this paper we present the declarative and operational semantics of SCIFF, its termination, soundness and completeness results, and some sample applications to demonstrate its use

  17. Verifying Concurrent Memory Reclamation Algorithms with Grace

    E-print Network

    Rinetzky, Noam

    Verifying Concurrent Memory Reclamation Algorithms with Grace Alexey Gotsman, Noam Rinetzky proposed for it--such as haz- ard pointers, read-copy-update and epoch-based reclamation--have proved very challenging for formal reasoning. In this paper, we show that different memory reclamation techniques actually

  18. Verifying liveness properties of multifunction composite protocols

    Microsoft Academic Search

    Jun-cheol Park

    2003-01-01

    In protocol composition techniques, component protocols are combined in various ways to obtain a complex protocol whose execution sequences consist of interleaved execution sequences of the component protocols. In this paper, we investigate the problem of verifying liveness properties of the composite protocol from the known properties of its components. We first characterize a class of composite protocols that encompasses

  19. verifying complex software systems the challenge

    E-print Network

    van Dyk, David

    4/1/2011 1 verifying complex software systems the challenge gerard holzmann gholzmann@acm.org 2 0 of latent defects for complex software systems neither set is ever empty #12;4/1/2011 5 12 how do most to the 60s 9 complex multi-threaded software systems with hidden dependencies (Charles Perrow effect) how

  20. Firms Verify Online IDs Via Schools

    ERIC Educational Resources Information Center

    Davis, Michelle R.

    2008-01-01

    Companies selling services to protect children and teenagers from sexual predators on the Internet have enlisted the help of schools and teachers to verify students' personal information. Those companies are also sharing some of the information with Web sites, which can pass it along to businesses for use in targeting advertising to young

  1. Specifying and verifying PLC systems with TLA

    Microsoft Academic Search

    Hehua Zhang; Stephan Merz; Ming Gu

    2010-01-01

    We report on a method for formally specifying and verifying programmable logic controllers (PLCs) in the specification language TLA+. The specification framework is generic. It separates the description of the environment from that of the controller itself and its structure is consistent with the scan cycle mechanism used by PLCs. Specifications can be parameterized with the number of replicated components.

  2. Verifying sensor network security protocol implementations

    Microsoft Academic Search

    Youssef Wasfy Hanna

    2008-01-01

    Verifying sensor network security protocol implementations using testing\\/simulation might leave some flaws undetected. Formal verification techniques have been very successful in detecting faults in security protocol specifications; however, they generally require building a formal description (model) of the protocol. Building accurate models is hard, thus hindering the application of formal verification. In this work, a framework for automating formal verification

  3. The Limited Verifier Signature and Its Application

    Microsoft Academic Search

    Shunsuke ARAKI; Satoshi UEHARA; Kyoki IMAMURA

    1999-01-01

    SUMMARY In ordinary digital signature schemes, anyone can verify signatures with signer's public key. However it is not necessary for anyone to be convinced a justification of signer's dis- honorable message such as a bill. It is enough for a receiver only to convince outsiders of signature's justification if the signer does not execute a contract. On the other hand

  4. Scalable and scalably-verifiable sequential synthesis

    Microsoft Academic Search

    Alan Mishchenko; Michael L. Case; Robert K. Brayton; Stephen Jang

    2008-01-01

    This paper describes an efficient implementation of an effective sequential synthesis operation that uses induction to detect and merge sequentially-equivalent nodes. State-encoding, scan chains, and test vectors are essentially preserved. Moreover, the sequential synthesis results are guaranteed to be sequentially verifiable against the original circuits. Verification can use an independent inductive prover similar to that used for synthesis, with guaranteed

  5. Scalable and scalably-verifiable sequential synthesis

    Microsoft Academic Search

    Alan Mishchenko; Michael Case; Robert Brayton; Stephen Jang

    2008-01-01

    This paper describes an efficient implementation of sequential synthesis that uses induction to detect and merge sequentially-equivalent nodes. State-encoding, scan chains, and test vectors are essentially preserved. Moreover, the sequential synthesis results are sequentially verifiable using an independent inductive prover similar to that used for synthesis, with guaranteed completeness. Experiments with this sequential synthesis show effectiveness. When applied to a

  6. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    NASA Technical Reports Server (NTRS)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  7. Honest Verifier vs Dishonest Verifier in Public Coin Zero-Knowledge Proofs

    E-print Network

    Wigderson, Avi

    Honest Verifier vs Dishonest Verifier in Public Coin Zero-Knowledge Proofs Ivan Damg°ard Oded applies only to constant-round proof systems. It builds on Damg°ard's transformation (see Crypto93), using and Yung ­ see Crypto92)which was used by Damg°ard. Consequently, the protocols resultingfrom our

  8. Cloud Verifier: Verifiable Auditing Service for IaaS Clouds Joshua Schiffman

    E-print Network

    Jaeger, Trent

    Cloud Verifier: Verifiable Auditing Service for IaaS Clouds Joshua Schiffman Security Architecture University Park, PA, USA yus138,hvijay,tjaeger@cse.psu.edu Abstract--Cloud computing has commoditized compute paradigm, its adoption has been stymied by cloud platform's lack of trans- parency, which leaves customers

  9. Verified Runtime Validation of Verified CPS Models From Model Checking to Checking Models

    E-print Network

    Clarke, Edmund M.

    ModelPlex: Verified Runtime Validation of Verified CPS Models From Model Checking to Checking Models Stefan Mitsch Andr´e Platzer Computer Science Department, Carnegie Mellon University Clarke Symposium, Sept. 20, 2014 For details, see ModelPlex paper at RV'14 Stefan Mitsch, Andr´e Platzer--Model

  10. Alf-Verifier: An Eclipse Plugin for Verifying Alf/UML Executable Models

    E-print Network

    Paris-Sud XI, Universit de

    Alf-Verifier: An Eclipse Plugin for Verifying Alf/UML Executable Models Elena Planas1 , David with the integrity constraints defined in the class diagram (specified in UML) and returns a meaningful feedback, the OMG has recently published the first version of the "Foundational Subset for Executable UML Models" (fUML

  11. Development of Seismic Isolation Systems Using Periodic Materials

    SciTech Connect

    Mo, Yi-Lung; Stokoe, Kenneth H.; Perkins, Judy; Tang, Yu

    2014-12-10

    Advanced fast nuclear power plants and small modular fast reactors are composed of thin-walled structures such as pipes; as a result, they do not have sufficient inherent strength to resist seismic loads. Seismic isolation, therefore, is an effective solution for mitigating earthquake hazards for these types of structures. Base isolation, on which numerous studies have been conducted, is a well-defined structure protection system against earthquakes. In conventional isolators, such as high-damping rubber bearings, lead-rubber bearings, and friction pendulum bearings, large relative displacements occur between upper structures and foundations. Only isolation in a horizontal direction is provided; these features are not desirable for the piping systems. The concept of periodic materials, based on the theory of solid-state physics, can be applied to earthquake engineering. The periodic material is a material that possesses distinct characteristics that prevent waves with certain frequencies from being transmitted through it; therefore, this material can be used in structural foundations to block unwanted seismic waves with certain frequencies. The frequency band of periodic material that can filter out waves is called the band gap, and the structural foundation made of periodic material is referred to as the periodic foundation. The design of a nuclear power plant, therefore, can be unified around the desirable feature of a periodic foundation, while the continuous maintenance of the structure is not needed. In this research project, three different types of periodic foundations were studied: one-dimensional, two-dimensional, and three-dimensional. The basic theories of periodic foundations are introduced first to find the band gaps; then the finite element methods are used, to perform parametric analysis, and obtain attenuation zones; finally, experimental programs are conducted, and the test data are analyzed to verify the theory. This procedure shows that the periodic foundation is a promising and effective way to mitigate structural damage caused by earthquake excitation.

  12. Seismic behavior of structural silicone glazing

    SciTech Connect

    Zarghamee, M.S.; Schwartz, T.A. [Simpson Gumpertz and Heger Inc., Arlington, MA (United States); Gladstone, M. [Dow Corning Corp., Fremont, CA (United States)

    1996-12-31

    In seismic events, glass curtain walls undergo racking deformation, while the flat glass lites do not rack due to their high shear stiffness. If the glass curtain wall is not isolated from the building frame by specifically designed connections that accommodate relative motion, seismic racking motion of the building frame will demand significant resiliency of the sealant that secures the glass to the curtain wall framing. In typical four-sided structural silicone glazing systems used in buildings with unbraced moment frames, the magnitude of seismic racking is likely to stress the sealants significantly beyond the sealant design strength. In this paper, the extent of the expected seismic racking motion, the behavior of the structural silicone glazing when subjected to the expected racking motion, and the field performance of a building with four-sided structural silicone glazing during the Northridge earthquake are discussed. The details of a curtain wall design concept consisting of shop-glazed subframes connected to the building frame and the connections that accommodate seismic motion of the subframe relative to the building frame is developed. Specific recommendations are made for the design of the four-sided structural silicone glazing systems for seismic loads.

  13. Seismic interferometry for seismic source location and interpolation of three-dimensional ocean bottom seismic data

    NASA Astrophysics Data System (ADS)

    Cao, Weiping

    This dissertation develops new seismic interferometry algorithms for estimation of seismic source locations and for the interpolation of sparse three-dimensional (3D) ocean bottom seismic (OBS) data. Unlike the conventional source location and interpolation methods, which heavily rely on the accuracy of the velocity models, the interferometric techniques extract the multiple scattering information in the data, and provide reliable results without knowledge of the velocity models. There are three main chapters in this dissertation. In Chapter 2 an interferometric imaging scheme, which is formulated as time-reversal mirrors (TRMs) in acoustics, is developed to backpropagate and refocus incident wave-fields to their actual source location, with the subsequent benefits of imaging with super-resolution and super-stacking properties. These benefits of TRMs have been previously verified with computer simulations and tank experiments, but not with exploration-scale seismic data. We now demonstrate, for the first time, the super-resolution and the super-stacking properties of TRMs with field seismic data. Tests on both synthetic data and field data show that TRM has the potential to exceed the Rayleigh resolution limit by factors of 7 or more. Results also show that TRM has a significant resilience to strong Gaussian noise, and that accurate imaging of source locations from passive seismic data can be accomplished with traces having signal-to-noise ratios as low as 0.001. Synthetic tests also demonstrate that TRMs enhance the signal by a factor proportional to the square root of the product of the number of traces and the number of events in the traces. This enhancement property is denoted as super-stacking and greatly exceeds the well-known square-root number-of-traces factor. Super-resolution and super-stacking are properties also enjoyed by seismic interferometry and reverse-time migration (RTM) with the exact velocity model. In Chapter 3 the equation for the vertical resolution limit Delta z is derived for the images obtained by cross-correlation migration (CCM). The analytical formula shows that Deltaz CCM = 4zo2 lLg 2 compared to the vertical resolution limit Deltaz post. = l2 for conventional migration; here lambda is the dominant wavelength; zo is the depth of the scatterer, and Lg is the half length of the recording aperture. This result explains the degraded vertical resolution in CCM images compared to those obtained from poststack migration, and suggests that a good resolution at depth requires a large recording aperture. I extend the interferometric interpolation of ocean bottom seismic (OBS) traces from 2D data to 3D data in Chapter 4. The sparse OBS data are interferometrically correlated with the model-based Green's functions for the sea-floor model to generate the densely recorded OBS traces, and a local matching filter is applied to reduce the artifacts in the interpolated data. Information about the source wavelet and the multilayered velocity model below the sea floor is not needed. Results with 3D synthetic data show that the OBS traces can be faithfully interpolated from a sparse sampling interval of 50 m, about one half of the minimum horizontal wavelength, in both the x and y directions. Interpolation results are also computed using sparse OBS data of different recording intervals, and the error analysis shows degrading interpolation accuracy when the recording interval increases. To mitigate the artifacts in the interferometry correlation results, an anti-aliasing condition is derived and demonstrated with a simple numerical example. The anti-aliasing theory developed here is a new development that establishes a fundamental criterion for numerically applying seismic interferometry to seismic data of any type.

  14. Positively Verifying Mating of Previously Unverifiable Flight Connectors

    NASA Technical Reports Server (NTRS)

    Pandipati R. K. Chetty

    2011-01-01

    Current practice is to uniquely key the connectors, which, when mated, could not be verified by ground tests such as those used in explosive or non-explosive initiators and pyro valves. However, this practice does not assure 100-percent correct mating. This problem could be overcome by the following approach. Errors in mating of interchangeable connectors can result in degraded or failed space mission. Mating of all flight connectors considered not verifiable via ground tests can be verified electrically by the following approach. It requires two additional wires going through the connector of interest, a few resistors, and a voltage source. The test-point voltage V(sub tp) when the connector is not mated will be the same as the input voltage, which gets attenuated by the resistor R(sub 1) when the female (F) and male (M) connectors are mated correctly and properly. The voltage at the test point will be a function of R(sub 1) and R(sub 2). Monitoring of the test point could be done on ground support equipment (GSE) only, or it can be a telemetry point. For implementation on multiple connector pairs, a different value for R(sub 1) or R(sub 2) or both can be selected for each pair of connectors that would result in a unique test point voltage for each connector pair. Each test point voltage is unique, and correct test point voltage is read only when the correct pair is mated correctly together. Thus, this design approach can be used to verify positively the correct mating of the connector pairs. This design approach can be applied to any number of connectors on the flight vehicle.

  15. Parallel Seismic Ray Tracing

    E-print Network

    Jain, Tarun K

    2013-12-09

    Seismic ray tracing is a common method for understanding and modeling seismic wave propagation. The wavefront construction (WFC) method handles wavefronts instead of individual rays, thereby providing a mechanism to control ray density...

  16. Using Embedded Wired and Wireless Seismic Networks in the Moment-Resisting Steel Frame Factor Building for Damage Identification

    E-print Network

    Kohler, Monica; Heaton, Thomas H.; Govindan, Ramesh; Davis, Paul; Estrin, D.

    2006-01-01

    and wireless seismic networks in the mo- ment-resisting steel frame Factor buildingbuilding seismic array to guide the design of a wirelesswireless system design. Acknowledgments We appreciate discussions with Erdal Safak and advice on building

  17. Shear building representations of seismically isolated buildings

    Microsoft Academic Search

    Cenk Alhan; Melih Srmeli

    Seismic isolation, with its capability of reducing floor accelerations and interstory drifts simultaneously, is recognized\\u000a as an earthquake resistant design method that protects contents of a building along with the building itself. In research\\u000a studies, superstructures of seismically isolated buildings are commonly modeled as idealized shear buildings. Shear building\\u000a representation corresponds to an idealized structure where the beams are infinitely

  18. Decision analysis for seismic retrofit of structures

    E-print Network

    Williams, Ryan J.

    2009-05-15

    given structure as well as the seismic hazard at a specific building location is incorporated into the decision-making process. The prescribed methodology is used to study two identical reinforced concrete buildings, one located in Memphis, Tennessee....2 Design Details of Example Buildings.....................................................6 2.3 Fragilities of Example Buildings............................................................9 2.4 Seismic Hazard at Memphis, TN and San Francisco, CA...

  19. Virtual Seismic Atlas

    NSDL National Science Digital Library

    Virtual Seismic Atlas

    The Virtual Seismic Atlas is an open access community resource to share the geological interpretation of seismic data. By browsing freely through the site you will find seismic images and interpretations. And you can down load higher resolution images for your use.

  20. Verifying policy-based web services security

    Microsoft Academic Search

    Karthikeyan Bhargavan; Cdric Fournet; Andrew D. Gordon

    2008-01-01

    WS-SecurityPolicy is a declarative language for configuring web services se- curity mechanisms. We describe a formal semantics for WS-SecurityPolicy and propose a more abstract language for specifying secure links between web ser- vices and their clients. We present the architecture and implementation of tools that (1) compile policy files from link specifications, and (2) verify by invoking a theorem prover

  1. Seismic retrofitting of the Ste-Justine Hospital in Montreal

    E-print Network

    Chartrand, Valerie

    2009-01-01

    Seismic engineering provides design and construction techniques so that buildings and other structures can survive the tremendous forces of earthquakes. While codes and design practices have resulted in greatly improved ...

  2. Seismic rehabilitation of wood diaphragms in unreinforced masonary buildings

    E-print Network

    Grubbs, Amber Jo

    2002-01-01

    objectives: (1) assessing the adequacy of current seismic rehabilitation guidelines for evaluating existing wood diaphragms in pre-1950's URM buildings and for designing necessary retrofits; and (2) evaluating the effect of diaphragm retrofits, as designed...

  3. Marine Seismic Data Center

    NSDL National Science Digital Library

    This is the homepage of the Marine Seismic Data Center (MSDC) of the University of Texas Institute for Geophysics (UTIG). MSDC's purpose is to organize seismic reflection and refraction data into a modern relational database management system accessible through the Internet. The web site provides access to metadata, SEG-Y (seismic shot record conversion) files, navigation files, seismic profile images, processing histories and more. The main features of the web site include a geographic search engine, a metadata search engine, and metadata pages for the cruises. A tool for plotting seismic sections is being tested and will be added in the future.

  4. Fuel storage basin seismic analysis

    SciTech Connect

    Kanjilal, S.K.; Winkel, B.V.

    1991-08-01

    The 105-KE and 105-KW Fuel Storage Basins were constructed more than 35 years ago as repositories for irradiated fuel from the K East and K West Reactors. Currently, the basins contain irradiated fuel from the N Reactor. To continue to use the basins as desired, seismic adequacy in accordance with current US Department of Energy facility requirements must be demonstrated. The 105-KE and 105-KW Basins are reinforced concrete, belowground reservoirs with a 16-ft water depth. The entire water retention boundary, which currently includes a portion of the adjacent reactor buildings, must be qualified for the Hanford Site design basis earthquake. The reactor building interface joints are sealed against leakage with rubber water stops. Demonstration of the seismic adequacy of these interface joints was initially identified as a key issue in the seismic qualification effort. The issue of water leakage through seismicly induced cracks was also investigated. This issue, coupled with the relatively complex geometry of the basins, dictated a need for three-dimensional modeling. A three-dimensional soil/structure interaction model was developed with the SASSI computer code. The development of three-dimensional models of the interfacing structures using the ANSYS code was also found to be necessary. 8 refs., 7 figs., 1 tab.

  5. Non-interactive Designated Verifier Proofs and Undeniable Signatures

    E-print Network

    Paterson, Kenny

    . Paterson Information Security Group Royal Holloway, University of London, UK {c to be no formal security modelling for NIDV undeniable signatures or for NIDV proofs in general. Indeed, recent and are therefore of independent interest. We therefore present two security models, one for general NIDV proof

  6. Underlying Assumptions and Designated Verifier Chifumi Sato1

    E-print Network

    , Digital signatures, Standard model 1 Introduction Security of cryptographic protocols and schemes scheme sat- isfying security based on the difficulty of the problems. Also we prove that the difficulty. For exam- ple, the Unforgeabilty (or Existential Unforgeability under an Adaptive Chosen Message Attack

  7. WRITING, VERIFYING, AND EXPLOITING FORMAL SPECIFICATIONS FOR HARDWARE DESIGNS

    E-print Network

    Dill, David L.

    UNIVERSITY IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY Kanna Shimizu October 2002 #12;© Copyright by Kanna Shimizu 2002 All Rights Reserved ii #12;I certify that I have read

  8. WRITING, VERIFYING, AND EXPLOITING FORMAL SPECIFICATIONS FOR HARDWARE DESIGNS

    E-print Network

    Dill, David L.

    university in partial fulfillment of the requirements for the degree of doctor of philosophy Kanna Shimizu August 2002 #12; c Copyright by Kanna Shimizu 2002 All Rights Reserved ii #12; I certify that I have

  9. Design of a potential long-term test of gas production from a hydrate deposit at the PBU-L106 site in North Slope, Alaska: Geomechanical system response and seismic monitoring

    NASA Astrophysics Data System (ADS)

    Chiaramonte, L.; Kowalsky, M. B.; Rutqvist, J.; Moridis, G. J.

    2009-12-01

    In an effort to optimize the design of a potential long-term production test at the PBU-L106 site in North Slope, Alaska, we have developed a coupled modeling framework that includes the simulation of (1) large-scale production at the test site, (2) the corresponding geomechanical changes in the system caused by production, and (3) time-lapse geophysical (seismic) surveys. The long-term test is to be conducted within the deposit of the C-layer, which extends from a depth of 2226 to 2374 ft, and is characterized by two hydrate-bearing strata separated by a 30 ft shale interlayer. In this study we examine the expected geomechanical response of the permafrost-associated hydrate deposit (C-Layer) at the PBU L106 site during depressurization-induced production, and assess the potential for monitoring the system response with seismic measurements. Gas hydrates increase the strength of the sediments (often unconsolidated) they impregnate. Thus hydrate disassociation in the course of gas production could potentially affect the geomechanical stability of such deposits, leading to sediment failure and potentially affecting wellbore stability and integrity at the production site and/or at neighboring conventional production facilities. For the geomechanical analysis we use a coupled hydraulic, thermodynamic and geomechanical model (TOUGH+HYDRATE+FLAC3D, T+H+F for short) simulating production from a single vertical well at the center of an infinite-acting hydrate deposit. We investigate the geomechanical stability of the C-Layer, well stability and possible interference (due to production) with pre-existing wells in the vicinity, as well as the system sensitivity to important parameters (saturation, permeability, porosity and heterogeneity). The time-lapse seismic surveys are simulated using a finite-difference elastic wave propagation model that is linked to the T+H+F code. The seismic properties, such as the elastic and shear moduli, are a function of the simulated time- and space-varying pressure and temperature, the aqueous-, gas-, and hydrate-phase saturation, and the porosity. We examine a variety of seismic measurement configurations and survey parameters to determine the optimal approach for detecting changes occurring in the hydrate deposit during production that can be used as the basis for monitoring hydrate dissociation, and the corresponding hydrate saturation and geomechanical status. The general approach we are developing (involving the coupled simulation of production, the geomechanical response, and the evolution of geophysical properties in hydrate accumulations under production) will be a valuable tool that can be used to maximize production potential, minimize risks of geomechanical instabilities, and ensure that the system can be adequately monitored using remote sensing techniques.

  10. Verifying speculative multithreading in an application

    DOEpatents

    Felton, Mitchell D

    2014-12-09

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  11. Verifying speculative multithreading in an application

    DOEpatents

    Felton, Mitchell D

    2014-11-18

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  12. Subocean bottom explosive seismic system

    SciTech Connect

    Wener, K. R.; Tinkle, A. R.

    1985-05-07

    The invention provides a system having at least one subocean bottom seismic device, such as a seismic source or a seismic detector, and a planting unit. When the planting unit is lowered it selectively implants the seismic device at predetermined locations in the ocean bottom, it releases from the implanted seismic device, and, when raised, uncoils a signal cable from the implanted seismic device. The signal cable which is capable of retrieving the implanted seismic device is connected to an anchored buoy which contains a first communications unit. A second seismic device is carried in a predetermined pattern near the implanted seismic device and is connected to a second communication unit.

  13. Design and installation of a monitoring network to investigate the correlations between geoelectrical fluctuations and seismicity of Basilicata region (southern Italy)

    NASA Astrophysics Data System (ADS)

    Colangelo, Gerardo; Balasco, Marianna; Lapenna, Vincenzo; Telesca, Luciano

    In past and recent years geoelectrical fluctuations measured in seismic areas have been attributed to stress and strain changes, associated with earthquakes. The complex nature of this problem has suggested the development of monitoring networks based on multi-parametric remote stations, in order to perform geophysical monitoring for a long time period and with a high spatial resolution. In this work we present the geophysical monitoring network built up with remote stations able to jointly detect geoelectrical and seismometric parameters in the Basilicata region, a seismically active area of southern Apennine Chain (Italy), struck by strong earthquakes in the past (1857) and recently (1980). The seismological settings and a very low level of anthropic noise allow us to consider the investigated area as well suited to study possible correlations between tectonic activity and anomalous patterns in geoelectrical data.

  14. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-12-09

    This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate preferential contaminant-flow paths. Historically, SSR has been used very little at depths shallower than 30 m, and even more rarely at depths of 10 m or less. Conversely, GPR is rarely useful at depths greater than 10 m, especially in areas where clay or other electrically conductive materials are present near the surface. Efforts to image the cone of depression around a pumping well using seismic methods were only partially successful (for complete references of all research results, see the full Final Technical Report, DOE/ER/14826-F), but peripheral results included development of SSR methods for depths shallower than one meter, a depth range that had not been achieved before. Imaging at such shallow depths, however, requires geophone intervals of the order of 10 cm or less, which makes such surveys very expensive in terms of human time and effort. We also showed that SSR and GPR could be used in a complementary fashion to image the same volume of earth at very shallow depths. The primary research focus of the second three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under rough topographic conditions. In some easy-access environments, this device could make SSR surveying considerably more efficient and less expensive, particularly when geophone intervals of 25 cm or less are required. The most recent research analyzed the difference in seismic response of the geophones with variable geophone spike length and geophones attached to various steel media. Experiments investigated the azimuthal dependence of the quality of data relative to the orientation of the rigidly attached geophones. Other experiments designed to test the hypothesis that the data are being amplified in much the same way that an organ pipe amplifies sound have so far proved inconclusive. Taken together, the positive results show that SSR imaging within a few meters of the earth's surface is possible if the geology is suitable, that SSR imaging can complement GPR imaging, and that SSR imaging could be made significantly more cost effective, at least in areas where the topography and the geology are favorable. Increased knowledge of the Earth's shallow subsurface through non-intrusive techniques is of potential benefit to management of DOE facilities. Among the most significant problems facing hydrologists today is the delineation of preferential permeability paths in sufficient detail to make a quantitative analysis possible. Aquifer systems dominated by fracture flow have a reputation of being particularly difficult to characterize and model. At chemically contaminated sites, including U.S. Department of Energy (DOE) facilities and others at Department of Defense (DOD) installations worldwide, establishing the spatial extent of the contamination, along with the fate of the contaminants and their transport-flow directions, is essential to the development of effective cleanup strategies. Detailed characterization of the shallow subsurface is important not only in environmental, groundwater, and geotechnical engineering applications, but also in neotectonics, mining geology, and the analysis of petroleum reservoir analogs. Near-surface seismology is in the vanguard of non-intrusive approaches to increase knowledge of the shallow subsurface; our

  15. Verifying disarmament: scientific, technological and political challenges

    SciTech Connect

    Pilat, Joseph R [Los Alamos National Laboratory

    2011-01-25

    There is growing interest in, and hopes for, nuclear disarmament in governments and nongovernmental organizations (NGOs) around the world. If a nuclear-weapon-free world is to be achievable, verification and compliance will be critical. VerifYing disarmament would have unprecedented scientific, technological and political challenges. Verification would have to address warheads, components, materials, testing, facilities, delivery capabilities, virtual capabilities from existing or shutdown nuclear weapon and existing nuclear energy programs and material and weapon production and related capabilities. Moreover, it would likely have far more stringent requirements. The verification of dismantlement or elimination of nuclear warheads and components is widely recognized as the most pressing problem. There has been considerable research and development done in the United States and elsewhere on warhead and dismantlement transparency and verification since the early 1990s. However, we do not today know how to verifY low numbers or zero. We need to develop the needed verification tools and systems approaches that would allow us to meet this complex set of challenges. There is a real opportunity to explore verification options and, given any realistic time frame for disarmament, there is considerable scope to invest resources at the national and international levels to undertake research, development and demonstrations in an effort to address the anticipated and perhaps unanticipated verification challenges of disarmament now andfor the next decades. Cooperative approaches have the greatest possibility for success.

  16. Design of a myo-seismic transducer for non-invasive transcutaneous vectorial recording of locally fast muscle-fibre micro-contractions.

    PubMed

    Journe, H L; de Jonge, A B

    1995-01-01

    Mechanical recording usually concerns the analysis of movements in bio-mechanical research projects. Mechanical recording of locally fast muscle-fibre micro-contractions, however, is a little-developed and rarely-applied myographic technique. In the last decade, acoustic or myophonic measurements came increasingly into the picture when they were also applied to research on general muscle activity, such as in muscle fatigue studies. In this paper, a new micro-seismic recording technique is introduced. The technique registers extremely local activity in the velocity and force vector of skin movement as a function in time. The recording method is sensitive to micro excursions caused by muscle fibres under the skin. The resolution in time is at least 100 us, which is demonstrated in an experiment where a mechanical contraction is provoked by electrical stimulation of the median nerve. This indicates a seismic variant, refered to as seismic-myography (SMG), of surface EMG's, and offers complementary features. The most important features are: 1. Insensitivity to low frequent, large movement artefacts. 2. Sensitivity to fast mechanical micro-excursions and velocities. 3. Fast and precise discrimination of local mechanical events. 4. Vectorial reconstruction of superficial mechanic activity which can be used for the identification and functional behaviour of subcutaneous muscle fibres and, in addition, for the localisation of motor endplate zones. 5. The method is easy to use.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7649066

  17. Improvement of broadband seismic station installations at the Observatoire de Grenoble (OSUG) seismic network

    NASA Astrophysics Data System (ADS)

    Langlais, M.; Vial, B.; Coutant, O.

    2013-04-01

    We describe in this paper different improvements that were brought to the installation of seismic broadband stations deployed by the Observatoire de Grenoble (OSUG) in the northern French Alps. This work was realized in the frame of a French-Italian ALCOTRA project (RISE), aimed at modernizing the broadband seismic networks across our common border. We had the opportunity with this project to improve some of our seismic recording sites, both in term of sensor installation quality, and in term of reliability. We detail in particular the thermal and barometric protection system that we designed and show its effect on the reduction of long period noise above 20 s.

  18. Development of adaptive seismic isolators for ultimate seismic protection of civil structures

    NASA Astrophysics Data System (ADS)

    Li, Jianchun; Li, Yancheng; Li, Weihua; Samali, Bijan

    2013-04-01

    Base isolation is the most popular seismic protection technique for civil engineering structures. However, research has revealed that the traditional base isolation system due to its passive nature is vulnerable to two kinds of earthquakes, i.e. the near-fault and far-fault earthquakes. A great deal of effort has been dedicated to improve the performance of the traditional base isolation system for these two types of earthquakes. This paper presents a recent research breakthrough on the development of a novel adaptive seismic isolation system as the quest for ultimate protection for civil structures, utilizing the field-dependent property of the magnetorheological elastomer (MRE). A novel adaptive seismic isolator was developed as the key element to form smart seismic isolation system. The novel isolator contains unique laminated structure of steel and MR elastomer layers, which enable its large-scale civil engineering applications, and a solenoid to provide sufficient and uniform magnetic field for energizing the field-dependent property of MR elastomers. With the controllable shear modulus/damping of the MR elastomer, the developed adaptive seismic isolator possesses a controllable lateral stiffness while maintaining adequate vertical loading capacity. In this paper, a comprehensive review on the development of the adaptive seismic isolator is present including designs, analysis and testing of two prototypical adaptive seismic isolators utilizing two different MRE materials. Experimental results show that the first prototypical MRE seismic isolator can provide stiffness increase up to 37.49%, while the second prototypical MRE seismic isolator provides amazing increase of lateral stiffness up to1630%. Such range of increase of the controllable stiffness of the seismic isolator makes it highly practical for developing new adaptive base isolation system utilizing either semi-active or smart passive controls.

  19. Verifying Timestamps of Occultation Observation Systems

    NASA Astrophysics Data System (ADS)

    Barry, M. A. Tony; Gault, Dave; Bolt, Greg; McEwan, Alistair; Filipovi?, Miroslav D.; White, Graeme L.

    2015-04-01

    We describe an image timestamp verification system to determine the exposure timing characteristics and continuity of images made by an imaging camera and recorder, with reference to Coordinated Universal Time. The original use was to verify the timestamps of stellar occultation recording systems, but the system is applicable to lunar flashes, planetary transits, sprite recording, or any area where reliable timestamps are required. The system offers good temporal resolution (down to 2 ms, referred to Coordinated Universal Time) and provides exposure duration and interframe dead time information. The system uses inexpensive, off-the-shelf components, requires minimal assembly, and requires no high-voltage components or connections. We also describe an application to load fits (and other format) image files, which can decode the verification image timestamp. Source code, wiring diagrams, and built applications are provided to aid the construction and use of the device.

  20. Measurement of the seismic attenuation performance of the VIRGO Superattenuator

    Microsoft Academic Search

    S. Braccini; L. Barsotti; C. Bradaschia; G. Cella; A. Di Virgilio; I. Ferrante; F. Fidecaro; I. Fiori; F. Frasconi; A. Gennai; A. Giazotto; F. Paoletti; R. Passaquieti; D. Passuello; R. Poggiani; E. Campagna; G. Guidi; G. Losurdo; F. Martelli; M. Mazzoni; B. Perniola; F. Piergiovanni; R. Stanga; F. Vetrano; A. Vicer; L. Brocco; S. Frasca; E. Majorana; A. Pai; C. Palomba; P. Puppo; P. Rapagnani; F. Ricci; G. Ballardin; R. Barill; R. Cavalieri; E. Cuoco; V. Dattilo; D. Enard; R. Flaminio; A. Freise; S. Hebri; L. Holloway; P. La Penna; M. Loupias; J. Marque; C. Moins; A. Pasqualetti; P. Ruggi; R. Taddei; Z. Zhang; F. Acernese; S. Avino; F. Barone; E. Calloni; R. De Rosa; L. Di Fiore; A. Eleuteri; L. Giordano; L. Milano; S. Pardi; K. Qipiani; I. Ricciardi; G. Russo; S. Solimeno; D. Babusci; G. Giordano; P. Amico; L. Bosi; L. Gammaitoni; F. Marchesoni; M. Punturo; F. Travasso; H. Vocca; C. Boccara; J. Moreau; V. Loriette; V. Reita; J. M. Mackowski; N. Morgado; L. Pinard; A. Remillieux; M. Barsuglia; M. A. Bizouard; V. Brisson; F. Cavalier; A. C. Clapson; M. Davier; P. Hello; S. Krecklbergh; F. Beauville; D. Buskulic; R. Gouaty; D. Grosjean; F. Marion; A. Masserot; B. Mours; E. Tournefier; D. Tombolato; D. Verkindt; M. Yvert; S. Aoudia; F. Bondu; A. Brillet; E. Chassande-Mottin; F. Cleva; J. P. Coulon; B. Dujardin; J. D. Fournier; H. Heitmann; C. N. Man; A. Spallicci; J. Y. Vinet

    2005-01-01

    The gravitational wave detector VIRGO aims at extending the detection band down to a few Hertz by isolating the mirrors of the interferometer from seismic noise. This result is achieved by hanging each mirror through an elastic suspension (Superattenuator), designed to filter mechanical vibrations in all the degrees of freedom. An experimental upper limit of the mirror residual seismic noise

  1. Verifiable process monitoring through enhanced data authentication.

    SciTech Connect

    Goncalves, Joao G. M. (European Commission Joint Research Centre, Italy); Schwalbach, Peter (European Commission Directorate General%3CU%2B2014%3EEnergy, Luxemburg); Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-09-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  2. Verifying an interactive consistency circuit: A case study in the reuse of a verification technology

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1990-01-01

    The work done at ORA for NASA-LRC in the design and formal verification of a hardware implementation of a scheme for attaining interactive consistency (byzantine agreement) among four microprocessors is presented in view graph form. The microprocessors used in the design are an updated version of a formally verified 32-bit, instruction-pipelined, RISC processor, MiniCayuga. The 4-processor system, which is designed under the assumption that the clocks of all the processors are synchronized, provides software control over the interactive consistency operation. Interactive consistency computation is supported as an explicit instruction on each of the microprocessors. An identical user program executing on each of the processors decides when and on what data interactive consistency must be performed. This exercise also served as a case study to investigate the effectiveness of reusing the technology which was developed during the MiniCayuga effort for verifying synchronous hardware designs. MiniCayuga was verified using the verification system Clio which was also developed at ORA. To assist in reusing this technology, a computer-aided specification and verification tool was developed. This tool specializes Clio to synchronous hardware designs and significantly reduces the tedium involved in verifying such designs. The tool is presented and how it was used to specify and verify the interactive consistency circuit is described.

  3. Usefulness of the fiber-optic interferometer for the investigation of the seismic rotation waves

    Microsoft Academic Search

    LESZEK R. JAROSZEWICZ; ZBIGNIEW KRAJEWSKI; ROMAN TEISSEYRE

    In the paper new areas of the fiber-optic Sagnac interferometer applications are discussed and proposed. Because this system detects the absolute rotation, its application is directly designed for detection of the seismic rotation waves which are rotational events existing in the seismic waves. In most cases those waves are extracted from recordings of differential seismic signals. However, all differences in

  4. Research and Development of Seismic Base Isolation Technique for Civil Engineering Structures

    Microsoft Academic Search

    Sun Hong-ling; Li Qing

    2010-01-01

    Base isolation is one of the most promising alternatives among the structure control methods. In recent decades, base isolation has been seriously considered for civil structures, such as buildings and bridges, subjected to ground motion. Seismic isolation technique had been applied successfully abroad, especially in Japan, and buildings with seismic isolation design had good performances in the earthquakes before. Seismic

  5. AUTOMATING SHALLOW SEISMIC IMAGING

    Microsoft Academic Search

    Steeples; Don W

    2003-01-01

    The current project is a continuation of an effort to develop ultrashallow seismic imaging as a cost-effective method potentially applicable to DOE facilities. The objective of the present research is to develop and demonstrate the use of a cost-effective, automated method of conducting shallow seismic surveys, an approach that represents a significant departure from conventional seismic-survey field procedures. Initial testing

  6. AUTOMATING SHALLOW SEISMIC IMAGING

    Microsoft Academic Search

    Steeples; Don W

    2002-01-01

    Our current EMSP project continues an effort begun in 1997 to develop ultrashallow seismic imaging as a cost-effective method applicable to DOE facilities. The objective of the present research is to refine and demonstrate the use of an automated method of conducting shallow seismic surveys--an approach that represents a significant departure from conventional seismic-survey field procedures. Recent tests involving a

  7. Seismic isolation of two dimensional periodic foundations

    SciTech Connect

    Yan, Y.; Mo, Y. L., E-mail: yilungmo@central.uh.edu [University of Houston, Houston, Texas 77004 (United States); Laskar, A. [Indian Institute of Technology Bombay, Powai, Mumbai (India); Cheng, Z.; Shi, Z. [Beijing Jiaotong University, Beijing (China); Menq, F. [University of Texas, Austin, Texas 78712 (United States); Tang, Y. [Argonne National Laboratory, Argonne, Illinois 60439 (United States)

    2014-07-28

    Phononic crystal is now used to control acoustic waves. When the crystal goes to a larger scale, it is called periodic structure. The band gaps of the periodic structure can be reduced to range from 0.5?Hz to 50?Hz. Therefore, the periodic structure has potential applications in seismic wave reflection. In civil engineering, the periodic structure can be served as the foundation of upper structure. This type of foundation consisting of periodic structure is called periodic foundation. When the frequency of seismic waves falls into the band gaps of the periodic foundation, the seismic wave can be blocked. Field experiments of a scaled two dimensional (2D) periodic foundation with an upper structure were conducted to verify the band gap effects. Test results showed the 2D periodic foundation can effectively reduce the response of the upper structure for excitations with frequencies within the frequency band gaps. When the experimental and the finite element analysis results are compared, they agree well with each other, indicating that 2D periodic foundation is a feasible way of reducing seismic vibrations.

  8. Application of seismic tomography in underground mining

    SciTech Connect

    Scott, D.F.; Williams, T.J. [Department of Energy, Spokane, WA (United States); Friedel, M.J.

    1996-12-01

    Seismic tomography, as used in mining, is based on the principle that highly stressed rock will demonstrate relatively higher P-wave velocities than rock under less stress. A decrease or increase in stress over time can be verified by comparing successive tomograms. Personnel at the Spokane Research Center have been investigating the use of seismic tomography to identify stress in remnant ore pillars in deep (greater than 1220 in) underground mines. In this process, three-dimensional seismic surveys are conducted in a pillar between mine levels. A sledgehammer is used to generate P-waves, which are recorded by geophones connected to a stacking signal seismograph capable of collecting and storing the P-wave data. Travel times are input into a spreadsheet, and apparent velocities are then generated and merged into imaging software. Mine workings are superimposed over apparent P-wave velocity contours to generate a final tomographic image. Results of a seismic tomographic survey at the Sunshine Mine, Kellogg, ED, indicate that low-velocity areas (low stress) are associated with mine workings and high-velocity areas (higher stress) are associated with areas where no mining has taken place. A high stress gradient was identified in an area where ground failed. From this tomographic survey, as well, as four earlier surveys at other deep underground mines, a method was developed to identify relative stress in remnant ore pillars. This information is useful in making decisions about miner safety when mining such ore pillars.

  9. Statistical classification methods applied to seismic discrimination

    SciTech Connect

    Ryan, F.M. [ed.; Anderson, D.N.; Anderson, K.K.; Hagedorn, D.N.; Higbee, K.T.; Miller, N.E.; Redgate, T.; Rohay, A.C.

    1996-06-11

    To verify compliance with a Comprehensive Test Ban Treaty (CTBT), low energy seismic activity must be detected and discriminated. Monitoring small-scale activity will require regional (within {approx}2000 km) monitoring capabilities. This report provides background information on various statistical classification methods and discusses the relevance of each method in the CTBT seismic discrimination setting. Criteria for classification method selection are explained and examples are given to illustrate several key issues. This report describes in more detail the issues and analyses that were initially outlined in a poster presentation at a recent American Geophysical Union (AGU) meeting. Section 2 of this report describes both the CTBT seismic discrimination setting and the general statistical classification approach to this setting. Seismic data examples illustrate the importance of synergistically using multivariate data as well as the difficulties due to missing observations. Classification method selection criteria are presented and discussed in Section 3. These criteria are grouped into the broad classes of simplicity, robustness, applicability, and performance. Section 4 follows with a description of several statistical classification methods: linear discriminant analysis, quadratic discriminant analysis, variably regularized discriminant analysis, flexible discriminant analysis, logistic discriminant analysis, K-th Nearest Neighbor discrimination, kernel discrimination, and classification and regression tree discrimination. The advantages and disadvantages of these methods are summarized in Section 5.

  10. Seismic isolation of an electron microscope

    SciTech Connect

    Godden, W.G.; Aslam, M.; Scalise, D.T.

    1980-01-01

    A unique two-stage dynamic-isolation problem is presented by the conflicting design requirements for the foundations of an electron microscope in a seismic region. Under normal operational conditions the microscope must be isolated from ambient ground noise; this creates a system extremely vulnerable to seismic ground motions. Under earthquake loading the internal equipment forces must be limited to prevent damage or collapse. An analysis of the proposed design solution is presented. This study was motivated by the 1.5 MeV High Voltage Electron Microscope (HVEM) to be installed at the Lawrence Berkeley Laboratory (LBL) located near the Hayward Fault in California.

  11. Seismic Imaging and Monitoring

    SciTech Connect

    Huang, Lianjie [Los Alamos National Laboratory

    2012-07-09

    I give an overview of LANL's capability in seismic imaging and monitoring. I present some seismic imaging and monitoring results, including imaging of complex structures, subsalt imaging of Gulf of Mexico, fault/fracture zone imaging for geothermal exploration at the Jemez pueblo, time-lapse imaging of a walkway vertical seismic profiling data for monitoring CO{sub 2} inject at SACROC, and microseismic event locations for monitoring CO{sub 2} injection at Aneth. These examples demonstrate LANL's high-resolution and high-fidelity seismic imaging and monitoring capabilities.

  12. K-means cluster analysis and seismicity partitioning for Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2014-07-01

    Pakistan and the western Himalaya is a region of high seismic activity located at the triple junction between the Arabian, Eurasian and Indian plates. Four devastating earthquakes have resulted in significant numbers of fatalities in Pakistan and the surrounding region in the past century (Quetta, 1935; Makran, 1945; Pattan, 1974 and the recent 2005 Kashmir earthquake). It is therefore necessary to develop an understanding of the spatial distribution of seismicity and the potential seismogenic sources across the region. This forms an important basis for the calculation of seismic hazard; a crucial input in seismic design codes needed to begin to effectively mitigate the high earthquake risk in Pakistan. The development of seismogenic source zones for seismic hazard analysis is driven by both geological and seismotectonic inputs. Despite the many developments in seismic hazard in recent decades, the manner in which seismotectonic information feeds the definition of the seismic source can, in many parts of the world including Pakistan and the surrounding regions, remain a subjective process driven primarily by expert judgment. Whilst much research is ongoing to map and characterise active faults in Pakistan, knowledge of the seismogenic properties of the active faults is still incomplete in much of the region. Consequently, seismicity, both historical and instrumental, remains a primary guide to the seismogenic sources of Pakistan. This study utilises a cluster analysis approach for the purposes of identifying spatial differences in seismicity, which can be utilised to form a basis for delineating seismogenic source regions. An effort is made to examine seismicity partitioning for Pakistan with respect to earthquake database, seismic cluster analysis and seismic partitions in a seismic hazard context. A magnitude homogenous earthquake catalogue has been compiled using various available earthquake data. The earthquake catalogue covers a time span from 1930 to 2007 and an area from 23.00 to 39.00N and 59.00 to 80.00E. A threshold magnitude of 5.2 is considered for K-means cluster analysis. The current study uses the traditional metrics of cluster quality, in addition to a seismic hazard contextual metric to attempt to constrain the preferred number of clusters found in the data. The spatial distribution of earthquakes from the catalogue was used to define the seismic clusters for Pakistan, which can be used further in the process of defining seismogenic sources and corresponding earthquake recurrence models for estimates of seismic hazard and risk in Pakistan. Consideration of the different approaches to cluster validation in a seismic hazard context suggests that Pakistan may be divided into K = 19 seismic clusters, including some portions of the neighbouring countries of Afghanistan, Tajikistan and India.

  13. Seismic Safety Of Simple Masonry Buildings

    SciTech Connect

    Guadagnuolo, Mariateresa; Faella, Giuseppe [Dipartimento di Cultura del Progetto, Seconda Universita di Napoli Abbazia di S. Lorenzo ad Septimum, 81031, Aversa (Italy)

    2008-07-08

    Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have to be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.

  14. Downhole seismic logging for high-resolution reflection surveying in unconsolidated overburden

    Microsoft Academic Search

    J. A. Hunter; S. E. Pullan; R. A. Burns; R. L. Good; J. B. Harris; A. Pugin; A. Skvortsov; N. N. Goriainov

    1998-01-01

    Downhole seismic velocity logging techniques have been developed and applied in support of high-resolution reflection seismic surveys. Data obtained from downhole seismic logging can provide accurate velocity-depth functions and directly correlate seismic reflections to depth. The methodologies described in this paper are designed for slimhole applications in plastic-cased boreholes (minimum ID of 50 mm) and with source and detector arrays

  15. Stressing of fault patch during seismic swarms in central Apennines, Italy

    NASA Astrophysics Data System (ADS)

    De Gori, P.; Lucente, F. P.; Chiarabba, C.

    2015-04-01

    Persistent seismic swarms originate along the normal faulting system of central Apennines (Italy). In this study, we analyze the space-time-energy distribution of one of the longer and more intense of these swarms, active since August 2013 in the high seismic risk area of the Gubbio basin. Our aim is to verify if information relevant to constraint short-term earthquake occurrence scenarios is hidden in seismic swarms. During the swarm, the seismic moment release first accelerated, with a rapid migration of seismicity along the fault system, and suddenly dropped. We observe a decrease of the b-value, along the portion of the fault system where large magnitude events concentrated, possibly indicating that a fault patch was dynamically stressed. This finding suggests that the onset of seismic swarms might help the formation of critically stressed patches.

  16. Induced Seismicity Potential of Energy Technologies

    NASA Astrophysics Data System (ADS)

    Hitzman, Murray

    2013-03-01

    Earthquakes attributable to human activities-``induced seismic events''-have received heightened public attention in the United States over the past several years. Upon request from the U.S. Congress and the Department of Energy, the National Research Council was asked to assemble a committee of experts to examine the scale, scope, and consequences of seismicity induced during fluid injection and withdrawal associated with geothermal energy development, oil and gas development, and carbon capture and storage (CCS). The committee's report, publicly released in June 2012, indicates that induced seismicity associated with fluid injection or withdrawal is caused in most cases by change in pore fluid pressure and/or change in stress in the subsurface in the presence of faults with specific properties and orientations and a critical state of stress in the rocks. The factor that appears to have the most direct consequence in regard to induced seismicity is the net fluid balance (total balance of fluid introduced into or removed from the subsurface). Energy technology projects that are designed to maintain a balance between the amount of fluid being injected and withdrawn, such as most oil and gas development projects, appear to produce fewer seismic events than projects that do not maintain fluid balance. Major findings from the study include: (1) as presently implemented, the process of hydraulic fracturing for shale gas recovery does not pose a high risk for inducing felt seismic events; (2) injection for disposal of waste water derived from energy technologies does pose some risk for induced seismicity, but very few events have been documented over the past several decades relative to the large number of disposal wells in operation; and (3) CCS, due to the large net volumes of injected fluids suggested for future large-scale carbon storage projects, may have potential for inducing larger seismic events.

  17. Seismic response study for base-isolated CANDU 3

    SciTech Connect

    Biswas, J.K.; Saudy, A.M. [Atomic Energy of Canada Limited, Saskatoon, Saskatchewan (Canada). Civil Engineering Branch

    1995-12-31

    The design of the CANDU 3 nuclear power plant rated at 450 MW of net output power is being developed by AECL. During the development of the CANDU 3 design, various design options including the use of seismic isolator bearings are considered to mitigate effects of seismic loads. The current design of CANDU 3 is of fixed-base construction. However, analytical studies are undertaken to determine the effects of using seismic isolation. This paper presents a study of the benefits of using seismic isolator bearings for the CANDU 3 nuclear power plant. To base-isolate the CANDU 3 plant, the reactor and other safety-related buildings would be located on a common mat isolated from the foundation with the use of elastomeric bearings. Seismic analyses are performed to predict the behavior of the structures. A mathematical model consisting of lumped masses and beams to represent different buildings of the CANDU 3 plant is considered in the analysis. The model considers the nonlinear characteristics of the elastomeric bearing. Nonlinear time-history analyses are performed to determine the seismic responses. The acceleration, displacement and floor response spectra of different buildings are determined for both the fixed-base and base-isolated cases. The results show that the use of seismic isolation would reduce the acceleration responses of the buildings significantly. However, the displacement responses of the buildings would be increased which would require special considerations for interconnected systems. Moreover, it is shown that the floor response spectra would be reduced drastically for a base-isolated structure as compared with a fixed-base structure. This reduction of seismic responses would be of considerable benefit for the design of structures and seismic qualification of components. Lastly, a parametric study is performed to determine the effect of varying seismic input using non-linear analysis techniques.

  18. Automatic Generation of the C# Code for Security Protocols Verified with Casper\\/FDR

    Microsoft Academic Search

    Chul-wuk Jeon; Il-gon Kim; Jin-young Choi

    2005-01-01

    Formal methods technique offer a means of verifying the correctness of the design process used to create the security protocol. Notwithstanding the successful verification of the design of security protocols, the implementation code for them may contain security flaws, due to the mistakes made by the programmers or bugs in the programming language itself. We propose an ACG-C# tool, which

  19. On Modeling and Verifying of Application Protocols of TTCAN in Flight-Control System with UPPAAL

    Microsoft Academic Search

    Xiao Wu; Heng Ling; Yunwei Dong

    2009-01-01

    TTCAN is the most potential protocol used to construct the communication layer of flight control system of unmanned aircraft vehicle (UAV). In this paper, we propose a novel UAV flight control system design which is based on TTCAN. We not only design the model of the system but also verify its non-functional properties such as reliability, security, schedulability and fault-tolerance

  20. Seismic analysis of a reinforced concrete containment vessel model

    SciTech Connect

    RANDY,JAMES J.; CHERRY,JEFFERY L.; RASHID,YUSEF R.; CHOKSHI,NILESH

    2000-02-03

    Pre-and post-test analytical predictions of the dynamic behavior of a 1:10 scale model Reinforced Concrete Containment Vessel are presented. This model, designed and constructed by the Nuclear Power Engineering Corp., was subjected to seismic simulation tests using the high-performance shaking table at the Tadotsu Engineering Laboratory in Japan. A group of tests representing design-level and beyond-design-level ground motions were first conducted to verify design safety margins. These were followed by a series of tests in which progressively larger base motions were applied until structural failure was induced. The analysis was performed by ANATECH Corp. and Sandia National Laboratories for the US Nuclear Regulatory Commission, employing state-of-the-art finite-element software specifically developed for concrete structures. Three-dimensional time-history analyses were performed, first as pre-test blind predictions to evaluate the general capabilities of the analytical methods, and second as post-test validation of the methods and interpretation of the test result. The input data consisted of acceleration time histories for the horizontal, vertical and rotational (rocking) components, as measured by accelerometers mounted on the structure's basemat. The response data consisted of acceleration and displacement records for various points on the structure, as well as time-history records of strain gages mounted on the reinforcement. This paper reports on work in progress and presents pre-test predictions and post-test comparisons to measured data for tests simulating maximum design basis and extreme design basis earthquakes. The pre-test analyses predict the failure earthquake of the test structure to have an energy level in the range of four to five times the energy level of the safe shutdown earthquake. The post-test calculations completed so far show good agreement with measured data.

  1. Seismic fragility test of a 6-inch diameter pipe system

    SciTech Connect

    Chen, W. P.; Onesto, A. T.; DeVita, V.

    1987-02-01

    This report contains the test results and assessments of seismic fragility tests performed on a 6-inch diameter piping system. The test was funded by the US Nuclear Regulatory Commission (NRC) and conducted by ETEC. The objective of the test was to investigate the ability of a representative nuclear piping system to withstand high level dynamic seismic and other loadings. Levels of loadings achieved during seismic testing were 20 to 30 times larger than normal elastic design evaluations to ASME Level D limits would permit. Based on failure data obtained during seismic and other dynamic testing, it was concluded that nuclear piping systems are inherently able to withstand much larger dynamic seismic loadings than permitted by current design practice criteria or predicted by the probabilistic risk assessment (PRA) methods and several proposed nonlinear methods of failure analysis.

  2. Scanning Seismic Intrusion Detector

    NASA Technical Reports Server (NTRS)

    Lee, R. D.

    1982-01-01

    Scanning seismic intrusion detector employs array of automatically or manually scanned sensors to determine approximate location of intruder. Automatic-scanning feature enables one operator to tend system of many sensors. Typical sensors used with new system are moving-coil seismic pickups. Detector finds uses in industrial security systems.

  3. Seismic performance of RC shear wall structure with novel shape memory alloy dampers in coupling beams

    NASA Astrophysics Data System (ADS)

    Mao, Chenxi; Dong, Jinzhi; Li, Hui; Ou, Jinping

    2012-04-01

    Shear wall system is widely adopted in high rise buildings because of its high lateral stiffness in resisting earthquakes. According to the concept of ductility seismic design, coupling beams in shear wall structure are required to yield prior to the damage of wall limb. However, damage in coupling beams results in repair cost post earthquake and even in some cases it is difficult to repair the coupling beams if the damage is severe. In order to solve this problem, a novel passive SMA damper was proposed in this study. The coupling beams connecting wall limbs are split in the middle, and the dampers are installed between the ends of the two cantilevers. Then the relative flexural deformation of the wall limbs is transferred to the ends of coupling beams and then to the SMA dampers. After earthquakes the deformation of the dampers can recover automatically because of the pseudoelasticity of austenite SMA material. In order to verify the validity of the proposed dampers, seismic responses of a 12-story coupled shear wall with such passive SMA dampers in coupling beams was investigated. The additional stiffness and yielding deformation of the dampers and their ratios to the lateral stiffness and yielding displacements of the wall limbs are key design parameters and were addressed. Analytical results indicate that the displacement responses of the shear wall structure with such dampers are reduced remarkably. The deformation of the structure is concentrated in the dampers and the damage of coupling beams is reduced.

  4. A PZT-based smart aggregate for seismic shear stress monitoring

    NASA Astrophysics Data System (ADS)

    Hou, S.; Zhang, H. B.; Ou, J. P.

    2013-06-01

    A lead zirconate titanate (PZT)-based smart aggregate (SA) is proposed for seismic shear stress monitoring in concrete structures. This SA uses a d15-mode PZT as the sensing element. A calibration test is designed in which a cyclic shear stress with a dominant frequency of the earthquake response spectrum is applied on the two opposite sides of the proposed SA using a specially designed loading mold. The test is repeated on six copies of the proposed SA. The maximum applied shear stress is larger than the shear strength of ordinary concrete to allow measurements during failure. The output voltage of the SA is experimentally verified as varying linearly with the applied stress in the loading range. The sensitivity of the proposed SA to the applied stress under the given boundary conditions is examined. The calibrated sensitivity value is then compared with the calculated value, which is obtained by computing the stress distribution in the SA using finite element analysis (FEA). The calculated values and the calibrated values are approximately the same, indicating that the established finite element (FE) model is reliable. Monotonic loading is also applied on the proposed SA to induce cracks between the SA and the loading mold, and the SAs response to cracking processes is examined. It is found that the proposed SA underestimates the cracking process. This study demonstrates that the proposed SA can be used in monitoring the overall shear stress development process in concrete during a seismic event.

  5. Towards data fusion in seismic monitoring: Source characterization of mining blasts with acoustic and seismic records

    SciTech Connect

    Leach, R.R. Jr.; Dowla, F.U.

    1995-07-01

    Event identification that combines data from a diverse range of sensor types, such as seismic, hydroacoustic, infrasound, optical, or acoustic sensors, has been discussed recently as a way to improve treaty monitoring technology, especially for a Comprehensive Test Ban Treaty. In this exploratory study the authors compare features in acoustic and seismic data from ripple-fired mining blasts, in an effort to understand the issues of incorporating data fusion into seismic monitoring. They study the possibility of identifying features such as spectral scalloping at high frequencies using acoustic signals recorded in the near field during mining blasts. Recorded acoustic and seismic data from two mining blasts at Carlin, Nevada, were analyzed. The authors have found that there is a clear presence of the periodic and impulsive nature of the ripple-fire source present in the acoustic recordings at high frequencies. They have discovered that the arrival time and duration of the acoustic recordings are also clearly discernible at high frequencies. This is in contrast to the absence of these features in seismic signals, due to attenuation and scattering at high frequencies. The association of signals from different sensors offers solutions for difficult monitoring problems. Seismic or acoustic signals individually may not be able to detect a nuclear test hidden under a typical mining blast. However, the presence of an underground nuclear test during a mining event could be determined by deriving the mining explosion source from the acoustic recording, modeling a seismic signal from the derived source, and subtracting the modeled seismic signal from the seismic recording for the event. Recommendations in the design of data fusion systems for treaty monitoring are suggested.

  6. Efficient Secure and Verifiable Outsourcing of Matrix Multiplications

    E-print Network

    Efficient Secure and Verifiable Outsourcing of Matrix Multiplications Yihua Zhang and Marina@nd.edu Abstract With the emergence of cloud computing services, a resource-constrained client can outsource its is called verifiable delegation or verifiable outsourcing. Furthermore, the data used in the computation may

  7. Efficient Secure and Verifiable Outsourcing of Matrix Multiplications

    E-print Network

    Blanton, Marina

    Efficient Secure and Verifiable Outsourcing of Matrix Multiplications Yihua Zhang and Marina@nd.edu Abstract. With the emergence of cloud computing services, a resource- constrained client can outsource its is called verifiable delegation or verifiable outsourcing. Furthermore, the data used in the computation may

  8. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...2014-07-01 false Verifying your identity. 802.13 Section 802.13 Judicial...Privacy Act 802.13 Verifying your identity. (a) Requests for your own records...about yourself, you must verify your identity. You must state your full...

  9. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...2012-07-01 false Verifying your identity. 802.13 Section 802.13 Judicial...Privacy Act 802.13 Verifying your identity. (a) Requests for your own records...about yourself, you must verify your identity. You must state your full...

  10. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...2013-07-01 false Verifying your identity. 802.13 Section 802.13 Judicial...Privacy Act 802.13 Verifying your identity. (a) Requests for your own records...about yourself, you must verify your identity. You must state your full...

  11. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...2011-07-01 false Verifying your identity. 802.13 Section 802.13 Judicial...Privacy Act 802.13 Verifying your identity. (a) Requests for your own records...about yourself, you must verify your identity. You must state your full...

  12. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...2010-07-01 false Verifying your identity. 802.13 Section 802.13 Judicial...Privacy Act 802.13 Verifying your identity. (a) Requests for your own records...about yourself, you must verify your identity. You must state your full...

  13. Third Quarter Hanford Seismic Report for Fiscal Year 2005

    SciTech Connect

    Reidel, Steve P.; Rohay, Alan C.; Hartshorn, Donald C.; Clayton, Ray E.; Sweeney, Mark D.

    2005-09-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 337 triggers during the third quarter of fiscal year 2005. Of these triggers, 20 were earthquakes within the Hanford Seismic Network. The largest earthquake within the Hanford Seismic Network was a magnitude 1.3 event May 25 near Vantage, Washington. During the third quarter, stratigraphically 17 (85%) events occurred in the Columbia River basalt (approximately 0-5 km), no events in the pre-basalt sediments (approximately 5-10 km), and three (15%) in the crystalline basement (approximately 10-25 km). During the first quarter, geographically five (20%) earthquakes occurred in swarm areas, 10 (50%) earthquakes were associated with a major geologic structure, and 5 (25%) were classified as random events.

  14. 3-D Seismic Methods for Shallow Imaging Beneath Pavement

    E-print Network

    Miller, Brian

    2013-05-31

    The research presented in this dissertation focuses on survey design and acquisition of near-surface 3D seismic reflection and surface wave data on pavement. Increased efficiency for mapping simple subsurface interfaces through a combined use...

  15. Acoustic and seismic signal processing for footsetp detection

    E-print Network

    Bland, Ross E. (Ross Edward)

    2006-01-01

    The problem of detecting footsteps using acoustic and seismic sensors is approached from three different angles in this thesis. First, accelerometer data processing systems are designed to make footsteps more apparent to ...

  16. The retrofitting of existing buildings for seismic criteria

    E-print Network

    Besing, Christa, 1978-

    2004-01-01

    This thesis describes the process for retrofitting a building for seismic criteria. It explains the need for a new, performance-based design code to provide a range of acceptable building behavior. It then outlines the ...

  17. Seismic Hazard Characterization at the DOE Savannah River Site (SRS): Status report

    SciTech Connect

    Savy, J.B.

    1994-06-24

    The purpose of the Seismic Hazard Characterization project for the Savannah River Site (SRS-SHC) is to develop estimates of the seismic hazard for several locations within the SRS. Given the differences in the geology and geotechnical characteristics at each location, the estimates of the seismic hazard are to allow for the specific local conditions at each site. Characterization of seismic hazard is a critical factor for the design of new facilities as well as for the review and potential retrofit of existing facilities at SRS. The scope of the SRS seismic hazard characterization reported in this document is limited to the Probabilistic Seismic Hazard Analysis (PSHA). The goal of the project is to provide seismic hazard estimates based on a state-of-the-art method which is consistent with developments and findings of several ongoing studies which are deemed to bring improvements in the state of the seismic hazard analyses.

  18. Utilizing synthesis to verify Boolean function models

    Microsoft Academic Search

    Azam Beg; P. W. Chandana Prasad; Walid Ibrahim; Emad Abu Shama

    2008-01-01

    In this paper, we compare two different Boolean function reduction methods in order to justify the analytical model of the Monte Carlo data for Boolean function complexity. We use a binary decision diagram (BDD) complexity model (proposed earlier) and weigh it against the complexity behavior generated by Synopsys Design Compiler (DC). We use this synthesis tool (that utilizes a standard

  19. MULTI-HAZARD (BLAST, SEISMIC, TSUNAMIS, COLLISION) RESISTANT BRIDGE PIERS David Keller

    E-print Network

    Bruneau, Michel

    MULTI-HAZARD (BLAST, SEISMIC, TSUNAMIS, COLLISION) RESISTANT BRIDGE PIERS David Keller Structural of these constraints drives the development of innovative multi-hazard design concepts. This paper presents the results piers retrofitted with steel jackets, both designed to be ductile from a seismic design perspective. LS

  20. Frictional melting of peridotite and seismic slip

    Microsoft Academic Search

    P. Del Gaudio; G. Di Toro; R. Han; T. Hirose; S. Nielsen; T. Shimamoto; A. Cavallo

    2009-01-01

    The evolution of the frictional strength along a fault at seismic slip rates (about 1 m\\/s) is a key factor controlling earthquake mechanics. At mantle depths, friction-induced melting and melt lubrication may influence earthquake slip and seismological data. We report on laboratory experiments designed to investigate dynamic fault strength and frictional melting processes in mantle rocks. We performed 20 experiments

  1. Improved seismic details for highway bridges

    Microsoft Academic Search

    James E. Roberts

    1995-01-01

    Five years have passed since the Governor's Board of Enquiry into the cause of structure failures during the Loma Prieta earthquake issued its final report with the warning title Competing Against Time. California Department of Transportation (Caltrans) staff engineers, consulting firms, independent peer review teams, and university researchers have co-operated in an unprecedented, accelerated program of research-based bridge seismic design

  2. Seismic data used to predict formation pressures

    Microsoft Academic Search

    1992-01-01

    A new set of equations helps estimate formation fluid pressures and minimum fracture pressures in liquid-filled, overpressured, soft rock areas before any wells are drilled in the area. This paper reports on the calculation method which uses reflection seismic data to make the estimates which should be helpful for the initial design of mud weight and casing programs. These equations

  3. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    SciTech Connect

    Joel Walls; M.T. Taner; Naum Derzhi; Gary Mavko; Jack Dvorkin

    2003-04-01

    In this report we will show some new Q related seismic attributes on the Burlington-Seitel data set. One example will be called Energy Absorption Attribute (EAA) and is based on a spectral analysis. The EAA algorithm is designed to detect a sudden increase in the rate of exponential decay in the relatively higher frequency portion of the spectrum. In addition we will show results from a hybrid attribute that combines attenuation with relative acoustic impedance to give a better indication of commercial gas saturation.

  4. Effects of Large and Small-Source Seismic Surveys on Marine Mammals and Sea Turtles

    NASA Astrophysics Data System (ADS)

    Holst, M.; Richardson, W. J.; Koski, W. R.; Smultea, M. A.; Haley, B.; Fitzgerald, M. W.; Rawson, M.

    2006-05-01

    L-DEO implements a marine mammal and sea turtle monitoring and mitigation program during its seismic surveys. The program consists of visual observations, mitigation, and/or passive acoustic monitoring (PAM). Mitigation includes ramp ups, powerdowns, and shutdowns of the seismic source if marine mammals or turtles are detected in or about to enter designated safety radii. Visual observations for marine mammals and turtles have taken place during all 11 L-DEO surveys since 2003, and PAM was done during five of those. Large sources were used during six cruises (10 to 20 airguns; 3050 to 8760 in3; PAM during four cruises). For two interpretable large-source surveys, densities of marine mammals were lower during seismic than non- seismic periods. During a shallow-water survey off Yucatn, delphinid densities during non-seismic periods were 19x higher than during seismic; however, this number is based on only 3 sightings during seismic and 11 sightings during non-seismic. During a Caribbean survey, densities were 1.4x higher during non-seismic. The mean closest point of approach (CPA) for delphinids for both cruises was significantly farther during seismic (1043 m) than during non-seismic (151 m) periods (Mann-Whitney U test, P < 0.001). Large whales were only seen during the Caribbean survey; mean CPA during seismic was 1722 m compared to 1539 m during non-seismic, but sample sizes were small. Acoustic detection rates with and without seismic were variable for three large-source surveys with PAM, with rates during seismic ranging from 1/3 to 6x those without seismic (n = 0 for fourth survey). The mean CPA for turtles was closer during non-seismic (139 m) than seismic (228 m) periods (P < 0.01). Small-source surveys used up to 6 airguns or 3 GI guns (75 to 1350 in3). During a Northwest Atlantic survey, delphinid densities during seismic and non-seismic were similar. However, in the Eastern Tropical Pacific, delphinid densities during non-seismic were 2x those during seismic. During a survey in Alaska, densities of large whales were 4.5x greater during non-seismic than seismic. In contrast, densities of Dall's porpoise were ~2x greater during seismic than during non-seismic; they also approached closer to the vessel during seismic (622 m) than during non-seismic (1044 m), though not significantly so (P = 0.16). CPAs for all other marine mammal groups sighted during small-source surveys were similar during seismic and non- seismic. For the one small-source survey with PAM, the acoustic detection rate during seismic was 1/3 of that without seismic. The mean CPA for turtles was 120 m during non-seismic and 285 m during seismic periods (P < 0.001). The large-source results suggest that, with operating airguns, some cetaceans tended to avoid the immediate area but often continued calling. Some displacement was also apparent during three interpretable small- source surveys, but the evidence was less clear than for large-source surveys. With both large and small sources, although some cetaceans avoided the airguns and vessel, others came to bowride during seismic operations. Sea turtles showed localized avoidance during large and small-source surveys.

  5. Seismic station, USGS Northern California Seismic Network

    USGS Multimedia Gallery

    Traditional seismic stations such as this one require a source of power (solar here), a poured concrete foundation and several square feet of space. They are not always practical to install in urban areas, and that's where NetQuakes comes in....

  6. Seismic Hazard of Eritrea

    NASA Astrophysics Data System (ADS)

    Hagos, L.; Arvidsson, R.

    2003-04-01

    The method of spatially smoothed seismicity developed by Frankel(1995) and later extended by Lapajne et al.(1997) , is applied to estimate the seismic hazard of Eritrea. The extended method unlike the original one involves the delineation of the whole region into subregions with statistically determined directions of seismogenic faults pertaining to the respective tectonic regions (Poljak, 2000). Fault-rupture oriented elliptical Gaussian smoothing results in spatial models of expected seismicity. Seismic catalogue was compiled from ISC, NEIC, and Turyomurgyendo(1996) and homogenized to Ms. Three seismicity models suggested by Frankel(1995) which are based on different time and magnitude intervals are used in this approach, and a fourth model suggested by Lapajne et al.(2000), which is based on the seismic energy release is also used to enhance the influence of historical events on the hazard computation. Activity rates and maximum likelihood estimates of b- values for the different models are computed using the OHAZ program. The western part of the region shows no seismic activity. b -value for models 1-3 is estimated to be 0.91. Mmax has been estimated to be 7.0. Correlation distances are obtained objectively from the location error in the seismic catalogue. The attenuation relationship by Ambraseys et al .(1996) was found suitable for the region under study. PGA values for 10% probability of exceedence in 50 years (return period of 475 years) are computed for each model and a combined seismic hazard map was produced by subjectively assigning weights to each of the models. A worst case map is also obtained showing the highest PGA values at each location from the four hazard maps. The map indicates a higher hazard along the main tectonic features of the East African and the Red sea rift systems, with its highest PGA values within Eritrea exceeding 25% of g being located north of the red sea port of Massawa. In areas around Asmara PGA values exceed 10% of g.

  7. Empirical correlation verifies true formation skin

    SciTech Connect

    Kutasov, I.M. [MultiSpectrum Technologies, Santa Monica, CA (United States)

    1995-04-03

    To determine formation (true) skin and the rate-dependent skin, a semi-theoretical equation is proposed for relating the critical value of flow rate (q{sub c}) to formation permeability, formation porosity, and gas/oil dynamic viscosity. An accurate evaluation of skin is important for designing remedial treatments or evaluating gas well productivity. Three examples illustrate the proposed equation. In all cases, the actual gas/oil flow rates are compared with the calculated critical flow rate.

  8. Numerical simulation on seismic retrofitting performance of reinforced concrete columns strengthened with fibre reinforced polymer sheets

    Microsoft Academic Search

    Zhishen Wu; Dachang Zhang; Vistasp M. Karbhari

    2010-01-01

    This paper evaluates the seismic performance of reinforced concrete columns retrofitted with fibre reinforced polymer (FRP) sheets through numerical simulations of the loaddeformation response using two-dimensional finite element analysis (2D-FEA). The relatively rational mesh configuration is verified through comparison of analysis results obtained from the different mesh configurations. The seismic performance of three reinforced concrete (RC) columns strengthened with FRP

  9. Seismic requalification of a safety class crane

    SciTech Connect

    Wu, Ting-shu; Moran, T.J.

    1991-01-01

    A remotely operated 5-ton crane within a nuclear fuel handling facility was designed and constructed over 25 years ago. At that time, less severe design criteria, particularly on seismic loadings, were in use. This crane is being reactivated and requalified under new design criteria with loads including a site specific design basis earthquake. Detailed analyses of the crane show that the maximum stress coefficient is less than 90% of the code allowable, indicating that this existing crane is able to withstand loadings including those from the design basis earthquake. 3 refs., 8 figs., 2 tabs.

  10. Seismic exploration for water on Mars

    NASA Technical Reports Server (NTRS)

    Page, Thornton

    1987-01-01

    It is proposed to soft-land three seismometers in the Utopia-Elysium region and three or more radio controlled explosive charges at nearby sites that can be accurately located by an orbiter. Seismic signatures of timed explosions, to be telemetered to the orbiter, will be used to detect present surface layers, including those saturated by volatiles such as water and/or ice. The Viking Landers included seismometers that showed that at present Mars is seismically quiet, and that the mean crustal thickness at the site is about 14 to 18 km. The new seismic landers must be designed to minimize wind vibration noise, and the landing sites selected so that each is well formed on the regolith, not on rock outcrops or in craters. The explosive charges might be mounted on penetrators aimed at nearby smooth areas. They must be equipped with radio emitters for accurate location and radio receivers for timed detonation.

  11. Characterization of the Virgo Seismic Environment

    E-print Network

    The Virgo Collaboration; T. Accadia; F. Acernese; P. Astone; G. Ballardin; F. Barone; M. Barsuglia; A. Basti; Th. S. Bauer; M. Bebronne; M. G. Beker; A. Belletoile; M. Bitossi; M. A. Bizouard; M. Blom; F. Bondu; L. Bonelli; R. Bonnand; V. Boschi; L. Bosi; B. Bouhou; S. Braccini; C. Bradaschia; M. Branchesi; T. Briant; A. Brillet; V. Brisson; T. Bulik; H. J. Bulten; D. Buskulic; C. Buy; G. Cagnoli; E. Calloni; B. Canuel; F. Carbognani; F. Cavalier; R. Cavalieri; G. Cella; E. Cesarini; O. Chaibi; E. Chassande-Mottin; A. Chincarini; A. Chiummo; F. Cleva; E. Coccia; P. -F. Cohadon; C. N. Colacino; J. Colas; A. Colla; M. Colombini; A. Conte; M. Coughlin; J. -P. Coulon; E. Cuoco; S. DAntonio; V. Dattilo; M. Davier; R. Day; R. De Rosa; G. Debreczeni; W. Del Pozzo; M. del Prete; L. Di Fiore; A. Di Lieto; M. Di Paolo Emilio; A. Di Virgilio; A. Dietz; M. Drago; G. Endroczi; V. Fafone; I. Ferrante; F. Fidecaro; I. Fiori; R. Flaminio; L. A. Forte; J. -D. Fournier; J. Franc; S. Frasca; F. Frasconi; M. Galimberti; L. Gammaitoni; F. Garufi; M. E. Gaspar; G. Gemme; E. Genin; A. Gennai; A. Giazotto; R. Gouaty; M. Granata; C. Greverie; G. M. Guidi; J. -F. Hayau; A. Heidmann; H. Heitmann; P. Hello; P. Jaranowski; I. Kowalska; A. Krolak; N. Leroy; N. Letendre; T. G. F. Li; N. Liguori; M. Lorenzini; V. Loriette; G. Losurdo; E. Majorana; I. Maksimovic; N. Man; M. Mantovani; F. Marchesoni; F. Marion; J. Marque; F. Martelli; A. Masserot; C. Michel; L. Milano; Y. Minenkov; M. Mohan; N. Morgado; A. Morgia; S. Mosca; B. Mours; L. Naticchioni; F. Nocera; G. Pagliaroli; L. Palladino; C. Palomba; F. Paoletti; M. Parisi; A. Pasqualetti; R. Passaquieti; D. Passuello; G. Persichetti; F. Piergiovanni; M. Pietka; L. Pinard; R. Poggiani; M. Prato; G. A. Prodi; M. Punturo; P. Puppo; D. S. Rabeling; I. Racz; P. Rapagnani; V. Re; T. Regimbau; F. Ricci; F. Robinet; A. Rocchi; L. Rolland; R. Romano; D. Rosinska; P. Ruggi; B. Sassolas; D. Sentenac; L. Sperandio; R. Sturani; B. Swinkels; M. Tacca; L. Taffarello; A. Toncelli; M. Tonelli; O. Torre; E. Tournefier; F. Travasso; G. Vajente; J. F. J. van den Brand; C. Van Den Broeck; S. van der Putten; M. Vasuth; M. Vavoulidis; G. Vedovato; D. Verkindt; F. Vetrano; A. Vicere; J. -Y. Vinet; S. Vitale; H. Vocca; R. L. Ward; M. Was; M. Yvert; A. Zadrozny; J. -P. Zendri

    2011-08-08

    The Virgo gravitational wave detector is an interferometer (ITF) with 3km arms located in Pisa, Italy. From July to October 2010, Virgo performed its third science run (VSR3) in coincidence with the LIGO detectors. Despite several techniques adopted to isolate the interferometer from the environment, seismic noise remains an important issue for Virgo. Vibrations produced by the detector infrastructure (such as air conditioning units, water chillers/heaters, pumps) are found to affect Virgo's sensitivity, with the main coupling mechanisms being through beam jitter and scattered light processes. The Advanced Virgo (AdV) design seeks to reduce ITF couplings to environmental noise by having most vibration-sensitive components suspended and in-vacuum, as well as muffle and relocate loud machines. During the months of June and July 2010, a Guralp-3TD seismometer was stationed at various locations around the Virgo site hosting major infrastructure machines. Seismic data were examined using spectral and coherence analysis with seismic probes close to the detector. The primary aim of this study was to identify noisy machines which seismically affect the ITF environment and thus require mitigation attention. Analyzed machines are located at various distances from the experimental halls, ranging from 10m to 100m. An attempt is made to measure the attenuation of emitted noise at the ITF and correlate it to the distance from the source and to seismic attenuation models in soil.

  12. Gravity of the New Madrid seismic zone; a preliminary study

    USGS Publications Warehouse

    Langenheim, V.E.

    1995-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Mo. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  13. Discussing Seismic Data

    USGS Multimedia Gallery

    USGS scientists Debbie Hutchinson and Jonathan Childs discuss collected seismic data. This image was taken on U.S. Coast Guard Cutter Healy and was during a scientific expedition to map the Arctic seafloor....

  14. A procedure for seismic risk reduction in Campania Region

    SciTech Connect

    Zuccaro, G. [Study Centre PLINIVS-University of Naples 'Federico II', via Toledo, 402 - I 80134 - Naples (Italy); Palmieri, M.; Cicalese, S.; Grassi, V.; Rauci, M. [Campania Region-Civil Protection Office-Centro Direzionale, is. C3 - 80 143 - Naples (Italy); Maggio, F. [Campania Region-Public Works Office-via Cesare Battisti, 30 - I 8100, Caserta (Italy)

    2008-07-08

    The Campania Region has set and performed a peculiar procedure in the field of seismic risk reduction. Great attention has been paid to public strategic buildings such as town halls, civil protection buildings and schools. The Ordinance 3274 promulgate in the 2004 by the Italian central authority obliged the owners of strategic buildings to perform seismic analyses within 2008 in order to check the safety of the structures and the adequacy to the use. In the procedure the Campania region, instead of the local authorities, ensure the complete drafting of seismic checks through financial resources of the Italian Government. A regional scientific technical committee has been constituted, composed of scientific experts, academics in seismic engineering. The committee has drawn up guidelines for the processing of seismic analyses. At the same time, the Region has issued a public competition to select technical seismic engineering experts to appoint seismic analysis in accordance with guidelines. The scientific committee has the option of requiring additional documents and studies in order to approve the safety checks elaborated. The Committee is supported by a technical and administrative secretariat composed of a group of expert in seismic engineering. At the moment several seismic safety checks have been completed. The results will be presented in this paper. Moreover, the policy to mitigate the seismic risk, set by Campania region, was to spend the most of the financial resources available on structural strengthening of public strategic buildings rather than in safety checks. A first set of buildings of which the response under seismic action was already known by data and studies of vulnerability previously realised, were selected for immediate retrofitting designs. Secondly, an other set of buildings were identified for structural strengthening. These were selected by using the criteria specified in the Guide Line prepared by the Scientific Committee and based on data obtained by the first set of safety checks. The strengthening philosophy adopt in the projects will be described in the paper.

  15. Shake It Up! Engineering for Seismic Waves

    NSDL National Science Digital Library

    2014-09-18

    Students learn about how engineers design and build shake tables to test the ability of buildings to withstand the various types of seismic waves generated by earthquakes. Just like engineers, students design and build shake tables to test their own model buildings made of toothpicks and mini marshmallows. Once students are satisfied with the performance of their buildings, they put them through a one-minute simulated earthquake challenge.

  16. Automating Shallow Seismic Imaging

    Microsoft Academic Search

    Steeples; Don W

    2004-01-01

    This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate

  17. AUTOMATING SHALLOW SEISMIC IMAGING

    SciTech Connect

    Steeples, Don W.

    2003-09-14

    The current project is a continuation of an effort to develop ultrashallow seismic imaging as a cost-effective method potentially applicable to DOE facilities. The objective of the present research is to develop and demonstrate the use of a cost-effective, automated method of conducting shallow seismic surveys, an approach that represents a significant departure from conventional seismic-survey field procedures. Initial testing of a mechanical geophone-planting device suggests that large numbers of geophones can be placed both quickly and automatically. The development of such a device could make the application of SSR considerably more efficient and less expensive. The imaging results obtained using automated seismic methods will be compared with results obtained using classical seismic techniques. Although this research falls primarily into the field of seismology, for comparison and quality-control purposes, some GPR data will be collected as well. In the final year of th e research, demonstration surveys at one or more DOE facilities will be performed. An automated geophone-planting device of the type under development would not necessarily be limited to the use of shallow seismic reflection methods; it also would be capable of collecting data for seismic-refraction and possibly for surface-wave studies. Another element of our research plan involves monitoring the cone of depression of a pumping well that is being used as a proxy site for fluid-flow at a contaminated site. Our next data set will be collected at a well site where drawdown equilibrium has been reached. Noninvasive, in-situ methods such as placing geophones automatically and using near-surface seismic methods to identify and characterize the hydrologic flow regimes at contaminated sites support the prospect of developing effective, cost-conscious cleanup strategies for DOE and others.

  18. Passive seismic experiment

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Ewing, M.; Press, F.; Sutton, G.; Dorman, J.; Nakamura, Y.; Toksoz, N.; Lammlein, D.; Duennebier, F.

    1972-01-01

    The establishment of a network of seismic stations on the lunar surface as a result of equipment installed by Apollo 12, 14, and 15 flights is described. Four major discoveries obtained by analyzing seismic data from the network are discussed. The use of the system to detect vibrations of the lunar surface and the use of the data to determine the internal structure, physical state, and tectonic activity of the moon are examined.

  19. Seismic surveys test on Innerhytta Pingo, Adventdalen, Svalbard Islands

    NASA Astrophysics Data System (ADS)

    Boaga, Jacopo; Rossi, Giuliana; Petronio, Lorenzo; Accaino, Flavio; Romeo, Roberto; Wheeler, Walter

    2015-04-01

    We present the preliminary results of an experimental full-wave seismic survey test conducted on the Innnerhytta a Pingo, located in the Adventdalen, Svalbard Islands, Norway. Several seismic surveys were adopted in order to study a Pingo inner structure, from classical reflection/refraction arrays to seismic tomography and surface waves analysis. The aim of the project IMPERVIA, funded by Italian PNRA, was the evaluation of the permafrost characteristics beneath this open-system Pingo by the use of seismic investigation, evaluating the best practice in terms of logistic deployment. The survey was done in April-May 2014: we collected 3 seismic lines with different spacing between receivers (from 2.5m to 5m), for a total length of more than 1 km. We collected data with different vertical geophones (with natural frequency of 4.5 Hz and 14 Hz) as well as with a seismic snow-streamer. We tested different seismic sources (hammer, seismic gun, fire crackers and heavy weight drop), and we verified accurately geophone coupling in order to evaluate the different responses. In such peculiar conditions we noted as fire-crackers allow the best signal to noise ratio for refraction/reflection surveys. To ensure the best geophones coupling with the frozen soil, we dug snow pits, to remove the snow-cover effect. On the other hand, for the surface wave methods, the very high velocity of the permafrost strongly limits the generation of long wavelengths both with these explosive sources as with the common sledgehammer. The only source capable of generating low frequencies was a heavy drop weight system, which allows to analyze surface wave dispersion below 10 Hz. Preliminary data analysis results evidence marked velocity inversions and strong velocity contrasts in depth. The combined use of surface and body waves highlights the presence of a heterogeneous soil deposit level beneath a thick layer of permafrost. This is the level that hosts the water circulation from depth controlling the Pingo structure evolution.

  20. Seismicity and strain accumulation around Karliova Triple Junction (Turkey)

    NASA Astrophysics Data System (ADS)

    Aktug, Bahadir; Dikmen, Unal; Dogru, Asli; Ozener, Haluk

    2013-07-01

    GPS studies in Turkey date back to the early 1990s, but were mostly focused on the seismically active North Anatolian Fault System (NAFS), or on the more populated Western Anatolia. Relatively few studies were made of the seismically less-active East Anatolian Fault System (EAFS), although it has the potential to produce large earthquakes. In this study, we present the results of a combination of geodetic and seismological data around the Karliova Triple Junction (KTJ), which lies at the intersection of the North- and East Anatolian Fault Systems. In particular, the geodetic slip rates obtained through block modeling of GPS velocities were compared with b-values to assess seismicity in the region. Yedisu segment, one of the best-known seismic gaps in Turkey, was specifically analyzed. The relatively low b-values across Yedisu segment verify the accumulation of seismic energy in this segment, and the GPS-derived geodetic slip rates suggest that it has the potential to produce an earthquake of Mw 7.5 across an 80-km rupture zone. Additionally, analysis of earthquake data reveals that the study area has a ductile or rigid-ductile behavior with respect to its surroundings, characterized by varying b-values. Although, seismic events of moderate- to high magnitudes are confined along the major fault zones, there are also low-seismicity zones along the eastern part of the Bitlis Suture Zone and around Yedisu. Since the high seismicity areas within the region may not accumulate sufficient stress for a large earthquake to occur, it is considered that the deformation in such areas occurs in a ductile manner. On the other hand, the areas characterized by low b-values may have the capacity of stress accumulation, which could lead to brittle deformation.

  1. USGS National Seismic Hazard Maps

    NSDL National Science Digital Library

    This set of resources provides seismic hazard assessments and information on design values and mitigation for the U.S. and areas around the world. Map resources include the U.S. National and Regional probabilistic ground motion map collection, which covers the 50 states, Puerto Rico, and selected countries. These maps display peak ground acceleration (PGA) values, and are used as the basis for seismic provisions in building codes and for new construction. There is also a custom mapping and analysis tool, which enables users to re-plot these maps for area of interest, get hazard values using latitude/longitude or zip code, find predominant magnitudes and distances, and map the probability of given magnitude within a certain distance from a site. The ground motion calculator, a Java application, determines hazard curves, uniform hazard response spectra, and design parameters for sites in the 50 states and most territories. There is also a two-part earthquake hazards 'primer', which provides links to hazard maps and frequently-asked-questions, and more detailed information for building and safety planners.

  2. Synthesis of artificial spectrum-compatible seismic accelerograms

    NASA Astrophysics Data System (ADS)

    Vrochidou, E.; Alvanitopoulos, P. F.; Andreadis, I.; Elenas, A.; Mallousi, K.

    2014-08-01

    The Hilbert-Huang transform is used to generate artificial seismic signals compatible with the acceleration spectra of natural seismic records. Artificial spectrum-compatible accelerograms are utilized instead of natural earthquake records for the dynamic response analysis of many critical structures such as hospitals, bridges, and power plants. The realistic estimation of the seismic response of structures involves nonlinear dynamic analysis. Moreover, it requires seismic accelerograms representative of the actual ground acceleration time histories expected at the site of interest. Unfortunately, not many actual records of different seismic intensities are available for many regions. In addition, a large number of seismic accelerograms are required to perform a series of nonlinear dynamic analyses for a reliable statistical investigation of the structural behavior under earthquake excitation. These are the main motivations for generating artificial spectrum-compatible seismic accelerograms and could be useful in earthquake engineering for dynamic analysis and design of buildings. According to the proposed method, a single natural earthquake record is deconstructed into amplitude and frequency components using the Hilbert-Huang transform. The proposed method is illustrated by studying 20 natural seismic records with different characteristics such as different frequency content, amplitude, and duration. Experimental results reveal the efficiency of the proposed method in comparison with well-established and industrial methods in the literature.

  3. Compliant liquid column damper modified by shape memory alloy device for seismic vibration control

    NASA Astrophysics Data System (ADS)

    Gur, Sourav; Mishra, Sudib Kumar; Bhowmick, Sutanu; Chakraborty, Subrata

    2014-10-01

    Liquid column dampers (LCDs) have long been used for the seismic vibration control of flexible structures. In contrast, tuning LCDs to short-period structures poses difficulty. Various modifications have been proposed on the original LCD configuration for improving its performance in relatively stiff structures. One such system, referred to as a compliant-LCD has been proposed recently by connecting the LCD to the structure with a spring. In this study, an improvement is attempted in compliant LCDs by replacing the linear spring with a spring made of shape memory alloy (SMA). Considering the dissipative, super-elastic, force-deformation hysteresis of SMA triggered by stress-induced micro-structural phase transition, the performance is expected to improve further. The optimum parameters for the SMA-compliant LCD are obtained through design optimization, which is based on a nonlinear random vibration response analysis via stochastic linearization of the force-deformation hysteresis of SMA and dissipation by liquid motion through an orifice. Substantially enhanced performance of the SMA-LCD over a conventional compliant LCD is demonstrated, the consistency of which is further verified under recorded ground motions. The robustness of the improved performance is also validated by parametric study concerning the anticipated variations in system parameters as well as variability in seismic loading.

  4. Seismic hazard assessment in Grecce: Revisited

    NASA Astrophysics Data System (ADS)

    Makropoulos, Kostas; Chousianitis, Kostas; Kaviris, George; Kassaras, Ioannis

    2013-04-01

    Greece is the most earthquake prone country in the eastern Mediterranean territory and one of the most active areas globally. Seismic Hazard Assessment (SHA) is a useful procedure to estimate the expected earthquake magnitude and strong ground-motion parameters which are necessary for earthquake resistant design. Several studies on the SHA of Greece are available, constituting the basis of the National Seismic Code. However, the recently available more complete, accurate and homogenous seismological data (the new earthquake catalogue of Makropoulos et al., 2012), the revised seismic zones determined within the framework of the SHARE project (2012), new empirical attenuation formulas extracted for several regions in Greece, as well as new algorithms of SHA, are innovations that motivated the present study. Herewith, the expected earthquake magnitude for Greece is evaluated by applying the zone-free, upper bounded Gumbel's third asymptotic distribution of extreme values method. The peak ground acceleration (PGA), velocity (PGV) and displacement (PGD) are calculated at the seismic bedrock using two methods: (a) the Gumbel's first asymptotic distribution of extreme values, since it is valid for initial open-end distributions and (b) the Cornell-McGuire approach, using the CRISIS2007 (Ordaz et. al., 2007) software. The latter takes into account seismic source zones for which seismicity parameters are assigned following a Poisson recurrence model. Thus, each source is characterized by a series of seismic parameters, such as the magnitude recurrence and the recurrence rate for threshold magnitude, while different predictive equations can be assigned to different seismic source zones. Recent available attenuation parameters were considered. Moreover, new attenuation parameters for the very seismically active Corinth Gulf deduced during this study, from recordings of the RASMON accelerometric array, were used. The hazard parameters such as the most probable annual maximum earthquake magnitude (mode) and the maximum expected earthquake magnitude with 70% and 90% probability of not been exceeded in 50 and 100 years are determined and compiled into a GIS mapping scheme. The data quality allowed the estimation of strong ground motion parameters (PGA, PGV and PGD) within cells of small dimensions of 0.25 X 0.25. The results are discussed and compared with the ones obtained by other studies.

  5. Possibility to determine parameters of seismic waves from the results of electric field measurements on the ocean floor

    Microsoft Academic Search

    B. A. Burov

    2008-01-01

    1. The possibility of separating seismic information from the variations of the electric field (EF) recorded in the sea with bottom cable antennas is shown on the basis of experimental data during a seismic perturbation (SP) related to a remote earthquake. The information about recording signals caused by seismic vibrations (SV) in stationary bottom antennas designed for measuring the EF

  6. Seismic deformation analysis of Tuttle Creek Dam Timothy D. Stark, Michael H. Beaty, Peter M. Byrne, Gonzalo Castro,

    E-print Network

    Seismic deformation analysis of Tuttle Creek Dam Timothy D. Stark, Michael H. Beaty, Peter M. Byrne, and David L. Mathews Abstract: To facilitate the design of seismic remediation for Tuttle Creek Dam in east central Kansas, a seismic finite differ- ence analysis of the dam was performed using the software FLAC

  7. Suitable Structure of PM and Copper Plate Systems for Reducing Vibration Transmission and Improving Damping Effect in a Superconducting Seismic Isolation Device

    Microsoft Academic Search

    S. Sasaki; K. Shimada; M. Tsuda; T. Hamajima; N. Kawai; K. Yasui

    2011-01-01

    We have investigated the basic properties of levitation force and vibration transmission in a magnetic levitation type su- perconducting seismic isolation device. Since it is very difficult in a realdeviceofaseismicisolationdevicetokeepthestationarylevita- tion against any horizontal disturbances, we havedevised a perma- nent magnet-permanent magnet (PM-PM) system with a copper plate and verified that the stable stationary levitation of the seismic isolation object

  8. Verifying Neutron Tomography Performance using Test Objects

    NASA Astrophysics Data System (ADS)

    Kaestner, A. P.; Lehmann, E. H.; Hovind, J.; Radebe, M. J.; de Beer, F. C.; Sim, C. M.

    In an effort to provide a standardized method to quantify the imaging capabilities of neutron imaging beam-lines worldwide, we propose a set of test objects for neutron tomography. The test objects are designed to quantify spatial resolution and material contrast in tomograms. The resolution samples aim at detecting a thin film embedded in a different material. Two samples with complementary material compositions are proposed for this purpose. The contrast sample has several insets of different materials. The measurements are proposed to be done using both radiography and tomography. The image processing methods needed to evaluate the performance of the reconstructed data are presented. The methods are automated to avoid subjective decisions by persons who evaluate the data. Experimental data to demonstrate the test objects and their analysis methods were acquired with the cold neutron imaging beam-line, ICON, in comparison with data from the thermal neutron facility, NEUTRA at Paul Scherrer Institut, Switzerland. This is a first initiative and is open for discussion among the participants to further improve the evaluation procedure.

  9. Application of the Neo-Deterministic Seismic Microzonation Procedure in Bulgaria and Validation of the Seismic Input Against Eurocode 8

    SciTech Connect

    Ivanka, Paskaleva [CLSMEE--BAS, 3 Acad G. Bonchev str, 1113 Sofia (Bulgaria); Mihaela, Kouteva [CLSMEE-BAS, 3 Acad G. Bonchev str, 1113 Sofia (Bulgaria); ESP-SAND, ICTP, Trieste (Italy); Franco, Vaccari [DST-University of Trieste, Via E. Weiss 4, 34127 Trieste (Italy); Panza, Giuliano F. [DST-University of Trieste, Via E. Weiss 4, 34127 Trieste (Italy); ESP-SAND, ICTP, Trieste (Italy)

    2008-07-08

    The earthquake record and the Code for design and construction in seismic regions in Bulgaria have shown that the territory of the Republic of Bulgaria is exposed to a high seismic risk due to local shallow and regional strong intermediate-depth seismic sources. The available strong motion database is quite limited, and therefore not representative at all of the real hazard. The application of the neo-deterministic seismic hazard assessment procedure for two main Bulgarian cities has been capable to supply a significant database of synthetic strong motions for the target sites, applicable for earthquake engineering purposes. The main advantage of the applied deterministic procedure is the possibility to take simultaneously and correctly into consideration the contribution to the earthquake ground motion at the target sites of the seismic source and of the seismic wave propagation in the crossed media. We discuss in this study the result of some recent applications of the neo-deterministic seismic microzonation procedure to the cities of Sofia and Russe. The validation of the theoretically modeled seismic input against Eurocode 8 and the few available records at these sites is discussed.

  10. 7 CFR 1780.57 - Design policies.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...intended for sheltering persons or property will be designed with appropriate seismic safety provisions in compliance with the Earthquake Hazards Reduction Act of 1977 (42 U.S.C. 7701 et seq.), and Executive Order 12699, Seismic Safety of...

  11. Seismic Safety Study

    SciTech Connect

    Tokarz, F J; Coats, D W

    2006-05-16

    During the past three decades, the Laboratory has been proactive in providing a seismically safe working environment for its employees and the general public. Completed seismic upgrades during this period have exceeded $30M with over 24 buildings structurally upgraded. Nevertheless, seismic questions still frequently arise regarding the safety of existing buildings. To address these issues, a comprehensive study was undertaken to develop an improved understanding of the seismic integrity of the Laboratory's entire building inventory at the Livermore Main Site and Site 300. The completed study of February 2005 extended the results from the 1998 seismic safety study per Presidential Executive Order 12941, which required each federal agency to develop an inventory of its buildings and to estimate the cost of mitigating unacceptable seismic risks. Degenkolb Engineers, who performed the first study, was recontracted to perform structural evaluations, rank order the buildings based on their level of seismic deficiencies, and to develop conceptual rehabilitation schemes for the most seriously deficient buildings. Their evaluation is based on screening procedures and guidelines as established by the Interagency Committee on Seismic Safety in Construction (ICSSC). Currently, there is an inventory of 635 buildings in the Laboratory's Facility Information Management System's (FIMS's) database, out of which 58 buildings were identified by Degenkolb Engineers that require seismic rehabilitation. The remaining 577 buildings were judged to be adequate from a seismic safety viewpoint. The basis for these evaluations followed the seismic safety performance objectives of DOE standard (DOE STD 1020) Performance Category 1 (PC1). The 58 buildings were ranked according to three risk-based priority classifications (A, B, and C) as shown in Figure 1-1 (all 58 buildings have structural deficiencies). Table 1-1 provides a brief description of their expected performance and damage state following a major earthquake, rating the seismic vulnerability (1-10) where the number 10 represents the highest and worst. Buildings in classifications A and B were judged to require the Laboratory's highest attention towards rehabilitation, classification C buildings could defer rehabilitation until a major remodel is undertaken. Strengthening schemes were developed by Degenkolb Engineers for the most seriously deficient A and B classifications (15 total), which the Laboratory's Plant Engineering Department used as its basis for rehabilitation construction cost estimates. A detailed evaluation of Building 2580, a strengthening scheme, and a construction cost estimate are pending. Specific details of the total estimated rehabilitation costs, a proposed 10-year seismic rehabilitation plan, exemption categories by building, DOE performance guidelines, cost comparisons for rehabilitation, and LLNL reports by Degenkolb Engineers are provided in Appendix A. Based on the results of Degenkolb Engineers evaluations, along with the prevailing practice for the disposition of seismically deficient buildings and risk-based evaluations, it is concluded that there is no need to evacuate occupants from these 58 buildings prior to their rehabilitation.

  12. Results from the latest SN-4 multi-parametric benthic observatory experiment (MARsite EU project) in the Gulf of Izmit, Turkey: oceanographic, chemical and seismic monitoring

    NASA Astrophysics Data System (ADS)

    Embriaco, Davide; Marinaro, Giuditta; Frugoni, Francesco; Giovanetti, Gabriele; Monna, Stephen; Etiope, Giuseppe; Gasperini, Luca; a?atay, Nam?k; Favali, Paolo

    2015-04-01

    An autonomous and long-term multiparametric benthic observatory (SN-4) was designed to study gas seepage and seismic energy release along the submerged segment of the North Anatolian Fault (NAF). Episodic gas seepage occurs at the seafloor in the Gulf of Izmit (Sea of Marmara, NW Turkey) along this submerged segment of the NAF, which ruptured during the 1999 Mw7.4 Izmit earthquake. The SN-4 observatory already operated in the Gulf of Izmit at the western end of the 1999 Izmit earthquake rupture for about one-year at 166 m water depth during the 2009-2010 experiment (EGU2014-13412-1, EGU General Assembly 2014). SN-4 was re-deployed in the same site for a new long term mission (September 2013 - April 2014) in the framework of MARsite (New Directions in Seismic Hazard assessment through Focused Earth Observation in the Marmara Supersite, http://marsite.eu/ ) EC project, which aims at evaluating seismic risk and managing of long-term monitoring activities in the Marmara Sea. A main scientific objective of the SN-4 experiment is to investigate the possible correlations between seafloor methane seepage and release of seismic energy. We used the same site of the 2009-2010 campaign to verify both the occurrence of previously observed phenomena and the reliability of results obtained in the previous experiment (Embriaco et al., 2014, doi:10.1093/gji/ggt436). In particular, we are interested in the detection of gas release at the seafloor, in the role played by oceanographic phenomena in this detection, and in the association of gas and seismic energy release. The scientific payload included, among other instruments, a three-component broad-band seismometer, and gas and oceanographic sensors. We present a technical description of the observatory, including the data acquisition and control system, results from the preliminary analysis of this new multidisciplinary data set, and a comparison with the previous experiment.

  13. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.

    2011-12-01

    The CSN is a network of low-cost accelerometers deployed in the Pasadena, CA region. It is a prototype network with the goal of demonstrating the importance of dense measurements in determining the rapid lateral variations in ground motion due to earthquakes. The main product of the CSN is a map of peak ground produced within seconds of significant local earthquakes that can be used as a proxy for damage. Examples of this are shown using data from a temporary network in Long Beach, CA. Dense measurements in buildings are also being used to determine the state of health of structures. In addition to fixed sensors, portable sensors such as smart phones are also used in the network. The CSN has necessitated several changes in the standard design of a seismic network. The first is that the data collection and processing is done in the "cloud" (Google cloud in this case) for robustness and the ability to handle large impulsive loads (earthquakes). Second, the database is highly de-normalized (i.e. station locations are part of waveform and event-detection meta data) because of the mobile nature of the sensors. Third, since the sensors are hosted and/or owned by individuals, the privacy of the data is very important. The location of fixed sensors is displayed on maps as sensor counts in block-wide cells, and mobile sensors are shown in a similar way, with the additional requirement to inhibit tracking that at least two must be present in a particular cell before any are shown. The raw waveform data are only released to users outside of the network after a felt earthquake.

  14. Seismic Performance Evaluation of Concentrically Braced Frames

    NASA Astrophysics Data System (ADS)

    Hsiao, Po-Chien

    Concentrically braced frames (CBFs) are broadly used as lateral-load resisting systems in buildings throughout the US. In high seismic regions, special concentrically braced frames (SCBFs) where ductility under seismic loading is necessary. Their large elastic stiffness and strength efficiently sustains the seismic demands during smaller, more frequent earthquakes. During large, infrequent earthquakes, SCBFs exhibit highly nonlinear behavior due to brace buckling and yielding and the inelastic behavior induced by secondary deformation of the framing system. These response modes reduce the system demands relative to an elastic system without supplemental damping. In design the re reduced demands are estimated using a response modification coefficient, commonly termed the R factor. The R factor values are important to the seismic performance of a building. Procedures put forth in FEMAP695 developed to R factors through a formalized procedure with the objective of consistent level of collapse potential for all building types. The primary objective of the research was to evaluate the seismic performance of SCBFs. To achieve this goal, an improved model including a proposed gusset plate connection model for SCBFs that permits accurate simulation of inelastic deformations of the brace, gusset plate connections, beams and columns and brace fracture was developed and validated using a large number of experiments. Response history analyses were conducted using the validated model. A series of different story-height SCBF buildings were designed and evaluated. The FEMAP695 method and an alternate procedure were applied to SCBFs and NCBFs. NCBFs are designed without ductile detailing. The evaluation using P695 method shows contrary results to the alternate evaluation procedure and the current knowledge in which short-story SCBF structures are more venerable than taller counterparts and NCBFs are more vulnerable than SCBFs.

  15. Controlled Rocking System for Seismic Retrofit of Steel Truss Bridge Piers

    E-print Network

    Bruneau, Michel

    rocking approach to seismic resistance was implemented into the design of the South Rangitikei Rail Bridge, and 1995 Kobe earthquake in Japan have demonstrated the need for improved methods for the design little to no ductility and inadequate strength to resist seismic demands. Many other non-ductile failure

  16. Study on Application of Seismic Isolation System to ABWR-II Building

    Microsoft Academic Search

    Hideaki Saito; Hideo Tanaka; Atsuko Noguchi; Junji Suhara; Yasuaki Fukushima

    2004-01-01

    This paper reports the result of a study that evaluated the applicability of the seismic isolation system to nuclear power plants. The study focuses on possibilities of a standard design with improved seismic safety of building and equipment for ABWR-II. A base isolation system with laminated lead rubber bearing was applied in the study. Based on the structural design of

  17. Seismic response of base-isolated buildings using a viscoelastic model

    Microsoft Academic Search

    Uras

    1993-01-01

    Due to recent developments in elastomer technology,seismic isolation using elastomer bearings is rapidly gaining acceptance as a design tool to enhance structural seismic margins and to protect people and equipment from earthquake damage. With proper design of isolators, the fundamental frequency of the structure can be reduced to a value that is lower than the dominant frequencies of earthquake ground

  18. Specifying and Verifying Ultra-reliability and Fault-tolerance Properties

    NASA Technical Reports Server (NTRS)

    Schwartz, R. L.; Melliar-Smith, P. M.

    1983-01-01

    A methodology to rigorously verify ultrareliability and fault tolerance system properties is described. The methodology utilizes a hierarchy of formal mathematical specifications of system design and incremental design proof to prove the system has the desired properties. A small example of the approach is given, and the application of the methodology to the large scale proof of SIFT, a fault tolerant flight control operating system, is discussed.

  19. Seismic while drilling: Operational experiences in Viet Nam

    SciTech Connect

    Jackson, M.; Einchcomb, C.

    1997-03-01

    The BP/Statoil alliance in Viet Nam has used seismic while drilling on four wells during the last two years. Three wells employed the Western Atlas Tomex system, and the last well, Schlumberger`s SWD system. Perceived value of seismic while drilling (SWD) lies in being able to supply real-time data linking drill bit position to a seismic picture of the well. However, once confidence in equipment and methodology is attained, SWD can influence well design and planning associated with drilling wells. More important, SWD can remove uncertainty when actually drilling wells, allowing risk assessment to be carried out more accurately and confidently.

  20. Investigations on local seismic phases and modeling of seismic signals

    Microsoft Academic Search

    B. Massinon; P. Mechler

    1993-01-01

    During this three years period of activity we have worked on the basic topic of our grant that is to say 'Investigations on local seismic phases and modeling of seismic signals' but we have also enlarged this research in some cases to teleseismic distances. We also worked on data processing of seismic waves recorded at regional distances by a mini-array

  1. Development of material measures for performance verifying surface topography measuring instruments

    NASA Astrophysics Data System (ADS)

    Leach, Richard; Giusca, Claudiu; Rickens, Kai; Riemer, Oltmann; Rubert, Paul

    2014-04-01

    The development of two irregular-geometry material measures for performance verifying surface topography measuring instruments is described. The material measures are designed to be used to performance verify tactile and optical areal surface topography measuring instruments. The manufacture of the material measures using diamond turning followed by nickel electroforming is described in detail. Measurement results are then obtained using a traceable stylus instrument and a commercial coherence scanning interferometer, and the results are shown to agree to within the measurement uncertainties. The material measures are now commercially available as part of a suite of material measures aimed at the calibration and performance verification of areal surface topography measuring instruments.

  2. Experimental Evaluation of Integrity of FBR Core under Seismic Events

    NASA Astrophysics Data System (ADS)

    Chellapandi, Perumal; Rajan Babu, Vinayagamoorthy; Puthiyavinayagam, Pillai; Chetal, Subhash Chander; Raj, Baldev

    The core of Prototype Fast Breeder Reactor (PFBR) is designed to produce 1250 MWt at full power. PFBR is under construction at Kalpakkam, India. In PFBR, the core is of free standing type and one of the major safety criteria for the design of core subassemblies is that the integrity of the core subassemblies should not be impaired and they should not be lifted up from the grid plate even during seismic condition. The net downward force acting on the grid plate is less than the weight of the subassembly due to the hydraulic lifting forces acting on it. Experimental analysis has been carried out to ensure that the subassembly does not get lifted off due to vertical seismic excitation. This paper gives the details of the methodology adopted for the experimental seismic analysis carried out on a core subassembly and the upward displacement of the subassembly under the combined effect of upward fluid force and vertical seismic excitations.

  3. Magnitude correlations in global seismicity

    SciTech Connect

    Sarlis, N. V. [Solid State Section and Solid Earth Physics Institute, Physics Department, University of Athens, Panepistimiopolis, Zografos GR-157 84, Athens (Greece)

    2011-08-15

    By employing natural time analysis, we analyze the worldwide seismicity and study the existence of correlations between earthquake magnitudes. We find that global seismicity exhibits nontrivial magnitude correlations for earthquake magnitudes greater than M{sub w}6.5.

  4. Dealer-Leakage Resilient Verifiable Secret Sharing Ruxandra F. Olimid

    E-print Network

    .olimid@fmi.unibuc.ro September 19, 2014 Abstract Verifiable Secret Sharing (VSS) guarantees that honest parties reconstruct-Leakage Resilient Verifiable Secret Sharing (DLR-VSS) as a stronger notion of VSS that achieves security under this settings. We propose an efficient DLR-VSS and prove its properties in the semi-honest adversarial model. 1

  5. A Practical (Non-interactive) Publicly Verifiable Secret Sharing Scheme

    E-print Network

    . A publicly verifiable secret sharing (PVSS) scheme, proposed by Stadler in [Sta96], is a VSS scheme in which play essential roles in the systems using VSS. Achieving si- multaneously the following two features will not be able to gain any information about the secret. #12;The verifiable secret sharing (VSS) schemes

  6. Verifying Hybrid Systems Modeled as Timed Automata: A Case Study?

    E-print Network

    -Vaandrager timed automata model, of the Steam Boiler Controller problem, a hybrid systems benchmark. This pa- perVerifying Hybrid Systems Modeled as Timed Automata: A Case Study? Presented at HART '97, Grenoble for specifying and verifying systems represented in terms of a speci c mathematical model. In 2 , we describe how

  7. Encrypted Receipts for Voter-Verified Elections Using Homomorphic Encryption

    E-print Network

    Goldwasser, Shafi

    Encrypted Receipts for Voter-Verified Elections Using Homomorphic Encryption by Joy Marie Forsythe by . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Arthur C. Smith Chairman, Department Committee on Graduate Students #12;2 #12;Encrypted Receipts for Voter-Verified Elections Using Homomorphic Encryption by Joy Marie Forsythe Submitted to the Department

  8. Efficient Private Techniques for Verifying Social Proximity Michael J. Freedman

    E-print Network

    Singh, Jaswinder Pal

    Efficient Private Techniques for Verifying Social Proximity Michael J. Freedman and Antonio privacy and security goals at a fraction of the cost of its current Private Matching [3] pro- tocol-fold. First, we describe and define a security model for verifying social con- nectedness in a privacy

  9. An Information Flow Verifier for Small Embedded Systems

    Microsoft Academic Search

    Dorina Ghindici; Gilles Grimaud; Isabelle Simplot-ryl

    2007-01-01

    Insecurity arising from illegal information flow represents a real threat in small computing environments allowing code sharing, dy- namic class loading and overloading. We introduce a verifier able to cer- tify at loading time Java applications already typed with signatures de- scribing possible information flows. The verifier is implemented as a class loader and can be used on any Java

  10. Towards Formally Verified Optimizing Compilation in Flight Control Software

    E-print Network

    Paris-Sud XI, Universit de

    simplifying pilots' tasks. Since these controls play a crucial role in flight safety, flight control softwareTowards Formally Verified Optimizing Compilation in Flight Control Software Ricardo Bedin Frana1 and verified optimizing compiler for the development of level A critical flight control software. First

  11. Verifying Red-Black Trees Paolo Baldan1

    E-print Network

    Baldan, Paolo

    Verifying Red-Black Trees Paolo Baldan1 , Andrea Corradini2 , Javier Esparza3 , Tobias Heindel3,heindets,koenigba,koziouvi}@fmi.uni-stuttgart.de Abstract. We show how to verify the correctness of insertion of ele- ments into red-black trees--a form of balanced search trees--using anal- ysis techniques developed for graph rewriting. We first model red

  12. A Practical Scheme for Non-interactive Verifiable Secret Sharing

    Microsoft Academic Search

    Paul Feldman

    1987-01-01

    This paper presents an extremely efficient, non-interactive protocol for verifiable secret sharing. Verifiable secret sharing (VSS) is a way of bequeathing information to a set of processors such that a quorum of processors is needed to access the information. VSS is a fundamental tool of cryptography and distributed computing. Seemingly difficult problems such as secret bidding, fair voting, leader election,

  13. Verifying the Dependence of Fractal Coefficients on Different Spatial Distributions

    SciTech Connect

    Gospodinov, Dragomir [Plovdiv University 'Paisii Hilendarski', 24, Tsar Asen Str., Plovdiv (Bulgaria); Geophysical Institute of Bulgarian Academy of Sciences, Akad. G. Bonchev Str., bl.3, Sofia (Bulgaria); Marekova, Elisaveta; Marinov, Alexander [Plovdiv University 'Paisii Hilendarski', 24, Tsar Asen Str., Plovdiv (Bulgaria)

    2010-01-21

    A fractal distribution requires that the number of objects larger than a specific size r has a power-law dependence on the size N(r) = C/r{sup D}propor tor{sup -D} where D is the fractal dimension. Usually the correlation integral is calculated to estimate the correlation fractal dimension of epicentres. A 'box-counting' procedure could also be applied giving the 'capacity' fractal dimension. The fractal dimension can be an integer and then it is equivalent to a Euclidean dimension (it is zero of a point, one of a segment, of a square is two and of a cube is three). In general the fractal dimension is not an integer but a fractional dimension and there comes the origin of the term 'fractal'. The use of a power-law to statistically describe a set of events or phenomena reveals the lack of a characteristic length scale, that is fractal objects are scale invariant. Scaling invariance and chaotic behavior constitute the base of a lot of natural hazards phenomena. Many studies of earthquakes reveal that their occurrence exhibits scale-invariant properties, so the fractal dimension can characterize them. It has first been confirmed that both aftershock rate decay in time and earthquake size distribution follow a power law. Recently many other earthquake distributions have been found to be scale-invariant. The spatial distribution of both regional seismicity and aftershocks show some fractal features. Earthquake spatial distributions are considered fractal, but indirectly. There are two possible models, which result in fractal earthquake distributions. The first model considers that a fractal distribution of faults leads to a fractal distribution of earthquakes, because each earthquake is characteristic of the fault on which it occurs. The second assumes that each fault has a fractal distribution of earthquakes. Observations strongly favour the first hypothesis.The fractal coefficients analysis provides some important advantages in examining earthquake spatial distribution, which are: - Simple way to quantify scale-invariant distributions of complex objects or phenomena by a small number of parameters. - It is becoming evident that the applicability of fractal distributions to geological problems could have a more fundamental basis. Chaotic behaviour could underlay the geotectonic processes and the applicable statistics could often be fractal.The application of fractal distribution analysis has, however, some specific aspects. It is usually difficult to present an adequate interpretation of the obtained values of fractal coefficients for earthquake epicenter or hypocenter distributions. That is why in this paper we aimed at other goals - to verify how a fractal coefficient depends on different spatial distributions. We simulated earthquake spatial data by generating randomly points first in a 3D space - cube, then in a parallelepiped, diminishing one of its sides. We then continued this procedure in 2D and 1D space. For each simulated data set we calculated the points' fractal coefficient (correlation fractal dimension of epicentres) and then checked for correlation between the coefficients values and the type of spatial distribution.In that way one can obtain a set of standard fractal coefficients' values for varying spatial distributions. These then can be used when real earthquake data is analyzed by comparing the real data coefficients values to the standard fractal coefficients. Such an approach can help in interpreting the fractal analysis results through different types of spatial distributions.

  14. VISION - Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics

    SciTech Connect

    Steven J. Piet; A. M. Yacout; J. J. Jacobson; C. Laws; G. E. Matthern; D. E. Shropshire

    2006-02-01

    The U.S. DOE Advanced Fuel Cycle Initiatives (AFCI) fundamental objective is to provide technology options that - if implemented - would enable long-term growth of nuclear power while improving sustainability and energy security. The AFCI organization structure consists of four areas; Systems Analysis, Fuels, Separations and Transmutations. The Systems Analysis Working Group is tasked with bridging the program technical areas and providing the models, tools, and analyses required to assess the feasibility of design and deployment options and inform key decision makers. An integral part of the Systems Analysis tool set is the development of a system level model that can be used to examine the implications of the different mixes of reactors, implications of fuel reprocessing, impact of deployment technologies, as well as potential "exit" or "off ramp" approaches to phase out technologies, waste management issues and long-term repository needs. The Verifiable Fuel Cycle Simulation Model (VISION) is a computer-based simulation model that allows performing dynamic simulations of fuel cycles to quantify infrastructure requirements and identify key trade-offs between alternatives. It is based on the current AFCI system analysis tool "DYMOND-US" functionalities in addition to economics, isotopic decay, and other new functionalities. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI and Generation IV reactor development studies.

  15. First quarter Hanford seismic report for fiscal year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-02-23

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EW uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 311 triggers on two parallel detection and recording systems during the first quarter of fiscal year (FY) 2000. Twelve seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree}N latitude and 119--120{degree}W longitude; 2 were earthquakes in the Columbia River Basalt Group, 3 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 1 was a quarry blast. Two earthquakes appear to be related to a major geologic structure, no earthquakes occurred in known swarm areas, and 9 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the first quarter of FY 2000.

  16. Second Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-06-26

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, seven local earthquakes were recorded during the second quarter of fiscal year 2008. The largest event recorded by the network during the second quarter (February 3, 2008 - magnitude 2.3 Mc) was located northeast of Richland in Franklin County at a depth of 22.5 km. With regard to the depth distribution, two earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), three earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and two earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, five earthquakes occurred in swarm areas and two earthquakes were classified as random events.

  17. First Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-03-21

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, forty-four local earthquakes were recorded during the first quarter of fiscal year 2008. A total of thirty-one micro earthquakes were recorded within the Rattlesnake Mountain swarm area at depths in the 5-8 km range, most likely within the pre-basalt sediments. The largest event recorded by the network during the first quarter (November 25, 2007 - magnitude 1.5 Mc) was located within this swarm area at a depth of 4.3 km. With regard to the depth distribution, three earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), thirty-six earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and five earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, thirty-eight earthquakes occurred in swarm areas and six earthquakes were classified as random events.

  18. Neural networks in seismic discrimination

    Microsoft Academic Search

    F. U. Dowla

    1995-01-01

    Neural networks are powerful and elegant computational tools that can be used in the analysis of geophysical signals. At Lawrence Livermore National Laboratory, we have developed neural networks to solve problems in seismic discrimination, event classification, and seismic and hydrodynamic yield estimation. Other researchers have used neural networks for seismic phase identification. We are currently developing neural networks to estimate

  19. Induced seismicity. Final report

    SciTech Connect

    Segall, P.

    1997-09-18

    The objective of this project has been to develop a fundamental understanding of seismicity associated with energy production. Earthquakes are known to be associated with oil, gas, and geothermal energy production. The intent is to develop physical models that predict when seismicity is likely to occur, and to determine to what extent these earthquakes can be used to infer conditions within energy reservoirs. Early work focused on earthquakes induced by oil and gas extraction. Just completed research has addressed earthquakes within geothermal fields, such as The Geysers in northern California, as well as the interactions of dilatancy, friction, and shear heating, on the generation of earthquakes. The former has involved modeling thermo- and poro-elastic effects of geothermal production and water injection. Global Positioning System (GPS) receivers are used to measure deformation associated with geothermal activity, and these measurements along with seismic data are used to test and constrain thermo-mechanical models.

  20. Correlating Seismicity and Subsidence in the Tokai Region, Central Japan

    NASA Astrophysics Data System (ADS)

    Wiemer, S.; Woessner, J.; Yoshida, A.; Hosono, K.; Noguchi, S.; Takayama, H.

    2003-12-01

    We investigate the correlation between seismicity rate changes as well as changes in the earthquake size distribution (b-value) of earthquakes and transients in geodetic data in the Tokai area of Central Japan. As a first target, we analyze the period of accelerated subsidence in the period 1988 - 1990. Three largely independent seismic catalogs cover this region: NIED, JMA and JUNEC, offering a unique opportunity to verify seismicity anomalies based on independent sources. We spatially and temporally map out seismicity rates, finding that a significant decrease in the earthquake rate of M > 2.0 events coincides with the accelerated subsidence period; however, this anomaly disappears when including smaller magnitudes in the analysis. This relative quiescence of larger events can readily be explained when interpreting the transient in seismicity in the framework of a change in the earthquake size distribution, or b-value. The background b-value of about 0.8 increased in the period 1987.5 to 1989.5 to a value of b=1.2 and is confirmed in all three data sets. This b-value transient occurred in the immediate vicinity of a patch of low b-value in the centre of Suruga Bay (b=0.5) that could be interpreted as a major locked patch or asperity. We also analyze stress tensor inversions of NIED focal mechanism data, finding an increase in thrusting type earthquakes for the anomalous period show. While a unique interpretation of the relationship between subsidence, b-value and stress tensor inversion results is not possible, we propose that an increase in the locking strength, a slow stick event, is consistent with all observations. We are now investigating the seismicity transients accompanying the recent slow slip event that started in the year 2000. Our ultimate goal is to construct a quantitative model that relates micro-seismicity and deformation data in the Tokai region.

  1. Methodology for the Caracas Seismic Microzonation Study

    NASA Astrophysics Data System (ADS)

    Hernndez, J. J.; Schmitz, M.

    2007-05-01

    Currently, the Venezuelan Foundation for Seismological Research (FUNVISIS) is executing the Caracas Seismic Microzonation Study. Fundamental objectives are the selection of microzones of similar response and the determination of landslide susceptibility. Both result from a guided combination of damage data of 1967 Caracas earthquake, landslides inventory, geophysical investigations, seismic hazard analysis, geological information, geotechnical database, and earthquake engineering estimations of soil response and probable hillside behavior. Geophysical investigations include refraction seismic, microtremor and gravimeter measurements, for modeling the valley basin; sediment thickness to bedrock reaches 350 m. The model was calibrated with three deep drillings, in which accelerometers will be placed for future comparisons between surface and bedrock seismic motions. Probabilistic seismic hazard analysis was performed, leading to uniform hazard spectra for 475-year mean return period and deaggregation of magnitude-distance pairs, differentiated within the Caracas bedrock. A parametric one-dimensional dynamic soil response study was performed with varying sediment thickness (0-350 m), average shear wave velocity in the upper 30 m (150-500 m/s), and nonlinear soil properties; 144 representative soil profiles were analyzed. The mean amplification of spectral response values of ther surface regarding to the bedrock is obtained, in order to determine probable surface spectra using the bedrock PSHA spectra as input. The seismic effects of the basin are incorporated in an approximate way, from numerical simulations of the 2D and 3D seismic response and statistical data around the world. Finally, a set of microzones with similar average response spectrum is selected, by means of correlating their geological, geophysical and geotechnical properties with those of the parametric study. Hillside pre-seismic hazard is established from lithological properties, slope gradients and seasonal and rainfall wet indexes, estimating the static factor of safety. Newmark displacements and probability of failures are estimated using PSHA results and statistical correlations, leading to the qualification of the earthquake-induced landslide susceptibility. The results will allow updating the municipality ordinances, in order to improve the design and safety construction of new buildings and establish reinforcement priorities of the existing ones. Contribution to projects FONACIT 200400738 (with funds from IDB) and FONACIT-ECOS Nord 2004000347.

  2. The Great Maule earthquake: seismicity prior to and after the main shock from amphibious seismic networks

    NASA Astrophysics Data System (ADS)

    Lieser, K.; Arroyo, I. G.; Grevemeyer, I.; Flueh, E. R.; Lange, D.; Tilmann, F. J.

    2013-12-01

    The Chilean subduction zone is among the seismically most active plate boundaries in the world and its coastal ranges suffer from a magnitude 8 or larger megathrust earthquake every 10-20 years. The Constitucin-Concepcin or Maule segment in central Chile between ~35.5S and 37S was considered to be a mature seismic gap, rupturing last in 1835 and being seismically quiet without any magnitude 4.5 or larger earthquakes reported in global catalogues. It is located to the north of the nucleation area of the 1960 magnitude 9.5 Valdivia earthquake and to the south of the 1928 magnitude 8 Talca earthquake. On 27 February 2010 this segment ruptured in a Mw=8.8 earthquake, nucleating near 36S and affecting a 500-600 km long segment of the margin between 34S and 38.5S. Aftershocks occurred along a roughly 600 km long portion of the central Chilean margin, most of them offshore. Therefore, a network of 30 ocean-bottom-seismometers was deployed in the northern portion of the rupture area for a three month period, recording local offshore aftershocks between 20 September 2010 and 25 December 2010. In addition, data of a network consisting of 33 landstations of the GeoForschungsZentrum Potsdam were included into the network, providing an ideal coverage of both the rupture plane and areas affected by post-seismic slip as deduced from geodetic data. Aftershock locations are based on automatically detected P wave onsets and a 2.5D velocity model of the combined on- and offshore network. Aftershock seismicity analysis in the northern part of the survey area reveals a well resolved seismically active splay fault in the accretionary prism of the Chilean forearc. Our findings imply that in the northernmost part of the rupture zone, co-seismic slip most likely propagated along the splay fault and not the subduction thrust fault. In addition, the updip limit of aftershocks along the plate interface can be verified to about 40 km landwards from the deformation front. Prior to the Great Maule earthquake the Collaborative Research Center SFB 574 'Volatiles and Fluids in Subduction Zones' shot several wide-angle profiles and operated a network, also consisting of OBS and land stations for six months in 2008. Both projects provide a great opportunity to study the evolution of a subduction zone within the seismic cycle of a great earthquake. The most profound features are (i) a sharp reduction in intraslab seismic activity after the Maule earthquake and (ii) a sharp increase in seismic activity at the slab interface above 50 km depth, where large parts of the rupture zone were largely aseismic prior to the Maule earthquake. Further, the aftershock seismicity shows a broader depth distribution above 50 km depth.

  3. Seismic interferometry for seismic source location and interpolation of three-dimensional ocean bottom seismic data

    Microsoft Academic Search

    Weiping Cao

    2009-01-01

    This dissertation develops new seismic interferometry algorithms for estimation of seismic source locations and for the interpolation of sparse three-dimensional (3D) ocean bottom seismic (OBS) data. Unlike the conventional source location and interpolation methods, which heavily rely on the accuracy of the velocity models, the interferometric techniques extract the multiple scattering information in the data, and provide reliable results without

  4. Best Practices for Addressing Induced Seismicity Associated with Enhanced Geothermal Systems (EGS)

    E-print Network

    Majer, E.

    2014-01-01

    Earthquake Resistant Design and, Construction, Prepared for FEMA by the National Institute of BuildingEarthquake-Resistant Design Concepts: An Introduction to the NEHRP Recommended Seismic Provisions for New Buildings

  5. Seismic Adequacy Review of PC012 SCEs that are Potential Seismic Hazards with PC3 SCEs at Cold Vacuum Dryer (CVD) Facility

    SciTech Connect

    OCOMA, E.C.

    1999-08-12

    This document provides seismic adequacy review of PCO12 Systems, Components L Equipment anchorage that are potential seismic interaction hazards with PC3 SCEs during a Design Basis Earthquake. The PCO12 items are identified in the Safety Equipment List as 3/1 SCEs.

  6. Korea Seismic Networks and Korea Integrated Seismic System (KISS)

    NASA Astrophysics Data System (ADS)

    Park, J. H.; Chi, H. C.; Lim, I. S.; Kim, G. Y.

    2009-04-01

    The modernization of seismic network in Korea was motivated by Youngweol (1996, Ml 4.5) and Gyeongju (1997, Ml 4.2) earthquakes. KMA (Korea Meteorological Agency) has built 45 digital seismic stations which compose the National Seismic Network. KEPRI (Korea Electric Power Research Institute) and KINS (Korea Institute of Nuclear Safety) also have built 15 and 4 digital seismic stations, respectively. KIGAM (Korea Institute of Geoscience and Mineral Resources) also has made 37 stations until 2008 including Hyodongri complex seismic observatory where GPS, geomagnetic observation system and borehole seismic system. Since 2002 Korea Integrated Seismic System (KISS) has been playing main role in real-time seismic data exchange between different seismic networks operated by four earthquake monitoring institutes: KMA, KEPRI, KINS and KIGAM. Seismic data from different seismic networks are gathered into the data pool of KISS where clients can receive data in real-time. Before expanding and modernizing of Korean seismic stations, the consortium of the four institutes made the standard criteria of seismic observation such as instrument, data format, and communication protocol for the purpose of integrating seismic networks. More than 150 digital stations (velocity or accelerometer) installed from 1998 to 2008 in Korea could be easily linked to KISS in real time due to the standard criteria. When a big earthquake happens, the observed peak acceleration value can be used as the instrumental intensity on the local site and the distribution of peak accelerations shows roughly the severity of the damaged area. Real Time Intensity Color Mapping (RTICOM) is developed to generate a every second contour map of the nationwide intensity based on the peak acceleration values retrieved through KISS from local stations. RTICOM can be used to rapid evaluation of the intensity and decision making against earthquake damages.

  7. Seismic sloshing of reactor tank with internals

    SciTech Connect

    Ma, D.C.; Gvildys, J.; Chang, Y.W.

    1985-01-01

    A large commercial size breeder reactor tank contains a huge amount of liquid sodium. Due to the presence of large free-surface areas, the liquid sodium will participate in sloshing motion under seismic disturbances. Of interest in the reactor design is the magnitude of the hydrodynamic pressure acting on the internal components and the maximum wave height of the free-surface when coolant sloshes. This paper presents a new seismic analysis methodology which calculates the sloshing loads on submerged components. Results of a study performed on the sloshing of a reactor tank with many in-tank components are also discussed. Objective is to investigate the effects of the internal components on the sloshing response and to calculate the maximum sloshing loads on internal components that can be used for design purposes.

  8. Investigation of the Seismic Performance of Reinforced Highway Embankments

    NASA Astrophysics Data System (ADS)

    Toksoy, Y. S.; Edinliler, A.

    2014-12-01

    Despite the fact that highway embankments are highly prone to earthquake induced damage, there are not enough studies in the literature concentrated on improving the seismic performance of highway embankments. Embankments which are quite stable under static load conditions can simply collapse during earthquakes due to the destructive seismic loading. This situation poses a high sequence thread to the structural integrity of the embankment, service quality and serviceability. The objective of this study is to determine the effect of the geosynthetic reinforcement on the seismic performance of the highway embankments and evaluate the seismic performance of the geotextile reinforced embankment under different earthquake motions. A 1:50 scale highway embankment model is designed and reinforced with geosynthetics in order to increase the seismic performance of the embankment model. A series of shaking table tests were performed for the identical unreinforced and reinforced embankment models using earthquake excitations with different characteristics. The experimental results were evaluated comparing the unreinforced and reinforced cases. Results revealed that reinforced embankment models perform better seismic performance especially under specificied ground excitations used in this study. Also, the prototype embankment was numerically modelled. It is seen that similar seismic behavior trend is obtained in the finite element simulations.

  9. Probabilistic seismic hazard estimation of Manipur, India

    NASA Astrophysics Data System (ADS)

    Pallav, Kumar; Raghukanth, S. T. G.; Darunkumar Singh, Konjengbam

    2012-10-01

    This paper deals with the estimation of spectral acceleration for Manipur based on probabilistic seismic hazard analysis (PSHA). The 500 km region surrounding Manipur is divided into seven tectonic zones and major faults located in these zones are used to estimate seismic hazard. The earthquake recurrence relations for the seven zones have been estimated from past seismicity data. Ground motion prediction equations proposed by Boore and Atkinson (2008 Earthq. Spectra 24 99-138) for shallow active regions and Atkinson and Boore (2003 Bull. Seismol. Soc. Am. 93 1703-29) for the Indo-Burma subduction zone are used for estimating ground motion. The uniform hazard response spectra for all the nine constituent districts of Manipur (Senapati, Tamenglong, Churachandpur, Chandel, Imphal east, Imphal west, Ukhrul, Thoubal and Bishnupur) at 100-, 500- and 2500-year return periods have been computed from PSHA. A contour map of peak ground acceleration over Manipur is also presented for 100-, 500-, and 2500-year return periods with variations of 0.075-0.225, 0.18-0.63 and 0.3-0.1.15 g, respectively, throughout the state. These results may be of use to planners and engineers for site selection, designing earthquake resistant structures and, further, may help the state administration in seismic hazard mitigation.

  10. Development of a magnetostrictive borehole seismic source

    Microsoft Academic Search

    R. P. Cutler; G. E. Sleefe; R. G. Keefe

    1997-01-01

    A magnetostrictive borehole seismic source was developed for use in high resolution crosswell surveys in environmental applications. The source is a clamped, vertical-shear, swept frequency, reaction-mass shaker design consisting of a spring pre-loaded magnetostrictive rod with permanent magnet bias, drive coils to induce an alternating magnetic field, and an integral tungsten reaction mass. The actuator was tested extensively in the

  11. Seismic qualification of existing safety class manipulators

    SciTech Connect

    Wu, Ting-shu; Moran, T.J.

    1992-01-01

    There are two bridge type electromechanical manipulators within a nuclear fuel handling facility which were constructed over twenty-five years ago. At that time, there were only minimal seismic considerations. These manipulators together with the facility are being reactivated. Detailed analyses have shown that the manipulators will satisfy the requirements of ANSI/AISC N690-1984 when they are subjected to loadings including the site specific design basis earthquake. 4 refs.

  12. Seismic qualification of existing safety class manipulators

    SciTech Connect

    Wu, Ting-shu; Moran, T.J.

    1992-05-01

    There are two bridge type electromechanical manipulators within a nuclear fuel handling facility which were constructed over twenty-five years ago. At that time, there were only minimal seismic considerations. These manipulators together with the facility are being reactivated. Detailed analyses have shown that the manipulators will satisfy the requirements of ANSI/AISC N690-1984 when they are subjected to loadings including the site specific design basis earthquake. 4 refs.

  13. Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity

    SciTech Connect

    Tolk, Keith M.; Stoker, Gerald C.

    1999-07-20

    An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements.

  14. AUTOMATING SHALLOW SEISMIC IMAGING

    EPA Science Inventory

    Our current EMSP project continues an effort begun in 1997 to develop ultrashallow seismic imaging as a cost-effective method applicable to DOE facilities. The objective of the present research is to refine and demonstrate the use of an automated method of conducting shallow seis...

  15. Seismic mass Top electrode

    E-print Network

    Kraft, Michael

    of which provides loop transducer is described. A bulk-micromachined an accurate measure is that the micromachined, capacitive sensing element had only three connections (top and bottom electrodes and seismic mass domains, (Burstein and Kaiser, 1996). Usually, for an analogue, closed loop accelerometer the latter

  16. Continous Seismic Profiling

    USGS Multimedia Gallery

    The USGS collaborated with cooperator U.S. Fish & Wildlife Service to conduct continuous seismic-reflection profiling in the Havasu National Wildlife Refuge. The survey was conducted as part of an applied research and technology transfer effort by the USGS Office of Groundwater Branch of Geophysics ...

  17. Downhole hydraulic seismic generator

    Microsoft Academic Search

    D. L. Gregory; H. C. Hardee; D. O. Smallwood

    1990-01-01

    A downhole hydraulic seismic generator system for transmitting energy wave vibrations into earth strata surrounding a borehole. The system contains an elongated, unitary housing operably connected to a well head aboveground by support and electrical cabling, and contains clamping apparatus for selectively clamping the housing to the walls of the borehole. The system further comprises a hydraulic oscillator containing a

  18. Downhole hydraulic seismic generator

    Microsoft Academic Search

    Danny L. Gregory; Harry C. Hardee; David O. Smallwood

    1992-01-01

    A downhole hydraulic seismic generator system for transmitting energy wave vibrations into earth strata surrounding a borehole. The system contains an elongated, unitary housing operably connected to a well head aboveground by support and electrical cabling, and contains clamping apparatus for selectively clamping the housing to the walls of the borehole. The system further comprises a hydraulic oscillator containing a

  19. Algorithms for verifying the integrity of untrusted storage

    E-print Network

    Sudan Ajay, 1980-

    2004-01-01

    This work addresses the problem of verifying that untrusted storage behaves like valid storage. The problem is important in a system such as a network file system or database where a client accesses data stored remotely ...

  20. Verifying Properties of Distributed Systems: Prospects for Practicality

    E-print Network

    Massachusetts at Amherst, University of

    Verifying Properties of Distributed Systems: Prospects for Practicality Lori A. Clarke Leon J. Osterweil University of Massachusetts Amherst Abstract Industry is rapidly embracing distributed systems. Although there are many advantages to distribution, such systems are certainly more difficult to understand

  1. Encrypted Receipts for Voter-Verified Elections Using Homomorphic Encryption

    E-print Network

    Forsythe, Joy Marie

    Voters are now demanding the ability to verify that their votes are cast and counted as intended. Most existing cryptographic election protocols do not treat the voter as a computationally-limited entity separate from the ...

  2. A Verifiable Random Function With Short Proofs and Keys

    E-print Network

    Dodis, Yevgeniy

    not necessarily look ran- dom. They use an expensive Goldreich-Levin hardcore bit [GL89] to convert a VUF like verifiability. Our construction is direct; it does not use a Goldreich-Levin hardcore bit, saving

  3. Verifying Flexible Timeline-Based Plans and A. Finzi

    E-print Network

    Tronci, Enrico

    of verifying a flexible plan before actual execution. This paper explores how a model-checking verification timeline-based P&S sys- tem (EUROPA (Frank and Jonsson 2003), IDEA (Jons- son et al. 2000), APSI-TRF (Cesta

  4. Receipt-Free Universally-Verifiable Voting with Everlasting Privacy

    Microsoft Academic Search

    Tal Moran; Moni Naor

    2006-01-01

    We present the first universally verifiable voting scheme that can be based on a general assumption (existence of a non-interactive commitment scheme). Our scheme is also the first receipt-free scheme to give \\

  5. 34 CFR 668.56 - Information to be verified.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...year the Secretary publishes in the Federal Register notice the FAFSA information that an institution and an applicant may be required to verify. (b) For each applicant whose FAFSA information is selected for verification by the...

  6. 34 CFR 668.56 - Information to be verified.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...year the Secretary publishes in the Federal Register notice the FAFSA information that an institution and an applicant may be required to verify. (b) For each applicant whose FAFSA information is selected for verification by the...

  7. 34 CFR 668.56 - Information to be verified.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...year the Secretary publishes in the Federal Register notice the FAFSA information that an institution and an applicant may be required to verify. (b) For each applicant whose FAFSA information is selected for verification by the...

  8. Optimizing and verifying an ensemble-based rainfall model

    E-print Network

    Friedman, Sara Hargrove

    2007-01-01

    In this thesis, I modified, optimized, and verified the stochastic Recursive Cluster-point Rainfall model of Chatdarong (2006). A novel error metric allows comparison of the stochastic ensemble of rainfall image forecasts ...

  9. Multi-Oracle Interactive Protocols with Space Bounded Verifiers

    Microsoft Academic Search

    Uriel Feige; Adi Shamir

    1989-01-01

    It is proved that both in the multiprover model of M. Ben-or et al. (Proc. 20th Symp. Theory Comput., 1988, p.113-131) and in the the noisy oracle model of U. Feige et al. (Proc. CRYPTO 88) a finite-state verifier can accept any recursive language. The power of verifiers with simultaneous time bounds and space bounds is considered as well

  10. Information-theoretic secure verifiable secret sharing over RSA modulus

    Microsoft Academic Search

    Qiu Gang; Wang Hong; Wei Shimin; Xiao Guozhen

    2006-01-01

    The well-known non-interactive and information-theoretic secure verifiable secret sharing scheme presented by Pedersen is\\u000a over a large prime. In this paper, we construct a novel non-interactive and information-theoretic verifiable secret sharing\\u000a over RSA (Rivest, Shamir, Adleman) modulus and give the rigorous security proof. It is shown how to distribute a secret among\\u000a a group such that any set ofk parties

  11. Bolivian Seismic Network

    NASA Astrophysics Data System (ADS)

    Minaya, E.; Rougon, P.; Valero, D.; Fernandez, G.; Lazaro, E.; Cano, W.

    2007-05-01

    One of the biggest challenges into the Seismic Network of Bolivia, composed of seven stations, is to connect all the differents characteristic of them. The Observatory San Calixto, network operator, is one of the few private seismic observatories in the world and for this reason is working only with agreement support or extern cooperation. This problem needs a promptly solution to obtain data system more convenient, in a real time, more effective and compatible with a future extension network. Now, we have differences in the equipment and transmission too. Two of our network stations, are part of the IMS (International System of Data), the information are transmitted by telemetry way from Primary Station PS6 (LPAZ) to OSC, and then by Vsat to IMS and by optic fiber to AFTAC. The auxiliary seismic station AS08 (SIV) sends information to DASE France by satellite way, and then DASE transmits to the IMS, and to the OSC by Internet. Similar situation is used for another station: MOC. The data of the other four stations are transmitted by telemetry to the OSC center, but here the difference with the other stations is that they are working with analogy system. This network does not cover all the Bolivian territory for a completed monitoring of the seismic activity of the country. For this reason it is very important for Bolivia to extend the network with installation of other stations and a project for the characteristic compatibility (formats specially) of this news stations with the actual stations and temporally stations. Temporally stations are mainly used to support the network and to obtain the evaluation of micro activity in some areas that have a possible seismic threat and because of the actual network distribution where the activity is unknown for us.

  12. High Voltage Seismic Generator

    NASA Astrophysics Data System (ADS)

    Bogacz, Adrian; Pala, Damian; Knafel, Marcin

    2015-04-01

    This contribution describes the preliminary result of annual cooperation of three student research groups from AGH UST in Krakow, Poland. The aim of this cooperation was to develop and construct a high voltage seismic wave generator. Constructed device uses a high-energy electrical discharge to generate seismic wave in ground. This type of device can be applied in several different methods of seismic measurement, but because of its limited power it is mainly dedicated for engineering geophysics. The source operates on a basic physical principles. The energy is stored in capacitor bank, which is charged by two stage low to high voltage converter. Stored energy is then released in very short time through high voltage thyristor in spark gap. The whole appliance is powered from li-ion battery and controlled by ATmega microcontroller. It is possible to construct larger and more powerful device. In this contribution the structure of device with technical specifications is resented. As a part of the investigation the prototype was built and series of experiments conducted. System parameter was measured, on this basis specification of elements for the final device were chosen. First stage of the project was successful. It was possible to efficiently generate seismic waves with constructed device. Then the field test was conducted. Spark gap wasplaced in shallowborehole(0.5 m) filled with salt water. Geophones were placed on the ground in straight line. The comparison of signal registered with hammer source and sparker source was made. The results of the test measurements are presented and discussed. Analysis of the collected data shows that characteristic of generated seismic signal is very promising, thus confirms possibility of practical application of the new high voltage generator. The biggest advantage of presented device after signal characteristics is its size which is 0.5 x 0.25 x 0.2 m and weight approximately 7 kg. This features with small li-ion battery makes constructed device very mobile. The project is still developing.

  13. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to densify the network and to enhance performance of the Algerian Digital Seismic Network.

  14. A probabilistic assessment of the seismic hazard in Turkey

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Doyuran, V.; Akka?, N.; Glkan, P.

    1985-08-01

    Seismic hazard maps for Turkey are developed by utilizing the current probabilistic procedures. The maps contain probabilistic estimates of the maximum MSK intensity and the maximum horizontal peak ground acceleration for the return periods of 225, 475 and 10,000 years. A modified and updated catalogue for the large and damaging earthquakes affecting Turkey has been prepared to provide a base for the spatial correlation of the seismic activity with the geo-tectonic elements. Brief descriptions of the seismic sources adopted for Turkey for the purpose of seismic source regionalization are given. Recurrence relationship regression constants, maximum magnitudes and intensity attenuation relationships are provided for each seismic source zone. The earthquake phenomenon is based on both the point source model and the fault rupture model depending on the source model. The results are discussed in detail. The study is intended to serve as a reference for more advanced approaches and to stimulate discussion and suggestions on the data base, assumptions and the inputs, and to pave the path for the risk based assessment of the seismic hazard in the site selection and the design of nuclear power plants and in the design of common buildings and engineering facilities.

  15. Seismic monitoring of Poland description and results of temporary seismic project with mobile seismic network

    NASA Astrophysics Data System (ADS)

    Trojanowski, Jacek; Plesiewicz, Beata; Wiszniowski, Jan

    2014-12-01

    The paper describes a temporary seismic project aimed at developing the national database of natural seismic activity for seismic hazard assessment, officially called "Monitoring of Seismic Hazard of Territory of Poland" (MSHTP). Due to low seismicity of Poland, the project was focused on events of magnitude range 1-3 in selected regions in order to maximize the chance of recording any natural event. The project used mobile seismic stations and was divided into two stages. Five-year measurements brought over one hundred natural seismic events of magnitudes M L range 0.5-3.8. Most of them were located in the Podhale region in the Carpathians. Together with previously recorded events this made it possible to conduct a preliminary study on ground motion prediction equation for this region. Only one natural event, of magnitude M L = 3.8, was recorded outside the Carpathians in a surprising location in central-west Poland.

  16. Seismic Monitoring of Poland - Description and Results of Temporary Seismic Project with Mobile Seismic Network

    NASA Astrophysics Data System (ADS)

    Trojanowski, Jacek; Plesiewicz, Beata; Wiszniowski, Jan

    2015-02-01

    The paper describes a temporary seismic project aimed at developing the national database of natural seismic activity for seismic hazard assessment, officially called "Monitoring of Seismic Hazard of Territory of Poland" (MSHTP). Due to low seismicity of Poland, the project was focused on events of magnitude range 1-3 in selected regions in order to maximize the chance of recording any natural event. The project used mobile seismic stations and was divided into two stages. Five-year measurements brought over one hundred natural seismic events of magnitudes ML range 0.5-3.8. Most of them were located in the Podhale region in the Carpathians. Together with previously recorded events this made it possible to conduct a preliminary study on ground motion prediction equation for this region. Only one natural event, of magnitude ML = 3.8, was recorded outside the Carpathians in a surprising location in central-west Poland

  17. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    SciTech Connect

    Amiri, G. Ghodrati [Iran University of Science and Technology--Islamic Azad University of Shahrekord, Narmak, Tehran 16846 (Iran, Islamic Republic of); Dehkordi, M. Raeisi [Department of Civil Engineering, Islamic Azad University of Shahrekord (Iran, Islamic Republic of); Amrei, S. A. Razavian [College of Civil Engineering, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of); Kamali, M. Koohi [Department of Civil Engineering, Islamic Azad University of Shahrekord (Iran, Islamic Republic of)

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrence relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.

  18. Sloshing of coolant in a seismically isolated reactor

    SciTech Connect

    Wu, Ting-shu; Gvildys, J.; Seidensticker, R.W.

    1988-01-01

    During a seismic event, the liquid coolant inside the reactor vessel will have sloshing motion which is a low-frequency phenomenon. In a reactor system incorporated with seismic isolation, the isolation frequency usually is also very low. There is concern on the potential amplification of sloshing motion of the liquid coolant. This study investigates the effects of seismic isolation on the sloshing of liquid coolant inside the reactor vessel of a liquid metal cooled reactor. Based on a synthetic ground motion whose response spectra envelop those specified by the NRC Regulator Guide 1.60, it is found that the maximum sloshing wave height increases from 18 in. to almost 30 in. when the system is seismically isolated. Since higher sloshing wave may introduce severe impact forces and thermal shocks to the reactor closure and other components within the reactor vessel, adequate design considerations should be made either to suppress the wave height or to reduce the effects caused by high waves.

  19. Comment on "How can seismic hazard around the New Madrid seismic zone be similar to that in California?" by Arthur Frankel

    USGS Publications Warehouse

    Wang, Z.; Shi, B.; Kiefer, J.D.

    2005-01-01

    PSHA is the method used most to assess seismic hazards for input into various aspects of public and financial policy. For example, PSHA was used by the U.S. Geological Survey to develop the National Seismic Hazard Maps (Frankel et al., 1996, 2002). These maps are the basis for many national, state, and local seismic safety regulations and design standards, such as the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, the International Building Code, and the International Residential Code. Adoption and implementation of these regulations and design standards would have significant impacts on many communities in the New Madrid area, including Memphis, Tennessee and Paducah, Kentucky. Although "mitigating risks to society from earthquakes involves economic and policy issues" (Stein, 2004), seismic hazard assessment is the basis. Seismologists should provide the best information on seismic hazards and communicate them to users and policy makers. There is a lack of effort in communicating the uncertainties in seismic hazard assessment in the central U.S., however. Use of 10%, 5%, and 2% PE in 50 years causes confusion in communicating seismic hazard assessment. It would be easy to discuss and understand the design ground motions if the true meaning of the ground motion derived from PSHA were presented, i.e., the ground motion with the estimated uncertainty or the associated confidence level.

  20. Monitoring hydraulic fracturing with seismic emission volume

    NASA Astrophysics Data System (ADS)

    Niu, F.; Tang, Y.; Chen, H.; TAO, K.; Levander, A.

    2014-12-01

    Recent developments in horizontal drilling and hydraulic fracturing have made it possible to access the reservoirs that are not available for massive production in the past. Hydraulic fracturing is designed to enhance rock permeability and reservoir drainage through the creation of fracture networks. Microseismic monitoring has been proven to be an effective and valuable technology to image hydraulic fracture geometry. Based on data acquisition, seismic monitoring techniques have been divided into two categories: downhole and surface monitoring. Surface monitoring is challenging because of the extremely low signal-to-noise ratio of the raw data. We applied the techniques used in earthquake seismology and developed an integrated monitoring system for mapping hydraulic fractures. The system consists of 20 to 30 state-of-the-art broadband seismographs, which are generally about hundreds times more sensible than regular geophones. We have conducted two experiments in two basins with very different geology and formation mechanism in China. In each case, we observed clear microseismic events, which may correspond to the induced seismicity directly associated with fracturing and the triggered ones at pre-existing faults. However, the magnitude of these events is generally larger than magnitude -1, approximately one to two magnitudes larger than those detected by downhole instruments. Spectrum-frequency analysis of the continuous surface recordings indicated high seismic energy associated with injection stages. The seismic energy can be back-projected to a volume that surrounds each injection stage. Imaging seismic emission volume (SEV) appears to be an effective way to map the stimulated reservior volume, as well as natural fractures.

  1. An Experimental Seismic Data and Parameter Exchange System for Tsunami Warning Systems

    NASA Astrophysics Data System (ADS)

    Hoffmann, T. L.; Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Heinloo, A.; Hoffmann, M.

    2009-12-01

    For several years GFZ Potsdam is operating a global earthquake monitoring system. Since the beginning of 2008, this system is also used as an experimental seismic background data center for two different regional Tsunami Warning Systems (TWS), the IOTWS (Indian Ocean) and the interim NEAMTWS (NE Atlantic and Mediterranean). The SeisComP3 (SC3) software, developed within the GITEWS (German Indian Ocean Tsunami Early Warning System) project, capable to acquire, archive and process real-time data feeds, was extended for export and import of individual processing results within the two clusters of connected SC3 systems. Therefore not only real-time waveform data are routed to the attached warning centers through GFZ but also processing results. While the current experimental NEAMTWS cluster consists of SC3 systems in six designated national warning centers in Europe, the IOTWS cluster presently includes seven centers, with another three likely to join in 2009/10. For NEAMTWS purposes, the GFZ virtual real-time seismic network (GEOFON Extended Virtual Network -GEVN) in Europe was substantially extended by adding many stations from Western European countries optimizing the station distribution. In parallel to the data collection over the Internet, a GFZ VSAT hub for secured data collection of the EuroMED GEOFON and NEAMTWS backbone network stations became operational and first data links were established through this backbone. For the Southeast Asia region, a VSAT hub has been established in Jakarta already in 2006, with some other partner networks connecting to this backbone via the Internet. Since its establishment, the experimental system has had the opportunity to prove its performance in a number of relevant earthquakes. Reliable solutions derived from a minimum of 25 stations were very promising in terms of speed. For important events, automatic alerts were released and disseminated by emails and SMS. Manually verified solutions are added as soon as they become available. The results are also promising in terms of accuracy since epicenter coordinates, depth and magnitude estimates were sufficiently accurate from the very beginning, and usually do not differ substantially from the final solutions. In summary, automatic seismic event processing has shown to work well as a first step for starting a Tsunami Warning process. However, for the secured assessment of the tsunami potential of a given event, 24/7-manned regional TWCs are mandatory for reliable manual verification of the automatic seismic results. At this time, GFZ itself provides manual verification only when staff is available, not on a 24/7 basis, while the actual national tsunami warning centers have all a reliable 24/7 service.

  2. Development of the Multi-Level Seismic Receiver (MLSR)

    SciTech Connect

    Sleefe, G.E.; Engler, B.P.; Drozda, P.M.; Franco, R.J.; Morgan, J.

    1995-02-01

    The Advanced Geophysical Technology Department (6114) and the Telemetry Technology Development Department (2664) have, in conjunction with the Oil Recovery Technology Partnership, developed a Multi-Level Seismic Receiver (MLSR) for use in crosswell seismic surveys. The MLSR was designed and evaluated with the significant support of many industry partners in the oil exploration industry. The unit was designed to record and process superior quality seismic data operating in severe borehole environments, including high temperature (up to 200{degrees}C) and static pressure (10,000 psi). This development has utilized state-of-the-art technology in transducers, data acquisition, and real-time data communication and data processing. The mechanical design of the receiver has been carefully modeled and evaluated to insure excellent signal coupling into the receiver.

  3. Development of the Multi-Level Seismic Receiver (MLSR)

    NASA Astrophysics Data System (ADS)

    Sleefe, G. E.; Engler, B. P.; Drozda, P. M.; Franco, R. J.; Morgan, Jeff

    1995-02-01

    The Advanced Geophysical Technology Department (6114) and the Telemetry Technology Development Department (2664) have, in conjunction with the Oil Recovery Technology Partnership, developed a Multi-Level Seismic Receiver (MLSR) for use in crosswell seismic surveys. The MLSR was designed and evaluated with the significant support of many industry partners in the oil exploration industry. The unit was designed to record and process superior quality seismic data operating in severe borehole environments, including high temperature (up to 200 C) and static pressure (10,000 psi). This development has utilized state-of-the-art technology in transducers, data acquisition, and real-time data communication and data processing. The mechanical design of the receiver has been carefully modeled and evaluated to insure excellent signal coupling into the receiver.

  4. Verifying End-to-End Protocols using Induction with CSP\\/FDR

    Microsoft Academic Search

    S. J. Creese; Joy N. Reed

    1999-01-01

    We investigate a technique, suitable for process algebraic, finite-state machine (model-checking) automated tools, for formally modelling arbitrary network topologies. We model aspects of a protocol for multiservice networks, and demonstrate how the technique can be used to verify end-to-end properties of protocols designed for arbitrary numbers of intermediate nodes. Our models are presented in a version of CSP allowing automatic

  5. One-dimensional Seismic Analysis of a Solid-Waste Landfill

    SciTech Connect

    Castelli, Francesco; Lentini, Valentina; Maugeri, Michele [Department of Civil and Environmental Engineering, University of Catania, Viale Andrea Doria no. 6, 95125, Catania (Italy)

    2008-07-08

    Analysis of the seismic performance of solid waste landfill follows generally the same procedures for the design of embankment dams, even if the methods and safety requirements should be different. The characterization of waste properties for seismic design is difficult due the heterogeneity of the material, requiring the procurement of large samples. The dynamic characteristics of solid waste materials play an important role on the seismic response of landfill, and it also is important to assess the dynamic shear strengths of liner materials due the effect of inertial forces in the refuse mass. In the paper the numerical results of a dynamic analysis are reported and analysed to determine the reliability of the common practice of using 1D analysis to evaluate the seismic response of a municipal solid-waste landfill. Numerical results indicate that the seismic response of a landfill can vary significantly due to reasonable variations of waste properties, fill heights, site conditions, and design rock motions.

  6. ELASTIC-WAVEFIELD SEISMIC STRATIGRAPHY: A NEW SEISMIC IMAGING TECHNOLOGY

    SciTech Connect

    Bob A. Hardage

    2004-05-06

    The focus of elastic-wavefield seismic stratigraphy research shifted from onshore prospects to marine environments during this report period. Four-component ocean-bottom-cable (4-C OBC) seismic data acquired in water depths of 2400 to 2500 feet across Green Canyon Block 237 in the Gulf of Mexico were processed and analyzed. The P-P and P-SV images of strata immediately below the seafloor exhibit amazing differences in P-P and P-SV seismic facies. These data may be one of the classic examples of the basic concepts of elastic-wavefield seismic stratigraphy.

  7. Poor boy 3D seismic effort yields South Central Kentucky discovery

    SciTech Connect

    Hanratty, M. [Hanratty (Michael), Grapevine, TX (United States)

    1996-11-04

    Clinton County, Ky., is on the eastern flank of the Cincinnati arch and the western edge of the Appalachian basin and the Pine Mountain overthrust. Clinton County has long been known for high volume fractured carbonate wells. The discovery of these fractured reservoir, unfortunately, has historically been serendipitous. The author currently uses 2D seismic and satellite imagery to design 3D high resolution seismic shoots. This method has proven to be the most efficient and is the core of his program. The paper describes exploration methods, seismic acquisition, well data base, and seismic interpretation.

  8. Assessment of wind turbine seismic risk : existing literature and simple study of tower moment demand.

    SciTech Connect

    Prowell, Ian (University of California, San Diego, CA); Veers, Paul S.

    2009-03-01

    Various sources of risk exist for all civil structures, one of which is seismic risk. As structures change in scale, the magnitude of seismic risk changes relative to risk from other sources. This paper presents an introduction to seismic hazard as applied to wind turbine structures. The existing design methods and research regarding seismic risk for wind turbines is then summarized. Finally a preliminary assessment is made based on current guidelines to understand how tower moment demand scales as rated power increases. Potential areas of uncertainty in the application of the current guidelines are summarized.

  9. Swept Impact Seismic Technique (SIST)

    USGS Publications Warehouse

    Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.

    1996-01-01

    A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.

  10. Seismic exploration method

    SciTech Connect

    Hackett, G.K.

    1980-09-16

    A seismic exploration method in which a coded energy signal is generated and transmitted into the earth, the seismic energy reflected from within the earth is sensed and sampled to form a raw trace, the raw trace is crosscorrelated with a record of the coded energy signal, and a predictive operator derived from the autocorrelation function of the coded energy signal is subtractively applied to the crosscorrelated trace to remove correlation residuals, and thereby produce a high quality processed trace. Geophysical principles are used to interpret the processed trace and identify subterranean mineral deposits of interest. Based on this interpretation at least one well is drilled to further explore and/or develop the mineral deposits of interest.

  11. Seismicity, seismology and erosion

    NASA Astrophysics Data System (ADS)

    Hovius, Niels; Meunier, Patrick; Burtin, Arnaud; Marc, Odin

    2013-04-01

    At the interface of geomorphology and seismology, patterns of erosion can be used to constrain seismic processes, and seismological instruments to determine geomorphic activity. For example, earthquakes trigger mass wasting in proportion to peak ground velocity or acceleration, modulated by local geologic and topographic conditions. This geomorphic response determines the mass balance and net topographic effect of earthquakes. It can also be used to obtain information about the distribution of seismic slip where instrumental observations are not available. Equally, seismometers can register the signals of geomorphic processes, revealing their location, type and magnitude. The high temporal resolution of such records can help determine the exact meteorological conditions that gave rise to erosion events, and the interactions between individual surface processes during such events. We will illustrate this synergy of disciplines with examples from active mountain belts around the world, including Taiwan, Japan, Papua New Guinea and the Alps.

  12. Seismic Applications of Nonlinear Response Spectra Based on the Theory of Modal Analysis

    Microsoft Academic Search

    K. K. F. Wong

    2011-01-01

    A fast nonlinear response spectra analysis algorithm based on the theory of modal analysis and superposition is proposed to overcome the drawbacks of using the time-consuming nonlinear response history analysis in seismic design. Because linear modal analysis has found great acceptance in performance-based seismic engineering, it is here extended to the nonlinear domain by using the force analogy method that

  13. Seismic behavior of a post-tensioned integral bridge including soilstructure interaction (SSI)

    Microsoft Academic Search

    Constantine Spyrakos; George Loannidis

    2003-01-01

    Current practice usually pays little attention to the effect of soilstructure interaction (SSI) on seismic analysis and design of bridges. The objective of this research study is to assess the significance of SSI on the modal with geometric stiffness and seismic response of a bridge with integral abutments that has been constructed using a new bridge system technology. Emphasis is

  14. Developing Smart Seismic Arrays: A Simulation Environment, Observational Database, and Advanced Signal Processing

    Microsoft Academic Search

    P E Harben; D Harris; S Myers; S Larsen; J Wagoner; J Trebes; K Nelson

    2003-01-01

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to

  15. Applying Seismic Methods to National Security Problems: Matched Field Processing With Geological Heterogeneity

    Microsoft Academic Search

    S Myers; S Larsen; J Wagoner; B Henderer; D McCallen; J Trebes; P Harben; D Harris

    2003-01-01

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to

  16. Seismic response analysis of cylindrical tanks with initial irregularities on side walls

    Microsoft Academic Search

    H. Zui; T. Shinke

    1984-01-01

    The objective of this paper is to report the findings of the study on the effects of initial irregularities on the seismic response of cylindrical tanks. Such as the out-of-roundness induces circumferential hydrodynamic pressure components of high order modes, which are neglected in current design assumptions. Seismic response formulas for cylindrical tanks with arbitrary initial irregularities have been derived from

  17. Seismic response analysis of cylindrical tanks with initial irregularities on side walls

    Microsoft Academic Search

    H. Zui; T. Shinke

    1985-01-01

    The objective of this paper is to report the findings of the study on the effects of initial irregularities on the seismic response of cylindrical tanks. The initial irregularities induce circumferential hydrodynamic pressure components of highorder modes, which are neglected in current design assumptions. Seismic response formulas for cylindrical tanks with arbitrary initial irregularities have been derived from Lagrange's kinematic

  18. Seismic Considerations--Elementary and Secondary Schools, Revised Edition. Earthquake Hazards Reduction Series 34.

    ERIC Educational Resources Information Center

    Building Seismic Safety Council, Washington, DC.

    Elementary and secondary schools deserve special attention with respect to seismic safety because of their special occupancy characteristics and their importance to immediate and long-term earthquake disaster relief and recovery efforts. Seismic safety provisions, when incorporated in a sound design from the very beginning, usually amount to only

  19. The use of waveform similarity to define planes of mining-induced seismic events

    Microsoft Academic Search

    S. m. Spottiswoode; A. m. Milev

    1998-01-01

    Mining-induced seismicity results from a complex interaction of ambient and mining-induced stresses acting on a rock mass that has been intersected by a variety of geological weaknesses and discontinuities. A major challenge is to improve mine design methodology by identifying the geological features that are seismically active and then changing the direction of mining to reduce the potential for shear

  20. 7 CFR 1792.104 - Seismic acknowledgments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...2011-01-01 2011-01-01 false Seismic acknowledgments. 1792.104 Section...REGULATIONS, AND EXECUTIVE ORDERS Seismic Safety of Federally Assisted New Building Construction 1792.104 Seismic acknowledgments. For each...

  1. Seismic Facies Characterization By Scale Analysis

    E-print Network

    Herrmann, Felix J.

    2000-01-01

    Over the years, there has been an ongoing struggle to relate well-log and seismic data due to the inherent bandwidth limitation of seismic data, the problem of seismic amplitudes, and the apparent inability to delineate ...

  2. Automating Shallow Seismic Imaging

    Microsoft Academic Search

    Steeples; Don W

    2003-01-01

    Our primary research focus during the current three-year period of funding has been to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Recent tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' have shown that large numbers of geophones can be placed quickly

  3. Seismic Eruption Teaching Modules

    NSDL National Science Digital Library

    Lawrence Braile

    This site presents educational modules for teaching about earthquakes, volcano eruptions and related plate tectonic concepts using an interactive computer program for mapping called Seismic/Eruption (also called SeisVolE). The program includes up-to-date earthquake and volcanic eruption catalogs and allows the user to display earthquake and volcanic eruption activity in "speeded up real time" on global, regional or local maps that also show the topography of the area in a shaded relief map image. SeisVolE is an interactive program that includes a number of tools that allow the user to analyze earthquake and volcanic eruption data and produce effective displays to illustrate seismicity and volcano patterns. The program can be used to sort data and provide results for statistical analysis, to generate detailed earthquake and volcano activity maps of specific areas or for specific purposes, to investigate earthquake sequences such as foreshocks and aftershocks, and to produce cross section or 3-D perspective views of earthquake locations. The Seismic/Eruption program can be a powerful and effective tool for teaching about plate tectonics and geologic hazards using earthquake and volcano locations, and for learning (or practicing) fundamental science skills such as statistical analysis, graphing, and map skills. The teaching modules describe and illustrate how to use the Seismic/Eruption program effectively in demonstrations, classroom presentations and interactive presentations, and independent study/research. Because the program has many useful options and can be used to examine earthquake activity and volcanic eruption data, the modules provide instructions and examples of quantitative analysis, graphing of results, creating useful maps and cross section diagrams, and performing in-depth exploration and research. The examples are intended to illustrate the features and capabilities of the program and stimulate interest in using the program for discovery learning in Earth science, especially earthquakes, volcanoes and plate tectonics.

  4. Modeling Beam-Column Joints in Fragility Assessment of Gravity Load Designed Reinforced Concrete Frames

    Microsoft Academic Search

    Ozan Cem Celik; Bruce R. Ellingwood

    2008-01-01

    Reinforced concrete (RC) frame structures customarily have been designed in regions of low-to-moderate seismicity with little or no consideration of their seismic resistance. The move toward performance-based seismic engineering requires accurate reliability-based structural analysis models of gravity load designed (GLD) RC frames for predicting their behavior under seismic effects and for developing seismic fragilities that can be used as a

  5. Seismic excitation by space shuttles

    Microsoft Academic Search

    H. Kanamori; J. Mori; B. Sturtevant; D. L. Anderson; T. Heaton

    1992-01-01

    Shock waves generated by the space shuttles Columbia (August 13, 1989), Atlantis (April 11, 1991) and Discovery (September 18, 1991) on their return to Edwards Air Force Base, California, were recorded by TERRAscope (Caltech's broadband seismic network), the Caltech-U.S.G.S Southern California Seismic Network (SCSN), and the University of Southern California (USC) Los Angeles Basin Seismic Network. The spatial pattern of

  6. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2003-06-01

    Our primary research focus during the current three-year period of funding has been to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Recent tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' have shown that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under all conditions (please see the Status and Results of Experiments sections for details). In some easy-access environments, this device is expected to make shallow seismic surveying considerably more efficient and less expensive. Another element of our research plan involved monitoring the cone of depression around a pumping well, with the well serving as a proxy location for fluid-flow at a contaminated DOE site. To try to achieve that goal, we collected data from a well site at which drawdown equilibrium had been reached and at another site during a pumping test. Data analysis disclosed that although we were successful in imaging the water table using seismic reflection techniques (Johnson, 2003), we were not able to explicitly delineate the cone of depression (see Status and Results of Experiments).

  7. Anonymity and verifiability in multi-attribute reverse auction

    E-print Network

    Srinath, T R; Pais, Alwyn Roshan; 10.5121/ijitcs.2011.1401

    2011-01-01

    The use of e-Auction services has been increasing in recent years. Security requirements in conducting e-Auctions are mainly bid privacy, anonymity and public verifiability. Most of the secure protocols concentrate on privacy and anonymity, which are achieved through bidder-resolved multi-party computation, assuming two or more trusted third parties, either through numerous auctioneers or with asymmetric models in which the commercial entity of an auction issuer or registration manager is assumed in addition to the auctioneer. Multi-attribute reverse auctions promise higher market efficiency and effective information exchange. This work extends and uses the existing schemes. This scheme uses scoring function, winner determination in multi-attribute auctions to implement public verifiability. Anonymity is achieved through bidder side pseudonym generation. By results and analysis we say this is very simple and effective scheme. This scheme ensures public verifiability and anonymity in multi-attribute auctions w...

  8. Usage of Friction-damped Braced Frames for Seismic Vibration Control

    E-print Network

    Fink, Brynnan 1992-

    2012-04-16

    resistant structures. Author keywords: Design earthquake loads; Input energy; Inelastic structures; Ductility ratio; Hysteretic energy; Damage indexes; Damage spectra; Nonlinear optimization. Introduction The assessment of seismic performance... possible structural damage under seismic loads. The earthquake-resistant design of structures has been an active area of research for many decades. Early works have dealt with specifying earthquake loads in terms of the elastic and inelastic design...

  9. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    SciTech Connect

    Joel Walls; M.T. Taner; Naum Derzhi; Gary Mavko; Jack Dvorkin

    2003-04-01

    In this report we will show results of seismic and well log derived attenuation attributes from a deep water Gulf of Mexico data set. This data was contributed by Burlington Resources and Seitel Inc. The data consists of ten square kilometers of 3D seismic data and three well penetrations. We have computed anomalous seismic absorption attributes on the seismic data and have computed Q from the well log curves. The results show a good correlation between the anomalous absorption (attenuation) attributes and the presence of gas as indicated by well logs.

  10. Reassessment of the Seismicity and seismic hazards of Libya

    NASA Astrophysics Data System (ADS)

    Ben Suleman, A.; Elmeladi, A.

    2009-04-01

    The tectonic evolution of Libya, located at the northern extreme of the African continent, has yielded a complex crustal structure that is composed of a series of basins and uplifts. The present day deformation of Libya is the result of the Eurasia-Africa continental collision. At the end of the year 2005, The Libyan National Seismological Network was established to monitor local, regional and teleseismic activities, as well as to provide high quality data for research projects both locally and on the regional and global scale. This study aims to discuss the seismicity of Libya by using the new data from the Libyan national seismological network and to focus on the seismic hazards. At first glance the seismic activity map shows dominant trends of seismicity with most of the seismic activity concentrated along the northern coastal areas. Four major seismic trends were quite noticeable. A first trend is a NW-SE direction coinciding with the eastern boarder of the Hun Graben. A second trend is also a NW-SE direction in the offshore area and might be a continuation of this trend. The other two trends were located in the western Gulf of Sirt and Cyrenaica platform. The rest of seismicity is diffuse either offshore or in land, with no good correlation with well-mapped faults. Detailed investigations of the Libyan seismicity indicates that the Libya has experienced earthquakes of varying magnitudes and that there is definitely a certain amount of seismic risk involved in engineering projects, particularly in the northern regions. Detailed investigation of the distribution of the Libyan earthquakes in space and time along with all other geological considerations suggested the classification of the country into four seismic zones with the Hun graben zone being the most seismically active zone.

  11. Seismic Imager Space Telescope

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin; Coste, Keith; Cunningham, J.; Sievers,Michael W.; Agnes, Gregory S.; Polanco, Otto R.; Green, Joseph J.; Cameron, Bruce A.; Redding, David C.; Avouac, Jean Philippe; Ampuero, Jean Paul; Leprince, Sebastien; Michel, Remi

    2012-01-01

    A concept has been developed for a geostationary seismic imager (GSI), a space telescope in geostationary orbit above the Pacific coast of the Americas that would provide movies of many large earthquakes occurring in the area from Southern Chile to Southern Alaska. The GSI movies would cover a field of view as long as 300 km, at a spatial resolution of 3 to 15 m and a temporal resolution of 1 to 2 Hz, which is sufficient for accurate measurement of surface displacements and photometric changes induced by seismic waves. Computer processing of the movie images would exploit these dynamic changes to accurately measure the rapidly evolving surface waves and surface ruptures as they happen. These measurements would provide key information to advance the understanding of the mechanisms governing earthquake ruptures, and the propagation and arrest of damaging seismic waves. GSI operational strategy is to react to earthquakes detected by ground seismometers, slewing the satellite to point at the epicenters of earthquakes above a certain magnitude. Some of these earthquakes will be foreshocks of larger earthquakes; these will be observed, as the spacecraft would have been pointed in the right direction. This strategy was tested against the historical record for the Pacific coast of the Americas, from 1973 until the present. Based on the seismicity recorded during this time period, a GSI mission with a lifetime of 10 years could have been in position to observe at least 13 (22 on average) earthquakes of magnitude larger than 6, and at least one (2 on average) earthquake of magnitude larger than 7. A GSI would provide data unprecedented in its extent and temporal and spatial resolution. It would provide this data for some of the world's most seismically active regions, and do so better and at a lower cost than could be done with ground-based instrumentation. A GSI would revolutionize the understanding of earthquake dynamics, perhaps leading ultimately to effective warning capabilities, to improved management of earthquake risk, and to improved public safety policies. The position of the spacecraft, its high optical quality, large field of view, and large field of regard will make it an ideal platform for other scientific studies. The same data could be simply reused for other studies. If different data, such as multi-spectral data, is required, additional instruments could share the telescope.

  12. The SCALE Verified, Archived Library of Inputs and Data - VALID

    SciTech Connect

    Marshall, William BJ J [ORNL] [ORNL; Rearden, Bradley T [ORNL] [ORNL

    2013-01-01

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated with model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.

  13. A methodology for assessment of nuclear power plant seismic margin

    SciTech Connect

    Reed, J.W. (Benjamin (J.R.) and Associates, Inc., Mountain View, CA (United States)); Kennedy, R.P. (Structural Mechanics Consulting, Inc., Yorba Linda, CA (United States)); Buttemer, D.R. (Pickard, Lowe and Garrick, Inc., Newport Beach, CA (United States)); Idriss, I.M. (California Univ., Davis, CA (United States). Dept. of Civil Engineering); Moore, D.P.; Barr, T.; Wooten, K.D.; Smith, J.E. (South

    1991-08-01

    EPRI's seismic margin methodology enables utility engineers to quantify a nuclear power plant's ability to withstand an earthquake greater than design and still safety shut down for a least 72 hours. This cost-effective, practical methodology used generic screening of systems and component seismic reggedness and does not require probabilistic calculations. The revision adds depth, detail, and more complete procedures to the original report but does not change the basic method. The seismic margin methodology enables engineers to choose a functional success path and several alternatives to shut down the plant and to identify the subset of plant structures and components associated with the path selected. The methodology further provides guidelines for screening the seismic ruggedness of subset components and structures. Components that satisfy screening criteria require no further evaluation. Remaining components require deterministic evaluations using in-structure motions calculated for the earthquake being assessed. The seismic margin earthquake (SME) is chosen to be sufficiently larger than the SSE to establish a significant seismic margin. Methodology procedures determine the weakest link components and establish the HCLPF level of ground moton for which the plant can safely shut down. Plant walkdowns ensure component compliance with screening guideline conditions. The revision gives detailed methods for calculating HCLPF values for an extensive variety of components.

  14. Seismic Prediction While Drilling (SPWD): Seismic exploration ahead of the drill bit using phased array sources

    NASA Astrophysics Data System (ADS)

    Jaksch, Katrin; Giese, Rdiger; Kopf, Matthias

    2010-05-01

    In the case of drilling for deep reservoirs previous exploration is indispensable. In recent years the focus shifted more on geological structures like small layers or hydrothermal fault systems. Beside 2D- or 3D-seismics from the surface and seismic measurements like Vertical Seismic Profile (VSP) or Seismic While Drilling (SWD) within a borehole these methods cannot always resolute this structures. The resolution is worsen the deeper and smaller the sought-after structures are. So, potential horizons like small layers in oil exploration or fault zones usable for geothermal energy production could be failed or not identified while drilling. The application of a device to explore the geology with a high resolution ahead of the drill bit in direction of drilling would be of high importance. Such a device would allow adjusting the drilling path according to the real geology and would minimize the risk of discovery and hence the costs for drilling. Within the project SPWD a device for seismic exploration ahead of the drill bit will be developed. This device should allow the seismic exploration to predict areas about 50 to 100 meters ahead of the drill bit with a resolution of one meter. At the GFZ a first prototype consisting of different units for seismic sources, receivers and data loggers has been designed and manufactured. As seismic sources four standard magnetostrictive actuators and as receivers four 3-component-geophones are used. Every unit, actuator or geophone, can be rotated in steps of 15 around the longitudinal axis of the prototype to test different measurement configurations. The SPWD prototype emits signal frequencies of about 500 up to 5000 Hz which are significant higher than in VSP and SWD. An increased radiation of seismic wave energy in the direction of the borehole axis allows the view in areas to be drilled. Therefore, every actuator must be controlled independently of each other regarding to amplitude and phase of the source signal to maximize the energy of the seismic source in order to reach a sufficient exploration range. The next step for focusing is to use the method of phased array. Dependent of the seismic wave velocities of the surrounding rock, the distance of the actuators to each other and the used frequencies the signal phases for each actuator can be determined. Since one year several measurements with the prototype have been realized under defined conditions at a test site in a mine. The test site consists of a rock block surrounded from three galleries with a dimension of about 100 by 200 meters. For testing the prototype two horizontal boreholes were drilled. They are directed to one of the gallery to get a strong reflector. The quality of the data of the borehole seismics in amplitude and frequency spectra show overall a good signal-to-noise ratio and correlate strongly with the fracture density along the borehole and are associated with a lower signal-to-noise ratio. Additionally, the geophones of the prototype show reflections from ahead and rearward in the seismic data. In particular, the reflections from the gallery ahead are used for the calibration of focusing. The direct seismic wave field indicates distinct compression and shear waves. The analysis of several seismic measurements with a focus on the direct seismic waves shows that the phased array technology explicit can influence the directional characteristics of the radiated seimic waves. The amplitudes of the seismic waves can be enhanced up to three times more in the desired direction and simultaneously be attenuated in the reverse direction. A major step for the directional investigation in boreholes has accomplished. But the focusing of the seismic waves has to be improved to maximize the energy in the desired direction in more measurements by calibrating the initiating seismic signals of the sources. A next step this year is the development of a wireline prototype for application in vertical boreholes with depths not more than 2000 meters are planned. The prototype must be modified and adapted to the conditions in

  15. Enormous sums of money are invested by industry and scientific funding agencies every year in seismic, well log-

    E-print Network

    optimally (Curtis, 1999; Curtis et al., 2004); updating shallow resistivity survey designs in real in seismic, well log- ging, electromagnetic, earthquake monitoring and micro- seismic surveys, and in laboratory-based experiments. For each survey or experiment a design process must first take place

  16. Delineation of potential seismic sources for seismic zoning of Iran

    Microsoft Academic Search

    Noorbakhsh Mirzaei; Mengtan Gao; Yun-Tai Chen

    1999-01-01

    A total of 235 potential seismic sources in Iran and neighboring regions are delineated based on available geological, geophysical, tectonic and earthquake data for seismic hazard assessment of the country. In practice, two key assumptions are considered; first, the assumption of earthquake repeatedness, implying that major earthquakes occur preferentially near the sites of previous earthquakes; second, the assumption of tectonic

  17. Moving formal methods into practice. Verifying the FTPP Scoreboard: Results, phase 1

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1992-01-01

    This report documents the Phase 1 results of an effort aimed at formally verifying a key hardware component, called Scoreboard, of a Fault-Tolerant Parallel Processor (FTPP) being built at Charles Stark Draper Laboratory (CSDL). The Scoreboard is part of the FTPP virtual bus that guarantees reliable communication between processors in the presence of Byzantine faults in the system. The Scoreboard implements a piece of control logic that approves and validates a message before it can be transmitted. The goal of Phase 1 was to lay the foundation of the Scoreboard verification. A formal specification of the functional requirements and a high-level hardware design for the Scoreboard were developed. The hardware design was based on a preliminary Scoreboard design developed at CSDL. A main correctness theorem, from which the functional requirements can be established as corollaries, was proved for the Scoreboard design. The goal of Phase 2 is to verify the final detailed design of Scoreboard. This task is being conducted as part of a NASA-sponsored effort to explore integration of formal methods in the development cycle of current fault-tolerant architectures being built in the aerospace industry.

  18. Benefits of vertical and horizontal seismic isolation for LMR (liquid metal reactor) nuclear reactor units

    SciTech Connect

    Wu, Ting-shu; Chang, Y.W.; Seidensticker, R.W.

    1988-01-01

    Seismic isolation has been shown to be able to reduce transmitted seismic force and lower response accelerations of a structure. When applied to nuclear reactors, it will minimize seismic influence on the reactor design and provide a design which is less site dependent. In liquid metal reactors where components are virtually at atmospheric pressure but under severe thermal conditions, thin-walled structures are generally used for primary systems. Thin-walled structures, however, have little inherent seismic resistance. The concept of seismic isolation therefore offers a viable and effective approach that permits the reactor structures to better withstand thermal and seismic loadings simultaneously. The majority of published work on seismic isolation deals with use of horizontal isolation system only. In this investigation, however, local vertical isolation is also provided for the primary system. Such local vertical isolation is found to result in significant benefits for major massive components, such as the reactor cover, designed to withstand vertical motions and loadings. Preliminary estimations on commodity savings of the primary system show that, with additional local vertical isolation, the savings could be twice that estimated for horizontal isolation only. The degree of effectiveness of vertical isolation depends on the diameter of the reactor vessel. As the reactor vessel diameter increases, the vertical seismic effects become more pronounced and vertical isolation can make a significant contribution.

  19. VERIFYING THE DETERMINANT IN Miklos Santha and Sovanna Tan

    E-print Network

    Fondements et Applications, Université Paris 7

    VERIFYING THE DETERMINANT IN PARALLEL Miklos Santha and Sovanna Tan Abstract. In this paper we of the verification of problems whose computation is equivalent to the determinant. We observe that for a few problems different reductions the class of problems which are reducible to the verification of the determinant

  20. Engineering a Sound Assertion Semantics for the Verifying Compiler

    Microsoft Academic Search

    Patrice Chalin

    2010-01-01

    The Verifying Compiler (VC) project is a core component of the Dependable Systems Evolution Grand Challenge. The VC offers the promise of automatically proving that a program or component is correct, where correctness is defined by program assertions. While several VC prototypes exist, all adopt a semantics for assertions that is unsound. This paper presents a consolidation of VC requirements

  1. Verifying Resource Access Control on Mobile Interactive Frdric Besson

    E-print Network

    Paris-Sud XI, Universit de

    definition of the basic security policy to enforce viz that an application will always ask for a permission example is the security architecture for embedded Java on mobile telephones, defined in the MobileVerifying Resource Access Control on Mobile Interactive Devices Frdric Besson Guillaume Dufay

  2. Efficient and Verifiable Algorithms for Secure Outsourcing of Cryptographic Computations

    E-print Network

    Efficient and Verifiable Algorithms for Secure Outsourcing of Cryptographic Computations Mehmet be re- duced by means of secure outsourcing modular exponentiations to a potentially untrusted server S known as secure outsourced computation. We propose new effi- cient outsourcing algorithms for modular

  3. Secure and Verifiable Outsourcing of Large-Scale Biometric Computations

    E-print Network

    Blanton, Marina

    Secure and Verifiable Outsourcing of Large-Scale Biometric Computations Marina Blanton and Yihua of secure outsourcing of large-scale biometric experiments to a cloud, where privacy of the data. One of the largest possibilities that the cloud enables is computation outsourcing, when the client

  4. Broadcast (and Round) Efficient Verifiable Secret Sharing Clint Givens

    E-print Network

    § Pavel Raykov¶ September 16, 2013 Abstract Verifiable secret sharing (VSS) is a fundamental cryptographic VSS protocols with honest majority. In this setting it is typically assumed that parties are connected are corrupt, it is impossible to construct VSS (and more generally, MPC) protocols in this setting without

  5. On the Complexity of Verifiable Secret Sharing and Multiparty Computation

    E-print Network

    appears in Proc. of STOC 2000) Abstract We first study the problem of doing Verifiable Secret Sharing (VSS structures where VSS is possible at all, we show that, up to a polynomial time black­box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where se­ curity

  6. A Verifiable Secret Sharing Scheme with Statistical Zero-Knowledge

    E-print Network

    satisfying 1) and 2). Intuitively, a secret sharing is a verifiable secret sharing (VSS for short) if all players believe that their shares are true when they get them from the dealer. Therefore, in a VSS scheme, there is not the case, which the dealer transmit a false share to any player. The notion of VSS was first introduced

  7. A Verifiable Secret Sharing Scheme with Statistical ZeroKnowledge #

    E-print Network

    satisfying 1) and 2). Intuitively, a secret sharing is a verifiable secret sharing (VSS for short) if all players believe that their shares are true when they get them from the dealer. Therefore, in a VSS scheme, there is not the case, which the dealer transmit a false share to any player. The notion of VSS was first introduced

  8. Verifying Program Optimizations in Agda Case Study: List Deforestation

    E-print Network

    Abel, Andreas

    Verifying Program Optimizations in Agda Case Study: List Deforestation Andreas Abel 3 July 2012 of deforestation. As a result we show that the summation of the first n natural numbers, implemented by producing structures is called deforestation, since data structures are tree-shaped in the general case. In our case

  9. Verifying Program Optimizations in Agda Case Study: List Deforestation

    E-print Network

    Abel, Andreas

    Verifying Program Optimizations in Agda Case Study: List Deforestation Andreas Abel 16 July 2009 of deforestation. As a result we show that the summation of the rst n natural numbers, implemented by producing structures is called deforestation, since data structures are tree-shaped in the general case. In our case

  10. Variables and Formulas, continued III. Verifying Magic Square Properties

    E-print Network

    White, Donald L.

    Variables and Formulas, continued III. Verifying Magic Square Properties We previously investigated properties of Magic Squares by considering specific examples with numbers. We'll now use variables to further investigate these properties. 1. Suppose we have a 3 by 3 Magic Square, called Square 1, and that the three

  11. Verifying Policy-Based Security for Web Services Karthikeyan Bhargavan

    E-print Network

    Fournet, Cedric

    Verifying Policy-Based Security for Web Services Karthikeyan Bhargavan Microsoft Research C configuration language for driv- ing web services security mechanisms. We describe a formal se- mantics for WS-SecurityPolicy, and propose a more abstract link language for specifying the security goals of web services and their clients

  12. Verifying Proofs in Constant Depth Olaf Beyersdorff1

    E-print Network

    Mahajan, Meena

    of Potsdam, Germany 6 TWT GmbH, Neuhausen a. d. F., Germany 7 Institute for Theoretical Computer Science was introduced by Cook and Reckhow in their seminal paper [13] as a function f that has as its range ex- actly. In addition, one needs to guarantee that proofs can be verified effi- ciently. In the model of Cook

  13. Committing Encryption and Publicly-Verifiable SignCryption

    E-print Network

    Committing Encryption and Publicly-Verifiable SignCryption Yitchak Gertner Amir Herzberg Department of Computer Science, Bar-Ilan University / / Abstract Encryption is often conceived as a committing process the standard definitions of secure encryption. We define and construct symmetric and asymmetric committing

  14. Verifying Stiffness Parameters Of Filament-Wound Cylinders

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Rheinfurth, M.

    1994-01-01

    Predicted engineering stiffness parameters of filament-wound composite-material cylinders verified with respect to experimental data, by use of equations developed straightforwardly from applicable formulation of Hooke's law. Equations derived in engineering study of filament-wound rocket-motor cases, also applicable to other cylindrical pressure vessels made of orthotropic materials.

  15. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  16. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...SOCIAL SECURITY ADMINISTRATION PRIVACY AND DISCLOSURE OF OFFICIAL RECORDS AND INFORMATION The Privacy Act 401.45 Verifying...clearly unwarranted invasion of privacy if disclosed to someone other...means, e.g., over the Internet, we require you to...

  17. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...SOCIAL SECURITY ADMINISTRATION PRIVACY AND DISCLOSURE OF OFFICIAL RECORDS AND INFORMATION The Privacy Act 401.45 Verifying...clearly unwarranted invasion of privacy if disclosed to someone other...means, e.g., over the Internet, we require you to...

  18. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...SOCIAL SECURITY ADMINISTRATION PRIVACY AND DISCLOSURE OF OFFICIAL RECORDS AND INFORMATION The Privacy Act 401.45 Verifying...clearly unwarranted invasion of privacy if disclosed to someone other...means, e.g., over the Internet, we require you to...

  19. Verifying Pointer Safety for Programs with Unknown Calls

    E-print Network

    Qin, Shengchao

    Verifying Pointer Safety for Programs with Unknown Calls Chenguang Luo, Florin Craciun, Shengchao with unknown procedure calls. Given a Hoare-style partial correctness specification S = {Pre} C {Post} in separation logic, where the program C contains calls to some unknown pro- cedure U, we infer a specification

  20. Adaptive Cruise Control: Hybrid, Distributed, and Now Formally Verified

    E-print Network

    Adaptive Cruise Control: Hybrid, Distributed, and Now Formally Verified Sarah M. Loos Andr of a distributed car control system in which every car is controlled by adaptive cruise control. One of the major appeared at FM [LPN11a]. #12;Keywords: Distributed car control, multi-agent systems, highway traffic safety