Science.gov

Sample records for verifying seismic design

  1. A Real Quantum Designated Verifier Signature Scheme

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang

    2015-09-01

    The effectiveness of most quantum signature schemes reported in the literature can be verified by a designated person, however, those quantum signature schemes aren't the real traditional designated verifier signature schemes, because the designated person hasn't the capability to efficiently simulate a signature which is indistinguishable from a signer, which cannot satisfy the requirements in some special environments such as E-voting, call for tenders and software licensing. For solving this problem, a real quantum designated verifier signature scheme is proposed in this paper. According to the property of unitary transformation and quantum one-way function, only a verifier designated by a signer can verify the "validity of a signature" and the designated verifier cannot prove to a third party that the signature was produced by the signer or by himself through a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Analysis results show that this new scheme satisfies the main security requirements of designated verifier signature scheme and the major attack strategies.

  2. Verifying the "correctness" of your optical proximity correction designs

    NASA Astrophysics Data System (ADS)

    Malhotra, Vinod K.; Chang, Fang C.

    1999-07-01

    The emerging demand for smaller and smaller IC features, undiminished by the delay of next generation stepper technologies, has increased the need for OPC and PSM designs that are becoming critical for leading-edge IC manufacturing. However, modifications made to the original layout by OPC or PSM deign tools in general, exclude the use of conventional design verification tools to verify the modified designs. Therefore, the question of design 'correctness' often goes unanswered until after the wafers have been printed. This is extremely costly in terms of time and money. In this paper, we address the critical issue that has thus far remained open, the development of methods for physical verification of OPC designs. Our approach uses fast lithography simulation to map the modified mask design to the final patterns produced on the wafer. The simulated wafer pattern is matched against the specified tolerances and the problem areas are reported. It is a hierarchical verification tool. The hierarchical processing of the data makes it a high performance tool and keeps the data volume in check. We validate this technology by comparing the simulation results with the experimental data. In addition, performance measurements indicate that it is an effective and practical solution to the problem of verifying correctness of full-chip OPC designs.

  3. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  4. Position paper: Seismic design criteria

    SciTech Connect

    Farnworth, S.K.

    1995-05-22

    The purpose of this paper is to document the seismic design criteria to be used on the Title 11 design of the underground double-shell waste storage tanks and appurtenant facilities of the Multi-Function Waste Tank Facility (MWTF) project, and to provide the history and methodologies for determining the recommended Design Basis Earthquake (DBE) Peak Ground Acceleration (PGA) anchors for site-specific seismic response spectra curves. Response spectra curves for use in design are provided in Appendix A.

  5. Design Strategy for a Formally Verified Reliable Computing Platform

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.

    1991-01-01

    This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.

  6. Design of a verifiable subset for HAL/S

    NASA Technical Reports Server (NTRS)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  7. Displacement Based Seismic Design Criteria

    SciTech Connect

    Costello, J.F.; Hofmayer, C.; Park, Y.J.

    1999-03-29

    The USNRC has initiated a project to determine if any of the likely revisions to traditional earthquake engineering practice are relevant to seismic design of the specialized structures, systems and components of nuclear power plants and of such significance to suggest that a change in design practice might be warranted. As part of the initial phase of this study, a literature survey was conducted on the recent changes in seismic design codes/standards, on-going activities of code-writing organizations/communities, and published documents on displacement-based design methods. This paper provides a summary of recent changes in building codes and on-going activities for future codes. It also discusses some technical issues for further consideration.

  8. DISPLACEMENT BASED SEISMIC DESIGN CRITERIA

    SciTech Connect

    HOFMAYER,C.H.

    1999-03-29

    The USNRC has initiated a project to determine if any of the likely revisions to traditional earthquake engineering practice are relevant to seismic design of the specialized structures, systems and components of nuclear power plants and of such significance to suggest that a change in design practice might be warranted. As part of the initial phase of this study, a literature survey was conducted on the recent changes in seismic design codes/standards, on-going activities of code-writing organizations/communities, and published documents on displacement-based design methods. This paper provides a summary of recent changes in building codes and on-going activities for future codes. It also discusses some technical issues for further consideration.

  9. Establishing seismic design criteria to achieve an acceptable seismic margin

    SciTech Connect

    Kennedy, R.P.

    1997-01-01

    In order to develop a risk based seismic design criteria the following four issues must be addressed: (1) What target annual probability of seismic induced unacceptable performance is acceptable? (2). What minimum seismic margin is acceptable? (3) Given the decisions made under Issues 1 and 2, at what annual frequency of exceedance should the Safe Shutdown Earthquake ground motion be defined? (4) What seismic design criteria should be established to reasonably achieve the seismic margin defined under Issue 2? The first issue is purely a policy decision and is not addressed in this paper. Each of the other three issues are addressed. Issues 2 and 3 are integrally tied together so that a very large number of possible combinations of responses to these two issues can be used to achieve the target goal defined under Issue 1. Section 2 lays out a combined approach to these two issues and presents three potentially attractive combined resolutions of these two issues which reasonably achieves the target goal. The remainder of the paper discusses an approach which can be used to develop seismic design criteria aimed at achieving the desired seismic margin defined in resolution of Issue 2. Suggestions for revising existing seismic design criteria to more consistently achieve the desired seismic margin are presented.

  10. DISPLACEMENT BASED SEISMIC DESIGN METHODS.

    SciTech Connect

    HOFMAYER,C.MILLER,C.WANG,Y.COSTELLO,J.

    2003-07-15

    A research effort was undertaken to determine the need for any changes to USNRC's seismic regulatory practice to reflect the move, in the earthquake engineering community, toward using expected displacement rather than force (or stress) as the basis for assessing design adequacy. The research explored the extent to which displacement based seismic design methods, such as given in FEMA 273, could be useful for reviewing nuclear power stations. Two structures common to nuclear power plants were chosen to compare the results of the analysis models used. The first structure is a four-story frame structure with shear walls providing the primary lateral load system, referred herein as the shear wall model. The second structure is the turbine building of the Diablo Canyon nuclear power plant. The models were analyzed using both displacement based (pushover) analysis and nonlinear dynamic analysis. In addition, for the shear wall model an elastic analysis with ductility factors applied was also performed. The objectives of the work were to compare the results between the analyses, and to develop insights regarding the work that would be needed before the displacement based analysis methodology could be considered applicable to facilities licensed by the NRC. A summary of the research results, which were published in NUREGICR-6719 in July 2001, is presented in this paper.

  11. Ground penetrating radar and active seismic investigation of stratigraphically verified pyroclastic deposits

    NASA Astrophysics Data System (ADS)

    Gase, A.; Bradford, J. H.; Brand, B. D.

    2015-12-01

    We conducted ground-penetrating radar (GPR) and active seismic surveys in July and August, 2015 parallel to outcrops of the pyroclastic density current deposits of the May 18th, 1980 eruption of Mount St. Helens (MSH), Washington. The primary objective of this study is to compare geophysical properties that influence electromagnetic and elastic wave velocities with stratigraphic parameters in the un-saturated zone. The deposits of interest are composed of pumice, volcanic ash, and lava blocks comprising a wide range of intrinsic porosities and grain sizes from sand to boulders. Single-offset GPR surveys for reflection data were performed with a Sensors and Software pulseEKKO Pro 100 GPR using 50 MHz, 100 MHz, and 200 MHz antennae. GPR data processing includes time-zero correction, dewow filter, migration, elevation correction. Multi-offset acquisition with 100 MHz antennae and offsets ranging from 1 m to 16 m are used for reflection tomography to create 2 D electromagnetic wave velocity models. Seismic surveys are performed with 72 geophones spaced at two meters using a sledge hammer source with shot points at each receiver point. We couple p- wave refraction tomography with Rayleigh wave inversion to compute Vp/Vs ratios. The two geophysical datasets are then compared with stratigraphic information to illustrate the influence of lithological parameters (e.g. stratification, grain-size distribution, porosity, and sorting) on geophysical properties of unsaturated pyroclastic deposits. Future work will include joint petrophysical inversion of the multiple datasets to estimate porosity and water content in the unsaturated zone.

  12. Structural concepts and details for seismic design

    SciTech Connect

    Not Available

    1991-09-01

    This manual discusses building and building component behavior during earthquakes, and provides suggested details for seismic resistance which have shown by experience to provide adequate performance during earthquakes. Special design and construction practices are also described which, although they might be common in some high-seismic regions, may not be common in low and moderate seismic-hazard regions of the United States. Special attention is given to describing the level of detailing appropriate for each seismic region. The UBC seismic criteria for all seismic zones is carefully examined, and many examples of connection details are given. The general scope of discussion is limited to materials and construction types common to Department of Energy (DOE) sites. Although the manual is primarily written for professional engineers engaged in performing seismic-resistant design for DOE facilities, the first two chapters, plus the introductory sections of succeeding chapters, contain descriptions which are also directed toward project engineers who authorize, review, or supervise the design and construction of DOE facilities. 88 refs., 188 figs.

  13. Simplified seismic performance assessment and implications for seismic design

    NASA Astrophysics Data System (ADS)

    Sullivan, Timothy J.; Welch, David P.; Calvi, Gian Michele

    2014-08-01

    The last decade or so has seen the development of refined performance-based earthquake engineering (PBEE) approaches that now provide a framework for estimation of a range of important decision variables, such as repair costs, repair time and number of casualties. This paper reviews current tools for PBEE, including the PACT software, and examines the possibility of extending the innovative displacement-based assessment approach as a simplified structural analysis option for performance assessment. Details of the displacement-based s+eismic assessment method are reviewed and a simple means of quickly assessing multiple hazard levels is proposed. Furthermore, proposals for a simple definition of collapse fragility and relations between equivalent single-degree-of-freedom characteristics and multi-degree-of-freedom story drift and floor acceleration demands are discussed, highlighting needs for future research. To illustrate the potential of the methodology, performance measures obtained from the simplified method are compared with those computed using the results of incremental dynamic analyses within the PEER performance-based earthquake engineering framework, applied to a benchmark building. The comparison illustrates that the simplified method could be a very effective conceptual seismic design tool. The advantages and disadvantages of the simplified approach are discussed and potential implications of advanced seismic performance assessments for conceptual seismic design are highlighted through examination of different case study scenarios including different structural configurations.

  14. Seismic design guidelines for highway bridges

    NASA Astrophysics Data System (ADS)

    Mayes, R. L.; Sharpe, R. L.

    1981-10-01

    Guidelines for the seismic design of highway bridges are given. The guidelines are the recommendations of a team of nationally recognized experts which included consulting engineers, academicians, State highway, and Federal agency representatives from throughout the United States. The guidelines are comprehensive in nature and they embody several new concepts which are significant departures from existing design provisions. An extensive commentary documenting the basis for the guidelines and an example demonstrating their use are included. A draft of the guidelines was used to seismically redesign twenty-one bridges. A summary of the redesigns is included.

  15. Optimal cost basis for seismic design

    SciTech Connect

    Hadjian, A.H.

    1993-09-01

    The paper summarizes a methodology for establishing seismic design levels based on a cost-benefit assessment. The methodology requires the development of costs and benefits for varying design levels of vibratory ground motion and surface fault displacements. The optimum design level for a given structure is that which gives the minimum total direct and earthquake consequence costs. The example used is the surface facilities of the proposed high-level waste repository at Yucca Mountain, Nevada.

  16. The Relationship Between Verified Organ Donor Designation and Patient Demographic and Medical Characteristics.

    PubMed

    Sehgal, N K R; Scallan, C; Sullivan, C; Cedeño, M; Pencak, J; Kirkland, J; Scott, K; Thornton, J D

    2016-04-01

    Previous studies on the correlates of organ donation consent have focused on self-reported willingness to donate and on self-reported medical suitability to donate. However, these may be subject to social desirability bias and inaccurate assessments of medical suitability. The authors sought to overcome these limitations by directly verifying donor designation on driver's licenses and by abstracting comorbid conditions from electronic health records. Using a cross-sectional study design, they reviewed the health records of 2070 randomly selected primary care patients at a large urban safety-net medical system to obtain demographic and medical characteristics. They also examined driver's licenses that were scanned into electronic health records as part of the patient registration process for donor designation. Overall, 943 (46%) patients were designated as a donor on their driver's license. On multivariate analysis, donor designation was positively associated with age 35-54 years, female sex, nonblack race, speaking English or Spanish, being employed, having private insurance, having an income >$45 000, and having fewer comorbid conditions. These demographic and medical characteristics resulted in patient subgroups with donor designation rates ranging from 21% to 75%. In conclusion, patient characteristics are strongly related to verified donor designation. Further work should tailor organ donation efforts to specific subgroups. PMID:26603147

  17. Feasibility study and verified design concept for new improved hot gas facility

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The MSFC Hot Gas Facility (HGF) was fabricated in 1975 as a temporary facility to provide immediate turnaround testing to support the SRB and ET TPS development. This facility proved to be very useful and was used to make more than 1300 runs, far more than ever intended in the original design. Therefore, it was in need of constant repair and needed to be replaced with a new improved design to support the continuing SRB/ET TPS product improvement and/or removal efforts. MSFC contracted with Lockheed-Huntsville to work on this improved design through contract NAS8-36304 Feasibility Study and Verified Design Concept for the New Improved Hot Gas Facility. The results of Lockheed-Huntsville's efforts under this contract are summarized.

  18. Tritium glovebox stripper system seismic design evaluation

    SciTech Connect

    Grinnell, J. J.; Klein, J. E.

    2015-09-01

    The use of glovebox confinement at US Department of Energy (DOE) tritium facilities has been discussed in numerous publications. Glovebox confinement protects the workers from radioactive material (especially tritium oxide), provides an inert atmosphere for prevention of flammable gas mixtures and deflagrations, and allows recovery of tritium released from the process into the glovebox when a glovebox stripper system (GBSS) is part of the design. Tritium recovery from the glovebox atmosphere reduces emissions from the facility and the radiological dose to the public. Location of US DOE defense programs facilities away from public boundaries also aids in reducing radiological doses to the public. This is a study based upon design concepts to identify issues and considerations for design of a Seismic GBSS. Safety requirements and analysis should be considered preliminary. Safety requirements for design of GBSS should be developed and finalized as a part of the final design process.

  19. Seismic-reflection technique used to verify shallow rebound fracture zones in the Pierre Shale of South Dakota ( USA).

    USGS Publications Warehouse

    Nichols, T.C., Jr.; King, K.W.; Collins, D.S.; Williams, R.A.

    1988-01-01

    Shallow seismic-reflection data are presented to demonstrate their usefulness for locating and showing the continuity and lateral extent of rebound fracture zones in the Pierre Shale. Rebound fracture zones, identified in boreholes near Hayes, South Dakota, have variable depth, thickness, and character, thus making questionable the correlation of these zones between holes. Thus, the subsequent determination of dip and of continuity of the zones is somewhat tenuous, especially if the fracture characteristics change significantly between holes. Once rebound fracture zones have been identified and located by borehole geotechnical and geologic data, seismic profiles can reveal the extent and geometry of fractures in these zones, thus providing valuable preconstruction information without the cost of additional drilling.-Authors

  20. Guidelines for the seismic design of fire protection systems

    SciTech Connect

    Benda, B.; Cushing, R.; Driesen, G.E.

    1991-12-31

    The engineering knowledge gained from earthquake experience data surveys of fire protection system components is combined with analytical evaluation results to develop guidelines for the design of seismically rugged fire protection distribution piping. The seismic design guidelines of the National Fire Protection Association Standard NFPA-13 are reviewed, augmented, and summarized to define an efficient method for the seismic design of fire protection piping systems. 8 refs.

  1. Guidelines for the seismic design of fire protection systems

    SciTech Connect

    Benda, B. ); Cushing, R. ); Driesen, G.E. )

    1991-01-01

    The engineering knowledge gained from earthquake experience data surveys of fire protection system components is combined with analytical evaluation results to develop guidelines for the design of seismically rugged fire protection distribution piping. The seismic design guidelines of the National Fire Protection Association Standard NFPA-13 are reviewed, augmented, and summarized to define an efficient method for the seismic design of fire protection piping systems. 8 refs.

  2. Verified by Visa and MasterCard SecureCode: Or, How Not to Design Authentication

    NASA Astrophysics Data System (ADS)

    Murdoch, Steven J.; Anderson, Ross

    Banks worldwide are starting to authenticate online card transactions using the '3-D Secure' protocol, which is branded as Verified by Visa and MasterCard SecureCode. This has been partly driven by the sharp increase in online fraud that followed the deployment of EMV smart cards for cardholder-present payments in Europe and elsewhere. 3-D Secure has so far escaped academic scrutiny; yet it might be a textbook example of how not to design an authentication protocol. It ignores good design principles and has significant vulnerabilities, some of which are already being exploited. Also, it provides a fascinating lesson in security economics. While other single sign-on schemes such as OpenID, InfoCard and Liberty came up with decent technology they got the economics wrong, and their schemes have not been adopted. 3-D Secure has lousy technology, but got the economics right (at least for banks and merchants); it now boasts hundreds of millions of accounts. We suggest a path towards more robust authentication that is technologically sound and where the economics would work for banks, merchants and customers - given a gentle regulatory nudge.

  3. Verifying single-station seismic approaches using Earth-based data: Preparation for data return from the InSight mission to Mars

    NASA Astrophysics Data System (ADS)

    Panning, Mark P.; Beucler, Éric; Drilleau, Mélanie; Mocquet, Antoine; Lognonné, Philippe; Banerdt, W. Bruce

    2015-03-01

    The planned InSight mission will deliver a single seismic station containing 3-component broadband and short-period sensors to the surface of Mars in 2016. While much of the progress in understanding the Earth and Moon's interior has relied on the use of seismic networks for accurate location of sources, single station approaches can be applied to data returned from Mars in order to locate events and determine interior structure. In preparation for the data return from InSight, we use a terrestrial dataset recorded at the Global Seismic Network station BFO, located at the Black Forest Observatory in Germany, to verify an approach for event location and structure determination based on recordings of multiple orbit surface waves, which will be more favorable to record on Mars than Earth due to smaller planetary radius and potentially lower background noise. With this approach applied to events near the threshold of observability on Earth, we are able to determine epicentral distance within approximately 1° (corresponding to ∼60 km on Mars), and origin time within ∼30 s. With back azimuth determined from Rayleigh wave polarization, absolute locations are determined generally within an aperture of 10°, allowing for localization within large tectonic regions on Mars. With these locations, we are able to recover Earth mantle structure within ±5% (the InSight mission requirements for martian mantle structure) using 1D travel time inversions of P and S travel times for datasets of only 7 events. The location algorithm also allows for the measurement of great-circle averaged group velocity dispersion, which we measure between 40 and 200 s to scale the expected reliable frequency range of the InSight data from Earth to Mars data. Using the terrestrial data, we are able to resolve structure down to ∼200 km, but synthetic tests demonstrate we should be able to resolve martian structure to ∼400 km with the same frequency content given the smaller planetary size.

  4. Seismic upgrade design for an exhaust stack building

    SciTech Connect

    Maryak, M.E. ); Malik, L.E. )

    1991-01-01

    An exhaust stack building of a nuclear reactor facility with complex structural configuration has been analyzed and evaluated and retrofitted for seismic forces. The building was built in the 1950's and had not been designed to resist seismic forces. A rigorous analysis and evaluation program was implemented to minimize costly retrofits required to upgrade the building to resist high seismic forces. Seismic evaluations were performed for the building in its as-is configuration, and as modified for several upgrade schemes. Soil-structure-interaction, basemat flexibility and the influence of the nearby reactor building were considered in rigorous seismic analyses. These analyses and evaluations enabled limited upgrades to qualify the stack building for the seismic forces. Some of the major conclusions of this study are: a phased approach of seismic analyses, utilizing simplified models to evaluate practicable upgrade schemes, and, then incorporating the most suitable scheme in a rigorous model to obtain design forces for upgrades, is an efficient and cost- effective approach for seismic qualification of nuclear facilities to higher seismic criteria; and finalizing the upgrade of a major nuclear facility is an iterative process, which continues throughout the construction of the upgrades.

  5. Seismic upgrade design for an exhaust stack building

    SciTech Connect

    Maryak, M.E.; Malik, L.E.

    1991-12-31

    An exhaust stack building of a nuclear reactor facility with complex structural configuration has been analyzed and evaluated and retrofitted for seismic forces. The building was built in the 1950`s and had not been designed to resist seismic forces. A rigorous analysis and evaluation program was implemented to minimize costly retrofits required to upgrade the building to resist high seismic forces. Seismic evaluations were performed for the building in its as-is configuration, and as modified for several upgrade schemes. Soil-structure-interaction, basemat flexibility and the influence of the nearby reactor building were considered in rigorous seismic analyses. These analyses and evaluations enabled limited upgrades to qualify the stack building for the seismic forces. Some of the major conclusions of this study are: a phased approach of seismic analyses, utilizing simplified models to evaluate practicable upgrade schemes, and, then incorporating the most suitable scheme in a rigorous model to obtain design forces for upgrades, is an efficient and cost- effective approach for seismic qualification of nuclear facilities to higher seismic criteria; and finalizing the upgrade of a major nuclear facility is an iterative process, which continues throughout the construction of the upgrades.

  6. Understanding seismic design criteria for Japanese Nuclear Power Plants

    SciTech Connect

    Park, Y.J.; Hofmayer, C.H.; Costello, J.F.

    1995-04-01

    This paper summarizes the results of recent survey studies on the seismic design practice for nuclear power plants in Japan. The seismic design codes and standards for both nuclear as well as non-nuclear structures have been reviewed and summarized. Some key documents for understanding Japanese seismic design criteria are also listed with brief descriptions. The paper highlights the design criteria to determine the seismic demand and component capacity in comparison with U.S. criteria, the background studies which have led to the current Japanese design criteria, and a survey of current research activities. More detailed technical descriptions are presented on the development of Japanese shear wall equations, design requirements for containment structures, and ductility requirements.

  7. Seismic Endoscopy: Design of New Instruments

    NASA Astrophysics Data System (ADS)

    Conil, F.; Nicollin, F.; Gibert, D.

    2003-04-01

    In order to perform 3D images around shallow-depth boreholes, in conditions in the field and within reasonable times of data acquisitions, several instrumental developments have been performed. The first development concerns the design of a directional probe working in the 20-100 kHz frequency range; the idea is to create a tool composed of multiple elementary piezoelectric entities able to cover the whole space to explore; made of special polyurethane rigid foam with excellent attenuation performances, the prototypes are covered by flexible polyurethane electric resin. By multiplying the number of elementary receptors around the vertical axes and piling up each elementary sensor, a complete design of multi-azimuth and multi-offset has been concepted. In addition to this, a test site has been built in order to obtain a controlled medium at typical scales of interest for seismic endoscopy and dedicated to experiment near the conditions in the field. Various reflectors are placed in well known positions and filled in an homogeneous cement medium; the whole edifice (2.2 m in diameter and 8 metres in depth) also contains 4 PVC tubes to simulate boreholes. The second part of this instrumental developments concern the synthesis of input signals; indeed, many modern devices used in ultrasonic experiment have non linear output response outside their nominal range: this is especially true in geophysical acoustical experiments when high acoustical power is necessary to insonify deep geological targets. Thanks to the high speed electronic and computerised devices now available, it is possible to plug in experimental set-ups into non linear inversions algorithms like simulated annealing. First experiments showed the robustness of the method in case of non linear analogic architecture. Large wavelet families have or example been constructed thanks to the method and multiscale Non Destructive Testing Method have been performed as an efficient method to detect and characterise

  8. On verifying a high-level design. [cost and error analysis

    NASA Technical Reports Server (NTRS)

    Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

    1993-01-01

    An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

  9. Optimum design for pipe-support allocation against seismic loading

    SciTech Connect

    Hara, Fumio; Iwasaki, Akira

    1996-12-01

    This paper deals with the optimum design methodology of a piping system subjected to a seismic design loading to reduce its dynamic response by selecting the location of pipe supports and whereby reducing the number of pipe supports to be used. The author employs the Genetic Algorithm for obtaining a reasonably optimum solution of the pipe support location, support capacity and number of supports. The design condition specified by the support location, support capacity and the number of supports to be used is encored by an integer number string for each of the support allocation candidates and they prepare many strings for expressing various kinds of pipe-support allocation state. Corresponding to each string, the authors evaluate the seismic response of the piping system to the design seismic excitation and apply the Genetic Algorithm to select the next generation candidates of support allocation to improve the seismic design performance specified by a weighted linear combination of seismic response magnitude, support capacity and the number of supports needed. Continuing this selection process, they find a reasonably optimum solution to the seismic design problem. They examine the feasibility of this optimum design method by investigating the optimum solution for 5, 7 and 10 degree-of-freedom models of piping system, and find that this method can offer one a theoretically feasible solution to the problem. They will be, thus, liberated from the severe uncertainty of damping value when the pipe support guaranties the design capacity of damping. Finally, they discuss the usefulness of the Genetic Algorithm for the seismic design problem of piping systems and some sensitive points when it will be applied to actual design problems.

  10. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    SciTech Connect

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  11. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  12. Coupling induced seismic hazard analysis with reservoir design

    NASA Astrophysics Data System (ADS)

    Gischig, V.; Wiemer, S.; Alcolea, A. R.

    2013-12-01

    positive impact on seismic hazard. However, as smaller magnitudes contribute less to permeability enhancement the efficiency of stimulation is degraded in case of high b-value conditions. Nevertheless, target permeability enhancement can be still be achieved under high b-value condition without reaching an unacceptable seismic hazard level, if either initial permeability is already high or if several fractures are stimulated. The proposed modelling approach is a first step towards including induced seismic hazard analysis into the design of reservoir stimulation.

  13. A verified design of a fault-tolerant clock synchronization circuit: Preliminary investigations

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1992-01-01

    Schneider demonstrates that many fault tolerant clock synchronization algorithms can be represented as refinements of a single proven correct paradigm. Shankar provides mechanical proof that Schneider's schema achieves Byzantine fault tolerant clock synchronization provided that 11 constraints are satisfied. Some of the constraints are assumptions about physical properties of the system and cannot be established formally. Proofs are given that the fault tolerant midpoint convergence function satisfies three of the constraints. A hardware design is presented, implementing the fault tolerant midpoint function, which is shown to satisfy the remaining constraints. The synchronization circuit will recover completely from transient faults provided the maximum fault assumption is not violated. The initialization protocol for the circuit also provides a recovery mechanism from total system failure caused by correlated transient faults.

  14. Seismic fragility assessment of RC frame structure designed according to modern Chinese code for seismic design of buildings

    NASA Astrophysics Data System (ADS)

    Wu, D.; Tesfamariam, S.; Stiemer, S. F.; Qin, D.

    2012-09-01

    Following several damaging earthquakes in China, research has been devoted to find the causes of the collapse of reinforced concrete (RC) building sand studying the vulnerability of existing buildings. The Chinese Code for Seismic Design of Buildings (CCSDB) has evolved over time, however, there is still reported earthquake induced damage of newly designed RC buildings. Thus, to investigate modern Chinese seismic design code, three low-, mid- and high-rise RC frames were designed according to the 2010 CCSDB and the corresponding vulnerability curves were derived by computing a probabilistic seismic demand model (PSDM).The PSDM was computed by carrying out nonlinear time history analysis using thirty ground motions obtained from the Pacific Earthquake Engineering Research Center. Finally, the PSDM was used to generate fragility curves for immediate occupancy, significant damage, and collapse prevention damage levels. Results of the vulnerability assessment indicate that the seismic demands on the three different frames designed according to the 2010 CCSDB meet the seismic requirements and are almost in the same safety level.

  15. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

    NASA Astrophysics Data System (ADS)

    Liu, Aiwen; Chen, Kun; Wu, Jian

    2010-06-01

    The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

  16. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    SciTech Connect

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  17. Next generation seismic fragility curves for California bridges incorporating the evolution in seismic design philosophy

    NASA Astrophysics Data System (ADS)

    Ramanathan, Karthik Narayan

    Quantitative and qualitative assessment of the seismic risk to highway bridges is crucial in pre-earthquake planning, and post-earthquake response of transportation systems. Such assessments provide valuable knowledge about a number of principal effects of earthquakes such as traffic disruption of the overall highway system, impact on the regions’ economy and post-earthquake response and recovery, and more recently serve as measures to quantify resilience. Unlike previous work, this study captures unique bridge design attributes specific to California bridge classes along with their evolution over three significant design eras, separated by the historic 1971 San Fernando and 1989 Loma Prieta earthquakes (these events affected changes in bridge seismic design philosophy). This research developed next-generation fragility curves for four multispan concrete bridge classes by synthesizing new knowledge and emerging modeling capabilities, and by closely coordinating new and ongoing national research initiatives with expertise from bridge designers. A multi-phase framework was developed for generating fragility curves, which provides decision makers with essential tools for emergency response, design, planning, policy support, and maximizing investments in bridge retrofit. This framework encompasses generational changes in bridge design and construction details. Parameterized high-fidelity three-dimensional nonlinear analytical models are developed for the portfolios of bridge classes within different design eras. These models incorporate a wide range of geometric and material uncertainties, and their responses are characterized under seismic loadings. Fragility curves were then developed considering the vulnerability of multiple components and thereby help to quantify the performance of highway bridge networks and to study the impact of seismic design principles on the performance within a bridge class. This not only leads to the development of fragility relations

  18. Review of seismicity and ground motion studies related to development of seismic design at SRS

    SciTech Connect

    Stephenson, D.E.; Acree, J.R.

    1992-08-01

    The NRC response spectra developed in Reg. Guide 1.60 is being used in the studies related to restarting of the existing Savannah River Site (SRS) reactors. Because it envelopes all the other site specific spectra which have been developed for SRS, it provides significant conservatism in the design and analysis of the reactor systems for ground motions of this value or with these probability levels. This spectral shape is also the shape used for the design of the recently licensed Vogtle Nuclear Station, located south of the Savannah River from the SRS. This report provides a summary of the data base used to develop the design basis earthquake. This includes the seismicity, rates of occurrence, magnitudes, and attenuation relationships. A summary is provided for the studies performed and methodologies used to establish the design basis earthquake for SRS. The ground motion response spectra developed from the various studies are also summarized. The seismic hazard and PGA`s developed for other critical facilities in the region are discussed, and the SRS seismic instrumentation is presented. The programs for resolving outstanding issues are discussed and conclusions are presented.

  19. Design soil profiles for seismic analyses of AP600 plant standard design

    SciTech Connect

    Ostadan, F.; Gross, K.K.; Liu, C.I.T.; Orr, R.S.

    1996-12-01

    Future nuclear power plants are based on standard designs. For seismic qualification, a variety of subsurface conditions are considered in the design. While the soil properties play a significant role in the seismic responses, an unlimited number of site conditions can be postulated for analyses. In this paper, a systematic and effective approach is described to arrive at the design soil profiles for the AP600 nuclear plant.

  20. Sensor placement for the analysis of seismic surface waves: sources of error, design criterion and array design algorithms

    NASA Astrophysics Data System (ADS)

    Maranò, Stefano; Fäh, Donat; Lu, Yue M.

    2014-06-01

    Seismic surface waves can be measured by deploying an array of seismometers on the surface of the earth. The goal of such measurement surveys is, usually, to estimate the velocity of propagation and the direction of arrival of the seismic waves. In this paper, we address the issue of sensor placement for the analysis of seismic surface waves from ambient vibration wavefields. First, we explain in detail how the array geometry affects the mean-squared estimation error of parameters of interest, such as the velocity and direction of propagation, both at low and high signal-to-noise ratios (SNRs). Secondly, we propose a cost function suitable for the design of the array geometry with particular focus on the estimation of the wavenumber of both Love and Rayleigh waves. Thirdly, we present and compare several computational approaches to minimize the proposed cost function. Numerical experiments verify the effectiveness of our cost function and resulting array geometry designs, leading to greatly improved estimation performance in comparison to arbitrary array geometries, both at low and high SNR levels.

  1. A New Design of Seismic Stations Deployed in South Tyrol

    NASA Astrophysics Data System (ADS)

    Melichar, P.; Horn, N.

    2007-05-01

    When designing the seismic network in South Tyrol, the seismic service of Austria and the Civil defense in South Tyrol combined more that 10 years experience in running seismic networks and private communication systems. In recent years the high data return rate of > 99% and network uptime of > 99.% is achieved by the combination of high quality station design and equipment, and the use of the Antelope data acquisition and processing software which comes with suite of network monitoring & alerting tools including Nagios, etc. The new Data Center is located in city of Bolzano and is connected to the other Data Centers in Austria, Switzerland, and Italy for data back up purposes. Each Data Center uses also redundant communication system if the primary system fails. When designing the South Tyrol network, new improvements were made in seismometer installations, grounding, lighting protection and data communications in order to improve quality of data recorded as well as network up-time, and data return. The new 12 stations are equipped with 6 Channels Q330+PB14f connected to STS2 + EpiSensor sensor. One of the key achievements was made in the grounding concept for the whole seismic station - and aluminum boxes were introduced which delivered Faraday cage isolation. Lightning protection devices are used for the equipment inside the aluminum housing where seismometer and data logger are housed. For the seismometer cables a special shielding was introduced. The broadband seismometer and strong-motion sensor are placed on a thick glass plate and therefore isolated from the ground. The precise seismometer orientation was done by a special groove on the glass plate and in case of a strong earthquake; the seismometer is tide up to the base plate. Temperature stability was achieved by styrofoam sheets inside the seismometer aluminum protection box.

  2. Seismic design technology for Breeder Reactor structures. Volume 3: special topics in reactor structures

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into six chapters: analysis techniques, equivalent damping values, probabilistic design factors, design verifications, equivalent response cycles for fatigue analysis, and seismic isolation. (JDB)

  3. RCC for seismic design. [Roller-Compacted Concrete

    SciTech Connect

    Wong, N.C.; Forrest, M.P.; Lo, S.H. )

    1994-09-01

    This article describes how the use of roller-compacted concrete is saving $10 million on the seismic retrofit of Southern California's historic multiple-arch Littlerock Dam. Throughout its 70-year existence, the Littlerock Dam in Southern California's Angeles National Forest has been a subject of the San Andreas Fault, could this 28-arch dam withstand any major movement from that fault line, much less the big one'' Working with the state's Division of Safety of Dams, Woodward-Clyde Consultants, Oakland, Calif., performed stability and stress analyses to find the answer. The evaluation showed that, as feared, the dam failed to meet required seismic safety criteria, principally due to its lack of lateral stability, a deficiency inherent in multiple-arch dams. To provide adequate seismic stability the authors developed a rehabilitation design centered around the use of roller-compacted concrete (RCC) to construct a gravity section between and around the downstream portions of the existing buttresses. The authors also proposed that the arches be resurfaced and stiffened with steel-fiber-reinforced silica fume. The alternative design would have required filling the arch bays between the buttresses with mass concrete at a cost of $22.5 million. The RCC buttress repair construction, scheduled for completion this fall, will cost about $13 million.

  4. A New Event Detector Designed for the Seismic Research Observatories

    USGS Publications Warehouse

    Murdock, James N.; Hutt, Charles R.

    1983-01-01

    A new short-period event detector has been implemented on the Seismic Research Observatories. For each signal detected, a printed output gives estimates of the time of onset of the signal, direction of the first break, quality of onset, period and maximum amplitude of the signal, and an estimate of the variability of the background noise. On the SRO system, the new algorithm runs ~2.5x faster than the former (power level) detector. This increase in speed is due to the design of the algorithm: all operations can be performed by simple shifts, additions, and comparisons (floating point operations are not required). Even though a narrow-band recursive filter is not used, the algorithm appears to detect events competitively with those algorithms that employ such filters. Tests at Albuquerque Seismological Laboratory on data supplied by Blandford suggest performance commensurate with the on-line detector of the Seismic Data Analysis Center, Alexandria, Virginia.

  5. Reduce seismic design conservatism through large-scale earthquake experiments

    SciTech Connect

    Tang, H.T.; Stepp, J.C. )

    1992-01-01

    For structures founded on soil deposits, the interaction between the soil and the structure caused by incident seismic waves modifies the foundation input motion and the dynamic characteristics of the soil-structure system. This paper reports that as a result, soil-structure interaction (SSI) plays a critical role in the design of nuclear plant structures. Recognizing that experimental validation and quantification is required, two scaled cylindrical reinforced-concrete containment models (1/4-scale and 1/12-scale of typical full-scale reactor containments) were constructed in Lotung, an active seismic region in Taiwan. Forced vibration tests (FBT) were also conducted to characterize the dynamic behavior of the soil-structure system. Based on these data, a series of round-robin blind prediction and post-test correlation analyses using various currently-available SSI methods were performed.

  6. Seismic isolation systems designed with distinct multiple frequencies

    SciTech Connect

    Wu, Ting-shu; Seidensticker, R.W.

    1991-01-01

    Two systems for seismic base isolation are presented. The main feature of these system is that, instead of only one isolation frequency as in conventional isolation systems, they are designed to have two distinct isolation frequencies. When the responses during an earthquake exceed the design value(s), the system will automatically and passively shift to the secondly isolation frequency. Responses of these two systems to different ground motions including a harmonic motion with frequency same as the primary isolation frequency, show that no excessive amplification will occur. Adoption of these new systems certainly will greatly enhance the safety and reliability of an isolated superstructure against future strong earthquakes. 3 refs.

  7. Seismic isolation systems designed with distinct multiple frequencies

    SciTech Connect

    Wu, Ting-shu; Seidensticker, R.W.

    1991-12-31

    Two systems for seismic base isolation are presented. The main feature of these system is that, instead of only one isolation frequency as in conventional isolation systems, they are designed to have two distinct isolation frequencies. When the responses during an earthquake exceed the design value(s), the system will automatically and passively shift to the secondly isolation frequency. Responses of these two systems to different ground motions including a harmonic motion with frequency same as the primary isolation frequency, show that no excessive amplification will occur. Adoption of these new systems certainly will greatly enhance the safety and reliability of an isolated superstructure against future strong earthquakes. 3 refs.

  8. An Alternative Approach to “Identification of Unknowns”: Designing a Protocol to Verify the Identities of Nitrogen Fixing Bacteria†

    PubMed Central

    Martinez-Vaz, Betsy M.; Denny, Roxanne; Young, Nevin D.; Sadowsky, Michael J.

    2015-01-01

    Microbiology courses often include a laboratory activity on the identification of unknown microbes. This activity consists of providing students with microbial cultures and running biochemical assays to identify the organisms. This approach lacks molecular techniques such as sequencing of genes encoding 16S rRNA, which is currently the method of choice for identification of unknown bacteria. A laboratory activity was developed to teach students how to identify microorganisms using 16S rRNA polymerase chain reaction (PCR) and validate microbial identities using biochemical techniques. We hypothesized that designing an experimental protocol to confirm the identity of a bacterium would improve students’ knowledge of microbial identification techniques and the physiological characteristics of bacterial species. Nitrogen-fixing bacteria were isolated from the root nodules of Medicago truncatula and prepared for 16S rRNA PCR analysis. Once DNA sequencing revealed the identity of the organisms, the students designed experimental protocols to verify the identity of rhizobia. An assessment was conducted by analyzing pre- and posttest scores and by grading students’ verification protocols and presentations. Posttest scores were higher than pretest scores at or below p = 0.001. Normalized learning gains (G) showed an improvement of students’ knowledge of microbial identification methods (LO4, G = 0.46), biochemical properties of nitrogen-fixing bacteria (LO3, G = 0.45), and the events leading to the establishment of nitrogen-fixing symbioses (LO1&2, G = 0.51, G = 0.37). An evaluation of verification protocols also showed significant improvement with a p value of less than 0.001. PMID:26753033

  9. Seismic design of a uranium conversion plant building

    SciTech Connect

    Peixoto, O.J.M.; Botelho, C.L.A. ); Braganca, A. Jr.; C. Santos, S.H. de

    1992-01-01

    The design of facilities with small radioactive inventory has been traditionally performed following the usual criteria for industrial buildings. In the last few years, more stringent criteria have been adopted in new nuclear facilities in order to achieve higher standards for environmental protection. In uranium conversion plants, the UF[sub 6] (uranium hexafluoride) production step is the part of the process with the highest potential for radioactivity release to the environment because of the operations performed in the UF[sub 6] desublimers and cylinder filling areas as well as UF[sub 6] distillation facilities, when they are also required in the process. This paper presents the design guidelines and some details of the seismic resistance design of a UF[sub 6] production building to be constructed in Brazil.

  10. Study of seismic design bases and site conditions for nuclear power plants

    SciTech Connect

    Not Available

    1980-04-01

    This report presents the results of an investigation of four topics pertinent to the seismic design of nuclear power plants: Design accelerations by regions of the continental United States; review and compilation of design-basis seismic levels and soil conditions for existing nuclear power plants; regional distribution of shear wave velocity of foundation materials at nuclear power plant sites; and technical review of surface-founded seismic analysis versus embedded approaches.

  11. U.S. Seismic Design Maps Web Application

    NASA Astrophysics Data System (ADS)

    Martinez, E.; Fee, J.

    2015-12-01

    The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.

  12. Design Of Bridges For Non Synchronous Seismic Motion

    SciTech Connect

    Nuti, Camillo; Vanzi, Ivo

    2008-07-08

    this paper aims to develop and validate structural design criteria which account for the effects of earthquakes spatial variability. In past works [1, 2] the two simplest forms of this problem were dealt with: differential displacements between two points belonging to the soil or to two single degree of freedom structures. Seismic action was defined according to EC8 [3]; the structures were assumed linear elastic sdof oscillators. Despite this problem may seem trivial, existing codes models appeared improvable on this aspect. For the differential displacements of two points on the ground, these results are now validated and generalized using the newly developed response spectra contained in the new seismic Italian code [4]; the resulting code formulation is presented. Next, the problem of statistically defining the differential displacement among any number of points on the ground (which is needed for continuos deck bridges) is approached, and some preliminary results shown. It is also shown that the current codes (e.g. EC8) rules may be improved on this aspect.

  13. Design and application of an electromagnetic vibrator seismic source

    USGS Publications Warehouse

    Haines, S.S.

    2006-01-01

    Vibrational seismic sources frequently provide a higher-frequency seismic wavelet (and therefore better resolution) than other sources, and can provide a superior signal-to-noise ratio in many settings. However, they are often prohibitively expensive for lower-budget shallow surveys. In order to address this problem, I designed and built a simple but effective vibrator source for about one thousand dollars. The "EMvibe" is an inexpensive electromagnetic vibrator that can be built with easy-to-machine parts and off-the-shelf electronics. It can repeatably produce pulse and frequency-sweep signals in the range of 5 to 650 Hz, and provides sufficient energy for recording at offsets up to 20 m. Analysis of frequency spectra show that the EMvibe provides a broader frequency range than the sledgehammer at offsets up to ??? 10 m in data collected at a site with soft sediments in the upper several meters. The EMvibe offers a high-resolution alternative to the sledgehammer for shallow surveys. It is well-suited to teaching applications, and to surveys requiring a precisely-repeatable source signature.

  14. Research on seismic survey design for doubly complex areas

    NASA Astrophysics Data System (ADS)

    Zhao, Hu; Yin, Cheng; Wu, Ming-Sheng; Wu, Xiao-Hua; Pan, Shu-Lin

    2012-06-01

    The complex geological conditions in doubly complex areas tend to result in difficult surface survey operations and poor target layer imaging in the subsurface which has a great impact on seismic data quality. In this paper, we propose an optimal crooked line survey method for decreasing the surface survey operational difficulties and improving the sub-layer event continuity. The method concentrates on the surface shooting conditions, first, selecting the proper shot positions based on the specific surface topographic features to reduce the shot difficulties and then optimizing the receiver positioning to meet the prerequisite that the subsurface reflection points remain in a straight line. Using this method cannot only lower the shooting difficulty of rough surface condition areas but also overcome the subsurface reflection point bending problem appearing in the traditional crooked line survey method. On the other hand, we use local infill shooting rather than conventional overall infill shooting to improve sublayer event continuity and uniformity with lower survey operation cost. A model has been calculated and processed with the proposed optimal crooked line survey and local infill shooting design method workflow and the results show that this new method can work for seismic surveys in double complex areas.

  15. Report of the US Nuclear Regulatory Commission Piping Review Committee. Volume 2. Evaluation of seismic designs: a review of seismic design requirements for Nuclear Power Plant Piping

    SciTech Connect

    Not Available

    1985-04-01

    This document reports the position and recommendations of the NRC Piping Review Committee, Task Group on Seismic Design. The Task Group considered overlapping conservation in the various steps of seismic design, the effects of using two levels of earthquake as a design criterion, and current industry practices. Issues such as damping values, spectra modification, multiple response spectra methods, nozzle and support design, design margins, inelastic piping response, and the use of snubbers are addressed. Effects of current regulatory requirements for piping design are evaluated, and recommendations for immediate licensing action, changes in existing requirements, and research programs are presented. Additional background information and suggestions given by consultants are also presented.

  16. Estimation of Characteristic Period for Energy Based Seismic Design

    SciTech Connect

    Hancloglu, Baykal; Polat, Zekeriya; Kircil, Murat Serdar

    2008-07-08

    Estimation of input energy using approximate methods has been always a considerable research topic of energy based seismic design. Therefore several approaches have been proposed by many researchers to estimate the energy input to SDOF systems in the last decades. The characteristic period is the key parameter of most of these approaches and it is defined as the period at which the peak value of the input energy occurs. In this study an equation is proposed for estimating the characteristic period considering an extensive earthquake ground motion database which includes a total of 268 far-field records, two horizontal components from 134 recording stations located on both soft and firm soil sites. For this purpose statistical regression analyses are performed to develop an equation in terms of a number of structural parameters, and it is found that the developed equation yields satisfactory results comparing the characteristic periods calculated from time history analyses of SDOF systems.

  17. Assessment of the impact of degraded shear wall stiffnesses on seismic plant risk and seismic design loads

    SciTech Connect

    Klamerus, E.W.; Bohn, M.P.; Johnson, J.J.; Asfura, A.P.; Doyle, D.J.

    1994-02-01

    Test results sponsored by the USNRC have shown that reinforced shear wall (Seismic Category I) structures exhibit stiffnesses and natural frequencies which are smaller than those calculated in the design process. The USNRC has sponsored Sandia National Labs to perform an evaluation of the effects of the reduced frequencies on several existing seismic PRAs in order to determine the seismic risk implications inherent in these test results. This report presents the results for the re-evaluation of the seismic risk for three nuclear power plants: the Peach Bottom Atomic Power Station, the Zion Nuclear Power Plant, and Arkansas Nuclear One -- Unit 1 (ANO-1). Increases in core damage frequencies for seismic initiated events at Peach Bottom were 25 to 30 percent (depending on whether LLNL or EPRI hazard curves were used). At the ANO-1 site, the corresponding increases in plant risk were 10 percent (for each set of hazard curves). Finally, at Zion, there was essentially no change in the computed core damage frequency when the reduction in shear wall stiffness was included. In addition, an evaluation of deterministic ``design-like`` structural dynamic calculations with and without the shear stiffness reductions was made. Deterministic loads calculated for these two cases typically increased on the order of 10 to 20 percent for the affected structures.

  18. Engineering Seismic Base Layer for Defining Design Earthquake Motion

    SciTech Connect

    Yoshida, Nozomu

    2008-07-08

    Engineer's common sense that incident wave is common in a widespread area at the engineering seismic base layer is shown not to be correct. An exhibiting example is first shown, which indicates that earthquake motion at the ground surface evaluated by the analysis considering the ground from a seismic bedrock to a ground surface simultaneously (continuous analysis) is different from the one by the analysis in which the ground is separated at the engineering seismic base layer and analyzed separately (separate analysis). The reason is investigated by several approaches. Investigation based on eigen value problem indicates that the first predominant period in the continuous analysis cannot be found in the separate analysis, and predominant period at higher order does not match in the upper and lower ground in the separate analysis. The earthquake response analysis indicates that reflected wave at the engineering seismic base layer is not zero, which indicates that conventional engineering seismic base layer does not work as expected by the term 'base'. All these results indicate that wave that goes down to the deep depths after reflecting in the surface layer and again reflects at the seismic bedrock cannot be neglected in evaluating the response at the ground surface. In other words, interaction between the surface layer and/or layers between seismic bedrock and engineering seismic base layer cannot be neglected in evaluating the earthquake motion at the ground surface.

  19. Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Sullivan, T. J.

    2012-04-01

    The aftermath of recent earthquakes is a reminder that seismic risk is a very relevant issue for our communities. Implicit within the seismic design standards currently in place around the world is that minimum acceptable levels of seismic risk will be ensured through design in accordance with the codes. All the same, none of the design standards specify what the minimum acceptable level of seismic risk actually is. Instead, a series of deterministic limit states are set which engineers then demonstrate are satisfied for their structure, typically through the use of elastic dynamic analyses adjusted to account for non-linear response using a set of empirical correction factors. From the early nineties the seismic engineering community has begun to recognise numerous fundamental shortcomings with such seismic design procedures in modern codes. Deficiencies include the use of elastic dynamic analysis for the prediction of inelastic force distributions, the assignment of uniform behaviour factors for structural typologies irrespective of the structural proportions and expected deformation demands, and the assumption that hysteretic properties of a structure do not affect the seismic displacement demands, amongst other things. In light of this a number of possibilities have emerged for improved control of risk through seismic design, with several innovative displacement-based seismic design methods now well developed. For a specific seismic design intensity, such methods provide a more rational means of controlling the response of a structure to satisfy performance limit states. While the development of such methodologies does mark a significant step forward for the control of seismic risk, they do not, on their own, identify the seismic risk of a newly designed structure. In the U.S. a rather elaborate performance-based earthquake engineering (PBEE) framework is under development, with the aim of providing seismic loss estimates for new buildings. The PBEE framework

  20. Technical Basis for Certification of Seismic Design Criteria for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Youngs, R.R.; Costantino, C.J.; Miller, L.F.

    2008-07-01

    In August 2007, Secretary of Energy Samuel W. Bodman approved the final seismic and ground motion criteria for the Waste Treatment and Immobilization Plant (WTP) at the Department of Energy's (DOE) Hanford Site. Construction of the WTP began in 2002 based on seismic design criteria established in 1999 and a probabilistic seismic hazard analysis completed in 1996. The design criteria were reevaluated in 2005 to address questions from the Defense Nuclear Facilities Safety Board (DNFSB), resulting in an increase by up to 40% in the seismic design basis. DOE announced in 2006 the suspension of construction on the pretreatment and high-level waste vitrification facilities within the WTP to validate the design with more stringent seismic criteria. In 2007, the U.S. Congress mandated that the Secretary of Energy certify the final seismic and ground motion criteria prior to expenditure of funds on construction of these two facilities. With the Secretary's approval of the final seismic criteria in the summer of 2007, DOE authorized restart of construction of the pretreatment and high-level waste vitrification facilities. The technical basis for the certification of seismic design criteria resulted from a two-year Seismic Boreholes Project that planned, collected, and analyzed geological data from four new boreholes drilled to depths of approximately 1400 feet below ground surface on the WTP site. A key uncertainty identified in the 2005 analyses was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The absence of directly-measured seismic shear wave velocities in the sedimentary interbeds resulted in the use of a wider and more conservative range of velocities in the 2005 analyses. The Seismic Boreholes Project was designed to directly measure the velocities and velocity contrasts in the basalts and sediments below the WTP, reanalyze the ground motion response, and assess the level of conservatism in the 2005 seismic design criteria

  1. Seismic Analysis Issues in Design Certification Applications for New Reactors

    SciTech Connect

    Miranda, M.; Morante, R.; Xu, J.

    2011-07-17

    The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants,” provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

  2. Low-Noise Potential of Advanced Fan Stage Stator Vane Designs Verified in NASA Lewis Wind Tunnel Test

    NASA Technical Reports Server (NTRS)

    Hughes, Christopher E.

    1999-01-01

    With the advent of new, more stringent noise regulations in the next century, aircraft engine manufacturers are investigating new technologies to make the current generation of aircraft engines as well as the next generation of advanced engines quieter without sacrificing operating performance. A current NASA initiative called the Advanced Subsonic Technology (AST) Program has set as a goal a 6-EPNdB (effective perceived noise) reduction in aircraft engine noise relative to 1992 technology levels by the year 2000. As part of this noise program, and in cooperation with the Allison Engine Company, an advanced, low-noise, high-bypass-ratio fan stage design and several advanced technology stator vane designs were recently tested in NASA Lewis Research Center's 9- by 15-Foot Low-Speed Wind Tunnel (an anechoic facility). The project was called the NASA/Allison Low Noise Fan.

  3. Optimization Criteria In Design Of Seismic Isolated Building

    SciTech Connect

    Clemente, Paolo; Buffarini, Giacomo

    2008-07-08

    Use of new anti-seismic techniques is certainly suitable for buildings of strategic importance and, in general, in the case of very high risk. For ordinary buildings, instead, the cost of base isolation system should be balanced by an equivalent saving in the structure. The comparison criteria have been first defined, then a large numerical investigation has been carried out to analyze the effectiveness and the economic suitability of seismic isolation in concrete buildings.

  4. Design and development of digital seismic amplifier recorder

    SciTech Connect

    Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan

    2015-04-16

    A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1KΩ and 1MΩ. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.

  5. Overcoming barriers to high performance seismic design using lessons learned from the green building industry

    NASA Astrophysics Data System (ADS)

    Glezil, Dorothy

    NEHRP's Provisions today currently governing conventional seismic resistant design. These provisions, though they ensure the life-safety of building occupants, extensive damage and economic losses may still occur in the structures. This minimum performance can be enhanced using the Performance-Based Earthquake Engineering methodology and passive control systems like base isolation and energy dissipation systems. Even though these technologies and the PBEE methodology are effective reducing economic losses and fatalities during earthquakes, getting them implemented into seismic resistant design has been challenging. One of the many barriers to their implementation has been their upfront costs. The green building community has faced some of the same challenges that the high performance seismic design community currently faces. The goal of this thesis is to draw on the success of the green building industry to provide recommendations that may be used overcome the barriers that high performance seismic design (HPSD) is currently facing.

  6. Technical Basis for Certification of Seismic Design Criteria for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, Thomas M.; Rohay, Alan C.; Youngs, Robert R.; Costantino, Carl J.; Miller, Lewis F.

    2008-02-28

    In August 2007, Secretary of Energy Samuel W. Bodman approved the final seismic and ground motion criteria for the Waste Treatment and Immobilization Plant (WTP) at the Department of Energy’s (DOE) Hanford Site. Construction of the WTP began in 2002 based on seismic design criteria established in 1999 and a probabilistic seismic hazard analysis completed in 1996. The design criteria were re-evaluated in 2005 to address questions from the Defense Nuclear Facilities Safety Board (DNFSB), resulting in an increase by up to 40% in the seismic design basis. DOE announced in 2006 the suspension of construction on the pretreatment and high-level waste vitrification facilities within the WTP to validate the design with more stringent seismic criteria. In 2007, the U.S. Congress mandated that the Secretary of Energy certify the final seismic and ground motion criteria prior to expenditure of funds on construction of these two facilities. With the Secretary’s approval of the final seismic criteria this past summer, DOE authorized restart of construction of the pretreatment and high-level waste vitrification facilities.

  7. Design and implementation of telemetry seismic data acquisition system based on embedded P2P Ethernet

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Lin, J.; Chen, Z.

    2011-12-01

    A new design of telemetry seismic data acquisition system is presented which uses embedded, point to point (P2P) Ethernet networks. In our presentation, we explain the idea and motivation behind the use of P2P Ethernet topology and show the problems when such topology is used in seismic acquisition system. The presented paper focuses on the network protocols developed by us which include the generation of route table and dynamic IP address management. This new design has been implemented based on ARM and FPGA, which we have tested in laboratory and seismic exploration.

  8. SEISMIC DESIGN REQUIREMENTS SELECTION METHODOLOGY FOR THE SLUDGE TREATMENT & M-91 SOLID WASTE PROCESSING FACILITIES PROJECTS

    SciTech Connect

    RYAN GW

    2008-04-25

    In complying with direction from the U.S. Department of Energy (DOE), Richland Operations Office (RL) (07-KBC-0055, 'Direction Associated with Implementation of DOE-STD-1189 for the Sludge Treatment Project,' and 08-SED-0063, 'RL Action on the Safety Design Strategy (SDS) for Obtaining Additional Solid Waste Processing Capabilities (M-91 Project) and Use of Draft DOE-STD-I 189-YR'), it has been determined that the seismic design requirements currently in the Project Hanford Management Contract (PHMC) will be modified by DOE-STD-1189, Integration of Safety into the Design Process (March 2007 draft), for these two key PHMC projects. Seismic design requirements for other PHMC facilities and projects will remain unchanged. Considering the current early Critical Decision (CD) phases of both the Sludge Treatment Project (STP) and the Solid Waste Processing Facilities (M-91) Project and a strong intent to avoid potentially costly re-work of both engineering and nuclear safety analyses, this document describes how Fluor Hanford, Inc. (FH) will maintain compliance with the PHMC by considering both the current seismic standards referenced by DOE 0 420.1 B, Facility Safety, and draft DOE-STD-1189 (i.e., ASCE/SEI 43-05, Seismic Design Criteria for Structures, Systems, and Components in Nuclear Facilities, and ANSI!ANS 2.26-2004, Categorization of Nuclear Facility Structures, Systems and Components for Seismic Design, as modified by draft DOE-STD-1189) to choose the criteria that will result in the most conservative seismic design categorization and engineering design. Following the process described in this document will result in a conservative seismic design categorization and design products. This approach is expected to resolve discrepancies between the existing and new requirements and reduce the risk that project designs and analyses will require revision when the draft DOE-STD-1189 is finalized.

  9. New seismic design and evaluation criteria for the Department of Energy

    SciTech Connect

    Kennedy, R.P.; Short, S.A.; Nelson, T.A.; Murray, R.C.

    1992-12-01

    Seismic design and evaluation criteria are based on probabilistic performance goals for Department of Energy (DOE) facilities across the United States. These criteria, utilize probabilistic seismic hazard curves for specification of earthquake loading combined with deterministic response evaluation methods and permissible behavior limits. Through the use of such a design/evaluation approach, it may be demonstrated that there is high likelihood that probabilistic performance goals can be achieved. These criteria have been described in previous technical papers. The purpose of this paper is to present proposed modifications to DOE seismic design and evaluation criteria. These modifications account for various slopes of seismic hazard curves, make corrections to earlier versions, and take advantage of an improved quantitative basis for the acceptance criteria.

  10. Seismic Response Analysis and Design of Structure with Base Isolation

    SciTech Connect

    Rosko, Peter

    2010-05-21

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  11. On the seismic design of piping for fossil fired power stations

    SciTech Connect

    Lazzeri, L.

    1996-12-01

    The seismic design criteria are briefly reviewed: the importance of the yielding phenomena on the seismic response is presented. The decisive importance of ductility is confirmed by the field observations. The ductility causes reduction in the response with flattening of the peaks. Some analyses are performed on several piping systems in static equivalent conditions with ZPA loading. Such analyses assume some ductility in the system. Problems are found only for very flexible systems.

  12. Effective Parameters on Seismic Design of Rectangular Underground Structures

    SciTech Connect

    Amiri, G. Ghodrati; Maddah, N.; Mohebi, B.

    2008-07-08

    Underground structures are a significant part of the transportation in the modern society and in the seismic zones should withstand against both seismic and static loadings. Embedded structures should conform to ground deformations during the earthquake but almost exact evaluation of structure to ground distortion is critical. Several two-dimensional finite difference models are used to find effective parameters on racking ratio (structure to ground distortion) including flexibility ratio, various cross sections, embedment depth, and Poisson's ratio of soil. Results show that influence of different cross sections, by themselves is negligible but embedment depth in addition to flexibility ratio and Poisson's ratio is known as a consequential parameter. A comparison with pseudo-static method (simplified frame analysis) is also performed. The results show that for a stiffer structure than soil, racking ratio decreases as the depth of burial decreases; on the other hand, shallow and flexible structures can suffer greater distortion than deeper ones up to 30 percents.

  13. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  14. Verifying Diagnostic Software

    NASA Technical Reports Server (NTRS)

    Lindsey, Tony; Pecheur, Charles

    2004-01-01

    Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.

  15. Seismic design factors for RC special moment resisting frames in Dubai, UAE

    NASA Astrophysics Data System (ADS)

    Alhamaydeh, Mohammad; Abdullah, Sulayman; Hamid, Ahmed; Mustapha, Abdilwahhab

    2011-12-01

    This study investigates the seismic design factors for three reinforced concrete (RC) framed buildings with 4, 16 and 32-stories in Dubai, UAE utilizing nonlinear analysis. The buildings are designed according to the response spectrum procedure defined in the 2009 International Building Code (IBC'09). Two ensembles of ground motion records with 10% and 2% probability of exceedance in 50 years (10/50 and 2/50, respectively) are used. The nonlinear dynamic responses to the earthquake records are computed using IDARC-2D. Key seismic design parameters are evaluated; namely, response modification factor ( R), deflection amplification factor ( C d), system overstrength factor ( Ω o), and response modification factor for ductility ( R d ) in addition to inelastic interstory drift. The evaluated seismic design factors are found to significantly depend on the considered ground motion (10/50 versus 2/50). Consequently, resolution to the controversy of Dubai seismicity is urged. The seismic design factors for the 2/50 records show an increase over their counterparts for the 10/50 records in the range of 200%-400%, except for the Ω o factor, which shows a mere 30% increase. Based on the observed trends, perioddependent R and C d factors are recommended if consistent collapse probability (or collapse prevention performance) in moment frames with varying heights is to be expected.

  16. Performance-based seismic design of nonstructural building components: The next frontier of earthquake engineering

    NASA Astrophysics Data System (ADS)

    Filiatrault, Andre; Sullivan, Timothy

    2014-08-01

    With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major

  17. The 1995 forum on appropriate criteria and methods for seismic design of nuclear piping

    SciTech Connect

    Slagis, G.C.

    1996-12-01

    A record of the 1995 Forum on Appropriate Criteria and Methods for Seismic Design of Nuclear Piping is provided. The focus of the forum was the earthquake experience data base and whether the data base demonstrates that seismic inertia loads will not cause failure in ductile piping systems. This was a follow-up to the 1994 Forum when the use of earthquake experience data, including the recent Northridge earthquake, to justify a design-by-rule method was explored. Two possible topics for the next forum were identified--inspection after an earthquake and design for safe-shutdown earthquake only.

  18. Architecture for Verifiable Software

    NASA Technical Reports Server (NTRS)

    Reinholtz, William; Dvorak, Daniel

    2005-01-01

    Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

  19. Performance-based seismic design of steel frames utilizing colliding bodies algorithm.

    PubMed

    Veladi, H

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717

  20. Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm

    PubMed Central

    Veladi, H.

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717

  1. Verifying Ballast Water Treatment Performance

    EPA Science Inventory

    The U.S. Environmental Protection Agency, NSF International, Battelle, and U.S. Coast Guard are jointly developing a protocol for verifying the technical performance of commercially available technologies designed to treat ship ballast water for potentially invasive species. The...

  2. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Reidel, S.P.; Gardner, M.G.

    2007-07-01

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energy's (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84. percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis. A key uncertainty identified in the 2005 analysis was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The velocity structure of the upper four basalt flows (Saddle Mountains Basalt) and the inter-layered sedimentary interbeds (Ellensburg Formation) produces strong reductions in modeled earthquake ground motions propagating through them. Uncertainty in the strength of velocity contrasts between these basalts and interbeds primarily resulted from an absence of measured shear wave velocities (Vs) in the interbeds. For this study, Vs in the interbeds was estimated from older, limited compressional wave velocity (Vp) data using estimated ranges for the ratio of the two velocities (Vp/Vs) based on analogues in similar materials. A range of possible Vs for the interbeds and basalts was used and produced additional uncertainty in the resulting response spectra. Because of the

  3. Estimation of Cyclic Interstory Drift Capacity of Steel Framed Structures and Future Applications for Seismic Design

    PubMed Central

    Bojórquez, Edén; Reyes-Salazar, Alfredo; Ruiz, Sonia E.; Terán-Gilmore, Amador

    2014-01-01

    Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions. PMID:25089288

  4. Estimation of cyclic interstory drift capacity of steel framed structures and future applications for seismic design.

    PubMed

    Bojórquez, Edén; Reyes-Salazar, Alfredo; Ruiz, Sonia E; Terán-Gilmore, Amador

    2014-01-01

    Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions. PMID:25089288

  5. optimization of seismic network design: application to a geophysical international lunar network

    NASA Astrophysics Data System (ADS)

    Yamada, R.; Garcia, R. F.; Lognonne, P.; Calvet, M.; Gagnepain-Beyneix, J.; Le Feuvre, M.

    2010-12-01

    During the next decade, some lunar seismic experiments are planned under the international lunar network initiative, such as NASA ILN Anchor nodes mission or Lunette DISCOVERY proposal, JAXA SELENE-2 and LUNA-GLOB penetrator missions, during which 1 to 4 seismic stations will be deployed on the lunar surface. Yamada et al. (submitted) have described how to design the optimized network in order to obtain the best scientific gain from these future lunar landing missions. In this presentation, we will describe the expected gain from the new lunar seismic observations potentially obtained by the optimized network compared with past Apollo seismic experiments. From the Apollo seismic experiments, valuable information about the lunar interior structure was obtained using deep and shallow moonquakes, and meteoroid impacts (e.g., Nakamura et al., 1983, Lognonné et al., 2003). However, due to the limited sensitivity of Apollo lunar seismometers and the narrowness of the seismic network, the deep lunar structure, especially the core, was not properly retrieved. In addition, large uncertainties are associated with the inferred crustal thickness around the Apollo seismic stations. Improvements of these knowledge will help us to understand the origin of the Earth-Moon system and the initial differentiation of the Moon. Therefore, we have studied the optimization of a seismic network consisting of three or four new seismometers in order to place better constraints on the lunar mantle structure and /or crustal thickness. The network is designed to minimize the a posteriori errors and maximize the resolution of the velocity perturbations inside the mantle and /or the crust through a linear inverse method. For the inversion, the deep moonquakes from active sources already located by Apollo seismic data are used, because it is known that these events occur repeatedly at identical nests depending on tidal constraints. In addition, we use randomly distributed meteoroid impacts

  6. Deterministic seismic design and evaluation criteria to meet probabilistic performance goals

    SciTech Connect

    Short, S.A. ); Murray, R.C.; Nelson, T.A. ); Hill, J.R. . Office of Safety Appraisals)

    1990-12-01

    For DOE facilities across the United States, seismic design and evaluation criteria are based on probabilistic performance goals. In addition, other programs such as Advanced Light Water Reactors, New Production Reactors, and IPEEE for commercial nuclear power plants utilize design and evaluation criteria based on probabilistic performance goals. The use of probabilistic performance goals is a departure from design practice for commercial nuclear power plants which have traditionally been designed utilizing a deterministic specification of earthquake loading combined with deterministic response evaluation methods and permissible behavior limits. Approaches which utilize probabilistic seismic hazard curves for specification of earthquake loading and deterministic response evaluation methods and permissible behavior limits are discussed in this paper. Through the use of such design/evaluation approaches, it may be demonstrated that there is high likelihood that probabilistic performance goals can be achieved. 12 refs., 2 figs., 9 tabs.

  7. Seismic Assessment of High-Raised Designed Structures Based on 2800 Iranian Seismic Code (same as UBC1997)

    SciTech Connect

    Negar, Moharrami Gargari; Rassol, Mirgaderi

    2008-07-08

    Seismic design codes have been applied by researchers to employ an appropriate performance of structures during earthquakes, in this regard, variety of load patterns, history and location of plastic hinges, ultimate capacity of structure, demand capacity of structure and response to many other questions about actual and assumptive performance of structures during earthquake have been considered by experts in this fields. In order to decline the retrofit cost of structure, evaluation of non-linear behavior of structure during the earthquake has been studied more. Since last 1980's the first generation of structural retrofit codes was established while designing codes were using linear behavior of structure. Consequently, comparison of design and retrofit code results, which are evaluated the actual behavior of the structure, has been considered. This research evaluates structures designed by 2800 code with performance levels, described in FEMA356, and also it compares results of modal analysis with outcomes of static non-linear analysis by application of load patterns mentioned in FEMA356. This structure designed and controlled by all regulations in 2800 code then it is evaluated by FEMA356 regulations. Finally, results are presented performance point of structure and distribution of plastic hinges over the whole structure when it collapses.

  8. Multi Canister Overpack (MCO) Handling Machine Trolley Seismic Uplift Constraint Design Loads

    SciTech Connect

    SWENSON, C.E.

    2000-03-09

    The MCO Handling Machine (MHM) trolley moves along the top of the MHM bridge girders on east-west oriented rails. To prevent trolley wheel uplift during a seismic event, passive uplift constraints are provided as shown in Figure 1-1. North-south trolley wheel movement is prevented by flanges on the trolley wheels. When the MHM is positioned over a Multi-Canister Overpack (MCO) storage tube, east-west seismic restraints are activated to prevent trolley movement during MCO handling. The active seismic constraints consist of a plunger, which is inserted into slots positioned along the tracks as shown in Figure 1-1. When the MHM trolley is moving between storage tube positions, the active seismic restraints are not engaged. The MHM has been designed and analyzed in accordance with ASME NOG-1-1995. The ALSTHOM seismic analysis (Reference 3) reported seismic uplift restraint loading and EDERER performed corresponding structural calculations. The ALSTHOM and EDERER calculations were performed with the east-west seismic restraints activated and the uplift restraints experiencing only vertical loading. In support of development of the CSB Safety Analysis Report (SAR), an evaluation of the MHM seismic response was requested for the case where the east-west trolley restraints are not engaged. For this case, the associated trolley movements would result in east-west lateral loads on the uplift constraints due to friction, as shown in Figure 1-2. During preliminary evaluations, questions were raised as to whether the EDERER calculations considered the latest ALSTHOM seismic analysis loads (See NCR No. 00-SNFP-0008, Reference 5). Further evaluation led to the conclusion that the EDERER calculations used appropriate vertical loading, but the uplift restraints would need to be re-analyzed and modified to account for lateral loading. The disposition of NCR 00-SNFP-0008 will track the redesign and modification effort. The purpose of this calculation is to establish bounding seismic

  9. Risk-Targeted versus Current Seismic Design Maps for the Conterminous United States

    USGS Publications Warehouse

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  10. Recent results of a seismically isolated optical table prototype designed for advanced LIGO

    NASA Astrophysics Data System (ADS)

    Sannibale, V.; Abbott, B.; Aso, Y.; Boschi, V.; Coyne, D.; DeSalvo, R.; Márka, S.; Ottaway, D.; Stochino, A.

    2008-07-01

    The Horizontal Access Module Seismic Attenuation System (HAM-SAS) is a mechanical device expressly designed to isolate a multipurpose optical table and fit in the tight space of the LIGO HAM Ultra-High-Vacuum chamber. Seismic attenuation in the detectors' sensitivity frequency band is achieved with state of the art passive mechanical attenuators. These devices should provide an attenuation factor of about 70dB above 10Hz at the suspension point of the Advanced LIGO triple pendulum suspension. Automatic control techniques are used to position the optical table and damp rigid body modes. Here, we report the main results obtained from the full scale prototype installed at the MIT LIGO Advanced System Test Interferometer (LASTI) facility. Seismic attenuation performance, control strategies, improvements and limitations are also discussed.

  11. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part I: Theoretical formulation

    SciTech Connect

    Mazza, Fabio; Vulcano, Alfonso

    2008-07-08

    The insertion of steel braces equipped with dissipative devices proves to be very effective in order to enhance the performance of a framed building under horizontal seismic loads. Multi-level design criteria were proposed according to the Performance-Based Design, in order to get, for a specific level of the seismic intensity, a designated performance objective of the building (e.g., an assigned damage level of either the framed structure or non-structural elements). In this paper a design procedure aiming to proportion braces with hysteretic dampers in order to attain, for a specific level of the seismic intensity, a designated performance level of the building is proposed. Exactly, a proportional stiffness criterion, which assumes the elastic lateral storey-stiffness due to the braces proportional to that of the unbraced frame, is combined with the Direct Displacement-Based Design, in which the design starts from target deformations. A computer code has been prepared for the nonlinear static and dynamic analyses, using a step-by-step procedure. Frame members and hysteretic dampers are idealized by bilinear models.

  12. Tube-shape verifier

    NASA Technical Reports Server (NTRS)

    Anderson, A. N.; Christ, C. R.

    1980-01-01

    Inexpensive apparatus checks accuracy of bent tubes. Assortment of slotted angles and clamps is bolted down to flat aluminum plate outlining shape of standard tube bent to desired configuration. Newly bent tubes are then checked against this outline. Because parts are bolted down, tubes can be checked very rapidly without disturbing outline. One verifier per tube-bending machine can really speed up production in tube-bending shop.

  13. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  14. Seismic design of circular-section concrete-lined underground openings: Preclosure performance considerations for the Yucca Mountain Site

    SciTech Connect

    Richardson, A.M.; Blejwas, T.E.

    1992-07-01

    Yucca Mountain, the potential site of a repository for high-level radioactive waste, is situated in a region of natural and man-made seismicity. Underground openings excavated at this site must be designed for worker safety in the seismic environment anticipated for the preclosure period. This includes accesses developed for site characterization regardless of the ultimate outcome of the repository siting process. Experience with both civil and mining structures has shown that underground openings are much more resistant to seismic effects than surface structures, and that even severe dynamic strains can usually be accommodated with proper design. This paper discusses the design and performance of lined openings in the seismic environment of the potential site. The types and ranges of possible ground motions (seismic loads) are briefly discussed. Relevant historical records of underground opening performance during seismic loading are reviewed. Simple analytical methods of predicting liner performance under combined in situ, thermal, and seismic loading are presented, and results of calculations are discussed in the context of realistic performance requirements for concrete-lined openings for the preclosure period. Design features that will enhance liner stability and mitigate the impact of the potential seismic load are reviewed. The paper is limited to preclosure performance concerns involving worker safety because present decommissioning plans specify maintaining the option for liner removal at seal locations, thus decoupling liner design from repository postclosure performance issues.

  15. Effects of surface topography on ground shaking prediction: implications for seismic hazard analysis and recommendations for seismic design

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Massa, Marco; Lovati, Sara; Spallarossa, Daniele

    2014-06-01

    This study examines the role of topographic effects on the prediction of earthquake ground motion. Ground motion prediction equations (GMPEs) are mathematical models that estimate the shaking level induced by an earthquake as a function of several parameters, such as magnitude, source-to-site distance, style of faulting and ground type. However, little importance is given to the effects of topography, which, as known, may play a significant role on the level, duration and frequency content of ground motion. Ridges and crests are often lost inside the large number of sites considered in the definition of a GMPE. Hence, it is presumable that current GMPEs are unable to accurately predict the shaking level at the top of a relief. The present work, which follows the article of Massa et al. about topographic effects, aims at overcoming this limitation by amending an existing GMPE with an additional term to account for the effects of surface topography at a specific site. First, experimental ground motion values and ground motions predicted by the attenuation model of Bindi et al. for five case studies are compared and contrasted in order to quantify their discrepancy and to identify anomalous behaviours of the sites investigated. Secondly, for the site of Narni (Central Italy), amplification factors derived from experimental measurements and numerical analyses are compared and contrasted, pointing out their impact on probabilistic seismic hazard analysis and design norms. In particular, with reference to the Italian building code, our results have highlighted the inadequacy of the national provisions concerning the definition of the seismic load at top of ridges and crests, evidencing a significant underestimation of ground motion around the site resonance frequency.

  16. Verifiable Quantum Computing

    NASA Astrophysics Data System (ADS)

    Kashefi, Elham

    Over the next five to ten years we will see a state of flux as quantum devices become part of the mainstream computing landscape. However adopting and applying such a highly variable and novel technology is both costly and risky as this quantum approach has an acute verification and validation problem: On the one hand, since classical computations cannot scale up to the computational power of quantum mechanics, verifying the correctness of a quantum-mediated computation is challenging; on the other hand, the underlying quantum structure resists classical certification analysis. Our grand aim is to settle these key milestones to make the translation from theory to practice possible. Currently the most efficient ways to verify a quantum computation is to employ cryptographic methods. I will present the current state of the art of various existing protocols where generally there exists a trade-off between the practicality of the scheme versus their generality, trust assumptions and security level. EK gratefully acknowledges funding through EPSRC Grants EP/N003829/1 and EP/M013243/1.

  17. Programmable Verifiers in Imperative Programming

    NASA Astrophysics Data System (ADS)

    Chen, Yifeng

    This paper studies the relation between execution and verification. A simple imperative language called VerExec with execution and verification commands is introduced. A machine only executes execution commands of a program, while the compiler only performs the verification commands. Common commands in other languages can be defined as a combination of execution and verification commands. Design of verifiers then becomes program design using verification commands. It is shown that type checking, abstract interpretation, modeling checking and Hoare Logic are all special verification programs, so are many of their combinations.

  18. Effect of URM infills on seismic vulnerability of Indian code designed RC frame buildings

    NASA Astrophysics Data System (ADS)

    Haldar, Putul; Singh, Yogendra; Paul, D. K.

    2012-03-01

    Unreinforced Masonry (URM) is the most common partitioning material in framed buildings in India and many other countries. Although it is well-known that under lateral loading the behavior and modes of failure of the frame buildings change significantly due to infill-frame interaction, the general design practice is to treat infills as nonstructural elements and their stiffness, strength and interaction with the frame is often ignored, primarily because of difficulties in simulation and lack of modeling guidelines in design codes. The Indian Standard, like many other national codes, does not provide explicit insight into the anticipated performance and associated vulnerability of infilled frames. This paper presents an analytical study on the seismic performance and fragility analysis of Indian code-designed RC frame buildings with and without URM infills. Infills are modeled as diagonal struts as per ASCE 41 guidelines and various modes of failure are considered. HAZUS methodology along with nonlinear static analysis is used to compare the seismic vulnerability of bare and infilled frames. The comparative study suggests that URM infills result in a significant increase in the seismic vulnerability of RC frames and their effect needs to be properly incorporated in design codes.

  19. Verifying versus falsifying banknotes

    NASA Astrophysics Data System (ADS)

    van Renesse, Rudolf L.

    1998-04-01

    A series of counterfeit Dutch, German, English, and U.S. banknotes was examined with respect to the various modi operandi to imitate paper based, printed and post-printed security features. These features provide positive evidence (verifiability) as well as negative evidence (falsifiability). It appears that the positive evidence provided in most cases is insufficiently convincing: banknote inspection mainly rests on negative evidence. The act of falsifying (to prove to be false), however, is an inefficacious procedure. Ergonomic verificatory security features are demanded. This demand is increasingly met by security features based on nano- technology. The potential of nano-security has a twofold base: (1) the unique optical effects displayed allow simple, fast and unambiguous inspection, and (2) the nano-technology they are based on, makes successful counterfeit or simulation extremely improbable.

  20. Some considerations for establishing seismic design criteria for nuclear plant piping

    SciTech Connect

    Chen, W.P.; Chokshi, N.C.

    1997-01-01

    The Energy Technology Engineering Center (ETEC) is providing assistance to the U.S. NRC in developing regulatory positions on the seismic analysis of piping. As part of this effort, ETEC previously performed reviews of the ASME Code, Section III piping seismic design criteria as revised by the 1994 Addenda. These revised criteria were based on evaluations by the ASME Special Task Group on Integrated Piping Criteria (STGIPC) and the Technical Core Group (TCG) of the Advanced Reactor Corporation (ARC) of the earlier joint Electric Power Research Institute (EPRI)/NRC Piping & Fitting Dynamic Reliability (PFDR) program. Previous ETEC evaluations reported at the 23rd WRSM of seismic margins associated with the revised criteria are reviewed. These evaluations had concluded, in part, that although margins for the timed PFDR tests appeared acceptable (>2), margins in detuned tests could be unacceptable (<1). This conclusion was based primarily on margin reduction factors (MRFs) developed by the ASME STGIPC and ARC/TCG from realistic analyses of PFDR test 36. This paper reports more recent results including: (1) an approach developed for establishing appropriate seismic margins based on PRA considerations, (2) independent assessments of frequency effects on margins, (3) the development of margins based on failure mode considerations, and (4) the implications of Code Section III rules for Section XI.

  1. Seismic design technology for breeder reactor structures. Volume 1. Special topics in earthquake ground motion

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This report is divided into twelve chapters: seismic hazard analysis procedures, statistical and probabilistic considerations, vertical ground motion characteristics, vertical ground response spectrum shapes, effects of inclined rock strata on site response, correlation of ground response spectra with intensity, intensity attenuation relationships, peak ground acceleration in the very mean field, statistical analysis of response spectral amplitudes, contributions of body and surface waves, evaluation of ground motion characteristics, and design earthquake motions. (DLC)

  2. Seismic Evaluation and Preliminary Design of Regular Setback Masonry Infilled Open Ground Storey RC Frame

    NASA Astrophysics Data System (ADS)

    Hashmi, Arshad K.

    2016-06-01

    Current seismic code presents certain stringent factors for defining frame as regular and irregular. Thereby these stringent factors only decide the type of analysis (i.e. equivalent static analysis or dynamic analysis) to be done. On the contrary, development of new simplified methods such as pushover analysis can give lateral load capacity of any structure (e.g. regular or irregular frame etc.) easily. Design by iterative procedure with the help of pushover analysis for serviceability requirement (i.e. inter storey drift limitation) provided by present seismic code, can provide an alternative to present practicing procedure. Present paper deals with regular setback frame in combination with vulnerable layout of masonry infill walls over the frame elevation (i.e. probable case of "Vertical Stiffness Irregularities"). Nonlinear time history analysis and Capacity Spectrum Method have been implemented to investigate the seismic performance of these frames. Finally, recently developed preliminary design procedure satisfying the serviceability criterion of inter storey drift limitation has been employed for the preliminary design of these frames.

  3. Probabilistic seismic hazard characterization and design parameters for the Pantex Plant

    SciTech Connect

    Bernreuter, D. L.; Foxall, W.; Savy, J. B.

    1998-10-19

    The Hazards Mitigation Center at Lawrence Livermore National Laboratory (LLNL) updated the seismic hazard and design parameters at the Pantex Plant. The probabilistic seismic hazard (PSH) estimates were first updated using the latest available data and knowledge from LLNL (1993, 1998), Frankel et al. (1996), and other relevant recent studies from several consulting companies. Special attention was given to account for the local seismicity and for the system of potentially active faults associated with the Amarillo-Wichita uplift. Aleatory (random) uncertainty was estimated from the available data and the epistemic (knowledge) uncertainty was taken from results of similar studies. Special attention was given to soil amplification factors for the site. Horizontal Peak Ground Acceleration (PGA) and 5% damped uniform hazard spectra were calculated for six return periods (100 yr., 500 yr., 1000 yr., 2000 yr., 10,000 yr., and 100,000 yr.). The design parameters were calculated following DOE standards (DOE-STD-1022 to 1024). Response spectra for design or evaluation of Performance Category 1 through 4 structures, systems, and components are presented.

  4. Seismic Evaluation and Preliminary Design of Regular Setback Masonry Infilled Open Ground Storey RC Frame

    NASA Astrophysics Data System (ADS)

    Hashmi, Arshad K.

    2016-03-01

    Current seismic code presents certain stringent factors for defining frame as regular and irregular. Thereby these stringent factors only decide the type of analysis (i.e. equivalent static analysis or dynamic analysis) to be done. On the contrary, development of new simplified methods such as pushover analysis can give lateral load capacity of any structure (e.g. regular or irregular frame etc.) easily. Design by iterative procedure with the help of pushover analysis for serviceability requirement (i.e. inter storey drift limitation) provided by present seismic code, can provide an alternative to present practicing procedure. Present paper deals with regular setback frame in combination with vulnerable layout of masonry infill walls over the frame elevation (i.e. probable case of "Vertical Stiffness Irregularities"). Nonlinear time history analysis and Capacity Spectrum Method have been implemented to investigate the seismic performance of these frames. Finally, recently developed preliminary design procedure satisfying the serviceability criterion of inter storey drift limitation has been employed for the preliminary design of these frames.

  5. IMPLEMENTATION OF THE SEISMIC DESIGN CRITERIA OF DOE-STD-1189-2008 APPENDIX A [FULL PAPER

    SciTech Connect

    OMBERG SK

    2008-05-14

    This paper describes the approach taken by two Fluor Hanford projects for implementing of the seismic design criteria from DOE-STD-1189-2008, Appendix A. The existing seismic design criteria and the new seismic design criteria is described, and an assessment of the primary differences provided. The gaps within the new system of seismic design criteria, which necessitate conduct of portions of work to the existing technical standards pending availability of applicable industry standards, is discussed. Two Hanford Site projects currently in the Control Decision (CD)-1 phase of design have developed an approach to implementation of the new criteria. Calculations have been performed to determine the seismic design category for one project, based on information available in early CD-1. The potential effects of DOE-STD-1189-2008, Appendix A seismic design criteria on the process of project alternatives analysis is discussed. Present of this work is expected to benefit others in the DOE Complex that may be implementing DOE-STD-1189-2008.

  6. Martian seismicity

    NASA Technical Reports Server (NTRS)

    Phillips, Roger J.; Grimm, Robert E.

    1991-01-01

    The design and ultimate success of network seismology experiments on Mars depends on the present level of Martian seismicity. Volcanic and tectonic landforms observed from imaging experiments show that Mars must have been a seismically active planet in the past and there is no reason to discount the notion that Mars is seismically active today but at a lower level of activity. Models are explored for present day Mars seismicity. Depending on the sensitivity and geometry of a seismic network and the attenuation and scattering properties of the interior, it appears that a reasonable number of Martian seismic events would be detected over the period of a decade. The thermoelastic cooling mechanism as estimated is surely a lower bound, and a more refined estimate would take into account specifically the regional cooling of Tharsis and lead to a higher frequency of seismic events.

  7. Decision making with epistemic uncertainty under safety constraints: An application to seismic design

    USGS Publications Warehouse

    Veneziano, D.; Agarwal, A.; Karaca, E.

    2009-01-01

    The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project. ?? 2009 Elsevier Ltd. All rights reserved.

  8. Seismic analysis of the LSST telescope

    NASA Astrophysics Data System (ADS)

    Neill, Douglas R.

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) will be located on the seismically active Chilean mountain of Cerro Pachón. The accelerations resulting from seismic events produce the most demanding load cases the telescope and its components must withstand. Seismic ground accelerations were applied to a comprehensive finite element analysis (FEA) model which included the telescope, its pier and the mountain top. Response accelerations for specific critical components (camera and secondary mirror assembly) on the telescope were determined by applying seismic accelerations in the form of Power Spectral Densities (PSD) to the FEA model. The PSDs were chosen based on the components design lives. Survival level accelerations were determined utilizing PSDs for seismic events with return periods 10 times the telescope's design life which is equivalent to a 10% chance of occurring over the lifetime. Since the telescope has a design life of 30 years it was analyzed for a return period of 300 years. Operational level seismic accelerations were determined using return periods of 5 times the lifetimes. Since the seismic accelerations provided by the Chilean design codes were provided in the form of Peak Spectral Accelerations (PSA), a method to convert between the two forms was developed. The accelerations are also affected by damping level. The LSST incorporates added damping to meets its rapid slew and settle requirements. This added damping also reduces the components' seismic accelerations. The analysis was repeated for the telescope horizon and zenith pointing. Closed form solutions were utilized to verify the results.

  9. Seismic design evaluation guidelines for buried piping for the DOE HLW Facilities

    SciTech Connect

    Lin, Chi-Wen; Antaki, G.; Bandyopadhyay, K.; Bush, S.H.; Costantino, C.; Kennedy, R.

    1995-05-01

    This paper presents the seismic design and evaluation guidelines for underground piping for the Department of Energy (DOE) High-Level-Waste (HLW) Facilities. The underground piping includes both single and double containment steel pipes and concrete pipes with steel lining, with particular emphasis on the double containment piping. The design and evaluation guidelines presented in this paper follow the generally accepted beam-on-elastic-foundation analysis principle and the inertial response calculation method, respectively, for piping directly in contact with the soil or contained in a jacket. A standard analysis procedure is described along with the discussion of factors deemed to be significant for the design of the underground piping. The following key considerations are addressed: the design feature and safety requirements for the inner (core) pipe and the outer pipe; the effect of soil strain and wave passage; assimilation of the necessary seismic and soil data; inertial response calculation for the inner pipe; determination of support anchor movement loads; combination of design loads; and code comparison. Specifications and justifications of the key parameters used, stress components to be calculated and the allowable stress and strain limits for code evaluation are presented.

  10. Displacement-based seismic design of flat slab-shear wall buildings

    NASA Astrophysics Data System (ADS)

    Sen, Subhajit; Singh, Yogendra

    2016-06-01

    Flat slab system is becoming widely popular for multistory buildings due to its several advantages. However, the performance of flat slab buildings under earthquake loading is unsatisfactory due to their vulnerability to punching shear failure. Several national design codes provide guidelines for designing flat slab system under gravity load only. Nevertheless, flat slab buildings are also being constructed in high seismicity regions. In this paper, performance of flat slab buildings of various heights, designed for gravity load alone according to code, is evaluated under earthquake loading as per ASCE/SEI 41 methodology. Continuity of slab bottom reinforcement through column cage improves the performance of flat slab buildings to some extent, but it is observed that these flat slab systems are not adequate in high seismicity areas and need additional primary lateral load resisting systems such as shear walls. A displacement-based method is proposed to proportion shear walls as primary lateral load resisting elements to ensure satisfactory performance. The methodology is validated using design examples of flat slab buildings with various heights.

  11. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    SciTech Connect

    , R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  12. Improved Simplified Methods for Effective Seismic Analysis and Design of Isolated and Damped Bridges in Western and Eastern North America

    NASA Astrophysics Data System (ADS)

    Koval, Viacheslav

    The seismic design provisions of the CSA-S6 Canadian Highway Bridge Design Code and the AASHTO LRFD Seismic Bridge Design Specifications have been developed primarily based on historical earthquake events that have occurred along the west coast of North America. For the design of seismic isolation systems, these codes include simplified analysis and design methods. The appropriateness and range of application of these methods are investigated through extensive parametric nonlinear time history analyses in this thesis. It was found that there is a need to adjust existing design guidelines to better capture the expected nonlinear response of isolated bridges. For isolated bridges located in eastern North America, new damping coefficients are proposed. The applicability limits of the code-based simplified methods have been redefined to ensure that the modified method will lead to conservative results and that a wider range of seismically isolated bridges can be covered by this method. The possibility of further improving current simplified code methods was also examined. By transforming the quantity of allocated energy into a displacement contribution, an idealized analytical solution is proposed as a new simplified design method. This method realistically reflects the effects of ground-motion and system design parameters, including the effects of a drifted oscillation center. The proposed method is therefore more appropriate than current existing simplified methods and can be applicable to isolation systems exhibiting a wider range of properties. A multi-level-hazard performance matrix has been adopted by different seismic provisions worldwide and will be incorporated into the new edition of the Canadian CSA-S6-14 Bridge Design code. However, the combined effect and optimal use of isolation and supplemental damping devices in bridges have not been fully exploited yet to achieve enhanced performance under different levels of seismic hazard. A novel Dual-Level Seismic

  13. Toward verified biological models.

    PubMed

    Sadot, Avital; Fisher, Jasmin; Barak, Dan; Admanit, Yishai; Stern, Michael J; Hubbard, E Jane Albert; Harel, David

    2008-01-01

    The last several decades have witnessed a vast accumulation of biological data and data analysis. Many of these data sets represent only a small fraction of the system's behavior, making the visualization of full system behavior difficult. A more complete understanding of a biological system is gained when different types of data (and/or conclusions drawn from the data) are integrated into a larger-scale representation or model of the system. Ideally, this type of model is consistent with all available data about the system, and it is then used to generate additional hypotheses to be tested. Computer-based methods intended to formulate models that integrate various events and to test the consistency of these models with respect to the laboratory-based observations on which they are based are potentially very useful. In addition, in contrast to informal models, the consistency of such formal computer-based models with laboratory data can be tested rigorously by methods of formal verification. We combined two formal modeling approaches in computer science that were originally developed for non-biological system design. One is the inter-object approach using the language of live sequence charts (LSCs) with the Play-Engine tool, and the other is the intra-object approach using the language of statecharts and Rhapsody as the tool. Integration is carried out using InterPlay, a simulation engine coordinator. Using these tools, we constructed a combined model comprising three modules. One module represents the early lineage of the somatic gonad of C. elegans in LSCs, while a second more detailed module in statecharts represents an interaction between two cells within this lineage that determine their developmental outcome. Using the advantages of the tools, we created a third module representing a set of key experimental data using LSCs. We tested the combined statechart-LSC model by showing that the simulations were consistent with the set of experimental LSCs. This small

  14. Exploratory Shaft Seismic Design Basis Working Group report; Yucca Mountain Project

    SciTech Connect

    Subramanian, C.V.; King, J.L.; Perkins, D.M.; Mudd, R.W.; Richardson, A.M.; Calovini, J.C.; Van Eeckhout, E.; Emerson, D.O.

    1990-08-01

    This report was prepared for the Yucca Mountain Project (YMP), which is managed by the US Department of Energy. The participants in the YMP are investigating the suitability of a site at Yucca Mountain, Nevada, for construction of a repository for high-level radioactive waste. An exploratory shaft facility (ESF) will be constructed to permit site characterization. The major components of the ESF are two shafts that will be used to provide access to the underground test areas for men, utilities, and ventilation. If a repository is constructed at the site, the exploratory shafts will be converted for use as intake ventilation shafts. In the context of both underground nuclear explosions (conducted at the nearby Nevada Test Site) and earthquakes, the report contains discussions of faulting potential at the site, control motions at depth, material properties of the different rock layers relevant to seismic design, the strain tensor for each of the waveforms along the shaft liners, and the method for combining the different strain components along the shaft liners. The report also describes analytic methods, assumptions used to ensure conservatism, and uncertainties in the data. The analyses show that none of the shafts` structures, systems, or components are important to public radiological safety; therefore, the shafts need only be designed to ensure worker safety, and the report recommends seismic design parameters appropriate for this purpose. 31 refs., 5 figs., 6 tabs.

  15. Implementation of seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Conrads, T.J.

    1993-06-01

    In the fall of 1992, a draft of the Seismic Design and Evaluation Guidelines for the Department of Energy (DOE) High-level Waste Storage Tanks and Appurtenances was issued. The guidelines were prepared by the Tanks Seismic Experts Panel (TSEP) and this task was sponsored by DOE, Environmental Management. The TSEP is comprised of a number of consultants known for their knowledge of seismic ground motion and expertise in the analysis of structures, systems and components subjected to seismic loads. The development of these guidelines was managed by staff from Brookhaven National Laboratory, Engineering Research and Applications Division, Department of Nuclear Energy. This paper describes the process used to incorporate the Seismic Design and Evaluation Guidelines for the DOE High-Level Waste Storage Tanks and Appurtenances into the design criteria for the Multi-Function Waste Tank Project at the Hanford Site. This project will design and construct six new high-level waste tanks in the 200 Areas at the Hanford Site. This paper also discusses the vehicles used to ensure compliance to these guidelines throughout Title 1 and Title 2 design phases of the project as well as the strategy used to ensure consistent and cost-effective application of the guidelines by the structural analysts. The paper includes lessons learned and provides recommendations for other tank design projects which might employ the TSEP guidelines.

  16. AP1000{sup R} design robustness against extreme external events - Seismic, flooding, and aircraft crash

    SciTech Connect

    Pfister, A.; Goossen, C.; Coogler, K.; Gorgemans, J.

    2012-07-01

    Both the International Atomic Energy Agency (IAEA) and the U.S. Nuclear Regulatory Commission (NRC) require existing and new nuclear power plants to conduct plant assessments to demonstrate the unit's ability to withstand external hazards. The events that occurred at the Fukushima-Dai-ichi nuclear power station demonstrated the importance of designing a nuclear power plant with the ability to protect the plant against extreme external hazards. The innovative design of the AP1000{sup R} nuclear power plant provides unparalleled protection against catastrophic external events which can lead to extensive infrastructure damage and place the plant in an extended abnormal situation. The AP1000 plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance and safety. The plant's compact safety related footprint and protection provided by its robust nuclear island structures prevent significant damage to systems, structures, and components required to safely shutdown the plant and maintain core and spent fuel pool cooling and containment integrity following extreme external events. The AP1000 nuclear power plant has been extensively analyzed and reviewed to demonstrate that it's nuclear island design and plant layout provide protection against both design basis and extreme beyond design basis external hazards such as extreme seismic events, external flooding that exceeds the maximum probable flood limit, and malicious aircraft impact. The AP1000 nuclear power plant uses fail safe passive features to mitigate design basis accidents. The passive safety systems are designed to function without safety-grade support systems (such as AC power, component cooling water, service water, compressed air or HVAC). The plant has been designed to protect systems, structures, and components critical to placing the reactor in a safe shutdown condition within the steel containment vessel

  17. Ground motion values for use in the seismic design of the Trans-Alaska Pipeline system

    USGS Publications Warehouse

    Page, Robert A.; Boore, D.M.; Joyner, W.B.; Coulter, H.W.

    1972-01-01

    The proposed trans-Alaska oil pipeline, which would traverse the state north to south from Prudhoe Bay on the Arctic coast to Valdez on Prince William Sound, will be subject to serious earthquake hazards over much of its length. To be acceptable from an environmental standpoint, the pipeline system is to be designed to minimize the potential of oil leakage resulting from seismic shaking, faulting, and seismically induced ground deformation. The design of the pipeline system must accommodate the effects of earthquakes with magnitudes ranging from 5.5 to 8.5 as specified in the 'Stipulations for Proposed Trans-Alaskan Pipeline System.' This report characterizes ground motions for the specified earthquakes in terms of peak levels of ground acceleration, velocity, and displacement and of duration of shaking. Published strong motion data from the Western United States are critically reviewed to determine the intensity and duration of shaking within several kilometers of the slipped fault. For magnitudes 5 and 6, for which sufficient near-fault records are available, the adopted ground motion values are based on data. For larger earthquakes the values are based on extrapolations from the data for smaller shocks, guided by simplified theoretical models of the faulting process.

  18. Best estimate method versus evaluation method: a comparison of two techniques in evaluating seismic analysis and design. Technical report

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-07-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the tradditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC)--seismic input, soil-structure interaction, major structural response, and subsystem response--are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on the model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evauation Method is also demonstrated.

  19. Seismic Ecology

    NASA Astrophysics Data System (ADS)

    Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

    The paper is devoted to researches of influence of seismic actions for industrial and civil buildings and people. The seismic actions bring influence directly on the people (vibration actions, force shocks at earthquakes) or indirectly through various build- ings and the constructions and can be strong (be felt by people) and weak (be fixed by sensing devices). The great number of work is devoted to influence of violent seismic actions (first of all of earthquakes) on people and various constructions. This work is devoted to study weak, but long seismic actions on various buildings and people. There is a need to take into account seismic oscillations, acting on the territory, at construction of various buildings on urbanized territories. Essential influence, except for violent earthquakes, man-caused seismic actions: the explosions, seismic noise, emitted by plant facilities and moving transport, radiation from high-rise buildings and constructions under action of a wind, etc. can exert. Materials on increase of man- caused seismicity in a number of regions in Russia, which earlier were not seismic, are presented in the paper. Along with maps of seismic microzoning maps to be built indicating a variation of amplitude spectra of seismic noise within day, months, years. The presence of an information about amplitudes and frequencies of oscillations from possible earthquakes and man-caused oscillations in concrete regions allows carry- ing out soundly designing and construction of industrial and civil housing projects. The construction of buildings even in not seismically dangerous regions, which have one from resonance frequencies coincident on magnitude to frequency of oscillations, emitted in this place by man-caused objects, can end in failure of these buildings and heaviest consequences for the people. The practical examples of detail of engineering- seismological investigation of large industrial and civil housing projects of Siberia territory (hydro power

  20. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    SciTech Connect

    Pucinotti, Raffaele; Ferrario, Fabio; Bursi, Oreste S.

    2008-07-08

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  1. Optimal seismic design of reinforced concrete structures under time-history earthquake loads using an intelligent hybrid algorithm

    NASA Astrophysics Data System (ADS)

    Gharehbaghi, Sadjad; Khatibinia, Mohsen

    2015-03-01

    A reliable seismic-resistant design of structures is achieved in accordance with the seismic design codes by designing structures under seven or more pairs of earthquake records. Based on the recommendations of seismic design codes, the average time-history responses (ATHR) of structure is required. This paper focuses on the optimal seismic design of reinforced concrete (RC) structures against ten earthquake records using a hybrid of particle swarm optimization algorithm and an intelligent regression model (IRM). In order to reduce the computational time of optimization procedure due to the computational efforts of time-history analyses, IRM is proposed to accurately predict ATHR of structures. The proposed IRM consists of the combination of the subtractive algorithm (SA), K-means clustering approach and wavelet weighted least squares support vector machine (WWLS-SVM). To predict ATHR of structures, first, the input-output samples of structures are classified by SA and K-means clustering approach. Then, WWLS-SVM is trained with few samples and high accuracy for each cluster. 9- and 18-storey RC frames are designed optimally to illustrate the effectiveness and practicality of the proposed IRM. The numerical results demonstrate the efficiency and computational advantages of IRM for optimal design of structures subjected to time-history earthquake loads.

  2. The optimum design of time delay in time-domain seismic beam-forming based on receiver array

    NASA Astrophysics Data System (ADS)

    Ge, L.; Jiang, T.; Xu, X.; Jia, H.; Yang, Z.

    2013-12-01

    Generally, it is hard to bring high signal-to-noise ratio (SNR) data in seismic prospecting in the mining area especially when noise in the field is strong. To improve the quality of seismic data from complicated ore body, we developed Time-domain Seismic Beam-forming Based on Receiver Array (TSBBRA) method, which can extract directional wave beam in any direction. But only the direction parameter from the target body matches with the direction of reflected waves, the quality of reflected seismic data can be improved. So it's important to determine the direction of reflected waves from target bodies underground. In addition, previous studies have shown that the time delay parameter of TSBBRA can be used to control the direction of the main beam, so it is of great significance for studying the optimization design of the delay time parameter of TSBBRA. The optimum design of time delay is involved in seismic pre-processing, which uses delay and sum in time-domain to form directional reflected seismic beam with the strongest energy of the specified receiving array. Firstly, we establish the velocity model according to the original seismic records and profiles of the assigned exploration area. Secondly, we simulate the propagation of seismic wave and the response of receiver array with finite-difference method. Then, we calculate optimum beam direction from assigned reflection targets and give directional diagrams. And then we synthetize seismic records with a group of time delay using TSBBRA, give the curves that energy varies with time-delay, and obtain the optimum time-delay. The results are as follows: The optimum delay time is 1.125 ms, 0.625 ms, 0.500 ms for reflected wave that form first, second and third target. Besides, to analyze the performance of TSBBRA, we calculated SNR of reflected wave signal before and after TABBRA processing for the given model. The result shows that SNR increased by 1.2~9.4 dB with TSBBRA averagely. In conclusion, the optimum design

  3. Spatial correlation analysis of seismic noise for STAR X-ray infrastructure design

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Antonino; Agostino, Raffaele; Festa, Lorenzo; Gervasi, Anna; Guerra, Ignazio; Palmer, Dennis T.; Serafini, Luca

    2014-05-01

    The Italian PON MaTeRiA project is focused on the creation of a research infrastructure open to users based on an innovative and evolutionary X-ray source. This source, named STAR (Southern Europe TBS for Applied Research), exploits the Thomson backscattering process of a laser radiation by fast-electron beams (Thomson Back Scattering - TBS). Its main performances are: X-ray photon flux 109-1010 ph/s, Angular divergence variable between 2 and 10 mrad, X-ray energy continuously variable between 8 keV and 150 keV, Bandwidth ΔE/E variable between 1 and 10%, ps time resolved structure. In order to achieve this performances, bunches of electrons produced by a photo-injector are accelerated to relativistic velocities by a linear accelerator section. The electron beam, few hundreds of micrometer wide, is driven by magnetic fields to the interaction point along a 15 m transport line where it is focused in a 10 micrometer-wide area. In the same area, the laser beam is focused after being transported along a 12 m structure. Ground vibrations could greatly affect the collision probability and thus the emittance by deviating the paths of the beams during their travel in the STAR source. Therefore, the study program to measure ground vibrations in the STAR site can be used for site characterization in relation to accelerator design. The environmental and facility noise may affect the X-ray operation especially if the predominant wavelengths in the microtremor wavefield are much smaller than the size of the linear accelerator. For wavelength much greater, all the accelerator parts move in phase, and therefore also large displacements cannot generate any significant effect. On the other hand, for wavelengths equal or less than half the accelerator size several parts could move in phase opposition and therefore small displacements could affect its proper functioning. Thereafter, it is important to characterize the microtremor wavefield in both frequencies and wavelengths domains

  4. Design and utilization of a portable seismic/acoustic calibration system

    SciTech Connect

    Stump, B.W.; Pearson, D.C.

    1996-10-01

    Empirical results from the current GSETT-3 illustrate the need for source specific information for the purpose of calibrating the monitoring system. With the specified location design goal of 1,000 km{sup 2}, preliminary analysis indicates the importance of regional calibration of travel times. This calibration information can be obtained in a passive manner utilizing locations derived from local seismic array arrival times and assumes the resulting locations are accurate. Alternatively, an active approach to the problem can be undertaken, attempting to make near-source observations of seismic sources of opportunity to provide specific information on the time, location and characteristics of the source. Moderate to large mining explosions are one source type that may be amenable to such calibration. This paper describes an active ground truthing procedure for regional calibration. A prototype data acquisition system that includes the primary ground motion component for source time and location determination, and secondary, optional acoustic and video components for improved source phenomenology is discussed. The system costs approximately $25,000 and can be deployed and operated by one to two people thus providing a cost effective system for calibration and documentation of sources of interest. Practical implementation of the system is illustrated, emphasizing the minimal impact on an active mining operation.

  5. UNCERTAINTY IN PHASE ARRIVAL TIME PICKS FOR REGIONAL SEISMIC EVENTS: AN EXPERIMENTAL DESIGN

    SciTech Connect

    A. VELASCO; ET AL

    2001-02-01

    The detection and timing of seismic arrivals play a critical role in the ability to locate seismic events, especially at low magnitude. Errors can occur with the determination of the timing of the arrivals, whether these errors are made by automated processing or by an analyst. One of the major obstacles encountered in properly estimating travel-time picking error is the lack of a clear and comprehensive discussion of all of the factors that influence phase picks. This report discusses possible factors that need to be modeled to properly study phase arrival time picking errors. We have developed a multivariate statistical model, experimental design, and analysis strategy that can be used in this study. We have embedded a general form of the International Data Center(IDC)/U.S. National Data Center (USNDC) phase pick measurement error model into our statistical model. We can use this statistical model to optimally calibrate a picking error model to regional data. A follow-on report will present the results of this analysis plan applied to an implementation of an experiment/data-gathering task.

  6. On the Need for Reliable Seismic Input Assessment for Optimized Design and Retrofit of Seismically Isolated Civil and Industrial Structures, Equipment, and Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Martelli, Alessandro

    2011-01-01

    Based on the experience of recent violent earthquakes, the limits of the methods that are currently used for the definition of seismic hazard are becoming more and more evident to several seismic engineers. Considerable improvement is felt necessary not only for the seismic classification of the territory (for which the probabilistic seismic hazard assessment—PSHA—is generally adopted at present), but also for the evaluation of local amplification. With regard to the first item, among others, a better knowledge of fault extension and near-fault effects is judged essential. The aforesaid improvements are particularly important for the design of seismically isolated structures, which relies on displacement. Thus, such a design requires an accurate definition of the maximum value of displacement corresponding to the isolation period, and a reliable evaluation of the earthquake energy content at the low frequencies that are typical of the isolated structures, for the site and ground of interest. These evaluations shall include possible near-fault effects even in the vertical direction; for the construction of high-risk plants and components and retrofit of some cultural heritage, they shall be performed for earthquakes characterized by very long return periods. The design displacement shall not be underestimated, but neither be excessively overestimated, at least when using rubber bearings in the seismic isolation (SI) system. In fact, by decreasing transverse deformation of such SI systems below a certain value, their horizontal stiffness increases. Thus, should a structure (e.g. a civil defence centre, a masterpiece, etc.) protected in the aforesaid way be designed to withstand an unnecessarily too large earthquake, the behaviour of its SI system will be inadequate (i.e. it will be too stiff) during much more frequent events, which may really strike the structure during its life. Furthermore, since SI can be used only when the room available to the structure

  7. MASSACHUSETTS DEP EELGRASS VERIFIED POINTS

    EPA Science Inventory

    Field verified points showing presence or absence of submerged rooted vascular plants along Massachusetts coastline. In addition to the photo interpreted eelgrass coverage (EELGRASS), this point coverage (EGRASVPT) was generated based on field-verified sites as well as all field...

  8. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part II: Numerical Results

    SciTech Connect

    Mazza, Fabio; Vulcano, Alfonso

    2008-07-08

    For a widespread application of dissipative braces to protect framed buildings against seismic loads, practical and reliable design procedures are needed. In this paper a design procedure based on the Direct Displacement-Based Design approach is adopted, assuming the elastic lateral storey-stiffness of the damped braces proportional to that of the unbraced frame. To check the effectiveness of the design procedure, presented in an associate paper, a six-storey reinforced concrete plane frame, representative of a medium-rise symmetric framed building, is considered as primary test structure; this structure, designed in a medium-risk region, is supposed to be retrofitted as in a high-risk region, by insertion of diagonal braces equipped with hysteretic dampers. A numerical investigation is carried out to study the nonlinear static and dynamic responses of the primary and the damped braced test structures, using step-by-step procedures described in the associate paper mentioned above; the behaviour of frame members and hysteretic dampers is idealized by bilinear models. Real and artificial accelerograms, matching EC8 response spectrum for a medium soil class, are considered for dynamic analyses.

  9. Simulation of complete seismic surveys for evaluation of experiment design and processing

    SciTech Connect

    Oezdenvar, T.; McMechan, G.A.; Chaney, P.

    1996-03-01

    Synthesis of complete seismic survey data sets allows analysis and optimization of all stages in an acquisition/processing sequence. The characteristics of available survey designs, parameter choices, and processing algorithms may be evaluated prior to field acquisition to produce a composite system in which all stages have compatible performance; this maximizes the cost effectiveness for a given level of accuracy, or for targets with specific characteristics. Data sets synthesized for three salt structures provide representative comparisons of time and depth migration, post-stack and prestack processing, and illustrate effects of varying recording aperture and shot spacing, iterative focusing analysis, and the interaction of migration algorithms with recording aperture. A final example demonstrates successful simulation of both 2-D acquisition and processing of a real data line over a salt pod in the Gulf of Mexico.

  10. Some issues in the seismic design of nuclear power-plant facilities

    SciTech Connect

    Hadjian, A.H.; Iwan, W.D.

    1980-09-01

    This paper summarizes the major issues discussed by an international panel of experts during the post-SMIRT (Structural Mechanics in Reactor Technology) Seminar on Extreme Load Design of Nuclear Power-Plant Facilities, which was held in Berlin, Aug. 20-21, 1979. The emphasis of the deliberations was on the state of the art of seismic-response calculations to predict the expected performance of structures and equipment during earthquakes. Four separate panels discussed issues on (1) soil-structure interaction and structural response, (2) modeling, materials, and boundary conditions, (3) damping in structures and equipment, and (4) fragility levels of equipment. The international character of the seminar was particularly helpful in the cross-pollination of ideas regarding the issues and the steps required to enhance the cause of safety of nuclear plants.

  11. CHARACTERIZING THE YUCCA MOUNTAIN SITE FOR DEVELOPING SEISMIC DESIGN GROUND MOTIONS

    SciTech Connect

    S. Upadhyaya, I. Wong, R. Kulkarni, K. Stokoe, M. Dober, W. Silva, and R. Quittmeyer

    2006-02-24

    Yucca Mountain, Nevada is the designated site for the first long-term geologic repository to safely dispose spent nuclear fuel and high-level nuclear waste in the U.S. Yucca Mountain consists of stacked layers of welded and non-welded volcanic tuffs. Site characterization studies are being performed to assess its future performance as a permanent geologic repository. These studies include the characterization of the shear-wave velocity (Vs) structure of the repository block and the surface facilities area. The Vs data are an input in the calculations of ground motions for the preclosure seismic design and for postclosure performance assessment and therefore their accurate estimation is needed. Three techniques have been employed: 24 downhole surveys, 15 suspension seismic logging surveys and 95 spectral-analysis-of-surface-waves (SASW) surveys have been performed to date at the site. The three data sets were compared with one another and with Vs profiles developed from vertical seismic profiling data collected by the Lawrence Berkeley National Laboratory and with Vs profiles developed independently by the University of Nevada, Reno using the refraction microtremor technique. Based on these data, base case Vs profiles have been developed and used in site response analyses. Since the question of adequate sampling arises in site characterization programs and a correlation between geology and Vs would help address this issue, a possible correlation was evaluated. To assess the influence of different factors on velocity, statistical analyses of the Vs data were performed using the method of multi-factor Analysis of Variance (ANOVA). The results of this analysis suggest that the effect of each of three factors, depth, lithologic unit, and spatial location, on velocity is statistically significant. Furthermore, velocity variation with depth is different at different spatial locations: Preliminary results show that the lithologic unit alone explains about 54% and 42% of

  12. On the Computation of H/V and its Application to Microzonation and Seismic Design

    NASA Astrophysics Data System (ADS)

    Perton, M.; Martínez, J. A.; Lermo, J. F.; Sanchez-Sesma, F. J.

    2014-12-01

    The H/V ratio is the square root of the ratio of horizontal to vertical energies of ground motion. It has been observed that the frequency of the main peak is well suited for the characterization of site effects and had been widely used for micro-zonation and seismic structural design. Historically that ratio was made from the average of individual H/V ratios obtained from noise autocorrelations. Nevertheless, it has been recently pointed out that the H/V ratio should be calculated differently as the ratio of the average of H over the average of V. This calculation is based on the relation between the directional energies (the imaginary part of Green's function) and the noise autocorrelations. In general, the average of ratios is different from the ratio of averages. Although the frequency of the main response was correctly obtained, the associated amplification factor has generally been badly predicted, having little matching with the amplification observed during strong earthquakes. The unexpected decay behavior of such ratios at high frequency and the lack of stability and reproducibility of the H/V ratios are other problems that face the method. These problems are addressed here from the point of view of normalization of noise correlations. In fact, several normalization techniques have already been proposed in order to correctly retrieve the Green's function. Some of them are well suited for the retrieval of the surface wave contribution, while others are more appropriate for bulk wave incidence. Since the H/V ratio may be used for various purposes like surface wave tomography, micro-zonation or seismic design, different normalizations are discussed in functions of the objectives. The H/V obtained from local historical earthquakes on top or far away from the subduction zone are also discussed. ACKNOWLEDGEMENT This research has been partially supported by DGAPA-UNAM under Project IN104712 and the AXA Research Fund.

  13. Pushover Analysis Methodologies: A Tool For Limited Damage Based Design Of Structure For Seismic Vibration

    NASA Astrophysics Data System (ADS)

    Dutta, Sekhar Chandra; Chakroborty, Suvonkar; Raychaudhuri, Anusrita

    Vibration transmitted to the structure during earthquake may vary in magnitude over a wide range. Design methodology should, therefore, enumerates steps so that structures are able to survive in the event of even severe ground motion. However, on account of economic reason, the strengths can be provided to the structures in such a way that the structure remains in elastic range in low to moderate range earthquake and is allowed to undergo inelastic deformation in severe earthquake without collapse. To implement this design philosophy a rigorous nonlinear dynamic analysis is needed to be performed to estimate the inelastic demands. Furthermore, the same is time consuming and requires expertise to judge the results obtained from the same. In this context, the present paper discusses and demonstrates an alternative simple method known as Pushover method, which can be easily used by practicing engineers bypassing intricate nonlinear dynamic analysis and can be thought of as a substitute of the latter. This method is in the process of development and is increasingly becoming popular for its simplicity. The objective of this paper is to emphasize and demonstrate the basic concept, strength and ease of this state of the art methodology for regular use in design offices in performance based seismic design of structures.

  14. Effects of charge design features on parameters of acoustic and seismic waves and cratering, for SMR chemical surface explosions

    NASA Astrophysics Data System (ADS)

    Gitterman, Y.

    2012-04-01

    A series of experimental on-surface shots was designed and conducted by the Geophysical Institute of Israel at Sayarim Military Range (SMR) in Negev desert, including two large calibration explosions: about 82 tons of strong IMI explosives in August 2009, and about 100 tons of ANFO explosives in January 2011. It was a collaborative effort between Israel, CTBTO, USA and several European countries, with the main goal to provide fully controlled ground truth (GT0) infrasound sources in different weather/wind conditions, for calibration of IMS infrasound stations in Europe, Middle East and Asia. Strong boosters and the upward charge detonation scheme were applied to provide a reduced energy release to the ground and an enlarged energy radiation to the atmosphere, producing enhanced infrasound signals, for better observation at far-regional stations. The following observations and results indicate on the required explosives energy partition for this charge design: 1) crater size and local seismic (duration) magnitudes were found smaller than expected for these large surface explosions; 2) small test shots of the same charge (1 ton) conducted at SMR with different detonation directions showed clearly lower seismic amplitudes/energy and smaller crater size for the upward detonation; 3) many infrasound stations at local and regional distances showed higher than expected peak amplitudes, even after application of a wind-correction procedure. For the large-scale explosions, high-pressure gauges were deployed at 100-600 m to record air-blast properties, evaluate the efficiency of the charge design and energy generation, and provide a reliable estimation of the charge yield. Empirical relations for air-blast parameters - peak pressure, impulse and the Secondary Shock (SS) time delay - depending on distance, were developed and analyzed. The parameters, scaled by the cubic root of estimated TNT equivalent charges, were found consistent for all analyzed explosions, except of SS

  15. A New Seismic Broadband Sensor Designed for Easy and Rapid Deployment

    NASA Astrophysics Data System (ADS)

    Guralp, Cansun; Pearcey, Chris; Nicholson, Bruce; Pearce, Nathan

    2014-05-01

    Properly deploying digital seismic broadband sensors in the field can be time consuming and logistically challenging. On active volcanoes the time it takes to install such instruments has to be particularly short in order to minimize the risk for the deployment personnel. In addition, once a seismometer is installed it is not always feasible to pay regular visits to the deployment site in order to correct for possible movements of the seismometer due to settling, sliding or other external events. In order to address those issues we have designed a new type of versatile and very robust three component feedback sensor which can be easily installed and is capable of self correcting changes of its tilt and measuring orientation changes during deployment. The instrument can be installed by direct burial in soil, in a borehole, in glacial ice and can even be used under water as an ocean bottom seismometer (OBS). It components are fitted above each other in a cylindrical stainless steel casing with a diameter of 51 mm. Each seismic sensor has a flat response to velocity between 30s to 100 Hz and a tilt tolerance of up to 20 degrees. A tilt sensor and a two axis magnetometer inside the casing capture changes in tilt and horizontal orientation during the course of the deployment. Their output can be fed into internal motors which in turn adjust the actual orientation of each sensor in the casing. First production models of this instrument have been deployed as OBS in an active submarine volcanic area along the Juan de Fuca Ridge in the NE Pacific. We are currently finishing units to be deployed for volcano monitoring in Icelandic glaciers. This instrument will be offered as an analogue version or with a 24-bit-digitizer fitted into the same casing. A pointy tip can be added to the casing ease direct burial.

  16. Experimentally verified, theoretical design of dual-tuned, low-pass birdcage radiofrequency resonators for magnetic resonance imaging and magnetic resonance spectroscopy of human brain at 3.0 Tesla.

    PubMed

    Shen, G X; Wu, J F; Boada, F E; Thulborn, K R

    1999-02-01

    A new theoretical method is presented for designing frequency responses of double-tuned, low-pass birdcage coils. This method is based on Kirchhoff's equations through a nonsymmetric matrix algorithm and extended through a modification of the corresponding eigenvalue system from a single-tuned mode. Designs from this method are verified for sodium/proton, dual-tuned, double-quadrature, low-pass birdcage coils at 1.5 Tesla and 3.0 Tesla and then are used to design dual-tuned, double-quadrature, lithium/proton and phosphorus/proton birdcage coils for 3.0 Tesla. All frequencies show experimental deviations of less than 3% from theory under unloaded conditions. The frequency shifts caused by loading and radiofrequency shielding are less than 1 MHz and can be compensated readily by adjustment of variable capacitors. Applications to human neuroimaging and spectroscopy are demonstrated. PMID:10080273

  17. Basis of Design and Seismic Action for Long Suspension Bridges: the case of the Messina Strait Bridge

    SciTech Connect

    Bontempi, Franco

    2008-07-08

    The basis of design for complex structures like suspension bridges is reviewed. Specific attention is devoted to seismic action and to the performance required and to the connected structural analysis. Uncertainty is specially addressed by probabilistic and soft-computing techniques. The paper makes punctual reference to the work end the experience developed during the last years for the re-design of the Messina Strait Bridge.

  18. Model verifies design of mobile data modem

    NASA Technical Reports Server (NTRS)

    Davarian, F.; Sumida, J.

    1986-01-01

    It has been proposed to use differential minimum shift keying (DMSK) modems in spacecraft-based mobile communications systems. For an employment of these modems, it is necessary that the transmitted carrier frequency be known prior to signal detection. In addition, the time needed by the receiver to lock onto the carrier frequency must be minimized. The present article is concerned with a DMSK modem developed for the Mobile Satellite Service. This device demonstrated fast acquisition time and good performance in the presence of fading. However, certain problems arose in initial attempts to study the acquisition behavior of the AFC loop through breadboard techniques. The development of a software model of the AFC loop is discussed, taking into account two cases which were plotted using the model. Attention is given to a demonstration of the viability of the modem by an approach involving modeling and analysis of the frequency synchronizer.

  19. Verifiable and Redactable Medical Documents

    PubMed Central

    Brown, Jordan; Blough, Douglas M.

    2012-01-01

    This paper considers how to verify provenance and integrity of data in medical documents that are exchanged in a distributed system of health IT services. Provenance refers to the sources of health information within the document and integrity means that the information was not modified after generation by the source. Our approach allows intermediate parties to redact the document by removing information that they do not wish to reveal. For example, patients can store verifiable health information and provide subsets of it to third parties, while redacting sensitive information that they do not wish employers, insurers, or others to receive. Our method uses a cryptographic primitive known as a redactable signature. We study practical issues and performance impacts of building, redacting, and verifying Continuity of Care Documents (CCDs) that are protected with redactable signatures. Results show that manipulating redactable CCDs provides superior security and privacy with little computational overhead. PMID:23304391

  20. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... seismic safety level in the 2000 edition of the NEHRP Recommended Provisions for the Development of... Officials, 4051 West Flossmoor Rd., Country Club Hill, IL 60478. Telephone: (800) 786-4452. Fax: (800) 214-7167. (c) The NEHRP Recommended Provisions for the Development of Seismic Regulations for New...

  1. Seismic design technology for breeder reactor structures. Volume 4. Special topics in piping and equipment

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into five chapters: experimental verification of piping systems, analytical verification of piping restraint systems, seismic analysis techniques for piping systems with multisupport input, development of floor spectra from input response spectra, and seismic analysis procedures for in-core components. (DLC)

  2. Quantum proxy signature scheme with public verifiability

    NASA Astrophysics Data System (ADS)

    Zhou, Jingxian; Zhou, Yajian; Niu, Xinxin; Yang, Yixian

    2011-10-01

    In recent years, with the development of quantum cryptography, quantum signature has also made great achievement. However, the effectiveness of all the quantum signature schemes reported in the literature can only be verified by a designated person. Therefore, its wide applications are limited. For solving this problem, a new quantum proxy signature scheme using EPR quantum entanglement state and unitary transformation to generate proxy signature is presented. Proxy signer announces his public key when he generates the final signature. According to the property of unitary transformation and quantum one-way function, everyone can verify whether the signature is effective or not by the public key. So the quantum proxy signature scheme in our paper can be public verified. The quantum key distribution and one-time pad encryption algorithm guarantee the unconditional security of this scheme. Analysis results show that this new scheme satisfies strong non-counterfeit and strong non-disavowal.

  3. Verifying the Hanging Chain Model

    ERIC Educational Resources Information Center

    Karls, Michael A.

    2013-01-01

    The wave equation with variable tension is a classic partial differential equation that can be used to describe the horizontal displacements of a vertical hanging chain with one end fixed and the other end free to move. Using a web camera and TRACKER software to record displacement data from a vibrating hanging chain, we verify a modified version…

  4. Updated Optimal Designs of Time-Lapse Seismic Surveys for Monitoring CO2 Leakage through Fault Zones

    NASA Astrophysics Data System (ADS)

    Liu, J.; Shang, X.; Sun, Y.; Chen, P.

    2012-12-01

    Cost-effective time-lapse seismic surveys are crucial for long-term monitoring of geologic carbon sequestration. Similar to Shang and Huang (2012), in this study we have numerically modeled time-lapse seismic surveys for monitoring CO2 leakage through fault zones, and designed updated optimal surveys for time-lapse seismic data acquisition using elastic-wave sensitivity analysis. When CO2 was confined in a relatively deep region, our results show that the most desired location for receivers at the surface is at the hanging-wall side of the two fault zones, of high-angle normal faults and reverse faults. The most sensitive places at the surface to the change of different P- and S-wave velocities and density are similar to each other, but are often not sensitive to the source location. When CO2 migrates close to the surface, our modeling suggests that the best region at the surface for time-lapse seismic surveys is very sensitive to the source location and the elastic parameter to be monitored.

  5. Detector verifier for circuit analyzers

    NASA Technical Reports Server (NTRS)

    Pope, D. L.; Wooters, R. L.

    1980-01-01

    Economical tool checks operation of automatic circuit analyzer. Each loop is addressed directly from analyzer console by switching internal analyzer bridge to resistance equal that of connecting cable plus specified limiting test value. Procedure verifies whether detected faults in circuit under test are actually due to analyzer malfunction. Standard-length universal test cables make it possible to shift detector tool from cable to cable without resistance compensation.

  6. Active seismic experiment

    NASA Technical Reports Server (NTRS)

    Kovach, R. L.; Watkins, J. S.; Talwani, P.

    1972-01-01

    The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

  7. Structural design of active seismic isolation floor with a charging function

    NASA Astrophysics Data System (ADS)

    Nakakoji, Hayato; Miura, Nanako

    2016-04-01

    This study shows an optimum structure of a seismic isolation floor against horizontal ground motions. Although a seismic isolation floor is effective with vibration reduction, the response of the floor becomes larger when excited by long-period ground motions. It is shown that caster equipment move and suffer damage in a seismic isolation structure by an experiment. Moreover, the permissible displacement of the floor is limited. Therefore, the focus is on an active seismic isolation. About active control, the system cannot operate without power supply. To solve these problems an energy regeneration is considered in our previous study. These studies only analyze simple model and did not choose the suitable structure for active control and energy regeneration. This research propose a new structure which has regenerated energy exceeds the energy required for the active control by numerical simulation.

  8. Conceptual Design and Architecture of Mars Exploration Rover (MER) for Seismic Experiments Over Martian Surfaces

    NASA Astrophysics Data System (ADS)

    Garg, Akshay; Singh, Amit

    2012-07-01

    Keywords: MER, Mars, Rover, Seismometer Mars has been a subject of human interest for exploration missions for quite some time now. Both rover as well as orbiter missions have been employed to suit mission objectives. Rovers have been preferentially deployed for close range reconnaissance and detailed experimentation with highest accuracy. However, it is essential to strike a balance between the chosen science objectives and the rover operations as a whole. The objective of this proposed mechanism is to design a vehicle (MER) to carry out seismic studies over Martian surface. The conceptual design consists of three units i.e. Mother Rover as a Surrogate (Carrier) and Baby Rovers (two) as seeders for several MEMS-based accelerometer / seismometer units (Nodes). Mother Rover can carry these Baby Rovers, having individual power supply with solar cells and with individual data transmission capabilities, to suitable sites such as Chasma associated with Valles Marineris, Craters or Sand Dunes. Mother rover deploys these rovers in two opposite direction and these rovers follow a triangulation pattern to study shock waves generated through firing tungsten carbide shells into the ground. Till the time of active experiments Mother Rover would act as a guiding unit to control spatial spread of detection instruments. After active shock experimentation, the babies can still act as passive seismometer units to study and record passive shocks from thermal quakes, impact cratering & landslides. Further other experiments / payloads (XPS / GAP / APXS) can also be carried by Mother Rover. Secondary power system consisting of batteries can also be utilized for carrying out further experiments over shallow valley surfaces. The whole arrangement is conceptually expected to increase the accuracy of measurements (through concurrent readings) and prolong life cycle of overall experimentation. The proposed rover can be customised according to the associated scientific objectives and further

  9. Verifying Correct Functionality of Avionics Subsystems

    NASA Technical Reports Server (NTRS)

    Meuer, Ben t.

    2005-01-01

    This project focuses on the testing of the telecommunications interface subsystem of the Multi-Mission System Architecture Platform to ensure proper functionality. The Multi-Mission System Architecture Platform is a set of basic tools designed to be used in future spacecraft. The responsibilities of the telecommunications interface include communication between the spacecraft and ground teams as well as acting as the bus controller for the system. The tests completed include bit wise read\\write tests to each register, testing of status bits, and verifying various bus controller activities. Testing is accomplished through the use of software-based simulations run on an electronic design of the system. The tests are written in Verilog Hardware Definition Language and they simulate specific states and conditions in telecommunication interfaces. Upon successful completion, the output is examined to verify that the system responded appropriately.

  10. Geological investigation for CO2 storage: from seismic and well data to storage design

    NASA Astrophysics Data System (ADS)

    Chapuis, Flavie; Bauer, Hugues; Grataloup, Sandrine; Leynet, Aurélien; Bourgine, Bernard; Castagnac, Claire; Fillacier, Simon; Lecomte, Antony; Le Gallo, Yann; Bonijoly, Didier

    2010-05-01

    Geological investigation for CO2 storage: from seismic and well data to storage design Chapuis F.1, Bauer H.1, Grataloup S.1, Leynet A.1, Bourgine B.1, Castagnac C.1, Fillacier, S.2, Lecomte A.2, Le Gallo Y.2, Bonijoly D.1. 1 BRGM, 3 av Claude Guillemin, 45060 Orléans Cedex, France, f.chapuis@brgm.fr, d.bonijoly@brgm.fr 2 Geogreen, 7, rue E. et A. Peugeot, 92563 Rueil-Malmaison Cedex, France, ylg@greogreen.fr The main purpose of this study is to evaluate the techno-economical potential of storing 200 000 tCO2 per year produced by a sugar beat distillery. To reach this goal, an accurate hydro-geological characterisation of a CO2 injection site is of primary importance because it will strongly influence the site selection, the storage design and the risk management. Geological investigation for CO2 storage is usually set in the center or deepest part of sedimentary basins. However, CO2 producers do not always match with the geological settings, and so other geological configurations have to be studied. This is the aim of this project, which is located near the South-West border of the Paris Basin, in the Orléans region. Special geometries such as onlaps and pinch out of formation against the basement are likely to be observed and so have to be taken into account. Two deep saline aquifers are potentially good candidates for CO2 storage. The Triassic continental deposits capped by the Upper Triassic/Lower Jurassic continental shales and the Dogger carbonate deposits capped by the Callovian and Oxfordian shales. First, a data review was undertaken, to provide the palaeogeographical settings and ideas about the facies, thicknesses and depth of the targeted formations. It was followed by a seismic interpretation. Three hundred kilometres of seismic lines were reprocessed and interpreted to characterize the geometry of the studied area. The main structure identified is the Étampes fault that affects all the formations. Apart from the vicinity of the fault where drag

  11. Geological investigation for CO2 storage: from seismic and well data to storage design

    NASA Astrophysics Data System (ADS)

    Chapuis, Flavie; Bauer, Hugues; Grataloup, Sandrine; Leynet, Aurélien; Bourgine, Bernard; Castagnac, Claire; Fillacier, Simon; Lecomte, Antony; Le Gallo, Yann; Bonijoly, Didier

    2010-05-01

    Geological investigation for CO2 storage: from seismic and well data to storage design Chapuis F.1, Bauer H.1, Grataloup S.1, Leynet A.1, Bourgine B.1, Castagnac C.1, Fillacier, S.2, Lecomte A.2, Le Gallo Y.2, Bonijoly D.1. 1 BRGM, 3 av Claude Guillemin, 45060 Orléans Cedex, France, f.chapuis@brgm.fr, d.bonijoly@brgm.fr 2 Geogreen, 7, rue E. et A. Peugeot, 92563 Rueil-Malmaison Cedex, France, ylg@greogreen.fr The main purpose of this study is to evaluate the techno-economical potential of storing 200 000 tCO2 per year produced by a sugar beat distillery. To reach this goal, an accurate hydro-geological characterisation of a CO2 injection site is of primary importance because it will strongly influence the site selection, the storage design and the risk management. Geological investigation for CO2 storage is usually set in the center or deepest part of sedimentary basins. However, CO2 producers do not always match with the geological settings, and so other geological configurations have to be studied. This is the aim of this project, which is located near the South-West border of the Paris Basin, in the Orléans region. Special geometries such as onlaps and pinch out of formation against the basement are likely to be observed and so have to be taken into account. Two deep saline aquifers are potentially good candidates for CO2 storage. The Triassic continental deposits capped by the Upper Triassic/Lower Jurassic continental shales and the Dogger carbonate deposits capped by the Callovian and Oxfordian shales. First, a data review was undertaken, to provide the palaeogeographical settings and ideas about the facies, thicknesses and depth of the targeted formations. It was followed by a seismic interpretation. Three hundred kilometres of seismic lines were reprocessed and interpreted to characterize the geometry of the studied area. The main structure identified is the Étampes fault that affects all the formations. Apart from the vicinity of the fault where drag

  12. Utilization of a finite element model to verify spent nuclear fuel storage rack welds

    SciTech Connect

    Nitzel, M.E.

    1998-07-01

    Elastic and plastic finite element analyses were performed for the inner tie block assembly of a 25 port fuel rack designed for installation at the Idaho National Engineering and Environmental Laboratory (INEEL) Idaho Chemical Processing Plant (ICPP). The model was specifically developed to verify the adequacy of certain welds joining components of the fuel storage rack assembly. The work scope for this task was limited to an investigation of the stress levels in the inner tie welds when the rack was subjected to seismic loads. Structural acceptance criteria used for the elastic calculations performed were as defined by the rack`s designer. Structural acceptance criteria used for the plastic calculations performed as part of this effort were as defined in Subsection NF and Appendix F of Section III of the ASME Boiler and Pressure Vessel Code. The results confirm that the welds joining the inner tie block to the surrounding rack structure meet the acceptance criteria. The analysis results verified that the inner tie block welds should be capable of transferring the expected seismic load without structural failure.

  13. Seismic design and evaluation guidelines for the Department of Energy High-Level Waste Storage Tanks and Appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1995-10-01

    This document provides seismic design and evaluation guidelines for underground high-level waste storage tanks. The guidelines reflect the knowledge acquired in the last two decades in defining seismic ground motion and calculating hydrodynamic loads, dynamic soil pressures and other loads for underground tank structures, piping and equipment. The application of the guidelines is illustrated with examples. The guidelines are developed for a specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document. The original version of this document was published in January 1993. Since then, additional studies have been performed in several areas and the results are included in this revision. Comments received from the users are also addressed. Fundamental concepts supporting the basic seismic criteria contained in the original version have since then been incorporated and published in DOE-STD-1020-94 and its technical basis documents. This information has been deleted in the current revision.

  14. Seismic design of steel structures with lead-extrusion dampers as knee braces

    SciTech Connect

    Monir, Habib Saeed; Naser, Ali

    2008-07-08

    One of the effective methods in decreasing the seismic response of structure against dynamic loads due to earthquake is using energy dissipating systems. Lead-extrusion dampers (LED) are one of these systems that dissipate energy in to one lead sleeve because of steel rod movement. Hysteresis loops of these dampers are approximately rectangular and acts independent from velocity in frequencies that are in the seismic frequency rang. In this paper lead dampers are considered as knee brace in steel frames and are studied in an economical view. Considering that lead dampers don't clog structural panels, so this characteristic can solve brace problems from architectural view. The behavior of these dampers is compared with the other kind of dampers such as XADAS and TADAS. The results indicate that lead dampers act properly in absorbing the induced energy due to earthquake and good function in controlling seismic movements of multi-story structures.

  15. Simplified design method and seismic performance of space trusses with consideration of the influence of the stiffness of their lower supporting columns

    NASA Astrophysics Data System (ADS)

    Fan, Feng; Sun, Menghan; Zhi, Xudong

    2016-06-01

    Static and dynamic force performance of two types of space truss structures i.e. square pyramid space truss (SPST) and diagonal on square pyramid space truss (DSPST), are studied to determine the effect of stiffness of their lower supporting members. A simplified model for the supporting columns and the equivalent spring mass system are presented. Furthermore, the feasibility of the simplified model is demonstrated through theoretical analysis and examples of comparative analysis of the simplified model with the entire model. Meanwhile, from the elastic analysis under frequently occurring earthquakes and elasto-plastic analysis under seldom occurring earthquakes subjected to TAFT and EL-Centro seismic oscillation it is shown that the simplified method can be encompassed in the results from the normal model. It also showed good agreement between the two methods, as well as greatly improved the computational efficiency. This study verified that the dynamic effect of the supporting structures was under considered in space truss design in the past. The method proposed in the paper has important significance for other space truss structures.

  16. Proceedings of seismic engineering 1991

    SciTech Connect

    Ware, A.G. )

    1991-01-01

    This book contains proceedings of the Seismic Engineering Technical Subcommittee of the ASME Pressure Vessels and Piping Division. Topics covered include: seismic damping and energy absorption, advanced seismic analysis methods, new analysis techniques and applications of advanced methods, seismic supports and test results, margins inherent in the current design methods, and risk assessment, and component and equipment qualification.

  17. Seismic Studies

    SciTech Connect

    R. Quittmeyer

    2006-09-25

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at

  18. Probabilistic Seismic Hazard Characterization and Design Parameters for the Sites of the Nuclear Power Plants of Ukraine

    SciTech Connect

    Savy, J.B.; Foxall, W.

    2000-01-26

    The U.S. Department of Energy (US DOE), under the auspices of the International Nuclear Safety Program (INSP) is supporting in-depth safety assessments (ISA) of nuclear power plants in Eastern Europe and the former Soviet Union for the purpose of evaluating the safety and upgrades necessary to the stock of nuclear power plants in Ukraine. For this purpose the Hazards Mitigation Center at Lawrence Livermore National Laboratory (LLNL) has been asked to assess the seismic hazard and design parameters at the sites of the nuclear power plants in Ukraine. The probabilistic seismic hazard (PSH) estimates were updated using the latest available data and knowledge from LLNL, the U.S. Geological Survey, and other relevant recent studies from several consulting companies. Special attention was given to account for the local seismicity, the deep focused earthquakes of the Vrancea zone, in Romania, the region around Crimea and for the system of potentially active faults associated with the Pripyat Dniepro Donnetts rift. Aleatory (random) uncertainty was estimated from the available data and the epistemic (knowledge) uncertainty was estimated by considering the existing models in the literature and the interpretations of a small group of experts elicited during a workshop conducted in Kiev, Ukraine, on February 2-4, 1999.

  19. Southern California Seismic Network: New Design and Implementation of Redundant and Reliable Real-time Data Acquisition Systems

    NASA Astrophysics Data System (ADS)

    Saleh, T.; Rico, H.; Solanki, K.; Hauksson, E.; Friberg, P.

    2005-12-01

    The Southern California Seismic Network (SCSN) handles more than 2500 high-data rate channels from more than 380 seismic stations distributed across southern California. These data are imported real-time from dataloggers, earthworm hubs, and partner networks. The SCSN also exports data to eight different partner networks. Both the imported and exported data are critical for emergency response and scientific research. Previous data acquisition systems were complex and difficult to operate, because they grew in an ad hoc fashion to meet the increasing needs for distributing real-time waveform data. To maximize reliability and redundancy, we apply best practices methods from computer science for implementing the software and hardware configurations for import, export, and acquisition of real-time seismic data. Our approach makes use of failover software designs, methods for dividing labor diligently amongst the network nodes, and state of the art networking redundancy technologies. To facilitate maintenance and daily operations we seek to provide some separation between major functions such as data import, export, acquisition, archiving, real-time processing, and alarming. As an example, we make waveform import and export functions independent by operating them on separate servers. Similarly, two independent servers provide waveform export, allowing data recipients to implement their own redundancy. The data import is handled differently by using one primary server and a live backup server. These data import servers, run fail-over software that allows automatic role switching in case of failure from primary to shadow. Similar to the classic earthworm design, all the acquired waveform data are broadcast onto a private network, which allows multiple machines to acquire and process the data. As we separate data import and export away from acquisition, we are also working on new approaches to separate real-time processing and rapid reliable archiving of real-time data

  20. A successful 3D seismic survey in the ``no-data zone,`` offshore Mississippi delta: Survey design and refraction static correction processing

    SciTech Connect

    Carvill, C.; Faris, N.; Chambers, R.

    1996-12-31

    This is a success story of survey design and refraction static correction processing of a large 3D seismic survey in the South Pass area of the Mississippi delta. In this transition zone, subaqueous mudflow gullies and lobes of the delta, in various states of consolidation and gas saturation, are strong absorbers of seismic energy. Seismic waves penetrating the mud are severely restricted in bandwidth and variously delayed by changes in mud velocity and thickness. Using a delay-time refraction static correction method, the authors find compensation for the various delays, i.e., static corrections, commonly vary 150 ms over a short distance. Application of the static corrections markedly improves the seismic stack volume. This paper shows that intelligent survey design and delay-time refraction static correction processing economically eliminate the historic no data status of this area.

  1. Rapid estimation of earthquake loss based on instrumental seismic intensity: design and realization

    NASA Astrophysics Data System (ADS)

    Huang, Hongsheng; Chen, Lin; Zhu, Gengqing; Wang, Lin; Lin, Yanzhao; Wang, Huishan

    2013-11-01

    As a result of our ability to acquire large volumes of real-time earthquake observation data, coupled with increased computer performance, near real-time seismic instrument intensity can be obtained by using ground motion data observed by instruments and by using the appropriate spatial interpolation methods. By combining vulnerability study results from earthquake disaster research with earthquake disaster assessment models, we can estimate the losses caused by devastating earthquakes, in an attempt to provide more reliable information for earthquake emergency response and decision support. This paper analyzes the latest progress on the methods of rapid earthquake loss estimation at home and abroad. A new method involving seismic instrument intensity rapid reporting to estimate earthquake loss is proposed and the relevant software is developed. Finally, a case study using the M L4.9 earthquake that occurred in Shun-chang county, Fujian Province on March 13, 2007 is given as an example of the proposed method.

  2. Verifying a Computer Algorithm Mathematically.

    ERIC Educational Resources Information Center

    Olson, Alton T.

    1986-01-01

    Presents an example of mathematics from an algorithmic point of view, with emphasis on the design and verification of this algorithm. The program involves finding roots for algebraic equations using the half-interval search algorithm. The program listing is included. (JN)

  3. Verify MesoNAM Performance

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2010-01-01

    The AMU conducted an objective analysis of the MesoNAM forecasts compared to observed values from sensors at specified KSC/CCAFS wind towers by calculating the following statistics to verify the performance of the model: 1) Bias (mean difference), 2) Standard deviation of Bias, 3) Root Mean Square Error (RMSE), and 4) Hypothesis test for Bias = O. The 45 WS LWOs use the MesoNAM to support launch weather operations. However, the actual performance of the model at KSC and CCAFS had not been measured objectively. The analysis compared the MesoNAM forecast winds, temperature and dew point to the observed values from the sensors on wind towers. The data were stratified by tower sensor, month and onshore/offshore wind direction based on the orientation of the coastline to each tower's location. The model's performance statistics were then calculated for each wind tower based on sensor height and model initialization time. The period of record for the data used in this task was based on the operational start of the current MesoNAM in mid-August 2006 and so the task began with the first full month of data, September 2006, through May 2010. The analysis of model performance indicated: a) The accuracy decreased as the forecast valid time from the model initialization increased, b) There was a diurnal signal in T with a cool bias during the late night and a warm bias during the afternoon, c) There was a diurnal signal in Td with a low bias during the afternoon and a high bias during the late night, and d) The model parameters at each vertical level most closely matched the observed parameters at heights closest to those vertical levels. The AMU developed a GUI that consists of a multi-level drop-down menu written in JavaScript embedded within the HTML code. This tool allows the LWO to easily and efficiently navigate among the charts and spreadsheet files containing the model performance statistics. The objective statistics give the LWOs knowledge of the model's strengths and

  4. A study on the seismic fortification level of offshore platform in Bohai Sea of China

    NASA Astrophysics Data System (ADS)

    Lu, Y.

    2010-12-01

    seismic design code for buildings it is proposed that the probability level for the strength level earthquake and ductility level earthquake takes respectively a return period of 200 and 1000-2500 years. By comparing with the codes developed by relevant industry institutions the rationality and safety of the seismic fortification objectives of OPs is verified. Finally, the seismic parameters in the sub-regions of Bohai Sea are calculated based on seismic risk zoning and ground motion intensity maps.

  5. A Seismic Isolation Application Using Rubber Bearings; Hangar Project in Turkey

    SciTech Connect

    Sesigur, Haluk; Cili, Feridun

    2008-07-08

    Seismic isolation is an effective design strategy to mitigate the seismic hazard wherein the structure and its contents are protected from the damaging effects of an earthquake. This paper presents the Hangar Project in Sabiha Goekcen Airport which is located in Istanbul, Turkey. Seismic isolation system where the isolation layer arranged at the top of the columns is selected. The seismic hazard analysis, superstructure design, isolator design and testing were based on the Uniform Building Code (1997) and met all requirements of the Turkish Earthquake Code (2007). The substructure which has the steel vertical trusses on facades and RC H shaped columns in the middle axis of the building was designed with an R factor limited to 2.0 in accordance with Turkish Earthquake Code. In order to verify the effectiveness of the isolation system, nonlinear static and dynamic analyses are performed. The analysis revealed that isolated building has lower base shear (approximately 1/4) against the non-isolated structure.

  6. Seismic design spectra 200 West and East Areas DOE Hanford Site, Washington

    SciTech Connect

    Tallman, A.M.

    1995-12-31

    This document presents equal hazard response spectra for the W236A project for the 200 East and West new high-level waste tanks. The hazard level is based upon WHC-SD-W236A-TI-002, Probabilistic Seismic Hazard Analysis, DOE Hanford Site, Washington. Spectral acceleration amplification is plotted with frequency (Hz) for horizontal and vertical motion and attached to this report. The vertical amplification is based upon the preliminary draft revision of Standard ASCE 4-86. The vertical spectral acceleration is equal to the horizontal at frequencies above 3.3Hz because of near-field, less than 15 km, sources.

  7. Seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1993-01-01

    This document provides guidelines for the design and evaluation of underground high-level waste storage tanks due to seismic loads. Attempts were made to reflect the knowledge acquired in the last two decades in the areas of defining the ground motion and calculating hydrodynamic loads and dynamic soil pressures for underground tank structures. The application of the analysis approach is illustrated with an example. The guidelines are developed for specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document.

  8. A very high-resolution, deep-towed, multichannel seismic streamer, part I: technical design

    NASA Astrophysics Data System (ADS)

    Bialas, J.; Breitzke, M.

    2003-04-01

    In order to allow very high resolution seismic data collection a new deep towed multichannel seismic streamer was developed within the gas hydrate initiative of the "Geotechnologien" program. The essential factor in terms of lateral resolution is determined by the size of the Fresnel zone. Using migration algorithms resolution could be enhanced up to half a wavelength, but this is only valid for the inline direction and will not recognize side effects. As the Fresnel zone is specified by the depth of source and receiver, as well as the velocity and frequency of the acoustic waves a lowering of source and receiver towards the sea floor will increase the lateral resolution. In our case we concentrated on the lowering of the receiver array resulting in hybrid system architecture, still using conventional surface operated airguns. Assuming a working depth of 3000 m and a source signal of 200 Hz the radius will be reduced from 106 m for surface configuration to 26 m for the hybrid case. The digital streamer comprises of single hydrophone nodes, which are coupled by cable sections of individual length. Due to this modular architecture the streamer lay out could be adapted to the source and target requirements. Currently 26 hydrophones are available which are sampled at 0.25 ms using a 24-bit A/D converter. Together with high-resolution data acquisition the request for good positioning is another issue. Therefore three of the hydrophone modules are extended to engineering modules. These nodes include a depth sensor as well as a compass, enabling the online display of a relative positioning of the streamer. Absolute coordinates of the deep towed system are measured through an ultra short baseline (USBL) system. Using a depth sensor within the deployed transponder the position could measured within 1% of the slant range even at very large offsets to the surface vessel. A permanent online connection to the deployed system is provided by a telemetry system, which is capable

  9. Numerical analysis on seismic response of Shinkansen bridge-train interaction system under moderate earthquakes

    NASA Astrophysics Data System (ADS)

    He, Xingwen; Kawatani, Mitsuo; Hayashikawa, Toshiro; Matsumoto, Takashi

    2011-03-01

    This study is intended to evaluate the influence of dynamic bridge-train interaction (BTI) on the seismic response of the Shinkansen system in Japan under moderate earthquakes. An analytical approach to simulate the seismic response of the BTI system is developed. In this approach, the behavior of the bridge structure is assumed to be within the elastic range under moderate ground motions. A bullet train car model idealized as a sprung-mass system is established. The viaduct is modeled with 3D finite elements. The BTI analysis algorithm is verified by comparing the analytical and experimental results. The seismic analysis is validated through comparison with a general program. Then, the seismic responses of the BTI system are simulated and evaluated. Some useful conclusions are drawn, indicating the importance of a proper consideration of the dynamic BTI in seismic design.

  10. New finite element models and seismic analyses of the telescopes at W.M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    Kan, Frank W.; Sarawit, Andrew T.; Callahan, Shawn P.; Pollard, Mike L.

    2014-07-01

    On 15 October 2006 a large earthquake damaged both telescopes at Keck observatory resulting in weeks of observing downtime. A significant portion of the downtime was attributed to recovery efforts repairing damage to telescope bearing journals, radial pad support structures and encoder subsystems. Inadequate damping and strength in the seismic restraint design and the lack of break-away features on the azimuth radial pads are key design deficiencies. In May, 2011 a feasibility study was conducted to review several options to enhance the protection of the telescopes with the goal to minimize the time to bring the telescopes back into operation after a large seismic event. At that time it was determined that new finite element models of the telescope structures were required to better understand the telescope responses to design earthquakes required by local governing building codes and the USGS seismic data collected at the site on 15 October 2006. These models were verified by comparing the calculated natural frequencies from the models to the measured frequencies obtained from the servo identification study and comparing the time history responses of the telescopes to the October 2006 seismic data to the actual observed damages. The results of two finite element methods, response spectrum analysis and time history analysis, used to determine seismic demand forces and seismic response of each telescope to the design earthquakes were compared. These models can be used to evaluate alternate seismic restraint design options for both Keck telescopes.

  11. Design and development of safety evaluation system of buildings on a seismic field based on the network platform

    NASA Astrophysics Data System (ADS)

    Sun, Baitao; Zhang, Lei; Chen, Xiangzhao; Zhang, Xinghua

    2015-03-01

    This paper describes a set of on-site earthquake safety evaluation systems for buildings, which were developed based on a network platform. The system embedded into the quantitative research results which were completed in accordance with the provisions from Post-earthquake Field Works, Part 2: Safety Assessment of Buildings, GB18208.2 -2001, and was further developed into an easy-to-use software platform. The system is aimed at allowing engineering professionals, civil engineeing technicists or earthquake-affected victims on site to assess damaged buildings through a network after earthquakes. The authors studied the function structure, process design of the safety evaluation module, and hierarchical analysis algorithm module of the system in depth, and developed the general architecture design, development technology and database design of the system. Technologies such as hierarchical architecture design and Java EE were used in the system development, and MySQL5 was adopted in the database development. The result is a complete evaluation process of information collection, safety evaluation, and output of damage and safety degrees, as well as query and statistical analysis of identified buildings. The system can play a positive role in sharing expert post-earthquake experience and promoting safety evaluation of buildings on a seismic field.

  12. A new instrumentation to measure seismic waves attenuation

    NASA Astrophysics Data System (ADS)

    Tisato, N.; Madonna, C.; Boutareaud, S.; Burg, J.

    2010-12-01

    Attenuation of seismic waves is the general expression describing the loss of energy of an elastic perturbation during its propagation in a medium. As a geophysical method, measuring the attenuation of seismic waves is a key to uncover essential information about fluid saturation of buried rocks. Attenuation of seismic waves depends on several mechanisms. In the case of saturated rock, fluids play an important role. Seismic waves create zones of overpressure by mobilizing the fluids in the pores of the rock. Starting from Gassmann-Biot theory (Gassman, 1951), several models (e.g. White, 1975; Mavko and Jizba, 1991) have been formulated to describe the energy absorption by flow of fluids. According to Mavko et al. (1998) for rock with permeability equals or less than 1 D, fluid viscosity between 1 cP and 10 cP and low frequencies seismic wave (< 100 Hz), the most important processes that subtract energy from the seismic waves are squirt flow and patchy saturation. Numerical models like Quintal et al. (2009) calculate how a patchy saturated vertical rock section (25 cm height), after stress steps of several kPa (i.e. 30 kPa) show a dissimilar increase in pore pressure between gas-saturated and liquid-saturated layers. The Rock Deformation Laboratory at ETH-Zürich has designed and set up a new pressure vessel to measure seismic wave attenuation in rocks at frequencies between 0.1 and 100 Hz and to verify the predicted influence of seismic waves on the pore pressure in patchy saturated rocks. We present this pressure vessel which can reach confining pressures of 25 MPa and holds a 250 mm long and 76 mm diameter sample. Dynamic stress is applied at the top of the rock cylinder by a piezoelectric motor that can generate a stress of several kPa (> 100 KPa) in less than 10 ms. The vessel is equipped with 5 pressure sensors buried within the rock sample, a load cell and a strain sensor to measure axial shortening while the motor generates the seismic waves. The sensor

  13. Earthquake damage potential and critical scour depth of bridges exposed to flood and seismic hazards under lateral seismic loads

    NASA Astrophysics Data System (ADS)

    Song, Shin-Tai; Wang, Chun-Yao; Huang, Wen-Hsiu

    2015-12-01

    Many bridges located in seismic hazard regions suffer from serious foundation exposure caused by riverbed scour. Loss of surrounding soil significantly reduces the lateral strength of pile foundations. When the scour depth exceeds a critical level, the strength of the foundation is insufficient to withstand the imposed seismic demand, which induces the potential for unacceptable damage to the piles during an earthquake. This paper presents an analytical approach to assess the earthquake damage potential of bridges with foundation exposure and identify the critical scour depth that causes the seismic performance of a bridge to differ from the original design. The approach employs the well-accepted response spectrum analysis method to determine the maximum seismic response of a bridge. The damage potential of a bridge is assessed by comparing the imposed seismic demand with the strengths of the column and the foundation. The versatility of the analytical approach is illustrated with a numerical example and verified by the nonlinear finite element analysis. The analytical approach is also demonstrated to successfully determine the critical scour depth. Results highlight that relatively shallow scour depths can cause foundation damage during an earthquake, even for bridges designed to provide satisfactory seismic performance.

  14. Seismic hazard evaluation for design and/or verification of a high voltage system

    SciTech Connect

    Grases, J.; Malaver, A.; Lopez, S.; Rivero, P.

    1995-12-31

    The Venezuelan capital, Caracas, with a population of about 5 million, is within the area of contact of the Caribbean and South American tectonic plates. Since 1567, the valley where it lies and surroundings have been shaken by at leas six destructive events from different seismogenic sources. Electric energy is served to the city by a high voltage system consisting of 4 power stations, 20 substations (230 KV downwards) and 80 km of high voltage lines, covering an area of about 135 x 60 km{sup 2}. Given the variety of soil conditions, topographical irregularities and proximity to potentially active faults, it was decided to perform a seismic hazard study. This paper gives the results of that study synthesized by two hazard-parameter maps, which allow a conservative characterization of the acceleration on firm soils. Specific site coefficients allow for changes in soil conditions and topographical effects. Sites whose proximity to fault lines is less than about 2 km, require additional field studies in order to rule out the possibility of permanent ground displacements.

  15. Multidisciplinary co-operation in building design according to urbanistic zoning and seismic microzonation

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2005-05-01

    Research and practice in seismology and urban planning interfere concerning the impact of earthquakes on urban areas. The roles of sub-area wide or typological divisions of the town were investigated with the methodology of regression, regarding their contribution to urban earthquake risk management. The inductive data set comprised recovery, preparedness, mitigation and resilience planning. All timely constituted planning types are refound today as layers, as the zoning results are used by differently backgrounded actors: local authorities, civil protection, urban planners, civil engineers. In resilience planning, the urban system is complexly theoretised, then integratedly approached. The steady restructuring process of the urban organism is evident in a dynamic analysis. Although expressed materially, the "urban-frame" is realised spiritually, space adaptation being also social. A retrospective investigation of the role of resilient individual buildings within the urban system of Bucharest, Romania, was undertaken, in order to learn systemic lessons considering the street, an educational environment. (In)formation in the study and decision making process stay in a reciprocal relationship, both being obliged in the (in)formation of the public opinion. For a complete view on resilience, both zoning types, seismic and urbanistic, must be considered and through their superposition new sub-area wide divisions of the town appear, making recommendations according to the vulnerability of the building type.

  16. Efficacy of Code Provisions for Seismic Design of Asymmetric RC Building

    NASA Astrophysics Data System (ADS)

    Balakrishnan, Bijily; Sarkar, Pradip

    2016-04-01

    The earthquake resistant design code in India, IS: 1893, has been revised in 2002 to include provisions for torsional irregularity in asymmetric buildings. In line with other international code, IS 1893: 2002 requires estimating the design eccentricity from static and accidental eccentricity. The present study attempts to evaluate the effectiveness of the design code requirements for designing torsionally irregular asymmetric buildings. Two similar asymmetric buildings designed considering and ignoring code requirement has been considered for this study. Nonlinear static and dynamic analyses are performed on these buildings to realize the difference in their behaviour and it is found that the plan asymmetry in the building makes it non-ductile even after design with code provisions. Code criterion for plan asymmetry tends to improve the strength of members but this study indicates that changing the stiffness distribution to reduce eccentricity may lead to a preferred mode of failure.

  17. Efficacy of Code Provisions for Seismic Design of Asymmetric RC Building

    NASA Astrophysics Data System (ADS)

    Balakrishnan, Bijily; Sarkar, Pradip

    2016-06-01

    The earthquake resistant design code in India, IS: 1893, has been revised in 2002 to include provisions for torsional irregularity in asymmetric buildings. In line with other international code, IS 1893: 2002 requires estimating the design eccentricity from static and accidental eccentricity. The present study attempts to evaluate the effectiveness of the design code requirements for designing torsionally irregular asymmetric buildings. Two similar asymmetric buildings designed considering and ignoring code requirement has been considered for this study. Nonlinear static and dynamic analyses are performed on these buildings to realize the difference in their behaviour and it is found that the plan asymmetry in the building makes it non-ductile even after design with code provisions. Code criterion for plan asymmetry tends to improve the strength of members but this study indicates that changing the stiffness distribution to reduce eccentricity may lead to a preferred mode of failure.

  18. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  19. Theoretical and practical considerations for the design of the iMUSH active-source seismic experiment

    NASA Astrophysics Data System (ADS)

    Kiser, E.; Levander, A.; Harder, S. H.; Abers, G. A.; Creager, K. C.; Vidale, J. E.; Moran, S. C.; Malone, S. D.

    2013-12-01

    The multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment seeks to understand the details of the magmatic system that feeds Mount St. Helens using active- and passive-source seismic, magnetotelluric, and petrologic data. The active-source seismic component of this experiment will take place in the summer of 2014 utilizing all of the 2600 PASSCAL 'Texan' Reftek instruments which will record twenty-four 1000-2000 lb shots distributed around the Mount St. Helens region. The instruments will be deployed as two consecutive refraction profiles centered on the volcano, and a series of areal arrays. The actual number of areal arrays, as well as their locations, will depend strongly on the length of the experiment (3-4 weeks), the number of instrument deployers (50-60), and the time it will take per deployment given the available road network. The current work shows how we are balancing these practical considerations against theoretical experiment designs in order to achieve the proposed scientific goals with the available resources. One of the main goals of the active-source seismic experiment is to image the magmatic system down to the Moho (35-40 km). Calculating sensitivity kernels for multiple shot/receiver offsets shows that direct P waves should be sensitive to Moho depths at offsets of 150 km, and therefore this will likely be the length of the refraction profiles. Another primary objective of the experiment is to estimate the locations and volumes of different magma accumulation zones beneath the volcano using the areal arrays. With this in mind, the optimal locations of these arrays, as well as their associated shots, are estimated using an eigenvalue analysis of the approximate Hessian for each possible experiment design. This analysis seeks to minimize the number of small eigenvalues of the approximate Hessian that would amplify the propagation of data noise into regions of interest in the model space, such as the likely locations of magma

  20. 37 CFR 2.33 - Verified statement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verified statement. 2.33... COMMERCE RULES OF PRACTICE IN TRADEMARK CASES The Written Application § 2.33 Verified statement. (a) The application must include a statement that is signed in accordance with the requirements of § 2.193...

  1. Appraising the value of independent EIA follow-up verifiers

    SciTech Connect

    Wessels, Jan-Albert

    2015-01-15

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  2. Preclosure seismic design methodology for a geologic repository at Yucca Mountain. Topical report YMP/TR-003-NP

    SciTech Connect

    1996-10-01

    This topical report describes the methodology and criteria that the U.S. Department of Energy (DOE) proposes to use for preclosure seismic design of structures, systems, and components (SSCs) of the proposed geologic repository operations area that are important to safety. Title 10 of the Code of Federal Regulations, Part 60 (10 CFR 60), Disposal of High-Level Radioactive Wastes in Geologic Repositories, states that for a license to be issued for operation of a high-level waste repository, the U.S. Nuclear Regulatory Commission (NRC) must find that the facility will not constitute an unreasonable risk to the health and safety of the public. Section 60.131 (b)(1) requires that SSCs important to safety be designed so that natural phenomena and environmental conditions anticipated at the geologic repository operations area will not interfere with necessary safety functions. Among the natural phenomena specifically identified in the regulation as requiring safety consideration are the hazards of ground shaking and fault displacement due to earthquakes.

  3. The DDBD Method In The A-Seismic Design of Anchored Diaphragm Walls

    SciTech Connect

    Manuela, Cecconi; Vincenzo, Pane; Sara, Vecchietti

    2008-07-08

    The development of displacement based approaches for earthquake engineering design appears to be very useful and capable to provide improved reliability by directly comparing computed response and expected structural performance. In particular, the design procedure known as the Direct Displacement Based Design (DDBD) method, which has been developed in structural engineering over the past ten years in the attempt to mitigate some of the deficiencies in current force-based design methods, has been shown to be very effective and promising ([1], [2]). The first attempts of application of the procedure to geotechnical engineering and, in particular, earth retaining structures are discussed in [3], [4] and [5]. However in this field, the outcomes of the research need to be further investigated in many aspects. The paper focuses on the application of the DDBD method to anchored diaphragm walls. The results of the DDBD method are discussed in detail in the paper, and compared to those obtained from conventional pseudo-static analyses.

  4. Optimization for performance-based design under seismic demands, including social costs

    NASA Astrophysics Data System (ADS)

    Möller, Oscar; Foschi, Ricardo O.; Ascheri, Juan P.; Rubinstein, Marcelo; Grossman, Sergio

    2015-06-01

    Performance-based design in earthquake engineering is a structural optimization problem that has, as the objective, the determination of design parameters for the minimization of total costs, while at the same time satisfying minimum reliability levels for the specified performance criteria. Total costs include those for construction and structural damage repairs, those associated with non-structural components and the social costs of economic losses, injuries and fatalities. This paper presents a general framework to approach this problem, using a numerical optimization strategy and incorporating the use of neural networks for the evaluation of dynamic responses and the reliability levels achieved for a given set of design parameters. The strategy is applied to an example of a three-story office building. The results show the importance of considering the social costs, and the optimum failure probabilities when minimum reliability constraints are not taken into account.

  5. Seismic design or retrofit of buildings with metallic structural fuses by the damage-reduction spectrum

    NASA Astrophysics Data System (ADS)

    Li, Gang; Jiang, Yi; Zhang, Shuchuan; Zeng, Yan; Li, Qiang

    2015-03-01

    Recently, the structural fuse has become an important issue in the field of earthquake engineering. Due to the trilinearity of the pushover curve of buildings with metallic structural fuses, the mechanism of the structural fuse is investigated through the ductility equation of a single-degree-of-freedom system, and the corresponding damage-reduction spectrum is proposed to design and retrofit buildings. Furthermore, the controlling parameters, the stiffness ratio between the main frame and structural fuse and the ductility factor of the main frame, are parametrically studied, and it is shown that the structural fuse concept can be achieved by specific combinations of the controlling parameters based on the proposed damage-reduction spectrum. Finally, a design example and a retrofit example, variations of real engineering projects after the 2008 Wenchuan earthquake, are provided to demonstrate the effectiveness of the proposed design procedures using buckling restrained braces as the structural fuses.

  6. The LUSI Seismic Experiment: Deployment of a Seismic Network around LUSI, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Karyono, Karyono; Mazzini, Adriano; Lupi, Matteo; Syafri, Ildrem; Haryanto, Iyan; Masturyono, Masturyono; Hadi, Soffian; Rohadi, Suprianto; Suardi, Iman; Rudiyanto, Ariska; Pranata, Bayu

    2015-04-01

    The spectacular Lusi eruption started in northeast Java, Indonesia the 29 of May 2006 following a M6.3 earthquake striking the island. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. Lusi is located few kilometres to the NE of the Arjuno-Welirang volcanic complex. Lusi sits upon the Watukosek fault system. From this volcanic complex originates the Watukosek fault system that was reactivated by the M6.3 earthquake in 2006 and is still periodically reactivated by the frequent seismicity. To date Lusi is still active and erupting gas, water, mud and clasts. Gas and water data show that the Lusi plumbing system is connected with the neighbouring Arjuno-Welirang volcanic complex. This makes the Lusi eruption a "sedimentary hosted geothermal system". To verify and characterise the occurrence of seismic activity and how this perturbs the connected Watukosek fault, the Arjuno-Welirang volcanic system and the ongoing Lusi eruption, we deployed 30 seismic stations (short-period and broadband) in this region of the East Java basin. The seismic stations are more densely distributed around LUSI and the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. Fewer stations are positioned around the volcanic arc. Our study sheds light on the seismic activity along the Watukosek fault system and describes the waveforms associated to the geysering activity of Lusi. The initial network aims to locate small event that may not be captured by the Indonesian Agency for Meteorology, Climatology and Geophysics (BMKG) seismic network and it will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-Arjuno Welirang region and temporal variations of vp/vs ratios. Such variations will then be ideally related to

  7. RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES

    EPA Science Inventory

    On October 9, 1993, the new RCRA Subtitle D regulation (40CFR Part 258) went into effect. hese regulations are applicable to landfills reclining solid waste (MSW) and establish minimum Federal criteria for the siting, design, operations, and closure of MSW landfills. hese regulat...

  8. RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES

    EPA Science Inventory

    On October 9, 1993, the new RCRA Subtitle D regulations (40 CFR Part 258) went into effect. These regulations are applicable to landfills receiving municipal solid waste (MSW) and establish minimum Federal criteria for the siting, design, operation, and closure of MSW landfills....

  9. A structural design and analysis of a piping system including seismic load

    SciTech Connect

    Hsieh, B.J.; Kot, C.A.

    1991-01-01

    The structural design/analysis of a piping system at a nuclear fuel facility is used to investigate some aspects of current design procedures. Specifically the effect of using various stress measures including ASME Boiler Pressure Vessel (B PV) Code formulas is evaluated. It is found that large differences in local maximum stress values may be calculated depending on the stress criterion used. However, when the global stress maximum for the entire system are compared the differences are much smaller, being nevertheless, for some load combinations, of the order of 50 percent. The effect of using an Equivalent Static Method (ESM) analysis is also evaluated by comparing its results with those obtained from a Response Spectrum Method (RSM) analysis with the modal responses combined by using the absolute summation (ABS), by using the square root of the squares (SRSS), and by using the 10 percent method (10PC). It is shown that for a spectrum amplification factor (equivalent static coefficient greater than unity) of at least 1.32 must be used in the current application of the ESM analysis in order to obtain results which are conservative in all aspects relative to an RSM analysis based on ABS. However, it appears that an adequate design would be obtained from the ESM approach even without the use of a spectrum amplification factor. 7 refs., 3 figs., 3 tabs.

  10. Analyzing Interaction Patterns to Verify a Simulation/Game Model

    ERIC Educational Resources Information Center

    Myers, Rodney Dean

    2012-01-01

    In order for simulations and games to be effective for learning, instructional designers must verify that the underlying computational models being used have an appropriate degree of fidelity to the conceptual models of their real-world counterparts. A simulation/game that provides incorrect feedback is likely to promote misunderstanding and…

  11. Performance-based design and evaluation for liquefaction-related seismic hazards

    NASA Astrophysics Data System (ADS)

    Huang, Yi-Min

    Soil liquefaction can cause serious damage both during and following strong ground shaking. The liquefaction-induced hazards discussed in this research include ground motion modification, flow slides, lateral spreading, and ground surface settlement. With the exception of ground motion modification, all of these post-liquefaction hazards are commonly estimated using empirical methods. Uncertainty in these empirical relationships is usually not explicitly accounted for, which affects the accuracy and consistency of liquefaction-induced hazard prediction. Some investigators have proposed probabilistic models to deal with post-liquefaction problems, such as residual strength and lateral spreading displacement. A probabilistic model for post-liquefaction settlement, however, is not currently available. The goal of this dissertation is to provide improved procedures for estimation of post-liquefaction hazards, and to suggest procedures for computing the probability of exceeding a damage level of concern. This has been accomplished by implementing the performance-based earthquake engineering (PBEE) framework developed by the Pacific Earthquake Engineering Research Center (PEER) into a limit state exceedance formulation, which involves concepts of demand (loading) and capacity (resistance). In the proposed procedures, the demand of a post-liquefaction hazard is estimated using PEER PBEE framework, and the capacity is characterized probabilistically for various damage levels. For design purposes, a limit state exceedance for a specific damage level is a special case in the PBEE computations. The mean annual rate of exceeding this specific damage level for a liquefaction-related hazard of interest then can be computed and applied to engineering design. An important accomplishment of this research was the development of procedures for performance-based analysis of post-liquefaction settlement. The development required characterization of a maximum volumetric strain. A

  12. Verifying a nuclear weapon`s response to radiation environments

    SciTech Connect

    Dean, F.F.; Barrett, W.H.

    1998-05-01

    The process described in the paper is being applied as part of the design verification of a replacement component designed for a nuclear weapon currently in the active stockpile. This process is an adaptation of the process successfully used in nuclear weapon development programs. The verification process concentrates on evaluating system response to radiation environments, verifying system performance during and after exposure to radiation environments, and assessing system survivability.

  13. An IBM 370 assembly language program verifier

    NASA Technical Reports Server (NTRS)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  14. Firms Verify Online IDs Via Schools

    ERIC Educational Resources Information Center

    Davis, Michelle R.

    2008-01-01

    Companies selling services to protect children and teenagers from sexual predators on the Internet have enlisted the help of schools and teachers to verify students' personal information. Those companies are also sharing some of the information with Web sites, which can pass it along to businesses for use in targeting advertising to young…

  15. 37 CFR 2.33 - Verified statement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... under § 2.20 of the applicant's continued use or bona fide intention to use the mark in commerce. (d) (e... COMMERCE RULES OF PRACTICE IN TRADEMARK CASES The Written Application § 2.33 Verified statement. (a) The... behalf of the applicant under § 2.193(e)(1). (b)(1) In an application under section 1(a) of the Act,...

  16. 37 CFR 2.33 - Verified statement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... under § 2.20 of the applicant's continued use or bona fide intention to use the mark in commerce. (d) (e... COMMERCE RULES OF PRACTICE IN TRADEMARK CASES The Written Application § 2.33 Verified statement. (a) The... behalf of the applicant under § 2.193(e)(1). (b)(1) In an application under section 1(a) of the Act,...

  17. Research of CRP-based irregular 2D seismic acquisition

    NASA Astrophysics Data System (ADS)

    Zhao, Hu; Yin, Cheng; He, Guang-Ming; Chen, Ai-Ping; Jing, Long-Jiang

    2015-03-01

    Seismic exploration in the mountainous areas of western Chinese is extremely difficult because of the complexity of the surface and subsurface, which results in shooting difficulties, seismic data with low signal-to-noise ratio, and strong interference. The complexity of the subsurface structure leads to strong scattering of the reflection points; thus, the curved-line acquisition method has been used. However, the actual subsurface structural characteristics have been rarely considered. We propose a design method for irregular acquisition based on common reflection points (CRP) to avoid difficult-to-shoot areas, while considering the structural characteristics and CRP positions and optimizing the surface-receiving line position. We arrange the positions of the receiving points to ensure as little dispersion of subsurface CRP as possible to improve the signal-to-noise ratio of the seismic data. We verify the applicability of the method using actual data from a site in Sichuan Basin. The proposed method apparently solves the problem of seismic data acquisition and facilitates seismic exploration in structurally complex areas.

  18. Towards composition of verified hardware devices

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, G. C.

    1991-01-01

    Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.

  19. Chemical weapons convention verifiability assessment. Final report

    SciTech Connect

    Mengel, R.W.; Meselson, M.; Dee, W.C.; Palarino, R.N.; Eimers, F.

    1994-01-18

    The U.S. is in the process of the ratification of the Chemical Weapons Convention (CWC). A significant element of this process is the evaluation of the verifiability of the CWC. In addition to U.S. Government assessment a separate independent evaluation has been conducted by a group of recognized non-governmental CWC experts. This report documents the findings, conclusions and recommendations of these experts. The verifiability assessment evaluated the kinds of violations that might be carried out undetected, the difficulty in accomplishing each violation, and he overall strengths and weaknesses of the CWC with regard to verification. Principle conclusions are: (1) reporting and routine inspection provisions of the CWC are adequate for verification of declarations; (2) restrictions on challenge inspection facility access and sampling and analysis limit verification; (3) difficulty in discriminating between permitted and prohibited activities at commercial facilities complicates verifiability; (4) fundamental to achieving verification aims is a highly qualified and trained corps of CWC inspectors; and, (5) technology to support improved verification will evolve into the future.

  20. Seismic design technology for breeder reactor structures. Volume 2. Special topics in soil/structure interaction analyses

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into six chapters: definition of seismic input ground motion, review of state-of-the-art procedures, analysis guidelines, rock/structure interaction analysis example, comparison of two- and three-dimensional analyses, and comparison of analyses using FLUSH and TRI/SAC Codes. (DLC)

  1. Static behaviour of induced seismicity

    NASA Astrophysics Data System (ADS)

    Mignan, A.

    2015-12-01

    The standard paradigm to describe seismicity induced by fluid injection is to apply nonlinear diffusion dynamics in a poroelastic medium. I show that the spatiotemporal behaviour and rate evolution of induced seismicity can, instead, be expressed by geometric operations on a static stress field produced by volume change at depth. I obtain laws similar in form to the ones derived from poroelasticity while requiring a lower description length. Although fluid flow is known to occur in the ground, it is not pertinent to the behaviour of induced seismicity. The proposed model is equivalent to the static stress model for tectonic foreshocks generated by the Non-Critical Precursory Accelerating Seismicity Theory. This study hence verifies the explanatory power of this theory outside of its original scope.

  2. Comparison of seismic sources for shallow seismic: sledgehammer and pyrotechnics

    NASA Astrophysics Data System (ADS)

    Brom, Aleksander; Stan-Kłeczek, Iwona

    2015-10-01

    The pyrotechnic materials are one of the types of the explosives materials which produce thermal, luminous or sound effects, gas, smoke and their combination as a result of a self-sustaining chemical reaction. Therefore, pyrotechnics can be used as a seismic source that is designed to release accumulated energy in a form of seismic wave recorded by tremor sensors (geophones) after its passage through the rock mass. The aim of this paper was to determine the utility of pyrotechnics for shallow seismic engineering. The work presented comparing the conventional method of seismic wave excitation for seismic refraction method like plate and hammer and activating of firecrackers on the surface. The energy released by various sources and frequency spectra was compared for the two types of sources. The obtained results did not determine which sources gave the better results but showed very interesting aspects of using pyrotechnics in seismic measurements for example the use of pyrotechnic materials in MASW.

  3. Monitoring and modeling the multi-time-scale seismic hazard of the southern Longmenshan fault: an experimental design of the `monitoring and modeling for prediction' system

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Li, L.; Liu, G.; Jiang, C.; Ma, H.

    2010-12-01

    To the southwest of the WFSD-I and WFSD-II is the southern part of the Longmenshan fault, which has been keeping quiet since the May 12, 2008, Wenchuan earthquake which ruptured the middle and the northern part of the Longmenshan fault zone. The seismic hazard in this reason is one of the concerns not only in the WFSD project but also in the regional sustainability. This presentation tries to discuss the following three major problems related to the seismic hazard of this fault segment: 1) If there were a major earthquake rupturing this fault segment, what would be the ‘scenario rupture’ preparing and occurring; 2) Based on this concept of ‘scenario rupture’, how to design the ‘monitoring and modeling for prediction’ system in this region, for the effective constraint of geodynamic models for earthquake preparation, the effective monitoring of potentially pre-seismic changes of geophysical fields, and the effective test of the predictive models and/or algorithms; and 3) what will be the potential contribution of the WFSD project, in both long-term sense and short-term sense, to the monitoring and modeling of seismic hazard in this region. In considering these three questions, lessons and experiences from the Wenchuan earthquake plays an important role, and the relation between the Xianshuihe fault and the Longmenshan fault is one of the critical issues subject to consideration. Considering the state-of-the-art of earthquake science and social needs, the monitoring and modeling endeavor should be dealing with different time scales considering both scientific issues and decision-making issues. Taking the lessons and experiences of the previously-conducted earthquake prediction experiment sites, we propose a concept ‘seismological engineering’ (which is different from either ‘earthquake engineering’ or ‘engineering seismology’) dealing with the design of the operational multi-disciplinary observation system oriented at the monitoring and

  4. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  5. Verifying speculative multithreading in an application

    DOEpatents

    Felton, Mitchell D

    2014-11-18

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  6. Verifying speculative multithreading in an application

    DOEpatents

    Felton, Mitchell D

    2014-12-09

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  7. Seismic offset balancing

    SciTech Connect

    Ross, C.P.; Beale, P.L.

    1994-01-01

    The ability to successfully predict lithology and fluid content from reflection seismic records using AVO techniques is contingent upon accurate pre-analysis conditioning of the seismic data. However, all too often, residual amplitude effects remain after the many offset-dependent processing steps are completed. Residual amplitude effects often represent a significant error when compared to the amplitude variation with offset (AVO) response that the authors are attempting to quantify. They propose a model-based, offset-dependent amplitude balancing method that attempts to correct for these residuals and other errors due to sub-optimal processing. Seismic offset balancing attempts to quantify the relationship between the offset response of back-ground seismic reflections and corresponding theoretical predictions for average lithologic interfaces thought to cause these background reflections. It is assumed that any deviation from the theoretical response is a result of residual processing phenomenon and/or suboptimal processing, and a simple offset-dependent scaling function is designed to correct for these differences. This function can then be applied to seismic data over both prospective and nonprospective zones within an area where the theoretical values are appropriate and the seismic characteristics are consistent. A conservative application of the above procedure results in an AVO response over both gas sands and wet sands that is much closer to theoretically expected values. A case history from the Gulf of Mexico Flexure Trend is presented as an example to demonstrate the offset balancing technique.

  8. Impact of lateral force-resisting system and design/construction practices on seismic performance and cost of tall buildings in Dubai, UAE

    NASA Astrophysics Data System (ADS)

    AlHamaydeh, Mohammad; Galal, Khaled; Yehia, Sherif

    2013-09-01

    The local design and construction practices in the United Arab Emirates (UAE), together with Dubai's unique rate of development, warrant special attention to the selection of Lateral Force-Resisting Systems (LFRS). This research proposes four different feasible solutions for the selection of the LFRS for tall buildings and quantifies the impact of these selections on seismic performance and cost. The systems considered are: Steel Special Moment-Resisting Frame (SMRF), Concrete SMRF, Steel Dual System (SMRF with Special Steel Plates Shear Wall, SPSW), and Concrete Dual System (SMRF with Special Concrete Shear Wall, SCSW). The LFRS selection is driven by seismic setup as well as the adopted design and construction practices in Dubai. It is found that the concrete design alternatives are consistently less expensive than their steel counterparts. The steel dual system is expected to have the least damage based on its relatively lesser interstory drifts. However, this preferred performance comes at a higher initial construction cost. Conversely, the steel SMRF system is expected to have the most damage and associated repair cost due to its excessive flexibility. The two concrete alternatives are expected to have relatively moderate damage and repair costs in addition to their lesser initial construction cost.

  9. Verifying disarmament: scientific, technological and political challenges

    SciTech Connect

    Pilat, Joseph R

    2011-01-25

    There is growing interest in, and hopes for, nuclear disarmament in governments and nongovernmental organizations (NGOs) around the world. If a nuclear-weapon-free world is to be achievable, verification and compliance will be critical. VerifYing disarmament would have unprecedented scientific, technological and political challenges. Verification would have to address warheads, components, materials, testing, facilities, delivery capabilities, virtual capabilities from existing or shutdown nuclear weapon and existing nuclear energy programs and material and weapon production and related capabilities. Moreover, it would likely have far more stringent requirements. The verification of dismantlement or elimination of nuclear warheads and components is widely recognized as the most pressing problem. There has been considerable research and development done in the United States and elsewhere on warhead and dismantlement transparency and verification since the early 1990s. However, we do not today know how to verifY low numbers or zero. We need to develop the needed verification tools and systems approaches that would allow us to meet this complex set of challenges. There is a real opportunity to explore verification options and, given any realistic time frame for disarmament, there is considerable scope to invest resources at the national and international levels to undertake research, development and demonstrations in an effort to address the anticipated and perhaps unanticipated verification challenges of disarmament now andfor the next decades. Cooperative approaches have the greatest possibility for success.

  10. Seismic, shock, and vibration isolation - 1988

    SciTech Connect

    Chung, H. ); Mostaghel, N. )

    1988-01-01

    This book contains papers presented at a conference on pressure vessels and piping. Topics covered include: Design of R-FBI bearings for seismic isolation; Benefits of vertical and horizontal seismic isolation for LMR nuclear reactor units; and Some remarks on the use and perspectives of seismic isolation for fast reactors.

  11. Recent advances in the Lesser Antilles observatories Part 1 : Seismic Data Acquisition Design based on EarthWorm and SeisComP

    NASA Astrophysics Data System (ADS)

    Saurel, Jean-Marie; Randriamora, Frédéric; Bosson, Alexis; Kitou, Thierry; Vidal, Cyril; Bouin, Marie-Paule; de Chabalier, Jean-Bernard; Clouard, Valérie

    2010-05-01

    Lesser Antilles observatories are in charge of monitoring the volcanoes and earthquakes in the Eastern Caribbean region. During the past two years, our seismic networks have evolved toward a full digital technology. These changes, which include modern three components sensors, high dynamic range digitizers, high speed terrestrial and satellite telemetry, improve data quality but also increase the data flows to process and to store. Moreover, the generalization of data exchange to build a wide virtual seismic network around the Caribbean domain requires a great flexibility to provide and receive data flows in various formats. As many observatories, we have decided to use the most popular and robust open source data acquisition systems in use in today observatories community : EarthWorm and SeisComP. The first is renowned for its ability to process real time seismic data flows, with a high number of tunable modules (filters, triggers, automatic pickers, locators). The later is renowned for its ability to exchange seismic data using the international SEED standard (Standard for Exchange of Earthquake Data), either by producing archive files, or by managing output and input SEEDLink flows. French Antilles Seismological and Volcanological Observatories have chosen to take advantage of the best features of each software to design a new data flow scheme and to integrate it in our global observatory data management system, WebObs [Beauducel et al., 2004]1, see the companion paper (Part 2). We assigned the tasks to the different softwares, regarding their main abilities : - EarthWorm first performs the integration of data from different heterogeneous sources; - SeisComP takes all this homogeneous EarthWorm data flow, adds other sources and produces SEED archives and SEED data flow; - EarthWorm is then used again to process this clean and complete SEEDLink data flow, mainly producing triggers, automatic locations and alarms; - WebObs provides a friendly human interface, both

  12. Procedures for computing site seismicity

    NASA Astrophysics Data System (ADS)

    Ferritto, John

    1994-02-01

    This report was prepared as part of the Navy's Seismic Hazard Mitigation Program. The Navy has numerous bases located in seismically active regions throughout the world. Safe effective design of waterfront structures requires determining expected earthquake ground motion. The Navy's problem is further complicated by the presence of soft saturated marginal soils that can significantly amplify the levels of seismic shaking as evidenced in the 1989 Loma Prieta earthquake. The Naval Facilities Engineering Command's seismic design manual, NAVFAC P355.l, requires a probabilistic assessment of ground motion for design of essential structures. This report presents the basis for the Navy's Seismic Hazard Analysis procedure that was developed and is intended to be used with the Seismic Hazard Analysis computer program and user's manual. This report also presents data on geology and seismology to establish the background for the seismic hazard model developed. The procedure uses the historical epicenter data base and available geologic data, together with source models, recurrence models, and attenuation relationships to compute the probability distribution of site acceleration and an appropriate spectra. This report discusses the developed stochastic model for seismic hazard evaluation and the associated research.

  13. Verifiable process monitoring through enhanced data authentication.

    SciTech Connect

    Goncalves, Joao G. M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-09-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  14. Positively Verifying Mating of Previously Unverifiable Flight Connectors

    NASA Technical Reports Server (NTRS)

    Pandipati R. K. Chetty

    2011-01-01

    Current practice is to uniquely key the connectors, which, when mated, could not be verified by ground tests such as those used in explosive or non-explosive initiators and pyro valves. However, this practice does not assure 100-percent correct mating. This problem could be overcome by the following approach. Errors in mating of interchangeable connectors can result in degraded or failed space mission. Mating of all flight connectors considered not verifiable via ground tests can be verified electrically by the following approach. It requires two additional wires going through the connector of interest, a few resistors, and a voltage source. The test-point voltage V(sub tp) when the connector is not mated will be the same as the input voltage, which gets attenuated by the resistor R(sub 1) when the female (F) and male (M) connectors are mated correctly and properly. The voltage at the test point will be a function of R(sub 1) and R(sub 2). Monitoring of the test point could be done on ground support equipment (GSE) only, or it can be a telemetry point. For implementation on multiple connector pairs, a different value for R(sub 1) or R(sub 2) or both can be selected for each pair of connectors that would result in a unique test point voltage for each connector pair. Each test point voltage is unique, and correct test point voltage is read only when the correct pair is mated correctly together. Thus, this design approach can be used to verify positively the correct mating of the connector pairs. This design approach can be applied to any number of connectors on the flight vehicle.

  15. Seismic isolation of nuclear power plants using sliding isolation bearings

    NASA Astrophysics Data System (ADS)

    Kumar, Manish

    Nuclear power plants (NPP) are designed for earthquake shaking with very long return periods. Seismic isolation is a viable strategy to protect NPPs from extreme earthquake shaking because it filters a significant fraction of earthquake input energy. This study addresses the seismic isolation of NPPs using sliding bearings, with a focus on the single concave Friction Pendulum(TM) (FP) bearing. Friction at the sliding surface of an FP bearing changes continuously during an earthquake as a function of sliding velocity, axial pressure and temperature at the sliding surface. The temperature at the sliding surface, in turn, is a function of the histories of coefficient of friction, sliding velocity and axial pressure, and the travel path of the slider. A simple model to describe the complex interdependence of the coefficient of friction, axial pressure, sliding velocity and temperature at the sliding surface is proposed, and then verified and validated. Seismic hazard for a seismically isolated nuclear power plant is defined in the United States using a uniform hazard response spectrum (UHRS) at mean annual frequencies of exceedance (MAFE) of 10-4 and 10 -5. A key design parameter is the clearance to the hard stop (CHS), which is influenced substantially by the definition of the seismic hazard. Four alternate representations of seismic hazard are studied, which incorporate different variabilities and uncertainties. Response-history analyses performed on single FP-bearing isolation systems using ground motions consistent with the four representations at the two shaking levels indicate that the CHS is influenced primarily by whether the observed difference between the two horizontal components of ground motions in a given set is accounted for. The UHRS at the MAFE of 10-4 is increased by a design factor (≥ 1) for conventional (fixed base) nuclear structure to achieve a target annual frequency of unacceptable performance. Risk oriented calculations are performed for

  16. Teacher Directed Design: Content Knowledge, Pedagogy and Assessment under the Nevada K-12 Real-Time Seismic Network

    NASA Astrophysics Data System (ADS)

    Cantrell, P.; Ewing-Taylor, J.; Crippen, K. J.; Smith, K. D.; Snelson, C. M.

    2004-12-01

    Education professionals and seismologists under the emerging SUN (Shaking Up Nevada) program are leveraging the existing infrastructure of the real-time Nevada K-12 Seismic Network to provide a unique inquiry based science experience for teachers. The concept and effort are driven by teacher needs and emphasize rigorous content knowledge acquisition coupled with the translation of that knowledge into an integrated seismology based earth sciences curriculum development process. We are developing a pedagogical framework, graduate level coursework, and materials to initiate the SUN model for teacher professional development in an effort to integrate the research benefits of real-time seismic data with science education needs in Nevada. A component of SUN is to evaluate teacher acquisition of qualified seismological and earth science information and pedagogy both in workshops and in the classroom and to assess the impact on student achievement. SUN's mission is to positively impact earth science education practices. With the upcoming EarthScope initiative, the program is timely and will incorporate EarthScope real-time seismic data (USArray) and educational materials in graduate course materials and teacher development programs. A number of schools in Nevada are contributing real-time data from both inexpensive and high-quality seismographs that are integrated with Nevada regional seismic network operations as well as the IRIS DMC. A powerful and unique component of the Nevada technology model is that schools can receive "stable" continuous live data feeds from 100's seismograph stations in Nevada, California and world (including live data from Earthworm systems and the IRIS DMC BUD - Buffer of Uniform Data). Students and teachers see their own networked seismograph station within a global context, as participants in regional and global monitoring. The robust real-time Internet communications protocols invoked in the Nevada network provide for local data acquisition

  17. Ringing load models verified against experiments

    SciTech Connect

    Krokstad, J.R.; Stansberg, C.T.

    1995-12-31

    What is believed to be the main reason for discrepancies between measured and simulated loads in previous studies has been assessed. One has focused on the balance between second- and third-order load components in relation to what is called ``fat body`` load correction. It is important to understand that the use of Morison strip theory in combination with second-order wave theory give rise to second- as well as third-order components in the horizontal force. A proper balance between second- and third-order components in horizontal force is regarded as the most central requirements for a sufficient accurate ringing load model in irregular sea. It is also verified that simulated second-order components are largely overpredicted both in regular and irregular seas. Nonslender diffraction effects are important to incorporate in the FNV formulation in order to reduce the simulated second-order component and to match experiments more closely. A sufficient accurate ringing simulation model with the use of simplified methods is shown to be within close reach. Some further development and experimental verification must however be performed in order to take non-slender effects into account.

  18. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    SciTech Connect

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly

  19. Annual Hanford seismic report -- fiscal year 1996

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.P.

    1996-12-01

    Seismic monitoring (SM) at the Hanford Site was established in 1969 by the US Geological Survey (USGS) under a contract with the US Atomic Energy Commission. Since 1980, the program has been managed by several contractors under the US Department of Energy (USDOE). Effective October 1, 1996, the Seismic Monitoring workscope, personnel, and associated contracts were transferred to the USDOE Pacific Northwest National Laboratory (PNNL). SM is tasked to provide an uninterrupted collection and archives of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) located on and encircling the Hanford Site. SM is also tasked to locate and identify sources of seismic activity and monitor changes in the historical pattern of seismic activity at the Hanford Site. The data compiled are used by SM, Waste Management, and engineering activities at the Hanford Site to evaluate seismic hazards and seismic design for the Site.

  20. Seismic bearing

    NASA Astrophysics Data System (ADS)

    Power, Dennis

    2009-05-01

    Textron Systems (Textron) has been using geophones for target detection for many years. This sensing capability was utilized for detection and classification purposes only. Recently Textron has been evaluating multiaxis geophones to calculate bearings and track targets more specifically personnel. This capability will not only aid the system in locating personnel in bearing space or cartesian space but also enhance detection and reduce false alarms. Textron has been involved in the testing and evaluation of several sensors at multiple sites. One of the challenges of calculating seismic bearing is an adequate signal to noise ratio. The sensor signal to noise ratio is a function of sensor coupling to the ground, seismic propagation and range to target. The goals of testing at multiple sites are to gain a good understanding of the maximum and minimum ranges for bearing and detection and to exploit that information to tailor sensor system emplacement to achieve desired performance. Test sites include 10A Site Devens, MA, McKenna Airfield Ft. Benning, GA and Yuma Proving Ground Yuma, AZ. Geophone sensors evaluated include a 28 Hz triax spike, a 15 Hz triax spike and a hybrid triax spike consisting of a 10 Hz vertical geophone and two 28 Hz horizontal geophones. The algorithm uses raw seismic data to calculate the bearings. All evaluated sensors have triaxial geophone configuration mounted to a spike housing/fixture. The suite of sensors also compares various types of geophones to evaluate benefits in lower bandwidth. The data products of these tests include raw geophone signals, seismic features, seismic bearings, seismic detection and GPS position truth data. The analyses produce Probability of Detection vs range, bearing accuracy vs range, and seismic feature level vs range. These analysis products are compared across test sites and sensor types.

  1. Development of Earthquake Ground Motion Input for Preclosure Seismic Design and Postclosure Performance Assessment of a Geologic Repository at Yucca Mountain, NV

    SciTech Connect

    I. Wong

    2004-11-05

    This report describes a site-response model and its implementation for developing earthquake ground motion input for preclosure seismic design and postclosure assessment of the proposed geologic repository at Yucca Mountain, Nevada. The model implements a random-vibration theory (RVT), one-dimensional (1D) equivalent-linear approach to calculate site response effects on ground motions. The model provides results in terms of spectral acceleration including peak ground acceleration, peak ground velocity, and dynamically-induced strains as a function of depth. In addition to documenting and validating this model for use in the Yucca Mountain Project, this report also describes the development of model inputs, implementation of the model, its results, and the development of earthquake time history inputs based on the model results. The purpose of the site-response ground motion model is to incorporate the effects on earthquake ground motions of (1) the approximately 300 m of rock above the emplacement levels beneath Yucca Mountain and (2) soil and rock beneath the site of the Surface Facilities Area. A previously performed probabilistic seismic hazard analysis (PSHA) (CRWMS M&O 1998a [DIRS 103731]) estimated ground motions at a reference rock outcrop for the Yucca Mountain site (Point A), but those results do not include these site response effects. Thus, the additional step of applying the site-response ground motion model is required to develop ground motion inputs that are used for preclosure and postclosure purposes.

  2. New sensitive seismic cable with imbedded geophones

    NASA Astrophysics Data System (ADS)

    Pakhomov, Alex; Pisano, Dan; Goldburt, Tim

    2005-10-01

    Seismic detection systems for homeland security applications are an important additional layer to perimeter and border protection and other security systems. General Sensing Systems has been developing low mass, low cost, highly sensitive geophones. These geophones are being incorporated within a seismic cable. This article reports on the concept of a seismic sensitive cable and seismic sensitive ribbon design. Unlike existing seismic cables with sensitivity distributed along their lengths, the GSS new cable and ribbon possesses high sensitivity distributed in several points along the cable/ribbon with spacing of about 8-12 to 100 meters between geophones. This cable/ribbon is highly suitable for design and installation in extended perimeter protection systems. It allows the use of a mechanical cable layer for high speed installation. We show that any installation mistakes in using the GSS seismic sensitive cable/ribbon have low impact on output seismic signal value and detection range of security systems.

  3. 76 FR 1620 - Trials to Verify and Describe Clinical Benefit of Midodrine Hydrochloride; Establishment of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ..., we are placing in the docket a brief description of a recommended clinical trial design. We are also... HUMAN SERVICES Food and Drug Administration Trials to Verify and Describe Clinical Benefit of Midodrine... to facilitate communication regarding the conduct of clinical trials needed to verify and...

  4. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    NASA Technical Reports Server (NTRS)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  5. Seismic Data Gathering and Validation

    SciTech Connect

    Coleman, Justin

    2015-02-01

    Three recent earthquakes in the last seven years have exceeded their design basis earthquake values (so it is implied that damage to SSC’s should have occurred). These seismic events were recorded at North Anna (August 2011, detailed information provided in [Virginia Electric and Power Company Memo]), Fukushima Daichii and Daini (March 2011 [TEPCO 1]), and Kaswazaki-Kariwa (2007, [TEPCO 2]). However, seismic walk downs at some of these plants indicate that very little damage occurred to safety class systems and components due to the seismic motion. This report presents seismic data gathered for two of the three events mentioned above and recommends a path for using that data for two purposes. One purpose is to determine what margins exist in current industry standard seismic soil-structure interaction (SSI) tools. The second purpose is the use the data to validated seismic site response tools and SSI tools. The gathered data represents free field soil and in-structure acceleration time histories data. Gathered data also includes elastic and dynamic soil properties and structural drawings. Gathering data and comparing with existing models has potential to identify areas of uncertainty that should be removed from current seismic analysis and SPRA approaches. Removing uncertainty (to the extent possible) from SPRA’s will allow NPP owners to make decisions on where to reduce risk. Once a realistic understanding of seismic response is established for a nuclear power plant (NPP) then decisions on needed protective measures, such as SI, can be made.

  6. Seismic component fragility data base for IPEEE

    SciTech Connect

    Bandyopadhyay, K.; Hofmayer, C.

    1990-01-01

    Seismic probabilistic risk assessment or a seismic margin study will require a reliable data base of seismic fragility of various equipment classes. Brookhaven National Laboratory (BNL) has selected a group of equipment and generically evaluated the seismic fragility of each equipment class by use of existing test data. This paper briefly discusses the evaluation methodology and the fragility results. The fragility analysis results when used in the Individual Plant Examination for External Events (IPEEE) Program for nuclear power plants are expected to provide insights into seismic vulnerabilities of equipment for earthquakes beyond the design basis. 3 refs., 1 fig., 1 tab.

  7. Seismic monitoring of Poland - temporary seismic project - first results

    NASA Astrophysics Data System (ADS)

    Trojanowski, J.; Plesiewicz, B.; Wiszniowski, J.; Suchcicki, J.; Tokarz, A.

    2012-04-01

    The aim of the project is to develop national database of seismic activity for seismic hazard assessment. Poland is known as a region of very low seismicity, however some earthquakes occur from time to time. The historical catalogue consists of less than one hundred earthquakes in the time span of almost one thousand years. Due to such a low occurrence rate, the study has been focussing on events at magnitudes lower than 2 which are more likely to occur during a few-year-long project. There are 24 mobile seismic stations involved in the project which are deployed in temporary locations close to humans neighbourhood. It causes a high level of noise and disturbances in recorded seismic signal. Moreover, the majority of Polish territory is covered by a thick sediments. It causes the problem of a reliable detection method for small seismic events in noisy data. The majority of algorithms is based on the concept of STA/LTA ratio and is designed for strong teleseismic events registered on many stations. Unfortunately they fail on the problem of weak events in the signal with noise and disturbances. It has been decided to apply Real Time Recurrent Neural Network (RTRN) to detect small natural seismic events from Poland. This method is able to assess relations of seismic signal in frequency domains as well as in time of seismic phases. The RTRN was taught by wide range of seismic signals - regional, teleseismic as well as blasts. The method is routinely used to analyse data from the project. In the firs two years of the project the seismic network was set in southern Poland, where relatively large seismicity in known. Since the mid-2010 the stations have been working in several regions of central and northern Poland where some minor historical earthquakes occurred. Over one hundred seismic events in magnitude range from 0.5 to 2.3 confirms the activity of Podhale region (Tatra Mountains, Carpathians), where an earthquake of magnitude 4.3 occurred in 2004. Initially three

  8. Seismic Tomography.

    ERIC Educational Resources Information Center

    Anderson, Don L.; Dziewonski, Adam M.

    1984-01-01

    Describes how seismic tomography is used to analyze the waves produced by earthquakes. The information obtained from the procedure can then be used to map the earth's mantle in three dimensions. The resulting maps are then studied to determine such information as the convective flow that propels the crustal plates. (JN)

  9. Seismic Symphonies

    NASA Astrophysics Data System (ADS)

    Strinna, Elisa; Ferrari, Graziano

    2015-04-01

    The project started in 2008 as a sound installation, a collaboration between an artist, a barrel organ builder and a seismologist. The work differs from other attempts of sound transposition of seismic records. In this case seismic frequencies are not converted automatically into the "sound of the earthquake." However, it has been studied a musical translation system that, based on the organ tonal scale, generates a totally unexpected sequence of sounds which is intended to evoke the emotions aroused by the earthquake. The symphonies proposed in the project have somewhat peculiar origins: they in fact come to life from the translation of graphic tracks into a sound track. The graphic tracks in question are made up by copies of seismograms recorded during some earthquakes that have taken place around the world. Seismograms are translated into music by a sculpture-instrument, half a seismograph and half a barrel organ. The organ plays through holes practiced on paper. Adapting the documents to the instrument score, holes have been drilled on the waves' peaks. The organ covers about three tonal scales, starting from heavy and deep sounds it reaches up to high and jarring notes. The translation of the seismic records is based on a criterion that does match the highest sounds to larger amplitudes with lower ones to minors. Translating the seismogram in the organ score, the larger the amplitude of recorded waves, the more the seismogram covers the full tonal scale played by the barrel organ and the notes arouse an intense emotional response in the listener. Elisa Strinna's Seismic Symphonies installation becomes an unprecedented tool for emotional involvement, through which can be revived the memory of the greatest disasters of over a century of seismic history of the Earth. A bridge between art and science. Seismic Symphonies is also a symbolic inversion: the instrument of the organ is most commonly used in churches, and its sounds are derived from the heavens and

  10. Static behaviour of induced seismicity

    NASA Astrophysics Data System (ADS)

    Mignan, Arnaud

    2016-04-01

    The standard paradigm to describe seismicity induced by fluid injection is to apply non-linear diffusion dynamics in a poroelastic medium. I show that the spatio-temporal behaviour and rate evolution of induced seismicity can, instead, be expressed by geometric operations on a static stress field produced by volume change at depth. I obtain laws similar in form to the ones derived from poroelasticity while requiring a lower description length. Although fluid flow is known to occur in the ground, it is not pertinent to the geometrical description of the spatio-temporal patterns of induced seismicity. The proposed model is equivalent to the static stress model for tectonic foreshocks generated by the Non-Critical Precursory Accelerating Seismicity Theory. This study hence verifies the explanatory power of this theory outside of its original scope and provides an alternative physical approach to poroelasticity for the modelling of induced seismicity. The applicability of the proposed geometrical approach is illustrated for the case of the 2006, Basel enhanced geothermal system stimulation experiment. Applicability to more problematic cases where the stress field may be spatially heterogeneous is also discussed.

  11. Designing a low-cost effective network for monitoring large scale regional seismicity in a soft-soil region (Alsace, France)

    NASA Astrophysics Data System (ADS)

    Bès de Berc, M.; Doubre, C.; Wodling, H.; Jund, H.; Hernandez, A.; Blumentritt, H.

    2015-12-01

    The Seismological Observatory of the North-East of France (ObSNEF) is developing its monitoring network within the framework of several projects. Among these project, RESIF (Réseau sismologique et géodésique français) allows the instrumentation of broad-band seismic stations, separated by 50-100 km. With the recent and future development of geothermal industrial projects in the Alsace region, the ObSNEF is responsible for designing, building and operating a dense regional seismic network in order to detect and localize earthquakes with both a completeness magnitude of 1.5 and no clipping for M6.0. The realization of the project has to be done prior to the summer 2016Several complex technical and financial constraints constitute such a projet. First, most of the Alsace Région (150x150 km2), particularly the whole Upper Rhine Graben, is a soft-soil plain where seismic signals are dominated by a high frequency noise level. Second, all the signals have to be transmitted in near real-time. And finally, the total cost of the project must not exceed $450,000.Regarding the noise level in Alsace, in order to make a reduction of 40 dB for frequencies above 1Hz, we program to instrument into 50m deep well with post-hole sensor for 5 stations out of 8 plane new stations. The 3 remaining would be located on bedrock along the Vosges piedmont. In order to be sensitive to low-magnitude regional events, we plan to install a low-noise short-period post-hole velocimeter. In order to avoid saturation for high potentiel local events (M6.0 at 10km), this velocimeter will be coupled with a surface strong-motion sensor. Regarding the connectivity, these stations will have no wired network, which reduces linking costs and delays. We will therefore use solar panels and a 3G/GPRS network. The infrastructure will be minimal and reduced to an outdoor box on a secured parcel of land. In addition to the data-logger, we will use a 12V ruggedized computer, hosting a seed-link server for near

  12. Design of an UML conceptual model and implementation of a GIS with metadata information for a seismic hazard assessment cooperative project.

    NASA Astrophysics Data System (ADS)

    Torres, Y.; Escalante, M. P.

    2009-04-01

    This work illustrates the advantages of using a Geographic Information System in a cooperative project with researchers of different countries, such as the RESIS II project (financed by the Norwegian Government and managed by CEPREDENAC) for seismic hazard assessment of Central America. As input data present different formats, cover distinct geographical areas and are subjected to different interpretations, data inconsistencies may appear and their management get complicated. To achieve data homogenization and to integrate them in a GIS, it is required previously to develop a conceptual model. This is accomplished in two phases: requirements analysis and conceptualization. The Unified Modeling Language (UML) is used to compose the conceptual model of the GIS. UML complies with ISO 19100 norms and allows the designer defining model architecture and interoperability. The GIS provides a frame for the combination of large geographic-based data volumes, with an uniform geographic reference and avoiding duplications. All this information contains its own metadata following ISO 19115 normative. In this work, the integration in the same environment of active faults and subduction slabs geometries, combined with the epicentres location, has facilitated the definition of seismogenetic regions. This is a great support for national specialists of different countries to make easier their teamwork. The GIS capacity for making queries (by location and by attributes) and geostatistical analyses is used to interpolate discrete data resulting from seismic hazard calculations and to create continuous maps as well as to check and validate partial results of the study. GIS-based products, such as complete, homogenised databases and thematic cartography of the region, are distributed to all researchers, facilitating cross-national communication, the project execution and results dissemination.

  13. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    NASA Astrophysics Data System (ADS)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  14. iMUSH: The design of the Mount St. Helens high-resolution active source seismic experiment

    NASA Astrophysics Data System (ADS)

    Kiser, Eric; Levander, Alan; Harder, Steve; Abers, Geoff; Creager, Ken; Vidale, John; Moran, Seth; Malone, Steve

    2013-04-01

    Mount St. Helens is one of the most societally relevant and geologically interesting volcanoes in the United States. Although much has been learned about the shallow structure of this volcano since its eruption in 1980, important questions still remain regarding its magmatic system and connectivity to the rest of the Cascadia arc. For example, the structure of the magma plumbing system below the shallowest magma chamber under the volcano is still only poorly known. This information will be useful for hazard assessment for the southwest Washington area, and also for gaining insight into fundamental scientific questions such as the assimilation and differentiation processes that lead to the formation of continental crust. As part of the multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment, funded by NSF GeoPRISMS and EarthScope, an active source seismic experiment will be conducted in late summer 2014. The experiment will utilize all of the 2600 IRIS/PASSCAL/USArray Texan instruments. The instruments will be deployed as two 1000-instrument consecutive refraction profiles (one N/S and one WNW/ESE). Each of these profiles will be accompanied by two 1600-instrument areal arrays at varying distances from Mount St. Helens. Finally, one 2600-instrument areal array will be centered on Mount St. Helens. These instruments will record a total of twenty-four 500-1000 kg shots. Each refraction profile will have an average station spacing of 150 m, and a total length of 150 km. The stations in the areal arrays will be separated by ~1 km. A critical step in the success of this project is to develop an experimental setup that can resolve the most interesting aspects of the magmatic system. In particular, we want to determine the distribution of shot locations that will provide good coverage throughout the entire model space, while still allowing us to focus on regions likely to contain the magmatic plumbing system. In this study, we approach this problem by

  15. Advanced Seismic While Drilling System

    SciTech Connect

    Robert Radtke; John Fontenot; David Glowka; Robert Stokes; Jeffery Sutherland; Ron Evans; Jim Musser

    2008-06-30

    A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology of a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured by TII

  16. Ground Motion Simulations for Bursa Region (Turkey) Using Input Parameters derived from the Regional Seismic Network

    NASA Astrophysics Data System (ADS)

    Unal, B.; Askan, A.

    2014-12-01

    Earthquakes are among the most destructive natural disasters in Turkey and it is important to assess seismicity in different regions with the use of seismic networks. Bursa is located in Marmara Region, Northwestern Turkey and to the south of the very active North Anatolian Fault Zone. With around three million inhabitants and key industrial facilities of the country, Bursa is the fourth largest city in Turkey. Since most of the focus is on North Anatolian Fault zone, despite its significant seismicity, Bursa area has not been investigated extensively until recently. For reliable seismic hazard estimations and seismic design of structures, assessment of potential ground motions in this region is essential using both recorded and simulated data. In this study, we employ stochastic finite-fault simulation with dynamic corner frequency approach to model previous events as well to assess potential earthquakes in Bursa. To ensure simulations with reliable synthetic ground motion outputs, the input parameters must be carefully derived from regional data. In this study, using strong motion data collected at 33 stations in the region, site-specific parameters such as near-surface high frequency attenuation parameter and amplifications are obtained. Similarly, source and path parameters are adopted from previous studies that as well employ regional data. Initially, major previous events in the region are verified by comparing the records with the corresponding synthetics. Then simulations of scenario events in the region are performed. We present the results in terms of spatial distribution of peak ground motion parameters and time histories at selected locations.

  17. Using Theorem Proving to Verify Properties of Agent Programs

    NASA Astrophysics Data System (ADS)

    Alechina, N.; Dastani, M.; Khan, F.; Logan, B.; Meyer, J.-J. Ch.

    We present a sound and complete logic for automatic verification of simpleAPL programs. simpleAPL is a simplified version of agent programming languages such as 3APL and 2APL designed for the implementation of cognitive agents with beliefs, goals and plans. Our logic is a variant of PDL, and allows the specification of safety and liveness properties of agent programs. We prove a correspondence between the operational semantics of simpleAPL and the models of the logic for two example program execution strategies. We show how to translate agent programs written in simpleAPL into expressions of the logic, and give an example in which we show how to verify correctness properties for a simple agent program using theorem-proving.

  18. An arbitrated quantum signature scheme with fast signing and verifying

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Qin, Su-Juan; Su, Qi

    2013-11-01

    Existing arbitrated quantum signature (AQS) schemes are almost all based on the Leung quantum one-time pad (L-QOTP) algorithm. In these schemes, the receiver can achieve an existential forgery of the sender's signatures under the known message attack, and the sender can successfully disavow any of her/his signatures by a simple attack. In this paper, a solution of solving the problems is given, through designing a new QOTP algorithm relying largely on inserting decoy states into fixed insertion positions. Furthermore, we present an AQS scheme with fast signing and verifying, which is based on the new QOTP algorithm. It is just using single particle states and is unconditional secure. To fulfill the functions of AQS schemes, our scheme needs a significantly lower computational costs than that required by other AQS schemes based on the L-QOTP algorithm.

  19. Onshore seismic amplifications due to bathymetric features

    NASA Astrophysics Data System (ADS)

    Rodríguez-Castellanos, A.; Carbajal-Romero, M.; Flores-Guzmán, N.; Olivera-Villaseñor, E.; Kryvko, A.

    2016-08-01

    We perform numerical calculations for onshore seismic amplifications, taking into consideration the effect of bathymetric features on the propagation of seismic movements. To this end, the boundary element method is applied. Boundary elements are employed to irradiate waves and, consequently, force densities can be obtained for each boundary element. From this assumption, Huygens’ principle is applied, and since the diffracted waves are built at the boundary from which they are radiated, this idea is equivalent to Somigliana’s representation theorem. The application of boundary conditions leads to a linear system being obtained (Fredholm integral equations). Several numerical models are analyzed, with the first one being used to verify the proposed formulation, and the others being used to estimate onshore seismic amplifications due to the presence of bathymetric features. The results obtained show that compressional waves (P-waves) generate onshore seismic amplifications that can vary from 1.2 to 5.2 times the amplitude of the incident wave. On the other hand, the shear waves (S-waves) can cause seismic amplifications of up to 4.0 times the incident wave. Furthermore, an important result is that in most cases the highest seismic amplifications from an offshore earthquake are located on the shoreline and not offshore, despite the seafloor configuration. Moreover, the influence of the incident angle of seismic waves on the seismic amplifications is highlighted.

  20. Seismic monitoring of rockfalls at Spitz quarry (NÖ, Austria)

    NASA Astrophysics Data System (ADS)

    del Puy Papí Isaba, María; Brückl, Ewald; Roncat, Andreas; Schweigl, Joachim

    2016-04-01

    In the recent past, significant rockfalls, which pose a danger to persons, railways and roads, occurred in the quarry of Spitz (NÖ-Austria). An existing seismic warning system did not fulfill the expected efficiency and reliability standards since the ratio of well-detected events to undetected events or false alarms was not satisfactory. Our aim was to analyze how a seismic warning system must be designed in order to overcome these deficiencies. A small-scale seismic network was deployed in the Spitz quarry to evaluate the possibility of improving the early-warning rockfall monitoring network by means of seismic observations. A new methodology based on seismic methods, which enables the detection and location of rockfalls above a critical size, was developed. In order to perform this task, a small-scale (200x200 m2) passive seismic network comprised of 7 monitoring seismic stations acquiring data in continuous mode was established in the quarry of Spitz so that it covered the rockfall hazard area. On the 2nd of October 2015, an induced rockfall experiment was performed. It began at 09:00 a.m (local time, 07:00 UTC) and lasted about 1.5 hours. The entire data set was analyzed using the pSysmon software. In order to locate the impact point of the rock falls, we used a procedure based on the back-projection of the maximum resultant amplitude recorded at each station of the network within a time window to every grid-point covering the whole area of interest. In order to verify the performance of the employed algorithm for detection and localization, we performed man-induced rock falls. We also used a terrestrial laser scanner and a camera, not only to draw the rockfall block trajectories, but also to determine the volume of rock lost or gained in the different areas of the quarry. This allowed us to relate the lost mass with the strength of the collision (Pseudo-magnitude) of the rockfall, and draw and rebuild their associated trajectory. The location test performed

  1. On the distribution of seismic reflection coefficients and seismic amplitudes

    SciTech Connect

    Painter, S.; Paterson, L.; Beresford, G.

    1995-07-01

    Reflection coefficient sequences from 14 wells in Australia have a statistical character consistent with a non-Gaussian scaling noise model based on the Levy-stable family of probability distributions. Experimental histograms of reflection coefficients are accurately approximated by symmetric Levy-stable probability density functions with Levy index between 0.99 and 1.43. These distributions have the same canonical role in mathematical statistics as the Gaussian distribution, but they have slowly decaying tails and infinite moments. The distribution of reflection coefficients is independent of the spatial scale (statistically self-similar), and the reflection coefficient sequences have long-range dependence. These results suggest that the logarithm of seismic impedance can be modeled accurately using fractional Levy motion, which is a generalization of fractional Brownian motion. Synthetic seismograms produced from the authors` model for the reflection coefficients also have Levy-stable distributions. These isolations include transmission losses, the effects of reverberations, and the loss of resolution caused by band-limited wavelets, and suggest that actual seismic amplitudes with sufficient signal-to-noise ratio should also have a Levy-stable distribution. This prediction is verified using post-stack seismic data acquired in the Timor Sea and in the continental USA. However, prestack seismic amplitudes from the Timor Sea are nearly Gaussian. They attribute the difference between prestack and poststack data to the high level of measurement noise in the prestack data.

  2. Cross well seismic reservoir characterization

    SciTech Connect

    Sheline, H.E.

    1995-08-01

    A striking example of how Cross Well Seismic reflection data can help characterize a reservoir, has resulted from an ongoing Multi-Discipline study of the carbonate Mishrif reservoir offshore Dubai, U.A.E. Because the study objectives include a more detailed description of intra reservoir structure and layering, Dubai Petroleum Company (DPC) analyzed the feasibility of Cross Well Seismic (CWS) and decided to acquire two surveys between three wells 337 to 523 feet apart. DPC has concluded that CWS can be cost effectively acquired offshore, in a Carbonate reservoir; as well as processed and interpreted. However, generally it is not often easy to acquire cross well seismic when and where it will be most useful. A CWS survey can provide multiple images such as a velocity Tomogram, P-wave reflections, and S-wave reflections. To date, Tomograms and P-wave reflections have been produced, and the reflection data has proven to be the most useful for reservoir characterization. Cross Well Seismic Reflection data have provided a level of vertical seismic reflection resolution of around 2 feet, which is more than 10 times better than surface seismic data (2D or 3D). The increase in vertical resolution has provided important detailed information about the reservoir, it`s continuity/heterogeneity; it`s detailed structure, stratigraphy and layering; and definition of any faults with more than 2 feet of offset. The CWS has shown detailed intra Mishrif reflectors. These reflectors have verified or changed detailed correlations between well bores, and show significant intra Mishrif thinning. These reflectors imply time stratigraphic layering which is consistent with tracer study results and regional sequence stratigraphy. This new data will be used to improve the reservoir model description.

  3. Strong Motion Instrumentation of Seismically-Strengthened Port Structures in California by CSMIP

    USGS Publications Warehouse

    Huang, M.J.; Shakal, A.F.

    2009-01-01

    The California Strong Motion Instrumentation Program (CSMIP) has instrumented five port structures. Instrumentation of two more port structures is underway and another one is in planning. Two of the port structures have been seismically strengthened. The primary goals of the strong motion instrumentation are to obtain strong earthquake shaking data for verifying seismic analysis procedures and strengthening schemes, and for post-earthquake evaluations of port structures. The wharves instrumented by CSMIP were recommended by the Strong Motion Instrumentation Advisory Committee, a committee of the California Seismic Safety Commission. Extensive instrumentation of a wharf is difficult and would be impossible without the cooperation of the owners and the involvement of the design engineers. The instrumentation plan for a wharf is developed through study of the retrofit plans of the wharf, and the strong-motion sensors are installed at locations where specific instrumentation objectives can be achieved and access is possible. Some sensor locations have to be planned during design; otherwise they are not possible to install after construction. This paper summarizes the two seismically-strengthened wharves and discusses the instrumentation schemes and objectives. ?? 2009 ASCE.

  4. Seismic Isolation Working Meeting Gap Analysis Report

    SciTech Connect

    Justin Coleman; Piyush Sabharwall

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  5. Infrasound Generation from the HH Seismic Hammer.

    SciTech Connect

    Jones, Kyle Richard

    2014-10-01

    The HH Seismic hammer is a large, "weight-drop" source for active source seismic experiments. This system provides a repetitive source that can be stacked for subsurface imaging and exploration studies. Although the seismic hammer was designed for seismological studies it was surmised that it might produce energy in the infrasonic frequency range due to the ground motion generated by the 13 metric ton drop mass. This study demonstrates that the seismic hammer generates a consistent acoustic source that could be used for in-situ sensor characterization, array evaluation and surface-air coupling studies for source characterization.

  6. Final Report: Seismic Hazard Assessment at the PGDP

    SciTech Connect

    Wang, Zhinmeng

    2007-06-01

    Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties of seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.

  7. Identity-Based Verifiably Encrypted Signatures without Random Oracles

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Wu, Qianhong; Qin, Bo

    Fair exchange protocol plays an important role in electronic commerce in the case of exchanging digital contracts. Verifiably encrypted signatures provide an optimistic solution to these scenarios with an off-line trusted third party. In this paper, we propose an identity-based verifiably encrypted signature scheme. The scheme is non-interactive to generate verifiably encrypted signatures and the resulting encrypted signature consists of only four group elements. Based on the computational Diffie-Hellman assumption, our scheme is proven secure without using random oracles. To the best of our knowledge, this is the first identity-based verifiably encrypted signature scheme provably secure in the standard model.

  8. Seismic risk perception in Italy

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro; Peruzza, Laura

    2014-05-01

    Risk perception is a fundamental element in the definition and the adoption of preventive counter-measures. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. This paper presents results of a survey on seismic risk perception in Italy conducted from January 2013 to present . The research design combines a psychometric and a cultural theoretic approach. More than 7,000 on-line tests have been compiled. The data collected show that in Italy seismic risk perception is strongly underestimated; 86 on 100 Italian citizens, living in the most dangerous zone (namely Zone 1), do not have a correct perception of seismic hazard. From these observations we deem that extremely urgent measures are required in Italy to reach an effective way to communicate seismic risk. Finally, the research presents a comparison between groups on seismic risk perception: a group involved in campaigns of information and education on seismic risk and a control group.

  9. Application of bounding spectra to seismic design of piping based on the performance of above ground piping in power plants subjected to strong motion earthquakes

    SciTech Connect

    Stevenson, J.D.

    1995-02-01

    This report extends the potential application of Bounding Spectra evaluation procedures, developed as part of the A-46 Unresolved Safety Issue applicable to seismic verification of in-situ electrical and mechanical equipment, to in-situ safety related piping in nuclear power plants. The report presents a summary of earthquake experience data which define the behavior of typical U.S. power plant piping subject to strong motion earthquakes. The report defines those piping system caveats which would assure the seismic adequacy of the piping systems which meet those caveats and whose seismic demand are within the bounding spectra input. Based on the observed behavior of piping in strong motion earthquakes, the report describes the capabilities of the piping system to carry seismic loads as a function of the type of connection (i.e. threaded versus welded). This report also discusses in some detail the basic causes and mechanisms for earthquake damages and failures to power plant piping systems.

  10. 41 CFR 128-1.8004 - Seismic Safety Coordinators.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false Seismic Safety... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety Program § 128-1.8004 Seismic Safety Coordinators. (a) The Justice Management Division shall designate...

  11. 41 CFR 128-1.8004 - Seismic Safety Coordinators.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Seismic Safety... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety Program § 128-1.8004 Seismic Safety Coordinators. (a) The Justice Management Division shall designate...

  12. 2008 United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, M.D.; and others

    2008-01-01

    The U.S. Geological Survey recently updated the National Seismic Hazard Maps by incorporating new seismic, geologic, and geodetic information on earthquake rates and associated ground shaking. The 2008 versions supersede those released in 1996 and 2002. These maps are the basis for seismic design provisions of building codes, insurance rate structures, earthquake loss studies, retrofit priorities, and land-use planning. Their use in design of buildings, bridges, highways, and critical infrastructure allows structures to better withstand earthquake shaking, saving lives and reducing disruption to critical activities following a damaging event. The maps also help engineers avoid costs from over-design for unlikely levels of ground motion.

  13. Verifying the Dependence of Fractal Coefficients on Different Spatial Distributions

    SciTech Connect

    Gospodinov, Dragomir; Marekova, Elisaveta; Marinov, Alexander

    2010-01-21

    A fractal distribution requires that the number of objects larger than a specific size r has a power-law dependence on the size N(r) = C/r{sup D}propor tor{sup -D} where D is the fractal dimension. Usually the correlation integral is calculated to estimate the correlation fractal dimension of epicentres. A 'box-counting' procedure could also be applied giving the 'capacity' fractal dimension. The fractal dimension can be an integer and then it is equivalent to a Euclidean dimension (it is zero of a point, one of a segment, of a square is two and of a cube is three). In general the fractal dimension is not an integer but a fractional dimension and there comes the origin of the term 'fractal'. The use of a power-law to statistically describe a set of events or phenomena reveals the lack of a characteristic length scale, that is fractal objects are scale invariant. Scaling invariance and chaotic behavior constitute the base of a lot of natural hazards phenomena. Many studies of earthquakes reveal that their occurrence exhibits scale-invariant properties, so the fractal dimension can characterize them. It has first been confirmed that both aftershock rate decay in time and earthquake size distribution follow a power law. Recently many other earthquake distributions have been found to be scale-invariant. The spatial distribution of both regional seismicity and aftershocks show some fractal features. Earthquake spatial distributions are considered fractal, but indirectly. There are two possible models, which result in fractal earthquake distributions. The first model considers that a fractal distribution of faults leads to a fractal distribution of earthquakes, because each earthquake is characteristic of the fault on which it occurs. The second assumes that each fault has a fractal distribution of earthquakes. Observations strongly favour the first hypothesis.The fractal coefficients analysis provides some important advantages in examining earthquake spatial

  14. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION...; contents. A verified statement should contain all the facts upon which the witness relies, and to...

  15. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION...; contents. A verified statement should contain all the facts upon which the witness relies, and to...

  16. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false Verifying your identity. 401.45 Section 401.45 Employees' Benefits SOCIAL SECURITY ADMINISTRATION PRIVACY AND DISCLOSURE OF OFFICIAL RECORDS AND INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a request for notification of or access to...

  17. 34 CFR 668.56 - Items to be verified.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Items to be verified. 668.56 Section 668.56 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION... Information § 668.56 Items to be verified. (a) Except as provided in paragraphs (b), (c), (d), and (e) of...

  18. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Verifying your identity. 802.13 Section 802.13 Judicial Administration COURT SERVICES AND OFFENDER SUPERVISION AGENCY FOR THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your...

  19. 26 CFR 301.6334-4 - Verified statements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Verified statements. 301.6334-4 Section 301.6334-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) PROCEDURE AND ADMINISTRATION PROCEDURE AND ADMINISTRATION Seizure of Property for Collection of Taxes § 301.6334-4 Verified statements. (a) In general. For...

  20. 26 CFR 301.6334-4 - Verified statements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 18 2011-04-01 2011-04-01 false Verified statements. 301.6334-4 Section 301.6334-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) PROCEDURE AND ADMINISTRATION PROCEDURE AND ADMINISTRATION Seizure of Property for Collection of Taxes § 301.6334-4 Verified statements. (a) In general. For...

  1. Romanian Educational Seismic Network Project

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers

  2. Seismic no-data zone, offshore Mississippi delta: depositional controls on geotechnical properties, velocity structure, and seismic attenuation

    SciTech Connect

    May, J.A.; Meeder, C.A.; Tinkle, A.R.; Wener, K.R.

    1986-09-01

    Seismic acquisition problems plague exploration and production offshore the Mississippi delta. Geologic and geotechnical analyses of 300-ft borings and 20-ft piston cores, combined with subbottom acoustic measurements, help identify and predict the locations, types, and magnitudes of anomalous seismic zones. This knowledge is used to design acquisition and processing techniques to circumvent the seismic problems.

  3. Flutter Stability Verified for the Trailing Edge Blowing Fan

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.; Srivastava, Rakesh

    2005-01-01

    The TURBO-AE aeroelastic code has been used to verify the flutter stability of the trailing edge blowing (TEB) fan, which is a unique technology demonstrator being designed and fabricated at the NASA Glenn Research Center for testing in Glenn s 9- by 15-Foot Low-Speed Wind Tunnel. Air can be blown out of slots near the trailing edges of the TEB fan blades to fill in the wakes downstream of the rotating blades, which reduces the rotor-stator interaction (tone) noise caused by the interaction of wakes with the downstream stators. The TEB fan will demonstrate a 1.6-EPNdB reduction in tone noise through wake filling. Furthermore, the reduced blade-row interaction will decrease the possibility of forced-response vibrations and enable closer spacing of blade rows, thus reducing engine length and weight. The detailed aeroelastic analysis capability of the three-dimensional Navier-Stokes TURBO-AE code was used to check the TEB fan rotor blades for flutter stability. Flutter calculations were first performed with no TEB flow; then select calculations were repeated with TEB flow turned on.

  4. Simple method to verify OPC data based on exposure condition

    NASA Astrophysics Data System (ADS)

    Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu

    2006-03-01

    In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.

  5. VISION - Verifiable Fuel Cycle Simulation of Nuclear Fuel Cycle Dynamics

    SciTech Connect

    Steven J. Piet; A. M. Yacout; J. J. Jacobson; C. Laws; G. E. Matthern; D. E. Shropshire

    2006-02-01

    The U.S. DOE Advanced Fuel Cycle Initiative’s (AFCI) fundamental objective is to provide technology options that - if implemented - would enable long-term growth of nuclear power while improving sustainability and energy security. The AFCI organization structure consists of four areas; Systems Analysis, Fuels, Separations and Transmutations. The Systems Analysis Working Group is tasked with bridging the program technical areas and providing the models, tools, and analyses required to assess the feasibility of design and deployment options and inform key decision makers. An integral part of the Systems Analysis tool set is the development of a system level model that can be used to examine the implications of the different mixes of reactors, implications of fuel reprocessing, impact of deployment technologies, as well as potential "exit" or "off ramp" approaches to phase out technologies, waste management issues and long-term repository needs. The Verifiable Fuel Cycle Simulation Model (VISION) is a computer-based simulation model that allows performing dynamic simulations of fuel cycles to quantify infrastructure requirements and identify key trade-offs between alternatives. It is based on the current AFCI system analysis tool "DYMOND-US" functionalities in addition to economics, isotopic decay, and other new functionalities. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI and Generation IV reactor development studies.

  6. Seismic sources

    DOEpatents

    Green, M.A.; Cook, N.G.W.; McEvilly, T.V.; Majer, E.L.; Witherspoon, P.A.

    1987-04-20

    Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Longitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements for more than about one minute. 9 figs.

  7. Seismic analysis of nuclear power plant structures

    NASA Technical Reports Server (NTRS)

    Go, J. C.

    1973-01-01

    Primary structures for nuclear power plants are designed to resist expected earthquakes of the site. Two intensities are referred to as Operating Basis Earthquake and Design Basis Earthquake. These structures are required to accommodate these seismic loadings without loss of their functional integrity. Thus, no plastic yield is allowed. The application of NASTRAN in analyzing some of these seismic induced structural dynamic problems is described. NASTRAN, with some modifications, can be used to analyze most structures that are subjected to seismic loads. A brief review of the formulation of seismic-induced structural dynamics is also presented. Two typical structural problems were selected to illustrate the application of the various methods of seismic structural analysis by the NASTRAN system.

  8. IMPLEMENTATION OF SEISMIC STOPS IN PIPING SYSTEMS.

    SciTech Connect

    BEZLER,P.

    1993-02-01

    Commonwealth Edison has submitted a request to NRC to replace the snubbers in the Reactor Coolant Bypass Line of Byron Station -Unit 2 with gapped pipe supports. The specific supports intended for use are commercial units designated ''Seismic Stops'' manufactured by Robert L. Cloud Associates, Inc. (RLCA). These devices have the physical appearance of snubbers and are essentially spring supports incorporating clearance gaps sized for the Byron Station application. Although the devices have a nonlinear stiffness characteristic, their design adequacy is demonstrated through the use of a proprietary linear elastic piping analysis code ''GAPPIPE'' developed by RLCA. The code essentially has all the capabilities of a conventional piping analysis code while including an equivalent linearization technique to process the nonlinear spring elements. Brookhaven National Laboratory (BNL) has assisted the NRC staff in its evaluation of the RLCA implementation of the equivalent linearization technique and the GAPPIPE code. Towards this end, BNL performed a detailed review of the theoretical basis for the method, an independent evaluation of the Byron piping using the nonlinear time history capability of the ANSYS computer code and by result comparisons to the RLCA developed results, an assessment of the adequacy of the response estimates developed with GAPPIPE. Associated studies included efforts to verify the ANSYS analysis results and the development of bounding calculations for the Byron Piping using linear response spectrum methods.

  9. Implementation of Seismic Stops in Piping Systems

    SciTech Connect

    Bezler, P.; Simos, N.; Wang, Y.K.

    1993-02-01

    Commonwealth Edison has submitted a request to NRC to replace the snubbers in the Reactor Coolant Bypass Line of Byron Station-Unit 2 with gapped pipe supports. The specific supports intended for use are commercial units designated ''Seismic Stops'' manufactured by Robert L. Cloud Associates, Inc. (RLCA). These devices have the physical appearance of snubbers and are essentially spring supports incorporating clearance gaps sized for the Byron Station application. Although the devices have a nonlinear stiffness characteristic, their design adequacy is demonstrated through the use of a proprietary linear elastic piping analysis code ''GAPPIPE'' developed by RLCA. The code essentially has all the capabilities of a conventional piping analysis code while including an equivalent linearization technique to process the nonlinear spring elements. Brookhaven National Laboratory (BNL) has assisted the NRC staff in its evaluation of the RLCA implementation of the equivalent Linearization technique and the GAPPIPE code. Towards this end, BNL performed a detailed review of the theoretical basis for the method, an independent evaluation of the Byron piping using the nonlinear time history capability of the ANSYS computer code and by result comparisons to the RLCA developed results, an assessment of the adequacy of the response estimates developed with GAPPIPE. Associated studies included efforts to verify the ANSYS analysis results and the development of bounding calculations for the Byron Piping using linear response spectrum methods.

  10. Seismic refraction exploration

    SciTech Connect

    Ruehle, W.H.

    1980-12-30

    In seismic exploration, refracted seismic energy is detected by seismic receivers to produce seismograms of subsurface formations. The seismograms are produced by directing seismic energy from an array of sources at an angle to be refracted by the subsurface formations and detected by the receivers. The directivity of the array is obtained by delaying the seismic pulses produced by each source in the source array.

  11. A university-developed seismic source for shallow seismic surveys

    NASA Astrophysics Data System (ADS)

    Yordkayhun, Sawasdee; Na Suwan, Jumras

    2012-07-01

    The main objectives of this study were to (1) design and develop a low cost seismic source for shallow seismic surveys and (2) test the performance of the developed source at a test site. The surface seismic source, referred to here as a university-developed seismic source is based upon the principle of an accelerated weight drop. A 30 kg activated mass is lifted by a mechanical rack and pinion gear and is accelerated by a mounted spring. When the mass is released from 0.5 m above the surface, it hits a 30 kg base plate and energy is transferred to the ground, generating a seismic wave. The developed source is portable, environmentally friendly, easy to operate and maintain, and is a highly repeatable impact source. To compare the developed source with a sledgehammer source, a source test was performed at a test site, a study site for mapping a major fault zone in southern Thailand. The sledgehammer and the developed sources were shot along a 300 m long seismic reflection profile with the same parameters. Data were recorded using 12 channels off-end geometry with source and receiver spacing of 5 m, resulting in CDP stacked sections with 2.5 m between traces. Source performances were evaluated based on analyses of signal penetration, frequency content and repeatability, as well as the comparison of stacked sections. The results show that both surface sources are suitable for seismic studies down to a depth of about 200 m at the site. The hammer data are characterized by relatively higher frequency signals than the developed source data, whereas the developed source generates signals with overall higher signal energy transmission and greater signal penetration. In addition, the repeatability of the developed source is considerably higher than the hammer source.

  12. Reasoning about knowledge: Children's evaluations of generality and verifiability.

    PubMed

    Koenig, Melissa A; Cole, Caitlin A; Meyer, Meredith; Ridge, Katherine E; Kushnir, Tamar; Gelman, Susan A

    2015-12-01

    In a series of experiments, we examined 3- to 8-year-old children's (N=223) and adults' (N=32) use of two properties of testimony to estimate a speaker's knowledge: generality and verifiability. Participants were presented with a "Generic speaker" who made a series of 4 general claims about "pangolins" (a novel animal kind), and a "Specific speaker" who made a series of 4 specific claims about "this pangolin" as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., "has a pointy nose") or a non-evident feature that was not visible (e.g., "sleeps in a hollow tree"). Three main findings emerged: (1) young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) children's attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) children often generalized speakers' knowledge outside of the pangolin domain, indicating a belief that a person's knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims. PMID:26451884

  13. Seismic reflection imaging of shallow oceanographic structures

    NASA Astrophysics Data System (ADS)

    PiéTé, Helen; Marié, Louis; Marsset, Bruno; Thomas, Yannick; Gutscher, Marc-André

    2013-05-01

    Multichannel seismic (MCS) reflection profiling can provide high lateral resolution images of deep ocean thermohaline fine structure. However, the shallowest layers of the water column (z < 150 m) have remained unexplored by this technique until recently. In order to explore the feasibility of shallow seismic oceanography (SO), we reprocessed and analyzed four multichannel seismic reflection sections featuring reflectors at depths between 10 and 150 m. The influence of the acquisition parameters was quantified. Seismic data processing dedicated to SO was also investigated. Conventional seismic acquisition systems were found to be ill-suited to the imaging of shallow oceanographic structures, because of a high antenna filter effect induced by large offsets and seismic trace lengths, and sources that typically cannot provide both a high level of emission and fine vertical resolution. We considered a test case, the imagery of the seasonal thermocline on the western Brittany continental shelf. New oceanographic data acquired in this area allowed simulation of the seismic acquisition. Sea trials of a specifically designed system were performed during the ASPEX survey, conducted in early summer 2012. The seismic device featured: (i) four seismic streamers, each consisting of six traces of 1.80 m; (ii) a 1000 J SIG sparker source, providing a 400 Hz signal with a level of emission of 205 dB re 1 μPa @ 1 m. This survey captured the 15 m thick, 30 m deep seasonal thermocline in unprecedented detail, showing images of vertical displacements most probably induced by internal waves.

  14. LANL seismic screening method for existing buildings

    SciTech Connect

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method and will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.

  15. Effect of Different Groundwater Levels on Seismic Dynamic Response and Failure Mode of Sandy Slope.

    PubMed

    Huang, Shuai; Lv, Yuejun; Peng, Yanju; Zhang, Lifang; Xiu, Liwei

    2015-01-01

    Heavy seismic damage tends to occur in slopes when groundwater is present. The main objectives of this paper are to determine the dynamic response and failure mode of sandy slope subjected simultaneously to seismic forces and variable groundwater conditions. This paper applies the finite element method, which is a fast and efficient design tool in modern engineering analysis, to evaluate dynamic response of the slope subjected simultaneously to seismic forces and variable groundwater conditions. Shaking table test is conducted to analyze the failure mode and verify the accuracy of the finite element method results. The research results show that dynamic response values of the slope have different variation rules under near and far field earthquakes. And the damage location and pattern of the slope are different in varying groundwater conditions. The destruction starts at the top of the slope when the slope is in no groundwater, which shows that the slope appears obvious whipping effect under the earthquake. The destruction starts at the toe of the slope when the slope is in the high groundwater levels. Meanwhile, the top of the slope shows obvious seismic subsidence phenomenon after earthquake. Furthermore, the existence of the groundwater has a certain effect of damping. PMID:26560103

  16. Effect of Different Groundwater Levels on Seismic Dynamic Response and Failure Mode of Sandy Slope

    PubMed Central

    Huang, Shuai; Lv, Yuejun; Peng, Yanju; Zhang, Lifang; Xiu, Liwei

    2015-01-01

    Heavy seismic damage tends to occur in slopes when groundwater is present. The main objectives of this paper are to determine the dynamic response and failure mode of sandy slope subjected simultaneously to seismic forces and variable groundwater conditions. This paper applies the finite element method, which is a fast and efficient design tool in modern engineering analysis, to evaluate dynamic response of the slope subjected simultaneously to seismic forces and variable groundwater conditions. Shaking table test is conducted to analyze the failure mode and verify the accuracy of the finite element method results. The research results show that dynamic response values of the slope have different variation rules under near and far field earthquakes. And the damage location and pattern of the slope are different in varying groundwater conditions. The destruction starts at the top of the slope when the slope is in no groundwater, which shows that the slope appears obvious whipping effect under the earthquake. The destruction starts at the toe of the slope when the slope is in the high groundwater levels. Meanwhile, the top of the slope shows obvious seismic subsidence phenomenon after earthquake. Furthermore, the existence of the groundwater has a certain effect of damping. PMID:26560103

  17. Seismic waveform viewer, processor and calculator

    Energy Science and Technology Software Center (ESTSC)

    2015-02-15

    SWIFT is a computer code that is designed to do research level signal analysis on seismic waveforms, including visualization, filtering and measurement. LLNL is using this code, amplitude and global tomography efforts.

  18. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

    2009-08-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several

  19. "Foreign material" to verify root fusion in welded joints

    NASA Technical Reports Server (NTRS)

    Kleint, R. E.

    1980-01-01

    Foil or thin wire at weld root is used to verify weld penetration. When weld is adequate, material mixes with weld and traces of it diffuse to weld crown. Spectroscopic analysis of samples identifies foreign material and verifies root has fused. Weld roots are usually inaccessible to visual inspection, and X-ray and ultrasonic inspection techniques are not always reliable. Good results are obtained with use of gold/nickel alloy.

  20. Seismic sources

    DOEpatents

    Green, Michael A.; Cook, Neville G. W.; McEvilly, Thomas V.; Majer, Ernest L.; Witherspoon, Paul A.

    1992-01-01

    Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Logitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole relative to a stator that is clamped to the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements at a power level that causes heating to over 150.degree. C. within one minute of operation, but energizing the elements for no more than about one minute.

  1. Seismic performance of underground facilities

    SciTech Connect

    Marine, I W

    1982-01-01

    A workshop was held in Augusta, GA, February 11-13, 1981 to review and assess the state-of-the-art for determining and predicting earthquake damage to underground facilities. The papers presented related to data collection and analysis, modeling, and repository design. Discussion groups addressed seismology, rock mechanics and hydrology, modeling, design, and licensing, siting, and tectonics. Most scientists in attendance believed that enough was known to proceed with site selection, design, and licensing of a waste repository. However, there was recognition of several items of research that would enhance understanding of the subsurface effects of seismicity. In general, the subsurface effects of earthquakes are substantially less than their surface effects. This conclusion is supported by both observation and by modeling studies. The absence of wave reflections, the absence of high flexural stresses, and the absence of poor soil conditions contribute to the improved seismic performance of subsurface facilities. Seismic considerations for geologic disposal of nuclear waste vary with the phase of operation. During construction and waste loading, the primary concern is for the safety of onsite personnel. However, during long-term waste storage, the principal interest is in the migration of contaminants due to seismic cracking and enhancement of permeability. Backfilling the storage facility will mitigate this effect.

  2. Seismic safety of high concrete dams

    NASA Astrophysics Data System (ADS)

    Chen, Houqun

    2014-08-01

    China is a country of high seismicity with many hydropower resources. Recently, a series of high arch dams have either been completed or are being constructed in seismic regions, of which most are concrete dams. The evaluation of seismic safety often becomes a critical problem in dam design. In this paper, a brief introduction to major progress in the research on seismic aspects of large concrete dams, conducted mainly at the Institute of Water Resources and Hydropower Research (IWHR) during the past 60 years, is presented. The dam site-specific ground motion input, improved response analysis, dynamic model test verification, field experiment investigations, dynamic behavior of dam concrete, and seismic monitoring and observation are described. Methods to prevent collapse of high concrete dams under maximum credible earthquakes are discussed.

  3. Broadband seismology and small regional seismic networks

    USGS Publications Warehouse

    Herrmann, Robert B.

    1995-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  4. Seismic qualification of unanchored equipment

    SciTech Connect

    Moran, T.J.

    1995-12-01

    This paper describes procedures used to design and qualify unanchored equipment to survive Seismic events to the PC = 4 level in a moderate seismic area. The need for flexibility to move experimental equipment together with the requirements for remote handling in a highly-radioactive non-reactor nuclear facility precluded normal equipment anchorage. Instead equipment was designed to remain stable under anticipated DBE floor motions with sufficient margin to achieve the performance goal. The equipment was also designed to accommodate anticipated sliding motions with sufficient. The simplified design criteria used to achieve these goals were based on extensive time-history simulations of sliding, rocking, and overturning of generic equipment models. The entire process was subject to independent peer review and accepted in a Safety Evaluation Report. The process provides a model suitable for adaptation to similar applications and for assessment of the potential for seismic damage of existing, unanchored equipment In particular, the paper describes: (1) Two dimensional sliding studies of deformable equipment subject to 3-D floor excitation as the basis for simplified sliding radius and sliding velocity design criteria. (2) Two dimensional rocking and overturning simulations of rigid equipment used to establish design criteria for minimum base dimensions and equipment rigidity to prevent overturning. (3) Assumed mode rocking analyses of deformable equipment models used to establish uplift magnitudes and subsequent impacts during stable rocking motions. The model used for these dynamic impact studies is reported elsewhere.

  5. Seismic intrusion detector system

    DOEpatents

    Hawk, Hervey L.; Hawley, James G.; Portlock, John M.; Scheibner, James E.

    1976-01-01

    A system for monitoring man-associated seismic movements within a control area including a geophone for generating an electrical signal in response to seismic movement, a bandpass amplifier and threshold detector for eliminating unwanted signals, pulse counting system for counting and storing the number of seismic movements within the area, and a monitoring system operable on command having a variable frequency oscillator generating an audio frequency signal proportional to the number of said seismic movements.

  6. Seismic-Scale Rock Physics of Methane Hydrate

    SciTech Connect

    Amos Nur

    2009-01-08

    We quantify natural methane hydrate reservoirs by generating synthetic seismic traces and comparing them to real seismic data: if the synthetic matches the observed data, then the reservoir properties and conditions used in synthetic modeling might be the same as the actual, in-situ reservoir conditions. This approach is model-based: it uses rock physics equations that link the porosity and mineralogy of the host sediment, pressure, and hydrate saturation, and the resulting elastic-wave velocity and density. One result of such seismic forward modeling is a catalogue of seismic reflections of methane hydrate which can serve as a field guide to hydrate identification from real seismic data. We verify this approach using field data from known hydrate deposits.

  7. USGS National Seismic Hazard Maps

    USGS Publications Warehouse

    Frankel, A.D.; Mueller, C.S.; Barnhard, T.P.; Leyendecker, E.V.; Wesson, R.L.; Harmsen, S.C.; Klein, F.W.; Perkins, D.M.; Dickman, N.C.; Hanson, S.L.; Hopper, M.G.

    2000-01-01

    The U.S. Geological Survey (USGS) recently completed new probabilistic seismic hazard maps for the United States, including Alaska and Hawaii. These hazard maps form the basis of the probabilistic component of the design maps used in the 1997 edition of the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, prepared by the Building Seismic Safety Council arid published by FEMA. The hazard maps depict peak horizontal ground acceleration and spectral response at 0.2, 0.3, and 1.0 sec periods, with 10%, 5%, and 2% probabilities of exceedance in 50 years, corresponding to return times of about 500, 1000, and 2500 years, respectively. In this paper we outline the methodology used to construct the hazard maps. There are three basic components to the maps. First, we use spatially smoothed historic seismicity as one portion of the hazard calculation. In this model, we apply the general observation that moderate and large earthquakes tend to occur near areas of previous small or moderate events, with some notable exceptions. Second, we consider large background source zones based on broad geologic criteria to quantify hazard in areas with little or no historic seismicity, but with the potential for generating large events. Third, we include the hazard from specific fault sources. We use about 450 faults in the western United States (WUS) and derive recurrence times from either geologic slip rates or the dating of pre-historic earthquakes from trenching of faults or other paleoseismic methods. Recurrence estimates for large earthquakes in New Madrid and Charleston, South Carolina, were taken from recent paleoliquefaction studies. We used logic trees to incorporate different seismicity models, fault recurrence models, Cascadia great earthquake scenarios, and ground-motion attenuation relations. We present disaggregation plots showing the contribution to hazard at four cities from potential earthquakes with various magnitudes and

  8. Evolution of optically nondestructive and data-non-intrusive credit card verifiers

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2010-04-01

    Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

  9. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-12-09

    make SSR surveying considerably more efficient and less expensive, particularly when geophone intervals of 25 cm or less are required. The most recent research analyzed the difference in seismic response of the geophones with variable geophone spike length and geophones attached to various steel media. Experiments investigated the azimuthal dependence of the quality of data relative to the orientation of the rigidly attached geophones. Other experiments designed to test the hypothesis that the data are being amplified in much the same way that an organ pipe amplifies sound have so far proved inconclusive. Taken together, the positive results show that SSR imaging within a few meters of the earth's surface is possible if the geology is suitable, that SSR imaging can complement GPR imaging, and that SSR imaging could be made significantly more cost effective, at least in areas where the topography and the geology are favorable. Increased knowledge of the Earth's shallow subsurface through non-intrusive techniques is of potential benefit to management of DOE facilities. Among the most significant problems facing hydrologists today is the delineation of preferential permeability paths in sufficient detail to make a quantitative analysis possible. Aquifer systems dominated by fracture flow have a reputation of being particularly difficult to characterize and model. At chemically contaminated sites, including U.S. Department of Energy (DOE) facilities and others at Department of Defense (DOD) installations worldwide, establishing the spatial extent of the contamination, along with the fate of the contaminants and their transport-flow directions, is essential to the development of effective cleanup strategies. Detailed characterization of the shallow subsurface is important not only in environmental, groundwater, and geotechnical engineering applications, but also in neotectonics, mining geology, and the analysis of petroleum reservoir analogs. Near-surface seismology is in

  10. The SCALE Verified, Archived Library of Inputs and Data - VALID

    SciTech Connect

    Marshall, William BJ J; Rearden, Bradley T

    2013-01-01

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated with model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional

  11. Micromachined silicon seismic transducers

    SciTech Connect

    Barron, C.C.; Fleming, J.G.; Sniegowski, J.J.; Armour, D.L.; Fleming, R.P.

    1995-08-01

    Batch-fabricated silicon seismic transducers could revolutionize the discipline of CTBT monitoring by providing inexpensive, easily depolyable sensor arrays. Although our goal is to fabricate seismic sensors that provide the same performance level as the current state-of-the-art ``macro`` systems, if necessary one could deploy a larger number of these small sensors at closer proximity to the location being monitored in order to compensate for lower performance. We have chosen a modified pendulum design and are manufacturing prototypes in two different silicon micromachining fabrication technologies. The first set of prototypes, fabricated in our advanced surface- micromachining technology, are currently being packaged for testing in servo circuits -- we anticipate that these devices, which have masses in the 1--10 {mu}g range, will resolve sub-mG signals. Concurrently, we are developing a novel ``mold`` micromachining technology that promises to make proof masses in the 1--10 mg range possible -- our calculations indicate that devices made in this new technology will resolve down to at least sub-{mu}G signals, and may even approach to 10{sup {minus}10} G/{radical}Hz acceleration levels found in the low-earth-noise model.

  12. Verifying continuous-variable entanglement in finite spaces

    SciTech Connect

    Sperling, J.; Vogel, W.

    2009-05-15

    Starting from arbitrary Hilbert spaces, we reduce the problem to verify entanglement of any bipartite quantum state to finite-dimensional subspaces. Entanglement can be fully characterized as a finite-dimensional property, even though in general the truncation of the Hilbert space may cause fake nonclassicality. A generalization for multipartite quantum states is also given.

  13. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  14. A Trustworthy Internet Auction Model with Verifiable Fairness.

    ERIC Educational Resources Information Center

    Liao, Gen-Yih; Hwang, Jing-Jang

    2001-01-01

    Describes an Internet auction model achieving verifiable fairness, a requirement aimed at enhancing the trust of bidders in auctioneers. Analysis results demonstrate that the proposed model satisfies various requirements regarding fairness and privacy. Moreover, in the proposed model, the losing bids remain sealed. (Author/AEF)

  15. Verifying Stiffness Parameters Of Filament-Wound Cylinders

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Rheinfurth, M.

    1994-01-01

    Predicted engineering stiffness parameters of filament-wound composite-material cylinders verified with respect to experimental data, by use of equations developed straightforwardly from applicable formulation of Hooke's law. Equations derived in engineering study of filament-wound rocket-motor cases, also applicable to other cylindrical pressure vessels made of orthotropic materials.

  16. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Verifying your identity. 401.45 Section 401.45 Employees' Benefits SOCIAL SECURITY ADMINISTRATION PRIVACY AND DISCLOSURE OF OFFICIAL RECORDS AND... used, we alert you that personally identifiable information (such as your social security...

  17. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... under false pretenses is a criminal offense. (2) Request by telephone. If you make a request by telephone, you must verify your identity by providing identifying particulars which parallel the record to which notification or access is being sought. If we determine that the particulars provided by...

  18. Seismic analysis of the large 70-meter antenna. Part 2: General dynamic response and a seismic safety check

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    An extensive dynamic analysis for the new JPL 70-meter antenna structure is presented. Analytical procedures are based on the normal mode decomposition which include dumping and special forcing functions. The dynamic response can be obtained for any arbitrarily selected point on the structure. A new computer program for computing the time-dependent, resultant structural displacement, summing the effects of all participating modes, was developed also. Program compatibility with natural frequency analysis output was verified. The program was applied to the JPL 70-meter antenna structure and the dynamic response for several specially selected points was computed. Seismic analysis of structures, a special application of the general dynamic analysis, is based also on the normal modal decomposition. Strength specification of the antenna, with respect to the earthquake excitation, is done by using the common response spectra. The results indicated basically a safe design under an assumed 5% or more damping coefficient. However, for the antenna located at Goldstone, with more active seismic environment, this study strongly recommends and experimental program that determines the true damping coefficient for a more reliable safety check.

  19. A seismic metamaterial: The resonant metawedge

    PubMed Central

    Colombi, Andrea; Colquitt, Daniel; Roux, Philippe; Guenneau, Sebastien; Craster, Richard V.

    2016-01-01

    Critical concepts from three different fields, elasticity, plasmonics and metamaterials, are brought together to design a metasurface at the geophysical scale, the resonant metawedge, to control seismic Rayleigh waves. Made of spatially graded vertical subwavelength resonators on an elastic substrate, the metawedge can either mode convert incident surface Rayleigh waves into bulk elastic shear waves or reflect the Rayleigh waves creating a “seismic rainbow” effect analogous to the optical rainbow for electromagnetic metasurfaces. Time-domain spectral element simulations demonstrate the broadband efficacy of the metawedge in mode conversion while an analytical model is developed to accurately describe and predict the seismic rainbow effect; allowing the metawedge to be designed without the need for extensive parametric studies and simulations. The efficiency of the resonant metawedge shows that large-scale mechanical metamaterials are feasible, will have application, and that the time is ripe for considering many optical devices in the seismic and geophysical context. PMID:27283587

  20. A seismic metamaterial: The resonant metawedge.

    PubMed

    Colombi, Andrea; Colquitt, Daniel; Roux, Philippe; Guenneau, Sebastien; Craster, Richard V

    2016-01-01

    Critical concepts from three different fields, elasticity, plasmonics and metamaterials, are brought together to design a metasurface at the geophysical scale, the resonant metawedge, to control seismic Rayleigh waves. Made of spatially graded vertical subwavelength resonators on an elastic substrate, the metawedge can either mode convert incident surface Rayleigh waves into bulk elastic shear waves or reflect the Rayleigh waves creating a "seismic rainbow" effect analogous to the optical rainbow for electromagnetic metasurfaces. Time-domain spectral element simulations demonstrate the broadband efficacy of the metawedge in mode conversion while an analytical model is developed to accurately describe and predict the seismic rainbow effect; allowing the metawedge to be designed without the need for extensive parametric studies and simulations. The efficiency of the resonant metawedge shows that large-scale mechanical metamaterials are feasible, will have application, and that the time is ripe for considering many optical devices in the seismic and geophysical context. PMID:27283587

  1. A seismic metamaterial: The resonant metawedge

    NASA Astrophysics Data System (ADS)

    Colombi, Andrea; Colquitt, Daniel; Roux, Philippe; Guenneau, Sebastien; Craster, Richard V.

    2016-06-01

    Critical concepts from three different fields, elasticity, plasmonics and metamaterials, are brought together to design a metasurface at the geophysical scale, the resonant metawedge, to control seismic Rayleigh waves. Made of spatially graded vertical subwavelength resonators on an elastic substrate, the metawedge can either mode convert incident surface Rayleigh waves into bulk elastic shear waves or reflect the Rayleigh waves creating a “seismic rainbow” effect analogous to the optical rainbow for electromagnetic metasurfaces. Time-domain spectral element simulations demonstrate the broadband efficacy of the metawedge in mode conversion while an analytical model is developed to accurately describe and predict the seismic rainbow effect; allowing the metawedge to be designed without the need for extensive parametric studies and simulations. The efficiency of the resonant metawedge shows that large-scale mechanical metamaterials are feasible, will have application, and that the time is ripe for considering many optical devices in the seismic and geophysical context.

  2. Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.; Rohay, A.C.

    2000-02-23

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 311 triggers on two parallel detection and recording systems during the first quarter of fiscal year (FY) 2000. Twelve seismic events were located by the Hanford Seismic Network within the reporting region of 46{degree}-47{degree}N latitude and 119{degree}-120{degree}W longitude; 2 were earthquakes in the Columbia River Basalt Group, 3 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 1 was a quarry blast. Two earthquakes appear to be related to a major geologic structure, no earthquakes occurred in known swarm areas, and 9 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion

  3. Development of Seismic Isolation Systems Using Periodic Materials

    SciTech Connect

    Yan, Yiqun; Mo, Yi-Lung; Menq, Farn-Yuh; Stokoe, II, Kenneth H.; Perkins, Judy; Tang, Yu

    2014-12-10

    Advanced fast nuclear power plants and small modular fast reactors are composed of thin-walled structures such as pipes; as a result, they do not have sufficient inherent strength to resist seismic loads. Seismic isolation, therefore, is an effective solution for mitigating earthquake hazards for these types of structures. Base isolation, on which numerous studies have been conducted, is a well-defined structure protection system against earthquakes. In conventional isolators, such as high-damping rubber bearings, lead-rubber bearings, and friction pendulum bearings, large relative displacements occur between upper structures and foundations. Only isolation in a horizontal direction is provided; these features are not desirable for the piping systems. The concept of periodic materials, based on the theory of solid-state physics, can be applied to earthquake engineering. The periodic material is a material that possesses distinct characteristics that prevent waves with certain frequencies from being transmitted through it; therefore, this material can be used in structural foundations to block unwanted seismic waves with certain frequencies. The frequency band of periodic material that can filter out waves is called the band gap, and the structural foundation made of periodic material is referred to as the periodic foundation. The design of a nuclear power plant, therefore, can be unified around the desirable feature of a periodic foundation, while the continuous maintenance of the structure is not needed. In this research project, three different types of periodic foundations were studied: one-dimensional, two-dimensional, and three-dimensional. The basic theories of periodic foundations are introduced first to find the band gaps; then the finite element methods are used, to perform parametric analysis, and obtain attenuation zones; finally, experimental programs are conducted, and the test data are analyzed to verify the theory. This procedure shows that the

  4. Salvo: Seismic imaging software for complex geologies

    SciTech Connect

    OBER,CURTIS C.; GJERTSEN,ROB; WOMBLE,DAVID E.

    2000-03-01

    This report describes Salvo, a three-dimensional seismic-imaging software for complex geologies. Regions of complex geology, such as overthrusts and salt structures, can cause difficulties for many seismic-imaging algorithms used in production today. The paraxial wave equation and finite-difference methods used within Salvo can produce high-quality seismic images in these difficult regions. However this approach comes with higher computational costs which have been too expensive for standard production. Salvo uses improved numerical algorithms and methods, along with parallel computing, to produce high-quality images and to reduce the computational and the data input/output (I/O) costs. This report documents the numerical algorithms implemented for the paraxial wave equation, including absorbing boundary conditions, phase corrections, imaging conditions, phase encoding, and reduced-source migration. This report also describes I/O algorithms for large seismic data sets and images and parallelization methods used to obtain high efficiencies for both the computations and the I/O of seismic data sets. Finally, this report describes the required steps to compile, port and optimize the Salvo software, and describes the validation data sets used to help verify a working copy of Salvo.

  5. Design of a potential long-term test of gas production from a hydrate deposit at the PBU-L106 site in North Slope, Alaska: Geomechanical system response and seismic monitoring

    NASA Astrophysics Data System (ADS)

    Chiaramonte, L.; Kowalsky, M. B.; Rutqvist, J.; Moridis, G. J.

    2009-12-01

    In an effort to optimize the design of a potential long-term production test at the PBU-L106 site in North Slope, Alaska, we have developed a coupled modeling framework that includes the simulation of (1) large-scale production at the test site, (2) the corresponding geomechanical changes in the system caused by production, and (3) time-lapse geophysical (seismic) surveys. The long-term test is to be conducted within the deposit of the C-layer, which extends from a depth of 2226 to 2374 ft, and is characterized by two hydrate-bearing strata separated by a 30 ft shale interlayer. In this study we examine the expected geomechanical response of the permafrost-associated hydrate deposit (C-Layer) at the PBU L106 site during depressurization-induced production, and assess the potential for monitoring the system response with seismic measurements. Gas hydrates increase the strength of the sediments (often unconsolidated) they impregnate. Thus hydrate disassociation in the course of gas production could potentially affect the geomechanical stability of such deposits, leading to sediment failure and potentially affecting wellbore stability and integrity at the production site and/or at neighboring conventional production facilities. For the geomechanical analysis we use a coupled hydraulic, thermodynamic and geomechanical model (TOUGH+HYDRATE+FLAC3D, T+H+F for short) simulating production from a single vertical well at the center of an infinite-acting hydrate deposit. We investigate the geomechanical stability of the C-Layer, well stability and possible interference (due to production) with pre-existing wells in the vicinity, as well as the system sensitivity to important parameters (saturation, permeability, porosity and heterogeneity). The time-lapse seismic surveys are simulated using a finite-difference elastic wave propagation model that is linked to the T+H+F code. The seismic properties, such as the elastic and shear moduli, are a function of the simulated time- and

  6. Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity

    SciTech Connect

    Tolk, Keith M.; Stoker, Gerald C.

    1999-07-20

    An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements.

  7. Basis for seismic provisions of DOE-STD-1020

    SciTech Connect

    Kennedy, R.C.; Short, S.A.

    1994-04-01

    DOE-STD-1020 provides for a graded approach for the seismic design and evaluation of DOE structures, systems, and components (SSC). Each SSC is assigned to a Performance Category (PC) with a performance description and an approximate annual probability of seismic-induced unacceptable performance, P{sub F}. The seismic annual probability performance goals for PC 1 through 4 for which specific seismic design and evaluation criteria are presented. DOE-STD-1020 also provides a seismic design and evaluation procedure applicable to achieve any seismic performance goal annual probability of unacceptable performance specified by the user. The desired seismic performance goal is achieved by defining the seismic hazard in terms of a site-specified design/evaluation response spectrum (called herein, the Design/Evaluation Basis Earthquake, DBE). Probabilistic seismic hazard estimates are used to establish the DBE. The resulting seismic hazard curves define the amplitude of the ground motion as a function of the annual probability of exceedance P{sub H} of the specified seismic hazard. Once the DBE is defined, the SSC is designed or evaluated for this DBE using adequately conservative deterministic acceptance criteria. To be adequately conservative, the acceptance criteria must introduce an additional reduction in the risk of unacceptable performance below the annual risk of exceeding the DBE. The ratio of the seismic hazard exceedance probability P{sub H} to the performance goal probability P{sub F} is defined herein as the risk reduction ratio. The required degree of conservatism in the deterministic acceptance criteria is a function of the specified risk reduction ratio.

  8. Study on Seismic Zoning of Sino-Mongolia Arc Areas

    NASA Astrophysics Data System (ADS)

    Xu, G.

    2015-12-01

    According to the agreement of Cooperation on seismic zoning between Institute of Geophysics, China Earthquake Administration and Research Center of Astronomy and Geophysics, Mongolian Academy of Science, the data of geotectonics, active faults, seismicity and geophysical field were collected and analyzed, then field investigation proceeded for Bolnay Faults, Ar Hutul Faults and Gobi Altay Faults, and a uniform earthquake catalogue of Mongolia and North China were established for the seismic hazard study in Sino-Mongolia arc areas. Furthermore the active faults and epicenters were mapped and 2 seismic belts and their 54 potential seismic sources are determined. Based on the data and results above mentioned the seismicity parameters for the two seismic belts and their potential sources were studied. Finally, the seismic zoning with different probability in Sino-Mongolia arc areas was carried out using China probabilistic hazard analysis method. By analyzing the data and results, we draw the following main conclusions. Firstly, the origin of tectonic stress field in the study areas is the collision and pressure of the India Plate to Eurasian Plate, passing from the Qinghai-Tibet Plateau. This is the reason why the seismicity is higher in the west than in the east, and all of earthquakes with magnitude 8 or greater occurred in the west. Secondly, the determination of the 2 arc seismic belts, Altay seismic belt and Bolnay-Baikal seismic belt, are reasonable in terms of their geotectonic location, geodynamic origin and seismicity characteristics. Finally, there are some differences between our results and the Mongolia Intensity Zoning map published in 1985 in terms of shape of seismic zoning map, especially in the areas near Ulaanbaatar. We argue that our relsults are reasonable if we take into account the data use of recent study of active faults and their parameters, so it can be used as a reference for seismic design.

  9. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  10. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  11. Formally Verified Practical Algorithms for Recovery from Loss of Separation

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Caesar A.

    2009-01-01

    In this paper, we develop and formally verify practical algorithms for recovery from loss of separation. The formal verification is performed in the context of a criteria-based framework. This framework provides rigorous definitions of horizontal and vertical maneuver correctness that guarantee divergence and achieve horizontal and vertical separation. The algorithms are shown to be independently correct, that is, separation is achieved when only one aircraft maneuvers, and implicitly coordinated, that is, separation is also achieved when both aircraft maneuver. In this paper we improve the horizontal criteria over our previous work. An important benefit of the criteria approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).

  12. Verifying Anonymous Credential Systems in Applied Pi Calculus

    NASA Astrophysics Data System (ADS)

    Li, Xiangxi; Zhang, Yu; Deng, Yuxin

    Anonymous credentials are widely used to certify properties of a credential owner or to support the owner to demand valuable services, while hiding the user's identity at the same time. A credential system (a.k.a. pseudonym system) usually consists of multiple interactive procedures between users and organizations, including generating pseudonyms, issuing credentials and verifying credentials, which are required to meet various security properties. We propose a general symbolic model (based on the applied pi calculus) for anonymous credential systems and give formal definitions of a few important security properties, including pseudonym and credential unforgeability, credential safety, pseudonym untraceability. We specialize the general formalization and apply it to the verification of a concrete anonymous credential system proposed by Camenisch and Lysyanskaya. The analysis is done automatically with the tool ProVerif and several security properties have been verified.

  13. Watermarking medical images with anonymous patient identification to verify authenticity.

    PubMed

    Coatrieux, Gouenou; Quantin, Catherine; Montagner, Julien; Fassa, Maniane; Allaert, François-André; Roux, Christian

    2008-01-01

    When dealing with medical image management, there is a need to ensure information authenticity and dependability. Being able to verify the information belongs to the correct patient and is issued from the right source is a major concern. Verification can help to reduce the risk of errors when identifying documents in daily practice or when sending a patient's Electronic Health Record. At the same time, patient privacy issues may appear during the verification process when the verifier accesses patient data without appropriate authorization. In this paper we discuss the combination of watermarking with different identifiers ranging from DICOM standard UID to an Anonymous European Patient Identifier in order to improve medical image protection in terms of authenticity and maintainability. PMID:18487808

  14. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-01

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  15. EXProt: a database for proteins with an experimentally verified function.

    PubMed

    Ursing, Björn M; van Enckevort, Frank H J; Leunissen, Jack A M; Siezen, Roland J

    2002-01-01

    EXProt is a non-redundant protein database containing a selection of entries from genome annotation projects and public databases, aimed at including only proteins with an experimentally verified function. In EXProt release 2.0 we have collected entries from the Pseudomonas aeruginosa community annotation project (PseudoCAP), the Escherichia coli genome and proteome database (GenProtEC) and the translated coding sequences from the Prokaryotes division of EMBL nucleotide sequence database, which are described as having an experimentally verified function. Each entry in EXProt has a unique ID number and contains information about the species, amino acid sequence, functional annotation and, in most cases, links to references in MEDLINE/PubMed and to the entry in the original database. EXProt is indexed in SRS at CMBI (http://www.cmbi.kun.nl/srs/) and can be searched with BLAST and FASTA through the EXProt web page (http://www.cmbi.kun.nl/EXProt/). PMID:11752251

  16. Real-Time Projection to Verify Plan Success During Execution

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Dvorak, Daniel L.; Rasmussen, Robert D.; Knight, Russell L.; Morris, John R.; Bennett, Matthew B.; Ingham, Michel D.

    2012-01-01

    The Mission Data System provides a framework for modeling complex systems in terms of system behaviors and goals that express intent. Complex activity plans can be represented as goal networks that express the coordination of goals on different state variables of the system. Real-time projection extends the ability of this system to verify plan achievability (all goals can be satisfied over the entire plan) into the execution domain so that the system is able to continuously re-verify a plan as it is executed, and as the states of the system change in response to goals and the environment. Previous versions were able to detect and respond to goal violations when they actually occur during execution. This new capability enables the prediction of future goal failures; specifically, goals that were previously found to be achievable but are no longer achievable due to unanticipated faults or environmental conditions. Early detection of such situations enables operators or an autonomous fault response capability to deal with the problem at a point that maximizes the available options. For example, this system has been applied to the problem of managing battery energy on a lunar rover as it is used to explore the Moon. Astronauts drive the rover to waypoints and conduct science observations according to a plan that is scheduled and verified to be achievable with the energy resources available. As the astronauts execute this plan, the system uses this new capability to continuously re-verify the plan as energy is consumed to ensure that the battery will never be depleted below safe levels across the entire plan.

  17. Seismic margins and calibration of piping systems

    SciTech Connect

    Shieh, L.C.; Tsai, N.C.; Yang, M.S.; Wong, W.L.

    1985-01-01

    The Seismic Safety Margins Research Program (SSMRP) is a US Nuclear Regulatory Commission-funded, multiyear program conducted by Lawrence Livermore National Laboratory (LLNL). Its objective is to develop a complete, fully coupled analysis procedure for estimating the risk of earthquake-induced radioactive release from a commercial nuclear power plant and to determine major contributors to the state-of-the-art seismic and systems analysis process and explicitly includes the uncertainties in such a process. The results will be used to improve seismic licensing requirements for nuclear power plants. In Phase I of SSMRP, the overall seismic risk assessment methodology was developed and assembled. The application of this methodology to the seismic PRA (Probabilistic Risk Assessment) at the Zion Nuclear Power Plant has been documented. This report documents the method deriving response factors. The response factors, which relate design calculated responses to best estimate values, were used in the seismic response determination of piping systems for a simplified seismic probablistic risk assessment. 13 references, 31 figures, 25 tables.

  18. Nuclear archaeology: Verifying declarations of fissile-material production

    SciTech Connect

    Fetter, S. )

    1993-01-01

    Controlling the production of fissile material is an essential element of nonproliferation policy. Similarly, accounting for the past production of fissile material should be an important component of nuclear disarmament. This paper describes two promising techniques that make use of physical evidence at reactors and enrichment facilities to verify the past production of plutonium and highly enriched uranium. In the first technique, the concentrations of long-lived radionuclides in permanent components of the reactor core are used to estimate the neutron fluence in various regions of the reactor, and thereby verify declarations of plutonium production in the reactor. In the second technique, the ratio of the concentration of U-235 to that of U-234 in the tails is used to determine whether a given container of tails was used in the production of low- enriched uranium, which is suitable for reactor fuel, or highly enriched uranium, which can be used in nuclear weapons. Both techniques belong to the new field of [open quotes]nuclear archaeology,[close quotes] in which the authors attempt to document past nuclear weapons activities and thereby lay a firm foundation for verifiable nuclear disarmament. 11 refs., 1 fig., 3 tabs.

  19. Annual Hanford Seismic Report for Fiscal Year 2002

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2002-11-15

    This report summarizes the earthquake activity on Hanford for FY 2002. Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 1,177 triggers during fiscal year 2002. Of these triggers, 553 were earthquakes. Two earthquakes were located in the Hanford Seismic Network area. Stratigraphically 13 occurred in the Columbia River basalt, 12 were earthquakes in the pre-basalt sediments, and 17 were earthquakes in the crystalline basement. Geographically, 13 earthquakes occurred in swarm areas, 1 earthquake was associated with major structures, and 28 were random events. There were no earthquake triggers of the Hanford Strong Motion Accelerometers during fiscal year 2002.

  20. Second Quarter Hanford Seismic Report for Fiscal Year 2003

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2003-04-16

    This describes the earthquakes that occurred on and near the Hanford Site during the second quarter of FY03. Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 271 triggers during the second quarter of fiscal year 2003. Of these triggers, 141 were earthquakes. Twenty earthquakes were located in the Hanford Seismic Network area. Stratigraphically 9 earthquakes occurred in the Columbia River basalt, 2 were earthquakes in the pre-basalt sediments, and 9 were earthquakes in the crystalline basement. Geographically, 6 earthquakes occurred in swarm areas, 2 earthquakes were associated with a major geologic structure, and 12 were classified as random events.

  1. Angola Seismicity MAP

    NASA Astrophysics Data System (ADS)

    Neto, F. A. P.; Franca, G.

    2014-12-01

    The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of Sá da Bandeira (Lubango)-Chibemba-Oncócua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambândua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic

  2. Rock-physics and seismic-inversion based reservoir characterization of the Haynesville Shale

    NASA Astrophysics Data System (ADS)

    Jiang, Meijuan; Spikes, Kyle T.

    2016-06-01

    Seismic reservoir characterization of unconventional gas shales is challenging due to their heterogeneity and anisotropy. Rock properties of unconventional gas shales such as porosity, pore-shape distribution, and composition are important for interpreting seismic data amplitude variations in order to locate optimal drilling locations. The presented seismic reservoir characterization procedure applied a grid-search algorithm to estimate the composition, pore-shape distribution, and porosity at the seismic scale from the seismically inverted impedances and a rock-physics model, using the Haynesville Shale as a case study. All the proposed rock properties affected the seismic velocities, and the combined effects of these rock properties on the seismic amplitude were investigated simultaneously. The P- and S-impedances correlated negatively with porosity, and the V P/V S correlated positively with clay fraction and negatively with the pore-shape distribution and quartz fraction. The reliability of these estimated rock properties at the seismic scale was verified through comparisons between two sets of elastic properties: one coming from inverted impedances, which were obtained from simultaneous inversion of prestack seismic data, and one derived from these estimated rock properties. The differences between the two sets of elastic properties were less than a few percent, verifying the feasibility of the presented seismic reservoir characterization.

  3. Seismic signal of avalanches

    NASA Astrophysics Data System (ADS)

    Pesaresi, Damiano; Ravanat, Xavier; Thibert, Emmanuel

    2010-05-01

    The characterization of avalanches with seismic signals is an important task. For risk mitigation, estimating remotely avalanche activity by means of seismic signals is a good alternative to direct observations that are often limited by visual conditions and observer's availability. In seismology, the main challenge is to discriminate avalanche signals within the natural earth seismic activity and background noise. Some anthropogenic low frequency (infra-sound) sources like helicopters also generate seismic signals. In order to characterize an avalanche seismic signal, a 3-axis broad band seismometer (Guralp 3T) has been set-up on a real scale avalanche test site in Lautaret (France). The sensor is located in proximity of 2 avalanche paths where avalanches can be artificially released. Preliminary results of seismic records are presented, correlated with avalanche physical parameters (volume released, velocity, energy).

  4. Oklahoma seismic network. Final report

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. |

    1993-07-01

    The US Nuclear Regulatory Commission has established rigorous guidelines that must be adhered to before a permit to construct a nuclear-power plant is granted to an applicant. Local as well as regional seismicity and structural relationships play an integral role in the final design criteria for nuclear power plants. The existing historical record of seismicity is inadequate in a number of areas of the Midcontinent region because of the lack of instrumentation and (or) the sensitivity of the instruments deployed to monitor earthquake events. The Nemaha Uplift/Midcontinent Geophysical Anomaly is one of five principal areas east of the Rocky Mountain front that has a moderately high seismic-risk classification. The Nemaha uplift, which is common to the states of Oklahoma, Kansas, and Nebraska, is approximately 415 miles long and 12-14 miles wide. The Midcontinent Geophysical Anomaly extends southward from Minnesota across Iowa and the southeastern corner of Nebraska and probably terminates in central Kansas. A number of moderate-sized earthquakes--magnitude 5 or greater--have occurred along or west of the Nemaha uplift. The Oklahoma Geological Survey, in cooperation with the geological surveys of Kansas, Nebraska, and Iowa, conducted a 5-year investigation of the seismicity and tectonic relationships of the Nemaha uplift and associated geologic features in the Midcontinent. This investigation was intended to provide data to be used to design nuclear-power plants. However, the information is also being used to design better large-scale structures, such as dams and high-use buildings, and to provide the necessary data to evaluate earthquake-insurance rates in the Midcontinent.

  5. Mapping Europe's Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Giardini, Domenico; Wössner, Jochen; Danciu, Laurentiu

    2014-07-01

    From the rift that cuts through the heart of Iceland to the complex tectonic convergence that causes frequent and often deadly earthquakes in Italy, Greece, and Turkey to the volcanic tremors that rattle the Mediterranean, seismic activity is a prevalent and often life-threatening reality across Europe. Any attempt to mitigate the seismic risk faced by society requires an accurate estimate of the seismic hazard.

  6. Volcano seismicity in Alaska

    NASA Astrophysics Data System (ADS)

    Buurman, Helena

    I examine the many facets of volcano seismicity in Alaska: from the short-lived eruption seismicity that is limited to only the few weeks during which a volcano is active, to the seismicity that occurs in the months following an eruption, and finally to the long-term volcano seismicity that occurs in the years in which volcanoes are dormant. I use the rich seismic dataset that was recorded during the 2009 eruption of Redoubt Volcano to examine eruptive volcano seismicity. I show that the progression of magma through the conduit system at Redoubt could be readily tracked by the seismicity. Many of my interpretations benefited greatly from the numerous other datasets collected during the eruption. Rarely was there volcanic activity that did not manifest itself in some way seismically, however, resulting in a remarkably complete chronology within the seismic record of the 2009 eruption. I also use the Redoubt seismic dataset to study post-eruptive seismicity. During the year following the eruption there were a number of unexplained bursts of shallow seismicity that did not culminate in eruptive activity despite closely mirroring seismic signals that had preceded explosions less than a year prior. I show that these episodes of shallow seismicity were in fact related to volcanic processes much deeper in the volcanic edifice by demonstrating that earthquakes that were related to magmatic activity during the eruption were also present during the renewed shallow unrest. These results show that magmatic processes can continue for many months after eruptions end, suggesting that volcanoes can stay active for much longer than previously thought. In the final chapter I characterize volcanic earthquakes on a much broader scale by analyzing a decade of continuous seismic data across 46 volcanoes in the Aleutian arc to search for regional-scale trends in volcano seismicity. I find that volcanic earthquakes below 20 km depth are much more common in the central region of the arc

  7. Seismic Imaging and Monitoring

    SciTech Connect

    Huang, Lianjie

    2012-07-09

    I give an overview of LANL's capability in seismic imaging and monitoring. I present some seismic imaging and monitoring results, including imaging of complex structures, subsalt imaging of Gulf of Mexico, fault/fracture zone imaging for geothermal exploration at the Jemez pueblo, time-lapse imaging of a walkway vertical seismic profiling data for monitoring CO{sub 2} inject at SACROC, and microseismic event locations for monitoring CO{sub 2} injection at Aneth. These examples demonstrate LANL's high-resolution and high-fidelity seismic imaging and monitoring capabilities.

  8. Permeameter data verify new turbulence process for MODFLOW

    USGS Publications Warehouse

    Kuniansky, Eve L.; Halford, Keith J.; Shoemaker, W. Barclay

    2008-01-01

    A sample of Key Largo Limestone from southern Florida exhibited turbulent flow behavior along three orthogonal axes as reported in recently published permeameter experiments. The limestone sample was a cube measuring 0.2 m on edge. The published nonlinear relation between hydraulic gradient and discharge was simulated using the turbulent flow approximation applied in the Conduit Flow Process (CFP) for MODFLOW-2005 mode 2, CFPM2. The good agreement between the experimental data and the simulated results verifies the utility of the approach used to simulate the effects of turbulent flow on head distributions and flux in the CFPM2 module of MODFLOW-2005.

  9. Verifying a Simplified Fuel Oil Flow Field Measurement Protocol

    SciTech Connect

    Henderson, H.; Dentz, J.; Doty, C.

    2013-07-01

    The Better Buildings program is a U.S. Department of Energy program funding energy efficiency retrofits in buildings nationwide. The program is in need of an inexpensive method for measuring fuel oil consumption that can be used in evaluating the impact that retrofits have in existing properties with oil heat. This project developed and verified a fuel oil flow field measurement protocol that is cost effective and can be performed with little training for use by the Better Buildings program as well as other programs and researchers.

  10. Verifying Galileo's discoveries: telescope-making at the Collegio Romano

    NASA Astrophysics Data System (ADS)

    Reeves, Eileen; van Helden, Albert

    The Jesuits of the Collegio Romano in Rome, especially the mathematicians Clavius and Grienberger, were very interested in Galilei's discoveries. After they had failed to recognize with telescopes of own construction the celestial phenomena, they expressed serious doubts. But from November 1610 onward, after they had built a better telescope and had obtained from Venice another one in addition, and could verify Galilei's observations, they completely accepted them. Clavius, who stuck to the Ptolemaic system till his death in 1612, even pointed out these facts in his last edition of Sacrobosco's Sphaera. He as well as his conpatres, however, avoided any conclusions with respect to the planetary system.

  11. Verified bites by the woodlouse spider, Dysdera crocata.

    PubMed

    Vetter, Richard S; Isbister, Geoffrey K

    2006-06-01

    Bites by the woodlouse spider, Dysdera crocata, are virtually innocuous. The main symptom is minor pain, typically lasting less than 1h, probably due mostly to mechanical puncture of the skin. However, because the spider has a strong proclivity to bite, has large fangs which it bares when threatened and is commonly mistaken for the medically important brown recluse spider in the United States, documentation of the mild effects of its bites may prevent excessive, unwarranted and possibly harmful treatment. We present information on eight verified bites reported to us as well as eight additional bites recorded in the literature. PMID:16574180

  12. Verifying compliance to the biological and toxin weapons convention.

    PubMed

    Zilinskas, R A

    1998-01-01

    There are difficult technical problems inherent in verifying compliance to the Biological Weapons and Toxin Convention (BWC) that are making it difficult to reach international agreement on a verification protocol. A compliance regime will most likely involve the formation of an Organization for the Prevention of Biological Warfare (OPBW). Based in part on the experience of UNSCOM in Iraq, this article considers the value of establishing an OPBW and the problems that would be faced by such an international organization. It also reviews the types of verification measures that might be applied by the OPBW and their limitations and benefits for deterring biological weapons programs. PMID:9800100

  13. Verifying a Simplified Fuel Oil Field Measurement Protocol

    SciTech Connect

    Henderson, Hugh; Dentz, Jordan; Doty, Chris

    2013-07-01

    The Better Buildings program is a U.S. Department of Energy program funding energy efficiency retrofits in buildings nationwide. The program is in need of an inexpensive method for measuring fuel oil consumption that can be used in evaluating the impact that retrofits have in existing properties with oil heat. This project developed and verified a fuel oil flow field measurement protocol that is cost effective and can be performed with little training for use by the Better Buildings program as well as other programs and researchers.

  14. Verifiable Quantum ( k, n)-threshold Secret Key Sharing

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Teng, Yi-Wei; Chai, Hai-Ping; Wen, Qiao-Yan

    2011-03-01

    Based on Lagrange interpolation formula and the post-verification mechanism, we show how to construct a verifiable quantum ( k, n) threshold secret key sharing scheme. Compared with the previous secret sharing protocols, ours has the merits: (i) it can resist the fraud of the dealer who generates and distributes fake shares among the participants during the secret distribution phase; Most importantly, (ii) It can check the cheating of the dishonest participant who provides a false share during the secret reconstruction phase such that the authorized group cannot recover the correct secret.

  15. An economical educational seismic system

    USGS Publications Warehouse

    Lehman, J. D.

    1980-01-01

    There is a considerable interest in seismology from the nonprofessional or amateur standpoint. The operation of a seismic system can be satisfying and educational, especially when you have built and operated the system yourself. A long-period indoor-type sensor and recording system that works extremely well has been developed in the James Madison University Physics Deparment. The system can be built quite economically, and any educational institution that cannot commit themselves to a professional installation need not be without first-hand seismic information. The system design approach has been selected by college students working a project or senior thesis, several elementary and secondary science teachers, as well as the more ambitious tinkerer or hobbyist at home 

  16. Applied methods to verify LP turbine performance after retrofit

    SciTech Connect

    Overby, R.; Lindberg, G.

    1996-12-31

    With increasing operational hours of power plants, many utilities may find it necessary to replace turbine components, i.e., low pressure turbines. In order to decide between different technical and economic solutions, the utility often takes the opportunity to choose between an OEM or non-OEM supplier. This paper will deal with the retrofitting of LP turbines. Depending on the scope of supply the contract must define the amount of improvement and specifically how to verify this improvement. Unfortunately, today`s Test Codes, such as ASME PTC 6 and 6.1, do not satisfactorily cover these cases. The methods used by Florida Power and Light (FP and L) and its supplier to verify the improvement of the low pressure turbine retrofit at the Martin No. 1 and Sanford No. 4 units will be discussed and the experience gained will be presented. In particular the influence of the thermal cycle on the applicability of the available methods will be analyzed and recommendations given.

  17. A verified minimal YAC contig for human chromosome 21

    SciTech Connect

    Graw, S.L.; Patterson, D.; Drabkin, H.

    1994-09-01

    The goal of this project is the construction of a verified YAC contig of the complete long arm of human chromosome 21 utilizing YACs from the CEPH and St. Louis libraries. The YACs in this contig have been analyzed for size by PFGE, tested for chimerism by FISH or end-cloning, and verified for STS content by PCR. This last analysis has revealed a number of cases of conflict with the published STS order. To establish correct order, we have utilized STS content analysis of somatic cell hybrids containing portions of chromosome 21. Additional problems being addressed include completeness of coverage and possible deletions or gaps. Questions of completeness of the CEPH 810 YAC set arose after screening with 57 independently derived probes failed to identify clones for 11 (19%). Ten of the 11, however, do detect chromosome 21 cosmids when used to screen Lawrence Livermore library LL21NC02`G,` a cosmid library constructed from flow-sorted chromosomes 21. Remaining gaps in the contig are being closed by several methods. These include YAC fingerprinting and conversion of YACs to cosmids. In addition, we are establishing the overlap between the physical NotI map and the YAC contig by testing YACs for NotI sites and screening the YACs in the contig for the presence of NotI-linking clones.

  18. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    U.S. Geological Survey

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  19. Seismic Catalogue and Seismic Network in Haiti

    NASA Astrophysics Data System (ADS)

    Belizaire, D.; Benito, B.; Carreño, E.; Meneses, C.; Huerfano, V.; Polanco, E.; McCormack, D.

    2013-05-01

    The destructive earthquake occurred on January 10, 2010 in Haiti, highlighted the lack of preparedness of the country to address seismic phenomena. At the moment of the earthquake, there was no seismic network operating in the country, and only a partial control of the past seismicity was possible, due to the absence of a national catalogue. After the 2010 earthquake, some advances began towards the installation of a national network and the elaboration of a seismic catalogue providing the necessary input for seismic Hazard Studies. This paper presents the state of the works carried out covering both aspects. First, a seismic catalogue has been built, compiling data of historical and instrumental events occurred in the Hispaniola Island and surroundings, in the frame of the SISMO-HAITI project, supported by the Technical University of Madrid (UPM) and Developed in cooperation with the Observatoire National de l'Environnement et de la Vulnérabilité of Haiti (ONEV). Data from different agencies all over the world were gathered, being relevant the role of the Dominican Republic and Puerto Rico seismological services which provides local data of their national networks. Almost 30000 events recorded in the area from 1551 till 2011 were compiled in a first catalogue, among them 7700 events with Mw ranges between 4.0 and 8.3. Since different magnitude scale were given by the different agencies (Ms, mb, MD, ML), this first catalogue was affected by important heterogeneity in the size parameter. Then it was homogenized to moment magnitude Mw using the empirical equations developed by Bonzoni et al (2011) for the eastern Caribbean. At present, this is the most exhaustive catalogue of the country, although it is difficult to assess its degree of completeness. Regarding the seismic network, 3 stations were installed just after the 2010 earthquake by the Canadian Government. The data were sent by telemetry thought the Canadian System CARINA. In 2012, the Spanish IGN together

  20. Development of adaptive seismic isolators for ultimate seismic protection of civil structures

    NASA Astrophysics Data System (ADS)

    Li, Jianchun; Li, Yancheng; Li, Weihua; Samali, Bijan

    2013-04-01

    Base isolation is the most popular seismic protection technique for civil engineering structures. However, research has revealed that the traditional base isolation system due to its passive nature is vulnerable to two kinds of earthquakes, i.e. the near-fault and far-fault earthquakes. A great deal of effort has been dedicated to improve the performance of the traditional base isolation system for these two types of earthquakes. This paper presents a recent research breakthrough on the development of a novel adaptive seismic isolation system as the quest for ultimate protection for civil structures, utilizing the field-dependent property of the magnetorheological elastomer (MRE). A novel adaptive seismic isolator was developed as the key element to form smart seismic isolation system. The novel isolator contains unique laminated structure of steel and MR elastomer layers, which enable its large-scale civil engineering applications, and a solenoid to provide sufficient and uniform magnetic field for energizing the field-dependent property of MR elastomers. With the controllable shear modulus/damping of the MR elastomer, the developed adaptive seismic isolator possesses a controllable lateral stiffness while maintaining adequate vertical loading capacity. In this paper, a comprehensive review on the development of the adaptive seismic isolator is present including designs, analysis and testing of two prototypical adaptive seismic isolators utilizing two different MRE materials. Experimental results show that the first prototypical MRE seismic isolator can provide stiffness increase up to 37.49%, while the second prototypical MRE seismic isolator provides amazing increase of lateral stiffness up to1630%. Such range of increase of the controllable stiffness of the seismic isolator makes it highly practical for developing new adaptive base isolation system utilizing either semi-active or smart passive controls.

  1. Seismic Risk Perception compared with seismic Risk Factors

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Pessina, Vera; Pino, Nicola Alessandro; Peruzza, Laura

    2016-04-01

    The communication of natural hazards and their consequences is one of the more relevant ethical issues faced by scientists. In the last years, social studies have provided evidence that risk communication is strongly influenced by the risk perception of people. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. A theory that offers an integrative approach to understanding and explaining risk perception is still missing. To explain risk perception, it is necessary to consider several perspectives: social, psychological and cultural perspectives and their interactions. This paper presents the results of the CATI survey on seismic risk perception in Italy, conducted by INGV researchers on funding by the DPC. We built a questionnaire to assess seismic risk perception, with a particular attention to compare hazard, vulnerability and exposure perception with the real data of the same factors. The Seismic Risk Perception Questionnaire (SRP-Q) is designed by semantic differential method, using opposite terms on a Likert scale to seven points. The questionnaire allows to obtain the scores of five risk indicators: Hazard, Exposure, Vulnerability, People and Community, Earthquake Phenomenon. The questionnaire was administered by telephone interview (C.A.T.I.) on a statistical sample at national level of over 4,000 people, in the period January -February 2015. Results show that risk perception seems be underestimated for all indicators considered. In particular scores of seismic Vulnerability factor are extremely low compared with house information data of the respondents. Other data collected by the questionnaire regard Earthquake information level, Sources of information, Earthquake occurrence with respect to other natural hazards, participation at risk reduction activities and level of involvement. Research on risk perception aims to aid risk analysis and policy-making by

  2. Seismic-source representation for spall

    SciTech Connect

    Day, S.M.; McLaughlin, K.L.

    1990-11-21

    Spall may be a significant secondary source of seismic waves from underground explosions. The proper representation of spall as a seismic source is important for forward and inverse modeling of explosions for yield estimation and discrimination studies. We present a new derivation of a widely used point force representation for spall, which is based on a horizontal tension crack model. The derivation clarifies the relationship between point force and moment tensor representations of the tension crack. For wavelengths long compared with spall depth, the two representations are equivalent, and the moment tensor time history is proportional to the doubly integrated time history of the point force. Numerical experiments verify that, for regional seismic phases, this equivalence is valid for all frequencies for which the point-source (long wavelength) approximation is valid. Further analysis shows that the moment tensor and point force representations retain their validity for non-planar spall surfaces, provided that the average dip of the surface is small. The equivalency of the two representations implies that a singular inverse problem will result from attempts to infer simultaneously the spectra of both these source terms from seismic waveforms. If the spall moment tensor alone is estimated by inversion of waveform data, the inferred numerical values of its components will depend inversely upon the source depth which is assumed in the inversion formalism.

  3. Application of seismic tomography in underground mining

    SciTech Connect

    Scott, D.F.; Williams, T.J.; Friedel, M.J.

    1996-12-01

    Seismic tomography, as used in mining, is based on the principle that highly stressed rock will demonstrate relatively higher P-wave velocities than rock under less stress. A decrease or increase in stress over time can be verified by comparing successive tomograms. Personnel at the Spokane Research Center have been investigating the use of seismic tomography to identify stress in remnant ore pillars in deep (greater than 1220 in) underground mines. In this process, three-dimensional seismic surveys are conducted in a pillar between mine levels. A sledgehammer is used to generate P-waves, which are recorded by geophones connected to a stacking signal seismograph capable of collecting and storing the P-wave data. Travel times are input into a spreadsheet, and apparent velocities are then generated and merged into imaging software. Mine workings are superimposed over apparent P-wave velocity contours to generate a final tomographic image. Results of a seismic tomographic survey at the Sunshine Mine, Kellogg, ED, indicate that low-velocity areas (low stress) are associated with mine workings and high-velocity areas (higher stress) are associated with areas where no mining has taken place. A high stress gradient was identified in an area where ground failed. From this tomographic survey, as well, as four earlier surveys at other deep underground mines, a method was developed to identify relative stress in remnant ore pillars. This information is useful in making decisions about miner safety when mining such ore pillars.

  4. The Spatial Scale of Detected Seismicity

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Chen, C.-C.

    2016-01-01

    An experimental method for the spatial resolution analysis of the earthquake frequency-magnitude distribution is introduced in order to identify the intrinsic spatial scale of the detected seismicity phenomenon. We consider the unbounded magnitude range m ∈ (-∞, +∞), which includes incomplete data below the completeness magnitude m c. By analyzing a relocated earthquake catalog of Taiwan, we find that the detected seismicity phenomenon is scale-variant for m ∈ (-∞, +∞) with its spatial grain a function of the configuration of the seismic network, while seismicity is known to be scale invariant for m ∈ [ m c, +∞). Correction for data incompleteness for m < m c based on the knowledge of the spatial scale of the process allows extending the analysis of the Gutenberg-Richter law and of the fractal dimension to lower magnitudes. This shall allow verifying the continuity of universality of these parameters over a wider magnitude range. Our results also suggest that the commonly accepted Gaussian model of earthquake detection might be an artifact of observation.

  5. Seismic isolation of two dimensional periodic foundations

    SciTech Connect

    Yan, Y.; Mo, Y. L.; Laskar, A.; Cheng, Z.; Shi, Z.; Menq, F.; Tang, Y.

    2014-07-28

    Phononic crystal is now used to control acoustic waves. When the crystal goes to a larger scale, it is called periodic structure. The band gaps of the periodic structure can be reduced to range from 0.5 Hz to 50 Hz. Therefore, the periodic structure has potential applications in seismic wave reflection. In civil engineering, the periodic structure can be served as the foundation of upper structure. This type of foundation consisting of periodic structure is called periodic foundation. When the frequency of seismic waves falls into the band gaps of the periodic foundation, the seismic wave can be blocked. Field experiments of a scaled two dimensional (2D) periodic foundation with an upper structure were conducted to verify the band gap effects. Test results showed the 2D periodic foundation can effectively reduce the response of the upper structure for excitations with frequencies within the frequency band gaps. When the experimental and the finite element analysis results are compared, they agree well with each other, indicating that 2D periodic foundation is a feasible way of reducing seismic vibrations.

  6. A Novel Simple Phantom for Verifying the Dose of Radiation Therapy

    PubMed Central

    Lee, J. H.; Chang, L. T.; Shiau, A. C.; Chen, C. W.; Liao, Y. J.; Li, W. J.; Lee, M. S.; Hsu, S. M.

    2015-01-01

    A standard protocol of dosimetric measurements is used by the organizations responsible for verifying that the doses delivered in radiation-therapy institutions are within authorized limits. This study evaluated a self-designed simple auditing phantom for use in verifying the dose of radiation therapy; the phantom design, dose audit system, and clinical tests are described. Thermoluminescent dosimeters (TLDs) were used as postal dosimeters, and mailable phantoms were produced for use in postal audits. Correction factors are important for converting TLD readout values from phantoms into the absorbed dose in water. The phantom scatter correction factor was used to quantify the difference in the scattered dose between a solid water phantom and homemade phantoms; its value ranged from 1.084 to 1.031. The energy-dependence correction factor was used to compare the TLD readout of the unit dose irradiated by audit beam energies with 60Co in the solid water phantom; its value was 0.99 to 1.01. The setup-condition factor was used to correct for differences in dose-output calibration conditions. Clinical tests of the device calibrating the dose output revealed that the dose deviation was within 3%. Therefore, our homemade phantoms and dosimetric system can be applied for accurately verifying the doses applied in radiation-therapy institutions. PMID:25883980

  7. A novel simple phantom for verifying the dose of radiation therapy.

    PubMed

    Lee, J H; Chang, L T; Shiau, A C; Chen, C W; Liao, Y J; Li, W J; Lee, M S; Hsu, S M

    2015-01-01

    A standard protocol of dosimetric measurements is used by the organizations responsible for verifying that the doses delivered in radiation-therapy institutions are within authorized limits. This study evaluated a self-designed simple auditing phantom for use in verifying the dose of radiation therapy; the phantom design, dose audit system, and clinical tests are described. Thermoluminescent dosimeters (TLDs) were used as postal dosimeters, and mailable phantoms were produced for use in postal audits. Correction factors are important for converting TLD readout values from phantoms into the absorbed dose in water. The phantom scatter correction factor was used to quantify the difference in the scattered dose between a solid water phantom and homemade phantoms; its value ranged from 1.084 to 1.031. The energy-dependence correction factor was used to compare the TLD readout of the unit dose irradiated by audit beam energies with (60)Co in the solid water phantom; its value was 0.99 to 1.01. The setup-condition factor was used to correct for differences in dose-output calibration conditions. Clinical tests of the device calibrating the dose output revealed that the dose deviation was within 3%. Therefore, our homemade phantoms and dosimetric system can be applied for accurately verifying the doses applied in radiation-therapy institutions. PMID:25883980

  8. Moving formal methods into practice. Verifying the FTPP Scoreboard: Results, phase 1

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1992-01-01

    This report documents the Phase 1 results of an effort aimed at formally verifying a key hardware component, called Scoreboard, of a Fault-Tolerant Parallel Processor (FTPP) being built at Charles Stark Draper Laboratory (CSDL). The Scoreboard is part of the FTPP virtual bus that guarantees reliable communication between processors in the presence of Byzantine faults in the system. The Scoreboard implements a piece of control logic that approves and validates a message before it can be transmitted. The goal of Phase 1 was to lay the foundation of the Scoreboard verification. A formal specification of the functional requirements and a high-level hardware design for the Scoreboard were developed. The hardware design was based on a preliminary Scoreboard design developed at CSDL. A main correctness theorem, from which the functional requirements can be established as corollaries, was proved for the Scoreboard design. The goal of Phase 2 is to verify the final detailed design of Scoreboard. This task is being conducted as part of a NASA-sponsored effort to explore integration of formal methods in the development cycle of current fault-tolerant architectures being built in the aerospace industry.

  9. Seismic Computerized Alert Network

    USGS Publications Warehouse

    1986-01-01

    In 1985 the USGS devised a model for a Seismic Computerized Alert Network (SCAN) that would use continuous monitoring of seismic data from existing types of instruments to provide automatic, highly-reliable early warnings of earthquake shaking. In a large earthquake, substantial damaging ground motions may occur at great distances from the earthquake's epicenter.

  10. Seismic isolation of an electron microscope

    SciTech Connect

    Godden, W.G.; Aslam, M.; Scalise, D.T.

    1980-01-01

    A unique two-stage dynamic-isolation problem is presented by the conflicting design requirements for the foundations of an electron microscope in a seismic region. Under normal operational conditions the microscope must be isolated from ambient ground noise; this creates a system extremely vulnerable to seismic ground motions. Under earthquake loading the internal equipment forces must be limited to prevent damage or collapse. An analysis of the proposed design solution is presented. This study was motivated by the 1.5 MeV High Voltage Electron Microscope (HVEM) to be installed at the Lawrence Berkeley Laboratory (LBL) located near the Hayward Fault in California.

  11. Study on Seismicity of Sino-Mongolia Arc Areas

    NASA Astrophysics Data System (ADS)

    Xu, Guangyin; Wang, Suyun

    2016-04-01

    Using the earthquake catalogue from China, Mongolia and the global catalogue, the uniform catalogue of North China, Mongolia and adjacent areas, which is within the region 80-130°E, 40-55°N, has been established by Institute of Geophysics, China Earthquake Administration and Research Center of Astronomy and Geophysics, Mongolian Academy of Science for the seismic hazard analysis and seismic zoning map of Mongolia according to the following principles. 1) Earthquakes, which just exist in one catalogue, need to be verified further. If the earthquakes occurred in the country where the catalog comes from, then they will be adopted. If not, it should be checked with other more data. 2) The events that come from the three data sources have be checked and verified as followings. (1) The parameters of earthquakes that occurred in China will be taken from China catalog. (2)The parameters of earthquakes that occurred in Mongolia will be taken from Mongolia catalog. (3) The parameters of earthquakes that occurred in the adjacent areas will be taken from the global catalog by Song et al. According to the uniform catalogue, the seismicity of the North China, Mongolia and adjacent areas is analyzed, and the conclusions as followings are made. 1) The epicenter map can be roughly divided into two parts, bounded by the longitude line 105°E , in accordance with the "North-South Seismic Belt" of China. The seismicity is in a high level with many strong earthquakes in the west and is in a low level with little strong events in the east. 2) Most earthquakes are shallow-focus events, but there are also several middle or deep-focus events in the study area. 3) Earthquakes with magnitude greater than 5 are basically complete since 1450 A.D., and the seismicity of the study areas is in a high level since 1700 A. 4) Two seismic belts, Altay seismic belt and Bolnay-Baikal seismic belt, are determined according to the epicenters and tectonics. 5) The b-values of magnitude - frequency

  12. Developing an Approach for Analyzing and Verifying System Communication

    NASA Technical Reports Server (NTRS)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  13. Cryptanalysis and improvement of verifiable quantum ( k, n) secret sharing

    NASA Astrophysics Data System (ADS)

    Song, Xiuli; Liu, Yanbing

    2016-02-01

    After analyzing Yang's verifiable quantum secret sharing (VQSS) scheme, we show that in their scheme a participant can prepare a false quantum particle sequence corresponding to a forged share, while other any participant cannot trace it. In addition, an attacker or a participant can forge a new quantum sequence by transforming an intercepted quantum sequence; moreover, the forged sequence can pass the verification of other participants. So we propose a new VQSS scheme to improve the existed one. In the improved scheme, we construct an identity-based quantum signature encryption algorithm, which ensures chosen plaintext attack security of the shares and their signatures transmitted in the quantum tunnel. We employ dual quantum signature and one-way function to trace against forgery and repudiation of the deceivers (dealer or participants). Furthermore, we add the reconstruction process of quantum secret and prove the security property against superposition attack in this process.

  14. Considerations for using research data to verify clinical data accuracy.

    PubMed

    Fort, Daniel; Weng, Chunhua; Bakken, Suzanne; Wilcox, Adam B

    2014-01-01

    Collected to support clinical decisions and processes, clinical data may be subject to validity issues when used for research. The objective of this study is to examine methods and issues in summarizing and evaluating the accuracy of clinical data as compared to primary research data. We hypothesized that research survey data on a patient cohort could serve as a reference standard for uncovering potential biases in clinical data. We compared the summary statistics between clinical and research datasets. Seven clinical variables, i.e., height, weight, gender, ethnicity, systolic and diastolic blood pressure, and diabetes status, were included in the study. Our results show that the clinical data and research data had similar summary statistical profiles, but there are detectable differences in definitions and measurements for individual variables such as height, diastolic blood pressure, and diabetes status. We discuss the implications of these results and confirm the important considerations for using research data to verify clinical data accuracy. PMID:25717415

  15. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  16. Beyond Hammers and Nails: Mitigating and Verifying Greenhouse Gas Emissions

    NASA Astrophysics Data System (ADS)

    Gurney, Kevin Robert

    2013-05-01

    One of the biggest challenges to future international agreements on climate change is an independent, science-driven method of verifying reductions in greenhouse gas emissions (GHG) [Niederberger and Kimble, 2011]. The scientific community has thus far emphasized atmospheric measurements to assess changes in emissions. An alternative is direct measurement or estimation of fluxes at the source. Given the many challenges facing the approach that uses "top-down" atmospheric measurements and recent advances in "bottom-up" estimation methods, I challenge the current doctrine, which has the atmospheric measurement approach "validating" bottom-up, "good-faith" emissions estimation [Balter, 2012] or which holds that the use of bottom-up estimation is like "dieting without weighing oneself" [Nisbet and Weiss, 2010].

  17. Seismic Safety Of Simple Masonry Buildings

    SciTech Connect

    Guadagnuolo, Mariateresa; Faella, Giuseppe

    2008-07-08

    Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have to be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.

  18. Seismic monitoring of the Yucca Mountain facility

    SciTech Connect

    Garbin, H.D.; Herrington, P.B.; Kromer, R.P.

    1997-08-01

    Questions have arisen regarding the applicability of seismic sensors to detect mining (re-entry) with a tunnel boring machine (TBM). Unlike cut and blast techniques of mining which produce impulsive seismic signals, the TBM produces seismic signals which are of long duration. (There are well established techniques available for detecting and locating the sources of the impulsive signals.) The Yucca Mountain repository offered an opportunity to perform field evaluations of the capabilities of seismic sensors because during much of 1996, mining there was progressing with the use of a TBM. During the mining of the repository`s southern branch, an effort was designed to evaluate whether the TBM could be detected, identified and located using seismic sensors. Three data acquisition stations were established in the Yucca Mountain area to monitor the TBM activity. A ratio of short term average to long term average algorithm was developed for use in signal detection based on the characteristics shown in the time series. For location of the source of detected signals, FK analysis was used on the array data to estimate back azimuths. The back azimuth from the 3 component system was estimated from the horizontal components. Unique features in the timing of the seismic signal were used to identify the source as the TBM.

  19. Generating and verifying entangled-itinerant microwave fields

    NASA Astrophysics Data System (ADS)

    Ku, H. S.

    This thesis presents the experimental achievements of (1) generating entangled-microwave fields propagating on two physically separate transmission lines and (2) verifying the entangled states with efficient measurements. Shared entanglement between two parties is an essential resource for quantum information processing and quantum communication protocols. Experimentally, entangled pairs of electromagnetic fields can be realized by distributing a squeezed vacuum over two separated modes. As a result, entanglement is revealed by the strong cross-correlations between specific quadratures of the two modes. Although it is possible to verify the presence of entanglement with low-efficiency quadrature measurements, higher detection efficiencies are desired for performing protocols that exploit entanglement with high fidelity. In the microwave regime, Josephson parametric amplifiers (JPAs) fulfill the two major tasks mentioned above: JPAs prepare the required squeezed states to generate entanglement and enable us to perform efficient quadrature measurements. Therefore, for the purposes of entanglement generation and verification, ultralow-noise--frequency-tunable JPAs have been developed. Additionally, to increase the efficiency of entanglement generation, we integrate JPAs with two on-chip microwave passive components, a directional coupler and a quadrature hybrid, to form an entangler circuit. The two-mode entangled states are created at the two output modes of the entangler and are measured with a two-channel measurement apparatus where each of the two channels incorporates a JPA as a single-quadrature preamplifier. By employing this measurement scheme, the two measured quadratures of the two output modes can be chosen independently of each other, enabling a full characterization of the two-mode state. To definitively demonstrate the two-mode entanglement, I prove that the measured quadrature variances satisfy the inseparability criterion.

  20. The Kyrgyz Seismic Network (KNET)

    NASA Astrophysics Data System (ADS)

    Bragin, V. D.; Willemann, R. J.; Matix, A. I.; Dudinskih, R. R.; Vernon, F.; Offield, G.

    2007-05-01

    The Kyrgyz Digital Seismic Network (KNET) is a regional continuous telemetric network of very broadband seismic data. KNET was installed in 1991. The telemetry system was upgraded in 1998. The seismograms are transmitted in near real time. KNET is located along part of the boundary between the northern Tien Shan Mountains and the Kazakh platform. Several major tectonic features are spanned by the network including a series of thrust faults in the Tien Shan, the Chu Valley, and the NW-SE trending ridges north of Bishkek. This network is designed to monitor regional seismic activity at the magnitude 3.5+ level as well as to provide high quality data for research projects in regional and global broadband seismology. The Kyrgyz seismic network array consists of 10 stations - 3 of them with more than 3600 m altitude, 2 mountain repeaters, 1 intermediate data base and 2 data centers. One of data centers is a remote source for IRIS data base. KNET is operated by International Research Center - Geodynamic Proving Ground in Bishkek (IGRC) with the participation of Research Station of the Russian Academy of Sciences (RS RAS) and Kyrgyz Institute of Seismology (KIS). The network consists of Streckeisen STS-2 sensors with 24-bit PASSCAL data loggers. All continuous real-time data are accessible through the IRIS DMC in Seattle with over 95% data availability, which compares favorably to the best networks currently operating worldwide. National institutes of seismology in Kyrgyzstan and Kazakhstan, National Nuclear Centre of Kazakhstan, RS RAS, divisions of the ministries on extreme situations and the institutes of the Russian Academy of Sciences use KNET data for estimating seismic hazards and to study deep-seated structure of researched territory. KNET data is used by National Nuclear Centre of Republic of Kazakhstan, which together with LAMONT laboratory (USA) carries out verification researches and monitoring of nuclear detonations in China, India and Pakistan. The uniform

  1. Induced Seismicity Potential of Energy Technologies

    NASA Astrophysics Data System (ADS)

    Hitzman, Murray

    2013-03-01

    Earthquakes attributable to human activities-``induced seismic events''-have received heightened public attention in the United States over the past several years. Upon request from the U.S. Congress and the Department of Energy, the National Research Council was asked to assemble a committee of experts to examine the scale, scope, and consequences of seismicity induced during fluid injection and withdrawal associated with geothermal energy development, oil and gas development, and carbon capture and storage (CCS). The committee's report, publicly released in June 2012, indicates that induced seismicity associated with fluid injection or withdrawal is caused in most cases by change in pore fluid pressure and/or change in stress in the subsurface in the presence of faults with specific properties and orientations and a critical state of stress in the rocks. The factor that appears to have the most direct consequence in regard to induced seismicity is the net fluid balance (total balance of fluid introduced into or removed from the subsurface). Energy technology projects that are designed to maintain a balance between the amount of fluid being injected and withdrawn, such as most oil and gas development projects, appear to produce fewer seismic events than projects that do not maintain fluid balance. Major findings from the study include: (1) as presently implemented, the process of hydraulic fracturing for shale gas recovery does not pose a high risk for inducing felt seismic events; (2) injection for disposal of waste water derived from energy technologies does pose some risk for induced seismicity, but very few events have been documented over the past several decades relative to the large number of disposal wells in operation; and (3) CCS, due to the large net volumes of injected fluids suggested for future large-scale carbon storage projects, may have potential for inducing larger seismic events.

  2. K-means cluster analysis and seismicity partitioning for Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2014-07-01

    Pakistan and the western Himalaya is a region of high seismic activity located at the triple junction between the Arabian, Eurasian and Indian plates. Four devastating earthquakes have resulted in significant numbers of fatalities in Pakistan and the surrounding region in the past century (Quetta, 1935; Makran, 1945; Pattan, 1974 and the recent 2005 Kashmir earthquake). It is therefore necessary to develop an understanding of the spatial distribution of seismicity and the potential seismogenic sources across the region. This forms an important basis for the calculation of seismic hazard; a crucial input in seismic design codes needed to begin to effectively mitigate the high earthquake risk in Pakistan. The development of seismogenic source zones for seismic hazard analysis is driven by both geological and seismotectonic inputs. Despite the many developments in seismic hazard in recent decades, the manner in which seismotectonic information feeds the definition of the seismic source can, in many parts of the world including Pakistan and the surrounding regions, remain a subjective process driven primarily by expert judgment. Whilst much research is ongoing to map and characterise active faults in Pakistan, knowledge of the seismogenic properties of the active faults is still incomplete in much of the region. Consequently, seismicity, both historical and instrumental, remains a primary guide to the seismogenic sources of Pakistan. This study utilises a cluster analysis approach for the purposes of identifying spatial differences in seismicity, which can be utilised to form a basis for delineating seismogenic source regions. An effort is made to examine seismicity partitioning for Pakistan with respect to earthquake database, seismic cluster analysis and seismic partitions in a seismic hazard context. A magnitude homogenous earthquake catalogue has been compiled using various available earthquake data. The earthquake catalogue covers a time span from 1930 to 2007 and

  3. Effects of vertical excitation on the seismic performance of a seismically isolated bridge with sliding friction bearings

    NASA Astrophysics Data System (ADS)

    Wang, Changfeng; Zhao, Jikang; Zhu, Long; Bao, Yijun

    2016-03-01

    A finite element model is constructed for a sliding friction bearing in a seismically isolated bridge under vertical excitation with contact/friction elements. The effects of vertical excitation on the seismic performance of a seismically isolated bridge with sliding friction bearings and different bearing friction coefficients and different stiffness levels (pier diameter) are discussed using example calculations, and the effects of excitation direction for vertical excitation on the analysis results are explored. The analysis results shows that vertical excitation has a relatively large impact on seismic performance for a seismically isolated bridge with sliding friction bearings, which should be considered when designing a seismically isolated bridge with sliding friction bearings where vertical excitation dominates.

  4. Experimental Techniques Verified for Determining Yield and Flow Surfaces

    NASA Technical Reports Server (NTRS)

    Lerch, Brad A.; Ellis, Rod; Lissenden, Cliff J.

    1998-01-01

    Structural components in aircraft engines are subjected to multiaxial loads when in service. For such components, life prediction methodologies are dependent on the accuracy of the constitutive models that determine the elastic and inelastic portions of a loading cycle. A threshold surface (such as a yield surface) is customarily used to differentiate between reversible and irreversible flow. For elastoplastic materials, a yield surface can be used to delimit the elastic region in a given stress space. The concept of a yield surface is central to the mathematical formulation of a classical plasticity theory, but at elevated temperatures, material response can be highly time dependent. Thus, viscoplastic theories have been developed to account for this time dependency. Since the key to many of these theories is experimental validation, the objective of this work (refs. 1 and 2) at the NASA Lewis Research Center was to verify that current laboratory techniques and equipment are sufficient to determine flow surfaces at elevated temperatures. By probing many times in the axial-torsional stress space, we could define the yield and flow surfaces. A small offset definition of yield (10 me) was used to delineate the boundary between reversible and irreversible behavior so that the material state remained essentially unchanged and multiple probes could be done on the same specimen. The strain was measured with an off-the-shelf multiaxial extensometer that could measure the axial and torsional strains over a wide range of temperatures. The accuracy and resolution of this extensometer was verified by comparing its data with strain gauge data at room temperature. The extensometer was found to have sufficient resolution for these experiments. In addition, the amount of crosstalk (i.e., the accumulation of apparent strain in one direction when strain in the other direction is applied) was found to be negligible. Tubular specimens were induction heated to determine the flow

  5. Realities of verifying the absence of highly enriched uranium (HEU) in gas centrifuge enrichment plants

    SciTech Connect

    Swindle, D.W.

    1990-03-01

    Over a two and one-half year period beginning in 1981, representatives of six countries (United States, United Kingdom, Federal Republic of Germany, Australia, The Netherlands, and Japan) and the inspectorate organizations of the International Atomic Energy Agency and EURATOM developed and agreed to a technically sound approach for verifying the absence of highly enriched uranium (HEU) in gas centrifuge enrichment plants. This effort, known as the Hexapartite Safeguards Project (HSP), led to the first international concensus on techniques and requirements for effective verification of the absence of weapons-grade nuclear materials production. Since that agreement, research and development has continued on the radiation detection technology-based technique that technically confirms the HSP goal is achievable. However, the realities of achieving the HSP goal of effective technical verification have not yet been fully attained. Issues such as design and operating conditions unique to each gas centrifuge plant, concern about the potential for sensitive technology disclosures, and on-site support requirements have hindered full implementation and operator support of the HSP agreement. In future arms control treaties that may limit or monitor fissile material production, the negotiators must recognize and account for the realities and practicalities in verifying the absence of HEU production. This paper will describe the experiences and realities of trying to achieve the goal of developing and implementing an effective approach for verifying the absence of HEU production. 3 figs.

  6. Measurements verifying the optics of the Electron Drift Instrument

    NASA Astrophysics Data System (ADS)

    Kooi, Vanessa M.

    This thesis concentrates on laboratory measurements of the Electron Drift Instrument (EDI), focussing primarily on the EDI optics of the system. The EDI is a device used on spacecraft to measure electric fields by emitting an electron beam and measuring the E x B drift of the returning electrons after one gyration. This drift velocity is determined using two electron beams directed perpendicular to the magnetic field returning to be detected by the spacecraft. The EDI will be used on the Magnetospheric Multi-Scale Mission. The EDI optic's testing process takes measurements of the optics response to a uni-directional electron beam. These measurements are used to verify the response of the EDI's optics and to allow for the optimization of the desired optics state via simulation. The optics state tables were created in simulations and we are using these measurements to confirm their accuracy. The setup consisted of an apparatus made up of the EDI's optics and sensor electronics was secured to the two axis gear arm inside a vacuum chamber. An electron beam was projected at the apparatus which then used the EDI optics to focus the beam into the micro-controller plates and onto the circular 32 pad annular ring that makes up the sensor. The concentration of counts per pad over an interval of 1ms were averaged over 25 samples and plotted in MATLAB. The results of the measurements plotted agreed well with the simulations, providing confidence in the EDI instrument.

  7. Verifying operator fitness - an imperative not an option

    SciTech Connect

    Scott, A.B. Jr.

    1987-01-01

    In the early morning hours of April 26, 1986, whatever credence those who operate nuclear power plants around the world could then muster, suffered a jarring reversal. Through an incredible series of personal errors, the operators at what was later to be termed one of the best operated plants in the USSR systematically stripped away the physical and procedural safeguards inherent to their installation and precipitated the worst reactor accident the world has yet seen. This challenge to the adequacy of nuclear operators comes at a time when many companies throughout the world - not only those that involve nuclear power - are grappling with the problem of how to assure the fitness for duty of those in their employ, specifically those users of substances that have an impact on the ability to function safely and productively in the workplace. In actuality, operator fitness for duty is far more than the lack of impairment from substance abuse, which many today consider it. Full fitness for duty implies mental and moral fitness, as well, and physical fitness in a more general sense. If we are to earn the confidence of the public, credible ways to verify total fitness on an operator-by-operator basis must be considered.

  8. Garbage collection can be made real-time and verifiable

    NASA Technical Reports Server (NTRS)

    Hino, James H.; Ross, Charles L.

    1988-01-01

    An efficient means of memory reclamation (also known as Garbage Collection) is essential for Machine Intelligence applications where dynamic storage allocation is desired or required. Solutions for real-time systems must introduce very small processing overhead and must also provide for the verification of the software in order to meet the application time budgets and to verify the correctness of the software. Garbage Collection (GC) techniques are proposed for symbolic processing systems which may simultaneously meet both real-time requirements and verification requirements. The proposed memory reclamation technique takes advantage of the strong points of both the earlier Mark and Sweep technique and the more recent Copy Collection approaches. At least one practical implementation of these new GC techniques has already been developed and tested on a very-high performance symbolic computing system. Complete GC processing of all generated garbage has been demonstrated to require as little as a few milliseconds to perform. This speed enables the effective operation of the GC function as either a background task or as an actual part of the application task itself.

  9. Measurements Verifying the Optics of the Electron Drift Instrument

    NASA Astrophysics Data System (ADS)

    Kooi, Vanessa; Kletzing, Craig; Bounds, Scott; Sigsbee, Kristine M.

    2015-04-01

    Magnetic reconnection is the process of breaking and reconnecting of opposing magnetic field lines, and is often associated with tremendous energy transfer. The energy transferred by reconnection directly affects people through its influence on geospace weather and technological systems - such as telecommunication networks, GPS, and power grids. However, the mechanisms that cause magnetic reconnection are not well understood. The Magnetospheric Multi-Scale Mission (MMS) will use four spacecraft in a pyramid formation to make three-dimensional measurements of the structures in magnetic reconnection occurring in the Earth's magnetosphere.The spacecraft will repeatedly sample these regions for a prolonged period of time to gather data in more detail than has been previously possible. MMS is scheduled to be launched in March of 2015. The Electron Drift Instrument (EDI) will be used on MMS to measure the electric fields associated with magnetic reconnection. The EDI is a device used on spacecraft to measure electric fields by emitting an electron beam and measuring the E x B drift of the returning electrons after one gyration. This paper concentrates on measurements of the EDI’s optics system. The testing process includes measuring the optics response to a uni-directional electron beam. These measurements are used to verify the response of the EDI's optics and to allow for the optimization of the desired optics state. The measurements agree well with simulations and we are confident in the performance of the EDI instrument.

  10. Improved data analysis for verifying quantum nonlocality and entanglement

    NASA Astrophysics Data System (ADS)

    Zhang, Yanbao; Glancy, Scott; Knill, Emanuel

    2012-06-01

    Given a finite number of experimental results originating from local measurements on two separated quantum systems in an unknown state, are these systems nonlocally correlated or entangled with each other? These properties can be verified by violating a Bell inequality or satisfying an entanglement witness. However, violation or satisfaction could be due to statistical fluctuations in finite measurements. Rigorous upper bounds, on the maximum probability (i.e., the p-value) according to local realistic or separable states of a violation or satisfaction as high as the observed, are required. Here, we propose a rigorous upper bound that improves the known bound from large deviation theory [R. Gill, arXiv:quant-ph/0110137]. The proposed bound is robust against experimental instability and the memory loophole [J. Barrett et al., Phys. Rev. A 66, 042111 (2002)]. Compared with our previous method [Phys. Rev. A 84, 062118 (2011)], the proposed method takes advantage of the particular Bell inequality or entanglement witness tested in an experiment, so the computation complexity is reduced. Also, this method can be easily extended to test a set of independent Bell inequalities or entanglement witnesses simultaneously.

  11. A credit card verifier structure using diffraction and spectroscopy concepts

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2008-04-01

    We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram is a diffractive optical element, we choose to shine one at a time a number of broadband lightsources, each at different incident angle, on the embossed hologram of the credit card in such a way that different color spectra per incident angle beam is diffracted and separated in space. In this way, the number of pixels of each color plane is investigated. Then we apply a feed forward back propagation neural network configuration to separate the counterfeit credit card from the real one. Our experimental demonstration using two off-the-shelf broadband white light emitting diodes, one digital camera, a 3-layer neural network, and a notebook computer can identify all 69 counterfeit credit cards from eight real credit cards.

  12. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  13. Piping and equipment resistance to seismic-generated missiles

    SciTech Connect

    LaSalle, F.R.; Golbeg, P.R.; Chenault, D.M.

    1992-02-01

    For reactor and nuclear facilities, both Title 10, Code of Federal Regulations, Part 50, and US Department of Energy Order 6430.1A require assessments of the interaction of non-Safety Class 1 piping and equipment with Safety Class 1 piping and equipment during a seismic event to maintain the safety function. The safety class systems of nuclear reactors or nuclear facilities are designed to the applicable American Society of Mechanical Engineers standards and Seismic Category 1 criteria that require rigorous analysis, construction, and quality assurance. Because non-safety class systems are generally designed to lesser standards and seismic criteria, they may become missiles during a safe shutdown earthquake. The resistance of piping, tubing, and equipment to seismically generated missiles is addressed in the paper. Gross plastic and local penetration failures are considered with applicable test verification. Missile types and seismic zones of influence are discussed. Field qualification data are also developed for missile evaluation.

  14. Successes and failures of recording and interpreting seismic data in structurally complex area: seismic case history

    SciTech Connect

    Morse, V.C.; Johnson, J.H.; Crittenden, J.L.; Anderson, T.D.

    1986-05-01

    There are successes and failures in recording and interpreting a single seismic line across the South Owl Creek Mountain fault on the west flank of the Casper arch. Information obtained from this type of work should help explorationists who are exploring structurally complex areas. A depth cross section lacks a subthrust prospect, but is illustrated to show that the South Owl Creek Mountain fault is steeper with less apparent displacement than in areas to the north. This cross section is derived from two-dimensional seismic modeling, using data processing methods specifically for modeling. A flat horizon and balancing technique helps confirm model accuracy. High-quality data were acquired using specifically designed seismic field parameters. The authors concluded that the methodology used is valid, and an interactive modeling program in addition to cross-line control can improve seismic interpretations in structurally complex areas.

  15. Seismic analysis of a reinforced concrete containment vessel model

    SciTech Connect

    RANDY,JAMES J.; CHERRY,JEFFERY L.; RASHID,YUSEF R.; CHOKSHI,NILESH

    2000-02-03

    Pre-and post-test analytical predictions of the dynamic behavior of a 1:10 scale model Reinforced Concrete Containment Vessel are presented. This model, designed and constructed by the Nuclear Power Engineering Corp., was subjected to seismic simulation tests using the high-performance shaking table at the Tadotsu Engineering Laboratory in Japan. A group of tests representing design-level and beyond-design-level ground motions were first conducted to verify design safety margins. These were followed by a series of tests in which progressively larger base motions were applied until structural failure was induced. The analysis was performed by ANATECH Corp. and Sandia National Laboratories for the US Nuclear Regulatory Commission, employing state-of-the-art finite-element software specifically developed for concrete structures. Three-dimensional time-history analyses were performed, first as pre-test blind predictions to evaluate the general capabilities of the analytical methods, and second as post-test validation of the methods and interpretation of the test result. The input data consisted of acceleration time histories for the horizontal, vertical and rotational (rocking) components, as measured by accelerometers mounted on the structure's basemat. The response data consisted of acceleration and displacement records for various points on the structure, as well as time-history records of strain gages mounted on the reinforcement. This paper reports on work in progress and presents pre-test predictions and post-test comparisons to measured data for tests simulating maximum design basis and extreme design basis earthquakes. The pre-test analyses predict the failure earthquake of the test structure to have an energy level in the range of four to five times the energy level of the safe shutdown earthquake. The post-test calculations completed so far show good agreement with measured data.

  16. Method of migrating seismic records

    DOEpatents

    Ober, Curtis C.; Romero, Louis A.; Ghiglia, Dennis C.

    2000-01-01

    The present invention provides a method of migrating seismic records that retains the information in the seismic records and allows migration with significant reductions in computing cost. The present invention comprises phase encoding seismic records and combining the encoded seismic records before migration. Phase encoding can minimize the effect of unwanted cross terms while still allowing significant reductions in the cost to migrate a number of seismic records.

  17. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    SciTech Connect

    Joel Walls; M.T. Taner; Naum Derzhi; Gary Mavko; Jack Dvorkin

    2003-04-01

    In this report we will show some new Q related seismic attributes on the Burlington-Seitel data set. One example will be called Energy Absorption Attribute (EAA) and is based on a spectral analysis. The EAA algorithm is designed to detect a sudden increase in the rate of exponential decay in the relatively higher frequency portion of the spectrum. In addition we will show results from a hybrid attribute that combines attenuation with relative acoustic impedance to give a better indication of commercial gas saturation.

  18. Seismic sequences in the Sombrero Seismic Zone

    NASA Astrophysics Data System (ADS)

    Pulliam, J.; Huerfano, V. A.; ten Brink, U.; von Hillebrandt, C.

    2007-05-01

    The northeastern Caribbean, in the vicinity of Puerto Rico and the Virgin Islands, has a long and well-documented history of devastating earthquakes and tsunamis, including major events in 1670, 1787, 1867, 1916, 1918, and 1943. Recently, seismicity has been concentrated to the north and west of the British Virgin Islands, in the region referred to as the Sombrero Seismic Zone by the Puerto Rico Seismic Network (PRSN). In the combined seismicity catalog maintained by the PRSN, several hundred small to moderate magnitude events can be found in this region prior to 2006. However, beginning in 2006 and continuing to the present, the rate of seismicity in the Sombrero suddenly increased, and a new locus of activity developed to the east of the previous location. Accurate estimates of seismic hazard, and the tsunamigenic potential of seismic events, depend on an accurate and comprehensive understanding of how strain is being accommodated in this corner region. Are faults locked and accumulating strain for release in a major event? Or is strain being released via slip over a diffuse system of faults? A careful analysis of seismicity patterns in the Sombrero region has the potential to both identify faults and modes of failure, provided the aggregation scheme is tuned to properly identify related events. To this end, we experimented with a scheme to identify seismic sequences based on physical and temporal proximity, under the assumptions that (a) events occur on related fault systems as stress is refocused by immediately previous events and (b) such 'stress waves' die out with time, so that two events that occur on the same system within a relatively short time window can be said to have a similar 'trigger' in ways that two nearby events that occurred years apart cannot. Patterns that emerge from the identification, temporal sequence, and refined locations of such sequences of events carry information about stress accommodation that is obscured by large clouds of

  19. Revised seismic and geologic siting regulations for nuclear power plants

    SciTech Connect

    Murphy, A.J.; Chokshi, N.C.

    1997-02-01

    The primary regulatory basis governing the seismic design of nuclear power plants is contained in Appendix A to Part 50, General Design Criteria for Nuclear Power Plants, of Title 10 of the Code of Federal Regulations (CFR). General Design Criteria (GDC) 2 defines requirements for design bases for protection against natural phenomena. GDC 2 states the performance criterion that {open_quotes}Structures, systems, and components important to safety shall be designed to withstand the effects of natural phenomena such as earthquakes, . . . without loss of capability to perform their safety functions. . .{close_quotes}. Appendix A to Part 100, Seismic and Geologic Siting Criteria for Nuclear Power Plants, has been the principal document which provided detailed criteria to evaluate the suitability of proposed sites and suitability of the plant design basis established in consideration of the seismic and geologic characteristics of the proposed sites. Appendix A defines required seismological and geological investigations and requirements for other design conditions such as soil stability, slope stability, and seismically induced floods and water waves, and requirements for seismic instrumentation. The NRC staff is in the process of revising Appendix A. The NRC has recently revised seismic siting and design regulations for future applications. These revisions are discussed in detail in this paper.

  20. Development of Towed Marine Seismic Vibrator as an Alternative Seismic Source

    NASA Astrophysics Data System (ADS)

    Ozasa, H.; Mikada, H.; Murakami, F.; Jamali Hondori, E.; Takekawa, J.; Asakawa, E.; Sato, F.

    2015-12-01

    The principal issue with respect to marine impulsive sources to acquire seismic data is if the emission of acoustic energy inflicts harm on marine mammals or not, since the volume of the source signal being released into the marine environment could be so large compared to the sound range of the mammals. We propose a marine seismic vibrator as an alternative to the impulsive sources to mitigate a risk of the impact to the marine environment while satisfying the necessary conditions of seismic surveys. These conditions include the repeatability and the controllability of source signals both in amplitude and phase for high-quality measurements. We, therefore, designed a towed marine seismic vibrator (MSV) as a new type marine vibratory seismic source that employed the hydraulic servo system for the controllability condition in phase and in amplitude that assures the repeatability as well. After fabricating a downsized MSV that requires the power of 30 kVA at a depth of about 250 m in water, several sea trials were conducted to test the source characteristics of the downsized MSV in terms of amplitude, frequency, horizontal and vertical directivities of the generated field. The maximum sound level satisfied the designed specification in the frequencies ranging from 3 to 300 Hz almost omnidirectionally. After checking the source characteristics, we then conducted a trial seismic survey, using both the downsized MSV and an airgun of 480 cubic-inches for comparison, with a streamer cable of 2,000m long right above a cabled earthquake observatory in the Japan Sea. The result showed that the penetration of seismic signals generated by the downsized MSV was comparable to that by the airgun, although there was a slight difference in the signal-to-noise ratio. The MSV could become a versatile source that will not harm living marine mammals as an alternative to the existing impulsive seismic sources such as airgun.

  1. Third Quarter Hanford Seismic Report for Fiscal Year 2005

    SciTech Connect

    Reidel, Steve P.; Rohay, Alan C.; Hartshorn, Donald C.; Clayton, Ray E.; Sweeney, Mark D.

    2005-09-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 337 triggers during the third quarter of fiscal year 2005. Of these triggers, 20 were earthquakes within the Hanford Seismic Network. The largest earthquake within the Hanford Seismic Network was a magnitude 1.3 event May 25 near Vantage, Washington. During the third quarter, stratigraphically 17 (85%) events occurred in the Columbia River basalt (approximately 0-5 km), no events in the pre-basalt sediments (approximately 5-10 km), and three (15%) in the crystalline basement (approximately 10-25 km). During the first quarter, geographically five (20%) earthquakes occurred in swarm areas, 10 (50%) earthquakes were associated with a major geologic structure, and 5 (25%) were classified as random events.

  2. Annual Hanford Seismic Report for Fiscal Year 2003

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2003-12-01

    This report describes the seismic activity in and around the Hanford Site during Fiscal year 2003. Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 1,336 triggers during fiscal year 2003. Of these triggers, 590 were earthquakes. One hundred and one earthquakes of the 590 earthquakes were located in the Hanford Seismic Network area. Stratigraphically 35 (34.6%) occurred in the Columbia River basalt, 29 (28.7%) were earthquakes in the pre-basalt sediments, and 37 (36.7%) were earthquakes in the crystalline basement. Geographically, 48 (47%) earthquakes occurred in swarm areas, 4 (4%) earthquakes were associated with a major geologic structure, and 49 (49%) were classified as random events. During the third and fourth quarters, an earthquake swarm consisting of 27 earthquakes occurred on the south limb of Rattlesnake Mountain. The earthquakes are centered over the northwest extension of the Horse Heaven Hills anticline and probably occur near the interface of the Columbia River Basalt Group and pre-basalt sediments.

  3. BUILDING 341 Seismic Evaluation

    SciTech Connect

    Halle, J.

    2015-06-15

    The Seismic Evaluation of Building 341 located at Lawrence Livermore National Laboratory in Livermore, California has been completed. The subject building consists of a main building, Increment 1, and two smaller additions; Increments 2 and 3.

  4. Seismicity, 1980-86

    SciTech Connect

    Hill, D.P.; Eaton, J.P.; Jones, L.M.

    1990-01-01

    Tens of thousands of small earthquakes occur in California each year, reflecting brittle deformation of the margins of the Pacific and North American plates as they grind inexorably past one another along the San Andreas fault system. The deformational patterns revealed by this ongoing earthquake activity provide a wealth of information on the tectonic processes along this major transform boundary that, every few hundred years, culminate in rupture of the San Andreas fault in a great (M {approx} 8) earthquake. This chapter describes the regional seismicity and the San Andreas transform boundary; seismicity along the San Andreas Fault system; and focal mechanisms and transform-boundary kinematics. Seismicity patterns and the earthquake cycle and distributed seismicity and deformation of the plate margins are discussed.

  5. Seismic attenuation in Florida

    SciTech Connect

    Bellini, J.J.; Bartolini, T.J.; Lord, K.M.; Smith, D.L. . Dept. of Geology)

    1993-03-01

    Seismic signals recorded by the expanded distribution of earthquake seismograph stations throughout Florida and data from a comprehensive review of record archives from stations GAI contribute to an initial seismic attenuation model for the Florida Plateau. Based on calculations of surface particle velocity, a pattern of attenuation exists that appears to deviate from that established for the remainder of the southeastern US. Most values suggest greater seismic attenuation within the Florida Plateau. However, a separate pattern may exist for those signals arising from the Gulf of Mexico. These results have important implications for seismic hazard assessments in Florida and may be indicative of the unique lithospheric identity of the Florida basement as an exotic terrane.

  6. Seismic Ray Theory

    NASA Astrophysics Data System (ADS)

    Cerveny, V.

    2001-07-01

    The seismic ray method plays an important role in seismology, seismic exploration, and in the interpretation of seismic measurements. Seismic Ray Theory presents the most comprehensive treatment of the method available. Many new concepts that extend the possibilities and increase the method's efficiency are included. The book has a tutorial character: derivations start with a relatively simple problem, in which the main ideas are easier to explain, and then advance to more complex problems. Most of the derived equations are expressed in algorithmic form and may be used directly for computer programming. This book will prove to be an invaluable advanced text and reference in all academic institutions in which seismology is taught or researched.

  7. Software for Verifying Image-Correlation Tie Points

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Yagi, Gary

    2008-01-01

    A computer program enables assessment of the quality of tie points in the image-correlation processes of the software described in the immediately preceding article. Tie points are computed in mappings between corresponding pixels in the left and right images of a stereoscopic pair. The mappings are sometimes not perfect because image data can be noisy and parallax can cause some points to appear in one image but not the other. The present computer program relies on the availability of a left- right correlation map in addition to the usual right left correlation map. The additional map must be generated, which doubles the processing time. Such increased time can now be afforded in the data-processing pipeline, since the time for map generation is now reduced from about 60 to 3 minutes by the parallelization discussed in the previous article. Parallel cluster processing time, therefore, enabled this better science result. The first mapping is typically from a point (denoted by coordinates x,y) in the left image to a point (x',y') in the right image. The second mapping is from (x',y') in the right image to some point (x",y") in the left image. If (x,y) and(x",y") are identical, then the mapping is considered perfect. The perfect-match criterion can be relaxed by introducing an error window that admits of round-off error and a small amount of noise. The mapping procedure can be repeated until all points in each image not connected to points in the other image are eliminated, so that what remains are verified correlation data.

  8. A manufactured solution for verifying CFD boundary conditions: part II.

    SciTech Connect

    Bond, Ryan Bomar; Ober, Curtis Curry; Knupp, Patrick Michael

    2005-01-01

    Order-of-accuracy verification is necessary to ensure that software correctly solves a given set of equations. One method to verify the order of accuracy of a code is the method of manufactured solutions. In this study, a manufactured solution has been derived and implemented that allows verification of not only the Euler, Navier-Stokes, and Reynolds-Averaged Navier-Stokes (RANS) equation sets, but also some of their associated boundary conditions (BC's): slip, no-slip (adiabatic and isothermal), and outflow (subsonic, supersonic, and mixed). Order-of-accuracy verification has been performed for the Euler and Navier-Stokes equations and these BC's in a compressible computational fluid dynamics code. All of the results shown are on skewed, non-uniform meshes. RANS results will be presented in a future paper. The observed order of accuracy was lower than the expected order of accuracy in two cases. One of these cases resulted in the identification and correction of a coding mistake in the CHAD gradient correction that was reducing the observed order of accuracy. This mistake would have been undetectable on a Cartesian mesh. During the search for the CHAD gradient correction problem, an unrelated coding mistake was found and corrected. The other case in which the observed order of accuracy was less than expected was a test of the slip BC; although no specific coding or formulation mistakes have yet been identified. After the correction of the identified coding mistakes, all of the aforementioned equation sets and BC's demonstrated the expected (or at least acceptable) order of accuracy except the slip condition.

  9. Analytical Approaches to Verify Food Integrity: Needs and Challenges.

    PubMed

    Stadler, Richard H; Tran, Lien-Anh; Cavin, Christophe; Zbinden, Pascal; Konings, Erik J M

    2016-09-01

    A brief overview of the main analytical approaches and practices to determine food authenticity is presented, addressing, as well, food supply chain and future requirements to more effectively mitigate food fraud. Food companies are introducing procedures and mechanisms that allow them to identify vulnerabilities in their food supply chain under the umbrella of a food fraud prevention management system. A key step and first line of defense is thorough supply chain mapping and full transparency, assessing the likelihood of fraudsters to penetrate the chain at any point. More vulnerable chains, such as those where ingredients and/or raw materials are purchased through traders or auctions, may require a higher degree of sampling, testing, and surveillance. Access to analytical tools is therefore pivotal, requiring continuous development and possibly sophistication in identifying chemical markers, data acquisition, and modeling. Significant progress in portable technologies is evident already today, for instance, as in the rapid testing now available at the agricultural level. In the near future, consumers may also have the ability to scan products in stores or at home to authenticate labels and food content. For food manufacturers, targeted analytical methods complemented by untargeted approaches are end control measures at the factory gate when the material is delivered. In essence, testing for food adulterants is an integral part of routine QC, ideally tailored to the risks in the individual markets and/or geographies or supply chains. The development of analytical methods is a first step in verifying the compliance and authenticity of food materials. A next, more challenging step is the successful establishment of global consensus reference methods as exemplified by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals initiative, which can serve as an approach that could also be applied to methods for contaminants and adulterants in food. The food

  10. Scenarios for exercising technical approaches to verified nuclear reductions

    SciTech Connect

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information

  11. Passive seismic experiment

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Ewing, M.; Press, F.; Sutton, G.; Dorman, J.; Nakamura, Y.; Toksoz, N.; Lammlein, D.; Duennebier, F.

    1972-01-01

    The establishment of a network of seismic stations on the lunar surface as a result of equipment installed by Apollo 12, 14, and 15 flights is described. Four major discoveries obtained by analyzing seismic data from the network are discussed. The use of the system to detect vibrations of the lunar surface and the use of the data to determine the internal structure, physical state, and tectonic activity of the moon are examined.

  12. AUTOMATING SHALLOW SEISMIC IMAGING

    SciTech Connect

    Steeples, Don W.

    2003-09-14

    The current project is a continuation of an effort to develop ultrashallow seismic imaging as a cost-effective method potentially applicable to DOE facilities. The objective of the present research is to develop and demonstrate the use of a cost-effective, automated method of conducting shallow seismic surveys, an approach that represents a significant departure from conventional seismic-survey field procedures. Initial testing of a mechanical geophone-planting device suggests that large numbers of geophones can be placed both quickly and automatically. The development of such a device could make the application of SSR considerably more efficient and less expensive. The imaging results obtained using automated seismic methods will be compared with results obtained using classical seismic techniques. Although this research falls primarily into the field of seismology, for comparison and quality-control purposes, some GPR data will be collected as well. In the final year of th e research, demonstration surveys at one or more DOE facilities will be performed. An automated geophone-planting device of the type under development would not necessarily be limited to the use of shallow seismic reflection methods; it also would be capable of collecting data for seismic-refraction and possibly for surface-wave studies. Another element of our research plan involves monitoring the cone of depression of a pumping well that is being used as a proxy site for fluid-flow at a contaminated site. Our next data set will be collected at a well site where drawdown equilibrium has been reached. Noninvasive, in-situ methods such as placing geophones automatically and using near-surface seismic methods to identify and characterize the hydrologic flow regimes at contaminated sites support the prospect of developing effective, cost-conscious cleanup strategies for DOE and others.

  13. Seismic risk management solution for nuclear power plants

    DOE PAGESBeta

    Coleman, Justin; Sabharwall, Piyush

    2014-12-01

    Nuclear power plants should safely operate during normal operations and maintain core-cooling capabilities during off-normal events, including external hazards (such as flooding and earthquakes). Management of external hazards to expectable levels of risk is critical to maintaining nuclear facility and nuclear power plant safety. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components). Seismic isolation (SI) is one protective measure showing promise to minimize seismic risk. Current SI designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefitmore » of SI application in the nuclear industry is being recognized and SI systems have been proposed in American Society of Civil Engineer Standard 4, ASCE-4, to be released in the winter of 2014, for light water reactors facilities using commercially available technology. The intent of ASCE-4 is to provide criteria for seismic analysis of safety related nuclear structures such that the responses to design basis seismic events, computed in accordance with this standard, will have a small likelihood of being exceeded. The U.S. nuclear industry has not implemented SI to date; a seismic isolation gap analysis meeting was convened on August 19, 2014, to determine progress on implementing SI in the U.S. nuclear industry. The meeting focused on the systems and components that could benefit from SI. As a result, this article highlights the gaps identified at this meeting.« less

  14. Seismic risk management solution for nuclear power plants

    SciTech Connect

    Coleman, Justin; Sabharwall, Piyush

    2014-12-01

    Nuclear power plants should safely operate during normal operations and maintain core-cooling capabilities during off-normal events, including external hazards (such as flooding and earthquakes). Management of external hazards to expectable levels of risk is critical to maintaining nuclear facility and nuclear power plant safety. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components). Seismic isolation (SI) is one protective measure showing promise to minimize seismic risk. Current SI designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed in American Society of Civil Engineer Standard 4, ASCE-4, to be released in the winter of 2014, for light water reactors facilities using commercially available technology. The intent of ASCE-4 is to provide criteria for seismic analysis of safety related nuclear structures such that the responses to design basis seismic events, computed in accordance with this standard, will have a small likelihood of being exceeded. The U.S. nuclear industry has not implemented SI to date; a seismic isolation gap analysis meeting was convened on August 19, 2014, to determine progress on implementing SI in the U.S. nuclear industry. The meeting focused on the systems and components that could benefit from SI. As a result, this article highlights the gaps identified at this meeting.

  15. 76 FR 45843 - Agency Information Collection Activities: E-Verify Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... SECURITY U.S. Citizenship and Immigration Services Agency Information Collection Activities: E-Verify... Program; Verify Employment Eligibility Status''. Instead it should read ``E-Verify Program.'' Second, comments are not requested on the E-Verify Memorandum of Understanding as previously published....

  16. Short-Period Seismic Noise in Vorkuta (Russia)

    SciTech Connect

    Kishkina, S B; Spivak, A A; Sweeney, J J

    2008-05-15

    Cultural development of new subpolar areas of Russia is associated with a need for detailed seismic research, including both mapping of regional seismicity and seismic monitoring of specific mining enterprises. Of special interest are the northern territories of European Russia, including shelves of the Kara and Barents Seas, Yamal Peninsula, and the Timan-Pechora region. Continuous seismic studies of these territories are important now because there is insufficient seismological knowledge of the area and an absence of systematic data on the seismicity of the region. Another task of current interest is the necessity to consider the seismic environment in the design, construction, and operation of natural gas extracting enterprises such as the construction of the North European Gas Pipeline. Issues of scientific importance for seismic studies in the region are the complex geodynamical setting, the presence of permafrost, and the complex tectonic structure. In particular, the Uralian Orogene (Fig. 1) strongly affects the propagation of seismic waves. The existing subpolar seismic stations [APA (67,57{sup o}N; 33,40{sup o}E), LVZ (67,90{sup o}N; 34,65{sup o}E), and NRIL (69,50{sup o}N; 88,40{sup o}E)] do not cover the extensive area between the Pechora and Ob Rivers (Fig. 1). Thus seismic observations in the Vorkuta area, which lies within the area of concern, represent a special interest. Continuous recording at a seismic station near the city of Vorkuta (67,50{sup o}N; 64,11{sup o}E) [1] has been conducted since 2005 for the purpose of regional seismic monitoring and, more specifically, detection of seismic signals caused by local mining enterprises. Current surveys of local seismic noise [7,8,9,11], are particularly aimed at a technical survey for the suitability of the site for installation of a small-aperture seismic array, which would include 10-12 recording instruments, with the Vorkuta seismic station as the central element. When constructed, this seismic

  17. The Budget Guide to Seismic Network Management

    NASA Astrophysics Data System (ADS)

    Hagerty, M. T.; Ebel, J. E.

    2007-05-01

    Regardless of their size, there are certain tasks that all seismic networks must perform, including data collection and processing, earthquake location, information dissemination, and quality control. Small seismic networks are unlikely to possess the resources -- manpower and money -- required to do much in-house development. Fortunately, there are a lot of free or inexpensive software solutions available that are able to perform many of the required tasks. Often the available solutions are all-in-one turnkey packages designed and developed for much larger seismic networks, and the cost of adapting them to a smaller network must be weighed against the ease with which other, non-seismic software can be adapted to the same task. We describe here the software and hardware choices we have made for the New England Seismic Network (NESN), a sparse regional seismic network responsible for monitoring and reporting all seismicity within the New England region in the northeastern U.S. We have chosen to use a cost-effective approach to monitoring using free, off-the-shelf solutions where available (e.g., Earthworm, HYP2000) and modifying freeware solutions when it is easier than trying to adapt a large, complicated package. We have selected for use software that is: free, likely to receive continued support from the seismic or, preferably, larger internet community, and modular. Modularity is key to our design because it ensures that if one component of our processing system becomes obsolete, we can insert a suitable replacement with few modifications to the other modules. Our automated event detection, identification and location system is based on a wavelet transform analysis of station data that arrive continuously via TCP/IP transmission over the internet. Our system for interactive analyst review of seismic events and remote system monitoring utilizes a combination of Earthworm modules, Perl cgi-bin scripts, Java, and native Unix commands and can now be carried out via

  18. Using Multiple Representations to Make and Verify Conjectures

    ERIC Educational Resources Information Center

    Garcia, Martha; Benitez, Alma

    2011-01-01

    This article reports on the results of research, the objective of which was to document and analyze the manner in which students relate different representations when solving problems. A total of 20 students attending their first year of university studies took part in the study. In order to design the problem, the underlying information in each…

  19. Compressive and Shear Wave Velocity Profiles using Seismic Refraction Technique

    NASA Astrophysics Data System (ADS)

    Aziman, M.; Hazreek, Z. A. M.; Azhar, A. T. S.; Haimi, D. S.

    2016-04-01

    Seismic refraction measurement is one of the geophysics exploration techniques to determine soil profile. Meanwhile, the borehole technique is an established way to identify the changes of soil layer based on number of blows penetrating the soil. Both techniques are commonly adopted for subsurface investigation. The seismic refraction test is a non-destructive and relatively fast assessment compared to borehole technique. The soil velocities of compressive wave and shear wave derived from the seismic refraction measurements can be directly utilised to calculate soil parameters such as soil modulus and Poisson’s ratio. This study investigates the seismic refraction techniques to obtain compressive and shear wave velocity profile. Using the vertical and horizontal geophones as well as vertical and horizontal strike directions of the transient seismic source, the propagation of compressive wave and shear wave can be examined, respectively. The study was conducted at Sejagung Sri Medan. The seismic velocity profile was obtained at a depth of 20 m. The velocity of the shear wave is about half of the velocity of the compression wave. The soil profiles of compressive and shear wave velocities were verified using the borehole data and showed good agreement with the borehole data.

  20. Static corrections for enhanced signal detection at IMS seismic arrays

    NASA Astrophysics Data System (ADS)

    Wilkins, Neil; Wookey, James; Selby, Neil

    2016-04-01

    Seismic monitoring forms an important part of the International Monitoring System (IMS) for verifying the Comprehensive nuclear Test Ban Treaty (CTBT). Analysis of seismic data can be used to discriminate between nuclear explosions and the tens of thousands of natural earthquakes of similar magnitude that occur every year. This is known as "forensic seismology", and techniques include measuring the P-to-S wave amplitude ratio, the body-to-surface wave magnitude ratio (mb/Ms), and source depth. Measurement of these seismic discriminants requires very high signal-to-noise ratio (SNR) data, and this has led to the development and deployment of seismic arrays as part of the IMS. Array processing methodologies such as stacking can be used, but optimum SNR improvement needs an accurate estimate of the arrival time of the particular seismic phase. To enhance the imaging capability of IMS arrays, we aim to develop site-specific static corrections to the arrival time as a function of frequency, slowness and backazimuth. Here, we present initial results for the IMS TORD array in Niger. Vespagrams are calculated for various events using the F-statistic to clearly identify seismic phases and measure their arrival times. Observed arrival times are compared with those predicted by 1D and 3D velocity models, and residuals are calculated for a range of backazimuths and slownesses. Finally, we demonstrate the improvement in signal fidelity provided by these corrections.

  1. Seismic Hazard Characterization at the DOE Savannah River Site (SRS): Status report

    SciTech Connect

    Savy, J.B.

    1994-06-24

    The purpose of the Seismic Hazard Characterization project for the Savannah River Site (SRS-SHC) is to develop estimates of the seismic hazard for several locations within the SRS. Given the differences in the geology and geotechnical characteristics at each location, the estimates of the seismic hazard are to allow for the specific local conditions at each site. Characterization of seismic hazard is a critical factor for the design of new facilities as well as for the review and potential retrofit of existing facilities at SRS. The scope of the SRS seismic hazard characterization reported in this document is limited to the Probabilistic Seismic Hazard Analysis (PSHA). The goal of the project is to provide seismic hazard estimates based on a state-of-the-art method which is consistent with developments and findings of several ongoing studies which are deemed to bring improvements in the state of the seismic hazard analyses.

  2. Optimizing Seismic Monitoring Networks for EGS and Conventional Geothermal Projects

    NASA Astrophysics Data System (ADS)

    Kraft, Toni; Herrmann, Marcus; Bethmann, Falko; Stefan, Wiemer

    2013-04-01

    In the past several years, geological energy technologies receive growing attention and have been initiated in or close to urban areas. Some of these technologies involve injecting fluids into the subsurface (e.g., oil and gas development, waste disposal, and geothermal energy development) and have been found or suspected to cause small to moderate sized earthquakes. These earthquakes, which may have gone unnoticed in the past when they occurred in remote sparsely populated areas, are now posing a considerable risk for the public acceptance of these technologies in urban areas. The permanent termination of the EGS project in Basel, Switzerland after a number of induced ML~3 (minor) earthquakes in 2006 is one prominent example. It is therefore essential for the future development and success of these geological energy technologies to develop strategies for managing induced seismicity and keeping the size of induced earthquakes at a level that is acceptable to all stakeholders. Most guidelines and recommendations on induced seismicity published since the 1970ies conclude that an indispensable component of such a strategy is the establishment of seismic monitoring in an early stage of a project. This is because an appropriate seismic monitoring is the only way to detect and locate induced microearthquakes with sufficient certainty to develop an understanding of the seismic and geomechanical response of the reservoir to the geotechnical operation. In addition, seismic monitoring lays the foundation for the establishment of advanced traffic light systems and is therefore an important confidence building measure towards the local population and authorities. We have developed an optimization algorithm for seismic monitoring networks in urban areas that allows to design and evaluate seismic network geometries for arbitrary geotechnical operation layouts. The algorithm is based on the D-optimal experimental design that aims to minimize the error ellipsoid of the linearized

  3. Tornado Detection Based on Seismic Signal.

    NASA Astrophysics Data System (ADS)

    Tatom, Frank B.; Knupp, Kevin R.; Vitton, Stanley J.

    1995-02-01

    At the present time the only generally accepted method for detecting when a tornado is on the ground is human observation. Based on theoretical considerations combined with eyewitness testimony, there is strong reason to believe that a tornado in contact with the ground transfers a significant amount of energy into the ground. The amount of energy transferred depends upon the intensity of the tornado and the characteristics of the surface. Some portion of this energy takes the form of seismic waves, both body and surface waves. Surface waves (Rayleigh and possibly Love) represent the most likely type of seismic signal to be detected. Based on the existence of such a signal, a seismic tornado detector appears conceptually possible. The major concerns for designing such a detector are range of detection and discrimination between the tornadic signal and other types of surface waves generated by ground transportation equipment, high winds, or other nontornadic sources.

  4. Permafrost Active Layer Seismic Interferometry Experiment (PALSIE).

    SciTech Connect

    Abbott, Robert; Knox, Hunter Anne; James, Stephanie; Lee, Rebekah; Cole, Chris

    2016-01-01

    We present findings from a novel field experiment conducted at Poker Flat Research Range in Fairbanks, Alaska that was designed to monitor changes in active layer thickness in real time. Results are derived primarily from seismic data streaming from seven Nanometric Trillium Posthole seismometers directly buried in the upper section of the permafrost. The data were evaluated using two analysis methods: Horizontal to Vertical Spectral Ratio (HVSR) and ambient noise seismic interferometry. Results from the HVSR conclusively illustrated the method's effectiveness at determining the active layer's thickness with a single station. Investigations with the multi-station method (ambient noise seismic interferometry) are continuing at the University of Florida and have not yet conclusively determined active layer thickness changes. Further work continues with the Bureau of Land Management (BLM) to determine if the ground based measurements can constrain satellite imagery, which provide measurements on a much larger spatial scale.

  5. Seismic exploration for water on Mars

    NASA Technical Reports Server (NTRS)

    Page, Thornton

    1987-01-01

    It is proposed to soft-land three seismometers in the Utopia-Elysium region and three or more radio controlled explosive charges at nearby sites that can be accurately located by an orbiter. Seismic signatures of timed explosions, to be telemetered to the orbiter, will be used to detect present surface layers, including those saturated by volatiles such as water and/or ice. The Viking Landers included seismometers that showed that at present Mars is seismically quiet, and that the mean crustal thickness at the site is about 14 to 18 km. The new seismic landers must be designed to minimize wind vibration noise, and the landing sites selected so that each is well formed on the regolith, not on rock outcrops or in craters. The explosive charges might be mounted on penetrators aimed at nearby smooth areas. They must be equipped with radio emitters for accurate location and radio receivers for timed detonation.

  6. SEISMIC MODELING ENGINES PHASE 1 FINAL REPORT

    SciTech Connect

    BRUCE P. MARION

    2006-02-09

    Seismic modeling is a core component of petroleum exploration and production today. Potential applications include modeling the influence of dip on anisotropic migration; source/receiver placement in deviated-well three-dimensional surveys for vertical seismic profiling (VSP); and the generation of realistic data sets for testing contractor-supplied migration algorithms or for interpreting AVO (amplitude variation with offset) responses. This project was designed to extend the use of a finite-difference modeling package, developed at Lawrence Berkeley Laboratories, to the advanced applications needed by industry. The approach included a realistic, easy-to-use 2-D modeling package for the desktop of the practicing geophysicist. The feasibility of providing a wide-ranging set of seismic modeling engines was fully demonstrated in Phase I. The technical focus was on adding variable gridding in both the horizontal and vertical directions, incorporating attenuation, improving absorbing boundary conditions and adding the optional coefficient finite difference methods.

  7. Overview of seismic considerations at the Paducah Gaseous Diffusion Plant

    SciTech Connect

    Hunt, R.J.; Stoddart, W.C.; Burnett, W.A.; Beavers, J.E.

    1992-10-01

    This paper presents an overview of seismic considerations at the Paducah Gaseous Diffusion Plant (PGDP), which is managed by Martin Marietta Energy Systems, Inc., for the Department of Energy (DOE). The overview describes the original design, the seismic evaluations performed for the Safety Analysis Report (SAR) issued in 1985, and current evaluations and designs to address revised DOE requirements. Future plans to ensure changes in requirements and knowledge are addressed.

  8. Constraints on Subglacial Conditions from Seismicity

    NASA Astrophysics Data System (ADS)

    Lipovsky, B.; Olivo, D. C.; Dunham, E. M.

    2014-12-01

    A family of physics-based models designed to explain emergent, bandlimited, "tremor-like" seismograms shed light onto subglacial and englacial conditions. We consider two such models. In the first, a water-filled fracture hosts resonant modes; the seismically observable quality factor and characteristic frequency of these modes constrain the fracture length and aperture. In the second model, seismicity is generated by repeating stick-slip events on a fault patch (portion of the glacier bed) with sliding described by rate- and state-dependent friction laws. Wave propagation phenomena may additionally generate bandlimited seismic signals. These models make distinct predictions that may be used to address questions of glaciological concern. Laboratory friction experiments show that small, repeating earthquakes most likely occur at the ice-till interface and at conditions below the pressure melting point. These laboratory friction values, when combined with observed ice surface velocities, may also be used to constrain basal pore pressure. In contrast, seismic signals indicative of water-filled basal fractures suggest that, at least locally, temperatures are above the pressure melting point. We present a simple diagnostic test between these two processes that concerns the relationship between the multiple seismic spectral peaks generated by each process. Whereas repeating earthquakes generate evenly spaced spectral peaks through the Dirac comb effect, hydraulic fracture resonance, as a result of dispersive propagation of waves along the crack, generates spectral peaks that are not evenly spaced.

  9. Alternate approaches to verifying the structural adequacy of the Defense High Level Waste Shipping Cask

    SciTech Connect

    Zimmer, A.; Koploy, M.

    1991-12-01

    In the early 1980s, the US Department of Energy/Defense Programs (DOE/DP) initiated a project to develop a safe and efficient transportation system for defense high level waste (DHLW). A long-standing objective of the DHLW transportation project is to develop a truck cask that represents the leading edge of cask technology as well as one that fully complies with all applicable DOE, Nuclear Regulatory Commission (NRC), and Department of Transportation (DOT) regulations. General Atomics (GA) designed the DHLW Truck Shipping Cask using state-of-the-art analytical techniques verified by model testing performed by Sandia National Laboratories (SNL). The analytical techniques include two approaches, inelastic analysis and elastic analysis. This topical report presents the results of the two analytical approaches and the model testing results. The purpose of this work is to show that there are two viable analytical alternatives to verify the structural adequacy of a Type B package and to obtain an NRC license. It addition, this data will help to support the future acceptance by the NRC of inelastic analysis as a tool in packaging design and licensing.

  10. Real-time Imaging Orientation Determination System to Verify Imaging Polarization Navigation Algorithm

    PubMed Central

    Lu, Hao; Zhao, Kaichun; Wang, Xiaochu; You, Zheng; Huang, Kaoli

    2016-01-01

    Bio-inspired imaging polarization navigation which can provide navigation information and is capable of sensing polarization information has advantages of high-precision and anti-interference over polarization navigation sensors that use photodiodes. Although all types of imaging polarimeters exist, they may not qualify for the research on the imaging polarization navigation algorithm. To verify the algorithm, a real-time imaging orientation determination system was designed and implemented. Essential calibration procedures for the type of system that contained camera parameter calibration and the inconsistency of complementary metal oxide semiconductor calibration were discussed, designed, and implemented. Calibration results were used to undistort and rectify the multi-camera system. An orientation determination experiment was conducted. The results indicated that the system could acquire and compute the polarized skylight images throughout the calibrations and resolve orientation by the algorithm to verify in real-time. An orientation determination algorithm based on image processing was tested on the system. The performance and properties of the algorithm were evaluated. The rate of the algorithm was over 1 Hz, the error was over 0.313°, and the population standard deviation was 0.148° without any data filter. PMID:26805851

  11. Real-time Imaging Orientation Determination System to Verify Imaging Polarization Navigation Algorithm.

    PubMed

    Lu, Hao; Zhao, Kaichun; Wang, Xiaochu; You, Zheng; Huang, Kaoli

    2016-01-01

    Bio-inspired imaging polarization navigation which can provide navigation information and is capable of sensing polarization information has advantages of high-precision and anti-interference over polarization navigation sensors that use photodiodes. Although all types of imaging polarimeters exist, they may not qualify for the research on the imaging polarization navigation algorithm. To verify the algorithm, a real-time imaging orientation determination system was designed and implemented. Essential calibration procedures for the type of system that contained camera parameter calibration and the inconsistency of complementary metal oxide semiconductor calibration were discussed, designed, and implemented. Calibration results were used to undistort and rectify the multi-camera system. An orientation determination experiment was conducted. The results indicated that the system could acquire and compute the polarized skylight images throughout the calibrations and resolve orientation by the algorithm to verify in real-time. An orientation determination algorithm based on image processing was tested on the system. The performance and properties of the algorithm were evaluated. The rate of the algorithm was over 1 Hz, the error was over 0.313°, and the population standard deviation was 0.148° without any data filter. PMID:26805851

  12. Seismic Safety Study

    SciTech Connect

    Tokarz, F J; Coats, D W

    2006-05-16

    During the past three decades, the Laboratory has been proactive in providing a seismically safe working environment for its employees and the general public. Completed seismic upgrades during this period have exceeded $30M with over 24 buildings structurally upgraded. Nevertheless, seismic questions still frequently arise regarding the safety of existing buildings. To address these issues, a comprehensive study was undertaken to develop an improved understanding of the seismic integrity of the Laboratory's entire building inventory at the Livermore Main Site and Site 300. The completed study of February 2005 extended the results from the 1998 seismic safety study per Presidential Executive Order 12941, which required each federal agency to develop an inventory of its buildings and to estimate the cost of mitigating unacceptable seismic risks. Degenkolb Engineers, who performed the first study, was recontracted to perform structural evaluations, rank order the buildings based on their level of seismic deficiencies, and to develop conceptual rehabilitation schemes for the most seriously deficient buildings. Their evaluation is based on screening procedures and guidelines as established by the Interagency Committee on Seismic Safety in Construction (ICSSC). Currently, there is an inventory of 635 buildings in the Laboratory's Facility Information Management System's (FIMS's) database, out of which 58 buildings were identified by Degenkolb Engineers that require seismic rehabilitation. The remaining 577 buildings were judged to be adequate from a seismic safety viewpoint. The basis for these evaluations followed the seismic safety performance objectives of DOE standard (DOE STD 1020) Performance Category 1 (PC1). The 58 buildings were ranked according to three risk-based priority classifications (A, B, and C) as shown in Figure 1-1 (all 58 buildings have structural deficiencies). Table 1-1 provides a brief description of their expected performance and damage state

  13. Spot: A Programming Language for Verified Flight Software

    NASA Technical Reports Server (NTRS)

    Bocchino, Robert L., Jr.; Gamble, Edward; Gostelow, Kim P.; Some, Raphael R.

    2014-01-01

    The C programming language is widely used for programming space flight software and other safety-critical real time systems. C, however, is far from ideal for this purpose: as is well known, it is both low-level and unsafe. This paper describes Spot, a language derived from C for programming space flight systems. Spot aims to maintain compatibility with existing C code while improving the language and supporting verification with the SPIN model checker. The major features of Spot include actor-based concurrency, distributed state with message passing and transactional updates, and annotations for testing and verification. Spot also supports domain-specific annotations for managing spacecraft state, e.g., communicating telemetry information to the ground. We describe the motivation and design rationale for Spot, give an overview of the design, provide examples of Spot's capabilities, and discuss the current status of the implementation.

  14. Effects of Large and Small-Source Seismic Surveys on Marine Mammals and Sea Turtles

    NASA Astrophysics Data System (ADS)

    Holst, M.; Richardson, W. J.; Koski, W. R.; Smultea, M. A.; Haley, B.; Fitzgerald, M. W.; Rawson, M.

    2006-05-01

    L-DEO implements a marine mammal and sea turtle monitoring and mitigation program during its seismic surveys. The program consists of visual observations, mitigation, and/or passive acoustic monitoring (PAM). Mitigation includes ramp ups, powerdowns, and shutdowns of the seismic source if marine mammals or turtles are detected in or about to enter designated safety radii. Visual observations for marine mammals and turtles have taken place during all 11 L-DEO surveys since 2003, and PAM was done during five of those. Large sources were used during six cruises (10 to 20 airguns; 3050 to 8760 in3; PAM during four cruises). For two interpretable large-source surveys, densities of marine mammals were lower during seismic than non- seismic periods. During a shallow-water survey off Yucatán, delphinid densities during non-seismic periods were 19x higher than during seismic; however, this number is based on only 3 sightings during seismic and 11 sightings during non-seismic. During a Caribbean survey, densities were 1.4x higher during non-seismic. The mean closest point of approach (CPA) for delphinids for both cruises was significantly farther during seismic (1043 m) than during non-seismic (151 m) periods (Mann-Whitney U test, P < 0.001). Large whales were only seen during the Caribbean survey; mean CPA during seismic was 1722 m compared to 1539 m during non-seismic, but sample sizes were small. Acoustic detection rates with and without seismic were variable for three large-source surveys with PAM, with rates during seismic ranging from 1/3 to 6x those without seismic (n = 0 for fourth survey). The mean CPA for turtles was closer during non-seismic (139 m) than seismic (228 m) periods (P < 0.01). Small-source surveys used up to 6 airguns or 3 GI guns (75 to 1350 in3). During a Northwest Atlantic survey, delphinid densities during seismic and non-seismic were similar. However, in the Eastern Tropical Pacific, delphinid densities during non-seismic were 2x those during

  15. LLNL`s regional seismic discrimination research

    SciTech Connect

    Walter, W.R.; Mayeda, K.M.; Goldstein, P.

    1995-07-01

    The ability to negotiate and verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and show much variability from region to region. The goals of Lawrence Livermore National Laboratory`s (LLNL`s) regional discriminant research are to evaluate the most promising discriminants, improve our understanding of their physical basis and use this information to develop new and more effective discriminants that can be transported to new regions of high monitoring interest. In this report we discuss our preliminary efforts to geophysically characterize two regions, the Korean Peninsula and the Middle East-North Africa. We show that the remarkable stability of coda allows us to develop physically based, stable single station magnitude scales in new regions. We then discuss our progress to date on evaluating and improving our physical understanding and ability to model regional discriminants, focusing on the comprehensive NTS dataset. We apply this modeling ability to develop improved discriminants including slopes of P to S ratios. We find combining disparate discriminant techniques is particularly effective in identifying consistent outliers such as shallow earthquakes and mine seismicity. Finally we discuss our development and use of new coda and waveform modeling tools to investigate special events.

  16. Regional seismic discrimination research at LLNL

    SciTech Connect

    Walter, W.R.; Mayeda, K.M.; Goldstein, P.; Patton, H.J.; Jarpe, S.; Glenn, L.

    1995-10-01

    The ability to verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and show much variability from region to region. The goals of Lawrence Livermore National Laboratory`s (LLNL`s) regional discriminant research are to evaluate the most promising discriminants, improve the understanding of their physical basis and use this information to develop new and more effective discriminants that can be transported to new regions of high monitoring interest. In this report the authors discuss preliminary efforts to geophysically characterize the Middle East and North Africa. They show that the remarkable stability of coda allows one to develop physically based, stable single station magnitude scales in new regions. They then discuss progress to date on evaluating and improving physical understanding and ability to model regional discriminants, focusing on the comprehensive NTS dataset. The authors apply this modeling ability to develop improved discriminants including slopes of P to S ratios. They find combining disparate discriminant techniques is particularly effective in identifying consistent outliers such as shallow earthquakes and mine seismicity. Finally they discuss development and use of new coda and waveform modeling tools to investigate special events.

  17. Verifying Stability of Dynamic Soft-Computing Systems

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Napolitano, Marcello; Callahan, John

    1997-01-01

    Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

  18. Web seismic Un ∗x: making seismic reflection processing more accessible

    NASA Astrophysics Data System (ADS)

    Templeton, M. E.; Gough, C. A.

    1999-05-01

    Web Seismic Un ∗x is a browser-based user interface for the Seismic Un ∗x freeware developed at Colorado School of Mines. The interface allows users to process and display seismic reflection data from any remote platform that runs a graphical Web browser. Users access data and create processing jobs on a remote server by completing form-based Web pages whose Common Gateway Interface scripts are written in Perl. These scripts supply parameters, manage files, call Seismic Un ∗x routines and return data plots. The interface was designed for undergraduate commuter students taking geophysics courses who need to: (a) process seismic data and other time series as a class using computers in campus teaching labs and (b) complete course assignments at home. Students from an undergraduate applied geophysics course tested the Web user interface while completing laboratory assignments in which they acquired and processed common-depth-point seismic reflection data into a subsurface image. This freeware, which will be publicly available by summer 1999, was developed and tested on a Solaris 2.5 server and will be ported to other versions of Unix, including Linux.

  19. Estimation of background noise level on seismic station using statistical analysis for improved analysis accuracy

    NASA Astrophysics Data System (ADS)

    Han, S. M.; Hahm, I.

    2015-12-01

    We evaluated the background noise level of seismic stations in order to collect the observation data of high quality and produce accurate seismic information. Determining of the background noise level was used PSD (Power Spectral Density) method by McNamara and Buland (2004) in this study. This method that used long-term data is influenced by not only innate electronic noise of sensor and a pulse wave resulting from stabilizing but also missing data and controlled by the specified frequency which is affected by the irregular signals without site characteristics. It is hard and inefficient to implement process that filters out the abnormal signal within the automated system. To solve these problems, we devised a method for extracting the data which normally distributed with 90 to 99% confidence intervals at each period. The availability of the method was verified using 62-seismic stations with broadband and short-period sensors operated by the KMA (Korea Meteorological Administration). Evaluation standards were NHNM (New High Noise Model) and NLNM (New Low Noise Model) published by the USGS (United States Geological Survey). It was designed based on the western United States. However, Korean Peninsula surrounded by the ocean on three sides has a complicated geological structure and a high population density. So, we re-designed an appropriate model in Korean peninsula by statistically combined result. The important feature is that secondary-microseism peak appeared at a higher frequency band. Acknowledgements: This research was carried out as a part of "Research for the Meteorological and Earthquake Observation Technology and Its Application" supported by the 2015 National Institute of Meteorological Research (NIMR) in the Korea Meteorological Administration.

  20. Enhancing Seismic Monitoring Capability for Hydraulic Fracturing Induced Seismicity in Canada

    NASA Astrophysics Data System (ADS)

    Kao, H.; Cassidy, J. F.; Farahbod, A.; Lamontagne, M.

    2012-12-01

    The amount of natural gas produced from unconventional sources, such as the shale gas, has increased dramatically since the last decade. One of the key factors in the success of shale gas production is the application of hydraulic fracturing (also known as "fracking") to facilitate the efficient recovery of natural gas from shale matrices. As the fracking operation becomes routine in all major shale gas fields, its potential to induce local earthquakes at some locations has become a public concern. To address this concern, Natural Resources Canada has initiated a research effort to investigate the potential links between fracking operations and induced seismicity in some major shale gas basins of Canada. This federal-provincial collaborative research aims to assess if shale gas fracking can alter regional pattern of background seismicity and if so, what the relationship between how fracking is conducted and the maximum magnitude of induced seismicity would be. Other objectives include the investigation of the time scale of the interaction between fracking events and induced seismicity and the evaluation of induced seismicity potential for shale gas basins under different tectonic/geological conditions. The first phase of this research is to enhance the detection and monitoring capability for seismicity possibly related to shale gas recovery in Canada. Densification of the Canadian National Seismograph Network (CNSN) is currently underway in northeast British Columbia where fracking operations are taking place. Additional seismic stations are planned for major shale gas basins in other regions where fracking might be likely in the future. All newly established CNSN stations are equipped with broadband seismographs with real-time continuous data transmission. The design goal of the enhanced seismic network is to significantly lower the detection threshold such that the anticipated low-magnitude earthquakes that might be related to fracking operations can be

  1. The Lusi seismic experiment: An initial study to understand the effect of seismic activity to Lusi

    NASA Astrophysics Data System (ADS)

    Karyono, Mazzini, Adriano; Lupi, Matteo; Syafri, Ildrem; Masturyono, Rudiyanto, Ariska; Pranata, Bayu; Muzli, Widodo, Handi Sulistyo; Sudrajat, Ajat; Sugiharto, Anton

    2015-04-01

    The spectacular Lumpur Sidoarjo (Lusi) eruption started in northeast Java on the 29 of May 2006 following a M6.3 earthquake striking the island [1,2]. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system [3] and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. The Lusi seismic experiment is a project aims to begin a detailed study of seismicity around the Lusi area. In this initial phase we deploy 30 seismometers strategically distributed in the area around Lusi and along the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. The purpose of the initial monitoring is to conduct a preliminary seismic campaign aiming to identify the occurrence and the location of local seismic events in east Java particularly beneath Lusi.This network will locate small event that may not be captured by the existing BMKG network. It will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-AW region and spatial and temporal variations of vp/vs ratios. The goal of this study is to understand how the seismicity occurring along the Sunda subduction zone affects to the behavior of the Lusi eruption. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. In this study, we will extract Green's functions from ambient seismic noise data in order to image the shallow subsurface structure beneath LUSI area. The waveform cross-correlation technique will be apply to all of recordings of ambient seismic noise at 30 seismographic stations around the LUSI area. We use the dispersive behaviour of the retrieved Rayleigh waves to infer velocity structures in the shallow subsurface.

  2. The Lusi seismic experiment: An initial study to understand the effect of seismic activity to Lusi

    SciTech Connect

    Karyono; Mazzini, Adriano; Sugiharto, Anton; Lupi, Matteo; Syafri, Ildrem; Masturyono,; Rudiyanto, Ariska; Pranata, Bayu; Muzli,; Widodo, Handi Sulistyo; Sudrajat, Ajat

    2015-04-24

    The spectacular Lumpur Sidoarjo (Lusi) eruption started in northeast Java on the 29 of May 2006 following a M6.3 earthquake striking the island [1,2]. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system [3] and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. The Lusi seismic experiment is a project aims to begin a detailed study of seismicity around the Lusi area. In this initial phase we deploy 30 seismometers strategically distributed in the area around Lusi and along the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. The purpose of the initial monitoring is to conduct a preliminary seismic campaign aiming to identify the occurrence and the location of local seismic events in east Java particularly beneath Lusi.This network will locate small event that may not be captured by the existing BMKG network. It will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-AW region and spatial and temporal variations of vp/vs ratios. The goal of this study is to understand how the seismicity occurring along the Sunda subduction zone affects to the behavior of the Lusi eruption. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. In this study, we will extract Green’s functions from ambient seismic noise data in order to image the shallow subsurface structure beneath LUSI area. The waveform cross-correlation technique will be apply to all of recordings of ambient seismic noise at 30 seismographic stations around the LUSI area. We use the dispersive behaviour of the retrieved Rayleigh waves to infer velocity structures in the shallow subsurface.

  3. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.

    2011-12-01

    The CSN is a network of low-cost accelerometers deployed in the Pasadena, CA region. It is a prototype network with the goal of demonstrating the importance of dense measurements in determining the rapid lateral variations in ground motion due to earthquakes. The main product of the CSN is a map of peak ground produced within seconds of significant local earthquakes that can be used as a proxy for damage. Examples of this are shown using data from a temporary network in Long Beach, CA. Dense measurements in buildings are also being used to determine the state of health of structures. In addition to fixed sensors, portable sensors such as smart phones are also used in the network. The CSN has necessitated several changes in the standard design of a seismic network. The first is that the data collection and processing is done in the "cloud" (Google cloud in this case) for robustness and the ability to handle large impulsive loads (earthquakes). Second, the database is highly de-normalized (i.e. station locations are part of waveform and event-detection meta data) because of the mobile nature of the sensors. Third, since the sensors are hosted and/or owned by individuals, the privacy of the data is very important. The location of fixed sensors is displayed on maps as sensor counts in block-wide cells, and mobile sensors are shown in a similar way, with the additional requirement to inhibit tracking that at least two must be present in a particular cell before any are shown. The raw waveform data are only released to users outside of the network after a felt earthquake.

  4. Gravity of the New Madrid seismic zone; a preliminary study

    USGS Publications Warehouse

    Langenheim, V.E.

    1995-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Mo. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  5. Seismic surveys test on Innerhytta Pingo, Adventdalen, Svalbard Islands

    NASA Astrophysics Data System (ADS)

    Boaga, Jacopo; Rossi, Giuliana; Petronio, Lorenzo; Accaino, Flavio; Romeo, Roberto; Wheeler, Walter

    2015-04-01

    We present the preliminary results of an experimental full-wave seismic survey test conducted on the Innnerhytta a Pingo, located in the Adventdalen, Svalbard Islands, Norway. Several seismic surveys were adopted in order to study a Pingo inner structure, from classical reflection/refraction arrays to seismic tomography and surface waves analysis. The aim of the project IMPERVIA, funded by Italian PNRA, was the evaluation of the permafrost characteristics beneath this open-system Pingo by the use of seismic investigation, evaluating the best practice in terms of logistic deployment. The survey was done in April-May 2014: we collected 3 seismic lines with different spacing between receivers (from 2.5m to 5m), for a total length of more than 1 km. We collected data with different vertical geophones (with natural frequency of 4.5 Hz and 14 Hz) as well as with a seismic snow-streamer. We tested different seismic sources (hammer, seismic gun, fire crackers and heavy weight drop), and we verified accurately geophone coupling in order to evaluate the different responses. In such peculiar conditions we noted as fire-crackers allow the best signal to noise ratio for refraction/reflection surveys. To ensure the best geophones coupling with the frozen soil, we dug snow pits, to remove the snow-cover effect. On the other hand, for the surface wave methods, the very high velocity of the permafrost strongly limits the generation of long wavelengths both with these explosive sources as with the common sledgehammer. The only source capable of generating low frequencies was a heavy drop weight system, which allows to analyze surface wave dispersion below 10 Hz. Preliminary data analysis results evidence marked velocity inversions and strong velocity contrasts in depth. The combined use of surface and body waves highlights the presence of a heterogeneous soil deposit level beneath a thick layer of permafrost. This is the level that hosts the water circulation from depth controlling

  6. A procedure for seismic risk reduction in Campania Region

    SciTech Connect

    Zuccaro, G.; Palmieri, M.; Cicalese, S.; Grassi, V.; Rauci, M.; Maggio, F.

    2008-07-08

    The Campania Region has set and performed a peculiar procedure in the field of seismic risk reduction. Great attention has been paid to public strategic buildings such as town halls, civil protection buildings and schools. The Ordinance 3274 promulgate in the 2004 by the Italian central authority obliged the owners of strategic buildings to perform seismic analyses within 2008 in order to check the safety of the structures and the adequacy to the use. In the procedure the Campania region, instead of the local authorities, ensure the complete drafting of seismic checks through financial resources of the Italian Government. A regional scientific technical committee has been constituted, composed of scientific experts, academics in seismic engineering. The committee has drawn up guidelines for the processing of seismic analyses. At the same time, the Region has issued a public competition to select technical seismic engineering experts to appoint seismic analysis in accordance with guidelines. The scientific committee has the option of requiring additional documents and studies in order to approve the safety checks elaborated. The Committee is supported by a technical and administrative secretariat composed of a group of expert in seismic engineering. At the moment several seismic safety checks have been completed. The results will be presented in this paper. Moreover, the policy to mitigate the seismic risk, set by Campania region, was to spend the most of the financial resources available on structural strengthening of public strategic buildings rather than in safety checks. A first set of buildings of which the response under seismic action was already known by data and studies of vulnerability previously realised, were selected for immediate retrofitting designs. Secondly, an other set of buildings were identified for structural strengthening. These were selected by using the criteria specified in the Guide Line prepared by the Scientific Committee and based on

  7. Third Quarter Hanford Seismic report for Fiscal year 2003

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2003-09-11

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 356 triggers during the third quarter of fiscal year 2003. Of these triggers, 141 were earthquakes. Thirty-four earthquakes of the 141 earthquakes were located in the Hanford Seismic Network area. Stratigraphically 15 occurred in the Columbia River basalt, 13 were earthquakes in the pre-basalt sediments, and 6 were earthquakes in the crystalline basement. Geographically, 22 earthquakes occurred in swarm areas, 1 earthquake was associated with a major geologic structure, and 11 were classified as random events. During the third quarter, an earthquake swarm consisting of 15 earthquakes occurred on the south limb of Rattlesnake Mountain. The earthquakes are centered over the northwest extension of the Horse Heaven Hills anticline and probably occur at the base of the Columbia River Basalt Group.

  8. A procedure for seismic risk reduction in Campania Region

    NASA Astrophysics Data System (ADS)

    Zuccaro, G.; Palmieri, M.; Maggiò, F.; Cicalese, S.; Grassi, V.; Rauci, M.

    2008-07-01

    The Campania Region has set and performed a peculiar procedure in the field of seismic risk reduction. Great attention has been paid to public strategic buildings such as town halls, civil protection buildings and schools. The Ordinance 3274 promulgate in the 2004 by the Italian central authority obliged the owners of strategic buildings to perform seismic analyses within 2008 in order to check the safety of the structures and the adequacy to the use. In the procedure the Campania region, instead of the local authorities, ensure the complete drafting of seismic checks through financial resources of the Italian Government. A regional scientific technical committee has been constituted, composed of scientific experts, academics in seismic engineering. The committee has drawn up guidelines for the processing of seismic analyses. At the same time, the Region has issued a public competition to select technical seismic engineering experts to appoint seismic analysis in accordance with guidelines. The scientific committee has the option of requiring additional documents and studies in order to approve the safety checks elaborated. The Committee is supported by a technical and administrative secretariat composed of a group of expert in seismic engineering. At the moment several seismic safety checks have been completed. The results will be presented in this paper. Moreover, the policy to mitigate the seismic risk, set by Campania region, was to spend the most of the financial resources available on structural strengthening of public strategic buildings rather than in safety checks. A first set of buildings of which the response under seismic action was already known by data and studies of vulnerability previously realised, were selected for immediate retrofitting designs. Secondly, an other set of buildings were identified for structural strengthening. These were selected by using the criteria specified in the Guide Line prepared by the Scientific Committee and based on

  9. Seismic source parameters

    SciTech Connect

    Johnson, L.R.

    1994-06-01

    The use of information contained on seismograms to infer the properties of an explosion source presents an interesting challenge because the seismic waves recorded on the seismograms represent only small indirect, effects of the explosion. The essential physics of the problem includes the process by which these elastic waves are generated by the explosion and also the process involved in propagating the seismic waves from the source region to the sites where the seismic data are collected. Interpretation of the seismic data in terms of source properties requires that the effects of these generation and propagation processes be taken into account. The propagation process involves linear mechanics and a variety of standard seismological methods have been developed for handling this part of the problem. The generation process presents a more difficult problem, as it involves non-linear mechanics, but semi-empirical methods have been developed for handling this part of the problem which appear to yield reasonable results. These basic properties of the seismic method are illustrated with some of the results from the NPE.

  10. Landslide seismic magnitude

    NASA Astrophysics Data System (ADS)

    Lin, C. H.; Jan, J. C.; Pu, H. C.; Tu, Y.; Chen, C. C.; Wu, Y. M.

    2015-11-01

    Landslides have become one of the most deadly natural disasters on earth, not only due to a significant increase in extreme climate change caused by global warming, but also rapid economic development in topographic relief areas. How to detect landslides using a real-time system has become an important question for reducing possible landslide impacts on human society. However, traditional detection of landslides, either through direct surveys in the field or remote sensing images obtained via aircraft or satellites, is highly time consuming. Here we analyze very long period seismic signals (20-50 s) generated by large landslides such as Typhoon Morakot, which passed though Taiwan in August 2009. In addition to successfully locating 109 large landslides, we define landslide seismic magnitude based on an empirical formula: Lm = log ⁡ (A) + 0.55 log ⁡ (Δ) + 2.44, where A is the maximum displacement (μm) recorded at one seismic station and Δ is its distance (km) from the landslide. We conclude that both the location and seismic magnitude of large landslides can be rapidly estimated from broadband seismic networks for both academic and applied purposes, similar to earthquake monitoring. We suggest a real-time algorithm be set up for routine monitoring of landslides in places where they pose a frequent threat.

  11. Adjustment of minimum seismic shear coefficient considering site effects for long-period structures

    NASA Astrophysics Data System (ADS)

    Guan, Minsheng; Du, Hongbiao; Cui, Jie; Zeng, Qingli; Jiang, Haibo

    2016-06-01

    Minimum seismic base shear is a key factor employed in the seismic design of long-period structures, which is specified in some of the major national seismic building codes viz. ASCE7-10, NZS1170.5 and GB50011-2010. In current Chinese seismic design code GB50011-2010, however, effects of soil types on the minimum seismic shear coefficient are not considered, which causes problems for long-period structures sited in hard or rock soil to meet the minimum base shear requirement. This paper aims to modify the current minimum seismic shear coefficient by taking into account site effects. For this purpose, effective peak acceleration (EPA) is used as a representation for the ordinate value of the design response spectrum at the plateau. A large amount of earthquake records, for which EPAs are calculated, are examined through the statistical analysis by considering soil conditions as well as the seismic fortification intensities. The study indicates that soil types have a significant effect on the spectral ordinates at the plateau as well as the minimum seismic shear coefficient. Modified factors related to the current minimum seismic shear coefficient are preliminarily suggested for each site class. It is shown that the modified seismic shear coefficients are more effective to the determination of minimum seismic base shear of long-period structures.

  12. Probabilistic seismic demand analysis of nonlinear structures

    NASA Astrophysics Data System (ADS)

    Shome, Nilesh

    Recent earthquakes in California have initiated improvement in current design philosophy and at present the civil engineering community is working towards development of performance-based earthquake engineering of structures. The objective of this study is to develop efficient, but accurate procedures for probabilistic analysis of nonlinear seismic behavior of structures. The proposed procedures help the near-term development of seismic-building assessments which require an estimation of seismic demand at a given intensity level. We also develop procedures to estimate the probability of exceedance of any specified nonlinear response level due to future ground motions at a specific site. This is referred as Probabilistic Seismic Demand Analysis (PSDA). The latter procedure prepares the way for the next stage development of seismic assessment that consider the uncertainties in nonlinear response and capacity. The proposed procedures require structure-specific nonlinear analyses for a relatively small set of recorded accelerograms and (site-specific or USGS-map-like) seismic hazard analyses. We have addressed some of the important issues of nonlinear seismic demand analysis, which are selection of records for structural analysis, the number of records to be used, scaling of records, etc. Initially these issues are studied through nonlinear analysis of structures for a number of magnitude-distance bins of records. Subsequently we introduce regression analysis of response results against spectral acceleration, magnitude, duration, etc., which helps to resolve these issues more systematically. We illustrate the demand-hazard calculations through two major example problems: a 5story and a 20-story SMRF building. Several simple, but quite accurate closed-form solutions have also been proposed to expedite the demand-hazard calculations. We find that vector-valued (e.g., 2-D) PSDA estimates demand hazard more accurately. This procedure, however, requires information about 2

  13. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  14. The Italian National Seismic Network

    NASA Astrophysics Data System (ADS)

    Michelini, Alberto

    2016-04-01

    The Italian National Seismic Network is composed by about 400 stations, mainly broadband, installed in the Country and in the surrounding regions. About 110 stations feature also collocated strong motion instruments. The Centro Nazionale Terremoti, (National Earthquake Center), CNT, has installed and operates most of these stations, although a considerable number of stations contributing to the INGV surveillance has been installed and is maintained by other INGV sections (Napoli, Catania, Bologna, Milano) or even other Italian or European Institutions. The important technological upgrades carried out in the last years has allowed for significant improvements of the seismic monitoring of Italy and of the Euro-Mediterranean Countries. The adopted data transmission systems include satellite, wireless connections and wired lines. The Seedlink protocol has been adopted for data transmission. INGV is a primary node of EIDA (European Integrated Data Archive) for archiving and distributing, continuous, quality checked data. The data acquisition system was designed to accomplish, in near-real-time, automatic earthquake detection and hypocenter and magnitude determination (moment tensors, shake maps, etc.). Database archiving of all parametric results are closely linked to the existing procedures of the INGV seismic monitoring environment. Overall, the Italian earthquake surveillance service provides, in quasi real-time, hypocenter parameters which are then revised routinely by the analysts of the Bollettino Sismico Nazionale. The results are published on the web page http://cnt.rm.ingv.it/ and are publicly available to both the scientific community and the the general public. This presentation will describe the various activities and resulting products of the Centro Nazionale Terremoti. spanning from data acquisition to archiving, distribution and specialised products.

  15. Stress-Release Seismic Source for Seismic Velocity Measurement in Mines

    NASA Astrophysics Data System (ADS)

    Swanson, P. L.; Clark, C.; Richardson, J.; Martin, L.; Zahl, E.; Etter, A.

    2014-12-01

    Accurate seismic event locations are needed to delineate roles of mine geometry, stress and geologic structures in developing rockburst conditions. Accurate absolute locations are challenging in mine environments with rapid changes in seismic velocity due to sharp contrasts between individual layers and large time-dependent velocity gradients attending excavations. Periodic use of controlled seismic sources can help constrain the velocity in this continually evolving propagation medium comprising the miners' workplace. With a view to constructing realistic velocity models in environments in which use of explosives is problematic, a seismic source was developed subject to the following design constraints: (i) suitable for use in highly disturbed zones surrounding mine openings, (ii) able to produce usable signals over km-scale distances in the frequency range of typical coal mine seismic events (~10-100 Hz), (iii) repeatable, (iv) portable, (v) non-disruptive to mining operations, and (vi) safe for use in potentially explosive gaseous environments. Designs of the compressed load column seismic source (CLCSS), which generates a stress, or load, drop normal to the surface of mine openings, and the fiber-optic based source-initiation timer are presented. Tests were conducted in a coal mine at a depth of 500 m (1700 ft) and signals were recorded on the surface with a 72-ch (14 Hz) exploration seismograph for load drops of 150-470 kN (16-48 tons). Signal-to-noise ratios of unfiltered signals ranged from ~200 immediately above the source (500 m (1700 ft)) to ~8 at the farthest extent of the array (slant distance of ~800 m (2600 ft)), suggesting the potential for use over longer range. Results are compared with signals produced by weight drop and sledge hammer sources, indicating the superior waveform quality for first-arrival measurements with the CLCSS seismic source.

  16. First Quarter Hanford Seismic Report for Fiscal Year 2011

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Clayton, Ray E.; Devary, Joseph L.

    2011-03-31

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 16 local earthquakes during the first quarter of FY 2011. Six earthquakes were located at shallow depths (less than 4 km), seven earthquakes at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments, and three earthquakes were located at depths greater than 9 km, within the basement. Geographically, thirteen earthquakes were located in known swarm areas and three earthquakes were classified as random events. The highest magnitude event (1.8 Mc) was recorded on October 19, 2010 at depth 17.5 km with epicenter located near the Yakima River between the Rattlesnake Mountain and Horse Heaven Hills swarm areas.

  17. Development of a wireless seismic array for volcano monitoring

    NASA Astrophysics Data System (ADS)

    Moure, David; Toma, Daniel; Lázaro, Antoni Manuel; Del Río, Joaquín; Carreras, Normandino; José Blanco, María

    2014-05-01

    Volcano monitoring is mainly based on three sciences: seismology, geodesy and geochemistry. Seismic arrays are used to locate the seismic source, based on analysis of signals recorded by each seismometer. The most important advantages of arrays over classical seismic networks are: painless deployment, no major infrastructures needed, able to provide an approximate location of a signal that is not feasible by a seismic network. In this paper the design of a low-power wireless array is presented. All sensors transmit acquired data to a central node which is capable to calculate the possible location of the seismic source in real-time. The reliability of those locations depends, among other parameters (number of sensors and geometrical distribution), on precision of time synchronization between the nodes. To achieve the necessary precision, the wireless seismic array implements a time synchronization protocol based on the IEEE1588 protocol, which ensures clock synchronization between nodes better than a microsecond, therefore, signal correlation between sensors is achieved correlating the signals from all the sensors. The ultimate challenge would be that the central node receives data from all the seismometers locating the seismic source, only transmitting the result, which dramatically reduces data traffic. Often, active volcano areas are located far from inhabited areas and data transmission options are limited. In situ calculation is crucial in order to reduce data volume transmission generated by the seismic array.

  18. Synthesis of artificial spectrum-compatible seismic accelerograms

    NASA Astrophysics Data System (ADS)

    Vrochidou, E.; Alvanitopoulos, P. F.; Andreadis, I.; Elenas, A.; Mallousi, K.

    2014-08-01

    The Hilbert-Huang transform is used to generate artificial seismic signals compatible with the acceleration spectra of natural seismic records. Artificial spectrum-compatible accelerograms are utilized instead of natural earthquake records for the dynamic response analysis of many critical structures such as hospitals, bridges, and power plants. The realistic estimation of the seismic response of structures involves nonlinear dynamic analysis. Moreover, it requires seismic accelerograms representative of the actual ground acceleration time histories expected at the site of interest. Unfortunately, not many actual records of different seismic intensities are available for many regions. In addition, a large number of seismic accelerograms are required to perform a series of nonlinear dynamic analyses for a reliable statistical investigation of the structural behavior under earthquake excitation. These are the main motivations for generating artificial spectrum-compatible seismic accelerograms and could be useful in earthquake engineering for dynamic analysis and design of buildings. According to the proposed method, a single natural earthquake record is deconstructed into amplitude and frequency components using the Hilbert-Huang transform. The proposed method is illustrated by studying 20 natural seismic records with different characteristics such as different frequency content, amplitude, and duration. Experimental results reveal the efficiency of the proposed method in comparison with well-established and industrial methods in the literature.

  19. Magnitude correlations in global seismicity

    SciTech Connect

    Sarlis, N. V.

    2011-08-15

    By employing natural time analysis, we analyze the worldwide seismicity and study the existence of correlations between earthquake magnitudes. We find that global seismicity exhibits nontrivial magnitude correlations for earthquake magnitudes greater than M{sub w}6.5.

  20. Seismic ruggedness of relays

    SciTech Connect

    Merz, K.L. )

    1991-08-01

    This report complements EPRI report NP-5223 Revision 1, February 1991, and presents additional information and analyses concerning generic seismic ruggedness of power plant relays. Existing and new test data have been used to construct Generic Equipment Ruggedness Spectra (GERS) which can be used in identifying rugged relays during seismic re-evaluation of nuclear power plants. This document is an EPRI tier 1 report. The results of relay fragility tests for both old and new relays are included in an EPRI tier 2 report with the same title. In addition to the presentation of relay GERS, the tier 2 report addresses the applicability of GERS to relays of older vintage, discusses the important identifying nomenclature for each relay type, and examines relay adjustment effects on seismic ruggedness. 9 refs., 3 figs, 1 tab.

  1. Downhole seismic array system

    SciTech Connect

    Petermann, S.G.

    1992-03-03

    This patent describes an apparatus of receiving seismic signals from an earth formation at least at one or more points in a wellbore penetrating the formation. It comprises a sonde including extensible and retractable support means thereon for supporting seismic signal receiver means, hydraulic actuator means for extending and reacting the support means, body means for supporting the actuator means and the support means and signal transmitting means for transmitting electrical signals related to seismic signals received by the receiver means; tubing means connected to the sonde for deploying the sonde in the wellbore, the tubing means including electrical conductor means disposed therein for conducting electrical signals between means on the surface of the formation and the sonde and the tubing means comprising means for conducting hydraulic fluid to the sonde for operation of the actuator means; and means for supplying hydraulic fluid from the surface of the formation through the tubing means to the sonde for operating the actuator means.

  2. Induced seismicity. Final report

    SciTech Connect

    Segall, P.

    1997-09-18

    The objective of this project has been to develop a fundamental understanding of seismicity associated with energy production. Earthquakes are known to be associated with oil, gas, and geothermal energy production. The intent is to develop physical models that predict when seismicity is likely to occur, and to determine to what extent these earthquakes can be used to infer conditions within energy reservoirs. Early work focused on earthquakes induced by oil and gas extraction. Just completed research has addressed earthquakes within geothermal fields, such as The Geysers in northern California, as well as the interactions of dilatancy, friction, and shear heating, on the generation of earthquakes. The former has involved modeling thermo- and poro-elastic effects of geothermal production and water injection. Global Positioning System (GPS) receivers are used to measure deformation associated with geothermal activity, and these measurements along with seismic data are used to test and constrain thermo-mechanical models.

  3. Canadian Seismic Agreement

    SciTech Connect

    Wetmiller, R.J.; Lyons, J.A.; Shannon, W.E.; Munro, P.S.; Thomas, J.T.; Andrew, M.D.; Lapointe, S.P.; Lamontagne, M.; Wong, C.; Anglin, F.M.; Adams, J.; Cajka, M.G.; McNeil, W.; Drysdale, J.A. )

    1992-05-01

    This is a progress report of work carried out under the terms of a research agreement entitled the Canadian Seismic Agreement'' between the US Nuclear Regulatory Commission (USNRC), the Canadian Commercial Corporation and the Geophysics Division of the Geological Survey of Canada (GD/GSC) during the period from July 01, 1989 to June 30, 1990. The Canadian Seismic Agreement'' supports generally the operation of various seismograph stations in eastern Canada and the collection and analysis of earthquake data for the purpose of mitigating seismic hazards in eastern Canada and the northeastern US. The specific activities carried out in this one-year period are summarized below under four headings; Eastern Canada Telemetred Network and local network developments, Datalab developments, strong-motion network developments and earthquake activity. During this period the first surface fault unequivocably determined to have accompanied a historic earthquake in eastern North America, occurred in northern Quebec.

  4. Controllable seismic source

    SciTech Connect

    Gomez, Antonio; DeRego, Paul Jeffrey; Ferrell, Patrick Andrew; Thom, Robert Anthony; Trujillo, Joshua J.; Herridge, Brian

    2015-09-29

    An apparatus for generating seismic waves includes a housing, a strike surface within the housing, and a hammer movably disposed within the housing. An actuator induces a striking motion in the hammer such that the hammer impacts the strike surface as part of the striking motion. The actuator is selectively adjustable to change characteristics of the striking motion and characteristics of seismic waves generated by the impact. The hammer may be modified to change the physical characteristics of the hammer, thereby changing characteristics of seismic waves generated by the hammer. The hammer may be disposed within a removable shock cavity, and the apparatus may include two hammers and two shock cavities positioned symmetrically about a center of the apparatus.

  5. Controllable seismic source

    SciTech Connect

    Gomez, Antonio; DeRego, Paul Jeffrey; Ferrel, Patrick Andrew; Thom, Robert Anthony; Trujillo, Joshua J.; Herridge, Brian

    2014-08-19

    An apparatus for generating seismic waves includes a housing, a strike surface within the housing, and a hammer movably disposed within the housing. An actuator induces a striking motion in the hammer such that the hammer impacts the strike surface as part of the striking motion. The actuator is selectively adjustable to change characteristics of the striking motion and characteristics of seismic waves generated by the impact. The hammer may be modified to change the physical characteristics of the hammer, thereby changing characteristics of seismic waves generated by the hammer. The hammer may be disposed within a removable shock cavity, and the apparatus may include two hammers and two shock cavities positioned symmetrically about a center of the apparatus.

  6. Quiet Clean Short-haul Experimental Engine (QCSEE) Under-The-Wing (UTW) composite nacelle subsystem test report. [to verify strength of selected composite materials

    NASA Technical Reports Server (NTRS)

    Stotler, C. L., Jr.; Johnston, E. A.; Freeman, D. S.

    1977-01-01

    The element and subcomponent testing conducted to verify the under the wing composite nacelle design is reported. This composite nacelle consists of an inlet, outer cowl doors, inner cowl doors, and a variable fan nozzle. The element tests provided the mechanical properties used in the nacelle design. The subcomponent tests verified that the critical panel and joint areas of the nacelle had adequate structural integrity.

  7. Induced Seismicity Monitoring System

    NASA Astrophysics Data System (ADS)

    Taylor, S. R.; Jarpe, S.; Harben, P.

    2014-12-01

    There are many seismological aspects associated with monitoring of permanent storage of carbon dioxide (CO2) in geologic formations. Many of these include monitoring underground gas migration through detailed tomographic studies of rock properties, integrity of the cap rock and micro seismicity with time. These types of studies require expensive deployments of surface and borehole sensors in the vicinity of the CO2 injection wells. Another problem that may exist in CO2 sequestration fields is the potential for damaging induced seismicity associated with fluid injection into the geologic reservoir. Seismic hazard monitoring in CO2 sequestration fields requires a seismic network over a spatially larger region possibly having stations in remote settings. Expensive observatory-grade seismic systems are not necessary for seismic hazard deployments or small-scale tomographic studies. Hazard monitoring requires accurate location of induced seismicity to magnitude levels only slightly less than that which can be felt at the surface (e.g. magnitude 1), and the frequencies of interest for tomographic analysis are ~1 Hz and greater. We have developed a seismo/acoustic smart sensor system that can achieve the goals necessary for induced seismicity monitoring in CO2 sequestration fields. The unit is inexpensive, lightweight, easy to deploy, can operate remotely under harsh conditions and features 9 channels of recording (currently 3C 4.5 Hz geophone, MEMS accelerometer and microphone). An on-board processor allows for satellite transmission of parameter data to a processing center. Continuous or event-detected data is kept on two removable flash SD cards of up to 64+ Gbytes each. If available, data can be transmitted via cell phone modem or picked up via site visits. Low-power consumption allows for autonomous operation using only a 10 watt solar panel and a gel-cell battery. The system has been successfully tested for long-term (> 6 months) remote operations over a wide range

  8. Application of the Neo-Deterministic Seismic Microzonation Procedure in Bulgaria and Validation of the Seismic Input Against Eurocode 8

    SciTech Connect

    Ivanka, Paskaleva; Mihaela, Kouteva; Franco, Vaccari; Panza, Giuliano F.

    2008-07-08

    The earthquake record and the Code for design and construction in seismic regions in Bulgaria have shown that the territory of the Republic of Bulgaria is exposed to a high seismic risk due to local shallow and regional strong intermediate-depth seismic sources. The available strong motion database is quite limited, and therefore not representative at all of the real hazard. The application of the neo-deterministic seismic hazard assessment procedure for two main Bulgarian cities has been capable to supply a significant database of synthetic strong motions for the target sites, applicable for earthquake engineering purposes. The main advantage of the applied deterministic procedure is the possibility to take simultaneously and correctly into consideration the contribution to the earthquake ground motion at the target sites of the seismic source and of the seismic wave propagation in the crossed media. We discuss in this study the result of some recent applications of the neo-deterministic seismic microzonation procedure to the cities of Sofia and Russe. The validation of the theoretically modeled seismic input against Eurocode 8 and the few available records at these sites is discussed.

  9. Monitoring and verifying changes of organic carbon in soil

    USGS Publications Warehouse

    Post, W.M.; Izaurralde, R. C.; Mann, L. K.; Bliss, Norman B.

    2001-01-01

    Changes in soil and vegetation management can impact strongly on the rates of carbon (C) accumulation and loss in soil, even over short periods of time. Detecting the effects of such changes in accumulation and loss rates on the amount of C stored in soil presents many challenges. Consideration of the temporal and spatial heterogeneity of soil properties, general environmental conditions, and management history is essential when designing methods for monitoring and projecting changes in soil C stocks. Several approaches and tools will be required to develop reliable estimates of changes in soil C at scales ranging from the individual experimental plot to whole regional and national inventories. In this paper we present an overview of soil properties and processes that must be considered. We classify the methods for determining soil C changes as direct or indirect. Direct methods include field and laboratory measurements of total C, various physical and chemical fractions, and C isotopes. A promising direct method is eddy covariance measurement of CO2 fluxes. Indirect methods include simple and stratified accounting, use of environmental and topographic relationships, and modeling approaches. We present a conceptual plan for monitoring soil C changes at regional scales that can be readily implemented. Finally, we anticipate significant improvements in soil C monitoring with the advent of instruments capable of direct and precise measurements in the field as well as methods for interpreting and extrapolating spatial and temporal information.

  10. Verifying a computational method for predicting extreme ground motion

    USGS Publications Warehouse

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, B.T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  11. Generic seismic ruggedness of power plant equipment

    SciTech Connect

    Merz, K.L. )

    1991-08-01

    This report updates the results of a program with the overall objective of demonstrating the generic seismic adequacy of as much nuclear power plant equipment as possible by means of collecting and evaluating existing seismic qualification test data. These data are then used to construct ruggedness'' spectra below which equipment in operating plants designed to earlier earthquake criteria would be generically adequate. This document is an EPRI Tier 1 Report. The report gives the methodology for the collection and evaluation of data which are used to construct a Generic Equipment Ruggedness Spectrum (GERs) for each equipment class considered. The GERS for each equipment class are included in an EPRI Tier 2 Report with the same title. Associated with each GERS are inclusion rules, cautions, and checklists for field screening of in-place equipment for GERS applicability. A GERS provides a measure of equipment seismic resistance based on available test data. As such, a GERS may also be used to judge the seismic adequacy of similar new or replacement equipment or to estimate the seismic margin of equipment re-evaluated with respect to earthquake levels greater than considered to date, resulting in fifteen finalized GERS. GERS for relays (included in the original version of this report) are now covered in a separate report (NP-7147). In addition to the presentation of GERS, the Tier 2 report addresses the applicability of GERS to equipment of older vintage, methods for estimating amplification factors for evaluating devices installed in cabinets and enclosures, and how seismic test data from related studies relate to the GERS approach. 28 refs., 5 figs., 4 tabs.

  12. Comparing USGS national seismic hazard maps with internet-based macroseismic intensity observations

    NASA Astrophysics Data System (ADS)

    Mak, Sum; Schorlemmer, Danijel

    2016-04-01

    Verifying a nationwide seismic hazard assessment using data collected after the assessment has been made (i.e., prospective data) is a direct consistency check of the assessment. We directly compared the predicted rate of ground motion exceedance by the four available versions of the USGS national seismic hazard map (NSHMP, 1996, 2002, 2008, 2014) with the actual observed rate during 2000-2013. The data were prospective to the two earlier versions of NSHMP. We used two sets of somewhat independent data, namely 1) the USGS "Did You Feel It?" (DYFI) intensity reports, 2) instrumental ground motion records extracted from ShakeMap stations. Although both are observed data, they come in different degrees of accuracy. Our results indicated that for California, the predicted and observed hazards were very comparable. The two sets of data gave consistent results, implying robustness. The consistency also encourages the use of DYFI data for hazard verification in the Central and Eastern US (CEUS), where instrumental records are lacking. The result showed that the observed ground-motion exceedance was also consistent with the predicted in CEUS. The primary value of this study is to demonstrate the usefulness of DYFI data, originally designed for community communication instead of scientific analysis, for the purpose of hazard verification.

  13. Separation of seismic blended data by sparse inversion over dictionary learning

    NASA Astrophysics Data System (ADS)

    Zhou, Yanhui; Chen, Wenchao; Gao, Jinghuai

    2014-07-01

    Recent development of blended acquisition calls for the new procedure to process blended seismic measurements. Presently, deblending and reconstructing unblended data followed by conventional processing is the most practical processing workflow. We study seismic deblending by advanced sparse inversion with a learned dictionary in this paper. To make our method more effective, hybrid acquisition and time-dithering sequential shooting are introduced so that clean single-shot records can be used to train the dictionary to favor the sparser representation of data to be recovered. Deblending and dictionary learning with l1-norm based sparsity are combined to construct the corresponding problem with respect to unknown recovery, dictionary, and coefficient sets. A two-step optimization approach is introduced. In the step of dictionary learning, the clean single-shot data are selected as trained data to learn the dictionary. For deblending, we fix the dictionary and employ an alternating scheme to update the recovery and coefficients separately. Synthetic and real field data were used to verify the performance of our method. The outcome can be a significant reference in designing high-efficient and low-cost blended acquisition.

  14. Compliant liquid column damper modified by shape memory alloy device for seismic vibration control

    NASA Astrophysics Data System (ADS)

    Gur, Sourav; Mishra, Sudib Kumar; Bhowmick, Sutanu; Chakraborty, Subrata

    2014-10-01

    Liquid column dampers (LCDs) have long been used for the seismic vibration control of flexible structures. In contrast, tuning LCDs to short-period structures poses difficulty. Various modifications have been proposed on the original LCD configuration for improving its performance in relatively stiff structures. One such system, referred to as a compliant-LCD has been proposed recently by connecting the LCD to the structure with a spring. In this study, an improvement is attempted in compliant LCDs by replacing the linear spring with a spring made of shape memory alloy (SMA). Considering the dissipative, super-elastic, force-deformation hysteresis of SMA triggered by stress-induced micro-structural phase transition, the performance is expected to improve further. The optimum parameters for the SMA-compliant LCD are obtained through design optimization, which is based on a nonlinear random vibration response analysis via stochastic linearization of the force-deformation hysteresis of SMA and dissipation by liquid motion through an orifice. Substantially enhanced performance of the SMA-LCD over a conventional compliant LCD is demonstrated, the consistency of which is further verified under recorded ground motions. The robustness of the improved performance is also validated by parametric study concerning the anticipated variations in system parameters as well as variability in seismic loading.

  15. Seismic waveform modeling over cloud

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Friederich, Wolfgang

    2016-04-01

    With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.

  16. Verifying likelihoods for low template DNA profiles using multiple replicates

    PubMed Central

    Steele, Christopher D.; Greenhalgh, Matthew; Balding, David J.

    2014-01-01

    To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

  17. Verifying likelihoods for low template DNA profiles using multiple replicates.

    PubMed

    Steele, Christopher D; Greenhalgh, Matthew; Balding, David J

    2014-11-01

    To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

  18. MERCURY vs. TART Comparisons to Verify Thermal Scattering

    SciTech Connect

    Cullen, D E; McKinley, S; Hagmann, C

    2006-03-30

    Recently the results from many Monte Carlo codes were compared for a series of theoretical pin-cells; the results are documented in ref. [3]; details are also provided here in Appendix A and B. The purpose of this earlier code comparison was primarily to determine how accurately our codes model both bound and free atom neutron thermal scattering. Prior to this study many people assumed that our Monte Carlo transport codes were all now so accurate that they would all produce more or less the same answers, say for example K-eff to within 0.1%. The results demonstrated that in reality we see a rather large spread in the results for even simple scalar parameters, such as K-eff, where we found differences in excess of 2%, far exceeding many people's expectations. The differences between code results were traced to four major factors, (1) Differences between the sets of nuclear data used. (2) The accuracy of nuclear data processing codes. (3) The accuracy of the models used in our Monte Carlo transport codes. (4) Code user selected input options. Naturally at Livermore we would like to insure that we minimize the effects of these factors. In this report we compare the results using two of our Monte Carlo transport codes: MERCURY [2] and TART [2], with the following constraints designed to address the four points listed above, (1) Both codes used exactly the same nuclear data, namely the TART 2005 data. (2) Each code used its own nuclear data processing code. Even though these two data processing codes are independent, they have been extensively tested to insure the processed output results closely agree. (3) Both used the same nuclear physics models. This required that some physics be turned off in each code, namely, (a) Unresolved resonance energy region self-shielding was turned off in TART, since this is not currently available in MERCURY. (b) Delayed neutrons were treated as prompt in TART, since this is not currently available in MERCURY. (c) Classical, rather than

  19. Seismic Initiating Event Analysis For a PBMR Plant

    SciTech Connect

    Van Graan, Henriette; Serbanescu, Dan; Combrink, Yolanda; Coman, Ovidiu

    2004-07-01

    Seismic Initiating Event (IE) analysis is one of the most important tasks that control the level of effort and quality of the whole Seismic Probabilistic Safety Assessment (SPRA). The typical problems are related to the following aspects: how the internal PRA model and its complexity can be used and how to control the number of PRA components for which fragility evaluation should be performed and finally to obtain a manageable number of significant cut-sets for seismic risk quantification. The answers to these questions are highly dependent on the possibility to improve the interface between the internal events analysis and the external events analysis at the design stage. (authors)

  20. Seismic while drilling: Operational experiences in Viet Nam

    SciTech Connect

    Jackson, M.; Einchcomb, C.

    1997-03-01

    The BP/Statoil alliance in Viet Nam has used seismic while drilling on four wells during the last two years. Three wells employed the Western Atlas Tomex system, and the last well, Schlumberger`s SWD system. Perceived value of seismic while drilling (SWD) lies in being able to supply real-time data linking drill bit position to a seismic picture of the well. However, once confidence in equipment and methodology is attained, SWD can influence well design and planning associated with drilling wells. More important, SWD can remove uncertainty when actually drilling wells, allowing risk assessment to be carried out more accurately and confidently.

  1. The 2012 Ferrara seismic sequence: Regional crustal structure, earthquake sources, and seismic hazard

    NASA Astrophysics Data System (ADS)

    Malagnini, Luca; Herrmann, Robert B.; Munafò, Irene; Buttinelli, Mauro; Anselmi, Mario; Akinci, Aybige; Boschi, E.

    2012-10-01

    Inadequate seismic design codes can be dangerous, particularly when they underestimate the true hazard. In this study we use data from a sequence of moderate-sized earthquakes in northeast Italy to validate and test a regional wave propagation model which, in turn, is used to understand some weaknesses of the current design spectra. Our velocity model, while regionalized and somewhat ad hoc, is consistent with geophysical observations and the local geology. In the 0.02-0.1 Hz band, this model is validated by using it to calculate moment tensor solutions of 20 earthquakes (5.6 ≥ MW ≥ 3.2) in the 2012 Ferrara, Italy, seismic sequence. The seismic spectra observed for the relatively small main shock significantly exceeded the design spectra to be used in the area for critical structures. Observations and synthetics reveal that the ground motions are dominated by long-duration surface waves, which, apparently, the design codes do not adequately anticipate. In light of our results, the present seismic hazard assessment in the entire Pianura Padana, including the city of Milan, needs to be re-evaluated.

  2. 48 CFR 4.1803 - Verifying CAGE codes prior to award.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Verifying CAGE codes prior... GENERAL ADMINISTRATIVE MATTERS Commercial and Government Entity Code 4.1803 Verifying CAGE codes prior to award. (a) Contracting officers shall verify the offeror's CAGE code by reviewing the...

  3. 13 CFR 127.403 - What happens if SBA verifies the concern's eligibility?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What happens if SBA verifies the concern's eligibility? 127.403 Section 127.403 Business Credit and Assistance SMALL BUSINESS....403 What happens if SBA verifies the concern's eligibility? If SBA verifies that the concern...

  4. 49 CFR 40.139 - On what basis does the MRO verify test results involving opiates?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false On what basis does the MRO verify test results... Verification Process § 40.139 On what basis does the MRO verify test results involving opiates? As the MRO, you... laboratory confirms the presence of 6-acetylmorphine (6-AM) in the specimen, you must verify the test...

  5. 49 CFR 40.139 - On what basis does the MRO verify test results involving opiates?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false On what basis does the MRO verify test results... Verification Process § 40.139 On what basis does the MRO verify test results involving opiates? As the MRO, you... laboratory confirms the presence of 6-acetylmorphine (6-AM) in the specimen, you must verify the test...

  6. 31 CFR 363.14 - How will you verify my identity?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application....

  7. 75 FR 31288 - Plant-Verified Drop Shipment (PVDS)-Nonpostal Documentation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-03

    ... 111 Plant-Verified Drop Shipment (PVDS)--Nonpostal Documentation AGENCY: Postal Service TM . ACTION... Service, Domestic Mail Manual (DMM ) 705.15. 2.14 to clarify that PS Form 8125, Plant-Verified Drop...: As a result of reviews of USPS policy concerning practices at induction points of plant-verified...

  8. 75 FR 876 - Agency Information Collection Activities: E-Verify Data Collection Survey, New Information...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-06

    ... SECURITY U.S. Citizenship and Immigration Services Agency Information Collection Activities: E-Verify Data... Collection Under Review: E-Verify Data ] Collection Survey, Control No. OMB-55. The Department of Homeland... Collection: New information collection. (2) Title of the Form/Collection: E-Verify Data Collection....

  9. Mobile seismic exploration

    NASA Astrophysics Data System (ADS)

    Dräbenstedt, A.; Cao, X.; Polom, U.; Pätzold, F.; Zeller, T.; Hecker, P.; Seyfried, V.; Rembe, C.

    2016-06-01

    Laser-Doppler-Vibrometry (LDV) is an established technique to measure vibrations in technical systems with picometer vibration-amplitude resolution. Especially good sensitivity and resolution can be achieved at an infrared wavelength of 1550 nm. High-resolution vibration measurements are possible over more than 100 m distance. This advancement of the LDV technique enables new applications. The detection of seismic waves is an application which has not been investigated so far because seismic waves outside laboratory scales are usually analyzed at low frequencies between approximately 1 Hz and 250 Hz and require velocity resolutions in the range below 1 nm/s/√Hz. Thermal displacements and air turbulence have critical influences to LDV measurements at this low-frequency range leading to noise levels of several 100 nm/√Hz. Commonly seismic waves are measured with highly sensitive inertial sensors (geophones or Micro Electro-Mechanical Sensors (MEMS)). Approaching a laser geophone based on LDV technique is the topic of this paper. We have assembled an actively vibration-isolated optical table in a minivan which provides a hole in its underbody. The laser-beam of an infrared LDV assembled on the optical table impinges the ground below the car through the hole. A reference geophone has detected remaining vibrations on the table. We present the results from the first successful experimental demonstration of contactless detection of seismic waves from a movable vehicle with a LDV as laser geophone.

  10. Nonstructural seismic restraint guidelines

    SciTech Connect

    Butler, D.M.; Czapinski, R.H.; Firneno, M.J.; Feemster, H.C.; Fornaciari, N.R.; Hillaire, R.G.; Kinzel, R.L.; Kirk, D.; McMahon, T.T.

    1993-08-01

    The Nonstructural Seismic Restraint Guidelines provide general information about how to secure or restrain items (such as material, equipment, furniture, and tools) in order to prevent injury and property, environmental, or programmatic damage during or following an earthquake. All SNL sites may experience earthquakes of magnitude 6.0 or higher on the Richter scale. Therefore, these guidelines are written for all SNL sites.

  11. Seismic Inversion Methods

    SciTech Connect

    Jackiewicz, Jason

    2009-09-16

    With the rapid advances in sophisticated solar modeling and the abundance of high-quality solar pulsation data, efficient and robust inversion techniques are crucial for seismic studies. We present some aspects of an efficient Fourier Optimally Localized Averaging (OLA) inversion method with an example applied to time-distance helioseismology.

  12. Geothermal induced seismicity program plan

    SciTech Connect

    Not Available

    1981-03-01

    A plan for a National Geothermal Induced Seismicity Program has been prepared in consultation with a panel of experts from industry, academia, and government. The program calls for baseline seismic monitoring in regions of known future geothermal development, continued seismic monitoring and characterization of earthquakes in zones of geothermal fluid production and injection, modeling of the earthquake-inducing mechanism, and in situ measurement of stresses in the geothermal development. The Geothermal Induced Seismicity Program (GISP) will have as its objectives the evaluation of the seismic hazard, if any, associated with geothermal resource exploitation and the devising of a technology which, when properly utilized, will control or mitigate such hazards.

  13. A Case Study of Verifying and Validating an Astrophysical Simulation Code

    NASA Astrophysics Data System (ADS)

    Calder, A. C.; Taylor, N. T.; Antypas, K.; Sheeler, D.; Dubey, A.

    2006-12-01

    We describe the process of verifying and validating FLASH, a parallel, multi-physics simulation code intended to model astrophysical environments. Verification tests are designed to test and quantify the accuracy of the code. Validation tests are meant to ensure that simulations meaningfully describe nature by comparing the results of simulations to relevant laboratory experiments. The centerpiece of the verification process is the re-engineered FlashTest toolkit, which is used both as a stand-alone testing application and as a manager for a nightly test-suite. FlashTest exercises the unit test framework now available in FLASH3, the most recently released version, as well as a variety of standard verification tests. We also present a validation example in which simulations were directly compared to a laboratory experiment. We discuss our findings and evaluate the agreement between simulations and experiment.

  14. Verifiable Adaptive Control with Analytical Stability Margins by Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2010-01-01

    This paper presents a verifiable model-reference adaptive control method based on an optimal control formulation for linear uncertain systems. A predictor model is formulated to enable a parameter estimation of the system parametric uncertainty. The adaptation is based on both the tracking error and predictor error. Using a singular perturbation argument, it can be shown that the closed-loop system tends to a linear time invariant model asymptotically under an assumption of fast adaptation. A stability margin analysis is given to estimate a lower bound of the time delay margin using a matrix measure method. Using this analytical method, the free design parameter n of the optimal control modification adaptive law can be determined to meet a specification of stability margin for verification purposes.

  15. Second Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-06-26

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, seven local earthquakes were recorded during the second quarter of fiscal year 2008. The largest event recorded by the network during the second quarter (February 3, 2008 - magnitude 2.3 Mc) was located northeast of Richland in Franklin County at a depth of 22.5 km. With regard to the depth distribution, two earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), three earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and two earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, five earthquakes occurred in swarm areas and two earthquakes were classified as random events.

  16. First Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-03-21

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, forty-four local earthquakes were recorded during the first quarter of fiscal year 2008. A total of thirty-one micro earthquakes were recorded within the Rattlesnake Mountain swarm area at depths in the 5-8 km range, most likely within the pre-basalt sediments. The largest event recorded by the network during the first quarter (November 25, 2007 - magnitude 1.5 Mc) was located within this swarm area at a depth of 4.3 km. With regard to the depth distribution, three earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), thirty-six earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and five earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, thirty-eight earthquakes occurred in swarm areas and six earth¬quakes were classified as random events.

  17. Second Quarter Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-07-17

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 506 triggers on two parallel detection and recording systems during the second quarter of fiscal year (FY) 2000. Twenty-seven seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree} N latitude and 119--120{degree} W longitude; 12 were earthquakes in the Columbia River Basalt Group, 2 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 5 were quarry blasts. Three earthquakes appear to be related to geologic structures, eleven earthquakes occurred in known swarm areas, and seven earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion

  18. Third Quarter Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-09-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its con-tractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (E WRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 818 triggers on two parallel detection and recording systems during the third quarter of fiscal year (FY) 2000. Thirteen seismic events were located by the Hanford Seismic Network within the reporting region of 46-47{degree} N latitude and 119-120{degree} W longitude; 7 were earthquakes in the Columbia River Basalt Group, 1 was an earthquake in the pre-basalt sediments, and 5 were earthquakes in the crystalline basement. Three earthquakes occurred in known swarm areas, and 10 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the third quarter of FY 2000.

  19. First quarter Hanford seismic report for fiscal year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-02-23

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EW uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 311 triggers on two parallel detection and recording systems during the first quarter of fiscal year (FY) 2000. Twelve seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree}N latitude and 119--120{degree}W longitude; 2 were earthquakes in the Columbia River Basalt Group, 3 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 1 was a quarry blast. Two earthquakes appear to be related to a major geologic structure, no earthquakes occurred in known swarm areas, and 9 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers

  20. Second Quarter Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Stephen P.; Rohay, Alan C.

    2000-07-17

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 506 triggers on two parallel detection and recording systems during the second quarter of fiscal year (FY) 2000. Twenty-seven seismic events were located by the Hanford Seismic Network within the reporting region of 46-47 N latitude and 119-120 W longitude; 12 were earthquakes in the Columbia River Basalt Group, 2 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 5 were quarry blasts. Three earthquakes appear to be related to geologic structures, eleven earthquakes occurred in known swarm areas, and seven earthquakes were random occurrences.

  1. Sub-seismic Deformation Prediction of Potential Pathways and Seismic Validation - The Joint Project PROTECT

    NASA Astrophysics Data System (ADS)

    Krawczyk, C. M.; Kolditz, O.

    2013-12-01

    The joint project PROTECT (PRediction Of deformation To Ensure Carbon Traps) aims to determine the existence and characteristics of sub-seismic structures that can potentially link deep reservoirs with the surface in the framework of CO2 underground storage. The research provides a new approach of assessing the long-term integrity of storage reservoirs. The objective is predicting and quantifying the distribution and the amount of sub-/seismic strain caused by fault movement in the proximity of a CO2 storage reservoir. The study is developing tools and workflows which will be tested at the CO2CRC Otway Project Site in the Otway Basin in south-western Victoria, Australia. For this purpose, we are building a geometrical kinematic 3-D model based on 2-D and 3-D seismic data that are provided by the Australian project partner, the CO2CRC Consortium. By retro-deforming the modeled subsurface faults in the inspected subsurface volume we can determine the accumulated sub-seismic deformation and thus the strain variation around the faults. Depending on lithology, the calculated strain magnitude and its orientation can be used as an indicator for fracture density. Furthermore, from the complete 3D strain tensor we can predict the orientation of fractures at sub-seismic scale. In areas where we have preliminary predicted critical deformation, we will acquire in November this year new near- surface, high resolution P- and S-wave 2-D seismic data in order to verify and calibrate our model results. Here, novel and parameter-based model building will especially benefit from extracting velocities and elastic parameters from VSP and other seismic data. Our goal is to obtain a better overview of possible fluid migration pathways and communication between reservoir and overburden. Thereby, we will provide a tool for prediction and adapted time-dependent monitoring strategies for subsurface storage in general including scientific visualization capabilities. Acknowledgement This work

  2. Network Optimization for Induced Seismicity Monitoring in Urban Areas

    NASA Astrophysics Data System (ADS)

    Kraft, T.; Husen, S.; Wiemer, S.

    2012-12-01

    With the global challenge to satisfy an increasing demand for energy, geological energy technologies receive growing attention and have been initiated in or close to urban areas in the past several years. Some of these technologies involve injecting fluids into the subsurface (e.g., oil and gas development, waste disposal, and geothermal energy development) and have been found or suspected to cause small to moderate sized earthquakes. These earthquakes, which may have gone unnoticed in the past when they occurred in remote sparsely populated areas, are now posing a considerable risk for the public acceptance of these technologies in urban areas. The permanent termination of the EGS project in Basel, Switzerland after a number of induced ML~3 (minor) earthquakes in 2006 is one prominent example. It is therefore essential to the future development and success of these geological energy technologies to develop strategies for managing induced seismicity and keeping the size of induced earthquake at a level that is acceptable to all stakeholders. Most guidelines and recommendations on induced seismicity published since the 1970ies conclude that an indispensable component of such a strategy is the establishment of seismic monitoring in an early stage of a project. This is because an appropriate seismic monitoring is the only way to detect and locate induced microearthquakes with sufficient certainty to develop an understanding of the seismic and geomechanical response of the reservoir to the geotechnical operation. In addition, seismic monitoring lays the foundation for the establishment of advanced traffic light systems and is therefore an important confidence building measure towards the local population and authorities. We have developed an optimization algorithm for seismic monitoring networks in urban areas that allows to design and evaluate seismic network geometries for arbitrary geotechnical operation layouts. The algorithm is based on the D-optimal experimental

  3. High Voltage Seismic Generator

    NASA Astrophysics Data System (ADS)

    Bogacz, Adrian; Pala, Damian; Knafel, Marcin

    2015-04-01

    This contribution describes the preliminary result of annual cooperation of three student research groups from AGH UST in Krakow, Poland. The aim of this cooperation was to develop and construct a high voltage seismic wave generator. Constructed device uses a high-energy electrical discharge to generate seismic wave in ground. This type of device can be applied in several different methods of seismic measurement, but because of its limited power it is mainly dedicated for engineering geophysics. The source operates on a basic physical principles. The energy is stored in capacitor bank, which is charged by two stage low to high voltage converter. Stored energy is then released in very short time through high voltage thyristor in spark gap. The whole appliance is powered from li-ion battery and controlled by ATmega microcontroller. It is possible to construct larger and more powerful device. In this contribution the structure of device with technical specifications is resented. As a part of the investigation the prototype was built and series of experiments conducted. System parameter was measured, on this basis specification of elements for the final device were chosen. First stage of the project was successful. It was possible to efficiently generate seismic waves with constructed device. Then the field test was conducted. Spark gap wasplaced in shallowborehole(0.5 m) filled with salt water. Geophones were placed on the ground in straight line. The comparison of signal registered with hammer source and sparker source was made. The results of the test measurements are presented and discussed. Analysis of the collected data shows that characteristic of generated seismic signal is very promising, thus confirms possibility of practical application of the new high voltage generator. The biggest advantage of presented device after signal characteristics is its size which is 0.5 x 0.25 x 0.2 m and weight approximately 7 kg. This features with small li-ion battery makes

  4. Seismicity of west Texas

    SciTech Connect

    Dumas, D.B.

    1981-01-01

    A four year seismic study has found the Basin and Range province of west Texas and the adjacent area of Mexico to be more seismically active then than heretofore known. A University of Texas five station seismic array around the Marfa Basin has located or detected approximately 800 local and regional earthquakes with S-P times of less than 30 sec. A crustal model for the Basin and Range is derived from natural and artificial sources and contains four layers having velocities of 3.60, 4.93, 6.11, and 6.60 km/sec, respectively, overlying a mantle of 8.37 km/sec. A moderate level of seismic activity has been detected near Van Horn, in the Marfa Basin, and along the Texas-Mexico border between latitudes 30 and 31/sup 0/N. Five earthquake sequences were recorded, two near the Texas-Mexico border and three in the Marfa Basin. Four of these sequences showed quiescent periods in foreshock activity preceding the main shock. On the eastern side of the Marfa Basin a diffuse linear seismic zone may represent an unmapped fault, striking N 50/sup 0/W that coincides with Muehlberger's proposed eastern boundary of Basin and Range faulting. A new epicenter for the Valentine, Texas earthquake of August 16, 1931 has been relocated instrumentally at the northern end of this diffuse zone. Regional and local teleseismic P-wave arrival time anomalies observed for the nearby Gnome underground nuclear explosion of 1961 are used to determine station corrections and thus to locate the new 1931 epicenter at 3.69/sup 0/N, 104.57/sup 0/W. Several estimates of magnitude (m/sub b/) based on intensity data range from 5.6 to 6.4. Fault-plane and composite fault-plane solutions support Muehlberger's hypothesis that the Basin and Range is undergoing extension in a SW-NE direction.

  5. The Great Maule earthquake: seismicity prior to and after the main shock from amphibious seismic networks

    NASA Astrophysics Data System (ADS)

    Lieser, K.; Arroyo, I. G.; Grevemeyer, I.; Flueh, E. R.; Lange, D.; Tilmann, F. J.

    2013-12-01

    The Chilean subduction zone is among the seismically most active plate boundaries in the world and its coastal ranges suffer from a magnitude 8 or larger megathrust earthquake every 10-20 years. The Constitución-Concepción or Maule segment in central Chile between ~35.5°S and 37°S was considered to be a mature seismic gap, rupturing last in 1835 and being seismically quiet without any magnitude 4.5 or larger earthquakes reported in global catalogues. It is located to the north of the nucleation area of the 1960 magnitude 9.5 Valdivia earthquake and to the south of the 1928 magnitude 8 Talca earthquake. On 27 February 2010 this segment ruptured in a Mw=8.8 earthquake, nucleating near 36°S and affecting a 500-600 km long segment of the margin between 34°S and 38.5°S. Aftershocks occurred along a roughly 600 km long portion of the central Chilean margin, most of them offshore. Therefore, a network of 30 ocean-bottom-seismometers was deployed in the northern portion of the rupture area for a three month period, recording local offshore aftershocks between 20 September 2010 and 25 December 2010. In addition, data of a network consisting of 33 landstations of the GeoForschungsZentrum Potsdam were included into the network, providing an ideal coverage of both the rupture plane and areas affected by post-seismic slip as deduced from geodetic data. Aftershock locations are based on automatically detected P wave onsets and a 2.5D velocity model of the combined on- and offshore network. Aftershock seismicity analysis in the northern part of the survey area reveals a well resolved seismically active splay fault in the accretionary prism of the Chilean forearc. Our findings imply that in the northernmost part of the rupture zone, co-seismic slip most likely propagated along the splay fault and not the subduction thrust fault. In addition, the updip limit of aftershocks along the plate interface can be verified to about 40 km landwards from the deformation front. Prior to

  6. Results from the latest SN-4 multi-parametric benthic observatory experiment (MARsite EU project) in the Gulf of Izmit, Turkey: oceanographic, chemical and seismic monitoring

    NASA Astrophysics Data System (ADS)

    Embriaco, Davide; Marinaro, Giuditta; Frugoni, Francesco; Giovanetti, Gabriele; Monna, Stephen; Etiope, Giuseppe; Gasperini, Luca; Çağatay, Namık; Favali, Paolo

    2015-04-01

    An autonomous and long-term multiparametric benthic observatory (SN-4) was designed to study gas seepage and seismic energy release along the submerged segment of the North Anatolian Fault (NAF). Episodic gas seepage occurs at the seafloor in the Gulf of Izmit (Sea of Marmara, NW Turkey) along this submerged segment of the NAF, which ruptured during the 1999 Mw7.4 Izmit earthquake. The SN-4 observatory already operated in the Gulf of Izmit at the western end of the 1999 Izmit earthquake rupture for about one-year at 166 m water depth during the 2009-2010 experiment (EGU2014-13412-1, EGU General Assembly 2014). SN-4 was re-deployed in the same site for a new long term mission (September 2013 - April 2014) in the framework of MARsite (New Directions in Seismic Hazard assessment through Focused Earth Observation in the Marmara Supersite, http://marsite.eu/ ) EC project, which aims at evaluating seismic risk and managing of long-term monitoring activities in the Marmara Sea. A main scientific objective of the SN-4 experiment is to investigate the possible correlations between seafloor methane seepage and release of seismic energy. We used the same site of the 2009-2010 campaign to verify both the occurrence of previously observed phenomena and the reliability of results obtained in the previous experiment (Embriaco et al., 2014, doi:10.1093/gji/ggt436). In particular, we are interested in the detection of gas release at the seafloor, in the role played by oceanographic phenomena in this detection, and in the association of gas and seismic energy release. The scientific payload included, among other instruments, a three-component broad-band seismometer, and gas and oceanographic sensors. We present a technical description of the observatory, including the data acquisition and control system, results from the preliminary analysis of this new multidisciplinary data set, and a comparison with the previous experiment.

  7. Guideline for the seismic technical evaluation of replacement items for nuclear power plants

    SciTech Connect

    Harris, S.P.; Cushing, R.W. ); Johnson, H.W. ); Abeles, J.M. )

    1993-02-01

    Seismic qualification for equipment originally installed in nuclear power plants was typically performed by the original equipment suppliers or manufactures (OES/OEM). Many of the OES/OEM no longer maintain quality assurance programs with adequate controls for supplying nuclear equipment. Utilities themselves must provide reasonable assurance in the continued seismic adequacy of such replacement items. This guideline provides practical, cost-effective techniques which can be used to provide reasonable assurance that replacement items will meet seismic performance requirements necessary to maintain the seismic design basis of commercial nuclear power plants. It also provides a method for determining when a seismic technical evaluation of replacement items (STERI) is required as part of the procurement process for spare and replacement items. Guidance on supplier program requirements necessary to maintain continued seismic adequacy and on documentation of maintaining required seismic adequacy is also included.

  8. A Hammer-Impact, Aluminum, Shear-Wave Seismic Source

    USGS Publications Warehouse

    Haines, Seth S.

    2007-01-01

    Near-surface seismic surveys often employ hammer impacts to create seismic energy. Shear-wave surveys using horizontally polarized waves require horizontal hammer impacts against a rigid object (the source) that is coupled to the ground surface. I have designed, built, and tested a source made out of aluminum and equipped with spikes to improve coupling. The source is effective in a variety of settings, and it is relatively simple and inexpensive to build.

  9. Seismic Rotations Observed with Inertial Seismic Sensors

    NASA Astrophysics Data System (ADS)

    Jean, V.

    2006-12-01

    Recent interest of the seismological community has arisen for possible rotation effects of the Earth on signals recorded by inertial seismometers. Wiechert and Schluter (1903) and more recently Pancha et al. (2000), Igel et al. (2005, 2006) show that, in the teleseismic range, rotations may be neglected and account for less than 0.1% of the translation waves generated by earthquakes. On the contrary, we may see effects of rotation on seismic traces recorded in the near field of an earthquake. As instruments will deliver unsaturated signals in this near field, rotation detection will be more and more frequent. We may observe rotation effects as well in the noise signal at long period. - In the near field, the three components integrated signal of the accelerograms (i.e; velocity signal) diverge and this drift is the effect of an nearly invisible little jump in acceleration signal. The second integrated step diverges and the co-seismic displacement could not be estimated. - By studying the long period noise, we have found that the two horizontal components of some of GEOSCOPE stations with STS-1 seismometer from Streckeisen, present the same noise both in amplitude and in phase with a coherency greater than 95%. This similarity could occur at some stations and not at others and during some time periods. Therefore, the noise has a quite stable horizontal polarisation at N045 during these periods. We may argue that these two separate effects comes from ground rotations and the way they are recorded by seismic instruments. For example, GEOSCOPE stations equipped by STS-2 which have a quite different mechanical structure do not exhibit the polarisation effect. Mechanical pendulums as vertical LaCoste sensor and horizontal 'garden-gate' sensor present effects of rotations on the different translation motions of the mass. Therefore, for the long period noise, a quite probable explanation is that a rotation around the vertical axis acts similarly on the two horizontal

  10. Intermediate depth seismicity - a reflection seismic approach

    NASA Astrophysics Data System (ADS)

    Haberland, C.; Rietbrock, A.

    2004-12-01

    During subduction the descending oceanic lithosphere is subject to metamorphic reactions, some of them associated with the release of fluids. It is now widely accepted, that these reactions and associated dehydration processes are directly related with the generation of intermediate depth earthquakes (dehydration embrittlement). However, the structure of the layered oceanic plate at depth and the location of the earthquakes relative to structural units of the subducting plate (sources within the oceanic crust and/or in the upper oceanic mantle lithosphere?) are still not resolved yet. This is in mainly due to the fact that the observational resolution needed to address these topics (in the range of only a few kilometers) is hardly achieved in field experiments and related studies. Here we study the wavefields of intermediate depth earthquakes typically observed by temporary networks in order to assess their high-resolution potential in resolving structure of the down going slab and locus of seismicity. In particular we study whether the subducted oceanic Moho can be detected by the analysis of secondary phases of local earthquakes (near vertical reflection). Due to the irregular geometry of sources and receivers we apply an imaging technique similar to diffraction stack migration. The method is tested using synthetic data both based on 2-D finite difference simulations and 3-D kinematic ray tracing. The accuracy of the hypocenter location and onset times crucial for the successful application of stacking techniques (coherency) was achieved by the use of relatively relocated intermediate depth seismicity. Additionally, we simulate the propagation of the wavefields at larger distance (wide angle) indicating the development of guided waves traveling in the low-velocity waveguide associated with the modeled oceanic crust. We also present application on local earthquake data from the South American subduction zone.

  11. Probabilistic seismic hazard estimation of Manipur, India

    NASA Astrophysics Data System (ADS)

    Pallav, Kumar; Raghukanth, S. T. G.; Darunkumar Singh, Konjengbam

    2012-10-01

    This paper deals with the estimation of spectral acceleration for Manipur based on probabilistic seismic hazard analysis (PSHA). The 500 km region surrounding Manipur is divided into seven tectonic zones and major faults located in these zones are used to estimate seismic hazard. The earthquake recurrence relations for the seven zones have been estimated from past seismicity data. Ground motion prediction equations proposed by Boore and Atkinson (2008 Earthq. Spectra 24 99-138) for shallow active regions and Atkinson and Boore (2003 Bull. Seismol. Soc. Am. 93 1703-29) for the Indo-Burma subduction zone are used for estimating ground motion. The uniform hazard response spectra for all the nine constituent districts of Manipur (Senapati, Tamenglong, Churachandpur, Chandel, Imphal east, Imphal west, Ukhrul, Thoubal and Bishnupur) at 100-, 500- and 2500-year return periods have been computed from PSHA. A contour map of peak ground acceleration over Manipur is also presented for 100-, 500-, and 2500-year return periods with variations of 0.075-0.225, 0.18-0.63 and 0.3-0.1.15 g, respectively, throughout the state. These results may be of use to planners and engineers for site selection, designing earthquake resistant structures and, further, may help the state administration in seismic hazard mitigation.

  12. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  13. Seismic monitoring system replacement at Temelin plant

    SciTech Connect

    Baltus, R.; Palusamy, S.S.

    1996-12-01

    The VVER-1000 plants under construction at Temelin (Czech Republic) were designed with an automatic reactor trip system triggered on seismic peak accelerations. Within the plant I and C upgrade, Westinghouse designed a digital Seismic Monitoring System to be integrated in an Artificial Intelligence based Diagnostic and Monitoring System. The system meets the requirements of the emerging standards prepared by the US NRC on the basis of EPRI studies, which recommend a detailed data evaluation and a pre-shutdown plant inspection before orderly shutdown, if required, rather than immediate emergency shutdown. The paper presents the arguments about automatic trip, as discussed in an IAEA meeting attended by expert consultants from Japan, Russia, US and Eastern and Western Europe. It describes the system installed at Temelin, including the plant specific criteria for OBE exceedance. Finally it presents the capabilities and limitations of the integration into an overall Diagnostic and Monitoring System.

  14. Conceptual design report: Nuclear materials storage facility renovation. Part 5, Structural/seismic investigation. Section A report, existing conditions calculations/supporting information

    SciTech Connect

    1995-07-14

    The Nuclear Materials Storage Facility (NMSF) at the Los Alamos National Laboratory (LANL) was a Fiscal Year (FY) 1984 line-item project completed in 1987 that has never been operated because of major design and construction deficiencies. This renovation project, which will correct those deficiencies and allow operation of the facility, is proposed as an FY 97 line item. The mission of the project is to provide centralized intermediate and long-term storage of special nuclear materials (SNM) associated with defined LANL programmatic missions and to establish a centralized SNM shipping and receiving location for Technical Area (TA)-55 at LANL. Based on current projections, existing storage space for SNM at other locations at LANL will be loaded to capacity by approximately 2002. This will adversely affect LANUs ability to meet its mission requirements in the future. The affected missions include LANL`s weapons research, development, and testing (WRD&T) program; special materials recovery; stockpile survelliance/evaluation; advanced fuels and heat sources development and production; and safe, secure storage of existing nuclear materials inventories. The problem is further exacerbated by LANL`s inability to ship any materials offsite because of the lack of receiver sites for mate rial and regulatory issues. Correction of the current deficiencies and enhancement of the facility will provide centralized storage close to a nuclear materials processing facility. The project will enable long-term, cost-effective storage in a secure environment with reduced radiation exposure to workers, and eliminate potential exposures to the public. Based upon US Department of Energy (DOE) Albuquerque Operations (DOE/Al) Office and LANL projections, storage space limitations/restrictions will begin to affect LANL`s ability to meet its missions between 1998 and 2002.

  15. Teaching Reflection Seismic Processing

    NASA Astrophysics Data System (ADS)

    Forel, D.; Benz, T.; Pennington, W. D.

    2004-12-01

    Without pictures, it is difficult to give students a feeling for wave propagation, transmission, and reflection. Even with pictures, wave propagation is still static to many. However, when students use and modify scripts that generate wavefronts and rays through a geologic model that they have modified themselves, we find that students gain a real feeling for wave propagation. To facilitate teaching 2-D seismic reflection data processing (from acquisition through migration) to our undergraduate and graduate Reflection Seismology students, we use Seismic Un*x (SU) software. SU is maintained and distributed by Colorado School of Mines, and it is freely available (at www.cwp.mines.edu/cwpcodes). Our approach includes use of synthetic and real seismic data, processing scripts, and detailed explanation of the scripts. Our real data were provided by Gregory F. Moore of the University of Hawaii. This approach can be used by any school at virtually no expense for either software or data, and can provide students with a sound introduction to techniques used in processing of reflection seismic data. The same software can be used for other purposes, such as research, with no additional expense. Students who have completed a course using SU are well equipped to begin using it for research, as well. Scripts for each processing step are supplied and explained to the students. Our detailed description of the scripts means students do not have to know anything about SU to start. Experience with the Unix operating system is preferable but not necessary -- our notes include Computer Hints to help the beginner work with the Unix operating system. We include several examples of synthetic model building, acquiring shot gathers through synthetic models, sorting shot gathers to CMP gathers, gain, 1-D frequency filtering, f-k filtering, deconvolution, semblance displays and velocity analysis, flattening data (NMO), stacking the CMPs, and migration. We use two real (marine) data sets. One

  16. Seismic qualification of existing safety class manipulators

    SciTech Connect

    Wu, Ting-shu; Moran, T.J.

    1992-05-01

    There are two bridge type electromechanical manipulators within a nuclear fuel handling facility which were constructed over twenty-five years ago. At that time, there were only minimal seismic considerations. These manipulators together with the facility are being reactivated. Detailed analyses have shown that the manipulators will satisfy the requirements of ANSI/AISC N690-1984 when they are subjected to loadings including the site specific design basis earthquake. 4 refs.

  17. Seismic qualification of existing safety class manipulators

    SciTech Connect

    Wu, Ting-shu; Moran, T.J.

    1992-01-01

    There are two bridge type electromechanical manipulators within a nuclear fuel handling facility which were constructed over twenty-five years ago. At that time, there were only minimal seismic considerations. These manipulators together with the facility are being reactivated. Detailed analyses have shown that the manipulators will satisfy the requirements of ANSI/AISC N690-1984 when they are subjected to loadings including the site specific design basis earthquake. 4 refs.

  18. Hanford quarterly seismic report -- 97A seismicity on and near the Hanford Site, Pasco Basin, Washington, October 1, 1996 through December 31, 1996

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.P.

    1997-02-01

    Seismic Monitoring is part of PNNL`s Applied Geology and Geochemistry Group. The Seismic Monitoring Analysis and Repair Team (SMART) operates, maintains, and analyzes data from the hanford Seismic Network (HSN), extending the site historical seismic database and fulfilling US Department of Energy, Richland Operations Office requirements and orders. The SMART also maintains the Eastern Washington Regional Network (EWRN). The University of Washington uses the data from the EWRN and other seismic networks in the Northwest to provide the SMART with necessary regional input for the seismic hazards analysis at the Hanford Site. The SMART is tasked to provide an uninterrupted collection of high-quality raw seismic data from the HSN located on and around the Hanford Site. These unprocessed data are permanently archived. SMART also is tasked to locate and identify sources of seismic activity, monitor changes in the historical pattern of seismic activity at the Hanford Site, and build a local earthquake database (processed data) that is permanently archived. Local earthquakes are defined as earthquakes that occur within 46 degrees to 47 degrees west longitude and 119 degrees to 120 degrees north latitude. The data are used by the Hanford contractor for waste management activities, Natural Phenomena Hazards assessments and engineering design and construction. In addition, the seismic monitoring organization works with Hanford Site Emergency Services Organization to provide assistance in the event of an earthquake on the Hanford Site.

  19. Validation of seismic probabilistic risk assessments of nuclear power plants

    SciTech Connect

    Ellingwood, B.

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  20. First Quarter Hanford Seismic Report for Fiscal Year 1999

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    1999-05-26

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. They also locate and identify sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consists of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The operational rate for the first quarter of FY99 for stations in the HSN was 99.8%. There were 121 triggers during the first quarter of fiscal year 1999. Fourteen triggers were local earthquakes; seven (50%) were in the Columbia River Basalt Group, no earthquakes occurred in the pre-basalt sediments, and seven (50%) were in the crystalline basement. One earthquake (7%) occurred near or along the Horn Rapids anticline, seven earthquakes (50%) occurred in a known swarm area, and six earthquakes (43%) were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometer during the first quarter of FY99.