Science.gov

Sample records for verifying seismic design

  1. A Real Quantum Designated Verifier Signature Scheme

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang

    2015-09-01

    The effectiveness of most quantum signature schemes reported in the literature can be verified by a designated person, however, those quantum signature schemes aren't the real traditional designated verifier signature schemes, because the designated person hasn't the capability to efficiently simulate a signature which is indistinguishable from a signer, which cannot satisfy the requirements in some special environments such as E-voting, call for tenders and software licensing. For solving this problem, a real quantum designated verifier signature scheme is proposed in this paper. According to the property of unitary transformation and quantum one-way function, only a verifier designated by a signer can verify the "validity of a signature" and the designated verifier cannot prove to a third party that the signature was produced by the signer or by himself through a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Analysis results show that this new scheme satisfies the main security requirements of designated verifier signature scheme and the major attack strategies.

  2. Verifying the HETG spectrometer Rowland design

    NASA Astrophysics Data System (ADS)

    Stage, Michael D.; Dewey, Daniel

    1998-11-01

    The HETGS on AXAF is the coordinated operation of the AXAF High-Resolution Mirror Assembly (HRMA), the high-energy transmission grating (HETG), and the grating-readout array of the AXAF CCD Imaging Spectrometer (ACIS-S). XRCF calibration data are analyzed to verify the Rowland geometry design of the HETGS. In particular, ACIS-S imaging of quadrant shutter focus test is used to probe the focus, alignment, and astigmatism of the spectra produced by diffraction through the high and medium energy gratings of the HETG. The experimental results are compared to expected values and to results obtained with the AXAF simulator, MARX.

  3. The seismic design handbook

    SciTech Connect

    Naeim, F. )

    1989-01-01

    This book contains papers on the planning, analysis, and design of earthquake resistant building structures. Theories and concepts of earthquake resistant design and their implementation in seismic design practice are presented.

  4. Position paper: Seismic design criteria

    SciTech Connect

    Farnworth, S.K.

    1995-05-22

    The purpose of this paper is to document the seismic design criteria to be used on the Title 11 design of the underground double-shell waste storage tanks and appurtenant facilities of the Multi-Function Waste Tank Facility (MWTF) project, and to provide the history and methodologies for determining the recommended Design Basis Earthquake (DBE) Peak Ground Acceleration (PGA) anchors for site-specific seismic response spectra curves. Response spectra curves for use in design are provided in Appendix A.

  5. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  6. Design Strategy for a Formally Verified Reliable Computing Platform

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.

    1991-01-01

    This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.

  7. Design of a verifiable subset for HAL/S

    NASA Technical Reports Server (NTRS)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  8. DISPLACEMENT BASED SEISMIC DESIGN CRITERIA

    SciTech Connect

    HOFMAYER,C.H.

    1999-03-29

    The USNRC has initiated a project to determine if any of the likely revisions to traditional earthquake engineering practice are relevant to seismic design of the specialized structures, systems and components of nuclear power plants and of such significance to suggest that a change in design practice might be warranted. As part of the initial phase of this study, a literature survey was conducted on the recent changes in seismic design codes/standards, on-going activities of code-writing organizations/communities, and published documents on displacement-based design methods. This paper provides a summary of recent changes in building codes and on-going activities for future codes. It also discusses some technical issues for further consideration.

  9. Establishing seismic design criteria to achieve an acceptable seismic margin

    SciTech Connect

    Kennedy, R.P.

    1997-01-01

    In order to develop a risk based seismic design criteria the following four issues must be addressed: (1) What target annual probability of seismic induced unacceptable performance is acceptable? (2). What minimum seismic margin is acceptable? (3) Given the decisions made under Issues 1 and 2, at what annual frequency of exceedance should the Safe Shutdown Earthquake ground motion be defined? (4) What seismic design criteria should be established to reasonably achieve the seismic margin defined under Issue 2? The first issue is purely a policy decision and is not addressed in this paper. Each of the other three issues are addressed. Issues 2 and 3 are integrally tied together so that a very large number of possible combinations of responses to these two issues can be used to achieve the target goal defined under Issue 1. Section 2 lays out a combined approach to these two issues and presents three potentially attractive combined resolutions of these two issues which reasonably achieves the target goal. The remainder of the paper discusses an approach which can be used to develop seismic design criteria aimed at achieving the desired seismic margin defined in resolution of Issue 2. Suggestions for revising existing seismic design criteria to more consistently achieve the desired seismic margin are presented.

  10. Ground penetrating radar and active seismic investigation of stratigraphically verified pyroclastic deposits

    NASA Astrophysics Data System (ADS)

    Gase, A.; Bradford, J. H.; Brand, B. D.

    2015-12-01

    We conducted ground-penetrating radar (GPR) and active seismic surveys in July and August, 2015 parallel to outcrops of the pyroclastic density current deposits of the May 18th, 1980 eruption of Mount St. Helens (MSH), Washington. The primary objective of this study is to compare geophysical properties that influence electromagnetic and elastic wave velocities with stratigraphic parameters in the un-saturated zone. The deposits of interest are composed of pumice, volcanic ash, and lava blocks comprising a wide range of intrinsic porosities and grain sizes from sand to boulders. Single-offset GPR surveys for reflection data were performed with a Sensors and Software pulseEKKO Pro 100 GPR using 50 MHz, 100 MHz, and 200 MHz antennae. GPR data processing includes time-zero correction, dewow filter, migration, elevation correction. Multi-offset acquisition with 100 MHz antennae and offsets ranging from 1 m to 16 m are used for reflection tomography to create 2 D electromagnetic wave velocity models. Seismic surveys are performed with 72 geophones spaced at two meters using a sledge hammer source with shot points at each receiver point. We couple p- wave refraction tomography with Rayleigh wave inversion to compute Vp/Vs ratios. The two geophysical datasets are then compared with stratigraphic information to illustrate the influence of lithological parameters (e.g. stratification, grain-size distribution, porosity, and sorting) on geophysical properties of unsaturated pyroclastic deposits. Future work will include joint petrophysical inversion of the multiple datasets to estimate porosity and water content in the unsaturated zone.

  11. Simplified seismic performance assessment and implications for seismic design

    NASA Astrophysics Data System (ADS)

    Sullivan, Timothy J.; Welch, David P.; Calvi, Gian Michele

    2014-08-01

    The last decade or so has seen the development of refined performance-based earthquake engineering (PBEE) approaches that now provide a framework for estimation of a range of important decision variables, such as repair costs, repair time and number of casualties. This paper reviews current tools for PBEE, including the PACT software, and examines the possibility of extending the innovative displacement-based assessment approach as a simplified structural analysis option for performance assessment. Details of the displacement-based s+eismic assessment method are reviewed and a simple means of quickly assessing multiple hazard levels is proposed. Furthermore, proposals for a simple definition of collapse fragility and relations between equivalent single-degree-of-freedom characteristics and multi-degree-of-freedom story drift and floor acceleration demands are discussed, highlighting needs for future research. To illustrate the potential of the methodology, performance measures obtained from the simplified method are compared with those computed using the results of incremental dynamic analyses within the PEER performance-based earthquake engineering framework, applied to a benchmark building. The comparison illustrates that the simplified method could be a very effective conceptual seismic design tool. The advantages and disadvantages of the simplified approach are discussed and potential implications of advanced seismic performance assessments for conceptual seismic design are highlighted through examination of different case study scenarios including different structural configurations.

  12. Structural concepts and details for seismic design

    SciTech Connect

    Not Available

    1991-09-01

    This manual discusses building and building component behavior during earthquakes, and provides suggested details for seismic resistance which have shown by experience to provide adequate performance during earthquakes. Special design and construction practices are also described which, although they might be common in some high-seismic regions, may not be common in low and moderate seismic-hazard regions of the United States. Special attention is given to describing the level of detailing appropriate for each seismic region. The UBC seismic criteria for all seismic zones is carefully examined, and many examples of connection details are given. The general scope of discussion is limited to materials and construction types common to Department of Energy (DOE) sites. Although the manual is primarily written for professional engineers engaged in performing seismic-resistant design for DOE facilities, the first two chapters, plus the introductory sections of succeeding chapters, contain descriptions which are also directed toward project engineers who authorize, review, or supervise the design and construction of DOE facilities. 88 refs., 188 figs.

  13. Seismic design parameters - A user guide

    USGS Publications Warehouse

    Leyendecker, E.V.; Frankel, A.D.; Rukstales, K.S.

    2001-01-01

    The 1997 NEHRP Recommended Provisions for Seismic Regulations for New Buildings (1997 NEHRP Provisions) introduced seismic design procedure that is based on the explicit use of spectral response acceleration rather than the traditional peak ground acceleration and/or peak ground velocity or zone factors. The spectral response accelerations are obtained from spectral response acceleration maps accompanying the report. Maps are available for the United States and a number of U.S. territories. Since 1997 additional codes and standards have also adopted seismic design approaches based on the same procedure used in the NEHRP Provisions and the accompanying maps. The design documents using the 1997 NEHRP Provisions procedure may be divided into three categories -(1) Design of New Construction, (2) Design and Evaluation of Existing Construction, and (3) Design of Residential Construction. A CD-ROM has been prepared for use in conjunction with the design documents in each of these three categories. The spectral accelerations obtained using the software on the CD are the same as those that would be obtained by using the maps accompanying the design documents. The software has been prepared to operate on a personal computer using a Windows (Microsoft Corporation) operating environment and a point and click type of interface. The user can obtain the spectral acceleration values that would be obtained by use of the maps accompanying the design documents, include site factors appropriate for the Site Class provided by the user, calculate a response spectrum that includes the site factor, and plot a response spectrum. Sites may be located by providing the latitude-longitude or zip code for all areas covered by the maps. All of the maps used in the various documents are also included on the CDROM

  14. The Relationship Between Verified Organ Donor Designation and Patient Demographic and Medical Characteristics.

    PubMed

    Sehgal, N K R; Scallan, C; Sullivan, C; Cedeño, M; Pencak, J; Kirkland, J; Scott, K; Thornton, J D

    2016-04-01

    Previous studies on the correlates of organ donation consent have focused on self-reported willingness to donate and on self-reported medical suitability to donate. However, these may be subject to social desirability bias and inaccurate assessments of medical suitability. The authors sought to overcome these limitations by directly verifying donor designation on driver's licenses and by abstracting comorbid conditions from electronic health records. Using a cross-sectional study design, they reviewed the health records of 2070 randomly selected primary care patients at a large urban safety-net medical system to obtain demographic and medical characteristics. They also examined driver's licenses that were scanned into electronic health records as part of the patient registration process for donor designation. Overall, 943 (46%) patients were designated as a donor on their driver's license. On multivariate analysis, donor designation was positively associated with age 35-54 years, female sex, nonblack race, speaking English or Spanish, being employed, having private insurance, having an income >$45 000, and having fewer comorbid conditions. These demographic and medical characteristics resulted in patient subgroups with donor designation rates ranging from 21% to 75%. In conclusion, patient characteristics are strongly related to verified donor designation. Further work should tailor organ donation efforts to specific subgroups.

  15. Research on performance-based seismic design criteria

    NASA Astrophysics Data System (ADS)

    Xie, Li-Li; Ma, Yu-Hong

    2002-03-01

    The seismic design criterion adopted in the existing seismic design codes is reviewed. It is pointed out that the presently used seismic design criterion is not satisfied with the requirements of nowadays social and economic development. A new performance-based seismic design criterion that is composed of three components is presented in this paper. It can not only effectively control the economic losses and casualty, but also ensure the building’s function in proper operation during earthquakes. The three components are: classification of seismic design for buildings, determination of seismic design intensity and/or seismic design ground motion for controlling seismic economic losses and casualties, and determination of the importance factors in terms of service periods of buildings. For controlling the seismic human losses, the idea of socially acceptable casualty level is presented and the ‘Optimal Economic Decision Model’ and ‘Optimal Safe Decision Model’ are established. Finally, a new method is recommended for calculating the importance factors of structures by adjusting structures service period on the base of more important structure with longer service period than the conventional ones. Therefore, the more important structure with longer service periods will be designed for higher seismic loads, in case the exceedance probability of seismic hazard in different service period is same.

  16. Tritium glovebox stripper system seismic design evaluation

    SciTech Connect

    Grinnell, J. J.; Klein, J. E.

    2015-09-01

    The use of glovebox confinement at US Department of Energy (DOE) tritium facilities has been discussed in numerous publications. Glovebox confinement protects the workers from radioactive material (especially tritium oxide), provides an inert atmosphere for prevention of flammable gas mixtures and deflagrations, and allows recovery of tritium released from the process into the glovebox when a glovebox stripper system (GBSS) is part of the design. Tritium recovery from the glovebox atmosphere reduces emissions from the facility and the radiological dose to the public. Location of US DOE defense programs facilities away from public boundaries also aids in reducing radiological doses to the public. This is a study based upon design concepts to identify issues and considerations for design of a Seismic GBSS. Safety requirements and analysis should be considered preliminary. Safety requirements for design of GBSS should be developed and finalized as a part of the final design process.

  17. Feasibility study and verified design concept for new improved hot gas facility

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The MSFC Hot Gas Facility (HGF) was fabricated in 1975 as a temporary facility to provide immediate turnaround testing to support the SRB and ET TPS development. This facility proved to be very useful and was used to make more than 1300 runs, far more than ever intended in the original design. Therefore, it was in need of constant repair and needed to be replaced with a new improved design to support the continuing SRB/ET TPS product improvement and/or removal efforts. MSFC contracted with Lockheed-Huntsville to work on this improved design through contract NAS8-36304 Feasibility Study and Verified Design Concept for the New Improved Hot Gas Facility. The results of Lockheed-Huntsville's efforts under this contract are summarized.

  18. Design of the IPIRG-2 simulated seismic forcing function

    SciTech Connect

    Olson, R.; Scott, P.; Wilkowski, G.

    1996-02-01

    A series of pipe system experiments was conducted in IPIRG-2 that used a realistic seismic forcing function. Because the seismic forcing function was more complex than the single-frequency increasing-amplitude sinusoidal forcing function used in the IPIRG-1 pipe system experiments, considerable effort went into designing the function. This report documents the design process for the seismic forcing function used in the IPIRG-2 pipe system experiments.

  19. Implied preference for seismic design level and earthquake insurance.

    PubMed

    Goda, K; Hong, H P

    2008-04-01

    Seismic risk can be reduced by implementing newly developed seismic provisions in design codes. Furthermore, financial protection or enhanced utility and happiness for stakeholders could be gained through the purchase of earthquake insurance. If this is not so, there would be no market for such insurance. However, perceived benefit associated with insurance is not universally shared by stakeholders partly due to their diverse risk attitudes. This study investigates the implied seismic design preference with insurance options for decisionmakers of bounded rationality whose preferences could be adequately represented by the cumulative prospect theory (CPT). The investigation is focused on assessing the sensitivity of the implied seismic design preference with insurance options to model parameters of the CPT and to fair and unfair insurance arrangements. Numerical results suggest that human cognitive limitation and risk perception can affect the implied seismic design preference by the CPT significantly. The mandatory purchase of fair insurance will lead the implied seismic design preference to the optimum design level that is dictated by the minimum expected lifecycle cost rule. Unfair insurance decreases the expected gain as well as its associated variability, which is preferred by risk-averse decisionmakers. The obtained results of the implied preference for the combination of the seismic design level and insurance option suggest that property owners, financial institutions, and municipalities can take advantage of affordable insurance to establish successful seismic risk management strategies.

  20. Verified by Visa and MasterCard SecureCode: Or, How Not to Design Authentication

    NASA Astrophysics Data System (ADS)

    Murdoch, Steven J.; Anderson, Ross

    Banks worldwide are starting to authenticate online card transactions using the '3-D Secure' protocol, which is branded as Verified by Visa and MasterCard SecureCode. This has been partly driven by the sharp increase in online fraud that followed the deployment of EMV smart cards for cardholder-present payments in Europe and elsewhere. 3-D Secure has so far escaped academic scrutiny; yet it might be a textbook example of how not to design an authentication protocol. It ignores good design principles and has significant vulnerabilities, some of which are already being exploited. Also, it provides a fascinating lesson in security economics. While other single sign-on schemes such as OpenID, InfoCard and Liberty came up with decent technology they got the economics wrong, and their schemes have not been adopted. 3-D Secure has lousy technology, but got the economics right (at least for banks and merchants); it now boasts hundreds of millions of accounts. We suggest a path towards more robust authentication that is technologically sound and where the economics would work for banks, merchants and customers - given a gentle regulatory nudge.

  1. Verifying single-station seismic approaches using Earth-based data: Preparation for data return from the InSight mission to Mars

    NASA Astrophysics Data System (ADS)

    Panning, Mark P.; Beucler, Éric; Drilleau, Mélanie; Mocquet, Antoine; Lognonné, Philippe; Banerdt, W. Bruce

    2015-03-01

    The planned InSight mission will deliver a single seismic station containing 3-component broadband and short-period sensors to the surface of Mars in 2016. While much of the progress in understanding the Earth and Moon's interior has relied on the use of seismic networks for accurate location of sources, single station approaches can be applied to data returned from Mars in order to locate events and determine interior structure. In preparation for the data return from InSight, we use a terrestrial dataset recorded at the Global Seismic Network station BFO, located at the Black Forest Observatory in Germany, to verify an approach for event location and structure determination based on recordings of multiple orbit surface waves, which will be more favorable to record on Mars than Earth due to smaller planetary radius and potentially lower background noise. With this approach applied to events near the threshold of observability on Earth, we are able to determine epicentral distance within approximately 1° (corresponding to ∼60 km on Mars), and origin time within ∼30 s. With back azimuth determined from Rayleigh wave polarization, absolute locations are determined generally within an aperture of 10°, allowing for localization within large tectonic regions on Mars. With these locations, we are able to recover Earth mantle structure within ±5% (the InSight mission requirements for martian mantle structure) using 1D travel time inversions of P and S travel times for datasets of only 7 events. The location algorithm also allows for the measurement of great-circle averaged group velocity dispersion, which we measure between 40 and 200 s to scale the expected reliable frequency range of the InSight data from Earth to Mars data. Using the terrestrial data, we are able to resolve structure down to ∼200 km, but synthetic tests demonstrate we should be able to resolve martian structure to ∼400 km with the same frequency content given the smaller planetary size.

  2. Concept design for seismic upgrade of Keck telescopes

    NASA Astrophysics Data System (ADS)

    Kan, Frank W.; Park, Samuel; Sarawit, Andrew T.; Cranston, P. Graham

    2016-08-01

    On 15 October 2006, a large earthquake damaged both telescopes at W. M. Keck Observatory resulting in weeks of observing downtime. A significant portion of the downtime was attributed to recovery efforts repairing damage to telescope bearing journals, radial pad support structures, and encoder subsystems. To reduce the risk of damage and loss of observing time in future seismic events, we developed a conceptual design for the seismic upgrade of the twin Keck Telescopes. The paper covers the design requirements and constraints for the seismic upgrade, the evaluation method used to check the safety of sensitive components, and the trade-off study used to compare different options and to select the best design. Various design options such as base isolating the structure, strengthening seismic restraints, adding dampers, adding break-away mechanisms, and combinations of these design options are considered in this study. Nonlinear time history analyses are performed to evaluate the performance of the design concepts.

  3. Understanding seismic design criteria for Japanese Nuclear Power Plants

    SciTech Connect

    Park, Y.J.; Hofmayer, C.H.; Costello, J.F.

    1995-04-01

    This paper summarizes the results of recent survey studies on the seismic design practice for nuclear power plants in Japan. The seismic design codes and standards for both nuclear as well as non-nuclear structures have been reviewed and summarized. Some key documents for understanding Japanese seismic design criteria are also listed with brief descriptions. The paper highlights the design criteria to determine the seismic demand and component capacity in comparison with U.S. criteria, the background studies which have led to the current Japanese design criteria, and a survey of current research activities. More detailed technical descriptions are presented on the development of Japanese shear wall equations, design requirements for containment structures, and ductility requirements.

  4. Seismic design methods for oil and gas transmission pipelines: A comparative study

    SciTech Connect

    Zarea, M.; Akel, S.; Champavere, R.; Betbeder-Matibet, J.; Conoscente, J.P.

    1995-12-31

    The results of a comparative study of two main seismic design methods for buried hydrocarbon transmission pipelines are presented. Several aspects of each method are analyzed: description of the assumptions, necessary input parameters and expected results. In addition, a brief parametric study is applied to several configurations (straight pipes and bends), for both travelling waves and permanent ground displacement (fault) effects. Finally, the validity of the methods is verified by analyzing field experience from past earthquakes with these methods.

  5. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  6. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    SciTech Connect

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  7. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

    NASA Astrophysics Data System (ADS)

    Liu, Aiwen; Chen, Kun; Wu, Jian

    2010-06-01

    The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

  8. Seismic fragility assessment of RC frame structure designed according to modern Chinese code for seismic design of buildings

    NASA Astrophysics Data System (ADS)

    Wu, D.; Tesfamariam, S.; Stiemer, S. F.; Qin, D.

    2012-09-01

    Following several damaging earthquakes in China, research has been devoted to find the causes of the collapse of reinforced concrete (RC) building sand studying the vulnerability of existing buildings. The Chinese Code for Seismic Design of Buildings (CCSDB) has evolved over time, however, there is still reported earthquake induced damage of newly designed RC buildings. Thus, to investigate modern Chinese seismic design code, three low-, mid- and high-rise RC frames were designed according to the 2010 CCSDB and the corresponding vulnerability curves were derived by computing a probabilistic seismic demand model (PSDM).The PSDM was computed by carrying out nonlinear time history analysis using thirty ground motions obtained from the Pacific Earthquake Engineering Research Center. Finally, the PSDM was used to generate fragility curves for immediate occupancy, significant damage, and collapse prevention damage levels. Results of the vulnerability assessment indicate that the seismic demands on the three different frames designed according to the 2010 CCSDB meet the seismic requirements and are almost in the same safety level.

  9. Use of process monitoring for verifying facility design of large-scale reprocessing plants

    SciTech Connect

    Hakkila, E.A.; Zack, N.R. ); Ehinger, M.H. ); Franssen, F. )

    1991-01-01

    During the decade of the 1990s, the International Atomic Energy Agency (IAEA) faces the challenge of implementing safeguards in large, new reprocessing facilities. The Agency will be involved in the design, construction, checkout and initial operation of these new facilities to ensure effective safeguards are implemented. One aspect of the Agency involvement is in the area of design verification. The United States Support Program has initiated a task to develop methods for applying process data collection and validation during the cold commissioning phase of plant construction. This paper summarizes the results of this task. 14 refs., 1 tab.

  10. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    SciTech Connect

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  11. Next generation seismic fragility curves for California bridges incorporating the evolution in seismic design philosophy

    NASA Astrophysics Data System (ADS)

    Ramanathan, Karthik Narayan

    Quantitative and qualitative assessment of the seismic risk to highway bridges is crucial in pre-earthquake planning, and post-earthquake response of transportation systems. Such assessments provide valuable knowledge about a number of principal effects of earthquakes such as traffic disruption of the overall highway system, impact on the regions’ economy and post-earthquake response and recovery, and more recently serve as measures to quantify resilience. Unlike previous work, this study captures unique bridge design attributes specific to California bridge classes along with their evolution over three significant design eras, separated by the historic 1971 San Fernando and 1989 Loma Prieta earthquakes (these events affected changes in bridge seismic design philosophy). This research developed next-generation fragility curves for four multispan concrete bridge classes by synthesizing new knowledge and emerging modeling capabilities, and by closely coordinating new and ongoing national research initiatives with expertise from bridge designers. A multi-phase framework was developed for generating fragility curves, which provides decision makers with essential tools for emergency response, design, planning, policy support, and maximizing investments in bridge retrofit. This framework encompasses generational changes in bridge design and construction details. Parameterized high-fidelity three-dimensional nonlinear analytical models are developed for the portfolios of bridge classes within different design eras. These models incorporate a wide range of geometric and material uncertainties, and their responses are characterized under seismic loadings. Fragility curves were then developed considering the vulnerability of multiple components and thereby help to quantify the performance of highway bridge networks and to study the impact of seismic design principles on the performance within a bridge class. This not only leads to the development of fragility relations

  12. A verified design of a fault-tolerant clock synchronization circuit: Preliminary investigations

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1992-01-01

    Schneider demonstrates that many fault tolerant clock synchronization algorithms can be represented as refinements of a single proven correct paradigm. Shankar provides mechanical proof that Schneider's schema achieves Byzantine fault tolerant clock synchronization provided that 11 constraints are satisfied. Some of the constraints are assumptions about physical properties of the system and cannot be established formally. Proofs are given that the fault tolerant midpoint convergence function satisfies three of the constraints. A hardware design is presented, implementing the fault tolerant midpoint function, which is shown to satisfy the remaining constraints. The synchronization circuit will recover completely from transient faults provided the maximum fault assumption is not violated. The initialization protocol for the circuit also provides a recovery mechanism from total system failure caused by correlated transient faults.

  13. Review of seismicity and ground motion studies related to development of seismic design at SRS

    SciTech Connect

    Stephenson, D.E.; Acree, J.R.

    1992-08-01

    The NRC response spectra developed in Reg. Guide 1.60 is being used in the studies related to restarting of the existing Savannah River Site (SRS) reactors. Because it envelopes all the other site specific spectra which have been developed for SRS, it provides significant conservatism in the design and analysis of the reactor systems for ground motions of this value or with these probability levels. This spectral shape is also the shape used for the design of the recently licensed Vogtle Nuclear Station, located south of the Savannah River from the SRS. This report provides a summary of the data base used to develop the design basis earthquake. This includes the seismicity, rates of occurrence, magnitudes, and attenuation relationships. A summary is provided for the studies performed and methodologies used to establish the design basis earthquake for SRS. The ground motion response spectra developed from the various studies are also summarized. The seismic hazard and PGA`s developed for other critical facilities in the region are discussed, and the SRS seismic instrumentation is presented. The programs for resolving outstanding issues are discussed and conclusions are presented.

  14. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 12 2011-01-01 2011-01-01 false Seismic design and construction standards for new..., REGULATIONS, AND EXECUTIVE ORDERS Seismic Safety of Federally Assisted New Building Construction § 1792.103 Seismic design and construction standards for new buildings. (a) In the design and construction...

  15. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 12 2012-01-01 2012-01-01 false Seismic design and construction standards for new..., REGULATIONS, AND EXECUTIVE ORDERS Seismic Safety of Federally Assisted New Building Construction § 1792.103 Seismic design and construction standards for new buildings. (a) In the design and construction...

  16. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 12 2013-01-01 2013-01-01 false Seismic design and construction standards for new..., REGULATIONS, AND EXECUTIVE ORDERS Seismic Safety of Federally Assisted New Building Construction § 1792.103 Seismic design and construction standards for new buildings. (a) In the design and construction...

  17. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 12 2014-01-01 2013-01-01 true Seismic design and construction standards for new..., REGULATIONS, AND EXECUTIVE ORDERS Seismic Safety of Federally Assisted New Building Construction § 1792.103 Seismic design and construction standards for new buildings. (a) In the design and construction...

  18. The Seismic Design of Waterfront Retaining Structures

    DTIC Science & Technology

    1992-11-01

    displacements, submergence, liquefaction potential, and excess pore water pressures, as well as inertial and hydrodynamic forces, are incorporated in the design...backfill. Procedures for incorporating the effects of submergence within the earth pressure computations, including consideration of excess pore water ...computations as specified by one of the following procedures: Restrained water case Free water case - restricted to soils of high permeability (e.g. k > 1 cm

  19. Evaluation of seismic design spectrum based on UHS implementing fourth-generation seismic hazard maps of Canada

    NASA Astrophysics Data System (ADS)

    Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.

    2016-12-01

    Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).

  20. Seismic design technology for Breeder Reactor structures. Volume 3: special topics in reactor structures

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into six chapters: analysis techniques, equivalent damping values, probabilistic design factors, design verifications, equivalent response cycles for fatigue analysis, and seismic isolation. (JDB)

  1. A New Design of Seismic Stations Deployed in South Tyrol

    NASA Astrophysics Data System (ADS)

    Melichar, P.; Horn, N.

    2007-05-01

    When designing the seismic network in South Tyrol, the seismic service of Austria and the Civil defense in South Tyrol combined more that 10 years experience in running seismic networks and private communication systems. In recent years the high data return rate of > 99% and network uptime of > 99.% is achieved by the combination of high quality station design and equipment, and the use of the Antelope data acquisition and processing software which comes with suite of network monitoring & alerting tools including Nagios, etc. The new Data Center is located in city of Bolzano and is connected to the other Data Centers in Austria, Switzerland, and Italy for data back up purposes. Each Data Center uses also redundant communication system if the primary system fails. When designing the South Tyrol network, new improvements were made in seismometer installations, grounding, lighting protection and data communications in order to improve quality of data recorded as well as network up-time, and data return. The new 12 stations are equipped with 6 Channels Q330+PB14f connected to STS2 + EpiSensor sensor. One of the key achievements was made in the grounding concept for the whole seismic station - and aluminum boxes were introduced which delivered Faraday cage isolation. Lightning protection devices are used for the equipment inside the aluminum housing where seismometer and data logger are housed. For the seismometer cables a special shielding was introduced. The broadband seismometer and strong-motion sensor are placed on a thick glass plate and therefore isolated from the ground. The precise seismometer orientation was done by a special groove on the glass plate and in case of a strong earthquake; the seismometer is tide up to the base plate. Temperature stability was achieved by styrofoam sheets inside the seismometer aluminum protection box.

  2. RCC for seismic design. [Roller-Compacted Concrete

    SciTech Connect

    Wong, N.C.; Forrest, M.P.; Lo, S.H. )

    1994-09-01

    This article describes how the use of roller-compacted concrete is saving $10 million on the seismic retrofit of Southern California's historic multiple-arch Littlerock Dam. Throughout its 70-year existence, the Littlerock Dam in Southern California's Angeles National Forest has been a subject of the San Andreas Fault, could this 28-arch dam withstand any major movement from that fault line, much less the big one'' Working with the state's Division of Safety of Dams, Woodward-Clyde Consultants, Oakland, Calif., performed stability and stress analyses to find the answer. The evaluation showed that, as feared, the dam failed to meet required seismic safety criteria, principally due to its lack of lateral stability, a deficiency inherent in multiple-arch dams. To provide adequate seismic stability the authors developed a rehabilitation design centered around the use of roller-compacted concrete (RCC) to construct a gravity section between and around the downstream portions of the existing buttresses. The authors also proposed that the arches be resurfaced and stiffened with steel-fiber-reinforced silica fume. The alternative design would have required filling the arch bays between the buttresses with mass concrete at a cost of $22.5 million. The RCC buttress repair construction, scheduled for completion this fall, will cost about $13 million.

  3. A New Event Detector Designed for the Seismic Research Observatories

    USGS Publications Warehouse

    Murdock, James N.; Hutt, Charles R.

    1983-01-01

    A new short-period event detector has been implemented on the Seismic Research Observatories. For each signal detected, a printed output gives estimates of the time of onset of the signal, direction of the first break, quality of onset, period and maximum amplitude of the signal, and an estimate of the variability of the background noise. On the SRO system, the new algorithm runs ~2.5x faster than the former (power level) detector. This increase in speed is due to the design of the algorithm: all operations can be performed by simple shifts, additions, and comparisons (floating point operations are not required). Even though a narrow-band recursive filter is not used, the algorithm appears to detect events competitively with those algorithms that employ such filters. Tests at Albuquerque Seismological Laboratory on data supplied by Blandford suggest performance commensurate with the on-line detector of the Seismic Data Analysis Center, Alexandria, Virginia.

  4. Study of seismic design bases and site conditions for nuclear power plants

    SciTech Connect

    Not Available

    1980-04-01

    This report presents the results of an investigation of four topics pertinent to the seismic design of nuclear power plants: Design accelerations by regions of the continental United States; review and compilation of design-basis seismic levels and soil conditions for existing nuclear power plants; regional distribution of shear wave velocity of foundation materials at nuclear power plant sites; and technical review of surface-founded seismic analysis versus embedded approaches.

  5. An Alternative Approach to "Identification of Unknowns": Designing a Protocol to Verify the Identities of Nitrogen Fixing Bacteria.

    PubMed

    Martinez-Vaz, Betsy M; Denny, Roxanne; Young, Nevin D; Sadowsky, Michael J

    2015-12-01

    Microbiology courses often include a laboratory activity on the identification of unknown microbes. This activity consists of providing students with microbial cultures and running biochemical assays to identify the organisms. This approach lacks molecular techniques such as sequencing of genes encoding 16S rRNA, which is currently the method of choice for identification of unknown bacteria. A laboratory activity was developed to teach students how to identify microorganisms using 16S rRNA polymerase chain reaction (PCR) and validate microbial identities using biochemical techniques. We hypothesized that designing an experimental protocol to confirm the identity of a bacterium would improve students' knowledge of microbial identification techniques and the physiological characteristics of bacterial species. Nitrogen-fixing bacteria were isolated from the root nodules of Medicago truncatula and prepared for 16S rRNA PCR analysis. Once DNA sequencing revealed the identity of the organisms, the students designed experimental protocols to verify the identity of rhizobia. An assessment was conducted by analyzing pre- and posttest scores and by grading students' verification protocols and presentations. Posttest scores were higher than pretest scores at or below p = 0.001. Normalized learning gains (G) showed an improvement of students' knowledge of microbial identification methods (LO4, G = 0.46), biochemical properties of nitrogen-fixing bacteria (LO3, G = 0.45), and the events leading to the establishment of nitrogen-fixing symbioses (LO1&2, G = 0.51, G = 0.37). An evaluation of verification protocols also showed significant improvement with a p value of less than 0.001.

  6. An Alternative Approach to “Identification of Unknowns”: Designing a Protocol to Verify the Identities of Nitrogen Fixing Bacteria†

    PubMed Central

    Martinez-Vaz, Betsy M.; Denny, Roxanne; Young, Nevin D.; Sadowsky, Michael J.

    2015-01-01

    Microbiology courses often include a laboratory activity on the identification of unknown microbes. This activity consists of providing students with microbial cultures and running biochemical assays to identify the organisms. This approach lacks molecular techniques such as sequencing of genes encoding 16S rRNA, which is currently the method of choice for identification of unknown bacteria. A laboratory activity was developed to teach students how to identify microorganisms using 16S rRNA polymerase chain reaction (PCR) and validate microbial identities using biochemical techniques. We hypothesized that designing an experimental protocol to confirm the identity of a bacterium would improve students’ knowledge of microbial identification techniques and the physiological characteristics of bacterial species. Nitrogen-fixing bacteria were isolated from the root nodules of Medicago truncatula and prepared for 16S rRNA PCR analysis. Once DNA sequencing revealed the identity of the organisms, the students designed experimental protocols to verify the identity of rhizobia. An assessment was conducted by analyzing pre- and posttest scores and by grading students’ verification protocols and presentations. Posttest scores were higher than pretest scores at or below p = 0.001. Normalized learning gains (G) showed an improvement of students’ knowledge of microbial identification methods (LO4, G = 0.46), biochemical properties of nitrogen-fixing bacteria (LO3, G = 0.45), and the events leading to the establishment of nitrogen-fixing symbioses (LO1&2, G = 0.51, G = 0.37). An evaluation of verification protocols also showed significant improvement with a p value of less than 0.001. PMID:26753033

  7. Design of engineered cementitious composites for ductile seismic resistant elements

    NASA Astrophysics Data System (ADS)

    Kanda, Tetsushi

    This dissertation focuses on designing Engineered Cementitious Composite (ECC) to achieve high performance seismic resistant elements. To attain this goal, three major tasks have been accomplished. Task 1 aims at achieving new ECCs involving low cost fiber, which often involve fiber rupture in crack bridging, thus named as "Fiber Rupture Type ECC". Achieving the new ECC requires a new practical and comprehensive composite design theory. For this theory, single fiber behavior was first investigated. Specifically, fiber rupture in composite and chemical bond in fiber/matrix interface were experimentally examined and mathematically modeled. Then this model for single fiber behavior was implemented into a proposed bridging law, a theoretical model for relationship between fiber bridging stress of composite and Crack Opening Displacement (COD). This new bridging law was finally employed to establish a new composite design theory. Task 2 was initiated to facilitate structural interpretation of ECC's material behavior investigated in Task 1. For this purpose, uniaxial tensile behavior, one of the most important ECC's properties, was theoretically characterized with stress-strain relation from micromechanics view point. As a result, a theory is proposed to express ECC's tensile stress-strain relation in terms of micromechanics parameters of composites, such as bond strengths. Task 3 primarily demonstrates an integrated design scheme for ductile seismic elements that covers from micromechanics in single fiber level to structural design tool, such as with non-linear FEM analysis. The significance of this design scheme is that the influences of ECC's microstructure on element's structural performance is quantitatively captured. This means that a powerful tool is obtained for tailoring constitutive micromechanics parameters in order to maximize structural performance of elements. While the tool is still preliminary, completing this tool in future studies will enable one to

  8. U.S. Seismic Design Maps Web Application

    NASA Astrophysics Data System (ADS)

    Martinez, E.; Fee, J.

    2015-12-01

    The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.

  9. Design and application of an electromagnetic vibrator seismic source

    USGS Publications Warehouse

    Haines, S.S.

    2006-01-01

    Vibrational seismic sources frequently provide a higher-frequency seismic wavelet (and therefore better resolution) than other sources, and can provide a superior signal-to-noise ratio in many settings. However, they are often prohibitively expensive for lower-budget shallow surveys. In order to address this problem, I designed and built a simple but effective vibrator source for about one thousand dollars. The "EMvibe" is an inexpensive electromagnetic vibrator that can be built with easy-to-machine parts and off-the-shelf electronics. It can repeatably produce pulse and frequency-sweep signals in the range of 5 to 650 Hz, and provides sufficient energy for recording at offsets up to 20 m. Analysis of frequency spectra show that the EMvibe provides a broader frequency range than the sledgehammer at offsets up to ??? 10 m in data collected at a site with soft sediments in the upper several meters. The EMvibe offers a high-resolution alternative to the sledgehammer for shallow surveys. It is well-suited to teaching applications, and to surveys requiring a precisely-repeatable source signature.

  10. Report of the US Nuclear Regulatory Commission Piping Review Committee. Volume 2. Evaluation of seismic designs: a review of seismic design requirements for Nuclear Power Plant Piping

    SciTech Connect

    Not Available

    1985-04-01

    This document reports the position and recommendations of the NRC Piping Review Committee, Task Group on Seismic Design. The Task Group considered overlapping conservation in the various steps of seismic design, the effects of using two levels of earthquake as a design criterion, and current industry practices. Issues such as damping values, spectra modification, multiple response spectra methods, nozzle and support design, design margins, inelastic piping response, and the use of snubbers are addressed. Effects of current regulatory requirements for piping design are evaluated, and recommendations for immediate licensing action, changes in existing requirements, and research programs are presented. Additional background information and suggestions given by consultants are also presented.

  11. Estimation of Characteristic Period for Energy Based Seismic Design

    SciTech Connect

    Hancloglu, Baykal; Polat, Zekeriya; Kircil, Murat Serdar

    2008-07-08

    Estimation of input energy using approximate methods has been always a considerable research topic of energy based seismic design. Therefore several approaches have been proposed by many researchers to estimate the energy input to SDOF systems in the last decades. The characteristic period is the key parameter of most of these approaches and it is defined as the period at which the peak value of the input energy occurs. In this study an equation is proposed for estimating the characteristic period considering an extensive earthquake ground motion database which includes a total of 268 far-field records, two horizontal components from 134 recording stations located on both soft and firm soil sites. For this purpose statistical regression analyses are performed to develop an equation in terms of a number of structural parameters, and it is found that the developed equation yields satisfactory results comparing the characteristic periods calculated from time history analyses of SDOF systems.

  12. Assessment of the impact of degraded shear wall stiffnesses on seismic plant risk and seismic design loads

    SciTech Connect

    Klamerus, E.W.; Bohn, M.P.; Johnson, J.J.; Asfura, A.P.; Doyle, D.J.

    1994-02-01

    Test results sponsored by the USNRC have shown that reinforced shear wall (Seismic Category I) structures exhibit stiffnesses and natural frequencies which are smaller than those calculated in the design process. The USNRC has sponsored Sandia National Labs to perform an evaluation of the effects of the reduced frequencies on several existing seismic PRAs in order to determine the seismic risk implications inherent in these test results. This report presents the results for the re-evaluation of the seismic risk for three nuclear power plants: the Peach Bottom Atomic Power Station, the Zion Nuclear Power Plant, and Arkansas Nuclear One -- Unit 1 (ANO-1). Increases in core damage frequencies for seismic initiated events at Peach Bottom were 25 to 30 percent (depending on whether LLNL or EPRI hazard curves were used). At the ANO-1 site, the corresponding increases in plant risk were 10 percent (for each set of hazard curves). Finally, at Zion, there was essentially no change in the computed core damage frequency when the reduction in shear wall stiffness was included. In addition, an evaluation of deterministic ``design-like`` structural dynamic calculations with and without the shear stiffness reductions was made. Deterministic loads calculated for these two cases typically increased on the order of 10 to 20 percent for the affected structures.

  13. Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Sullivan, T. J.

    2012-04-01

    The aftermath of recent earthquakes is a reminder that seismic risk is a very relevant issue for our communities. Implicit within the seismic design standards currently in place around the world is that minimum acceptable levels of seismic risk will be ensured through design in accordance with the codes. All the same, none of the design standards specify what the minimum acceptable level of seismic risk actually is. Instead, a series of deterministic limit states are set which engineers then demonstrate are satisfied for their structure, typically through the use of elastic dynamic analyses adjusted to account for non-linear response using a set of empirical correction factors. From the early nineties the seismic engineering community has begun to recognise numerous fundamental shortcomings with such seismic design procedures in modern codes. Deficiencies include the use of elastic dynamic analysis for the prediction of inelastic force distributions, the assignment of uniform behaviour factors for structural typologies irrespective of the structural proportions and expected deformation demands, and the assumption that hysteretic properties of a structure do not affect the seismic displacement demands, amongst other things. In light of this a number of possibilities have emerged for improved control of risk through seismic design, with several innovative displacement-based seismic design methods now well developed. For a specific seismic design intensity, such methods provide a more rational means of controlling the response of a structure to satisfy performance limit states. While the development of such methodologies does mark a significant step forward for the control of seismic risk, they do not, on their own, identify the seismic risk of a newly designed structure. In the U.S. a rather elaborate performance-based earthquake engineering (PBEE) framework is under development, with the aim of providing seismic loss estimates for new buildings. The PBEE framework

  14. Engineering Seismic Base Layer for Defining Design Earthquake Motion

    SciTech Connect

    Yoshida, Nozomu

    2008-07-08

    Engineer's common sense that incident wave is common in a widespread area at the engineering seismic base layer is shown not to be correct. An exhibiting example is first shown, which indicates that earthquake motion at the ground surface evaluated by the analysis considering the ground from a seismic bedrock to a ground surface simultaneously (continuous analysis) is different from the one by the analysis in which the ground is separated at the engineering seismic base layer and analyzed separately (separate analysis). The reason is investigated by several approaches. Investigation based on eigen value problem indicates that the first predominant period in the continuous analysis cannot be found in the separate analysis, and predominant period at higher order does not match in the upper and lower ground in the separate analysis. The earthquake response analysis indicates that reflected wave at the engineering seismic base layer is not zero, which indicates that conventional engineering seismic base layer does not work as expected by the term 'base'. All these results indicate that wave that goes down to the deep depths after reflecting in the surface layer and again reflects at the seismic bedrock cannot be neglected in evaluating the response at the ground surface. In other words, interaction between the surface layer and/or layers between seismic bedrock and engineering seismic base layer cannot be neglected in evaluating the earthquake motion at the ground surface.

  15. A new methodology for energy-based seismic design of steel moment frames

    NASA Astrophysics Data System (ADS)

    Mezgebo, Mebrahtom Gebrekirstos; Lui, Eric M.

    2017-01-01

    A procedure is proposed whereby input and hysteretic energy spectra developed for single-degree-of-freedom (SDOF) systems are applied to multi-degree-of-freedom (MDOF) steel moment resisting frames. The proposed procedure is verified using four frames, viz., frame with three-, five-, seven- and nine-stories, each of which is subjected to the fault-normal and fault-parallel components of three actual earthquakes. A very good estimate for the three- and five-story frames, and a reasonably acceptable estimate for the seven-, and nine-story frames, have been obtained. A method for distributing the hysteretic energy over the frame height is also proposed. This distribution scheme allows for the determination of the energy demand component of a proposed energy-based seismic design (EBSD) procedure for each story. To address the capacity component of EBSD, a story-wise optimization design procedure is developed by utilizing the energy dissipating capacity from plastic hinge formation/rotation for these moment frames. The proposed EBSD procedure is demonstrated in the design of a three-story one-bay steel moment frame.

  16. Seismic Analysis Issues in Design Certification Applications for New Reactors

    SciTech Connect

    Miranda, M.; Morante, R.; Xu, J.

    2011-07-17

    The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants,” provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

  17. Design and development of digital seismic amplifier recorder

    SciTech Connect

    Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan

    2015-04-16

    A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1KΩ and 1MΩ. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.

  18. Overcoming barriers to high performance seismic design using lessons learned from the green building industry

    NASA Astrophysics Data System (ADS)

    Glezil, Dorothy

    NEHRP's Provisions today currently governing conventional seismic resistant design. These provisions, though they ensure the life-safety of building occupants, extensive damage and economic losses may still occur in the structures. This minimum performance can be enhanced using the Performance-Based Earthquake Engineering methodology and passive control systems like base isolation and energy dissipation systems. Even though these technologies and the PBEE methodology are effective reducing economic losses and fatalities during earthquakes, getting them implemented into seismic resistant design has been challenging. One of the many barriers to their implementation has been their upfront costs. The green building community has faced some of the same challenges that the high performance seismic design community currently faces. The goal of this thesis is to draw on the success of the green building industry to provide recommendations that may be used overcome the barriers that high performance seismic design (HPSD) is currently facing.

  19. Experimental investigation of damage behavior of RC frame members including non-seismically designed columns

    NASA Astrophysics Data System (ADS)

    Chen, Linzhi; Lu, Xilin; Jiang, Huanjun; Zheng, Jianbo

    2009-06-01

    Reinforced concrete (RC) frame structures are one of the mostly common used structural systems, and their seismic performance is largely determined by the performance of columns and beams. This paper describes horizontal cyclic loading tests of ten column and three beam specimens, some of which were designed according to the current seismic design code and others were designed according to the early non-seismic Chinese design code, aiming at reporting the behavior of the damaged or collapsed RC frame strctures observed during the Wenchuan earthquake. The effects of axial load ratio, shear span ratio, and transverse and longitudinal reinforcement ratio on hysteresis behavior, ductility and damage progress were incorporated in the experimental study. Test results indicate that the non-seismically designed columns show premature shear failure, and yield larger maximum residual crack widths and more concrete spalling than the seismically designed columns. In addition, longitudinal steel reinforcement rebars were severely buckled. The axial load ratio and shear span ratio proved to be the most important factors affecting the ductility, crack opening width and closing ability, while the longitudinal reinforcement ratio had only a minor effect on column ductility, but exhibited more influence on beam ductility. Finally, the transverse reinforcement ratio did not influence the maximum residual crack width and closing ability of the seismically designed columns.

  20. Seismic Response Analysis and Design of Structure with Base Isolation

    SciTech Connect

    Rosko, Peter

    2010-05-21

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  1. Effective Parameters on Seismic Design of Rectangular Underground Structures

    SciTech Connect

    Amiri, G. Ghodrati; Maddah, N.; Mohebi, B.

    2008-07-08

    Underground structures are a significant part of the transportation in the modern society and in the seismic zones should withstand against both seismic and static loadings. Embedded structures should conform to ground deformations during the earthquake but almost exact evaluation of structure to ground distortion is critical. Several two-dimensional finite difference models are used to find effective parameters on racking ratio (structure to ground distortion) including flexibility ratio, various cross sections, embedment depth, and Poisson's ratio of soil. Results show that influence of different cross sections, by themselves is negligible but embedment depth in addition to flexibility ratio and Poisson's ratio is known as a consequential parameter. A comparison with pseudo-static method (simplified frame analysis) is also performed. The results show that for a stiffer structure than soil, racking ratio decreases as the depth of burial decreases; on the other hand, shallow and flexible structures can suffer greater distortion than deeper ones up to 30 percents.

  2. Seismic design factors for RC special moment resisting frames in Dubai, UAE

    NASA Astrophysics Data System (ADS)

    Alhamaydeh, Mohammad; Abdullah, Sulayman; Hamid, Ahmed; Mustapha, Abdilwahhab

    2011-12-01

    This study investigates the seismic design factors for three reinforced concrete (RC) framed buildings with 4, 16 and 32-stories in Dubai, UAE utilizing nonlinear analysis. The buildings are designed according to the response spectrum procedure defined in the 2009 International Building Code (IBC'09). Two ensembles of ground motion records with 10% and 2% probability of exceedance in 50 years (10/50 and 2/50, respectively) are used. The nonlinear dynamic responses to the earthquake records are computed using IDARC-2D. Key seismic design parameters are evaluated; namely, response modification factor ( R), deflection amplification factor ( C d), system overstrength factor ( Ω o), and response modification factor for ductility ( R d ) in addition to inelastic interstory drift. The evaluated seismic design factors are found to significantly depend on the considered ground motion (10/50 versus 2/50). Consequently, resolution to the controversy of Dubai seismicity is urged. The seismic design factors for the 2/50 records show an increase over their counterparts for the 10/50 records in the range of 200%-400%, except for the Ω o factor, which shows a mere 30% increase. Based on the observed trends, perioddependent R and C d factors are recommended if consistent collapse probability (or collapse prevention performance) in moment frames with varying heights is to be expected.

  3. Development and implementation of seismic design and evaluation criteria for NIF

    SciTech Connect

    Sommer, S.C.; MacCalden, P.B.

    1998-03-17

    The National Ignition Facility (NIF) is being built at the Lawrence Livermore National Laboratory (LLNL) as an international research center for inertial confinement fusion (ICF). This paper will provide an overview of NIF, review NIF seismic criteria, and briefly discuss seismic analyses of NIF optical support structures that have been performed by LLNL and the Ralph M. Parsons Company, the Architect and Engineer (A&E) for NIF. The NIF seismic design and evaluation criteria is based on provisions in DOE Standard 1020 (DOE-STD-1020), the Uniform Building Code (UBC), and the LLNL Mechanical Engineering Design Safety Standards (MEDSS). Different levels of seismic requirements apply to NIF structures, systems, and components (SSCs) based on their function. The highest level of requirements are defined for optical support structures and SSCs which could influence the performance of optical support structures, while the minimum level of requirements are Performance Category 2 (PC2) requirements in DOE-STD-1020. To demonstrate that the NIF seismic criteria is satisfied, structural analyses have been performed by LLNL and Parsons to evaluate the responses of optical support structures and other SSCs to seismic-induced forces.

  4. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  5. Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm

    PubMed Central

    Veladi, H.

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm. PMID:25202717

  6. Performance-based seismic design of steel frames utilizing colliding bodies algorithm.

    PubMed

    Veladi, H

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm.

  7. Evaluation of collapse resistance of RC frame structures for Chinese schools in seismic design categories B and C

    NASA Astrophysics Data System (ADS)

    Tang, Baoxin; Lu, Xinzheng; Ye, Lieping; Shi, Wei

    2011-09-01

    According to the Code for Seismic Design of Buildings (GB50011-2001), ten typical reinforced concrete (RC) frame structures, used as school classroom buildings, are designed with different seismic fortification intensities (SFIs) (SFI=6 to 8.5) and different seismic design categories (SDCs) (SDC=B and C). The collapse resistance of the frames with SDC=B and C in terms of collapse fragility curves are quantitatively evaluated and compared via incremental dynamic analysis (IDA). The results show that the collapse resistance of structures should be evaluated based on both the absolute seismic resistance and the corresponding design seismic intensity. For the frames with SFI from 6 to 7.5, because they have relatively low absolute seismic resistance, their collapse resistance is insufficient even when their corresponding SDCs are upgraded from B to C. Thus, further measures are needed to enhance these structures, and some suggestions are proposed.

  8. Architecture for Verifiable Software

    NASA Technical Reports Server (NTRS)

    Reinholtz, William; Dvorak, Daniel

    2005-01-01

    Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

  9. Designing linings of mutually influencing parallel shallow circular tunnels under seismic effects of earthquake

    NASA Astrophysics Data System (ADS)

    Sammal, A. S.; Antsiferov, S. V.; Deev, P. V.

    2016-09-01

    The paper deals with seismic design of parallel shallow tunnel linings, which is based on identifying the most unfavorable lining stress states under the effects of long longitudinal and shear seismic waves propagating through the cross section of the tunnel in different directions and combinations. For this purpose, the sum and difference of normal tangential stresses on lining internal outline caused by waves of different types are investigated on the extreme relative to the angle of incidence. The method allows analytic plotting of a curve illustrating structure stresses. The paper gives an example of design calculation.

  10. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Reidel, S.P.; Gardner, M.G.

    2007-07-01

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energy's (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84. percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis. A key uncertainty identified in the 2005 analysis was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The velocity structure of the upper four basalt flows (Saddle Mountains Basalt) and the inter-layered sedimentary interbeds (Ellensburg Formation) produces strong reductions in modeled earthquake ground motions propagating through them. Uncertainty in the strength of velocity contrasts between these basalts and interbeds primarily resulted from an absence of measured shear wave velocities (Vs) in the interbeds. For this study, Vs in the interbeds was estimated from older, limited compressional wave velocity (Vp) data using estimated ranges for the ratio of the two velocities (Vp/Vs) based on analogues in similar materials. A range of possible Vs for the interbeds and basalts was used and produced additional uncertainty in the resulting response spectra. Because of the

  11. Estimation of Cyclic Interstory Drift Capacity of Steel Framed Structures and Future Applications for Seismic Design

    PubMed Central

    Bojórquez, Edén; Reyes-Salazar, Alfredo; Ruiz, Sonia E.; Terán-Gilmore, Amador

    2014-01-01

    Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions. PMID:25089288

  12. Estimation of cyclic interstory drift capacity of steel framed structures and future applications for seismic design.

    PubMed

    Bojórquez, Edén; Reyes-Salazar, Alfredo; Ruiz, Sonia E; Terán-Gilmore, Amador

    2014-01-01

    Several studies have been devoted to calibrate damage indices for steel and reinforced concrete members with the purpose of overcoming some of the shortcomings of the parameters currently used during seismic design. Nevertheless, there is a challenge to study and calibrate the use of such indices for the practical structural evaluation of complex structures. In this paper, an energy-based damage model for multidegree-of-freedom (MDOF) steel framed structures that accounts explicitly for the effects of cumulative plastic deformation demands is used to estimate the cyclic drift capacity of steel structures. To achieve this, seismic hazard curves are used to discuss the limitations of the maximum interstory drift demand as a performance parameter to achieve adequate damage control. Then the concept of cyclic drift capacity, which incorporates information of the influence of cumulative plastic deformation demands, is introduced as an alternative for future applications of seismic design of structures subjected to long duration ground motions.

  13. Verifying Ballast Water Treatment Performance

    EPA Science Inventory

    The U.S. Environmental Protection Agency, NSF International, Battelle, and U.S. Coast Guard are jointly developing a protocol for verifying the technical performance of commercially available technologies designed to treat ship ballast water for potentially invasive species. The...

  14. Multi Canister Overpack (MCO) Handling Machine Trolley Seismic Uplift Constraint Design Loads

    SciTech Connect

    SWENSON, C.E.

    2000-03-09

    The MCO Handling Machine (MHM) trolley moves along the top of the MHM bridge girders on east-west oriented rails. To prevent trolley wheel uplift during a seismic event, passive uplift constraints are provided as shown in Figure 1-1. North-south trolley wheel movement is prevented by flanges on the trolley wheels. When the MHM is positioned over a Multi-Canister Overpack (MCO) storage tube, east-west seismic restraints are activated to prevent trolley movement during MCO handling. The active seismic constraints consist of a plunger, which is inserted into slots positioned along the tracks as shown in Figure 1-1. When the MHM trolley is moving between storage tube positions, the active seismic restraints are not engaged. The MHM has been designed and analyzed in accordance with ASME NOG-1-1995. The ALSTHOM seismic analysis (Reference 3) reported seismic uplift restraint loading and EDERER performed corresponding structural calculations. The ALSTHOM and EDERER calculations were performed with the east-west seismic restraints activated and the uplift restraints experiencing only vertical loading. In support of development of the CSB Safety Analysis Report (SAR), an evaluation of the MHM seismic response was requested for the case where the east-west trolley restraints are not engaged. For this case, the associated trolley movements would result in east-west lateral loads on the uplift constraints due to friction, as shown in Figure 1-2. During preliminary evaluations, questions were raised as to whether the EDERER calculations considered the latest ALSTHOM seismic analysis loads (See NCR No. 00-SNFP-0008, Reference 5). Further evaluation led to the conclusion that the EDERER calculations used appropriate vertical loading, but the uplift restraints would need to be re-analyzed and modified to account for lateral loading. The disposition of NCR 00-SNFP-0008 will track the redesign and modification effort. The purpose of this calculation is to establish bounding seismic

  15. 7 CFR 1792.103 - Seismic design and construction standards for new buildings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Seismic design and construction standards for new buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE (CONTINUED) COMPLIANCE WITH OTHER FEDERAL STATUTES, REGULATIONS, AND EXECUTIVE ORDERS...

  16. Risk-Targeted versus Current Seismic Design Maps for the Conterminous United States

    USGS Publications Warehouse

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  17. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  18. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part I: Theoretical formulation

    NASA Astrophysics Data System (ADS)

    Mazza, Fabio; Vulcano, Alfonso

    2008-07-01

    The insertion of steel braces equipped with dissipative devices proves to be very effective in order to enhance the performance of a framed building under horizontal seismic loads. Multi-level design criteria were proposed according to the Performance-Based Design, in order to get, for a specific level of the seismic intensity, a designated performance objective of the building (e.g., an assigned damage level of either the framed structure or non-structural elements). In this paper a design procedure aiming to proportion braces with hysteretic dampers in order to attain, for a specific level of the seismic intensity, a designated performance level of the building is proposed. Exactly, a proportional stiffness criterion, which assumes the elastic lateral storey-stiffness due to the braces proportional to that of the unbraced frame, is combined with the Direct Displacement-Based Design, in which the design starts from target deformations. A computer code has been prepared for the nonlinear static and dynamic analyses, using a step-by-step procedure. Frame members and hysteretic dampers are idealized by bilinear models.

  19. Seismic design of circular-section concrete-lined underground openings: Preclosure performance considerations for the Yucca Mountain Site

    SciTech Connect

    Richardson, A.M.; Blejwas, T.E.

    1992-07-01

    Yucca Mountain, the potential site of a repository for high-level radioactive waste, is situated in a region of natural and man-made seismicity. Underground openings excavated at this site must be designed for worker safety in the seismic environment anticipated for the preclosure period. This includes accesses developed for site characterization regardless of the ultimate outcome of the repository siting process. Experience with both civil and mining structures has shown that underground openings are much more resistant to seismic effects than surface structures, and that even severe dynamic strains can usually be accommodated with proper design. This paper discusses the design and performance of lined openings in the seismic environment of the potential site. The types and ranges of possible ground motions (seismic loads) are briefly discussed. Relevant historical records of underground opening performance during seismic loading are reviewed. Simple analytical methods of predicting liner performance under combined in situ, thermal, and seismic loading are presented, and results of calculations are discussed in the context of realistic performance requirements for concrete-lined openings for the preclosure period. Design features that will enhance liner stability and mitigate the impact of the potential seismic load are reviewed. The paper is limited to preclosure performance concerns involving worker safety because present decommissioning plans specify maintaining the option for liner removal at seal locations, thus decoupling liner design from repository postclosure performance issues.

  20. Effects of surface topography on ground shaking prediction: implications for seismic hazard analysis and recommendations for seismic design

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Massa, Marco; Lovati, Sara; Spallarossa, Daniele

    2014-06-01

    This study examines the role of topographic effects on the prediction of earthquake ground motion. Ground motion prediction equations (GMPEs) are mathematical models that estimate the shaking level induced by an earthquake as a function of several parameters, such as magnitude, source-to-site distance, style of faulting and ground type. However, little importance is given to the effects of topography, which, as known, may play a significant role on the level, duration and frequency content of ground motion. Ridges and crests are often lost inside the large number of sites considered in the definition of a GMPE. Hence, it is presumable that current GMPEs are unable to accurately predict the shaking level at the top of a relief. The present work, which follows the article of Massa et al. about topographic effects, aims at overcoming this limitation by amending an existing GMPE with an additional term to account for the effects of surface topography at a specific site. First, experimental ground motion values and ground motions predicted by the attenuation model of Bindi et al. for five case studies are compared and contrasted in order to quantify their discrepancy and to identify anomalous behaviours of the sites investigated. Secondly, for the site of Narni (Central Italy), amplification factors derived from experimental measurements and numerical analyses are compared and contrasted, pointing out their impact on probabilistic seismic hazard analysis and design norms. In particular, with reference to the Italian building code, our results have highlighted the inadequacy of the national provisions concerning the definition of the seismic load at top of ridges and crests, evidencing a significant underestimation of ground motion around the site resonance frequency.

  1. Experimental Evaluation of the Failure of a Seismic Design Category - B Precast Concrete Beam-Column Connection System

    DTIC Science & Technology

    2014-12-01

    ER D C TR -1 4 -1 2 Experimental Evaluation of the Failure of a Seismic Design Category – B Precast Concrete Beam-Column Connection...ERDC TR-14-12 December 2014 Experimental Evaluation of the Failure of a Seismic Design Category – B Precast Concrete Beam-Column Connection...experiment to test a precast concrete beam-column system to failure. This experiment was designed to evaluate the performance of precast frame

  2. Martian seismicity

    NASA Technical Reports Server (NTRS)

    Phillips, Roger J.; Grimm, Robert E.

    1991-01-01

    The design and ultimate success of network seismology experiments on Mars depends on the present level of Martian seismicity. Volcanic and tectonic landforms observed from imaging experiments show that Mars must have been a seismically active planet in the past and there is no reason to discount the notion that Mars is seismically active today but at a lower level of activity. Models are explored for present day Mars seismicity. Depending on the sensitivity and geometry of a seismic network and the attenuation and scattering properties of the interior, it appears that a reasonable number of Martian seismic events would be detected over the period of a decade. The thermoelastic cooling mechanism as estimated is surely a lower bound, and a more refined estimate would take into account specifically the regional cooling of Tharsis and lead to a higher frequency of seismic events.

  3. Some considerations for establishing seismic design criteria for nuclear plant piping

    SciTech Connect

    Chen, W.P.; Chokshi, N.C.

    1997-01-01

    The Energy Technology Engineering Center (ETEC) is providing assistance to the U.S. NRC in developing regulatory positions on the seismic analysis of piping. As part of this effort, ETEC previously performed reviews of the ASME Code, Section III piping seismic design criteria as revised by the 1994 Addenda. These revised criteria were based on evaluations by the ASME Special Task Group on Integrated Piping Criteria (STGIPC) and the Technical Core Group (TCG) of the Advanced Reactor Corporation (ARC) of the earlier joint Electric Power Research Institute (EPRI)/NRC Piping & Fitting Dynamic Reliability (PFDR) program. Previous ETEC evaluations reported at the 23rd WRSM of seismic margins associated with the revised criteria are reviewed. These evaluations had concluded, in part, that although margins for the timed PFDR tests appeared acceptable (>2), margins in detuned tests could be unacceptable (<1). This conclusion was based primarily on margin reduction factors (MRFs) developed by the ASME STGIPC and ARC/TCG from realistic analyses of PFDR test 36. This paper reports more recent results including: (1) an approach developed for establishing appropriate seismic margins based on PRA considerations, (2) independent assessments of frequency effects on margins, (3) the development of margins based on failure mode considerations, and (4) the implications of Code Section III rules for Section XI.

  4. Seismic design technology for breeder reactor structures. Volume 1. Special topics in earthquake ground motion

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This report is divided into twelve chapters: seismic hazard analysis procedures, statistical and probabilistic considerations, vertical ground motion characteristics, vertical ground response spectrum shapes, effects of inclined rock strata on site response, correlation of ground response spectra with intensity, intensity attenuation relationships, peak ground acceleration in the very mean field, statistical analysis of response spectral amplitudes, contributions of body and surface waves, evaluation of ground motion characteristics, and design earthquake motions. (DLC)

  5. Probabilistic seismic hazard characterization and design parameters for the Pantex Plant

    SciTech Connect

    Bernreuter, D. L.; Foxall, W.; Savy, J. B.

    1998-10-19

    The Hazards Mitigation Center at Lawrence Livermore National Laboratory (LLNL) updated the seismic hazard and design parameters at the Pantex Plant. The probabilistic seismic hazard (PSH) estimates were first updated using the latest available data and knowledge from LLNL (1993, 1998), Frankel et al. (1996), and other relevant recent studies from several consulting companies. Special attention was given to account for the local seismicity and for the system of potentially active faults associated with the Amarillo-Wichita uplift. Aleatory (random) uncertainty was estimated from the available data and the epistemic (knowledge) uncertainty was taken from results of similar studies. Special attention was given to soil amplification factors for the site. Horizontal Peak Ground Acceleration (PGA) and 5% damped uniform hazard spectra were calculated for six return periods (100 yr., 500 yr., 1000 yr., 2000 yr., 10,000 yr., and 100,000 yr.). The design parameters were calculated following DOE standards (DOE-STD-1022 to 1024). Response spectra for design or evaluation of Performance Category 1 through 4 structures, systems, and components are presented.

  6. Seismic Evaluation and Preliminary Design of Regular Setback Masonry Infilled Open Ground Storey RC Frame

    NASA Astrophysics Data System (ADS)

    Hashmi, Arshad K.

    2016-06-01

    Current seismic code presents certain stringent factors for defining frame as regular and irregular. Thereby these stringent factors only decide the type of analysis (i.e. equivalent static analysis or dynamic analysis) to be done. On the contrary, development of new simplified methods such as pushover analysis can give lateral load capacity of any structure (e.g. regular or irregular frame etc.) easily. Design by iterative procedure with the help of pushover analysis for serviceability requirement (i.e. inter storey drift limitation) provided by present seismic code, can provide an alternative to present practicing procedure. Present paper deals with regular setback frame in combination with vulnerable layout of masonry infill walls over the frame elevation (i.e. probable case of "Vertical Stiffness Irregularities"). Nonlinear time history analysis and Capacity Spectrum Method have been implemented to investigate the seismic performance of these frames. Finally, recently developed preliminary design procedure satisfying the serviceability criterion of inter storey drift limitation has been employed for the preliminary design of these frames.

  7. Seismic analysis and design of buried pipelines for fault movement

    SciTech Connect

    Wang, L.R.L.; Yeh, Y.H.

    1984-06-01

    Lifelines, such as gas and oil transmission lines and water and sewer pipelines have been damaged heavily in recent earthquakes. The damages of these lifelines have caused major, catastrophic disruption of essential service to human needs. Large abrupt differential ground movements resulted at an active fault present one of the most severe earthquake effects on a buried pipeline system. Although simplified analysis procedures for buried pipelines across strike-slip fault zones causing tensive failure of the pipeline (called tensile strike-slip fault) have been proposed, the results are not accurate enough because of several assumptions involved. Furthermore, several other important failure mechanisms and parameters have not been investigated. This paper is to present the analysis procedures and results for buried pipeline subjected to tensile strike-slip fault after modifying some of the assumptions used previously. Based on the analysis results, this paper also discusses the design criteria for buried pipelines subjected to various fault movements.

  8. IMPLEMENTATION OF THE SEISMIC DESIGN CRITERIA OF DOE-STD-1189-2008 APPENDIX A [FULL PAPER

    SciTech Connect

    OMBERG SK

    2008-05-14

    This paper describes the approach taken by two Fluor Hanford projects for implementing of the seismic design criteria from DOE-STD-1189-2008, Appendix A. The existing seismic design criteria and the new seismic design criteria is described, and an assessment of the primary differences provided. The gaps within the new system of seismic design criteria, which necessitate conduct of portions of work to the existing technical standards pending availability of applicable industry standards, is discussed. Two Hanford Site projects currently in the Control Decision (CD)-1 phase of design have developed an approach to implementation of the new criteria. Calculations have been performed to determine the seismic design category for one project, based on information available in early CD-1. The potential effects of DOE-STD-1189-2008, Appendix A seismic design criteria on the process of project alternatives analysis is discussed. Present of this work is expected to benefit others in the DOE Complex that may be implementing DOE-STD-1189-2008.

  9. Decision making with epistemic uncertainty under safety constraints: An application to seismic design

    USGS Publications Warehouse

    Veneziano, D.; Agarwal, A.; Karaca, E.

    2009-01-01

    The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project. ?? 2009 Elsevier Ltd. All rights reserved.

  10. Seismic design repair and retrofit strategies for steel roof deck diaphragms

    NASA Astrophysics Data System (ADS)

    Franquet, John-Edward

    Structural engineers will often rely on the roof diaphragm to transfer lateral seismic loads to the bracing system of single-storey structures. The implementation of capacity-based design in the NBCC 2005 has caused an increase in the diaphragm design load due to the need to use the probable capacity of the bracing system, thus resulting in thicker decks, closer connector patterns and higher construction costs. Previous studies have shown that accounting for the in-plane flexibility of the diaphragm when calculating the overall building period can result in lower seismic forces and a more cost-efficient design. However, recent studies estimating the fundamental period of single storey structures using ambient vibration testing showed that the in-situ approximation was much shorter than that obtained using analytical means. The difference lies partially in the diaphragm stiffness characteristics which have been shown to decrease under increasing excitation amplitude. Using the diaphragm as the energy-dissipating element in the seismic force resisting system has also been investigated as this would take advantage of the diaphragm's ductility and limited overstrength; thus, lower capacity based seismic forces would result. An experimental program on 21.0m by 7.31m diaphragm test specimens was carried out so as to investigate the dynamic properties of diaphragms including the stiffness, ductility and capacity. The specimens consisted of 20 and 22 gauge panels with nailed frame fasteners and screwed sidelap connections as well a welded and button-punch specimen. Repair strategies for diaphragms that have previously undergone inelastic deformations were devised in an attempt to restitute the original stiffness and strength and were then experimentally evaluated. Strength and stiffness experimental estimations are compared with those predicted with the Steel Deck Institute (SDI) method. A building design comparative study was also completed. This study looks at the

  11. Displacement-based seismic design of flat slab-shear wall buildings

    NASA Astrophysics Data System (ADS)

    Sen, Subhajit; Singh, Yogendra

    2016-06-01

    Flat slab system is becoming widely popular for multistory buildings due to its several advantages. However, the performance of flat slab buildings under earthquake loading is unsatisfactory due to their vulnerability to punching shear failure. Several national design codes provide guidelines for designing flat slab system under gravity load only. Nevertheless, flat slab buildings are also being constructed in high seismicity regions. In this paper, performance of flat slab buildings of various heights, designed for gravity load alone according to code, is evaluated under earthquake loading as per ASCE/SEI 41 methodology. Continuity of slab bottom reinforcement through column cage improves the performance of flat slab buildings to some extent, but it is observed that these flat slab systems are not adequate in high seismicity areas and need additional primary lateral load resisting systems such as shear walls. A displacement-based method is proposed to proportion shear walls as primary lateral load resisting elements to ensure satisfactory performance. The methodology is validated using design examples of flat slab buildings with various heights.

  12. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Design and Construction... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  13. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Design and Construction... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  14. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Design and Construction... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  15. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Design and Construction... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  16. 41 CFR 102-76.30 - What seismic safety standards must Federal agencies follow in the design and construction of...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 76-DESIGN AND CONSTRUCTION Design and Construction... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What seismic safety standards must Federal agencies follow in the design and construction of Federal facilities?...

  17. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    SciTech Connect

    , R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  18. Implementation of seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Conrads, T.J.

    1993-06-01

    In the fall of 1992, a draft of the Seismic Design and Evaluation Guidelines for the Department of Energy (DOE) High-level Waste Storage Tanks and Appurtenances was issued. The guidelines were prepared by the Tanks Seismic Experts Panel (TSEP) and this task was sponsored by DOE, Environmental Management. The TSEP is comprised of a number of consultants known for their knowledge of seismic ground motion and expertise in the analysis of structures, systems and components subjected to seismic loads. The development of these guidelines was managed by staff from Brookhaven National Laboratory, Engineering Research and Applications Division, Department of Nuclear Energy. This paper describes the process used to incorporate the Seismic Design and Evaluation Guidelines for the DOE High-Level Waste Storage Tanks and Appurtenances into the design criteria for the Multi-Function Waste Tank Project at the Hanford Site. This project will design and construct six new high-level waste tanks in the 200 Areas at the Hanford Site. This paper also discusses the vehicles used to ensure compliance to these guidelines throughout Title 1 and Title 2 design phases of the project as well as the strategy used to ensure consistent and cost-effective application of the guidelines by the structural analysts. The paper includes lessons learned and provides recommendations for other tank design projects which might employ the TSEP guidelines.

  19. Exploratory Shaft Seismic Design Basis Working Group report; Yucca Mountain Project

    SciTech Connect

    Subramanian, C.V.; King, J.L.; Perkins, D.M.; Mudd, R.W.; Richardson, A.M.; Calovini, J.C.; Van Eeckhout, E.; Emerson, D.O.

    1990-08-01

    This report was prepared for the Yucca Mountain Project (YMP), which is managed by the US Department of Energy. The participants in the YMP are investigating the suitability of a site at Yucca Mountain, Nevada, for construction of a repository for high-level radioactive waste. An exploratory shaft facility (ESF) will be constructed to permit site characterization. The major components of the ESF are two shafts that will be used to provide access to the underground test areas for men, utilities, and ventilation. If a repository is constructed at the site, the exploratory shafts will be converted for use as intake ventilation shafts. In the context of both underground nuclear explosions (conducted at the nearby Nevada Test Site) and earthquakes, the report contains discussions of faulting potential at the site, control motions at depth, material properties of the different rock layers relevant to seismic design, the strain tensor for each of the waveforms along the shaft liners, and the method for combining the different strain components along the shaft liners. The report also describes analytic methods, assumptions used to ensure conservatism, and uncertainties in the data. The analyses show that none of the shafts` structures, systems, or components are important to public radiological safety; therefore, the shafts need only be designed to ensure worker safety, and the report recommends seismic design parameters appropriate for this purpose. 31 refs., 5 figs., 6 tabs.

  20. Ground motion values for use in the seismic design of the Trans-Alaska Pipeline system

    USGS Publications Warehouse

    Page, Robert A.; Boore, D.M.; Joyner, W.B.; Coulter, H.W.

    1972-01-01

    The proposed trans-Alaska oil pipeline, which would traverse the state north to south from Prudhoe Bay on the Arctic coast to Valdez on Prince William Sound, will be subject to serious earthquake hazards over much of its length. To be acceptable from an environmental standpoint, the pipeline system is to be designed to minimize the potential of oil leakage resulting from seismic shaking, faulting, and seismically induced ground deformation. The design of the pipeline system must accommodate the effects of earthquakes with magnitudes ranging from 5.5 to 8.5 as specified in the 'Stipulations for Proposed Trans-Alaskan Pipeline System.' This report characterizes ground motions for the specified earthquakes in terms of peak levels of ground acceleration, velocity, and displacement and of duration of shaking. Published strong motion data from the Western United States are critically reviewed to determine the intensity and duration of shaking within several kilometers of the slipped fault. For magnitudes 5 and 6, for which sufficient near-fault records are available, the adopted ground motion values are based on data. For larger earthquakes the values are based on extrapolations from the data for smaller shocks, guided by simplified theoretical models of the faulting process.

  1. AP1000{sup R} design robustness against extreme external events - Seismic, flooding, and aircraft crash

    SciTech Connect

    Pfister, A.; Goossen, C.; Coogler, K.; Gorgemans, J.

    2012-07-01

    Both the International Atomic Energy Agency (IAEA) and the U.S. Nuclear Regulatory Commission (NRC) require existing and new nuclear power plants to conduct plant assessments to demonstrate the unit's ability to withstand external hazards. The events that occurred at the Fukushima-Dai-ichi nuclear power station demonstrated the importance of designing a nuclear power plant with the ability to protect the plant against extreme external hazards. The innovative design of the AP1000{sup R} nuclear power plant provides unparalleled protection against catastrophic external events which can lead to extensive infrastructure damage and place the plant in an extended abnormal situation. The AP1000 plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance and safety. The plant's compact safety related footprint and protection provided by its robust nuclear island structures prevent significant damage to systems, structures, and components required to safely shutdown the plant and maintain core and spent fuel pool cooling and containment integrity following extreme external events. The AP1000 nuclear power plant has been extensively analyzed and reviewed to demonstrate that it's nuclear island design and plant layout provide protection against both design basis and extreme beyond design basis external hazards such as extreme seismic events, external flooding that exceeds the maximum probable flood limit, and malicious aircraft impact. The AP1000 nuclear power plant uses fail safe passive features to mitigate design basis accidents. The passive safety systems are designed to function without safety-grade support systems (such as AC power, component cooling water, service water, compressed air or HVAC). The plant has been designed to protect systems, structures, and components critical to placing the reactor in a safe shutdown condition within the steel containment vessel

  2. Best estimate method versus evaluation method: a comparison of two techniques in evaluating seismic analysis and design. Technical report

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-07-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the tradditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC)--seismic input, soil-structure interaction, major structural response, and subsystem response--are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on the model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evauation Method is also demonstrated.

  3. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    NASA Astrophysics Data System (ADS)

    Pucinotti, Raffaele; Ferrario, Fabio; Bursi, Oreste S.

    2008-07-01

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  4. Computational fluid dynamics verified the advantages of streamlined impeller design in improving flow patterns and anti-haemolysis properties of centrifugal pump.

    PubMed

    Qian, K X; Wang, F Q; Zeng, P; Ru, W M; Yuan, H Y; Feng, Z G

    2006-01-01

    Computational fluid dynamics (CFD) technology was applied to predict the flow patterns in the authors' streamlined blood pump and an American bio-pump with straight vanes and shroud, respectively. Meanwhile, haemolysis comparative tests of the two pumps were performed to verify the theoretical analysis. The results revealed that the flow patterns in the streamlined impeller are coincident with its logarithmic vanes and parabolic shroud, and there is neither separate flow nor impact in the authors' pump. In the bio-pump, the main flow has the form of logarithmic spiral in vertical section and parabola in cross section, thus there are both stagnation and swirl between the main flow and the straight vanes and shroud. Haemolysis comparative tests demonstrated that the authors' pump has an index of haemolysis of 0.030, less than that of the bio-pump (0.065).

  5. Optimal seismic design of reinforced concrete structures under time-history earthquake loads using an intelligent hybrid algorithm

    NASA Astrophysics Data System (ADS)

    Gharehbaghi, Sadjad; Khatibinia, Mohsen

    2015-03-01

    A reliable seismic-resistant design of structures is achieved in accordance with the seismic design codes by designing structures under seven or more pairs of earthquake records. Based on the recommendations of seismic design codes, the average time-history responses (ATHR) of structure is required. This paper focuses on the optimal seismic design of reinforced concrete (RC) structures against ten earthquake records using a hybrid of particle swarm optimization algorithm and an intelligent regression model (IRM). In order to reduce the computational time of optimization procedure due to the computational efforts of time-history analyses, IRM is proposed to accurately predict ATHR of structures. The proposed IRM consists of the combination of the subtractive algorithm (SA), K-means clustering approach and wavelet weighted least squares support vector machine (WWLS-SVM). To predict ATHR of structures, first, the input-output samples of structures are classified by SA and K-means clustering approach. Then, WWLS-SVM is trained with few samples and high accuracy for each cluster. 9- and 18-storey RC frames are designed optimally to illustrate the effectiveness and practicality of the proposed IRM. The numerical results demonstrate the efficiency and computational advantages of IRM for optimal design of structures subjected to time-history earthquake loads.

  6. The optimum design of time delay in time-domain seismic beam-forming based on receiver array

    NASA Astrophysics Data System (ADS)

    Ge, L.; Jiang, T.; Xu, X.; Jia, H.; Yang, Z.

    2013-12-01

    Generally, it is hard to bring high signal-to-noise ratio (SNR) data in seismic prospecting in the mining area especially when noise in the field is strong. To improve the quality of seismic data from complicated ore body, we developed Time-domain Seismic Beam-forming Based on Receiver Array (TSBBRA) method, which can extract directional wave beam in any direction. But only the direction parameter from the target body matches with the direction of reflected waves, the quality of reflected seismic data can be improved. So it's important to determine the direction of reflected waves from target bodies underground. In addition, previous studies have shown that the time delay parameter of TSBBRA can be used to control the direction of the main beam, so it is of great significance for studying the optimization design of the delay time parameter of TSBBRA. The optimum design of time delay is involved in seismic pre-processing, which uses delay and sum in time-domain to form directional reflected seismic beam with the strongest energy of the specified receiving array. Firstly, we establish the velocity model according to the original seismic records and profiles of the assigned exploration area. Secondly, we simulate the propagation of seismic wave and the response of receiver array with finite-difference method. Then, we calculate optimum beam direction from assigned reflection targets and give directional diagrams. And then we synthetize seismic records with a group of time delay using TSBBRA, give the curves that energy varies with time-delay, and obtain the optimum time-delay. The results are as follows: The optimum delay time is 1.125 ms, 0.625 ms, 0.500 ms for reflected wave that form first, second and third target. Besides, to analyze the performance of TSBBRA, we calculated SNR of reflected wave signal before and after TABBRA processing for the given model. The result shows that SNR increased by 1.2~9.4 dB with TSBBRA averagely. In conclusion, the optimum design

  7. On the Need for Reliable Seismic Input Assessment for Optimized Design and Retrofit of Seismically Isolated Civil and Industrial Structures, Equipment, and Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Martelli, Alessandro

    2011-01-01

    Based on the experience of recent violent earthquakes, the limits of the methods that are currently used for the definition of seismic hazard are becoming more and more evident to several seismic engineers. Considerable improvement is felt necessary not only for the seismic classification of the territory (for which the probabilistic seismic hazard assessment—PSHA—is generally adopted at present), but also for the evaluation of local amplification. With regard to the first item, among others, a better knowledge of fault extension and near-fault effects is judged essential. The aforesaid improvements are particularly important for the design of seismically isolated structures, which relies on displacement. Thus, such a design requires an accurate definition of the maximum value of displacement corresponding to the isolation period, and a reliable evaluation of the earthquake energy content at the low frequencies that are typical of the isolated structures, for the site and ground of interest. These evaluations shall include possible near-fault effects even in the vertical direction; for the construction of high-risk plants and components and retrofit of some cultural heritage, they shall be performed for earthquakes characterized by very long return periods. The design displacement shall not be underestimated, but neither be excessively overestimated, at least when using rubber bearings in the seismic isolation (SI) system. In fact, by decreasing transverse deformation of such SI systems below a certain value, their horizontal stiffness increases. Thus, should a structure (e.g. a civil defence centre, a masterpiece, etc.) protected in the aforesaid way be designed to withstand an unnecessarily too large earthquake, the behaviour of its SI system will be inadequate (i.e. it will be too stiff) during much more frequent events, which may really strike the structure during its life. Furthermore, since SI can be used only when the room available to the structure

  8. Spatial correlation analysis of seismic noise for STAR X-ray infrastructure design

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Antonino; Agostino, Raffaele; Festa, Lorenzo; Gervasi, Anna; Guerra, Ignazio; Palmer, Dennis T.; Serafini, Luca

    2014-05-01

    The Italian PON MaTeRiA project is focused on the creation of a research infrastructure open to users based on an innovative and evolutionary X-ray source. This source, named STAR (Southern Europe TBS for Applied Research), exploits the Thomson backscattering process of a laser radiation by fast-electron beams (Thomson Back Scattering - TBS). Its main performances are: X-ray photon flux 109-1010 ph/s, Angular divergence variable between 2 and 10 mrad, X-ray energy continuously variable between 8 keV and 150 keV, Bandwidth ΔE/E variable between 1 and 10%, ps time resolved structure. In order to achieve this performances, bunches of electrons produced by a photo-injector are accelerated to relativistic velocities by a linear accelerator section. The electron beam, few hundreds of micrometer wide, is driven by magnetic fields to the interaction point along a 15 m transport line where it is focused in a 10 micrometer-wide area. In the same area, the laser beam is focused after being transported along a 12 m structure. Ground vibrations could greatly affect the collision probability and thus the emittance by deviating the paths of the beams during their travel in the STAR source. Therefore, the study program to measure ground vibrations in the STAR site can be used for site characterization in relation to accelerator design. The environmental and facility noise may affect the X-ray operation especially if the predominant wavelengths in the microtremor wavefield are much smaller than the size of the linear accelerator. For wavelength much greater, all the accelerator parts move in phase, and therefore also large displacements cannot generate any significant effect. On the other hand, for wavelengths equal or less than half the accelerator size several parts could move in phase opposition and therefore small displacements could affect its proper functioning. Thereafter, it is important to characterize the microtremor wavefield in both frequencies and wavelengths domains

  9. UNCERTAINTY IN PHASE ARRIVAL TIME PICKS FOR REGIONAL SEISMIC EVENTS: AN EXPERIMENTAL DESIGN

    SciTech Connect

    A. VELASCO; ET AL

    2001-02-01

    The detection and timing of seismic arrivals play a critical role in the ability to locate seismic events, especially at low magnitude. Errors can occur with the determination of the timing of the arrivals, whether these errors are made by automated processing or by an analyst. One of the major obstacles encountered in properly estimating travel-time picking error is the lack of a clear and comprehensive discussion of all of the factors that influence phase picks. This report discusses possible factors that need to be modeled to properly study phase arrival time picking errors. We have developed a multivariate statistical model, experimental design, and analysis strategy that can be used in this study. We have embedded a general form of the International Data Center(IDC)/U.S. National Data Center (USNDC) phase pick measurement error model into our statistical model. We can use this statistical model to optimally calibrate a picking error model to regional data. A follow-on report will present the results of this analysis plan applied to an implementation of an experiment/data-gathering task.

  10. Definition of Verifiable School IPM

    EPA Pesticide Factsheets

    EPA is promoting use of verifiable school IPM. This is an activity that includes several elements with documentation, including pest identification, action thresholds, monitoring, effective pest control.

  11. Displacement-Based Seismic Design Procedure for Framed Buildings with Dissipative Braces Part II: Numerical Results

    NASA Astrophysics Data System (ADS)

    Mazza, Fabio; Vulcano, Alfonso

    2008-07-01

    For a widespread application of dissipative braces to protect framed buildings against seismic loads, practical and reliable design procedures are needed. In this paper a design procedure based on the Direct Displacement-Based Design approach is adopted, assuming the elastic lateral storey-stiffness of the damped braces proportional to that of the unbraced frame. To check the effectiveness of the design procedure, presented in an associate paper, a six-storey reinforced concrete plane frame, representative of a medium-rise symmetric framed building, is considered as primary test structure; this structure, designed in a medium-risk region, is supposed to be retrofitted as in a high-risk region, by insertion of diagonal braces equipped with hysteretic dampers. A numerical investigation is carried out to study the nonlinear static and dynamic responses of the primary and the damped braced test structures, using step-by-step procedures described in the associate paper mentioned above; the behaviour of frame members and hysteretic dampers is idealized by bilinear models. Real and artificial accelerograms, matching EC8 response spectrum for a medium soil class, are considered for dynamic analyses.

  12. Simulation of complete seismic surveys for evaluation of experiment design and processing

    SciTech Connect

    Oezdenvar, T.; McMechan, G.A.; Chaney, P.

    1996-03-01

    Synthesis of complete seismic survey data sets allows analysis and optimization of all stages in an acquisition/processing sequence. The characteristics of available survey designs, parameter choices, and processing algorithms may be evaluated prior to field acquisition to produce a composite system in which all stages have compatible performance; this maximizes the cost effectiveness for a given level of accuracy, or for targets with specific characteristics. Data sets synthesized for three salt structures provide representative comparisons of time and depth migration, post-stack and prestack processing, and illustrate effects of varying recording aperture and shot spacing, iterative focusing analysis, and the interaction of migration algorithms with recording aperture. A final example demonstrates successful simulation of both 2-D acquisition and processing of a real data line over a salt pod in the Gulf of Mexico.

  13. Some issues in the seismic design of nuclear power-plant facilities

    SciTech Connect

    Hadjian, A.H.; Iwan, W.D.

    1980-09-01

    This paper summarizes the major issues discussed by an international panel of experts during the post-SMIRT (Structural Mechanics in Reactor Technology) Seminar on Extreme Load Design of Nuclear Power-Plant Facilities, which was held in Berlin, Aug. 20-21, 1979. The emphasis of the deliberations was on the state of the art of seismic-response calculations to predict the expected performance of structures and equipment during earthquakes. Four separate panels discussed issues on (1) soil-structure interaction and structural response, (2) modeling, materials, and boundary conditions, (3) damping in structures and equipment, and (4) fragility levels of equipment. The international character of the seminar was particularly helpful in the cross-pollination of ideas regarding the issues and the steps required to enhance the cause of safety of nuclear plants.

  14. Enhancement of Seismic Performance Using Shear Link Braces in a Building Designed Only for Gravity Loads

    NASA Astrophysics Data System (ADS)

    Maniyar, S. U.; Paul, D. K.

    2012-02-01

    The present work attempts to study the behaviour of building designed for gravity loads only under the effect of lateral seismic load. Such a building is generally deficient against lateral forces and need to be retrofitted against lateral earthquake forces. A retrofitting scheme by providing aluminium shear link with chevron braces is suggested to improve its performance. Past earthquakes have shown a great deal of damages to the deficient RC frame buildings designed without any consideration to the lateral earthquake forces. Chevron braces with the aluminium shear link can be implemented as an effective retrofit measure. A comparison of the performance of building initially designed for gravity load only with the retrofitted building using chevron braces with the aluminium shear link is presented in this paper. The behaviour of building is worked out by performing nonlinear static pushover analysis and nonlinear time history analyses. A parametric study has also been carried out to study the effect of shear link and braces on the retrofitted building. The performance of RC building designed for gravity loads only as evaluated from the nonlinear static pushover analysis lies in life safety and collapse prevention range for DBE and MCE level of earthquakes respectively. The same building when retrofitted by using chevron braces with aluminium shear link show improved performance. This device is very simple, economic, effective and can be placed in a building very easily. The dissipation of damaging energy/damage is localised in shear link which can be replaced after a major earthquake.

  15. Effects of charge design features on parameters of acoustic and seismic waves and cratering, for SMR chemical surface explosions

    NASA Astrophysics Data System (ADS)

    Gitterman, Y.

    2012-04-01

    A series of experimental on-surface shots was designed and conducted by the Geophysical Institute of Israel at Sayarim Military Range (SMR) in Negev desert, including two large calibration explosions: about 82 tons of strong IMI explosives in August 2009, and about 100 tons of ANFO explosives in January 2011. It was a collaborative effort between Israel, CTBTO, USA and several European countries, with the main goal to provide fully controlled ground truth (GT0) infrasound sources in different weather/wind conditions, for calibration of IMS infrasound stations in Europe, Middle East and Asia. Strong boosters and the upward charge detonation scheme were applied to provide a reduced energy release to the ground and an enlarged energy radiation to the atmosphere, producing enhanced infrasound signals, for better observation at far-regional stations. The following observations and results indicate on the required explosives energy partition for this charge design: 1) crater size and local seismic (duration) magnitudes were found smaller than expected for these large surface explosions; 2) small test shots of the same charge (1 ton) conducted at SMR with different detonation directions showed clearly lower seismic amplitudes/energy and smaller crater size for the upward detonation; 3) many infrasound stations at local and regional distances showed higher than expected peak amplitudes, even after application of a wind-correction procedure. For the large-scale explosions, high-pressure gauges were deployed at 100-600 m to record air-blast properties, evaluate the efficiency of the charge design and energy generation, and provide a reliable estimation of the charge yield. Empirical relations for air-blast parameters - peak pressure, impulse and the Secondary Shock (SS) time delay - depending on distance, were developed and analyzed. The parameters, scaled by the cubic root of estimated TNT equivalent charges, were found consistent for all analyzed explosions, except of SS

  16. GA-based optimum design of a shape memory alloy device for seismic response mitigation

    NASA Astrophysics Data System (ADS)

    Ozbulut, O. E.; Roschke, P. N.; Y Lin, P.; Loh, C. H.

    2010-06-01

    Damping systems discussed in this work are optimized so that a three-story steel frame structure and its shape memory alloy (SMA) bracing system minimize response metrics due to a custom-tailored earthquake excitation. Multiple-objective numerical optimization that simultaneously minimizes displacements and accelerations of the structure is carried out with a genetic algorithm (GA) in order to optimize SMA bracing elements within the structure. After design of an optimal SMA damping system is complete, full-scale experimental shake table tests are conducted on a large-scale steel frame that is equipped with the optimal SMA devices. A fuzzy inference system is developed from data collected during the testing to simulate the dynamic material response of the SMA bracing subcomponents. Finally, nonlinear analyses of a three-story braced frame are carried out to evaluate the performance of comparable SMA and commonly used steel braces under dynamic loading conditions and to assess the effectiveness of GA-optimized SMA bracing design as compared to alternative designs of SMA braces. It is shown that peak displacement of a structure can be reduced without causing significant acceleration response amplification through a judicious selection of physical characteristics of the SMA devices. Also, SMA devices provide a recentering mechanism for the structure to return to its original position after a seismic event.

  17. Active seismic experiment

    NASA Technical Reports Server (NTRS)

    Kovach, R. L.; Watkins, J. S.; Talwani, P.

    1972-01-01

    The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

  18. Seismic design technology for breeder reactor structures. Volume 4. Special topics in piping and equipment

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into five chapters: experimental verification of piping systems, analytical verification of piping restraint systems, seismic analysis techniques for piping systems with multisupport input, development of floor spectra from input response spectra, and seismic analysis procedures for in-core components. (DLC)

  19. Load Distribution Patterns for Displacement-based Seismic Design of RC Framed Buildings

    NASA Astrophysics Data System (ADS)

    Varughese, Jiji Anna; Menon, Devdas; Meher Prasad, A.

    2014-12-01

    The behaviour of tall frames is characterized by the influence of higher modes in addition to the fundamental mode and thus the design procedures for Displacement-based Design (DBD) adopt several measures to control higher mode effects. The performances of 4, 9 and 15-storeyed frames, designed by DBD were verified using non-linear time history analyses. Higher values of inter-storey drift and damage index were seen near the top of tall frames, which shows the inefficiency of the design method in accounting for higher mode effect. As the principle of damage-limiting aseismic design is to get uniform damage along the height of the frame, several load distribution patterns were examined and the storey shear distributions were compared to identify the best pattern to get uniform damage. The Chao load distribution was found to give higher storey shear at top and thus the frames were redesigned using this load distribution. The efficiency of Chao load distribution in reducing higher mode effects is demonstrated using non-linear time history analyses.

  20. Site study plan for EDBH (Engineering Design Boreholes) seismic surveys, Deaf Smith County site, Texas: Revision 1

    SciTech Connect

    Hume, H.

    1987-12-01

    This site study plan describes seismic reflection surveys to run north-south and east-west across the Deaf Smith County site, and intersecting near the Engineering Design Boreholes (EDBH). Both conventional and shallow high-resolution surveys will be run. The field program has been designed to acquire subsurface geologic and stratigraphic data to address information/data needs resulting from Federal and State regulations and Repository program requirements. The data acquired by the conventional surveys will be common-depth- point, seismic reflection data optimized for reflection events that indicate geologic structure near the repository horizon. The data will also resolve the basement structure and shallow reflection events up to about the top of the evaporite sequence. Field acquisition includes a testing phase to check/select parameters and a production phase. The field data will be subjected immediately to conventional data processing and interpretation to determine if there are any anamolous structural for stratigraphic conditions that could affect the choice of the EDBH sites. After the EDBH's have been drilled and logged, including vertical seismic profiling, the data will be reprocessed and reinterpreted for detailed structural and stratigraphic information to guide shaft development. The shallow high-resulition seismic reflection lines will be run along the same alignments, but the lines will be shorter and limited to immediate vicinity of the EDBH sites. These lines are planned to detect faults or thick channel sands that may be present at the EDBH sites. 23 refs. , 7 figs., 5 tabs.

  1. Software Model Checking for Verifying Distributed Algorithms

    DTIC Science & Technology

    2014-10-28

    Verification procedure is an intelligent exhaustive search of the state space of the design Model Checking 6 Verifying Synchronous Distributed App...Distributed App Sagar Chaki, June 11, 2014 © 2014 Carnegie Mellon University Tool Usage Project webpage (http://mcda.googlecode.com) • Tutorial

  2. Model verifies design of mobile data modem

    NASA Technical Reports Server (NTRS)

    Davarian, F.; Sumida, J.

    1986-01-01

    It has been proposed to use differential minimum shift keying (DMSK) modems in spacecraft-based mobile communications systems. For an employment of these modems, it is necessary that the transmitted carrier frequency be known prior to signal detection. In addition, the time needed by the receiver to lock onto the carrier frequency must be minimized. The present article is concerned with a DMSK modem developed for the Mobile Satellite Service. This device demonstrated fast acquisition time and good performance in the presence of fading. However, certain problems arose in initial attempts to study the acquisition behavior of the AFC loop through breadboard techniques. The development of a software model of the AFC loop is discussed, taking into account two cases which were plotted using the model. Attention is given to a demonstration of the viability of the modem by an approach involving modeling and analysis of the frequency synchronizer.

  3. Seismic design of low-level nuclear waste repositories and toxic waste management facilities

    SciTech Connect

    Chung, D.H.; Bernreuter, D.L.

    1984-05-08

    Identification of the elements of typical hazardous waste facilities (HFWs) that are the major contributors to the risk are focussed on as the elements which require additional considerations in the design and construction of low-level nuclear waste management repositories and HWFs. From a recent study of six typical HWFs it was determined that the factors that contribute most to the human and environmental risk fall into four basic categories: geologic and seismological conditions at each HWF; engineered structures at each HWF; environmental conditions at each HWF; and nature of the material being released. In selecting and carrying out the six case studies, three groups of hazardous waste facilities were examined: generator industries which treat or temporarily store their own wastes; generator facilities which dispose of their own hazardous wastes on site; and industries in the waste treatment and disposal business. The case studies have a diversity of geologic setting, nearby settlement patterns, and environments. Two sites are above a regional aquifer, two are near a bay important to regional fishing, one is in rural hills, and one is in a desert, although not isolated from nearby towns and a groundwater/surface-water system. From the results developed in the study, it was concluded that the effect of seismic activity on hazardous facilities poses a significant risk to the population. Fifteen reasons are given for this conclusion.

  4. Verifying the Hanging Chain Model

    ERIC Educational Resources Information Center

    Karls, Michael A.

    2013-01-01

    The wave equation with variable tension is a classic partial differential equation that can be used to describe the horizontal displacements of a vertical hanging chain with one end fixed and the other end free to move. Using a web camera and TRACKER software to record displacement data from a vibrating hanging chain, we verify a modified version…

  5. Conceptual Design and Architecture of Mars Exploration Rover (MER) for Seismic Experiments Over Martian Surfaces

    NASA Astrophysics Data System (ADS)

    Garg, Akshay; Singh, Amit

    2012-07-01

    Keywords: MER, Mars, Rover, Seismometer Mars has been a subject of human interest for exploration missions for quite some time now. Both rover as well as orbiter missions have been employed to suit mission objectives. Rovers have been preferentially deployed for close range reconnaissance and detailed experimentation with highest accuracy. However, it is essential to strike a balance between the chosen science objectives and the rover operations as a whole. The objective of this proposed mechanism is to design a vehicle (MER) to carry out seismic studies over Martian surface. The conceptual design consists of three units i.e. Mother Rover as a Surrogate (Carrier) and Baby Rovers (two) as seeders for several MEMS-based accelerometer / seismometer units (Nodes). Mother Rover can carry these Baby Rovers, having individual power supply with solar cells and with individual data transmission capabilities, to suitable sites such as Chasma associated with Valles Marineris, Craters or Sand Dunes. Mother rover deploys these rovers in two opposite direction and these rovers follow a triangulation pattern to study shock waves generated through firing tungsten carbide shells into the ground. Till the time of active experiments Mother Rover would act as a guiding unit to control spatial spread of detection instruments. After active shock experimentation, the babies can still act as passive seismometer units to study and record passive shocks from thermal quakes, impact cratering & landslides. Further other experiments / payloads (XPS / GAP / APXS) can also be carried by Mother Rover. Secondary power system consisting of batteries can also be utilized for carrying out further experiments over shallow valley surfaces. The whole arrangement is conceptually expected to increase the accuracy of measurements (through concurrent readings) and prolong life cycle of overall experimentation. The proposed rover can be customised according to the associated scientific objectives and further

  6. Geological investigation for CO2 storage: from seismic and well data to storage design

    NASA Astrophysics Data System (ADS)

    Chapuis, Flavie; Bauer, Hugues; Grataloup, Sandrine; Leynet, Aurélien; Bourgine, Bernard; Castagnac, Claire; Fillacier, Simon; Lecomte, Antony; Le Gallo, Yann; Bonijoly, Didier

    2010-05-01

    Geological investigation for CO2 storage: from seismic and well data to storage design Chapuis F.1, Bauer H.1, Grataloup S.1, Leynet A.1, Bourgine B.1, Castagnac C.1, Fillacier, S.2, Lecomte A.2, Le Gallo Y.2, Bonijoly D.1. 1 BRGM, 3 av Claude Guillemin, 45060 Orléans Cedex, France, f.chapuis@brgm.fr, d.bonijoly@brgm.fr 2 Geogreen, 7, rue E. et A. Peugeot, 92563 Rueil-Malmaison Cedex, France, ylg@greogreen.fr The main purpose of this study is to evaluate the techno-economical potential of storing 200 000 tCO2 per year produced by a sugar beat distillery. To reach this goal, an accurate hydro-geological characterisation of a CO2 injection site is of primary importance because it will strongly influence the site selection, the storage design and the risk management. Geological investigation for CO2 storage is usually set in the center or deepest part of sedimentary basins. However, CO2 producers do not always match with the geological settings, and so other geological configurations have to be studied. This is the aim of this project, which is located near the South-West border of the Paris Basin, in the Orléans region. Special geometries such as onlaps and pinch out of formation against the basement are likely to be observed and so have to be taken into account. Two deep saline aquifers are potentially good candidates for CO2 storage. The Triassic continental deposits capped by the Upper Triassic/Lower Jurassic continental shales and the Dogger carbonate deposits capped by the Callovian and Oxfordian shales. First, a data review was undertaken, to provide the palaeogeographical settings and ideas about the facies, thicknesses and depth of the targeted formations. It was followed by a seismic interpretation. Three hundred kilometres of seismic lines were reprocessed and interpreted to characterize the geometry of the studied area. The main structure identified is the Étampes fault that affects all the formations. Apart from the vicinity of the fault where drag

  7. Seismic Studies

    SciTech Connect

    R. Quittmeyer

    2006-09-25

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at

  8. Seismic testing

    SciTech Connect

    Knott, S.

    1981-10-01

    Electric Power Research Institute (EPRI) research programs in seismic testing to improve earthquake design guidelines lowers the safety-design costs of nuclear power plants. Explosive tests that simulate earthquakes help to determine how structures respond to ground motion and how these are related to soil and geologic conditions at a specific site. Explosive tests develop data for simulation using several computer codes. Photographs illustrate testing techniques. 6 references. (DCK)

  9. Seismic design and evaluation guidelines for the Department of Energy High-Level Waste Storage Tanks and Appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1995-10-01

    This document provides seismic design and evaluation guidelines for underground high-level waste storage tanks. The guidelines reflect the knowledge acquired in the last two decades in defining seismic ground motion and calculating hydrodynamic loads, dynamic soil pressures and other loads for underground tank structures, piping and equipment. The application of the guidelines is illustrated with examples. The guidelines are developed for a specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document. The original version of this document was published in January 1993. Since then, additional studies have been performed in several areas and the results are included in this revision. Comments received from the users are also addressed. Fundamental concepts supporting the basic seismic criteria contained in the original version have since then been incorporated and published in DOE-STD-1020-94 and its technical basis documents. This information has been deleted in the current revision.

  10. Seismic design of steel structures with lead-extrusion dampers as knee braces

    SciTech Connect

    Monir, Habib Saeed; Naser, Ali

    2008-07-08

    One of the effective methods in decreasing the seismic response of structure against dynamic loads due to earthquake is using energy dissipating systems. Lead-extrusion dampers (LED) are one of these systems that dissipate energy in to one lead sleeve because of steel rod movement. Hysteresis loops of these dampers are approximately rectangular and acts independent from velocity in frequencies that are in the seismic frequency rang. In this paper lead dampers are considered as knee brace in steel frames and are studied in an economical view. Considering that lead dampers don't clog structural panels, so this characteristic can solve brace problems from architectural view. The behavior of these dampers is compared with the other kind of dampers such as XADAS and TADAS. The results indicate that lead dampers act properly in absorbing the induced energy due to earthquake and good function in controlling seismic movements of multi-story structures.

  11. Seismic design of steel structures with lead-extrusion dampers as knee braces

    NASA Astrophysics Data System (ADS)

    monir, Habib Saeed; Naser, Ali

    2008-07-01

    One of the effective methods in decreasing the seismic response of structure against dynamic loads due to earthquake is using energy dissipating systems. Lead-extrusion dampers (LED)are one of these systems that dissipate energy in to one lead sleeve because of steel rod movement. Hysteresis loops of these dampers are approximately rectangular and acts independent from velocity in frequencies that are in the seismic frequency rang. In this paper lead dampers are considered as knee brace in steel frames and are studied in an economical view. Considering that lead dampers don't clog structural panels, so this characteristic can solve brace problems from architectural view. The behavior of these dampers is compared with the other kind of dampers such as XADAS and TADAS. The results indicate that lead dampers act properly in absorbing the induced energy due to earthquake and good function in controlling seismic movements of multi-story structures

  12. Simplified design method and seismic performance of space trusses with consideration of the influence of the stiffness of their lower supporting columns

    NASA Astrophysics Data System (ADS)

    Fan, Feng; Sun, Menghan; Zhi, Xudong

    2016-06-01

    Static and dynamic force performance of two types of space truss structures i.e. square pyramid space truss (SPST) and diagonal on square pyramid space truss (DSPST), are studied to determine the effect of stiffness of their lower supporting members. A simplified model for the supporting columns and the equivalent spring mass system are presented. Furthermore, the feasibility of the simplified model is demonstrated through theoretical analysis and examples of comparative analysis of the simplified model with the entire model. Meanwhile, from the elastic analysis under frequently occurring earthquakes and elasto-plastic analysis under seldom occurring earthquakes subjected to TAFT and EL-Centro seismic oscillation it is shown that the simplified method can be encompassed in the results from the normal model. It also showed good agreement between the two methods, as well as greatly improved the computational efficiency. This study verified that the dynamic effect of the supporting structures was under considered in space truss design in the past. The method proposed in the paper has important significance for other space truss structures.

  13. Probabilistic Seismic Hazard Characterization and Design Parameters for the Sites of the Nuclear Power Plants of Ukraine

    SciTech Connect

    Savy, J.B.; Foxall, W.

    2000-01-26

    The U.S. Department of Energy (US DOE), under the auspices of the International Nuclear Safety Program (INSP) is supporting in-depth safety assessments (ISA) of nuclear power plants in Eastern Europe and the former Soviet Union for the purpose of evaluating the safety and upgrades necessary to the stock of nuclear power plants in Ukraine. For this purpose the Hazards Mitigation Center at Lawrence Livermore National Laboratory (LLNL) has been asked to assess the seismic hazard and design parameters at the sites of the nuclear power plants in Ukraine. The probabilistic seismic hazard (PSH) estimates were updated using the latest available data and knowledge from LLNL, the U.S. Geological Survey, and other relevant recent studies from several consulting companies. Special attention was given to account for the local seismicity, the deep focused earthquakes of the Vrancea zone, in Romania, the region around Crimea and for the system of potentially active faults associated with the Pripyat Dniepro Donnetts rift. Aleatory (random) uncertainty was estimated from the available data and the epistemic (knowledge) uncertainty was estimated by considering the existing models in the literature and the interpretations of a small group of experts elicited during a workshop conducted in Kiev, Ukraine, on February 2-4, 1999.

  14. A successful 3D seismic survey in the ``no-data zone,`` offshore Mississippi delta: Survey design and refraction static correction processing

    SciTech Connect

    Carvill, C.; Faris, N.; Chambers, R.

    1996-12-31

    This is a success story of survey design and refraction static correction processing of a large 3D seismic survey in the South Pass area of the Mississippi delta. In this transition zone, subaqueous mudflow gullies and lobes of the delta, in various states of consolidation and gas saturation, are strong absorbers of seismic energy. Seismic waves penetrating the mud are severely restricted in bandwidth and variously delayed by changes in mud velocity and thickness. Using a delay-time refraction static correction method, the authors find compensation for the various delays, i.e., static corrections, commonly vary 150 ms over a short distance. Application of the static corrections markedly improves the seismic stack volume. This paper shows that intelligent survey design and delay-time refraction static correction processing economically eliminate the historic no data status of this area.

  15. Rapid estimation of earthquake loss based on instrumental seismic intensity: design and realization

    NASA Astrophysics Data System (ADS)

    Huang, Hongsheng; Chen, Lin; Zhu, Gengqing; Wang, Lin; Lin, Yanzhao; Wang, Huishan

    2013-11-01

    As a result of our ability to acquire large volumes of real-time earthquake observation data, coupled with increased computer performance, near real-time seismic instrument intensity can be obtained by using ground motion data observed by instruments and by using the appropriate spatial interpolation methods. By combining vulnerability study results from earthquake disaster research with earthquake disaster assessment models, we can estimate the losses caused by devastating earthquakes, in an attempt to provide more reliable information for earthquake emergency response and decision support. This paper analyzes the latest progress on the methods of rapid earthquake loss estimation at home and abroad. A new method involving seismic instrument intensity rapid reporting to estimate earthquake loss is proposed and the relevant software is developed. Finally, a case study using the M L4.9 earthquake that occurred in Shun-chang county, Fujian Province on March 13, 2007 is given as an example of the proposed method.

  16. Optimum seismic structural design based on random vibration and fuzzy graded damages

    NASA Technical Reports Server (NTRS)

    Cheng, Franklin Y.; Ou, Jin-Ping

    1990-01-01

    This paper presents the fuzzy dynamical reliability and failure probability as well as the basic principles and the analytical method of loss assessment for nonlinear seismic steel structures. Also presented is the optimization formulation and a numerical example for double objectives, initial construction cost and expected failure loss, and dynamical reliability constraints. The earthquake ground motion is based on a stationary filtered non-white noise and the fuzzy damage grade is described by damage index.

  17. Verifying a Computer Algorithm Mathematically.

    ERIC Educational Resources Information Center

    Olson, Alton T.

    1986-01-01

    Presents an example of mathematics from an algorithmic point of view, with emphasis on the design and verification of this algorithm. The program involves finding roots for algebraic equations using the half-interval search algorithm. The program listing is included. (JN)

  18. Verify MesoNAM Performance

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2010-01-01

    The AMU conducted an objective analysis of the MesoNAM forecasts compared to observed values from sensors at specified KSC/CCAFS wind towers by calculating the following statistics to verify the performance of the model: 1) Bias (mean difference), 2) Standard deviation of Bias, 3) Root Mean Square Error (RMSE), and 4) Hypothesis test for Bias = O. The 45 WS LWOs use the MesoNAM to support launch weather operations. However, the actual performance of the model at KSC and CCAFS had not been measured objectively. The analysis compared the MesoNAM forecast winds, temperature and dew point to the observed values from the sensors on wind towers. The data were stratified by tower sensor, month and onshore/offshore wind direction based on the orientation of the coastline to each tower's location. The model's performance statistics were then calculated for each wind tower based on sensor height and model initialization time. The period of record for the data used in this task was based on the operational start of the current MesoNAM in mid-August 2006 and so the task began with the first full month of data, September 2006, through May 2010. The analysis of model performance indicated: a) The accuracy decreased as the forecast valid time from the model initialization increased, b) There was a diurnal signal in T with a cool bias during the late night and a warm bias during the afternoon, c) There was a diurnal signal in Td with a low bias during the afternoon and a high bias during the late night, and d) The model parameters at each vertical level most closely matched the observed parameters at heights closest to those vertical levels. The AMU developed a GUI that consists of a multi-level drop-down menu written in JavaScript embedded within the HTML code. This tool allows the LWO to easily and efficiently navigate among the charts and spreadsheet files containing the model performance statistics. The objective statistics give the LWOs knowledge of the model's strengths and

  19. Simulation and Processing Seismic Data in Complex Geological Models

    NASA Astrophysics Data System (ADS)

    Forestieri da Gama Rodrigues, S.; Moreira Lupinacci, W.; Martins de Assis, C. A.

    2014-12-01

    Seismic simulations in complex geological models are interesting to verify some limitations of seismic data. In this project, different geological models were designed to analyze some difficulties encountered in the interpretation of seismic data. Another idea is these data become available for LENEP/UENF students to test new tools to assist in seismic data processing. The geological models were created considering some characteristics found in oil exploration. We simulated geological medium with volcanic intrusions, salt domes, fault, pinch out and layers more distante from surface (Kanao, 2012). We used the software Tesseral Pro to simulate the seismic acquisitions. The acquisition geometries simulated were of the type common offset, end-on and split-spread. (Figure 1) Data acquired with constant offset require less processing routines. The processing flow used with tools available in Seismic Unix package (for more details, see Pennington et al., 2005) was geometric spreading correction, deconvolution, attenuation correction and post-stack depth migration. In processing of the data acquired with end-on and split-spread geometries, we included velocity analysis and NMO correction routines. Although we analyze synthetic data and carefully applied each processing routine, we can observe some limitations of the seismic reflection in imaging thin layers, great surface depth layers, layers with low impedance contrast and faults.

  20. Seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances

    SciTech Connect

    Bandyopadhyay, K.; Cornell, A.; Costantino, C.; Kennedy, R.; Miller, C.; Veletsos, A.

    1993-01-01

    This document provides guidelines for the design and evaluation of underground high-level waste storage tanks due to seismic loads. Attempts were made to reflect the knowledge acquired in the last two decades in the areas of defining the ground motion and calculating hydrodynamic loads and dynamic soil pressures for underground tank structures. The application of the analysis approach is illustrated with an example. The guidelines are developed for specific design of underground storage tanks, namely double-shell structures. However, the methodology discussed is applicable for other types of tank structures as well. The application of these and of suitably adjusted versions of these concepts to other structural types will be addressed in a future version of this document.

  1. Seismic Waveguide of Metamaterials

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Hoon; Das, Mukunda P.

    We developed a new method of an earthquake-resistant design to support conventional aseismic system using acoustic metamaterials. The device is an attenuator of a seismic wave that reduces the amplitude of the wave exponentially. Constructing a cylindrical shell-type waveguide composed of many Helmholtz resonators that creates a stop-band for the seismic frequency range, we convert the seismic wave into an attenuated one without touching the building that we want to protect. It is a mechanical way to convert the seismic energy into sound and heat.

  2. Seismic design spectra 200 West and East Areas DOE Hanford Site, Washington

    SciTech Connect

    Tallman, A.M.

    1995-12-31

    This document presents equal hazard response spectra for the W236A project for the 200 East and West new high-level waste tanks. The hazard level is based upon WHC-SD-W236A-TI-002, Probabilistic Seismic Hazard Analysis, DOE Hanford Site, Washington. Spectral acceleration amplification is plotted with frequency (Hz) for horizontal and vertical motion and attached to this report. The vertical amplification is based upon the preliminary draft revision of Standard ASCE 4-86. The vertical spectral acceleration is equal to the horizontal at frequencies above 3.3Hz because of near-field, less than 15 km, sources.

  3. Effect of Seismic Zone and Story Height on Response Reduction Factor for SMRF Designed According to IS 1893(Part-1):2002

    NASA Astrophysics Data System (ADS)

    Rao, P. Pravin Venkat; Gupta, L. M.

    2016-12-01

    Indian seismic code design procedure, which permit the estimation of inelastic deformation capacity of lateral force resisting systems, has been questioned since no coherence exists for determining the values of response reduction factor tabulated in code. Indian code at present does not give any deterministic values of ductility reduction factor and overstrength factor to be used in the design, because of the inadequacy of research results currently available. Hence, this study focuses on the variation of overstrength and ductility factors in steel moment resisting frame with different seismic zones and number of story. A total of 12 steel moment resisting frames were analyzed and designed. Response reduction factor has been determined by performing the non-linear static pushover analysis. The result shows that overstrength and ductility factors varies with number of story and seismic zones. It is also observed that for different seismic zones and story, ductility reduction factor is found to be different from overall structural ductility. It is observed that three buildings of different heights had an average overstrength of 63% higher in Zone-II as compared to Zone-V. These observations are extremely significant for building seismic provision codes, that at present not taking into consideration the variation of response reduction factor.

  4. Information spreadsheet for Verify user registration

    EPA Pesticide Factsheets

    In this spreadsheet, user(s) provide their company’s manufacturer code, user contact information for Verify, and user roles. This spreadsheet is used for the Company Authorizing Official (CAO), CROMERR Signer, and Verify Submitters.

  5. Characterization and performance evaluation of a vertical seismic isolator using link and crank mechanism

    NASA Astrophysics Data System (ADS)

    Tsujiuchi, N.; Ito, A.; Sekiya, Y.; Nan, C.; Yasuda, M.

    2016-09-01

    In recent years, various seismic isolators have been developed to prevent earthquake damage to valuable art and other rare objects. Many seismic isolators only defend against horizontal motions, which are the usual cause of falling objects. However, the development of a seismic isolator designed for vertical vibration is necessary since such great vertical vibration earthquakes as the 2004 Niigata Prefecture Chuetsu Earthquake have occurred, and their increased height characteristics are undesirable. In this study, we developed a vertical seismic isolator that can be installed at a lower height and can support loads using a horizontal spring without requiring a vertical spring. It has a mechanism that combines links and cranks. The dynamic model was proposed and the frequency characteristics were simulated when the sine waves were the input. Shaking tests were also performed. The experimental value of the natural frequency was 0.57 Hz, and the theoretical values of the frequency characteristics were close to the experimental values. In addition, we verified this vertical seismic isolator's performance through shaking tests and simulation for typical seismic waves in Japan. We verified the seismic isolation's performance from the experimental result because the average reduction rate of the acceleration was 0.21.

  6. Design and development of safety evaluation system of buildings on a seismic field based on the network platform

    NASA Astrophysics Data System (ADS)

    Sun, Baitao; Zhang, Lei; Chen, Xiangzhao; Zhang, Xinghua

    2015-03-01

    This paper describes a set of on-site earthquake safety evaluation systems for buildings, which were developed based on a network platform. The system embedded into the quantitative research results which were completed in accordance with the provisions from Post-earthquake Field Works, Part 2: Safety Assessment of Buildings, GB18208.2 -2001, and was further developed into an easy-to-use software platform. The system is aimed at allowing engineering professionals, civil engineeing technicists or earthquake-affected victims on site to assess damaged buildings through a network after earthquakes. The authors studied the function structure, process design of the safety evaluation module, and hierarchical analysis algorithm module of the system in depth, and developed the general architecture design, development technology and database design of the system. Technologies such as hierarchical architecture design and Java EE were used in the system development, and MySQL5 was adopted in the database development. The result is a complete evaluation process of information collection, safety evaluation, and output of damage and safety degrees, as well as query and statistical analysis of identified buildings. The system can play a positive role in sharing expert post-earthquake experience and promoting safety evaluation of buildings on a seismic field.

  7. Seismic analysis of Industrial Waste Landfill 4 at Y-12 Plant

    SciTech Connect

    1995-04-07

    This calculation was to seismically evaluate Landfill IV at Y-12 as required by Tennessee Rule 1200-1-7-04(2) for seismic impact zones. The calculation verifies that the landfill meets the seismic requirements of the Tennessee Division of Solid Waste, ``Earthquake Evaluation Guidance Document.`` The theoretical displacements of 0.17 in. and 0.13 in. for the design basis earthquake are well below the limiting seimsic slope stability design criteria. There is no potential for liquefaction due to absence of chohesionless soils, or for loss or reduction of shear strength for the clays at this site as result of earthquake vibration. The vegetative cover on slopes will most likely be displaced and move during a large seismic event, but this is not considered a serious deficiency because the cover is not involved in the structural stability of the landfill and there would be no release of waste to the environment.

  8. Seismic hazard evaluation for design and/or verification of a high voltage system

    SciTech Connect

    Grases, J.; Malaver, A.; Lopez, S.; Rivero, P.

    1995-12-31

    The Venezuelan capital, Caracas, with a population of about 5 million, is within the area of contact of the Caribbean and South American tectonic plates. Since 1567, the valley where it lies and surroundings have been shaken by at leas six destructive events from different seismogenic sources. Electric energy is served to the city by a high voltage system consisting of 4 power stations, 20 substations (230 KV downwards) and 80 km of high voltage lines, covering an area of about 135 x 60 km{sup 2}. Given the variety of soil conditions, topographical irregularities and proximity to potentially active faults, it was decided to perform a seismic hazard study. This paper gives the results of that study synthesized by two hazard-parameter maps, which allow a conservative characterization of the acceleration on firm soils. Specific site coefficients allow for changes in soil conditions and topographical effects. Sites whose proximity to fault lines is less than about 2 km, require additional field studies in order to rule out the possibility of permanent ground displacements.

  9. Seismic hazard analyses for Taipei city including deaggregation, design spectra, and time history with excel applications

    NASA Astrophysics Data System (ADS)

    Wang, Jui-Pin; Huang, Duruo; Cheng, Chin-Tung; Shao, Kuo-Shin; Wu, Yuan-Chieh; Chang, Chih-Wei

    2013-03-01

    Given the difficulty of earthquake forecast, Probabilistic Seismic Hazard Analysis (PSHA) has been a method to best estimate site-specific ground motion or response spectra in earthquake engineering and engineering seismology. In this paper, the first in-depth PSHA study for Taipei, the economic center of Taiwan with a six-million population, was carried out. Unlike the very recent PSHA study for Taiwan, this study includes the follow-up hazard deaggregation, response spectra, and the earthquake motion recommendations. Hazard deaggregation results show that moderate-size and near-source earthquakes are the most probable scenario for this city. Moreover, similar to the findings in a few recent studies, the earthquake risk for Taipei should be relatively high and considering this city's importance, the high risk should not be overlooked and a potential revision of the local technical reference would be needed. In addition to the case study, some innovative Excel applications to PSHA are introduced in this paper. Such spreadsheet applications are applicable to geosciences research as those developed for data reduction or quantitative analysis with Excel's user-friendly nature and wide accessibility.

  10. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  11. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  12. The LUSI Seismic Experiment: Deployment of a Seismic Network around LUSI, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Karyono, Karyono; Mazzini, Adriano; Lupi, Matteo; Syafri, Ildrem; Haryanto, Iyan; Masturyono, Masturyono; Hadi, Soffian; Rohadi, Suprianto; Suardi, Iman; Rudiyanto, Ariska; Pranata, Bayu

    2015-04-01

    The spectacular Lusi eruption started in northeast Java, Indonesia the 29 of May 2006 following a M6.3 earthquake striking the island. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. Lusi is located few kilometres to the NE of the Arjuno-Welirang volcanic complex. Lusi sits upon the Watukosek fault system. From this volcanic complex originates the Watukosek fault system that was reactivated by the M6.3 earthquake in 2006 and is still periodically reactivated by the frequent seismicity. To date Lusi is still active and erupting gas, water, mud and clasts. Gas and water data show that the Lusi plumbing system is connected with the neighbouring Arjuno-Welirang volcanic complex. This makes the Lusi eruption a "sedimentary hosted geothermal system". To verify and characterise the occurrence of seismic activity and how this perturbs the connected Watukosek fault, the Arjuno-Welirang volcanic system and the ongoing Lusi eruption, we deployed 30 seismic stations (short-period and broadband) in this region of the East Java basin. The seismic stations are more densely distributed around LUSI and the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. Fewer stations are positioned around the volcanic arc. Our study sheds light on the seismic activity along the Watukosek fault system and describes the waveforms associated to the geysering activity of Lusi. The initial network aims to locate small event that may not be captured by the Indonesian Agency for Meteorology, Climatology and Geophysics (BMKG) seismic network and it will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-Arjuno Welirang region and temporal variations of vp/vs ratios. Such variations will then be ideally related to

  13. Efficacy of Code Provisions for Seismic Design of Asymmetric RC Building

    NASA Astrophysics Data System (ADS)

    Balakrishnan, Bijily; Sarkar, Pradip

    2016-06-01

    The earthquake resistant design code in India, IS: 1893, has been revised in 2002 to include provisions for torsional irregularity in asymmetric buildings. In line with other international code, IS 1893: 2002 requires estimating the design eccentricity from static and accidental eccentricity. The present study attempts to evaluate the effectiveness of the design code requirements for designing torsionally irregular asymmetric buildings. Two similar asymmetric buildings designed considering and ignoring code requirement has been considered for this study. Nonlinear static and dynamic analyses are performed on these buildings to realize the difference in their behaviour and it is found that the plan asymmetry in the building makes it non-ductile even after design with code provisions. Code criterion for plan asymmetry tends to improve the strength of members but this study indicates that changing the stiffness distribution to reduce eccentricity may lead to a preferred mode of failure.

  14. An identity verifier evaluation of performance

    SciTech Connect

    Maxwell, R.L.

    1987-01-01

    Because the development of personnel identity verifiers is active in several areas, it is important that an independent comparative evaluation of such devices be continuously pursued for the security industry to apply such devices. An evaluation of several verifiers was recently conducted (in the winter of 1986/1987) at Sandia National Laboratories. In a nonrigorous attempt to comparatively evaluate these verifiers in a field security environment, about 80 individuals were enrolled in five different verifiers. The enrollees were than encouraged to attempt a verification on each device several times a day for about four months such that both single try and multiple try information could be extracted from the data. Results indicated a general improvement in verifier performance with regard to accuracy and operating time compared to previous similar evaluations of verifiers at Sandia.

  15. Theoretical and practical considerations for the design of the iMUSH active-source seismic experiment

    NASA Astrophysics Data System (ADS)

    Kiser, E.; Levander, A.; Harder, S. H.; Abers, G. A.; Creager, K. C.; Vidale, J. E.; Moran, S. C.; Malone, S. D.

    2013-12-01

    The multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment seeks to understand the details of the magmatic system that feeds Mount St. Helens using active- and passive-source seismic, magnetotelluric, and petrologic data. The active-source seismic component of this experiment will take place in the summer of 2014 utilizing all of the 2600 PASSCAL 'Texan' Reftek instruments which will record twenty-four 1000-2000 lb shots distributed around the Mount St. Helens region. The instruments will be deployed as two consecutive refraction profiles centered on the volcano, and a series of areal arrays. The actual number of areal arrays, as well as their locations, will depend strongly on the length of the experiment (3-4 weeks), the number of instrument deployers (50-60), and the time it will take per deployment given the available road network. The current work shows how we are balancing these practical considerations against theoretical experiment designs in order to achieve the proposed scientific goals with the available resources. One of the main goals of the active-source seismic experiment is to image the magmatic system down to the Moho (35-40 km). Calculating sensitivity kernels for multiple shot/receiver offsets shows that direct P waves should be sensitive to Moho depths at offsets of 150 km, and therefore this will likely be the length of the refraction profiles. Another primary objective of the experiment is to estimate the locations and volumes of different magma accumulation zones beneath the volcano using the areal arrays. With this in mind, the optimal locations of these arrays, as well as their associated shots, are estimated using an eigenvalue analysis of the approximate Hessian for each possible experiment design. This analysis seeks to minimize the number of small eigenvalues of the approximate Hessian that would amplify the propagation of data noise into regions of interest in the model space, such as the likely locations of magma

  16. Preclosure seismic design methodology for a geologic repository at Yucca Mountain. Topical report YMP/TR-003-NP

    SciTech Connect

    1996-10-01

    This topical report describes the methodology and criteria that the U.S. Department of Energy (DOE) proposes to use for preclosure seismic design of structures, systems, and components (SSCs) of the proposed geologic repository operations area that are important to safety. Title 10 of the Code of Federal Regulations, Part 60 (10 CFR 60), Disposal of High-Level Radioactive Wastes in Geologic Repositories, states that for a license to be issued for operation of a high-level waste repository, the U.S. Nuclear Regulatory Commission (NRC) must find that the facility will not constitute an unreasonable risk to the health and safety of the public. Section 60.131 (b)(1) requires that SSCs important to safety be designed so that natural phenomena and environmental conditions anticipated at the geologic repository operations area will not interfere with necessary safety functions. Among the natural phenomena specifically identified in the regulation as requiring safety consideration are the hazards of ground shaking and fault displacement due to earthquakes.

  17. The DDBD Method In The A-Seismic Design of Anchored Diaphragm Walls

    SciTech Connect

    Manuela, Cecconi; Vincenzo, Pane; Sara, Vecchietti

    2008-07-08

    The development of displacement based approaches for earthquake engineering design appears to be very useful and capable to provide improved reliability by directly comparing computed response and expected structural performance. In particular, the design procedure known as the Direct Displacement Based Design (DDBD) method, which has been developed in structural engineering over the past ten years in the attempt to mitigate some of the deficiencies in current force-based design methods, has been shown to be very effective and promising ([1], [2]). The first attempts of application of the procedure to geotechnical engineering and, in particular, earth retaining structures are discussed in [3], [4] and [5]. However in this field, the outcomes of the research need to be further investigated in many aspects. The paper focuses on the application of the DDBD method to anchored diaphragm walls. The results of the DDBD method are discussed in detail in the paper, and compared to those obtained from conventional pseudo-static analyses.

  18. Seismic design or retrofit of buildings with metallic structural fuses by the damage-reduction spectrum

    NASA Astrophysics Data System (ADS)

    Li, Gang; Jiang, Yi; Zhang, Shuchuan; Zeng, Yan; Li, Qiang

    2015-03-01

    Recently, the structural fuse has become an important issue in the field of earthquake engineering. Due to the trilinearity of the pushover curve of buildings with metallic structural fuses, the mechanism of the structural fuse is investigated through the ductility equation of a single-degree-of-freedom system, and the corresponding damage-reduction spectrum is proposed to design and retrofit buildings. Furthermore, the controlling parameters, the stiffness ratio between the main frame and structural fuse and the ductility factor of the main frame, are parametrically studied, and it is shown that the structural fuse concept can be achieved by specific combinations of the controlling parameters based on the proposed damage-reduction spectrum. Finally, a design example and a retrofit example, variations of real engineering projects after the 2008 Wenchuan earthquake, are provided to demonstrate the effectiveness of the proposed design procedures using buckling restrained braces as the structural fuses.

  19. Optimization for performance-based design under seismic demands, including social costs

    NASA Astrophysics Data System (ADS)

    Möller, Oscar; Foschi, Ricardo O.; Ascheri, Juan P.; Rubinstein, Marcelo; Grossman, Sergio

    2015-06-01

    Performance-based design in earthquake engineering is a structural optimization problem that has, as the objective, the determination of design parameters for the minimization of total costs, while at the same time satisfying minimum reliability levels for the specified performance criteria. Total costs include those for construction and structural damage repairs, those associated with non-structural components and the social costs of economic losses, injuries and fatalities. This paper presents a general framework to approach this problem, using a numerical optimization strategy and incorporating the use of neural networks for the evaluation of dynamic responses and the reliability levels achieved for a given set of design parameters. The strategy is applied to an example of a three-story office building. The results show the importance of considering the social costs, and the optimum failure probabilities when minimum reliability constraints are not taken into account.

  20. Appraising the value of independent EIA follow-up verifiers

    SciTech Connect

    Wessels, Jan-Albert

    2015-01-15

    Independent Environmental Impact Assessment (EIA) follow-up verifiers such as monitoring agencies, checkers, supervisors and control officers are active on various construction sites across the world. There are, however, differing views on the value that these verifiers add and very limited learning in EIA has been drawn from independent verifiers. This paper aims to appraise how and to what extent independent EIA follow-up verifiers add value in major construction projects in the developing country context of South Africa. A framework for appraising the role of independent verifiers was established and four South African case studies were examined through a mixture of site visits, project document analysis, and interviews. Appraisal results were documented in the performance areas of: planning, doing, checking, acting, public participating and integration with other programs. The results indicate that independent verifiers add most value to major construction projects when involved with screening EIA requirements of new projects, allocation of financial and human resources, checking legal compliance, influencing implementation, reporting conformance results, community and stakeholder engagement, integration with self-responsibility programs such as environmental management systems (EMS), and controlling records. It was apparent that verifiers could be more creatively utilized in pre-construction preparation, providing feedback of knowledge into assessment of new projects, giving input to the planning and design phase of projects, and performance evaluation. The study confirms the benefits of proponent and regulator follow-up, specifically in having independent verifiers that disclose information, facilitate discussion among stakeholders, are adaptable and proactive, aid in the integration of EIA with other programs, and instill trust in EIA enforcement by conformance evaluation. Overall, the study provides insight on how to harness the learning opportunities

  1. RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES

    EPA Science Inventory

    On October 9, 1993, the new RCRA Subtitle D regulations (40 CFR Part 258) went into effect. These regulations are applicable to landfills receiving municipal solid waste (MSW) and establish minimum Federal criteria for the siting, design, operation, and closure of MSW landfills....

  2. Seismic verification of a comprehensive test ban

    SciTech Connect

    Hannon, W.J.

    1985-01-18

    The capabilities of in-country seismic-monitoring systems for verifying the absence of underground nuclear explosions are compared against challenges posed by possible clandestine testing schemes. Although analyses indicate that extensive networks of in-country seismic arrays are needed to verify a Comprehensive Test Ban Treaty, such networks cannot ensure that all underground nuclear explosions will be identified. Political and military judgments will determine the level of risk acceptable to each nation. 35 references, 9 figures.

  3. Towards Automatic Analysis of Election Verifiability Properties

    NASA Astrophysics Data System (ADS)

    Smyth, Ben; Ryan, Mark; Kremer, Steve; Kourjieh, Mounira

    We present a symbolic definition that captures some cases of election verifiability for electronic voting protocols. Our definition is given in terms of reachability assertions in the applied pi calculus and is amenable to automated reasoning using the software tool ProVerif. The definition distinguishes three aspects of verifiability, which we call individual, universal, and eligibility verifiability. We demonstrate the applicability of our formalism by analysing the protocols due to Fujioka, Okamoto & Ohta and a variant of the one by Juels, Catalano & Jakobsson (implemented as Civitas by Clarkson, Chong & Myers).

  4. A structural design and analysis of a piping system including seismic load

    SciTech Connect

    Hsieh, B.J.; Kot, C.A.

    1991-01-01

    The structural design/analysis of a piping system at a nuclear fuel facility is used to investigate some aspects of current design procedures. Specifically the effect of using various stress measures including ASME Boiler Pressure Vessel (B PV) Code formulas is evaluated. It is found that large differences in local maximum stress values may be calculated depending on the stress criterion used. However, when the global stress maximum for the entire system are compared the differences are much smaller, being nevertheless, for some load combinations, of the order of 50 percent. The effect of using an Equivalent Static Method (ESM) analysis is also evaluated by comparing its results with those obtained from a Response Spectrum Method (RSM) analysis with the modal responses combined by using the absolute summation (ABS), by using the square root of the squares (SRSS), and by using the 10 percent method (10PC). It is shown that for a spectrum amplification factor (equivalent static coefficient greater than unity) of at least 1.32 must be used in the current application of the ESM analysis in order to obtain results which are conservative in all aspects relative to an RSM analysis based on ABS. However, it appears that an adequate design would be obtained from the ESM approach even without the use of a spectrum amplification factor. 7 refs., 3 figs., 3 tabs.

  5. 41 CFR 128-1.8006 - Seismic Safety Program requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., any effect the addition will have on the seismic resistance of the existing portion of the structure. If the reviewer determines that the addition will decrease the level of seismic resistance of the... reviewer shall verify that the current level of seismic resistance of the existing building at least...

  6. 41 CFR 128-1.8006 - Seismic Safety Program requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., any effect the addition will have on the seismic resistance of the existing portion of the structure. If the reviewer determines that the addition will decrease the level of seismic resistance of the... reviewer shall verify that the current level of seismic resistance of the existing building at least...

  7. Seismic design and engineering research at the U.S. Geological Survey

    USGS Publications Warehouse

    1988-01-01

    The Engineering Seismology Element of the USGS Earthquake Hazards Reduction Program is responsible for the coordination and operation of the National Strong Motion Network to collect, process, and disseminate earthquake strong-motion data; and, the development of improved methodologies to estimate and predict earthquake ground motion.  Instrumental observations of strong ground shaking induced by damaging earthquakes and the corresponding response of man-made structures provide the basis for estimating the severity of shaking from future earthquakes, for earthquake-resistant design, and for understanding the physics of seismologic failure in the Earth's crust.

  8. Artificial Seismic Shadow Zone by Acoustic Metamaterials

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Hoon; Das, Mukunda P.

    2013-08-01

    We developed a new method of earthquake-proof engineering to create an artificial seismic shadow zone using acoustic metamaterials. By designing huge empty boxes with a few side-holes corresponding to the resonance frequencies of seismic waves and burying them around the buildings that we want to protect, the velocity of the seismic wave becomes imaginary. The meta-barrier composed of many meta-boxes attenuates the seismic waves, which reduces the amplitude of the wave exponentially by dissipating the seismic energy. This is a mechanical method of converting the seismic energy into sound and heat. We estimated the sound level generated from a seismic wave. This method of area protection differs from the point protection of conventional seismic design, including the traditional cloaking method. The artificial seismic shadow zone is tested by computer simulation and compared with a normal barrier.

  9. SOAR Telescope seismic performance II: seismic mitigation

    NASA Astrophysics Data System (ADS)

    Elias, Jonathan H.; Muñoz, Freddy; Warner, Michael; Rivera, Rossano; Martínez, Manuel

    2016-07-01

    We describe design modifications to the SOAR telescope intended to reduce the impact of future major earthquakes, based on the facility's experience during recent events, most notably the September 2015 Illapel earthquake. Specific modifications include a redesign of the encoder systems for both azimuth and elevation, seismic trigger for the emergency stop system, and additional protections for the telescope secondary mirror system. The secondary mirror protection may combine measures to reduce amplification of seismic vibration and "fail-safe" components within the assembly. The status of these upgrades is presented.

  10. Building and Verifying a Predictive Model of Interruption Resumption

    DTIC Science & Technology

    2012-03-01

    willingly anthropomorphize robots with very little evi- dence that the robot can think or act for itself [31]–[33]. We hoped that the combination of following...equal: The design and perception of humanoid robot heads,[ in Proc. 4th Int. Conf. Designing Trafton et al. : Building and Verifying a Predictive Model of...models of humanoid robots ,[ Proc. IEEE Int. Conf. Robot . Autom., 2005, pp. 2767–2772. [34] T. Fawcett, BAn introduction to ROC analysis,[ Pattern

  11. Verifiable Quantum Encryption and its Practical Applications

    NASA Astrophysics Data System (ADS)

    Shi, Run-hua

    2016-12-01

    In this paper, we present a novel verifiable quantum encryption scheme, in which a sender encrypts a classical plaintext into a quantum ciphertext, such that only a specified receiver can decrypt the ciphertext and further get the plaintext. This scheme can not only ensure the unconditional security of the plaintext, but can also verify the validness of the plaintext. In addition, we consider its practical applications with key reuse and further present a practical application protocol for secure two-party quantum scalar product.

  12. An IBM 370 assembly language program verifier

    NASA Technical Reports Server (NTRS)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  13. Verifiable Quantum Encryption and its Practical Applications

    NASA Astrophysics Data System (ADS)

    Shi, Run-hua

    2017-04-01

    In this paper, we present a novel verifiable quantum encryption scheme, in which a sender encrypts a classical plaintext into a quantum ciphertext, such that only a specified receiver can decrypt the ciphertext and further get the plaintext. This scheme can not only ensure the unconditional security of the plaintext, but can also verify the validness of the plaintext. In addition, we consider its practical applications with key reuse and further present a practical application protocol for secure two-party quantum scalar product.

  14. Seismic upgrades of healthcare facilities.

    PubMed

    Yusuf, A

    1997-06-01

    Before 1989 seismic upgrading of hospital structures was not a primary consideration among hospital owners. However, after extensive earthquake damage to hospital buildings at Loma Prieta in Northern California in 1989 and then at Northridge in Southern California in 1994, hospital owners, legislators, and design teams become concerned about the need for seismic upgrading of existing facilities. Because the damage hospital structures sustained in the earthquakes was so severe and far-reaching, California has enacted laws that mandate seismic upgrading for existing facilities. Now hospital owners will have to upgrade buildings that do not conform to statewide seismic adequacy laws. By 2030, California expects all of its hospital structures to be sufficiently seismic-resistant. Slowly, regions in the Midwest and on the East Coast are following their example. This article outlines reasons and ways for seismic upgrading of existing facilities.

  15. Seismic offset balancing

    SciTech Connect

    Ross, C.P.; Beale, P.L.

    1994-01-01

    The ability to successfully predict lithology and fluid content from reflection seismic records using AVO techniques is contingent upon accurate pre-analysis conditioning of the seismic data. However, all too often, residual amplitude effects remain after the many offset-dependent processing steps are completed. Residual amplitude effects often represent a significant error when compared to the amplitude variation with offset (AVO) response that the authors are attempting to quantify. They propose a model-based, offset-dependent amplitude balancing method that attempts to correct for these residuals and other errors due to sub-optimal processing. Seismic offset balancing attempts to quantify the relationship between the offset response of back-ground seismic reflections and corresponding theoretical predictions for average lithologic interfaces thought to cause these background reflections. It is assumed that any deviation from the theoretical response is a result of residual processing phenomenon and/or suboptimal processing, and a simple offset-dependent scaling function is designed to correct for these differences. This function can then be applied to seismic data over both prospective and nonprospective zones within an area where the theoretical values are appropriate and the seismic characteristics are consistent. A conservative application of the above procedure results in an AVO response over both gas sands and wet sands that is much closer to theoretically expected values. A case history from the Gulf of Mexico Flexure Trend is presented as an example to demonstrate the offset balancing technique.

  16. 37 CFR 2.33 - Verified statement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... under § 2.20 of the applicant's continued use or bona fide intention to use the mark in commerce. (d) (e... COMMERCE RULES OF PRACTICE IN TRADEMARK CASES The Written Application § 2.33 Verified statement. (a) The... behalf of the applicant under § 2.193(e)(1). (b)(1) In an application under section 1(a) of the Act,...

  17. 37 CFR 2.33 - Verified statement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... under § 2.20 of the applicant's continued use or bona fide intention to use the mark in commerce. (d) (e... COMMERCE RULES OF PRACTICE IN TRADEMARK CASES The Written Application § 2.33 Verified statement. (a) The... behalf of the applicant under § 2.193(e)(1). (b)(1) In an application under section 1(a) of the Act,...

  18. Firms Verify Online IDs Via Schools

    ERIC Educational Resources Information Center

    Davis, Michelle R.

    2008-01-01

    Companies selling services to protect children and teenagers from sexual predators on the Internet have enlisted the help of schools and teachers to verify students' personal information. Those companies are also sharing some of the information with Web sites, which can pass it along to businesses for use in targeting advertising to young…

  19. Seismic design technology for breeder reactor structures. Volume 2. Special topics in soil/structure interaction analyses

    SciTech Connect

    Reddy, D.P.

    1983-04-01

    This volume is divided into six chapters: definition of seismic input ground motion, review of state-of-the-art procedures, analysis guidelines, rock/structure interaction analysis example, comparison of two- and three-dimensional analyses, and comparison of analyses using FLUSH and TRI/SAC Codes. (DLC)

  20. Procedures for computing site seismicity

    NASA Astrophysics Data System (ADS)

    Ferritto, John

    1994-02-01

    This report was prepared as part of the Navy's Seismic Hazard Mitigation Program. The Navy has numerous bases located in seismically active regions throughout the world. Safe effective design of waterfront structures requires determining expected earthquake ground motion. The Navy's problem is further complicated by the presence of soft saturated marginal soils that can significantly amplify the levels of seismic shaking as evidenced in the 1989 Loma Prieta earthquake. The Naval Facilities Engineering Command's seismic design manual, NAVFAC P355.l, requires a probabilistic assessment of ground motion for design of essential structures. This report presents the basis for the Navy's Seismic Hazard Analysis procedure that was developed and is intended to be used with the Seismic Hazard Analysis computer program and user's manual. This report also presents data on geology and seismology to establish the background for the seismic hazard model developed. The procedure uses the historical epicenter data base and available geologic data, together with source models, recurrence models, and attenuation relationships to compute the probability distribution of site acceleration and an appropriate spectra. This report discusses the developed stochastic model for seismic hazard evaluation and the associated research.

  1. Optimised resource construction for verifiable quantum computation

    NASA Astrophysics Data System (ADS)

    Kashefi, Elham; Wallden, Petros

    2017-04-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph.

  2. Towards composition of verified hardware devices

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, G. C.

    1991-01-01

    Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.

  3. Design and Implementation of a Wireless Sensor Network of GPS-enabled Seismic Sensors for the Study of Glaciers and Ice Sheets

    NASA Astrophysics Data System (ADS)

    Bilen, S. G.; Anandakrishnan, S.; Urbina, J. V.

    2012-12-01

    In an effort to provide new and improved geophysical sensing capabilities for the study of ice sheets in Antarctica and Greenland, or to study mountain glaciers, we are developing a network of wirelessly interconnected seismic and GPS sensor nodes (called "geoPebbles"), with the primary objective of making such instruments more capable and cost effective. We describe our design methodology, which has enabled us to develop these state-of-the art sensors using commercial-off-the-shelf hardware combined with custom-designed hardware and software. Each geoPebble is a self-contained, wirelessly connected sensor for collecting seismic measurements and position information. Each node is built around a three-component seismic recorder, which includes an amplifier, filter, and 24-bit analog-to-digital card that can sample up to 10 kHz. Each unit also includes a microphone channel to record the ground-coupled airwave. The timing for each node is available through a carrier-phase measurement of the L1 GPS signal at an absolute accuracy of better than a microsecond. Each geoPebble includes 16 GB of solid-state storage, wireless communications capability to a central supervisory unit, and auxiliary measurements capability (up to eight 10-bit channels at low sample rates). We will report on current efforts to test this new instrument and how we are addressing the challenges imposed by the extreme weather conditions on the Antarctic continent. After fully validating its operational conditions, the geoPebble system will be available for NSF-sponsored glaciology research projects. Geophysical experiments in the polar region are logistically difficult. With the geoPebble system, the cost of doing today's experiments (low-resolution, 2D) will be significantly reduced, and the cost and feasibility of doing tomorrow's experiments (integrated seismic, positioning, 3D, etc.) will be reasonable. Sketch of an experiment with geoPebbles scattered on the surface of the ice sheet. The seismic

  4. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  5. Seismic Discrimination

    DTIC Science & Technology

    1981-03-31

    for second-order Sturm - Liouville boundary-value problems, such a count of eigenvalues may be established in terms of the number of zero crossings of...will be operational during the next six months. Section 11 describes a series of activities in the development and imple- mentation of the seismic...element of seismic research. with emphasis on those areas directly related to tho operations of the SDC. Substantial progress has been made in the

  6. Seismic seiches

    USGS Publications Warehouse

    McGarr, Arthur; Gupta, Harsh K.

    2011-01-01

    Seismic seiche is a term first used by Kvale (1955) to discuss oscillations of lake levels in Norway and England caused by the Assam earthquake of August 15, 1950. This definition has since been generalized to apply to standing waves set up in closed, or partially closed, bodies of water including rivers, shipping channels, lakes, swimming pools and tanks due to the passage of seismic waves from an earthquake.

  7. Impact of lateral force-resisting system and design/construction practices on seismic performance and cost of tall buildings in Dubai, UAE

    NASA Astrophysics Data System (ADS)

    AlHamaydeh, Mohammad; Galal, Khaled; Yehia, Sherif

    2013-09-01

    The local design and construction practices in the United Arab Emirates (UAE), together with Dubai's unique rate of development, warrant special attention to the selection of Lateral Force-Resisting Systems (LFRS). This research proposes four different feasible solutions for the selection of the LFRS for tall buildings and quantifies the impact of these selections on seismic performance and cost. The systems considered are: Steel Special Moment-Resisting Frame (SMRF), Concrete SMRF, Steel Dual System (SMRF with Special Steel Plates Shear Wall, SPSW), and Concrete Dual System (SMRF with Special Concrete Shear Wall, SCSW). The LFRS selection is driven by seismic setup as well as the adopted design and construction practices in Dubai. It is found that the concrete design alternatives are consistently less expensive than their steel counterparts. The steel dual system is expected to have the least damage based on its relatively lesser interstory drifts. However, this preferred performance comes at a higher initial construction cost. Conversely, the steel SMRF system is expected to have the most damage and associated repair cost due to its excessive flexibility. The two concrete alternatives are expected to have relatively moderate damage and repair costs in addition to their lesser initial construction cost.

  8. Seismic isolation of nuclear power plants using sliding isolation bearings

    NASA Astrophysics Data System (ADS)

    Kumar, Manish

    Nuclear power plants (NPP) are designed for earthquake shaking with very long return periods. Seismic isolation is a viable strategy to protect NPPs from extreme earthquake shaking because it filters a significant fraction of earthquake input energy. This study addresses the seismic isolation of NPPs using sliding bearings, with a focus on the single concave Friction Pendulum(TM) (FP) bearing. Friction at the sliding surface of an FP bearing changes continuously during an earthquake as a function of sliding velocity, axial pressure and temperature at the sliding surface. The temperature at the sliding surface, in turn, is a function of the histories of coefficient of friction, sliding velocity and axial pressure, and the travel path of the slider. A simple model to describe the complex interdependence of the coefficient of friction, axial pressure, sliding velocity and temperature at the sliding surface is proposed, and then verified and validated. Seismic hazard for a seismically isolated nuclear power plant is defined in the United States using a uniform hazard response spectrum (UHRS) at mean annual frequencies of exceedance (MAFE) of 10-4 and 10 -5. A key design parameter is the clearance to the hard stop (CHS), which is influenced substantially by the definition of the seismic hazard. Four alternate representations of seismic hazard are studied, which incorporate different variabilities and uncertainties. Response-history analyses performed on single FP-bearing isolation systems using ground motions consistent with the four representations at the two shaking levels indicate that the CHS is influenced primarily by whether the observed difference between the two horizontal components of ground motions in a given set is accounted for. The UHRS at the MAFE of 10-4 is increased by a design factor (≥ 1) for conventional (fixed base) nuclear structure to achieve a target annual frequency of unacceptable performance. Risk oriented calculations are performed for

  9. Recent advances in the Lesser Antilles observatories Part 1 : Seismic Data Acquisition Design based on EarthWorm and SeisComP

    NASA Astrophysics Data System (ADS)

    Saurel, Jean-Marie; Randriamora, Frédéric; Bosson, Alexis; Kitou, Thierry; Vidal, Cyril; Bouin, Marie-Paule; de Chabalier, Jean-Bernard; Clouard, Valérie

    2010-05-01

    Lesser Antilles observatories are in charge of monitoring the volcanoes and earthquakes in the Eastern Caribbean region. During the past two years, our seismic networks have evolved toward a full digital technology. These changes, which include modern three components sensors, high dynamic range digitizers, high speed terrestrial and satellite telemetry, improve data quality but also increase the data flows to process and to store. Moreover, the generalization of data exchange to build a wide virtual seismic network around the Caribbean domain requires a great flexibility to provide and receive data flows in various formats. As many observatories, we have decided to use the most popular and robust open source data acquisition systems in use in today observatories community : EarthWorm and SeisComP. The first is renowned for its ability to process real time seismic data flows, with a high number of tunable modules (filters, triggers, automatic pickers, locators). The later is renowned for its ability to exchange seismic data using the international SEED standard (Standard for Exchange of Earthquake Data), either by producing archive files, or by managing output and input SEEDLink flows. French Antilles Seismological and Volcanological Observatories have chosen to take advantage of the best features of each software to design a new data flow scheme and to integrate it in our global observatory data management system, WebObs [Beauducel et al., 2004]1, see the companion paper (Part 2). We assigned the tasks to the different softwares, regarding their main abilities : - EarthWorm first performs the integration of data from different heterogeneous sources; - SeisComP takes all this homogeneous EarthWorm data flow, adds other sources and produces SEED archives and SEED data flow; - EarthWorm is then used again to process this clean and complete SEEDLink data flow, mainly producing triggers, automatic locations and alarms; - WebObs provides a friendly human interface, both

  10. Verifying speculative multithreading in an application

    DOEpatents

    Felton, Mitchell D

    2014-11-18

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  11. Verifying speculative multithreading in an application

    DOEpatents

    Felton, Mitchell D

    2014-12-09

    Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.

  12. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    SciTech Connect

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly

  13. Coherence estimation algorithm using Kendall's concordance measurement on seismic data

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Gao, Jing-Huai; Zhang, Bing; Wang, Da-Xing

    2016-09-01

    The coherence method is always used to describe the discontinuity and heterogeneity of seismic data. In traditional coherence methods, a linear correlation coefficient is always used to measure the relationship between two random variables (i.e., between two seismic traces). However, mathematically speaking, a linear correlation coefficient cannot be applied to describe nonlinear relationships between variables. In order to overcome this limitation of liner correlation coefficient. We proposed an improved concordance measurement algorithm based on Kendall's tau. That mainly concern the sensitivity of the liner correlation coefficient and concordance measurements on the waveform. Using two designed numerical models tests sensitivity of waveform similarity affected by these two factors. The analysis of both the numerical model results and real seismic data processing suggest that the proposed method, combining information divergence measurement, can not only precisely characterize the variations of waveform and the heterogeneity of an underground geological body, but also does so with high resolution. In addition, we verified its effectiveness by the actual application of real seismic data from the north of China.

  14. Verifying disarmament: scientific, technological and political challenges

    SciTech Connect

    Pilat, Joseph R

    2011-01-25

    There is growing interest in, and hopes for, nuclear disarmament in governments and nongovernmental organizations (NGOs) around the world. If a nuclear-weapon-free world is to be achievable, verification and compliance will be critical. VerifYing disarmament would have unprecedented scientific, technological and political challenges. Verification would have to address warheads, components, materials, testing, facilities, delivery capabilities, virtual capabilities from existing or shutdown nuclear weapon and existing nuclear energy programs and material and weapon production and related capabilities. Moreover, it would likely have far more stringent requirements. The verification of dismantlement or elimination of nuclear warheads and components is widely recognized as the most pressing problem. There has been considerable research and development done in the United States and elsewhere on warhead and dismantlement transparency and verification since the early 1990s. However, we do not today know how to verifY low numbers or zero. We need to develop the needed verification tools and systems approaches that would allow us to meet this complex set of challenges. There is a real opportunity to explore verification options and, given any realistic time frame for disarmament, there is considerable scope to invest resources at the national and international levels to undertake research, development and demonstrations in an effort to address the anticipated and perhaps unanticipated verification challenges of disarmament now andfor the next decades. Cooperative approaches have the greatest possibility for success.

  15. 7 CFR 1792.104 - Seismic acknowledgments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... registered architect or engineer responsible for the building design stating that seismic provisions pursuant to § 1792.103 of this subpart will be used in the design of the building. (a) For projects in which... include the identification and date of the model code or standard that is used in the seismic design...

  16. Proving Correctness for Pointer Programs in a Verifying Compiler

    NASA Technical Reports Server (NTRS)

    Kulczycki, Gregory; Singh, Amrinder

    2008-01-01

    This research describes a component-based approach to proving the correctness of programs involving pointer behavior. The approach supports modular reasoning and is designed to be used within the larger context of a verifying compiler. The approach consists of two parts. When a system component requires the direct manipulation of pointer operations in its implementation, we implement it using a built-in component specifically designed to capture the functional and performance behavior of pointers. When a system component requires pointer behavior via a linked data structure, we ensure that the complexities of the pointer operations are encapsulated within the data structure and are hidden to the client component. In this way, programs that rely on pointers can be verified modularly, without requiring special rules for pointers. The ultimate objective of a verifying compiler is to prove-with as little human intervention as possible-that proposed program code is correct with respect to a full behavioral specification. Full verification for software is especially important for an agency like NASA that is routinely involved in the development of mission critical systems.

  17. New sensitive seismic cable with imbedded geophones

    NASA Astrophysics Data System (ADS)

    Pakhomov, Alex; Pisano, Dan; Goldburt, Tim

    2005-10-01

    Seismic detection systems for homeland security applications are an important additional layer to perimeter and border protection and other security systems. General Sensing Systems has been developing low mass, low cost, highly sensitive geophones. These geophones are being incorporated within a seismic cable. This article reports on the concept of a seismic sensitive cable and seismic sensitive ribbon design. Unlike existing seismic cables with sensitivity distributed along their lengths, the GSS new cable and ribbon possesses high sensitivity distributed in several points along the cable/ribbon with spacing of about 8-12 to 100 meters between geophones. This cable/ribbon is highly suitable for design and installation in extended perimeter protection systems. It allows the use of a mechanical cable layer for high speed installation. We show that any installation mistakes in using the GSS seismic sensitive cable/ribbon have low impact on output seismic signal value and detection range of security systems.

  18. Verifiable process monitoring through enhanced data authentication.

    SciTech Connect

    Goncalves, Joao G. M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-09-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  19. Seismic Data Gathering and Validation

    SciTech Connect

    Coleman, Justin

    2015-02-01

    Three recent earthquakes in the last seven years have exceeded their design basis earthquake values (so it is implied that damage to SSC’s should have occurred). These seismic events were recorded at North Anna (August 2011, detailed information provided in [Virginia Electric and Power Company Memo]), Fukushima Daichii and Daini (March 2011 [TEPCO 1]), and Kaswazaki-Kariwa (2007, [TEPCO 2]). However, seismic walk downs at some of these plants indicate that very little damage occurred to safety class systems and components due to the seismic motion. This report presents seismic data gathered for two of the three events mentioned above and recommends a path for using that data for two purposes. One purpose is to determine what margins exist in current industry standard seismic soil-structure interaction (SSI) tools. The second purpose is the use the data to validated seismic site response tools and SSI tools. The gathered data represents free field soil and in-structure acceleration time histories data. Gathered data also includes elastic and dynamic soil properties and structural drawings. Gathering data and comparing with existing models has potential to identify areas of uncertainty that should be removed from current seismic analysis and SPRA approaches. Removing uncertainty (to the extent possible) from SPRA’s will allow NPP owners to make decisions on where to reduce risk. Once a realistic understanding of seismic response is established for a nuclear power plant (NPP) then decisions on needed protective measures, such as SI, can be made.

  20. Wide-angle Marine Seismic Refraction Imaging of Vertical Faults: Pre-Stack Turning Wave Migrations of Synthetic Data and Implications for Survey Design

    NASA Astrophysics Data System (ADS)

    Miller, N. C.; Lizarralde, D.; McGuire, J.; Hole, J. A.

    2006-12-01

    We consider methodologies, including survey design and processing algorithms, which are best suited to imaging vertical reflectors in oceanic crust using marine seismic techniques. The ability to image the reflectivity structure of transform faults as a function of depth, for example, may provide new insights into what controls seismicity along these plate boundaries. Turning-wave migration has been used with success to image vertical faults on land. With synthetic datasets we find that this approach has unique difficulties in the deep ocean. The fault-reflected crustal refraction phase (Pg-r) typically used in pre-stack migrations is difficult to isolate in marine seismic data. An "imagable" Pg-r is only observed in a time window between the first arrivals and arrivals from the sediments and the thick, slow water layer at offsets beyond ~25 km. Ocean- bottom seismometers (OBSs), as opposed to a long surface streamer, must be used to acquire data suitable for crustal-scale vertical imaging. The critical distance for Moho reflections (PmP) in oceanic crust is also ~25 km, thus Pg-r and PmP-r are observed with very little separation, and the fault-reflected mantle refraction (Pn-r) arrives prior to Pg-r as the observation window opens with increased OBS-to-fault distance. This situation presents difficulties for "first-arrival" based Kirchoff migration approaches and suggests that wave- equation approaches, which in theory can image all three phases simultaneously, may be more suitable for vertical imaging in oceanic crust. We will present a comparison of these approaches as applied to a synthetic dataset generated from realistic, stochastic velocity models. We will assess their suitability, the migration artifacts unique to the deep ocean, and the ideal instrument layout for such an experiment.

  1. Teacher Directed Design: Content Knowledge, Pedagogy and Assessment under the Nevada K-12 Real-Time Seismic Network

    NASA Astrophysics Data System (ADS)

    Cantrell, P.; Ewing-Taylor, J.; Crippen, K. J.; Smith, K. D.; Snelson, C. M.

    2004-12-01

    Education professionals and seismologists under the emerging SUN (Shaking Up Nevada) program are leveraging the existing infrastructure of the real-time Nevada K-12 Seismic Network to provide a unique inquiry based science experience for teachers. The concept and effort are driven by teacher needs and emphasize rigorous content knowledge acquisition coupled with the translation of that knowledge into an integrated seismology based earth sciences curriculum development process. We are developing a pedagogical framework, graduate level coursework, and materials to initiate the SUN model for teacher professional development in an effort to integrate the research benefits of real-time seismic data with science education needs in Nevada. A component of SUN is to evaluate teacher acquisition of qualified seismological and earth science information and pedagogy both in workshops and in the classroom and to assess the impact on student achievement. SUN's mission is to positively impact earth science education practices. With the upcoming EarthScope initiative, the program is timely and will incorporate EarthScope real-time seismic data (USArray) and educational materials in graduate course materials and teacher development programs. A number of schools in Nevada are contributing real-time data from both inexpensive and high-quality seismographs that are integrated with Nevada regional seismic network operations as well as the IRIS DMC. A powerful and unique component of the Nevada technology model is that schools can receive "stable" continuous live data feeds from 100's seismograph stations in Nevada, California and world (including live data from Earthworm systems and the IRIS DMC BUD - Buffer of Uniform Data). Students and teachers see their own networked seismograph station within a global context, as participants in regional and global monitoring. The robust real-time Internet communications protocols invoked in the Nevada network provide for local data acquisition

  2. Seismic monitoring of Poland - temporary seismic project - first results

    NASA Astrophysics Data System (ADS)

    Trojanowski, J.; Plesiewicz, B.; Wiszniowski, J.; Suchcicki, J.; Tokarz, A.

    2012-04-01

    The aim of the project is to develop national database of seismic activity for seismic hazard assessment. Poland is known as a region of very low seismicity, however some earthquakes occur from time to time. The historical catalogue consists of less than one hundred earthquakes in the time span of almost one thousand years. Due to such a low occurrence rate, the study has been focussing on events at magnitudes lower than 2 which are more likely to occur during a few-year-long project. There are 24 mobile seismic stations involved in the project which are deployed in temporary locations close to humans neighbourhood. It causes a high level of noise and disturbances in recorded seismic signal. Moreover, the majority of Polish territory is covered by a thick sediments. It causes the problem of a reliable detection method for small seismic events in noisy data. The majority of algorithms is based on the concept of STA/LTA ratio and is designed for strong teleseismic events registered on many stations. Unfortunately they fail on the problem of weak events in the signal with noise and disturbances. It has been decided to apply Real Time Recurrent Neural Network (RTRN) to detect small natural seismic events from Poland. This method is able to assess relations of seismic signal in frequency domains as well as in time of seismic phases. The RTRN was taught by wide range of seismic signals - regional, teleseismic as well as blasts. The method is routinely used to analyse data from the project. In the firs two years of the project the seismic network was set in southern Poland, where relatively large seismicity in known. Since the mid-2010 the stations have been working in several regions of central and northern Poland where some minor historical earthquakes occurred. Over one hundred seismic events in magnitude range from 0.5 to 2.3 confirms the activity of Podhale region (Tatra Mountains, Carpathians), where an earthquake of magnitude 4.3 occurred in 2004. Initially three

  3. Development of Earthquake Ground Motion Input for Preclosure Seismic Design and Postclosure Performance Assessment of a Geologic Repository at Yucca Mountain, NV

    SciTech Connect

    I. Wong

    2004-11-05

    This report describes a site-response model and its implementation for developing earthquake ground motion input for preclosure seismic design and postclosure assessment of the proposed geologic repository at Yucca Mountain, Nevada. The model implements a random-vibration theory (RVT), one-dimensional (1D) equivalent-linear approach to calculate site response effects on ground motions. The model provides results in terms of spectral acceleration including peak ground acceleration, peak ground velocity, and dynamically-induced strains as a function of depth. In addition to documenting and validating this model for use in the Yucca Mountain Project, this report also describes the development of model inputs, implementation of the model, its results, and the development of earthquake time history inputs based on the model results. The purpose of the site-response ground motion model is to incorporate the effects on earthquake ground motions of (1) the approximately 300 m of rock above the emplacement levels beneath Yucca Mountain and (2) soil and rock beneath the site of the Surface Facilities Area. A previously performed probabilistic seismic hazard analysis (PSHA) (CRWMS M&O 1998a [DIRS 103731]) estimated ground motions at a reference rock outcrop for the Yucca Mountain site (Point A), but those results do not include these site response effects. Thus, the additional step of applying the site-response ground motion model is required to develop ground motion inputs that are used for preclosure and postclosure purposes.

  4. Seismic Discrimination

    DTIC Science & Technology

    1975-12-31

    dyn-cm. It can be seen that there is a wide range of the potential con- tribution of different seismic zones to excitation of the Chandler wobble ...Correction to the Excitation of the Chandler Wobble by Earthquakes," Geophys. J. R. Astron. Soc. 32, 203-217 (1973). 22. S. C. Solomon, N. H. Sleep

  5. Seismic Tomography.

    ERIC Educational Resources Information Center

    Anderson, Don L.; Dziewonski, Adam M.

    1984-01-01

    Describes how seismic tomography is used to analyze the waves produced by earthquakes. The information obtained from the procedure can then be used to map the earth's mantle in three dimensions. The resulting maps are then studied to determine such information as the convective flow that propels the crustal plates. (JN)

  6. Seismic Symphonies

    NASA Astrophysics Data System (ADS)

    Strinna, Elisa; Ferrari, Graziano

    2015-04-01

    The project started in 2008 as a sound installation, a collaboration between an artist, a barrel organ builder and a seismologist. The work differs from other attempts of sound transposition of seismic records. In this case seismic frequencies are not converted automatically into the "sound of the earthquake." However, it has been studied a musical translation system that, based on the organ tonal scale, generates a totally unexpected sequence of sounds which is intended to evoke the emotions aroused by the earthquake. The symphonies proposed in the project have somewhat peculiar origins: they in fact come to life from the translation of graphic tracks into a sound track. The graphic tracks in question are made up by copies of seismograms recorded during some earthquakes that have taken place around the world. Seismograms are translated into music by a sculpture-instrument, half a seismograph and half a barrel organ. The organ plays through holes practiced on paper. Adapting the documents to the instrument score, holes have been drilled on the waves' peaks. The organ covers about three tonal scales, starting from heavy and deep sounds it reaches up to high and jarring notes. The translation of the seismic records is based on a criterion that does match the highest sounds to larger amplitudes with lower ones to minors. Translating the seismogram in the organ score, the larger the amplitude of recorded waves, the more the seismogram covers the full tonal scale played by the barrel organ and the notes arouse an intense emotional response in the listener. Elisa Strinna's Seismic Symphonies installation becomes an unprecedented tool for emotional involvement, through which can be revived the memory of the greatest disasters of over a century of seismic history of the Earth. A bridge between art and science. Seismic Symphonies is also a symbolic inversion: the instrument of the organ is most commonly used in churches, and its sounds are derived from the heavens and

  7. Derivation and implementation of a nonlinear experimental design criterion and its application to seismic network expansion at Kawerau geothermal field, New Zealand

    NASA Astrophysics Data System (ADS)

    Rawlinson, Z. J.; Townend, J.; Arnold, R.; Bannister, S.

    2012-09-01

    The accuracy with which geophysical observations are made is inherently determined by the geometry of the observation network, and typically depends on a highly non-linear relationship between data and earth parameters. Statistical experimental design provides a means of optimizing the network geometry to provide maximum information about parameters of interest. Here, we re-derive the nonlinear experimental design DN optimization method, without the need for the usual assumption of a multivariate normal model of data uncertainties. We demonstrate the criterion's utility by applying it to the problem of seismic network expansion in the active Kawerau geothermal field, Taupo Volcanic Zone, New Zealand. The design calculations maximize the ratio of the hypocentre data generalized variance (attributable to resolvable spatial separation of earthquakes) to the measurement error generalized variance (attributable to observational uncertainties), and incorporate realistic 3-D velocity and attenuation models, surface noise sources, and both P- and S-wave data. In geologically complex areas, statistical experimental design provides a means of objectively deploying finite observational resources to target areas of particular interest while taking into account environmental and logistical factors.

  8. Seismic Discrimination

    DTIC Science & Technology

    1977-03-31

    Determining Phase and Group Velocities of Surface Seismic Waves 21 B. Group-Velocity Measurements Across Eurasia from Mashad SRO 22 C. Group-Velocity...Albuquerque), MAIO ( Mashad ), GUMO (Guam), NWAO (Australia), SNZO (New Zealand), and TATO (Taiwan). Fairly extensive data are now a|ailable for the...include a new rapid algorithm for the determination of group and phase velocity, a series of observations of Rayleigh-wave dispersion at the Mashad

  9. Onshore seismic amplifications due to bathymetric features

    NASA Astrophysics Data System (ADS)

    Rodríguez-Castellanos, A.; Carbajal-Romero, M.; Flores-Guzmán, N.; Olivera-Villaseñor, E.; Kryvko, A.

    2016-08-01

    We perform numerical calculations for onshore seismic amplifications, taking into consideration the effect of bathymetric features on the propagation of seismic movements. To this end, the boundary element method is applied. Boundary elements are employed to irradiate waves and, consequently, force densities can be obtained for each boundary element. From this assumption, Huygens’ principle is applied, and since the diffracted waves are built at the boundary from which they are radiated, this idea is equivalent to Somigliana’s representation theorem. The application of boundary conditions leads to a linear system being obtained (Fredholm integral equations). Several numerical models are analyzed, with the first one being used to verify the proposed formulation, and the others being used to estimate onshore seismic amplifications due to the presence of bathymetric features. The results obtained show that compressional waves (P-waves) generate onshore seismic amplifications that can vary from 1.2 to 5.2 times the amplitude of the incident wave. On the other hand, the shear waves (S-waves) can cause seismic amplifications of up to 4.0 times the incident wave. Furthermore, an important result is that in most cases the highest seismic amplifications from an offshore earthquake are located on the shoreline and not offshore, despite the seafloor configuration. Moreover, the influence of the incident angle of seismic waves on the seismic amplifications is highlighted.

  10. Evaluation of verifiability in HAL/S. [programming language for aerospace computers

    NASA Technical Reports Server (NTRS)

    Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.

    1979-01-01

    The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.

  11. Ground Motion Simulations for Bursa Region (Turkey) Using Input Parameters derived from the Regional Seismic Network

    NASA Astrophysics Data System (ADS)

    Unal, B.; Askan, A.

    2014-12-01

    Earthquakes are among the most destructive natural disasters in Turkey and it is important to assess seismicity in different regions with the use of seismic networks. Bursa is located in Marmara Region, Northwestern Turkey and to the south of the very active North Anatolian Fault Zone. With around three million inhabitants and key industrial facilities of the country, Bursa is the fourth largest city in Turkey. Since most of the focus is on North Anatolian Fault zone, despite its significant seismicity, Bursa area has not been investigated extensively until recently. For reliable seismic hazard estimations and seismic design of structures, assessment of potential ground motions in this region is essential using both recorded and simulated data. In this study, we employ stochastic finite-fault simulation with dynamic corner frequency approach to model previous events as well to assess potential earthquakes in Bursa. To ensure simulations with reliable synthetic ground motion outputs, the input parameters must be carefully derived from regional data. In this study, using strong motion data collected at 33 stations in the region, site-specific parameters such as near-surface high frequency attenuation parameter and amplifications are obtained. Similarly, source and path parameters are adopted from previous studies that as well employ regional data. Initially, major previous events in the region are verified by comparing the records with the corresponding synthetics. Then simulations of scenario events in the region are performed. We present the results in terms of spatial distribution of peak ground motion parameters and time histories at selected locations.

  12. Designing a low-cost effective network for monitoring large scale regional seismicity in a soft-soil region (Alsace, France)

    NASA Astrophysics Data System (ADS)

    Bès de Berc, M.; Doubre, C.; Wodling, H.; Jund, H.; Hernandez, A.; Blumentritt, H.

    2015-12-01

    The Seismological Observatory of the North-East of France (ObSNEF) is developing its monitoring network within the framework of several projects. Among these project, RESIF (Réseau sismologique et géodésique français) allows the instrumentation of broad-band seismic stations, separated by 50-100 km. With the recent and future development of geothermal industrial projects in the Alsace region, the ObSNEF is responsible for designing, building and operating a dense regional seismic network in order to detect and localize earthquakes with both a completeness magnitude of 1.5 and no clipping for M6.0. The realization of the project has to be done prior to the summer 2016Several complex technical and financial constraints constitute such a projet. First, most of the Alsace Région (150x150 km2), particularly the whole Upper Rhine Graben, is a soft-soil plain where seismic signals are dominated by a high frequency noise level. Second, all the signals have to be transmitted in near real-time. And finally, the total cost of the project must not exceed $450,000.Regarding the noise level in Alsace, in order to make a reduction of 40 dB for frequencies above 1Hz, we program to instrument into 50m deep well with post-hole sensor for 5 stations out of 8 plane new stations. The 3 remaining would be located on bedrock along the Vosges piedmont. In order to be sensitive to low-magnitude regional events, we plan to install a low-noise short-period post-hole velocimeter. In order to avoid saturation for high potentiel local events (M6.0 at 10km), this velocimeter will be coupled with a surface strong-motion sensor. Regarding the connectivity, these stations will have no wired network, which reduces linking costs and delays. We will therefore use solar panels and a 3G/GPRS network. The infrastructure will be minimal and reduced to an outdoor box on a secured parcel of land. In addition to the data-logger, we will use a 12V ruggedized computer, hosting a seed-link server for near

  13. Conceptual design report: Nuclear materials storage facility renovation. Part 5, Structural/seismic investigation. Section B, Renovation calculations/supporting data

    SciTech Connect

    1995-07-14

    The Nuclear Materials Storage Facility (NMSF) at the Los Alamos National Laboratory (LANL) was a Fiscal Year (FY) 1984 line-item project completed in 1987 that has never been operated because of major design and construction deficiencies. This renovation project, which will correct those deficiencies and allow operation of the facility, is proposed as an FY 97 line item. The mission of the project is to provide centralized intermediate and long-term storage of special nuclear materials (SNM) associated with defined LANL programmatic missions and to establish a centralized SNM shipping and receiving location for Technical Area (TA)-55 at LANL. Based on current projections, existing storage space for SNM at other locations at LANL will be loaded to capacity by approximately 2002. This will adversely affect LANUs ability to meet its mission requirements in the future. The affected missions include LANL`s weapons research, development, and testing (WRD&T) program; special materials recovery; stockpile survelliance/evaluation; advanced fuels and heat sources development and production; and safe, secure storage of existing nuclear materials inventories. The problem is further exacerbated by LANL`s inability to ship any materials offsite because of the lack of receiver sites for mate rial and regulatory issues. Correction of the current deficiencies and enhancement of the facility will provide centralized storage close to a nuclear materials processing facility. The project will enable long-term, cost-effective storage in a secure environment with reduced radiation exposure to workers, and eliminate potential exposures to the public. This report is organized according to the sections and subsections. It is organized into seven parts. This document, Part V, Section B - Structural/Seismic Information provides a description of the seismic and structural analyses performed on the NMSF and their results.

  14. Design of an UML conceptual model and implementation of a GIS with metadata information for a seismic hazard assessment cooperative project.

    NASA Astrophysics Data System (ADS)

    Torres, Y.; Escalante, M. P.

    2009-04-01

    This work illustrates the advantages of using a Geographic Information System in a cooperative project with researchers of different countries, such as the RESIS II project (financed by the Norwegian Government and managed by CEPREDENAC) for seismic hazard assessment of Central America. As input data present different formats, cover distinct geographical areas and are subjected to different interpretations, data inconsistencies may appear and their management get complicated. To achieve data homogenization and to integrate them in a GIS, it is required previously to develop a conceptual model. This is accomplished in two phases: requirements analysis and conceptualization. The Unified Modeling Language (UML) is used to compose the conceptual model of the GIS. UML complies with ISO 19100 norms and allows the designer defining model architecture and interoperability. The GIS provides a frame for the combination of large geographic-based data volumes, with an uniform geographic reference and avoiding duplications. All this information contains its own metadata following ISO 19115 normative. In this work, the integration in the same environment of active faults and subduction slabs geometries, combined with the epicentres location, has facilitated the definition of seismogenetic regions. This is a great support for national specialists of different countries to make easier their teamwork. The GIS capacity for making queries (by location and by attributes) and geostatistical analyses is used to interpolate discrete data resulting from seismic hazard calculations and to create continuous maps as well as to check and validate partial results of the study. GIS-based products, such as complete, homogenised databases and thematic cartography of the region, are distributed to all researchers, facilitating cross-national communication, the project execution and results dissemination.

  15. On the distribution of seismic reflection coefficients and seismic amplitudes

    SciTech Connect

    Painter, S.; Paterson, L.; Beresford, G.

    1995-07-01

    Reflection coefficient sequences from 14 wells in Australia have a statistical character consistent with a non-Gaussian scaling noise model based on the Levy-stable family of probability distributions. Experimental histograms of reflection coefficients are accurately approximated by symmetric Levy-stable probability density functions with Levy index between 0.99 and 1.43. These distributions have the same canonical role in mathematical statistics as the Gaussian distribution, but they have slowly decaying tails and infinite moments. The distribution of reflection coefficients is independent of the spatial scale (statistically self-similar), and the reflection coefficient sequences have long-range dependence. These results suggest that the logarithm of seismic impedance can be modeled accurately using fractional Levy motion, which is a generalization of fractional Brownian motion. Synthetic seismograms produced from the authors` model for the reflection coefficients also have Levy-stable distributions. These isolations include transmission losses, the effects of reverberations, and the loss of resolution caused by band-limited wavelets, and suggest that actual seismic amplitudes with sufficient signal-to-noise ratio should also have a Levy-stable distribution. This prediction is verified using post-stack seismic data acquired in the Timor Sea and in the continental USA. However, prestack seismic amplitudes from the Timor Sea are nearly Gaussian. They attribute the difference between prestack and poststack data to the high level of measurement noise in the prestack data.

  16. Seismic monitoring of rockfalls at Spitz quarry (NÖ, Austria)

    NASA Astrophysics Data System (ADS)

    del Puy Papí Isaba, María; Brückl, Ewald; Roncat, Andreas; Schweigl, Joachim

    2016-04-01

    In the recent past, significant rockfalls, which pose a danger to persons, railways and roads, occurred in the quarry of Spitz (NÖ-Austria). An existing seismic warning system did not fulfill the expected efficiency and reliability standards since the ratio of well-detected events to undetected events or false alarms was not satisfactory. Our aim was to analyze how a seismic warning system must be designed in order to overcome these deficiencies. A small-scale seismic network was deployed in the Spitz quarry to evaluate the possibility of improving the early-warning rockfall monitoring network by means of seismic observations. A new methodology based on seismic methods, which enables the detection and location of rockfalls above a critical size, was developed. In order to perform this task, a small-scale (200x200 m2) passive seismic network comprised of 7 monitoring seismic stations acquiring data in continuous mode was established in the quarry of Spitz so that it covered the rockfall hazard area. On the 2nd of October 2015, an induced rockfall experiment was performed. It began at 09:00 a.m (local time, 07:00 UTC) and lasted about 1.5 hours. The entire data set was analyzed using the pSysmon software. In order to locate the impact point of the rock falls, we used a procedure based on the back-projection of the maximum resultant amplitude recorded at each station of the network within a time window to every grid-point covering the whole area of interest. In order to verify the performance of the employed algorithm for detection and localization, we performed man-induced rock falls. We also used a terrestrial laser scanner and a camera, not only to draw the rockfall block trajectories, but also to determine the volume of rock lost or gained in the different areas of the quarry. This allowed us to relate the lost mass with the strength of the collision (Pseudo-magnitude) of the rockfall, and draw and rebuild their associated trajectory. The location test performed

  17. Importance of analytically verifying chemical treatments

    USGS Publications Warehouse

    Rach, J.J.; Gaikowski, M.P.; Olson, J.J.

    1997-01-01

    Hydrogen peroxide is considered a low regulatory priority compound by the U.S. Food and Drug Administration. It is used to control fungal infections on fish eggs. We studied the treatment profiles of hydrogen peroxide in Heath, McDonald egg jar, and Clark-Williamson incubators during treatments intended to deliver an effective regimen of at least 500 ??L hydrogen peroxide/L (i.e., treatments of 500 and 1,000 ??L/L) for 15 min. Hydrogen peroxide concentrations decreased with increasing distance from the influent water in both Heath and Clark-Williamson incubators. The top treatment tray (tray 2) of the Heath incubator received more than 90% of the intended regimen during the 500 ??L/L treatment, whereas at 1,000 ??L/L, all trays had hydrogen peroxide concentrations at or above 500 ??L/L for 15 min. None of the compartments in the Clark-Williamson incubator received the intended therapeutic regimen when treated at 500 ??L/L. The McDonald egg jar system distributed the intended concentration for the designated treatment period in all jars, except those located directly below the influent water. Our results indicate that dilution of therapeutants applied through certain egg incubation systems significantly decreases the efficacy of treatments and may render them ineffective. The dilution characteristics of egg incubation systems should be assessed in order to ensure proper delivery of all intended chemical concentrations and exposure regimens. Suggestions for maintaining the minimum effective concentrations in evaluated incubators are included.

  18. Seismic Isolation Working Meeting Gap Analysis Report

    SciTech Connect

    Coleman, Justin; Sabharwall, Piyush

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  19. Infrasound Generation from the HH Seismic Hammer.

    SciTech Connect

    Jones, Kyle Richard

    2014-10-01

    The HH Seismic hammer is a large, "weight-drop" source for active source seismic experiments. This system provides a repetitive source that can be stacked for subsurface imaging and exploration studies. Although the seismic hammer was designed for seismological studies it was surmised that it might produce energy in the infrasonic frequency range due to the ground motion generated by the 13 metric ton drop mass. This study demonstrates that the seismic hammer generates a consistent acoustic source that could be used for in-situ sensor characterization, array evaluation and surface-air coupling studies for source characterization.

  20. Complexity in Design-Driven Innovation: A Case Study of Knowledge Transfer Flow in Subsea Seismic Sensor Technology and Design Education

    ERIC Educational Resources Information Center

    Pavel, Nenad; Berg, Arild

    2015-01-01

    To the extent previously claimed, concept exploration is not the key to product innovation. However, companies that are design-focused are twice as innovative as those that are not. To study design-driven innovation and its occurrence in design education, two case studies are conducted. The first is an example of design practice which includes…

  1. Final Report: Seismic Hazard Assessment at the PGDP

    SciTech Connect

    Wang, Zhinmeng

    2007-06-01

    Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties of seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.

  2. Seismic assessment for offshore pipelines

    SciTech Connect

    Bruschi, R.; Gudmestad, O.T.; Blaker, F.; Nadim, F.

    1995-12-31

    An international consensus on seismic design criteria for onshore pipelines has been established during the last thirty years. The need to assess seismic design for offshore pipelines has not been similarly recognized. In this paper, the geotechnical hazard for a pipeline routed across steep slopes and irregular terrains affected by earthquakes, is discussed. The integrity of both natural and artificial load bearing supports is assessed.d The response of the pipeline to direct excitation from soil or through discontinuous, sparsely distributed natural or artificial supports, is commented.

  3. Seismic risk perception in Italy

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro; Peruzza, Laura

    2014-05-01

    Risk perception is a fundamental element in the definition and the adoption of preventive counter-measures. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. This paper presents results of a survey on seismic risk perception in Italy conducted from January 2013 to present . The research design combines a psychometric and a cultural theoretic approach. More than 7,000 on-line tests have been compiled. The data collected show that in Italy seismic risk perception is strongly underestimated; 86 on 100 Italian citizens, living in the most dangerous zone (namely Zone 1), do not have a correct perception of seismic hazard. From these observations we deem that extremely urgent measures are required in Italy to reach an effective way to communicate seismic risk. Finally, the research presents a comparison between groups on seismic risk perception: a group involved in campaigns of information and education on seismic risk and a control group.

  4. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 8 2013-10-01 2013-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION RULES OF PRACTICE MODIFIED PROCEDURES § 1112.6 Verified statements; contents. A verified statement should...

  5. 49 CFR 1112.6 - Verified statements; contents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 8 2012-10-01 2012-10-01 false Verified statements; contents. 1112.6 Section 1112.6 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION RULES OF PRACTICE MODIFIED PROCEDURES § 1112.6 Verified statements; contents. A verified statement should...

  6. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false Verifying your identity. 401.45 Section 401... INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a... representative, you must verify your identity in accordance with paragraph (b) of this section if: (1) You make...

  7. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Verifying your identity. 401.45 Section 401... INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a... representative, you must verify your identity in accordance with paragraph (b) of this section if: (1) You make...

  8. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity....

  9. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false Verifying your identity. 401.45 Section 401... INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a... representative, you must verify your identity in accordance with paragraph (b) of this section if: (1) You make...

  10. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity....

  11. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false Verifying your identity. 401.45 Section 401... INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a... representative, you must verify your identity in accordance with paragraph (b) of this section if: (1) You make...

  12. 20 CFR 401.45 - Verifying your identity.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false Verifying your identity. 401.45 Section 401... INFORMATION The Privacy Act § 401.45 Verifying your identity. (a) When required. Unless you are making a... representative, you must verify your identity in accordance with paragraph (b) of this section if: (1) You make...

  13. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Verifying your identity. 802.13 Section... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own records. When you make a request for access to records about yourself, you must verify your identity....

  14. Strong Motion Instrumentation of Seismically-Strengthened Port Structures in California by CSMIP

    USGS Publications Warehouse

    Huang, M.J.; Shakal, A.F.

    2009-01-01

    The California Strong Motion Instrumentation Program (CSMIP) has instrumented five port structures. Instrumentation of two more port structures is underway and another one is in planning. Two of the port structures have been seismically strengthened. The primary goals of the strong motion instrumentation are to obtain strong earthquake shaking data for verifying seismic analysis procedures and strengthening schemes, and for post-earthquake evaluations of port structures. The wharves instrumented by CSMIP were recommended by the Strong Motion Instrumentation Advisory Committee, a committee of the California Seismic Safety Commission. Extensive instrumentation of a wharf is difficult and would be impossible without the cooperation of the owners and the involvement of the design engineers. The instrumentation plan for a wharf is developed through study of the retrofit plans of the wharf, and the strong-motion sensors are installed at locations where specific instrumentation objectives can be achieved and access is possible. Some sensor locations have to be planned during design; otherwise they are not possible to install after construction. This paper summarizes the two seismically-strengthened wharves and discusses the instrumentation schemes and objectives. ?? 2009 ASCE.

  15. 41 CFR 128-1.8004 - Seismic Safety Coordinators.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Seismic Safety... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety Program § 128-1.8004 Seismic Safety Coordinators. (a) The Justice Management Division shall designate...

  16. 41 CFR 128-1.8004 - Seismic Safety Coordinators.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Seismic Safety... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety Program § 128-1.8004 Seismic Safety Coordinators. (a) The Justice Management Division shall designate...

  17. 41 CFR 128-1.8004 - Seismic Safety Coordinators.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false Seismic Safety... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety Program § 128-1.8004 Seismic Safety Coordinators. (a) The Justice Management Division shall designate...

  18. 41 CFR 128-1.8004 - Seismic Safety Coordinators.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false Seismic Safety... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety Program § 128-1.8004 Seismic Safety Coordinators. (a) The Justice Management Division shall designate...

  19. 41 CFR 128-1.8004 - Seismic Safety Coordinators.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false Seismic Safety... Management Regulations System (Continued) DEPARTMENT OF JUSTICE 1-INTRODUCTION 1.80-Seismic Safety Program § 128-1.8004 Seismic Safety Coordinators. (a) The Justice Management Division shall designate...

  20. Romanian Educational Seismic Network Project

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers

  1. 2008 United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, M.D.; ,

    2008-01-01

    The U.S. Geological Survey recently updated the National Seismic Hazard Maps by incorporating new seismic, geologic, and geodetic information on earthquake rates and associated ground shaking. The 2008 versions supersede those released in 1996 and 2002. These maps are the basis for seismic design provisions of building codes, insurance rate structures, earthquake loss studies, retrofit priorities, and land-use planning. Their use in design of buildings, bridges, highways, and critical infrastructure allows structures to better withstand earthquake shaking, saving lives and reducing disruption to critical activities following a damaging event. The maps also help engineers avoid costs from over-design for unlikely levels of ground motion.

  2. Seismic sources

    DOEpatents

    Green, M.A.; Cook, N.G.W.; McEvilly, T.V.; Majer, E.L.; Witherspoon, P.A.

    1987-04-20

    Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Longitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements for more than about one minute. 9 figs.

  3. Seismic analysis of nuclear power plant structures

    NASA Technical Reports Server (NTRS)

    Go, J. C.

    1973-01-01

    Primary structures for nuclear power plants are designed to resist expected earthquakes of the site. Two intensities are referred to as Operating Basis Earthquake and Design Basis Earthquake. These structures are required to accommodate these seismic loadings without loss of their functional integrity. Thus, no plastic yield is allowed. The application of NASTRAN in analyzing some of these seismic induced structural dynamic problems is described. NASTRAN, with some modifications, can be used to analyze most structures that are subjected to seismic loads. A brief review of the formulation of seismic-induced structural dynamics is also presented. Two typical structural problems were selected to illustrate the application of the various methods of seismic structural analysis by the NASTRAN system.

  4. Simplified Procedures for Seismic Analysis and Design of Piers and Wharves in Marine Oil and LNG Terminals

    DTIC Science & Technology

    2010-06-01

    prestressed concrete piles connected to the deck slab with dowels. The following is a step-by-step summary of the procedure to implement these formulas to... prestressed concrete pile with dowel-connection for both design levels is 0.05ρ = . 12. Compute the dimensionless parameters: ,P ,Cy yM Mη = , and eEI...42 8.1.2 Prestressed Concrete Piles

  5. LANL seismic screening method for existing buildings

    SciTech Connect

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method and will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.

  6. Induced seismicity after borehole fluid injections

    NASA Astrophysics Data System (ADS)

    Langenbruch, Cornelius; Shapiro, Serge

    2010-05-01

    We present a model for the temporal distribution of microseismic events induced by borehole fluid injections into reservoirs. We put the focus on seismicity induced after the stop of fluid injections. Here, our main concern is the identification of parameters controlling the decay rate of seismicity after injection stops. The particular importance of a theoretical model for the occurrence of seismicity after stop of injection is underlined by observations after stimulations of geothermal reservoirs at different locations. These stimulations have shown that the post injection phase contains a high seismic risk, which is up to now uncontrollable, because the processes leading to the occurrence of post injection events are not well understood. Based on the assumption that pore pressure diffusion is the governing mechanism leading to the triggering of seismic events, we develop a method to calculate the seismicity rate during and after fluid injections. We show that the obtained solution after injection is very similar to the frequency scaling law of aftershocks, namely the Omori law. We propose a modified Omori law, which describes how post injection seismicity depends on parameters of injection source and reservoir rock and the strength of a pre-existing fracture system in the reservoir. We analyze two end members of fracture strength, representing stable and unstable pre-existing fracture systems. Our results shows, that the decay rate of post injection seismicity is highly dependent on the strength of the fracture system. Furthermore, we show that the existence of an unstable fracture system in a reservoir results in a critical trend of seismic activity, which explains the occurrence of events with the largest magnitude close after the stop of injection. This result coincides with observations made after the stimulation of Enhanced Geothermal Systems (EGS). We verify our theoretical model by an application to synthetic data sets resulting from finite element

  7. Verifying an interactive consistency circuit: A case study in the reuse of a verification technology

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1990-01-01

    The work done at ORA for NASA-LRC in the design and formal verification of a hardware implementation of a scheme for attaining interactive consistency (byzantine agreement) among four microprocessors is presented in view graph form. The microprocessors used in the design are an updated version of a formally verified 32-bit, instruction-pipelined, RISC processor, MiniCayuga. The 4-processor system, which is designed under the assumption that the clocks of all the processors are synchronized, provides software control over the interactive consistency operation. Interactive consistency computation is supported as an explicit instruction on each of the microprocessors. An identical user program executing on each of the processors decides when and on what data interactive consistency must be performed. This exercise also served as a case study to investigate the effectiveness of reusing the technology which was developed during the MiniCayuga effort for verifying synchronous hardware designs. MiniCayuga was verified using the verification system Clio which was also developed at ORA. To assist in reusing this technology, a computer-aided specification and verification tool was developed. This tool specializes Clio to synchronous hardware designs and significantly reduces the tedium involved in verifying such designs. The tool is presented and how it was used to specify and verify the interactive consistency circuit is described.

  8. Application of bounding spectra to seismic design of piping based on the performance of above ground piping in power plants subjected to strong motion earthquakes

    SciTech Connect

    Stevenson, J.D.

    1995-02-01

    This report extends the potential application of Bounding Spectra evaluation procedures, developed as part of the A-46 Unresolved Safety Issue applicable to seismic verification of in-situ electrical and mechanical equipment, to in-situ safety related piping in nuclear power plants. The report presents a summary of earthquake experience data which define the behavior of typical U.S. power plant piping subject to strong motion earthquakes. The report defines those piping system caveats which would assure the seismic adequacy of the piping systems which meet those caveats and whose seismic demand are within the bounding spectra input. Based on the observed behavior of piping in strong motion earthquakes, the report describes the capabilities of the piping system to carry seismic loads as a function of the type of connection (i.e. threaded versus welded). This report also discusses in some detail the basic causes and mechanisms for earthquake damages and failures to power plant piping systems.

  9. Implementation of Seismic Stops in Piping Systems

    SciTech Connect

    Bezler, P.; Simos, N.; Wang, Y.K.

    1993-02-01

    Commonwealth Edison has submitted a request to NRC to replace the snubbers in the Reactor Coolant Bypass Line of Byron Station-Unit 2 with gapped pipe supports. The specific supports intended for use are commercial units designated ''Seismic Stops'' manufactured by Robert L. Cloud Associates, Inc. (RLCA). These devices have the physical appearance of snubbers and are essentially spring supports incorporating clearance gaps sized for the Byron Station application. Although the devices have a nonlinear stiffness characteristic, their design adequacy is demonstrated through the use of a proprietary linear elastic piping analysis code ''GAPPIPE'' developed by RLCA. The code essentially has all the capabilities of a conventional piping analysis code while including an equivalent linearization technique to process the nonlinear spring elements. Brookhaven National Laboratory (BNL) has assisted the NRC staff in its evaluation of the RLCA implementation of the equivalent Linearization technique and the GAPPIPE code. Towards this end, BNL performed a detailed review of the theoretical basis for the method, an independent evaluation of the Byron piping using the nonlinear time history capability of the ANSYS computer code and by result comparisons to the RLCA developed results, an assessment of the adequacy of the response estimates developed with GAPPIPE. Associated studies included efforts to verify the ANSYS analysis results and the development of bounding calculations for the Byron Piping using linear response spectrum methods.

  10. IMPLEMENTATION OF SEISMIC STOPS IN PIPING SYSTEMS.

    SciTech Connect

    BEZLER,P.

    1993-02-01

    Commonwealth Edison has submitted a request to NRC to replace the snubbers in the Reactor Coolant Bypass Line of Byron Station -Unit 2 with gapped pipe supports. The specific supports intended for use are commercial units designated ''Seismic Stops'' manufactured by Robert L. Cloud Associates, Inc. (RLCA). These devices have the physical appearance of snubbers and are essentially spring supports incorporating clearance gaps sized for the Byron Station application. Although the devices have a nonlinear stiffness characteristic, their design adequacy is demonstrated through the use of a proprietary linear elastic piping analysis code ''GAPPIPE'' developed by RLCA. The code essentially has all the capabilities of a conventional piping analysis code while including an equivalent linearization technique to process the nonlinear spring elements. Brookhaven National Laboratory (BNL) has assisted the NRC staff in its evaluation of the RLCA implementation of the equivalent linearization technique and the GAPPIPE code. Towards this end, BNL performed a detailed review of the theoretical basis for the method, an independent evaluation of the Byron piping using the nonlinear time history capability of the ANSYS computer code and by result comparisons to the RLCA developed results, an assessment of the adequacy of the response estimates developed with GAPPIPE. Associated studies included efforts to verify the ANSYS analysis results and the development of bounding calculations for the Byron Piping using linear response spectrum methods.

  11. Midget Seismic in Sandbox Models

    NASA Astrophysics Data System (ADS)

    Krawczyk, C. M.; Buddensiek, M. L.; Philipp, J.; Kukowski, N.; Oncken, O.

    2008-12-01

    Analog sandbox simulation has been applied to study geological processes to provide qualitative and quantitative insights into specific geological problems. In nature, the structures, which are simulated in those sandbox models, are often inferred from seismic data. With the study introduced here, we want to combine the analog sandbox simulation techniques with seismic physical modeling of those sandbox models. The long-term objectives of this approach are (1) imaging of seismic and seismological events of actively deforming and static 3D analogue models, and (2) assessment of the transferability of the model data to field data in order to improve field data acquisition and interpretation according to the addressed geological problem. To achieve this objective, a new midget-seismic facility for laboratory use was designed and developed, comprising a seismic tank, a PC control unit including piezo-electric transducers, and a positioning system. The first experiments are aimed at studying the wave field properties of the piezo- transducers in order to investigate their feasibility for seismic profiling. The properties investigated are their directionality and the change of waveform due to their size (5-12 mm) compared to the wavelengths (< 1.5 mm). The best quality signals and least directionality and waveform change are achieved when the center source frequency is between 350-500 kHz, and the offset is less than 8 cm for a reflector depth of 10 cm. With respect to the technical hardware reflection processing on such a small scale is feasible as long as the offset does not exceed a certain value, which is dependent on the reflector depth and frequency. The next steps will include a study of material properties and the effects of wave propagation in an-/isotropic media by physical studies, before we finally start using different seismic imaging and processing techniques on static and actively deforming 3D analog models.

  12. Identity-Based Verifiably Encrypted Signatures without Random Oracles

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Wu, Qianhong; Qin, Bo

    Fair exchange protocol plays an important role in electronic commerce in the case of exchanging digital contracts. Verifiably encrypted signatures provide an optimistic solution to these scenarios with an off-line trusted third party. In this paper, we propose an identity-based verifiably encrypted signature scheme. The scheme is non-interactive to generate verifiably encrypted signatures and the resulting encrypted signature consists of only four group elements. Based on the computational Diffie-Hellman assumption, our scheme is proven secure without using random oracles. To the best of our knowledge, this is the first identity-based verifiably encrypted signature scheme provably secure in the standard model.

  13. Seismic waveform viewer, processor and calculator

    SciTech Connect

    2015-02-15

    SWIFT is a computer code that is designed to do research level signal analysis on seismic waveforms, including visualization, filtering and measurement. LLNL is using this code, amplitude and global tomography efforts.

  14. Seismic sources

    DOEpatents

    Green, Michael A.; Cook, Neville G. W.; McEvilly, Thomas V.; Majer, Ernest L.; Witherspoon, Paul A.

    1992-01-01

    Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Logitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole relative to a stator that is clamped to the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements at a power level that causes heating to over 150.degree. C. within one minute of operation, but energizing the elements for no more than about one minute.

  15. Seismic assessment of buried pipelines

    SciTech Connect

    Al-Chaar, G.; Brady, P.; Fernandez, G.

    1995-12-31

    A structure and its lifelines are closely linked because the disruption of lifeline systems will obstruct emergency service functions that are vitally needed after an earthquake. As an example of the criticality of these systems, the Association of Bay Area Government (ABAG) recorded thousands of leaks in pipelines that resulted in more than twenty million gallons of hazardous materials being released in several recorded earthquakes. The cost of cleaning the spills from these materials was very high. This information supports the development of seismic protection of lifeline systems. The US Army Corps of Engineers Construction Engineering Research Laboratories (USACERL) has, among its missions, the responsibility to develop seismic vulnerability assessment procedures for military installations. Within this mission, a preliminary research program to assess the seismic vulnerability of buried pipeline systems on military installations was initiated. Phase 1 of this research project resulted in two major studies. In the first, evaluating current procedures to seismically design or evaluate existing lifeline systems, the authors found several significant aspects that deserve special consideration and need to be addressed in future research. The second was focused on identifying parameters related to buried pipeline system vulnerability and developing a generalized analytical method to relate these parameters to the seismic vulnerability assessment of existing pipeline systems.

  16. Effect of Different Groundwater Levels on Seismic Dynamic Response and Failure Mode of Sandy Slope.

    PubMed

    Huang, Shuai; Lv, Yuejun; Peng, Yanju; Zhang, Lifang; Xiu, Liwei

    2015-01-01

    Heavy seismic damage tends to occur in slopes when groundwater is present. The main objectives of this paper are to determine the dynamic response and failure mode of sandy slope subjected simultaneously to seismic forces and variable groundwater conditions. This paper applies the finite element method, which is a fast and efficient design tool in modern engineering analysis, to evaluate dynamic response of the slope subjected simultaneously to seismic forces and variable groundwater conditions. Shaking table test is conducted to analyze the failure mode and verify the accuracy of the finite element method results. The research results show that dynamic response values of the slope have different variation rules under near and far field earthquakes. And the damage location and pattern of the slope are different in varying groundwater conditions. The destruction starts at the top of the slope when the slope is in no groundwater, which shows that the slope appears obvious whipping effect under the earthquake. The destruction starts at the toe of the slope when the slope is in the high groundwater levels. Meanwhile, the top of the slope shows obvious seismic subsidence phenomenon after earthquake. Furthermore, the existence of the groundwater has a certain effect of damping.

  17. Effect of Different Groundwater Levels on Seismic Dynamic Response and Failure Mode of Sandy Slope

    PubMed Central

    Huang, Shuai; Lv, Yuejun; Peng, Yanju; Zhang, Lifang; Xiu, Liwei

    2015-01-01

    Heavy seismic damage tends to occur in slopes when groundwater is present. The main objectives of this paper are to determine the dynamic response and failure mode of sandy slope subjected simultaneously to seismic forces and variable groundwater conditions. This paper applies the finite element method, which is a fast and efficient design tool in modern engineering analysis, to evaluate dynamic response of the slope subjected simultaneously to seismic forces and variable groundwater conditions. Shaking table test is conducted to analyze the failure mode and verify the accuracy of the finite element method results. The research results show that dynamic response values of the slope have different variation rules under near and far field earthquakes. And the damage location and pattern of the slope are different in varying groundwater conditions. The destruction starts at the top of the slope when the slope is in no groundwater, which shows that the slope appears obvious whipping effect under the earthquake. The destruction starts at the toe of the slope when the slope is in the high groundwater levels. Meanwhile, the top of the slope shows obvious seismic subsidence phenomenon after earthquake. Furthermore, the existence of the groundwater has a certain effect of damping. PMID:26560103

  18. The ENAM Explosive Seismic Source Test

    NASA Astrophysics Data System (ADS)

    Harder, S. H.; Magnani, M. B.

    2013-12-01

    We present the results of the pilot study conducted as part of the eastern North American margin (ENAM) community seismic experiment (CSE) to test an innovative design of land explosive seismic source for crustal-scale seismic surveys. The ENAM CSE is a community based onshore-offshore controlled- and passive-source seismic experiment spanning a 400 km-wide section of the mid-Atlantic East Coast margin around Cape Hatteras. The experiment was designed to address prominent research questions such as the role of the pre-existing lithospheric grain on the structure and evolution of the ENAM margin, the distribution of magmatism, and the along-strike segmentation of the margin. In addition to a broadband OBS deployment, the CSE will acquire multichannel marine seismic data and two major onshore-offshore controlled-source seismic profiles recording both marine sources (airguns) and land explosions. The data acquired as part of the ENAM CSE will be available to the community immediately upon completion of QC procedures required for archiving purposes. The ENAM CSE provides an opportunity to test a radically new and more economical design for land explosive seismic sources used for crustal-scale seismic surveys. Over the years we have incrementally improved the performance and reduced the cost of shooting crustal seismic shots. These improvements have come from better explosives and more efficient configuration of those explosives. These improvements are largely intuitive, using higher velocity explosives and shorter, but larger diameter explosive configurations. However, recently theoretical advances now allow us to model not only these incremental improvements, but to move to more radical shot designs, which further enhance performance and reduce costs. Because some of these designs are so radical, they need experimental verification. To better engineer the shots for the ENAM experiment we are conducting an explosives test in the region of the ENAM CSE. The results of

  19. Seismic spatial wavefield gradient and rotational rate measurements as new observables in land seismic exploration

    NASA Astrophysics Data System (ADS)

    Schmelzbach, Cedric; Sollberger, David; Van Renterghem, Cédéric; Häusler, Mauro; Robertsson, Johan; Greenhalgh, Stewart

    2016-04-01

    Traditionally, land-seismic data acquisition is conducted using vertical-component sensors. A more complete representation of the seismic wavefield can be obtained by employing multicomponent sensors recording the full vector wavefield. If groups of multicomponent sensors are deployed, then spatial seismic wavefield gradients and rotational rates can be estimated by differencing the outputs of closely spaced sensors. Such data capture all six degrees of freedom of a rigid body (three components of translation and three components of rotation), and hence allow an even more complete representation of the seismic wavefield compared to single station triaxial data. Seismic gradient and rotation data open up new possibilities to process land-seismic data. Potential benefits and applications of wavefield gradient data include local slowness estimation, improved arrival identification, wavefield separation and noise suppression. Using synthetic and field data, we explored the reliability and sensitivity of various multicomponent sensor layouts to estimate seismic wavefield gradients and rotational rates. Due to the wavelength and incidence-angle dependence of sensor-group reception patterns as a function of the number of sensors, station spacing and layout, one has to counterbalance the impacts of truncation errors, random noise attenuation, and sensitivity to perturbations such as amplitude variations and positioning errors when searching for optimum receiver configurations. Field experiments with special rotational rate sensors were used to verify array-based rotational-rate estimates. Seismic wavefield gradient estimates and inferred wavefield attributes such as instantaneous slowness enable improved arrival identification, e.g. wave type and path. Under favorable conditions, seismic-wavefield gradient attributes can be extracted from conventional vertical-component data and used to, for example, enhance the identification of shear waves. A further promising

  20. Seismic safety of high concrete dams

    NASA Astrophysics Data System (ADS)

    Chen, Houqun

    2014-08-01

    China is a country of high seismicity with many hydropower resources. Recently, a series of high arch dams have either been completed or are being constructed in seismic regions, of which most are concrete dams. The evaluation of seismic safety often becomes a critical problem in dam design. In this paper, a brief introduction to major progress in the research on seismic aspects of large concrete dams, conducted mainly at the Institute of Water Resources and Hydropower Research (IWHR) during the past 60 years, is presented. The dam site-specific ground motion input, improved response analysis, dynamic model test verification, field experiment investigations, dynamic behavior of dam concrete, and seismic monitoring and observation are described. Methods to prevent collapse of high concrete dams under maximum credible earthquakes are discussed.

  1. Broadband seismology and small regional seismic networks

    USGS Publications Warehouse

    Herrmann, Robert B.

    1995-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  2. Verified models of multiagent systems for vehicle health management

    NASA Astrophysics Data System (ADS)

    Esterline, Albert; Gandluri, Bhanu; Sundaresan, Mannur; Sankar, Jagannathan

    2005-05-01

    A multiagent framework for data acquisition, analysis, and diagnosis in health management is proposed. It uses the contract net protocol, a protocol for high-level distributed problem solving that provides adaptive and flexible solutions where task decomposition and assignment of subtasks is natural. Java is used to wrap implementations of existing techniques for individual tasks, such as neural networks or fuzzy rule bases for fault classification. The Java wrapping supplies an agent interface that allows an implementation to participate in the contract net protocol. This framework is demonstrated with a simple Java prototype that monitors a laboratory specimen that generates acoustic emission signals due to fracture-induced failure. A multiagent system that conforms to our framework can focus resources as well as select important data and extract important information. Such a system is extensible and decentralized, and redundancy in it provides fault tolerance and graceful degradation. Finally, the flexibility inherent in such a system allows new strategies to develop on the fly. The behavior of a non-trivial concurrent system (such as multiagent systems) is too complex and uncontrollable to be thoroughly tested, so methods have been developed to check the design of a concurrent system against formal specifications of the system"s behavior. We review one such method-model checking with SPIN-and discuss how it can be used to verify control aspects of multiagent systems that conform to our framework.

  3. Regional Seismic Methods of Identifying Explosions

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Ford, S. R.; Pasyanos, M.; Pyle, M. L.; Hauk, T. F.

    2013-12-01

    A lesson from the 2006, 2009 and 2013 DPRK declared nuclear explosion Ms:mb observations is that our historic collection of data may not be representative of future nuclear test signatures (e.g. Selby et al., 2012). To have confidence in identifying future explosions amongst the background of other seismic signals, we need to put our empirical methods on a firmer physical footing. Here we review the two of the main identification methods: 1) P/S ratios and 2) Moment Tensor techniques, which can be applied at the regional distance (200-1600 km) to very small events, improving nuclear explosion monitoring and confidence in verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Amplitude ratios of seismic P-to-S waves at sufficiently high frequencies (~>2 Hz) can identify explosions among a background of natural earthquakes (e.g. Walter et al., 1995). However the physical basis for the generation of explosion S-waves, and therefore the predictability of this P/S technique as a function of event properties such as size, depth, geology and path, remains incompletely understood. Calculated intermediate period (10-100s) waveforms from regional 1-D models can match data and provide moment tensor results that separate explosions from earthquakes and cavity collapses (e.g. Ford et al. 2009). However it has long been observed that some nuclear tests produce large Love waves and reversed Rayleigh waves that complicate moment tensor modeling. Again the physical basis for the generation of these effects from explosions remains incompletely understood. We are re-examining regional seismic data from a variety of nuclear test sites including the DPRK and the former Nevada Test Site (now the Nevada National Security Site (NNSS)). Newer relative amplitude techniques can be employed to better quantify differences between explosions and used to understand those differences in term of depth, media and other properties. We are also making use of the Source Physics

  4. Seismic intrusion detector system

    DOEpatents

    Hawk, Hervey L.; Hawley, James G.; Portlock, John M.; Scheibner, James E.

    1976-01-01

    A system for monitoring man-associated seismic movements within a control area including a geophone for generating an electrical signal in response to seismic movement, a bandpass amplifier and threshold detector for eliminating unwanted signals, pulse counting system for counting and storing the number of seismic movements within the area, and a monitoring system operable on command having a variable frequency oscillator generating an audio frequency signal proportional to the number of said seismic movements.

  5. Reasoning about knowledge: Children's evaluations of generality and verifiability.

    PubMed

    Koenig, Melissa A; Cole, Caitlin A; Meyer, Meredith; Ridge, Katherine E; Kushnir, Tamar; Gelman, Susan A

    2015-12-01

    In a series of experiments, we examined 3- to 8-year-old children's (N=223) and adults' (N=32) use of two properties of testimony to estimate a speaker's knowledge: generality and verifiability. Participants were presented with a "Generic speaker" who made a series of 4 general claims about "pangolins" (a novel animal kind), and a "Specific speaker" who made a series of 4 specific claims about "this pangolin" as an individual. To investigate the role of verifiability, we systematically varied whether the claim referred to a perceptually-obvious feature visible in a picture (e.g., "has a pointy nose") or a non-evident feature that was not visible (e.g., "sleeps in a hollow tree"). Three main findings emerged: (1) young children showed a pronounced reliance on verifiability that decreased with age. Three-year-old children were especially prone to credit knowledge to speakers who made verifiable claims, whereas 7- to 8-year-olds and adults credited knowledge to generic speakers regardless of whether the claims were verifiable; (2) children's attributions of knowledge to generic speakers was not detectable until age 5, and only when those claims were also verifiable; (3) children often generalized speakers' knowledge outside of the pangolin domain, indicating a belief that a person's knowledge about pangolins likely extends to new facts. Findings indicate that young children may be inclined to doubt speakers who make claims they cannot verify themselves, as well as a developmentally increasing appreciation for speakers who make general claims.

  6. Advanced Seismic While Drilling System

    SciTech Connect

    Robert Radtke; John Fontenot; David Glowka; Robert Stokes; Jeffery Sutherland; Ron Evans; Jim Musser

    2008-06-30

    A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology of a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured by TII

  7. USGS National Seismic Hazard Maps

    USGS Publications Warehouse

    Frankel, A.D.; Mueller, C.S.; Barnhard, T.P.; Leyendecker, E.V.; Wesson, R.L.; Harmsen, S.C.; Klein, F.W.; Perkins, D.M.; Dickman, N.C.; Hanson, S.L.; Hopper, M.G.

    2000-01-01

    The U.S. Geological Survey (USGS) recently completed new probabilistic seismic hazard maps for the United States, including Alaska and Hawaii. These hazard maps form the basis of the probabilistic component of the design maps used in the 1997 edition of the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, prepared by the Building Seismic Safety Council arid published by FEMA. The hazard maps depict peak horizontal ground acceleration and spectral response at 0.2, 0.3, and 1.0 sec periods, with 10%, 5%, and 2% probabilities of exceedance in 50 years, corresponding to return times of about 500, 1000, and 2500 years, respectively. In this paper we outline the methodology used to construct the hazard maps. There are three basic components to the maps. First, we use spatially smoothed historic seismicity as one portion of the hazard calculation. In this model, we apply the general observation that moderate and large earthquakes tend to occur near areas of previous small or moderate events, with some notable exceptions. Second, we consider large background source zones based on broad geologic criteria to quantify hazard in areas with little or no historic seismicity, but with the potential for generating large events. Third, we include the hazard from specific fault sources. We use about 450 faults in the western United States (WUS) and derive recurrence times from either geologic slip rates or the dating of pre-historic earthquakes from trenching of faults or other paleoseismic methods. Recurrence estimates for large earthquakes in New Madrid and Charleston, South Carolina, were taken from recent paleoliquefaction studies. We used logic trees to incorporate different seismicity models, fault recurrence models, Cascadia great earthquake scenarios, and ground-motion attenuation relations. We present disaggregation plots showing the contribution to hazard at four cities from potential earthquakes with various magnitudes and

  8. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-12-09

    make SSR surveying considerably more efficient and less expensive, particularly when geophone intervals of 25 cm or less are required. The most recent research analyzed the difference in seismic response of the geophones with variable geophone spike length and geophones attached to various steel media. Experiments investigated the azimuthal dependence of the quality of data relative to the orientation of the rigidly attached geophones. Other experiments designed to test the hypothesis that the data are being amplified in much the same way that an organ pipe amplifies sound have so far proved inconclusive. Taken together, the positive results show that SSR imaging within a few meters of the earth's surface is possible if the geology is suitable, that SSR imaging can complement GPR imaging, and that SSR imaging could be made significantly more cost effective, at least in areas where the topography and the geology are favorable. Increased knowledge of the Earth's shallow subsurface through non-intrusive techniques is of potential benefit to management of DOE facilities. Among the most significant problems facing hydrologists today is the delineation of preferential permeability paths in sufficient detail to make a quantitative analysis possible. Aquifer systems dominated by fracture flow have a reputation of being particularly difficult to characterize and model. At chemically contaminated sites, including U.S. Department of Energy (DOE) facilities and others at Department of Defense (DOD) installations worldwide, establishing the spatial extent of the contamination, along with the fate of the contaminants and their transport-flow directions, is essential to the development of effective cleanup strategies. Detailed characterization of the shallow subsurface is important not only in environmental, groundwater, and geotechnical engineering applications, but also in neotectonics, mining geology, and the analysis of petroleum reservoir analogs. Near-surface seismology is in

  9. Micromachined silicon seismic transducers

    SciTech Connect

    Barron, C.C.; Fleming, J.G.; Sniegowski, J.J.; Armour, D.L.; Fleming, R.P.

    1995-08-01

    Batch-fabricated silicon seismic transducers could revolutionize the discipline of CTBT monitoring by providing inexpensive, easily depolyable sensor arrays. Although our goal is to fabricate seismic sensors that provide the same performance level as the current state-of-the-art ``macro`` systems, if necessary one could deploy a larger number of these small sensors at closer proximity to the location being monitored in order to compensate for lower performance. We have chosen a modified pendulum design and are manufacturing prototypes in two different silicon micromachining fabrication technologies. The first set of prototypes, fabricated in our advanced surface- micromachining technology, are currently being packaged for testing in servo circuits -- we anticipate that these devices, which have masses in the 1--10 {mu}g range, will resolve sub-mG signals. Concurrently, we are developing a novel ``mold`` micromachining technology that promises to make proof masses in the 1--10 mg range possible -- our calculations indicate that devices made in this new technology will resolve down to at least sub-{mu}G signals, and may even approach to 10{sup {minus}10} G/{radical}Hz acceleration levels found in the low-earth-noise model.

  10. Seismic-Scale Rock Physics of Methane Hydrate

    SciTech Connect

    Amos Nur

    2009-01-08

    We quantify natural methane hydrate reservoirs by generating synthetic seismic traces and comparing them to real seismic data: if the synthetic matches the observed data, then the reservoir properties and conditions used in synthetic modeling might be the same as the actual, in-situ reservoir conditions. This approach is model-based: it uses rock physics equations that link the porosity and mineralogy of the host sediment, pressure, and hydrate saturation, and the resulting elastic-wave velocity and density. One result of such seismic forward modeling is a catalogue of seismic reflections of methane hydrate which can serve as a field guide to hydrate identification from real seismic data. We verify this approach using field data from known hydrate deposits.

  11. VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model

    SciTech Connect

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

    2009-08-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several

  12. Learn About SmartWay Verified Aerodynamic Devices

    EPA Pesticide Factsheets

    Installing EPA-verified aerodynamic technologies on your trailer can help fleet and truck owners save fuel. Options include gap reducers, skirts, or tails and can be installed individually or in combination.

  13. Specifying and Verifying the Correctness of Dynamic Software Updates

    DTIC Science & Technology

    2011-11-15

    updates. – Modified semantics: Before Redis version 1.3.8, a set whose last element was removed would remain in the database . We use the backward... Redis key-value store and several synthetic programs. Using Thor, a verification tool, we could verify many of the synthetic programs; using Otter, a...and applied it to updates for the Redis key-value store and several synthetic programs. Using Thor, a veri cation tool, we could verify many of the

  14. Long-Term Verifiability of Remote Electronic Elections

    NASA Astrophysics Data System (ADS)

    Langer, Lucie

    Retention of election documents is essential for verifying the proper conduct of an election ex post. The documents retained provide for later review in case an election contest is filed. Moreover, the principle of public elections laid down in German basic law implies the need for public verifiability. This applies to remote electronic voting in particular as physical observation is not achievable in this case.

  15. A seismic metamaterial: The resonant metawedge

    NASA Astrophysics Data System (ADS)

    Colombi, Andrea; Colquitt, Daniel; Roux, Philippe; Guenneau, Sebastien; Craster, Richard V.

    2016-06-01

    Critical concepts from three different fields, elasticity, plasmonics and metamaterials, are brought together to design a metasurface at the geophysical scale, the resonant metawedge, to control seismic Rayleigh waves. Made of spatially graded vertical subwavelength resonators on an elastic substrate, the metawedge can either mode convert incident surface Rayleigh waves into bulk elastic shear waves or reflect the Rayleigh waves creating a “seismic rainbow” effect analogous to the optical rainbow for electromagnetic metasurfaces. Time-domain spectral element simulations demonstrate the broadband efficacy of the metawedge in mode conversion while an analytical model is developed to accurately describe and predict the seismic rainbow effect; allowing the metawedge to be designed without the need for extensive parametric studies and simulations. The efficiency of the resonant metawedge shows that large-scale mechanical metamaterials are feasible, will have application, and that the time is ripe for considering many optical devices in the seismic and geophysical context.

  16. A seismic metamaterial: The resonant metawedge.

    PubMed

    Colombi, Andrea; Colquitt, Daniel; Roux, Philippe; Guenneau, Sebastien; Craster, Richard V

    2016-06-10

    Critical concepts from three different fields, elasticity, plasmonics and metamaterials, are brought together to design a metasurface at the geophysical scale, the resonant metawedge, to control seismic Rayleigh waves. Made of spatially graded vertical subwavelength resonators on an elastic substrate, the metawedge can either mode convert incident surface Rayleigh waves into bulk elastic shear waves or reflect the Rayleigh waves creating a "seismic rainbow" effect analogous to the optical rainbow for electromagnetic metasurfaces. Time-domain spectral element simulations demonstrate the broadband efficacy of the metawedge in mode conversion while an analytical model is developed to accurately describe and predict the seismic rainbow effect; allowing the metawedge to be designed without the need for extensive parametric studies and simulations. The efficiency of the resonant metawedge shows that large-scale mechanical metamaterials are feasible, will have application, and that the time is ripe for considering many optical devices in the seismic and geophysical context.

  17. A seismic metamaterial: The resonant metawedge

    PubMed Central

    Colombi, Andrea; Colquitt, Daniel; Roux, Philippe; Guenneau, Sebastien; Craster, Richard V.

    2016-01-01

    Critical concepts from three different fields, elasticity, plasmonics and metamaterials, are brought together to design a metasurface at the geophysical scale, the resonant metawedge, to control seismic Rayleigh waves. Made of spatially graded vertical subwavelength resonators on an elastic substrate, the metawedge can either mode convert incident surface Rayleigh waves into bulk elastic shear waves or reflect the Rayleigh waves creating a “seismic rainbow” effect analogous to the optical rainbow for electromagnetic metasurfaces. Time-domain spectral element simulations demonstrate the broadband efficacy of the metawedge in mode conversion while an analytical model is developed to accurately describe and predict the seismic rainbow effect; allowing the metawedge to be designed without the need for extensive parametric studies and simulations. The efficiency of the resonant metawedge shows that large-scale mechanical metamaterials are feasible, will have application, and that the time is ripe for considering many optical devices in the seismic and geophysical context. PMID:27283587

  18. Integration of onshore and offshore seismological data to study the seismicity of the Calabrian Region

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Antonino; Guerra, Ignazio; D'Anna, Giuseppe; Gervasi, Anna; Harabaglia, Paolo; Luzio, Dario; Stellato, Gilda

    2014-05-01

    The Pollino Massif marks the transition from the Southern Appenninic to the Calabrian Arc. On the western side it is characterized by a moderately sized seismicity (about 9 M > 4 events in the last 50 years), well documented in the last 400 years. The moment tensor solutions available in this area yields, mainly, normal faults with coherent Southern Appeninic trend. This remains true also for the events that are localized on the calabrian side of Pollino, South of the massif. In most of the Sibari plane, seismic activity is very scarce, while it is again rather marked on its southeastern corner, both onshore and offshore. The above observations point to the perspective that the stress field of a vast portion of Northern Calabria still resembles that of the Southern Appenines. In this frame, it becomes important to investigate the offshore seismicity of the Sibari Gulf and the deformation pattern within the Sibari Plane. The latter might function as a hinge to transfer the deformation of the extensional fault system in the Pollino area to a different offshore fault system. Since return times of larger events might be very long, we need to investigate the true seismic potential of the offshore faults and to verify whether they are truly strike slip or if they could involve relevant thrust or normal components, that would add to the risk that of potentially associated tsunamis. Despite their importance in the understanding of the seismotectonic processes taking place in the Southern Appenninic - Calabrian Arc border and surrounding areas, the seismicity and the seismogenic volumes of the Sibari Gulf until now has not been well characterized due to the lack of offshore seismic stations. The seismicity of the Calabrian is monitored by the Italian National Seismic Network (INSN) managed by Istituto Nazionale di Geofisica e Vulcanologia and by the Calabrian Regional Seismic Network (CRSN) managed by the University of Calabria. Both the network comprise only on

  19. Multichannel algorithms for seismic reflectivity inversion

    NASA Astrophysics Data System (ADS)

    Wang, Ruo; Wang, Yanghua

    2017-02-01

    Seismic reflectivity inversion is a deconvolution process for quantitatively extracting the reflectivity series and depicting the layered subsurface structure. The conventional method is a single channel inversion and cannot clearly characterise stratified structures, especially from seismic data with low signal-to-noise ratio. Because it is implemented on a trace-by-trace basis, the continuity along reflections in the original seismic data is deteriorated in the inversion results. We propose here multichannel inversion algorithms that apply the information of adjacent traces during seismic reflectivity inversion. Explicitly, we incorporate a spatial prediction filter into the conventional Cauchy-constrained inversion method. We verify the validity and feasibility of the method using field data experiments and find an improved lateral continuity and clearer structures achieved by the multichannel algorithms. Finally, we compare the performance of three multichannel algorithms and merit the effectiveness based on the lateral coherency and structure characterisation of the inverted reflectivity profiles, and the residual energy of the seismic data at the same time.

  20. Salvo: Seismic imaging software for complex geologies

    SciTech Connect

    OBER,CURTIS C.; GJERTSEN,ROB; WOMBLE,DAVID E.

    2000-03-01

    This report describes Salvo, a three-dimensional seismic-imaging software for complex geologies. Regions of complex geology, such as overthrusts and salt structures, can cause difficulties for many seismic-imaging algorithms used in production today. The paraxial wave equation and finite-difference methods used within Salvo can produce high-quality seismic images in these difficult regions. However this approach comes with higher computational costs which have been too expensive for standard production. Salvo uses improved numerical algorithms and methods, along with parallel computing, to produce high-quality images and to reduce the computational and the data input/output (I/O) costs. This report documents the numerical algorithms implemented for the paraxial wave equation, including absorbing boundary conditions, phase corrections, imaging conditions, phase encoding, and reduced-source migration. This report also describes I/O algorithms for large seismic data sets and images and parallelization methods used to obtain high efficiencies for both the computations and the I/O of seismic data sets. Finally, this report describes the required steps to compile, port and optimize the Salvo software, and describes the validation data sets used to help verify a working copy of Salvo.

  1. Seismic analysis of the large 70-meter antenna. Part 2: General dynamic response and a seismic safety check

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    An extensive dynamic analysis for the new JPL 70-meter antenna structure is presented. Analytical procedures are based on the normal mode decomposition which include dumping and special forcing functions. The dynamic response can be obtained for any arbitrarily selected point on the structure. A new computer program for computing the time-dependent, resultant structural displacement, summing the effects of all participating modes, was developed also. Program compatibility with natural frequency analysis output was verified. The program was applied to the JPL 70-meter antenna structure and the dynamic response for several specially selected points was computed. Seismic analysis of structures, a special application of the general dynamic analysis, is based also on the normal modal decomposition. Strength specification of the antenna, with respect to the earthquake excitation, is done by using the common response spectra. The results indicated basically a safe design under an assumed 5% or more damping coefficient. However, for the antenna located at Goldstone, with more active seismic environment, this study strongly recommends and experimental program that determines the true damping coefficient for a more reliable safety check.

  2. Verified Centers, Nonverified Centers or Other Facilities: A National Analysis of Burn Patient Treatment Location

    PubMed Central

    Zonies, David; Mack, Christopher; Kramer, Bradley; Rivara, Frederick; Klein, Matthew

    2009-01-01

    Background Although comprehensive burn care requires significant resources, patients may be treated at verified burn centers, non-verified burn centers, or other facilities due to a variety of factors. The purpose of this study was to evaluate the association between patient and injury characteristics and treatment location using a national database. Study Design We performed an analysis of all burn patients admitted to United States hospitals participating in the Healthcare Cost and Utilization Project over 2 years. Univariate and multivariate analyses were performed to identify patient and injury factors associated with the likelihood of treatment at designated burn care facilities. Definitve care facilities were categorized as American Burn Association verified centers, non-verified burn centers, or other facilities. Results Over the two years, 29,971 burn patients were treated in 1,376 hospitals located in 19 participating states. A total of 6,712 (22%) patients were treated at verified centers, with 26% and 52% treated at non-verified or other facilities, respectively. Patients treated at verified centers were younger than those at non-verified or other facilities (33.1 years vs. 33.7 years vs. 41.9 years, p<0.001) and had a higher rate of inhalation injury (3.4% vs. 3.2% vs. 2.2%, p<0.001). Independent factors associated with treatment at verified centers include burns to the head/neck (RR 2.4, CI 2.1-2.7), hand (RR 1.8, CI 1.6-1.9), electrical injury (RR 1.4, CI 1.4, CI 1.2-1.7), and fewer co-morbidities (RR 0.55, CI 0.5-0.6). Conclusions More than two-thirds of significantly burned patients are treated at non-verified burn centers in the U.S. Many patients meeting ABA criteria for transfer to a burn center are being treated at non-burn center facilities. PMID:20193892

  3. Development of Seismic Isolation Systems Using Periodic Materials

    SciTech Connect

    Yan, Yiqun; Mo, Yi-Lung; Menq, Farn-Yuh; Stokoe, II, Kenneth H.; Perkins, Judy; Tang, Yu

    2014-12-10

    Advanced fast nuclear power plants and small modular fast reactors are composed of thin-walled structures such as pipes; as a result, they do not have sufficient inherent strength to resist seismic loads. Seismic isolation, therefore, is an effective solution for mitigating earthquake hazards for these types of structures. Base isolation, on which numerous studies have been conducted, is a well-defined structure protection system against earthquakes. In conventional isolators, such as high-damping rubber bearings, lead-rubber bearings, and friction pendulum bearings, large relative displacements occur between upper structures and foundations. Only isolation in a horizontal direction is provided; these features are not desirable for the piping systems. The concept of periodic materials, based on the theory of solid-state physics, can be applied to earthquake engineering. The periodic material is a material that possesses distinct characteristics that prevent waves with certain frequencies from being transmitted through it; therefore, this material can be used in structural foundations to block unwanted seismic waves with certain frequencies. The frequency band of periodic material that can filter out waves is called the band gap, and the structural foundation made of periodic material is referred to as the periodic foundation. The design of a nuclear power plant, therefore, can be unified around the desirable feature of a periodic foundation, while the continuous maintenance of the structure is not needed. In this research project, three different types of periodic foundations were studied: one-dimensional, two-dimensional, and three-dimensional. The basic theories of periodic foundations are introduced first to find the band gaps; then the finite element methods are used, to perform parametric analysis, and obtain attenuation zones; finally, experimental programs are conducted, and the test data are analyzed to verify the theory. This procedure shows that the

  4. Study on Seismic Zoning of Sino-Mongolia Arc Areas

    NASA Astrophysics Data System (ADS)

    Xu, G.

    2015-12-01

    According to the agreement of Cooperation on seismic zoning between Institute of Geophysics, China Earthquake Administration and Research Center of Astronomy and Geophysics, Mongolian Academy of Science, the data of geotectonics, active faults, seismicity and geophysical field were collected and analyzed, then field investigation proceeded for Bolnay Faults, Ar Hutul Faults and Gobi Altay Faults, and a uniform earthquake catalogue of Mongolia and North China were established for the seismic hazard study in Sino-Mongolia arc areas. Furthermore the active faults and epicenters were mapped and 2 seismic belts and their 54 potential seismic sources are determined. Based on the data and results above mentioned the seismicity parameters for the two seismic belts and their potential sources were studied. Finally, the seismic zoning with different probability in Sino-Mongolia arc areas was carried out using China probabilistic hazard analysis method. By analyzing the data and results, we draw the following main conclusions. Firstly, the origin of tectonic stress field in the study areas is the collision and pressure of the India Plate to Eurasian Plate, passing from the Qinghai-Tibet Plateau. This is the reason why the seismicity is higher in the west than in the east, and all of earthquakes with magnitude 8 or greater occurred in the west. Secondly, the determination of the 2 arc seismic belts, Altay seismic belt and Bolnay-Baikal seismic belt, are reasonable in terms of their geotectonic location, geodynamic origin and seismicity characteristics. Finally, there are some differences between our results and the Mongolia Intensity Zoning map published in 1985 in terms of shape of seismic zoning map, especially in the areas near Ulaanbaatar. We argue that our relsults are reasonable if we take into account the data use of recent study of active faults and their parameters, so it can be used as a reference for seismic design.

  5. Angola Seismicity MAP

    NASA Astrophysics Data System (ADS)

    Neto, F. A. P.; Franca, G.

    2014-12-01

    The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of Sá da Bandeira (Lubango)-Chibemba-Oncócua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambândua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic

  6. Seismic hazard map of the western hemisphere

    USGS Publications Warehouse

    Shedlock, K.M.; Tanner, J.G.

    1999-01-01

    Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the

  7. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  8. Design of a potential long-term test of gas production from a hydrate deposit at the PBU-L106 site in North Slope, Alaska: Geomechanical system response and seismic monitoring

    NASA Astrophysics Data System (ADS)

    Chiaramonte, L.; Kowalsky, M. B.; Rutqvist, J.; Moridis, G. J.

    2009-12-01

    In an effort to optimize the design of a potential long-term production test at the PBU-L106 site in North Slope, Alaska, we have developed a coupled modeling framework that includes the simulation of (1) large-scale production at the test site, (2) the corresponding geomechanical changes in the system caused by production, and (3) time-lapse geophysical (seismic) surveys. The long-term test is to be conducted within the deposit of the C-layer, which extends from a depth of 2226 to 2374 ft, and is characterized by two hydrate-bearing strata separated by a 30 ft shale interlayer. In this study we examine the expected geomechanical response of the permafrost-associated hydrate deposit (C-Layer) at the PBU L106 site during depressurization-induced production, and assess the potential for monitoring the system response with seismic measurements. Gas hydrates increase the strength of the sediments (often unconsolidated) they impregnate. Thus hydrate disassociation in the course of gas production could potentially affect the geomechanical stability of such deposits, leading to sediment failure and potentially affecting wellbore stability and integrity at the production site and/or at neighboring conventional production facilities. For the geomechanical analysis we use a coupled hydraulic, thermodynamic and geomechanical model (TOUGH+HYDRATE+FLAC3D, T+H+F for short) simulating production from a single vertical well at the center of an infinite-acting hydrate deposit. We investigate the geomechanical stability of the C-Layer, well stability and possible interference (due to production) with pre-existing wells in the vicinity, as well as the system sensitivity to important parameters (saturation, permeability, porosity and heterogeneity). The time-lapse seismic surveys are simulated using a finite-difference elastic wave propagation model that is linked to the T+H+F code. The seismic properties, such as the elastic and shear moduli, are a function of the simulated time- and

  9. The SCALE Verified, Archived Library of Inputs and Data - VALID

    SciTech Connect

    Marshall, William BJ J; Rearden, Bradley T

    2013-01-01

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated with model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional

  10. New Madrid Seismic Zone

    DTIC Science & Technology

    2007-11-02

    NEW MADRID SEISMIC ZONE BY COLONEL J.DAVID NORWOOD United States Army DISTRIBUTION STATEMENT A...mCTBB l USAWC STRATEGY RESEARCH PROJECT New Madrid Seismic Zone by J. David Norwood, COL, USA Michael A. Pearson, COL, USA Project Advisor The...ABSTRACT AUTHOR: J. David Norwood, Colonel, U.S. Army TITLE: New Madrid Seismic Zone FORMAT: Strategy Research Project DATE: 22 April 1998 . PAGES:

  11. Oklahoma seismic network. Final report

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. |

    1993-07-01

    The US Nuclear Regulatory Commission has established rigorous guidelines that must be adhered to before a permit to construct a nuclear-power plant is granted to an applicant. Local as well as regional seismicity and structural relationships play an integral role in the final design criteria for nuclear power plants. The existing historical record of seismicity is inadequate in a number of areas of the Midcontinent region because of the lack of instrumentation and (or) the sensitivity of the instruments deployed to monitor earthquake events. The Nemaha Uplift/Midcontinent Geophysical Anomaly is one of five principal areas east of the Rocky Mountain front that has a moderately high seismic-risk classification. The Nemaha uplift, which is common to the states of Oklahoma, Kansas, and Nebraska, is approximately 415 miles long and 12-14 miles wide. The Midcontinent Geophysical Anomaly extends southward from Minnesota across Iowa and the southeastern corner of Nebraska and probably terminates in central Kansas. A number of moderate-sized earthquakes--magnitude 5 or greater--have occurred along or west of the Nemaha uplift. The Oklahoma Geological Survey, in cooperation with the geological surveys of Kansas, Nebraska, and Iowa, conducted a 5-year investigation of the seismicity and tectonic relationships of the Nemaha uplift and associated geologic features in the Midcontinent. This investigation was intended to provide data to be used to design nuclear-power plants. However, the information is also being used to design better large-scale structures, such as dams and high-use buildings, and to provide the necessary data to evaluate earthquake-insurance rates in the Midcontinent.

  12. Evolution of optically nondestructive and data-non-intrusive credit card verifiers

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2010-04-01

    Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

  13. Rock-physics and seismic-inversion based reservoir characterization of the Haynesville Shale

    NASA Astrophysics Data System (ADS)

    Jiang, Meijuan; Spikes, Kyle T.

    2016-06-01

    Seismic reservoir characterization of unconventional gas shales is challenging due to their heterogeneity and anisotropy. Rock properties of unconventional gas shales such as porosity, pore-shape distribution, and composition are important for interpreting seismic data amplitude variations in order to locate optimal drilling locations. The presented seismic reservoir characterization procedure applied a grid-search algorithm to estimate the composition, pore-shape distribution, and porosity at the seismic scale from the seismically inverted impedances and a rock-physics model, using the Haynesville Shale as a case study. All the proposed rock properties affected the seismic velocities, and the combined effects of these rock properties on the seismic amplitude were investigated simultaneously. The P- and S-impedances correlated negatively with porosity, and the V P/V S correlated positively with clay fraction and negatively with the pore-shape distribution and quartz fraction. The reliability of these estimated rock properties at the seismic scale was verified through comparisons between two sets of elastic properties: one coming from inverted impedances, which were obtained from simultaneous inversion of prestack seismic data, and one derived from these estimated rock properties. The differences between the two sets of elastic properties were less than a few percent, verifying the feasibility of the presented seismic reservoir characterization.

  14. Seismic Imaging and Monitoring

    SciTech Connect

    Huang, Lianjie

    2012-07-09

    I give an overview of LANL's capability in seismic imaging and monitoring. I present some seismic imaging and monitoring results, including imaging of complex structures, subsalt imaging of Gulf of Mexico, fault/fracture zone imaging for geothermal exploration at the Jemez pueblo, time-lapse imaging of a walkway vertical seismic profiling data for monitoring CO{sub 2} inject at SACROC, and microseismic event locations for monitoring CO{sub 2} injection at Aneth. These examples demonstrate LANL's high-resolution and high-fidelity seismic imaging and monitoring capabilities.

  15. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  16. Public Verifiable Multi-sender Identity Based Threshold Signcryption

    NASA Astrophysics Data System (ADS)

    Chen, Wen; Lei, Feiyu; Guo, Fang; Chen, Guang

    In this paper, we present a new identity based signcryption scheme with public verifiability using quadratic residue and pairings over elliptic curves, and give a security proof about the original scheme in the random oracle model. Furthermore, this paper focuses on a multisender(t,n) identity based threshold sighcryption. Finally, we prove the scheme of threshold setting is secure as the original scheme.

  17. Verifying Stiffness Parameters Of Filament-Wound Cylinders

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Rheinfurth, M.

    1994-01-01

    Predicted engineering stiffness parameters of filament-wound composite-material cylinders verified with respect to experimental data, by use of equations developed straightforwardly from applicable formulation of Hooke's law. Equations derived in engineering study of filament-wound rocket-motor cases, also applicable to other cylindrical pressure vessels made of orthotropic materials.

  18. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own... identification and location of requested records, you may also, at your option, include your social security... agency, except under the provisions of the Privacy Act, 5 U.S.C. 552a, or the Freedom of Information...

  19. 28 CFR 802.13 - Verifying your identity.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... COLUMBIA DISCLOSURE OF RECORDS Privacy Act § 802.13 Verifying your identity. (a) Requests for your own... identification and location of requested records, you may also, at your option, include your social security... agency, except under the provisions of the Privacy Act, 5 U.S.C. 552a, or the Freedom of Information...

  20. Seismic Catalogue and Seismic Network in Haiti

    NASA Astrophysics Data System (ADS)

    Belizaire, D.; Benito, B.; Carreño, E.; Meneses, C.; Huerfano, V.; Polanco, E.; McCormack, D.

    2013-05-01

    The destructive earthquake occurred on January 10, 2010 in Haiti, highlighted the lack of preparedness of the country to address seismic phenomena. At the moment of the earthquake, there was no seismic network operating in the country, and only a partial control of the past seismicity was possible, due to the absence of a national catalogue. After the 2010 earthquake, some advances began towards the installation of a national network and the elaboration of a seismic catalogue providing the necessary input for seismic Hazard Studies. This paper presents the state of the works carried out covering both aspects. First, a seismic catalogue has been built, compiling data of historical and instrumental events occurred in the Hispaniola Island and surroundings, in the frame of the SISMO-HAITI project, supported by the Technical University of Madrid (UPM) and Developed in cooperation with the Observatoire National de l'Environnement et de la Vulnérabilité of Haiti (ONEV). Data from different agencies all over the world were gathered, being relevant the role of the Dominican Republic and Puerto Rico seismological services which provides local data of their national networks. Almost 30000 events recorded in the area from 1551 till 2011 were compiled in a first catalogue, among them 7700 events with Mw ranges between 4.0 and 8.3. Since different magnitude scale were given by the different agencies (Ms, mb, MD, ML), this first catalogue was affected by important heterogeneity in the size parameter. Then it was homogenized to moment magnitude Mw using the empirical equations developed by Bonzoni et al (2011) for the eastern Caribbean. At present, this is the most exhaustive catalogue of the country, although it is difficult to assess its degree of completeness. Regarding the seismic network, 3 stations were installed just after the 2010 earthquake by the Canadian Government. The data were sent by telemetry thought the Canadian System CARINA. In 2012, the Spanish IGN together

  1. An economical educational seismic system

    USGS Publications Warehouse

    Lehman, J. D.

    1980-01-01

    There is a considerable interest in seismology from the nonprofessional or amateur standpoint. The operation of a seismic system can be satisfying and educational, especially when you have built and operated the system yourself. A long-period indoor-type sensor and recording system that works extremely well has been developed in the James Madison University Physics Deparment. The system can be built quite economically, and any educational institution that cannot commit themselves to a professional installation need not be without first-hand seismic information. The system design approach has been selected by college students working a project or senior thesis, several elementary and secondary science teachers, as well as the more ambitious tinkerer or hobbyist at home 

  2. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    ,

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  3. Seismic analysis of a vacuum vessel

    SciTech Connect

    Chen, W.W.

    1993-01-01

    This paper presents the results of the seismic analysis for the preliminary design of a vacuum vessel for the ground engineering system (GES) of the SP-100 project. It describes the method of calculating the elevated seismic response spectra at various levels within the vacuum vessel using the simplified computer code developed by Weiner. A modal superposition analysis under design response spectra loading was performed for a three-dimensional finite-element model using the general-purpose finite-element computer code ANSYS. The in-vessel elevated seismic response spectra at various levels in the vacuum vessel, along with vessel mode shapes and frequencies are presented. Also included are descriptions of the results of the modal analyses for some significant preliminary design points at various elevations of the vessel.

  4. Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity

    SciTech Connect

    Tolk, Keith M.; Stoker, Gerald C.

    1999-07-20

    An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements.

  5. Development of adaptive seismic isolators for ultimate seismic protection of civil structures

    NASA Astrophysics Data System (ADS)

    Li, Jianchun; Li, Yancheng; Li, Weihua; Samali, Bijan

    2013-04-01

    Base isolation is the most popular seismic protection technique for civil engineering structures. However, research has revealed that the traditional base isolation system due to its passive nature is vulnerable to two kinds of earthquakes, i.e. the near-fault and far-fault earthquakes. A great deal of effort has been dedicated to improve the performance of the traditional base isolation system for these two types of earthquakes. This paper presents a recent research breakthrough on the development of a novel adaptive seismic isolation system as the quest for ultimate protection for civil structures, utilizing the field-dependent property of the magnetorheological elastomer (MRE). A novel adaptive seismic isolator was developed as the key element to form smart seismic isolation system. The novel isolator contains unique laminated structure of steel and MR elastomer layers, which enable its large-scale civil engineering applications, and a solenoid to provide sufficient and uniform magnetic field for energizing the field-dependent property of MR elastomers. With the controllable shear modulus/damping of the MR elastomer, the developed adaptive seismic isolator possesses a controllable lateral stiffness while maintaining adequate vertical loading capacity. In this paper, a comprehensive review on the development of the adaptive seismic isolator is present including designs, analysis and testing of two prototypical adaptive seismic isolators utilizing two different MRE materials. Experimental results show that the first prototypical MRE seismic isolator can provide stiffness increase up to 37.49%, while the second prototypical MRE seismic isolator provides amazing increase of lateral stiffness up to1630%. Such range of increase of the controllable stiffness of the seismic isolator makes it highly practical for developing new adaptive base isolation system utilizing either semi-active or smart passive controls.

  6. Seismic Risk Perception compared with seismic Risk Factors

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Pessina, Vera; Pino, Nicola Alessandro; Peruzza, Laura

    2016-04-01

    The communication of natural hazards and their consequences is one of the more relevant ethical issues faced by scientists. In the last years, social studies have provided evidence that risk communication is strongly influenced by the risk perception of people. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. A theory that offers an integrative approach to understanding and explaining risk perception is still missing. To explain risk perception, it is necessary to consider several perspectives: social, psychological and cultural perspectives and their interactions. This paper presents the results of the CATI survey on seismic risk perception in Italy, conducted by INGV researchers on funding by the DPC. We built a questionnaire to assess seismic risk perception, with a particular attention to compare hazard, vulnerability and exposure perception with the real data of the same factors. The Seismic Risk Perception Questionnaire (SRP-Q) is designed by semantic differential method, using opposite terms on a Likert scale to seven points. The questionnaire allows to obtain the scores of five risk indicators: Hazard, Exposure, Vulnerability, People and Community, Earthquake Phenomenon. The questionnaire was administered by telephone interview (C.A.T.I.) on a statistical sample at national level of over 4,000 people, in the period January -February 2015. Results show that risk perception seems be underestimated for all indicators considered. In particular scores of seismic Vulnerability factor are extremely low compared with house information data of the respondents. Other data collected by the questionnaire regard Earthquake information level, Sources of information, Earthquake occurrence with respect to other natural hazards, participation at risk reduction activities and level of involvement. Research on risk perception aims to aid risk analysis and policy-making by

  7. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    PubMed

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  8. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing

    NASA Astrophysics Data System (ADS)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-01

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  9. Watermarking medical images with anonymous patient identification to verify authenticity.

    PubMed

    Coatrieux, Gouenou; Quantin, Catherine; Montagner, Julien; Fassa, Maniane; Allaert, François-André; Roux, Christian

    2008-01-01

    When dealing with medical image management, there is a need to ensure information authenticity and dependability. Being able to verify the information belongs to the correct patient and is issued from the right source is a major concern. Verification can help to reduce the risk of errors when identifying documents in daily practice or when sending a patient's Electronic Health Record. At the same time, patient privacy issues may appear during the verification process when the verifier accesses patient data without appropriate authorization. In this paper we discuss the combination of watermarking with different identifiers ranging from DICOM standard UID to an Anonymous European Patient Identifier in order to improve medical image protection in terms of authenticity and maintainability.

  10. Formally Verified Practical Algorithms for Recovery from Loss of Separation

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Caesar A.

    2009-01-01

    In this paper, we develop and formally verify practical algorithms for recovery from loss of separation. The formal verification is performed in the context of a criteria-based framework. This framework provides rigorous definitions of horizontal and vertical maneuver correctness that guarantee divergence and achieve horizontal and vertical separation. The algorithms are shown to be independently correct, that is, separation is achieved when only one aircraft maneuvers, and implicitly coordinated, that is, separation is also achieved when both aircraft maneuver. In this paper we improve the horizontal criteria over our previous work. An important benefit of the criteria approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).

  11. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  12. Real-Time Projection to Verify Plan Success During Execution

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Dvorak, Daniel L.; Rasmussen, Robert D.; Knight, Russell L.; Morris, John R.; Bennett, Matthew B.; Ingham, Michel D.

    2012-01-01

    The Mission Data System provides a framework for modeling complex systems in terms of system behaviors and goals that express intent. Complex activity plans can be represented as goal networks that express the coordination of goals on different state variables of the system. Real-time projection extends the ability of this system to verify plan achievability (all goals can be satisfied over the entire plan) into the execution domain so that the system is able to continuously re-verify a plan as it is executed, and as the states of the system change in response to goals and the environment. Previous versions were able to detect and respond to goal violations when they actually occur during execution. This new capability enables the prediction of future goal failures; specifically, goals that were previously found to be achievable but are no longer achievable due to unanticipated faults or environmental conditions. Early detection of such situations enables operators or an autonomous fault response capability to deal with the problem at a point that maximizes the available options. For example, this system has been applied to the problem of managing battery energy on a lunar rover as it is used to explore the Moon. Astronauts drive the rover to waypoints and conduct science observations according to a plan that is scheduled and verified to be achievable with the energy resources available. As the astronauts execute this plan, the system uses this new capability to continuously re-verify the plan as energy is consumed to ensure that the battery will never be depleted below safe levels across the entire plan.

  13. Concurrency and Complexity in Verifying Dynamic Adaptation: A Case Study

    DTIC Science & Technology

    2005-01-01

    Concurrency and Complexity in Verifying Dynamic Adaptation: A Case Study ? Karun N. Biyani?? Sandeep S. Kulkarni? ? ? Department of Computer Science...lattice. References 1. Sandeep S. Kulkarni, Karun N. Biyani, and Umamaheswaran Arumugam. Compos- ing distributed fault-tolerance components. In...and Autonomic Computing. PhD thesis, Michigan State University, 2004. 7. Sandeep Kulkarni and Karun Biyani. Correctness of component-based adaptation

  14. FAll 2014 SEI Research Review Verifying Evolving Software

    DTIC Science & Technology

    2014-10-28

    Scalable verification of evolving software • reduce re-verification effort • close semantic gap between compiler and verifier • enable safe use of... compiler optimizations in safety-critical code Related Work: Current solutions are limited by • effectiveness (syntactic slicing, regression...developed by us 5 Fall 2014 SEI Research Review Gurfinkel, October 28, 2014 © 2014 Carnegie Mellon University Model Problem: Certifying Compiler for

  15. 10 CFR 36.39 - Design requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... lost for more than 10 seconds. (j) Seismic. For panoramic irradiators to be built in seismic areas, the... an earthquake by designing to the seismic requirements of an appropriate source such as American..., “Special Provisions for Seismic Design,” or local building codes, if current. (k) Wiring. For...

  16. 10 CFR 36.39 - Design requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... lost for more than 10 seconds. (j) Seismic. For panoramic irradiators to be built in seismic areas, the... an earthquake by designing to the seismic requirements of an appropriate source such as American..., “Special Provisions for Seismic Design,” or local building codes, if current. (k) Wiring. For...

  17. 10 CFR 36.39 - Design requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... lost for more than 10 seconds. (j) Seismic. For panoramic irradiators to be built in seismic areas, the... an earthquake by designing to the seismic requirements of an appropriate source such as American..., “Special Provisions for Seismic Design,” or local building codes, if current. (k) Wiring. For...

  18. 10 CFR 36.39 - Design requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... lost for more than 10 seconds. (j) Seismic. For panoramic irradiators to be built in seismic areas, the... an earthquake by designing to the seismic requirements of an appropriate source such as American..., “Special Provisions for Seismic Design,” or local building codes, if current. (k) Wiring. For...

  19. Application of seismic tomography in underground mining

    SciTech Connect

    Scott, D.F.; Williams, T.J.; Friedel, M.J.

    1996-12-01

    Seismic tomography, as used in mining, is based on the principle that highly stressed rock will demonstrate relatively higher P-wave velocities than rock under less stress. A decrease or increase in stress over time can be verified by comparing successive tomograms. Personnel at the Spokane Research Center have been investigating the use of seismic tomography to identify stress in remnant ore pillars in deep (greater than 1220 in) underground mines. In this process, three-dimensional seismic surveys are conducted in a pillar between mine levels. A sledgehammer is used to generate P-waves, which are recorded by geophones connected to a stacking signal seismograph capable of collecting and storing the P-wave data. Travel times are input into a spreadsheet, and apparent velocities are then generated and merged into imaging software. Mine workings are superimposed over apparent P-wave velocity contours to generate a final tomographic image. Results of a seismic tomographic survey at the Sunshine Mine, Kellogg, ED, indicate that low-velocity areas (low stress) are associated with mine workings and high-velocity areas (higher stress) are associated with areas where no mining has taken place. A high stress gradient was identified in an area where ground failed. From this tomographic survey, as well, as four earlier surveys at other deep underground mines, a method was developed to identify relative stress in remnant ore pillars. This information is useful in making decisions about miner safety when mining such ore pillars.

  20. The Spatial Scale of Detected Seismicity

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Chen, C.-C.

    2016-01-01

    An experimental method for the spatial resolution analysis of the earthquake frequency-magnitude distribution is introduced in order to identify the intrinsic spatial scale of the detected seismicity phenomenon. We consider the unbounded magnitude range m ∈ (-∞, +∞), which includes incomplete data below the completeness magnitude m c. By analyzing a relocated earthquake catalog of Taiwan, we find that the detected seismicity phenomenon is scale-variant for m ∈ (-∞, +∞) with its spatial grain a function of the configuration of the seismic network, while seismicity is known to be scale invariant for m ∈ [ m c, +∞). Correction for data incompleteness for m < m c based on the knowledge of the spatial scale of the process allows extending the analysis of the Gutenberg-Richter law and of the fractal dimension to lower magnitudes. This shall allow verifying the continuity of universality of these parameters over a wider magnitude range. Our results also suggest that the commonly accepted Gaussian model of earthquake detection might be an artifact of observation.

  1. Seismic isolation of two dimensional periodic foundations

    SciTech Connect

    Yan, Y.; Mo, Y. L.; Laskar, A.; Cheng, Z.; Shi, Z.; Menq, F.; Tang, Y.

    2014-07-28

    Phononic crystal is now used to control acoustic waves. When the crystal goes to a larger scale, it is called periodic structure. The band gaps of the periodic structure can be reduced to range from 0.5 Hz to 50 Hz. Therefore, the periodic structure has potential applications in seismic wave reflection. In civil engineering, the periodic structure can be served as the foundation of upper structure. This type of foundation consisting of periodic structure is called periodic foundation. When the frequency of seismic waves falls into the band gaps of the periodic foundation, the seismic wave can be blocked. Field experiments of a scaled two dimensional (2D) periodic foundation with an upper structure were conducted to verify the band gap effects. Test results showed the 2D periodic foundation can effectively reduce the response of the upper structure for excitations with frequencies within the frequency band gaps. When the experimental and the finite element analysis results are compared, they agree well with each other, indicating that 2D periodic foundation is a feasible way of reducing seismic vibrations.

  2. Statistical classification methods applied to seismic discrimination

    SciTech Connect

    Ryan, F.M.; Anderson, D.N.; Anderson, K.K.; Hagedorn, D.N.; Higbee, K.T.; Miller, N.E.; Redgate, T.; Rohay, A.C.

    1996-06-11

    To verify compliance with a Comprehensive Test Ban Treaty (CTBT), low energy seismic activity must be detected and discriminated. Monitoring small-scale activity will require regional (within {approx}2000 km) monitoring capabilities. This report provides background information on various statistical classification methods and discusses the relevance of each method in the CTBT seismic discrimination setting. Criteria for classification method selection are explained and examples are given to illustrate several key issues. This report describes in more detail the issues and analyses that were initially outlined in a poster presentation at a recent American Geophysical Union (AGU) meeting. Section 2 of this report describes both the CTBT seismic discrimination setting and the general statistical classification approach to this setting. Seismic data examples illustrate the importance of synergistically using multivariate data as well as the difficulties due to missing observations. Classification method selection criteria are presented and discussed in Section 3. These criteria are grouped into the broad classes of simplicity, robustness, applicability, and performance. Section 4 follows with a description of several statistical classification methods: linear discriminant analysis, quadratic discriminant analysis, variably regularized discriminant analysis, flexible discriminant analysis, logistic discriminant analysis, K-th Nearest Neighbor discrimination, kernel discrimination, and classification and regression tree discrimination. The advantages and disadvantages of these methods are summarized in Section 5.

  3. Operations plan for the Regional Seismic Test Network

    SciTech Connect

    Not Available

    1981-05-15

    The Regional Seismic Test Network program was established to provide a capability for detection of extremely sensitive earth movements. Seismic signals from both natural and man-made earth motions will be analyzed with the ultimate objective of accurately locating underground nuclear explosions. The Sandia National Laboratories, Albuquerque, has designed an unattended seismic station capable of recording seismic information received at the location of the seismometers installed as part of that specific station. A network of stations is required to increase the capability of determining the source of the seismic signal and the location of the source. Current plans are to establish a five-station seismic network in the United States and Canada. The Department of Energy, Nevada Operations Office, has been assigned the responsibility for deploying, installing, and operating these remote stations. This Operation Plan provides the basic information and tasking to accomplish this assignment.

  4. United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, M.D.; ,

    2008-01-01

    The U.S. Geological Survey?s maps of earthquake shaking hazards provide information essential to creating and updating the seismic design provisions of building codes and insurance rates used in the United States. Periodic revisions of these maps incorporate the results of new research. Buildings, bridges, highways, and utilities built to meet modern seismic design provisions are better able to withstand earthquakes, not only saving lives but also enabling critical activities to continue with less disruption. These maps can also help people assess the hazard to their homes or places of work and can also inform insurance rates.

  5. Seismic isolation of an electron microscope

    SciTech Connect

    Godden, W.G.; Aslam, M.; Scalise, D.T.

    1980-01-01

    A unique two-stage dynamic-isolation problem is presented by the conflicting design requirements for the foundations of an electron microscope in a seismic region. Under normal operational conditions the microscope must be isolated from ambient ground noise; this creates a system extremely vulnerable to seismic ground motions. Under earthquake loading the internal equipment forces must be limited to prevent damage or collapse. An analysis of the proposed design solution is presented. This study was motivated by the 1.5 MeV High Voltage Electron Microscope (HVEM) to be installed at the Lawrence Berkeley Laboratory (LBL) located near the Hayward Fault in California.

  6. Large scale mechanical metamaterials as seismic shields

    NASA Astrophysics Data System (ADS)

    Miniaci, Marco; Krushynska, Anastasiia; Bosia, Federico; Pugno, Nicola M.

    2016-08-01

    Earthquakes represent one of the most catastrophic natural events affecting mankind. At present, a universally accepted risk mitigation strategy for seismic events remains to be proposed. Most approaches are based on vibration isolation of structures rather than on the remote shielding of incoming waves. In this work, we propose a novel approach to the problem and discuss the feasibility of a passive isolation strategy for seismic waves based on large-scale mechanical metamaterials, including for the first time numerical analysis of both surface and guided waves, soil dissipation effects, and adopting a full 3D simulations. The study focuses on realistic structures that can be effective in frequency ranges of interest for seismic waves, and optimal design criteria are provided, exploring different metamaterial configurations, combining phononic crystals and locally resonant structures and different ranges of mechanical properties. Dispersion analysis and full-scale 3D transient wave transmission simulations are carried out on finite size systems to assess the seismic wave amplitude attenuation in realistic conditions. Results reveal that both surface and bulk seismic waves can be considerably attenuated, making this strategy viable for the protection of civil structures against seismic risk. The proposed remote shielding approach could open up new perspectives in the field of seismology and in related areas of low-frequency vibration damping or blast protection.

  7. Seismic Safety Of Simple Masonry Buildings

    SciTech Connect

    Guadagnuolo, Mariateresa; Faella, Giuseppe

    2008-07-08

    Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have to be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.

  8. K-means cluster analysis and seismicity partitioning for Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2014-07-01

    Pakistan and the western Himalaya is a region of high seismic activity located at the triple junction between the Arabian, Eurasian and Indian plates. Four devastating earthquakes have resulted in significant numbers of fatalities in Pakistan and the surrounding region in the past century (Quetta, 1935; Makran, 1945; Pattan, 1974 and the recent 2005 Kashmir earthquake). It is therefore necessary to develop an understanding of the spatial distribution of seismicity and the potential seismogenic sources across the region. This forms an important basis for the calculation of seismic hazard; a crucial input in seismic design codes needed to begin to effectively mitigate the high earthquake risk in Pakistan. The development of seismogenic source zones for seismic hazard analysis is driven by both geological and seismotectonic inputs. Despite the many developments in seismic hazard in recent decades, the manner in which seismotectonic information feeds the definition of the seismic source can, in many parts of the world including Pakistan and the surrounding regions, remain a subjective process driven primarily by expert judgment. Whilst much research is ongoing to map and characterise active faults in Pakistan, knowledge of the seismogenic properties of the active faults is still incomplete in much of the region. Consequently, seismicity, both historical and instrumental, remains a primary guide to the seismogenic sources of Pakistan. This study utilises a cluster analysis approach for the purposes of identifying spatial differences in seismicity, which can be utilised to form a basis for delineating seismogenic source regions. An effort is made to examine seismicity partitioning for Pakistan with respect to earthquake database, seismic cluster analysis and seismic partitions in a seismic hazard context. A magnitude homogenous earthquake catalogue has been compiled using various available earthquake data. The earthquake catalogue covers a time span from 1930 to 2007 and

  9. Cross-correlation—an objective tool to indicate induced seismicity

    NASA Astrophysics Data System (ADS)

    Oprsal, Ivo; Eisner, Leo

    2014-03-01

    Differentiation between natural and induced seismicity is crucial for the ability to safely and soundly carry out various underground experiments and operations. This paper defines an objective tool for one of the criteria used to discriminate between natural and induced seismicity. The qualitative correlation between earthquake rates and the injected volume has been an established tool for investigating the possibility of induced, or triggered, seismicity. We derive mathematically, and verify using numerical examples, that the definition of normalized cross-correlation (NCC) between positive random functions exhibits high values with a limit equal to one, if these functions (such as earthquake rates and injection volumes) have a large mean and low standard deviation. In such a case, the high NCC values do not necessarily imply temporal relationship between the phenomena. Instead of positive-value time histories, the functions with their running mean subtracted should be used for cross-correlation. The NCC of such functions (called here NCCEP) may be close to zero, or may oscillate between positive and negative values in cases where seismicity is not related to injection. We apply this method for case studies of seismicity in Colorado, the United Kingdom, Switzerland and south-central Oklahoma, and show that NCCEP reliably determines induced seismicity. Finally, we introduce a geomechanical model explaining the positive cross-correlation observed in the induced seismicity data sets.

  10. Verifying Galileo's discoveries: telescope-making at the Collegio Romano

    NASA Astrophysics Data System (ADS)

    Reeves, Eileen; van Helden, Albert

    The Jesuits of the Collegio Romano in Rome, especially the mathematicians Clavius and Grienberger, were very interested in Galilei's discoveries. After they had failed to recognize with telescopes of own construction the celestial phenomena, they expressed serious doubts. But from November 1610 onward, after they had built a better telescope and had obtained from Venice another one in addition, and could verify Galilei's observations, they completely accepted them. Clavius, who stuck to the Ptolemaic system till his death in 1612, even pointed out these facts in his last edition of Sacrobosco's Sphaera. He as well as his conpatres, however, avoided any conclusions with respect to the planetary system.

  11. Verifiable Quantum ( k, n)-threshold Secret Key Sharing

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Teng, Yi-Wei; Chai, Hai-Ping; Wen, Qiao-Yan

    2011-03-01

    Based on Lagrange interpolation formula and the post-verification mechanism, we show how to construct a verifiable quantum ( k, n) threshold secret key sharing scheme. Compared with the previous secret sharing protocols, ours has the merits: (i) it can resist the fraud of the dealer who generates and distributes fake shares among the participants during the secret distribution phase; Most importantly, (ii) It can check the cheating of the dishonest participant who provides a false share during the secret reconstruction phase such that the authorized group cannot recover the correct secret.

  12. Verifying a Simplified Fuel Oil Flow Field Measurement Protocol

    SciTech Connect

    Henderson, H.; Dentz, J.; Doty, C.

    2013-07-01

    The Better Buildings program is a U.S. Department of Energy program funding energy efficiency retrofits in buildings nationwide. The program is in need of an inexpensive method for measuring fuel oil consumption that can be used in evaluating the impact that retrofits have in existing properties with oil heat. This project developed and verified a fuel oil flow field measurement protocol that is cost effective and can be performed with little training for use by the Better Buildings program as well as other programs and researchers.

  13. Verifying a Simplified Fuel Oil Field Measurement Protocol

    SciTech Connect

    Henderson, Hugh; Dentz, Jordan; Doty, Chris

    2013-07-01

    The Better Buildings program is a U.S. Department of Energy program funding energy efficiency retrofits in buildings nationwide. The program is in need of an inexpensive method for measuring fuel oil consumption that can be used in evaluating the impact that retrofits have in existing properties with oil heat. This project developed and verified a fuel oil flow field measurement protocol that is cost effective and can be performed with little training for use by the Better Buildings program as well as other programs and researchers.

  14. Permeameter data verify new turbulence process for MODFLOW.

    PubMed

    Kuniansky, Eve L; Halford, Keith J; Shoemaker, W Barclay

    2008-01-01

    Abstract A sample of Key Largo Limestone from southern Florida exhibited turbulent flow behavior along three orthogonal axes as reported in recently published permeameter experiments. The limestone sample was a cube measuring 0.2 m on edge. The published nonlinear relation between hydraulic gradient and discharge was simulated using the turbulent flow approximation applied in the Conduit Flow Process (CFP) for MODFLOW-2005 mode 2, CFPM2. The good agreement between the experimental data and the simulated results verifies the utility of the approach used to simulate the effects of turbulent flow on head distributions and flux in the CFPM2 module of MODFLOW-2005.

  15. Permeameter data verify new turbulence process for MODFLOW

    USGS Publications Warehouse

    Kuniansky, Eve L.; Halford, Keith J.; Shoemaker, W. Barclay

    2008-01-01

    A sample of Key Largo Limestone from southern Florida exhibited turbulent flow behavior along three orthogonal axes as reported in recently published permeameter experiments. The limestone sample was a cube measuring 0.2 m on edge. The published nonlinear relation between hydraulic gradient and discharge was simulated using the turbulent flow approximation applied in the Conduit Flow Process (CFP) for MODFLOW-2005 mode 2, CFPM2. The good agreement between the experimental data and the simulated results verifies the utility of the approach used to simulate the effects of turbulent flow on head distributions and flux in the CFPM2 module of MODFLOW-2005.

  16. A robust control method for seismic protection of civil frame building

    NASA Astrophysics Data System (ADS)

    Wu, Jong-Cheng; Chih, Hsin-Hsien; Chen, Chern-Hwa

    2006-06-01

    Recently, more and more experimental studies indicate that a mature active control design toward practical implementation requires consideration of robustness criteria in the design process, which includes the performance robustness in reducing tracking error and in resistance to external disturbance and measurement noise, and the stability robustness with respect to system uncertainty. In this paper, a robust control method employing these robustness criteria that can be further converted to a generalized H∞ control problem is presented for control of civil structures. To facilitate computation of H∞ controllers, an efficient solution procedure based on linear matrix inequalities (LMI), the so-called LMI-based H∞ control, is introduced. For verifying applicability of the proposed method, extensive simulations were conducted on a numerical building model with active bracings under seismic excitation, which was constructed from a full-scale steel frame building that was once tested on a shake table. In the simulation, system uncertainty is assumed in the controller design and the use of acceleration feedback is emphasized for practical consideration. From the simulation results, it is demonstrated that the performance of H∞ controllers proposed is remarkable and robust, and the efficiency of LMI-based approach is also approved. Therefore, this robust control method is suitable for application to seismic protection of civil frame buildings.

  17. Clinical experience with a computerized record and verify system.

    PubMed

    Podmaniczky, K C; Mohan, R; Kutcher, G J; Kestler, C; Vikram, B

    1985-08-01

    To improve the quality of patient care by detecting and preventing many types of treatment mistakes, we have implemented a computerized system for recording and verifying external beam radiation treatments on our therapy machines. It inhibits the radiation beam if treatment machine settings do not agree with prescribed values to within maximum permissible deviations (tolerances). The tolerances are determined from experience and adjusted when necessary to make the system more effective and less susceptible to "false alarms." The system uses a common data base for all treatment machines. As a result, it permits statistical analysis and generation of reports based on data encompassing the entire patient population as well as verification of treatments of patients transferred from one machine to another. Reports of verification failures reveal patterns of mistakes. Knowing these, attempts can be made to reduce the frequency of verification failures. "Significant" mistakes that were prevented are extracted by treatment planning personnel from these reports. Analysis of data indicates a rate of approximately 150 "significant" mistakes detected and prevented per machine per year, representing 1.0% of all fields treated. We present and discuss our experiences with the system and with the frequency, patterns, and significance of verification failures. We selected a few of the patients for whose treatments significant set-up mistakes were made, and were detected and prevented by the Record and Verify System. We include discussions of the overall effect these mistakes would have had on dose distribution had they not been prevented.

  18. Method of migrating seismic records

    DOEpatents

    Ober, Curtis C.; Romero, Louis A.; Ghiglia, Dennis C.

    2000-01-01

    The present invention provides a method of migrating seismic records that retains the information in the seismic records and allows migration with significant reductions in computing cost. The present invention comprises phase encoding seismic records and combining the encoded seismic records before migration. Phase encoding can minimize the effect of unwanted cross terms while still allowing significant reductions in the cost to migrate a number of seismic records.

  19. Successes and failures of recording and interpreting seismic data in structurally complex area: seismic case history

    SciTech Connect

    Morse, V.C.; Johnson, J.H.; Crittenden, J.L.; Anderson, T.D.

    1986-05-01

    There are successes and failures in recording and interpreting a single seismic line across the South Owl Creek Mountain fault on the west flank of the Casper arch. Information obtained from this type of work should help explorationists who are exploring structurally complex areas. A depth cross section lacks a subthrust prospect, but is illustrated to show that the South Owl Creek Mountain fault is steeper with less apparent displacement than in areas to the north. This cross section is derived from two-dimensional seismic modeling, using data processing methods specifically for modeling. A flat horizon and balancing technique helps confirm model accuracy. High-quality data were acquired using specifically designed seismic field parameters. The authors concluded that the methodology used is valid, and an interactive modeling program in addition to cross-line control can improve seismic interpretations in structurally complex areas.

  20. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    SciTech Connect

    Joel Walls; M.T. Taner; Naum Derzhi; Gary Mavko; Jack Dvorkin

    2003-04-01

    In this report we will show some new Q related seismic attributes on the Burlington-Seitel data set. One example will be called Energy Absorption Attribute (EAA) and is based on a spectral analysis. The EAA algorithm is designed to detect a sudden increase in the rate of exponential decay in the relatively higher frequency portion of the spectrum. In addition we will show results from a hybrid attribute that combines attenuation with relative acoustic impedance to give a better indication of commercial gas saturation.

  1. Multifrequency seismic detectability of seasonal thermoclines assessed from ARGO data

    NASA Astrophysics Data System (ADS)

    Ker, S.; Le Gonidec, Y.; Marié, L.

    2016-08-01

    Seismic oceanography is a developing research topic where new acoustic methods allow high-resolution teledetection of the thermohaline structure of the ocean. First implementations to study the Ocean Surface Boundary Layer have recently been achieved but remain very challenging due to the weakness and shallowness of such seismic reflectors. In this article, we develop a multifrequency seismic analysis of hydrographic data sets collected in a seasonally stratified midlatitude shelf by ARGO network floats to assess the detectability issue of shallow thermoclines. This analysis, for which sensitivity to the data reduction scheme used by ARGO floats for the transmission of the profiles is discussed, allows characterizing both the depth location and the frequency dependency of the dominant reflective feature of such complex structures. This approach provides the first statistical distribution of the range of variability of the frequency-dependent seismic reflection amplitude of the midlatitude seasonal thermoclines. We introduce a new parameter to quantify the overall capability of a multichannel seismic setup, including the source strength, the fold, and the ambient noise level, to detect shallow thermoclines. Seismic source signals are approximated by Ricker wavelets, providing quantitative guidelines to help in the design of seismic experiments targeting such oceanic reflectors. For shallow midlatitude seasonal thermoclines, we show that the detectability is optimal for seismic peak frequencies between 200 and 400 Hz: this means that airgun and Sparker sources are not well suited and that significant improvements of source devices will be necessary before seismic imaging of OSBL structures can be reliably attempted.

  2. Seismic attenuation in Florida

    SciTech Connect

    Bellini, J.J.; Bartolini, T.J.; Lord, K.M.; Smith, D.L. . Dept. of Geology)

    1993-03-01

    Seismic signals recorded by the expanded distribution of earthquake seismograph stations throughout Florida and data from a comprehensive review of record archives from stations GAI contribute to an initial seismic attenuation model for the Florida Plateau. Based on calculations of surface particle velocity, a pattern of attenuation exists that appears to deviate from that established for the remainder of the southeastern US. Most values suggest greater seismic attenuation within the Florida Plateau. However, a separate pattern may exist for those signals arising from the Gulf of Mexico. These results have important implications for seismic hazard assessments in Florida and may be indicative of the unique lithospheric identity of the Florida basement as an exotic terrane.

  3. BUILDING 341 Seismic Evaluation

    SciTech Connect

    Halle, J.

    2015-06-15

    The Seismic Evaluation of Building 341 located at Lawrence Livermore National Laboratory in Livermore, California has been completed. The subject building consists of a main building, Increment 1, and two smaller additions; Increments 2 and 3.

  4. Deepwater seismic acquisition technology

    SciTech Connect

    Caldwell, J.

    1996-09-01

    Although truly new technology is not required for successful acquisition of seismic data in deep Gulf of Mexico waters, it is helpful to review some basic aspects of these seismic surveys. Additionally, such surveys are likely to see early use of some emerging new technology which can improve data quality. Because such items as depth imaging, borehole seismic, 4-D and marine 3-component recording were mentioned in the May 1996 issue of World Oil, they are not discussed again here. However, these technologies will also play some role in the deepwater seismic activities. What is covered in this paper are some new considerations for: (1) longer data records needed in deeper water, (2) some pros and cons of very long steamer use, and (3) two new commercial systems for quantifying data quality.

  5. Third Quarter Hanford Seismic Report for Fiscal Year 2005

    SciTech Connect

    Reidel, Steve P.; Rohay, Alan C.; Hartshorn, Donald C.; Clayton, Ray E.; Sweeney, Mark D.

    2005-09-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 337 triggers during the third quarter of fiscal year 2005. Of these triggers, 20 were earthquakes within the Hanford Seismic Network. The largest earthquake within the Hanford Seismic Network was a magnitude 1.3 event May 25 near Vantage, Washington. During the third quarter, stratigraphically 17 (85%) events occurred in the Columbia River basalt (approximately 0-5 km), no events in the pre-basalt sediments (approximately 5-10 km), and three (15%) in the crystalline basement (approximately 10-25 km). During the first quarter, geographically five (20%) earthquakes occurred in swarm areas, 10 (50%) earthquakes were associated with a major geologic structure, and 5 (25%) were classified as random events.

  6. Annual Hanford Seismic Report for Fiscal Year 2003

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2003-12-01

    This report describes the seismic activity in and around the Hanford Site during Fiscal year 2003. Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 1,336 triggers during fiscal year 2003. Of these triggers, 590 were earthquakes. One hundred and one earthquakes of the 590 earthquakes were located in the Hanford Seismic Network area. Stratigraphically 35 (34.6%) occurred in the Columbia River basalt, 29 (28.7%) were earthquakes in the pre-basalt sediments, and 37 (36.7%) were earthquakes in the crystalline basement. Geographically, 48 (47%) earthquakes occurred in swarm areas, 4 (4%) earthquakes were associated with a major geologic structure, and 49 (49%) were classified as random events. During the third and fourth quarters, an earthquake swarm consisting of 27 earthquakes occurred on the south limb of Rattlesnake Mountain. The earthquakes are centered over the northwest extension of the Horse Heaven Hills anticline and probably occur near the interface of the Columbia River Basalt Group and pre-basalt sediments.

  7. Probabilistic Seismic Hazard Deaggregation for Selected Egyptian Cities

    NASA Astrophysics Data System (ADS)

    Sawires, Rashad; Peláez, José A.; Fat-Helbary, Raafat E.; Panzera, Francesco; Ibrahim, Hamza A.; Hamdache, Mohamed

    2017-02-01

    A probabilistic seismic hazard analysis in terms of peak ground acceleration (PGA) and spectral acceleration (SA) values has been performed for the Egyptian territory. Eighty-eight potential seismic sources (for shallow- and intermediate-depth seismicity) in and around Egypt were identified and characterized based on an updated and unified earthquake catalog spanning the time period from 2200 B.C. until 2013 A.D. A logic-tree approach was followed, after a sensitivity analysis, to consider the epistemic uncertainty in the different input parameters, including the selected ground-motion attenuation models to predict the ground motion for the different tectonic environments. Then the seismic hazard deaggregation results, in terms of distance and magnitude, for the most important cities in Egypt have been computed to help understanding the relative contributions of the different seismic sources. Seismic hazard deaggregation, in particular, was computed for PGA and SA at periods of 0.2, 1.0 and 2.0 s for rock-site conditions, and for 10% probability of exceedance in 50 years. In general, the results at most of the cities indicate that the distance to the seismic sources which mostly contribute to the seismic hazard is mainly controlled by the nearby seismic sources (especially for PGA). However, distant events contribute more to the hazard for larger spectral periods (for 1.0 and 2.0 s). A significant result of this type of work is that seismic hazard deaggregation provides useful data on the distance and magnitude of the contributing seismic sources to the hazard in a certain place, which can be applied to generate scenario earthquakes and select acceleration records for seismic design.

  8. Development of Towed Marine Seismic Vibrator as an Alternative Seismic Source

    NASA Astrophysics Data System (ADS)

    Ozasa, H.; Mikada, H.; Murakami, F.; Jamali Hondori, E.; Takekawa, J.; Asakawa, E.; Sato, F.

    2015-12-01

    The principal issue with respect to marine impulsive sources to acquire seismic data is if the emission of acoustic energy inflicts harm on marine mammals or not, since the volume of the source signal being released into the marine environment could be so large compared to the sound range of the mammals. We propose a marine seismic vibrator as an alternative to the impulsive sources to mitigate a risk of the impact to the marine environment while satisfying the necessary conditions of seismic surveys. These conditions include the repeatability and the controllability of source signals both in amplitude and phase for high-quality measurements. We, therefore, designed a towed marine seismic vibrator (MSV) as a new type marine vibratory seismic source that employed the hydraulic servo system for the controllability condition in phase and in amplitude that assures the repeatability as well. After fabricating a downsized MSV that requires the power of 30 kVA at a depth of about 250 m in water, several sea trials were conducted to test the source characteristics of the downsized MSV in terms of amplitude, frequency, horizontal and vertical directivities of the generated field. The maximum sound level satisfied the designed specification in the frequencies ranging from 3 to 300 Hz almost omnidirectionally. After checking the source characteristics, we then conducted a trial seismic survey, using both the downsized MSV and an airgun of 480 cubic-inches for comparison, with a streamer cable of 2,000m long right above a cabled earthquake observatory in the Japan Sea. The result showed that the penetration of seismic signals generated by the downsized MSV was comparable to that by the airgun, although there was a slight difference in the signal-to-noise ratio. The MSV could become a versatile source that will not harm living marine mammals as an alternative to the existing impulsive seismic sources such as airgun.

  9. Revised seismic and geologic siting regulations for nuclear power plants

    SciTech Connect

    Murphy, A.J.; Chokshi, N.C.

    1997-02-01

    The primary regulatory basis governing the seismic design of nuclear power plants is contained in Appendix A to Part 50, General Design Criteria for Nuclear Power Plants, of Title 10 of the Code of Federal Regulations (CFR). General Design Criteria (GDC) 2 defines requirements for design bases for protection against natural phenomena. GDC 2 states the performance criterion that {open_quotes}Structures, systems, and components important to safety shall be designed to withstand the effects of natural phenomena such as earthquakes, . . . without loss of capability to perform their safety functions. . .{close_quotes}. Appendix A to Part 100, Seismic and Geologic Siting Criteria for Nuclear Power Plants, has been the principal document which provided detailed criteria to evaluate the suitability of proposed sites and suitability of the plant design basis established in consideration of the seismic and geologic characteristics of the proposed sites. Appendix A defines required seismological and geological investigations and requirements for other design conditions such as soil stability, slope stability, and seismically induced floods and water waves, and requirements for seismic instrumentation. The NRC staff is in the process of revising Appendix A. The NRC has recently revised seismic siting and design regulations for future applications. These revisions are discussed in detail in this paper.

  10. Seismicity and seismotectonics in eastern Canada and vicinity

    NASA Astrophysics Data System (ADS)

    Ma, Shutian

    The aim of this thesis is to explore the fundamental nature of seismicity and seismotectonics in eastern Canada and vicinity. The findings have some instructive roles in seismological research and seismic hazard assessments. The first part is focused on developments and refinements to methodologies required for analysis of seismic phenomena. The second part is devoted to case histories, in which these methods are applied with the goal of developing greater insight into the nature of intraplate seismicity. Chapter Two describes a hybrid method for precise determination of earthquake hypocenters. The method partitions the inversion process by separating the inversion into distinct steps. The benefits of splitting the problem up stem from inherent tradeoffs between the focal depth, epicentral location and the origin time in the inversion process. Examples show that the approach yields more accurate solutions than those obtained using the original hypoDD analysis procedure. In Chapter Three a new moment-tensor inversion method is described and tested. The method is tailored for small earthquakes. It is interactive and uses adjustable, independently weighted time windows to isolate crustal phases. The technique is applied to a number of small earthquakes and also verified using a synthetic event. In Chapter Four, the developed techniques are applied to investigate seismicity of the WQSZ in western Quebec. Seismicity in this zone is mainly localized along a hotspot track. A statistical approach is used to delineate spatial clusters of seismicity. The locations of several clusters are consistent with paleoseismic evidence for large prehistoric earthquakes, suggesting these clusters may be exceptionally long-lived aftershock sequences from prehistoric earthquakes. Chapter Five provides an analysis of seismicity and seismotectonics in northern Ontario. Four distinct types of seismic activity are noted. No obvious correlation was found between the seismicity on the Severn

  11. Seismic Consequence Abstraction

    SciTech Connect

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  12. Beyond Hammers and Nails: Mitigating and Verifying Greenhouse Gas Emissions

    NASA Astrophysics Data System (ADS)

    Gurney, Kevin Robert

    2013-05-01

    One of the biggest challenges to future international agreements on climate change is an independent, science-driven method of verifying reductions in greenhouse gas emissions (GHG) [Niederberger and Kimble, 2011]. The scientific community has thus far emphasized atmospheric measurements to assess changes in emissions. An alternative is direct measurement or estimation of fluxes at the source. Given the many challenges facing the approach that uses "top-down" atmospheric measurements and recent advances in "bottom-up" estimation methods, I challenge the current doctrine, which has the atmospheric measurement approach "validating" bottom-up, "good-faith" emissions estimation [Balter, 2012] or which holds that the use of bottom-up estimation is like "dieting without weighing oneself" [Nisbet and Weiss, 2010].

  13. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  14. A computerized record and verify system for radiation treatments.

    PubMed

    Mohan, R; Podmaniczky, K C; Caley, R; Lapidus, A; Laughlin, J S

    1984-10-01

    We have developed a general purpose, comprehensive, and highly reliable computerized Record and Verify System to detect and prevent mistakes in the delivery of external beam radiation therapy. This system helps prevent accidental delivery of dangerous dose, improves quality control, and provides invaluable record keeping and report generating capabilities. Currently, treatment machine and couch parameter settings of four different machines are monitored by the system and compared with prescribed values. The system inhibits a machine from being turned on if the settings do not agree with the prescribed values to within specified maximum permissible deviations. The system is user-friendly and provides useful, complete, and easily accessible data. We describe many aspects of the system including hardware, software, data, and operation, and we conclude with a brief discussion of clinical experience and preliminary data.

  15. Analysis of Fingerprint Image to Verify a Person

    NASA Astrophysics Data System (ADS)

    Jahankhani, Hossein; Mohid, Maktuba

    Identification and authentication technologies are increasing day by day to protect people and goods from crime and terrorism. This paper is aimed to discuss fingerprint technology in depth and analysis of fingerprint image. Verify a person with a highlight on fingerprint matching. Some fingerprint matching algorithms are analysed and compared. The outcomes of the analysis has identified some major issues or factors of fingerprinting, which are location, rotation, clipping, noise, non-linear distortion sensitiveness/ insensitiveness properties, computational cost and accuracy level of fingerprint matching algorithms. Also a new fingerprint matching algorithm proposed in this research work. The proposed algorithm has used Euclidean distance, angle difference, type as matching parameters instead of specific location parameter (like, x or y coordinates), which makes the algorithm location and rotation insensitive. The matching of local neighbourhoods at each stage makes the algorithm non-linear distortion insensitive.

  16. Cryptanalysis and improvement of verifiable quantum ( k, n) secret sharing

    NASA Astrophysics Data System (ADS)

    Song, Xiuli; Liu, Yanbing

    2016-02-01

    After analyzing Yang's verifiable quantum secret sharing (VQSS) scheme, we show that in their scheme a participant can prepare a false quantum particle sequence corresponding to a forged share, while other any participant cannot trace it. In addition, an attacker or a participant can forge a new quantum sequence by transforming an intercepted quantum sequence; moreover, the forged sequence can pass the verification of other participants. So we propose a new VQSS scheme to improve the existed one. In the improved scheme, we construct an identity-based quantum signature encryption algorithm, which ensures chosen plaintext attack security of the shares and their signatures transmitted in the quantum tunnel. We employ dual quantum signature and one-way function to trace against forgery and repudiation of the deceivers (dealer or participants). Furthermore, we add the reconstruction process of quantum secret and prove the security property against superposition attack in this process.

  17. Seismic analysis of a reinforced concrete containment vessel model

    SciTech Connect

    RANDY,JAMES J.; CHERRY,JEFFERY L.; RASHID,YUSEF R.; CHOKSHI,NILESH

    2000-02-03

    Pre-and post-test analytical predictions of the dynamic behavior of a 1:10 scale model Reinforced Concrete Containment Vessel are presented. This model, designed and constructed by the Nuclear Power Engineering Corp., was subjected to seismic simulation tests using the high-performance shaking table at the Tadotsu Engineering Laboratory in Japan. A group of tests representing design-level and beyond-design-level ground motions were first conducted to verify design safety margins. These were followed by a series of tests in which progressively larger base motions were applied until structural failure was induced. The analysis was performed by ANATECH Corp. and Sandia National Laboratories for the US Nuclear Regulatory Commission, employing state-of-the-art finite-element software specifically developed for concrete structures. Three-dimensional time-history analyses were performed, first as pre-test blind predictions to evaluate the general capabilities of the analytical methods, and second as post-test validation of the methods and interpretation of the test result. The input data consisted of acceleration time histories for the horizontal, vertical and rotational (rocking) components, as measured by accelerometers mounted on the structure's basemat. The response data consisted of acceleration and displacement records for various points on the structure, as well as time-history records of strain gages mounted on the reinforcement. This paper reports on work in progress and presents pre-test predictions and post-test comparisons to measured data for tests simulating maximum design basis and extreme design basis earthquakes. The pre-test analyses predict the failure earthquake of the test structure to have an energy level in the range of four to five times the energy level of the safe shutdown earthquake. The post-test calculations completed so far show good agreement with measured data.

  18. Short-Period Seismic Noise in Vorkuta (Russia)

    SciTech Connect

    Kishkina, S B; Spivak, A A; Sweeney, J J

    2008-05-15

    Cultural development of new subpolar areas of Russia is associated with a need for detailed seismic research, including both mapping of regional seismicity and seismic monitoring of specific mining enterprises. Of special interest are the northern territories of European Russia, including shelves of the Kara and Barents Seas, Yamal Peninsula, and the Timan-Pechora region. Continuous seismic studies of these territories are important now because there is insufficient seismological knowledge of the area and an absence of systematic data on the seismicity of the region. Another task of current interest is the necessity to consider the seismic environment in the design, construction, and operation of natural gas extracting enterprises such as the construction of the North European Gas Pipeline. Issues of scientific importance for seismic studies in the region are the complex geodynamical setting, the presence of permafrost, and the complex tectonic structure. In particular, the Uralian Orogene (Fig. 1) strongly affects the propagation of seismic waves. The existing subpolar seismic stations [APA (67,57{sup o}N; 33,40{sup o}E), LVZ (67,90{sup o}N; 34,65{sup o}E), and NRIL (69,50{sup o}N; 88,40{sup o}E)] do not cover the extensive area between the Pechora and Ob Rivers (Fig. 1). Thus seismic observations in the Vorkuta area, which lies within the area of concern, represent a special interest. Continuous recording at a seismic station near the city of Vorkuta (67,50{sup o}N; 64,11{sup o}E) [1] has been conducted since 2005 for the purpose of regional seismic monitoring and, more specifically, detection of seismic signals caused by local mining enterprises. Current surveys of local seismic noise [7,8,9,11], are particularly aimed at a technical survey for the suitability of the site for installation of a small-aperture seismic array, which would include 10-12 recording instruments, with the Vorkuta seismic station as the central element. When constructed, this seismic

  19. Seismic risk management solution for nuclear power plants

    SciTech Connect

    Coleman, Justin; Sabharwall, Piyush

    2014-12-01

    Nuclear power plants should safely operate during normal operations and maintain core-cooling capabilities during off-normal events, including external hazards (such as flooding and earthquakes). Management of external hazards to expectable levels of risk is critical to maintaining nuclear facility and nuclear power plant safety. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components). Seismic isolation (SI) is one protective measure showing promise to minimize seismic risk. Current SI designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed in American Society of Civil Engineer Standard 4, ASCE-4, to be released in the winter of 2014, for light water reactors facilities using commercially available technology. The intent of ASCE-4 is to provide criteria for seismic analysis of safety related nuclear structures such that the responses to design basis seismic events, computed in accordance with this standard, will have a small likelihood of being exceeded. The U.S. nuclear industry has not implemented SI to date; a seismic isolation gap analysis meeting was convened on August 19, 2014, to determine progress on implementing SI in the U.S. nuclear industry. The meeting focused on the systems and components that could benefit from SI. As a result, this article highlights the gaps identified at this meeting.

  20. Seismic risk management solution for nuclear power plants

    DOE PAGES

    Coleman, Justin; Sabharwall, Piyush

    2014-12-01

    Nuclear power plants should safely operate during normal operations and maintain core-cooling capabilities during off-normal events, including external hazards (such as flooding and earthquakes). Management of external hazards to expectable levels of risk is critical to maintaining nuclear facility and nuclear power plant safety. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components). Seismic isolation (SI) is one protective measure showing promise to minimize seismic risk. Current SI designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefitmore » of SI application in the nuclear industry is being recognized and SI systems have been proposed in American Society of Civil Engineer Standard 4, ASCE-4, to be released in the winter of 2014, for light water reactors facilities using commercially available technology. The intent of ASCE-4 is to provide criteria for seismic analysis of safety related nuclear structures such that the responses to design basis seismic events, computed in accordance with this standard, will have a small likelihood of being exceeded. The U.S. nuclear industry has not implemented SI to date; a seismic isolation gap analysis meeting was convened on August 19, 2014, to determine progress on implementing SI in the U.S. nuclear industry. The meeting focused on the systems and components that could benefit from SI. As a result, this article highlights the gaps identified at this meeting.« less

  1. The Budget Guide to Seismic Network Management

    NASA Astrophysics Data System (ADS)

    Hagerty, M. T.; Ebel, J. E.

    2007-05-01

    Regardless of their size, there are certain tasks that all seismic networks must perform, including data collection and processing, earthquake location, information dissemination, and quality control. Small seismic networks are unlikely to possess the resources -- manpower and money -- required to do much in-house development. Fortunately, there are a lot of free or inexpensive software solutions available that are able to perform many of the required tasks. Often the available solutions are all-in-one turnkey packages designed and developed for much larger seismic networks, and the cost of adapting them to a smaller network must be weighed against the ease with which other, non-seismic software can be adapted to the same task. We describe here the software and hardware choices we have made for the New England Seismic Network (NESN), a sparse regional seismic network responsible for monitoring and reporting all seismicity within the New England region in the northeastern U.S. We have chosen to use a cost-effective approach to monitoring using free, off-the-shelf solutions where available (e.g., Earthworm, HYP2000) and modifying freeware solutions when it is easier than trying to adapt a large, complicated package. We have selected for use software that is: free, likely to receive continued support from the seismic or, preferably, larger internet community, and modular. Modularity is key to our design because it ensures that if one component of our processing system becomes obsolete, we can insert a suitable replacement with few modifications to the other modules. Our automated event detection, identification and location system is based on a wavelet transform analysis of station data that arrive continuously via TCP/IP transmission over the internet. Our system for interactive analyst review of seismic events and remote system monitoring utilizes a combination of Earthworm modules, Perl cgi-bin scripts, Java, and native Unix commands and can now be carried out via

  2. Seismic Hazard Characterization at the DOE Savannah River Site (SRS): Status report

    SciTech Connect

    Savy, J.B.

    1994-06-24

    The purpose of the Seismic Hazard Characterization project for the Savannah River Site (SRS-SHC) is to develop estimates of the seismic hazard for several locations within the SRS. Given the differences in the geology and geotechnical characteristics at each location, the estimates of the seismic hazard are to allow for the specific local conditions at each site. Characterization of seismic hazard is a critical factor for the design of new facilities as well as for the review and potential retrofit of existing facilities at SRS. The scope of the SRS seismic hazard characterization reported in this document is limited to the Probabilistic Seismic Hazard Analysis (PSHA). The goal of the project is to provide seismic hazard estimates based on a state-of-the-art method which is consistent with developments and findings of several ongoing studies which are deemed to bring improvements in the state of the seismic hazard analyses.

  3. Seismic exploration for water on Mars

    NASA Technical Reports Server (NTRS)

    Page, Thornton

    1987-01-01

    It is proposed to soft-land three seismometers in the Utopia-Elysium region and three or more radio controlled explosive charges at nearby sites that can be accurately located by an orbiter. Seismic signatures of timed explosions, to be telemetered to the orbiter, will be used to detect present surface layers, including those saturated by volatiles such as water and/or ice. The Viking Landers included seismometers that showed that at present Mars is seismically quiet, and that the mean crustal thickness at the site is about 14 to 18 km. The new seismic landers must be designed to minimize wind vibration noise, and the landing sites selected so that each is well formed on the regolith, not on rock outcrops or in craters. The explosive charges might be mounted on penetrators aimed at nearby smooth areas. They must be equipped with radio emitters for accurate location and radio receivers for timed detonation.

  4. Permafrost Active Layer Seismic Interferometry Experiment (PALSIE).

    SciTech Connect

    Abbott, Robert; Knox, Hunter Anne; James, Stephanie; Lee, Rebekah; Cole, Chris

    2016-01-01

    We present findings from a novel field experiment conducted at Poker Flat Research Range in Fairbanks, Alaska that was designed to monitor changes in active layer thickness in real time. Results are derived primarily from seismic data streaming from seven Nanometric Trillium Posthole seismometers directly buried in the upper section of the permafrost. The data were evaluated using two analysis methods: Horizontal to Vertical Spectral Ratio (HVSR) and ambient noise seismic interferometry. Results from the HVSR conclusively illustrated the method's effectiveness at determining the active layer's thickness with a single station. Investigations with the multi-station method (ambient noise seismic interferometry) are continuing at the University of Florida and have not yet conclusively determined active layer thickness changes. Further work continues with the Bureau of Land Management (BLM) to determine if the ground based measurements can constrain satellite imagery, which provide measurements on a much larger spatial scale.

  5. SEISMIC MODELING ENGINES PHASE 1 FINAL REPORT

    SciTech Connect

    BRUCE P. MARION

    2006-02-09

    Seismic modeling is a core component of petroleum exploration and production today. Potential applications include modeling the influence of dip on anisotropic migration; source/receiver placement in deviated-well three-dimensional surveys for vertical seismic profiling (VSP); and the generation of realistic data sets for testing contractor-supplied migration algorithms or for interpreting AVO (amplitude variation with offset) responses. This project was designed to extend the use of a finite-difference modeling package, developed at Lawrence Berkeley Laboratories, to the advanced applications needed by industry. The approach included a realistic, easy-to-use 2-D modeling package for the desktop of the practicing geophysicist. The feasibility of providing a wide-ranging set of seismic modeling engines was fully demonstrated in Phase I. The technical focus was on adding variable gridding in both the horizontal and vertical directions, incorporating attenuation, improving absorbing boundary conditions and adding the optional coefficient finite difference methods.

  6. Seismic Safety Study

    SciTech Connect

    Tokarz, F J; Coats, D W

    2006-05-16

    During the past three decades, the Laboratory has been proactive in providing a seismically safe working environment for its employees and the general public. Completed seismic upgrades during this period have exceeded $30M with over 24 buildings structurally upgraded. Nevertheless, seismic questions still frequently arise regarding the safety of existing buildings. To address these issues, a comprehensive study was undertaken to develop an improved understanding of the seismic integrity of the Laboratory's entire building inventory at the Livermore Main Site and Site 300. The completed study of February 2005 extended the results from the 1998 seismic safety study per Presidential Executive Order 12941, which required each federal agency to develop an inventory of its buildings and to estimate the cost of mitigating unacceptable seismic risks. Degenkolb Engineers, who performed the first study, was recontracted to perform structural evaluations, rank order the buildings based on their level of seismic deficiencies, and to develop conceptual rehabilitation schemes for the most seriously deficient buildings. Their evaluation is based on screening procedures and guidelines as established by the Interagency Committee on Seismic Safety in Construction (ICSSC). Currently, there is an inventory of 635 buildings in the Laboratory's Facility Information Management System's (FIMS's) database, out of which 58 buildings were identified by Degenkolb Engineers that require seismic rehabilitation. The remaining 577 buildings were judged to be adequate from a seismic safety viewpoint. The basis for these evaluations followed the seismic safety performance objectives of DOE standard (DOE STD 1020) Performance Category 1 (PC1). The 58 buildings were ranked according to three risk-based priority classifications (A, B, and C) as shown in Figure 1-1 (all 58 buildings have structural deficiencies). Table 1-1 provides a brief description of their expected performance and damage state

  7. Probabilistic Seismic Hazard Assessment for Iraq

    SciTech Connect

    Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq; Shakir, Ammar M.; Mahdi, Hanan; Numan, Nazar M.S.; Al-Shukri, Haydar; Chlaib, Hussein K.; Ameen, Taher H.; Abd, Najah A.

    2016-05-06

    Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al., 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.

  8. Constraints on Subglacial Conditions from Seismicity

    NASA Astrophysics Data System (ADS)

    Lipovsky, B.; Olivo, D. C.; Dunham, E. M.

    2014-12-01

    A family of physics-based models designed to explain emergent, bandlimited, "tremor-like" seismograms shed light onto subglacial and englacial conditions. We consider two such models. In the first, a water-filled fracture hosts resonant modes; the seismically observable quality factor and characteristic frequency of these modes constrain the fracture length and aperture. In the second model, seismicity is generated by repeating stick-slip events on a fault patch (portion of the glacier bed) with sliding described by rate- and state-dependent friction laws. Wave propagation phenomena may additionally generate bandlimited seismic signals. These models make distinct predictions that may be used to address questions of glaciological concern. Laboratory friction experiments show that small, repeating earthquakes most likely occur at the ice-till interface and at conditions below the pressure melting point. These laboratory friction values, when combined with observed ice surface velocities, may also be used to constrain basal pore pressure. In contrast, seismic signals indicative of water-filled basal fractures suggest that, at least locally, temperatures are above the pressure melting point. We present a simple diagnostic test between these two processes that concerns the relationship between the multiple seismic spectral peaks generated by each process. Whereas repeating earthquakes generate evenly spaced spectral peaks through the Dirac comb effect, hydraulic fracture resonance, as a result of dispersive propagation of waves along the crack, generates spectral peaks that are not evenly spaced.

  9. A robotics-based testbed for verifying a method of identifying contact-dynamics model parameters

    NASA Astrophysics Data System (ADS)

    Ma, Ou; Boyden, Samuel

    2006-05-01

    This paper describes a general method of identifying the key parameters of multiple-point contact-dynamics models and a robotics-based testbed for experimentally verifying the new method. Some of the current and future flight systems are required to make physical contact on orbit for on-orbit servicing such as docking, refueling, repairing, etc. Because of the high risks associated with contact operations, the design and operation of such a flight system must be thoroughly analyzed and verified in advance by hardware testing and/or high-fidelity computer simulation. Computer simulations are increasingly playing a major role in system verification because it is extreme difficult to test 6-DOF microgravity contact dynamics on the ground. However, the accuracy of computer simulation depends not only on the mathematical model (i.e., formulation, algorithms, and computer code) but also on the values of model parameters. It is, therefore, desirable to have a systematic method which can identify multiple model parameters directly from routine physical tests of the contact components. The robotics-based experiment testbed introduced in this paper is specially designed to test and verify such a method of identifying contact parameters. The method is capable of identifying the key stiffness, damping, and friction parameters of a contact dynamics model all together from hardware test of contacting components having complicated geometries and multiple contacts. It can also be used to extract contact-dynamics model parameters of a dynamic system from its routine test of complex contact hardware. The paper discusses the major design requirements of this experimental testbed and how they are met by the specific design of the system.

  10. Common Core Units in Business Education: Sorting, Checking, and Verifying.

    ERIC Educational Resources Information Center

    Contra Costa County Superintendent of Schools, CA.

    This secondary unit of instruction on handling sales orders is one of sixteen Common Core Units in Business Education (CCUBE). The units were designed for implementing the sixteen common core competencies identified in the California Business Education Program Guide for Office and Distributive Education. Each competency-based unit is designed to…

  11. Static corrections for enhanced signal detection at IMS seismic arrays

    NASA Astrophysics Data System (ADS)

    Wilkins, Neil; Wookey, James; Selby, Neil

    2016-04-01

    Seismic monitoring forms an important part of the International Monitoring System (IMS) for verifying the Comprehensive nuclear Test Ban Treaty (CTBT). Analysis of seismic data can be used to discriminate between nuclear explosions and the tens of thousands of natural earthquakes of similar magnitude that occur every year. This is known as "forensic seismology", and techniques include measuring the P-to-S wave amplitude ratio, the body-to-surface wave magnitude ratio (mb/Ms), and source depth. Measurement of these seismic discriminants requires very high signal-to-noise ratio (SNR) data, and this has led to the development and deployment of seismic arrays as part of the IMS. Array processing methodologies such as stacking can be used, but optimum SNR improvement needs an accurate estimate of the arrival time of the particular seismic phase. To enhance the imaging capability of IMS arrays, we aim to develop site-specific static corrections to the arrival time as a function of frequency, slowness and backazimuth. Here, we present initial results for the IMS TORD array in Niger. Vespagrams are calculated for various events using the F-statistic to clearly identify seismic phases and measure their arrival times. Observed arrival times are compared with those predicted by 1D and 3D velocity models, and residuals are calculated for a range of backazimuths and slownesses. Finally, we demonstrate the improvement in signal fidelity provided by these corrections.

  12. Overview of seismic considerations at the Paducah Gaseous Diffusion Plant

    SciTech Connect

    Hunt, R.J.; Stoddart, W.C.; Burnett, W.A.; Beavers, J.E.

    1992-10-01

    This paper presents an overview of seismic considerations at the Paducah Gaseous Diffusion Plant (PGDP), which is managed by Martin Marietta Energy Systems, Inc., for the Department of Energy (DOE). The overview describes the original design, the seismic evaluations performed for the Safety Analysis Report (SAR) issued in 1985, and current evaluations and designs to address revised DOE requirements. Future plans to ensure changes in requirements and knowledge are addressed.

  13. Effects of Large and Small-Source Seismic Surveys on Marine Mammals and Sea Turtles

    NASA Astrophysics Data System (ADS)

    Holst, M.; Richardson, W. J.; Koski, W. R.; Smultea, M. A.; Haley, B.; Fitzgerald, M. W.; Rawson, M.

    2006-05-01

    L-DEO implements a marine mammal and sea turtle monitoring and mitigation program during its seismic surveys. The program consists of visual observations, mitigation, and/or passive acoustic monitoring (PAM). Mitigation includes ramp ups, powerdowns, and shutdowns of the seismic source if marine mammals or turtles are detected in or about to enter designated safety radii. Visual observations for marine mammals and turtles have taken place during all 11 L-DEO surveys since 2003, and PAM was done during five of those. Large sources were used during six cruises (10 to 20 airguns; 3050 to 8760 in3; PAM during four cruises). For two interpretable large-source surveys, densities of marine mammals were lower during seismic than non- seismic periods. During a shallow-water survey off Yucatán, delphinid densities during non-seismic periods were 19x higher than during seismic; however, this number is based on only 3 sightings during seismic and 11 sightings during non-seismic. During a Caribbean survey, densities were 1.4x higher during non-seismic. The mean closest point of approach (CPA) for delphinids for both cruises was significantly farther during seismic (1043 m) than during non-seismic (151 m) periods (Mann-Whitney U test, P < 0.001). Large whales were only seen during the Caribbean survey; mean CPA during seismic was 1722 m compared to 1539 m during non-seismic, but sample sizes were small. Acoustic detection rates with and without seismic were variable for three large-source surveys with PAM, with rates during seismic ranging from 1/3 to 6x those without seismic (n = 0 for fourth survey). The mean CPA for turtles was closer during non-seismic (139 m) than seismic (228 m) periods (P < 0.01). Small-source surveys used up to 6 airguns or 3 GI guns (75 to 1350 in3). During a Northwest Atlantic survey, delphinid densities during seismic and non-seismic were similar. However, in the Eastern Tropical Pacific, delphinid densities during non-seismic were 2x those during

  14. Seismic performance of RC shear wall structure with novel shape memory alloy dampers in coupling beams

    NASA Astrophysics Data System (ADS)

    Mao, Chenxi; Dong, Jinzhi; Li, Hui; Ou, Jinping

    2012-04-01

    Shear wall system is widely adopted in high rise buildings because of its high lateral stiffness in resisting earthquakes. According to the concept of ductility seismic design, coupling beams in shear wall structure are required to yield prior to the damage of wall limb. However, damage in coupling beams results in repair cost post earthquake and even in some cases it is difficult to repair the coupling beams if the damage is severe. In order to solve this problem, a novel passive SMA damper was proposed in this study. The coupling beams connecting wall limbs are split in the middle, and the dampers are installed between the ends of the two cantilevers. Then the relative flexural deformation of the wall limbs is transferred to the ends of coupling beams and then to the SMA dampers. After earthquakes the deformation of the dampers can recover automatically because of the pseudoelasticity of austenite SMA material. In order to verify the validity of the proposed dampers, seismic responses of a 12-story coupled shear wall with such passive SMA dampers in coupling beams was investigated. The additional stiffness and yielding deformation of the dampers and their ratios to the lateral stiffness and yielding displacements of the wall limbs are key design parameters and were addressed. Analytical results indicate that the displacement responses of the shear wall structure with such dampers are reduced remarkably. The deformation of the structure is concentrated in the dampers and the damage of coupling beams is reduced.

  15. Web seismic Un ∗x: making seismic reflection processing more accessible

    NASA Astrophysics Data System (ADS)

    Templeton, M. E.; Gough, C. A.

    1999-05-01

    Web Seismic Un ∗x is a browser-based user interface for the Seismic Un ∗x freeware developed at Colorado School of Mines. The interface allows users to process and display seismic reflection data from any remote platform that runs a graphical Web browser. Users access data and create processing jobs on a remote server by completing form-based Web pages whose Common Gateway Interface scripts are written in Perl. These scripts supply parameters, manage files, call Seismic Un ∗x routines and return data plots. The interface was designed for undergraduate commuter students taking geophysics courses who need to: (a) process seismic data and other time series as a class using computers in campus teaching labs and (b) complete course assignments at home. Students from an undergraduate applied geophysics course tested the Web user interface while completing laboratory assignments in which they acquired and processed common-depth-point seismic reflection data into a subsurface image. This freeware, which will be publicly available by summer 1999, was developed and tested on a Solaris 2.5 server and will be ported to other versions of Unix, including Linux.

  16. The Lusi seismic experiment: An initial study to understand the effect of seismic activity to Lusi

    SciTech Connect

    Karyono; Mazzini, Adriano; Sugiharto, Anton; Lupi, Matteo; Syafri, Ildrem; Masturyono,; Rudiyanto, Ariska; Pranata, Bayu; Muzli,; Widodo, Handi Sulistyo; Sudrajat, Ajat

    2015-04-24

    The spectacular Lumpur Sidoarjo (Lusi) eruption started in northeast Java on the 29 of May 2006 following a M6.3 earthquake striking the island [1,2]. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system [3] and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. The Lusi seismic experiment is a project aims to begin a detailed study of seismicity around the Lusi area. In this initial phase we deploy 30 seismometers strategically distributed in the area around Lusi and along the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. The purpose of the initial monitoring is to conduct a preliminary seismic campaign aiming to identify the occurrence and the location of local seismic events in east Java particularly beneath Lusi.This network will locate small event that may not be captured by the existing BMKG network. It will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-AW region and spatial and temporal variations of vp/vs ratios. The goal of this study is to understand how the seismicity occurring along the Sunda subduction zone affects to the behavior of the Lusi eruption. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. In this study, we will extract Green’s functions from ambient seismic noise data in order to image the shallow subsurface structure beneath LUSI area. The waveform cross-correlation technique will be apply to all of recordings of ambient seismic noise at 30 seismographic stations around the LUSI area. We use the dispersive behaviour of the retrieved Rayleigh waves to infer velocity structures in the shallow subsurface.

  17. The Lusi seismic experiment: An initial study to understand the effect of seismic activity to Lusi

    NASA Astrophysics Data System (ADS)

    Karyono, Mazzini, Adriano; Lupi, Matteo; Syafri, Ildrem; Masturyono, Rudiyanto, Ariska; Pranata, Bayu; Muzli, Widodo, Handi Sulistyo; Sudrajat, Ajat; Sugiharto, Anton

    2015-04-01

    The spectacular Lumpur Sidoarjo (Lusi) eruption started in northeast Java on the 29 of May 2006 following a M6.3 earthquake striking the island [1,2]. Initially, several gas and mud eruption sites appeared along the reactivated strike-slip Watukosek fault system [3] and within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. The Lusi seismic experiment is a project aims to begin a detailed study of seismicity around the Lusi area. In this initial phase we deploy 30 seismometers strategically distributed in the area around Lusi and along the Watukosek fault zone that stretches between Lusi and the Arjuno Welirang (AW) complex. The purpose of the initial monitoring is to conduct a preliminary seismic campaign aiming to identify the occurrence and the location of local seismic events in east Java particularly beneath Lusi.This network will locate small event that may not be captured by the existing BMKG network. It will be crucial to design the second phase of the seismic experiment that will consist of a local earthquake tomography of the Lusi-AW region and spatial and temporal variations of vp/vs ratios. The goal of this study is to understand how the seismicity occurring along the Sunda subduction zone affects to the behavior of the Lusi eruption. Our study will also provide a large dataset for a qualitative analysis of earthquake triggering studies, earthquake-volcano and earthquake-earthquake interactions. In this study, we will extract Green's functions from ambient seismic noise data in order to image the shallow subsurface structure beneath LUSI area. The waveform cross-correlation technique will be apply to all of recordings of ambient seismic noise at 30 seismographic stations around the LUSI area. We use the dispersive behaviour of the retrieved Rayleigh waves to infer velocity structures in the shallow subsurface.

  18. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.

    2011-12-01

    The CSN is a network of low-cost accelerometers deployed in the Pasadena, CA region. It is a prototype network with the goal of demonstrating the importance of dense measurements in determining the rapid lateral variations in ground motion due to earthquakes. The main product of the CSN is a map of peak ground produced within seconds of significant local earthquakes that can be used as a proxy for damage. Examples of this are shown using data from a temporary network in Long Beach, CA. Dense measurements in buildings are also being used to determine the state of health of structures. In addition to fixed sensors, portable sensors such as smart phones are also used in the network. The CSN has necessitated several changes in the standard design of a seismic network. The first is that the data collection and processing is done in the "cloud" (Google cloud in this case) for robustness and the ability to handle large impulsive loads (earthquakes). Second, the database is highly de-normalized (i.e. station locations are part of waveform and event-detection meta data) because of the mobile nature of the sensors. Third, since the sensors are hosted and/or owned by individuals, the privacy of the data is very important. The location of fixed sensors is displayed on maps as sensor counts in block-wide cells, and mobile sensors are shown in a similar way, with the additional requirement to inhibit tracking that at least two must be present in a particular cell before any are shown. The raw waveform data are only released to users outside of the network after a felt earthquake.

  19. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  20. Seismic requalification of a safety class crane

    SciTech Connect

    Wu, Ting-shu; Moran, T.J.

    1991-01-01

    A remotely operated 5-ton crane within a nuclear fuel handling facility was designed and constructed over 25 years ago. At that time, less severe design criteria, particularly on seismic loadings, were in use. This crane is being reactivated and requalified under new design criteria with loads including a site specific design basis earthquake. Detailed analyses of the crane show that the maximum stress coefficient is less than 90% of the code allowable, indicating that this existing crane is able to withstand loadings including those from the design basis earthquake. 3 refs., 8 figs., 2 tabs.

  1. Regional seismic discrimination research at LLNL

    SciTech Connect

    Walter, W.R.; Mayeda, K.M.; Goldstein, P.; Patton, H.J.; Jarpe, S.; Glenn, L.

    1995-10-01

    The ability to verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and show much variability from region to region. The goals of Lawrence Livermore National Laboratory`s (LLNL`s) regional discriminant research are to evaluate the most promising discriminants, improve the understanding of their physical basis and use this information to develop new and more effective discriminants that can be transported to new regions of high monitoring interest. In this report the authors discuss preliminary efforts to geophysically characterize the Middle East and North Africa. They show that the remarkable stability of coda allows one to develop physically based, stable single station magnitude scales in new regions. They then discuss progress to date on evaluating and improving physical understanding and ability to model regional discriminants, focusing on the comprehensive NTS dataset. The authors apply this modeling ability to develop improved discriminants including slopes of P to S ratios. They find combining disparate discriminant techniques is particularly effective in identifying consistent outliers such as shallow earthquakes and mine seismicity. Finally they discuss development and use of new coda and waveform modeling tools to investigate special events.

  2. LLNL`s regional seismic discrimination research

    SciTech Connect

    Walter, W.R.; Mayeda, K.M.; Goldstein, P.

    1995-07-01

    The ability to negotiate and verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and show much variability from region to region. The goals of Lawrence Livermore National Laboratory`s (LLNL`s) regional discriminant research are to evaluate the most promising discriminants, improve our understanding of their physical basis and use this information to develop new and more effective discriminants that can be transported to new regions of high monitoring interest. In this report we discuss our preliminary efforts to geophysically characterize two regions, the Korean Peninsula and the Middle East-North Africa. We show that the remarkable stability of coda allows us to develop physically based, stable single station magnitude scales in new regions. We then discuss our progress to date on evaluating and improving our physical understanding and ability to model regional discriminants, focusing on the comprehensive NTS dataset. We apply this modeling ability to develop improved discriminants including slopes of P to S ratios. We find combining disparate discriminant techniques is particularly effective in identifying consistent outliers such as shallow earthquakes and mine seismicity. Finally we discuss our development and use of new coda and waveform modeling tools to investigate special events.

  3. Verifying operator fitness - an imperative not an option

    SciTech Connect

    Scott, A.B. Jr.

    1987-01-01

    In the early morning hours of April 26, 1986, whatever credence those who operate nuclear power plants around the world could then muster, suffered a jarring reversal. Through an incredible series of personal errors, the operators at what was later to be termed one of the best operated plants in the USSR systematically stripped away the physical and procedural safeguards inherent to their installation and precipitated the worst reactor accident the world has yet seen. This challenge to the adequacy of nuclear operators comes at a time when many companies throughout the world - not only those that involve nuclear power - are grappling with the problem of how to assure the fitness for duty of those in their employ, specifically those users of substances that have an impact on the ability to function safely and productively in the workplace. In actuality, operator fitness for duty is far more than the lack of impairment from substance abuse, which many today consider it. Full fitness for duty implies mental and moral fitness, as well, and physical fitness in a more general sense. If we are to earn the confidence of the public, credible ways to verify total fitness on an operator-by-operator basis must be considered.

  4. VDVR: verifiable visualization of projection-based data.

    PubMed

    Zheng, Ziyi; Xu, Wei; Mueller, Klaus

    2010-01-01

    Practical volume visualization pipelines are never without compromises and errors. A delicate and often-studied component is the interpolation of off-grid samples, where aliasing can lead to misleading artifacts and blurring, potentially hiding fine details of critical importance. The verifiable visualization framework we describe aims to account for these errors directly in the volume generation stage, and we specifically target volumetric data obtained via computed tomography (CT) reconstruction. In this case the raw data are the X-ray projections obtained from the scanner and the volume data generation process is the CT algorithm. Our framework informs the CT reconstruction process of the specific filter intended for interpolation in the subsequent visualization process, and this in turn ensures an accurate interpolation there at a set tolerance. Here, we focus on fast trilinear interpolation in conjunction with an octree-type mixed resolution volume representation without T-junctions. Efficient rendering is achieved by a space-efficient and locality-optimized representation, which can straightforwardly exploit fast fixed-function pipelines on GPUs.

  5. [Determining and verifying reference intervals in clinical laboratories].

    PubMed

    Henny, Joseph

    2011-01-01

    Based on the original recommendation of the Expert Panel on the Theory of Reference Values of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC-LM), updated guidelines were recently published under the auspices of the IFCC-LM and the Clinical and Laboratory Standards Institute (CLSI). This article summarises these new proposals: (1) defining more precisely the terminology, which is often confusing, noticeably concerning the terms of reference limits and decision limits; (2) showing the different steps for determining reference limits according to the original procedure and the conditions which should be respected and (3) proposing a simple methodology allowing to the Clinical Laboratories to satisfy the needs of the Regulation and Standards. The updated document proposes to verify if published reference limits are applicable to the Laboratory involved. Finally the strengths and limits of the revised recommendations (noticeably the selection of the reference population, the maintenance of the analytical quality, the choice of the statistical methodology, etc.) will be briefly discussed.

  6. A credit card verifier structure using diffraction and spectroscopy concepts

    NASA Astrophysics Data System (ADS)

    Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

    2008-04-01

    We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram is a diffractive optical element, we choose to shine one at a time a number of broadband lightsources, each at different incident angle, on the embossed hologram of the credit card in such a way that different color spectra per incident angle beam is diffracted and separated in space. In this way, the number of pixels of each color plane is investigated. Then we apply a feed forward back propagation neural network configuration to separate the counterfeit credit card from the real one. Our experimental demonstration using two off-the-shelf broadband white light emitting diodes, one digital camera, a 3-layer neural network, and a notebook computer can identify all 69 counterfeit credit cards from eight real credit cards.

  7. Measurements verifying the optics of the Electron Drift Instrument

    NASA Astrophysics Data System (ADS)

    Kooi, Vanessa M.

    This thesis concentrates on laboratory measurements of the Electron Drift Instrument (EDI), focussing primarily on the EDI optics of the system. The EDI is a device used on spacecraft to measure electric fields by emitting an electron beam and measuring the E x B drift of the returning electrons after one gyration. This drift velocity is determined using two electron beams directed perpendicular to the magnetic field returning to be detected by the spacecraft. The EDI will be used on the Magnetospheric Multi-Scale Mission. The EDI optic's testing process takes measurements of the optics response to a uni-directional electron beam. These measurements are used to verify the response of the EDI's optics and to allow for the optimization of the desired optics state via simulation. The optics state tables were created in simulations and we are using these measurements to confirm their accuracy. The setup consisted of an apparatus made up of the EDI's optics and sensor electronics was secured to the two axis gear arm inside a vacuum chamber. An electron beam was projected at the apparatus which then used the EDI optics to focus the beam into the micro-controller plates and onto the circular 32 pad annular ring that makes up the sensor. The concentration of counts per pad over an interval of 1ms were averaged over 25 samples and plotted in MATLAB. The results of the measurements plotted agreed well with the simulations, providing confidence in the EDI instrument.

  8. Verifying the Simulation Hypothesis via Infinite Nested Universe Simulacrum Loops

    NASA Astrophysics Data System (ADS)

    Sharma, Vikrant

    2017-01-01

    The simulation hypothesis proposes that local reality exists as a simulacrum within a hypothetical computer's dimension. More specifically, Bostrom's trilemma proposes that the number of simulations an advanced 'posthuman' civilization could produce makes the proposition very likely. In this paper a hypothetical method to verify the simulation hypothesis is discussed using infinite regression applied to a new type of infinite loop. Assign dimension n to any computer in our present reality, where dimension signifies the hierarchical level in nested simulations our reality exists in. A computer simulating known reality would be dimension (n-1), and likewise a computer simulating an artificial reality, such as a video game, would be dimension (n +1). In this method, among others, four key assumptions are made about the nature of the original computer dimension n. Summations show that regressing such a reality infinitely will create convergence, implying that the verification of whether local reality is a grand simulation is feasible to detect with adequate compute capability. The action of reaching said convergence point halts the simulation of local reality. Sensitivities to the four assumptions and implications are discussed.

  9. Measurements Verifying the Optics of the Electron Drift Instrument

    NASA Astrophysics Data System (ADS)

    Kooi, Vanessa; Kletzing, Craig; Bounds, Scott; Sigsbee, Kristine M.

    2015-04-01

    Magnetic reconnection is the process of breaking and reconnecting of opposing magnetic field lines, and is often associated with tremendous energy transfer. The energy transferred by reconnection directly affects people through its influence on geospace weather and technological systems - such as telecommunication networks, GPS, and power grids. However, the mechanisms that cause magnetic reconnection are not well understood. The Magnetospheric Multi-Scale Mission (MMS) will use four spacecraft in a pyramid formation to make three-dimensional measurements of the structures in magnetic reconnection occurring in the Earth's magnetosphere.The spacecraft will repeatedly sample these regions for a prolonged period of time to gather data in more detail than has been previously possible. MMS is scheduled to be launched in March of 2015. The Electron Drift Instrument (EDI) will be used on MMS to measure the electric fields associated with magnetic reconnection. The EDI is a device used on spacecraft to measure electric fields by emitting an electron beam and measuring the E x B drift of the returning electrons after one gyration. This paper concentrates on measurements of the EDI’s optics system. The testing process includes measuring the optics response to a uni-directional electron beam. These measurements are used to verify the response of the EDI's optics and to allow for the optimization of the desired optics state. The measurements agree well with simulations and we are confident in the performance of the EDI instrument.

  10. Gravity of the New Madrid seismic zone; a preliminary study

    USGS Publications Warehouse

    Langenheim, V.E.

    1995-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Mo. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  11. Landslide seismic magnitude

    NASA Astrophysics Data System (ADS)

    Lin, C. H.; Jan, J. C.; Pu, H. C.; Tu, Y.; Chen, C. C.; Wu, Y. M.

    2015-11-01

    Landslides have become one of the most deadly natural disasters on earth, not only due to a significant increase in extreme climate change caused by global warming, but also rapid economic development in topographic relief areas. How to detect landslides using a real-time system has become an important question for reducing possible landslide impacts on human society. However, traditional detection of landslides, either through direct surveys in the field or remote sensing images obtained via aircraft or satellites, is highly time consuming. Here we analyze very long period seismic signals (20-50 s) generated by large landslides such as Typhoon Morakot, which passed though Taiwan in August 2009. In addition to successfully locating 109 large landslides, we define landslide seismic magnitude based on an empirical formula: Lm = log ⁡ (A) + 0.55 log ⁡ (Δ) + 2.44, where A is the maximum displacement (μm) recorded at one seismic station and Δ is its distance (km) from the landslide. We conclude that both the location and seismic magnitude of large landslides can be rapidly estimated from broadband seismic networks for both academic and applied purposes, similar to earthquake monitoring. We suggest a real-time algorithm be set up for routine monitoring of landslides in places where they pose a frequent threat.

  12. A procedure for seismic risk reduction in Campania Region

    NASA Astrophysics Data System (ADS)

    Zuccaro, G.; Palmieri, M.; Maggiò, F.; Cicalese, S.; Grassi, V.; Rauci, M.

    2008-07-01

    The Campania Region has set and performed a peculiar procedure in the field of seismic risk reduction. Great attention has been paid to public strategic buildings such as town halls, civil protection buildings and schools. The Ordinance 3274 promulgate in the 2004 by the Italian central authority obliged the owners of strategic buildings to perform seismic analyses within 2008 in order to check the safety of the structures and the adequacy to the use. In the procedure the Campania region, instead of the local authorities, ensure the complete drafting of seismic checks through financial resources of the Italian Government. A regional scientific technical committee has been constituted, composed of scientific experts, academics in seismic engineering. The committee has drawn up guidelines for the processing of seismic analyses. At the same time, the Region has issued a public competition to select technical seismic engineering experts to appoint seismic analysis in accordance with guidelines. The scientific committee has the option of requiring additional documents and studies in order to approve the safety checks elaborated. The Committee is supported by a technical and administrative secretariat composed of a group of expert in seismic engineering. At the moment several seismic safety checks have been completed. The results will be presented in this paper. Moreover, the policy to mitigate the seismic risk, set by Campania region, was to spend the most of the financial resources available on structural strengthening of public strategic buildings rather than in safety checks. A first set of buildings of which the response under seismic action was already known by data and studies of vulnerability previously realised, were selected for immediate retrofitting designs. Secondly, an other set of buildings were identified for structural strengthening. These were selected by using the criteria specified in the Guide Line prepared by the Scientific Committee and based on

  13. Third Quarter Hanford Seismic report for Fiscal year 2003

    SciTech Connect

    Hartshorn, Donald C.; Reidel, Steve P.; Rohay, Alan C.

    2003-09-11

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 356 triggers during the third quarter of fiscal year 2003. Of these triggers, 141 were earthquakes. Thirty-four earthquakes of the 141 earthquakes were located in the Hanford Seismic Network area. Stratigraphically 15 occurred in the Columbia River basalt, 13 were earthquakes in the pre-basalt sediments, and 6 were earthquakes in the crystalline basement. Geographically, 22 earthquakes occurred in swarm areas, 1 earthquake was associated with a major geologic structure, and 11 were classified as random events. During the third quarter, an earthquake swarm consisting of 15 earthquakes occurred on the south limb of Rattlesnake Mountain. The earthquakes are centered over the northwest extension of the Horse Heaven Hills anticline and probably occur at the base of the Columbia River Basalt Group.

  14. Seismic hazard assessment in Grecce: Revisited

    NASA Astrophysics Data System (ADS)

    Makropoulos, Kostas; Chousianitis, Kostas; Kaviris, George; Kassaras, Ioannis

    2013-04-01

    Greece is the most earthquake prone country in the eastern Mediterranean territory and one of the most active areas globally. Seismic Hazard Assessment (SHA) is a useful procedure to estimate the expected earthquake magnitude and strong ground-motion parameters which are necessary for earthquake resistant design. Several studies on the SHA of Greece are available, constituting the basis of the National Seismic Code. However, the recently available more complete, accurate and homogenous seismological data (the new earthquake catalogue of Makropoulos et al., 2012), the revised seismic zones determined within the framework of the SHARE project (2012), new empirical attenuation formulas extracted for several regions in Greece, as well as new algorithms of SHA, are innovations that motivated the present study. Herewith, the expected earthquake magnitude for Greece is evaluated by applying the zone-free, upper bounded Gumbel's third asymptotic distribution of extreme values method. The peak ground acceleration (PGA), velocity (PGV) and displacement (PGD) are calculated at the seismic bedrock using two methods: (a) the Gumbel's first asymptotic distribution of extreme values, since it is valid for initial open-end distributions and (b) the Cornell-McGuire approach, using the CRISIS2007 (Ordaz et. al., 2007) software. The latter takes into account seismic source zones for which seismicity parameters are assigned following a Poisson recurrence model. Thus, each source is characterized by a series of seismic parameters, such as the magnitude recurrence and the recurrence rate for threshold magnitude, while different predictive equations can be assigned to different seismic source zones. Recent available attenuation parameters were considered. Moreover, new attenuation parameters for the very seismically active Corinth Gulf deduced during this study, from recordings of the RASMON accelerometric array, were used. The hazard parameters such as the most probable annual maximum

  15. Realities of verifying the absence of highly enriched uranium (HEU) in gas centrifuge enrichment plants

    SciTech Connect

    Swindle, D.W.

    1990-03-01

    Over a two and one-half year period beginning in 1981, representatives of six countries (United States, United Kingdom, Federal Republic of Germany, Australia, The Netherlands, and Japan) and the inspectorate organizations of the International Atomic Energy Agency and EURATOM developed and agreed to a technically sound approach for verifying the absence of highly enriched uranium (HEU) in gas centrifuge enrichment plants. This effort, known as the Hexapartite Safeguards Project (HSP), led to the first international concensus on techniques and requirements for effective verification of the absence of weapons-grade nuclear materials production. Since that agreement, research and development has continued on the radiation detection technology-based technique that technically confirms the HSP goal is achievable. However, the realities of achieving the HSP goal of effective technical verification have not yet been fully attained. Issues such as design and operating conditions unique to each gas centrifuge plant, concern about the potential for sensitive technology disclosures, and on-site support requirements have hindered full implementation and operator support of the HSP agreement. In future arms control treaties that may limit or monitor fissile material production, the negotiators must recognize and account for the realities and practicalities in verifying the absence of HEU production. This paper will describe the experiences and realities of trying to achieve the goal of developing and implementing an effective approach for verifying the absence of HEU production. 3 figs.

  16. Adjustment of minimum seismic shear coefficient considering site effects for long-period structures

    NASA Astrophysics Data System (ADS)

    Guan, Minsheng; Du, Hongbiao; Cui, Jie; Zeng, Qingli; Jiang, Haibo

    2016-06-01

    Minimum seismic base shear is a key factor employed in the seismic design of long-period structures, which is specified in some of the major national seismic building codes viz. ASCE7-10, NZS1170.5 and GB50011-2010. In current Chinese seismic design code GB50011-2010, however, effects of soil types on the minimum seismic shear coefficient are not considered, which causes problems for long-period structures sited in hard or rock soil to meet the minimum base shear requirement. This paper aims to modify the current minimum seismic shear coefficient by taking into account site effects. For this purpose, effective peak acceleration (EPA) is used as a representation for the ordinate value of the design response spectrum at the plateau. A large amount of earthquake records, for which EPAs are calculated, are examined through the statistical analysis by considering soil conditions as well as the seismic fortification intensities. The study indicates that soil types have a significant effect on the spectral ordinates at the plateau as well as the minimum seismic shear coefficient. Modified factors related to the current minimum seismic shear coefficient are preliminarily suggested for each site class. It is shown that the modified seismic shear coefficients are more effective to the determination of minimum seismic base shear of long-period structures.

  17. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  18. The Italian National Seismic Network

    NASA Astrophysics Data System (ADS)

    Michelini, Alberto

    2016-04-01

    The Italian National Seismic Network is composed by about 400 stations, mainly broadband, installed in the Country and in the surrounding regions. About 110 stations feature also collocated strong motion instruments. The Centro Nazionale Terremoti, (National Earthquake Center), CNT, has installed and operates most of these stations, although a considerable number of stations contributing to the INGV surveillance has been installed and is maintained by other INGV sections (Napoli, Catania, Bologna, Milano) or even other Italian or European Institutions. The important technological upgrades carried out in the last years has allowed for significant improvements of the seismic monitoring of Italy and of the Euro-Mediterranean Countries. The adopted data transmission systems include satellite, wireless connections and wired lines. The Seedlink protocol has been adopted for data transmission. INGV is a primary node of EIDA (European Integrated Data Archive) for archiving and distributing, continuous, quality checked data. The data acquisition system was designed to accomplish, in near-real-time, automatic earthquake detection and hypocenter and magnitude determination (moment tensors, shake maps, etc.). Database archiving of all parametric results are closely linked to the existing procedures of the INGV seismic monitoring environment. Overall, the Italian earthquake surveillance service provides, in quasi real-time, hypocenter parameters which are then revised routinely by the analysts of the Bollettino Sismico Nazionale. The results are published on the web page http://cnt.rm.ingv.it/ and are publicly available to both the scientific community and the the general public. This presentation will describe the various activities and resulting products of the Centro Nazionale Terremoti. spanning from data acquisition to archiving, distribution and specialised products.

  19. Seismic surveys test on Innerhytta Pingo, Adventdalen, Svalbard Islands

    NASA Astrophysics Data System (ADS)

    Boaga, Jacopo; Rossi, Giuliana; Petronio, Lorenzo; Accaino, Flavio; Romeo, Roberto; Wheeler, Walter

    2015-04-01

    We present the preliminary results of an experimental full-wave seismic survey test conducted on the Innnerhytta a Pingo, located in the Adventdalen, Svalbard Islands, Norway. Several seismic surveys were adopted in order to study a Pingo inner structure, from classical reflection/refraction arrays to seismic tomography and surface waves analysis. The aim of the project IMPERVIA, funded by Italian PNRA, was the evaluation of the permafrost characteristics beneath this open-system Pingo by the use of seismic investigation, evaluating the best practice in terms of logistic deployment. The survey was done in April-May 2014: we collected 3 seismic lines with different spacing between receivers (from 2.5m to 5m), for a total length of more than 1 km. We collected data with different vertical geophones (with natural frequency of 4.5 Hz and 14 Hz) as well as with a seismic snow-streamer. We tested different seismic sources (hammer, seismic gun, fire crackers and heavy weight drop), and we verified accurately geophone coupling in order to evaluate the different responses. In such peculiar conditions we noted as fire-crackers allow the best signal to noise ratio for refraction/reflection surveys. To ensure the best geophones coupling with the frozen soil, we dug snow pits, to remove the snow-cover effect. On the other hand, for the surface wave methods, the very high velocity of the permafrost strongly limits the generation of long wavelengths both with these explosive sources as with the common sledgehammer. The only source capable of generating low frequencies was a heavy drop weight system, which allows to analyze surface wave dispersion below 10 Hz. Preliminary data analysis results evidence marked velocity inversions and strong velocity contrasts in depth. The combined use of surface and body waves highlights the presence of a heterogeneous soil deposit level beneath a thick layer of permafrost. This is the level that hosts the water circulation from depth controlling

  20. Topographic effects on seismic response of long-span rigid-frame bridge under SV seismic wave

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhao, Cheng-Gang; Qu, Tie-Jun

    2008-05-01

    Seismic ground motions of two neighboring mountains and the free surface between them are calculated under the SV seismic waves with three different incident angles. The results are then taken as the inputs of multi-point seismic excitations for the foundation of a long-span bridge built over the valley in the analysis considering the integrated influence of traveling wave and topography. On the basis of a dynamic analytical method, a finite element model is created for the seismic responses of a four-span rigid-frame bridge of 440 m. The pier-top displacement and the pier-bottom internal force of the bridge are calculated. Then the results are compared with those considering traveling-wave effect only. The conclusions can serve as a seismic design reference for the structures located on the complex mountain topography.

  1. Estimation of background noise level on seismic station using statistical analysis for improved analysis accuracy

    NASA Astrophysics Data System (ADS)

    Han, S. M.; Hahm, I.

    2015-12-01

    We evaluated the background noise level of seismic stations in order to collect the observation data of high quality and produce accurate seismic information. Determining of the background noise level was used PSD (Power Spectral Density) method by McNamara and Buland (2004) in this study. This method that used long-term data is influenced by not only innate electronic noise of sensor and a pulse wave resulting from stabilizing but also missing data and controlled by the specified frequency which is affected by the irregular signals without site characteristics. It is hard and inefficient to implement process that filters out the abnormal signal within the automated system. To solve these problems, we devised a method for extracting the data which normally distributed with 90 to 99% confidence intervals at each period. The availability of the method was verified using 62-seismic stations with broadband and short-period sensors operated by the KMA (Korea Meteorological Administration). Evaluation standards were NHNM (New High Noise Model) and NLNM (New Low Noise Model) published by the USGS (United States Geological Survey). It was designed based on the western United States. However, Korean Peninsula surrounded by the ocean on three sides has a complicated geological structure and a high population density. So, we re-designed an appropriate model in Korean peninsula by statistically combined result. The important feature is that secondary-microseism peak appeared at a higher frequency band. Acknowledgements: This research was carried out as a part of "Research for the Meteorological and Earthquake Observation Technology and Its Application" supported by the 2015 National Institute of Meteorological Research (NIMR) in the Korea Meteorological Administration.

  2. Stress-Release Seismic Source for Seismic Velocity Measurement in Mines

    NASA Astrophysics Data System (ADS)

    Swanson, P. L.; Clark, C.; Richardson, J.; Martin, L.; Zahl, E.; Etter, A.

    2014-12-01

    Accurate seismic event locations are needed to delineate roles of mine geometry, stress and geologic structures in developing rockburst conditions. Accurate absolute locations are challenging in mine environments with rapid changes in seismic velocity due to sharp contrasts between individual layers and large time-dependent velocity gradients attending excavations. Periodic use of controlled seismic sources can help constrain the velocity in this continually evolving propagation medium comprising the miners' workplace. With a view to constructing realistic velocity models in environments in which use of explosives is problematic, a seismic source was developed subject to the following design constraints: (i) suitable for use in highly disturbed zones surrounding mine openings, (ii) able to produce usable signals over km-scale distances in the frequency range of typical coal mine seismic events (~10-100 Hz), (iii) repeatable, (iv) portable, (v) non-disruptive to mining operations, and (vi) safe for use in potentially explosive gaseous environments. Designs of the compressed load column seismic source (CLCSS), which generates a stress, or load, drop normal to the surface of mine openings, and the fiber-optic based source-initiation timer are presented. Tests were conducted in a coal mine at a depth of 500 m (1700 ft) and signals were recorded on the surface with a 72-ch (14 Hz) exploration seismograph for load drops of 150-470 kN (16-48 tons). Signal-to-noise ratios of unfiltered signals ranged from ~200 immediately above the source (500 m (1700 ft)) to ~8 at the farthest extent of the array (slant distance of ~800 m (2600 ft)), suggesting the potential for use over longer range. Results are compared with signals produced by weight drop and sledge hammer sources, indicating the superior waveform quality for first-arrival measurements with the CLCSS seismic source.

  3. Canadian Seismic Agreement

    SciTech Connect

    Wetmiller, R.J.; Lyons, J.A.; Shannon, W.E.; Munro, P.S.; Thomas, J.T.; Andrew, M.D.; Lapointe, S.P.; Lamontagne, M.; Wong, C.; Anglin, F.M.; Adams, J.; Cajka, M.G.; McNeil, W.; Drysdale, J.A. )

    1992-05-01

    This is a progress report of work carried out under the terms of a research agreement entitled the Canadian Seismic Agreement'' between the US Nuclear Regulatory Commission (USNRC), the Canadian Commercial Corporation and the Geophysics Division of the Geological Survey of Canada (GD/GSC) during the period from July 01, 1989 to June 30, 1990. The Canadian Seismic Agreement'' supports generally the operation of various seismograph stations in eastern Canada and the collection and analysis of earthquake data for the purpose of mitigating seismic hazards in eastern Canada and the northeastern US. The specific activities carried out in this one-year period are summarized below under four headings; Eastern Canada Telemetred Network and local network developments, Datalab developments, strong-motion network developments and earthquake activity. During this period the first surface fault unequivocably determined to have accompanied a historic earthquake in eastern North America, occurred in northern Quebec.

  4. Induced seismicity. Final report

    SciTech Connect

    Segall, P.

    1997-09-18

    The objective of this project has been to develop a fundamental understanding of seismicity associated with energy production. Earthquakes are known to be associated with oil, gas, and geothermal energy production. The intent is to develop physical models that predict when seismicity is likely to occur, and to determine to what extent these earthquakes can be used to infer conditions within energy reservoirs. Early work focused on earthquakes induced by oil and gas extraction. Just completed research has addressed earthquakes within geothermal fields, such as The Geysers in northern California, as well as the interactions of dilatancy, friction, and shear heating, on the generation of earthquakes. The former has involved modeling thermo- and poro-elastic effects of geothermal production and water injection. Global Positioning System (GPS) receivers are used to measure deformation associated with geothermal activity, and these measurements along with seismic data are used to test and constrain thermo-mechanical models.

  5. Unraveling Megathrust Seismicity

    NASA Astrophysics Data System (ADS)

    Funiciello, Francesca; Corbi, Fabio; van Dinther, Ylona; Heuret, Arnauld

    2013-12-01

    The majority of global seismicity originates at subduction zones, either within the converging plates or along the plate interface. In particular, events with Mw ≥ 8.0 usually occur at the subduction megathrust, which is the frictional interface between subducting and overriding plates. Consequently, seismicity at subduction megathrusts is responsible for most of the seismic energy globally released during the last century [Pacheco and Sykes, 1992]. What's more, during the last decade giant megathrust earthquakes occurred at an increased rate with respect to the last century [Ammon et al., 2010], often revealing unexpected characteristics and resulting in catastrophic effects. Determining the controlling factors of these events would have fundamental implications for earthquake and tsunami hazard assessment.

  6. 3-D Seismic Interpretation

    NASA Astrophysics Data System (ADS)

    Moore, Gregory F.

    2009-05-01

    This volume is a brief introduction aimed at those who wish to gain a basic and relatively quick understanding of the interpretation of three-dimensional (3-D) seismic reflection data. The book is well written, clearly illustrated, and easy to follow. Enough elementary mathematics are presented for a basic understanding of seismic methods, but more complex mathematical derivations are avoided. References are listed for readers interested in more advanced explanations. After a brief introduction, the book logically begins with a succinct chapter on modern 3-D seismic data acquisition and processing. Standard 3-D acquisition methods are presented, and an appendix expands on more recent acquisition techniques, such as multiple-azimuth and wide-azimuth acquisition. Although this chapter covers the basics of standard time processing quite well, there is only a single sentence about prestack depth imaging, and anisotropic processing is not mentioned at all, even though both techniques are now becoming standard.

  7. Controllable seismic source

    SciTech Connect

    Gomez, Antonio; DeRego, Paul Jeffrey; Ferrell, Patrick Andrew; Thom, Robert Anthony; Trujillo, Joshua J.; Herridge, Brian

    2015-09-29

    An apparatus for generating seismic waves includes a housing, a strike surface within the housing, and a hammer movably disposed within the housing. An actuator induces a striking motion in the hammer such that the hammer impacts the strike surface as part of the striking motion. The actuator is selectively adjustable to change characteristics of the striking motion and characteristics of seismic waves generated by the impact. The hammer may be modified to change the physical characteristics of the hammer, thereby changing characteristics of seismic waves generated by the hammer. The hammer may be disposed within a removable shock cavity, and the apparatus may include two hammers and two shock cavities positioned symmetrically about a center of the apparatus.

  8. Controllable seismic source

    SciTech Connect

    Gomez, Antonio; DeRego, Paul Jeffrey; Ferrel, Patrick Andrew; Thom, Robert Anthony; Trujillo, Joshua J.; Herridge, Brian

    2014-08-19

    An apparatus for generating seismic waves includes a housing, a strike surface within the housing, and a hammer movably disposed within the housing. An actuator induces a striking motion in the hammer such that the hammer impacts the strike surface as part of the striking motion. The actuator is selectively adjustable to change characteristics of the striking motion and characteristics of seismic waves generated by the impact. The hammer may be modified to change the physical characteristics of the hammer, thereby changing characteristics of seismic waves generated by the hammer. The hammer may be disposed within a removable shock cavity, and the apparatus may include two hammers and two shock cavities positioned symmetrically about a center of the apparatus.

  9. Seismic ruggedness of relays

    SciTech Connect

    Merz, K.L. )

    1991-08-01

    This report complements EPRI report NP-5223 Revision 1, February 1991, and presents additional information and analyses concerning generic seismic ruggedness of power plant relays. Existing and new test data have been used to construct Generic Equipment Ruggedness Spectra (GERS) which can be used in identifying rugged relays during seismic re-evaluation of nuclear power plants. This document is an EPRI tier 1 report. The results of relay fragility tests for both old and new relays are included in an EPRI tier 2 report with the same title. In addition to the presentation of relay GERS, the tier 2 report addresses the applicability of GERS to relays of older vintage, discusses the important identifying nomenclature for each relay type, and examines relay adjustment effects on seismic ruggedness. 9 refs., 3 figs, 1 tab.

  10. Synthesis of artificial spectrum-compatible seismic accelerograms

    NASA Astrophysics Data System (ADS)

    Vrochidou, E.; Alvanitopoulos, P. F.; Andreadis, I.; Elenas, A.; Mallousi, K.

    2014-08-01

    The Hilbert-Huang transform is used to generate artificial seismic signals compatible with the acceleration spectra of natural seismic records. Artificial spectrum-compatible accelerograms are utilized instead of natural earthquake records for the dynamic response analysis of many critical structures such as hospitals, bridges, and power plants. The realistic estimation of the seismic response of structures involves nonlinear dynamic analysis. Moreover, it requires seismic accelerograms representative of the actual ground acceleration time histories expected at the site of interest. Unfortunately, not many actual records of different seismic intensities are available for many regions. In addition, a large number of seismic accelerograms are required to perform a series of nonlinear dynamic analyses for a reliable statistical investigation of the structural behavior under earthquake excitation. These are the main motivations for generating artificial spectrum-compatible seismic accelerograms and could be useful in earthquake engineering for dynamic analysis and design of buildings. According to the proposed method, a single natural earthquake record is deconstructed into amplitude and frequency components using the Hilbert-Huang transform. The proposed method is illustrated by studying 20 natural seismic records with different characteristics such as different frequency content, amplitude, and duration. Experimental results reveal the efficiency of the proposed method in comparison with well-established and industrial methods in the literature.

  11. Development of a wireless seismic array for volcano monitoring

    NASA Astrophysics Data System (ADS)

    Moure, David; Toma, Daniel; Lázaro, Antoni Manuel; Del Río, Joaquín; Carreras, Normandino; José Blanco, María

    2014-05-01

    Volcano monitoring is mainly based on three sciences: seismology, geodesy and geochemistry. Seismic arrays are used to locate the seismic source, based on analysis of signals recorded by each seismometer. The most important advantages of arrays over classical seismic networks are: painless deployment, no major infrastructures needed, able to provide an approximate location of a signal that is not feasible by a seismic network. In this paper the design of a low-power wireless array is presented. All sensors transmit acquired data to a central node which is capable to calculate the possible location of the seismic source in real-time. The reliability of those locations depends, among other parameters (number of sensors and geometrical distribution), on precision of time synchronization between the nodes. To achieve the necessary precision, the wireless seismic array implements a time synchronization protocol based on the IEEE1588 protocol, which ensures clock synchronization between nodes better than a microsecond, therefore, signal correlation between sensors is achieved correlating the signals from all the sensors. The ultimate challenge would be that the central node receives data from all the seismometers locating the seismic source, only transmitting the result, which dramatically reduces data traffic. Often, active volcano areas are located far from inhabited areas and data transmission options are limited. In situ calculation is crucial in order to reduce data volume transmission generated by the seismic array.

  12. First Quarter Hanford Seismic Report for Fiscal Year 2011

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Clayton, Ray E.; Devary, Joseph L.

    2011-03-31

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 16 local earthquakes during the first quarter of FY 2011. Six earthquakes were located at shallow depths (less than 4 km), seven earthquakes at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments, and three earthquakes were located at depths greater than 9 km, within the basement. Geographically, thirteen earthquakes were located in known swarm areas and three earthquakes were classified as random events. The highest magnitude event (1.8 Mc) was recorded on October 19, 2010 at depth 17.5 km with epicenter located near the Yakima River between the Rattlesnake Mountain and Horse Heaven Hills swarm areas.

  13. A western gray whale mitigation and monitoring program for a 3-D seismic survey, Sakhalin Island, Russia.

    PubMed

    Johnson, S R; Richardson, W J; Yazvenko, S B; Blokhin, S A; Gailey, G; Jenkerson, M R; Meier, S K; Melton, H R; Newcomer, M W; Perlov, A S; Rutenko, S A; Würsig, B; Martin, C R; Egging, D E

    2007-11-01

    residual impacts. Aerial and vessel-based surveys determined the distribution of whales before, during and after the seismic survey. Daily aerial reconnaissance helped verify whale-free areas and select the sequence of seismic lines to be surveyed. A scout vessel with MMOs aboard was positioned 4 km shoreward of the active seismic vessel to provide better visual coverage of the 4-5 km buffer and to help define the inshore edge of the 4-5 km buffer. A second scout vessel remained near the seismic vessel. Shore-based observers determined whale numbers, distribution, and behavior during and after the seismic survey. Acoustic monitoring documented received sound levels near and in the main whale feeding area. Statistical analyses of aerial survey data indicated that about 5-10 gray whales moved away from waters near (inshore of) the seismic survey during seismic operations. They shifted into the core gray whale feeding area farther south, and the proportion of gray whales observed feeding did not change over the study period. Five shutdowns of the air guns were invoked for gray whales seen within or near the buffer. A previously unknown gray whale feeding area (the Offshore feeding area) was discovered south and offshore from the nearshore Piltun feeding area. The Offshore area has subsequently been shown to be used by feeding gray whales during several years when no anthropogenic activity occurred near the Piltun feeding area.Shore-based counts indicated that whales continued to feed inshore of the Odoptu block throughout the seismic survey, with no significant correlation between gray whale abundance and seismic activity. Average values of most behavioral parameters were similar to those without seismic surveys. Univariate analysis showed no correlation between seismic sound levels and any behavioral parameter. Multiple regression analyses indicated that, after allowance for environmental covariates, 5 of 11 behavioral parameters were statistically correlated with estimated

  14. Analytical Approaches to Verify Food Integrity: Needs and Challenges.

    PubMed

    Stadler, Richard H; Tran, Lien-Anh; Cavin, Christophe; Zbinden, Pascal; Konings, Erik J M

    2016-09-01

    A brief overview of the main analytical approaches and practices to determine food authenticity is presented, addressing, as well, food supply chain and future requirements to more effectively mitigate food fraud. Food companies are introducing procedures and mechanisms that allow them to identify vulnerabilities in their food supply chain under the umbrella of a food fraud prevention management system. A key step and first line of defense is thorough supply chain mapping and full transparency, assessing the likelihood of fraudsters to penetrate the chain at any point. More vulnerable chains, such as those where ingredients and/or raw materials are purchased through traders or auctions, may require a higher degree of sampling, testing, and surveillance. Access to analytical tools is therefore pivotal, requiring continuous development and possibly sophistication in identifying chemical markers, data acquisition, and modeling. Significant progress in portable technologies is evident already today, for instance, as in the rapid testing now available at the agricultural level. In the near future, consumers may also have the ability to scan products in stores or at home to authenticate labels and food content. For food manufacturers, targeted analytical methods complemented by untargeted approaches are end control measures at the factory gate when the material is delivered. In essence, testing for food adulterants is an integral part of routine QC, ideally tailored to the risks in the individual markets and/or geographies or supply chains. The development of analytical methods is a first step in verifying the compliance and authenticity of food materials. A next, more challenging step is the successful establishment of global consensus reference methods as exemplified by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals initiative, which can serve as an approach that could also be applied to methods for contaminants and adulterants in food. The food

  15. Scenarios for exercising technical approaches to verified nuclear reductions

    SciTech Connect

    Doyle, James

    2010-01-01

    Presidents Obama and Medvedev in April 2009 committed to a continuing process of step-by-step nuclear arms reductions beyond the new START treaty that was signed April 8, 2010 and to the eventual goal of a world free of nuclear weapons. In addition, the US Nuclear Posture review released April 6, 2010 commits the US to initiate a comprehensive national research and development program to support continued progress toward a world free of nuclear weapons, including expanded work on verification technologies and the development of transparency measures. It is impossible to predict the specific directions that US-RU nuclear arms reductions will take over the 5-10 years. Additional bilateral treaties could be reached requiring effective verification as indicated by statements made by the Obama administration. There could also be transparency agreements or other initiatives (unilateral, bilateral or multilateral) that require monitoring with a standard of verification lower than formal arms control, but still needing to establish confidence to domestic, bilateral and multilateral audiences that declared actions are implemented. The US Nuclear Posture Review and other statements give some indication of the kinds of actions and declarations that may need to be confirmed in a bilateral or multilateral setting. Several new elements of the nuclear arsenals could be directly limited. For example, it is likely that both strategic and nonstrategic nuclear warheads (deployed and in storage), warhead components, and aggregate stocks of such items could be accountable under a future treaty or transparency agreement. In addition, new initiatives or agreements may require the verified dismantlement of a certain number of nuclear warheads over a specified time period. Eventually procedures for confirming the elimination of nuclear warheads, components and fissile materials from military stocks will need to be established. This paper is intended to provide useful background information

  16. Interpolation of aliased seismic traces

    SciTech Connect

    Monk, D.J.; McBeath, R.G.; Wason, C.B.

    1993-08-10

    A method is described of interpolating seismic traces comprising the steps of: (a) processing seismic data to produce input seismic traces; (b) transforming the input seismic traces from the x, y, and time domain into the x-slope, y-slope and time domain (domains) by using a two dimensional power diversity slant stack; and (c) transforming the product of step (b) back into the x, y, and time domain using an inverse slant stack.

  17. Robust discrimination of human footsteps using seismic signals

    NASA Astrophysics Data System (ADS)

    Faghfouri, Aram E.; Frish, Michael B.

    2011-06-01

    This paper provides a statistical analysis method for detecting and discriminating different seismic activity sources such as humans, animals, and vehicles using their seismic signals. A five-step process is employed for this purpose: (1) a set of signals with known seismic activities are utilized to verify the algorithms; (2) for each data file, the vibration signal is segmented by a sliding-window and its noise is reduced; (3) a set of features is extracted from each window of the signal which captures its statistical and spectral properties. This set is formed as an array and is called a feature array; (4) a portion of the labeled feature arrays are utilized to train a classifier for discriminating different types of signals; and (5) the rest of the labeled feature arrays are employed to test the performance of the developed classifier. The results indicate that the classifier achieves probability of detection (pd) above 95% and false alarm rate (pfa) less than 1%.

  18. Induced Seismicity Monitoring System

    NASA Astrophysics Data System (ADS)

    Taylor, S. R.; Jarpe, S.; Harben, P.

    2014-12-01

    There are many seismological aspects associated with monitoring of permanent storage of carbon dioxide (CO2) in geologic formations. Many of these include monitoring underground gas migration through detailed tomographic studies of rock properties, integrity of the cap rock and micro seismicity with time. These types of studies require expensive deployments of surface and borehole sensors in the vicinity of the CO2 injection wells. Another problem that may exist in CO2 sequestration fields is the potential for damaging induced seismicity associated with fluid injection into the geologic reservoir. Seismic hazard monitoring in CO2 sequestration fields requires a seismic network over a spatially larger region possibly having stations in remote settings. Expensive observatory-grade seismic systems are not necessary for seismic hazard deployments or small-scale tomographic studies. Hazard monitoring requires accurate location of induced seismicity to magnitude levels only slightly less than that which can be felt at the surface (e.g. magnitude 1), and the frequencies of interest for tomographic analysis are ~1 Hz and greater. We have developed a seismo/acoustic smart sensor system that can achieve the goals necessary for induced seismicity monitoring in CO2 sequestration fields. The unit is inexpensive, lightweight, easy to deploy, can operate remotely under harsh conditions and features 9 channels of recording (currently 3C 4.5 Hz geophone, MEMS accelerometer and microphone). An on-board processor allows for satellite transmission of parameter data to a processing center. Continuous or event-detected data is kept on two removable flash SD cards of up to 64+ Gbytes each. If available, data can be transmitted via cell phone modem or picked up via site visits. Low-power consumption allows for autonomous operation using only a 10 watt solar panel and a gel-cell battery. The system has been successfully tested for long-term (> 6 months) remote operations over a wide range

  19. Application of the Neo-Deterministic Seismic Microzonation Procedure in Bulgaria and Validation of the Seismic Input Against Eurocode 8

    SciTech Connect

    Ivanka, Paskaleva; Mihaela, Kouteva; Franco, Vaccari; Panza, Giuliano F.

    2008-07-08

    The earthquake record and the Code for design and construction in seismic regions in Bulgaria have shown that the territory of the Republic of Bulgaria is exposed to a high seismic risk due to local shallow and regional strong intermediate-depth seismic sources. The available strong motion database is quite limited, and therefore not representative at all of the real hazard. The application of the neo-deterministic seismic hazard assessment procedure for two main Bulgarian cities has been capable to supply a significant database of synthetic strong motions for the target sites, applicable for earthquake engineering purposes. The main advantage of the applied deterministic procedure is the possibility to take simultaneously and correctly into consideration the contribution to the earthquake ground motion at the target sites of the seismic source and of the seismic wave propagation in the crossed media. We discuss in this study the result of some recent applications of the neo-deterministic seismic microzonation procedure to the cities of Sofia and Russe. The validation of the theoretically modeled seismic input against Eurocode 8 and the few available records at these sites is discussed.

  20. Seismic Performance Evaluation of Concentrically Braced Frames

    NASA Astrophysics Data System (ADS)

    Hsiao, Po-Chien

    Concentrically braced frames (CBFs) are broadly used as lateral-load resisting systems in buildings throughout the US. In high seismic regions, special concentrically braced frames (SCBFs) where ductility under seismic loading is necessary. Their large elastic stiffness and strength efficiently sustains the seismic demands during smaller, more frequent earthquakes. During large, infrequent earthquakes, SCBFs exhibit highly nonlinear behavior due to brace buckling and yielding and the inelastic behavior induced by secondary deformation of the framing system. These response modes reduce the system demands relative to an elastic system without supplemental damping. In design the re reduced demands are estimated using a response modification coefficient, commonly termed the R factor. The R factor values are important to the seismic performance of a building. Procedures put forth in FEMAP695 developed to R factors through a formalized procedure with the objective of consistent level of collapse potential for all building types. The primary objective of the research was to evaluate the seismic performance of SCBFs. To achieve this goal, an improved model including a proposed gusset plate connection model for SCBFs that permits accurate simulation of inelastic deformations of the brace, gusset plate connections, beams and columns and brace fracture was developed and validated using a large number of experiments. Response history analyses were conducted using the validated model. A series of different story-height SCBF buildings were designed and evaluated. The FEMAP695 method and an alternate procedure were applied to SCBFs and NCBFs. NCBFs are designed without ductile detailing. The evaluation using P695 method shows contrary results to the alternate evaluation procedure and the current knowledge in which short-story SCBF structures are more venerable than taller counterparts and NCBFs are more vulnerable than SCBFs.

  1. Generic seismic ruggedness of power plant equipment

    SciTech Connect

    Merz, K.L. )

    1991-08-01

    This report updates the results of a program with the overall objective of demonstrating the generic seismic adequacy of as much nuclear power plant equipment as possible by means of collecting and evaluating existing seismic qualification test data. These data are then used to construct ruggedness'' spectra below which equipment in operating plants designed to earlier earthquake criteria would be generically adequate. This document is an EPRI Tier 1 Report. The report gives the methodology for the collection and evaluation of data which are used to construct a Generic Equipment Ruggedness Spectrum (GERs) for each equipment class considered. The GERS for each equipment class are included in an EPRI Tier 2 Report with the same title. Associated with each GERS are inclusion rules, cautions, and checklists for field screening of in-place equipment for GERS applicability. A GERS provides a measure of equipment seismic resistance based on available test data. As such, a GERS may also be used to judge the seismic adequacy of similar new or replacement equipment or to estimate the seismic margin of equipment re-evaluated with respect to earthquake levels greater than considered to date, resulting in fifteen finalized GERS. GERS for relays (included in the original version of this report) are now covered in a separate report (NP-7147). In addition to the presentation of GERS, the Tier 2 report addresses the applicability of GERS to equipment of older vintage, methods for estimating amplification factors for evaluating devices installed in cabinets and enclosures, and how seismic test data from related studies relate to the GERS approach. 28 refs., 5 figs., 4 tabs.

  2. High performance seismic sensor requirements for military and security applications

    NASA Astrophysics Data System (ADS)

    Pakhomov, A.; Pisano, D.; Sicignano, A.; Goldburt, T.

    2005-05-01

    General Sensing Systems (GSS) has been developing seismic sensors for different security and military applications for the past several years. Research and development in this area does not have a single-value purpose as security and military applications are of a broad variety. Many of the requirements for seismic sensors are well known. Herein we describe additional requirements for seismic sensors that are not at the center of common attention and associated with high performance seismic sensors. We find that the hard issues related to "remote" deployment/installation methods can be solved, given the seismic sensor does not have the usual single-axis sensitivity, but sensitivity to arbitrary oriented impact/vibrations. Our results show that such a sensor can be designed, in particular based on electret materials. We report that traditional frequency response curve linearity is not always the appropriate goal. Such issues as useful signal frequency band and an interference immunity should be directly taken into account. In addition, the mechanical oscillator of the seismic sensor should have a very broad dynamic range about 120dB, or an adjustable sensitivity for use in various tactical applications. We find that increasing sensitivity is not so much needed as is reducing of the seismic sensor sensitivity threshold. The lower sensitivity threshold in higher target detection range can be obtained in low noise environmental conditions. We will also show that the attempt to design and manufacture a universal seismic sensor for every possible application seems unreasonable. In every respect it makes sense to design a seismic sensor set, which can fit and satisfy all plurality of the applications and multi objective requirements.

  3. Seismic waveform modeling over cloud

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Friederich, Wolfgang

    2016-04-01

    With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.

  4. Functional seismic evaluation of hospitals

    NASA Astrophysics Data System (ADS)

    Guevara, L. T.

    2003-04-01

    Functional collapse of hospitals (FCH) occurs when a medical complex, or part of it, although with neither structural nor nonstructural damage, is unable to provide required services for immediate attention to earthquake victims and for the recovery of the affected community. As it is known, FCH during and after an earthquake, is produced, not only by the damage to nonstructural components, but by an inappropriate or deficient distribution of essential and supporting medical spaces. This paper presents some conclusions on the analysis of the traditional architectural schemes for the design and construction of hospitals in the 20th Century and some recommendations for the establishment of evaluation parameters for the remodeling and seismic upgrade of existing hospitals in seismic zones based on the new concepts of: a) the relative location of each essential service (ES) into the medical complex, b) the capacity of each of these spaces for housing temporary activities required for the attention of a massive emergency (ME); c) the relationship between ES and the supporting services (SS); d) the flexibility of transformation of nonessential services into complementary spaces for the attention of extraordinary number of victims; e) the dimensions and appropriateness of evacuation routes; and d) the appropriate supply and maintenance of water, electricity and vital gases emergency installations.

  5. The seismic analyzer: interpreting and illustrating 2D seismic data.

    PubMed

    Patel, Daniel; Giertsen, Christopher; Thurmond, John; Gjelberg, John; Gröller, M Eduard

    2008-01-01

    We present a toolbox for quickly interpreting and illustrating 2D slices of seismic volumetric reflection data. Searching for oil and gas involves creating a structural overview of seismic reflection data to identify hydrocarbon reservoirs. We improve the search of seismic structures by precalculating the horizon structures of the seismic data prior to interpretation. We improve the annotation of seismic structures by applying novel illustrative rendering algorithms tailored to seismic data, such as deformed texturing and line and texture transfer functions. The illustrative rendering results in multi-attribute and scale invariant visualizations where features are represented clearly in both highly zoomed in and zoomed out views. Thumbnail views in combination with interactive appearance control allows for a quick overview of the data before detailed interpretation takes place. These techniques help reduce the work of seismic illustrators and interpreters.

  6. Spot: A Programming Language for Verified Flight Software

    NASA Technical Reports Server (NTRS)

    Bocchino, Robert L., Jr.; Gamble, Edward; Gostelow, Kim P.; Some, Raphael R.

    2014-01-01

    The C programming language is widely used for programming space flight software and other safety-critical real time systems. C, however, is far from ideal for this purpose: as is well known, it is both low-level and unsafe. This paper describes Spot, a language derived from C for programming space flight systems. Spot aims to maintain compatibility with existing C code while improving the language and supporting verification with the SPIN model checker. The major features of Spot include actor-based concurrency, distributed state with message passing and transactional updates, and annotations for testing and verification. Spot also supports domain-specific annotations for managing spacecraft state, e.g., communicating telemetry information to the ground. We describe the motivation and design rationale for Spot, give an overview of the design, provide examples of Spot's capabilities, and discuss the current status of the implementation.

  7. Assessment of the Metrological Performance of Seismic Tables for a QMS Recognition

    NASA Astrophysics Data System (ADS)

    Silva Ribeiro, A.; Campos Costa, A.; Candeias, P.; Sousa, J. Alves e.; Lages Martins, L.; Freitas Martins, A. C.; Ferreira, A. C.

    2016-11-01

    Seismic testing and analysis using large infrastructures, such as shaking tables and reaction walls, is performed worldwide requiring the use of complex instrumentation systems. To assure the accuracy of these systems, conformity assessment is needed to verify the compliance with standards and applications, and the Quality Management Systems (QMS) is being increasingly applied to domains where risk analysis is critical as a way to provide a formal recognition. This paper describes an approach to the assessment of the metrological performance of seismic shake tables as part of a QMS recognition, with the analysis of a case study of LNEC Seismic shake table.

  8. Mobile seismic exploration

    NASA Astrophysics Data System (ADS)

    Dräbenstedt, A.; Cao, X.; Polom, U.; Pätzold, F.; Zeller, T.; Hecker, P.; Seyfried, V.; Rembe, C.

    2016-06-01

    Laser-Doppler-Vibrometry (LDV) is an established technique to measure vibrations in technical systems with picometer vibration-amplitude resolution. Especially good sensitivity and resolution can be achieved at an infrared wavelength of 1550 nm. High-resolution vibration measurements are possible over more than 100 m distance. This advancement of the LDV technique enables new applications. The detection of seismic waves is an application which has not been investigated so far because seismic waves outside laboratory scales are usually analyzed at low frequencies between approximately 1 Hz and 250 Hz and require velocity resolutions in the range below 1 nm/s/√Hz. Thermal displacements and air turbulence have critical influences to LDV measurements at this low-frequency range leading to noise levels of several 100 nm/√Hz. Commonly seismic waves are measured with highly sensitive inertial sensors (geophones or Micro Electro-Mechanical Sensors (MEMS)). Approaching a laser geophone based on LDV technique is the topic of this paper. We have assembled an actively vibration-isolated optical table in a minivan which provides a hole in its underbody. The laser-beam of an infrared LDV assembled on the optical table impinges the ground below the car through the hole. A reference geophone has detected remaining vibrations on the table. We present the results from the first successful experimental demonstration of contactless detection of seismic waves from a movable vehicle with a LDV as laser geophone.

  9. Hanford Seismic Network

    SciTech Connect

    Reidel, S.P.; Hartshorn, D.C.

    1997-05-01

    This report describes the Hanford Seismic Network. The network consists of two instrument arrays: seismometers and strong motion accelerometers. The seismometers determine the location and magnitude of earthquakes, and the strong motion accelerometers determine ground motion. Together these instruments arrays comply with the intent of DOE Order 5480.20, Natural Phenomena Hazards Mitigation.

  10. Nonstructural seismic restraint guidelines

    SciTech Connect

    Butler, D.M.; Czapinski, R.H.; Firneno, M.J.; Feemster, H.C.; Fornaciari, N.R.; Hillaire, R.G.; Kinzel, R.L.; Kirk, D.; McMahon, T.T.

    1993-08-01

    The Nonstructural Seismic Restraint Guidelines provide general information about how to secure or restrain items (such as material, equipment, furniture, and tools) in order to prevent injury and property, environmental, or programmatic damage during or following an earthquake. All SNL sites may experience earthquakes of magnitude 6.0 or higher on the Richter scale. Therefore, these guidelines are written for all SNL sites.

  11. Verifying Stability of Dynamic Soft-Computing Systems

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Napolitano, Marcello; Callahan, John

    1997-01-01

    Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

  12. Real-time Imaging Orientation Determination System to Verify Imaging Polarization Navigation Algorithm

    PubMed Central

    Lu, Hao; Zhao, Kaichun; Wang, Xiaochu; You, Zheng; Huang, Kaoli

    2016-01-01

    Bio-inspired imaging polarization navigation which can provide navigation information and is capable of sensing polarization information has advantages of high-precision and anti-interference over polarization navigation sensors that use photodiodes. Although all types of imaging polarimeters exist, they may not qualify for the research on the imaging polarization navigation algorithm. To verify the algorithm, a real-time imaging orientation determination system was designed and implemented. Essential calibration procedures for the type of system that contained camera parameter calibration and the inconsistency of complementary metal oxide semiconductor calibration were discussed, designed, and implemented. Calibration results were used to undistort and rectify the multi-camera system. An orientation determination experiment was conducted. The results indicated that the system could acquire and compute the polarized skylight images throughout the calibrations and resolve orientation by the algorithm to verify in real-time. An orientation determination algorithm based on image processing was tested on the system. The performance and properties of the algorithm were evaluated. The rate of the algorithm was over 1 Hz, the error was over 0.313°, and the population standard deviation was 0.148° without any data filter. PMID:26805851

  13. Alternate approaches to verifying the structural adequacy of the Defense High Level Waste Shipping Cask

    SciTech Connect

    Zimmer, A.; Koploy, M.

    1991-12-01

    In the early 1980s, the US Department of Energy/Defense Programs (DOE/DP) initiated a project to develop a safe and efficient transportation system for defense high level waste (DHLW). A long-standing objective of the DHLW transportation project is to develop a truck cask that represents the leading edge of cask technology as well as one that fully complies with all applicable DOE, Nuclear Regulatory Commission (NRC), and Department of Transportation (DOT) regulations. General Atomics (GA) designed the DHLW Truck Shipping Cask using state-of-the-art analytical techniques verified by model testing performed by Sandia National Laboratories (SNL). The analytical techniques include two approaches, inelastic analysis and elastic analysis. This topical report presents the results of the two analytical approaches and the model testing results. The purpose of this work is to show that there are two viable analytical alternatives to verify the structural adequacy of a Type B package and to obtain an NRC license. It addition, this data will help to support the future acceptance by the NRC of inelastic analysis as a tool in packaging design and licensing.

  14. Second Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-06-26

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, seven local earthquakes were recorded during the second quarter of fiscal year 2008. The largest event recorded by the network during the second quarter (February 3, 2008 - magnitude 2.3 Mc) was located northeast of Richland in Franklin County at a depth of 22.5 km. With regard to the depth distribution, two earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), three earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and two earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, five earthquakes occurred in swarm areas and two earthquakes were classified as random events.

  15. First Quarter Hanford Seismic Report for Fiscal Year 2008

    SciTech Connect

    Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

    2008-03-21

    The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, forty-four local earthquakes were recorded during the first quarter of fiscal year 2008. A total of thirty-one micro earthquakes were recorded within the Rattlesnake Mountain swarm area at depths in the 5-8 km range, most likely within the pre-basalt sediments. The largest event recorded by the network during the first quarter (November 25, 2007 - magnitude 1.5 Mc) was located within this swarm area at a depth of 4.3 km. With regard to the depth distribution, three earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), thirty-six earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and five earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, thirty-eight earthquakes occurred in swarm areas and six earth¬quakes were classified as random events.

  16. Third Quarter Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-09-01

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its con-tractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (E WRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 818 triggers on two parallel detection and recording systems during the third quarter of fiscal year (FY) 2000. Thirteen seismic events were located by the Hanford Seismic Network within the reporting region of 46-47{degree} N latitude and 119-120{degree} W longitude; 7 were earthquakes in the Columbia River Basalt Group, 1 was an earthquake in the pre-basalt sediments, and 5 were earthquakes in the crystalline basement. Three earthquakes occurred in known swarm areas, and 10 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the third quarter of FY 2000.

  17. First quarter Hanford seismic report for fiscal year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-02-23

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EW uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 311 triggers on two parallel detection and recording systems during the first quarter of fiscal year (FY) 2000. Twelve seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree}N latitude and 119--120{degree}W longitude; 2 were earthquakes in the Columbia River Basalt Group, 3 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 1 was a quarry blast. Two earthquakes appear to be related to a major geologic structure, no earthquakes occurred in known swarm areas, and 9 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers

  18. Second Quarter Hanford Seismic Report for Fiscal Year 2000

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    2000-07-17

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 506 triggers on two parallel detection and recording systems during the second quarter of fiscal year (FY) 2000. Twenty-seven seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree} N latitude and 119--120{degree} W longitude; 12 were earthquakes in the Columbia River Basalt Group, 2 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 5 were quarry blasts. Three earthquakes appear to be related to geologic structures, eleven earthquakes occurred in known swarm areas, and seven earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion

  19. High Voltage Seismic Generator

    NASA Astrophysics Data System (ADS)

    Bogacz, Adrian; Pala, Damian; Knafel, Marcin

    2015-04-01

    This contribution describes the preliminary result of annual cooperation of three student research groups from AGH UST in Krakow, Poland. The aim of this cooperation was to develop and construct a high voltage seismic wave generator. Constructed device uses a high-energy electrical discharge to generate seismic wave in ground. This type of device can be applied in several different methods of seismic measurement, but because of its limited power it is mainly dedicated for engineering geophysics. The source operates on a basic physical principles. The energy is stored in capacitor bank, which is charged by two stage low to high voltage converter. Stored energy is then released in very short time through high voltage thyristor in spark gap. The whole appliance is powered from li-ion battery and controlled by ATmega microcontroller. It is possible to construct larger and more powerful device. In this contribution the structure of device with technical specifications is resented. As a part of the investigation the prototype was built and series of experiments conducted. System parameter was measured, on this basis specification of elements for the final device were chosen. First stage of the project was successful. It was possible to efficiently generate seismic waves with constructed device. Then the field test was conducted. Spark gap wasplaced in shallowborehole(0.5 m) filled with salt water. Geophones were placed on the ground in straight line. The comparison of signal registered with hammer source and sparker source was made. The results of the test measurements are presented and discussed. Analysis of the collected data shows that characteristic of generated seismic signal is very promising, thus confirms possibility of practical application of the new high voltage generator. The biggest advantage of presented device after signal characteristics is its size which is 0.5 x 0.25 x 0.2 m and weight approximately 7 kg. This features with small li-ion battery makes

  20. Comparing USGS national seismic hazard maps with internet-based macroseismic intensity observations

    NASA Astrophysics Data System (ADS)

    Mak, Sum; Schorlemmer, Danijel

    2016-04-01

    Verifying a nationwide seismic hazard assessment using data collected after the assessment has been made (i.e., prospective data) is a direct consistency check of the assessment. We directly compared the predicted rate of ground motion exceedance by the four available versions of the USGS national seismic hazard map (NSHMP, 1996, 2002, 2008, 2014) with the actual observed rate during 2000-2013. The data were prospective to the two earlier versions of NSHMP. We used two sets of somewhat independent data, namely 1) the USGS "Did You Feel It?" (DYFI) intensity reports, 2) instrumental ground motion records extracted from ShakeMap stations. Although both are observed data, they come in different degrees of accuracy. Our results indicated that for California, the predicted and observed hazards were very comparable. The two sets of data gave consistent results, implying robustness. The consistency also encourages the use of DYFI data for hazard verification in the Central and Eastern US (CEUS), where instrumental records are lacking. The result showed that the observed ground-motion exceedance was also consistent with the predicted in CEUS. The primary value of this study is to demonstrate the usefulness of DYFI data, originally designed for community communication instead of scientific analysis, for the purpose of hazard verification.

  1. Separation of seismic blended data by sparse inversion over dictionary learning

    NASA Astrophysics Data System (ADS)

    Zhou, Yanhui; Chen, Wenchao; Gao, Jinghuai

    2014-07-01

    Recent development of blended acquisition calls for the new procedure to process blended seismic measurements. Presently, deblending and reconstructing unblended data followed by conventional processing is the most practical processing workflow. We study seismic deblending by advanced sparse inversion with a learned dictionary in this paper. To make our method more effective, hybrid acquisition and time-dithering sequential shooting are introduced so that clean single-shot records can be used to train the dictionary to favor the sparser representation of data to be recovered. Deblending and dictionary learning with l1-norm based sparsity are combined to construct the corresponding problem with respect to unknown recovery, dictionary, and coefficient sets. A two-step optimization approach is introduced. In the step of dictionary learning, the clean single-shot data are selected as trained data to learn the dictionary. For deblending, we fix the dictionary and employ an alternating scheme to update the recovery and coefficients separately. Synthetic and real field data were used to verify the performance of our method. The outcome can be a significant reference in designing high-efficient and low-cost blended acquisition.

  2. Sub-seismic Deformation Prediction of Potential Pathways and Seismic Validation - The Joint Project PROTECT

    NASA Astrophysics Data System (ADS)

    Krawczyk, C. M.; Kolditz, O.

    2013-12-01

    The joint project PROTECT (PRediction Of deformation To Ensure Carbon Traps) aims to determine the existence and characteristics of sub-seismic structures that can potentially link deep reservoirs with the surface in the framework of CO2 underground storage. The research provides a new approach of assessing the long-term integrity of storage reservoirs. The objective is predicting and quantifying the distribution and the amount of sub-/seismic strain caused by fault movement in the proximity of a CO2 storage reservoir. The study is developing tools and workflows which will be tested at the CO2CRC Otway Project Site in the Otway Basin in south-western Victoria, Australia. For this purpose, we are building a geometrical kinematic 3-D model based on 2-D and 3-D seismic data that are provided by the Australian project partner, the CO2CRC Consortium. By retro-deforming the modeled subsurface faults in the inspected subsurface volume we can determine the accumulated sub-seismic deformation and thus the strain variation around the faults. Depending on lithology, the calculated strain magnitude and its orientation can be used as an indicator for fracture density. Furthermore, from the complete 3D strain tensor we can predict the orientation of fractures at sub-seismic scale. In areas where we have preliminary predicted critical deformation, we will acquire in November this year new near- surface, high resolution P- and S-wave 2-D seismic data in order to verify and calibrate our model results. Here, novel and parameter-based model building will especially benefit from extracting velocities and elastic parameters from VSP and other seismic data. Our goal is to obtain a better overview of possible fluid migration pathways and communication between reservoir and overburden. Thereby, we will provide a tool for prediction and adapted time-dependent monitoring strategies for subsurface storage in general including scientific visualization capabilities. Acknowledgement This work

  3. Seismic margin review of the Maine Yankee Atomic Power Station: Summary report

    SciTech Connect

    Prassinos, P.G.; Murray, R.C.; Cummings, G.E.

    1987-03-01

    This Summary Report is the first of three volumes for the Seismic Margin Review of the Maine Yankee Atomic Power Station. Volume 2 is the Systems Analysis of the first trial seismic margin review. Volume 3 documents the results of the fragility screening for the review. The three volumes demonstrate how the seismic margin review guidance (NUREG/CR-4482) of the Nuclear Regulatory Commission (NRC) Seismic Design Margins Program can be applied. The overall objectives of the trial review are to assess the seismic margins of a particular pressurized water reactor, and to test the adequacy of this review approach, quantification techniques, and guidelines for performing the review. Results from the trial review will be used to revise the seismic margin methodology and guidelines so that the NRC and industry can readily apply them to assess the inherent quantitative seismic capacity of nuclear power plants.

  4. The Great Maule earthquake: seismicity prior to and after the main shock from amphibious seismic networks

    NASA Astrophysics Data System (ADS)

    Lieser, K.; Arroyo, I. G.; Grevemeyer, I.; Flueh, E. R.; Lange, D.; Tilmann, F. J.

    2013-12-01

    The Chilean subduction zone is among the seismically most active plate boundaries in the world and its coastal ranges suffer from a magnitude 8 or larger megathrust earthquake every 10-20 years. The Constitución-Concepción or Maule segment in central Chile between ~35.5°S and 37°S was considered to be a mature seismic gap, rupturing last in 1835 and being seismically quiet without any magnitude 4.5 or larger earthquakes reported in global catalogues. It is located to the north of the nucleation area of the 1960 magnitude 9.5 Valdivia earthquake and to the south of the 1928 magnitude 8 Talca earthquake. On 27 February 2010 this segment ruptured in a Mw=8.8 earthquake, nucleating near 36°S and affecting a 500-600 km long segment of the margin between 34°S and 38.5°S. Aftershocks occurred along a roughly 600 km long portion of the central Chilean margin, most of them offshore. Therefore, a network of 30 ocean-bottom-seismometers was deployed in the northern portion of the rupture area for a three month period, recording local offshore aftershocks between 20 September 2010 and 25 December 2010. In addition, data of a network consisting of 33 landstations of the GeoForschungsZentrum Potsdam were included into the network, providing an ideal coverage of both the rupture plane and areas affected by post-seismic slip as deduced from geodetic data. Aftershock locations are based on automatically detected P wave onsets and a 2.5D velocity model of the combined on- and offshore network. Aftershock seismicity analysis in the northern part of the survey area reveals a well resolved seismically active splay fault in the accretionary prism of the Chilean forearc. Our findings imply that in the northernmost part of the rupture zone, co-seismic slip most likely propagated along the splay fault and not the subduction thrust fault. In addition, the updip limit of aftershocks along the plate interface can be verified to about 40 km landwards from the deformation front. Prior to

  5. Seismic Tomography in Sensor Networks

    NASA Astrophysics Data System (ADS)

    Shi, L.; Song, W.; Lees, J. M.; Xing, G.

    2012-12-01

    Tomography imaging, applied to seismology, requires a new, decentralized approach if high resolution calculations are to be performed in a sensor network configuration. The real-time data retrieval from a network of large-amount wireless seismic stations to a central server is virtually impossible due to the sheer data amount and resource limitations. In this paper, we propose and design a distributed algorithm for processing data and inverting tomography in the network, while avoiding costly data collections and centralized computations. Based on a partition of the tomographic inversion problem, the new algorithms distribute the computational burden to sensor nodes and perform real-time tomographic inversion in the network, so that we can recover a high resolution tomographic model in real-time under the constraints of network resources. Our emulation results indicate that the distributed algorithms successfully reconstruct the synthetic models, while reducing and balancing the communication and computation cost to a large extent.

  6. Development of an Innovative Downhole Seismic Source

    NASA Astrophysics Data System (ADS)

    Reichhardt, D.

    2005-05-01

    MSE Technology Applications, Inc. (MSE) previously designed, built, and tested an innovative downhole seismic source. The design criteria included a size limitation (the source needed to fit into a 2-inch diameter well casing), the source would use .22 caliber power loads as the energy source, it would have the ability to fire at least 12 times before reloading, it would be able to function under water (depth is limited by internal pressure from the .22 caliber power loads, which must be greater than pressure exerted by water column), and it would use no more than 24-volt dc current. MSE developed the design criteria from a need for a downhole seismic source suitable for high-resolution seismic tomography applications. Tomographic methods may provide detailed information at waste sites for both characterization and monitoring. Since borehole diameters are kept to a minimum (i.e., 2-inches or less) to reduce waste volumes from drill cuttings, or the borehole may be installed using a direct push technology such as a GeoprobeT or cone penetrometer, a small diameter source is desirable. Additionally, the use of .22 caliber power loads reduces the amount of supporting equipment required to operate the source as compared to other downhole seismic sources (e.g., air guns and piezoelectric sources). MSE tested and evaluated the completed seismic source to assess the effectiveness of the .22 caliber power loads as energy sources and to assess the operational ease of using the source. Results of the testing indicated that the power loads provided energy suitable for high-resolution cross-well seismic tomography applications. Operation of the source required significantly less supporting equipment than other downhole sources tested. However, the testing suggested the system could be improved if the number of mechanical components were reduced. Subsequent research suggested that the power loads could be fired using an electric current. As a result, MSE believes that the entire

  7. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  8. Probabilistic seismic hazard estimation of Manipur, India

    NASA Astrophysics Data System (ADS)

    Pallav, Kumar; Raghukanth, S. T. G.; Darunkumar Singh, Konjengbam

    2012-10-01

    This paper deals with the estimation of spectral acceleration for Manipur based on probabilistic seismic hazard analysis (PSHA). The 500 km region surrounding Manipur is divided into seven tectonic zones and major faults located in these zones are used to estimate seismic hazard. The earthquake recurrence relations for the seven zones have been estimated from past seismicity data. Ground motion prediction equations proposed by Boore and Atkinson (2008 Earthq. Spectra 24 99-138) for shallow active regions and Atkinson and Boore (2003 Bull. Seismol. Soc. Am. 93 1703-29) for the Indo-Burma subduction zone are used for estimating ground motion. The uniform hazard response spectra for all the nine constituent districts of Manipur (Senapati, Tamenglong, Churachandpur, Chandel, Imphal east, Imphal west, Ukhrul, Thoubal and Bishnupur) at 100-, 500- and 2500-year return periods have been computed from PSHA. A contour map of peak ground acceleration over Manipur is also presented for 100-, 500-, and 2500-year return periods with variations of 0.075-0.225, 0.18-0.63 and 0.3-0.1.15 g, respectively, throughout the state. These results may be of use to planners and engineers for site selection, designing earthquake resistant structures and, further, may help the state administration in seismic hazard mitigation.

  9. Sound source localization technique using a seismic streamer and its extension for whale localization during seismic surveys.

    PubMed

    Abadi, Shima H; Wilcock, William S D; Tolstoy, Maya; Crone, Timothy J; Carbotte, Suzanne M

    2015-12-01

    Marine seismic surveys are under increasing scrutiny because of concern that they may disturb or otherwise harm marine mammals and impede their communications. Most of the energy from seismic surveys is low frequency, so concerns are particularly focused on baleen whales. Extensive mitigation efforts accompany seismic surveys, including visual and acoustic monitoring, but the possibility remains that not all animals in an area can be observed and located. One potential way to improve mitigation efforts is to utilize the seismic hydrophone streamer to detect and locate calling baleen whales. This study describes a method to localize low frequency sound sources with data recoded by a streamer. Beamforming is used to estimate the angle of arriving energy relative to sub-arrays of the streamer which constrains the horizontal propagation velocity to each sub-array for a given trial location. A grid search method is then used to minimize the time residual for relative arrival times along the streamer estimated by cross correlation. Results from both simulation and experiment are shown and data from the marine mammal observers and the passive acoustic monitoring conducted simultaneously with the seismic survey are used to verify the analysis.

  10. Results from the latest SN-4 multi-parametric benthic observatory experiment (MARsite EU project) in the Gulf of Izmit, Turkey: oceanographic, chemical and seismic monitoring

    NASA Astrophysics Data System (ADS)

    Embriaco, Davide; Marinaro, Giuditta; Frugoni, Francesco; Giovanetti, Gabriele; Monna, Stephen; Etiope, Giuseppe; Gasperini, Luca; Çağatay, Namık; Favali, Paolo

    2015-04-01

    An autonomous and long-term multiparametric benthic observatory (SN-4) was designed to study gas seepage and seismic energy release along the submerged segment of the North Anatolian Fault (NAF). Episodic gas seepage occurs at the seafloor in the Gulf of Izmit (Sea of Marmara, NW Turkey) along this submerged segment of the NAF, which ruptured during the 1999 Mw7.4 Izmit earthquake. The SN-4 observatory already operated in the Gulf of Izmit at the western end of the 1999 Izmit earthquake rupture for about one-year at 166 m water depth during the 2009-2010 experiment (EGU2014-13412-1, EGU General Assembly 2014). SN-4 was re-deployed in the same site for a new long term mission (September 2013 - April 2014) in the framework of MARsite (New Directions in Seismic Hazard assessment through Focused Earth Observation in the Marmara Supersite, http://marsite.eu/ ) EC project, which aims at evaluating seismic risk and managing of long-term monitoring activities in the Marmara Sea. A main scientific objective of the SN-4 experiment is to investigate the possible correlations between seafloor methane seepage and release of seismic energy. We used the same site of the 2009-2010 campaign to verify both the occurrence of previously observed phenomena and the reliability of results obtained in the previous experiment (Embriaco et al., 2014, doi:10.1093/gji/ggt436). In particular, we are interested in the detection of gas release at the seafloor, in the role played by oceanographic phenomena in this detection, and in the association of gas and seismic energy release. The scientific payload included, among other instruments, a three-component broad-band seismometer, and gas and oceanographic sensors. We present a technical description of the observatory, including the data acquisition and control system, results from the preliminary analysis of this new multidisciplinary data set, and a comparison with the previous experiment.

  11. Hanford quarterly seismic report -- 97A seismicity on and near the Hanford Site, Pasco Basin, Washington, October 1, 1996 through December 31, 1996

    SciTech Connect

    Hartshorn, D.C.; Reidel, S.P.

    1997-02-01

    Seismic Monitoring is part of PNNL`s Applied Geology and Geochemistry Group. The Seismic Monitoring Analysis and Repair Team (SMART) operates, maintains, and analyzes data from the hanford Seismic Network (HSN), extending the site historical seismic database and fulfilling US Department of Energy, Richland Operations Office requirements and orders. The SMART also maintains the Eastern Washington Regional Network (EWRN). The University of Washington uses the data from the EWRN and other seismic networks in the Northwest to provide the SMART with necessary regional input for the seismic hazards analysis at the Hanford Site. The SMART is tasked to provide an uninterrupted collection of high-quality raw seismic data from the HSN located on and around the Hanford Site. These unprocessed data are permanently archived. SMART also is tasked to locate and identify sources of seismic activity, monitor changes in the historical pattern of seismic activity at the Hanford Site, and build a local earthquake database (processed data) that is permanently archived. Local earthquakes are defined as earthquakes that occur within 46 degrees to 47 degrees west longitude and 119 degrees to 120 degrees north latitude. The data are used by the Hanford contractor for waste management activities, Natural Phenomena Hazards assessments and engineering design and construction. In addition, the seismic monitoring organization works with Hanford Site Emergency Services Organization to provide assistance in the event of an earthquake on the Hanford Site.

  12. Patterns of significant seismic quiescence on the Mexican Pacific coast

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.; Rudolf-Navarro, A. H.; Angulo-Brown, F.; Barrera-Ferrer, A. G.

    Many authors have proposed that the study of seismicity rates is an appropriate technique for evaluating how close a seismic gap may be to rupture. We designed an algorithm for identification of patterns of significant seismic quiescence by using the definition of seismic quiescence proposed by Schreider (1990). This algorithm shows the area of quiescence where an earthquake of great magnitude may probably occur. We have applied our algorithm to the earthquake catalog on the Mexican Pacific coast located between 14 and 21 degrees of North latitude and 94 and 106 degrees West longitude; with depths less than or equal to 60 km and magnitude greater than or equal to 4.3, which occurred from January, 1965 until December, 2014. We have found significant patterns of seismic quietude before the earthquakes of Oaxaca (November 1978, Mw = 7.8), Petatlán (March 1979, Mw = 7.6), Michoacán (September 1985, Mw = 8.0, and Mw = 7.6) and Colima (October 1995, Mw = 8.0). Fortunately, in this century earthquakes of great magnitude have not occurred in Mexico. However, we have identified well-defined seismic quiescences in the Guerrero seismic-gap, which are apparently correlated with the occurrence of silent earthquakes in 2002, 2006 and 2010 recently discovered by GPS technology.

  13. First Quarter Hanford Seismic Report for Fiscal Year 1999

    SciTech Connect

    DC Hartshorn; SP Reidel; AC Rohay

    1999-05-26

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. They also locate and identify sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consists of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The operational rate for the first quarter of FY99 for stations in the HSN was 99.8%. There were 121 triggers during the first quarter of fiscal year 1999. Fourteen triggers were local earthquakes; seven (50%) were in the Columbia River Basalt Group, no earthquakes occurred in the pre-basalt sediments, and seven (50%) were in the crystalline basement. One earthquake (7%) occurred near or along the Horn Rapids anticline, seven earthquakes (50%) occurred in a known swarm area, and six earthquakes (43%) were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometer during the first quarter of FY99.

  14. Investigation of the Seismic Performance of Reinforced Highway Embankments

    NASA Astrophysics Data System (ADS)

    Toksoy, Y. S.; Edinçliler, A.

    2014-12-01

    Despite the fact that highway embankments are highly prone to earthquake induced damage, there are not enough studies in the literature concentrated on improving the seismic performance of highway embankments. Embankments which are quite stable under static load conditions can simply collapse during earthquakes due to the destructive seismic loading. This situation poses a high sequence thread to the structural integrity of the embankment, service quality and serviceability. The objective of this study is to determine the effect of the geosynthetic reinforcement on the seismic performance of the highway embankments and evaluate the seismic performance of the geotextile reinforced embankment under different earthquake motions. A 1:50 scale highway embankment model is designed and reinforced with geosynthetics in order to increase the seismic performance of the embankment model. A series of shaking table tests were performed for the identical unreinforced and reinforced embankment models using earthquake excitations with different characteristics. The experimental results were evaluated comparing the unreinforced and reinforced cases. Results revealed that reinforced embankment models perform better seismic performance especially under specificied ground excitations used in this study. Also, the prototype embankment was numerically modelled. It is seen that similar seismic behavior trend is obtained in the finite element simulations.

  15. Validation of seismic probabilistic risk assessments of nuclear power plants

    SciTech Connect

    Ellingwood, B.

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  16. Models of protein-ligand crystal structures: trust, but verify

    NASA Astrophysics Data System (ADS)

    Deller, Marc C.; Rupp, Bernhard

    2015-09-01

    X-ray crystallography provides the most accurate models of protein-ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein-ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein-ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein-ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein-ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein-ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein-ligand models for their computational and biological studies, and we provide an overview of how this can be achieved.

  17. Verifying a computational method for predicting extreme ground motion

    USGS Publications Warehouse

    Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, B.T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

    2011-01-01

    In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

  18. Monitoring and verifying changes of organic carbon in soil

    USGS Publications Warehouse

    Post, W.M.; Izaurralde, R. C.; Mann, L. K.; Bliss, Norman B.

    2001-01-01

    Changes in soil and vegetation management can impact strongly on the rates of carbon (C) accumulation and loss in soil, even over short periods of time. Detecting the effects of such changes in accumulation and loss rates on the amount of C stored in soil presents many challenges. Consideration of the temporal and spatial heterogeneity of soil properties, general environmental conditions, and management history is essential when designing methods for monitoring and projecting changes in soil C stocks. Several approaches and tools will be required to develop reliable estimates of changes in soil C at scales ranging from the individual experimental plot to whole regional and national inventories. In this paper we present an overview of soil properties and processes that must be considered. We classify the methods for determining soil C changes as direct or indirect. Direct methods include field and laboratory measurements of total C, various physical and chemical fractions, and C isotopes. A promising direct method is eddy covariance measurement of CO2 fluxes. Indirect methods include simple and stratified accounting, use of environmental and topographic relationships, and modeling approaches. We present a conceptual plan for monitoring soil C changes at regional scales that can be readily implemented. Finally, we anticipate significant improvements in soil C monitoring with the advent of instruments capable of direct and precise measurements in the field as well as methods for interpreting and extrapolating spatial and temporal information.

  19. Seismic detection of tornadoes

    USGS Publications Warehouse

    Tatom, F. B.

    1993-01-01

    Tornadoes represent the most violent of all forms of atmospheric storms, each year resulting in hundreds of millions of dollars in property damage and approximately one hundred fatalities. In recent years, considerable success has been achieved in detecting tornadic storms by means of Doppler radar. However, radar systems cannot determine when a tornado is actually in contact with the ground, expect possibly at extremely close range. At the present time, human observation is the only truly reliable way of knowing that a tornado is actually on the ground. However, considerable evidence exists indicating that a tornado in contact with the ground produces a significant seismic signal. If such signals are generated, the seismic detection and warning of an imminent tornado can become a distinct possibility. 

  20. Canadian seismic agreement

    SciTech Connect

    Wetmiller, R.J.; Lyons, J.A.; Shannon, W.E.; Munro, P.S.; Thomas, J.T.; Andrew, M.D.; Lamontagne, M.; Wong, C.; Anglin, F.M.; Plouffe, M.; Adams, J.; Drysdale, J.A. . Geophysics Div.)

    1990-04-01

    During the period of this report, the contract resources were spent on operation and maintenance of the Eastern Canada Telemetred Network (ECTN), development of special purpose local network systems, servicing and maintenance of the strong-motion seismograph network in eastern Canada, operation of the Ottawa data lab and earthquake monitoring and reporting. Of special note in this period was the final completion of the Sudbury (SLTN) and Charlevoix (CLTN) local networks and the integration of their data processing and analysis requirements in the regular analysis stream for ECTN data. These networks now acquire high quality digital data for detailed analysis of seismic activity and source properties from these two areas, thus effectively doubling the amount of seismic data being received by the Ottawa data lab. 37 refs., 17 figs., 2 tabs.

  1. Albuquerque Basin seismic network

    USGS Publications Warehouse

    Jaksha, Lawrence H.; Locke, Jerry; Thompson, J.B.; Garcia, Alvin

    1977-01-01

    The U.S. Geological Survey has recently completed the installation of a seismic network around the Albuquerque Basin in New Mexico. The network consists of two seismometer arrays, a thirteen-station array monitoring an area of approximately 28,000 km 2 and an eight-element array monitoring the area immediately adjacent to the Albuquerque Seismological Laboratory. This report describes the instrumentation deployed in the network.

  2. Verifying likelihoods for low template DNA profiles using multiple replicates

    PubMed Central

    Steele, Christopher D.; Greenhalgh, Matthew; Balding, David J.

    2014-01-01

    To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

  3. Models of protein–ligand crystal structures: trust, but verify

    PubMed Central

    Deller, Marc C.

    2015-01-01

    X-ray crystallography provides the most accurate models of protein–ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein–ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein–ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein–ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein–ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein–ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein–ligand models for their computational and biological studies, and we provide an overview of how this can be achieved. PMID:25665575

  4. Seismic basement in Poland

    NASA Astrophysics Data System (ADS)

    Grad, Marek; Polkowski, Marcin

    2016-06-01

    The area of contact between Precambrian and Phanerozoic Europe in Poland has complicated structure of sedimentary cover and basement. The thinnest sedimentary cover in the Mazury-Belarus anteclize is only 0.3-1 km thick, increases to 7-8 km along the East European Craton margin, and 9-12 km in the Trans-European Suture Zone (TESZ). The Variscan domain is characterized by a 1- to 2-km-thick sedimentary cover, while the Carpathians are characterized by very thick sediments, up to c. 20 km. The map of the basement depth is created by combining data from geological boreholes with a set of regional seismic refraction profiles. These maps do not provide data about the basement depth in the central part of the TESZ and in the Carpathians. Therefore, the data set is supplemented by 32 models from deep seismic sounding profiles and a map of a high-resistivity (low-conductivity) layer from magnetotelluric soundings, identified as a basement. All of these data provide knowledge about the basement depth and of P-wave seismic velocities of the crystalline and consolidated type of basement for the whole area of Poland. Finally, the differentiation of the basement depth and velocity is discussed with respect to geophysical fields and the tectonic division of the area.

  5. Quiet Clean Short-haul Experimental Engine (QCSEE) Under-The-Wing (UTW) composite nacelle subsystem test report. [to verify strength of selected composite materials

    NASA Technical Reports Server (NTRS)

    Stotler, C. L., Jr.; Johnston, E. A.; Freeman, D. S.

    1977-01-01

    The element and subcomponent testing conducted to verify the under the wing composite nacelle design is reported. This composite nacelle consists of an inlet, outer cowl doors, inner cowl doors, and a variable fan nozzle. The element tests provided the mechanical properties used in the nacelle design. The subcomponent tests verified that the critical panel and joint areas of the nacelle had adequate structural integrity.

  6. Seismic Adequacy Review of PC012 SCEs that are Potential Seismic Hazards with PC3 SCEs at Cold Vacuum Dryer (CVD) Facility

    SciTech Connect

    OCOMA, E.C.

    1999-08-12

    This document provides seismic adequacy review of PCO12 Systems, Components L Equipment anchorage that are potential seismic interaction hazards with PC3 SCEs during a Design Basis Earthquake. The PCO12 items are identified in the Safety Equipment List as 3/1 SCEs.

  7. MERCURY vs. TART Comparisons to Verify Thermal Scattering

    SciTech Connect

    Cullen, D E; McKinley, S; Hagmann, C

    2006-03-30

    Recently the results from many Monte Carlo codes were compared for a series of theoretical pin-cells; the results are documented in ref. [3]; details are also provided here in Appendix A and B. The purpose of this earlier code comparison was primarily to determine how accurately our codes model both bound and free atom neutron thermal scattering. Prior to this study many people assumed that our Monte Carlo transport codes were all now so accurate that they would all produce more or less the same answers, say for example K-eff to within 0.1%. The results demonstrated that in reality we see a rather large spread in the results for even simple scalar parameters, such as K-eff, where we found differences in excess of 2%, far exceeding many people's expectations. The differences between code results were traced to four major factors, (1) Differences between the sets of nuclear data used. (2) The accuracy of nuclear data processing codes. (3) The accuracy of the models used in our Monte Carlo transport codes. (4) Code user selected input options. Naturally at Livermore we would like to insure that we minimize the effects of these factors. In this report we compare the results using two of our Monte Carlo transport codes: MERCURY [2] and TART [2], with the following constraints designed to address the four points listed above, (1) Both codes used exactly the same nuclear data, namely the TART 2005 data. (2) Each code used its own nuclear data processing code. Even though these two data processing codes are independent, they have been extensively tested to insure the processed output results closely agree. (3) Both used the same nuclear physics models. This required that some physics be turned off in each code, namely, (a) Unresolved resonance energy region self-shielding was turned off in TART, since this is not currently available in MERCURY. (b) Delayed neutrons were treated as prompt in TART, since this is not currently available in MERCURY. (c) Classical, rather than

  8. Monitoring hydraulic fracturing with seismic emission volume

    NASA Astrophysics Data System (ADS)

    Niu, F.; Tang, Y.; Chen, H.; TAO, K.; Levander, A.

    2014-12-01

    Recent developments in horizontal drilling and hydraulic fracturing have made it possible to access the reservoirs that are not available for massive production in the past. Hydraulic fracturing is designed to enhance rock permeability and reservoir drainage through the creation of fracture networks. Microseismic monitoring has been proven to be an effective and valuable technology to image hydraulic fracture geometry. Based on data acquisition, seismic monitoring techniques have been divided into two categories: downhole and surface monitoring. Surface monitoring is challenging because of the extremely low signal-to-noise ratio of the raw data. We applied the techniques used in earthquake seismology and developed an integrated monitoring system for mapping hydraulic fractures. The system consists of 20 to 30 state-of-the-art broadband seismographs, which are generally about hundreds times more sensible than regular geophones. We have conducted two experiments in two basins with very different geology and formation mechanism in China. In each case, we observed clear microseismic events, which may correspond to the induced seismicity directly associated with fracturing and the triggered ones at pre-existing faults. However, the magnitude of these events is generally larger than magnitude -1, approximately one to two magnitudes larger than those detected by downhole instruments. Spectrum-frequency analysis of the continuous surface recordings indicated high seismic energy associated with injection stages. The seismic energy can be back-projected to a volume that surrounds each injection stage. Imaging seismic emission volume (SEV) appears to be an effective way to map the stimulated reservior volume, as well as natural fractures.

  9. Electron optics simulation to verify the design and behavior of a plasma diagnostics tool via MATLAB

    NASA Astrophysics Data System (ADS)

    Tompkins, Vincent Rashod

    The Association of Unmanned Vehicle Systems International (AUVSI) predicts that 80% of the U.S. unmanned aerial vehicle (UAV) market will be in agricultural and rural areas where cooperatives have a strong presence. Agricultural cooperatives could use UAVs in crop scouting to provide timely high-resolution imagery of crop conditions. Rural electric cooperatives (RECs) could use UAVs to perform routine line inspection. Our research investigated the level of interest and awareness of these rural cooperatives towards UAVS and analyzed the feasibility of UAV adoption. Surveys were sent to Oklahoma grain and farm supply cooperatives and RECs. The survey investigated the knowledge of and interest in UAVs, and elicited information on crop scouting fees and costs, distribution line inspection costs and preventable line loss. The results indicated a low level of knowledge but a high level of interest in UAV technology. Modeling suggests that UAV applications could be feasible for both REC and agricultural cooperatives. Final regulations from the Federal Aviation Administration, particularly restrictions on line-of-sight operation and altitude appear to be a major impediment to UAV adoption. Our survey results suggest that REC applications would be particularly sensitive to the regulatory structure.

  10. PSPVDC: An Adaptation of the PSP that Incorporates Verified Design by Contract

    DTIC Science & Technology

    2013-05-01

    of only three works in the literature that propose a combination of PSP and formal meth- ods. Babar and Potter [ Babar 2005] combine Abrial’s B Method...proposals [ Babar 2005, Suzumori 2003]. Nonetheless, the main difficulty associated with the method resides in developing a competence in carrying out the...Software Engineering. Shanghai, China, May 2006. www.irisa.fr/lande/lande/icse-proceedings/icse/p761.pdf. [ Babar 2005] Babar , Abdul and Potter, John

  11. 76 FR 45843 - Agency Information Collection Activities: E-Verify Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ...-Verify web interface, and performing initial and secondary queries (for example: referring and resolving... of time estimated for an average respondent to respond: 125,015 completing the E-Verify web...

  12. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to

  13. Small Arrays for Seismic Intruder Detections: A Simulation Based Experiment

    NASA Astrophysics Data System (ADS)

    Pitarka, A.

    2014-12-01

    Seismic sensors such as geophones and fiber optic have been increasingly recognized as promising technologies for intelligence surveillance, including intruder detection and perimeter defense systems. Geophone arrays have the capability to provide cost effective intruder detection in protecting assets with large perimeters. A seismic intruder detection system uses one or multiple arrays of geophones design to record seismic signals from footsteps and ground vehicles. Using a series of real-time signal processing algorithms the system detects, classify and monitors the intruder's movement. We have carried out numerical experiments to demonstrate the capability of a seismic array to detect moving targets that generate seismic signals. The seismic source is modeled as a vertical force acting on the ground that generates continuous impulsive seismic signals with different predominant frequencies. Frequency-wave number analysis of the synthetic array data was used to demonstrate the array's capability at accurately determining intruder's movement direction. The performance of the array was also analyzed in detecting two or more objects moving at the same time. One of the drawbacks of using a single array system is its inefficiency at detecting seismic signals deflected by large underground objects. We will show simulation results of the effect of an underground concrete block at shielding the seismic signal coming from an intruder. Based on simulations we found that multiple small arrays can greatly improve the system's detection capability in the presence of underground structures. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344

  14. Swept Impact Seismic Technique (SIST)

    USGS Publications Warehouse

    Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.

    1996-01-01

    A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.

  15. Seismic Imager Space Telescope

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin; Coste, Keith; Cunningham, J.; Sievers,Michael W.; Agnes, Gregory S.; Polanco, Otto R.; Green, Joseph J.; Cameron, Bruce A.; Redding, David C.; Avouac, Jean Philippe; Ampuero, Jean Paul; Leprince, Sebastien; Michel, Remi

    2012-01-01

    A concept has been developed for a geostationary seismic imager (GSI), a space telescope in geostationary orbit above the Pacific coast of the Americas that would provide movies of many large earthquakes occurring in the area from Southern Chile to Southern Alaska. The GSI movies would cover a field of view as long as 300 km, at a spatial resolution of 3 to 15 m and a temporal resolution of 1 to 2 Hz, which is sufficient for accurate measurement of surface displacements and photometric changes induced by seismic waves. Computer processing of the movie images would exploit these dynamic changes to accurately measure the rapidly evolving surface waves and surface ruptures as they happen. These measurements would provide key information to advance the understanding of the mechanisms governing earthquake ruptures, and the propagation and arrest of damaging seismic waves. GSI operational strategy is to react to earthquakes detected by ground seismometers, slewing the satellite to point at the epicenters of earthquakes above a certain magnitude. Some of these earthquakes will be foreshocks of larger earthquakes; these will be observed, as the spacecraft would have been pointed in the right direction. This strategy was tested against the historical record for the Pacific coast of the Americas, from 1973 until the present. Based on the seismicity recorded during this time period, a GSI mission with a lifetime of 10 years could have been in position to observe at least 13 (22 on average) earthquakes of magnitude larger than 6, and at least one (2 on average) earthquake of magnitude larger than 7. A GSI would provide data unprecedented in its extent and temporal and spatial resolution. It would provide this data for some of the world's most seismically active regions, and do so better and at a lower cost than could be done with ground-based instrumentation. A GSI would revolutionize the understanding of earthquake dynamics, perhaps leading ultimately to effective warning

  16. GPR and seismic imaging in a gypsum quarry

    NASA Astrophysics Data System (ADS)

    Dérobert, Xavier; Abraham, Odile

    2000-10-01

    A combination of ground penetrating radar (GPR) and seismic imaging has been performed in a gypsum quarry in western Europe. The objective was to localize main cracks and damaged areas inside some of the pillars, which presented indications of having reached stress limits. The GPR imaging was designed from classical profiles with GPR processes and a customized, PC-based image-processing software. The detection of energy reflection seems to be an efficient process for localizing damaged areas. Seismic tomographic images have been obtained from travel time measurements, which were inverted using a simultaneous iterative reconstruction technique (SIRT) technique in order to provide a map of seismic velocities. The imaging and techniques employed are compared herein. The two techniques are complementary; seismic tomography produces a map of velocities related to the state of the pillar's internal stress, while radar data serve to localize the main cracks. Moreover, these imaging processes present similarities with respect to the damaged zone detection.

  17. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    SciTech Connect

    Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian; Kamali, M. Koohi

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrence relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.

  18. Modelling of NW Himalayan Seismicity

    NASA Astrophysics Data System (ADS)

    Bansal, A. R.; Dimri, V. P.

    2014-12-01

    The northwest Himalaya is seismicity active region due to the collision of Indian and Eurasian plates and experienced many large earthquakes in past. A systematic analysis of seismicity is useful for seismic hazard estimation of the region. We analyzed the seismicity of northwestern Himalaya since 1980. The magnitude of completeness of the catalogue is carried out using different methods and found as 3.0. A large difference in magnitude of completeness is found using different methods and a reliable value is obtained after testing the distribution of magnitudes with time. The region is prone to large earthquake and many studied have shown that seismic activation or quiescence takes place before large earthquakes. We studied such behavior of seismicity based on Epidemic Type Aftershock Sequence (ETAS) model and found that a stationary ETAS model is more suitable for modelling the seismicity of this region. The earthquake catalogue is de-clustered using stochasting approach to study behavior of background and triggered seismicity. The triggered seismicity is found to have shallower depths as compared to the background events.

  19. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    SciTech Connect

    Joel Walls; M.T. Taner; Naum Derzhi; Gary Mavko; Jack Dvorkin

    2003-04-01

    In this report we will show results of seismic and well log derived attenuation attributes from a deep water Gulf of Mexico data set. This data was contributed by Burlington Resources and Seitel Inc. The data consists of ten square kilometers of 3D seismic data and three well penetrations. We have computed anomalous seismic absorption attributes on the seismic data and have computed Q from the well log curves. The results show a good correlation between the anomalous absorption (attenuation) attributes and the presence of gas as indicated by well logs.

  20. Seismic hazard estimation of northern Iran using smoothed seismicity

    NASA Astrophysics Data System (ADS)

    Khoshnevis, Naeem; Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Cramer, Chris H.

    2017-03-01

    This article presents a seismic hazard assessment for northern Iran, where a smoothed seismicity approach has been used in combination with an updated seismic catalog and a ground motion prediction equation recently found to yield good fit with data. We evaluate the hazard over a geographical area including the seismic zones of Azerbaijan, the Alborz Mountain Range, and Kopeh-Dagh, as well as parts of other neighboring seismic zones that fall within our region of interest. In the chosen approach, seismic events are not assigned to specific faults but assumed to be potential seismogenic sources distributed within regular grid cells. After performing the corresponding magnitude conversions, we decluster both historical and instrumental seismicity catalogs to obtain earthquake rates based on the number of events within each cell, and smooth the results to account for the uncertainty in the spatial distribution of future earthquakes. Seismicity parameters are computed for each seismic zone separately, and for the entire region of interest as a single uniform seismotectonic region. In the analysis, we consider uncertainties in the ground motion prediction equation, the seismicity parameters, and combine the resulting models using a logic tree. The results are presented in terms of expected peak ground acceleration (PGA) maps and hazard curves at selected locations, considering exceedance probabilities of 2 and 10% in 50 years for rock site conditions. According to our results, the highest levels of hazard are observed west of the North Tabriz and east of the North Alborz faults, where expected PGA values are between about 0.5 and 1 g for 10 and 2% probability of exceedance in 50 years, respectively. We analyze our results in light of similar estimates available in the literature and offer our perspective on the differences observed. We find our results to be helpful in understanding seismic hazard for northern Iran, but recognize that additional efforts are necessary to

  1. Verifiable Adaptive Control with Analytical Stability Margins by Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.

    2010-01-01

    This paper presents a verifiable model-reference adaptive control method based on an optimal control formulation for linear uncertain systems. A predictor model is formulated to enable a parameter estimation of the system parametric uncertainty. The adaptation is based on both the tracking error and predictor error. Using a singular perturbation argument, it can be shown that the closed-loop system tends to a linear time invariant model asymptotically under an assumption of fast adaptation. A stability margin analysis is given to estimate a lower bound of the time delay margin using a matrix measure method. Using this analytical method, the free design parameter n of the optimal control modification adaptive law can be determined to meet a specification of stability margin for verification purposes.

  2. Workmanship Coupon Verifies and Validates the Remote Inspection System Used to Inspect Dry Shielded Canister Welds

    SciTech Connect

    Custer, K. E.; Zirker, L. R.; Dowalo, J. A.; Kaylor, J. E.

    2002-02-25

    The Idaho National Engineering and Environmental Laboratory (INEEL) is operated by Bechtel-BWXT Idaho LLC (BBWI), which recently completed a very successful Three-Mile Island-2 (TMI-2) program for the Department of Energy. This complex and challenging program loaded, welded, and transported an unprecedented 27 dry shielded canisters in seven-months, and did so ahead of schedule. The program moved over 340 canisters of TMI-2 core debris that had been in wet storage into a dry storage facility at the INEEL. Welding flaws with the manually welded purge and vent ports discovered in mid-campaign had to be verified as not effecting previous completed seal welds. A portable workmanship coupon was designed and built to validate remote inspection of completed in-service seal welds. This document outlines the methodology and advantages for building and using workmanship coupons.

  3. 75 FR 31288 - Plant-Verified Drop Shipment (PVDS)-Nonpostal Documentation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-03

    ... 111 Plant-Verified Drop Shipment (PVDS)--Nonpostal Documentation AGENCY: Postal Service TM . ACTION... Service, Domestic Mail Manual (DMM ) 705.15. 2.14 to clarify that PS Form 8125, Plant-Verified Drop...: As a result of reviews of USPS policy concerning practices at induction points of plant-verified...

  4. 31 CFR 363.14 - How will you verify my identity?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application....

  5. 31 CFR 363.14 - How will you verify my identity?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 2 2013-07-01 2013-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application....

  6. 31 CFR 363.14 - How will you verify my identity?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application....

  7. 31 CFR 363.14 - How will you verify my identity?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application....

  8. 31 CFR 363.14 - How will you verify my identity?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 2 2014-07-01 2014-07-01 false How will you verify my identity? 363... you verify my identity? (a) Individual. When you establish an account, we may use a verification service to verify your identity using information you provide about yourself on the online application....

  9. 49 CFR 40.149 - May the MRO change a verified drug test result?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false May the MRO change a verified drug test result? 40... TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Process § 40.149 May the MRO change a verified drug test result? (a) As the MRO, you may change a verified...

  10. 49 CFR 40.149 - May the MRO change a verified drug test result?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false May the MRO change a verified drug test result? 40... TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Process § 40.149 May the MRO change a verified drug test result? (a) As the MRO, you may change a verified...

  11. 49 CFR 40.149 - May the MRO change a verified drug test result?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false May the MRO change a verified drug test result? 40... TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Process § 40.149 May the MRO change a verified drug test result? (a) As the MRO, you may change a verified...

  12. 49 CFR 40.149 - May the MRO change a verified drug test result?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 1 2011-10-01 2011-10-01 false May the MRO change a verified drug test result? 40... TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Process § 40.149 May the MRO change a verified drug test result? (a) As the MRO, you may change a verified...

  13. 49 CFR 40.149 - May the MRO change a verified drug test result?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false May the MRO change a verified drug test result? 40... TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Medical Review Officers and the Verification Process § 40.149 May the MRO change a verified drug test result? (a) As the MRO, you may change a verified...

  14. Mechanical characterization of seismic base isolation elastomers

    SciTech Connect

    Kulak, R.F.; Hughes, T.H.

    1991-01-01

    From the various devices proposed for seismic isolators, the laminated elastomer bearing is emerging as the preferred device for large buildings/structures, such as nuclear reactor plants. The laminated bearing is constructed from alternating thin layers of elastomer and metallic plates (shims). The elastomer is usually a carbon filled natural rubber that exhibits damping when subjected to shear. Recently, some blends of natural and synthetic rubbers have appeared. Before candidate elastomers can be used in seismic isolation bearings, their response to design-basis loads and beyond- design-basis loads must be determined. This entails the development of constitutive models and and then the determination of associated material parameters through specimen testing. This paper describes the methods used to obtain data for characterizing the mechanical response of elastomers used for seismic isolation. The data provides a data base for use in determining material parameters associated with nonlinear constitutive models. In addition, the paper presents a definition for a damping ratio that does not exhibit the usual reduction at higher strain cycles. 2 refs., 6 figs., 1 tab.

  15. Application of seismic isolation to industrial tanks

    SciTech Connect

    Zayas, V.A.; Low, S.S.

    1995-12-01

    The state-of-the-art in the application of seismic isolation to industrial tanks is presented. Use of seismic isolation in industrial tanks can reduce lateral shaking forces by factors of 3 to 5 for strong earthquake loadings. This level of force reduction offers a practical and economical means of designing tanks on a linear elastic basis, and thereby reduces the risk of local failures and leakage during earthquakes. The case studies presented include: LNG Storage Tanks, an Ammonia Storage Tank, and an Emergency Fire and Cooling Water Tank. The tank capacities range from 50 thousand gallons to 19 million gallons. Two applications are new tanks, and one is a retrofit of an existing tank. The methodology for the design of the isolation bearings and tank structures is presented. The dynamic analysis methods used to perform the seismic analysis of the isolated tanks are reviewed, including the hydrodynamic modeling methods. The engineering principles and theory of the Friction Pendulum isolation bearings are discussed. This pendulum based isolation system results in the same natural period of vibration regardless of changes in the fluid levels in the tank, or temperature, aging, and environmental conditions. Test results for the isolation bearings are presented, including comparisons of experimental and analytical results for dynamic loadings, and strength, temperature and aging tests.

  16. Conceptual design report: Nuclear materials storage facility renovation. Part 5, Structural/seismic investigation. Section A report, existing conditions calculations/supporting information

    SciTech Connect

    1995-07-14

    The Nuclear Materials Storage Facility (NMSF) at the Los Alamos National Laboratory (LANL) was a Fiscal Year (FY) 1984 line-item project completed in 1987 that has never been operated because of major design and construction deficiencies. This renovation project, which will correct those deficiencies and allow operation of the facility, is proposed as an FY 97 line item. The mission of the project is to provide centralized intermediate and long-term storage of special nuclear materials (SNM) associated with defined LANL programmatic missions and to establish a centralized SNM shipping and receiving location for Technical Area (TA)-55 at LANL. Based on current projections, existing storage space for SNM at other locations at LANL will be loaded to capacity by approximately 2002. This will adversely affect LANUs ability to meet its mission requirements in the future. The affected missions include LANL`s weapons research, development, and testing (WRD&T) program; special materials recovery; stockpile survelliance/evaluation; advanced fuels and heat sources development and production; and safe, secure storage of existing nuclear materials inventories. The problem is further exacerbated by LANL`s inability to ship any materials offsite because of the lack of receiver sites for mate rial and regulatory issues. Correction of the current deficiencies and enhancement of the facility will provide centralized storage close to a nuclear materials processing facility. The project will enable long-term, cost-effective storage in a secure environment with reduced radiation exposure to workers, and eliminate potential exposures to the public. Based upon US Department of Energy (DOE) Albuquerque Operations (DOE/Al) Office and LANL projections, storage space limitations/restrictions will begin to affect LANL`s ability to meet its missions between 1998 and 2002.

  17. Validating induced seismicity forecast models—Induced Seismicity Test Bench

    NASA Astrophysics Data System (ADS)

    Király-Proag, Eszter; Zechar, J. Douglas; Gischig, Valentin; Wiemer, Stefan; Karvounis, Dimitrios; Doetsch, Joseph

    2016-08-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. In this study, we propose an Induced Seismicity Test Bench to test and rank such models; this test bench can be used for model development, model selection, and ensemble model building. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models: Shapiro and Smoothed Seismicity (SaSS) and Hydraulics and Seismics (HySei). These models incorporate a different mix of physics-based elements and stochastic representation of the induced sequences. Our results show that neither model is fully superior to the other. Generally, HySei forecasts the seismicity rate better after shut-in but is only mediocre at forecasting the spatial distribution. On the other hand, SaSS forecasts the spatial distribution better and gives better seismicity rate estimates before shut-in. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in.

  18. Comment on "How can seismic hazard around the New Madrid seismic zone be similar to that in California?" by Arthur Frankel

    USGS Publications Warehouse

    Wang, Z.; Shi, B.; Kiefer, J.D.

    2005-01-01

    PSHA is the method used most to assess seismic hazards for input into various aspects of public and financial policy. For example, PSHA was used by the U.S. Geological Survey to develop the National Seismic Hazard Maps (Frankel et al., 1996, 2002). These maps are the basis for many national, state, and local seismic safety regulations and design standards, such as the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, the International Building Code, and the International Residential Code. Adoption and implementation of these regulations and design standards would have significant impacts on many communities in the New Madrid area, including Memphis, Tennessee and Paducah, Kentucky. Although "mitigating risks to society from earthquakes involves economic and policy issues" (Stein, 2004), seismic hazard assessment is the basis. Seismologists should provide the best information on seismic hazards and communicate them to users and policy makers. There is a lack of effort in communicating the uncertainties in seismic hazard assessment in the central U.S., however. Use of 10%, 5%, and 2% PE in 50 years causes confusion in communicating seismic hazard assessment. It would be easy to discuss and understand the design ground motions if the true meaning of the ground motion derived from PSHA were presented, i.e., the ground motion with the estimated uncertainty or the associated confidence level.

  19. Seismic isolation of nuclear power plants using elastomeric bearings

    NASA Astrophysics Data System (ADS)

    Kumar, Manish

    Seismic isolation using low damping rubber (LDR) and lead-rubber (LR) bearings is a viable strategy for mitigating the effects of extreme earthquake shaking on safety-related nuclear structures. Although seismic isolation has been deployed in nuclear structures in France and South Africa, it has not seen widespread use because of limited new build nuclear construction in the past 30 years and a lack of guidelines, codes and standards for the analysis, design and construction of isolation systems specific to nuclear structures. The nuclear accident at Fukushima Daiichi in March 2011 has led the nuclear community to consider seismic isolation for new large light water and small modular reactors to withstand the effects of extreme earthquakes. The mechanical properties of LDR and LR bearings are not expected to change substantially in design basis shaking. However, under shaking more intense than design basis, the properties of the lead cores in lead-rubber bearings may degrade due to heating associated with energy dissipation, some bearings in an isolation system may experience net tension, and the compression and tension stiffness may be affected by the horizontal displacement of the isolation system. The effects of intra-earthquake changes in mechanical properties on the response of base-isolated nuclear power plants (NPPs) were investigated using an advanced numerical model of a lead-rubber bearing that has been verified and validated, and implemented in OpenSees and ABAQUS. A series of experiments were conducted at University at Buffalo to characterize the behavior of elastomeric bearings in tension. The test data was used to validate a phenomenological model of an elastomeric bearing in tension. The value of three times the shear modulus of rubber in elastomeric bearing was found to be a reasonable estimate of the cavitation stress of a bearing. The sequence of loading did not change the behavior of an elastomeric bearing under cyclic tension, and there was no

  20. Seismic behavior of lightweight concrete columns

    NASA Astrophysics Data System (ADS)

    Rabbat, B. G.; Daniel, J. I.; Weinmann, T. L.; Hanson, N. W.

    1982-09-01

    Sixteen full-scale, column-beam assemblies, which represented a portion of a frame subjected to simulated seismic loading, were tested. Controlled test parameters included concrete type, column size, amount of main column steel, size and spacing of column confining hoops, and magnitude of column axial load. The columns were subjected to constant axial load and slow moment reversals at increasing inelastic deformations. Test data showed that properly designed lightweight concrete columns maintained ductility and strength when subjected to large inelastic deformations from load reversals. Confinement requirements for normal weight concrete columns were shown to be applicable to lightweight concrete columns up to thirty percent of the design strength.