These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

The Hob system for verifying software design properties  

E-print Network

This dissertation introduces novel techniques for verifying that programs conform to their designs. My Hob system, as described in this dissertation, allows developers to statically ensure that implementations preserve ...

Lam, Patrick, Ph. D. Massachusetts Institute of Technology

2007-01-01

2

Assessment of seismic design response factors of concrete wall buildings  

NASA Astrophysics Data System (ADS)

To verify the seismic design response factors of high-rise buildings, five reference structures, varying in height from 20- to 60-stories, were selected and designed according to modern design codes to represent a wide range of concrete wall structures. Verified fiber-based analytical models for inelastic simulation were developed, considering the geometric nonlinearity and material inelasticity of the structural members. The ground motion uncertainty was accounted for by employing 20 earthquake records representing two seismic scenarios, consistent with the latest understanding of the tectonic setting and seismicity of the selected reference region (UAE). A large number of Inelastic Pushover Analyses (IPAs) and Incremental Dynamic Collapse Analyses (IDCAs) were deployed for the reference structures to estimate the seismic design response factors. It is concluded that the factors adopted by the design code are adequately conservative. The results of this systematic assessment of seismic design response factors apply to a wide variety of contemporary concrete wall buildings with various characteristics.

Mwafy, Aman

2011-03-01

3

Application of process monitoring to verify facility design  

SciTech Connect

Process monitoring has been proposed as a safeguards measure to ensure that a facility is operating as designed, or as a surveillance measure to ensure that material is not removed from the facility in an undeclared manner. In a process-monitoring system, the facility operator monitors process operations such as tank levels, densities, and temperatures; process flows; and physical parameters such as valve positions to ensure that the operations performed are both desired and required. At many facilities (for example, Idaho), the process-monitoring system is also an important safety feature to prevent criticality. Verifying facility design is necessary for application of safeguards in a reprocessing plant. Verifying all pipes and valves through comparison of blueprints with the as-built facility is an almost impossible task with the International Atomic Energy Agency's limited inspection resources. We propose applying process monitoring for international safeguards facility design verification. By carefully selecting process-operating variables, it may be possible to verify that plant flows are as described and that key measurement points are not bypassed. 8 refs.

Hakkila, E.A.

1989-01-01

4

Verifying Architectural Design Rules of the Flight Software Product Line  

NASA Technical Reports Server (NTRS)

This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

2009-01-01

5

Evaluation of strategies for seismic design  

E-print Network

Current trends in seismic design require a new approach, oriented in satisfying motion related design requirements and limiting both structural and non-structural damage. Seismic isolation and damping devices are currently ...

Tsertikidou, Despoina

2012-01-01

6

Design of a verifiable subset for HAL/S  

NASA Technical Reports Server (NTRS)

An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

1979-01-01

7

Structural concepts and details for seismic design  

SciTech Connect

This manual discusses building and building component behavior during earthquakes, and provides suggested details for seismic resistance which have shown by experience to provide adequate performance during earthquakes. Special design and construction practices are also described which, although they might be common in some high-seismic regions, may not be common in low and moderate seismic-hazard regions of the United States. Special attention is given to describing the level of detailing appropriate for each seismic region. The UBC seismic criteria for all seismic zones is carefully examined, and many examples of connection details are given. The general scope of discussion is limited to materials and construction types common to Department of Energy (DOE) sites. Although the manual is primarily written for professional engineers engaged in performing seismic-resistant design for DOE facilities, the first two chapters, plus the introductory sections of succeeding chapters, contain descriptions which are also directed toward project engineers who authorize, review, or supervise the design and construction of DOE facilities. 88 refs., 188 figs.

Not Available

1991-09-01

8

Simplified seismic performance assessment and implications for seismic design  

NASA Astrophysics Data System (ADS)

The last decade or so has seen the development of refined performance-based earthquake engineering (PBEE) approaches that now provide a framework for estimation of a range of important decision variables, such as repair costs, repair time and number of casualties. This paper reviews current tools for PBEE, including the PACT software, and examines the possibility of extending the innovative displacement-based assessment approach as a simplified structural analysis option for performance assessment. Details of the displacement-based s+eismic assessment method are reviewed and a simple means of quickly assessing multiple hazard levels is proposed. Furthermore, proposals for a simple definition of collapse fragility and relations between equivalent single-degree-of-freedom characteristics and multi-degree-of-freedom story drift and floor acceleration demands are discussed, highlighting needs for future research. To illustrate the potential of the methodology, performance measures obtained from the simplified method are compared with those computed using the results of incremental dynamic analyses within the PEER performance-based earthquake engineering framework, applied to a benchmark building. The comparison illustrates that the simplified method could be a very effective conceptual seismic design tool. The advantages and disadvantages of the simplified approach are discussed and potential implications of advanced seismic performance assessments for conceptual seismic design are highlighted through examination of different case study scenarios including different structural configurations.

Sullivan, Timothy J.; Welch, David P.; Calvi, Gian Michele

2014-08-01

9

Pseudostatic seismic stability of slopes: Design charts  

SciTech Connect

A variational limiting equilibrium approach that is an extension of Baker and Garber's analysis is presented. In this extension a pseudostatic seismic force is included to assess the stability of simple slopes. The normal stress distribution over the potential slip surface is determined mathematically so as to render the minimum factor of safety while satisfying explicitly all limiting equilibrium equations. The equation describing this distribution contains terms related to seismicity and it makes the problem statically determinate without a priori statical assumptions. A closed-form solution is then obtained for a log spiral failure mechanism. Consequently, a parametric study of the effects of pseudostatic seismic force is conducted with ease. Furthermore, design charts for assessing the seismic stability of simple slopes are presented. For nonseismic cases these charts coincide with Taylor's chart. For the seismic cases, identical results could also be obtained by using a conventional log spiral analysis (though with much greater effort). The design charts, therefore, should be acceptable by a large group of readers.

Leshchinsky, D.; San, K.C. (Univ. of Delaware, Newark, DE (United States). Dept. of Civil Engineering)

1994-09-01

10

On the Security of a Designated-Verifier Proxy Signature Scheme and its Improved Scheme (Revisited)  

Microsoft Academic Search

As a special signature, proxy signature allows a entity called original signer to delegate his signing capability to another entity to produce signature on behalf of him. By combining the ideas of proxy signatures and designated-verifier signatures, Q.Wang et.al. proposed an identity-based strong designated-verifier proxy signature scheme (for short DVP) and claimed that their scheme satisfied all of the security

Jianhong Zhang

2007-01-01

11

Risk analysis for seismic design of (tailings dams)  

Microsoft Academic Search

Probabilistic seismic risk analysis is a promising method for evaluating design options and establishing seismic design parameters. However, there have been few examples in the literature to guide practitioners in its use. (This paper demonstrates the value of risk analysis for mine tailings dams and provides a case-history application for a seismically active portion of Nevada. Risk analysis provided the

Steven G. Vick; Gail M. Atkinson; Charles I. Wilmot

1985-01-01

12

Voting with designated verifier signature-like Emmanuel Dall'Olio and Olivier Markowitch  

E-print Network

Voting with designated verifier signature-like protocol Emmanuel Dall'Olio and Olivier Markowitch,omarkow}@ulb.ac.be Abstract. We propose in this paper a new voting scheme where the voter, while receiving a receipt for his/her vote allowing further contes- tations, cannot use it to reveal the vote to other unspecified entities

Markowitch, Olivier

13

A Preliminary study on the seismic conceptual design  

NASA Astrophysics Data System (ADS)

The seismic conceptual design is an essential part of seismic design codes. It points out that the term "seismic conceptual design" should imply three aspects, i.e., the given concept itself, the specific provisions related to the given concept and the designing following the provisions. Seismic conceptual design can be classified into two categories: the strict or traditional seismic conceptual design and the generalized seismic conceptual design. The authors are trying to define for both conceptual designs their connotations and study their characteristics, in particular, the differences between them. Authors emphasize that both conceptual designs sound very close, however, their differences are apparent. The strict conceptual designs are usually worked out directly from engineering practice and/or lessons learnt from earthquake damage, while the generalized conceptual designs are resulted in a series of visions aiming to realize the general objectives of the seismic codes. The strict conceptual designs, (traditional conceptual designs) are indispensable elements of seismic codes in assuring designed structures safer and the generalized conceptual designs are playing key roles in directing to a more advanced and effective seismic codes.

Zhao, Zhen; Xie, Lili

2014-08-01

14

Implied preference for seismic design level and earthquake insurance.  

PubMed

Seismic risk can be reduced by implementing newly developed seismic provisions in design codes. Furthermore, financial protection or enhanced utility and happiness for stakeholders could be gained through the purchase of earthquake insurance. If this is not so, there would be no market for such insurance. However, perceived benefit associated with insurance is not universally shared by stakeholders partly due to their diverse risk attitudes. This study investigates the implied seismic design preference with insurance options for decisionmakers of bounded rationality whose preferences could be adequately represented by the cumulative prospect theory (CPT). The investigation is focused on assessing the sensitivity of the implied seismic design preference with insurance options to model parameters of the CPT and to fair and unfair insurance arrangements. Numerical results suggest that human cognitive limitation and risk perception can affect the implied seismic design preference by the CPT significantly. The mandatory purchase of fair insurance will lead the implied seismic design preference to the optimum design level that is dictated by the minimum expected lifecycle cost rule. Unfair insurance decreases the expected gain as well as its associated variability, which is preferred by risk-averse decisionmakers. The obtained results of the implied preference for the combination of the seismic design level and insurance option suggest that property owners, financial institutions, and municipalities can take advantage of affordable insurance to establish successful seismic risk management strategies. PMID:18419667

Goda, K; Hong, H P

2008-04-01

15

Risk analysis for seismic design of (tailings dams)  

SciTech Connect

Probabilistic seismic risk analysis is a promising method for evaluating design options and establishing seismic design parameters. However, there have been few examples in the literature to guide practitioners in its use. (This paper demonstrates the value of risk analysis for mine tailings dams and provides a case-history application for a seismically active portion of Nevada. Risk analysis provided the basis for selecting among design options having varying liquefaction resistance, and for establishing input parameters for dynamic analysis. Ranges are presented for the quantity and cleanup cost of tailings released in seismic failures to aid in determining expected failure consequences. It is shown that for many tailings dams, accepted lifetime failure probabilities of a few percent may provide a reasonable basis for probabilistic determination of seismic design criteria.)

Vick, S.G.; Atkinson, G.M.; Wilmot, C.I.

1985-07-01

16

Verifying seismic design of nuclear reactors by testing. Technical evaluation report  

SciTech Connect

The purpose of the study is to develop a program plan to provide assurance by physical demonstration that nuclear power plants are earthquake resistant and to allow nuclear power plant operators to: (1) decide whether tests should be conducted on their facilities; (2) specify the tests that should be performed; and (3) estimate the cost of the effort to complete the recommended test program.

Barclay, B.; Malthan, J.A.; Masri, S.F.; Safford, F.B.

1980-04-01

17

A provably secure really source hiding designated verifier signature scheme based on random oracle model  

E-print Network

the public key of the signer to verify weather , = , (, ) holds, where is the signer's public key], in 2008, found an impersonation attack on [15]. Hence, they provided a modification on [15]. They claimed

18

Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis  

SciTech Connect

The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

1998-04-01

19

Seismic fragility assessment of RC frame structure designed according to modern Chinese code for seismic design of buildings  

NASA Astrophysics Data System (ADS)

Following several damaging earthquakes in China, research has been devoted to find the causes of the collapse of reinforced concrete (RC) building sand studying the vulnerability of existing buildings. The Chinese Code for Seismic Design of Buildings (CCSDB) has evolved over time, however, there is still reported earthquake induced damage of newly designed RC buildings. Thus, to investigate modern Chinese seismic design code, three low-, mid- and high-rise RC frames were designed according to the 2010 CCSDB and the corresponding vulnerability curves were derived by computing a probabilistic seismic demand model (PSDM).The PSDM was computed by carrying out nonlinear time history analysis using thirty ground motions obtained from the Pacific Earthquake Engineering Research Center. Finally, the PSDM was used to generate fragility curves for immediate occupancy, significant damage, and collapse prevention damage levels. Results of the vulnerability assessment indicate that the seismic demands on the three different frames designed according to the 2010 CCSDB meet the seismic requirements and are almost in the same safety level.

Wu, D.; Tesfamariam, S.; Stiemer, S. F.; Qin, D.

2012-09-01

20

Next generation seismic fragility curves for California bridges incorporating the evolution in seismic design philosophy  

NASA Astrophysics Data System (ADS)

Quantitative and qualitative assessment of the seismic risk to highway bridges is crucial in pre-earthquake planning, and post-earthquake response of transportation systems. Such assessments provide valuable knowledge about a number of principal effects of earthquakes such as traffic disruption of the overall highway system, impact on the regions’ economy and post-earthquake response and recovery, and more recently serve as measures to quantify resilience. Unlike previous work, this study captures unique bridge design attributes specific to California bridge classes along with their evolution over three significant design eras, separated by the historic 1971 San Fernando and 1989 Loma Prieta earthquakes (these events affected changes in bridge seismic design philosophy). This research developed next-generation fragility curves for four multispan concrete bridge classes by synthesizing new knowledge and emerging modeling capabilities, and by closely coordinating new and ongoing national research initiatives with expertise from bridge designers. A multi-phase framework was developed for generating fragility curves, which provides decision makers with essential tools for emergency response, design, planning, policy support, and maximizing investments in bridge retrofit. This framework encompasses generational changes in bridge design and construction details. Parameterized high-fidelity three-dimensional nonlinear analytical models are developed for the portfolios of bridge classes within different design eras. These models incorporate a wide range of geometric and material uncertainties, and their responses are characterized under seismic loadings. Fragility curves were then developed considering the vulnerability of multiple components and thereby help to quantify the performance of highway bridge networks and to study the impact of seismic design principles on the performance within a bridge class. This not only leads to the development of fragility relations that are unique and better suited for bridges in California, but also leads to the creation of better bridge classes and sub-bins that have more consistent performance characteristics than those currently provided by the National Bridge Inventory. Another important feature of this research is associated with the development of damage state definitions and grouping of bridge components in a way that they have similar consequences in terms of repair and traffic implications following a seismic event. These definitions are in alignment with the California Department of Transportation’s design and operational experience, thereby enabling better performance assessment, emergency response, and management in the aftermath of a seismic event. The fragility curves developed as a part of this research will be employed in ShakeCast, a web-based post-earthquake situational awareness application that automatically retrieves earthquake shaking data and generates potential damage assessment notifications for emergency managers and responders.

Ramanathan, Karthik Narayan

21

Recommended revisions to Nuclear Regulatory Commission seismic design criteria  

Microsoft Academic Search

This report recommends changes in the Nuclear Regulatory Commission's (NRC's) criteria now used in the seismic design of nuclear power plants. Areas covered include ground motion, soil-structure interaction, structures, and equipment and components. Members of the Engineering Mechanics Section of the Nuclear Test Engineering Division at Lawrence Livermore Laboratory (LLL) generally agreed upon the recommendations, which are based on: (1)

Coats

1979-01-01

22

High-performance braces for seismic design  

E-print Network

The fundamental challenge for the structural engineer in designing earthquake-resistant structures is to design buildings with both adequate ductility and sufficient stiffness. Traditional lateral force resisting systems ...

Lim, Tim S

2013-01-01

23

On verifying a high-level design. [cost and error analysis  

NASA Technical Reports Server (NTRS)

An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

1993-01-01

24

Verified Craters  

NSDL National Science Digital Library

Spinning globe showing yellow dots to represent the location of approximately 150 verified craters scattered throughout the world. They are largely grouped on the North American, European, and Australian continents.

Thomson, Joycelyn; Wasilewski, Peter

2002-09-06

25

Review of seismicity and ground motion studies related to development of seismic design at SRS  

SciTech Connect

The NRC response spectra developed in Reg. Guide 1.60 is being used in the studies related to restarting of the existing Savannah River Site (SRS) reactors. Because it envelopes all the other site specific spectra which have been developed for SRS, it provides significant conservatism in the design and analysis of the reactor systems for ground motions of this value or with these probability levels. This spectral shape is also the shape used for the design of the recently licensed Vogtle Nuclear Station, located south of the Savannah River from the SRS. This report provides a summary of the data base used to develop the design basis earthquake. This includes the seismicity, rates of occurrence, magnitudes, and attenuation relationships. A summary is provided for the studies performed and methodologies used to establish the design basis earthquake for SRS. The ground motion response spectra developed from the various studies are also summarized. The seismic hazard and PGA`s developed for other critical facilities in the region are discussed, and the SRS seismic instrumentation is presented. The programs for resolving outstanding issues are discussed and conclusions are presented.

Stephenson, D.E. [Westinghouse Savannah River Co., Aiken, SC (United States); Acree, J.R. [Westinghouse Environmental and Geotechnical Services, Inc., Columbia, SC (United States)

1992-08-01

26

Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.  

SciTech Connect

This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

2006-10-01

27

Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington  

SciTech Connect

The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energy’s (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84th percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis.

Brouns, Thomas M.; Rohay, Alan C.; Reidel, Steve; Gardner, Martin G.

2007-02-27

28

Displacement-Based Seismic Design of Structures  

Microsoft Academic Search

The concept of designing structures to achieve a specified performance limit state defined by strain or drift limits was first introduced, in New Zealand, in 1993. Over the following years, and in particular the past five years, an intense coordinated research effort has been underway in Europe and the USA to develop the concept to the stage where it is

M. J. N. Priestley; G. M. Calvi; M. J. Kowalsky; Graham H. Powell

2008-01-01

29

Sensor placement for the analysis of seismic surface waves: sources of error, design criterion and array design algorithms  

NASA Astrophysics Data System (ADS)

Seismic surface waves can be measured by deploying an array of seismometers on the surface of the earth. The goal of such measurement surveys is, usually, to estimate the velocity of propagation and the direction of arrival of the seismic waves. In this paper, we address the issue of sensor placement for the analysis of seismic surface waves from ambient vibration wavefields. First, we explain in detail how the array geometry affects the mean-squared estimation error of parameters of interest, such as the velocity and direction of propagation, both at low and high signal-to-noise ratios (SNRs). Secondly, we propose a cost function suitable for the design of the array geometry with particular focus on the estimation of the wavenumber of both Love and Rayleigh waves. Thirdly, we present and compare several computational approaches to minimize the proposed cost function. Numerical experiments verify the effectiveness of our cost function and resulting array geometry designs, leading to greatly improved estimation performance in comparison to arbitrary array geometries, both at low and high SNR levels.

Maranò, Stefano; Fäh, Donat; Lu, Yue M.

2014-06-01

30

Use of process monitoring for verifying facility design of large-scale reprocessing plants  

SciTech Connect

During the decade of the 1990s, the International Atomic Energy Agency (IAEA) faces the challenge of implementing safeguards in large, new reprocessing facilities. The Agency will be involved in the design, construction, checkout and initial operation of these new facilities to ensure effective safeguards are implemented. One aspect of the Agency involvement is in the area of design verification. The United States Support Program has initiated a task to develop methods for applying process data collection and validation during the cold commissioning phase of plant construction. This paper summarizes the results of this task. 14 refs., 1 tab.

Hakkila, E.A.; Zack, N.R. (Los Alamos National Lab., NM (USA)); Ehinger, M.H. (Oak Ridge National Lab., TN (USA)); Franssen, F. (International Atomic Energy Agency, Vienna (Austria))

1991-01-01

31

Seismic design technology for Breeder Reactor structures. Volume 3: special topics in reactor structures  

SciTech Connect

This volume is divided into six chapters: analysis techniques, equivalent damping values, probabilistic design factors, design verifications, equivalent response cycles for fatigue analysis, and seismic isolation. (JDB)

Reddy, D.P. (ed)

1983-04-01

32

A verified design of a fault-tolerant clock synchronization circuit: Preliminary investigations  

NASA Technical Reports Server (NTRS)

Schneider demonstrates that many fault tolerant clock synchronization algorithms can be represented as refinements of a single proven correct paradigm. Shankar provides mechanical proof that Schneider's schema achieves Byzantine fault tolerant clock synchronization provided that 11 constraints are satisfied. Some of the constraints are assumptions about physical properties of the system and cannot be established formally. Proofs are given that the fault tolerant midpoint convergence function satisfies three of the constraints. A hardware design is presented, implementing the fault tolerant midpoint function, which is shown to satisfy the remaining constraints. The synchronization circuit will recover completely from transient faults provided the maximum fault assumption is not violated. The initialization protocol for the circuit also provides a recovery mechanism from total system failure caused by correlated transient faults.

Miner, Paul S.

1992-01-01

33

A New Design of Seismic Stations Deployed in South Tyrol  

NASA Astrophysics Data System (ADS)

When designing the seismic network in South Tyrol, the seismic service of Austria and the Civil defense in South Tyrol combined more that 10 years experience in running seismic networks and private communication systems. In recent years the high data return rate of > 99% and network uptime of > 99.% is achieved by the combination of high quality station design and equipment, and the use of the Antelope data acquisition and processing software which comes with suite of network monitoring & alerting tools including Nagios, etc. The new Data Center is located in city of Bolzano and is connected to the other Data Centers in Austria, Switzerland, and Italy for data back up purposes. Each Data Center uses also redundant communication system if the primary system fails. When designing the South Tyrol network, new improvements were made in seismometer installations, grounding, lighting protection and data communications in order to improve quality of data recorded as well as network up-time, and data return. The new 12 stations are equipped with 6 Channels Q330+PB14f connected to STS2 + EpiSensor sensor. One of the key achievements was made in the grounding concept for the whole seismic station - and aluminum boxes were introduced which delivered Faraday cage isolation. Lightning protection devices are used for the equipment inside the aluminum housing where seismometer and data logger are housed. For the seismometer cables a special shielding was introduced. The broadband seismometer and strong-motion sensor are placed on a thick glass plate and therefore isolated from the ground. The precise seismometer orientation was done by a special groove on the glass plate and in case of a strong earthquake; the seismometer is tide up to the base plate. Temperature stability was achieved by styrofoam sheets inside the seismometer aluminum protection box.

Melichar, P.; Horn, N.

2007-05-01

34

RCC for seismic design. [Roller-Compacted Concrete  

SciTech Connect

This article describes how the use of roller-compacted concrete is saving $10 million on the seismic retrofit of Southern California's historic multiple-arch Littlerock Dam. Throughout its 70-year existence, the Littlerock Dam in Southern California's Angeles National Forest has been a subject of the San Andreas Fault, could this 28-arch dam withstand any major movement from that fault line, much less the big one'' Working with the state's Division of Safety of Dams, Woodward-Clyde Consultants, Oakland, Calif., performed stability and stress analyses to find the answer. The evaluation showed that, as feared, the dam failed to meet required seismic safety criteria, principally due to its lack of lateral stability, a deficiency inherent in multiple-arch dams. To provide adequate seismic stability the authors developed a rehabilitation design centered around the use of roller-compacted concrete (RCC) to construct a gravity section between and around the downstream portions of the existing buttresses. The authors also proposed that the arches be resurfaced and stiffened with steel-fiber-reinforced silica fume. The alternative design would have required filling the arch bays between the buttresses with mass concrete at a cost of $22.5 million. The RCC buttress repair construction, scheduled for completion this fall, will cost about $13 million.

Wong, N.C.; Forrest, M.P.; Lo, S.H. (Woodward-Clyde Consultants, Oakland, CA (United States))

1994-09-01

35

Study of seismic design bases and site conditions for nuclear power plants  

SciTech Connect

This report presents the results of an investigation of four topics pertinent to the seismic design of nuclear power plants: Design accelerations by regions of the continental United States; review and compilation of design-basis seismic levels and soil conditions for existing nuclear power plants; regional distribution of shear wave velocity of foundation materials at nuclear power plant sites; and technical review of surface-founded seismic analysis versus embedded approaches.

Not Available

1980-04-01

36

Design and application of an electromagnetic vibrator seismic source  

USGS Publications Warehouse

Vibrational seismic sources frequently provide a higher-frequency seismic wavelet (and therefore better resolution) than other sources, and can provide a superior signal-to-noise ratio in many settings. However, they are often prohibitively expensive for lower-budget shallow surveys. In order to address this problem, I designed and built a simple but effective vibrator source for about one thousand dollars. The "EMvibe" is an inexpensive electromagnetic vibrator that can be built with easy-to-machine parts and off-the-shelf electronics. It can repeatably produce pulse and frequency-sweep signals in the range of 5 to 650 Hz, and provides sufficient energy for recording at offsets up to 20 m. Analysis of frequency spectra show that the EMvibe provides a broader frequency range than the sledgehammer at offsets up to ??? 10 m in data collected at a site with soft sediments in the upper several meters. The EMvibe offers a high-resolution alternative to the sledgehammer for shallow surveys. It is well-suited to teaching applications, and to surveys requiring a precisely-repeatable source signature.

Haines, S. S.

2006-01-01

37

Design Of Bridges For Non Synchronous Seismic Motion  

SciTech Connect

this paper aims to develop and validate structural design criteria which account for the effects of earthquakes spatial variability. In past works [1, 2] the two simplest forms of this problem were dealt with: differential displacements between two points belonging to the soil or to two single degree of freedom structures. Seismic action was defined according to EC8 [3]; the structures were assumed linear elastic sdof oscillators. Despite this problem may seem trivial, existing codes models appeared improvable on this aspect. For the differential displacements of two points on the ground, these results are now validated and generalized using the newly developed response spectra contained in the new seismic Italian code [4]; the resulting code formulation is presented. Next, the problem of statistically defining the differential displacement among any number of points on the ground (which is needed for continuos deck bridges) is approached, and some preliminary results shown. It is also shown that the current codes (e.g. EC8) rules may be improved on this aspect.

Nuti, Camillo [Dipartimento di Strutture, Dis, Universita di Roma 3, Via Segre 4-6, 00146, Roma (Italy); Vanzi, Ivo [Dipartimento di Progettazione, Pricos, Universita di Chieti, Viale Pindaro 42, 65127, Pescara (Italy)

2008-07-08

38

Seismic Analysis Issues in Design Certification Applications for New Reactors  

SciTech Connect

The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants,” provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

Miranda, M.; Morante, R.; Xu, J.

2011-07-17

39

Assessment of the impact of degraded shear wall stiffnesses on seismic plant risk and seismic design loads  

SciTech Connect

Test results sponsored by the USNRC have shown that reinforced shear wall (Seismic Category I) structures exhibit stiffnesses and natural frequencies which are smaller than those calculated in the design process. The USNRC has sponsored Sandia National Labs to perform an evaluation of the effects of the reduced frequencies on several existing seismic PRAs in order to determine the seismic risk implications inherent in these test results. This report presents the results for the re-evaluation of the seismic risk for three nuclear power plants: the Peach Bottom Atomic Power Station, the Zion Nuclear Power Plant, and Arkansas Nuclear One -- Unit 1 (ANO-1). Increases in core damage frequencies for seismic initiated events at Peach Bottom were 25 to 30 percent (depending on whether LLNL or EPRI hazard curves were used). At the ANO-1 site, the corresponding increases in plant risk were 10 percent (for each set of hazard curves). Finally, at Zion, there was essentially no change in the computed core damage frequency when the reduction in shear wall stiffness was included. In addition, an evaluation of deterministic ``design-like`` structural dynamic calculations with and without the shear stiffness reductions was made. Deterministic loads calculated for these two cases typically increased on the order of 10 to 20 percent for the affected structures.

Klamerus, E.W.; Bohn, M.P. [Sandia National Labs., Albuquerque, NM (United States); Johnson, J.J.; Asfura, A.P.; Doyle, D.J. [EQE Engineering, Inc., San Francisco, CA (United States)

1994-02-01

40

Performance-based design of reinforced concrete buildings subjected to seismic forces  

E-print Network

An approach for evaluating reinforced concrete crographics. structural frame systems subjected to seismic forces under the framework of performance-based design methodology was developed. The method integrates the design criteria according...

Kalghatgi, Nikhil S.

2012-06-07

41

Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)  

NASA Astrophysics Data System (ADS)

The aftermath of recent earthquakes is a reminder that seismic risk is a very relevant issue for our communities. Implicit within the seismic design standards currently in place around the world is that minimum acceptable levels of seismic risk will be ensured through design in accordance with the codes. All the same, none of the design standards specify what the minimum acceptable level of seismic risk actually is. Instead, a series of deterministic limit states are set which engineers then demonstrate are satisfied for their structure, typically through the use of elastic dynamic analyses adjusted to account for non-linear response using a set of empirical correction factors. From the early nineties the seismic engineering community has begun to recognise numerous fundamental shortcomings with such seismic design procedures in modern codes. Deficiencies include the use of elastic dynamic analysis for the prediction of inelastic force distributions, the assignment of uniform behaviour factors for structural typologies irrespective of the structural proportions and expected deformation demands, and the assumption that hysteretic properties of a structure do not affect the seismic displacement demands, amongst other things. In light of this a number of possibilities have emerged for improved control of risk through seismic design, with several innovative displacement-based seismic design methods now well developed. For a specific seismic design intensity, such methods provide a more rational means of controlling the response of a structure to satisfy performance limit states. While the development of such methodologies does mark a significant step forward for the control of seismic risk, they do not, on their own, identify the seismic risk of a newly designed structure. In the U.S. a rather elaborate performance-based earthquake engineering (PBEE) framework is under development, with the aim of providing seismic loss estimates for new buildings. The PBEE framework consists of the following four main analysis stages: (i) probabilistic seismic hazard analysis to give the mean occurrence rate of earthquake events having an intensity greater than a threshold value, (ii) structural analysis to estimate the global structural response, given a certain value of seismic intensity, (iii) damage analysis, in which fragility functions are used to express the probability that a building component exceeds a damage state, as a function of the global structural response, (iv) loss analysis, in which the overall performance is assessed based on the damage state of all components. This final step gives estimates of the mean annual frequency with which various repair cost levels (or other decision variables) are exceeded. The realisation of this framework does suggest that risk-based seismic design is now possible. However, comparing current code approaches with the proposed PBEE framework, it becomes apparent that mainstream consulting engineers would have to go through a massive learning curve in order to apply the new procedures in practice. With this in mind, it is proposed that simplified loss-based seismic design procedures are a logical means of helping the engineering profession transition from what are largely deterministic seismic design procedures in current codes, to more rational risk-based seismic design methodologies. Examples are provided to illustrate the likely benefits of adopting loss-based seismic design approaches in practice.

Sullivan, T. J.

2012-04-01

42

Engineering Seismic Base Layer for Defining Design Earthquake Motion  

SciTech Connect

Engineer's common sense that incident wave is common in a widespread area at the engineering seismic base layer is shown not to be correct. An exhibiting example is first shown, which indicates that earthquake motion at the ground surface evaluated by the analysis considering the ground from a seismic bedrock to a ground surface simultaneously (continuous analysis) is different from the one by the analysis in which the ground is separated at the engineering seismic base layer and analyzed separately (separate analysis). The reason is investigated by several approaches. Investigation based on eigen value problem indicates that the first predominant period in the continuous analysis cannot be found in the separate analysis, and predominant period at higher order does not match in the upper and lower ground in the separate analysis. The earthquake response analysis indicates that reflected wave at the engineering seismic base layer is not zero, which indicates that conventional engineering seismic base layer does not work as expected by the term 'base'. All these results indicate that wave that goes down to the deep depths after reflecting in the surface layer and again reflects at the seismic bedrock cannot be neglected in evaluating the response at the ground surface. In other words, interaction between the surface layer and/or layers between seismic bedrock and engineering seismic base layer cannot be neglected in evaluating the earthquake motion at the ground surface.

Yoshida, Nozomu [Department of Civil and Environmental Engineering, Tohoku Gakuin University, Tagajo 1-13-1, Miyagi (Japan)

2008-07-08

43

USP Verified Dietary Supplements  

MedlinePLUS

... USP Verified Dietary Supplements Tweet USP Verified Dietary Supplements USP Verified dietary supplements are products that have ... it means . Where to Find USP Verified Dietary Supplements View USP Verified products and where they can ...

44

Technical Basis for Certification of Seismic Design Criteria for the Waste Treatment Plant, Hanford, Washington  

SciTech Connect

In August 2007, Secretary of Energy Samuel W. Bodman approved the final seismic and ground motion criteria for the Waste Treatment and Immobilization Plant (WTP) at the Department of Energy's (DOE) Hanford Site. Construction of the WTP began in 2002 based on seismic design criteria established in 1999 and a probabilistic seismic hazard analysis completed in 1996. The design criteria were reevaluated in 2005 to address questions from the Defense Nuclear Facilities Safety Board (DNFSB), resulting in an increase by up to 40% in the seismic design basis. DOE announced in 2006 the suspension of construction on the pretreatment and high-level waste vitrification facilities within the WTP to validate the design with more stringent seismic criteria. In 2007, the U.S. Congress mandated that the Secretary of Energy certify the final seismic and ground motion criteria prior to expenditure of funds on construction of these two facilities. With the Secretary's approval of the final seismic criteria in the summer of 2007, DOE authorized restart of construction of the pretreatment and high-level waste vitrification facilities. The technical basis for the certification of seismic design criteria resulted from a two-year Seismic Boreholes Project that planned, collected, and analyzed geological data from four new boreholes drilled to depths of approximately 1400 feet below ground surface on the WTP site. A key uncertainty identified in the 2005 analyses was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The absence of directly-measured seismic shear wave velocities in the sedimentary interbeds resulted in the use of a wider and more conservative range of velocities in the 2005 analyses. The Seismic Boreholes Project was designed to directly measure the velocities and velocity contrasts in the basalts and sediments below the WTP, reanalyze the ground motion response, and assess the level of conservatism in the 2005 seismic design criteria. The characterization and analysis effort included 1) downhole measurements of the velocity properties (including uncertainties) of the basalt/interbed sequences, 2) confirmation of the geometry of the contact between the various basalt and interbedded sediments through examination of retrieved core from the core-hole and data collected through geophysical logging of each borehole, and 3) prediction of ground motion response to an earthquake using newly acquired and historic data. The data and analyses reflect a significant reduction in the uncertainty in shear wave velocities below the WTP and result in a significantly lower spectral acceleration (i.e., ground motion). The updated ground motion response analyses and corresponding design response spectra reflect a 25% lower peak horizontal acceleration than reflected in the 2005 design criteria. These results provide confidence that the WTP seismic design criteria are conservative. (authors)

Brouns, T.M.; Rohay, A.C. [Pacific Northwest National Laboratory, Richland, WA (United States); Youngs, R.R. [Geomatrix Consultants, Inc., Oakland, CA (United States); Costantino, C.J. [C.J. Costantino and Associates, Valley, NY (United States); Miller, L.F. [U.S. Department of Energy, Office of River Protection, Richland, WA (United States)

2008-07-01

45

Development of Guidelines for Incorporation of Vertical Ground Motion Effects in Seismic Design of Highway Bridges.  

National Technical Information Service (NTIS)

This study was undertaken with the objective of assessing the current provisions in SDC-2006 for incorporating vertical effects of ground motions in seismic evaluation and design of ordinary highway bridges. A comprehensive series of simulations was carri...

E. Erduran, N. Abrahamson, S. K. Kunnath, Y. H. Chai, Z. Yilmaz

2008-01-01

46

Overcoming barriers to high performance seismic design using lessons learned from the green building industry  

NASA Astrophysics Data System (ADS)

NEHRP's Provisions today currently governing conventional seismic resistant design. These provisions, though they ensure the life-safety of building occupants, extensive damage and economic losses may still occur in the structures. This minimum performance can be enhanced using the Performance-Based Earthquake Engineering methodology and passive control systems like base isolation and energy dissipation systems. Even though these technologies and the PBEE methodology are effective reducing economic losses and fatalities during earthquakes, getting them implemented into seismic resistant design has been challenging. One of the many barriers to their implementation has been their upfront costs. The green building community has faced some of the same challenges that the high performance seismic design community currently faces. The goal of this thesis is to draw on the success of the green building industry to provide recommendations that may be used overcome the barriers that high performance seismic design (HPSD) is currently facing.

Glezil, Dorothy

47

Design and implementation of telemetry seismic data acquisition system based on embedded P2P Ethernet  

NASA Astrophysics Data System (ADS)

A new design of telemetry seismic data acquisition system is presented which uses embedded, point to point (P2P) Ethernet networks. In our presentation, we explain the idea and motivation behind the use of P2P Ethernet topology and show the problems when such topology is used in seismic acquisition system. The presented paper focuses on the network protocols developed by us which include the generation of route table and dynamic IP address management. This new design has been implemented based on ARM and FPGA, which we have tested in laboratory and seismic exploration.

Zhang, L.; Lin, J.; Chen, Z.

2011-12-01

48

SEISMIC DESIGN REQUIREMENTS SELECTION METHODOLOGY FOR THE SLUDGE TREATMENT & M-91 SOLID WASTE PROCESSING FACILITIES PROJECTS  

SciTech Connect

In complying with direction from the U.S. Department of Energy (DOE), Richland Operations Office (RL) (07-KBC-0055, 'Direction Associated with Implementation of DOE-STD-1189 for the Sludge Treatment Project,' and 08-SED-0063, 'RL Action on the Safety Design Strategy (SDS) for Obtaining Additional Solid Waste Processing Capabilities (M-91 Project) and Use of Draft DOE-STD-I 189-YR'), it has been determined that the seismic design requirements currently in the Project Hanford Management Contract (PHMC) will be modified by DOE-STD-1189, Integration of Safety into the Design Process (March 2007 draft), for these two key PHMC projects. Seismic design requirements for other PHMC facilities and projects will remain unchanged. Considering the current early Critical Decision (CD) phases of both the Sludge Treatment Project (STP) and the Solid Waste Processing Facilities (M-91) Project and a strong intent to avoid potentially costly re-work of both engineering and nuclear safety analyses, this document describes how Fluor Hanford, Inc. (FH) will maintain compliance with the PHMC by considering both the current seismic standards referenced by DOE 0 420.1 B, Facility Safety, and draft DOE-STD-1189 (i.e., ASCE/SEI 43-05, Seismic Design Criteria for Structures, Systems, and Components in Nuclear Facilities, and ANSI!ANS 2.26-2004, Categorization of Nuclear Facility Structures, Systems and Components for Seismic Design, as modified by draft DOE-STD-1189) to choose the criteria that will result in the most conservative seismic design categorization and engineering design. Following the process described in this document will result in a conservative seismic design categorization and design products. This approach is expected to resolve discrepancies between the existing and new requirements and reduce the risk that project designs and analyses will require revision when the draft DOE-STD-1189 is finalized.

RYAN GW

2008-04-25

49

Design, manufacturing and evaluation of the performance of steel like fiber reinforced elastomeric seismic isolators  

Microsoft Academic Search

In this research, specimens of fiber reinforced elastomeric seismic isolators have been designed, manufactured and their dynamic and mechanical characteristics have then been studied by performing vertical and horizontal (compression–shear) tests. For the sake of comparison, one steel reinforced elastomeric isolator specimen has also been designed, manufactured and subjected to similar tests. In design of fiber reinforced isolators, the tensile

Ghasem Dehghani Ashkezari; Ali Akbar Aghakouchak; Mehrdad Kokabi

2008-01-01

50

Seismic design factors for RC special moment resisting frames in Dubai, UAE  

NASA Astrophysics Data System (ADS)

This study investigates the seismic design factors for three reinforced concrete (RC) framed buildings with 4, 16 and 32-stories in Dubai, UAE utilizing nonlinear analysis. The buildings are designed according to the response spectrum procedure defined in the 2009 International Building Code (IBC'09). Two ensembles of ground motion records with 10% and 2% probability of exceedance in 50 years (10/50 and 2/50, respectively) are used. The nonlinear dynamic responses to the earthquake records are computed using IDARC-2D. Key seismic design parameters are evaluated; namely, response modification factor ( R), deflection amplification factor ( C d), system overstrength factor ( ? o), and response modification factor for ductility ( R d ) in addition to inelastic interstory drift. The evaluated seismic design factors are found to significantly depend on the considered ground motion (10/50 versus 2/50). Consequently, resolution to the controversy of Dubai seismicity is urged. The seismic design factors for the 2/50 records show an increase over their counterparts for the 10/50 records in the range of 200%-400%, except for the ? o factor, which shows a mere 30% increase. Based on the observed trends, perioddependent R and C d factors are recommended if consistent collapse probability (or collapse prevention performance) in moment frames with varying heights is to be expected.

Alhamaydeh, Mohammad; Abdullah, Sulayman; Hamid, Ahmed; Mustapha, Abdilwahhab

2011-12-01

51

Performance-based seismic design of nonstructural building components: The next frontier of earthquake engineering  

NASA Astrophysics Data System (ADS)

With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major knowledge gaps that will need to be filled by future research. Furthermore, considering recent trends in earthquake engineering, the paper explores how performance-based seismic design might be conceived for nonstructural components, drawing on recent developments made in the field of seismic design and hinting at the specific considerations required for nonstructural components.

Filiatrault, Andre; Sullivan, Timothy

2014-08-01

52

Seismic design repair and retrofit strategies for steel roof deck diaphragms  

Microsoft Academic Search

Structural engineers will often rely on the roof diaphragm to transfer lateral seismic loads to the bracing system of single-storey structures. The implementation of capacity-based design in the NBCC 2005 has caused an increase in the diaphragm design load due to the need to use the probable capacity of the bracing system, thus resulting in thicker decks, closer connector patterns

John-Edward Franquet

2010-01-01

53

Seismic responses of a pool-type fast reactor with different core support designs  

SciTech Connect

In designing the core support system for a pool-type fast reactor, there are many issues which must be considered in order to achieve an optimum and balanced design. These issues include safety, reliability, as well as costs. Several design options are possible to support the reactor core. Different core support options yield different frequency ranges and responses. Seismic responses of a large pool-type fast reactor incorporated with different core support designs have been investigated. 4 refs., 3 figs.

Wu, Ting-shu; Seidensticker, R.W. (Argonne National Lab., IL (USA))

1989-01-01

54

Seismic design, analysis, and testing of the HTGR MK-IVA steam generator  

SciTech Connect

The paper addresses the design of the helically coiled economizer-evaporator-superheater (EES) portion of the high-temperature gas-cooled reactor (HTGR) MK-IVA steam generator. The MK-IVA steam generator must sustain seismic loads resulting from ground level accelerations up to 0.3 g, safe shutdown earthquake (SSE), for an envelope of varying soil conditions. This presents a challenging design problem, because the high-temperature structure must have sufficient flexibility to accommodate thermal growth and differential expansions between heat transfer tubing and support structures, yet incorporate sufficient rigidity in those structural elements which transmit seismic loads.

Orr, J.D.; Schleicher, R.W.; Tong, K.

1982-08-01

55

Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington  

SciTech Connect

The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energy's (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84. percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis. A key uncertainty identified in the 2005 analysis was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The velocity structure of the upper four basalt flows (Saddle Mountains Basalt) and the inter-layered sedimentary interbeds (Ellensburg Formation) produces strong reductions in modeled earthquake ground motions propagating through them. Uncertainty in the strength of velocity contrasts between these basalts and interbeds primarily resulted from an absence of measured shear wave velocities (Vs) in the interbeds. For this study, Vs in the interbeds was estimated from older, limited compressional wave velocity (Vp) data using estimated ranges for the ratio of the two velocities (Vp/Vs) based on analogues in similar materials. A range of possible Vs for the interbeds and basalts was used and produced additional uncertainty in the resulting response spectra. Because of the sensitivity of the calculated response spectra to the velocity contrasts between the basalts and interbedded sediments, DOE initiated an effort to emplace additional boreholes at the WTP site and obtain direct Vs measurements and other physical property measurements in these layers. One core-hole and three boreholes have been installed at the WTP site to a maximum depth of 1468 ft (447 m) below ground surface. The three boreholes are within 500 ft (152 m) of and surrounding the high level waste vitrification and pretreatment facilities of the WTP, which were the Performance Category 3 (PC-3) structures affected by the interim design spectra. The core-hole is co-located with the borehole closest to the two PC-3 structures. These new measurements are expected to reduce the uncertainty in the modeled site response that is caused by the lack of direct knowledge of the Vs contrasts within these layers. (authors)

Brouns, T.M.; Rohay, A.C.; Reidel, S.P. [Pacific Northwest National Laboratory, Richland, WA (United States); Gardner, M.G. [EnergySolutions, Richland, WA (United States)

2007-07-01

56

Verifiably Secure Devices  

E-print Network

We put forward the notion of a verifiably secure device, in essence a stronger notion of secure computation, and achieve it in the ballot-box model. Verifiably secure devices1. Provide a perfect solution to the problem of ...

Lepinski, Matt

2007-12-05

57

Recommended revisions to Nuclear Regulatory Commission seismic design criteria. Technical report  

Microsoft Academic Search

This report recommends changes in the Nuclear Regulatory Commission's (NRC's) criteria now used in the seismic design of nuclear power plants. Areas covered include ground motion, soil-structure interaction, structures, and equipment and components. Members of the Engineering Mechanics Section of the Nuclear Test Engineering Division at Lawrence Livermore Laboratory (LLL) generally agreed upon the recommendations, which are based on (1)

Coats

1980-01-01

58

Optimal design of steel frames subject to gravity and seismic codes' prescribed lateral forces  

Microsoft Academic Search

Allowable stress design of two-dimensional braced and unbraced steel frames based on AISC specifications subject to gravity and seismic lateral forces is formulated as a structural optimization problem. The nonlinear constrained minimization algorithm employed is the feasible directions method. The objective function is the weight of the structure, and behaviour constraints include combined bending and axial stress, shear stress, buckling,

A. M. Memari; M. Madhkhan

1999-01-01

59

Risk-Targeted versus Current Seismic Design Maps for the Conterminous United States  

USGS Publications Warehouse

The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

2007-01-01

60

Design methodologies for the seismic retrofitting of bridges  

E-print Network

This paper formulates an earthquake design strategy for bridges. Earthquakes can cause extreme economic damage and loss of life. Structural engineers must be conscience of earthquake slip type, earthquake proximity, and ...

Otenti, Alexander A. (Alexander Alfred), 1981-

2004-01-01

61

Publicly Verifiable Secret Sharing  

Microsoft Academic Search

. A secret sharing scheme allows to share a secret among several participants such that only certain groups of them can recover it. Verifiable secret sharing has been proposed to achieve security against cheating participants. Its first realization had the special property that everybody, not only the participants, can verify that the shares are correctly distributed. We will call such

Markus Stadler

1996-01-01

62

Verifying Diagnostic Software  

NASA Technical Reports Server (NTRS)

Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.

Lindsey, Tony; Pecheur, Charles

2004-01-01

63

Recent advances in the Lesser Antilles observatoriesRecent advances in the Lesser Antilles observatories Part 1 : Seismic Data Acquisition Design based on EarthWorm andPart 1 : Seismic Data Acquisition Design based on EarthWorm and  

E-print Network

observatories Part 1 : Seismic Data Acquisition Design based on EarthWorm andPart 1 : Seismic Data Acquisition Design based on EarthWorm and SeisComPSeisComP Jean-Marie SAUREL (2,1), Frédéric RANDRIAMORA (3 observatories community : EarthWorm and SeisComP. The first is renowned for its ability to process real time

Beauducel, François

64

Architecture for Verifiable Software  

NASA Technical Reports Server (NTRS)

Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

Reinholtz, William; Dvorak, Daniel

2005-01-01

65

Probabilistic seismic hazard characterization and design parameters for the Pantex Plant  

SciTech Connect

The Hazards Mitigation Center at Lawrence Livermore National Laboratory (LLNL) updated the seismic hazard and design parameters at the Pantex Plant. The probabilistic seismic hazard (PSH) estimates were first updated using the latest available data and knowledge from LLNL (1993, 1998), Frankel et al. (1996), and other relevant recent studies from several consulting companies. Special attention was given to account for the local seismicity and for the system of potentially active faults associated with the Amarillo-Wichita uplift. Aleatory (random) uncertainty was estimated from the available data and the epistemic (knowledge) uncertainty was taken from results of similar studies. Special attention was given to soil amplification factors for the site. Horizontal Peak Ground Acceleration (PGA) and 5% damped uniform hazard spectra were calculated for six return periods (100 yr., 500 yr., 1000 yr., 2000 yr., 10,000 yr., and 100,000 yr.). The design parameters were calculated following DOE standards (DOE-STD-1022 to 1024). Response spectra for design or evaluation of Performance Category 1 through 4 structures, systems, and components are presented.

Bernreuter, D. L.; Foxall, W.; Savy, J. B.

1998-10-19

66

1-G model tests and hollow cylindrical torsional shear experiments on seismic residual displacements of fill dams from the viewpoint of seismic performance-based design  

Microsoft Academic Search

This paper concerns technological efforts for the general acceptance of performance-based seismic design principle of geotechnical structures. Among many problems to be solved, a particular emphasis was placed on the prediction of residual displacement that remains after a strong earthquake. Because of the complicated behavior of soils undergoing cyclic loading, the prediction is often either complicated\\/costly or not very accurate.

Seda Sendir Torisu; Junichi Sato; Ikuo Towhata; Tsuyoshi Honda

2010-01-01

67

IMPLEMENTATION OF THE SEISMIC DESIGN CRITERIA OF DOE-STD-1189-2008 APPENDIX A [FULL PAPER  

SciTech Connect

This paper describes the approach taken by two Fluor Hanford projects for implementing of the seismic design criteria from DOE-STD-1189-2008, Appendix A. The existing seismic design criteria and the new seismic design criteria is described, and an assessment of the primary differences provided. The gaps within the new system of seismic design criteria, which necessitate conduct of portions of work to the existing technical standards pending availability of applicable industry standards, is discussed. Two Hanford Site projects currently in the Control Decision (CD)-1 phase of design have developed an approach to implementation of the new criteria. Calculations have been performed to determine the seismic design category for one project, based on information available in early CD-1. The potential effects of DOE-STD-1189-2008, Appendix A seismic design criteria on the process of project alternatives analysis is discussed. Present of this work is expected to benefit others in the DOE Complex that may be implementing DOE-STD-1189-2008.

OMBERG SK

2008-05-14

68

Decision making with epistemic uncertainty under safety constraints: An application to seismic design  

USGS Publications Warehouse

The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project. ?? 2009 Elsevier Ltd. All rights reserved.

Veneziano, D.; Agarwal, A.; Karaca, E.

2009-01-01

69

Seismic design repair and retrofit strategies for steel roof deck diaphragms  

NASA Astrophysics Data System (ADS)

Structural engineers will often rely on the roof diaphragm to transfer lateral seismic loads to the bracing system of single-storey structures. The implementation of capacity-based design in the NBCC 2005 has caused an increase in the diaphragm design load due to the need to use the probable capacity of the bracing system, thus resulting in thicker decks, closer connector patterns and higher construction costs. Previous studies have shown that accounting for the in-plane flexibility of the diaphragm when calculating the overall building period can result in lower seismic forces and a more cost-efficient design. However, recent studies estimating the fundamental period of single storey structures using ambient vibration testing showed that the in-situ approximation was much shorter than that obtained using analytical means. The difference lies partially in the diaphragm stiffness characteristics which have been shown to decrease under increasing excitation amplitude. Using the diaphragm as the energy-dissipating element in the seismic force resisting system has also been investigated as this would take advantage of the diaphragm's ductility and limited overstrength; thus, lower capacity based seismic forces would result. An experimental program on 21.0m by 7.31m diaphragm test specimens was carried out so as to investigate the dynamic properties of diaphragms including the stiffness, ductility and capacity. The specimens consisted of 20 and 22 gauge panels with nailed frame fasteners and screwed sidelap connections as well a welded and button-punch specimen. Repair strategies for diaphragms that have previously undergone inelastic deformations were devised in an attempt to restitute the original stiffness and strength and were then experimentally evaluated. Strength and stiffness experimental estimations are compared with those predicted with the Steel Deck Institute (SDI) method. A building design comparative study was also completed. This study looks at the difference in design and cost yielded by previous and current design practice with EBF braced frames. Two alternate design methodologies, where the period is not restricted by code limitations and where the diaphragm force is limited to the equivalent shear force calculated with RdR o = 1.95, are also used for comparison. This study highlights the importance of incorporating the diaphragm stiffness in design and the potential cost savings.

Franquet, John-Edward

70

Seismic design evaluation guidelines for buried piping for the DOE HLW Facilities  

SciTech Connect

This paper presents the seismic design and evaluation guidelines for underground piping for the Department of Energy (DOE) High-Level-Waste (HLW) Facilities. The underground piping includes both single and double containment steel pipes and concrete pipes with steel lining, with particular emphasis on the double containment piping. The design and evaluation guidelines presented in this paper follow the generally accepted beam-on-elastic-foundation analysis principle and the inertial response calculation method, respectively, for piping directly in contact with the soil or contained in a jacket. A standard analysis procedure is described along with the discussion of factors deemed to be significant for the design of the underground piping. The following key considerations are addressed: the design feature and safety requirements for the inner (core) pipe and the outer pipe; the effect of soil strain and wave passage; assimilation of the necessary seismic and soil data; inertial response calculation for the inner pipe; determination of support anchor movement loads; combination of design loads; and code comparison. Specifications and justifications of the key parameters used, stress components to be calculated and the allowable stress and strain limits for code evaluation are presented.

Lin, Chi-Wen [Consultant, Martinez, CA (United States); Antaki, G. [Westinghouse Savannah River Co., Aiken, SC (United States); Bandyopadhyay, K. [Brookhaven National Lab., Upton, NY (United States); Bush, S.H. [Review & Synthesis Association, Richland, WA (United States); Costantino, C. [City Univ. of New York, New York, NY (United States); Kennedy, R. [RPK Structural Mechanics, Yorba Linda, CA (United States). Consultant

1995-05-01

71

Verifying Atomic Data Types  

Microsoft Academic Search

Atomic transactions are a widely-accepted technique for organizing computation in fault-tolerant distributed systems. In most languages and systems based on transactions, atomicity is implemented through atomic objects, typed data objects that provide their own synchronization and recovery. Hence, atomicity is the key correctness condition required of a data type implementation. This paper presents a technique for verifying the correctness of

Jeannette M. Wing

1989-01-01

72

SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)  

SciTech Connect

This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The primary reasons for this difference is the greater activity rate used in contemporary models for the Charleston source zone and proper incorporation of uncertainty and randomness in GMAMs.

(NOEMAIL), R

2005-12-14

73

Response spectrum of seismic design code for zones lack of near-fault strong earthquake records  

Microsoft Academic Search

It was shown from the study on the recently near-fault earthquake ground motions that the near-fault effects were seldom considered\\u000a in the existing Chinese seismic code. Referring to the UBC97 design concept for near-fault factors, based on the collected\\u000a world-widely free-site records of near-fault earthquakes ground motions classified by earthquake magnitude and site condition,\\u000a the attenuation relationship expressions of the

Xin-Le Li; Hui-Juan Dou; Xi Zhu; Jian-Gang Sun

2007-01-01

74

Implementation of seismic design and evaluation guidelines for the Department of Energy high-level waste storage tanks and appurtenances  

SciTech Connect

In the fall of 1992, a draft of the Seismic Design and Evaluation Guidelines for the Department of Energy (DOE) High-level Waste Storage Tanks and Appurtenances was issued. The guidelines were prepared by the Tanks Seismic Experts Panel (TSEP) and this task was sponsored by DOE, Environmental Management. The TSEP is comprised of a number of consultants known for their knowledge of seismic ground motion and expertise in the analysis of structures, systems and components subjected to seismic loads. The development of these guidelines was managed by staff from Brookhaven National Laboratory, Engineering Research and Applications Division, Department of Nuclear Energy. This paper describes the process used to incorporate the Seismic Design and Evaluation Guidelines for the DOE High-Level Waste Storage Tanks and Appurtenances into the design criteria for the Multi-Function Waste Tank Project at the Hanford Site. This project will design and construct six new high-level waste tanks in the 200 Areas at the Hanford Site. This paper also discusses the vehicles used to ensure compliance to these guidelines throughout Title 1 and Title 2 design phases of the project as well as the strategy used to ensure consistent and cost-effective application of the guidelines by the structural analysts. The paper includes lessons learned and provides recommendations for other tank design projects which might employ the TSEP guidelines.

Conrads, T.J.

1993-06-01

75

Exploratory Shaft Seismic Design Basis Working Group report; Yucca Mountain Project  

SciTech Connect

This report was prepared for the Yucca Mountain Project (YMP), which is managed by the US Department of Energy. The participants in the YMP are investigating the suitability of a site at Yucca Mountain, Nevada, for construction of a repository for high-level radioactive waste. An exploratory shaft facility (ESF) will be constructed to permit site characterization. The major components of the ESF are two shafts that will be used to provide access to the underground test areas for men, utilities, and ventilation. If a repository is constructed at the site, the exploratory shafts will be converted for use as intake ventilation shafts. In the context of both underground nuclear explosions (conducted at the nearby Nevada Test Site) and earthquakes, the report contains discussions of faulting potential at the site, control motions at depth, material properties of the different rock layers relevant to seismic design, the strain tensor for each of the waveforms along the shaft liners, and the method for combining the different strain components along the shaft liners. The report also describes analytic methods, assumptions used to ensure conservatism, and uncertainties in the data. The analyses show that none of the shafts` structures, systems, or components are important to public radiological safety; therefore, the shafts need only be designed to ensure worker safety, and the report recommends seismic design parameters appropriate for this purpose. 31 refs., 5 figs., 6 tabs.

Subramanian, C.V. [Sandia National Labs., Albuquerque, NM (USA); King, J.L. [Science Applications International Corp., Las Vegas, NV (USA); Perkins, D.M. [Geological Survey, Denver, CO (USA); Mudd, R.W. [Fenix and Scisson, Inc., Tulsa, OK (USA); Richardson, A.M. [Parsons, Brinckerhoff, Quade and Douglas, Inc., San Francisco, CA (USA); Calovini, J.C. [Holmes and Narver, Inc., Las Vegas, NV (USA); Van Eeckhout, E. [Los Alamos National Lab., NM (USA); Emerson, D.O. [Lawrence Livermore National Lab., CA (USA)

1990-08-01

76

AP1000{sup R} design robustness against extreme external events - Seismic, flooding, and aircraft crash  

SciTech Connect

Both the International Atomic Energy Agency (IAEA) and the U.S. Nuclear Regulatory Commission (NRC) require existing and new nuclear power plants to conduct plant assessments to demonstrate the unit's ability to withstand external hazards. The events that occurred at the Fukushima-Dai-ichi nuclear power station demonstrated the importance of designing a nuclear power plant with the ability to protect the plant against extreme external hazards. The innovative design of the AP1000{sup R} nuclear power plant provides unparalleled protection against catastrophic external events which can lead to extensive infrastructure damage and place the plant in an extended abnormal situation. The AP1000 plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance and safety. The plant's compact safety related footprint and protection provided by its robust nuclear island structures prevent significant damage to systems, structures, and components required to safely shutdown the plant and maintain core and spent fuel pool cooling and containment integrity following extreme external events. The AP1000 nuclear power plant has been extensively analyzed and reviewed to demonstrate that it's nuclear island design and plant layout provide protection against both design basis and extreme beyond design basis external hazards such as extreme seismic events, external flooding that exceeds the maximum probable flood limit, and malicious aircraft impact. The AP1000 nuclear power plant uses fail safe passive features to mitigate design basis accidents. The passive safety systems are designed to function without safety-grade support systems (such as AC power, component cooling water, service water, compressed air or HVAC). The plant has been designed to protect systems, structures, and components critical to placing the reactor in a safe shutdown condition within the steel containment vessel which is further surrounded by a substantial 'steel concrete' composite shield building. The containment vessel is not affected by external flooding, and the shield building design provides hazard protection beyond that provided by a comparable reinforced concrete structure. The intent of this paper is to demonstrate the robustness of the AP1000 design against extreme events. The paper will focus on the plants ability to withstand extreme external events such as beyond design basis flooding, seismic events, and malicious aircraft impact. The paper will highlight the robustness of the AP1000 nuclear island design including the protection provided by the unique AP1000 composite shield building. (authors)

Pfister, A.; Goossen, C.; Coogler, K.; Gorgemans, J. [Westinghouse Electric Company LLC, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States)

2012-07-01

77

Ground motion values for use in the seismic design of the Trans-Alaska Pipeline system  

USGS Publications Warehouse

The proposed trans-Alaska oil pipeline, which would traverse the state north to south from Prudhoe Bay on the Arctic coast to Valdez on Prince William Sound, will be subject to serious earthquake hazards over much of its length. To be acceptable from an environmental standpoint, the pipeline system is to be designed to minimize the potential of oil leakage resulting from seismic shaking, faulting, and seismically induced ground deformation. The design of the pipeline system must accommodate the effects of earthquakes with magnitudes ranging from 5.5 to 8.5 as specified in the 'Stipulations for Proposed Trans-Alaskan Pipeline System.' This report characterizes ground motions for the specified earthquakes in terms of peak levels of ground acceleration, velocity, and displacement and of duration of shaking. Published strong motion data from the Western United States are critically reviewed to determine the intensity and duration of shaking within several kilometers of the slipped fault. For magnitudes 5 and 6, for which sufficient near-fault records are available, the adopted ground motion values are based on data. For larger earthquakes the values are based on extrapolations from the data for smaller shocks, guided by simplified theoretical models of the faulting process.

Page, Robert A.; Boore, D. M.; Joyner, W. B.; Coulter, H. W.

1972-01-01

78

Seismic Ecology  

NASA Astrophysics Data System (ADS)

The paper is devoted to researches of influence of seismic actions for industrial and civil buildings and people. The seismic actions bring influence directly on the people (vibration actions, force shocks at earthquakes) or indirectly through various build- ings and the constructions and can be strong (be felt by people) and weak (be fixed by sensing devices). The great number of work is devoted to influence of violent seismic actions (first of all of earthquakes) on people and various constructions. This work is devoted to study weak, but long seismic actions on various buildings and people. There is a need to take into account seismic oscillations, acting on the territory, at construction of various buildings on urbanized territories. Essential influence, except for violent earthquakes, man-caused seismic actions: the explosions, seismic noise, emitted by plant facilities and moving transport, radiation from high-rise buildings and constructions under action of a wind, etc. can exert. Materials on increase of man- caused seismicity in a number of regions in Russia, which earlier were not seismic, are presented in the paper. Along with maps of seismic microzoning maps to be built indicating a variation of amplitude spectra of seismic noise within day, months, years. The presence of an information about amplitudes and frequencies of oscillations from possible earthquakes and man-caused oscillations in concrete regions allows carry- ing out soundly designing and construction of industrial and civil housing projects. The construction of buildings even in not seismically dangerous regions, which have one from resonance frequencies coincident on magnitude to frequency of oscillations, emitted in this place by man-caused objects, can end in failure of these buildings and heaviest consequences for the people. The practical examples of detail of engineering- seismological investigation of large industrial and civil housing projects of Siberia territory (hydro power stations, bridges, constructions, etc.) are given.

Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

79

Spatial correlation analysis of seismic noise for STAR X-ray infrastructure design  

NASA Astrophysics Data System (ADS)

The Italian PON MaTeRiA project is focused on the creation of a research infrastructure open to users based on an innovative and evolutionary X-ray source. This source, named STAR (Southern Europe TBS for Applied Research), exploits the Thomson backscattering process of a laser radiation by fast-electron beams (Thomson Back Scattering - TBS). Its main performances are: X-ray photon flux 109-1010 ph/s, Angular divergence variable between 2 and 10 mrad, X-ray energy continuously variable between 8 keV and 150 keV, Bandwidth ?E/E variable between 1 and 10%, ps time resolved structure. In order to achieve this performances, bunches of electrons produced by a photo-injector are accelerated to relativistic velocities by a linear accelerator section. The electron beam, few hundreds of micrometer wide, is driven by magnetic fields to the interaction point along a 15 m transport line where it is focused in a 10 micrometer-wide area. In the same area, the laser beam is focused after being transported along a 12 m structure. Ground vibrations could greatly affect the collision probability and thus the emittance by deviating the paths of the beams during their travel in the STAR source. Therefore, the study program to measure ground vibrations in the STAR site can be used for site characterization in relation to accelerator design. The environmental and facility noise may affect the X-ray operation especially if the predominant wavelengths in the microtremor wavefield are much smaller than the size of the linear accelerator. For wavelength much greater, all the accelerator parts move in phase, and therefore also large displacements cannot generate any significant effect. On the other hand, for wavelengths equal or less than half the accelerator size several parts could move in phase opposition and therefore small displacements could affect its proper functioning. Thereafter, it is important to characterize the microtremor wavefield in both frequencies and wavelengths domains. For this reason, we performed some measurements of seismic noise in order to characterize the environmental noise in the site in which the X-ray accelerator arise. For the characterization of the site, we carried out several passive seismic monitoring experiments at different times of the day and in different weather conditions. We recorded microtremor using an array of broadband 3C seismic sensors arranged along the linear accelerator. For each measurement point, we determined the displacement, velocity and acceleration spectrogram and power spectral density of both horizontal and vertical components. We determined also the microtremor horizontal to vertical spectral ratio as function of azimuth to individuate the main ground vibration direction and detect the existence of site or building resonance frequencies. We applied a rotation matrix to transform the North-South and East-West signal components in transversal and radial components, respect to the direction of the linear accelerator. Subsequently, for each couple of seismic stations we determined the coherence function to analyze the seismic noise spatial correlation. These analyses have allowed us to exhaustively characterize the seismic noise of the study area, from the point of view of the power and space-time variability, both in frequency and wavelength.

D'Alessandro, Antonino; Agostino, Raffaele; Festa, Lorenzo; Gervasi, Anna; Guerra, Ignazio; Palmer, Dennis T.; Serafini, Luca

2014-05-01

80

Core restraint and seismic analysis of a large heterogeneous free-flowering core design. Final report  

SciTech Connect

The core restraint and seismic performance of a large heterogeneous core was analyzed. A free-flowering core restraint system was selected for this study, as opposed to the limited-free bow system of the FFTF and CRBRP. The key features of the core restraint system, such as stiff reflector assemblies and load pad properties, were specified in this study. Other features - such as the fuel-assembly description, flux and temperature distributions, and clearances between the assembly nozzle and grid plate - were obtained from the other parts of a large, heterogeneous Core Study 11 and 12. Core restraint analysis was performed with NUBOW-3D over the first two cycles of operation. The SCRAP code was used to analyze the time-history seismic response of the core with the effects of fluid, impact, and bowed assemblies modeled in the code. The core restraint system design was assessed in terms of the predicted forces, impacts, displacements, and reactivity effects for different cycle times and power/flow ratios.

Madell, J.T.; Moran, T.J.; Ash, J.E.; Fulford, P.J.

1980-11-01

81

Effects of charge design features on parameters of acoustic and seismic waves and cratering, for SMR chemical surface explosions  

NASA Astrophysics Data System (ADS)

A series of experimental on-surface shots was designed and conducted by the Geophysical Institute of Israel at Sayarim Military Range (SMR) in Negev desert, including two large calibration explosions: about 82 tons of strong IMI explosives in August 2009, and about 100 tons of ANFO explosives in January 2011. It was a collaborative effort between Israel, CTBTO, USA and several European countries, with the main goal to provide fully controlled ground truth (GT0) infrasound sources in different weather/wind conditions, for calibration of IMS infrasound stations in Europe, Middle East and Asia. Strong boosters and the upward charge detonation scheme were applied to provide a reduced energy release to the ground and an enlarged energy radiation to the atmosphere, producing enhanced infrasound signals, for better observation at far-regional stations. The following observations and results indicate on the required explosives energy partition for this charge design: 1) crater size and local seismic (duration) magnitudes were found smaller than expected for these large surface explosions; 2) small test shots of the same charge (1 ton) conducted at SMR with different detonation directions showed clearly lower seismic amplitudes/energy and smaller crater size for the upward detonation; 3) many infrasound stations at local and regional distances showed higher than expected peak amplitudes, even after application of a wind-correction procedure. For the large-scale explosions, high-pressure gauges were deployed at 100-600 m to record air-blast properties, evaluate the efficiency of the charge design and energy generation, and provide a reliable estimation of the charge yield. Empirical relations for air-blast parameters - peak pressure, impulse and the Secondary Shock (SS) time delay - depending on distance, were developed and analyzed. The parameters, scaled by the cubic root of estimated TNT equivalent charges, were found consistent for all analyzed explosions, except of SS time delays clearly separated for the shot of IMI explosives (characterized by much higher detonation velocity than ANFO). Additionally acoustic records at close distances from WSMR explosions Distant Image (2440 tons of ANFO) and Minor Uncle (2725 tons of ANFO) were used to extend the charge and distance range for the SS delay scaled relationship, that showed consistency with SMR ANFO shots. The developed specific charge design contributed to the success of this unique dual Sayarim explosion experiment, providing the strongest GT0 sources since the establishment of the IMS network, that demonstrated clearly the most favorable westward/ eastward infrasound propagation up to 3400/6250 km according to appropriate summer/winter weather pattern and stratospheric wind directions, respectively, and thus verified empirically common models of infrasound propagation in the atmosphere. The research was supported by the CTBTO, Vienna, and the Israel Ministry of Immigrant Absorption.

Gitterman, Y.

2012-04-01

82

Adapting the structural design actions standard for the seismic design of new industrial plant  

Microsoft Academic Search

There are overseas publications that have considered the differences in the typical structural systems necessary to support the equipment and distributive systems needed to process industrial feedstock. Current standards as ASCE 7 and FEMA 450 incorporate these in a specific manner relating to the design of industrial plant. The design engineer for industrial structures is required to develop a feel

G. H. Lindup

83

Basis of Design and Seismic Action for Long Suspension Bridges: the case of the Messina Strait Bridge  

SciTech Connect

The basis of design for complex structures like suspension bridges is reviewed. Specific attention is devoted to seismic action and to the performance required and to the connected structural analysis. Uncertainty is specially addressed by probabilistic and soft-computing techniques. The paper makes punctual reference to the work end the experience developed during the last years for the re-design of the Messina Strait Bridge.

Bontempi, Franco [University of Rome 'La Sapienza', School of Engineering Via Eudossiana 18- 00184 Roma (Italy)

2008-07-08

84

Design of an implantable seismic sensor placed on the ossicular chain.  

PubMed

This paper presents a design guideline for matching a fully implantable middle ear microphone with the physiology of human hearing. The guideline defines the first natural frequency of a seismic sensor placed at the tip of the manubrium mallei with respect to the frequency-dependence hearing of the human ear as well as the deflection of the ossicular chain. A transducer designed in compliance with the guideline presented reduces the range of the output signal while preserving all information obtained by the ossicular chain. On top of a output signal compression, static deflections, which can mask the tiny motions of the ossicles, are reduced. For guideline verification, a microelectromechanical system (MEMS) based on silicon on insulator technology was produced and tested. This prototype is capable of resolving 0.4 pm/Hz with a custom made read-out circuit. For a bandwidth of 0.1 kHz, this deflection is comparable with the lower threshold of speech (? 40 phon). PMID:23810385

Sachse, M; Hortschitz, W; Stifter, M; Steiner, H; Sauter, T

2013-10-01

85

Optimisation of seismic network design: Application to a geophysical international lunar network  

Microsoft Academic Search

Fundamental scientific questions concerning the internal structure and dynamics of the Moon, and their implications on the Earth–Moon System, are driving the deployment of a new broadband seismological network on the surface of the Moon. Informations about lunar seismicity and seismic subsurface models from the Apollo missions are used as a priori information in this study to optimise the geometry

Ryuhei Yamada; Raphaël F. Garcia; Philippe Lognonné; Mathieu Le Feuvre; Marie Calvet; Jeannine Gagnepain-Beyneix

2011-01-01

86

Verifying ventilation flows  

SciTech Connect

An innovative technique using a hot-film anemometer has been developed for measuring the flow distribution through generator rotors. When designing large air-cooled generators with the highest efficiency, engineers need to know the total flow rate to the rotor as well as the flow distribution to ensure there are no local hot spots. However, gaining an understanding of generator ventilation has been hampered by a lack of experimental data on rotating machines. Furthermore, while the axial flow distribution along the rotor body can be estimated using standard techniques, such methods average the flow in the circumferential direction. To overcome these limitations, engineers at Westinghouse Electric Corp.`s Power Generation Technology Division in Orlando, Fla., have developed a novel technique for measuring airflow through each vent hole. The technique uses a hot-film anemometer--a velocity-measurement device with a very-high-frequency response--to measure such flows separately while the rotor is spinning at a rated speed of 3,600 rpm.

Laster, W.R. [Westinghouse Electric Corp., Orlando, FL (United States). Power Generation Technology Div.; Sanford, G.W. [Westinghouse Electric Corp., Charlotte, NC (United States)

1996-10-01

87

Active seismic experiment  

NASA Technical Reports Server (NTRS)

The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

Kovach, R. L.; Watkins, J. S.; Talwani, P.

1972-01-01

88

Overview of Thermal-Hydraulic Test Program for Evaluating or Verifying the Performance of New Design Features in APR1400 Reactor  

SciTech Connect

Experimental program and some of test results for thermal-hydraulic evaluation or verification of new design features in APR1400 are introduced for major test items. APR1400 incorporates many advanced design features to enhance its performance and safety. New design features adopted in APR1400 include, among others, four trains of the safety injection system (SIS) with direct vessel injection (DVI) mode and passively operating safety injection tank (SIT), the In-containment Refueling Water Storage Tank (IRWST) and the safety depressurization and vent system (SDVS). For these new design features, experimental activities relevant for ensuring their performance and contribution to the safety enhancement have been carried out at KAERI. They include the LBLOCA ECCS performance evaluation test for the DVI mode of SIS, performance verification test of the fluidic device as a passive flow controller, performance evaluation test of steam sparger for SDVS and the CEDM (control element drive mechanism) performance evaluation test. In this paper, the test program is briefly introduced, which includes the test objectives, experimental method and some of typical results for each test item. (authors)

Song, C.H.; Kwon, T.S.; Chu, I.C.; Jun, H.G.; Park, C.K. [Korea Atomic Energy Research Institute, Yuseong P.O. Box 105, Daejeon 305-600 (Korea, Republic of)

2002-07-01

89

Seismic design of low-level nuclear waste repositories and toxic waste management facilities  

SciTech Connect

Identification of the elements of typical hazardous waste facilities (HFWs) that are the major contributors to the risk are focussed on as the elements which require additional considerations in the design and construction of low-level nuclear waste management repositories and HWFs. From a recent study of six typical HWFs it was determined that the factors that contribute most to the human and environmental risk fall into four basic categories: geologic and seismological conditions at each HWF; engineered structures at each HWF; environmental conditions at each HWF; and nature of the material being released. In selecting and carrying out the six case studies, three groups of hazardous waste facilities were examined: generator industries which treat or temporarily store their own wastes; generator facilities which dispose of their own hazardous wastes on site; and industries in the waste treatment and disposal business. The case studies have a diversity of geologic setting, nearby settlement patterns, and environments. Two sites are above a regional aquifer, two are near a bay important to regional fishing, one is in rural hills, and one is in a desert, although not isolated from nearby towns and a groundwater/surface-water system. From the results developed in the study, it was concluded that the effect of seismic activity on hazardous facilities poses a significant risk to the population. Fifteen reasons are given for this conclusion.

Chung, D.H.; Bernreuter, D.L.

1984-05-08

90

NASA/TM2009000000 From Verified Models to Verifiable  

E-print Network

NASA/TM­2009­000000 From Verified Models to Verifiable Code Leonard Lensink L.Lensink@cs.ru.nl Radboud University Nijmegen, The Netherlands C´esar Mu~noz cesar.a.munoz@nasa.gov NASA Langley Research, Hampton, Virginia, USA June 2009 #12;NASA STI Program . . . in Profile Since its founding, NASA has been

Muñoz, César A.

91

7 CFR 1792.103 - Seismic design and construction standards for new buildings.  

Code of Federal Regulations, 2013 CFR

...NEHRP Recommended Provisions for the Development of Seismic Regulations for New Buildings is available from the Office of Earthquakes and Natural Hazards, Federal Emergency Management Agency, 500 C Street, SW., Washington, DC 20472. [69 FR...

2013-01-01

92

Conceptual Design and Architecture of Mars Exploration Rover (MER) for Seismic Experiments Over Martian Surfaces  

NASA Astrophysics Data System (ADS)

Keywords: MER, Mars, Rover, Seismometer Mars has been a subject of human interest for exploration missions for quite some time now. Both rover as well as orbiter missions have been employed to suit mission objectives. Rovers have been preferentially deployed for close range reconnaissance and detailed experimentation with highest accuracy. However, it is essential to strike a balance between the chosen science objectives and the rover operations as a whole. The objective of this proposed mechanism is to design a vehicle (MER) to carry out seismic studies over Martian surface. The conceptual design consists of three units i.e. Mother Rover as a Surrogate (Carrier) and Baby Rovers (two) as seeders for several MEMS-based accelerometer / seismometer units (Nodes). Mother Rover can carry these Baby Rovers, having individual power supply with solar cells and with individual data transmission capabilities, to suitable sites such as Chasma associated with Valles Marineris, Craters or Sand Dunes. Mother rover deploys these rovers in two opposite direction and these rovers follow a triangulation pattern to study shock waves generated through firing tungsten carbide shells into the ground. Till the time of active experiments Mother Rover would act as a guiding unit to control spatial spread of detection instruments. After active shock experimentation, the babies can still act as passive seismometer units to study and record passive shocks from thermal quakes, impact cratering & landslides. Further other experiments / payloads (XPS / GAP / APXS) can also be carried by Mother Rover. Secondary power system consisting of batteries can also be utilized for carrying out further experiments over shallow valley surfaces. The whole arrangement is conceptually expected to increase the accuracy of measurements (through concurrent readings) and prolong life cycle of overall experimentation. The proposed rover can be customised according to the associated scientific objectives and further needs.

Garg, Akshay; Singh, Amit

2012-07-01

93

MASSACHUSETTS DEP EELGRASS VERIFIED POINTS  

EPA Science Inventory

Field verified points showing presence or absence of submerged rooted vascular plants along Massachusetts coastline. In addition to the photo interpreted eelgrass coverage (EELGRASS), this point coverage (EGRASVPT) was generated based on field-verified sites as well as all field...

94

Seismic Studies  

SciTech Connect

This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at Yucca Mountain. (2) For probabilistic analyses supporting the demonstration of compliance with preclosure performance objectives, provide a mean seismic hazard curve for the surface facilities area. Results should be consistent with the PSHA for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at Yucca Mountain. (3) For annual ground motion exceedance probabilities appropriate for postclosure analyses, provide site-specific seismic time histories (acceleration, velocity, and displacement) for the waste emplacement level. Time histories should be consistent with the PSHA and reflect available knowledge on the limits to extreme ground motion at Yucca Mountain. (4) In support of ground-motion site-response modeling, perform field investigations and laboratory testing to provide a technical basis for model inputs. Characterize the repository block and areas in which important-to-safety surface facilities will be sited. Work should support characterization and reduction of uncertainties in inputs to ground-motion site-response modeling. (5) On the basis of rock mechanics, geologic, and seismic information, determine limits on extreme ground motion at Yucca Mountain and document the technical basis for them. (6) Update the ground-motion site-response model, as appropriate, on the basis of new data. Expand and enhance the technical basis for model validation to further increase confidence in the site-response modeling. (7) Document seismic methodologies and approaches in reports to be submitted to the NRC. (8) Address condition reports.

R. Quittmeyer

2006-09-25

95

Lessons learned from the ``5.12'' Wenchuan Earthquake: evaluation of earthquake performance objectives and the importance of seismic conceptual design principles  

NASA Astrophysics Data System (ADS)

Many different types of buildings were severely damaged or collapsed during the May 12, 2008 Great Wenchuan Earthquake. Based on survey data collected in regions that were subjected to moderate to severe earthquake intensities, a comparison between the observed building damage, and the three earthquake performance objectives and seismic conceptual design principles specifi ed by the national “Code for Seismic Design of Buildings GB50011-2001,” was carried out. Actual damage and predicted damage for a given earthquake level for different types of structures is compared. Discussions on seismic conceptual design principles, with respect to multiple defense lines, strong column-weak beam, link beam of shear walls, ductility detailing of masonry structures, exits and staircases, and nonstructural elements, etc. are carried out. Suggestions for improving the seismic design of structures are also proposed. It is concluded that the seismic performance objectives for three earthquake levels, i.e., “no failure under minor earthquake level,” “repairable damage under moderate earthquake level” and “no collapse under major earthquake level” can be achieved if seismic design principles are carried out by strictly following the code requirements and ensuring construction quality.

Wang, Yayong

2008-09-01

96

Verifiable conditions of - Optimization Online  

E-print Network

properties of proposed verifiable sufficient conditions, describe their limits of .... 1We use the term “s-semigoodness” to comply with the terminology of the ...... Verdera, Eds. International Congress of Mathematicians, Madrid 2006, Vol.

2010-09-17

97

Seismic response of perforated lightweight aggregate concrete wall panels for low-rise modular classrooms  

Microsoft Academic Search

In this paper, details of precast concrete wall panels for construction of low-rise modular school buildings are described. These panels, designed to be the primary lateral force resisting system of the building, are relatively thin. Compared to the case for conventional moment-resisting frames or shear-wall buildings, where details have been extensively verified to be seismically effective, the seismic response of

Y. H. Chai; John D. Anderson

2005-01-01

98

Image resolution analysis: a new, robust approach to seismic survey design  

E-print Network

?guration, parameters such as the structure and seismic velocity also in?uence image resolution. Understanding their e?ect on image quality, allows us to better interpret the resolution results for the surveys under examination. A salt model was used to simulate...

Tzimeas, Constantinos

2005-08-29

99

Seismic Screening, Evaluation, Rehabilitation, and Design Provisions for Wood-Framed Structures  

E-print Network

scores and modifiers from external, visual inspection 15­30 min/building . Lateral-load- resisting system of examining construction drawings. Plans revealed a structural deficiency, but screening alone indicated, "Rapid visual screening of buildings for po- tential seismic hazards" ATC 2002 ; FEMA 356, "Prestandard

Gupta, Rakesh

100

Core restraint and seismic analysis of a large heterogeneous free-flowering core design. Final report  

Microsoft Academic Search

The core restraint and seismic performance of a large heterogeneous core was analyzed. A free-flowering core restraint system was selected for this study, as opposed to the limited-free bow system of the FFTF and CRBRP. The key features of the core restraint system, such as stiff reflector assemblies and load pad properties, were specified in this study. Other features -

J. T. Madell; T. J. Moran; J. E. Ash; P. J. Fulford

1980-01-01

101

Optimum seismic structural design based on random vibration and fuzzy graded damages  

NASA Technical Reports Server (NTRS)

This paper presents the fuzzy dynamical reliability and failure probability as well as the basic principles and the analytical method of loss assessment for nonlinear seismic steel structures. Also presented is the optimization formulation and a numerical example for double objectives, initial construction cost and expected failure loss, and dynamical reliability constraints. The earthquake ground motion is based on a stationary filtered non-white noise and the fuzzy damage grade is described by damage index.

Cheng, Franklin Y.; Ou, Jin-Ping

1990-01-01

102

Verifiable Secret-Ballot Elections  

Microsoft Academic Search

Privacy in secret-ballot elections has traditionally been attained by using a ballot box or voting booth to disassociate voters from ballots. Although such a system might achieve privacy, there is often little confidence in the accuracy of the announced tally. This thesis describes a practical scheme for conducting secret-ballot elections in which the outcome of an election is verifiable by

Josh Daniel Cohen Benaloh

1987-01-01

103

Verifying the Hanging Chain Model  

ERIC Educational Resources Information Center

The wave equation with variable tension is a classic partial differential equation that can be used to describe the horizontal displacements of a vertical hanging chain with one end fixed and the other end free to move. Using a web camera and TRACKER software to record displacement data from a vibrating hanging chain, we verify a modified version…

Karls, Michael A.

2013-01-01

104

A Seismic Isolation Application Using Rubber Bearings; Hangar Project in Turkey  

SciTech Connect

Seismic isolation is an effective design strategy to mitigate the seismic hazard wherein the structure and its contents are protected from the damaging effects of an earthquake. This paper presents the Hangar Project in Sabiha Goekcen Airport which is located in Istanbul, Turkey. Seismic isolation system where the isolation layer arranged at the top of the columns is selected. The seismic hazard analysis, superstructure design, isolator design and testing were based on the Uniform Building Code (1997) and met all requirements of the Turkish Earthquake Code (2007). The substructure which has the steel vertical trusses on facades and RC H shaped columns in the middle axis of the building was designed with an R factor limited to 2.0 in accordance with Turkish Earthquake Code. In order to verify the effectiveness of the isolation system, nonlinear static and dynamic analyses are performed. The analysis revealed that isolated building has lower base shear (approximately 1/4) against the non-isolated structure.

Sesigur, Haluk; Cili, Feridun [Istanbul Technical University, Faculty of Architecture, Division of Theory of Structures 34434, Taskisla, Istanbul (Turkey)

2008-07-08

105

COMPARISON OF HORIZONTAL SEISMIC COEFFICIENTS DEFINED BY CURRENT AND PREVIOUS DESIGN STANDARDS FOR PORT AND HARBOR FACILITIES  

NASA Astrophysics Data System (ADS)

Japanese design standard for port and harbor facilities was revised in 2007, modifying the method used to calculate the horizontal seismic coefficient, kh. The comprehensive change of the method indicates that the quay walls designed by the previous standard could be lack of earthquake resistance in terms of the current standard. In the present study, the coefficients, kh, calculated by the two standards were compared for the existing quay walls constructed in Kanto area, Japan. In addition, the factors that affected the relationship of two types of coefficients, kh, were identified by means of multiple regression analyses. Only 16 % of the current standard of kh exceeded the previous standard of kh. According to the multiple regression analyses, the ratio of two types of coefficients, kh, tended to increase in the quay walls which were located in a specific port and had the large wall height and the small importance factor.

Takahashi, Hidenori; Ikuta, Akiho

106

Seismic design spectra 200 West and East Areas DOE Hanford Site, Washington  

SciTech Connect

This document presents equal hazard response spectra for the W236A project for the 200 East and West new high-level waste tanks. The hazard level is based upon WHC-SD-W236A-TI-002, Probabilistic Seismic Hazard Analysis, DOE Hanford Site, Washington. Spectral acceleration amplification is plotted with frequency (Hz) for horizontal and vertical motion and attached to this report. The vertical amplification is based upon the preliminary draft revision of Standard ASCE 4-86. The vertical spectral acceleration is equal to the horizontal at frequencies above 3.3Hz because of near-field, less than 15 km, sources.

Tallman, A.M.

1995-12-31

107

Seismic hazard assessment and design spectra for the Kozani-Grevena region (Greece) after the earthquake of May 13, 1995  

NASA Astrophysics Data System (ADS)

The Kozani-Grevena (Greece) destructive earthquake occurred in a region of low seismicity. A considerable amount of strong-motion data was acquired from the permanent strong motion network of the Institute of Engineering Scismology and Earthquake Engineering (ITSAK) as well as from a temporary one installed after the earthquake. On the basis of this data set as well as on the observed macroseismic intensities, local attenuation relations for peak ground acceleration and velocity are proposed. A posteriori seismic hazard analysis is attempted for the affected and surrounding areas in terms of peak ground acceleration, velocity, bracketed duration and spectral acceleration. The analysis shows that the event of May 13, 1995 can be characterized as one with a mean return period of 500 to 1000 years. Relying on the observed spectral-acceleration amplification factors and the expected peak ground acceleration for mean return period of 500 years, region-specific elastic design spectra for the buildings of the Kozani and Grevena prefectures are proposed.

Theodulidis, N.; Lekidis, V.; Margaris, B.; Papazachos, C.; Papaioannou, Ch.; Dimitriu, P.

108

The need for verifiable visualization.  

PubMed

Visualization is often employed as part of the simulation science pipeline-it's the window through which scientists examine their data for deriving new science, and the lens used to view modeling and discretization interactions within their simulations. We advocate that as a component of the simulation science pipeline, visualization must be explicitly considered as part of the validation and verification (V&V) process. In this article, the authors define V&V in the context of computational science, discuss the role of V&V in the scientific process, and present arguments for the need for verifiable visualization. PMID:18753037

Kirby, Robert M; Silva, Cláudio T

2008-01-01

109

New finite element models and seismic analyses of the telescopes at W.M. Keck Observatory  

NASA Astrophysics Data System (ADS)

On 15 October 2006 a large earthquake damaged both telescopes at Keck observatory resulting in weeks of observing downtime. A significant portion of the downtime was attributed to recovery efforts repairing damage to telescope bearing journals, radial pad support structures and encoder subsystems. Inadequate damping and strength in the seismic restraint design and the lack of break-away features on the azimuth radial pads are key design deficiencies. In May, 2011 a feasibility study was conducted to review several options to enhance the protection of the telescopes with the goal to minimize the time to bring the telescopes back into operation after a large seismic event. At that time it was determined that new finite element models of the telescope structures were required to better understand the telescope responses to design earthquakes required by local governing building codes and the USGS seismic data collected at the site on 15 October 2006. These models were verified by comparing the calculated natural frequencies from the models to the measured frequencies obtained from the servo identification study and comparing the time history responses of the telescopes to the October 2006 seismic data to the actual observed damages. The results of two finite element methods, response spectrum analysis and time history analysis, used to determine seismic demand forces and seismic response of each telescope to the design earthquakes were compared. These models can be used to evaluate alternate seismic restraint design options for both Keck telescopes.

Kan, Frank W.; Sarawit, Andrew T.; Callahan, Shawn P.; Pollard, Mike L.

2014-07-01

110

Verifying Search Results Over Web Collections  

E-print Network

Searching accounts for one of the most frequently performed computations over the Internet as well as one of the most important applications of outsourced computing, producing results that critically affect users' decision-making behaviors. As such, verifying the integrity of Internet-based searches over vast amounts of web contents is essential. We provide the first solution to this general security problem. We introduce the concept of an authenticated web crawler and present the design and prototype implementation of this new concept. An authenticated web crawler is a trusted program that computes a special "signature" $s$ of a collection of web contents it visits. Subject to this signature, web searches can be verified to be correct with respect to the integrity of their produced results. This signature also allows the verification of complicated queries on web pages, such as conjunctive keyword searches. In our solution, along with the web pages that satisfy any given search query, the search engine also ...

Goodrich, Michael T; Ohrimenko, Olga; Papamanthou, Charalampos; Tamassia, Roberto; Triandopoulos, Nikos; Lopes, Cristina Videira

2012-01-01

111

Experimentally Verified Numerical Model of Thixoforming Process  

SciTech Connect

A new mathematical model of thixotropic casting based on the two-phase approach for the semi-solid metal alloys is presented. The corresponding numerical algorithm has been implemented in original computer software using the finite element method in the 3-D geometry and using the Lagrangian approach to flow description. The model has been verified by means of an original experiment of thixoforming in a model die specially designed for this purpose. Some particular cases of such casting and influence of operating parameters on the segregation phenomenon have been discussed.

Bialobrzeski, Andrzej [Foundry Institute Krakow, Zakopianska 73, 30-418 Cracow (Poland); Kotynia, Monika; Petera, Jerzy [Faculty of Process and Environmental Engineering, Technical University of Lodz, Wolczanska 213, 93-005 Lodz (Poland)

2007-04-07

112

A Supervised Verifiable Voting Protocol for the Victorian Electoral Commission  

E-print Network

A Supervised Verifiable Voting Protocol for the Victorian Electoral Commission Craig Burton1 the design of a supervised verifiable voting protocol suitable for use for elections in the state of Victoria, , with ranking single transferable vote (STV), which some Victorian elections require. We conclude with a threat

Doran, Simon J.

113

SEISMIC RISK MAPS FOR EUROCODE-8 DESIGNED BUILDINGS Thomas Ulrich1  

E-print Network

to its design peak ground acceleration (PGA), and the standard deviation of the lognormal distribution between 1.7 � 10-7 (for a design PGA of 0.7m/s2 ) and 1.0 � 10-5 (for a design PGA of 3.0m/s2, they find that X varies from 0.14 (for a design PGA of 0.7m/s2 ) to 0.85 (for a design PGA of 3.0m/s2

114

Seismic analysis of Industrial Waste Landfill 4 at Y-12 Plant  

SciTech Connect

This calculation was to seismically evaluate Landfill IV at Y-12 as required by Tennessee Rule 1200-1-7-04(2) for seismic impact zones. The calculation verifies that the landfill meets the seismic requirements of the Tennessee Division of Solid Waste, ``Earthquake Evaluation Guidance Document.`` The theoretical displacements of 0.17 in. and 0.13 in. for the design basis earthquake are well below the limiting seimsic slope stability design criteria. There is no potential for liquefaction due to absence of chohesionless soils, or for loss or reduction of shear strength for the clays at this site as result of earthquake vibration. The vegetative cover on slopes will most likely be displaced and move during a large seismic event, but this is not considered a serious deficiency because the cover is not involved in the structural stability of the landfill and there would be no release of waste to the environment.

NONE

1995-04-07

115

Seismic hazard evaluation for design and/or verification of a high voltage system  

SciTech Connect

The Venezuelan capital, Caracas, with a population of about 5 million, is within the area of contact of the Caribbean and South American tectonic plates. Since 1567, the valley where it lies and surroundings have been shaken by at leas six destructive events from different seismogenic sources. Electric energy is served to the city by a high voltage system consisting of 4 power stations, 20 substations (230 KV downwards) and 80 km of high voltage lines, covering an area of about 135 x 60 km{sup 2}. Given the variety of soil conditions, topographical irregularities and proximity to potentially active faults, it was decided to perform a seismic hazard study. This paper gives the results of that study synthesized by two hazard-parameter maps, which allow a conservative characterization of the acceleration on firm soils. Specific site coefficients allow for changes in soil conditions and topographical effects. Sites whose proximity to fault lines is less than about 2 km, require additional field studies in order to rule out the possibility of permanent ground displacements.

Grases, J.; Malaver, A. [Ingenieria de Consulta, Caracas (Venezuela); Lopez, S.; Rivero, P. [Electricidad de Caracas (Venezuela)

1995-12-31

116

Seismic hazard analyses for Taipei city including deaggregation, design spectra, and time history with excel applications  

NASA Astrophysics Data System (ADS)

Given the difficulty of earthquake forecast, Probabilistic Seismic Hazard Analysis (PSHA) has been a method to best estimate site-specific ground motion or response spectra in earthquake engineering and engineering seismology. In this paper, the first in-depth PSHA study for Taipei, the economic center of Taiwan with a six-million population, was carried out. Unlike the very recent PSHA study for Taiwan, this study includes the follow-up hazard deaggregation, response spectra, and the earthquake motion recommendations. Hazard deaggregation results show that moderate-size and near-source earthquakes are the most probable scenario for this city. Moreover, similar to the findings in a few recent studies, the earthquake risk for Taipei should be relatively high and considering this city's importance, the high risk should not be overlooked and a potential revision of the local technical reference would be needed. In addition to the case study, some innovative Excel applications to PSHA are introduced in this paper. Such spreadsheet applications are applicable to geosciences research as those developed for data reduction or quantitative analysis with Excel's user-friendly nature and wide accessibility.

Wang, Jui-Pin; Huang, Duruo; Cheng, Chin-Tung; Shao, Kuo-Shin; Wu, Yuan-Chieh; Chang, Chih-Wei

2013-03-01

117

Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset  

USGS Publications Warehouse

The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

Raef, A.

2009-01-01

118

Preclosure seismic design methodology for a geologic repository at Yucca Mountain. Topical report YMP/TR-003-NP  

SciTech Connect

This topical report describes the methodology and criteria that the U.S. Department of Energy (DOE) proposes to use for preclosure seismic design of structures, systems, and components (SSCs) of the proposed geologic repository operations area that are important to safety. Title 10 of the Code of Federal Regulations, Part 60 (10 CFR 60), Disposal of High-Level Radioactive Wastes in Geologic Repositories, states that for a license to be issued for operation of a high-level waste repository, the U.S. Nuclear Regulatory Commission (NRC) must find that the facility will not constitute an unreasonable risk to the health and safety of the public. Section 60.131 (b)(1) requires that SSCs important to safety be designed so that natural phenomena and environmental conditions anticipated at the geologic repository operations area will not interfere with necessary safety functions. Among the natural phenomena specifically identified in the regulation as requiring safety consideration are the hazards of ground shaking and fault displacement due to earthquakes.

NONE

1996-10-01

119

Verifying Deadlock-Freedom of Communication Fabrics  

NASA Astrophysics Data System (ADS)

Avoiding message dependent deadlocks in communication fabrics is critical for modern microarchitectures. If discovered late in the design cycle, deadlocks lead to missed project deadlines and suboptimal design decisions. One approach to avoid this problem is to get high level of confidence on an early microarchitectural model. However, formal proofs of liveness even on abstract models are hard due to large number of queues and distributed control. In this work we address liveness verification of communication fabrics described in the form of high-level microarchitectural models which use a small set of well-defined primitives. We prove that under certain realistic restrictions, deadlock freedom can be reduced to unsatisfiability of a system of Boolean equations. Using this approach, we have automatically verified liveness of several non-trivial models (derived from industrial microarchitectures), where state-of-the-art model checkers failed and pen and paper proofs were either tedious or unknown.

Gotmanov, Alexander; Chatterjee, Satrajit; Kishinevsky, Michael

120

The 1978 Yellowstone-eastern Snake River Plain seismic profiling experiment: Crustal structure of the Yellowstone region and experiment design  

Microsoft Academic Search

In 1978 a major seismic profiling experiment was conducted in the Yellowstone-eastern Snake River Plain region of Idaho and Wyoming. Fifteen shots were recorded that provided coverage to distances of 300 km. In this paper, travel time and synthetic seismogram modeling was used to evaluate an average P wave velocity and apparent Q structure of the crust from two seismic

R. B. Smith; M. M. Schilly; L. W. Braile; J. Ansorge; J. L. Lehman; M. R. Baker; C. Prodehl; J. H. Healy; S. Mueller; R. W. Greensfelder

1982-01-01

121

RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES  

EPA Science Inventory

On October 9, 1993, the new RCRA Subtitle D regulation (40CFR Part 258) went into effect. hese regulations are applicable to landfills reclining solid waste (MSW) and establish minimum Federal criteria for the siting, design, operations, and closure of MSW landfills. hese regulat...

122

RCRA SUBTITLE D (258): SEISMIC DESIGN GUIDANCE FOR MUNICIPAL SOLID WASTE LANDFILL FACILITIES  

EPA Science Inventory

On October 9, 1993, the new RCRA Subtitle D regulations (40 CFR Part 258) went into effect. These regulations are applicable to landfills receiving municipal solid waste (MSW) and establish minimum Federal criteria for the siting, design, operation, and closure of MSW landfills....

123

41 CFR 128-1.8005 - Seismic safety standards.  

Code of Federal Regulations, 2010 CFR

...the model building codes that the Interagency Committee on Seismic Safety in Construction...Standard Building Code (SBC). (b) The seismic design and construction...higher level of seismic safety than provided...appropriate model code, in which...

2010-07-01

124

How Students Verify Conjectures: Teachers' Expectations  

ERIC Educational Resources Information Center

Eight teachers were interviewed concerning how students verify conjectures. The study is a sequel to a previous study, "How Students Verify Conjectures" [Bergqvist, T. (2000). "How students verify conjectures." "Research reports in Mathematics Education" 3]. Teachers' expectations of students' reasoning and performance are examined, and also how…

Bergqvist, Tomas

2005-01-01

125

Flexible and Publicly Verifiable Aggregation Query for Outsourced Databases in Cloud  

E-print Network

investigates this challenging problem and proposes an efficient publicly verifiable aggregation query schemeFlexible and Publicly Verifiable Aggregation Query for Outsourced Databases in Cloud Jiawei Yuan to design a client-verifiable (or publicly verifiable) aggregation query scheme that supports more flexible

126

Correlation of Seismic Velocities with Earthwork Factors.  

National Technical Information Service (NTIS)

The study was made to determine whether seismic data can be used to obtain satisfactory design earthwork factors for roadway excavation. The study shows an apparent correlation between seismic velocity and earthwork factor for the three metamorphic rock t...

T. Smith, M. McCauley, R. Mearns, K. Baumeister

1972-01-01

127

Correlation of Seismic Velocities with Earthwork Factors.  

National Technical Information Service (NTIS)

The study was made to determine whether seismic data can be used to obtain satisfactory design earthwork factors for roadway excavation. The study shows an apparent correlation between seismic velocity and earthwork factor for the sedimentary rock types e...

T. Smith, M. McCauley, K. Baumeister

1972-01-01

128

Artificial Seismic Shadow Zone by Acoustic Metamaterials  

NASA Astrophysics Data System (ADS)

We developed a new method of earthquake-proof engineering to create an artificial seismic shadow zone using acoustic metamaterials. By designing huge empty boxes with a few side-holes corresponding to the resonance frequencies of seismic waves and burying them around the buildings that we want to protect, the velocity of the seismic wave becomes imaginary. The meta-barrier composed of many meta-boxes attenuates the seismic waves, which reduces the amplitude of the wave exponentially by dissipating the seismic energy. This is a mechanical method of converting the seismic energy into sound and heat. We estimated the sound level generated from a seismic wave. This method of area protection differs from the point protection of conventional seismic design, including the traditional cloaking method. The artificial seismic shadow zone is tested by computer simulation and compared with a normal barrier.

Kim, Sang-Hoon; Das, Mukunda P.

2013-08-01

129

Verifying Test Hypotheses -HOL/TestGen Verifying Test Hypotheses -HOL/TestGen  

E-print Network

Verifying Test Hypotheses - HOL/TestGen Verifying Test Hypotheses - HOL/TestGen An Experiment in Test and Proof Thomas Malcher January 20, 2014 1 / 20 #12;Verifying Test Hypotheses - HOL/TestGen HOL/TestGen Outline Introduction Test Hypotheses HOL/TestGen - Demo Verifying Test Hypotheses Conclusion 2 / 20 #12

130

Design and test of a laser-based optical-fiber Bragg-grating accelerometer for seismic applications  

NASA Astrophysics Data System (ADS)

We report on a proof-of-principle work aimed at the development of fast-response fiber-optic accelerometers for seismic monitoring. The system is based on a semiconductor diode-laser source that interrogates a newly devised two-dimensional inertial sensor suitable for measurement of horizontal ground accelerations. Plane acceleration components of the sensor's mass are detected by two fiber Bragg gratings anchored to its structure. Calibration and comparison with a commercial accelerometer are presented. A great potential, in terms of frequency response and sensitivity, is demonstrated in view of possible field applications in active seismic areas.

Gagliardi, G.; Salza, M.; Ferraro, P.; DeNatale, P.; Di Maio, A.; Carlino, S.; DeNatale, G.; Boschi, E.

2008-08-01

131

Verifiable Secret Redistribution for Threshold Sharing Schemes  

E-print Network

Verifiable Secret Redistribution for Threshold Sharing Schemes Theodore M. Wong Chenxi Wang1 Pittsburgh, PA 15213 Abstract We present a new protocol for verifiably redistributing secrets from an (m, n policies, either expressed or implied, of DARPA or the U.S. Government. #12;Keywords: non

Wing, Jeannette M.

132

Security of Verifiably Encrypted Signatures Markus Ruckert  

E-print Network

correctly. The security properties are unforgeability and opacity. Unforgeability states that a malicious signer should not be able to forge verifiably encrypted signatures and opacity prevents extraction from elsewhere. Security of verifiably encrypted signatures is defined via unforgeability and opacity [BGLS03

133

Seismic design technology for breeder reactor structures. Volume 2. Special topics in soil/structure interaction analyses  

SciTech Connect

This volume is divided into six chapters: definition of seismic input ground motion, review of state-of-the-art procedures, analysis guidelines, rock/structure interaction analysis example, comparison of two- and three-dimensional analyses, and comparison of analyses using FLUSH and TRI/SAC Codes. (DLC)

Reddy, D.P.

1983-04-01

134

Neural networks in seismic discrimination  

SciTech Connect

Neural networks are powerful and elegant computational tools that can be used in the analysis of geophysical signals. At Lawrence Livermore National Laboratory, we have developed neural networks to solve problems in seismic discrimination, event classification, and seismic and hydrodynamic yield estimation. Other researchers have used neural networks for seismic phase identification. We are currently developing neural networks to estimate depths of seismic events using regional seismograms. In this paper different types of network architecture and representation techniques are discussed. We address the important problem of designing neural networks with good generalization capabilities. Examples of neural networks for treaty verification applications are also described.

Dowla, F.U.

1995-01-01

135

Seismic upgrades of healthcare facilities.  

PubMed

Before 1989 seismic upgrading of hospital structures was not a primary consideration among hospital owners. However, after extensive earthquake damage to hospital buildings at Loma Prieta in Northern California in 1989 and then at Northridge in Southern California in 1994, hospital owners, legislators, and design teams become concerned about the need for seismic upgrading of existing facilities. Because the damage hospital structures sustained in the earthquakes was so severe and far-reaching, California has enacted laws that mandate seismic upgrading for existing facilities. Now hospital owners will have to upgrade buildings that do not conform to statewide seismic adequacy laws. By 2030, California expects all of its hospital structures to be sufficiently seismic-resistant. Slowly, regions in the Midwest and on the East Coast are following their example. This article outlines reasons and ways for seismic upgrading of existing facilities. PMID:10168656

Yusuf, A

1997-06-01

136

Oklahoma seismic network. Final report  

Microsoft Academic Search

The US Nuclear Regulatory Commission has established rigorous guidelines that must be adhered to before a permit to construct a nuclear-power plant is granted to an applicant. Local as well as regional seismicity and structural relationships play an integral role in the final design criteria for nuclear power plants. The existing historical record of seismicity is inadequate in a number

K. V. Luza; J. E. Jr. Lawson

1993-01-01

137

Design and Implementation of a Wireless Sensor Network of GPS-enabled Seismic Sensors for the Study of Glaciers and Ice Sheets  

NASA Astrophysics Data System (ADS)

In an effort to provide new and improved geophysical sensing capabilities for the study of ice sheets in Antarctica and Greenland, or to study mountain glaciers, we are developing a network of wirelessly interconnected seismic and GPS sensor nodes (called "geoPebbles"), with the primary objective of making such instruments more capable and cost effective. We describe our design methodology, which has enabled us to develop these state-of-the art sensors using commercial-off-the-shelf hardware combined with custom-designed hardware and software. Each geoPebble is a self-contained, wirelessly connected sensor for collecting seismic measurements and position information. Each node is built around a three-component seismic recorder, which includes an amplifier, filter, and 24-bit analog-to-digital card that can sample up to 10 kHz. Each unit also includes a microphone channel to record the ground-coupled airwave. The timing for each node is available through a carrier-phase measurement of the L1 GPS signal at an absolute accuracy of better than a microsecond. Each geoPebble includes 16 GB of solid-state storage, wireless communications capability to a central supervisory unit, and auxiliary measurements capability (up to eight 10-bit channels at low sample rates). We will report on current efforts to test this new instrument and how we are addressing the challenges imposed by the extreme weather conditions on the Antarctic continent. After fully validating its operational conditions, the geoPebble system will be available for NSF-sponsored glaciology research projects. Geophysical experiments in the polar region are logistically difficult. With the geoPebble system, the cost of doing today's experiments (low-resolution, 2D) will be significantly reduced, and the cost and feasibility of doing tomorrow's experiments (integrated seismic, positioning, 3D, etc.) will be reasonable. Sketch of an experiment with geoPebbles scattered on the surface of the ice sheet. The seismic source can move through the array. The SQC node communicates with all the elements in the array.

Bilen, S. G.; Anandakrishnan, S.; Urbina, J. V.

2012-12-01

138

Seismic, shock, and vibration isolation - 1988  

SciTech Connect

This book contains papers presented at a conference on pressure vessels and piping. Topics covered include: Design of R-FBI bearings for seismic isolation; Benefits of vertical and horizontal seismic isolation for LMR nuclear reactor units; and Some remarks on the use and perspectives of seismic isolation for fast reactors.

Chung, H. (Argonne National Lab., Argonne, IL (US)); Mostaghel, N. (Univ. of Utah, Salt Lake City, UT (US))

1988-01-01

139

Impact of lateral force-resisting system and design/construction practices on seismic performance and cost of tall buildings in Dubai, UAE  

NASA Astrophysics Data System (ADS)

The local design and construction practices in the United Arab Emirates (UAE), together with Dubai's unique rate of development, warrant special attention to the selection of Lateral Force-Resisting Systems (LFRS). This research proposes four different feasible solutions for the selection of the LFRS for tall buildings and quantifies the impact of these selections on seismic performance and cost. The systems considered are: Steel Special Moment-Resisting Frame (SMRF), Concrete SMRF, Steel Dual System (SMRF with Special Steel Plates Shear Wall, SPSW), and Concrete Dual System (SMRF with Special Concrete Shear Wall, SCSW). The LFRS selection is driven by seismic setup as well as the adopted design and construction practices in Dubai. It is found that the concrete design alternatives are consistently less expensive than their steel counterparts. The steel dual system is expected to have the least damage based on its relatively lesser interstory drifts. However, this preferred performance comes at a higher initial construction cost. Conversely, the steel SMRF system is expected to have the most damage and associated repair cost due to its excessive flexibility. The two concrete alternatives are expected to have relatively moderate damage and repair costs in addition to their lesser initial construction cost.

AlHamaydeh, Mohammad; Galal, Khaled; Yehia, Sherif

2013-09-01

140

Seismic Waves: How Earthquakes Move the Earth  

NSDL National Science Digital Library

Students learn about the types of seismic waves produced by earthquakes and how they move the Earth. The dangers of earthquakes are presented as well as the necessity for engineers to design structures for earthquake-prone areas that are able to withstand the forces of seismic waves. Students learn how engineers build shake tables that simulate the ground motions of the Earth caused by seismic waves in order to test the seismic performance of buildings.

Integrated Teaching And Learning Program

141

Verifiable Private Multi-party Computation: Ranging and Ranking  

E-print Network

focus on the problem of verifiable privacy preserving multi- party computation. We thoroughly analyze the attacks on existing privacy preserving multi-party computation approaches and design a series of protocols, Multi-party Computation, Ranking, Ranging, Dot Product. I. INTRODUCTION Privacy preserving multi

Li, Xiang-Yang

142

Voter verifiability in homomorphic election schemes  

E-print Network

Voters are now demanding the ability to verify that their votes are cast and counted as intended. Most existing cryptographic election protocols do not treat the voter as a computationally-limited entity separate from the ...

Forsythe, Joy Marie

2005-01-01

143

Recent advances in the Lesser Antilles observatories Part 1 : Seismic Data Acquisition Design based on EarthWorm and SeisComP  

NASA Astrophysics Data System (ADS)

Lesser Antilles observatories are in charge of monitoring the volcanoes and earthquakes in the Eastern Caribbean region. During the past two years, our seismic networks have evolved toward a full digital technology. These changes, which include modern three components sensors, high dynamic range digitizers, high speed terrestrial and satellite telemetry, improve data quality but also increase the data flows to process and to store. Moreover, the generalization of data exchange to build a wide virtual seismic network around the Caribbean domain requires a great flexibility to provide and receive data flows in various formats. As many observatories, we have decided to use the most popular and robust open source data acquisition systems in use in today observatories community : EarthWorm and SeisComP. The first is renowned for its ability to process real time seismic data flows, with a high number of tunable modules (filters, triggers, automatic pickers, locators). The later is renowned for its ability to exchange seismic data using the international SEED standard (Standard for Exchange of Earthquake Data), either by producing archive files, or by managing output and input SEEDLink flows. French Antilles Seismological and Volcanological Observatories have chosen to take advantage of the best features of each software to design a new data flow scheme and to integrate it in our global observatory data management system, WebObs [Beauducel et al., 2004]1, see the companion paper (Part 2). We assigned the tasks to the different softwares, regarding their main abilities : - EarthWorm first performs the integration of data from different heterogeneous sources; - SeisComP takes all this homogeneous EarthWorm data flow, adds other sources and produces SEED archives and SEED data flow; - EarthWorm is then used again to process this clean and complete SEEDLink data flow, mainly producing triggers, automatic locations and alarms; - WebObs provides a friendly human interface, both to the administrator for station management, and to the regular user for real time everyday analysis of the seismic data (event classification database, location scripts, automatic shakemaps and regional catalog with associated hypocenter maps).

Saurel, Jean-Marie; Randriamora, Frédéric; Bosson, Alexis; Kitou, Thierry; Vidal, Cyril; Bouin, Marie-Paule; de Chabalier, Jean-Bernard; Clouard, Valérie

2010-05-01

144

System-on-Chip: Reuse and Integration Pre-designed and pre-verified hardware and software blocks can be combined on chips for many different applicationsVthey promise large productivity gains  

Microsoft Academic Search

Over the past ten years, as integrated circuits became increasingly more complex and expensive, the indus- try began to embrace new design and reuse methodologies that are collectively referred to as system-on-chip (SoC) design. In this paper, we focus on the reuse and integration issues encountered in this paradigm shift. The reusable components, called intellectual property (IP) blocks or cores,

Resve Saleh; Shahriar Mirabbasi; Mark Greenstreet; Guy Lemieux; Partha Pratim Pande; Cristian Grecu; Andre Ivanov

145

Seafloor seismic data study  

SciTech Connect

A significant consideration in the design of offshore platforms, for seismically-active regions, is their response to earthquakes. However, the comprehensive inclusion of such considerations in a design activity is a formidable task. Elements of this task include characterization of projected seismicity, respone of the (saturated) soils, transfer of the seismic energy from the soil to the platform, platform response to this energy deposition, and consequences associated with the platform response. A substantial amount of effort has been directed toward the synthesis of capabilities to provide these phenomenological characterizations. This effort has resulted in the development of a variety of analytical techniques. Collectively, these techniques can serve as the methodological basis for evaluation of seismic aspects pertaining to offshore platform design. Sandia National Laboratories has been involved in the design, development, installation, and interrogation of a Seafloor Earthquake Measurement System (SEMS) for nearly ten years. This R and D activity has produced an instrumentation system which is currently operating in the Shell Oil Company Beta Field, offshore Long Beach, California. The current SEMS unit (Hickerson, et al, 1986) is an upgraded version of the original unit which was also fielded offshore California. 12 refs., 4 figs., 1 tab.

Engi, D.

1986-01-01

146

Broadband seismic energy source  

SciTech Connect

A vibratory seismic energy source capable of generating significant energy over a broad frequency band is described. The vibrating baseplate and associated structure are designed to have minimum weight while still retaining sufficient structural integrity to permit the use of high actuator forces. This, coupled with a large reaction mass results in the generation of significant energy levels in the earth at high frequencies.

Bedenbender, J.W.; Weber, R.M.

1981-03-03

147

How to Bypass Verified Boot Security in Chromium OS  

E-print Network

Verified boot is an interesting feature of Chromium OS that should detect any modification in the firmware, kernel or the root file system (rootfs) by a dedicated adversary. However, by exploiting a design flaw in verified boot, we show that an adversary can replace the original rootfs by a malicious rootfs containing exploits such as a spyware and still pass the verified boot process. The exploit is based on the fact that although a kernel partition is paired with a rootfs, verification of kernel partition and rootfs are independent of each other. We experimentally demonstrate an attack using both the base and developer version of Chromium OS in which the adversary installs a spyware in the target system to send cached user data to the attacker machine in plain text which are otherwise inaccessible in encrypted form. We also discuss possible directions to mitigate the vulnerability.

Husain, Mohammad Iftekhar; Qiao, Chunming; Sridhar, Ramalingam

2012-01-01

148

Annual Hanford seismic report -- fiscal year 1996  

SciTech Connect

Seismic monitoring (SM) at the Hanford Site was established in 1969 by the US Geological Survey (USGS) under a contract with the US Atomic Energy Commission. Since 1980, the program has been managed by several contractors under the US Department of Energy (USDOE). Effective October 1, 1996, the Seismic Monitoring workscope, personnel, and associated contracts were transferred to the USDOE Pacific Northwest National Laboratory (PNNL). SM is tasked to provide an uninterrupted collection and archives of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) located on and encircling the Hanford Site. SM is also tasked to locate and identify sources of seismic activity and monitor changes in the historical pattern of seismic activity at the Hanford Site. The data compiled are used by SM, Waste Management, and engineering activities at the Hanford Site to evaluate seismic hazards and seismic design for the Site.

Hartshorn, D.C.; Reidel, S.P.

1996-12-01

149

SEISMIC ANALYSIS FOR PRECLOSURE SAFETY  

SciTech Connect

The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly evaluated and identified. This document supersedes the seismic classifications, assignments, and computations in ''Seismic Analysis for Preclosure Safety'' (BSC 2004a).

E.N. Lindner

2004-12-03

150

Teacher Directed Design: Content Knowledge, Pedagogy and Assessment under the Nevada K-12 Real-Time Seismic Network  

NASA Astrophysics Data System (ADS)

Education professionals and seismologists under the emerging SUN (Shaking Up Nevada) program are leveraging the existing infrastructure of the real-time Nevada K-12 Seismic Network to provide a unique inquiry based science experience for teachers. The concept and effort are driven by teacher needs and emphasize rigorous content knowledge acquisition coupled with the translation of that knowledge into an integrated seismology based earth sciences curriculum development process. We are developing a pedagogical framework, graduate level coursework, and materials to initiate the SUN model for teacher professional development in an effort to integrate the research benefits of real-time seismic data with science education needs in Nevada. A component of SUN is to evaluate teacher acquisition of qualified seismological and earth science information and pedagogy both in workshops and in the classroom and to assess the impact on student achievement. SUN's mission is to positively impact earth science education practices. With the upcoming EarthScope initiative, the program is timely and will incorporate EarthScope real-time seismic data (USArray) and educational materials in graduate course materials and teacher development programs. A number of schools in Nevada are contributing real-time data from both inexpensive and high-quality seismographs that are integrated with Nevada regional seismic network operations as well as the IRIS DMC. A powerful and unique component of the Nevada technology model is that schools can receive "stable" continuous live data feeds from 100's seismograph stations in Nevada, California and world (including live data from Earthworm systems and the IRIS DMC BUD - Buffer of Uniform Data). Students and teachers see their own networked seismograph station within a global context, as participants in regional and global monitoring. The robust real-time Internet communications protocols invoked in the Nevada network provide for local data acquisition, remote multi-channel data access, local time-series data management, interactive multi-window waveform display and time-series analysis with centralized meta-data control. Formally integrating educational seismology into the K-12 science curriculum with an overall "positive" impact to science education practices necessarily requires a collaborative effort between professional educators and seismologists yet driven exclusively by teacher needs.

Cantrell, P.; Ewing-Taylor, J.; Crippen, K. J.; Smith, K. D.; Snelson, C. M.

2004-12-01

151

An IBM 370 assembly language program verifier  

NASA Technical Reports Server (NTRS)

The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

Maurer, W. D.

1977-01-01

152

Publicly Verifiable Lotteries: Applications of Delaying Functions  

Microsoft Academic Search

This paper uses delaying functions, functions that require significantcalculation time, in the development of a one-pass lottery scheme in whichwinners are chosen fairly using only internal information. Since all thisinformation may be published (even before the lottery closes), anyone cando the calculation and therefore verify that the winner was chosen correctly.Since the calculation uses a delaying function, ticket purchaserscannot take

David M. Goldschlag; Stuart G. Stubblebine

1998-01-01

153

Verifying Influence Diagrams using Dimensional Analysis  

Microsoft Academic Search

Developing a valid model is of primary importance. Various verification and validation procedures are used to establish confidence in the model output. To establish confidence that a model produces right behaviour for right reasons it is essential to ensure that the structure of model represents real world system. Amongst the verification procedures employed, dimensional analysis is used to verify the

G W Komanapalli

154

Verifying and validating a simulation model  

Microsoft Academic Search

This paper presents the verification and validation (V&V) of simulation model with the emphasis on the possible modification. Based on the analysis, a new framework is proposed, and new terms are defined. An example is employed to demonstrate how the framework and terms related are used in verifying and validating an existing model.

Anbin Hu; Ye San; Zicai Wang

2001-01-01

155

Verifying and validating a simulation model  

Microsoft Academic Search

This paper presents the verification and validation of a simulation model with the emphasis on possible modification. Based on the analysis, a new framework is proposed, and new terms are defined. An example is employed to demonstrate how the framework and related terms are used in verifying and validating an existing model

Anbin Hu; Ye San; Zicai Wang

2001-01-01

156

NASA/TM2009215726 Formally Verified Practical  

E-print Network

NASA/TM­2009­215726 Formally Verified Practical Algorithms For Recovery From Loss of Separation of Aerospace, Hampton, Virginia June 2009 #12;The NASA STI Program Office . . . in Profile Since its founding, NASA has been dedicated to the advancement of aeronautics and space science. The NASA Scientific

Butler, Ricky W.

157

Verifying Concurrent Memory Reclamation Algorithms with Grace  

E-print Network

Verifying Concurrent Memory Reclamation Algorithms with Grace Alexey Gotsman, Noam Rinetzky proposed for it--such as haz- ard pointers, read-copy-update and epoch-based reclamation--have proved very challenging for formal reasoning. In this paper, we show that different memory reclamation techniques actually

Rinetzky, Noam

158

Verifying ET-LOTOS programmes with KRONOS  

Microsoft Academic Search

This paper shows that real-time systems described in a reasonable subset of ET-LOTOScan be verified with Kronos by compiling them into timed automata. We illustrate thepractical interest of our approach with a case study: the Tick-Tock protocol

Conrado Daws; Alfredo Olivero; Sergio Yovine

1994-01-01

159

Seismic waves  

NSDL National Science Digital Library

What causes seismic waves and how do they travel through the Earth? This instructional tutorial, part of an interactive laboratory series for grades 8-12, introduces students to seismic waves caused by earthquakes. Students answer questions as they move through the tutorial and investigate how P and S waves travel through layers of the Earth. In one activity, students can produce and view wave motion in a chain of particles. Scored student results are provided. A second activity introduces Love and Rayleigh waves. In a third activity, students study P and S waves by activating four seismographs, watching the resulting P and S waves travel through the Earth, and answering interactive questions. Five web sites about waves, seismic action, and earthquakes are included. Copyright 2005 Eisenhower National Clearinghouse

University of Utah. Astrophysics Science Project Integrating Research and Education (ASPIRE)

2003-01-01

160

Seismic Monitor  

NSDL National Science Digital Library

This web site provides an interactive map of global seismic activity that is updated every 30 minutes. The site uses data from the National Earthquake Information Center to produce a world map with clickable areas of seismic activity. Users can click on geographical areas of the map, and will be taken to a table which describes the time, location, magnitude and comments about particular seismic events. Information is kept for earthquakes that have occurred in the last 24 hours, 15 days, and five years. For earthquakes of a magnitude of 6.0 and over, links are provided to special information pages that try to explain the where, how and why that particular event occurred. The user can also view the ground motion associated with an event and visit seismology laboratories around the world.

161

Seismic Waves  

NSDL National Science Digital Library

The first site for this Topic in Depth comes from the Department of Geological and Mining Engineering and Sciences at Michigan Technological University and is called What Is Seismology? (1). The site describes the basics of seismology, the various types of waves associated with it, and even contains a link that shows you how to make your own P and S waves. Next is the Earthquakes Overview site (2), provided by The Tech Museum. Visitors can explore topics such as seismographs and waves through an informative and well done site that can be enjoyed by all age levels. The third site, from the USGS Earthquakes Hazard Program (3) Web site, contains animations of various seismic waves that gives a very clear look at what happens during an earthquake. The site also contains other relevant links worth investigating. The University of Alaska Fairbanks Seismic Waves (4) Web site provides a diagram of an earthquake wave traveling through the earth and shows how far it travels 15 seconds to 4 minutes after an earthquake event. The fifth site, called UK Macroseismology Home Page (5), explores the study of observable effects of earthquakes on people, buildings, and nature. Included are descriptions of macroseismic methods and the usefulness of macroseismic studies, among others. The Lesson Plans - High School (6) Web site offered by the Mid-America Earthquake Center contains links to several good lesson plans from various sources related to earthquakes and seismic waves. The seventh site is maintained by the Earth Ocean Atmosphere Scientific Systems company. The main page called Earthstation Library (7) offers information on several topics including a multimedia presentation on earthquakes and seismic waves. Under the Shockwave Demonstrations heading, visitors will find a link that provides a very interesting, visually stunning look at the subject. Lastly, from Earthscope comes the Earthscope Data (8) Web site, which provides a map that gives locations and links to seismic stations that give real-time seismic data from around the US.

Brieske, Joel A.

2002-01-01

162

Seismic Tomography  

NASA Astrophysics Data System (ADS)

The inversion of seismic travel-time data for radially varying media was initially investigated by Herglotz, Wiechert, and Bateman (the HWB method) in the early part of the 20th century [1]. Tomographic inversions for laterally varying media began in seismology starting in the 1970’s. This included early work by Aki, Christoffersson, and Husebye who developed an inversion technique for estimating lithospheric structure beneath a seismic array from distant earthquakes (the ACH method) [2]. Also, Alekseev and others in Russia performed early inversions of refraction data for laterally varying upper mantle structure [3]. Aki and Lee [4] developed an inversion technique using travel-time data from local earthquakes.

Nowack, Robert L.; Li, Cuiping

163

Towards composition of verified hardware devices  

NASA Technical Reports Server (NTRS)

Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.

Schubert, E. Thomas; Levitt, K.; Cohen, G. C.

1991-01-01

164

Seismic isolation and passive response-control buildings in Japan  

Microsoft Academic Search

This paper presents a brief introduction to seismic isolation and passive structural response-control buildings in Japan. A total number of 287 projects on seismic-isolated building structures had obtained the required special permission in Japan by the end of September 1996. The effectiveness of seismic isolation buildings has been demonstrated and verified through the 1995 Hyogoken-nanbu (Kobe) earthquake. It has resulted

Yoshikazu Kitagawa; Mitsumasa Midorikawa

1998-01-01

165

Seismic Tomography.  

ERIC Educational Resources Information Center

Describes how seismic tomography is used to analyze the waves produced by earthquakes. The information obtained from the procedure can then be used to map the earth's mantle in three dimensions. The resulting maps are then studied to determine such information as the convective flow that propels the crustal plates. (JN)

Anderson, Don L.; Dziewonski, Adam M.

1984-01-01

166

7 CFR 1792.104 - Seismic acknowledgments.  

Code of Federal Regulations, 2010 CFR

...identification and date of the model code or standard that is used in the seismic design of the building project...statement shall identify the model code or standard identified that is used in the seismic design of the building or...

2010-01-01

167

Advanced Seismic While Drilling System  

SciTech Connect

A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology of a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured by TII. An APS Turbine Alternator powered the SeismicPULSER{trademark} to produce two Hz frequency peak signals repeated every 20 seconds. Since the ION Geophysical, Inc. (ION) seismic survey surface recording system was designed to detect a minimum downhole signal of three Hz, successful performance was confirmed with a 5.3 Hz recording with the pumps running. The two Hz signal generated by the sparker was modulated with the 3.3 Hz signal produced by the mud pumps to create an intense 5.3 Hz peak frequency signal. The low frequency sparker source is ultimately capable of generating selectable peak frequencies of 1 to 40 Hz with high-frequency spectra content to 10 kHz. The lower frequencies and, perhaps, low-frequency sweeps, are needed to achieve sufficient range and resolution for realtime imaging in deep (15,000 ft+), high-temperature (150 C) wells for (a) geosteering, (b) accurate seismic hole depth, (c) accurate pore pressure determinations ahead of the bit, (d) near wellbore diagnostics with a downhole receiver and wired drill pipe, and (e) reservoir model verification. Furthermore, the pressure of the sparker bubble will disintegrate rock resulting in an increased overall rates of penetration. Other applications for the SeismicPULSER{trademark} technology are to deploy a low-frequency source for greater range on a wireline for Reverse Vertical Seismic Profiling (RVSP) and Cross-Well Tomography. Commercialization of the technology is being undertaken by first contacting stakeholders to define the value proposition for rig site services utilizing SeismicPULSER{trademark} technologies. Stakeholders include national oil companies, independent oil companies, independents, service companies, and commercial investors. Service companies will introduce a new Drill Bit SWD service for deep HTHP wells. Collaboration will be encouraged between stakeholders in the form of joint industry projects to develop prototype tools and initial field trials. No barriers have been identified for developing, utilizing, and exploiting the low-frequency SeismicPULSER{trademark} source in a

Robert Radtke; John Fontenot; David Glowka; Robert Stokes; Jeffery Sutherland; Ron Evans; Jim Musser

2008-06-30

168

Design Strategy for a Formally Verified Reliable Computing Platform  

E-print Network

must demonstrate that these systems meet stringent reliability requirements. Flight critical components of commercial aircraft should have a probability of failure of at most 10 \\Gamma9 for a 10 hour mission [1­28, 1991 in Gaithersburg, MD. major tasks: 1. Quantifying the probability of system failure due to physical

Butler, Ricky W.

169

E-Verify is a service of DHS and SSA WHAT IS E-VERIFY?  

E-print Network

, Social Security number, and date of birth match government records. If your employer uses E-Verify, you of Special Counsel for Immigration Related Unfair Employment Practices at 1-800-255-7688 (TDD: 1

Bolding, M. Chad

170

Design of an UML conceptual model and implementation of a GIS with metadata information for a seismic hazard assessment cooperative project.  

NASA Astrophysics Data System (ADS)

This work illustrates the advantages of using a Geographic Information System in a cooperative project with researchers of different countries, such as the RESIS II project (financed by the Norwegian Government and managed by CEPREDENAC) for seismic hazard assessment of Central America. As input data present different formats, cover distinct geographical areas and are subjected to different interpretations, data inconsistencies may appear and their management get complicated. To achieve data homogenization and to integrate them in a GIS, it is required previously to develop a conceptual model. This is accomplished in two phases: requirements analysis and conceptualization. The Unified Modeling Language (UML) is used to compose the conceptual model of the GIS. UML complies with ISO 19100 norms and allows the designer defining model architecture and interoperability. The GIS provides a frame for the combination of large geographic-based data volumes, with an uniform geographic reference and avoiding duplications. All this information contains its own metadata following ISO 19115 normative. In this work, the integration in the same environment of active faults and subduction slabs geometries, combined with the epicentres location, has facilitated the definition of seismogenetic regions. This is a great support for national specialists of different countries to make easier their teamwork. The GIS capacity for making queries (by location and by attributes) and geostatistical analyses is used to interpolate discrete data resulting from seismic hazard calculations and to create continuous maps as well as to check and validate partial results of the study. GIS-based products, such as complete, homogenised databases and thematic cartography of the region, are distributed to all researchers, facilitating cross-national communication, the project execution and results dissemination.

Torres, Y.; Escalante, M. P.

2009-04-01

171

Conceptual design report: Nuclear materials storage facility renovation. Part 5, Structural/seismic investigation. Section B, Renovation calculations/supporting data  

SciTech Connect

The Nuclear Materials Storage Facility (NMSF) at the Los Alamos National Laboratory (LANL) was a Fiscal Year (FY) 1984 line-item project completed in 1987 that has never been operated because of major design and construction deficiencies. This renovation project, which will correct those deficiencies and allow operation of the facility, is proposed as an FY 97 line item. The mission of the project is to provide centralized intermediate and long-term storage of special nuclear materials (SNM) associated with defined LANL programmatic missions and to establish a centralized SNM shipping and receiving location for Technical Area (TA)-55 at LANL. Based on current projections, existing storage space for SNM at other locations at LANL will be loaded to capacity by approximately 2002. This will adversely affect LANUs ability to meet its mission requirements in the future. The affected missions include LANL`s weapons research, development, and testing (WRD&T) program; special materials recovery; stockpile survelliance/evaluation; advanced fuels and heat sources development and production; and safe, secure storage of existing nuclear materials inventories. The problem is further exacerbated by LANL`s inability to ship any materials offsite because of the lack of receiver sites for mate rial and regulatory issues. Correction of the current deficiencies and enhancement of the facility will provide centralized storage close to a nuclear materials processing facility. The project will enable long-term, cost-effective storage in a secure environment with reduced radiation exposure to workers, and eliminate potential exposures to the public. This report is organized according to the sections and subsections. It is organized into seven parts. This document, Part V, Section B - Structural/Seismic Information provides a description of the seismic and structural analyses performed on the NMSF and their results.

NONE

1995-07-14

172

Seismic Reflection and Refraction  

NSDL National Science Digital Library

This web site provides a brief introduction to the process of seismic exploration. Included are a definition of seismic exploration, a listing of possible applications of seismic methods, definitions of seismic reflection and refraction, and an explanation of data processing with seismic methods. The text descriptions are accompanied by visualizations helping to aid the reader in their understanding of the concepts discussed.

173

iMUSH: The design of the Mount St. Helens high-resolution active source seismic experiment  

NASA Astrophysics Data System (ADS)

Mount St. Helens is one of the most societally relevant and geologically interesting volcanoes in the United States. Although much has been learned about the shallow structure of this volcano since its eruption in 1980, important questions still remain regarding its magmatic system and connectivity to the rest of the Cascadia arc. For example, the structure of the magma plumbing system below the shallowest magma chamber under the volcano is still only poorly known. This information will be useful for hazard assessment for the southwest Washington area, and also for gaining insight into fundamental scientific questions such as the assimilation and differentiation processes that lead to the formation of continental crust. As part of the multi-disciplinary imaging of Magma Under St. Helens (iMUSH) experiment, funded by NSF GeoPRISMS and EarthScope, an active source seismic experiment will be conducted in late summer 2014. The experiment will utilize all of the 2600 IRIS/PASSCAL/USArray Texan instruments. The instruments will be deployed as two 1000-instrument consecutive refraction profiles (one N/S and one WNW/ESE). Each of these profiles will be accompanied by two 1600-instrument areal arrays at varying distances from Mount St. Helens. Finally, one 2600-instrument areal array will be centered on Mount St. Helens. These instruments will record a total of twenty-four 500-1000 kg shots. Each refraction profile will have an average station spacing of 150 m, and a total length of 150 km. The stations in the areal arrays will be separated by ~1 km. A critical step in the success of this project is to develop an experimental setup that can resolve the most interesting aspects of the magmatic system. In particular, we want to determine the distribution of shot locations that will provide good coverage throughout the entire model space, while still allowing us to focus on regions likely to contain the magmatic plumbing system. In this study, we approach this problem by calculating Fréchet kernels with dynamic ray tracing. An initial observation from these kernels is that waves traveling across the largest offsets of the experiment (~150km) have sensitivity below depths of 30km. This means that we may be able to image the magmatic system down to the Moho, estimated at ~40 km. Additional work is focusing on searching for the shot locations that provide high resolution around very shallow features beneath Mount St. Helens, such as the first magmatic reservoir at about 3 km depth, and the associated Mount St. Helens seismic zone. One way in which we are guiding this search is to find the shot locations that maximize sensitivity values within the regions of interest after summing Fréchet kernels from each shot/station pair

Kiser, Eric; Levander, Alan; Harder, Steve; Abers, Geoff; Creager, Ken; Vidale, John; Moran, Seth; Malone, Steve

2013-04-01

174

Seismic Waves  

NSDL National Science Digital Library

In this activity, students learn about the different types of seismic waves in an environment they can control. Using an interactive, online wave generator, they will study P waves, S waves, Love waves, and Rayleigh waves, and examine a combination of P and S waves that crudely simulates the wave motion experienced during an earthquake. A tutorial is provided to show how the wave generator is used.

175

Seismic Signals  

NSDL National Science Digital Library

Not so long ago, people living near volcanoes had little that might help them to anticipate an eruption. A deep rumble, a puff of smoke, and ash might foreshadow a major volcanic event. Or a volcano might erupt with no warning at all. This interactive feature illustrates some of the types of seismic activity that may precede an eruption, which modern seismologists are studying in hopes of improving their ability to predict eruptions.

2010-11-30

176

Seismic Waves  

NSDL National Science Digital Library

This demonstration elucidates the concept of propagation of compressional waves (primary or P waves) and shear waves (secondary or S waves), which constitute the seismic waves used in locating and modeling earthquakes and underground nuclear explosions, and for imaging the interior structure of the Earth. The demonstration uses a slinky, pushed along its axis to create a compressional (longitudinal) wave, and moved up and down on one end to create a shear (transverse) wave.

Barker, Jeffrey

177

Seismic Waves and the Slinky  

NSDL National Science Digital Library

This teaching guide is designed to introduce the concepts of seismic waves that propagate within the Earth, and to provide ideas and suggestions for how to teach about seismic waves. The guide provides information on the types and properties of seismic waves and instructions for using the slinky to effectively demonstrate seismic wave characteristics and wave propagation. Most of the activities described in the guide are useful both as demonstrations for the teacher and as exploratory activities for students. A slinky is used to demonstrate P and S waves, Love wave on floor or tabletop, and Rayleigh waves by using three people and a long slinky. Five slinkys attached to wood block show that waves propagate in all directions from the source and that wave vibration for P and S sources will be different in different directions from the source.

Braile, Lawrence

178

Verifying Soft Deadlines with Probabilistic Timed Automata ?  

E-print Network

The design and analysis of many systems, to mention communication proto- cols, embedded systems embedded systems, monitoring equipment, communication and multimedia pro- tocols, requires detailed of soft deadlines. 2 Background The design and analysis of many hardware and software systems, for example

Segala, Roberto

179

Strong Motion Instrumentation of Seismically-Strengthened Port Structures in California by CSMIP  

USGS Publications Warehouse

The California Strong Motion Instrumentation Program (CSMIP) has instrumented five port structures. Instrumentation of two more port structures is underway and another one is in planning. Two of the port structures have been seismically strengthened. The primary goals of the strong motion instrumentation are to obtain strong earthquake shaking data for verifying seismic analysis procedures and strengthening schemes, and for post-earthquake evaluations of port structures. The wharves instrumented by CSMIP were recommended by the Strong Motion Instrumentation Advisory Committee, a committee of the California Seismic Safety Commission. Extensive instrumentation of a wharf is difficult and would be impossible without the cooperation of the owners and the involvement of the design engineers. The instrumentation plan for a wharf is developed through study of the retrofit plans of the wharf, and the strong-motion sensors are installed at locations where specific instrumentation objectives can be achieved and access is possible. Some sensor locations have to be planned during design; otherwise they are not possible to install after construction. This paper summarizes the two seismically-strengthened wharves and discusses the instrumentation schemes and objectives. ?? 2009 ASCE.

Huang, M. J.; Shakal, A. F.

2009-01-01

180

A fiber optic Bragg grating seismic sensor  

NASA Astrophysics Data System (ADS)

Here we present a fiber optic seismic waves sensor based on in-fiber Bragg gratings. Fiber Bragg Grating sensors have been demonstrated to have very high sensitivity to dynamical strain in the sub-micro-strain range and very extended dynamical response from static to very high frequency. The seismic sensing system is based on the integration of three FBGs dynamical strain sensors in a mechanical structure acting as an inverse pendulum. Polar symmetry of the mechanical system and 120° placement of the FBG sensors guarantee a directional capability of the seismic sensor. Design, manufacturing and preliminary dynamical testing of the seismic sensor are discussed.

Laudati, A.; Mennella, F.; Esposito, M.; Cusano, A.; Giordano, M.; Breglio, G.; Sorge, S.; Calisti Tassini, C.; Torre, A.; D'Altrui, G.; Cutolo, A.

2007-07-01

181

Adaptive Tabulation for Verified Equations of State  

NASA Astrophysics Data System (ADS)

For over forty years, large hydrodynamic calculations have used tabulated equation of state (EOS) models to reduce the computation cost associated with complex EOS models. Ideally, these tables would be verified, in that values interpolated from them match the direct EOS model calculations within some level of accuracy. For typical rectangular-gridded tables, and associated interpolation schemes, the verification error is often found to be quite large. Outstanding issues include grid coarseness and difficulty in reproducing phase boundary topology. Decreasing the grid spacing quickly becomes inefficient, due to increasing storage requirements. Instead, a tabulation approach is demonstrated that naturally incorporates the phase boundary topology through a triangulated interpolation domain. A given verification level as well as thermodynamic consistency and stability are ensured through an adaptive refinement process. Improvements are demonstrated on a simple multi-phase EOS model. *Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

Carpenter, John H.

2011-06-01

182

Romanian Educational Seismic Network Project  

NASA Astrophysics Data System (ADS)

Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babe?-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers will be installed in several schools in the most important seismic areas (Vrancea, Dobrogea), vulnerable cities (Bucharest, Ploiesti, Iasi) or high populated places (Cluj, Sibiu, Timisoara, Zal?u). All the elements of the seismic station are especially designed for educational purposes and can be operated independently by the students and teachers themselves. The first stage of ROEDUSEIS project was centered on the work of achievement of educational materials for all levels of pre-university education (kindergarten, primary, secondary and high school). A study of necessity preceded the achievement of educational materials. This was done through a set of questionnaires for teachers and students sent to participating schools. Their responses formed a feedback instrument for properly materials editing. The topics covered within educational materials include: seismicity (general principles, characteristics of Romanian seismicity, historical local events), structure of the Earth, measuring of earthquakes, seismic hazard and risk.

Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

2013-04-01

183

Verifying Soft Deadlines with Probabilistic Timed Automata  

E-print Network

The design and analysis of many systems, to mention communication proto- cols, embedded systems of many hardware and software systems, for example embedded systems, monitoring equipment, communication of safety-critical systems, such as hospital monitoring equipment, vehicle controllers, etc., it is es

Oxford, University of

184

Mechanisms of seismic quiescences  

Microsoft Academic Search

In the past decade there have been major advances in understanding the seismic cycle in terms of the recognition of characteristic patterns of seismicity over the entire tectonic loading cycle. The most distinctive types of patterns are seismic quiescences, of which three types can be recognized:post-seismic quiescence, which occurs in the region of the rupture zone of an earthquake and

Christopher H. Scholz

1988-01-01

185

Verifying Parameterized taDOM+ Lock Managers  

Microsoft Academic Search

taDOM* protocols are designed to provide lock-based approach to handle multiple access to XML databases. The notion of taDOM+\\u000a protocol is formalized and generalized and a formal model of taDOM+ lock manager that is parameterized in the number of transactions\\u000a and in the size of database is represented. An important class of safety properties of taDOM+ lock managers were proven

Antti Siirtola; Michal Valenta

2008-01-01

186

Seismic sources  

DOEpatents

Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Longitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements for more than about one minute. 9 figs.

Green, M.A.; Cook, N.G.W.; McEvilly, T.V.; Majer, E.L.; Witherspoon, P.A.

1987-04-20

187

Seismic refraction exploration  

SciTech Connect

In seismic exploration, refracted seismic energy is detected by seismic receivers to produce seismograms of subsurface formations. The seismograms are produced by directing seismic energy from an array of sources at an angle to be refracted by the subsurface formations and detected by the receivers. The directivity of the array is obtained by delaying the seismic pulses produced by each source in the source array.

Ruehle, W.H.

1980-12-30

188

LANL seismic screening method for existing buildings  

SciTech Connect

The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method and will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.

Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O. [and others

1997-01-01

189

An investigation of the principles and practices of seismic isolation in bridge structures  

E-print Network

Within the past decade, seismic isolation systems have gained rapid popularity in the earthquake resistant design of bridge structures. This popularity has come in response to the inadequacy of earlier seismic design and ...

Lapointe, Evan M. (Evan McNeil), 1981-

2004-01-01

190

Seismic retrofitting of deficient Canadian buildings  

E-print Network

Many developed countries such as Canada and the United States are facing a significant infrastructure crisis. Most of their facilities have been built with little consideration of seismic design and durability issues. As ...

Gemme, Marie-Claude

2009-01-01

191

Integrated seismic monitoring in Slovakia  

NASA Astrophysics Data System (ADS)

Two seismic networks are operated on the territory of the Slovak republic by two academic institutions. The Geophysical Institute of the Slovak Academy of Sciences operates the Slovak National Network of Seismic Stations (SNNSS, established in 2004) and the Faculty of Mathematics, Physics and Informatics, Comenius University Bratislava operates the Local Seismic Network Eastern Slovakia (LSNES, established in 2007). SNNSS is focused on the regional seismicity and participates in the international data exchange on a regular basis. LSNES, designed to be compatible and complementary with the existing SNNSS infrastructure, is focused on the seismicity of the eastern Slovakia source zone. The two networks share database and archive. Thus the expenses and workload of the joint data center operation are split between the two institutions. The cooperation enhances the overall reliability of the data center while does not interfere with the original scopes of the two networks. Relational database with thin client based on the standard web browser is implemented. Maintenance requirements of clients are reduced to minimum and it is easier to manage the system integrity. The database manages parametric data, macroseismic data, waveform data, inventory data, and geographic data. The database is not only a central part of the data processing of the two institutions; it also forms a core of the warning system. The warning system functionality requires development of the modules which are additional to the standard seismic database functionality. The modules for editing, publishing and automatic processing of macroseismic questionnaires were implemented for the purpose of the warning system, and the database integrates macroseismic data with other seismic data.

Bystrický, E.; Kristeková, M.; Moczo, P.; Cipciar, A.; Fojtíková, L.; Pažák, P.; Gális, M.

2009-04-01

192

Seismic Performance Requirements for WETF  

SciTech Connect

This report develops recommendations for requirements on the Weapons Engineering Tritium Facility (WETF) performance during seismic events. These recommendations are based on fragility estimates of WETF structures, systems, and components that were developed by LANL experts during facility walkdowns. They follow DOE guidance as set forth in standards DOE-STD-1021-93, ''Natural Phenomena Hazards Performance Categorization Guidelines for Structures, Systems, and Components'' and DOE-STD-1020-94, ''Natural Phenomena Hazards Design and Evaluation Criteria for Department of Energy Facilities''. Major recommendations are that WETF institute a stringent combustible loading control program and that additional seismic bracing and anchoring be provided for gloveboxes and heavy equipment.

Hans Jordan

2001-01-01

193

Specifying and Verifying Systems With TLA + Leslie Lamport  

E-print Network

Specifying and Verifying Systems With TLA + Leslie Lamport Microsoft Research John Matthews HP Labs; Specifying and Verifying Systems With TLA + Leslie Lamport Microsoft Research John Matthews HP Labs Cambridge

Tuttle, Mark R.

194

End-to-end verifiability for optical scan voting systems  

E-print Network

End-to-end verifiable voting systems allow voters to verify that their votes are cast as intended, collected as cast, and counted as collected. Essentially, end-to-end voting systems provide voters assurance that each step ...

Shen, Emily (Emily Huei-Yi)

2008-01-01

195

The ENAM Explosive Seismic Source Test  

NASA Astrophysics Data System (ADS)

We present the results of the pilot study conducted as part of the eastern North American margin (ENAM) community seismic experiment (CSE) to test an innovative design of land explosive seismic source for crustal-scale seismic surveys. The ENAM CSE is a community based onshore-offshore controlled- and passive-source seismic experiment spanning a 400 km-wide section of the mid-Atlantic East Coast margin around Cape Hatteras. The experiment was designed to address prominent research questions such as the role of the pre-existing lithospheric grain on the structure and evolution of the ENAM margin, the distribution of magmatism, and the along-strike segmentation of the margin. In addition to a broadband OBS deployment, the CSE will acquire multichannel marine seismic data and two major onshore-offshore controlled-source seismic profiles recording both marine sources (airguns) and land explosions. The data acquired as part of the ENAM CSE will be available to the community immediately upon completion of QC procedures required for archiving purposes. The ENAM CSE provides an opportunity to test a radically new and more economical design for land explosive seismic sources used for crustal-scale seismic surveys. Over the years we have incrementally improved the performance and reduced the cost of shooting crustal seismic shots. These improvements have come from better explosives and more efficient configuration of those explosives. These improvements are largely intuitive, using higher velocity explosives and shorter, but larger diameter explosive configurations. However, recently theoretical advances now allow us to model not only these incremental improvements, but to move to more radical shot designs, which further enhance performance and reduce costs. Because some of these designs are so radical, they need experimental verification. To better engineer the shots for the ENAM experiment we are conducting an explosives test in the region of the ENAM CSE. The results of this test will guide engineering for the main ENAM experiment as well as other experiments in the future.

Harder, S. H.; Magnani, M. B.

2013-12-01

196

28 CFR 802.13 - Verifying your identity.  

Code of Federal Regulations, 2011 CFR

...2011-07-01 false Verifying your identity. 802.13 Section 802...802.13 Verifying your identity. (a) Requests for your...yourself, you must verify your identity. You must state your...your option, include your social security number. (b)...

2011-07-01

197

28 CFR 802.13 - Verifying your identity.  

Code of Federal Regulations, 2010 CFR

...2010-07-01 false Verifying your identity. 802.13 Section 802...802.13 Verifying your identity. (a) Requests for your...yourself, you must verify your identity. You must state your...your option, include your social security number. (b)...

2010-07-01

198

Seismic noise investigations: A method for seismic microzoning in areas of nuclear power plants  

Microsoft Academic Search

The noise method requiring exact study of a site relative to the position of noise sources is described. In Europe and the U.S. frequently standardized seismic data are used for designing nuclear power plants which are secure from effects of earthquakes. Not considered in the standardized data are, generally, the seismic soil amplifications, i.e. the amplification of the earthquake waves

M. Steinwachs

1976-01-01

199

Seismic sources  

DOEpatents

Apparatus is described for placement in a borehole in the earth, which enables the generation of closely controlled seismic waves from the borehole. Pure torsional shear waves are generated by an apparatus which includes a stator element fixed to the borehole walls and a rotor element which is electrically driven to rapidly oscillate on the stator element to cause reaction forces transmitted through the borehole walls to the surrounding earth. Logitudinal shear waves are generated by an armature that is driven to rapidly oscillate along the axis of the borehole relative to a stator that is clamped to the borehole, to cause reaction forces transmitted to the surrounding earth. Pressure waves are generated by electrically driving pistons that press against opposite ends of a hydraulic reservoir that fills the borehole. High power is generated by energizing the elements at a power level that causes heating to over 150.degree. C. within one minute of operation, but energizing the elements for no more than about one minute.

Green, Michael A. (Oakland, CA); Cook, Neville G. W. (Lafayette, CA); McEvilly, Thomas V. (Berkeley, CA); Majer, Ernest L. (El Cirrito, CA); Witherspoon, Paul A. (Berkeley, CA)

1992-01-01

200

Seismic safety of high concrete dams  

NASA Astrophysics Data System (ADS)

China is a country of high seismicity with many hydropower resources. Recently, a series of high arch dams have either been completed or are being constructed in seismic regions, of which most are concrete dams. The evaluation of seismic safety often becomes a critical problem in dam design. In this paper, a brief introduction to major progress in the research on seismic aspects of large concrete dams, conducted mainly at the Institute of Water Resources and Hydropower Research (IWHR) during the past 60 years, is presented. The dam site-specific ground motion input, improved response analysis, dynamic model test verification, field experiment investigations, dynamic behavior of dam concrete, and seismic monitoring and observation are described. Methods to prevent collapse of high concrete dams under maximum credible earthquakes are discussed.

Chen, Houqun

2014-08-01

201

Verifying an interactive consistency circuit: A case study in the reuse of a verification technology  

NASA Technical Reports Server (NTRS)

The work done at ORA for NASA-LRC in the design and formal verification of a hardware implementation of a scheme for attaining interactive consistency (byzantine agreement) among four microprocessors is presented in view graph form. The microprocessors used in the design are an updated version of a formally verified 32-bit, instruction-pipelined, RISC processor, MiniCayuga. The 4-processor system, which is designed under the assumption that the clocks of all the processors are synchronized, provides software control over the interactive consistency operation. Interactive consistency computation is supported as an explicit instruction on each of the microprocessors. An identical user program executing on each of the processors decides when and on what data interactive consistency must be performed. This exercise also served as a case study to investigate the effectiveness of reusing the technology which was developed during the MiniCayuga effort for verifying synchronous hardware designs. MiniCayuga was verified using the verification system Clio which was also developed at ORA. To assist in reusing this technology, a computer-aided specification and verification tool was developed. This tool specializes Clio to synchronous hardware designs and significantly reduces the tedium involved in verifying such designs. The tool is presented and how it was used to specify and verify the interactive consistency circuit is described.

Bickford, Mark; Srivas, Mandayam

1990-01-01

202

Regional Seismic Methods of Identifying Explosions  

NASA Astrophysics Data System (ADS)

A lesson from the 2006, 2009 and 2013 DPRK declared nuclear explosion Ms:mb observations is that our historic collection of data may not be representative of future nuclear test signatures (e.g. Selby et al., 2012). To have confidence in identifying future explosions amongst the background of other seismic signals, we need to put our empirical methods on a firmer physical footing. Here we review the two of the main identification methods: 1) P/S ratios and 2) Moment Tensor techniques, which can be applied at the regional distance (200-1600 km) to very small events, improving nuclear explosion monitoring and confidence in verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Amplitude ratios of seismic P-to-S waves at sufficiently high frequencies (~>2 Hz) can identify explosions among a background of natural earthquakes (e.g. Walter et al., 1995). However the physical basis for the generation of explosion S-waves, and therefore the predictability of this P/S technique as a function of event properties such as size, depth, geology and path, remains incompletely understood. Calculated intermediate period (10-100s) waveforms from regional 1-D models can match data and provide moment tensor results that separate explosions from earthquakes and cavity collapses (e.g. Ford et al. 2009). However it has long been observed that some nuclear tests produce large Love waves and reversed Rayleigh waves that complicate moment tensor modeling. Again the physical basis for the generation of these effects from explosions remains incompletely understood. We are re-examining regional seismic data from a variety of nuclear test sites including the DPRK and the former Nevada Test Site (now the Nevada National Security Site (NNSS)). Newer relative amplitude techniques can be employed to better quantify differences between explosions and used to understand those differences in term of depth, media and other properties. We are also making use of the Source Physics Experiments (SPE) at NNSS. The SPE chemical explosions are explicitly designed to improve our understanding of emplacement and source material effects on the generation of shear and surface waves (e.g. Snelson et al., 2013). Our goal is to improve our explosion models and our ability to understand and predict where P/S and moment tensor methods of identifying explosions work, and any circumstances where they may not. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

Walter, W. R.; Ford, S. R.; Pasyanos, M.; Pyle, M. L.; Hauk, T. F.

2013-12-01

203

Seismic qualification of unanchored equipment  

SciTech Connect

This paper describes procedures used to design and qualify unanchored equipment to survive Seismic events to the PC = 4 level in a moderate seismic area. The need for flexibility to move experimental equipment together with the requirements for remote handling in a highly-radioactive non-reactor nuclear facility precluded normal equipment anchorage. Instead equipment was designed to remain stable under anticipated DBE floor motions with sufficient margin to achieve the performance goal. The equipment was also designed to accommodate anticipated sliding motions with sufficient. The simplified design criteria used to achieve these goals were based on extensive time-history simulations of sliding, rocking, and overturning of generic equipment models. The entire process was subject to independent peer review and accepted in a Safety Evaluation Report. The process provides a model suitable for adaptation to similar applications and for assessment of the potential for seismic damage of existing, unanchored equipment In particular, the paper describes: (1) Two dimensional sliding studies of deformable equipment subject to 3-D floor excitation as the basis for simplified sliding radius and sliding velocity design criteria. (2) Two dimensional rocking and overturning simulations of rigid equipment used to establish design criteria for minimum base dimensions and equipment rigidity to prevent overturning. (3) Assumed mode rocking analyses of deformable equipment models used to establish uplift magnitudes and subsequent impacts during stable rocking motions. The model used for these dynamic impact studies is reported elsewhere.

Moran, T.J.

1995-12-01

204

Linking the Meaning of Programs to What the Compiler Can Verify  

Microsoft Academic Search

Abstract. We formulate some research and development challenges that relate what a verifying compiler can verify to the definition and analysis of the application-content of programs, where the analysis comprises both experimental validation and mathematical verification. We also point to a practical framework to deal with theses challenges, namely the Ab- stract State Machines (ASM) method,for high-level system design and

Egon Börger

2005-01-01

205

Seismic data acquisition method  

SciTech Connect

The field locations of seismic shot points are chosen to produce partial multifold data, the static correction equations of which are at least partially coupled. The seismic cross sections resulting therefrom are substantially improved.

Johnson, P.W.

1984-08-21

206

Automatic Generation of the C# Code for Security Protocols Verified with Casper\\/FDR  

Microsoft Academic Search

Formal methods technique offer a means of verifying the correctness of the design process used to create the security protocol. Notwithstanding the successful verification of the design of security protocols, the implementation code for them may contain security flaws, due to the mistakes made by the programmers or bugs in the programming language itself. We propose an ACG-C# tool, which

Chul-wuk Jeon; Il-gon Kim; Jin-young Choi

2005-01-01

207

Seismic-Scale Rock Physics of Methane Hydrate  

SciTech Connect

We quantify natural methane hydrate reservoirs by generating synthetic seismic traces and comparing them to real seismic data: if the synthetic matches the observed data, then the reservoir properties and conditions used in synthetic modeling might be the same as the actual, in-situ reservoir conditions. This approach is model-based: it uses rock physics equations that link the porosity and mineralogy of the host sediment, pressure, and hydrate saturation, and the resulting elastic-wave velocity and density. One result of such seismic forward modeling is a catalogue of seismic reflections of methane hydrate which can serve as a field guide to hydrate identification from real seismic data. We verify this approach using field data from known hydrate deposits.

Amos Nur

2009-01-08

208

Seismic Refraction Lab  

NSDL National Science Digital Library

The following lab will introduce students to the basic concepts of seismic refraction as well as some actual data collected during seismic refraction surveys. You will use your knowledge of seismic refraction to calculate various parameters of interest. The two travel time data sets come from an existing SERC activity by Robert Cicerone at Bridgewater State College.

Marshall, Scott

209

Micromachined silicon seismic transducers  

SciTech Connect

Batch-fabricated silicon seismic transducers could revolutionize the discipline of CTBT monitoring by providing inexpensive, easily depolyable sensor arrays. Although our goal is to fabricate seismic sensors that provide the same performance level as the current state-of-the-art ``macro`` systems, if necessary one could deploy a larger number of these small sensors at closer proximity to the location being monitored in order to compensate for lower performance. We have chosen a modified pendulum design and are manufacturing prototypes in two different silicon micromachining fabrication technologies. The first set of prototypes, fabricated in our advanced surface- micromachining technology, are currently being packaged for testing in servo circuits -- we anticipate that these devices, which have masses in the 1--10 {mu}g range, will resolve sub-mG signals. Concurrently, we are developing a novel ``mold`` micromachining technology that promises to make proof masses in the 1--10 mg range possible -- our calculations indicate that devices made in this new technology will resolve down to at least sub-{mu}G signals, and may even approach to 10{sup {minus}10} G/{radical}Hz acceleration levels found in the low-earth-noise model.

Barron, C.C.; Fleming, J.G.; Sniegowski, J.J.; Armour, D.L.; Fleming, R.P.

1995-08-01

210

Marine Seismic Data Center  

NSDL National Science Digital Library

This is the homepage of the Marine Seismic Data Center (MSDC) of the University of Texas Institute for Geophysics (UTIG). MSDC's purpose is to organize seismic reflection and refraction data into a modern relational database management system accessible through the Internet. The web site provides access to metadata, SEG-Y (seismic shot record conversion) files, navigation files, seismic profile images, processing histories and more. The main features of the web site include a geographic search engine, a metadata search engine, and metadata pages for the cruises. A tool for plotting seismic sections is being tested and will be added in the future.

211

Revolutionary seismic vessel sees first action in North Sea  

SciTech Connect

This paper reviews the design of a new seismic surveying vessel which was developed in response to the increased need for 3D seismic data acquisition. To help in the distribution of the seismic equipment in tow, this ship was developed to have a wide, continuous beam to allow uniform distribution of a large number of seismic streamers. This width to length ratio also provides better stability of the ship as it moves through the water increasing the quality of the seismic data. The propulsion systems have also been constructed to better drag the arrays through the water. Specifications and cost benefit analysis of this new seismic vessel are reviewed and compared to conventional seismic survey methods and vessels.

Greenway, J. [PGS Exploration AS, Oslo (Norway)

1995-08-28

212

SEISMIC GEOTECHNICAL INVESTIGATIONS OF BRIDGES IN NEW YORK CITY  

Microsoft Academic Search

Seismic vulnerability assessment of a critical bridge is a major undertaking. Such an investigation may lead to requirements with respect to seismic retrofitting of an existing bridge or enhancement of the design of a new bridge, often at considerable cost. A safe and cost-effective new design or retrofit of a bridge requires the application of realistic evaluations at every step

M. K. Yegian

1998-01-01

213

Design of a potential long-term test of gas production from a hydrate deposit at the PBU-L106 site in North Slope, Alaska: Geomechanical system response and seismic monitoring  

NASA Astrophysics Data System (ADS)

In an effort to optimize the design of a potential long-term production test at the PBU-L106 site in North Slope, Alaska, we have developed a coupled modeling framework that includes the simulation of (1) large-scale production at the test site, (2) the corresponding geomechanical changes in the system caused by production, and (3) time-lapse geophysical (seismic) surveys. The long-term test is to be conducted within the deposit of the C-layer, which extends from a depth of 2226 to 2374 ft, and is characterized by two hydrate-bearing strata separated by a 30 ft shale interlayer. In this study we examine the expected geomechanical response of the permafrost-associated hydrate deposit (C-Layer) at the PBU L106 site during depressurization-induced production, and assess the potential for monitoring the system response with seismic measurements. Gas hydrates increase the strength of the sediments (often unconsolidated) they impregnate. Thus hydrate disassociation in the course of gas production could potentially affect the geomechanical stability of such deposits, leading to sediment failure and potentially affecting wellbore stability and integrity at the production site and/or at neighboring conventional production facilities. For the geomechanical analysis we use a coupled hydraulic, thermodynamic and geomechanical model (TOUGH+HYDRATE+FLAC3D, T+H+F for short) simulating production from a single vertical well at the center of an infinite-acting hydrate deposit. We investigate the geomechanical stability of the C-Layer, well stability and possible interference (due to production) with pre-existing wells in the vicinity, as well as the system sensitivity to important parameters (saturation, permeability, porosity and heterogeneity). The time-lapse seismic surveys are simulated using a finite-difference elastic wave propagation model that is linked to the T+H+F code. The seismic properties, such as the elastic and shear moduli, are a function of the simulated time- and space-varying pressure and temperature, the aqueous-, gas-, and hydrate-phase saturation, and the porosity. We examine a variety of seismic measurement configurations and survey parameters to determine the optimal approach for detecting changes occurring in the hydrate deposit during production that can be used as the basis for monitoring hydrate dissociation, and the corresponding hydrate saturation and geomechanical status. The general approach we are developing (involving the coupled simulation of production, the geomechanical response, and the evolution of geophysical properties in hydrate accumulations under production) will be a valuable tool that can be used to maximize production potential, minimize risks of geomechanical instabilities, and ensure that the system can be adequately monitored using remote sensing techniques.

Chiaramonte, L.; Kowalsky, M. B.; Rutqvist, J.; Moridis, G. J.

2009-12-01

214

Verifying the Dependence of Fractal Coefficients on Different Spatial Distributions  

SciTech Connect

A fractal distribution requires that the number of objects larger than a specific size r has a power-law dependence on the size N(r) = C/r{sup D}propor tor{sup -D} where D is the fractal dimension. Usually the correlation integral is calculated to estimate the correlation fractal dimension of epicentres. A 'box-counting' procedure could also be applied giving the 'capacity' fractal dimension. The fractal dimension can be an integer and then it is equivalent to a Euclidean dimension (it is zero of a point, one of a segment, of a square is two and of a cube is three). In general the fractal dimension is not an integer but a fractional dimension and there comes the origin of the term 'fractal'. The use of a power-law to statistically describe a set of events or phenomena reveals the lack of a characteristic length scale, that is fractal objects are scale invariant. Scaling invariance and chaotic behavior constitute the base of a lot of natural hazards phenomena. Many studies of earthquakes reveal that their occurrence exhibits scale-invariant properties, so the fractal dimension can characterize them. It has first been confirmed that both aftershock rate decay in time and earthquake size distribution follow a power law. Recently many other earthquake distributions have been found to be scale-invariant. The spatial distribution of both regional seismicity and aftershocks show some fractal features. Earthquake spatial distributions are considered fractal, but indirectly. There are two possible models, which result in fractal earthquake distributions. The first model considers that a fractal distribution of faults leads to a fractal distribution of earthquakes, because each earthquake is characteristic of the fault on which it occurs. The second assumes that each fault has a fractal distribution of earthquakes. Observations strongly favour the first hypothesis.The fractal coefficients analysis provides some important advantages in examining earthquake spatial distribution, which are: - Simple way to quantify scale-invariant distributions of complex objects or phenomena by a small number of parameters. - It is becoming evident that the applicability of fractal distributions to geological problems could have a more fundamental basis. Chaotic behaviour could underlay the geotectonic processes and the applicable statistics could often be fractal.The application of fractal distribution analysis has, however, some specific aspects. It is usually difficult to present an adequate interpretation of the obtained values of fractal coefficients for earthquake epicenter or hypocenter distributions. That is why in this paper we aimed at other goals - to verify how a fractal coefficient depends on different spatial distributions. We simulated earthquake spatial data by generating randomly points first in a 3D space - cube, then in a parallelepiped, diminishing one of its sides. We then continued this procedure in 2D and 1D space. For each simulated data set we calculated the points' fractal coefficient (correlation fractal dimension of epicentres) and then checked for correlation between the coefficients values and the type of spatial distribution.In that way one can obtain a set of standard fractal coefficients' values for varying spatial distributions. These then can be used when real earthquake data is analyzed by comparing the real data coefficients values to the standard fractal coefficients. Such an approach can help in interpreting the fractal analysis results through different types of spatial distributions.

Gospodinov, Dragomir [Plovdiv University 'Paisii Hilendarski', 24, Tsar Asen Str., Plovdiv (Bulgaria); Geophysical Institute of Bulgarian Academy of Sciences, Akad. G. Bonchev Str., bl.3, Sofia (Bulgaria); Marekova, Elisaveta; Marinov, Alexander [Plovdiv University 'Paisii Hilendarski', 24, Tsar Asen Str., Plovdiv (Bulgaria)

2010-01-21

215

A Practical Scheme for Non-interactive Verifiable Secret Sharing  

Microsoft Academic Search

This paper presents an extremely efficient, non-interactive protocol for verifiable secret sharing. Verifiable secret sharing (VSS) is a way of bequeathing information to a set of processors such that a quorum of processors is needed to access the information. VSS is a fundamental tool of cryptography and distributed computing. Seemingly difficult problems such as secret bidding, fair voting, leader election,

Paul Feldman

1987-01-01

216

Verifying Red-Black Trees Paolo Baldan1  

E-print Network

Verifying Red-Black Trees Paolo Baldan1 , Andrea Corradini2 , Javier Esparza3 , Tobias Heindel3,heindets,koenigba,koziouvi}@fmi.uni-stuttgart.de Abstract. We show how to verify the correctness of insertion of ele- ments into red-black trees--a form of balanced search trees--using anal- ysis techniques developed for graph rewriting. We first model red

Baldan, Paolo

217

Bragg grating seismic monitoring system  

NASA Astrophysics Data System (ADS)

A theoretical concept and its experimental realization of a fiber-optic Bragg grating strain sensor system to measure dynamic deformations in rock masses are presented. The system has been designed in order to monitor strain variations in the range of 10-9 within a bandwidth of 0.1 to 2 kHz. First promising results from field experiments are shown where seismic signals have been detected, in comparison with conventional geophone registrations.

Schmidt-Hattenberger, Cornelia; Borm, Gunter; Amberg, F.

1999-12-01

218

Simple method to verify OPC data based on exposure condition  

NASA Astrophysics Data System (ADS)

In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.

Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu

2006-03-01

219

Seismic vulnerability assessments in risk analysis  

NASA Astrophysics Data System (ADS)

The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in the Russian Federation the corresponding procedures have been developed. They are based on mathematical modeling of the system elements' interaction: the oil pipe line and ground, in the case of seismic loads. As a result the dependence-ships between the probability of oil pipe line system to be damaged, and the intensity of shaking in grades of seismic scales have been obtained. The following three damage states for oil pipe line systems have been considered: light damage - elastic deformation of the linear part; localized plastic deformation without breaching the pipeline; average damage - significant plastic deformation of the linear part; fistulas in some areas; complete destruction - large horizontal and vertical displacements of the linear part; mass fistulas, cracks; "guillotine break" of pipe line in some areas.

Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

2013-04-01

220

Automating Shallow Seismic Imaging  

SciTech Connect

This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate preferential contaminant-flow paths. Historically, SSR has been used very little at depths shallower than 30 m, and even more rarely at depths of 10 m or less. Conversely, GPR is rarely useful at depths greater than 10 m, especially in areas where clay or other electrically conductive materials are present near the surface. Efforts to image the cone of depression around a pumping well using seismic methods were only partially successful (for complete references of all research results, see the full Final Technical Report, DOE/ER/14826-F), but peripheral results included development of SSR methods for depths shallower than one meter, a depth range that had not been achieved before. Imaging at such shallow depths, however, requires geophone intervals of the order of 10 cm or less, which makes such surveys very expensive in terms of human time and effort. We also showed that SSR and GPR could be used in a complementary fashion to image the same volume of earth at very shallow depths. The primary research focus of the second three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under rough topographic conditions. In some easy-access environments, this device could make SSR surveying considerably more efficient and less expensive, particularly when geophone intervals of 25 cm or less are required. The most recent research analyzed the difference in seismic response of the geophones with variable geophone spike length and geophones attached to various steel media. Experiments investigated the azimuthal dependence of the quality of data relative to the orientation of the rigidly attached geophones. Other experiments designed to test the hypothesis that the data are being amplified in much the same way that an organ pipe amplifies sound have so far proved inconclusive. Taken together, the positive results show that SSR imaging within a few meters of the earth's surface is possible if the geology is suitable, that SSR imaging can complement GPR imaging, and that SSR imaging could be made significantly more cost effective, at least in areas where the topography and the geology are favorable. Increased knowledge of the Earth's shallow subsurface through non-intrusive techniques is of potential benefit to management of DOE facilities. Among the most significant problems facing hydrologists today is the delineation of preferential permeability paths in sufficient detail to make a quantitative analysis possible. Aquifer systems dominated by fracture flow have a reputation of being particularly difficult to characterize and model. At chemically contaminated sites, including U.S. Department of Energy (DOE) facilities and others at Department of Defense (DOD) installations worldwide, establishing the spatial extent of the contamination, along with the fate of the contaminants and their transport-flow directions, is essential to the development of effective cleanup strategies. Detailed characterization of the shallow subsurface is important not only in environmental, groundwater, and geotechnical engineering applications, but also in neotectonics, mining geology, and the analysis of petroleum reservoir analogs. Near-surface seismology is in the vanguard of non-intrusive approaches to increase knowledge of the shallow subsurface; our

Steeples, Don W.

2004-12-09

221

Seismic rehabilitation of wood diaphragms in unreinforced masonary buildings  

E-print Network

objectives: (1) assessing the adequacy of current seismic rehabilitation guidelines for evaluating existing wood diaphragms in pre-1950's URM buildings and for designing necessary retrofits; and (2) evaluating the effect of diaphragm retrofits, as designed...

Grubbs, Amber Jo

2012-06-07

222

Mapping Europe's Seismic Hazard  

NASA Astrophysics Data System (ADS)

From the rift that cuts through the heart of Iceland to the complex tectonic convergence that causes frequent and often deadly earthquakes in Italy, Greece, and Turkey to the volcanic tremors that rattle the Mediterranean, seismic activity is a prevalent and often life-threatening reality across Europe. Any attempt to mitigate the seismic risk faced by society requires an accurate estimate of the seismic hazard.

Giardini, Domenico; Wössner, Jochen; Danciu, Laurentiu

2014-07-01

223

March 2000 Seismicity Animation  

NSDL National Science Digital Library

An animated map of seismic activity in southern California during March 2000, part of a series of monthly seismicity reports and images, is available from the Southern California Earthquake Data Center (SCEDC). This is particularly interesting because over half of the 1,626 earthquakes detected in the region were aftershocks from the 1999 Hector Mine earthquake. Color still maps and tables of seismic activity for April-August have also been added recently.

224

VISION User Guide - VISION (Verifiable Fuel Cycle Simulation) Model  

SciTech Connect

The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating “what if” scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level for U.S. nuclear power. The model is not intended as a tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., “reactor types” not individual reactors and “separation types” not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation of disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. Note that recovered uranium is itself often partitioned: some RU flows with recycled transuranic elements, some flows with wastes, and the rest is designated RU. RU comes out of storage if needed to correct the U/TRU ratio in new recycled fuel. Neither RU nor DU are designated as wastes. VISION is comprised of several Microsoft Excel input files, a Powersim Studio core, and several Microsoft Excel output files. All must be co-located in the same folder on a PC to function. We use Microsoft Excel 2003 and have not tested VISION with Microsoft Excel 2007. The VISION team uses both Powersim Studio 2005 and 2009 and it should work with either.

Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern; Steven J. Piet; Benjamin A. Baker; Joseph Grimm

2009-08-01

225

1010 PERMANENT GENETIC RESOURCES NOTE PCR products was verified on agarose gels stained with  

E-print Network

Ecology, 4, 249­252. Raymond M, Rousset F (1995) GenePop (version 1.2): population genetics software1010 PERMANENT GENETIC RESOURCES NOTE PCR products was verified on agarose gels stained with ethidium bromide. The primer pairs designed for each of the 34 microsatellite markers amplified a single

Steve Kemp

226

IEEE TRANSACTIONS ON MOBILE COMPUTING 1 Verifying Delivered QoS in Multi-hop Wireless  

E-print Network

of statistical QoS for groups of packets. The protocols are proved to be cheat-proof. We also provide expressions for the minimum verifiable delay. Index Terms-- C.2.8.a Mobile computing algorithm/protocol design and analysis, C], [13]. Other work has addressed the issue of cheating and payment in such schemes [17]. All

Singh, Suresh

227

xTune: Online Verifiable Cross-Layer Adaptation for Distributed Real-Time Embedded Systems  

E-print Network

xTune: Online Verifiable Cross-Layer Adaptation for Distributed Real-Time Embedded Systems Minyoung Adaptation Cross Layer Formal Executable Specification Controller A. B. C. Monitor & Analysis System optimization is needed [12]. Therefore, this thesis proposes a unified framework that enables system design

Venkatasubramanian, Nalini

228

Using Timestamping and History Variables to Verify Sequential Consistency  

Microsoft Academic Search

In this paper we propose a methodology for verifying the sequentialconsistency of caching algorithms. The scheme combines timestampingand an auxiliary history table to construct a serial execution`matching\\

Tamarah Arons

2001-01-01

229

Optimizing and verifying an ensemble-based rainfall model  

E-print Network

In this thesis, I modified, optimized, and verified the stochastic Recursive Cluster-point Rainfall model of Chatdarong (2006). A novel error metric allows comparison of the stochastic ensemble of rainfall image forecasts ...

Friedman, Sara Hargrove

2007-01-01

230

Towards verifiable adaptive control for safety critical applications  

E-print Network

To be implementable in safety critical applications, adaptive controllers must be shown to behave strictly according to predetermined specifications. This thesis presents two tools for verifying specifications relevant to ...

Schwager, Mac

2005-01-01

231

Verified Calculation of Multiscale Combustion in Gaseous Mixtures  

E-print Network

right--a math exercise. · Validation: Solving the right equations--a physics exercise. · DNS: a verified: Fundamental Linear Analysis of Length Scales #12;Motivation · To achieve DNS, the interplay between chemistry

232

Manufacture of a lightweight COBRA seal camera/verifier  

SciTech Connect

The use of still video recording and display technologies were developed by Sandia National Laboratories and approved by IAEA for use in recording and verifying the COBRA containment seal. The subsequent emergency of a widely used commercially available still video product offered the opportunity to significantly reduce the weight and size of the COBRA Seal Camera/Verifier System, while producing a more manufacturable product. This paper summarizes the experiences gained in the performance of the manufacturing engineering.

Kadner, S.; Bozone, J. (Aquila Technologies Group Inc., Albuquerque, NM (US))

1991-01-01

233

Seismic Catalogue and Seismic Network in Haiti  

NASA Astrophysics Data System (ADS)

The destructive earthquake occurred on January 10, 2010 in Haiti, highlighted the lack of preparedness of the country to address seismic phenomena. At the moment of the earthquake, there was no seismic network operating in the country, and only a partial control of the past seismicity was possible, due to the absence of a national catalogue. After the 2010 earthquake, some advances began towards the installation of a national network and the elaboration of a seismic catalogue providing the necessary input for seismic Hazard Studies. This paper presents the state of the works carried out covering both aspects. First, a seismic catalogue has been built, compiling data of historical and instrumental events occurred in the Hispaniola Island and surroundings, in the frame of the SISMO-HAITI project, supported by the Technical University of Madrid (UPM) and Developed in cooperation with the Observatoire National de l'Environnement et de la Vulnérabilité of Haiti (ONEV). Data from different agencies all over the world were gathered, being relevant the role of the Dominican Republic and Puerto Rico seismological services which provides local data of their national networks. Almost 30000 events recorded in the area from 1551 till 2011 were compiled in a first catalogue, among them 7700 events with Mw ranges between 4.0 and 8.3. Since different magnitude scale were given by the different agencies (Ms, mb, MD, ML), this first catalogue was affected by important heterogeneity in the size parameter. Then it was homogenized to moment magnitude Mw using the empirical equations developed by Bonzoni et al (2011) for the eastern Caribbean. At present, this is the most exhaustive catalogue of the country, although it is difficult to assess its degree of completeness. Regarding the seismic network, 3 stations were installed just after the 2010 earthquake by the Canadian Government. The data were sent by telemetry thought the Canadian System CARINA. In 2012, the Spanish IGN together with ONEV and BME, installed 4 seismic stations with financial support from the Inter-American Development Bank and the Haitian Government. The 4 stations include strong motion and broad-band sensors, complementing the 8 sensors initially installed. The stations communicate via SATMEX5 with the Canadian HUB, which sends the data back to Haiti with minimum delay. In the immediate future, data transfer will be improved with the installation of a main antenna for data reception and the Seismic Warning Center of Port-au-Prince. A bidirectional satellite communication is considered of fundamental importance for robust real-time data transmission that is not affected in the case of a catastrophic event.

Belizaire, D.; Benito, B.; Carreño, E.; Meneses, C.; Huerfano, V.; Polanco, E.; McCormack, D.

2013-05-01

234

Seismic data denoising based on the fractional Fourier transformation  

NASA Astrophysics Data System (ADS)

Seismic data may suffer from too severe noise contamination to carry out further processing and interpretation procedure. In the paper, a new scheme was proposed based on the fractional Fourier transform (FrFT) in time frequency domain to mitigate noise. The scheme consists of two steps. In the first step, the seismic signal is filtered with the ordinary Butterworth filter in the frequency domain. The residual noises after frequency filtering are with the same frequencies with the filtered seismic signals. In order to mitigate the residual noises further, the FrFT filter is applied in the second step. The results from the simulated seismic signals and the measurements data verify the validity of the proposed scheme in both frequency and time-frequency domains.

Zhai, Ming-Yue

2014-10-01

235

Seismic Waves and the Slinky: A Guide For Teachers  

NSDL National Science Digital Library

This teaching guide is designed to introduce the concepts of waves and seismic waves that propagate within the Earth, and to provide ideas and suggestions for how to teach about seismic waves. The guide provides information on the types and properties of seismic waves and instructions for using some simple materials, especially the slinky, to effectively demonstrate seismic wave characteristics and wave propagation. Most of the activities described in the guide are useful both as demonstrations for the teacher and as exploratory activities for students.

Braile, Lawrence

236

Statistical classification methods applied to seismic discrimination  

SciTech Connect

To verify compliance with a Comprehensive Test Ban Treaty (CTBT), low energy seismic activity must be detected and discriminated. Monitoring small-scale activity will require regional (within {approx}2000 km) monitoring capabilities. This report provides background information on various statistical classification methods and discusses the relevance of each method in the CTBT seismic discrimination setting. Criteria for classification method selection are explained and examples are given to illustrate several key issues. This report describes in more detail the issues and analyses that were initially outlined in a poster presentation at a recent American Geophysical Union (AGU) meeting. Section 2 of this report describes both the CTBT seismic discrimination setting and the general statistical classification approach to this setting. Seismic data examples illustrate the importance of synergistically using multivariate data as well as the difficulties due to missing observations. Classification method selection criteria are presented and discussed in Section 3. These criteria are grouped into the broad classes of simplicity, robustness, applicability, and performance. Section 4 follows with a description of several statistical classification methods: linear discriminant analysis, quadratic discriminant analysis, variably regularized discriminant analysis, flexible discriminant analysis, logistic discriminant analysis, K-th Nearest Neighbor discrimination, kernel discrimination, and classification and regression tree discrimination. The advantages and disadvantages of these methods are summarized in Section 5.

Ryan, F.M. [ed.; Anderson, D.N.; Anderson, K.K.; Hagedorn, D.N.; Higbee, K.T.; Miller, N.E.; Redgate, T.; Rohay, A.C.

1996-06-11

237

Simple Procedure for Seismic Analysis of Liquid-Storage Tanks  

Microsoft Academic Search

Summary This paper provides the theoretical background of a simplified seismic design procedure for cylindrical ground-supported tanks. The procedure takes into ac- count impulsive and convective (sloshing) actions of the liquid in flexible steel or concrete tanks fixed to rigid foundations. Seismic responses - base shear, over- turning moment, and sloshing wave height - are calculated by using the site

Praveen K. Malhotra; Thomas Wenk; Martin Wieland

2000-01-01

238

SEISMIC EARLY WARNING SYSTEM FOR A NUCLEAR POWER PLANT  

Microsoft Academic Search

SUMMARY Reviews of several Soviet-built nuclear power plants have shown that most of them have an unknown earthquake safety or are under-designed seismically. In the cases where seismic strengthening of the buildings and equipment is not feasible due to economical, political and timing reasons, an active reactor protection system based on an earthquake early warning system may be the answer,

Martin WIELAND; Lothar GRIESSER; Christoph KUENDIG

239

Seismic analysis of wind turbines in the time domain  

Microsoft Academic Search

The analysis of wind turbine loading associated with earthquakes is clearly important when designing for and assessing the feasibility of wind farms in seismically active regions. The approach taken for such analysis is generally based on codified methods which have been developed for the assessment of seismic loads acting on buildings. These methods are not able to deal properly with

D. Witcher

2005-01-01

240

SEISMIC GEOTECHNICAL INVESTIGATIONS FOR BRIDGES M. K. Yegian  

E-print Network

1 SEISMIC GEOTECHNICAL INVESTIGATIONS FOR BRIDGES M. K. Yegian 1 , F. ASCE ABSTRACT Seismic geotechnical investigations for a bridge involve several types of analyses, including: establishment of design computations; assessment of liquefaction and its impact on bridge foundations; soil-foundation interaction

Yegian, Mishac

241

Borehole seismic unit  

SciTech Connect

Fracture orientation can be measured by using a triaxial geophone package located at the fracture interval within the wellbore. Seismic signals produced by the fracture can be recorded and measured to determine the direction of the fracture. Reported herein is a description of a borehole seismic unit and procedures to accomplish this task.

Seavey, R.W.

1982-05-01

242

Evolution of optically nondestructive and data-non-intrusive credit card verifiers  

NASA Astrophysics Data System (ADS)

Since the deployment of the credit card, the number of credit card fraud cases has grown rapidly with a huge amount of loss in millions of US dollars. Instead of asking more information from the credit card's holder or taking risk through payment approval, a nondestructive and data-non-intrusive credit card verifier is highly desirable before transaction begins. In this paper, we review optical techniques that have been proposed and invented in order to make the genuine credit card more distinguishable than the counterfeit credit card. Several optical approaches for the implementation of credit card verifiers are also included. In particular, we highlight our invention on a hyperspectral-imaging based portable credit card verifier structure that offers a very low false error rate of 0.79%. Other key features include low cost, simplicity in design and implementation, no moving part, no need of an additional decoding key, and adaptive learning.

Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

2010-04-01

243

Enhanced Seismic Performance of Hybrid Bridge Systems: Comparison with Traditional Monolithic Solutions  

Microsoft Academic Search

Remarkable accomplishments have been observed in seismic engineering in the recent past with the definition and development of high-performance seismic resistant systems, able to sustain major ground motions with limited level of structural damage. Following the introduction and further developments of jointed ductile connections for the seismic design of precast concrete buildings, the concept of hybrid system, where self-centering and

Alessandro Palermo; Stefano Pampanin

2008-01-01

244

The SCALE Verified, Archived Library of Inputs and Data - VALID  

SciTech Connect

The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated with model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.

Marshall, William BJ J [ORNL] [ORNL; Rearden, Bradley T [ORNL] [ORNL

2013-01-01

245

10 CFR 36.39 - Design requirements.  

Code of Federal Regulations, 2010 CFR

...seismic areas, the licensee shall design the reinforced concrete radiation shields to retain their integrity in the event of an earthquake by designing to the seismic requirements of an appropriate source such as American Concrete Institute Standard ACI...

2010-01-01

246

10 CFR 36.39 - Design requirements.  

...seismic areas, the licensee shall design the reinforced concrete radiation shields to retain their integrity in the event of an earthquake by designing to the seismic requirements of an appropriate source such as American Concrete Institute Standard ACI...

2014-01-01

247

Seismic exploration fundamentals. Second edition  

SciTech Connect

This book includes discussions of the new techniques in seismic exploration including vertical seismic profiles, shear wave exploration, seismic stratigraphy and interactive interpretation. The book is not about theory, but describes the techniques actually used in seismic exploration, from program planning to recommendations for drilling. Figures are used to illustrate various points throughout the book, and photographs of equipment and field work are included.

Coffeen, J.A.

1986-01-01

248

Eddy-Current Testing of Welded Stainless Steel Storage Containers to Verify Integrity and Identity  

SciTech Connect

An eddy-current scanning system is being developed to allow the International Atomic Energy Agency (IAEA) to verify the integrity of nuclear material storage containers. Such a system is necessary to detect attempts to remove material from the containers in facilities where continuous surveillance of the containers is not practical. Initial tests have shown that the eddy-current system is also capable of verifying the identity of each container using the electromagnetic signature of its welds. The DOE-3013 containers proposed for use in some US facilities are made of an austenitic stainless steel alloy, which is nonmagnetic in its normal condition. When the material is cold worked by forming or by local stresses experienced in welding, it loses its austenitic grain structure and its magnetic permeability increases. This change in magnetic permeability can be measured using an eddy-current probe specifically designed for this purpose. Initial tests have shown that variations of magnetic permeability and material conductivity in and around welds can be detected, and form a pattern unique to the container. The changes in conductivity that are present around a mechanically inserted plug can also be detected. Further development of the system is currently underway to adapt the system to verifying the integrity and identity of sealable, tamper-indicating enclosures designed to prevent unauthorized access to measurement equipment used to verify international agreements.

Tolk, Keith M.; Stoker, Gerald C.

1999-07-20

249

Seismic fragility test of a 6-inch diameter pipe system  

SciTech Connect

This report contains the test results and assessments of seismic fragility tests performed on a 6-inch diameter piping system. The test was funded by the US Nuclear Regulatory Commission (NRC) and conducted by ETEC. The objective of the test was to investigate the ability of a representative nuclear piping system to withstand high level dynamic seismic and other loadings. Levels of loadings achieved during seismic testing were 20 to 30 times larger than normal elastic design evaluations to ASME Level D limits would permit. Based on failure data obtained during seismic and other dynamic testing, it was concluded that nuclear piping systems are inherently able to withstand much larger dynamic seismic loadings than permitted by current design practice criteria or predicted by the probabilistic risk assessment (PRA) methods and several proposed nonlinear methods of failure analysis.

Chen, W. P.; Onesto, A. T.; DeVita, V.

1987-02-01

250

On the Complexity of Verifying Cyber-Physical Security Protocols  

E-print Network

On the Complexity of Verifying Cyber-Physical Security Protocols Max Kanovich, Tajana Ban Kirigin agents. We classify such security protocols as Cyber-Physical.6 The key elements of such protocols important class of Bounded Memory Cyber-Physical Security19 Protocols with a Memory Bounded Intruder

Nigam, Vivek

251

An Executable Formal Model for Specifying and Verifying Clinical Trials  

E-print Network

An Executable Formal Model for Specifying and Verifying Clinical Trials Vivek Nigam1 , Carolyn evaluated. Experiments that involve human subjects are called Clinical Trials (CTs). Since human subjects. This paper takes the first steps towards that direction. We model a clinical trial by using a partial order

Nigam, Vivek

252

Using symbolic execution for verifying safety-critical systems  

Microsoft Academic Search

Safety critical systems require to be highly reliable and thus special care is taken when verifying them in order to increase the confidence in their behavior. This paper addresses the problem of formal verification of safety critical systems by providing empirical evidence of the practical applicability of symbolic execution and of its usefulness for checking safety-related properties. In this paper,

Alberto Coen-Porisini; Giovanni Denaro; Carlo Ghezzi

2001-01-01

253

Developing a flexible and verifiable integrated dose assessment capability  

Microsoft Academic Search

Due to the ever-changing regulatory environment there is a need for a flexible yet verifiable system of computing and recording personnel doses. Recent directions in Chapter XI requirements, dictated by earlier advances forwarded by ICRP 26, establish the trend of combining internal and external doses. We are currently developing a Health Physics Information Management System (HPIMS) which will: (1) centralize

D. C. Parzyck; T. A. Rhea; E. D. Copenhaver; J. S. Bogard

1986-01-01

254

A verifiable SSA program representation for aggressive compiler optimization  

Microsoft Academic Search

We present a verifiable low-level program representation to em- bed, propagate, and preserve safety information in high perfor- mance compilers for safe languages such as Java and C#. Our rep- resentation precisely encodes safety information via static single- assignment (SSA) (11, 3) proof variables that are first-class con- structs in the program. We argue that our representation allows a compiler

Vijay Menon; Neal Glew; Brian R. Murphy; Andrew Mccreight; Tatiana Shpeisman; Ali-reza Adl-tabatabai; Leaf Petersen

2006-01-01

255

Mostly Sound Type System Improves a Foundational Program Verifier  

E-print Network

transformations. If the C program is already in the C light subset, the first phase of CompCert will leave language, proved sound with respect to the operational semantics of the CompCert verified optimizing C built a usable program logic for C, proved sound with respect to CompCert's operational semantics (a

Appel, Andrew W.

256

Verifiable secret sharing and multiparty protocols with honest majority  

Microsoft Academic Search

Under the assumption that each participant can broadcast a message to all other participants and that each pair of participants can communicate secretly, we present a verifiable secret sharing protocol, and show that any multiparty protocol, or game with incomplete information, can be achieved if a majority of the players are honest. The secrecy achieved is unconditional and does not

Tal Rabin

1989-01-01

257

Radiative transfer theory verified by controlled laboratory experiments  

E-print Network

Radiative transfer theory verified by controlled laboratory experiments Michael I. Mishchenko,1 particles from 2% to 10%. Our results indicate that the VRTE can be applied safely to random particulate the Maxwell equations [1,2] has finally made the RT theory (RTT) a legitimate branch of physical optics. Yet

258

Verified Calculation of Multiscale Combustion in Gaseous Mixtures  

E-print Network

: Solving the equa- tions right--a math exercise. · Validation: Solving the right equations--a physics exercise. · DNS: a verified and validated computation that resolves all ranges of relevant continuum in the unstable regime.] #12;Part II: Fundamental Linear Analysis of Length Scales #12;Motivation · To achieve DNS

259

Verifying Program Optimizations in Agda Case Study: List Deforestation  

E-print Network

Verifying Program Optimizations in Agda Case Study: List Deforestation Andreas Abel 3 July 2012 of deforestation. As a result we show that the summation of the first n natural numbers, implemented by producing structures is called deforestation, since data structures are tree-shaped in the general case. In our case

Abel, Andreas

260

Verifying Program Optimizations in Agda Case Study: List Deforestation  

E-print Network

Verifying Program Optimizations in Agda Case Study: List Deforestation Andreas Abel 16 July 2009 of deforestation. As a result we show that the summation of the rst n natural numbers, implemented by producing structures is called deforestation, since data structures are tree-shaped in the general case. In our case

Abel, Andreas

261

Elements of a system for verifying a Comprehensive Test Ban  

SciTech Connect

The paper discusses the goals of a monitoring system for a CTB, its functions, the challenges to verification, discrimination techniques, and some recent developments. It is concluded technical, military and political efforts are required to establish and verify test ban treaties which will contribute to stability in the long term. It currently appears there will be a significant number of unidentified events. (ACR)

Hannon, W.J.

1987-03-06

262

Providers Do Not Verify Patient Identity during Computer Order Entry  

E-print Network

Emergency Medicine Keywords: patient identification, computer physician order entry, CPOE, medical errors M recommended.4­7 To reduce the frequency of medical errors, the Joint Commission created national patient tests. Conclusions: Medical providers often miss ID errors and infrequently verify patient ID with two

Massachusetts at Amherst, University of

263

From Operating-System Correctness to Pervasively Verified Applications  

E-print Network

correctness. Based on the formal correctness of our real-time operating system Olos, this paper describes of hardware, a real-time operating system, and application programs. We have implemented the operating system of its automotive subproject is a pervasively verified distributed real-time system, consisting

Paris-Sud XI, Université de

264

Implementing a Formally Verifiable Security Protocol in Java Card  

E-print Network

Implementing a Formally Verifiable Security Protocol in Java Card Engelbert Hubbers, Martijn implementation on a Java Card smart card. The aim is to consider the decisions that have to be made exploration based analysis using model checkers [12]. Still, there is a big gap between the abstract level

Poll, Erik

265

A Formally Verified Calculus for Full Java Card Kurt Stenzel  

E-print Network

constructs (i.e. expressions and statements) as Java, but omits all features that make the JVM big and slowA Formally Verified Calculus for Full Java Card Kurt Stenzel Lehrstuhl f�ur Softwaretechnik und@informatik.uni-augsburg.de Abstract. We present a calculus for the verification of sequential Java programs. It supports all Java

Reif, Wolfgang

266

Implementing a Formally Verifiable Security Protocol in Java Card  

E-print Network

Implementing a Formally Verifiable Security Protocol in Java Card Engelbert Hubbers, Martijn implementation on a Java Card smart card. The aim is to consider the decisions that have to be made exploration based analysis using model checkers [2]. Still, there is a big gap between the abstract level

Hubbers, Engelbert

267

Verifying Stiffness Parameters Of Filament-Wound Cylinders  

NASA Technical Reports Server (NTRS)

Predicted engineering stiffness parameters of filament-wound composite-material cylinders verified with respect to experimental data, by use of equations developed straightforwardly from applicable formulation of Hooke's law. Equations derived in engineering study of filament-wound rocket-motor cases, also applicable to other cylindrical pressure vessels made of orthotropic materials.

Verderaime, V.; Rheinfurth, M.

1994-01-01

268

The verifying compiler: A grand challenge for computing research  

Microsoft Academic Search

This contribution proposes a set of criteria that distinguish a grand challenge in science or engineering from the many other kinds of short-term or long-term research problems that engage the interest of scientists and engineers. As an example drawn from Computer Science, it revives an old challenge: the construction and application of a verifying compiler that guarantees correctness of a

Tony Hoare

2003-01-01

269

A credit card verifier structure using diffraction and spectroscopy concepts  

Microsoft Academic Search

We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram

Sarun Sumriddetchkajorn; Yuttana Intaravanne

2008-01-01

270

Verifying High-Confidence Interactive Systems: Electronic Voting and Beyond  

E-print Network

Verifying High-Confidence Interactive Systems: Electronic Voting and Beyond Sanjit A. Seshia EECS systems that require a high level of assurance. We term such systems as high-confidence interactive with self-driving features (interacting with a driver), medical devices (interacting with a doctor

Seshia, Sanjit A.

271

Verified Computational Differential Privacy with Applications to Smart Metering  

E-print Network

Verified Computational Differential Privacy with Applications to Smart Metering Gilles Barthe, Spain. Email: {gilles.barthe,cesar.kunz}@imdea.org INRIA Sophia Antipolis ­ M´editerran´ee, France. Email: benjamin.gregoire@inria.fr Microsoft Research, UK. Email: {gdane,santiago}@microsoft.com Abstract

Danezis, George

272

Verifiably Truthful Mechanisms SIMINA BR ^ANZEI, Aarhus University  

E-print Network

X Verifiably Truthful Mechanisms SIMINA BR ^ANZEI, Aarhus University ARIEL D. PROCACCIA, Carnegie Mellon University It is typically expected that if a mechanism is truthful, then the agents would, indeed, truthfully report their private information. But why would an agent believe that the mechanism is truthful

Procaccia, Ariel

273

Third Quarter Hanford Seismic Report for Fiscal Year 2005  

SciTech Connect

Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the Hanford Seismic Network, there were 337 triggers during the third quarter of fiscal year 2005. Of these triggers, 20 were earthquakes within the Hanford Seismic Network. The largest earthquake within the Hanford Seismic Network was a magnitude 1.3 event May 25 near Vantage, Washington. During the third quarter, stratigraphically 17 (85%) events occurred in the Columbia River basalt (approximately 0-5 km), no events in the pre-basalt sediments (approximately 5-10 km), and three (15%) in the crystalline basement (approximately 10-25 km). During the first quarter, geographically five (20%) earthquakes occurred in swarm areas, 10 (50%) earthquakes were associated with a major geologic structure, and 5 (25%) were classified as random events.

Reidel, Steve P.; Rohay, Alan C.; Hartshorn, Donald C.; Clayton, Ray E.; Sweeney, Mark D.

2005-09-01

274

Discussing Seismic Data  

USGS Multimedia Gallery

USGS scientists Debbie Hutchinson and Jonathan Childs discuss collected seismic data. This image was taken on U.S. Coast Guard Cutter Healy and was during a scientific expedition to map the Arctic seafloor....

2009-01-28

275

Seismicity, 1980-86  

SciTech Connect

Tens of thousands of small earthquakes occur in California each year, reflecting brittle deformation of the margins of the Pacific and North American plates as they grind inexorably past one another along the San Andreas fault system. The deformational patterns revealed by this ongoing earthquake activity provide a wealth of information on the tectonic processes along this major transform boundary that, every few hundred years, culminate in rupture of the San Andreas fault in a great (M {approx} 8) earthquake. This chapter describes the regional seismicity and the San Andreas transform boundary; seismicity along the San Andreas Fault system; and focal mechanisms and transform-boundary kinematics. Seismicity patterns and the earthquake cycle and distributed seismicity and deformation of the plate margins are discussed.

Hill, D.P.; Eaton, J.P.; Jones, L.M.

1990-01-01

276

3-D Seismic Methods for Shallow Imaging Beneath Pavement  

E-print Network

The research presented in this dissertation focuses on survey design and acquisition of near-surface 3D seismic reflection and surface wave data on pavement. Increased efficiency for mapping simple subsurface interfaces ...

Miller, Brian

2013-05-31

277

Acoustic and seismic signal processing for footsetp detection  

E-print Network

The problem of detecting footsteps using acoustic and seismic sensors is approached from three different angles in this thesis. First, accelerometer data processing systems are designed to make footsteps more apparent to ...

Bland, Ross E. (Ross Edward)

2006-01-01

278

The retrofitting of existing buildings for seismic criteria  

E-print Network

This thesis describes the process for retrofitting a building for seismic criteria. It explains the need for a new, performance-based design code to provide a range of acceptable building behavior. It then outlines the ...

Besing, Christa, 1978-

2004-01-01

279

Tunable marine seismic source  

SciTech Connect

The disclosed device is a marine seismic source which emits a constantly varying FM signal in the 10 to 100 H /SUB z/ range. The seismic source utilizes an adjustable length cantilever spring rotatably attached to stiff acoustic radiators, which create a signal in the water. Varying the length of the cantilever spring as a function of the frequency will permit the device to be continuously tuned for maximum power output.

Mifsud, J. F.

1985-12-10

280

Passive seismic experiment  

NASA Technical Reports Server (NTRS)

The establishment of a network of seismic stations on the lunar surface as a result of equipment installed by Apollo 12, 14, and 15 flights is described. Four major discoveries obtained by analyzing seismic data from the network are discussed. The use of the system to detect vibrations of the lunar surface and the use of the data to determine the internal structure, physical state, and tectonic activity of the moon are examined.

Latham, G. V.; Ewing, M.; Press, F.; Sutton, G.; Dorman, J.; Nakamura, Y.; Toksoz, N.; Lammlein, D.; Duennebier, F.

1972-01-01

281

Advanced fiber optic seismic sensors (geophone) research  

NASA Astrophysics Data System (ADS)

The systematical research on the fiber optic seismic sensors based on optical Fiber Bragg Grating (FBG) sensing technology is presented in this thesis. Optical fiber sensors using fiber Bragg gratings have a number of advantages such as immunity to electromagnetic interference, lightweight, low power consumption. The FBG sensor is intrinsically sensitive to dynamic strain signals and the strain sensitivity can approach sub micro-strain. Furthermore, FBG sensors are inherently suited for multiplexing, which makes possible networked/arrayed deployment on a large scale. The basic principle of the FBG geophone is that it transforms the acceleration of ground motion into the strain signal of the FBG sensor through mechanical design, and after the optical demodulation generates the analog voltage output proportional to the strain changes. The customized eight-channel FBG seismic sensor prototype is described here which consists of FBG sensor/demodulation grating pairs attached on the spring-mass mechanical system. The sensor performance is evaluated systematically in the laboratory using the conventional accelerometer and geophone as the benchmark, Two major applications of FBG seismic sensor are demonstrated. One is in the battlefield remote monitoring system to detect the presence of personnel, wheeled vehicles, and tracked vehicles. The other application is in the seismic reflection survey of oilfield exploration to collect the seismic waves from the earth. The field tests were carried out in the air force base and the oilfield respectively. It is shown that the FBG geophone has higher frequency response bandwidth and sensitivity than conventional moving-coil electromagnetic geophone and the military Rembass-II S/A sensor. Our objective is to develop a distributed FBG seismic sensor network to recognize and locate the presence of seismic sources with high inherent detection capability and a low false alarm rate in an integrated system.

Zhang, Yan

282

Seismic Crystals And Earthquake Shield Application  

E-print Network

We theoretically demonstrate that earthquake shield made of seismic crystal can damp down surface waves, which are the most destructive type for constructions. In the paper, seismic crystal is introduced in aspect of band gaps (Stop band) and some design concepts for earthquake and tsunami shielding were discussed in theoretical manner. We observed in our FDTD based 2D elastic wave simulations that proposed earthquake shield could provide about 0.5 reductions in magnitude of surface wave on the Richter scale. This reduction rate in magnitude can considerably reduce destructions in the case of earthquake.

B. Baykant Alagoz; Serkan Alagoz

2009-02-09

283

Seismic Consequence Abstraction  

SciTech Connect

The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

M. Gross

2004-10-25

284

Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model  

SciTech Connect

The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

J. J. Jacobson; D. E. Shropshire; W. B. West

2005-11-01

285

Optimizing Seismic Monitoring Networks for EGS and Conventional Geothermal Projects  

NASA Astrophysics Data System (ADS)

In the past several years, geological energy technologies receive growing attention and have been initiated in or close to urban areas. Some of these technologies involve injecting fluids into the subsurface (e.g., oil and gas development, waste disposal, and geothermal energy development) and have been found or suspected to cause small to moderate sized earthquakes. These earthquakes, which may have gone unnoticed in the past when they occurred in remote sparsely populated areas, are now posing a considerable risk for the public acceptance of these technologies in urban areas. The permanent termination of the EGS project in Basel, Switzerland after a number of induced ML~3 (minor) earthquakes in 2006 is one prominent example. It is therefore essential for the future development and success of these geological energy technologies to develop strategies for managing induced seismicity and keeping the size of induced earthquakes at a level that is acceptable to all stakeholders. Most guidelines and recommendations on induced seismicity published since the 1970ies conclude that an indispensable component of such a strategy is the establishment of seismic monitoring in an early stage of a project. This is because an appropriate seismic monitoring is the only way to detect and locate induced microearthquakes with sufficient certainty to develop an understanding of the seismic and geomechanical response of the reservoir to the geotechnical operation. In addition, seismic monitoring lays the foundation for the establishment of advanced traffic light systems and is therefore an important confidence building measure towards the local population and authorities. We have developed an optimization algorithm for seismic monitoring networks in urban areas that allows to design and evaluate seismic network geometries for arbitrary geotechnical operation layouts. The algorithm is based on the D-optimal experimental design that aims to minimize the error ellipsoid of the linearized location problem. Optimization for additional criteria (e.g., focal mechanism determination or installation costs) can be included. We consider a 3D seismic velocity model, an European ambient seismic noise model derived from high-resolution land-use data, and existing seismic stations in the vicinity of the geotechnical site. Additionally, we account for the attenuation of the seismic signal with travel time and ambient seismic noise with depth to be able to correctly deal with borehole station networks. Using this algorithm we are able to find the optimal geometry and size of the seismic monitoring network that meets the predefined application-oriented performance criteria. This talk will focus on optimal network geometries for deep geothermal projects of the EGS and hydrothermal type, and discuss the requirements for basic seismic surveillance and high-resolution reservoir monitoring and characterization.

Kraft, Toni; Herrmann, Marcus; Bethmann, Falko; Stefan, Wiemer

2013-04-01

286

Real-Time Projection to Verify Plan Success During Execution  

NASA Technical Reports Server (NTRS)

The Mission Data System provides a framework for modeling complex systems in terms of system behaviors and goals that express intent. Complex activity plans can be represented as goal networks that express the coordination of goals on different state variables of the system. Real-time projection extends the ability of this system to verify plan achievability (all goals can be satisfied over the entire plan) into the execution domain so that the system is able to continuously re-verify a plan as it is executed, and as the states of the system change in response to goals and the environment. Previous versions were able to detect and respond to goal violations when they actually occur during execution. This new capability enables the prediction of future goal failures; specifically, goals that were previously found to be achievable but are no longer achievable due to unanticipated faults or environmental conditions. Early detection of such situations enables operators or an autonomous fault response capability to deal with the problem at a point that maximizes the available options. For example, this system has been applied to the problem of managing battery energy on a lunar rover as it is used to explore the Moon. Astronauts drive the rover to waypoints and conduct science observations according to a plan that is scheduled and verified to be achievable with the energy resources available. As the astronauts execute this plan, the system uses this new capability to continuously re-verify the plan as energy is consumed to ensure that the battery will never be depleted below safe levels across the entire plan.

Wagner, David A.; Dvorak, Daniel L.; Rasmussen, Robert D.; Knight, Russell L.; Morris, John R.; Bennett, Matthew B.; Ingham, Michel D.

2012-01-01

287

VERIFYING THE MASTER SINTERING CURVE ON AN INDUSTRIAL FURNACE  

Microsoft Academic Search

The Master Sintering Curve is a simple means of predicting density evolution during sintering. This model relies on the work-of-sintering concept, a time-temperature integral, to predict the degree to which a compact has approached the theoretical density limit. The model is characterized through a series of constant heating rate dilatometry experiments. In this paper, we verify that although the model

Deborah C. Blaine; Seong-Jin Park; Randall M. German; Jerry LaSalle; Hill Nandi

288

Seismic exploration for water on Mars  

NASA Technical Reports Server (NTRS)

It is proposed to soft-land three seismometers in the Utopia-Elysium region and three or more radio controlled explosive charges at nearby sites that can be accurately located by an orbiter. Seismic signatures of timed explosions, to be telemetered to the orbiter, will be used to detect present surface layers, including those saturated by volatiles such as water and/or ice. The Viking Landers included seismometers that showed that at present Mars is seismically quiet, and that the mean crustal thickness at the site is about 14 to 18 km. The new seismic landers must be designed to minimize wind vibration noise, and the landing sites selected so that each is well formed on the regolith, not on rock outcrops or in craters. The explosive charges might be mounted on penetrators aimed at nearby smooth areas. They must be equipped with radio emitters for accurate location and radio receivers for timed detonation.

Page, Thornton

1987-01-01

289

Nuclear archaeology: Verifying declarations of fissile-material production  

SciTech Connect

Controlling the production of fissile material is an essential element of nonproliferation policy. Similarly, accounting for the past production of fissile material should be an important component of nuclear disarmament. This paper describes two promising techniques that make use of physical evidence at reactors and enrichment facilities to verify the past production of plutonium and highly enriched uranium. In the first technique, the concentrations of long-lived radionuclides in permanent components of the reactor core are used to estimate the neutron fluence in various regions of the reactor, and thereby verify declarations of plutonium production in the reactor. In the second technique, the ratio of the concentration of U-235 to that of U-234 in the tails is used to determine whether a given container of tails was used in the production of low- enriched uranium, which is suitable for reactor fuel, or highly enriched uranium, which can be used in nuclear weapons. Both techniques belong to the new field of [open quotes]nuclear archaeology,[close quotes] in which the authors attempt to document past nuclear weapons activities and thereby lay a firm foundation for verifiable nuclear disarmament. 11 refs., 1 fig., 3 tabs.

Fetter, S. (Univ. of Maryland, College Park (United States))

1993-01-01

290

Characterization of the Virgo Seismic Environment  

E-print Network

The Virgo gravitational wave detector is an interferometer (ITF) with 3km arms located in Pisa, Italy. From July to October 2010, Virgo performed its third science run (VSR3) in coincidence with the LIGO detectors. Despite several techniques adopted to isolate the interferometer from the environment, seismic noise remains an important issue for Virgo. Vibrations produced by the detector infrastructure (such as air conditioning units, water chillers/heaters, pumps) are found to affect Virgo's sensitivity, with the main coupling mechanisms being through beam jitter and scattered light processes. The Advanced Virgo (AdV) design seeks to reduce ITF couplings to environmental noise by having most vibration-sensitive components suspended and in-vacuum, as well as muffle and relocate loud machines. During the months of June and July 2010, a Guralp-3TD seismometer was stationed at various locations around the Virgo site hosting major infrastructure machines. Seismic data were examined using spectral and coherence analysis with seismic probes close to the detector. The primary aim of this study was to identify noisy machines which seismically affect the ITF environment and thus require mitigation attention. Analyzed machines are located at various distances from the experimental halls, ranging from 10m to 100m. An attempt is made to measure the attenuation of emitted noise at the ITF and correlate it to the distance from the source and to seismic attenuation models in soil.

The Virgo Collaboration; T. Accadia; F. Acernese; P. Astone; G. Ballardin; F. Barone; M. Barsuglia; A. Basti; Th. S. Bauer; M. Bebronne; M. G. Beker; A. Belletoile; M. Bitossi; M. A. Bizouard; M. Blom; F. Bondu; L. Bonelli; R. Bonnand; V. Boschi; L. Bosi; B. Bouhou; S. Braccini; C. Bradaschia; M. Branchesi; T. Briant; A. Brillet; V. Brisson; T. Bulik; H. J. Bulten; D. Buskulic; C. Buy; G. Cagnoli; E. Calloni; B. Canuel; F. Carbognani; F. Cavalier; R. Cavalieri; G. Cella; E. Cesarini; O. Chaibi; E. Chassande-Mottin; A. Chincarini; A. Chiummo; F. Cleva; E. Coccia; P. -F. Cohadon; C. N. Colacino; J. Colas; A. Colla; M. Colombini; A. Conte; M. Coughlin; J. -P. Coulon; E. Cuoco; S. DAntonio; V. Dattilo; M. Davier; R. Day; R. De Rosa; G. Debreczeni; W. Del Pozzo; M. del Prete; L. Di Fiore; A. Di Lieto; M. Di Paolo Emilio; A. Di Virgilio; A. Dietz; M. Drago; G. Endroczi; V. Fafone; I. Ferrante; F. Fidecaro; I. Fiori; R. Flaminio; L. A. Forte; J. -D. Fournier; J. Franc; S. Frasca; F. Frasconi; M. Galimberti; L. Gammaitoni; F. Garufi; M. E. Gaspar; G. Gemme; E. Genin; A. Gennai; A. Giazotto; R. Gouaty; M. Granata; C. Greverie; G. M. Guidi; J. -F. Hayau; A. Heidmann; H. Heitmann; P. Hello; P. Jaranowski; I. Kowalska; A. Krolak; N. Leroy; N. Letendre; T. G. F. Li; N. Liguori; M. Lorenzini; V. Loriette; G. Losurdo; E. Majorana; I. Maksimovic; N. Man; M. Mantovani; F. Marchesoni; F. Marion; J. Marque; F. Martelli; A. Masserot; C. Michel; L. Milano; Y. Minenkov; M. Mohan; N. Morgado; A. Morgia; S. Mosca; B. Mours; L. Naticchioni; F. Nocera; G. Pagliaroli; L. Palladino; C. Palomba; F. Paoletti; M. Parisi; A. Pasqualetti; R. Passaquieti; D. Passuello; G. Persichetti; F. Piergiovanni; M. Pietka; L. Pinard; R. Poggiani; M. Prato; G. A. Prodi; M. Punturo; P. Puppo; D. S. Rabeling; I. Racz; P. Rapagnani; V. Re; T. Regimbau; F. Ricci; F. Robinet; A. Rocchi; L. Rolland; R. Romano; D. Rosinska; P. Ruggi; B. Sassolas; D. Sentenac; L. Sperandio; R. Sturani; B. Swinkels; M. Tacca; L. Taffarello; A. Toncelli; M. Tonelli; O. Torre; E. Tournefier; F. Travasso; G. Vajente; J. F. J. van den Brand; C. Van Den Broeck; S. van der Putten; M. Vasuth; M. Vavoulidis; G. Vedovato; D. Verkindt; F. Vetrano; A. Vicere; J. -Y. Vinet; S. Vitale; H. Vocca; R. L. Ward; M. Was; M. Yvert; A. Zadrozny; J. -P. Zendri

2011-08-08

291

USGS National Seismic Hazard Maps  

NSDL National Science Digital Library

This set of resources provides seismic hazard assessments and information on design values and mitigation for the U.S. and areas around the world. Map resources include the U.S. National and Regional probabilistic ground motion map collection, which covers the 50 states, Puerto Rico, and selected countries. These maps display peak ground acceleration (PGA) values, and are used as the basis for seismic provisions in building codes and for new construction. There is also a custom mapping and analysis tool, which enables users to re-plot these maps for area of interest, get hazard values using latitude/longitude or zip code, find predominant magnitudes and distances, and map the probability of given magnitude within a certain distance from a site. The ground motion calculator, a Java application, determines hazard curves, uniform hazard response spectra, and design parameters for sites in the 50 states and most territories. There is also a two-part earthquake hazards 'primer', which provides links to hazard maps and frequently-asked-questions, and more detailed information for building and safety planners.

292

Regional seismic discrimination research at LLNL  

SciTech Connect

The ability to verify a Comprehensive Test Ban Treaty (CTBT) depends in part on the ability to seismically detect and discriminate between potential clandestine underground nuclear tests and other seismic sources, including earthquakes and mining activities. Regional techniques are necessary to push detection and discrimination levels down to small magnitudes, but existing methods of event discrimination are mainly empirical and show much variability from region to region. The goals of Lawrence Livermore National Laboratory`s (LLNL`s) regional discriminant research are to evaluate the most promising discriminants, improve the understanding of their physical basis and use this information to develop new and more effective discriminants that can be transported to new regions of high monitoring interest. In this report the authors discuss preliminary efforts to geophysically characterize the Middle East and North Africa. They show that the remarkable stability of coda allows one to develop physically based, stable single station magnitude scales in new regions. They then discuss progress to date on evaluating and improving physical understanding and ability to model regional discriminants, focusing on the comprehensive NTS dataset. The authors apply this modeling ability to develop improved discriminants including slopes of P to S ratios. They find combining disparate discriminant techniques is particularly effective in identifying consistent outliers such as shallow earthquakes and mine seismicity. Finally they discuss development and use of new coda and waveform modeling tools to investigate special events.

Walter, W.R.; Mayeda, K.M.; Goldstein, P.; Patton, H.J.; Jarpe, S.; Glenn, L. [Lawrence Livermore National Lab., CA (United States). Earth Sciences Div.

1995-10-01

293

Gravity of the New Madrid seismic zone; a preliminary study  

USGS Publications Warehouse

In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Mo. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/Central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

Langenheim, V.E.

1995-01-01

294

Shake It Up! Engineering for Seismic Waves  

NSDL National Science Digital Library

Students learn about how engineers design and build shake tables to test the ability of buildings to withstand the various types of seismic waves generated by earthquakes. Just like engineers, students design and build shake tables to test their own model buildings made of toothpicks and mini marshmallows. Once students are satisfied with the performance of their buildings, they put them through a one-minute simulated earthquake challenge.

Integrated Teaching And Learning Program

295

Community Seismic Network (CSN)  

NASA Astrophysics Data System (ADS)

The CSN is a network of low-cost accelerometers deployed in the Pasadena, CA region. It is a prototype network with the goal of demonstrating the importance of dense measurements in determining the rapid lateral variations in ground motion due to earthquakes. The main product of the CSN is a map of peak ground produced within seconds of significant local earthquakes that can be used as a proxy for damage. Examples of this are shown using data from a temporary network in Long Beach, CA. Dense measurements in buildings are also being used to determine the state of health of structures. In addition to fixed sensors, portable sensors such as smart phones are also used in the network. The CSN has necessitated several changes in the standard design of a seismic network. The first is that the data collection and processing is done in the "cloud" (Google cloud in this case) for robustness and the ability to handle large impulsive loads (earthquakes). Second, the database is highly de-normalized (i.e. station locations are part of waveform and event-detection meta data) because of the mobile nature of the sensors. Third, since the sensors are hosted and/or owned by individuals, the privacy of the data is very important. The location of fixed sensors is displayed on maps as sensor counts in block-wide cells, and mobile sensors are shown in a similar way, with the additional requirement to inhibit tracking that at least two must be present in a particular cell before any are shown. The raw waveform data are only released to users outside of the network after a felt earthquake.

Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.

2011-12-01

296

Seismic Hazard analysis of Adjaria Region in Georgia  

NASA Astrophysics Data System (ADS)

The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude distribution [Youngs and Coppersmith, 1985]. Notably, the software can deal with uncertainty in the seismicity input parameters such as maximum magnitude value. CRISIS offers a set of built-in GMPEs, as well as the possibility of defining new ones by providing information in a tabular format. Our study shows that in case of Ajaristkali HPP study area, significant contribution to Seismic Hazard comes from local sources with quite low Mmax values, thus these two attenuation lows give us quite different PGA and SA values.

Jorjiashvili, Nato; Elashvili, Mikheil

2014-05-01

297

Seismic hazard assessment in Grecce: Revisited  

NASA Astrophysics Data System (ADS)

Greece is the most earthquake prone country in the eastern Mediterranean territory and one of the most active areas globally. Seismic Hazard Assessment (SHA) is a useful procedure to estimate the expected earthquake magnitude and strong ground-motion parameters which are necessary for earthquake resistant design. Several studies on the SHA of Greece are available, constituting the basis of the National Seismic Code. However, the recently available more complete, accurate and homogenous seismological data (the new earthquake catalogue of Makropoulos et al., 2012), the revised seismic zones determined within the framework of the SHARE project (2012), new empirical attenuation formulas extracted for several regions in Greece, as well as new algorithms of SHA, are innovations that motivated the present study. Herewith, the expected earthquake magnitude for Greece is evaluated by applying the zone-free, upper bounded Gumbel's third asymptotic distribution of extreme values method. The peak ground acceleration (PGA), velocity (PGV) and displacement (PGD) are calculated at the seismic bedrock using two methods: (a) the Gumbel's first asymptotic distribution of extreme values, since it is valid for initial open-end distributions and (b) the Cornell-McGuire approach, using the CRISIS2007 (Ordaz et. al., 2007) software. The latter takes into account seismic source zones for which seismicity parameters are assigned following a Poisson recurrence model. Thus, each source is characterized by a series of seismic parameters, such as the magnitude recurrence and the recurrence rate for threshold magnitude, while different predictive equations can be assigned to different seismic source zones. Recent available attenuation parameters were considered. Moreover, new attenuation parameters for the very seismically active Corinth Gulf deduced during this study, from recordings of the RASMON accelerometric array, were used. The hazard parameters such as the most probable annual maximum earthquake magnitude (mode) and the maximum expected earthquake magnitude with 70% and 90% probability of not been exceeded in 50 and 100 years are determined and compiled into a GIS mapping scheme. The data quality allowed the estimation of strong ground motion parameters (PGA, PGV and PGD) within cells of small dimensions of 0.25° X 0.25°. The results are discussed and compared with the ones obtained by other studies.

Makropoulos, Kostas; Chousianitis, Kostas; Kaviris, George; Kassaras, Ioannis

2013-04-01

298

Seismic source parameters  

SciTech Connect

The use of information contained on seismograms to infer the properties of an explosion source presents an interesting challenge because the seismic waves recorded on the seismograms represent only small indirect, effects of the explosion. The essential physics of the problem includes the process by which these elastic waves are generated by the explosion and also the process involved in propagating the seismic waves from the source region to the sites where the seismic data are collected. Interpretation of the seismic data in terms of source properties requires that the effects of these generation and propagation processes be taken into account. The propagation process involves linear mechanics and a variety of standard seismological methods have been developed for handling this part of the problem. The generation process presents a more difficult problem, as it involves non-linear mechanics, but semi-empirical methods have been developed for handling this part of the problem which appear to yield reasonable results. These basic properties of the seismic method are illustrated with some of the results from the NPE.

Johnson, L.R.

1994-06-01

299

AUTOMATING SHALLOW SEISMIC IMAGING  

SciTech Connect

Our current EMSP project continues an effort begun in 1997 to develop ultrashallow seismic imaging as a cost-effective method applicable to DOE facilities. The objective of the present research is to refine and demonstrate the use of an automated method of conducting shallow seismic surveys--an approach that represents a significant departure from conventional seismic-survey field procedures. Recent tests involving a second-generation mechanical geophone-planting device have shown that large numbers of geophones can be placed quickly and automatically and can acquire good data. In some easy-access environments, this device is expected to make shallow seismic surveying considerably more efficient and less expensive. Another element of our research plan involves monitoring the cone of depression of a pumping well that serves as a proxy location for fluid-flow at a contaminated site. In May 2001, we collected data from a well site at which drawdown equilibrium had been reached. That information is being interpreted and evaluated. The development of noninvasive, in-situ methods such as placing geophones automatically and using near-surface seismic methods alone or in concert with ground-penetrating radar to identify and characterize the hydrologic flow regimes at contaminated sites supports the prospect of developing effective, cost-conscious cleanup strategies for DOE and others.

Steeples, Don W.

2002-06-01

300

Synthesis of artificial spectrum-compatible seismic accelerograms  

NASA Astrophysics Data System (ADS)

The Hilbert-Huang transform is used to generate artificial seismic signals compatible with the acceleration spectra of natural seismic records. Artificial spectrum-compatible accelerograms are utilized instead of natural earthquake records for the dynamic response analysis of many critical structures such as hospitals, bridges, and power plants. The realistic estimation of the seismic response of structures involves nonlinear dynamic analysis. Moreover, it requires seismic accelerograms representative of the actual ground acceleration time histories expected at the site of interest. Unfortunately, not many actual records of different seismic intensities are available for many regions. In addition, a large number of seismic accelerograms are required to perform a series of nonlinear dynamic analyses for a reliable statistical investigation of the structural behavior under earthquake excitation. These are the main motivations for generating artificial spectrum-compatible seismic accelerograms and could be useful in earthquake engineering for dynamic analysis and design of buildings. According to the proposed method, a single natural earthquake record is deconstructed into amplitude and frequency components using the Hilbert-Huang transform. The proposed method is illustrated by studying 20 natural seismic records with different characteristics such as different frequency content, amplitude, and duration. Experimental results reveal the efficiency of the proposed method in comparison with well-established and industrial methods in the literature.

Vrochidou, E.; Alvanitopoulos, P. F.; Andreadis, I.; Elenas, A.; Mallousi, K.

2014-08-01

301

First Quarter Hanford Seismic Report for Fiscal Year 2011  

SciTech Connect

The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 16 local earthquakes during the first quarter of FY 2011. Six earthquakes were located at shallow depths (less than 4 km), seven earthquakes at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments, and three earthquakes were located at depths greater than 9 km, within the basement. Geographically, thirteen earthquakes were located in known swarm areas and three earthquakes were classified as random events. The highest magnitude event (1.8 Mc) was recorded on October 19, 2010 at depth 17.5 km with epicenter located near the Yakima River between the Rattlesnake Mountain and Horse Heaven Hills swarm areas.

Rohay, Alan C.; Sweeney, Mark D.; Clayton, Ray E.; Devary, Joseph L.

2011-03-31

302

First Quarter Hanford Seismic Report for Fiscal Year 2009  

SciTech Connect

The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. This includes three recently acquired Transportable Array stations located at Cold Creek, Didier Farms, and Phinney Hill. For the Hanford Seismic Network, ten local earthquakes were recorded during the first quarter of fiscal year 2009. All earthquakes were considered as “minor” with magnitudes (Mc) less than 1.0. Two earthquakes were located at shallow depths (less than 4 km), most likely in the Columbia River basalts; five earthquakes at intermediate depths (between 4 and 9 km), most likely in the sub-basalt sediments); and three earthquakes were located at depths greater than 9 km, within the basement. Geographically, four earthquakes occurred in known swarm areas and six earthquakes were classified as random events.

Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

2009-03-15

303

Verified bites by the woodlouse spider, Dysdera crocata.  

PubMed

Bites by the woodlouse spider, Dysdera crocata, are virtually innocuous. The main symptom is minor pain, typically lasting less than 1h, probably due mostly to mechanical puncture of the skin. However, because the spider has a strong proclivity to bite, has large fangs which it bares when threatened and is commonly mistaken for the medically important brown recluse spider in the United States, documentation of the mild effects of its bites may prevent excessive, unwarranted and possibly harmful treatment. We present information on eight verified bites reported to us as well as eight additional bites recorded in the literature. PMID:16574180

Vetter, Richard S; Isbister, Geoffrey K

2006-06-01

304

From Operating-System Correctness to Pervasively Verified Applications  

NASA Astrophysics Data System (ADS)

Though program verification is known and has been used for decades, the verification of a complete computer system still remains a grand challenge. Part of this challenge is the interaction of application programs with the operating system, which is usually entrusted with retrieving input data from and transferring output data to peripheral devices. In this scenario, the correct operation of the applications inherently relies on operating-system correctness. Based on the formal correctness of our real-time operating system Olos, this paper describes an approach to pervasively verify applications running on top of the operating system.

Daum, Matthias; Schirmer, Norbert W.; Schmidt, Mareike

305

Magnitude correlations in global seismicity  

SciTech Connect

By employing natural time analysis, we analyze the worldwide seismicity and study the existence of correlations between earthquake magnitudes. We find that global seismicity exhibits nontrivial magnitude correlations for earthquake magnitudes greater than M{sub w}6.5.

Sarlis, N. V. [Solid State Section and Solid Earth Physics Institute, Physics Department, University of Athens, Panepistimiopolis, Zografos GR-157 84, Athens (Greece)

2011-08-15

306

Unraveling Megathrust Seismicity  

NASA Astrophysics Data System (ADS)

The majority of global seismicity originates at subduction zones, either within the converging plates or along the plate interface. In particular, events with Mw ? 8.0 usually occur at the subduction megathrust, which is the frictional interface between subducting and overriding plates. Consequently, seismicity at subduction megathrusts is responsible for most of the seismic energy globally released during the last century [Pacheco and Sykes, 1992]. What's more, during the last decade giant megathrust earthquakes occurred at an increased rate with respect to the last century [Ammon et al., 2010], often revealing unexpected characteristics and resulting in catastrophic effects. Determining the controlling factors of these events would have fundamental implications for earthquake and tsunami hazard assessment.

Funiciello, Francesca; Corbi, Fabio; van Dinther, Ylona; Heuret, Arnauld

2013-12-01

307

Induced seismicity. Final report  

SciTech Connect

The objective of this project has been to develop a fundamental understanding of seismicity associated with energy production. Earthquakes are known to be associated with oil, gas, and geothermal energy production. The intent is to develop physical models that predict when seismicity is likely to occur, and to determine to what extent these earthquakes can be used to infer conditions within energy reservoirs. Early work focused on earthquakes induced by oil and gas extraction. Just completed research has addressed earthquakes within geothermal fields, such as The Geysers in northern California, as well as the interactions of dilatancy, friction, and shear heating, on the generation of earthquakes. The former has involved modeling thermo- and poro-elastic effects of geothermal production and water injection. Global Positioning System (GPS) receivers are used to measure deformation associated with geothermal activity, and these measurements along with seismic data are used to test and constrain thermo-mechanical models.

Segall, P.

1997-09-18

308

Application of the Neo-Deterministic Seismic Microzonation Procedure in Bulgaria and Validation of the Seismic Input Against Eurocode 8  

SciTech Connect

The earthquake record and the Code for design and construction in seismic regions in Bulgaria have shown that the territory of the Republic of Bulgaria is exposed to a high seismic risk due to local shallow and regional strong intermediate-depth seismic sources. The available strong motion database is quite limited, and therefore not representative at all of the real hazard. The application of the neo-deterministic seismic hazard assessment procedure for two main Bulgarian cities has been capable to supply a significant database of synthetic strong motions for the target sites, applicable for earthquake engineering purposes. The main advantage of the applied deterministic procedure is the possibility to take simultaneously and correctly into consideration the contribution to the earthquake ground motion at the target sites of the seismic source and of the seismic wave propagation in the crossed media. We discuss in this study the result of some recent applications of the neo-deterministic seismic microzonation procedure to the cities of Sofia and Russe. The validation of the theoretically modeled seismic input against Eurocode 8 and the few available records at these sites is discussed.

Ivanka, Paskaleva [CLSMEE--BAS, 3 Acad G. Bonchev str, 1113 Sofia (Bulgaria); Mihaela, Kouteva [CLSMEE-BAS, 3 Acad G. Bonchev str, 1113 Sofia (Bulgaria); ESP-SAND, ICTP, Trieste (Italy); Franco, Vaccari [DST-University of Trieste, Via E. Weiss 4, 34127 Trieste (Italy); Panza, Giuliano F. [DST-University of Trieste, Via E. Weiss 4, 34127 Trieste (Italy); ESP-SAND, ICTP, Trieste (Italy)

2008-07-08

309

A verified minimal YAC contig for human chromosome 21  

SciTech Connect

The goal of this project is the construction of a verified YAC contig of the complete long arm of human chromosome 21 utilizing YACs from the CEPH and St. Louis libraries. The YACs in this contig have been analyzed for size by PFGE, tested for chimerism by FISH or end-cloning, and verified for STS content by PCR. This last analysis has revealed a number of cases of conflict with the published STS order. To establish correct order, we have utilized STS content analysis of somatic cell hybrids containing portions of chromosome 21. Additional problems being addressed include completeness of coverage and possible deletions or gaps. Questions of completeness of the CEPH 810 YAC set arose after screening with 57 independently derived probes failed to identify clones for 11 (19%). Ten of the 11, however, do detect chromosome 21 cosmids when used to screen Lawrence Livermore library LL21NC02`G,` a cosmid library constructed from flow-sorted chromosomes 21. Remaining gaps in the contig are being closed by several methods. These include YAC fingerprinting and conversion of YACs to cosmids. In addition, we are establishing the overlap between the physical NotI map and the YAC contig by testing YACs for NotI sites and screening the YACs in the contig for the presence of NotI-linking clones.

Graw, S.L.; Patterson, D.; Drabkin, H. [Eleanor Roosevelt Institute, Denver, CO (United States)] [and others

1994-09-01

310

Seismic exploration system improvement  

SciTech Connect

This patent describes a seismic exploration system having geophone locations along a survey line with at least one geophone connected to separate circuits connected to corresponding terminals of a roll-along common depth point switch. A means is described for identifying a specific one of the geophone locations as the switch changes connections, comprising means for superimposing a signal outside the useful range of seismic energy signals generated by the geophones on the one of the separate circuits connected to the specific geophone location whereby the location may be identified on the changed connection side of the switch.

Bearden, J.M.

1987-01-06

311

Marine seismic sensor  

SciTech Connect

A hydrophone streamer that includes several arrays of optical fiber pressure sensors. Each array consists of at least three sensors symmetrically disposed around the inside of the streamer skin to form a vertically-disposed array. Each sensor modulates a coherent light beam in accordance with the instantaneous ambient water pressure. The output signals of the sensors include an AC component due to seismic waves and a DC component due to hydrostatic pressure difference between the sensors of an array. Means are provided to resolve the AC and DC components to determine the arrival direction of the received seismic waves.

Savit, C. H.

1985-10-15

312

Multi-purpose seismic transducer  

SciTech Connect

A multi-purpose seismic transducer includes a first seismic sensor having a first transfer function. A transfer-function shaping filter is coupled to the output of the first seismic sensor. The filter is adjustable to shape the first transfer function to match a plurality of different second transfer functions.

Hall, E.M.

1981-02-24

313

SEISMIC OBSERVATION IN IRRIGATION DAM  

Microsoft Academic Search

A large number of irrigation dams have been constructed in Japan. Strong earthquakes have often occurred and seismic observations in fill dams are very important for safety and countermeasures. Seismometers have been installed in 156 high irrigation dams since 1954 and seismic accelerations observed. Dynamic dam behavior is investigated by seismic observation records at the National Institute for Rural Engineering

Tamotsu FURUYA

314

Development of core seismic analysis models for KNGR fuel assemblies associated with 0.3 g seismic loads  

Microsoft Academic Search

In order to evaluate the structural integrity of fuel assemblies associated with 0.3 g seismic loads in the Korean Next Generation Reactor (KNGR), detailed fuel assembly model and core series models with seven and 17 assemblies have been developed using the super-element capability of the msc\\/nastran code. The detailed fuel assembly model has been verified by comparison with the analysis

H. K. Kim; J. S. Lee

2002-01-01

315

Compliant liquid column damper modified by shape memory alloy device for seismic vibration control  

NASA Astrophysics Data System (ADS)

Liquid column dampers (LCDs) have long been used for the seismic vibration control of flexible structures. In contrast, tuning LCDs to short-period structures poses difficulty. Various modifications have been proposed on the original LCD configuration for improving its performance in relatively stiff structures. One such system, referred to as a compliant-LCD has been proposed recently by connecting the LCD to the structure with a spring. In this study, an improvement is attempted in compliant LCDs by replacing the linear spring with a spring made of shape memory alloy (SMA). Considering the dissipative, super-elastic, force-deformation hysteresis of SMA triggered by stress-induced micro-structural phase transition, the performance is expected to improve further. The optimum parameters for the SMA-compliant LCD are obtained through design optimization, which is based on a nonlinear random vibration response analysis via stochastic linearization of the force-deformation hysteresis of SMA and dissipation by liquid motion through an orifice. Substantially enhanced performance of the SMA-LCD over a conventional compliant LCD is demonstrated, the consistency of which is further verified under recorded ground motions. The robustness of the improved performance is also validated by parametric study concerning the anticipated variations in system parameters as well as variability in seismic loading.

Gur, Sourav; Mishra, Sudib Kumar; Bhowmick, Sutanu; Chakraborty, Subrata

2014-10-01

316

49 CFR 40.23 - What actions do employers take after receiving verified test results?  

Code of Federal Regulations, 2010 CFR

...employers take after receiving verified test results? 40.23 Section 40.23 Transportation...employers take after receiving verified test results? (a) As an employer who receives a verified positive drug test result, you must immediately remove...

2010-10-01

317

7 CFR 1780.57 - Design policies.  

Code of Federal Regulations, 2013 CFR

...intended for sheltering persons or property will be designed with appropriate seismic safety provisions in compliance with the Earthquake Hazards Reduction Act of 1977 (42 U.S.C. 7701 et seq.), and Executive Order 12699, Seismic Safety of...

2013-01-01

318

First quarter Hanford seismic report for fiscal year 2000  

SciTech Connect

Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EW uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 311 triggers on two parallel detection and recording systems during the first quarter of fiscal year (FY) 2000. Twelve seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree}N latitude and 119--120{degree}W longitude; 2 were earthquakes in the Columbia River Basalt Group, 3 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 1 was a quarry blast. Two earthquakes appear to be related to a major geologic structure, no earthquakes occurred in known swarm areas, and 9 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the first quarter of FY 2000.

DC Hartshorn; SP Reidel; AC Rohay

2000-02-23

319

Third Quarter Hanford Seismic Report for Fiscal Year 2000  

SciTech Connect

Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its con-tractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (E WRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 818 triggers on two parallel detection and recording systems during the third quarter of fiscal year (FY) 2000. Thirteen seismic events were located by the Hanford Seismic Network within the reporting region of 46-47{degree} N latitude and 119-120{degree} W longitude; 7 were earthquakes in the Columbia River Basalt Group, 1 was an earthquake in the pre-basalt sediments, and 5 were earthquakes in the crystalline basement. Three earthquakes occurred in known swarm areas, and 10 earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the third quarter of FY 2000.

DC Hartshorn; SP Reidel; AC Rohay

2000-09-01

320

First Quarter Hanford Seismic Report for Fiscal Year 2008  

SciTech Connect

The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The Hanford Seismic Assessment Team locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. For the Hanford Seismic Network, forty-four local earthquakes were recorded during the first quarter of fiscal year 2008. A total of thirty-one micro earthquakes were recorded within the Rattlesnake Mountain swarm area at depths in the 5-8 km range, most likely within the pre-basalt sediments. The largest event recorded by the network during the first quarter (November 25, 2007 - magnitude 1.5 Mc) was located within this swarm area at a depth of 4.3 km. With regard to the depth distribution, three earthquakes occurred at shallow depths (less than 4 km, most likely in the Columbia River basalts), thirty-six earthquakes at intermediate depths (between 4 and 9 km, most likely in the pre-basalt sediments), and five earthquakes were located at depths greater than 9 km, within the crystalline basement. Geographically, thirty-eight earthquakes occurred in swarm areas and six earth¬quakes were classified as random events.

Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.; Clayton, Ray E.; Devary, Joseph L.

2008-03-21

321

Second Quarter Hanford Seismic Report for Fiscal Year 2000  

SciTech Connect

Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the US Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The HSN uses 21 sites and the EWRN uses 36 sites; both networks share 16 sites. The networks have 46 combined data channels because Gable Butte and Frenchman Hills East are three-component sites. The reconfiguration of the telemetry and recording systems was completed during the first quarter. All leased telephone lines have been eliminated and radio telemetry is now used exclusively. For the HSN, there were 506 triggers on two parallel detection and recording systems during the second quarter of fiscal year (FY) 2000. Twenty-seven seismic events were located by the Hanford Seismic Network within the reporting region of 46--47{degree} N latitude and 119--120{degree} W longitude; 12 were earthquakes in the Columbia River Basalt Group, 2 were earthquakes in the pre-basalt sediments, 9 were earthquakes in the crystalline basement, and 5 were quarry blasts. Three earthquakes appear to be related to geologic structures, eleven earthquakes occurred in known swarm areas, and seven earthquakes were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometers during the second quarter of FY 2000.

DC Hartshorn; SP Reidel; AC Rohay

2000-07-17

322

Korea Seismic Networks and Korea Integrated Seismic System (KISS)  

NASA Astrophysics Data System (ADS)

The modernization of seismic network in Korea was motivated by Youngweol (1996, Ml 4.5) and Gyeongju (1997, Ml 4.2) earthquakes. KMA (Korea Meteorological Agency) has built 45 digital seismic stations which compose the National Seismic Network. KEPRI (Korea Electric Power Research Institute) and KINS (Korea Institute of Nuclear Safety) also have built 15 and 4 digital seismic stations, respectively. KIGAM (Korea Institute of Geoscience and Mineral Resources) also has made 37 stations until 2008 including Hyodongri complex seismic observatory where GPS, geomagnetic observation system and borehole seismic system. Since 2002 Korea Integrated Seismic System (KISS) has been playing main role in real-time seismic data exchange between different seismic networks operated by four earthquake monitoring institutes: KMA, KEPRI, KINS and KIGAM. Seismic data from different seismic networks are gathered into the data pool of KISS where clients can receive data in real-time. Before expanding and modernizing of Korean seismic stations, the consortium of the four institutes made the standard criteria of seismic observation such as instrument, data format, and communication protocol for the purpose of integrating seismic networks. More than 150 digital stations (velocity or accelerometer) installed from 1998 to 2008 in Korea could be easily linked to KISS in real time due to the standard criteria. When a big earthquake happens, the observed peak acceleration value can be used as the instrumental intensity on the local site and the distribution of peak accelerations shows roughly the severity of the damaged area. Real Time Intensity Color Mapping (RTICOM) is developed to generate a every second contour map of the nationwide intensity based on the peak acceleration values retrieved through KISS from local stations. RTICOM can be used to rapid evaluation of the intensity and decision making against earthquake damages.

Park, J. H.; Chi, H. C.; Lim, I. S.; Kim, G. Y.

2009-04-01

323

AUTOMATING SHALLOW SEISMIC IMAGING  

EPA Science Inventory

Our current EMSP project continues an effort begun in 1997 to develop ultrashallow seismic imaging as a cost-effective method applicable to DOE facilities. The objective of the present research is to refine and demonstrate the use of an automated method of conducting shallow seis...

324

Hanford Seismic Network  

SciTech Connect

This report describes the Hanford Seismic Network. The network consists of two instrument arrays: seismometers and strong motion accelerometers. The seismometers determine the location and magnitude of earthquakes, and the strong motion accelerometers determine ground motion. Together these instruments arrays comply with the intent of DOE Order 5480.20, Natural Phenomena Hazards Mitigation.

Reidel, S.P.; Hartshorn, D.C.

1997-05-01

325

Seismic gaps and earthquakes  

Microsoft Academic Search

McCann et al. [1979] published a widely cited ``seismic gap'' model ascribing earthquake potential categories to 125 zones surrounding the Pacific Rim. Nishenko [1991] published an updated and revised version including probability estimates of characteristic earthquakes with specified magnitudes within each zone. These forecasts are now more than 20 and 10 years old, respectively, and sufficient data now exist to

Yufang Rong; David D. Jackson; Yan Y. Kagan

2003-01-01

326

Seismic gaps and earthquakes  

Microsoft Academic Search

McCann et al. [1979] published a widely cited “seismic gap” model ascribing earthquake potential categories to 125 zones surrounding the Pacific Rim. Nishenko [1991] published an updated and revised version including probability estimates of characteristic earthquakes with specified magnitudes within each zone. These forecasts are now more than 20 and 10 years old, respectively, and sufficient data now exist to

Yufang Rong; David D. Jackson; Yan Y. Kagan

2003-01-01

327

Seismic on screen  

SciTech Connect

This book discusses methods for interpreting seismic data on computer screens and highlights various functions and the several ways they can be performed, as well as the types of hardware that can performed these functions. Vital information for the geophysicist, petroleum engineer and geologist. Also an authoritative college text.

Coffeen, J.A.

1990-01-01

328

Sub-seismic Deformation Prediction of Potential Pathways and Seismic Validation - The Joint Project PROTECT  

NASA Astrophysics Data System (ADS)

The joint project PROTECT (PRediction Of deformation To Ensure Carbon Traps) aims to determine the existence and characteristics of sub-seismic structures that can potentially link deep reservoirs with the surface in the framework of CO2 underground storage. The research provides a new approach of assessing the long-term integrity of storage reservoirs. The objective is predicting and quantifying the distribution and the amount of sub-/seismic strain caused by fault movement in the proximity of a CO2 storage reservoir. The study is developing tools and workflows which will be tested at the CO2CRC Otway Project Site in the Otway Basin in south-western Victoria, Australia. For this purpose, we are building a geometrical kinematic 3-D model based on 2-D and 3-D seismic data that are provided by the Australian project partner, the CO2CRC Consortium. By retro-deforming the modeled subsurface faults in the inspected subsurface volume we can determine the accumulated sub-seismic deformation and thus the strain variation around the faults. Depending on lithology, the calculated strain magnitude and its orientation can be used as an indicator for fracture density. Furthermore, from the complete 3D strain tensor we can predict the orientation of fractures at sub-seismic scale. In areas where we have preliminary predicted critical deformation, we will acquire in November this year new near- surface, high resolution P- and S-wave 2-D seismic data in order to verify and calibrate our model results. Here, novel and parameter-based model building will especially benefit from extracting velocities and elastic parameters from VSP and other seismic data. Our goal is to obtain a better overview of possible fluid migration pathways and communication between reservoir and overburden. Thereby, we will provide a tool for prediction and adapted time-dependent monitoring strategies for subsurface storage in general including scientific visualization capabilities. Acknowledgement This work was sponsored in part by the Australian Commonwealth Government through the Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC). PROTECT (PRediction Of deformation To Ensure Carbon Traps) is funded through the Geotechnologien Programme (grant 03G0797) of the German Ministry for Education and Research (BMBF). The PROTECT research group consists of Leibniz Institute for Applied Geophysics in Hannover, Technical University Darmstadt, Helmholtz-Zentrum für Umweltforschung in Leipzig, Trappe Erdöl Erdgas Consultant in Isernhagen (all Germany), and Curtin University in Perth, Australia.

Krawczyk, C. M.; Kolditz, O.

2013-12-01

329

A Hammer-Impact, Aluminum, Shear-Wave Seismic Source  

USGS Publications Warehouse

Near-surface seismic surveys often employ hammer impacts to create seismic energy. Shear-wave surveys using horizontally polarized waves require horizontal hammer impacts against a rigid object (the source) that is coupled to the ground surface. I have designed, built, and tested a source made out of aluminum and equipped with spikes to improve coupling. The source is effective in a variety of settings, and it is relatively simple and inexpensive to build.

Haines, Seth S.

2007-01-01

330

Autonomous, continuously recording broadband seismic stations at high-latitude  

Microsoft Academic Search

IRIS PASSCAL is in the third year of an NSF funded development and acquisition effort to establish a pool of cold-hardened seismic stations specifically for high-latitude broadband deployments. We have two complete years of field trials and have successfully recorded continuous seismic data during both years with data recovery rates of ~90%. Our design is premised on a 2W autonomous

B. Beaudoin; T. Parker; B. Bonnett; G. Tytgat; K. Anderson; J. Fowler

2009-01-01

331

An application of Marquardt's procedure to the seismic inverse problem  

SciTech Connect

The seismic inverse problem is to infer characteristics of the subsurface from measurements of the wave field at the surface. The Marquardt procedure offers one approach to this problem. In applying this procedure, a linear relationship is developed between the wave field and some parameter which describes a physical property of the subsurface. Then a selection criterion is designed to choose the subsurface parameter which provides the best match for the observed seismic data.

Keys, R.G.

1986-03-01

332

A case study: Time-lapse seismic monitoring of a thin heavy oil reservoir  

NASA Astrophysics Data System (ADS)

This thesis presents a case study on time-lapse seismic monitoring. The target area is located at East Senlac in the vicinity of Alberta and Saskatchewan border, a heavy oil reservoir in the Western Canadian Sedimentary Basin. In order to observe rock property related seismic anomalies, two perpendicular seismic lines have been set up. One seismic line along the N-S direction is subject to Steam Assisted Gravity Drainage (SAGD) while the other seismic line along the W-E direction is not affected. This case study covers the subjects of feasibility study, processing strategy, repeatability evaluation, seismic attribute analysis, and impedance inversion. Systematic feasibility study is conducted by prediction of rock properties based on Gassmann's equation, technical risk assessment, forward modelling and seismic survey design. The first stage simulation of oil substitution by steam indicates that it is feasible to perform time-lapse seismic monitoring project, but great challenge might be encountered. Continuous gas injection barely induces seismic variations. In the aspect of seismic data processing, better seismic quality is obtained by employing the prestack simultaneous processing (PSP) strategy. The three metrics, Pearson correlation, normalized root-mean-squares and predictability are employed to quantify the post-stack seismic repeatability. Higher repeatability along the W-E direction than along the N-S direction shows different local geology environment. The non-uniform CMP stack fold distribution is found the main factor to affect seismic repeatability. The seismic attribute, power spectra calculated from the N-S seismic surveys demonstrate that higher frequency energy tend to increase with time due to the possible decrease in pore pressure and pore temperature. On the other hand, the inverted impedance using the recently proposed hybrid data transformation shows mixed impedance variations. The continuous gas injection and the simultaneous drop in temperature and pressure are possibly the main reason to result in this mixed impedance variations.

Zhang, Yajun

333

Implementational Issues for Verifying RISC Pipeline Conflicts in HOL  

E-print Network

Kumar 2 1 University of Karlsruhe, Institute of Computer Design and Fault Tolerance (Prof. D. Schmid), P of Automation in Circuit Design, Haid­und­Neu Stra�e 10­14, 76131 Karlsruhe, Germany e­mail: kumar@fzi.de Abstract. We outline a general methodology for the formal verification of instruction pipelines in RISC

Tahar, Sofiène

334

Hanford quarterly seismic report -- 97A seismicity on and near the Hanford Site, Pasco Basin, Washington, October 1, 1996 through December 31, 1996  

SciTech Connect

Seismic Monitoring is part of PNNL`s Applied Geology and Geochemistry Group. The Seismic Monitoring Analysis and Repair Team (SMART) operates, maintains, and analyzes data from the hanford Seismic Network (HSN), extending the site historical seismic database and fulfilling US Department of Energy, Richland Operations Office requirements and orders. The SMART also maintains the Eastern Washington Regional Network (EWRN). The University of Washington uses the data from the EWRN and other seismic networks in the Northwest to provide the SMART with necessary regional input for the seismic hazards analysis at the Hanford Site. The SMART is tasked to provide an uninterrupted collection of high-quality raw seismic data from the HSN located on and around the Hanford Site. These unprocessed data are permanently archived. SMART also is tasked to locate and identify sources of seismic activity, monitor changes in the historical pattern of seismic activity at the Hanford Site, and build a local earthquake database (processed data) that is permanently archived. Local earthquakes are defined as earthquakes that occur within 46 degrees to 47 degrees west longitude and 119 degrees to 120 degrees north latitude. The data are used by the Hanford contractor for waste management activities, Natural Phenomena Hazards assessments and engineering design and construction. In addition, the seismic monitoring organization works with Hanford Site Emergency Services Organization to provide assistance in the event of an earthquake on the Hanford Site.

Hartshorn, D.C.; Reidel, S.P.

1997-02-01

335

New seismic codes and their impact on the acoustician  

NASA Astrophysics Data System (ADS)

New seismic building codes for HVAC and electrical equipment, pipe ducts and conduits are being adopted nationwide. These codes affect the way acousticians practice their profession. Recently published model codes (such as IBC, NFPA, ASCE and NBC T1809-4) specify systems that require documented seismic protection. Specific performance and prescriptive code provisions that affect acoustical system applications and how they can be made to comply is included. Key terms in these codes (life safety, essential, seismic use group, category and importance factor) are explained and illustrated. A table listing major code seismic demand formulas (horizontal static seismic force, acting at the center of gravity of the equipment, pipe duct or conduit), is a useful reference. A table that defines which HVAC systems require static or dynamic analysis based on seismic use group, design category and importance factor is provided. A discussion of code-mandated Certificates of Compliance for both mountings and equipment is included and may impact acoustical decisions. New codes may require that engineers, architects and acousticians use seismic restraints with acoustical ceilings, floating floors, resilient pipe duct supports, HVAC equipment and architectural items. ``How To'' for all of this is presented with tables, details and graphs.

Lama, Patrick J.

2005-09-01

336

First Quarter Hanford Seismic Report for Fiscal Year 1999  

SciTech Connect

Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. They also locate and identify sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consists of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The operational rate for the first quarter of FY99 for stations in the HSN was 99.8%. There were 121 triggers during the first quarter of fiscal year 1999. Fourteen triggers were local earthquakes; seven (50%) were in the Columbia River Basalt Group, no earthquakes occurred in the pre-basalt sediments, and seven (50%) were in the crystalline basement. One earthquake (7%) occurred near or along the Horn Rapids anticline, seven earthquakes (50%) occurred in a known swarm area, and six earthquakes (43%) were random occurrences. No earthquakes triggered the Hanford Strong Motion Accelerometer during the first quarter of FY99.

DC Hartshorn; SP Reidel; AC Rohay

1999-05-26

337

Verifying and Validating Proposed Models for FSW Process Optimization  

NASA Technical Reports Server (NTRS)

This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

Schneider, Judith

2008-01-01

338

Verifying operator fitness - an imperative not an option  

SciTech Connect

In the early morning hours of April 26, 1986, whatever credence those who operate nuclear power plants around the world could then muster, suffered a jarring reversal. Through an incredible series of personal errors, the operators at what was later to be termed one of the best operated plants in the USSR systematically stripped away the physical and procedural safeguards inherent to their installation and precipitated the worst reactor accident the world has yet seen. This challenge to the adequacy of nuclear operators comes at a time when many companies throughout the world - not only those that involve nuclear power - are grappling with the problem of how to assure the fitness for duty of those in their employ, specifically those users of substances that have an impact on the ability to function safely and productively in the workplace. In actuality, operator fitness for duty is far more than the lack of impairment from substance abuse, which many today consider it. Full fitness for duty implies mental and moral fitness, as well, and physical fitness in a more general sense. If we are to earn the confidence of the public, credible ways to verify total fitness on an operator-by-operator basis must be considered.

Scott, A.B. Jr.

1987-01-01

339

A credit card verifier structure using diffraction and spectroscopy concepts  

NASA Astrophysics Data System (ADS)

We propose and experimentally demonstrate an angle-multiplexing based optical structure for verifying a credit card. Our key idea comes from the fact that the fine detail of the embossed hologram stamped on the credit card is hard to duplicate and therefore its key color features can be used for distinguishing between the real and counterfeit ones. As the embossed hologram is a diffractive optical element, we choose to shine one at a time a number of broadband lightsources, each at different incident angle, on the embossed hologram of the credit card in such a way that different color spectra per incident angle beam is diffracted and separated in space. In this way, the number of pixels of each color plane is investigated. Then we apply a feed forward back propagation neural network configuration to separate the counterfeit credit card from the real one. Our experimental demonstration using two off-the-shelf broadband white light emitting diodes, one digital camera, a 3-layer neural network, and a notebook computer can identify all 69 counterfeit credit cards from eight real credit cards.

Sumriddetchkajorn, Sarun; Intaravanne, Yuttana

2008-04-01

340

Verifying Linearizability with Hindsight Peter W. O'Hearn  

E-print Network

.00. per, for concreteness. The concurrent set is near the leading edge of concurrent programming interactions between operations in different threads. By design, to be impervious to interference, every thread

Rinetzky, Noam

341

Seismic vulnerability of older reinforced concrete frame structures in Mid-America  

E-print Network

This research quantifies the seismic vulnerability of older reinforced concrete frame structures located in Mid-America. After designing a representative three-story gravity load designed reinforced concrete frame structure, a nonlinear analytical...

Beason, Lauren Rae

2004-09-30

342

Seismic hazard assessment in Aswan, Egypt  

NASA Astrophysics Data System (ADS)

The study of earthquake activity and seismic hazard assessment around Aswan is very important due to the proximity of the Aswan High Dam. The Aswan High Dam is based on hard Precambrian bedrock and is considered to be the most important project in Egypt from the social, agricultural and electrical energy production points of view. The seismotectonic settings around Aswan strongly suggest that medium to large earthquakes are possible, particularly along the Kalabsha, Seiyal and Khor El-Ramla faults. The seismic hazard for Aswan is calculated utilizing the probabilistic approach within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for eight ground motion spectral periods and for a return period of 475 years, which is deemed appropriate for structural design standards in the Egyptian building codes. The results were also displayed in terms of uniform hazard spectra for rock sites at the Aswan High Dam for return periods of 475 and 2475 years. In addition, the ground-motion levels are also deaggregated at the dam site, in order to provide insight into which events are the most important for hazard estimation. The peak ground acceleration ranges between 36 and 152 cm s-2 for return periods of 475 years (equivalent to 90% probability of non-exceedance in 50 years). Spectral hazard values clearly indicate that compared with countries of high seismic risk, the seismicity in the Aswan region can be described as low at most sites to moderate in the area between the Kalabsha and Seyial faults.

Deif, A.; Hamed, H.; Ibrahim, H. A.; Abou Elenean, K.; El-Amin, E.

2011-12-01

343

Seismic Eruption Teaching Modules  

NSDL National Science Digital Library

This site presents educational modules for teaching about earthquakes, volcano eruptions and related plate tectonic concepts using an interactive computer program for mapping called Seismic/Eruption (also called SeisVolE). The program includes up-to-date earthquake and volcanic eruption catalogs and allows the user to display earthquake and volcanic eruption activity in "speeded up real time" on global, regional or local maps that also show the topography of the area in a shaded relief map image. SeisVolE is an interactive program that includes a number of tools that allow the user to analyze earthquake and volcanic eruption data and produce effective displays to illustrate seismicity and volcano patterns. The program can be used to sort data and provide results for statistical analysis, to generate detailed earthquake and volcano activity maps of specific areas or for specific purposes, to investigate earthquake sequences such as foreshocks and aftershocks, and to produce cross section or 3-D perspective views of earthquake locations. The Seismic/Eruption program can be a powerful and effective tool for teaching about plate tectonics and geologic hazards using earthquake and volcano locations, and for learning (or practicing) fundamental science skills such as statistical analysis, graphing, and map skills. The teaching modules describe and illustrate how to use the Seismic/Eruption program effectively in demonstrations, classroom presentations and interactive presentations, and independent study/research. Because the program has many useful options and can be used to examine earthquake activity and volcanic eruption data, the modules provide instructions and examples of quantitative analysis, graphing of results, creating useful maps and cross section diagrams, and performing in-depth exploration and research. The examples are intended to illustrate the features and capabilities of the program and stimulate interest in using the program for discovery learning in Earth science, especially earthquakes, volcanoes and plate tectonics.

Braile, Lawrence

344

Precursory seismic quiescence  

Microsoft Academic Search

Seventeen cases of precursory seismic quiescence to mainshocks with magnitudes fromML=4.7 toMS=8.0 are summarized. The amount of rate decrease ranges from 45% to 90%. The significance of these changes varies between 90% and 99.99%. The assumption that the background rate is approximately constant is fulfilled in most crustal volumes studied. All quiescence anomalies seem to have abrupt beginnings, and the

Max Wyss; R. E. Habermann

1988-01-01

345

The Practice of Seismic Management in Mines— How to Love your Seismic Monitoring System  

Microsoft Academic Search

Eleven years of seismic monitoring has been the foundation for the current understanding and management of seismicity at Mt Charlotte mine. Knowledge gained by linking seismic monitoring to seismic management has allowed the mine to survive. Seismic events possess elements of both fractal and chaotic behaviour. This gives valid basis firstly to collect seismic data, and secondly to expect that

P. A. Mikula; Kalgoorlie Consolidated; Gold Mines

346

The Algerian Seismic Network: Performance from data quality analysis  

NASA Astrophysics Data System (ADS)

Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to densify the network and to enhance performance of the Algerian Digital Seismic Network.

Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

2013-04-01

347

Alternate approaches to verifying the structural adequacy of the Defense High Level Waste Shipping Cask  

SciTech Connect

In the early 1980s, the US Department of Energy/Defense Programs (DOE/DP) initiated a project to develop a safe and efficient transportation system for defense high level waste (DHLW). A long-standing objective of the DHLW transportation project is to develop a truck cask that represents the leading edge of cask technology as well as one that fully complies with all applicable DOE, Nuclear Regulatory Commission (NRC), and Department of Transportation (DOT) regulations. General Atomics (GA) designed the DHLW Truck Shipping Cask using state-of-the-art analytical techniques verified by model testing performed by Sandia National Laboratories (SNL). The analytical techniques include two approaches, inelastic analysis and elastic analysis. This topical report presents the results of the two analytical approaches and the model testing results. The purpose of this work is to show that there are two viable analytical alternatives to verify the structural adequacy of a Type B package and to obtain an NRC license. It addition, this data will help to support the future acceptance by the NRC of inelastic analysis as a tool in packaging design and licensing.

Zimmer, A.; Koploy, M.

1991-12-01

348

Automating Shallow Seismic Imaging  

SciTech Connect

Our primary research focus during the current three-year period of funding has been to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Recent tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' have shown that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under all conditions (please see the Status and Results of Experiments sections for details). In some easy-access environments, this device is expected to make shallow seismic surveying considerably more efficient and less expensive. Another element of our research plan involved monitoring the cone of depression around a pumping well, with the well serving as a proxy location for fluid-flow at a contaminated DOE site. To try to achieve that goal, we collected data from a well site at which drawdown equilibrium had been reached and at another site during a pumping test. Data analysis disclosed that although we were successful in imaging the water table using seismic reflection techniques (Johnson, 2003), we were not able to explicitly delineate the cone of depression (see Status and Results of Experiments).

Steeples, Don W.

2003-06-01

349

Software for Verifying Image-Correlation Tie Points  

NASA Technical Reports Server (NTRS)

A computer program enables assessment of the quality of tie points in the image-correlation processes of the software described in the immediately preceding article. Tie points are computed in mappings between corresponding pixels in the left and right images of a stereoscopic pair. The mappings are sometimes not perfect because image data can be noisy and parallax can cause some points to appear in one image but not the other. The present computer program relies on the availability of a left- right correlation map in addition to the usual right left correlation map. The additional map must be generated, which doubles the processing time. Such increased time can now be afforded in the data-processing pipeline, since the time for map generation is now reduced from about 60 to 3 minutes by the parallelization discussed in the previous article. Parallel cluster processing time, therefore, enabled this better science result. The first mapping is typically from a point (denoted by coordinates x,y) in the left image to a point (x',y') in the right image. The second mapping is from (x',y') in the right image to some point (x",y") in the left image. If (x,y) and(x",y") are identical, then the mapping is considered perfect. The perfect-match criterion can be relaxed by introducing an error window that admits of round-off error and a small amount of noise. The mapping procedure can be repeated until all points in each image not connected to points in the other image are eliminated, so that what remains are verified correlation data.

Klimeck, Gerhard; Yagi, Gary

2008-01-01

350

Verifying and Postprocesing the Ensemble Spread-Error Relationship  

NASA Astrophysics Data System (ADS)

With the increased utilization of ensemble forecasts in weather and hydrologic applications, there is a need to verify their benefit over less expensive deterministic forecasts. One such potential benefit of ensemble systems is their capacity to forecast their own forecast error through the ensemble spread-error relationship. The paper begins by revisiting the limitations of the Pearson correlation alone in assessing this relationship. Next, we introduce two new metrics to consider in assessing the utility an ensemble's varying dispersion. We argue there are two aspects of an ensemble's dispersion that should be assessed. First, and perhaps more fundamentally: is there enough variability in the ensembles dispersion to justify the maintenance of an expensive ensemble prediction system (EPS), irrespective of whether the EPS is well-calibrated or not? To diagnose this, the factor that controls the theoretical upper limit of the spread-error correlation can be useful. Secondly, does the variable dispersion of an ensemble relate to variable expectation of forecast error? Representing the spread-error correlation in relation to its theoretical limit can provide a simple diagnostic of this attribute. A context for these concepts is provided by assessing two operational ensembles: 30-member Western US temperature forecasts for the U.S. Army Test and Evaluation Command and 51-member Brahmaputra River flow forecasts of the Climate Forecast and Applications Project for Bangladesh. Both of these systems utilize a postprocessing technique based on quantile regression (QR) under a step-wise forward selection framework leading to ensemble forecasts with both good reliability and sharpness. In addition, the methodology utilizes the ensemble's ability to self-diagnose forecast instability to produce calibrated forecasts with informative skill-spread relationships. We will describe both ensemble systems briefly, review the steps used to calibrate the ensemble forecast, and present verification statistics using error-spread metrics, along with figures from operational ensemble forecasts before and after calibration.

Hopson, Tom; Knievel, Jason; Liu, Yubao; Roux, Gregory; Wu, Wanli

2013-04-01

351

Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran  

SciTech Connect

This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrence relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.

Amiri, G. Ghodrati [Iran University of Science and Technology--Islamic Azad University of Shahrekord, Narmak, Tehran 16846 (Iran, Islamic Republic of); Dehkordi, M. Raeisi [Department of Civil Engineering, Islamic Azad University of Shahrekord (Iran, Islamic Republic of); Amrei, S. A. Razavian [College of Civil Engineering, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of); Kamali, M. Koohi [Department of Civil Engineering, Islamic Azad University of Shahrekord (Iran, Islamic Republic of)

2008-07-08

352

IDMS: A System to Verify Component Interface Completeness and Compatibility for Product Integration  

NASA Astrophysics Data System (ADS)

The growing approach of Component-Based software Development has had a great impact on today system architectural design. However, the design of subsystems that lacks interoperability and reusability can cause problems during product integration. At worst, this may result in project failure. In literature, it is suggested that the verification of interface descriptions and management of interface changes are factors essential to the success of product integration process. This paper thus presents an automation approach to facilitate reviewing component interfaces for completeness and compatibility. The Interface Descriptions Management System (IDMS) has been implemented to ease and fasten the interface review activities using UML component diagrams as input. The method of verifying interface compatibility is accomplished by traversing the component dependency graph called Component Compatibility Graph (CCG). CCG is the visualization of which each node represents a component, and each edge represents communications between associated components. Three case studies were studied to subjectively evaluate the correctness and usefulness of IDMS.

Areeprayolkij, Wantana; Limpiyakorn, Yachai; Gansawat, Duangrat

353

Seismic databases of The Caucasus  

NASA Astrophysics Data System (ADS)

The Caucasus is one of the active segments of the Alpine-Himalayan collision belt. The region needs continues seismic monitoring systems for better understanding of tectonic processes going in the region. Seismic Monitoring Center of Georgia (Ilia State University) is operating the digital seismic network of the country and is also collecting and exchanging data with neighboring countries. The main focus of our study was to create seismic database which is well organized, easily reachable and is convenient for scientists to use. The seismological database includes the information about more than 100 000 earthquakes from the whole Caucasus. We have to mention that it includes data from analog and digital seismic networks. The first analog seismic station in Georgia was installed in 1899 in the Caucasus in Tbilisi city. The number of analog seismic stations was increasing during next decades and in 1980s about 100 analog stations were operated all over the region. From 1992 due to political and economical situation the number of stations has been decreased and in 2002 just two analog equipments was operated. New digital seismic network was developed in Georgia since 2003. The number of digital seismic stations was increasing and in current days there are more than 25 digital stations operating in the country. The database includes the detailed information about all equipments installed on seismic stations. Database is available online. That will make convenient interface for seismic data exchange data between Caucasus neighboring countries. It also makes easier both the seismic data processing and transferring them to the database and decreases the operator's mistakes during the routine work. The database was created using the followings: php, MySql, Javascript, Ajax, GMT, Gmap, Hypoinverse.

Gunia, I.; Sokhadze, G.; Mikava, D.; Tvaradze, N.; Godoladze, T.

2012-12-01

354

UNIVERSIT AT AUGSBURG Verifying a Stack with Hazard Pointers  

E-print Network

of objects that have been removed from the data structure imposes significant additional challenges on design concurrent reuse of locations introduces a further fundamental problem of lock-free algorithms, the ABA memory reclamation and ABA-avoidance for a stack with hazard pointers has been setup a challenge

Reif, Wolfgang

355

Verifying Communication Protocols Using Live Sequence Chart Specifications  

Microsoft Academic Search

The need for a formal verification process in System on Chip (SoC) design and Intellectual Property (IP) integration has been recognized and investigated significantly in the past. A major drawback is the lack of a suitable specification language against which definitive and efficient verification of inter-core communication can be performed to prove compliance of an IP block against the protocol

Rahul Kumar; Eric G. Mercer

2009-01-01

356

Verifying Multicast-Based Security Protocols Using the Inductive Method  

E-print Network

designed as an efficient way of broad- casting content, is increasingly used in security protocols. Multicast is a general way of representing message casting in protocol verification, with Unicast, Anycast in protocols that involve Byzantine Agreement [19] taking advantage of the message casting framework. Multicast

Paulson, Lawrence C.

357

Towards a uniform method for formally verifying client- server protocols  

Microsoft Academic Search

This paper describes a research on creating a uniform method for formal verification of client-server software protocols. As technology and the need for communication keeps increasing so does the need for formal verification of protocols. If a protocol is poorly designed the consequences could reach from minor inconvenience to possible loss of human life. The problem this paper discusses is

Anne Franssens

2005-01-01

358

Training patients to ask information verifying questions in medical interviews  

Microsoft Academic Search

Purpose – The main purpose of the paper was to examine whether a short patient training session on various ways of requesting physicians to clarify a piece of previously elicited information during medical consultation would improve information communication, thus increasing patient satisfaction. Design\\/methodology\\/approach – A total of 114 adult patients voluntarily participated in the study which was carried out at

Han Z. Li; Juanita Lundgren

2005-01-01

359

Using Multiple Representations to Make and Verify Conjectures  

ERIC Educational Resources Information Center

This article reports on the results of research, the objective of which was to document and analyze the manner in which students relate different representations when solving problems. A total of 20 students attending their first year of university studies took part in the study. In order to design the problem, the underlying information in each…

Garcia, Martha; Benitez, Alma

2011-01-01

360

Quiet Clean Short-haul Experimental Engine (QCSEE) Under-The-Wing (UTW) composite nacelle subsystem test report. [to verify strength of selected composite materials  

NASA Technical Reports Server (NTRS)

The element and subcomponent testing conducted to verify the under the wing composite nacelle design is reported. This composite nacelle consists of an inlet, outer cowl doors, inner cowl doors, and a variable fan nozzle. The element tests provided the mechanical properties used in the nacelle design. The subcomponent tests verified that the critical panel and joint areas of the nacelle had adequate structural integrity.

Stotler, C. L., Jr.; Johnston, E. A.; Freeman, D. S.

1977-01-01

361

Comment on "How can seismic hazard around the New Madrid seismic zone be similar to that in California?" by Arthur Frankel  

USGS Publications Warehouse

PSHA is the method used most to assess seismic hazards for input into various aspects of public and financial policy. For example, PSHA was used by the U.S. Geological Survey to develop the National Seismic Hazard Maps (Frankel et al., 1996, 2002). These maps are the basis for many national, state, and local seismic safety regulations and design standards, such as the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, the International Building Code, and the International Residential Code. Adoption and implementation of these regulations and design standards would have significant impacts on many communities in the New Madrid area, including Memphis, Tennessee and Paducah, Kentucky. Although "mitigating risks to society from earthquakes involves economic and policy issues" (Stein, 2004), seismic hazard assessment is the basis. Seismologists should provide the best information on seismic hazards and communicate them to users and policy makers. There is a lack of effort in communicating the uncertainties in seismic hazard assessment in the central U.S., however. Use of 10%, 5%, and 2% PE in 50 years causes confusion in communicating seismic hazard assessment. It would be easy to discuss and understand the design ground motions if the true meaning of the ground motion derived from PSHA were presented, i.e., the ground motion with the estimated uncertainty or the associated confidence level.

Wang, Z.; Shi, B.; Kiefer, J.D.

2005-01-01

362

Seismic data used to predict formation pressures  

SciTech Connect

A new set of equations helps estimate formation fluid pressures and minimum fracture pressures in liquid-filled, overpressured, soft rock areas before any wells are drilled in the area. This paper reports on the calculation method which uses reflection seismic data to make the estimates which should be helpful for the initial design of mud weight and casing programs. These equations are suited for soft rock areas containing layers of shale and friable sands, such as those in the Gulf Coast or offshore West Africa. Standard seismic interpretation procedures are used to obtain the data. The equations assume that the stronger reflection signals occur where sand bodies exist and area based on the mechanical behavior of uncemented sand.

Stein, N. (Sans Co., Boston, MA (United States))

1992-11-30

363

Verifying Stability of Dynamic Soft-Computing Systems  

NASA Technical Reports Server (NTRS)

Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

Wen, Wu; Napolitano, Marcello; Callahan, John

1997-01-01

364

Seismic hazard from induced seismicity: effect of time-dependent hazard variables  

NASA Astrophysics Data System (ADS)

Geothermal systems are drawing large attention worldwide as an alternative source of energy. Although geothermal energy is beneficial, field operations can produce induced seismicity whose effects can range from light and unfelt to severe damaging. In a recent paper by Convertito et al. (2012), we have investigated the effect of time-dependent seismicity parameters on seismic hazard from induced seismicity. The analysis considered the time-variation of the b-value of the Gutenberg-Richter relationship and the seismicity rate, and assumed a non-homogeneous Poisson model to solve the hazard integral. The procedure was tested in The Geysers geothermal area in Northern California where commercial exploitation has started in the 1960s. The analyzed dataset consists of earthquakes recorded during the period 2007 trough 2010 by the LBNL Geysers/Calpine network. To test the reliability of the analysis, we applied a simple forecasting procedure which compares the estimated hazard values in terms of ground-motion values having fixed probability of exceedance and the observed ground-motion values. The procedure is feasible for monitoring purposes and for calibrating the production/extraction rate to avoid adverse consequences. However, one of the main assumptions we made concern the fact that both median predictions and standard deviation of the ground-motion prediction equation (GMPE) are stationary. Particularly for geothermal areas where the number of recorded earthquakes can rapidly change with time, we want to investigate how a variation of the coefficients of the used GMPE and of the standard deviation influences the hazard estimates. Basically, we hypothesize that the physical-mechanical properties of a highly fractured medium which is continuously perturbed by field operations can produce variations of both source and medium properties that cannot be captured by a stationary GMPE. We assume a standard GMPE which accounts for the main effects which modify the scaling of the peak-ground motion parameters (e.g., magnitude, geometrical spreading and anelastic attenuation). Moreover, we consider both the inter-event and intra-event components of the standard deviation. For comparison, we use the same dataset analyzed by Convertito et al. (2012), and for successive time windows we perform the regression analysis to infer the time-dependent coefficients of the GMPE. After having tested the statistical significance of the new coefficients and having verified a reduction in the total standard deviation, we introduce the new model in the hazard integral. Hazard maps and site-specific analyses in terms of a uniform hazard spectrum are used to compare the new results with those obtained in our previous study to investigate which coefficients and which components of the total standard deviation do really matter for refining seismic hazard estimates for induced seismicity. Convertito et al. (2012). From Induced Seismicity to Direct Time-Dependent Seismic Hazard, BSSA 102(6), doi:10.1785/0120120036.

Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

2012-12-01

365

Seismological investigation of earthquakes in the New Madrid Seismic Zone. Final report, September 1986--December 1992  

SciTech Connect

Earthquake activity in the New Madrid Seismic Zone had been monitored by regional seismic networks since 1975. During this time period, over 3,700 earthquakes have been located within the region bounded by latitudes 35{degrees}--39{degrees}N and longitudes 87{degrees}--92{degrees}W. Most of these earthquakes occur within a 1.5{degrees} x 2{degrees} zone centered on the Missouri Bootheel. Source parameters of larger earthquakes in the zone and in eastern North America are determined using surface-wave spectral amplitudes and broadband waveforms for the purpose of determining the focal mechanism, source depth and seismic moment. Waveform modeling of broadband data is shown to be a powerful tool in defining these source parameters when used complementary with regional seismic network data, and in addition, in verifying the correctness of previously published focal mechanism solutions.

Herrmann, R.B.; Nguyen, B. [Saint Louis Univ., MO (US). Dept. of Earth and Atmospheric Sciences

1993-08-01

366

Reservoir Characterization Using Seismic Inversion Data.  

E-print Network

?? Reservoir architecture may be inferred from analogs and geologic concepts, seismic surveys, and well data. Stochastically inverted seismic data are uninformative about meter-scale features,… (more)

Kalla, Subhash

2008-01-01

367

Development of a HT seismic downhole tool.  

SciTech Connect

Enhanced Geothermal Systems (EGS) require the stimulation of the drilled well, likely through hydraulic fracturing. Whether fracturing of the rock occurs by shear destabilization of natural fractures or by extensional failure of weaker zones, control of the fracture process will be required to create the flow paths necessary for effective heat mining. As such, microseismic monitoring provides one method for real-time mapping of the fractures created during the hydraulic fracturing process. This monitoring is necessary to help assess stimulation effectiveness and provide the information necessary to properly create the reservoir. In addition, reservoir monitoring of the microseismic activity can provide information on reservoir performance and evolution over time. To our knowledge, no seismic tool exists that will operate above 125 C for the long monitoring durations that may be necessary. Replacing failed tools is costly and introduces potential errors such as depth variance, etc. Sandia has designed a high temperature seismic tool for long-term deployment in geothermal applications. It is capable of detecting microseismic events and operating continuously at temperatures up to 240 C. This project includes the design and fabrication of two High Temperature (HT) seismic tools that will have the capability to operate in both temporary and long-term monitoring modes. To ensure the developed tool meets industry requirements for high sampling rates (>2ksps) and high resolution (24-bit Analog-to-Digital Converter) two electronic designs will be implemented. One electronic design will utilize newly developed 200 C electronic components. The other design will use qualified Silicon-on-Insulator (SOI) devices and will have a continuous operating temperature of 240 C.

Maldonado, Frank P.; Greving, Jeffrey J.; Henfling, Joseph Anthony; Chavira, David J.; Uhl, James Eugene; Polsky, Yarom

2009-06-01

368

Seismic Accelerogram Compatible with Design Response Spectrum.  

National Technical Information Service (NTIS)

The limitations and paucity of recorded accelerograms in Taiwan together with the widespread use of time history dynamic analysis for obtaining structural and secondary system's response are the primary motivation of this report for the development of sim...

C. C. Lin, H. H. Hung

1984-01-01

369

Seismic Design of Monolithic Bridge Abutments,  

National Technical Information Service (NTIS)

The objective of the research was to investigate the soil-structure interaction characteristics between monolithic bridge abutments and the surrounding soil. The investigation consisted of: (1) vibration tests on the Horsethief Bridge, a single span struc...

B. Hushmand, C. B. Crouse, G. Liang, G. Martin, J. Wood

1986-01-01

370

One-dimensional Seismic Analysis of a Solid-Waste Landfill  

SciTech Connect

Analysis of the seismic performance of solid waste landfill follows generally the same procedures for the design of embankment dams, even if the methods and safety requirements should be different. The characterization of waste properties for seismic design is difficult due the heterogeneity of the material, requiring the procurement of large samples. The dynamic characteristics of solid waste materials play an important role on the seismic response of landfill, and it also is important to assess the dynamic shear strengths of liner materials due the effect of inertial forces in the refuse mass. In the paper the numerical results of a dynamic analysis are reported and analysed to determine the reliability of the common practice of using 1D analysis to evaluate the seismic response of a municipal solid-waste landfill. Numerical results indicate that the seismic response of a landfill can vary significantly due to reasonable variations of waste properties, fill heights, site conditions, and design rock motions.

Castelli, Francesco; Lentini, Valentina; Maugeri, Michele [Department of Civil and Environmental Engineering, University of Catania, Viale Andrea Doria no. 6, 95125, Catania (Italy)

2008-07-08

371

Elastic-Wavefield Seismic Stratigraphy: A New Seismic Imaging Technology  

SciTech Connect

The purpose of our research has been to develop and demonstrate a seismic technology that will provide the oil and gas industry a better methodology for understanding reservoir and seal architectures and for improving interpretations of hydrocarbon systems. Our research goal was to expand the valuable science of seismic stratigraphy beyond the constraints of compressional (P-P) seismic data by using all modes (P-P, P-SV, SH-SH, SV-SV, SV-P) of a seismic elastic wavefield to define depositional sequences and facies. Our objective was to demonstrate that one or more modes of an elastic wavefield may image stratal surfaces across some stratigraphic intervals that are not seen by companion wave modes and thus provide different, but equally valid, information regarding depositional sequences and sedimentary facies within that interval. We use the term elastic wavefield stratigraphy to describe the methodology we use to integrate seismic sequences and seismic facies from all modes of an elastic wavefield into a seismic interpretation. We interpreted both onshore and marine multicomponent seismic surveys to select the data examples that we use to document the principles of elastic wavefield stratigraphy. We have also used examples from published papers that illustrate some concepts better than did the multicomponent seismic data that were available for our analysis. In each interpretation study, we used rock physics modeling to explain how and why certain geological conditions caused differences in P and S reflectivities that resulted in P-wave seismic sequences and facies being different from depth-equivalent S-wave sequences and facies across the targets we studied.

Bob A. Hardage; Milo M. Backus; Michael V. DeAngelo; Sergey Fomel; Khaled Fouad; Robert J. Graebner; Paul E. Murray; Randy Remington; Diana Sava

2006-07-31

372

Geophysics I. Seismic Methods  

SciTech Connect

During the past two decades, the technology of geophysics has exploded. At the same time, the petroleum industry has been forced to look for more and more subtle traps in more and more difficult terrain. The choice of papers in this geophysics reprint volume reflects this evolution. The papers were chosen to help geologists, not geophysicists, enhance their knowledge of geophysics. Math-intensive papers were excluded because those papers are relatively esoteric and have limited applicability for most geologists. This volume concentrates on different seismic survey methods. Each of the 38 papers were abstracted and indexed for the U.S. Department of Energy's Energy Data Base.

Beaumont, E.A.; Foster, N.H. (comps.)

1989-01-01

373

MERCURIO: An Interaction-oriented Framework for Designing, Verifying and Programming  

E-print Network

the conformance of an agent specification (or of its run- time behavior) to a protocol. In open environments to FIPA ACL [3]. According to FIPA ACL mentalistic approach, the semantics of messages is given in terms

Mascardi, Viviana

374

Probabilistic versus deterministic seismic hazard analysis: an integrated approach for siting problems  

Microsoft Academic Search

A methodology is proposed to determine design earthquakes for site-specific studies such as the siting of critical structures (power plants, waste disposals, large dams, etc.), strategic structures (fire stations, military commands, hospitals, etc.), or for seismic microzoning studies, matching the results of probabilistic seismic hazard analyses. This goal is achieved by calculating the source contribution to hazard and the magnitude–distance

R Romeo; A Prestininzi

2000-01-01

375

BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM  

Microsoft Academic Search

The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design

G. DEGRASSI; C. HOFMAYER; C. MURPHY; K. SUZUKI; Y. NAMITA

2003-01-01

376

Seismic viscoelastic attenuation Submitted to  

E-print Network

the resultant elastic deformation (strain) in the material lags in time the applied stress induced by the wave T of a seismic body wave given by the expression exp(- f T/Q). The apparent Q combines the energy lost to heat attenuate? The attenuation of seismic waves is due to three effects: geometric spreading, intrinsic

Cormier, Vernon F.

377

Shallow subsurface applications of high-resolution seismic reflection  

NASA Astrophysics Data System (ADS)

Shallow seismic reflection surveys have been applied to a wide variety of problems. For example, in many geologic settings, variations and discontinuities on the surface of bedrock can influence the transport and eventual fate of contaminants introduced at or near the ground surface. Using seismic methods to determine the nature and location of anomalous bedrock can be an essential component of hydrologic characterization. Shallow seismic surveys can also be used to detect earthquake faults and to image underground voids. During the early 1980s, the advent of digital engineering seismographs designed for shallow, high-resolution surveying spurred significant improvements in engineering and environmental reflection seismology. Commonly, shallow seismic reflection methods are used in conjunction with other geophysical and geological methods, supported by a well-planned drilling-verification effort. To the extent that seismic reflection, refraction, and surface-wave methods can constrain shallow stratigraphy, geologic structure, engineering properties, and relative permeability, these methods are useful in civil-engineering applications and in characterizing environmental sites. Case histories from Kansas, California, and Texas illustrate how seismic reflection can be used to map bedrock beneath alluvium at hazardous waste sites, detect abandoned coal mines, follow the top of the saturated zone during an alluvial aquifer pumping test, and map shallow faults that serve as contaminant flowpaths.

Steeples, Don

2002-11-01

378

Seismic vulnerability and risk assessment of Kolkata City, India  

NASA Astrophysics Data System (ADS)

The city of Kolkata is one of the most urbanized and densely populated regions in the world, which is a major industrial and commercial hub of the Eastern and Northeastern region of India. In order to classify the seismic risk zones of Kolkata we used seismic hazard exposures on the vulnerability components namely, landuse/landcover, population density, building typology, age and height. We microzoned seismic hazard of the City by integrating seismological, geological and geotechnical themes in GIS which in turn is integrated with the vulnerability components in a logic-tree framework to estimate both the socio-economic and structural risk of the City. In both the risk maps, three broad zones have been demarcated as "severe", "high" and "moderate". There had also been a risk-free zone in the City. The damage distribution in the City due to the 1934 Bihar-Nepal Earthquake of Mw 8.1 well matches with the risk regime. The design horizontal seismic coefficients for the City have been worked out for all the predominant periods which indicate suitability of "A", "B" and "C" type of structures. The cumulative damage probabilities in terms of "slight", "moderate", "extensive" and "complete" have also been assessed for the significant four model building types viz. RM2L, RM2M, URML and URMM for each structural seismic risk zone in the City. Both the Seismic Hazard and Risk maps are expected to play vital roles in the earthquake inflicted disaster mitigation and management of the city of Kolkata.

Nath, S. K.; Adhikari, M. D.; Devaraj, N.; Maiti, S. K.

2014-04-01

379

Local seismic events in area of Poland based on data from PASSEQ 2006-2008 experiment  

NASA Astrophysics Data System (ADS)

PASSEQ 2006-2008 (Passive Seismic Experiment in TESZ; Wilde-Piórko et al, 2008) was the biggest so far passive seismic experiment in the area of Central Europe (Poland, Germany, Czech Republic and Lithuania). 196 seismic stations (including 49 broadband seismometers) worked simultaneously for over two years. During experiment multiple types of data recorders and seismometers were used making analysis more complex and time consuming. Dataset was unified and repaired to start the detection of local seismic events. Two different approaches for detection were applied for stations located in Poland. One used standard STA/LTA triggers (Carl Johnson's STA/LTA algorithm) and grid search to classify and locate events. Result was manually verified. Other approach used Real Time Recurrent Network (RTRN) detection (Wiszniowski et al, 2014). Both methods gave similar results showing four previously unknown seismic events located in area of Gulf Of Gda?sk in southern Baltic Sea. The investigation of local seismicity is a good opportunity for verification of new seismic models of lithosphere in the area. In this paper we discuss both detection methods with their pros and cons (accuracy, efficiency, manual work required, scalability). We also show details of all detected and previously unknown events in discussed area. This work was partially supported by NCN grant UMO-2011/01/B/ST10/06653.

Polkowski, Marcin; Plesiewicz, Beata; Wiszniowski, Jan; Wilde-Piórko, Monika; Passeq Working Group

2014-05-01

380

Key aspects governing induced seismicity  

NASA Astrophysics Data System (ADS)

In the past decades numerous examples of earthquakes induced by human-induced changes in subsurface fluid pressures have been reported. This poses a major threat to the future development of some of these operations and calls for an understanding and quantification of the seismicity generated. From geomechanical considerations and insights from laboratory experiments the factors controlling induced seismicity may be grouped into 4 categories; the magnitude of the stress disturbance, the pre-existing stress conditions, the reservoir/fault rock properties and the local geometry. We investigated whether the (relative) contributions of these factors and their influence on magnitudes generated could be recognized by looking at the entire dataset of reported cases of induced seismicity as a whole, and what this might imply for future developments. An extensive database has been built out of over a 160 known cases of induced seismicity worldwide, incorporating the relevant geological, seismological and fluid-related parameters. The cases studied include hydrocarbon depletion and secondary recovery, waste water injection, (enhanced) geothermal systems and hydraulic fracturing with observed magnitudes ranging from less than -1.5 to 7. The parameters taken into account were based on the theoretical background of the mechanisms of induced seismicity and include the injection/depletion-related parameters, (spatial) characteristics of seismicity, lithological properties and the local stress situation. Correlations between the seismic response and the geological/geomechanical characteristics of the various sites were investigated. The injected/depleted volumes and the scale of the activities are major controlling factors on the maximum magnitudes generated. Spatial signatures of seismicity such as the depth and lateral spread of the seismicity were observed to be distinct for different activities, which is useful when considering future operations. Where available the local stress situation is considered, as well as the influence of the natural seismicity. Finally, we related induced seismicity to several reservoir and fault rock properties, including fault rock stability as is observed from the laboratory. The combination of activities of different natures and associated seismicity occurring through distinct mechanisms in this dataset is very useful for a better understanding of the factors governing induced seismicity and the operation-specific seismic expressions.

Buijze, Loes; Wassing, Brecht; Fokker, Peter

2013-04-01

381

Seismicity of western Macedonia, Greece  

NASA Astrophysics Data System (ADS)

The seismicity of western Macedonia is examined in the present paper. On the basis of historical information as well as on instrumental data it is found that this area is characterized by low seismicity. The focal region of the Grevena-Kozani 1995 earthquake exhibits the highest seismicity in terms of probabilities for the generation of strong (M s ? 6.0) earthquakes in a period of fifty years. Two other regions with relatively high seismicity were also distinguished (west of Edessa and around Prespes lakes). Accurate determination of focal parameters of all earthquakes occurred in the area during October 1975-April 1995, by the use of a 3-D crustal model shows that the seismic activity is related to the graben structures of the studied area. Finally, evidence is presented that the triggering of the 1995 earthquake may be related to the impoundment of the Polyfytos artificial lake.

Karakaisis, G. F.; Hatzidimitriou, P. M.; Scordilis, E. M.; Panagiotopoulos, D. G.

382

First Quarter Hanford Seismic Report for Fiscal Year 2001  

SciTech Connect

Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. Hanford Seismic Monitoring also locates and identifies sources of seismic activity and monitors changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of a significant earthquake on the Hanford Site. The HSN and the Eastern Washington Regional Network (EWRN) consist of 41 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. For the HSN, there were 477 triggers during the first quarter of fiscal year (FY) 2001 on the data acquisition system. Of these triggers, 176 were earthquakes. Forty-five earthquakes were located in the HSN area; 1 earthquake occurred in the Columbia River Basalt Group, 43 were earthquakes in the pre-basalt sediments, and 1 was earthquakes in the crystalline basement. Geographically, 44 earthquakes occurred in swarm areas, 1 earthquake was on a major structure, and no earthquakes were classified as random occurrences. The Horse Heaven Hills earthquake swarm area recorded all but one event during the first quarter of FY 2001. The peak of the activity occurred over December 12th, 13th, and 14th when 35 events occurred. No earthquakes triggered the Hanford Strong Motion Accelerometers during the first quarter of FY 2001.

Hartshorn, Donald C.; Reidel, Stephen P.; Rohay, Alan C.; Valenta, Michelle M.

2001-02-27

383

Illuminating Asset Value through New Seismic Technology  

NASA Astrophysics Data System (ADS)

The ability to reduce risk and uncertainty across the full life cycle of an asset is directly correlated to creating an accurate subsurface image that enhances our understanding of the geology. This presentation focuses on this objective in areas of complex overburden in deepwater. Marine 3D seismic surveys have been acquired in essentially the same way for the past decade. This configuration of towed streamer acquisition, where the boat acquires data in one azimuth has been very effective in imaging areas in fairly benign geologic settings. As the industry has moved into more complicated geologic settings these surveys no longer meet the imaging objectives for risk reduction in exploration through production. In shallow water, we have seen increasing use of ocean bottom cables to meet this challenge. For deepwater, new breakthroughs in technology were required. This will be highlighted through examples of imaging below large salt bodies in the deep water Gulf of Mexico. GoM - Mad Dog: The Mad Dog field is located approximately 140 miles south of the Louisiana coastline in the southern Green Canyon area in water depths between 4100 feet to 6000 feet. The complex salt canopy overlying a large portion of the field results in generally poor seismic data quality. Advanced processing techniques improved the image, but gaps still remained even after several years of effort. We concluded that wide azimuth acquisition was required to illuminate the field in a new way. Results from the Wide Azimuth Towed Streamer (WATS) survey deployed at Mad Dog demonstrated the anticipated improvement in the subsalt image. GoM - Atlantis Field: An alternative approach to wide azimuth acquisition, ocean bottom seismic (OBS) node technology, was developed and tested. In 2001 deepwater practical experience was limited to a few nodes owned by academic institutions and there were no commercial solutions either available or in development. BP embarked on a program of sea trials designed to both evaluate technologies and subsequently encourage vendor activity to develop and deploy a commercial system. The 3D seismic method exploded into general usage in the 1990's. Our industry delivered 3D cheaper and faster, improving quality through improved acquisition specifications and new processing technology. The need to mitigate business risks in highly material subsalt plays led BP to explore the technical limits of the seismic method, testing novel acquisition techniques to improve illumination and signal to noise ratio. These were successful and are applicable to analogue seismic quality problems globally providing breakthroughs in illuminating previously hidden geology and hydrocarbon reservoirs. A focused business challenge, smart risk taking, investment in people and computing capability, partnerships, and rapid implementation are key themes that will be touched on through out the talk.

Brandsberg-Dahl, S.

2007-05-01

384

A Novel Approach for Verifiable Secret Sharing by using a One Way Hash Function  

E-print Network

Threshold secret sharing schemes do not prevent any malicious behavior of the dealer or shareholders and so we need verifiable secret sharing, to detect and identify the cheaters, to achieve fair reconstruction of a secret. The problem of verifiable secret sharing is to verify the shares distributed by the dealer. A novel approach for verifiable secret sharing is presented in this paper where both the dealer and shareholders are not assumed to be honest. In this paper, we extend the term verifiable secret sharing to verify the shares, distributed by a dealer as well as shares submitted by shareholders for secret reconstruction, and to verify the reconstructed secret. Our proposed scheme uses a one way hash function and probabilistic homomorphic encryption function to provide verifiability and fair reconstruction of a secret.

Parmar, Keyur

2012-01-01

385

20 CFR 10.527 - Does OWCP verify reports of earnings?  

Code of Federal Regulations, 2010 CFR

...2010-04-01 false Does OWCP verify reports of earnings? 10.527 Section...AMENDED Continuing Benefits Reports of Earnings from Employment and Self-Employment § 10.527 Does OWCP verify reports of earnings? To make...

2010-04-01

386

A Multithreaded Verified Method for Solving Linear Systems in Dual-Core Processors  

E-print Network

A Multithreaded Verified Method for Solving Linear Systems in Dual-Core Processors Mariana Luderitz multithreaded approach for the problem of solving dense linear systems with verified results. We propose a new

Paris-Sud XI, Université de

387

48 CFR 227.7103-13 - Government right to review, verify, challenge and validate asserted restrictions.  

Code of Federal Regulations, 2010 CFR

...2010-10-01 2010-10-01 false Government right to review, verify, challenge and validate...REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Technical Data 227.7103-13 Government right to review, verify, challenge and...

2010-10-01

388

Seismic event classification system  

DOEpatents

In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

Dowla, F.U.; Jarpe, S.P.; Maurer, W.

1994-12-13

389

Seismic event classification system  

DOEpatents

In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

Dowla, Farid U. (Castro Valley, CA); Jarpe, Stephen P. (Brentwood, CA); Maurer, William (Livermore, CA)

1994-01-01

390

Seismic Survey Challenges and Solutions in Industrial And Urban Environments  

NASA Astrophysics Data System (ADS)

Carbon storage projects are often located in close proximity to anthropogenic sources of CO2. This means that the storage site location may be near industrial power plants, mining activity, or urban centers. Proximity to these environments can present unique challenges for the seismic survey design, acquisition, and processing teams in terms of acquiring surface seismic data that meets the site characterization objectives for a CO2 storage site. Seismic surveys in urban and industrial environments may have acquisition footprints that are severely constrained by surrounding infrastructure. The acquisition crew and survey design team must work closely together in real-time to add in-fill source and receiver locations to surveys in order to ensure that high fold coverage is maintained over the survey. High levels of seismic noise may be generated by the industrial plants themselves. Local and industrial traffic, as well as electrical noise may also be a cause for concern. Near surface conditions, such as water saturated soils, unconsolidated mine tailings, and mining cavities, may accelerate attenuation of the seismic signal and become sources of noise in the survey and further impact data quality. When dealing with such conditions, the acquisition and survey design teams must stay in constant communication to optimize survey parameters to account for noise issues. In some cases, the raw data can be so contaminated with noise that no coherent signal can be seen in the data. However, the use of high density-single sensors is one of the most effective options to deal with noisy acquisition environments as this method allows the recorded noise to be sampled without aliasing so that that it can be removed from the data without impacting the seismic signal. Removing noise and optimizing the final images obtained from the data is the job of the survey design and data processing teams. A final consideration when acquiring seismic surveys in urban areas is the visibility of the project to the public. It is important to have a member of the acquisition team that is in constant communication with the surrounding landowners; this is usually done in collaboration with the project communications and outreach team. Invitations to come observe seismic operations can go a long way towards allaying the concerns of landowners. Given that commercial CO2 storage projects are likely to span decades and will require a number of time-lapse seismic surveys to monitor CO2 plume movement, it is essential that the community is left with a positive feeling about seismic operations. One of the ultimate site characterization objectives of seismic datasets should be deriving information on rock lithologies and rock properties, such as porosity, using seismic inversion. The data trends in lithology and porosity that are identified through seismic inversion may have significant impacts on the geologic model and reservoir simulations. Despite the multitude of acquisition challenges that can be encountered at CO2 storage sites, high quality and high resolution seismic data can still be acquired, and the value of that data can be maximized through inversion analysis.

Coueslan, M. L.; El-Kaseeh, G.; Totten, S.

2011-12-01

391

Experimental study of seismic vibration effect on two-phase flow  

NASA Astrophysics Data System (ADS)

This study is to investigate the seismic vibration effects on two-phase flow. Based on the seismic characteristics found in literature, the properties for designing a test facility to simulate vibration and the test conditions for adiabatic and diabatic (subcooled boiling) two-phase flows have been chosen. In order to perform this experiment, an annulus test section has been built and attached to a vibration module. For experimental investigation and visualization of two-phase flow, Pyrex-glass tubes have been utilized as a transparent test section and stainless steel instrumentation ports are designed to acquire experimental data. In the design process, calculations considering the resonance, natural frequency, structural deflection, material properties and vibration conditions for the vibration structure have been performed to choose a suitable vibration beam. The motion equations of the eccentric cam are also analyzed with respect to displacement (vibration amplitude), velocity and acceleration. Each design process is set for the goal of an economical, reliable and controllable vibration condition for the two-phase flow test section. In addition, the scaling laws for geometric similarity, hydrodynamic similarity and thermal similarity are taken into account for the annulus test section to simulate a fuel assembly sub-channel of a prototypic boiling water reactor (BWR). Potential hydrodynamic and thermal effects for two-phase flow under seismic vibration are broken down and analyzed in detail. Based on the 1-D drift-flux model, the hydrodynamics effects are discussed with respect to the possible variations of distribution parameters, C0, and drift velocity, <>, caused by the changes of the void distribution, bubble diameter and flow regimes. Sensitivity studies are carried out for analyzing these potential hydrodynamic effects. In addition, the void generation relations in a diabatic (subcooled boiling) two-phase flow system are taken into account for analyses of potential thermal effects, including the onset of nucleate boiling and onset of significant void (ONB-OSV) effect, wall nucleation effect and bulk phase change effect. Analyses of phenomenological changes and temperature distributions are performed for estimations of void changes due to vibration. An extensive 1-D experimental database is assembled for adiabatic and subcooled boiling two-phase flow under stationary and vibration conditions. The adiabatic test results are used to examine and verify the potential hydrodynamic effects, whereas the subcooled boiling test results are compared and explained by proposed thermal effects. Several vibration effect maps were made in terms of flow conditions (- ), thermal conditions (NZu-N Sub), operation conditions (-) and vibration conditions ( E-f and f-?) for adiabatic and subcooled boiling two-phase flow tests. Among these vibration effect maps, different effective vibration conditions and dominant effects can be seen by regions. In the -< jf> plots of adiabatic two-phase flow tests, the hydrodynamic effect is found to dominate. The void fraction is found to potentially decrease due to vibration in wall-peak bubbly flow regime and increase at the region close to bubbly-to-slug transition boundary. No significant change in void fraction is found in slug flow regime under vibration. In addition, the thermal effect due to vibration is presented on the NZu- NSub plots. Three regions representing void increase, no change and void decrease which are corresponding to thick thermal boundary layer (TBL), bulk saturation and near saturation with low flow and high subcooling conditions are presented for subcooled boiling flow under seismic vibration. Finally, the E-f and f- ? plots express the effective vibration conditions for adiabatic and subcooled boiling flows, and the acceleration values are compared with existing earthquake intensity records. In summary, a systematic database covering wide ranges of seismic vibration conditions

Chen, Shao-Wen

392

Seismic monitoring at Deception Island volcano (Antarctica): Recent advances  

NASA Astrophysics Data System (ADS)

Deception Island (South Shetland Island, Antarctica) is an active volcano with recent eruptions (e.g. 1967, 1969 and 1970). It is also among the Antarctic sites most visited by tourists. Besides, there are currently two scientific bases operating during the austral summers, usually from late November to early March. For these reasons it is necessary to deploy a volcano monitoring system as complete as possible, designed specifically to endure the extreme conditions of the volcanic environment and the Antarctic climate. The Instituto Andaluz de Geofísica of University of Granada, Spain (IAG-UGR) performs seismic monitoring on Deception Island since 1994 during austral summer surveys. The seismicity basically includes volcano-tectonic earthquakes, long-period events and volcanic tremor, among other signals. The level of seismicity is moderate, except for a seismo-volcanic crisis in 1999. The seismic monitoring system has evolved during these years, following the trends of the technological developments and software improvements. Recent advances have been mainly focused on: (1) the improvement of the seismic network introducing broadband stations and 24-bit data acquisition systems; (2) the development of a short-period seismic array, with a 12-channel, 24-bit data acquisition system; (3) the implementation of wireless data transmission from the network stations and also from the seismic array to a recording center, allowing for real-time monitoring; (4) the efficiency of the power supply systems and the monitoring of the battery levels and power consumption; (5) the optimization of data analysis procedures, including database management, automated event recognition tools for the identification and classification of seismo-volcanic signals, and apparent slowness vector estimates using seismic array data; (6) the deployment of permanent seismic stations and the transmission of data during the winter using a satellite connection. A single permanent station is operating at Deception Island since 2008. In the current survey we collaborate with the Spanish Army to add another permanent station that will be able to send to the IAG-UGR seismic information about the activity of the volcano during the winter, using a communications satellite (SPAINSAT). These advances simplify the field work and the data acquisition procedures, and allow us to obtain high-quality seismic data in real-time. These improvements have a very important significance for a better and faster interpretation of the seismo-volcanic activity and assessment of the volcanic hazards at Deception Island volcano.

Carmona, E.; Almendros, J.; Martín, R.; Cortés, G.; Alguacil, G.; Moreno, J.; Martín, B.; Martos, A.; Serrano, I.; Stich, D.; Ibáñez, J. M.

2012-04-01

393

Patterns of significant seismic quiescence in the Pacific Mexican coast  

NASA Astrophysics Data System (ADS)

Mexico is one of the countries with higher seismicity. During the 20th century, 8% of all the earthquakes in the world of magnitude greater than or equal to 7.0 have taken place in Mexico. On average, an earthquake of magnitude greater than or equal to 7.0 occurred in Mexico every two and a half years. Great earthquakes in Mexico have their epicenters in the Pacific Coast in which some seismic gaps have been identified; for example, there is a mature gap in the Guerrero State Coast, which potentially can produce an earthquake of magnitude 8.2. With the purpose of making some prognosis, some researchers study the statistical behavior of certain physical parameters that could be related with the process of accumulation of stress in the Earth crust. Other researchers study seismic catalogs trying to find seismicity patterns that are manifested before the occurrence of great earthquakes. Many authors have proposed that the study of seismicity rates is an appropriate technique for evaluating how close a seismic gap may be to rupture. We designed an algorithm for identification of patterns of significant seismic quiescence by using the definition of seismic quiescence proposed by Schreider (1990). This algorithm shows the area of quiescence where an earthquake of great magnitude will probably occur. We apply our algorithm to the earthquake catalogue of the Mexican Pacific coast located between 14 and 21 degrees of North latitude and 94 and 106 degrees West longitude; with depths less or equal to 60 km and magnitude greater or equal to 4.2, which occurred from September, 1965 until December, 2014. We have found significant patterns of seismic quietude before the earthquakes of Oaxaca (November 1978, Mw = 7.8), Petatlán (March 1979, Mw = 7.6), Michoacán (September 1985, Mw = 8.0, and Mw = 7.6) and Colima (October 1995, Mw = 8.0). Fortunately, in this century have not occurred earthquakes of great magnitude in Mexico, however, we have identified well-defined seismic quiescence in the Guerrero seismic-gap, which are apparently correlated with the occurrence of silent earthquakes in 2002, 2006 and 2011 recently discovered by GPS technology. In fact, a possible silent earthquake with Mw =7.6 occurred at this gap in 2002 which lasted for approximately 4 months and was detected by continuous GPS receivers located over an area of ~550x250 square kilometers.

Muñoz-Diosdado, Alejandro; Rudolf-Navarro, Adolfo; Barrera-Ferrer, Amilcar; Angulo-Brown, Fernando

2014-05-01

394

Induced Seismicity of Kuznetsk Basin  

NASA Astrophysics Data System (ADS)

A natural seismicity of Kuznetsk Basin is confined in the main to mountain frame of Kuznetsk hollow. In this paper materials of experimental work with local station networks within sediment basin are presented. Different types of seismicity display within Kuznetsk hollow has been understood: first, man-caused seismic processes, confined to mine working and concentrated on depths up to one and a half of km; secondly, seismic activations on depths of 2-5 km, not coordinated in plan with coal mines; thirdly, induced seismicity in the neighborhood of strip mines. Every of studied seismic activations consists of large quantity of earthquakes of small powers (Ms=1-3). From one to first tens of earthquakes were recorded in a day. The earthquakes near mine working shift in space along with mine working, and seismic process become stronger at the instant a coal-plough machine is operated, and slacken at the instant the preventive works are executed. Uplift is the most typical focal mechanism. Activated zone near mine working reach in diameter 1-1,5 km. Today earthquakes happen mainly under mine working, though damages of working themselves do not happen, but intensive shaking on surface calls for intent study of so dangerous phenomena. Spatial-temporal changes of technogeneous activations not coordinated in plan with mine working are noted. A spatial displacement of activation along with mine working has been found. Trigger effects in progress of man-caused seismicity have been understood. It was demonstrated that industrial explosions in neighboring open-casts have no pronounced effect on seismic process near lavas. Stoppage of mole work in lavas leads to simultaneous changes in man-caused seismicity. The number of technogeneous earthquakes is lowered in several time, the earthquakes of small powers remain. Reactivation of lava coal production restores almost instantly the seismic behavior characteristics. Research of induced seismicity in area of "Raspadskaya" coal mine immediately after crash showed an existence of seismic activated zone, where four working lavas shifted from different sides. A fact of vibration effect at 500 m distance on characteristics of technogeneous seismicity in area of working lava has been experimentally determined. This fact allows to be relied on success in making of control method of technogeneous seismicity, what is important for working protection of coal production. The technogeneous seismicity in section area of 350 m depth and size of 3 km to 12 km has been studied. The strongest earthquake in the section area had magnitude 4. In whole a seismic energy of technogeneous earthquakes is in order less, than a seismic effect of industrial explosions in open cast. The recorded large event is rare, but dangerous phenomena. The largest coal basin of Siberia, disposed in zone of moderate natural activity, is situated in stress-strain state, and development of intensive induced seismicity accompanies coal production.

Emanov, A.; Leskova, E.; Fateev, A.

2013-05-01

395

Workmanship Coupon Verifies and Validates the Remote Inspection System Used to Inspect Dry Shielded Canister Welds  

SciTech Connect

The Idaho National Engineering and Environmental Laboratory (INEEL) is operated by Bechtel-BWXT Idaho LLC (BBWI), which recently completed a very successful Three-Mile Island-2 (TMI-2) program for the Department of Energy. This complex and challenging program loaded, welded, and transported an unprecedented 27 dry shielded canisters in seven-months, and did so ahead of schedule. The program moved over 340 canisters of TMI-2 core debris that had been in wet storage into a dry storage facility at the INEEL. Welding flaws with the manually welded purge and vent ports discovered in mid-campaign had to be verified as not effecting previous completed seal welds. A portable workmanship coupon was designed and built to validate remote inspection of completed in-service seal welds. This document outlines the methodology and advantages for building and using workmanship coupons.

Custer, K. E.; Zirker, L. R.; Dowalo, J. A.; Kaylor, J. E.

2002-02-25

396

Calibration of Seismic Attributes for Reservoir Characterization  

SciTech Connect

This project has completed the initially scheduled third year of the contract, and is beginning a fourth year, designed to expand upon the tech transfer aspects of the project. From the Stratton data set, demonstrated that an apparent correlation between attributes derived along `phantom' horizons are artifacts of isopach changes; only if the interpreter understands that the interpretation is based on this correlation with bed thickening or thinning, can reliable interpretations of channel horizons and facies be made. From the Boonsville data set , developed techniques to use conventional seismic attributes, including seismic facies generated under various neural network procedures, to subdivide regional facies determined from logs into productive and non-productive subfacies, and developed a method involving cross-correlation of seismic waveforms to provide a reliable map of the various facies present in the area. The Teal South data set provided a surprising set of data, leading us to develop a pressure-dependent velocity relationship and to conclude that nearby reservoirs are undergoing a pressure drop in response to the production of the main reservoir, implying that oil is being lost through their spill points, never to be produced. The Wamsutter data set led to the use of unconventional attributes including lateral incoherence and horizon-dependent impedance variations to indicate regions of former sand bars and current high pressure, respectively, and to evaluation of various upscaling routines.

Pennington, Wayne D.; Acevedo, Horacio; Green, Aaron; Len, Shawn; Minavea, Anastasia; Wood, James; Xie, Deyi

2002-01-29

397

Linking the Meaning of Programs to What the Compiler Can Verify  

E-print Network

and development challenges that relate what a verifying compiler can verify to the definition and analysis verifier challenge in [46]. By its definition, Hoare's challenge is focussed on the correctness of programs of their refinements to compilable code, using Abstract State Machine (ASM) ground models [11] (Sect. 1) and ASM refine

Börger, Egon

398

How to Generate Universally Verifiable Signatures in AdHoc Networks #  

E-print Network

How to Generate Universally Verifiable Signatures in Ad­Hoc Networks # KyungKeun Lee + JoongHyo Oh domain (an ad­hoc network) available in another domain (the Internet). Universal verifiability in the Gap Di#e­Hellman groups. Keywords: Ad­hoc networks, Universal verifiability, Interoperability, Digital

399

49 CFR 40.149 - May the MRO change a verified drug test result?  

Code of Federal Regulations, 2010 CFR

...May the MRO change a verified drug test result? 40.149 Section 40.149 Transportation...May the MRO change a verified drug test result? (a) As the MRO, you may change a verified test result only in the following situations:...

2010-10-01

400

31 CFR 363.14 - How will you verify my identity?  

Code of Federal Regulations, 2010 CFR

...2010-07-01 false How will you verify my identity? 363.14 Section 363.14 Money... § 363.14 How will you verify my identity? (a) Individual. When you establish...a verification service to verify your identity using information you provide...

2010-07-01

401

Verifiable Secret Redistribution for Threshold Sharing Theodore M. Wong Chenxi Wang 1 Jeannette M. Wing  

E-print Network

, of DARPA or the U.S. Government. #12; Keywords: non­interactive verifiable secret redistribution, thresholdVerifiable Secret Redistribution for Threshold Sharing Schemes Theodore M. Wong Chenxi Wang 1 Pittsburgh, PA 15213 Abstract We present a new protocol for the verifiable redistribution of secrets from (m

402

Verifiable Secret Redistribution for Threshold Sharing Theodore Wong Chenxi Wang Jeannette Wing  

E-print Network

, either expressed implied, DARPA or Government. #12; Keywords: non­interactive verifiable secretVerifiable Secret Redistribution for Threshold Sharing Schemes Theodore Wong Chenxi Wang Jeannette Abstract We present a new protocol verifiable redistribution secrets from (m,n) ) access structures

403

Verifiable Secret Redistribution for Threshold Sharing Theodore M. Wong Chenxi Wang1  

E-print Network

or the U.S. Government. #12;Keywords: non-interactive verifiable secret redistribution, threshold secretVerifiable Secret Redistribution for Threshold Sharing Schemes Theodore M. Wong Chenxi Wang1 Pittsburgh, PA 15213 Abstract We present a new protocol for the verifiable redistribution of secrets from (m

404

Verifying a computational method for predicting extreme ground motion  

USGS Publications Warehouse

In situations where seismological data is rare or nonexistent, computer simulations may be used to predict ground motions caused by future earthquakes. This is particularly practical in the case of extreme ground motions, where engineers of special buildings may need to design for an event that has not been historically observed but which may occur in the far-distant future. Once the simulations have been performed, however, they still need to be tested. The SCEC-USGS dynamic rupture code verification exercise provides a testing mechanism for simulations that involve spontaneous earthquake rupture. We have performed this examination for the specific computer code that was used to predict maximum possible ground motion near Yucca Mountain. Our SCEC-USGS group exercises have demonstrated that the specific computer code that was used for the Yucca Mountain simulations produces similar results to those produced by other computer codes when tackling the same science problem. We also found that the 3D ground motion simulations produced smaller ground motions than the 2D simulations.

Harris, R.A.; Barall, M.; Andrews, D.J.; Duan, B.; Ma, S.; Dunham, E.M.; Gabriel, A.-A.; Kaneko, Y.; Kase, Y.; Aagaard, B.T.; Oglesby, D.D.; Ampuero, J.-P.; Hanks, T.C.; Abrahamson, N.

2011-01-01

405

Comprehensive seismic hazard assessment of Tripura and Mizoram states  

NASA Astrophysics Data System (ADS)

Northeast India is one of the most highly seismically active regions in the world with more than seven earthquakes on an average per year of magnitude 5.0 and above. Reliable seismic hazard assessment could provide the necessary design inputs for earthquake resistant design of structures in this region. In this study, deterministic as well as probabilistic methods have been attempted for seismic hazard assessment of Tripura and Mizoram states at bedrock level condition. An updated earthquake catalogue was collected from various national and international seismological agencies for the period from 1731 to 2011. The homogenization, declustering and data completeness analysis of events have been carried out before hazard evaluation. Seismicity parameters have been estimated using G-R relationship for each source zone. Based on the seismicity, tectonic features and fault rupture mechanism, this region was divided into six major subzones. Region specific correlations were used for magnitude conversion for homogenization of earthquake size. Ground motion equations (Atkinson and Boore 2003; Gupta 2010) were validated with the observed PGA (peak ground acceleration) values before use in the hazard evaluation. In this study, the hazard is estimated using linear sources, identified in and around the study area. Results are presented in the form of PGA using both DSHA (deterministic seismic hazard analysis) and PSHA (probabilistic seismic hazard analysis) with 2 and 10% probability of exceedance in 50 years, and spectral acceleration (T = 0. 2 s, 1.0 s) for both the states (2% probability of exceedance in 50 years). The results are important to provide inputs for planning risk reduction strategies, for developing risk acceptance criteria and financial analysis for possible damages in the study area with a comprehensive analysis and higher resolution hazard mapping.

Sitharam, T. G.; Sil, Arjun

2014-06-01

406

Development of a hydraulic borehole seismic source  

SciTech Connect

This report describes a 5 year, $10 million Sandia/Industry project to develop an advanced borehole seismic source for use in oil and gas exploration and production. The development Team included Sandia, Chevron, Amoco, Conoco, Exxon, Raytheon, Pelton, and GRI. The seismic source that was developed is a vertically oriented, axial point force, swept frequency, clamped, reaction-mass vibrator design. It was based on an early Chevron prototype, but the new tool incorporates a number of improvements which make it far superior to the original prototype. The system consists of surface control electronics, a special heavy duty fiber optic wireline and draw works, a cablehead, hydraulic motor/pump module, electronics module, clamp, and axial vibrator module. The tool has a peak output of 7,000 lbs force and a useful frequency range of 5 to 800 Hz. It can operate in fluid filled wells with 5.5-inch or larger casing to depths of 20,000 ft and operating temperatures of 170 C. The tool includes fiber optic telemetry, force and phase control, provisions to add seismic receiver arrays below the source for single well imaging, and provisions for adding other vibrator modules to the tool in the future. The project yielded four important deliverables: a complete advanced borehole seismic source system with all associated field equipment; field demonstration surveys funded by industry showing the utility of the system; industrial sources for all of the hardware; and a new service company set up by their industrial partner to provide commercial surveys.

Cutler, R.P.

1998-04-01

407

DSOD Procedures for Seismic Hazard Analysis  

NASA Astrophysics Data System (ADS)

DSOD, which has jurisdiction over more than 1200 dams in California, routinely evaluates their dynamic stability using seismic shaking input ranging from simple pseudostatic coefficients to spectrally matched earthquake time histories. Our seismic hazard assessments assume maximum earthquake scenarios of nearest active and conditionally active seismic sources. Multiple earthquake scenarios may be evaluated depending on sensitivity of the design analysis (e.g., to certain spectral amplitudes, duration of shaking). Active sources are defined as those with evidence of movement within the last 35,000 years. Conditionally active sources are those with reasonable expectation of activity, which are treated as active until demonstrated otherwise. The Division's Geology Branch develops seismic hazard estimates using spectral attenuation formulas applicable to California. The formulas were selected, in part, to achieve a site response model similar to the 2000 IBC's for rock, soft rock, and stiff soil sites. The level of dynamic loading used in the stability analysis (50th, 67th, or 84th percentile ground shaking estimates) is determined using a matrix that considers consequence of dam failure and fault slip rate. We account for near-source directivity amplification along such faults by adjusting target response spectra and developing appropriate design earthquakes for analysis of structures sensitive to long-period motion. Based on in-house studies, the orientation of the dam analysis section relative to the fault-normal direction is considered for strike-slip earthquakes, but directivity amplification is assumed in any orientation for dip-slip earthquakes. We do not have probabilistic standards, but we evaluate the probability of our ground shaking estimates using hazard curves constructed from the USGS Interactive De-Aggregation website. Typically, return periods for our design loads exceed 1000 years. Excessive return periods may warrant a lower design load. Minimum shaking levels are provided for sites far from active faulting. Our procedures and standards are presented at the DSOD website http://damsafety.water.ca.gov/. We review our methods and tools periodically under the guidance of our Consulting Board for Earthquake Analysis (and expect to make changes pending NGA completion), mindful that frequent procedural changes can interrupt design evaluations.

Howard, J. K.; Fraser, W. A.

2005-12-01

408

Verifying likelihoods for low template DNA profiles using multiple replicates  

PubMed Central

To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

Steele, Christopher D.; Greenhalgh, Matthew; Balding, David J.

2014-01-01

409

Verifying likelihoods for low template DNA profiles using multiple replicates.  

PubMed

To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

Steele, Christopher D; Greenhalgh, Matthew; Balding, David J

2014-11-01

410

Reservoir Characterization Using Intelligent Seismic Inversion  

E-print Network

with uncertainty, imprecision, and partial truth. motivation > Reservoir Characterization #12;Ten-feet One characterization studies. - Inverse modeling of reservoir properties from the seismic data is known as seismic inversion. SEISMIC LOGS #12;1. Does a relationship exist between seismic data and reservoir characteristics

Mohaghegh, Shahab

411

Updated Colombian Seismic Hazard Map  

NASA Astrophysics Data System (ADS)

The Colombian seismic hazard map used by the National Building Code (NSR-98) in effect until 2009 was developed in 1996. Since then, the National Seismological Network of Colombia has improved in both coverage and technology providing fifteen years of additional seismic records. These improvements have allowed a better understanding of the regional geology and tectonics which in addition to the seismic activity in Colombia with destructive effects has motivated the interest and the need to develop a new seismic hazard assessment in this country. Taking advantage of new instrumental information sources such as new broad band stations of the National Seismological Network, new historical seismicity data, standardized global databases availability, and in general, of advances in models and techniques, a new Colombian seismic hazard map was developed. A PSHA model was applied. The use of the PSHA model is because it incorporates the effects of all seismic sources that may affect a particular site solving the uncertainties caused by the parameters and assumptions defined in this kind of studies. First, the seismic sources geometry and a complete and homogeneous seismic catalog were defined; the parameters of seismic rate of each one of the seismic sources occurrence were calculated establishing a national seismotectonic model. Several of attenuation-distance relationships were selected depending on the type of seismicity considered. The seismic hazard was estimated using the CRISIS2007 software created by the Engineering Institute of the Universidad Nacional Autónoma de México -UNAM (National Autonomous University of Mexico). A uniformly spaced grid each 0.1° was used to calculate the peak ground acceleration (PGA) and response spectral values at 0.1, 0.2, 0.3, 0.5, 0.75, 1, 1.5, 2, 2.5 and 3.0 seconds with return periods of 75, 225, 475, 975 and 2475 years. For each site, a uniform hazard spectrum and exceedance rate curves were calculated. With the results, it is possible to determinate environments and scenarios where the seismic hazard is a function of distance and magnitude and also the principal seismic sources that contribute to the seismic hazard at each site (dissagregation). This project was conducted by the Servicio Geológico Colombiano (Colombian Geological Survey) and the Universidad Nacional de Colombia (National University of Colombia), with the collaboration of national and foreign experts and the National System of Prevention and Attention of Disaster (SNPAD). It is important to stand out that this new seismic hazard map was used in the updated national building code (NSR-10). A new process is ongoing in order to improve and present the Seismic Hazard Map in terms of intensity. This require new knowledge in site effects, in both local and regional scales, checking the existing and develop new acceleration to intensity relationships, in order to obtain results more understandable and useful for a wider range of users, not only in the engineering field, but also all the risk assessment and management institutions, research and general community.

Eraso, J.; Arcila, M.; Romero, J.; Dimate, C.; Bermúdez, M. L.; Alvarado, C.

2013-05-01

412

Experimental measurements of seismic attenuation in microfracture sedimentary rock  

SciTech Connect

In a previous paper (Peacock et al., 1994), the authors related ultrasonic velocities in water-saturated Carrara Marble to crack densities in polished sections to verify Hudson's (1980, 1981, 1986) theory for velocities in cracked rock. They describe the empirical relationships between attenuation and crack density that they established during these experiments in the hope of clarifying the mechanism of attenuation in rocks with fluid-filled cracks. Relating seismic velocity and attenuation to crack density is important in predicting the productivity of fractured petroleum reservoirs such as the North Sea Brent Field. It also allows cracks to be used as stress indicators throughout the shallow crust (Crampin and Lovell, 1991).

Peacock, S.; McCann, C.; Sothcott, J.; Astin, T.R. (Univ. of Reading (United Kingdom). Research Inst. for Sedimentology)

1994-09-01

413

Seismic imaging of sandbox experiments - laboratory hardware setup and first reflection seismic sections  

NASA Astrophysics Data System (ADS)

With the study and technical development introduced here, we combine analogue sandbox simulation techniques with seismic physical modelling of sandbox models. For that purpose, we designed and developed a new mini-seismic facility for laboratory use, comprising a seismic tank, a PC-driven control unit, a positioning system, and piezoelectric transducers used here for the first time in an array mode. To assess the possibilities and limits of seismic imaging of small-scale structures in sandbox models, different geometry setups were tested in the first 2-D experiments that also tested the proper functioning of the device and studied the seismo-elastic properties of the granular media used. Simple two-layer models of different materials and layer thicknesses as well as a more complex model comprising channels and shear zones were tested using different acquisition geometries and signal properties. We suggest using well sorted and well rounded grains with little surface roughness (glass beads). Source receiver-offsets less than 14 cm for imaging structures as small as 2.0-1.5 mm size have proven feasible. This is the best compromise between wide beam and high energy output, and is applicable with a consistent waveform. Resolution of the interfaces of layers of granular materials depends on the interface preparation rather than on the material itself. Flat grading of interfaces and powder coverage yields the clearest interface reflections. Finally, sandbox seismic sections provide images of high quality showing constant thickness layers as well as predefined channel structures and indications of the fault traces from shear zones. Since these were artificially introduced in our test models, they can be regarded as zones of disturbance rather than tectonic shear zones characterized by decompaction. The multiple-offset surveying introduced here, improves the quality with respect to S / N ratio and source signature even more; the maximum depth penetration in glass-bead layers thereby amounts to 5 cm. Thus, the presented mini-seismic device is already able to resolve structures within simple models of saturated porous media, so that multiple-offset seismic imaging of shallow sandbox models, that are structurally evolving, is generally feasible.

Krawczyk, C. M.; Buddensiek, M.-L.; Oncken, O.; Kukowski, N.

2013-02-01

414

Seismic imaging of sandbox experiments - laboratory hardware setup and first reflection seismic sections  

NASA Astrophysics Data System (ADS)

With the study and technical development introduced here, we combine analogue sandbox simulation techniques with seismic physical modelling of sandbox models. For that purpose, we designed and developed a new mini-seismic facility for laboratory use, comprising a seismic tank, a PC-driven control unit, a positioning system, and piezo-electric transducers used here the first time in an array mode. To assess the possibilities and limits of seismic imaging of small-scale structures in sandbox models, different geometry setups were tested in the first experiments that also tested the proper functioning of the device and studied the seismo-elastic properties of the granular media used. Simple two-layer models of different materials and layer thicknesses as well as a more complex model comprising channels and shear zones were tested using different acquisition geometries and signal properties. We suggest using well sorted and well rounded grains with little surface roughness (glass beads). Source receiver-offsets less than 14 cm for imaging structures as small as 2.0-1.5 mm size have proven feasible. This is the best compromise between wide beam and high energy output, and being applicable with a consistent waveform. Resolution of the interfaces of layers of granular materials depends on the interface preparation rather than on the material itself. Flat grading of interfaces and powder coverage yields the clearest interface reflections. Finally, sandbox seismic sections provide images of very good quality showing constant thickness layers as well as predefined channel structures and fault traces from shear zones. Since these can be regarded in sandbox models as zones of decompaction, they behave as reflectors and can be imaged. The multiple-offset surveying introduced here improves the quality with respect to S/N-ratio and source signature even more; the maximum depth penetration in glass bead layers thereby amounts to 5 cm. Thus, the presented mini-seismic device is already able to resolve structures within simple models of saturated porous media, so that multiple-offset seismic imaging of shallow sandbox models, that are structurally evolving, is generally feasible.

Krawczyk, C. M.; Buddensiek, M.-L.; Oncken, O.; Kukowski, N.

2012-10-01

415

Development of a seismic borehole sonde for high resolution geophysical exploration ahead and around the drill bit  

NASA Astrophysics Data System (ADS)

The importance of exploration with high resolution increases more and more because reservoirs especially in geothermal fields are characterized of small-scale geological structures. Today, surface seismic surveys were often combined with borehole seismic measurements like VSP or SWD to improve the velocity model and to image the structures with higher resolution. The accuracy of structure localization depends strongly on the surveying depth. There is the need for resolution of such small-scale structures in the range of meters to explore deeper structures with a high resolution. In the project "Seismic Prediction While Drilling" (SPWD) a new approach for a seismic exploration method in boreholes will be examined. SPWD comprises the seismic sources and receivers in one device. This allows an exploration with a resolution independent from depth and a system development for an exploration ahead and around the drill bit. At first a prototype of a borehole device for dry horizontal boreholes in a mine was developed and tested. The source device consists of four magnetostrictive vibrators emitting sweep signals from 500 Hz to 5000 Hz. To achieve a radiation pattern for focusing the seismic wave energy in predefined directions the signals of each vibrator must be independently controlled in amplitude and phase. The adjustment of amplitudes and phases of each sweep signal resulting in constructive interference with a predefined direction. A control of the emitted signals is retained by 30 three-component receivers mounted along the surrounding galleries in distances of up to 50 m. In measurements several parameters were examined to control the radiation pattern. The enhancement and diminishment of the wave amplitudes in the predefined directions of the radiation pattern is clearly exhibited also a dependency of the frequency. Using a three-component Fresnel-Volume-Migration to image the reflected wave field the results show clearly the effect of the radiation pattern on the distribution of the seismic wave energy. The migration of the reflected wave field reveals an amplification of the reflected amplitudes at the galleries corresponding to the radiation pattern of the complex borehole source. Also, structures passing through the borehole can be detected with an additional characterization by different radiation patterns. Further improvements were realized in focusing the seismic energy with advances in technical devices and also in the control of the vibrators. As a next step a wireline prototype for borehole measurements was designed and constructed. Currently the manufacturing is in progress. This prototype will be used in vertical boreholes up to 2000 m depth. After completion first measurements are planned to verify the exploration method for a directional investigation in boreholes. The measurements will take place in different geologies of hard and soft rocks and also depths. Also the mine was expanded with a 70 m vertical borehole for further research aspects. This project is funded by the German Federal Environment Ministry.

Jaksch, K.; Giese, R.; Kopf, M.

2012-04-01

416

Downhole hydraulic seismic generator  

DOEpatents

A downhole hydraulic seismic generator system for transmitting energy wave vibrations into earth strata surrounding a borehole. The system contains an elongated, unitary housing operably connected to a well head aboveground by support and electrical cabling, and contains clamping apparatus for selectively clamping the housing to the walls of the borehole. The system further comprises a hydraulic oscillator containing a double-actuating piston whose movement is controlled by an electro-servovalve regulating a high pressure hydraulic fluid flow into and out of upper and lower chambers surrounding the piston. The spent hydraulic fluid from the hydraulic oscillator is stored and pumped back into the system to provide high pressure fluid for conducting another run at the same, or a different location within the borehole.

Gregory, Danny L. (Corrales, NM); Hardee, Harry C. (Albuquerque, NM); Smallwood, David O. (Albuquerque, NM)

1992-01-01

417

End to seismic quiescence  

SciTech Connect

Analysis of data of occurrence of earthquakes of Richter magnitude (M) >6 in California from about 1850 to 1980 indicates that the region is emerging from a period of seismic quiescence. Epicentral distribution of all earthquakes of M greater than or equal to 6 for this same period is shown graphically as is the temporal distribution of these earthquakes. The rise in occurrence of strong earthquakes (4 earthquakes of M greater than or equal to 6 from October, 1979 through November, 1980) after a period of quiescence can be viewed as representative of behavior observed before the occurrence of some very large shocks; but the authors feel that in light of California's short earthquake record (since about 1850) the current increase in activity represents a return to a more normal rate of earthquake occurrence than that during the period of quiescence. (BLM)

Bufe, C.G.; Toppozada, T.R.

1981-06-01

418

Seismicity of Sri Lanka  

NASA Astrophysics Data System (ADS)

Sri Lanka has been considered an aseismic region. After 2.5 years of continuous microearthquake recording in the Kotmale area, earthquakes with a magnitude ? 2.25 have been recorded clearly indicating a measurable seismic risk. The data come from an array established in February 1982, surrounding the proposed Kotmale Reservoir in a geologically adverse area, where nine major lineaments have been identified. These major lineaments are either 70-90° dipping normal faults with a fraction of a metre displacement, fracture zones with little or no displacements or master joints with no or unknown displacements. Forty-eight microearthquakes have been recorded from various parts of the country from February 1983 to August 1984 with magnitude varying from 0.2 to 2.25 on the Richter scale. These results are an outcome of the Kotmale Microseismic Network where a 87 m high rock-filled dam has been constructed across the Kotmale Valley. Reservoir gross storage is 174-10 6 m 3 when the water level reaches 84.5 m from the valley bottom. This network is part of the Kotmale Hydro Power Project which comes under the Accelerated Mahaweli Programme. So far no microearthquakes have been recorded from the nine major lineaments at Kotmale and therefore no correlation can be made between said lineaments and seismicity. Microearthquake epicentres appear to be closely associated with major lineaments and escarpments of the central highlands of Sri Lanka. The north-south trending Mahaweli lineament and Haputale escarpment at Haputale are two examples where earthquakes with a magnitude ? 1.7 on the Richter scale have been located. This study supports an idea of the slow movement of the central highlands as suggested by several authors in the past from geomorphological evidence.

Fernando, M. J.; Kulasinghe, A. N. S.

1986-10-01

419

Tilt and seismicity changes in the Shumagin seismic gap  

SciTech Connect

Changes in the ground surface tilt and in the rate of seismicity indicate that an aseismic deformation event may have occurred between 1978 and 1980 along the plate boundary in the eastern Aleutians, Alaska, within the Shumagin seismic gap. Pavlof Volcano was unusually quiescent during this period. The proposed event would cause an increase of stress on the shallow locked portion of the plate boundary, bringing it closer to rupture in a great earthquake.

Beavan, J.; Hauksson, E.; McNutt, S.R.; Bilham, R.; Jacob, K.H.

1983-10-21

420

Tilt and seismicity changes in the shumagin seismic gap.  

PubMed

Changes in the ground surface tilt and in the rate of seismicity indicate that an aseismic deformation event may have occurred between 1978 and 1980 along the plate boundary in the eastern Aleutians, Alaska, within the Shumagin seismic gap. Pavlof Volcano was unusually quiescent during this period. The proposed event would cause an increase of stress on the shallow locked portion of the plate boundary, bringing it closer to rupture in a great earthquake. PMID:17734832

Beavan, J; Hauksson, E; McNutt, S R; Bilham, R; Jacob, K H

1983-10-21

421

From Induced Seismicity to Direct Time-Dependent Seismic Hazard  

NASA Astrophysics Data System (ADS)

The growing installation of industrial facilities for subsurface exploration worldwide requires continuous refinements in understanding both the mechanisms by which seismicity is induced by field operations and the related seismic hazard. Particularly in proximity of densely populated areas, induced low-to-moderate magnitude seismicity characterized by high-frequency content can be clearly felt by the surrounding inhabitants and, in some cases, may produce damage. In this respect we propose a technique for time-dependent probabilistic seismic hazard analysis to be used in geothermal fields as a monitoring tool for the effects of on-going field operations. The technique integrates the observed features of the seismicity induced by fluid injection and extraction with a local ground-motion prediction equation. The result of the analysis is the time-evolving probability of exceedance of peak ground acceleration (PGA), which can be compared with selected critical values to manage field operations. To evaluate the reliability of the proposed technique, we applied it to data collected in The Geysers geothermal field in northern California between 1 September 2007 and 15 November 2010. We show that the period considered the seismic hazard at The Geysers was variable in time and space, which is a consequence of the field operations and the variation of both seismicity rate and b-value. We conclude that, for the exposure period taken into account (i.e., two months), as a conservative limit, PGA values corresponding to the lowest probability of exceedance (e.g., 30%) must not be exceeded to ensure safe field operations. We suggest testing the proposed technique at other geothermal areas or in regions where seismicity is induced, for example, by hydrocarbon exploitation or carbon dioxide storage.

Convertito, V.; Maercklin, N.; Sharma, N.; Zollo, A.

2012-12-01

422

Seismic hazard assessment in Central Asia using smoothed seismicity approaches  

NASA Astrophysics Data System (ADS)

Central Asia has a long history of large to moderate frequent seismicity and is therefore considered one of the most seismically active regions with a high hazard level in the world. In the hazard map produced at global scale by GSHAP project in 1999( Giardini, 1999), Central Asia is characterized by peak ground accelerations with return period of 475 years as high as 4.8 m/s2. Therefore Central Asia was selected as a target area for EMCA project (Earthquake Model Central Asia), a regional project of GEM (Global Earthquake Model) for this area. In the framework of EMCA, a new generation of seismic hazard maps are foreseen in terms of macro-seismic intensity, in turn to be used to obtain seismic risk maps for the region. Therefore Intensity Prediction Equation (IPE) had been developed for the region based on the distribution of intensity data for different earthquakes occurred in Central Asia since the end of 19th century (Bindi et al. 2011). The same observed intensity distribution had been used to assess the seismic hazard following the site approach (Bindi et al. 2012). In this study, we present the probabilistic seismic hazard assessment of Central Asia in terms of MSK-64 based on two kernel estimation methods. We consider the smoothed seismicity approaches of Frankel (1995), modified for considering the adaptive kernel proposed by Stock and Smith (2002), and of Woo (1996), modified for considering a grid of sites and estimating a separate bandwidth for each site. The activity rate maps are shown from Frankel approach showing the effects of fixed and adaptive kernel. The hazard is estimated for rock site condition based on 10% probability of exceedance in 50 years. Maximum intensity of about 9 is observed in the Hindukush region.

Ullah, Shahid; Bindi, Dino; Zuccolo, Elisa; Mikhailova, Natalia; Danciu, Laurentiu; Parolai, Stefano

2014-05-01

423

First level seismic microzonation map of Chennai city - a GIS approach  

NASA Astrophysics Data System (ADS)

Chennai city is the fourth largest metropolis in India, is the focus of economic, social and cultural development and it is the capital of the State of Tamil Nadu. The city has a multi-dimensional growth in development of its infrastructures and population. The area of Chennai has experienced moderate earthquakes in the historical past. Also the Bureau of Indian Standard upgraded the seismic status of Chennai from Low Seismic Hazard (Zone II) to Moderate Seismic Hazard (Zone III)-(BIS: 1893 (2001)). In this connection, a first level seismic microzonation map of Chennai city has been produced with a GIS platform using the themes, viz, Peak Ground Acceleration (PGA), Shear wave velocity at 3 m, Geology, Ground water fluctuation and bed rock depth. The near potential seismic sources were identified from the remote-sensing study and seismo-tectonic details from published literatures. The peak ground acceleration for these seismic sources were estimated based on the attenuation relationship and the maximum PGA for Chennai is 0.176 g. The groundwater fluctuation of the city varies from 0-4 m below ground level. The depth to bedrock configuration shows trough and ridges in the bedrock topography all over the city. The seismic microzonation analysis involved grid datasets (the discrete datasets from different themes were converted to grids) to compute the final seismic hazard grid through integration and weightage analysis of the source themes. The Chennai city has been classified into three broad zones, viz, High, Moderate and Low Seismic Hazard. The High seismic Hazard concentrated in a few places in the western central part of the city. The moderate hazard areas are oriented in NW-SE direction in the Western part. The southern and eastern part will have low seismic hazard. The result of the study may be used as first-hand information in selecting the appropriate earthquake resistant features in designing the forthcoming new buildings against seismic ground motion of the city.

Ganapathy, G. P.

2011-02-01

424

Earthquake magnitude or seismic moment in seismic hazard evaluation?  

NASA Astrophysics Data System (ADS)

Seismic hazard analysis requires the estimation of the probabilities that earthquakes will take place within a region of interest, and the expected level of ground motion which will be received at a site during the next t years. The earthquake magnitude has been used as a basic parameter, because it is available, under the assumption that the earthquake occurrence is a compound Poisson process with exponential or multinomial distribution of magnitude. For improving the hazard prediction, we used the seismic moment as a basic parameter to estimate the mean rate, ?, of occurrence of earthquakes in a function of seismic moment rate and slip rate released in a seismogenic region. As an illustration of the model, the seismic hazard analysis at different sites in and around the Gulf of Corinth, central Greece, is presented on the basis of the earthquake magnitude and the seismic moment. Comparison of the results shows that determination of the mean rate of earthquake occurrence, using the conventional Gutenberg-Richter recurrence model, underestimates the seismic hazard at a site.

Stavrakakis, Georgios N.

1988-03-01

425

A study on seismicity and seismic hazard for Karnataka State  

NASA Astrophysics Data System (ADS)

This paper presents a detailed study on the seismic pattern of the state of Karnataka and also quantifies the seismic hazard for the entire state. In the present work, historical and instrumental seismicity data for Karnataka (within 300 km from Karnataka political boundary) were compiled and hazard analysis was done based on this data. Geographically, Karnataka forms a part of peninsular India which is tectonically identified as an intraplate region of Indian plate. Due to the convergent movement of the Indian plate with the Eurasian plate, movements are occurring along major intraplate faults resulting in seismic activity of the region and hence the hazard assessment of this region is very important. Apart from referring to seismotectonic atlas for identifying faults and fractures, major lineaments in the study area were also mapped using satellite data. The earthquake events reported by various national and international agencies were collected until 2009. Declustering of earthquake events was done to remove foreshocks and aftershocks. Seismic hazard analysis was done for the state of Karnataka using both deterministic and probabilistic approaches incorporating logic tree methodology. The peak ground acceleration (PGA) at rock level was evaluated for the entire state considering a grid size of 0.05° × 0.05°. The attenuation relations proposed for stable continental shield region were used in evaluating the seismic hazard with appropriate weightage factors. Response spectra at rock level for important Tier II cities and Bangalore were evaluated. The contour maps showing the spatial variation of PGA values at bedrock are presented in this work.

Sitharam, T. G.; James, Naveen; Vipin, K. S.; Raj, K. Ganesha

2012-04-01

426

Induced seismicity in mines in Canada—An overview  

Microsoft Academic Search

Monitoring of mine-induced seismicity in Canada has improved with the expansion of regional seismograph networks into areas of active mining. However, the severity, and in some cases the frequency, of mine-induced tremors has increased as mining extends to greater depths and at accelerated rates of extraction. Because of the complex design and large areal extent of many mines (potash, coal

Henry S. Hasegawa; Robert J. Wetmiller; Don J. Gendzwill

1989-01-01

427

Seismic performance parameters of RC beams retrofitted by CARDIFRC ®  

Microsoft Academic Search

A new high performance fibre-reinforced cementitious composite material (designated CARDIFRC®), to be used for retrofitting concrete members, has been developed at Cardiff University. The material is compatible with concrete and possesses favourable strength and ductility properties desirable for seismic retrofit. It overcomes some of the problems associated with the current techniques based on externally bonded steel plates and fibre-reinforced polymer

Mahmoud R. Maheri; Bhushan Lal Karihaloo; Farshid Jandaghi Alaee

2004-01-01

428

An evaluation of the seismic- window theory for earthquake prediction.  

USGS Publications Warehouse

Reports studies designed to determine whether earthquakes in the San Francisco Bay area respond to a fortnightly fluctuation in tidal amplitude. It does not appear that the tide is capable of triggering earthquakes, and in particular the seismic window theory fails as a relevant method of earthqu