Sample records for computer aided seismic

  1. Aerospace technology can be applied to exploration 'back on earth'. [offshore petroleum resources

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.

    1977-01-01

    Applications of aerospace technology to petroleum exploration are described. Attention is given to seismic reflection techniques, sea-floor mapping, remote geochemical sensing, improved drilling methods and down-hole acoustic concepts, such as down-hole seismic tomography. The seismic reflection techniques include monitoring of swept-frequency explosive or solid-propellant seismic sources, as well as aerial seismic surveys. Telemetry and processing of seismic data may also be performed through use of aerospace technology. Sea-floor sonor imaging and a computer-aided system of geologic analogies for petroleum exploration are also considered.

  2. Seismic expression of Red Fork channels in Major and Kay Counties, Oklahoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanoch, C.A.

    1987-08-01

    This paper investigates the application of regional seismic to exploration and development Red Fork sands of the Cherokee Group, in Major and Kay Counties, Oklahoma. A computer-aided exploration system (CAEX) was used to justify the subtle seismic expressions with the geological interpretation. Modeling shows that the low-velocity shales are the anomalous rock in the Cherokee package, which is most represented by siltstone and thin sands. Because the Red Fork channel sands were incised into or deposited with laterally time-equivalent siltstones, no strong reflection coefficient is associated with the top of the sands. The objective sands become a seismic anomaly onlymore » when they cut into and replace a low-velocity shale. This knowledge allows mapping the channel thickness by interpreting the shale thickness from seismic data. A group shoot line in Major County, Oklahoma, has been tied to the geologic control, and the channel thicknesses have been interpreted assuming a detectable vertical resolution of 10 ft. A personal computer-based geophysical work station is used to construct velocity logs representative of the geology to produce forward-modeled synthetic seismic sections, and to display, in color, the seismic trace attributes. These synthetic sections are used as tools to compare with and interpret the seismic line and to evaluate the interpretative value of lowest cost, lesser quality data versus reprocessing or new data acquisition.« less

  3. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    NASA Astrophysics Data System (ADS)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the earthquake wave propagation to tsunami mitigation would be feasible once the user community support is in place.

  4. A Kirchhoff approach to seismic modeling and prestack depth migration

    NASA Astrophysics Data System (ADS)

    Liu, Zhen-Yue

    1993-05-01

    The Kirchhoff integral provides a robust method for implementing seismic modeling and prestack depth migration, which can handle lateral velocity variation and turning waves. With a little extra computation cost, the Kirchoff-type migration can obtain multiple outputs that have the same phase but different amplitudes, compared with that of other migration methods. The ratio of these amplitudes is helpful in computing some quantities such as reflection angle. I develop a seismic modeling and prestack depth migration method based on the Kirchhoff integral, that handles both laterally variant velocity and a dip beyond 90 degrees. The method uses a finite-difference algorithm to calculate travel times and WKBJ amplitudes for the Kirchhoff integral. Compared to ray-tracing algorithms, the finite-difference algorithm gives an efficient implementation and single-valued quantities (first arrivals) on output. In my finite difference algorithm, the upwind scheme is used to calculate travel times, and the Crank-Nicolson scheme is used to calculate amplitudes. Moreover, interpolation is applied to save computation cost. The modeling and migration algorithms require a smooth velocity function. I develop a velocity-smoothing technique based on damped least-squares to aid in obtaining a successful migration.

  5. Semiautomatic and Automatic Cooperative Inversion of Seismic and Magnetotelluric Data

    NASA Astrophysics Data System (ADS)

    Le, Cuong V. A.; Harris, Brett D.; Pethick, Andrew M.; Takam Takougang, Eric M.; Howe, Brendan

    2016-09-01

    Natural source electromagnetic methods have the potential to recover rock property distributions from the surface to great depths. Unfortunately, results in complex 3D geo-electrical settings can be disappointing, especially where significant near-surface conductivity variations exist. In such settings, unconstrained inversion of magnetotelluric data is inexorably non-unique. We believe that: (1) correctly introduced information from seismic reflection can substantially improve MT inversion, (2) a cooperative inversion approach can be automated, and (3) massively parallel computing can make such a process viable. Nine inversion strategies including baseline unconstrained inversion and new automated/semiautomated cooperative inversion approaches are applied to industry-scale co-located 3D seismic and magnetotelluric data sets. These data sets were acquired in one of the Carlin gold deposit districts in north-central Nevada, USA. In our approach, seismic information feeds directly into the creation of sets of prior conductivity model and covariance coefficient distributions. We demonstrate how statistical analysis of the distribution of selected seismic attributes can be used to automatically extract subvolumes that form the framework for prior model 3D conductivity distribution. Our cooperative inversion strategies result in detailed subsurface conductivity distributions that are consistent with seismic, electrical logs and geochemical analysis of cores. Such 3D conductivity distributions would be expected to provide clues to 3D velocity structures that could feed back into full seismic inversion for an iterative practical and truly cooperative inversion process. We anticipate that, with the aid of parallel computing, cooperative inversion of seismic and magnetotelluric data can be fully automated, and we hold confidence that significant and practical advances in this direction have been accomplished.

  6. Mini-Sosie high-resolution seismic method aids hazards studies

    USGS Publications Warehouse

    Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.

    1992-01-01

    The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors

  7. The Use of Signal Dimensionality for Automatic QC of Seismic Array Data

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.; Draganov, D.; Maceira, M.; Gomez, M.

    2014-12-01

    A significant problem in seismic array analysis is the inclusion of bad sensor channels in the beam-forming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by-node basis, so the dimensionality of the node traffic is instead monitored for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. We examine the signal dimension in similar way to the method addressing node traffic anomalies in large computer systems. We explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements. We show preliminary results applied to arrays in Kazakhstan (Makanchi) and Argentina (Malargue).

  8. Seismic attributes and advanced computer algorithm to predict formation pore pressure: Qalibah formation of Northwest Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Nour, Abdoulshakour M.

    Oil and gas exploration professionals have long recognized the importance of predicting pore pressure before drilling wells. Pre-drill pore pressure estimation not only helps with drilling wells safely but also aids in the determination of formation fluids migration and seal integrity. With respect to the hydrocarbon reservoirs, the appropriate drilling mud weight is directly related to the estimated pore pressure in the formation. If the mud weight is lower than the formation pressure, a blowout may occur, and conversely, if it is higher than the formation pressure, the formation may suffer irreparable damage due to the invasion of drilling fluids into the formation. A simple definition of pore pressure is the pressure of the pore fluids in excess of the hydrostatic pressure. In this thesis, I investigated the utility of advance computer algorithm called Support Vector Machine (SVM) to learn the pattern of high pore pressure regime, using seismic attributes such as Instantaneous phase, t*Attenuation, Cosine of Phase, Vp/Vs ratio, P-Impedance, Reflection Acoustic Impedance, Dominant frequency and one well attribute (Mud-Weigh) as the learning dataset. I applied this technique to the over pressured Qalibah formation of Northwest Saudi Arabia. The results of my research revealed that in the Qalibah formation of Northwest Saudi Arabia, the pore pressure trend can be predicted using SVM with seismic and well attributes as the learning dataset. I was able to show the pore pressure trend at any given point within the geographical extent of the 3D seismic data from which the seismic attributes were derived. In addition, my results surprisingly showed the subtle variation of pressure within the thick succession of shale units of the Qalibah formation.

  9. GeoNetGIS: a Geodetic Network Geographical Information System to manage GPS networks in seismic and volcanic areas

    NASA Astrophysics Data System (ADS)

    Cristofoletti, P.; Esposito, A.; Anzidei, M.

    2003-04-01

    This paper presents the methodologies and issues involved in the use of GIS techniques to manage geodetic information derived from networks in seismic and volcanic areas. Organization and manipulation of different geodetical, geological and seismic database, give us a new challenge in interpretation of information that has several dimensions, including spatial and temporal variations, also the flexibility and brand range of tools available in GeoNetGIS, make it an attractive platform for earthquake risk assessment. During the last decade the use of geodetic networks based on the Global Positioning System, devoted to geophysical applications, especially for crustal deformation monitoring in seismic and volcanic areas, increased dramatically. The large amount of data provided by these networks, combined with different and independent observations, such as epicentre distribution of recent and historical earthquakes, geological and structural data, photo interpretation of aerial and satellite images, can aid for the detection and parameterization of seismogenic sources. In particular we applied our geodetic oriented GIS to a new GPS network recently set up and surveyed in the Central Apennine region: the CA-GeoNet. GeoNetGIS is designed to analyze in three and four dimensions GPS sources and to improve crustal deformation analysis and interpretation related with tectonic structures and seismicity. It manages many database (DBMS) consisting of different classes, such as Geodesy, Topography, Seismicity, Geology, Geography and Raster Images, administrated according to Thematic Layers. GeoNetGIS represents a powerful research tool allowing to join the analysis of all data layers to integrate the different data base which aid for the identification of the activity of known faults or structures and suggesting the new evidences of active tectonics. A new approach to data integration given by GeoNetGIS capabilities, allow us to create and deliver a wide range of maps, digital and 3-dimensional environment data analysis applications for geophysical users and civil defense companies, also distributing them on the World Wide Web or in wireless connection realized by PDA computer. It runs on powerful PC platform under Win2000 Prof OS © and based on ArcGIS 8.2 ESRI © software.

  10. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  11. Development of seismic tomography software for hybrid supercomputers

    NASA Astrophysics Data System (ADS)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on supercomputers using multicore CPUs only, with preliminary performance tests showing good parallel efficiency on large numerical grids. Porting of the algorithms to hybrid supercomputers is currently ongoing.

  12. Processing grounded-wire TEM signal in time-frequency-pseudo-seismic domain: A new paradigm

    NASA Astrophysics Data System (ADS)

    Khan, M. Y.; Xue, G. Q.; Chen, W.; Huasen, Z.

    2017-12-01

    Grounded-wire TEM has received great attention in mineral, hydrocarbon and hydrogeological investigations for the last several years. Conventionally, TEM soundings have been presented as apparent resistivity curves as function of time. With development of sophisticated computational algorithms, it became possible to extract more realistic geoelectric information by applying inversion programs to 1-D & 3-D problems. Here, we analyze grounded-wire TEM data by carrying out analysis in time, frequency and pseudo-seismic domain supported by borehole information. At first, H, K, A & Q type geoelectric models are processed using a proven inversion program (1-D Occam inversion). Second, time-to-frequency transformation is conducted from TEM ρa(t) curves to magneto telluric MT ρa(f) curves for the same models based on all-time apparent resistivity curves. Third, 1-D Bostick's algorithm was applied to the transformed resistivity. Finally, EM diffusion field is transformed into propagating wave field obeying the standard wave equation using wavelet transformation technique and constructed pseudo-seismic section. The transformed seismic-like wave indicates that some reflection and refraction phenomena appear when the EM wave field interacts with geoelectric interface at different depth intervals due to contrast in resistivity. The resolution of the transformed TEM data is significantly improved in comparison to apparent resistivity plots. A case study illustrates the successful hydrogeophysical application of proposed approach in recovering water-filled mined-out area in a coal field located in Ye county, Henan province, China. The results support the introduction of pseudo-seismic imaging technology in short-offset version of TEM which can also be an useful aid if integrated with seismic reflection technique to explore possibilities for high resolution EM imaging in future.

  13. An efficient implementation of 3D high-resolution imaging for large-scale seismic data with GPU/CPU heterogeneous parallel computing

    NASA Astrophysics Data System (ADS)

    Xu, Jincheng; Liu, Wei; Wang, Jin; Liu, Linong; Zhang, Jianfeng

    2018-02-01

    De-absorption pre-stack time migration (QPSTM) compensates for the absorption and dispersion of seismic waves by introducing an effective Q parameter, thereby making it an effective tool for 3D, high-resolution imaging of seismic data. Although the optimal aperture obtained via stationary-phase migration reduces the computational cost of 3D QPSTM and yields 3D stationary-phase QPSTM, the associated computational efficiency is still the main problem in the processing of 3D, high-resolution images for real large-scale seismic data. In the current paper, we proposed a division method for large-scale, 3D seismic data to optimize the performance of stationary-phase QPSTM on clusters of graphics processing units (GPU). Then, we designed an imaging point parallel strategy to achieve an optimal parallel computing performance. Afterward, we adopted an asynchronous double buffering scheme for multi-stream to perform the GPU/CPU parallel computing. Moreover, several key optimization strategies of computation and storage based on the compute unified device architecture (CUDA) were adopted to accelerate the 3D stationary-phase QPSTM algorithm. Compared with the initial GPU code, the implementation of the key optimization steps, including thread optimization, shared memory optimization, register optimization and special function units (SFU), greatly improved the efficiency. A numerical example employing real large-scale, 3D seismic data showed that our scheme is nearly 80 times faster than the CPU-QPSTM algorithm. Our GPU/CPU heterogeneous parallel computing framework significant reduces the computational cost and facilitates 3D high-resolution imaging for large-scale seismic data.

  14. DigiSeis—A software component for digitizing seismic signals using the PC sound card

    NASA Astrophysics Data System (ADS)

    Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar

    2012-06-01

    An innovative software-based approach to develop an inexpensive experimental seismic recorder is presented. This approach requires no hardware as the built-in PC sound card is used for digitization of seismic signals. DigiSeis, an ActiveX component is developed to capture the digitized seismic signals from the sound card and deliver them to applications for processing and display. A seismic recorder application software SeisWave is developed over this component, which provides real-time monitoring and display of seismic events picked by a pair of external geophones. This recorder can be used as an educational aid for conducting seismic experiments. It can also be connected with suitable seismic sensors to record earthquakes. The software application and the ActiveX component are available for download. This component can be used to develop seismic recording applications according to user specific requirements.

  15. Mantle circulation models with variational data assimilation: Inferring past mantle flow and structure from plate motion histories and seismic tomography

    NASA Astrophysics Data System (ADS)

    Bunge, Hans-Peter

    2002-08-01

    Earth's mantle overturns itself about once every 200 Million years (myrs). Prima facie evidence for this overturn is the motion of tectonic plates at the surface of the Earth driving the geologic activity of our planet. Supporting evidence also comes from seismic tomograms of the Earth's interior that reveal the convective currents in remarkable clarity. Much has been learned about the physics of solid state mantle convection over the past two decades aided primarily by sophisticated computer simulations. Such simulations are reaching the threshold of fully resolving the convective system globally. In this talk we will review recent progress in mantle dynamics studies. We will then turn our attention to the fundamental question of whether it is possible to explicitly reconstruct mantle flow back in time. This is a classic problem of history matching, amenable to control theory and data assimilation. The technical advances that make such approach feasible are dramatically increasing compute resources, represented for example through Beowulf clusters, and new observational initiatives, represented for example through the US-Array effort that should lead to an order-of-magnitude improvement in our ability to resolve Earth structure seismically below North America. In fact, new observational constraints on deep Earth structure illustrate the growing importance of of improving our data assimilation skills in deep Earth models. We will explore data assimilation through high resolution global adjoint models of mantle circulation and conclude that it is feasible to reconstruct mantle flow back in time for at least the past 100 myrs.

  16. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements through a jackknifing process to isolate the anomalous channels, so that an automated analysis system might discard them prior to FK analysis and beamforming on events of interest.

  17. Seismic Risk Studies in the United States.

    ERIC Educational Resources Information Center

    Algermissen, S.T.

    A new seismic risk map of the United States is presented, along with strain release and maximum Modified Mercalli intesity maps of the country. Frequency of occurrence of damaging earthquakes was not considered in zone ratings, but included frequency studies may aid interpretation. Discussion of methods is included with review of calculations. (MH)

  18. New seismic study begins in Puerto Rico

    USGS Publications Warehouse

    Tarr, A.C.

    1974-01-01

    A new seismological project is now underway in Puerto Rico to provide information needed for accurate assessment of the island's seismic hazard. The project should also help to increase understanding of the tectonics and geologic evolution of the Caribbean region. The Puerto Rico Seismic Program is being conducted by the Geological Survey with support provided by the Puerto Rico Water Resources Authority, an agency responsible for generation and distribution of electric power throughout the Commonwealth. The Program will include the installation of a network of high quality seismograph stations to monitor seismic activity on and around Puerto Rico. These stations will be distributed across the island to record the seismicity as uniformly as possible. The detection and accurate location of small earthquakes, as well as moderate magnitude shocks, will aid in mapping active seismic zones and in compiling frequency of occurrence statistics which ultimately wil be useful in seismic risk-zoning of hte island. 

  19. Coal-seismic, desktop computer programs in BASIC; Part 7, Display and compute shear-pair seismograms

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report discusses and presents five computer pro grams used to display and compute shear-pair seismograms.

  20. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    NASA Astrophysics Data System (ADS)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-10-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  1. Seismic wavefield modeling based on time-domain symplectic and Fourier finite-difference method

    NASA Astrophysics Data System (ADS)

    Fang, Gang; Ba, Jing; Liu, Xin-xin; Zhu, Kun; Liu, Guo-Chang

    2017-06-01

    Seismic wavefield modeling is important for improving seismic data processing and interpretation. Calculations of wavefield propagation are sometimes not stable when forward modeling of seismic wave uses large time steps for long times. Based on the Hamiltonian expression of the acoustic wave equation, we propose a structure-preserving method for seismic wavefield modeling by applying the symplectic finite-difference method on time grids and the Fourier finite-difference method on space grids to solve the acoustic wave equation. The proposed method is called the symplectic Fourier finite-difference (symplectic FFD) method, and offers high computational accuracy and improves the computational stability. Using acoustic approximation, we extend the method to anisotropic media. We discuss the calculations in the symplectic FFD method for seismic wavefield modeling of isotropic and anisotropic media, and use the BP salt model and BP TTI model to test the proposed method. The numerical examples suggest that the proposed method can be used in seismic modeling of strongly variable velocities, offering high computational accuracy and low numerical dispersion. The symplectic FFD method overcomes the residual qSV wave of seismic modeling in anisotropic media and maintains the stability of the wavefield propagation for large time steps.

  2. Advanced Gas Hydrate Reservoir Modeling Using Rock Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McConnell, Daniel

    Prospecting for high saturation gas hydrate deposits can be greatly aided with improved approaches to seismic interpretation and especially if sets of seismic attributes can be shown as diagnostic or direct hydrocarbon indicators for high saturation gas hydrates in sands that would be of most interest for gas hydrate production. A large 3D seismic data set in the deep water Eastern Gulf of Mexico was screened for gas hydrates using a set of techniques and seismic signatures that were developed and proven in the Central deepwater Gulf of Mexico in the DOE Gulf of Mexico Joint Industry Project JIP Legmore » II in 2009 and recently confirmed with coring in 2017. A large gas hydrate deposit is interpreted in the data where gas has migrated from one of the few deep seated faults plumbing the Jurassic hydrocarbon source into the gas hydrate stability zone. The gas hydrate deposit lies within a flat-lying within Pliocene Mississippi Fan channel that was deposited outboard in a deep abyssal environment. The uniform architecture of the channel aided the evaluation of a set of seismic attributes that relate to attenuation and thin-bed energy that could be diagnostic of gas hydrates. Frequency attributes derived from spectral decomposition also proved to be direct hydrocarbon indicators by pseudo-thickness that could be only be reconciled by substituting gas hydrate in the pore space. The study emphasizes that gas hydrate exploration and reservoir characterization benefits from a seismic thin bed approach.« less

  3. Goal-seismic computer programs in BASIC: Part I; Store, plot, and edit array data

    USGS Publications Warehouse

    Hasbrouck, Wilfred P.

    1979-01-01

    Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in an extended BASIC language specially augmented for acceptance by the Tektronix 4051 Graphic System. This report presents five computer programs used to store, plot, and edit array data for the line, cross, and triangle arrays commonly employed in our coal-seismic investigations. * Use of brand names in this report is for descriptive purposes only and does not constitute endorsement by the U.S. Geological Survey.

  4. An enhancement of NASTRAN for the seismic analysis of structures. [nuclear power plants

    NASA Technical Reports Server (NTRS)

    Burroughs, J. W.

    1980-01-01

    New modules, bulk data cards and DMAP sequence were added to NASTRAN to aid in the seismic analysis of nuclear power plant structures. These allow input consisting of acceleration time histories and result in the generation of acceleration floor response spectra. The resulting system contains numerous user convenience features, as well as being reasonably efficient.

  5. Research and development support for the Center for Seismic Studies

    NASA Astrophysics Data System (ADS)

    Romney, C. F.; Huszar, L.; Frazier, G. A.

    1984-07-01

    Work during the second and third quarters of FY1984 continued to be focused on the development of the Center for Seismic Studies, and on planning and developments to prepare for a test of Seismic data exchange and event determination, as proposed by the group of Scientific Experts, UN Committee on Disarmament. A help system was designed and partially completed, and other aids for new users of the Center's data and facilities were developed. An introduction to Ingres was prepared, and new experimental data bases were installed.

  6. Oklahoma's induced seismicity strongly linked to wastewater injection depth

    NASA Astrophysics Data System (ADS)

    Hincks, Thea; Aspinall, Willy; Cooke, Roger; Gernon, Thomas

    2018-03-01

    The sharp rise in Oklahoma seismicity since 2009 is due to wastewater injection. The role of injection depth is an open, complex issue, yet critical for hazard assessment and regulation. We developed an advanced Bayesian network to model joint conditional dependencies between spatial, operational, and seismicity parameters. We found that injection depth relative to crystalline basement most strongly correlates with seismic moment release. The joint effects of depth and volume are critical, as injection rate becomes more influential near the basement interface. Restricting injection depths to 200 to 500 meters above basement could reduce annual seismic moment release by a factor of 1.4 to 2.8. Our approach enables identification of subregions where targeted regulation may mitigate effects of induced earthquakes, aiding operators and regulators in wastewater disposal regions.

  7. Recent seismicity and crustal stress field in the Lucanian Apennines and surrounding areas (Southern Italy): Seismotectonic implications

    NASA Astrophysics Data System (ADS)

    Maggi, C.; Frepoli, A.; Cimini, G. B.; Console, R.; Chiappini, M.

    2009-01-01

    We analyzed the instrumental seismicity of Southern Italy in the area including the Lucanian Apennines and Bradano foredeep, making use of the most recent seismological data base available so far. P- and S-wave arrival times, recorded by the Italian National Seismic Network (RSNC) operated by the Istituto Nazionale di Geofisica e Vulcanologia (INGV), were re-picked along with those of the SAPTEX temporary array deployed in the region in the period 2001-2004. For some events located in the upper Val d'Agri, we also used data from the Eni-Agip oil company seismic network. We examined the seismicity occurred during the period between 2001 and 2006, considering 514 events with magnitudes M ≥ 2.0. We computed the VP/ VS ratio obtaining a value of 1.83 and we carried out an analysis for the one-dimensional (1D) velocity model that approximates the seismic structure of the study area. Earthquakes were relocated and, for well- recorded events, we also computed 108 fault plane solutions. Finally, using 58 solutions, the most constrained, we computed regional stress field in the study area. Earthquake distribution shows three main seismic regions: the westernmost (Lucanian Apennines) characterized by high background seismicity, mostly with shallow hypocenters, the easternmost below the Bradano foredeep and the Murge with deeper and more scattered seismicity, and finally the more isolated and sparse seismicity localized in the Sila Range and in the offshore area along the northeastern Calabrian coast. Focal mechanisms computed in this work are in large part normal and strike-slip solutions and their tensional axes ( T-axes) have a generalized NE-SW orientation. The denser station coverage allowed us to improve hypocenters determination compared to those obtained by using only RSNC data, for a better characterization of the crustal and subcrustal seismicity in the study area.

  8. Seismic Characterization of EGS Reservoirs

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.

    2014-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  9. Recent development on computer aided tissue engineering--a review.

    PubMed

    Sun, Wei; Lal, Pallavi

    2002-02-01

    The utilization of computer-aided technologies in tissue engineering has evolved in the development of a new field of computer-aided tissue engineering (CATE). This article reviews recent development and application of enabling computer technology, imaging technology, computer-aided design and computer-aided manufacturing (CAD and CAM), and rapid prototyping (RP) technology in tissue engineering, particularly, in computer-aided tissue anatomical modeling, three-dimensional (3-D) anatomy visualization and 3-D reconstruction, CAD-based anatomical modeling, computer-aided tissue classification, computer-aided tissue implantation and prototype modeling assisted surgical planning and reconstruction.

  10. Method of migrating seismic records

    DOEpatents

    Ober, Curtis C.; Romero, Louis A.; Ghiglia, Dennis C.

    2000-01-01

    The present invention provides a method of migrating seismic records that retains the information in the seismic records and allows migration with significant reductions in computing cost. The present invention comprises phase encoding seismic records and combining the encoded seismic records before migration. Phase encoding can minimize the effect of unwanted cross terms while still allowing significant reductions in the cost to migrate a number of seismic records.

  11. Modeling Poroelastic Wave Propagation in a Real 2-D Complex Geological Structure Obtained via Self-Organizing Maps

    NASA Astrophysics Data System (ADS)

    Itzá Balam, Reymundo; Iturrarán-Viveros, Ursula; Parra, Jorge O.

    2018-03-01

    Two main stages of seismic modeling are geological model building and numerical computation of seismic response for the model. The quality of the computed seismic response is partly related to the type of model that is built. Therefore, the model building approaches become as important as seismic forward numerical methods. For this purpose, three petrophysical facies (sands, shales and limestones) are extracted from reflection seismic data and some seismic attributes via the clustering method called Self-Organizing Maps (SOM), which, in this context, serves as a geological model building tool. This model with all its properties is the input to the Optimal Implicit Staggered Finite Difference (OISFD) algorithm to create synthetic seismograms for poroelastic, poroacoustic and elastic media. The results show a good agreement between observed and 2-D synthetic seismograms. This demonstrates that the SOM classification method enables us to extract facies from seismic data and allows us to integrate the lithology at the borehole scale with the 2-D seismic data.

  12. Seismic instrumentation plan for the Hawaiian Volcano Observatory

    USGS Publications Warehouse

    Thelen, Weston A.

    2014-01-01

    The installation of new seismic stations is only the first part of building a volcanic early warning capability for seismicity in the State of Hawaii. Additional personnel will likely be required to study the volcanic processes at work under each volcano, analyze the current seismic activity at a level sufficient for early warning, build new tools for monitoring, maintain seismic computing resources, and maintain the new seismic stations.

  13. Seismic hazard study for selected sites in New Mexico and Nevada

    NASA Astrophysics Data System (ADS)

    Johnston, J. C.

    1983-12-01

    Seismic hazard evaluations were conducted for specific sites in New Mexico and Nevada. For New Mexico, a model of seismicity was developed from historical accounts of medium to large shocks and the current microactivity record from local networks. Ninety percent confidence levels at Albuquerque and Roswell were computed to be 56 gals for a 10-year period and 77 gals for a 20-year period. Values of ground motion for Clovis were below these values. Peak velocity and displacement were also computed for each site. Deterministic spectra based on the estimated maximum credible earthquake for the zones which the sites occupy were also computed. For the sites in Nevada, the regionalizations used in Battis (1982) for the uniform seismicity model were slightly modified. For 10- and 20-year time periods, peak acceleration values for Indian Springs were computed to be 94 gals and 123 gals and for Hawthorne 206 gals and 268 gals. Deterministic spectra were also computed. The input parameters were well determined for the analysis for the Nevada sites because of the abundance of data. The values computed for New Mexico, however, are likely upper limits. As more data are collected from the area of the Rio Grande rift zone, the pattern of seismicity will become better understood. At this time a more detailed, and thus more accurate, model may emerge.

  14. Using Seismic and Infrasonic Data to Identify Persistent Sources

    NASA Astrophysics Data System (ADS)

    Nava, S.; Brogan, R.

    2014-12-01

    Data from seismic and infrasound sensors were combined to aid in the identification of persistent sources such as mining-related explosions. It is of interest to operators of seismic networks to identify these signals in their event catalogs. Acoustic signals below the threshold of human hearing, in the frequency range of ~0.01 to 20 Hz are classified as infrasound. Persistent signal sources are useful as ground truth data for the study of atmospheric infrasound signal propagation, identification of manmade versus naturally occurring seismic sources, and other studies. By using signals emanating from the same location, propagation studies, for example, can be conducted using a variety of atmospheric conditions, leading to improvements to the modeling process for eventual use where the source is not known. We present results from several studies to identify ground truth sources using both seismic and infrasound data.

  15. Computer program modifications of Open-file report 82-1065; a comprehensive system for interpreting seismic-refraction and arrival-time data using interactive computer methods

    USGS Publications Warehouse

    Ackermann, Hans D.; Pankratz, Leroy W.; Dansereau, Danny A.

    1983-01-01

    The computer programs published in Open-File Report 82-1065, A comprehensive system for interpreting seismic-refraction arrival-time data using interactive computer methods (Ackermann, Pankratz, and Dansereau, 1982), have been modified to run on a mini-computer. The new version uses approximately 1/10 of the memory of the initial version, is more efficient and gives the same results.

  16. A seismic reflection velocity study of a Mississippian mud-mound in the Illinois basin

    NASA Astrophysics Data System (ADS)

    Ranaweera, Chamila Kumari

    Two mud-mounds have been reported in the Ullin limestone near, but not in, the Aden oil field in Hamilton County, Illinois. One mud-mound is in the Broughton oil field of Hamilton County 25 miles to the south of Aden. The second mud-mound is in the Johnsonville oil field in Wayne County 20 miles to the north of Aden. Seismic reflection profiles were shot in 2012 adjacent to the Aden oil field to evaluate the oil prospects and to investigate the possibility of detecting Mississippian mud-mounds near the Aden field. A feature on one of the seismic profiles was interpreted to be a mud-mound or carbonate buildup. A well drilled at the location of this interpreted structure provided digital geophysical logs and geological logs used to refine the interpretation of the seismic profiles. Geological data from the new well at Aden, in the form of drill cuttings, have been used to essentially confirm the existence of a mud-mound in the Ullin limestone at a depth of 4300 feet. Geophysical well logs from the new well near Aden were used to create 1-D computer models and synthetic seismograms for comparison to the seismic data. The reflection seismic method is widely used to aid interpreting subsurface geology. Processing seismic data is an important step in the method as a properly processed seismic section can give a better image of the subsurface geology whereas a poorly processed section could mislead the interpretation. Seismic reflections will be more accurately depicted with careful determination of seismic velocities and by carefully choosing the processing steps and parameters. Various data processing steps have been applied and parameters refined to produce improved stacked seismic records. The resulting seismic records from the Aden field area indicate a seismic response similar to what is expected from a carbonate mud-mound. One-dimensional synthetic seismograms were created using the available sonic and density logs from the well drilled near the Aden seismic lines. The 1-D synthetics were used by Cory Cantrell of Royal Drilling and Producing Company to identify various reflections on the seismic records. Seismic data was compared with the modeled synthetic seismograms to identify what appears to be a carbonate mud-mound within the Aden study area. No mud-mounds have been previously found in the Aden oil field. Average and interval velocities obtained from the geophysical logs from the wells drilled in the Aden area was compared with the same type of well velocities from the Broughton known mud-mound area to observe the significance of velocity variation related to the un-known mud-mound in the Aden study area. The results of the velocity study shows a similar trends in the wells from both areas and are higher at the bottom of the wells. Another approach was used to observe the variation of root mean square velocities calculated from the sonic log from the well velocity from the Aden area and the stacking velocities obtained from the seismic data adjacent to the well.

  17. Forward and adjoint spectral-element simulations of seismic wave propagation using hardware accelerators

    NASA Astrophysics Data System (ADS)

    Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri

    2015-04-01

    Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  18. Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography

    PubMed Central

    Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2013-01-01

    OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume < −950 HU of each lobe using (i) the slice-by-slice method (reference standard), (ii) number of segments method, and (iii) semi-automatic and (iv) automatic computer-aided diagnosis. We determined Pearson correlation coefficients between the reference standard and the three other methods for lobar volumes and emphysematous lobar volumes. We also compared the relative errors among the three measurement methods. RESULTS Both semi-automatic and automatic computer-aided diagnosis results were more strongly correlated with the reference standard than the number of segments method. The correlation coefficients for automatic computer-aided diagnosis were slightly lower than those for semi-automatic computer-aided diagnosis because there was one outlier among 50 cases (2%) in the right upper lobe and two outliers among 50 cases (4%) in the other lobes. The number of segments method relative error was significantly greater than those for semi-automatic and automatic computer-aided diagnosis (P < 0.001). The computational time for automatic computer-aided diagnosis was 1/2 to 2/3 than that of semi-automatic computer-aided diagnosis. CONCLUSIONS A novel lobar volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418

  19. Building Capacity for Earthquake Monitoring: Linking Regional Networks with the Global Community

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Lerner-Lam, A.

    2006-12-01

    Installing or upgrading a seismic monitoring network is often among the mitigation efforts after earthquake disasters, and this is happening in response to the events both in Sumatra during December 2004 and in Pakistan during October 2005. These networks can yield improved hazard assessment, more resilient buildings where they are most needed, and emergency relief directed more quickly to the worst hit areas after the next large earthquake. Several commercial organizations are well prepared for the fleeting opportunity to provide the instruments that comprise a seismic network, including sensors, data loggers, telemetry stations, and the computers and software required for the network center. But seismic monitoring requires more than hardware and software, no matter how advanced. A well-trained staff is required to select appropriate and mutually compatible components, install and maintain telemetered stations, manage and archive data, and perform the analyses that actually yield the intended benefits. Monitoring is more effective when network operators cooperate with a larger community through free and open exchange of data, sharing information about working practices, and international collaboration in research. As an academic consortium, a facility operator and a founding member of the International Federation of Digital Seismographic Networks, IRIS has access to a broad range of expertise with the skills that are required to help design, install, and operate a seismic network and earthquake analysis center, and stimulate the core training for the professional teams required to establish and maintain these facilities. But delivering expertise quickly when and where it is unexpectedly in demand requires advance planning and coordination in order to respond to the needs of organizations that are building a seismic network, either with tight time constraints imposed by the budget cycles of aid agencies following a disastrous earthquake, or as part of more informed national programs for hazard assessment and mitigation.

  20. Frozen Gaussian approximation for 3D seismic tomography

    NASA Astrophysics Data System (ADS)

    Chai, Lihui; Tong, Ping; Yang, Xu

    2018-05-01

    Three-dimensional (3D) wave-equation-based seismic tomography is computationally challenging in large scales and high-frequency regime. In this paper, we apply the frozen Gaussian approximation (FGA) method to compute 3D sensitivity kernels and seismic tomography of high-frequency. Rather than standard ray theory used in seismic inversion (e.g. Kirchhoff migration and Gaussian beam migration), FGA is used to compute the 3D high-frequency sensitivity kernels for travel-time or full waveform inversions. Specifically, we reformulate the equations of the forward and adjoint wavefields for the purpose of convenience to apply FGA, and with this reformulation, one can efficiently compute the Green’s functions whose convolutions with source time function produce wavefields needed for the construction of 3D kernels. Moreover, a fast summation method is proposed based on local fast Fourier transform which greatly improves the speed of reconstruction as the last step of FGA algorithm. We apply FGA to both the travel-time adjoint tomography and full waveform inversion (FWI) on synthetic crosswell seismic data with dominant frequencies as high as those of real crosswell data, and confirm again that FWI requires a more sophisticated initial velocity model for the convergence than travel-time adjoint tomography. We also numerically test the accuracy of applying FGA to local earthquake tomography. This study paves the way to directly apply wave-equation-based seismic tomography methods into real data around their dominant frequencies.

  1. The Shock and Vibration Digest. Volume 14, Number 11

    DTIC Science & Technology

    1982-11-01

    cooled reactor 1981) ( HTGR ) core under seismic excitation his been developed . N82-18644 The computer program can be used to predict the behavior (In...French) of the HTGR core under seismic excitation. Key Words: Computer programs , Modal analysis, Beams, Undamped structures A computation method is...30) PROGRAMMING c c Dale and Cohen [221 extended the method of McMunn and Plunkett [201 developed a compute- McMunn and Plunkett to continuous systems

  2. Coal-seismic, desktop computer programs in BASIC; Part 6, Develop rms velocity functions and apply mute and normal movement

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report presents computer programs used to develop rms velocity functions and apply mute and normal moveout to a 12-trace seismogram.

  3. Seismic data are rich in information about subsurface formations and fluids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farfour, Mohammed; Yoon, Wang Jung; Kim, Dongshin

    2016-06-08

    Seismic attributes are defined as any measured or computed information derived from seismic data. Throughout the last decades extensive work has been done in developing variety of mathematical approaches to extract maximum information from seismic data. Nevertheless, geoscientists found that seismic is still mature and rich in information. In this paper a new seismic attribute is introduced. Instantaneous energy seismic attribute is an amplitude based attribute that has the potential to emphasize anomalous amplitude associated with hydrocarbons. Promising results have been obtained from applying the attribute on seismic section traversing hydrocarbon filled sand from Alberta, Canada.

  4. GRACE gravity data help constraining seismic models of the 2004 Sumatran earthquake

    NASA Astrophysics Data System (ADS)

    Cambiotti, G.; Bordoni, A.; Sabadini, R.; Colli, L.

    2011-10-01

    The analysis of Gravity Recovery and Climate Experiment (GRACE) Level 2 data time series from the Center for Space Research (CSR) and GeoForschungsZentrum (GFZ) allows us to extract a new estimate of the co-seismic gravity signal due to the 2004 Sumatran earthquake. Owing to compressible self-gravitating Earth models, including sea level feedback in a new self-consistent way and designed to compute gravitational perturbations due to volume changes separately, we are able to prove that the asymmetry in the co-seismic gravity pattern, in which the north-eastern negative anomaly is twice as large as the south-western positive anomaly, is not due to the previously overestimated dilatation in the crust. The overestimate was due to a large dilatation localized at the fault discontinuity, the gravitational effect of which is compensated by an opposite contribution from topography due to the uplifted crust. After this localized dilatation is removed, we instead predict compression in the footwall and dilatation in the hanging wall. The overall anomaly is then mainly due to the additional gravitational effects of the ocean after water is displaced away from the uplifted crust, as first indicated by de Linage et al. (2009). We also detail the differences between compressible and incompressible material properties. By focusing on the most robust estimates from GRACE data, consisting of the peak-to-peak gravity anomaly and an asymmetry coefficient, that is given by the ratio of the negative gravity anomaly over the positive anomaly, we show that they are quite sensitive to seismic source depths and dip angles. This allows us to exploit space gravity data for the first time to help constraining centroid-momentum-tensor (CMT) source analyses of the 2004 Sumatran earthquake and to conclude that the seismic moment has been released mainly in the lower crust rather than the lithospheric mantle. Thus, GRACE data and CMT source analyses, as well as geodetic slip distributions aided by GPS, complement each other for a robust inference of the seismic source of large earthquakes. Particular care is devoted to the spatial filtering of the gravity anomalies estimated both from observations and models to make their comparison significant.

  5. Theoretical computation of internal co- and post-seismic deformation fields caused by great earthquakes in a spherically stratified viscoelastic earth

    NASA Astrophysics Data System (ADS)

    Takagi, Y.; Okubo, S.

    2016-12-01

    Internal co- and post-seismic deformation fields such as strain and stress changes have been modelled in order to study their effects on the subsequent earthquake and/or volcanic activity around the epicentre. When modelling strain or stress changes caused by great earthquakes (M>9.0), we should use a realistic earth model including earth's curvature and stratification; according to Toda et al.'s (2011) result, the stress changes caused by the 2011 Tohoku-oki earthquake (Mw=9.0) exceed 0.1 bar (0.01 MPa) even at the epicentral distance over 400 km. Although many works have been carried out to compute co- and post-seismic surface deformation fields using a spherically stratified viscoelastic earth (e.g. Piersanti et al. 1995; Pollitz 1996, 1997; Tanaka et al. 2006), less attention has been paid to `internal' deformation fields. Tanaka et al. (2006) succeeded in computing post-seismic surface displacements in a continuously stratified compressible viscoelastic earth by evaluating the inverse Laplace integration numerically. To our regret, however, their method cannot calculate internal deformation because they use Okubo's (1993) reciprocity theorem. We found that Okubo's (1993) reciprocity theorem can be extended to computation of internal deformation fields. In this presentation, we show a method of computing internal co- and post-seismic deformation fields and discuss the effects of earth's curvature and stratification on them.

  6. Evaluation of Ground Vibrations Induced by Military Noise Sources

    DTIC Science & Technology

    2006-08-01

    1 Task 2—Determine the acoustic -to-seismic coupling coefficients C1 and C2 ...................... 1 Task 3—Computational modeling ...Determine the acoustic -to-seismic coupling coefficients C1 and C2 ....................45 Task 3—Computational modeling of acoustically induced ground...ground conditions. Task 3—Computational modeling of acoustically induced ground motion The simple model of blast sound interaction with the

  7. Coal-seismic, desktop computer programs in BASIC; Part 5, Perform X-square T-square analysis and plot normal moveout lines on seismogram overlay

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language used by the Tektronix 4051 Graphic System. This report presents computer programs to perform X-square/T-square analyses and to plot normal moveout lines on a seismogram overlay.

  8. The availability of hydrogeologic data associated with areas identified by the US Geological Survey as experiencing potentially induced seismicity resulting from subsurface injection

    NASA Astrophysics Data System (ADS)

    Barnes, Caitlin; Halihan, Todd

    2018-05-01

    A critical need exists for site-specific hydrogeologic data in order to determine potential hazards of induced seismicity and to manage risk. By 2015, the United States Geological Survey (USGS) had identified 17 locations in the USA that are experiencing an increase in seismicity, which may be potentially induced through industrial subsurface injection. These locations span across seven states, which vary in geological setting, industrial exposure and seismic history. Comparing the research across the 17 locations revealed patterns for addressing induced seismicity concerns, despite the differences between geographical locations. Most induced seismicity studies evaluate geologic structure and seismic data from areas experiencing changes in seismic activity levels, but the inherent triggering mechanism is the transmission of hydraulic pressure pulses. This research conducted a systematic review of whether data are available in these locations to generate accurate hydrogeologic predictions, which could aid in managing seismicity. After analyzing peer-reviewed research within the 17 locations, this research confirms a lack of site-specific hydrogeologic data availability for at-risk areas. Commonly, formation geology data are available for these sites, but hydraulic parameters for the seismically active injection and basement zones are not available to researchers conducting peer-reviewed research. Obtaining hydrogeologic data would lead to better risk management for injection areas and provide additional scientific evidential support for determining a potentially induced seismic area.

  9. Processing of single channel air and water gun data for imaging an impact structure at the Chesapeake Bay

    USGS Publications Warehouse

    Lee, Myung W.

    1999-01-01

    Processing of 20 seismic profiles acquired in the Chesapeake Bay area aided in analysis of the details of an impact structure and allowed more accurate mapping of the depression caused by a bolide impact. Particular emphasis was placed on enhancement of seismic reflections from the basement. Application of wavelet deconvolution after a second zero-crossing predictive deconvolution improved the resolution of shallow reflections, and application of a match filter enhanced the basement reflections. The use of deconvolution and match filtering with a two-dimensional signal enhancement technique (F-X filtering) significantly improved the interpretability of seismic sections.

  10. Quake Final Video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Critical infrastructures of the world are at constant risks for earthquakes. Most of these critical structures are designed using archaic, seismic, simulation methods that were built from early digital computers from the 1970s. Idaho National Laboratory’s Seismic Research Group are working to modernize the simulation methods through computational research and large-scale laboratory experiments.

  11. Seismic data restoration with a fast L1 norm trust region method

    NASA Astrophysics Data System (ADS)

    Cao, Jingjie; Wang, Yanfei

    2014-08-01

    Seismic data restoration is a major strategy to provide reliable wavefield when field data dissatisfy the Shannon sampling theorem. Recovery by sparsity-promoting inversion often get sparse solutions of seismic data in a transformed domains, however, most methods for sparsity-promoting inversion are line-searching methods which are efficient but are inclined to obtain local solutions. Using trust region method which can provide globally convergent solutions is a good choice to overcome this shortcoming. A trust region method for sparse inversion has been proposed, however, the efficiency should be improved to suitable for large-scale computation. In this paper, a new L1 norm trust region model is proposed for seismic data restoration and a robust gradient projection method for solving the sub-problem is utilized. Numerical results of synthetic and field data demonstrate that the proposed trust region method can get excellent computation speed and is a viable alternative for large-scale computation.

  12. Connecting an Ocean-Bottom Broadband Seismometer to a Seafloor Cabled Observatory: A Prototype System in Monterey Bay

    NASA Astrophysics Data System (ADS)

    McGill, P.; Neuhauser, D.; Romanowicz, B.

    2008-12-01

    The Monterey Ocean-Bottom Broadband (MOBB) seismic station was installed in April 2003, 40 km offshore from the central coast of California at a seafloor depth of 1000 m. It comprises a three-component broadband seismometer system (Guralp CMG-1T), installed in a hollow PVC caisson and buried under the seafloor; a current meter; and a differential pressure gauge. The station has been operating continuously since installation with no connection to the shore. Three times each year, the station is serviced with the aid of a Remotely Operated Vehicle (ROV) to change the batteries and retrieve the seismic data. In February 2009, the MOBB system will be connected to the Monterey Accelerated Research System (MARS) seafloor cabled observatory. The NSF-funded MARS observatory comprises a 52 km electro-optical cable that extends from a shore facility in Moss Landing out to a seafloor node in Monterey Bay. Once installation is completed in November 2008, the node will provide power and data to as many as eight science experiments through underwater electrical connectors. The MOBB system is located 3 km from the MARS node, and the two will be connected with an extension cable installed by an ROV with the aid of a cable-laying toolsled. The electronics module in the MOBB system is being refurbished to support the connection to the MARS observatory. The low-power autonomous data logger has been replaced with a PC/104 computer stack running embedded Linux. This new computer will run an Object Ring Buffer (ORB), which will collect data from the various MOBB sensors and forward it to another ORB running on a computer at the MARS shore station. There, the data will be archived and then forwarded to a third ORB running at the UC Berkeley Seismological Laboratory. Timing will be synchronized among MOBB's multiple acquisition systems using NTP, GPS clock emulation, and a precise timing signal from the MARS cable. The connection to the MARS observatory will provide real-time access to the MOBB data and eliminate the need for frequent servicing visits. The new system uses off-the-shelf hardware and open-source software, and will serve as a prototype for future instruments connected to seafloor cabled observatories.

  13. Asymptotic co- and post-seismic displacements in a homogeneous Maxwell sphere

    NASA Astrophysics Data System (ADS)

    Tang, He; Sun, Wenke

    2018-07-01

    The deformations of the Earth caused by internal and external forces are usually expressed through Green's functions or the superposition of normal modes, that is, via numerical methods, which are applicable for computing both co- and post-seismic deformations. It is difficult to express these deformations in an analytical form, even for a uniform viscoelastic sphere. In this study, we present a set of asymptotic solutions for computing co- and post-seismic displacements; these solutions can be further applied to solving co- and post-seismic geoid, gravity and strain changes. Expressions are derived for a uniform Maxwell Earth by combining the reciprocity theorem, which links earthquake, tidal, shear and loading deformations, with the asymptotic solutions of these three external forces (tidal, shear and loading) and analytical inverse Laplace transformation formulae. Since the asymptotic solutions are given in a purely analytical form without series summations or extra convergence skills, they can be practically applied in an efficient way, especially when computing post-seismic deformations and glacial isotactic adjustments of the Earth over long timescales.

  14. Asymptotic Co- and Post-seismic displacements in a homogeneous Maxwell sphere

    NASA Astrophysics Data System (ADS)

    Tang, He; Sun, Wenke

    2018-05-01

    The deformations of the Earth caused by internal and external forces are usually expressed through Green's functions or the superposition of normal modes, i.e. via numerical methods, which are applicable for computing both co- and post-seismic deformations. It is difficult to express these deformations in an analytical form, even for a uniform viscoelastic sphere. In this study, we present a set of asymptotic solutions for computing co- and post-seismic displacements; these solutions can be further applied to solving co- and post-seismic geoid, gravity, and strain changes. Expressions are derived for a uniform Maxwell Earth by combining the reciprocity theorem, which links earthquake, tidal, shear and loading deformations, with the asymptotic solutions of these three external forces (tidal, shear and loading) and analytical inverse Laplace transformation formulae. Since the asymptotic solutions are given in a purely analytical form without series summations or extra convergence skills, they can be practically applied in an efficient way, especially when computing post-seismic deformations and glacial isotactic adjustments of the Earth over long timescales.

  15. Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio

    1997-11-01

    A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).

  16. An alternative approach for computing seismic response with accidental eccentricity

    NASA Astrophysics Data System (ADS)

    Fan, Xuanhua; Yin, Jiacong; Sun, Shuli; Chen, Pu

    2014-09-01

    Accidental eccentricity is a non-standard assumption for seismic design of tall buildings. Taking it into consideration requires reanalysis of seismic resistance, which requires either time consuming computation of natural vibration of eccentric structures or finding a static displacement solution by applying an approximated equivalent torsional moment for each eccentric case. This study proposes an alternative modal response spectrum analysis (MRSA) approach to calculate seismic responses with accidental eccentricity. The proposed approach, called the Rayleigh Ritz Projection-MRSA (RRP-MRSA), is developed based on MRSA and two strategies: (a) a RRP method to obtain a fast calculation of approximate modes of eccentric structures; and (b) an approach to assemble mass matrices of eccentric structures. The efficiency of RRP-MRSA is tested via engineering examples and compared with the standard MRSA (ST-MRSA) and one approximate method, i.e., the equivalent torsional moment hybrid MRSA (ETM-MRSA). Numerical results show that RRP-MRSA not only achieves almost the same precision as ST-MRSA, and is much better than ETM-MRSA, but is also more economical. Thus, RRP-MRSA can be in place of current accidental eccentricity computations in seismic design.

  17. Introducing Seismic Tomography with Computational Modeling

    NASA Astrophysics Data System (ADS)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  18. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    NASA Astrophysics Data System (ADS)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  19. Surface-Source Downhole Seismic Analysis in R

    USGS Publications Warehouse

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the inversion for the velocity of each layer. The analyst usually picks layer-interfaces by visual inspection of the travel-time data. I have also developed an algorithm that automatically finds boundaries which can save a significant amount of the time when analyzing a large number of sites. The results of the automatic routines should be reviewed to check that they are reasonable. The interactivity of these scripts allows the user to add and to remove layers quickly, thus allowing rapid feedback on how the residuals are affected by each additional parameter in the inversion. In addition, the script allows many models to be compared at the same time.

  20. Python Waveform Cross-Correlation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Templeton, Dennise

    PyWCC is a tool to compute seismic waveform cross-correlation coefficients on single-component or multiple-component seismic data across a network of seismic sensors. PyWCC compares waveform data templates with continuous seismic data, associates the resulting detections, identifies the template with the highest cross-correlation coefficient, and outputs a catalog of detections above a user-defined absolute cross-correlation threshold value.

  1. Development of High-speed Visualization System of Hypocenter Data Using CUDA-based GPU computing

    NASA Astrophysics Data System (ADS)

    Kumagai, T.; Okubo, K.; Uchida, N.; Matsuzawa, T.; Kawada, N.; Takeuchi, N.

    2014-12-01

    After the Great East Japan Earthquake on March 11, 2011, intelligent visualization of seismic information is becoming important to understand the earthquake phenomena. On the other hand, to date, the quantity of seismic data becomes enormous as a progress of high accuracy observation network; we need to treat many parameters (e.g., positional information, origin time, magnitude, etc.) to efficiently display the seismic information. Therefore, high-speed processing of data and image information is necessary to handle enormous amounts of seismic data. Recently, GPU (Graphic Processing Unit) is used as an acceleration tool for data processing and calculation in various study fields. This movement is called GPGPU (General Purpose computing on GPUs). In the last few years the performance of GPU keeps on improving rapidly. GPU computing gives us the high-performance computing environment at a lower cost than before. Moreover, use of GPU has an advantage of visualization of processed data, because GPU is originally architecture for graphics processing. In the GPU computing, the processed data is always stored in the video memory. Therefore, we can directly write drawing information to the VRAM on the video card by combining CUDA and the graphics API. In this study, we employ CUDA and OpenGL and/or DirectX to realize full-GPU implementation. This method makes it possible to write drawing information to the VRAM on the video card without PCIe bus data transfer: It enables the high-speed processing of seismic data. The present study examines the GPU computing-based high-speed visualization and the feasibility for high-speed visualization system of hypocenter data.

  2. High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers

    NASA Astrophysics Data System (ADS)

    Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas

    2017-04-01

    Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise mapping application is composed of four principal modules: (1) pre-processing of raw data, (2) massive cross-correlation, (3) post-processing of correlation data based on computation of logarithmic energy ratio and (4) generation of source maps from post-processed data. Implementation of the solution posed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.

  3. MSNoise: a Python Package for Monitoring Seismic Velocity Changes using Ambient Seismic Noise

    NASA Astrophysics Data System (ADS)

    Lecocq, T.; Caudron, C.; Brenguier, F.

    2013-12-01

    Earthquakes occur every day all around the world and are recorded by thousands of seismic stations. In between earthquakes, stations are recording "noise". In the last 10 years, the understanding of this noise and its potential usage have been increasing rapidly. The method, called "seismic interferometry", uses the principle that seismic waves travel between two recorders and are multiple-scattered in the medium. By cross-correlating the two records, one gets an information on the medium below/between the stations. The cross-correlation function (CCF) is a proxy to the Green Function of the medium. Recent developments of the technique have shown those CCF can be used to image the earth at depth (3D seismic tomography) or study the medium changes with time. We present MSNoise, a complete software suite to compute relative seismic velocity changes under a seismic network, using ambient seismic noise. The whole is written in Python, from the monitoring of data archives, to the production of high quality figures. All steps have been optimized to only compute the necessary steps and to use 'job'-based processing. We present a validation of the software on a dataset acquired during the UnderVolc[1] project on the Piton de la Fournaise Volcano, La Réunion Island, France, for which precursory relative changes of seismic velocity are visible for three eruptions betwee 2009 and 2011.

  4. Application of Adjoint Method and Spectral-Element Method to Tomographic Inversion of Regional Seismological Structure Beneath Japanese Islands

    NASA Astrophysics Data System (ADS)

    Tsuboi, S.; Miyoshi, T.; Obayashi, M.; Tono, Y.; Ando, K.

    2014-12-01

    Recent progress in large scale computing by using waveform modeling technique and high performance computing facility has demonstrated possibilities to perform full-waveform inversion of three dimensional (3D) seismological structure inside the Earth. We apply the adjoint method (Liu and Tromp, 2006) to obtain 3D structure beneath Japanese Islands. First we implemented Spectral-Element Method to K-computer in Kobe, Japan. We have optimized SPECFEM3D_GLOBE (Komatitsch and Tromp, 2002) by using OpenMP so that the code fits hybrid architecture of K-computer. Now we could use 82,134 nodes of K-computer (657,072 cores) to compute synthetic waveform with about 1 sec accuracy for realistic 3D Earth model and its performance was 1.2 PFLOPS. We use this optimized SPECFEM3D_GLOBE code and take one chunk around Japanese Islands from global mesh and compute synthetic seismograms with accuracy of about 10 second. We use GAP-P2 mantle tomography model (Obayashi et al., 2009) as an initial 3D model and use as many broadband seismic stations available in this region as possible to perform inversion. We then use the time windows for body waves and surface waves to compute adjoint sources and calculate adjoint kernels for seismic structure. We have performed several iteration and obtained improved 3D structure beneath Japanese Islands. The result demonstrates that waveform misfits between observed and theoretical seismograms improves as the iteration proceeds. We now prepare to use much shorter period in our synthetic waveform computation and try to obtain seismic structure for basin scale model, such as Kanto basin, where there are dense seismic network and high seismic activity. Acknowledgements: This research was partly supported by MEXT Strategic Program for Innovative Research. We used F-net seismograms of the National Research Institute for Earth Science and Disaster Prevention.

  5. Continuous Seismic Threshold Monitoring

    DTIC Science & Technology

    1992-05-31

    Continuous threshold monitoring is a technique for using a seismic network to monitor a geographical area continuously in time. The method provides...area. Two approaches are presented. Site-specific monitoring: By focusing a seismic network on a specific target site, continuous threshold monitoring...recorded events at the site. We define the threshold trace for the network as the continuous time trace of computed upper magnitude limits of seismic

  6. Fast principal component analysis for stacking seismic data

    NASA Astrophysics Data System (ADS)

    Wu, Juan; Bai, Min

    2018-04-01

    Stacking seismic data plays an indispensable role in many steps of the seismic data processing and imaging workflow. Optimal stacking of seismic data can help mitigate seismic noise and enhance the principal components to a great extent. Traditional average-based seismic stacking methods cannot obtain optimal performance when the ambient noise is extremely strong. We propose a principal component analysis (PCA) algorithm for stacking seismic data without being sensitive to noise level. Considering the computational bottleneck of the classic PCA algorithm in processing massive seismic data, we propose an efficient PCA algorithm to make the proposed method readily applicable for industrial applications. Two numerically designed examples and one real seismic data are used to demonstrate the performance of the presented method.

  7. Automatic Seismic-Event Classification with Convolutional Neural Networks.

    NASA Astrophysics Data System (ADS)

    Bueno Rodriguez, A.; Titos Luzón, M.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Active volcanoes exhibit a wide range of seismic signals, providing vast amounts of unlabelled volcano-seismic data that can be analyzed through the lens of artificial intelligence. However, obtaining high-quality labelled data is time-consuming and expensive. Deep neural networks can process data in their raw form, compute high-level features and provide a better representation of the input data distribution. These systems can be deployed to classify seismic data at scale, enhance current early-warning systems and build extensive seismic catalogs. In this research, we aim to classify spectrograms from seven different seismic events registered at "Volcán de Fuego" (Colima, Mexico), during four eruptive periods. Our approach is based on convolutional neural networks (CNNs), a sub-type of deep neural networks that can exploit grid structure from the data. Volcano-seismic signals can be mapped into a grid-like structure using the spectrogram: a representation of the temporal evolution in terms of time and frequency. Spectrograms were computed from the data using Hamming windows with 4 seconds length, 2.5 seconds overlapping and 128 points FFT resolution. Results are compared to deep neural networks, random forest and SVMs. Experiments show that CNNs can exploit temporal and frequency information, attaining a classification accuracy of 93%, similar to deep networks 91% but outperforming SVM and random forest. These results empirically show that CNNs are powerful models to classify a wide range of volcano-seismic signals, and achieve good generalization. Furthermore, volcano-seismic spectrograms contains useful discriminative information for the CNN, as higher layers of the network combine high-level features computed for each frequency band, helping to detect simultaneous events in time. Being at the intersection of deep learning and geophysics, this research enables future studies of how CNNs can be used in volcano monitoring to accurately determine the detection and location of seismic events.

  8. 3D Seismic Experimentation and Advanced Processing/Inversion Development for Investigations of the Shallow Subsurface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levander, Alan Richard; Zelt, Colin A.

    2015-03-17

    The work plan for this project was to develop and apply advanced seismic reflection and wide-angle processing and inversion techniques to high resolution seismic data for the shallow subsurface to seismically characterize the shallow subsurface at hazardous waste sites as an aid to containment and cleanup activities. We proposed to continue work on seismic data that we had already acquired under a previous DoE grant, as well as to acquire additional new datasets for analysis. The project successfully developed and/or implemented the use of 3D reflection seismology algorithms, waveform tomography and finite-frequency tomography using compressional and shear waves for highmore » resolution characterization of the shallow subsurface at two waste sites. These two sites have markedly different near-surface structures, groundwater flow patterns, and hazardous waste problems. This is documented in the list of refereed documents, conference proceedings, and Rice graduate theses, listed below.« less

  9. Computer-assisted versus conventional free fibula flap technique for craniofacial reconstruction: an outcomes comparison.

    PubMed

    Seruya, Mitchel; Fisher, Mark; Rodriguez, Eduardo D

    2013-11-01

    There has been rising interest in computer-aided design/computer-aided manufacturing for preoperative planning and execution of osseous free flap reconstruction. The purpose of this study was to compare outcomes between computer-assisted and conventional fibula free flap techniques for craniofacial reconstruction. A two-center, retrospective review was carried out on patients who underwent fibula free flap surgery for craniofacial reconstruction from 2003 to 2012. Patients were categorized by the type of reconstructive technique: conventional (between 2003 and 2009) or computer-aided design/computer-aided manufacturing (from 2010 to 2012). Demographics, surgical factors, and perioperative and long-term outcomes were compared. A total of 68 patients underwent microsurgical craniofacial reconstruction: 58 conventional and 10 computer-aided design and manufacturing fibula free flaps. By demographics, patients undergoing the computer-aided design/computer-aided manufacturing method were significantly older and had a higher rate of radiotherapy exposure compared with conventional patients. Intraoperatively, the median number of osteotomies was significantly higher (2.0 versus 1.0, p=0.002) and the median ischemia time was significantly shorter (120 minutes versus 170 minutes, p=0.004) for the computer-aided design/computer-aided manufacturing technique compared with conventional techniques; operative times were shorter for patients undergoing the computer-aided design/computer-aided manufacturing technique, although this did not reach statistical significance. Perioperative and long-term outcomes were equivalent for the two groups, notably, hospital length of stay, recipient-site infection, partial and total flap loss, and rate of soft-tissue and bony tissue revisions. Microsurgical craniofacial reconstruction using a computer-assisted fibula flap technique yielded significantly shorter ischemia times amidst a higher number of osteotomies compared with conventional techniques. Therapeutic, III.

  10. Motorized Activity on Legacy Seismic Lines: A Predictive Modeling Approach to Prioritize Restoration Efforts.

    PubMed

    Hornseth, M L; Pigeon, K E; MacNearney, D; Larsen, T A; Stenhouse, G; Cranston, J; Finnegan, L

    2018-05-11

    Natural regeneration of seismic lines, cleared for hydrocarbon exploration, is slow and often hindered by vegetation damage, soil compaction, and motorized human activity. There is an extensive network of seismic lines in western Canada which is known to impact forest ecosystems, and seismic lines have been linked to declines in woodland caribou (Rangifer tarandus caribou). Seismic line restoration is costly, but necessary for caribou conservation to reduce cumulative disturbance. Understanding where motorized activity may be impeding regeneration of seismic lines will aid in prioritizing restoration. Our study area in west-central Alberta, encompassed five caribou ranges where restoration is required under federal species at risk recovery strategies, hence prioritizing seismic lines for restoration is of immediate conservation value. To understand patterns of motorized activity on seismic lines, we evaluated five a priori hypotheses using a predictive modeling framework and Geographic Information System variables across three landscapes in the foothills and northern boreal regions of Alberta. In the northern boreal landscape, motorized activity was most common in dry areas with a large industrial footprint. In highly disturbed areas of the foothills, motorized activity on seismic lines increased with low vegetation heights, relatively dry soils, and further from forest cutblocks, while in less disturbed areas of the foothills, motorized activity on seismic lines decreased proportional to seismic line density, slope steepness, and white-tailed deer abundance, and increased proportional with distance to roads. We generated predictive maps of high motorized activity, identifying 21,777 km of seismic lines where active restoration could expedite forest regeneration.

  11. Computer-Aided Facilities Management Systems (CAFM).

    ERIC Educational Resources Information Center

    Cyros, Kreon L.

    Computer-aided facilities management (CAFM) refers to a collection of software used with increasing frequency by facilities managers. The six major CAFM components are discussed with respect to their usefulness and popularity in facilities management applications: (1) computer-aided design; (2) computer-aided engineering; (3) decision support…

  12. Integrated system for well-to-well correlation with geological knowledge base

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saito, K.; Doi, E.; Uchiyama, T.

    1987-05-01

    A task of well-to-well correlation is an essential part of the reservoir description study. Since the task is involved with diverse data such as logs, dipmeter, seismic, and reservoir engineering, a system with simultaneous access to such data is desirable. A system is developed to aid stratigraphic correlation under a Xerox 1108 workstation, written in INTERLISP-D. The system uses log, dipmeter, seismic, and computer-processed results such as Litho-Analysis and LSA (Log Shape Analyzer). The system first defines zones which are segmentations of log data into consistent layers using Litho-Analysis and LSA results. Each zone is defined as a minimum unitmore » for correlation with slot values of lithology, thickness, log values, and log shape such as bell, cylinder, and funnel. Using a user's input of local geological knowledge such as depositional environment, the system selects marker beds and performs correlation among the wells chosen from the base map. Correlation is performed first with markers and then with sandstones of lesser lateral extent. Structural dip and seismic horizon are guides for seeking a correlatable event. Knowledge of sand body geometry such as ratio of thickness and width is also used to provide a guide on how far a correlation should be made. Correlation results performed by the system are displayed on the screen for the user to examine and modify. The system has been tested with data sets from several depositional settings and has shown to be a useful tool for correlation work. The results are stored as a data base for structural mapping and reservoir engineering study.« less

  13. The utility of petroleum seismic exploration data in delineating structural features within salt anticlines

    USGS Publications Warehouse

    Stockton, S.L.; Balch, Alfred H.

    1978-01-01

    The Salt Valley anticline, in the Paradox Basin of southeastern Utah, is under investigation for use as a location for storage of solid nuclear waste. Delineation of thin, nonsalt interbeds within the upper reaches of the salt body is extremely important because the nature and character of any such fluid- or gas-saturated horizons would be critical to the mode of emplacement of wastes into the structure. Analysis of 50 km of conventional seismic-reflection data, in the vicinity of the anticline, indicates that mapping of thin beds at shallow depths may well be possible using a specially designed adaptation of state-of-the-art seismic oil-exploration procedures. Computer ray-trace modeling of thin beds in salt reveals that the frequency and spatial resolution required to map the details of interbeds at shallow depths (less than 750 m) may be on the order of 500 Hz, with surface-spread lengths of less than 350 m. Consideration should be given to the burial of sources and receivers in order to attenuate surface noise and to record the desired high frequencies. Correlation of the seismic-reflection data with available well data and surface geology reveals the complex, structurally initiated diapir, whose upward flow was maintained by rapid contemporaneous deposition of continental clastic sediments on its flanks. Severe collapse faulting near the crests of these structures has distorted the seismic response. Evidence exists, however, that intrasalt thin beds of anhydrite, dolomite, and black shale are mappable on seismic record sections either as short, discontinuous reflected events or as amplitude anomalies that result from focusing of the reflected seismic energy by the thin beds; computer modeling of the folded interbeds confirms both of these as possible causes of seismic response from within the salt diapir. Prediction of the seismic signatures of the interbeds can be made from computer-model studies. Petroleum seismic-reflection data are unsatisfactory for mapping the thin beds because of the lack of sufficient resolution to provide direct evidence of the presence of the thin beds. However, indirect evidence, present in these data as discontinuous seismic events, suggests that two geophysical techniques designed for this specific problem would allow direct detection of the interbeds in salt. These techniques are vertical seismic profiling and shallow, short-offset, high-frequency, seismic-reflection recording.

  14. A multimethod Global Sensitivity Analysis to aid the calibration of geomechanical models via time-lapse seismic data

    NASA Astrophysics Data System (ADS)

    Price, D. C.; Angus, D. A.; Garcia, A.; Fisher, Q. J.; Parsons, S.; Kato, J.

    2018-03-01

    Time-lapse seismic attributes are used extensively in the history matching of production simulator models. However, although proven to contain information regarding production induced stress change, it is typically only loosely (i.e. qualitatively) used to calibrate geomechanical models. In this study we conduct a multimethod Global Sensitivity Analysis (GSA) to assess the feasibility and aid the quantitative calibration of geomechanical models via near-offset time-lapse seismic data. Specifically, the calibration of mechanical properties of the overburden. Via the GSA, we analyse the near-offset overburden seismic traveltimes from over 4000 perturbations of a Finite Element (FE) geomechanical model of a typical High Pressure High Temperature (HPHT) reservoir in the North Sea. We find that, out of an initially large set of material properties, the near-offset overburden traveltimes are primarily affected by Young's modulus and the effective stress (i.e. Biot) coefficient. The unexpected significance of the Biot coefficient highlights the importance of modelling fluid flow and pore pressure outside of the reservoir. The FE model is complex and highly nonlinear. Multiple combinations of model parameters can yield equally possible model realizations. Consequently, numerical calibration via a large number of random model perturbations is unfeasible. However, the significant differences in traveltime results suggest that more sophisticated calibration methods could potentially be feasible for finding numerous suitable solutions. The results of the time-varying GSA demonstrate how acquiring multiple vintages of time-lapse seismic data can be advantageous. However, they also suggest that significant overburden near-offset seismic time-shifts, useful for model calibration, may take up to 3 yrs after the start of production to manifest. Due to the nonlinearity of the model behaviour, similar uncertainty in the reservoir mechanical properties appears to influence overburden traveltime to a much greater extent. Therefore, reservoir properties must be known to a suitable degree of accuracy before the calibration of the overburden can be considered.

  15. On the Value of Computer-aided Instruction: Thoughts after Teaching Sales Writing in a Computer Classroom.

    ERIC Educational Resources Information Center

    Hagge, John

    1986-01-01

    Focuses on problems encountered with computer-aided writing instruction. Discusses conflicts caused by the computer classroom concept, some general paradoxes and ethical implications of computer-aided instruction. (EL)

  16. Project-Based Teaching-Learning Computer-Aided Engineering Tools

    ERIC Educational Resources Information Center

    Simoes, J. A.; Relvas, C.; Moreira, R.

    2004-01-01

    Computer-aided design, computer-aided manufacturing, computer-aided analysis, reverse engineering and rapid prototyping are tools that play an important key role within product design. These are areas of technical knowledge that must be part of engineering and industrial design courses' curricula. This paper describes our teaching experience of…

  17. The Collaborative Seismic Earth Model: Generation 1

    NASA Astrophysics Data System (ADS)

    Fichtner, Andreas; van Herwaarden, Dirk-Philip; Afanasiev, Michael; SimutÄ--, SaulÄ--; Krischer, Lion; ćubuk-Sabuncu, Yeşim; Taymaz, Tuncay; Colli, Lorenzo; Saygin, Erdinc; Villaseñor, Antonio; Trampert, Jeannot; Cupillard, Paul; Bunge, Hans-Peter; Igel, Heiner

    2018-05-01

    We present a general concept for evolutionary, collaborative, multiscale inversion of geophysical data, specifically applied to the construction of a first-generation Collaborative Seismic Earth Model. This is intended to address the limited resources of individual researchers and the often limited use of previously accumulated knowledge. Model evolution rests on a Bayesian updating scheme, simplified into a deterministic method that honors today's computational restrictions. The scheme is able to harness distributed human and computing power. It furthermore handles conflicting updates, as well as variable parameterizations of different model refinements or different inversion techniques. The first-generation Collaborative Seismic Earth Model comprises 12 refinements from full seismic waveform inversion, ranging from regional crustal- to continental-scale models. A global full-waveform inversion ensures that regional refinements translate into whole-Earth structure.

  18. The Effect of Boiling on Seismic Properties of Water-Saturated Fractured Rock

    NASA Astrophysics Data System (ADS)

    Grab, Melchior; Quintal, Beatriz; Caspari, Eva; Deuber, Claudia; Maurer, Hansruedi; Greenhalgh, Stewart

    2017-11-01

    Seismic campaigns for exploring geothermal systems aim at detecting permeable formations in the subsurface and evaluating the energy state of the pore fluids. High-enthalpy geothermal resources are known to contain fluids ranging from liquid water up to liquid-vapor mixtures in regions where boiling occurs and, ultimately, to vapor-dominated fluids, for instance, if hot parts of the reservoir get depressurized during production. In this study, we implement the properties of single- and two-phase fluids into a numerical poroelastic model to compute frequency-dependent seismic velocities and attenuation factors of a fractured rock as a function of fluid state. Fluid properties are computed while considering that thermodynamic interaction between the fluid phases takes place. This leads to frequency-dependent fluid properties and fluid internal attenuation. As shown in a first example, if the fluid contains very small amounts of vapor, fluid internal attenuation is of similar magnitude as attenuation in fractured rock due to other mechanisms. In a second example, seismic properties of a fractured geothermal reservoir with spatially varying fluid properties are calculated. Using the resulting seismic properties as an input model, the seismic response of the reservoir is then computed while the hydrothermal structure is assumed to vary over time. The resulting seismograms demonstrate that anomalies in the seismic response due to fluid state variability are small compared to variations caused by geological background heterogeneity. However, the hydrothermal structure in the reservoir can be delineated from amplitude anomalies when the variations due to geology can be ruled out such as in time-lapse experiments.

  19. Comparison of the Structurally Controlled Landslides Numerical Model Results to the M 7.2 2013 Bohol Earthquake Co-seismic Landslides

    NASA Astrophysics Data System (ADS)

    Macario Galang, Jan Albert; Narod Eco, Rodrigo; Mahar Francisco Lagmay, Alfredo

    2015-04-01

    The M 7.2 October 15, 2013 Bohol earthquake is the most destructive earthquake to hit the Philippines since 2012. The epicenter was located in Sagbayan municipality, central Bohol and was generated by a previously unmapped reverse fault called the "Inabanga Fault". Its name, taken after the barangay (village) where the fault is best exposed and was first seen. The earthquake resulted in 209 fatalities and over 57 billion USD worth of damages. The earthquake generated co-seismic landslides most of which were related to fault structures. Unlike rainfall induced landslides, the trigger for co-seismic landslides happen without warning. Preparedness against this type of landslide therefore, relies heavily on the identification of fracture-related unstable slopes. To mitigate the impacts of co-seismic landslide hazards, morpho-structural orientations or discontinuity sets were mapped in the field with the aid of a 2012 IFSAR Digital Terrain Model (DTM) with 5-meter pixel resolution and < 0.5 meter vertical accuracy. Coltop 3D software was then used to identify similar structures including measurement of their dip and dip directions. The chosen discontinuity sets were then keyed into Matterocking software to identify potential rock slide zones due to planar or wedged discontinuities. After identifying the structurally-controlled unstable slopes, the rock mass propagation extent of the possible rock slides was simulated using Conefall. The results were compared to a post-earthquake landslide inventory of 456 landslides. Out the total number of landslides identified from post-earthquake high-resolution imagery, 366 or 80% intersect the structural-controlled hazard areas of Bohol. The results show the potential of this method to identify co-seismic landslide hazard areas for disaster mitigation. Along with computer methods to simulate shallow landslides, and debris flow paths, located structurally-controlled unstable zones can be used to mark unsafe areas for settlement. The method can be further improved with the use of Lidar DTMs, which has better accuracy than the IFSAR DTM. A nationwide effort under DOST-Project NOAH (DREAM-LIDAR) is underway, to map the Philippine archipelago using Lidar.

  20. Full Waveform Adjoint Seismic Tomography of the Antarctic Plate

    NASA Astrophysics Data System (ADS)

    Lloyd, A. J.; Wiens, D.; Zhu, H.; Tromp, J.; Nyblade, A.; Anandakrishnan, S.; Aster, R. C.; Huerta, A. D.; Winberry, J. P.; Wilson, T. J.; Dalziel, I. W. D.; Hansen, S. E.; Shore, P.

    2017-12-01

    Recent studies investigating the response and influence of the solid Earth on the evolution of the cryosphere demonstrate the need to account for 3D rheological structure to better predict ice sheet dynamics, stability, and future sea level impact, as well as to improve glacial isostatic adjustment models and more accurately measure ice mass loss. Critical rheological properties like mantle viscosity and lithospheric thickness may be estimated from shear wave velocity models that, for Antarctica, would ideally possess regional-scale resolution extending down to at least the base of the transition zone (i.e. 670 km depth). However, current global- and continental-scale seismic velocity models are unable to obtain both the resolution and spatial coverage necessary, do not take advantage of the full set of available Antarctic data, and, in most instance, employ traditional seismic imaging techniques that utilize limited seismogram information. We utilize 3-component earthquake waveforms from almost 300 Antarctic broadband seismic stations and 26 southern mid-latitude stations from 270 earthquakes (5.5 ≤ Mw ≤ 7.0) between 2001-2003 and 2007-2016 to conduct a full-waveform adjoint inversion for Antarctica and surrounding regions of the Antarctic plate. Necessary forward and adjoint wavefield simulations are performed utilizing SPECFEM3D_GLOBE with the aid of the Texas Advanced Computing Center. We utilize phase observations from seismogram segments containing P, S, Rayleigh, and Love waves, including reflections and overtones, which are autonomously identified using FLEXWIN. The FLEXWIN analysis is carried out over a short (15-50 s) and long (initially 50-150 s) period band that target body waves, or body and surface waves, respectively. As our model is iteratively refined, the short-period corner of the long period band is gradually reduced to 25 s as the model converges over 20 linearized inversion iterations. We will briefly present this new high-resolution transverse isotropic seismic model of the Antarctic upper mantle and transition zone, which will be broadly valuable to advance cryosphere studies and improve understanding of the tectonic structure and geodynamic processes of Antarctica.

  1. Matching pursuit parallel decomposition of seismic data

    NASA Astrophysics Data System (ADS)

    Li, Chuanhui; Zhang, Fanchang

    2017-07-01

    In order to improve the computation speed of matching pursuit decomposition of seismic data, a matching pursuit parallel algorithm is designed in this paper. We pick a fixed number of envelope peaks from the current signal in every iteration according to the number of compute nodes and assign them to the compute nodes on average to search the optimal Morlet wavelets in parallel. With the help of parallel computer systems and Message Passing Interface, the parallel algorithm gives full play to the advantages of parallel computing to significantly improve the computation speed of the matching pursuit decomposition and also has good expandability. Besides, searching only one optimal Morlet wavelet by every compute node in every iteration is the most efficient implementation.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busbey, A.B.

    Seismic Processing Workshop, a program by Parallel Geosciences of Austin, TX, is discussed in this column. The program is a high-speed, interactive seismic processing and computer analysis system for the Apple Macintosh II family of computers. Also reviewed in this column are three products from Wilkerson Associates of Champaign, IL. SubSide is an interactive program for basin subsidence analysis; MacFault and MacThrustRamp are programs for modeling faults.

  3. An efficient repeating signal detector to investigate earthquake swarms

    NASA Astrophysics Data System (ADS)

    Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.

    2016-08-01

    Repetitive earthquake swarms have been recognized as key signatures in fluid injection induced seismicity, precursors to volcanic eruptions, and slow slip events preceding megathrust earthquakes. We investigate earthquake swarms by developing a Repeating Signal Detector (RSD), a computationally efficient algorithm utilizing agglomerative clustering to identify similar waveforms buried in years of seismic recordings using a single seismometer. Instead of relying on existing earthquake catalogs of larger earthquakes, RSD identifies characteristic repetitive waveforms by rapidly identifying signals of interest above a low signal-to-noise ratio and then grouping based on spectral and time domain characteristics, resulting in dramatically shorter processing time than more exhaustive autocorrelation approaches. We investigate seismicity in four regions using RSD: (1) volcanic seismicity at Mammoth Mountain, California, (2) subduction-related seismicity in Oaxaca, Mexico, (3) induced seismicity in Central Alberta, Canada, and (4) induced seismicity in Harrison County, Ohio. In each case, RSD detects a similar or larger number of earthquakes than existing catalogs created using more time intensive methods. In Harrison County, RSD identifies 18 seismic sequences that correlate temporally and spatially to separate hydraulic fracturing operations, 15 of which were previously unreported. RSD utilizes a single seismometer for earthquake detection which enables seismicity to be quickly identified in poorly instrumented regions at the expense of relying on another method to locate the new detections. Due to the smaller computation overhead and success at distances up to ~50 km, RSD is well suited for real-time detection of low-magnitude earthquake swarms with permanent regional networks.

  4. Seismic Characterization of the Newberry and Cooper Basin EGS Sites

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.

    2015-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  5. Seismic modeling of Earth's 3D structure: Recent advancements

    NASA Astrophysics Data System (ADS)

    Ritsema, J.

    2008-12-01

    Global models of Earth's seismic structure continue to improve due to the growth of seismic data sets, implementation of advanced wave propagations theories, and increased computational power. In my presentation, I will summarize seismic tomography results from the past 5-10 years. I will compare the most recent P and S velocity models, discuss model resolution and model interpretation, and present an, admittedly biased, list of research directions required to develop the next generation 3D models.

  6. Sensitivities Kernels of Seismic Traveltimes and Amplitudes for Quality Factor and Boundary Topography

    NASA Astrophysics Data System (ADS)

    Hsieh, M.; Zhao, L.; Ma, K.

    2010-12-01

    Finite-frequency approach enables seismic tomography to fully utilize the spatial and temporal distributions of the seismic wavefield to improve resolution. In achieving this goal, one of the most important tasks is to compute efficiently and accurately the (Fréchet) sensitivity kernels of finite-frequency seismic observables such as traveltime and amplitude to the perturbations of model parameters. In scattering-integral approach, the Fréchet kernels are expressed in terms of the strain Green tensors (SGTs), and a pre-established SGT database is necessary to achieve practical efficiency for a three-dimensional reference model in which the SGTs must be calculated numerically. Methods for computing Fréchet kernels for seismic velocities have long been established. In this study, we develop algorithms based on the finite-difference method for calculating Fréchet kernels for the quality factor Qμ and seismic boundary topography. Kernels for the quality factor can be obtained in a way similar to those for seismic velocities with the help of the Hilbert transform. The effects of seismic velocities and quality factor on either traveltime or amplitude are coupled. Kernels for boundary topography involve spatial gradient of the SGTs and they also exhibit interesting finite-frequency characteristics. Examples of quality factor and boundary topography kernels will be shown for a realistic model for the Taiwan region with three-dimensional velocity variation as well as surface and Moho discontinuity topography.

  7. Engineering Technology Programs Courses Guide for Computer Aided Design and Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Div. of Vocational Education.

    This guide describes the requirements for courses in computer-aided design and computer-aided manufacturing (CAD/CAM) that are part of engineering technology programs conducted in vocational-technical schools in Georgia. The guide is organized in five sections. The first section provides a rationale for occupations in design and in production,…

  8. Employment Opportunities for the Handicapped in Programmable Automation.

    ERIC Educational Resources Information Center

    Swift, Richard; Leneway, Robert

    A Computer Integrated Manufacturing System may make it possible for severely disabled people to custom design, machine, and manufacture either wood or metal parts. Programmable automation merges computer aided design, computer aided manufacturing, computer aided engineering, and computer integrated manufacturing systems with automated production…

  9. Post-processing of seismic parameter data based on valid seismic event determination

    DOEpatents

    McEvilly, Thomas V.

    1985-01-01

    An automated seismic processing system and method are disclosed, including an array of CMOS microprocessors for unattended battery-powered processing of a multi-station network. According to a characterizing feature of the invention, each channel of the network is independently operable to automatically detect, measure times and amplitudes, and compute and fit Fast Fourier transforms (FFT's) for both P- and S- waves on analog seismic data after it has been sampled at a given rate. The measured parameter data from each channel are then reviewed for event validity by a central controlling microprocessor and if determined by preset criteria to constitute a valid event, the parameter data are passed to an analysis computer for calculation of hypocenter location, running b-values, source parameters, event count, P- wave polarities, moment-tensor inversion, and Vp/Vs ratios. The in-field real-time analysis of data maximizes the efficiency of microearthquake surveys allowing flexibility in experimental procedures, with a minimum of traditional labor-intensive postprocessing. A unique consequence of the system is that none of the original data (i.e., the sensor analog output signals) are necessarily saved after computation, but rather, the numerical parameters generated by the automatic analysis are the sole output of the automated seismic processor.

  10. Numerical modeling of landslides and generated seismic waves: The Bingham Canyon Mine landslides

    NASA Astrophysics Data System (ADS)

    Miallot, H.; Mangeney, A.; Capdeville, Y.; Hibert, C.

    2016-12-01

    Landslides are important natural hazards and key erosion processes. They create long period surface waves that can be recorded by regional and global seismic networks. The seismic signals are generated by acceleration/deceleration of the mass sliding over the topography. They consist in a unique and powerful tool to detect, characterize and quantify the landslide dynamics. We investigate here the processes at work during the two massive landslides that struck the Bingham Canyon Mine on the 10th April 2013. We carry a combined analysis of the generated seismic signals and the landslide processes computed with a 3D modeling on a complex topography. Forces computed by broadband seismic waveform inversion are used to constrain the study and particularly the force-source and the bulk dynamic. The source time function are obtained by a 3D model (Shaltop) where rheological parameters can be adjusted. We first investigate the influence of the initial shape of the sliding mass which strongly affects the whole landslide dynamic. We also see that the initial shape of the source mass of the first landslide constrains pretty well the second landslide source mass. We then investigate the effect of a rheological parameter, the frictional angle, that strongly influences the resulted computed seismic source function. We test here numerous friction laws as the frictional Coulomb law and a velocity-weakening friction law. Our results show that the force waveform fitting the observed data is highly variable depending on these different choices.

  11. A first step to compare geodynamical models and seismic observations of the inner core

    NASA Astrophysics Data System (ADS)

    Lasbleis, M.; Waszek, L.; Day, E. A.

    2016-12-01

    Seismic observations have revealed a complex inner core, with lateral and radial heterogeneities at all observable scales. The dominant feature is the east-west hemispherical dichotomy in seismic velocity and attenuation. Several geodynamical models have been proposed to explain the observed structure: convective instabilities, external forces, crystallisation processes or influence of outer core convection. However, interpreting such geodynamical models in terms of the seismic observations is difficult, and has been performed only for very specific models (Geballe 2013, Lincot 2014, 2016). Here, we propose a common framework to make such comparisons. We have developed a Python code that propagates seismic ray paths through kinematic geodynamical models for the inner core, computing a synthetic seismic data set that can be compared to seismic observations. Following the method of Geballe 2013, we start with the simple model of translation. For this, the seismic velocity is proposed to be function of the age or initial growth rate of the material (since there is no deformation included in our models); the assumption is reasonable when considering translation, growth and super rotation of the inner core. Using both artificial (random) seismic ray data sets and a real inner core data set (from Waszek et al. 2011), we compare these different models. Our goal is to determine the model which best matches the seismic observations. Preliminary results show that super rotation successfully creates an eastward shift in properties with depth, as has been observed seismically. Neither the growth rate of inner core material nor the relationship between crystal size and seismic velocity are well constrained. Consequently our method does not directly compute the seismic travel times. Instead, here we use age, growth rate and other parameters as proxies for the seismic properties, which represent a good first step to compare geodynamical and seismic observations.Ultimately we aim to release our codes to broader scientific community, allowing researchers from all disciplines to test their models of inner core growth against seismic observations or create a kinematic model for the evolution of the inner core which matches new geophysical observations.

  12. Time-Independent Annual Seismic Rates, Based on Faults and Smoothed Seismicity, Computed for Seismic Hazard Assessment in Italy

    NASA Astrophysics Data System (ADS)

    Murru, M.; Falcone, G.; Taroni, M.; Console, R.

    2017-12-01

    In 2015 the Italian Department of Civil Protection, started a project for upgrading the official Italian seismic hazard map (MPS04) inviting the Italian scientific community to participate in a joint effort for its realization. We participated providing spatially variable time-independent (Poisson) long-term annual occurrence rates of seismic events on the entire Italian territory, considering cells of 0.1°x0.1° from M4.5 up to M8.1 for magnitude bin of 0.1 units. Our final model was composed by two different models, merged in one ensemble model, each one with the same weight: the first one was realized by a smoothed seismicity approach, the second one using the seismogenic faults. The spatial smoothed seismicity was obtained using the smoothing method introduced by Frankel (1995) applied to the historical and instrumental seismicity. In this approach we adopted a tapered Gutenberg-Richter relation with a b-value fixed to 1 and a corner magnitude estimated with the bigger events in the catalogs. For each seismogenic fault provided by the Database of the Individual Seismogenic Sources (DISS), we computed the annual rate (for each cells of 0.1°x0.1°) for magnitude bin of 0.1 units, assuming that the seismic moments of the earthquakes generated by each fault are distributed according to the same tapered Gutenberg-Richter relation of the smoothed seismicity model. The annual rate for the final model was determined in the following way: if the cell falls within one of the seismic sources, we merge the respective value of rate determined by the seismic moments of the earthquakes generated by each fault and the value of the smoothed seismicity model with the same weight; if instead the cells fall outside of any seismic source we considered the rate obtained from the spatial smoothed seismicity. Here we present the final results of our study to be used for the new Italian seismic hazard map.

  13. Computer power fathoms the depths: billion-bit data processors illuminate the subsurface. [3-D Seismic techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, J.J.

    Some of the same space-age signal technology being used to track events 200 miles above the earth is helping petroleum explorationists track down oil and natural gas two miles and more down into the earth. The breakthroughs, which have come in a technique called three-dimensional seismic work, could change the complexion of exploration for oil and natural gas. Thanks to this 3-D seismic approach, explorationists can make dynamic maps of sites miles beneath the surface. Then explorationists can throw these maps on space-age computer systems and manipulate them every which way - homing in sharply on salt domes, faults, sandsmore » and traps associated with oil and natural gas. ''The 3-D seismic scene has exploded within the last two years,'' says, Peiter Tackenberg, Marathon technical consultant who deals with both domestic and international exploration. The 3-D technique has been around for more than a decade, he notes, but recent achievements in space-age computer hardware and software have unlocked its full potential.« less

  14. The use of vertical seismic profiles in seismic investigations of the earth

    USGS Publications Warehouse

    Balch, Alfred H.; Lee, M.W.; Miller, J.J.; Ryder, Robert T.

    1982-01-01

    During the past 8 years, the U.S. Geological Survey has conducted an extensive investigation on the use of vertical seismic profiles (VSP) in a variety of seismic exploration applications. Seismic sources used were surface air guns, vibrators, explosives, marine air guns, and downhole air guns. Source offsets have ranged from 100 to 7800 ft. Well depths have been from 1200 to over 10,000 ft. We have found three specific ways in which VSPs can be applied to seismic exploration. First, seismic events observed at the surface of the ground can be traced, level by level, to their point of origin within the earth. Thus, one can tie a surface profile to a well log with an extraordinarily high degree of confidence. Second, one can establish the detectability of a target horizon, such as a porous zone. One can determine (either before or after surface profiling) whether or not a given horizon or layered sequence returns a detectable reflection to the surface. The amplitude and character of the reflection can also be observed. Third, acoustic properties of a stratigraphic sequence can be measured and sometimes correlated to important exploration parameters. For example, sometimes a relationship between apparent attenuation and sand percentage can be established. The technique shows additional promise of aiding surface exploration indirectly through studies of the evolution of the seismic pulse, studies of ghosts and multiples, and studies of seismic trace inversion techniques. Nearly all current seismic data‐processing techniques are adaptable to the processing of VSP data, such as normal moveout (NMO) corrections, stacking, single‐and multiple‐channel filtering, deconvolution, and wavelet shaping.

  15. Advanced computational tools for 3-D seismic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advancemore » in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.« less

  16. CAD/CAM (Computer Aided Design/Computer Aided Manufacture). A Brief Guide to Materials in the Library of Congress.

    ERIC Educational Resources Information Center

    Havas, George D.

    This brief guide to materials in the Library of Congress (LC) on computer aided design and/or computer aided manufacturing lists reference materials and other information sources under 13 headings: (1) brief introductions; (2) LC subject headings used for such materials; (3) textbooks; (4) additional titles; (5) glossaries and handbooks; (6)…

  17. Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators

    NASA Astrophysics Data System (ADS)

    Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.

    2015-12-01

    Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  18. Neo-deterministic seismic hazard scenarios for India—a preventive tool for disaster mitigation

    NASA Astrophysics Data System (ADS)

    Parvez, Imtiyaz A.; Magrin, Andrea; Vaccari, Franco; Ashish; Mir, Ramees R.; Peresan, Antonella; Panza, Giuliano Francesco

    2017-11-01

    Current computational resources and physical knowledge of the seismic wave generation and propagation processes allow for reliable numerical and analytical models of waveform generation and propagation. From the simulation of ground motion, it is easy to extract the desired earthquake hazard parameters. Accordingly, a scenario-based approach to seismic hazard assessment has been developed, namely the neo-deterministic seismic hazard assessment (NDSHA), which allows for a wide range of possible seismic sources to be used in the definition of reliable scenarios by means of realistic waveforms modelling. Such reliable and comprehensive characterization of expected earthquake ground motion is essential to improve building codes, particularly for the protection of critical infrastructures and for land use planning. Parvez et al. (Geophys J Int 155:489-508, 2003) published the first ever neo-deterministic seismic hazard map of India by computing synthetic seismograms with input data set consisting of structural models, seismogenic zones, focal mechanisms and earthquake catalogues. As described in Panza et al. (Adv Geophys 53:93-165, 2012), the NDSHA methodology evolved with respect to the original formulation used by Parvez et al. (Geophys J Int 155:489-508, 2003): the computer codes were improved to better fit the need of producing realistic ground shaking maps and ground shaking scenarios, at different scale levels, exploiting the most significant pertinent progresses in data acquisition and modelling. Accordingly, the present study supplies a revised NDSHA map for India. The seismic hazard, expressed in terms of maximum displacement (Dmax), maximum velocity (Vmax) and design ground acceleration (DGA), has been extracted from the synthetic signals and mapped on a regular grid over the studied territory.

  19. Student Achievement in Computer Programming: Lecture vs Computer-Aided Instruction

    ERIC Educational Resources Information Center

    Tsai, San-Yun W.; Pohl, Norval F.

    1978-01-01

    This paper discusses a study of the differences in student learning achievement, as measured by four different types of common performance evaluation techniques, in a college-level computer programming course under three teaching/learning environments: lecture, computer-aided instruction, and lecture supplemented with computer-aided instruction.…

  20. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.

  1. Array seismological investigation of the South Atlantic 'Superplume'

    NASA Astrophysics Data System (ADS)

    Hempel, Stefanie; Gassmöller, Rene; Thomas, Christine

    2015-04-01

    We apply the axisymmetric, spherical Earth spectral elements code AxiSEM to model seismic compressional waves which sample complex `superplume' structures in the lower mantle. High-resolution array seismological stacking techniques are evaluated regarding their capability to resolve large-scale high-density low-velocity bodies including interior structure such as inner upwellings, high density lenses, ultra-low velocity zones (ULVZs), neighboring remnant slabs and adjacent small-scale uprisings. Synthetic seismograms are also computed and processed for models of the Earth resulting from geodynamic modelling of the South Atlantic mantle including plate reconstruction. We discuss the interference and suppression of the resulting seismic signals and implications for a seismic data study in terms of visibility of the South Atlantic `superplume' structure. This knowledge is used to process, invert and interpret our data set of seismic sources from the Andes and the South Sandwich Islands detected at seismic arrays spanning from Ethiopia over Cameroon to South Africa mapping the South Atlantic `superplume' structure including its interior structure. In order too present the model of the South Atlantic `superplume' structure that best fits the seismic data set, we iteratively compute synthetic seismograms while adjusting the model according to the dependencies found in the parameter study.

  2. The Quake Catcher Network: Cyberinfrastructure Bringing Seismology into Schools and Homes

    NASA Astrophysics Data System (ADS)

    Lawrence, J. F.; Cochran, E. S.

    2007-12-01

    We propose to implement a high density, low cost strong-motion network for rapid response and early warning by placing sensors in schools, homes, and offices. The Quake Catcher Network (QCN) will employ existing networked laptops and desktops to form the world's largest high-density, distributed computing seismic network. Costs for this network will be minimal because the QCN will use 1) strong motion sensors (accelerometers) already internal to many laptops and 2) nearly identical low-cost universal serial bus (USB) accelerometers for use with desktops. The Berkeley Open Infrastructure for Network Computing (BOINC!) provides a free, proven paradigm for involving the public in large-scale computational research projects. As evidenced by the SETI@home program and others, individuals are especially willing to donate their unused computing power to projects that they deem relevant, worthwhile, and educational. The client- and server-side software will rapidly monitor incoming seismic signals, detect the magnitudes and locations of significant earthquakes, and may even provide early warnings to other computers and users before they can feel the earthquake. The software will provide the client-user with a screen-saver displaying seismic data recorded on their laptop, recently detected earthquakes, and general information about earthquakes and the geosciences. Furthermore, this project will install USB sensors in K-12 classrooms as an educational tool for teaching science. Through a variety of interactive experiments students will learn about earthquakes and the hazards earthquakes pose. For example, students can learn how the vibrations of an earthquake decrease with distance by jumping up and down at increasing distances from the sensor and plotting the decreased amplitude of the seismic signal measured on their computer. We hope to include an audio component so that students can hear and better understand the difference between low and high frequency seismic signals. The QCN will provide a natural way to engage students and the public in earthquake detection and research.

  3. Silicon Wafer Advanced Packaging (SWAP). Multichip Module (MCM) Foundry Study. Version 2

    DTIC Science & Technology

    1991-04-08

    Next Layer Dielectric Spacing - Additional Metal Thickness Impact on Dielectric Uniformity/Adhiesion. The first step in .!Ie EPerimental design would be... design CAM - computer aided manufacturing CAE - computer aided engineering CALCE - computer aided life cycle engineering center CARMA - computer aided...expansion 5 j- CVD - chemical vapor deposition J . ..- j DA - design automation J , DEC - Digital Equipment Corporation --- DFT - design for testability

  4. The application of computer-aided technologies in automotive styling design

    NASA Astrophysics Data System (ADS)

    Zheng, Ze-feng; Zhang, Ji; Zheng, Ying

    2012-04-01

    In automotive industry, outline design is its life and creative design is its soul indeed. Computer-aided technology has been widely used in the automotive industry and more and more attention has been paid. This paper chiefly introduce the application of computer-aided technologies including CAD, CAM and CAE, analyses the process of automotive structural design and describe the development tendency of computer-aided design.

  5. High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers

    NASA Astrophysics Data System (ADS)

    Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.

    2017-12-01

    The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.

  6. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.

  7. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  8. Geologic and seismic investigation for southeast expressway, stations 600-603 in Quincy, Mass.

    USGS Publications Warehouse

    May, James E.

    1954-01-01

    At this site the southbound lane of the proposed highway will be located approximately 75 feet to the left (south) of the base line. This will place it close to the base of a mound of granite quarry waste with very steep slopes. As a cut of considerable depth will be required for the road, the mound of waste with its unstable slope constitutes a very hazardous condition, especially with respect to the possibility of rock-falls and slides. Seismic work was performed at the site with the two aims in view; firstly, to obtain information on depths to bedrock that would aid in estimating the quantities of materials to be removed from the proposed cut, secondly, to obtain data that might aid in estimating the quantity of material in the mound of quarry waste with the object of obtaining estimates for its removal. Transverses A-13 and C-D were made for this latter purpose. Additional transverses would have been of value, but they were not made because of the possibility of starting rock-falls or slides, a situation that would have exposed personnel to unwarranted danger, and equipment to avoidable risk. Mr. M. E. Chandler and Me. W. L. Carney, Massachusetts Department of Public Works' Engineers, performed pertinent survey work required for this project, and prepared the essential plans and profiles. Mr. Chandler also operated the seismic equipment and assisted in the preparation of the seismic velocity data. The work was performed in June 1953 as part of a cooperative program of the Massachusetts Department of Public Works and the United States Geological Survey.

  9. Source Inversion of Seismic Events Associated with the Sinkhole at Napoleonville Salt Dome, Louisiana using a 3D Velocity Model

    NASA Astrophysics Data System (ADS)

    Nayak, Avinash; Dreger, Douglas S.

    2018-05-01

    The formation of a large sinkhole at the Napoleonville salt dome (NSD), Assumption Parish, Louisiana, caused by the collapse of a brine cavern, was accompanied by an intense and complex sequence of seismic events. We implement a grid-search approach to compute centroid locations and point-source moment tensor (MT) solutions of these seismic events using ˜0.1-0.3 Hz displacement waveforms and synthetic Green's functions computed using a 3D velocity model of the western edge of the NSD. The 3D model incorporates the currently known approximate geometry of the salt dome and the overlying anhydrite-gypsum cap rock, and features a large velocity contrast between the high velocity salt dome and low velocity sediments overlying and surrounding it. For each possible location on the source grid, Green's functions (GFs) to each station were computed using source-receiver reciprocity and the finite-difference seismic wave propagation software SW4. We also establish an empirical method to rigorously assess uncertainties in the centroid location, MW and source type of these events under evolving network geometry, using the results of synthetic tests with hypothetical events and real seismic noise. We apply the methods on the entire duration of data (˜6 months) recorded by the temporary US Geological Survey network. During an energetic phase of the sequence from 24-31 July 2012 when 4 stations were operational, the events with the best waveform fits are primarily located at the western edge of the salt dome at most probable depths of ˜0.3-0.85 km, close to the horizontal positions of the cavern and the future sinkhole. The data are fit nearly equally well by opening crack MTs in the high velocity salt medium or by isotropic volume-increase MTs in the low velocity sediment layers. We find that data recorded by 6 stations during 1-2 August 2012, right before the appearance of the sinkhole, indicate that some events are likely located in the lower velocity media just outside the salt dome at slightly shallower depth ˜0.35-0.65 km, with preferred isotropic volume-increase MT solutions. We find that GFs computed using the 3D velocity model generally result in better fits to the data than GFs computed using 1D velocity models, especially for the smaller amplitude tangential and vertical components, and result in better resolution of event locations. The dominant seismicity during 24-30 July 2012 is characterized by steady occurrence of seismic events with similar locations and MT solutions at a near-characteristic inter-event time. The steady activity is sometimes interrupted by tremor-like sequences of multiple events in rapid succession, followed by quiet periods of little of no seismic activity, in turn followed by the resumption of seismicity with a reduced seismic moment-release rate. The dominant volume-increase MT solutions and the steady features of the seismicity indicate a crack-valve-type source mechanism possibly driven by pressurized natural gas.

  10. ASDF - A Modern Data Format for Seismology

    NASA Astrophysics Data System (ADS)

    Krischer, Lion; Smith, James; Lei, Wenjie; Lefebvre, Matthieu; Ruan, Youyi; Sales de Andrade, Elliot; Podhorszki, Norbert; Bozdag, Ebru; Tromp, Jeroen

    2017-04-01

    Seismology as a science is driven by observing and understanding data and it is thus vital to make this as easy and accessible as possible. The growing volume of freely available data coupled with ever expanding computational power enables scientists to take on new and bigger problems. This evolution is to some part hindered as existing data formats have not been designed with it in mind. We present ASDF (http://seismic-data.org), the Adaptable Seismic Data Format, a novel, modern, and especially practical data format for all branches of seismology with particular focus on how it is incorporated into seismic full waveform inversion workflows. The format aims to solve five key issues: Efficiency: Fast I/O operations especially in high performance computing environments, especially limiting the total number of files. Data organization: Different types of data are needed for a variety of tasks. This results in ad hoc data organization and formats that are hard to maintain, integrate, reproduce, and exchange. Data exchange: We want to exchange complex and complete data sets. Reproducibility: Oftentimes just not existing but crucial to advance our science. Mining, visualization, and understanding of data: As data volumes grow, more complex, new techniques to query and visualize large datasets are needed. ASDF tackles these by defining a structure on top of HDF5 reusing as many existing standards (QuakeML, StationXML, PROV) as possible. An essential trait of ASDF is that it empowers the construction of completely self-describing data sets including waveform, station, and event data together with non-waveform data and a provenance description of everything. This for example for the first time enables the proper archival and exchange of processed or synthetic waveforms. To aid community adoption we developed mature tools in Python as well as in C and Fortran. Additionally we provide a formal definition of the format, a validation tool, and integration into widely used tools like ObsPy (http://obspy.org), SPECFEM GLOBE (https://geodynamics.org/cig/software/specfem3d_globe/), and Salvus (http://salvus.io).

  11. Instrument-Aided Assessment of the Effect of Natural and Technogenic Factors on the Geomechanical State of a Massif Enclosing an HPP Turbine Room

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramov, N. N., E-mail: Abramov@goi.kolasc.net.ru; Epimakhov, Yu. A.

    2016-05-15

    A package of geophysical criteria has been developed using seismic spatiotemporal tomography (SST) of a rock massif to perform an instrument-aided assessment of the effect of natural and technogenic factors on the geomechanical state of a rock massif enclosing an underground turbine room at an HPP. Results are presented for a detailed assessment for the underground turbine room at the Verkhnyaya Tuloma HPP on the Kola peninsula.

  12. Screening guide for rapid assessment of liquefaction hazard at highway bridge sites

    DOT National Transportation Integrated Search

    1998-06-16

    As an aid to seismic hazard assessment, this report provides a "screening guide" for systematic evaluation of liquefactin hazard at bridge sites and a guide for prioritizing sites for further investigation or mitigation. The guide presents a systemat...

  13. Seismic wavefield propagation in 2D anisotropic media: Ray theory versus wave-equation simulation

    NASA Astrophysics Data System (ADS)

    Bai, Chao-ying; Hu, Guang-yi; Zhang, Yan-teng; Li, Zhong-sheng

    2014-05-01

    Despite the ray theory that is based on the high frequency assumption of the elastic wave-equation, the ray theory and the wave-equation simulation methods should be mutually proof of each other and hence jointly developed, but in fact parallel independent progressively. For this reason, in this paper we try an alternative way to mutually verify and test the computational accuracy and the solution correctness of both the ray theory (the multistage irregular shortest-path method) and the wave-equation simulation method (both the staggered finite difference method and the pseudo-spectral method) in anisotropic VTI and TTI media. Through the analysis and comparison of wavefield snapshot, common source gather profile and synthetic seismogram, it is able not only to verify the accuracy and correctness of each of the methods at least for kinematic features, but also to thoroughly understand the kinematic and dynamic features of the wave propagation in anisotropic media. The results show that both the staggered finite difference method and the pseudo-spectral method are able to yield the same results even for complex anisotropic media (such as a fault model); the multistage irregular shortest-path method is capable of predicting similar kinematic features as the wave-equation simulation method does, which can be used to mutually test each other for methodology accuracy and solution correctness. In addition, with the aid of the ray tracing results, it is easy to identify the multi-phases (or multiples) in the wavefield snapshot, common source point gather seismic section and synthetic seismogram predicted by the wave-equation simulation method, which is a key issue for later seismic application.

  14. A Serviced-based Approach to Connect Seismological Infrastructures: Current Efforts at the IRIS DMC

    NASA Astrophysics Data System (ADS)

    Ahern, Tim; Trabant, Chad

    2014-05-01

    As part of the COOPEUS initiative to build infrastructure that connects European and US research infrastructures, IRIS has advocated for the development of Federated services based upon internationally recognized standards using web services. By deploying International Federation of Digital Seismograph Networks (FDSN) endorsed web services at multiple data centers in the US and Europe, we have shown that integration within seismological domain can be realized. By deploying identical methods to invoke the web services at multiple centers this approach can significantly ease the methods through which a scientist can access seismic data (time series, metadata, and earthquake catalogs) from distributed federated centers. IRIS has developed an IRIS federator that helps a user identify where seismic data from global seismic networks can be accessed. The web services based federator can build the appropriate URLs and return them to client software running on the scientists own computer. These URLs are then used to directly pull data from the distributed center in a very peer-based fashion. IRIS is also involved in deploying web services across horizontal domains. As part of the US National Science Foundation's (NSF) EarthCube effort, an IRIS led EarthCube Building Block's project is underway. When completed this project will aid in the discovery, access, and usability of data across multiple geoscienece domains. This presentation will summarize current IRIS efforts in building vertical integration infrastructure within seismology working closely with 5 centers in Europe and 2 centers in the US, as well as how we are taking first steps toward horizontal integration of data from 14 different domains in the US, in Europe, and around the world.

  15. Code for Calculating Regional Seismic Travel Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BALLARD, SANFORD; HIPP, JAMES; & BARKER, GLENN

    The RSTT software computes predictions of the travel time of seismic energy traveling from a source to a receiver through 2.5D models of the seismic velocity distribution within the Earth. The two primary applications for the RSTT library are tomographic inversion studies and seismic event location calculations. In tomographic inversions studies, a seismologist begins with number of source-receiver travel time observations and an initial starting model of the velocity distribution within the Earth. A forward travel time calculator, such as the RSTT library, is used to compute predictions of each observed travel time and all of the residuals (observed minusmore » predicted travel time) are calculated. The Earth model is then modified in some systematic way with the goal of minimizing the residuals. The Earth model obtained in this way is assumed to be a better model than the starting model if it has lower residuals. The other major application for the RSTT library is seismic event location. Given an Earth model, an initial estimate of the location of a seismic event, and some number of observations of seismic travel time thought to have originated from that event, location codes systematically modify the estimate of the location of the event with the goal of minimizing the difference between the observed and predicted travel times. The second application, seismic event location, is routinely implemented by the military as part of its effort to monitor the Earth for nuclear tests conducted by foreign countries.« less

  16. Seismic site-response characterization of high-velocity sites using advanced geophysical techniques: application to the NAGRA-Net

    NASA Astrophysics Data System (ADS)

    Poggi, V.; Burjanek, J.; Michel, C.; Fäh, D.

    2017-08-01

    The Swiss Seismological Service (SED) has recently finalised the installation of ten new seismological broadband stations in northern Switzerland. The project was led in cooperation with the National Cooperative for the Disposal of Radioactive Waste (Nagra) and Swissnuclear to monitor micro seismicity at potential locations of nuclear-waste repositories. To further improve the quality and usability of the seismic recordings, an extensive characterization of the sites surrounding the installation area was performed following a standardised investigation protocol. State-of-the-art geophysical techniques have been used, including advanced active and passive seismic methods. The results of all analyses converged to the definition of a set of best-representative 1-D velocity profiles for each site, which are the input for the computation of engineering soil proxies (traveltime averaged velocity and quarter-wavelength parameters) and numerical amplification models. Computed site response is then validated through comparison with empirical site amplification, which is currently available for any station connected to the Swiss seismic networks. With the goal of a high-sensitivity network, most of the NAGRA stations have been installed on stiff-soil sites of rather high seismic velocity. Seismic characterization of such sites has always been considered challenging, due to lack of relevant velocity contrast and the large wavelengths required to investigate the frequency range of engineering interest. We describe how ambient vibration techniques can successfully be applied in these particular conditions, providing practical recommendations for best practice in seismic site characterization of high-velocity sites.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parra, J.; Collier, H.; Angstman, B.

    In low porosity, low permeability zones, natural fractures are the primary source of permeability which affect both production and injection of fluids. The open fractures do not contribute much to porosity, but they provide an increased drainage network to any porosity. An important approach to characterizing the fracture orientation and fracture permeability of reservoir formations is one based upon the effects of such conditions on the propagation of acoustic and seismic waves in the rock. We present the feasibility of using seismic measurement techniques to map the fracture zones between wells spaced 2400 ft at depths of about 1000 ft.more » For this purpose we constructed computer models (which include azimuthal anisotropy) using Lodgepole reservoir parameters to predict seismic signatures recorded at the borehole scale, crosswell scale, and 3 D seismic scale. We have integrated well logs with existing 2D surfaces seismic to produce petrophysical and geological cross sections to determine the reservoir parameters and geometry for the computer models. In particular, the model responses are used to evaluate if surface seismic and crosswell seismic measurements can capture the anisotropy due to vertical fractures. Preliminary results suggested that seismic waves transmitted between two wells will propagate in carbonate fracture reservoirs, and the signal can be received above the noise level at the distance of 2400 ft. In addition, the large velocities contrast between the main fracture zone and the underlying unfractured Boundary Ridge Member, suggested that borehole reflection imaging may be appropriate to map and fracture zone thickness variation and fracture distributions in the reservoir.« less

  18. Enhancing an appointment diary on a pocket computer for use by people after brain injury.

    PubMed

    Wright, P; Rogers, N; Hall, C; Wilson, B; Evans, J; Emslie, H

    2001-12-01

    People with memory loss resulting from brain injury benefit from purpose-designed memory aids such as appointment diaries on pocket computers. The present study explores the effects of extending the range of memory aids and including games. For 2 months, 12 people who had sustained brain injury were loaned a pocket computer containing three purpose-designed memory aids: diary, notebook and to-do list. A month later they were given another computer with the same memory aids but a different method of text entry (physical keyboard or touch-screen keyboard). Machine order was counterbalanced across participants. Assessment was by interviews during the loan periods, rating scales, performance tests and computer log files. All participants could use the memory aids and ten people (83%) found them very useful. Correlations among the three memory aids were not significant, suggesting individual variation in how they were used. Games did not increase use of the memory aids, nor did loan of the preferred pocket computer (with physical keyboard). Significantly more diary entries were made by people who had previously used other memory aids, suggesting that a better understanding of how to use a range of memory aids could benefit some people with brain injury.

  19. The Computation of Global Viscoelastic Co- and Post-seismic Displacement in a Realistic Earth Model by Straightforward Numerical Inverse Laplace Integration

    NASA Astrophysics Data System (ADS)

    Tang, H.; Sun, W.

    2016-12-01

    The theoretical computation of dislocation theory in a given earth model is necessary in the explanation of observations of the co- and post-seismic deformation of earthquakes. For this purpose, computation theories based on layered or pure half space [Okada, 1985; Okubo, 1992; Wang et al., 2006] and on spherically symmetric earth [Piersanti et al., 1995; Pollitz, 1997; Sabadini & Vermeersen, 1997; Wang, 1999] have been proposed. It is indicated that the compressibility, curvature and the continuous variation of the radial structure of Earth should be simultaneously taken into account for modern high precision displacement-based observations like GPS. Therefore, Tanaka et al. [2006; 2007] computed global displacement and gravity variation by combining the reciprocity theorem (RPT) [Okubo, 1993] and numerical inverse Laplace integration (NIL) instead of the normal mode method [Peltier, 1974]. Without using RPT, we follow the straightforward numerical integration of co-seismic deformation given by Sun et al. [1996] to present a straightforward numerical inverse Laplace integration method (SNIL). This method is used to compute the co- and post-seismic displacement of point dislocations buried in a spherically symmetric, self-gravitating viscoelastic and multilayered earth model and is easy to extended to the application of geoid and gravity. Comparing with pre-existing method, this method is relatively more straightforward and time-saving, mainly because we sum associated Legendre polynomials and dislocation love numbers before using Riemann-Merlin formula to implement SNIL.

  20. Black Thunder Coal Mine and Los Alamos National Laboratory experimental study of seismic energy generated by large scale mine blasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, R.L.; Gross, D.; Pearson, D.C.

    In an attempt to better understand the impact that large mining shots will have on verifying compliance with the international, worldwide, Comprehensive Test Ban Treaty (CTBT, no nuclear explosion tests), a series of seismic and videographic experiments has been conducted during the past two years at the Black Thunder Coal Mine. Personnel from the mine and Los Alamos National Laboratory have cooperated closely to design and perform experiments to produce results with mutual benefit to both organizations. This paper summarizes the activities, highlighting the unique results of each. Topics which were covered in these experiments include: (1) synthesis of seismic,more » videographic, acoustic, and computer modeling data to improve understanding of shot performance and phenomenology; (2) development of computer generated visualizations of observed blasting techniques; (3) documentation of azimuthal variations in radiation of seismic energy from overburden casting shots; (4) identification of, as yet unexplained, out of sequence, simultaneous detonation in some shots using seismic and videographic techniques; (5) comparison of local (0.1 to 15 kilometer range) and regional (100 to 2,000 kilometer range) seismic measurements leading to determine of the relationship between local and regional seismic amplitude to explosive yield for overburden cast, coal bulking and single fired explosions; and (6) determination of the types of mining shots triggering the prototype International Monitoring System for the CTBT.« less

  1. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  2. Computer Instructional Aids for Undergraduate Control Education.

    ERIC Educational Resources Information Center

    Volz, Richard A.; And Others

    Engineering is coming to rely more and more heavily upon the computer for computations, analyses, and graphic displays which aid the design process. A general purpose simulation system, the Time-shared Automatic Control Laboratory (TACL), and a set of computer-aided design programs, Control Oriented Interactive Graphic Analysis and Design…

  3. Development of a Web Based Simulating System for Earthquake Modeling on the Grid

    NASA Astrophysics Data System (ADS)

    Seber, D.; Youn, C.; Kaiser, T.

    2007-12-01

    Existing cyberinfrastructure-based information, data and computational networks now allow development of state- of-the-art, user-friendly simulation environments that democratize access to high-end computational environments and provide new research opportunities for many research and educational communities. Within the Geosciences cyberinfrastructure network, GEON, we have developed the SYNSEIS (SYNthetic SEISmogram) toolkit to enable efficient computations of 2D and 3D seismic waveforms for a variety of research purposes especially for helping to analyze the EarthScope's USArray seismic data in a speedy and efficient environment. The underlying simulation software in SYNSEIS is a finite difference code, E3D, developed by LLNL (S. Larsen). The code is embedded within the SYNSEIS portlet environment and it is used by our toolkit to simulate seismic waveforms of earthquakes at regional distances (<1000km). Architecturally, SYNSEIS uses both Web Service and Grid computing resources in a portal-based work environment and has a built in access mechanism to connect to national supercomputer centers as well as to a dedicated, small-scale compute cluster for its runs. Even though Grid computing is well-established in many computing communities, its use among domain scientists still is not trivial because of multiple levels of complexities encountered. We grid-enabled E3D using our own dialect XML inputs that include geological models that are accessible through standard Web services within the GEON network. The XML inputs for this application contain structural geometries, source parameters, seismic velocity, density, attenuation values, number of time steps to compute, and number of stations. By enabling a portal based access to a such computational environment coupled with its dynamic user interface we enable a large user community to take advantage of such high end calculations in their research and educational activities. Our system can be used to promote an efficient and effective modeling environment to help scientists as well as educators in their daily activities and speed up the scientific discovery process.

  4. Evaluation of Microcomputer-Based Operation and Maintenance Management Systems for Army Water/Wastewater Treatment Plant Operation.

    DTIC Science & Technology

    1986-07-01

    COMPUTER-AIDED OPERATION MANAGEMENT SYSTEM ................. 29 Functions of an Off-Line Computer-Aided Operation Management System Applications of...System Comparisons 85 DISTRIBUTION 5V J. • 0. FIGURES Number Page 1 Hardware Components 21 2 Basic Functions of a Computer-Aided Operation Management System...Plant Visits 26 4 Computer-Aided Operation Management Systems Reviewed for Analysis of Basic Functions 29 5 Progress of Software System Installation and

  5. Seismic performance for vertical geometric irregularity frame structures

    NASA Astrophysics Data System (ADS)

    Ismail, R.; Mahmud, N. A.; Ishak, I. S.

    2018-04-01

    This research highlights the result of vertical geometric irregularity frame structures. The aid of finite element analysis software, LUSAS was used to analyse seismic performance by focusing particularly on type of irregular frame on the differences in height floors and continued in the middle of the building. Malaysia’s building structures were affected once the earthquake took place in the neighbouring country such as Indonesia (Sumatera Island). In Malaysia, concrete is widely used in building construction and limited tension resistance to prevent it. Analysing structural behavior with horizontal and vertical static load is commonly analyses by using the Plane Frame Analysis. The case study of this research is to determine the stress and displacement in the seismic response under this type of irregular frame structures. This study is based on seven-storey building of Clinical Training Centre located in Sungai Buloh, Selayang, Selangor. Since the largest earthquake occurs in Acheh, Indonesia on December 26, 2004, the data was recorded and used in conducting this research. The result of stress and displacement using IMPlus seismic analysis in LUSAS Modeller Software under the seismic response of a formwork frame system states that the building is safe to withstand the ground and in good condition under the variation of seismic performance.

  6. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  7. Reducing disk storage of full-3D seismic waveform tomography (F3DT) through lossy online compression

    NASA Astrophysics Data System (ADS)

    Lindstrom, Peter; Chen, Po; Lee, En-Jui

    2016-08-01

    Full-3D seismic waveform tomography (F3DT) is the latest seismic tomography technique that can assimilate broadband, multi-component seismic waveform observations into high-resolution 3D subsurface seismic structure models. The main drawback in the current F3DT implementation, in particular the scattering-integral implementation (F3DT-SI), is the high disk storage cost and the associated I/O overhead of archiving the 4D space-time wavefields of the receiver- or source-side strain tensors. The strain tensor fields are needed for computing the data sensitivity kernels, which are used for constructing the Jacobian matrix in the Gauss-Newton optimization algorithm. In this study, we have successfully integrated a lossy compression algorithm into our F3DT-SI workflow to significantly reduce the disk space for storing the strain tensor fields. The compressor supports a user-specified tolerance for bounding the error, and can be integrated into our finite-difference wave-propagation simulation code used for computing the strain fields. The decompressor can be integrated into the kernel calculation code that reads the strain fields from the disk and compute the data sensitivity kernels. During the wave-propagation simulations, we compress the strain fields before writing them to the disk. To compute the data sensitivity kernels, we read the compressed strain fields from the disk and decompress them before using them in kernel calculations. Experiments using a realistic dataset in our California statewide F3DT project have shown that we can reduce the strain-field disk storage by at least an order of magnitude with acceptable loss, and also improve the overall I/O performance of the entire F3DT-SI workflow significantly. The integration of the lossy online compressor may potentially open up the possibilities of the wide adoption of F3DT-SI in routine seismic tomography practices in the near future.

  8. Reducing Disk Storage of Full-3D Seismic Waveform Tomography (F3DT) Through Lossy Online Compression

    DOE PAGES

    Lindstrom, Peter; Chen, Po; Lee, En-Jui

    2016-05-05

    Full-3D seismic waveform tomography (F3DT) is the latest seismic tomography technique that can assimilate broadband, multi-component seismic waveform observations into high-resolution 3D subsurface seismic structure models. The main drawback in the current F3DT implementation, in particular the scattering-integral implementation (F3DT-SI), is the high disk storage cost and the associated I/O overhead of archiving the 4D space-time wavefields of the receiver- or source-side strain tensors. The strain tensor fields are needed for computing the data sensitivity kernels, which are used for constructing the Jacobian matrix in the Gauss-Newton optimization algorithm. In this study, we have successfully integrated a lossy compression algorithmmore » into our F3DT SI workflow to significantly reduce the disk space for storing the strain tensor fields. The compressor supports a user-specified tolerance for bounding the error, and can be integrated into our finite-difference wave-propagation simulation code used for computing the strain fields. The decompressor can be integrated into the kernel calculation code that reads the strain fields from the disk and compute the data sensitivity kernels. During the wave-propagation simulations, we compress the strain fields before writing them to the disk. To compute the data sensitivity kernels, we read the compressed strain fields from the disk and decompress them before using them in kernel calculations. Experiments using a realistic dataset in our California statewide F3DT project have shown that we can reduce the strain-field disk storage by at least an order of magnitude with acceptable loss, and also improve the overall I/O performance of the entire F3DT-SI workflow significantly. The integration of the lossy online compressor may potentially open up the possibilities of the wide adoption of F3DT-SI in routine seismic tomography practices in the near future.« less

  9. The Feasibility of Classifying Breast Masses Using a Computer-Assisted Diagnosis (CAD) System Based on Ultrasound Elastography and BI-RADS Lexicon.

    PubMed

    Fleury, Eduardo F C; Gianini, Ana Claudia; Marcomini, Karem; Oliveira, Vilmar

    2018-01-01

    To determine the applicability of a computer-aided diagnostic system strain elastography system for the classification of breast masses diagnosed by ultrasound and scored using the criteria proposed by the breast imaging and reporting data system ultrasound lexicon and to determine the diagnostic accuracy and interobserver variability. This prospective study was conducted between March 1, 2016, and May 30, 2016. A total of 83 breast masses subjected to percutaneous biopsy were included. Ultrasound elastography images before biopsy were interpreted by 3 radiologists with and without the aid of computer-aided diagnostic system for strain elastography. The parameters evaluated by each radiologist results were sensitivity, specificity, and diagnostic accuracy, with and without computer-aided diagnostic system for strain elastography. Interobserver variability was assessed using a weighted κ test and an intraclass correlation coefficient. The areas under the receiver operating characteristic curves were also calculated. The areas under the receiver operating characteristic curve were 0.835, 0.801, and 0.765 for readers 1, 2, and 3, respectively, without computer-aided diagnostic system for strain elastography, and 0.900, 0.926, and 0.868, respectively, with computer-aided diagnostic system for strain elastography. The intraclass correlation coefficient between the 3 readers was 0.6713 without computer-aided diagnostic system for strain elastography and 0.811 with computer-aided diagnostic system for strain elastography. The proposed computer-aided diagnostic system for strain elastography system has the potential to improve the diagnostic performance of radiologists in breast examination using ultrasound associated with elastography.

  10. Multifractal Analysis of Seismically Induced Soft-Sediment Deformation Structures Imaged by X-Ray Computed Tomography

    NASA Astrophysics Data System (ADS)

    Nakashima, Yoshito; Komatsubara, Junko

    Unconsolidated soft sediments deform and mix complexly by seismically induced fluidization. Such geological soft-sediment deformation structures (SSDSs) recorded in boring cores were imaged by X-ray computed tomography (CT), which enables visualization of the inhomogeneous spatial distribution of iron-bearing mineral grains as strong X-ray absorbers in the deformed strata. Multifractal analysis was applied to the two-dimensional (2D) CT images with various degrees of deformation and mixing. The results show that the distribution of the iron-bearing mineral grains is multifractal for less deformed/mixed strata and almost monofractal for fully mixed (i.e. almost homogenized) strata. Computer simulations of deformation of real and synthetic digital images were performed using the egg-beater flow model. The simulations successfully reproduced the transformation from the multifractal spectra into almost monofractal spectra (i.e. almost convergence on a single point) with an increase in deformation/mixing intensity. The present study demonstrates that multifractal analysis coupled with X-ray CT and the mixing flow model is useful to quantify the complexity of seismically induced SSDSs, standing as a novel method for the evaluation of cores for seismic risk assessment.

  11. Computer Aided Design: Instructional Manual. The North Dakota High Technology Mobile Laboratory Project.

    ERIC Educational Resources Information Center

    Cheng, Wan-Lee

    This instructional manual contains 12 learning activity packets for use in a workshop in computer-aided design and drafting (CADD). The lessons cover the following topics: introduction to computer graphics and computer-aided design/drafting; coordinate systems; advance space graphics hardware configuration and basic features of the IBM PC…

  12. RASCAL: A Rudimentary Adaptive System for Computer-Aided Learning.

    ERIC Educational Resources Information Center

    Stewart, John Christopher

    Both the background of computer-assisted instruction (CAI) systems in general and the requirements of a computer-aided learning system which would be a reasonable assistant to a teacher are discussed. RASCAL (Rudimentary Adaptive System for Computer-Aided Learning) is a first attempt at defining a CAI system which would individualize the learning…

  13. A seismic study of Yucca Mountain and vicinity, southern Nevada; data report and preliminary results

    USGS Publications Warehouse

    Hoffman, L.R.; Mooney, W.D.

    1983-01-01

    From 1980 to 1982, the U.S. Geological Survey conducted seismic refraction studies at the Nevada Test Site to aid in an investigation of the regional crustal structure at a possible nuclear waste repository site near Yucca Mountain. Two regionally distributed deployments and one north-south deployment recorded nuclear events. First arrival times from these deployments were plotted on a location map and contoured to determine traveltime delays. The results indicate delays as large as 0.5 s in the Yucca Mountain and Crater Flat areas relative to the Jackass Flats area. A fourth east-west deployment recorded a chemical explosion and was interpreted using a two-dimensional computer raytracing technique. Delays as high as 0.7 s were observed over Crater Flat and Yucca Mountain. The crustal model derived from this profile indicates that Paleozoic rocks, which outcrop to the east at Skull Mountain and the Calico Hills, and to the west at Bare Mountain, lie at a minimum depth of 3 km beneath part of Yucca Mountain. These results confirm earlier estimates based on the modeling of detailed gravity data. A mid-crustal boundary at 15 ? 2 km beneath Yucca Mountain is evidenced by a prominent reflection recorded beyond 43 km range at 1.5 s reduced time. Other mid-crustal boundaries have been identified at 24 and 30 km and the total crustal thickness is 35 km.

  14. Effect of strong elastic contrasts on the propagation of seismic wave in hard-rock environments

    NASA Astrophysics Data System (ADS)

    Saleh, R.; Zheng, L.; Liu, Q.; Milkereit, B.

    2013-12-01

    Understanding the propagation of seismic waves in a presence of strong elastic contrasts, such as topography, tunnels and ore-bodies is still a challenge. Safety in mining is a major concern and seismic monitoring is the main tool here. For engineering purposes, amplitudes (peak particle velocity/acceleration) and travel times of seismic events (mostly blasts or microseismic events) are critical parameters that have to be determined at various locations in a mine. These parameters are useful in preparing risk maps or to better understand the process of spatial and temporal stress distributions in a mine. Simple constant velocity models used for monitoring studies in mining, cannot explain the observed complexities in scattered seismic waves. In hard-rock environments modeling of elastic seismic wavefield require detailed 3D petrophysical, infrastructure and topographical data to simulate the propagation of seismic wave with a frequencies up to few kilohertz. With the development of efficient numerical techniques, and parallel computation facilities, a solution for such a problem is achievable. In this study, the effects of strong elastic contrasts such as ore-bodies, rough topography and tunnels will be illustrated using 3D modeling method. The main tools here are finite difference code (SOFI3D)[1] that has been benchmarked for engineering studies, and spectral element code (SPECFEM) [2], which was, developed for global seismology problems. The modeling results show locally enhanced peak particle velocity due to presence of strong elastic contrast and topography in models. [1] Bohlen, T. Parallel 3-D viscoelastic finite difference seismic modeling. Computers & Geosciences 28 (2002) 887-899 [2] Komatitsch, D., and J. Tromp, Introduction to the spectral-element method for 3-D seismic wave propagation, Geophys. J. Int., 139, 806-822, 1999.

  15. Detailed Velocity and Density models of the Cascadia Subduction Zone from Prestack Full-Waveform Inversion

    NASA Astrophysics Data System (ADS)

    Fortin, W.; Holbrook, W. S.; Mallick, S.; Everson, E. D.; Tobin, H. J.; Keranen, K. M.

    2014-12-01

    Understanding the geologic composition of the Cascadia Subduction Zone (CSZ) is critically important in assessing seismic hazards in the Pacific Northwest. Despite being a potential earthquake and tsunami threat to millions of people, key details of the structure and fault mechanisms remain poorly understood in the CSZ. In particular, the position and character of the subduction interface remains elusive due to its relative aseismicity and low seismic reflectivity, making imaging difficult for both passive and active source methods. Modern active-source reflection seismic data acquired as part of the COAST project in 2012 provide an opportunity to study the transition from the Cascadia basin, across the deformation front, and into the accretionary prism. Coupled with advances in seismic inversion methods, this new data allow us to produce detailed velocity models of the CSZ and accurate pre-stack depth migrations for studying geologic structure. While still computationally expensive, current computing clusters can perform seismic inversions at resolutions that match that of the seismic image itself. Here we present pre-stack full waveform inversions of the central seismic line of the COAST survey offshore Washington state. The resultant velocity model is produced by inversion at every CMP location, 6.25 m laterally, with vertical resolution of 0.2 times the dominant seismic frequency. We report a good average correlation value above 0.8 across the entire seismic line, determined by comparing synthetic gathers to the real pre-stack gathers. These detailed velocity models, both Vp and Vs, along with the density model, are a necessary step toward a detailed porosity cross section to be used to determine the role of fluids in the CSZ. Additionally, the P-velocity model is used to produce a pre-stack depth migration image of the CSZ.

  16. Computational Software to Fit Seismic Data Using Epidemic-Type Aftershock Sequence Models and Modeling Performance Comparisons

    NASA Astrophysics Data System (ADS)

    Chu, A.

    2016-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work implements three of the homogeneous ETAS models described in Ogata (1998). With a model's log-likelihood function, my software finds the Maximum-Likelihood Estimates (MLEs) of the model's parameters to estimate the homogeneous background rate and the temporal and spatial parameters that govern triggering effects. EM-algorithm is employed for its advantages of stability and robustness (Veen and Schoenberg, 2008). My work also presents comparisons among the three models in robustness, convergence speed, and implementations from theory to computing practice. Up-to-date regional seismic data of seismic active areas such as Southern California and Japan are used to demonstrate the comparisons. Data analysis has been done using computer languages Java and R. Java has the advantages of being strong-typed and easiness of controlling memory resources, while R has the advantages of having numerous available functions in statistical computing. Comparisons are also made between the two programming languages in convergence and stability, computational speed, and easiness of implementation. Issues that may affect convergence such as spatial shapes are discussed.

  17. Digital seismic-reflection data from western Rhode Island Sound, 1980

    USGS Publications Warehouse

    McMullen, K.Y.; Poppe, L.J.; Soderberg, N.K.

    2009-01-01

    During 1980, the U.S. Geological Survey (USGS) conducted a seismic-reflection survey in western Rhode Island Sound aboard the Research Vessel Neecho. Data from this survey were recorded in analog form and archived at the USGS Woods Hole Science Center's Data Library. Due to recent interest in the geology of Rhode Island Sound and in an effort to make the data more readily accessible while preserving the original paper records, the seismic data from this cruise were scanned and converted to Tagged Image File Format (TIFF) images and SEG-Y data files. Navigation data were converted from U.S. Coast Guard Long Range Aids to Navigation (LORAN-C) time delays to latitudes and longitudes, which are available in Environmental Systems Research Institute, Inc. (ESRI) shapefile format and as eastings and northings in space-delimited text format.

  18. Seismic waveform modeling over cloud

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Friederich, Wolfgang

    2016-04-01

    With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.

  19. Crowd-Sourcing Seismic Data for Education and Research Opportunities with the Quake-Catcher Network

    NASA Astrophysics Data System (ADS)

    Sumy, D. F.; DeGroot, R. M.; Benthien, M. L.; Cochran, E. S.; Taber, J. J.

    2016-12-01

    The Quake Catcher Network (QCN; quakecatcher.net) uses low cost micro-electro-mechanical system (MEMS) sensors hosted by volunteers to collect seismic data. Volunteers use accelerometers internal to laptop computers, phones, tablets or small (the size of a matchbox) MEMS sensors plugged into desktop computers using a USB connector to collect scientifically useful data. Data are collected and sent to a central server using the Berkeley Open Infrastructure for Network Computing (BOINC) distributed computing software. Since 2008, sensors installed in museums, schools, offices, and residences have collected thousands of earthquake records, including the 2010 M8.8 Maule, Chile, the 2010 M7.1 Darfield, New Zealand, and 2015 M7.8 Gorkha, Nepal earthquakes. In 2016, the QCN in the United States transitioned to the Incorporated Research Institutions for Seismology (IRIS) Consortium and the Southern California Earthquake Center (SCEC), which are facilities funded through the National Science Foundation and the United States Geological Survey, respectively. The transition has allowed for an influx of new ideas and new education related efforts, which include focused installations in several school districts in southern California, on Native American reservations in North Dakota, and in the most seismically active state in the contiguous U.S. - Oklahoma. We present and describe these recent educational opportunities, and highlight how QCN has engaged a wide sector of the public in scientific data collection, particularly through the QCN-EPIcenter Network and NASA Mars InSight teacher programs. QCN provides the public with information and insight into how seismic data are collected, and how researchers use these data to better understand and characterize seismic activity. Lastly, we describe how students use data recorded by QCN sensors installed in their classrooms to explore and investigate felt earthquakes, and look towards the bright future of the network.

  20. Insights into crustal structure of the Eastern North American Margin from community multichannel seismic and potential field data

    NASA Astrophysics Data System (ADS)

    Davis, J. K.; Becel, A.; Shillington, D. J.; Buck, W. R.

    2017-12-01

    In the fall of 2014, the R/V Marcus Langseth collected gravity, magnetic, and reflection seismic data as part of the Eastern North American Margin Community Seismic Experiment. The dataset covers a 500 km wide section of the Mid-Atlantic passive margin offshore North Carolina, which formed after the Mesozoic breakup of the supercontinent Pangaea. Using these seismic and potential field data, we present observations and interpretations along two cross margin and one along-margin profiles. Analyses and interpretations are conducted using pre-stack depth migrated reflection seismic profiles in conjunction with forward modeling of shipboard gravity and magnetic anomalies. Preliminary interpretations of the data reveal variations in basement character and structure across the entire transition between continental and oceanic domains. These interpretations help provide insight into the origin and nature of the prominent East Coast and Blake Spur magnetic anomalies, as well as the Inner Magnetic Quiet Zone which occupies the domain between the anomalies. Collectively, these observations can aid in deciphering the rift-to-drift transition during the breakup of North America and West Africa and formation of the Central Atlantic.

  1. Signal Quality and the Reliability of Seismic Observations

    NASA Astrophysics Data System (ADS)

    Zeiler, C. P.; Velasco, A. A.; Pingitore, N. E.

    2009-12-01

    The ability to detect, time and measure seismic phases depends on the location, size, and quality of the recorded signals. Additional constraints are an analyst’s familiarity with a seismogenic zone and with the seismic stations that record the energy. Quantification and qualification of an analyst’s ability to detect, time and measure seismic signals has not been calculated or fully assessed. The fundamental measurement for computing the accuracy of a seismic measurement is the signal quality. Several methods have been proposed to measure signal quality; however, the signal-to-noise ratio (SNR) has been adopted as a short-term average over the long-term average. While the standard SNR is an easy and computationally inexpensive term, the overall statistical significance has not been computed for seismic measurement analysis. The prospect of canonizing the process of cataloging seismic arrivals hinges on the ability to repeat measurements made by different methods and analysts. The first step in canonizing phase measurements has been done by the IASPEI, which established a reference for accepted practices in naming seismic phases. The New Manual for Seismological Observatory Practices (NMSOP, 2002) outlines key observations for seismic phases recorded at different distances and proposes to quantify timing uncertainty with a user-specified windowing technique. However, this added measurement would not completely remove bias introduced by different techniques used by analysts to time seismic arrivals. The general guideline to time a seismic arrival is to record the time where a noted change in frequency and/or amplitude begins. This is generally achieved by enhancing the arrivals through filtering or beam forming. However, these enhancements can alter the characteristics of the arrival and how the arrival will be measured. Furthermore, each enhancement has user-specified parameters that can vary between analysts and this results in reduced ability to repeat measurements between analysts. The SPEAR project (Zeiler and Velasco, 2009) has started to explore the effects of comparing measurements from the same seismograms. Initial results showed that experience and the signal quality are the leading contributors to pick differences. However, the traditional SNR method of measuring signal quality was replaced by a Wide-band Spectral Ratio (WSR) due to a decrease in scatter. This observation brings up an important question of what is the best way to measure signal quality. We compare various methods (traditional SNR, WSR, power spectral density plots, Allan Variance) that have been proposed to measure signal quality and discuss which method provides the best tool to compare arrival time uncertainty.

  2. Continuous, Large-Scale Processing of Seismic Archives for High-Resolution Monitoring of Seismic Activity and Seismogenic Properties

    NASA Astrophysics Data System (ADS)

    Waldhauser, F.; Schaff, D. P.

    2012-12-01

    Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring earthquake activity and verification of the Nuclear Test-Ban Treaty. We show results from our continuing effort in developing efficient waveform cross-correlation and double-difference analysis methods for the large-scale processing of regional and global seismic archives to improve existing earthquake parameter estimates, detect seismic events with magnitudes below current detection thresholds, and improve real-time monitoring procedures. We demonstrate the performance of these algorithms as applied to the 28-year long seismic archive of the Northern California Seismic Network. The tools enable the computation of periodic updates of a high-resolution earthquake catalog of currently over 500,000 earthquakes using simultaneous double-difference inversions, achieving up to three orders of magnitude resolution improvement over existing hypocenter locations. This catalog, together with associated metadata, form the underlying relational database for a real-time double-difference scheme, DDRT, which rapidly computes high-precision correlation times and hypocenter locations of new events with respect to the background archive (http://ddrt.ldeo.columbia.edu). The DDRT system facilitates near-real-time seismicity analysis, including the ability to search at an unprecedented resolution for spatio-temporal changes in seismogenic properties. In areas with continuously recording stations, we show that a detector built around a scaled cross-correlation function can lower the detection threshold by one magnitude unit compared to the STA/LTA based detector employed at the network. This leads to increased event density, which in turn pushes the resolution capability of our location algorithms. On a global scale, we are currently building the computational framework for double-difference processing the combined parametric and waveform archives of the ISC, NEIC, and IRIS with over three million recorded earthquakes worldwide. Since our methods are scalable and run on inexpensive Beowulf clusters, periodic re-analysis of such archives may thus become a routine procedure to continuously improve resolution in existing global earthquake catalogs. Results from subduction zones and aftershock sequences of recent great earthquakes demonstrate the considerable social and economic impact that high-resolution images of active faults, when available in real-time, will have in the prompt evaluation and mitigation of seismic hazards. These results also highlight the need for consistent long-term seismic monitoring and archiving of records.

  3. A Percolation Perspective for Gutenburg-Richter Scaling and b-values for Fracking Assocated Seismicity

    NASA Astrophysics Data System (ADS)

    Norris, J. Q.

    2016-12-01

    Published 60 years ago, the Gutenburg-Richter law provides a universal frequency-magnitude distribution for natural and induced seismicity. The GR law is a two parameter power-law with the b-value specifying the relative frequency of small and large events. For large catalogs of natural seismicity, the observed b-values are near one, while fracking associated seismicity has observed b-values near two, indicating relatively fewer large events. We have developed a computationally inexpensive percolation model for fracking that allows us to generate large catalogs of fracking associated seismicity. Using these catalogs, we show that different power-law fitting procedures produce different b-values for the same data set. This shows that care must be taken when determining and comparing b-values for fracking associated seismicity.

  4. Computer-Aided Drug Discovery: Molecular Docking of Diminazene Ligands to DNA Minor Groove

    ERIC Educational Resources Information Center

    Kholod, Yana; Hoag, Erin; Muratore, Katlynn; Kosenkov, Dmytro

    2018-01-01

    The reported project-based laboratory unit introduces upper-division undergraduate students to the basics of computer-aided drug discovery as a part of a computational chemistry laboratory course. The students learn to perform model binding of organic molecules (ligands) to the DNA minor groove with computer-aided drug discovery (CADD) tools. The…

  5. The Collaborative Seismic Earth Model Project

    NASA Astrophysics Data System (ADS)

    Fichtner, A.; van Herwaarden, D. P.; Afanasiev, M.

    2017-12-01

    We present the first generation of the Collaborative Seismic Earth Model (CSEM). This effort is intended to address grand challenges in tomography that currently inhibit imaging the Earth's interior across the seismically accessible scales: [1] For decades to come, computational resources will remain insufficient for the exploitation of the full observable seismic bandwidth. [2] With the man power of individual research groups, only small fractions of available waveform data can be incorporated into seismic tomographies. [3] The limited incorporation of prior knowledge on 3D structure leads to slow progress and inefficient use of resources. The CSEM is a multi-scale model of global 3D Earth structure that evolves continuously through successive regional refinements. Taking the current state of the CSEM as initial model, these refinements are contributed by external collaborators, and used to advance the CSEM to the next state. This mode of operation allows the CSEM to [1] harness the distributed man and computing power of the community, [2] to make consistent use of prior knowledge, and [3] to combine different tomographic techniques, needed to cover the seismic data bandwidth. Furthermore, the CSEM has the potential to serve as a unified and accessible representation of tomographic Earth models. Generation 1 comprises around 15 regional tomographic refinements, computed with full-waveform inversion. These include continental-scale mantle models of North America, Australasia, Europe and the South Atlantic, as well as detailed regional models of the crust beneath the Iberian Peninsula and western Turkey. A global-scale full-waveform inversion ensures that regional refinements are consistent with whole-Earth structure. This first generation will serve as the basis for further automation and methodological improvements concerning validation and uncertainty quantification.

  6. An Integrated Approach for the Large-Scale Simulation of Sedimentary Basins to Study Seismic Wave Amplification

    NASA Astrophysics Data System (ADS)

    Poursartip, B.

    2015-12-01

    Seismic hazard assessment to predict the behavior of infrastructures subjected to earthquake relies on ground motion numerical simulation because the analytical solution of seismic waves is limited to only a few simple geometries. Recent advances in numerical methods and computer architectures make it ever more practical to reliably and quickly obtain the near-surface response to seismic events. The key motivation stems from the need to access the performance of sensitive components of the civil infrastructure (nuclear power plants, bridges, lifelines, etc), when subjected to realistic scenarios of seismic events. We discuss an integrated approach that deploys best-practice tools for simulating seismic events in arbitrarily heterogeneous formations, while also accounting for topography. Specifically, we describe an explicit forward wave solver based on a hybrid formulation that couples a single-field formulation for the computational domain with an unsplit mixed-field formulation for Perfectly-Matched-Layers (PMLs and/or M-PMLs) used to limit the computational domain. Due to the material heterogeneity and the contrasting discretization needs it imposes, an adaptive time solver is adopted. We use a Runge-Kutta-Fehlberg time-marching scheme that adjusts optimally the time step such that the local truncation error rests below a predefined tolerance. We use spectral elements for spatial discretization, and the Domain Reduction Method in accordance with double couple method to allow for the efficient prescription of the input seismic motion. Of particular interest to this development is the study of the effects idealized topographic features have on the surface motion when compared against motion results that are based on a flat-surface assumption. We discuss the components of the integrated approach we followed, and report the results of parametric studies in two and three dimensions, for various idealized topographic features, which show motion amplification that depends, as expected, on the relation between the topographic feature's characteristics and the dominant wavelength. Lastly, we report results involving three-dimensional simulations.

  7. Zephyr: Open-source Parallel Seismic Waveform Inversion in an Integrated Python-based Framework

    NASA Astrophysics Data System (ADS)

    Smithyman, B. R.; Pratt, R. G.; Hadden, S. M.

    2015-12-01

    Seismic Full-Waveform Inversion (FWI) is an advanced method to reconstruct wave properties of materials in the Earth from a series of seismic measurements. These methods have been developed by researchers since the late 1980s, and now see significant interest from the seismic exploration industry. As researchers move towards implementing advanced numerical modelling (e.g., 3D, multi-component, anisotropic and visco-elastic physics), it is desirable to make use of a modular approach, minimizing the effort developing a new set of tools for each new numerical problem. SimPEG (http://simpeg.xyz) is an open source project aimed at constructing a general framework to enable geophysical inversion in various domains. In this abstract we describe Zephyr (https://github.com/bsmithyman/zephyr), which is a coupled research project focused on parallel FWI in the seismic context. The software is built on top of Python, Numpy and IPython, which enables very flexible testing and implementation of new features. Zephyr is an open source project, and is released freely to enable reproducible research. We currently implement a parallel, distributed seismic forward modelling approach that solves the 2.5D (two-and-one-half dimensional) viscoacoustic Helmholtz equation at a range modelling frequencies, generating forward solutions for a given source behaviour, and gradient solutions for a given set of observed data. Solutions are computed in a distributed manner on a set of heterogeneous workers. The researcher's frontend computer may be separated from the worker cluster by a network link to enable full support for computation on remote clusters from individual workstations or laptops. The present codebase introduces a numerical discretization equivalent to that used by FULLWV, a well-known seismic FWI research codebase. This makes it straightforward to compare results from Zephyr directly with FULLWV. The flexibility introduced by the use of a Python programming environment makes extension of the codebase with new methods much more straightforward. This enables comparison and integration of new efforts with existing results.

  8. A Cloud-Computing Service for Environmental Geophysics and Seismic Data Processing

    NASA Astrophysics Data System (ADS)

    Heilmann, B. Z.; Maggi, P.; Piras, A.; Satta, G.; Deidda, G. P.; Bonomi, E.

    2012-04-01

    Cloud computing is establishing worldwide as a new high performance computing paradigm that offers formidable possibilities to industry and science. The presented cloud-computing portal, part of the Grida3 project, provides an innovative approach to seismic data processing by combining open-source state-of-the-art processing software and cloud-computing technology, making possible the effective use of distributed computation and data management with administratively distant resources. We substituted the user-side demanding hardware and software requirements by remote access to high-performance grid-computing facilities. As a result, data processing can be done quasi in real-time being ubiquitously controlled via Internet by a user-friendly web-browser interface. Besides the obvious advantages over locally installed seismic-processing packages, the presented cloud-computing solution creates completely new possibilities for scientific education, collaboration, and presentation of reproducible results. The web-browser interface of our portal is based on the commercially supported grid portal EnginFrame, an open framework based on Java, XML, and Web Services. We selected the hosted applications with the objective to allow the construction of typical 2D time-domain seismic-imaging workflows as used for environmental studies and, originally, for hydrocarbon exploration. For data visualization and pre-processing, we chose the free software package Seismic Un*x. We ported tools for trace balancing, amplitude gaining, muting, frequency filtering, dip filtering, deconvolution and rendering, with a customized choice of options as services onto the cloud-computing portal. For structural imaging and velocity-model building, we developed a grid version of the Common-Reflection-Surface stack, a data-driven imaging method that requires no user interaction at run time such as manual picking in prestack volumes or velocity spectra. Due to its high level of automation, CRS stacking can benefit largely from the hardware parallelism provided by the cloud deployment. The resulting output, post-stack section, coherence, and NMO-velocity panels are used to generate a smooth migration-velocity model. Residual static corrections are calculated as a by-product of the stack and can be applied iteratively. As a final step, a time migrated subsurface image is obtained by a parallelized Kirchhoff time migration scheme. Processing can be done step-by-step or using a graphical workflow editor that can launch a series of pipelined tasks. The status of the submitted jobs is monitored by a dedicated service. All results are stored in project directories, where they can be downloaded of viewed directly in the browser. Currently, the portal has access to three research clusters having a total number of 70 nodes with 4 cores each. They are shared with four other cloud-computing applications bundled within the GRIDA3 project. To demonstrate the functionality of our "seismic cloud lab", we will present results obtained for three different types of data, all taken from hydrogeophysical studies: (1) a seismic reflection data set, made of compressional waves from explosive sources, recorded in Muravera, Sardinia; (2) a shear-wave data set from, Sardinia; (3) a multi-offset Ground-Penetrating-Radar data set from Larreule, France. The presented work was funded by the government of the Autonomous Region of Sardinia and by the Italian Ministry of Research and Education.

  9. Rubber airplane: Constraint-based component-modeling for knowledge representation in computer-aided conceptual design

    NASA Technical Reports Server (NTRS)

    Kolb, Mark A.

    1990-01-01

    Viewgraphs on Rubber Airplane: Constraint-based Component-Modeling for Knowledge Representation in Computer Aided Conceptual Design are presented. Topics covered include: computer aided design; object oriented programming; airfoil design; surveillance aircraft; commercial aircraft; aircraft design; and launch vehicles.

  10. Seismic Signatures of Brine Release at Blood Falls, Taylor Glacier, Antarctica

    NASA Astrophysics Data System (ADS)

    Carr, C. G.; Pettit, E. C.; Carmichael, J.

    2017-12-01

    Blood Falls is created by the release of subglacially-sourced, iron-rich brine at the surface of Taylor Glacier, McMurdo Dry Valleys, Antarctica. The supraglacial portion of this hydrological feature is episodically active. Englacial liquid brine flow occurs despite ice temperatures of -17°C and we document supraglacial liquid brine release despite ambient air temperatures average -20°C. In this study, we use data from a seismic network, time-lapse cameras, and publicly available weather station data to address the questions: what are the characteristics of seismic events that occur during Blood Falls brine release and how do these compare with seismic events that occur during times of Blood Falls quiescence? How are different processes observable in the time-lapse imagery represented in the seismic record? Time-lapse photography constrains the timing of brine release events during the austral winter of 2014. We use a noise-adaptive digital power detector to identify seismic events and cluster analysis to identify repeating events based on waveform similarity across the network. During the 2014 wintertime brine release, high-energy repeated seismic events occurred proximal to Blood Falls. We investigate the ground motions associated with these clustered events, as well as their spatial distribution. We see evidence of possible tremor during the brine release periods, an indicator of fluid movement. If distinctive seismic signatures are associated with Blood Falls brine release they could be identified based solely on seismic data without any aid from time-lapse cameras. Passive seismologic monitoring has the benefit of continuity during the polar night and other poor visibility conditions, which make time-lapse imagery unusable.

  11. [Development of computer aided forming techniques in manufacturing scaffolds for bone tissue engineering].

    PubMed

    Wei, Xuelei; Dong, Fuhui

    2011-12-01

    To review recent advance in the research and application of computer aided forming techniques for constructing bone tissue engineering scaffolds. The literature concerning computer aided forming techniques for constructing bone tissue engineering scaffolds in recent years was reviewed extensively and summarized. Several studies over last decade have focused on computer aided forming techniques for bone scaffold construction using various scaffold materials, which is based on computer aided design (CAD) and bone scaffold rapid prototyping (RP). CAD include medical CAD, STL, and reverse design. Reverse design can fully simulate normal bone tissue and could be very useful for the CAD. RP techniques include fused deposition modeling, three dimensional printing, selected laser sintering, three dimensional bioplotting, and low-temperature deposition manufacturing. These techniques provide a new way to construct bone tissue engineering scaffolds with complex internal structures. With rapid development of molding and forming techniques, computer aided forming techniques are expected to provide ideal bone tissue engineering scaffolds.

  12. The Impact of Machine Translation and Computer-aided Translation on Translators

    NASA Astrophysics Data System (ADS)

    Peng, Hao

    2018-03-01

    Under the context of globalization, communications between countries and cultures are becoming increasingly frequent, which make it imperative to use some techniques to help translate. This paper is to explore the influence of computer-aided translation on translators, which is derived from the field of the computer-aided translation (CAT) and machine translation (MT). Followed by an introduction to the development of machine and computer-aided translation, it then depicts the technologies practicable to translators, which are trying to analyze the demand of designing the computer-aided translation so far in translation practice, and optimize the designation of computer-aided translation techniques, and analyze its operability in translation. The findings underline the advantages and disadvantages of MT and CAT tools, and the serviceability and future development of MT and CAT technologies. Finally, this thesis probes into the impact of these new technologies on translators in hope that more translators and translation researchers can learn to use such tools to improve their productivity.

  13. Real-time seismic monitoring of instrumented hospital buildings

    USGS Publications Warehouse

    Kalkan, Erol; Fletcher, Jon Peter B.; Leith, William S.; McCarthy, William S.; Banga, Krishna

    2012-01-01

    In collaboration with the Department of Veterans Affairs (VA), the U.S. Geological Survey's National Strong Motion Project has recently installed sophisticated seismic monitoring systems to monitor the structural health of two hospital buildings at the Memphis VA Medical Center in Tennessee. The monitoring systems in the Bed Tower and Spinal Cord Injury buildings combine sensing technologies with an on-site computer to capture and analyze seismic performance of buildings in near-real time.

  14. Comparison of Structurally Controlled Landslide Hazard Simulation to the Co-seismic Landslides Caused by the M 7.2 2013 Bohol Earthquake.

    NASA Astrophysics Data System (ADS)

    Galang, J. A. M. B.; Eco, R. C.; Lagmay, A. M. A.

    2014-12-01

    The M_w 7.2 October 15, 2013 Bohol earthquake is one of the more destructive earthquake to hit the Philippines in the 21st century. The epicenter was located in Sagbayan municipality, central Bohol and was generated by a previously unmapped reverse fault called the "Inabanga Fault". The earthquake resulted in 209 fatalities and over 57 million USD worth of damages. The earthquake generated co-seismic landslides most of which were related to fault structures. Unlike rainfall induced landslides, the trigger for co-seismic landslides happen without warning. Preparations for this type of landslides rely heavily on the identification of fracture-related slope instability. To mitigate the impacts of co-seismic landslide hazards, morpho-structural orientations of discontinuity sets were mapped using remote sensing techniques with the aid of a Digital Terrain Model (DTM) obtained in 2012. The DTM used is an IFSAR derived image with a 5-meter pixel resolution and approximately 0.5 meter vertical accuracy. Coltop 3D software was then used to identify similar structures including measurement of their dip and dip directions. The chosen discontinuity sets were then keyed into Matterocking software to identify potential rock slide zones due to planar or wedged discontinuities. After identifying the structurally-controlled unstable slopes, the rock mass propagation extent of the possible rock slides was simulated using Conefall. Separately, a manually derived landslide inventory has been performed using post-earthquake satellite images and LIDAR. The results were compared to the landslide inventory which identified at least 873 landslides. Out of the 873 landslides identified through the inventory, 786 or 90% intersect the simulated structural-controlled landslide hazard areas of Bohol. The results show the potential of this method to identify co-seismic landslide hazard areas for disaster mitigation. Along with computer methods to simulate shallow landslides, and debris flow paths, located structurally-controlled unstable zones can be used to mark unsafe areas for settlement. The method can be further improved with the use of Lidar DTMs, which has better spatial resolution than the IFSAR DTM. A nationwide effort under DOST-Project NOAH (DREAM-LIDAR) is underway, to map the Philippine archipelago using Lidar.

  15. Seismic vulnerability of Oregon state highway bridges : mitigation strategies to reduce major mobility risks.

    DOT National Transportation Integrated Search

    2009-11-01

    The Oregon Department of Transportation and Portland State University evaluated the seismic : vulnerability of state highway bridges in western Oregon. The study used a computer program : called REDARS2 that simulated the damage to bridges within a t...

  16. Effects of Permafrost and Seasonally Frozen Ground on the Seismic Response of Transportation Infrastructure Sites

    DOT National Transportation Integrated Search

    2010-02-01

    This interdisciplinary project combined seismic data recorded at bridge sites with computer models to identify how highway bridges built on permanently and seasonally frozen ground behave during an earthquake. Two sites one in Anchorage and one in...

  17. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.

  18. H-fractal seismic metamaterial with broadband low-frequency bandgaps

    NASA Astrophysics Data System (ADS)

    Du, Qiujiao; Zeng, Yi; Xu, Yang; Yang, Hongwu; Zeng, Zuoxun

    2018-03-01

    The application of metamaterial in civil engineering to achieve isolation of a building by controlling the propagation of seismic waves is a substantial challenge because seismic waves, a superposition of longitudinal and shear waves, are more complex than electromagnetic and acoustic waves. In this paper, we design a broadband seismic metamaterial based on H-shaped fractal pillars and report numerical simulation of band structures for seismic surface waves propagating. Comparative study on the band structures of H-fractal seismic metamaterials with different levels shows that a new level of fractal structure creates new band gap, widens the total band gaps and shifts the same band gap towards lower frequencies. Moreover, the vibration modes for H-fractal seismic metamaterials are computed and analyzed to clarify the mechanism of widening band gaps. A numerical investigation of seismic surface waves propagation on a 2D array of fractal unit cells on the surface of semi-infinite substrate is proposed to show the efficiency of earthquake shielding in multiple complete band gaps.

  19. Attributes Affecting Computer-Aided Decision Making--A Literature Survey.

    ERIC Educational Resources Information Center

    Moldafsky, Neil I; Kwon, Ik-Whan

    1994-01-01

    Reviews current literature about personal, demographic, situational, and cognitive attributes that affect computer-aided decision making. The effectiveness of computer-aided decision making is explored in relation to decision quality, effectiveness, and confidence. Studies of the effects of age, anxiety, cognitive type, attitude, gender, and prior…

  20. User-Centered Computer Aided Language Learning

    ERIC Educational Resources Information Center

    Zaphiris, Panayiotis, Ed.; Zacharia, Giorgos, Ed.

    2006-01-01

    In the field of computer aided language learning (CALL), there is a need for emphasizing the importance of the user. "User-Centered Computer Aided Language Learning" presents methodologies, strategies, and design approaches for building interfaces for a user-centered CALL environment, creating a deeper understanding of the opportunities and…

  1. Hearing Impairments. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide focuses on advances in electronic aids, computers, telecommunications, and videodiscs to assist students with hearing impairments. Electronic aids include hearing aids, telephone devices for the deaf, teletypes, closed captioning systems for television, and…

  2. Bayesian Estimation of the Spatially Varying Completeness Magnitude of Earthquake Catalogs

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Werner, M.; Wiemer, S.; Chen, C.; Wu, Y.

    2010-12-01

    Assessing the completeness magnitude Mc of earthquake catalogs is an essential prerequisite for any seismicity analysis. We employ a simple model to compute Mc in space, based on the proximity to seismic stations in a network. We show that a relationship of the form Mcpred(d) = ad^b+c, with d the distance to the 5th nearest seismic station, fits the observations well. We then propose a new Mc mapping approach, the Bayesian Magnitude of Completeness (BMC) method, based on a 2-step procedure: (1) a spatial resolution optimization to minimize spatial heterogeneities and uncertainties in Mc estimates and (2) a Bayesian approach that merges prior information about Mc based on the proximity to seismic stations with locally observed values weighted by their respective uncertainties. This new methodology eliminates most weaknesses associated with current Mc mapping procedures: the radius that defines which earthquakes to include in the local magnitude distribution is chosen according to an objective criterion and there are no gaps in the spatial estimation of Mc. The method solely requires the coordinates of seismic stations. Here, we investigate the Taiwan Central Weather Bureau (CWB) earthquake catalog by computing a Mc map for the period 1994-2010.

  3. Can We Estimate Injected Carbon Dioxide Prior to the Repeat Survey in 4D Seismic Monitoring Scheme?

    NASA Astrophysics Data System (ADS)

    Sakai, A.

    2005-12-01

    To mitigate global climate change, the geologic sequestration by injecting carbon dioxide in the aquifer and others is one of the most promising scenarios. Monitoring is required to verify the long-term safe storage of carbon dioxide in the subsurface. As evidenced in the oil industry, monitoring by time-lapse 3D seismic survey is the most effective to spatially detect fluid movements and change of pore pressure. We have conducted 3D seismic survey onshore Japan surrounding RITE/METI Iwanohara carbon dioxide injection test site. Target aquifer zone is at 1100m deep in the Pleistocene layer with 60m thick and most permeable zone is approx. 12m thick. Baseline 3D seismic survey was conducted in July-August 2003 and a monitor 3D seismic survey was in July-August 2005 by vibrating source with 10-120Hz sweep frequency band. Prior to the monitor survey, we evaluated seismic data with integrating wireline logging data. As target carbon dioxide injection layer is thin, high-resolution seismic data is required to estimate potential spreading of injected carbon dioxide. To increase seismic resolution, spectrally enhancing method was in use. The procedure is smoothing number of seismic spectral amplitude, computing well log spectrum, and constructing matching filter between seismic and well spectrum. Then it was applied to the whole seismic traces after evaluating test traces. Synthetic seismograms from logging data were computed with extracting optimal wavelets. Fitting between spectrally enhanced seismic traces and synthetic seismograms was excellent even for deviated monitor wells. Acoustic impedance was estimated by inversion of these 3D seismic traces. In analyzing logging data of sonic, density, CMR, and others, the elastic wave velocity was reconstructed by rock physics approach after estimating compositions. Based on models, velocity changes by carbon dioxide injection was evaluated. The correlation of acoustic impedance with porosity and logarithmic permeability was good and relying on this relation and geological constraints with inversion techniques, porosity and permeability was estimated in 3D volume. If the carbon dioxide movement was solely controlled by permeability, estimated permeability volume might predict the time-lapse seismic data prior to a repeat survey. We compare the estimate with the actual 4D changes and discuss related variations.

  4. Computer Aided Design in Engineering Education.

    ERIC Educational Resources Information Center

    Gobin, R.

    1986-01-01

    Discusses the use of Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) systems in an undergraduate engineering education program. Provides a rationale for CAD/CAM use in the already existing engineering program. Describes the methods used in choosing the systems, some initial results, and warnings for first-time users. (TW)

  5. Prosthetic rehabilitation with an implant-supported fixed prosthesis using computer-aided design and computer-aided manufacturing dental technology for a patient with a mandibulectomy: A clinical report.

    PubMed

    Yoon, Hyung-In; Han, Jung-Suk

    2016-02-01

    The fabrication of dental prostheses with computer-aided design and computer-aided manufacturing shows acceptable marginal fits and favorable treatment outcomes. This clinical report describes the management of a patient who had undergone a mandibulectomy and received an implant-supported fixed prosthesis by using additive manufacturing for the framework and subtractive manufacturing for the monolithic zirconia restorations. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  6. Bridges Dynamic Parameters Identification Based On Experimental and Numerical Method Comparison in Regard with Traffic Seismicity

    NASA Astrophysics Data System (ADS)

    Krkošková, Katarína; Papán, Daniel; Papánová, Zuzana

    2017-10-01

    The technical seismicity negatively affects the environment, buildings and structures. Technical seismicity means seismic shakes caused by force impulse, random process and unnatural origin. The vibration influence on buildings is evaluated in the Eurocode 8 in Slovak Republic, however, the Slovak Technical Standard STN 73 0036 includes solution of the technical seismicity. This standard also classes bridges into the group of structures that are significant in light of the technical seismicity - the group “U”. Using the case studies analysis by FEM simulation and comparison is necessary because of brief norm evaluation of this issue. In this article, determinate dynamic parameters by experimental measuring and numerical method on two real bridges are compared. First bridge, (D201 - 00) is Scaffold Bridge on the road I/11 leading to the city of Čadca and is situated in the city of Žilina. It is eleven - span concrete road bridge. The railway is the obstacle, which this bridge spans. Second bridge (M5973 Brodno) is situated in the part of Žilina City on the road of I/11. It is concrete three - span road bridge built as box girder. The computing part includes 3D computational models of the bridges. First bridge (D201 - 00) was modelled in the software of IDA Nexis as the slab - wall model. The model outputs are natural frequencies and natural vibration modes. Second bridge (M5973 Brodno) was modelled in the software of VisualFEA. The technical seismicity corresponds with the force impulse, which was put into this model. The model outputs are vibration displacements, velocities and accelerations. The aim of the experiments was measuring of the vibration acceleration time record of bridges, and there was need to systematic placement of accelerometers. The vibration acceleration time record is important during the under - bridge train crossing, about the first bridge (D201 - 00) and the vibration acceleration time domain is important during deducing the force impulse under the bridge, about second bridge (M5973 Brodno). The analysis was done in the software of Sigview. About the first bridge (D201 - 00), the analysis output were values of power spectral density adherent to the frequencies values. These frequencies were compared with the natural frequencies values from the computational model whereby the technical seismicity influence on bridge natural frequencies was found out. About the second bridge (M5973 Brodno), the Sigview display of recorded vibration velocity time history was compared with the final vibration velocity time history from the computational model, whereby the results were incidental.

  7. Application of seismic interpretation in the development of Jerneh Field, Malay Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yusoff, Z.

    1994-07-01

    Development of the Jerneh gas field has been significantly aided by the use of 3-D and site survey seismic interpretations. The two aspects that have been of particular importance are identification of sea-floor and near-surface safety hazards for safe platform installation/development drilling and mapping of reservoirs/hydrocarbons within gas-productive sands of the Miocene groups B, D, and E. Choice of platform location as well as casing design require detailed analysis of sea-floor and near-surface safety hazards. At Jerneh, sea-floor pockmarks near-surface high amplitudes, distributary channels, and minor faults were recognized as potential operational safety hazards. The integration of conventional 3-D andmore » site survey seismic data enabled comprehensive understanding of the occurrence and distribution of potential hazards to platform installation and development well drilling. Three-dimensional seismic interpretation has been instrumental not only in the field structural definition but also in recognition of reservoir trends and hydrocarbon distribution. Additional gas reservoirs were identified by their DHI characteristics and subsequently confirmed by development wells. The innovative use of seismic attribute mapping techniques has been very important in defining both fluid and reservoir distribution in groups B and D. Integration of 3-D seismic data and well-log interpretations has helped in optimal field development, including the planning of well locations and drilling sequence.« less

  8. Computers in Manufacturing.

    ERIC Educational Resources Information Center

    Hudson, C. A.

    1982-01-01

    Advances in factory computerization (computer-aided design and computer-aided manufacturing) are reviewed, including discussions of robotics, human factors engineering, and the sociological impact of automation. (JN)

  9. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang

    Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).

  10. Tectonic styles of future earthquakes in Italy as input data for seismic hazard

    NASA Astrophysics Data System (ADS)

    Pondrelli, S.; Meletti, C.; Rovida, A.; Visini, F.; D'Amico, V.; Pace, B.

    2017-12-01

    In a recent elaboration of a new seismogenic zonation and hazard model for Italy, we tried to understand how many indications we have on the tectonic style of future earthquake/rupture. Using all available or recomputed seismic moment tensors for relevant seismic events (Mw starting from 4.5) of the last 100 yrs, first arrival focal mechanisms for less recent earthquakes and also geological data on past activated faults, we collected a database gathering a thousands of data all over the Italian peninsula and regions around it. After several summations of seismic moment tensors, over regular grids of different dimensions and different thicknesses of the seismogenic layer, we applied the same procedure to each of the 50 area sources that were designed in the seismogenic zonation. The results for several seismic zones are very stable, e.g. along the southern Apennines we expect future earthquakes to be mostly extensional, although in the outer part of the chain strike-slip events are possible. In the Northern part of the Apennines we also expect different, opposite tectonic styles for different hypocentral depths. In several zones, characterized by a low seismic moment release, defined for the study region using 1000 yrs of catalog, the next possible tectonic style of future earthquakes is less clear. It is worth to note that for some zones the possible greatest earthquake could be not represented in the available observations. We also add to our analysis the computation of the seismic release rate, computed using a distributed completeness, identified for single great events of the historical seismic catalog for Italy. All these information layers, overlapped and compared, may be used to characterize each new seismogenic zone.

  11. Seismic velocity change and slip rate during the 2006 Guerrero (Mexico) slow slip event

    NASA Astrophysics Data System (ADS)

    Rivet, Diane; Radiguet, Mathilde; Campillo, Michel; Cotton, Fabrice; Shapiro, Nikolai; Krishna Singh, Shri; Kostoglodov, Vladimir

    2010-05-01

    We measure temporal change of the seismic velocity in the crust below the Guerrero region during the 2006 slow sleep event (SSE). We use repeated cross-correlations of ambient seismic noise recorded at 26 broad-band stations of the MesoAmerica Seismic Experiment (MASE). The cross-correlations are computed over 90 days with a moving window of 10 days from January 2005 to July 2007. To insure measurements independent of noise source variations, we only take into account the travel time change within the coda. For period of 8 to 20s, we observe a decrease in velocity starting in April 2006 with a maximum change of -0.3% of the initial velocity in June 2006. At these periods, the Rayleigh waves are sensitive to velocity changes down to the lower crust. In the other hand, we compute the deformation rate below the MASE array from a slip propagation model of the SSE observed by means of the displacement time-series of 15 continuous GPS stations. Slip initiates in the western part of the Guerrero Gap and propagates southeastward. The propagation velocity is of the order of 1 km/day. We then compare the seismic velocity change measured from continuous seismological data with the deformation rate inferred from geodetic measurements below the MASE array. We obtain a good agreement between the time of maximal seismic velocity change (July 2006) and the time of maximum deformation associated with the SSE (July to August 2006). This result shows that the long-term velocity change associated with the SSE can be detected using continuous seismic recordings. Since the SSE does not emit seismic waves, which interact with the superficial layers, the result indicates that the velocity change is due to deformation at depth.

  12. Measurement of near-surface seismic compressional wave velocities using refraction tomography at a proposed construction site on the Presidio of Monterey, California

    USGS Publications Warehouse

    Powers, Michael H.; Burton, Bethany L.

    2012-01-01

    The U.S. Army Corps of Engineers is determining the feasibility of constructing a new barracks building on the U.S. Army Presidio of Monterey in Monterey, California. Due to the presence of an endangered orchid in the proposed area, invasive techniques such as exploratory drill holes are prohibited. To aid in determining the feasibility, budget, and design of this building, a compressional-wave seismic refraction survey was proposed by the U.S. Geological Survey as an alternative means of investigating the depth to competent bedrock. Two sub-parallel profiles were acquired along an existing foot path and a fence line to minimize impacts on the endangered flora. The compressional-wave seismic refraction tomography data for both profiles indicate that no competent rock classified as non-rippable or marginally rippable exists within the top 30 feet beneath the ground surface.

  13. Probabilistic safety analysis of earth retaining structures during earthquakes

    NASA Astrophysics Data System (ADS)

    Grivas, D. A.; Souflis, C.

    1982-07-01

    A procedure is presented for determining the probability of failure of Earth retaining structures under static or seismic conditions. Four possible modes of failure (overturning, base sliding, bearing capacity, and overall sliding) are examined and their combined effect is evaluated with the aid of combinatorial analysis. The probability of failure is shown to be a more adequate measure of safety than the customary factor of safety. As Earth retaining structures may fail in four distinct modes, a system analysis can provide a single estimate for the possibility of failure. A Bayesian formulation of the safety retaining walls is found to provide an improved measure for the predicted probability of failure under seismic loading. The presented Bayesian analysis can account for the damage incurred to a retaining wall during an earthquake to provide an improved estimate for its probability of failure during future seismic events.

  14. The Effects of Computer-Aided Design Software on Engineering Students' Spatial Visualisation Skills

    ERIC Educational Resources Information Center

    Kösa, Temel; Karakus, Fatih

    2018-01-01

    The purpose of this study was to determine the influence of computer-aided design (CAD) software-based instruction on the spatial visualisation skills of freshman engineering students in a computer-aided engineering drawing course. A quasi-experimental design was applied, using the Purdue Spatial Visualization Test-Visualization of Rotations…

  15. Teaching Computer-Aided Design of Fluid Flow and Heat Transfer Engineering Equipment.

    ERIC Educational Resources Information Center

    Gosman, A. D.; And Others

    1979-01-01

    Describes a teaching program for fluid mechanics and heat transfer which contains both computer aided learning (CAL) and computer aided design (CAD) components and argues that the understanding of the physical and numerical modeling taught in the CAL course is essential to the proper implementation of CAD. (Author/CMV)

  16. Computer-Presented Organizational/Memory Aids as Instruction for Solving Pico-Fomi Problems.

    ERIC Educational Resources Information Center

    Steinberg, Esther R.; And Others

    1985-01-01

    Describes investigation of effectiveness of computer-presented organizational/memory aids (matrix and verbal charts controlled by computer or learner) as instructional technique for solving Pico-Fomi problems, and the acquisition of deductive inference rules when such aids are present. Results indicate chart use control should be adapted to…

  17. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  18. Computer-aided design development transition for IPAD environment

    NASA Technical Reports Server (NTRS)

    Owens, H. G.; Mock, W. D.; Mitchell, J. C.

    1980-01-01

    The relationship of federally sponsored computer-aided design/computer-aided manufacturing (CAD/CAM) programs to the aircraft life cycle design process, an overview of NAAD'S CAD development program, an evaluation of the CAD design process, a discussion of the current computing environment within which NAAD is developing its CAD system, some of the advantages/disadvantages of the NAAD-IPAD approach, and CAD developments during transition into the IPAD system are discussed.

  19. CAD/CAE Integration Enhanced by New CAD Services Standard

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.

    2002-01-01

    A Government-industry team led by the NASA Glenn Research Center has developed a computer interface standard for accessing data from computer-aided design (CAD) systems. The Object Management Group, an international computer standards organization, has adopted this CAD services standard. The new standard allows software (e.g., computer-aided engineering (CAE) and computer-aided manufacturing software to access multiple CAD systems through one programming interface. The interface is built on top of a distributed computing system called the Common Object Request Broker Architecture (CORBA). CORBA allows the CAD services software to operate in a distributed, heterogeneous computing environment.

  20. Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.

    ERIC Educational Resources Information Center

    Rosenberg, R.C.; And Others

    These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…

  1. Computer Numerical Control: Instructional Manual. The North Dakota High Technology Mobile Laboratory Project.

    ERIC Educational Resources Information Center

    Sinn, John W.

    This instructional manual contains five learning activity packets for use in a workshop on computer numerical control for computer-aided manufacturing. The lessons cover the following topics: introduction to computer-aided manufacturing, understanding the lathe, using the computer, computer numerically controlled part programming, and executing a…

  2. Software and resources for computational medicinal chemistry

    PubMed Central

    Liao, Chenzhong; Sitzmann, Markus; Pugliese, Angelo; Nicklaus, Marc C

    2011-01-01

    Computer-aided drug design plays a vital role in drug discovery and development and has become an indispensable tool in the pharmaceutical industry. Computational medicinal chemists can take advantage of all kinds of software and resources in the computer-aided drug design field for the purposes of discovering and optimizing biologically active compounds. This article reviews software and other resources related to computer-aided drug design approaches, putting particular emphasis on structure-based drug design, ligand-based drug design, chemical databases and chemoinformatics tools. PMID:21707404

  3. Status of emerging standards for data definitions and transfer in the petroleum industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winczewski, L.M.

    1991-03-01

    Leading-edge hardware and software to store, retrieve, process, analyze, visualize, and interpret geoscience and petroleum data are improving continuously. A babel of definitions and formats for common industry data items limits the overall effectiveness of these computer-aided exploration and production tools. Custom data conversion required to load applications causes delays and exposes data content to error and degradation. Emerging industry-wide standards for management of geoscience and petroleum-related data are poised to overcome long-standing internal barriers to the full exploitation of these high-tech hardware/software systems. Industry technical organizations, such as AAPG, SEG, and API, have been actively pursuing industry-wide standards formore » data transfer, data definitions, and data models. These standard-defining groups are non-fee and solicit active participation from the entire petroleum community. The status of the most active of these groups is presented here. Data transfer standards are being pursued within AAPG (AAPG-B Data Transfer Standard), API (DLIS, for log data) and SEG (SEG-DEF, for seismic data). Converging data definitions, models, and glossaries are coming from the Petroleum Industry Data Dictionary Group (PIDD) and from subcommittees of the AAPG Computer Applications Committee. The National Computer Graphics Association is promoting development of standards for transfer of geographically oriented data. The API Well-Number standard is undergoing revision.« less

  4. [Key points for esthetic rehabilitation of anterior teeth using chair-side computer aided design and computer aided manufacture technique].

    PubMed

    Yang, J; Feng, H L

    2018-04-09

    With the rapid development of the chair-side computer aided design and computer aided manufacture (CAD/CAM) technology, its accuracy and operability of have been greatly improved in recent years. Chair-side CAD/CAM system may produce all kinds of indirect restorations, and has the advantages of rapid, accurate and stable production. It has become the future development direction of Stomatology. This paper describes the clinical application of the chair-side CAD/CAM technology for anterior aesthetic restorations from the aspects of shade and shape.

  5. [Clinical skills and outcomes of chair-side computer aided design and computer aided manufacture system].

    PubMed

    Yu, Q

    2018-04-09

    Computer aided design and computer aided manufacture (CAD/CAM) technology is a kind of oral digital system which is applied to clinical diagnosis and treatment. It overturns the traditional pattern, and provides a solution to restore defect tooth quickly and efficiently. In this paper we mainly discuss the clinical skills of chair-side CAD/CAM system, including tooth preparation, digital impression, the three-dimensional design of prosthesis, numerical control machining, clinical bonding and so on, and review the outcomes of several common kinds of materials at the same time.

  6. Quantifying the similarity of seismic polarizations

    NASA Astrophysics Data System (ADS)

    Jones, Joshua P.; Eaton, David W.; Caffagni, Enrico

    2016-02-01

    Assessing the similarities of seismic attributes can help identify tremor, low signal-to-noise (S/N) signals and converted or reflected phases, in addition to diagnosing site noise and sensor misalignment in arrays. Polarization analysis is a widely accepted method for studying the orientation and directional characteristics of seismic phases via computed attributes, but similarity is ordinarily discussed using qualitative comparisons with reference values or known seismic sources. Here we introduce a technique for quantitative polarization similarity that uses weighted histograms computed in short, overlapping time windows, drawing on methods adapted from the image processing and computer vision literature. Our method accounts for ambiguity in azimuth and incidence angle and variations in S/N ratio. Measuring polarization similarity allows easy identification of site noise and sensor misalignment and can help identify coherent noise and emergent or low S/N phase arrivals. Dissimilar azimuths during phase arrivals indicate misaligned horizontal components, dissimilar incidence angles during phase arrivals indicate misaligned vertical components and dissimilar linear polarization may indicate a secondary noise source. Using records of the Mw = 8.3 Sea of Okhotsk earthquake, from Canadian National Seismic Network broad-band sensors in British Columbia and Yukon Territory, Canada, and a vertical borehole array at Hoadley gas field, central Alberta, Canada, we demonstrate that our method is robust to station spacing. Discrete wavelet analysis extends polarization similarity to the time-frequency domain in a straightforward way. Time-frequency polarization similarities of borehole data suggest that a coherent noise source may have persisted above 8 Hz several months after peak resource extraction from a `flowback' type hydraulic fracture.

  7. Investigation of seismicity and related effects at NASA Ames-Dryden Flight Research Facility, Computer Center, Edwards, California

    NASA Technical Reports Server (NTRS)

    Cousineau, R. D.; Crook, R., Jr.; Leeds, D. J.

    1985-01-01

    This report discusses a geological and seismological investigation of the NASA Ames-Dryden Flight Research Facility site at Edwards, California. Results are presented as seismic design criteria, with design values of the pertinent ground motion parameters, probability of recurrence, and recommended analogous time-history accelerograms with their corresponding spectra. The recommendations apply specifically to the Dryden site and should not be extrapolated to other sites with varying foundation and geologic conditions or different seismic environments.

  8. Discovering new events beyond the catalogue—application of empirical matched field processing to Salton Sea geothermal field seismicity

    DOE PAGES

    Wang, Jingbo; Templeton, Dennise C.; Harris, David B.

    2015-07-30

    Using empirical matched field processing (MFP), we compare 4 yr of continuous seismic data to a set of 195 master templates from within an active geothermal field and identify over 140 per cent more events than were identified using traditional detection and location techniques alone. In managed underground reservoirs, a substantial fraction of seismic events can be excluded from the official catalogue due to an inability to clearly identify seismic-phase onsets. Empirical MFP can improve the effectiveness of current seismic detection and location methodologies by using conventionally located events with higher signal-to-noise ratios as master events to define wavefield templatesmore » that could then be used to map normally discarded indistinct seismicity. Since MFP does not require picking, it can be carried out automatically and rapidly once suitable templates are defined. In this application, we extend MFP by constructing local-distance empirical master templates using Southern California Earthquake Data Center archived waveform data of events originating within the Salton Sea Geothermal Field. We compare the empirical templates to continuous seismic data collected between 1 January 2008 and 31 December 2011. The empirical MFP method successfully identifies 6249 additional events, while the original catalogue reported 4352 events. The majority of these new events are lower-magnitude events with magnitudes between M0.2–M0.8. Here, the increased spatial-temporal resolution of the microseismicity map within the geothermal field illustrates how empirical MFP, when combined with conventional methods, can significantly improve seismic network detection capabilities, which can aid in long-term sustainability and monitoring of managed underground reservoirs.« less

  9. Pattern recognition in volcano seismology - Reducing spectral dimensionality

    NASA Astrophysics Data System (ADS)

    Unglert, K.; Radic, V.; Jellinek, M.

    2015-12-01

    Variations in the spectral content of volcano seismicity can relate to changes in volcanic activity. Low-frequency seismic signals often precede or accompany volcanic eruptions. However, they are commonly manually identified in spectra or spectrograms, and their definition in spectral space differs from one volcanic setting to the next. Increasingly long time series of monitoring data at volcano observatories require automated tools to facilitate rapid processing and aid with pattern identification related to impending eruptions. Furthermore, knowledge transfer between volcanic settings is difficult if the methods to identify and analyze the characteristics of seismic signals differ. To address these challenges we evaluate whether a machine learning technique called Self-Organizing Maps (SOMs) can be used to characterize the dominant spectral components of volcano seismicity without the need for any a priori knowledge of different signal classes. This could reduce the dimensions of the spectral space typically analyzed by orders of magnitude, and enable rapid processing and visualization. Preliminary results suggest that the temporal evolution of volcano seismicity at Kilauea Volcano, Hawai`i, can be reduced to as few as 2 spectral components by using a combination of SOMs and cluster analysis. We will further refine our methodology with several datasets from Hawai`i and Alaska, among others, and compare it to other techniques.

  10. Geophysical remote sensing of water reservoirs suitable for desalinization.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldridge, David Franklin; Bartel, Lewis Clark; Bonal, Nedra

    2009-12-01

    In many parts of the United States, as well as other regions of the world, competing demands for fresh water or water suitable for desalination are outstripping sustainable supplies. In these areas, new water supplies are necessary to sustain economic development and agricultural uses, as well as support expanding populations, particularly in the Southwestern United States. Increasing the supply of water will more than likely come through desalinization of water reservoirs that are not suitable for present use. Surface-deployed seismic and electromagnetic (EM) methods have the potential for addressing these critical issues within large volumes of an aquifer at amore » lower cost than drilling and sampling. However, for detailed analysis of the water quality, some sampling utilizing boreholes would be required with geophysical methods being employed to extrapolate these sampled results to non-sampled regions of the aquifer. The research in this report addresses using seismic and EM methods in two complimentary ways to aid in the identification of water reservoirs that are suitable for desalinization. The first method uses the seismic data to constrain the earth structure so that detailed EM modeling can estimate the pore water conductivity, and hence the salinity. The second method utilizes the coupling of seismic and EM waves through the seismo-electric (conversion of seismic energy to electrical energy) and the electro-seismic (conversion of electrical energy to seismic energy) to estimate the salinity of the target aquifer. Analytic 1D solutions to coupled pressure and electric wave propagation demonstrate the types of waves one expects when using a seismic or electric source. A 2D seismo-electric/electro-seismic is developed to demonstrate the coupled seismic and EM system. For finite-difference modeling, the seismic and EM wave propagation algorithms are on different spatial and temporal scales. We present a method to solve multiple, finite-difference physics problems that has application beyond the present use. A limited field experiment was conducted to assess the seismo-electric effect. Due to a variety of problems, the observation of the electric field due to a seismic source is not definitive.« less

  11. The shallow elastic structure of the lunar crust: New insights from seismic wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-10-01

    Enigmatic lunar seismograms recorded during the Apollo 17 mission in 1972 have so far precluded the identification of shear-wave arrivals and hence the construction of a comprehensive elastic model of the shallow lunar subsurface. Here, for the first time, we extract shear-wave information from the Apollo active seismic data using a novel waveform analysis technique based on spatial seismic wavefield gradients. The star-like recording geometry of the active seismic experiment lends itself surprisingly well to compute spatial wavefield gradients and rotational ground motion as a function of time. These observables, which are new to seismic exploration in general, allowed us to identify shear waves in the complex lunar seismograms, and to derive a new model of seismic compressional and shear-wave velocities in the shallow lunar crust, critical to understand its lithology and constitution, and its impact on other geophysical investigations of the Moon's deep interior.

  12. Seismic activity monitoring in the Izvorul Muntelui dam region

    NASA Astrophysics Data System (ADS)

    Borleanu, Felix; Otilia Placinta, Anca; Popa, Mihaela; Adelin Moldovan, Iren; Popescu, Emilia

    2016-04-01

    Earthquakes occurrences near the artificial water reservoirs are caused by stress variation due to the weight of water, weakness of fractures or faults and increasing of pore pressure in crustal rocks. In the present study we aim to investigate how Izvorul Muntelui dam, located in the Eastern Carpathians influences local seismicity. For this purpose we selected from the seismic bulletins computed within National Data Center of National Institute for Earth Physics, Romania, crustal events occurred between 984 and 2015 in a range of 0.3 deg around the artificial lake. Subsequently to improve the seismic monitoring of the region we applied a cross-correlation detector on the continuous recordings of Bicaz (BIZ) seismic stations. Besides the tectonic events we detected sources within this region that periodically generate artificial evens. We couldn't emphasize the existence of a direct correlation between the water level variations and natural seismicity of the investigated area.

  13. Strategies for the Creation, Design and Implementation of Effective Interactive Computer-Aided Learning Software in Numerate Business Subjects--The Byzantium Experience.

    ERIC Educational Resources Information Center

    Wilkinson-Riddle, G. J.; Patel, Ashok

    1998-01-01

    Discusses courseware development, including intelligent tutoring systems, under the Teaching and Learning Technology Programme and the Byzantium project that was designed to define computer-aided learning performance standards suitable for numerate business subjects; examine reasons to use computer-aided learning; and improve access to educational…

  14. Enhancing Engineering Computer-Aided Design Education Using Lectures Recorded on the PC

    ERIC Educational Resources Information Center

    McGrann, Roy T. R.

    2006-01-01

    Computer-Aided Engineering (CAE) is a course that is required during the third year in the mechanical engineering curriculum at Binghamton University. The primary objective of the course is to educate students in the procedures of computer-aided engineering design. The solid modeling and analysis program Pro/Engineer[TM] (PTC[R]) is used as the…

  15. The Prospect of using Three-Dimensional Earth Models To Improve Nuclear Explosion Monitoring and Ground Motion Hazard Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zucca, J J; Walter, W R; Rodgers, A J

    2008-11-19

    The last ten years have brought rapid growth in the development and use of three-dimensional (3D) seismic models of Earth structure at crustal, regional and global scales. In order to explore the potential for 3D seismic models to contribute to important societal applications, Lawrence Livermore National Laboratory (LLNL) hosted a 'Workshop on Multi-Resolution 3D Earth Models to Predict Key Observables in Seismic Monitoring and Related Fields' on June 6 and 7, 2007 in Berkeley, California. The workshop brought together academic, government and industry leaders in the research programs developing 3D seismic models and methods for the nuclear explosion monitoring andmore » seismic ground motion hazard communities. The workshop was designed to assess the current state of work in 3D seismology and to discuss a path forward for determining if and how 3D Earth models and techniques can be used to achieve measurable increases in our capabilities for monitoring underground nuclear explosions and characterizing seismic ground motion hazards. This paper highlights some of the presentations, issues, and discussions at the workshop and proposes two specific paths by which to begin quantifying the potential contribution of progressively refined 3D seismic models in critical applied arenas. Seismic monitoring agencies are tasked with detection, location, and characterization of seismic activity in near real time. In the case of nuclear explosion monitoring or seismic hazard, decisions to further investigate a suspect event or to launch disaster relief efforts may rely heavily on real-time analysis and results. Because these are weighty decisions, monitoring agencies are regularly called upon to meticulously document and justify every aspect of their monitoring system. In order to meet this level of scrutiny and maintain operational robustness requirements, only mature technologies are considered for operational monitoring systems, and operational technology necessarily lags contemporary research. Current monitoring practice is to use relatively simple Earth models that generally afford analytical prediction of seismic observables (see Examples of Current Monitoring Practice below). Empirical relationships or corrections to predictions are often used to account for unmodeled phenomena, such as the generation of S-waves from explosions or the effect of 3-dimensional Earth structure on wave propagation. This approach produces fast and accurate predictions in areas where empirical observations are available. However, accuracy may diminish away from empirical data. Further, much of the physics is wrapped into an empirical relationship or correction, which limits the ability to fully understand the physical processes underlying the seismic observation. Every generation of seismology researchers works toward quantitative results, with leaders who are active at or near the forefront of what has been computationally possible. While recognizing that only a 3-dimensional model can capture the full physics of seismic wave generation and propagation in the Earth, computational seismology has, until recently, been limited to simplifying model parameterizations (e.g. 1D Earth models) that lead to efficient algorithms. What is different today is the fact that the largest and fastest machines are at last capable of evaluating the effects of generalized 3D Earth structure, at levels of detail that improve significantly over past efforts, with potentially wide application. Advances in numerical methods to compute travel times and complete seismograms for 3D models are enabling new ways to interpret available data. This includes algorithms such as the Fast Marching Method (Rawlison and Sambridge, 2004) for travel time calculations and full waveform methods such as the spectral element method (SEM; Komatitsch et al., 2002, Tromp et al., 2005), higher order Galerkin methods (Kaser and Dumbser, 2006; Dumbser and Kaser, 2006) and advances in more traditional Cartesian finite difference methods (e.g. Pitarka, 1999; Nilsson et al., 2007). The ability to compute seismic observables using a 3D model is only half of the challenge; models must be developed that accurately represent true Earth structure. Indeed, advances in seismic imaging have followed improvements in 3D computing capability (e.g. Tromp et al., 2005; Rawlinson and Urvoy, 2006). Advances in seismic imaging methods have been fueled in part by theoretical developments and the introduction of novel approaches for combining different seismological observables, both of which can increase the sensitivity of observations to Earth structure. Examples of such developments are finite-frequency sensitivity kernels for body-wave tomography (e.g. Marquering et al., 1998; Montelli et al., 2004) and joint inversion of receiver functions and surface wave group velocities (e.g. Julia et al., 2000).« less

  16. Key Issues in Instructional Computer Graphics.

    ERIC Educational Resources Information Center

    Wozny, Michael J.

    1981-01-01

    Addresses key issues facing universities which plan to establish instructional computer graphics facilities, including computer-aided design/computer aided manufacturing systems, role in curriculum, hardware, software, writing instructional software, faculty involvement, operations, and research. Thirty-seven references and two appendices are…

  17. An automatic tsunami warning system: TREMORS application in Europe

    NASA Astrophysics Data System (ADS)

    Reymond, D.; Robert, S.; Thomas, Y.; Schindelé, F.

    1996-03-01

    An integrated system named TREMORS (Tsunami Risk Evaluation through seismic Moment of a Real-time System) has been installed in EVORA station, in Portugal which has been affected by historical tsunamis. The system is based on a three component long period seismic station linked to a compatible IBM_PC with a specific software. The goals of this system are the followings: detect earthquake, locate them, compute their seismic moment, give a seismic warning. The warnings are based on the seismic moment estimation and all the processing are made automatically. The finality of this study is to check the quality of estimation of the main parameters of interest in a goal of tsunami warning: the location which depends of azimuth and distance, and at last the seismic moment, M 0, which controls the earthquake size. The sine qua non condition for obtaining an automatic location is that the 3 main seismic phases P, S, R must be visible. This study gives satisfying results (automatic analysis): ± 5° errors in azimuth and epicentral distance, and a standard deviation of less than a factor 2 for the seismic moment M 0.

  18. Seismic Analysis Capability in NASTRAN

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.

    1984-01-01

    Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.

  19. Scenarios for Evolving Seismic Crises: Possible Communication Strategies

    NASA Astrophysics Data System (ADS)

    Steacy, S.

    2015-12-01

    Recent advances in operational earthquake forecasting mean that we are very close to being able to confidently compute changes in earthquake probability as seismic crises develop. For instance, we now have statistical models such as ETAS and STEP which demonstrate considerable skill in forecasting earthquake rates and recent advances in Coulomb based models are also showing much promise. Communicating changes in earthquake probability is likely be very difficult, however, as the absolute probability of a damaging event is likely to remain quite small despite a significant increase in the relative value. Here, we use a hybrid Coulomb/statistical model to compute probability changes for a series of earthquake scenarios in New Zealand. We discuss the strengths and limitations of the forecasts and suggest a number of possible mechanisms that might be used to communicate results in an actual developing seismic crisis.

  20. Seismic low-frequency-based calculation of reservoir fluid mobility and its applications

    NASA Astrophysics Data System (ADS)

    Chen, Xue-Hua; He, Zhen-Hua; Zhu, Si-Xin; Liu, Wei; Zhong, Wen-Li

    2012-06-01

    Low frequency content of seismic signals contains information related to the reservoir fluid mobility. Based on the asymptotic analysis theory of frequency-dependent reflectivity from a fluid-saturated poroelastic medium, we derive the computational implementation of reservoir fluid mobility and present the determination of optimal frequency in the implementation. We then calculate the reservoir fluid mobility using the optimal frequency instantaneous spectra at the low-frequency end of the seismic spectrum. The methodology is applied to synthetic seismic data from a permeable gas-bearing reservoir model and real land and marine seismic data. The results demonstrate that the fluid mobility shows excellent quality in imaging the gas reservoirs. It is feasible to detect the location and spatial distribution of gas reservoirs and reduce the non-uniqueness and uncertainty in fluid identification.

  1. New approach to detect seismic surface waves in 1Hz-sampled GPS time series

    PubMed Central

    Houlié, N.; Occhipinti, G.; Blanchard, T.; Shapiro, N.; Lognonné, P.; Murakami, M.

    2011-01-01

    Recently, co-seismic seismic source characterization based on GPS measurements has been completed in near- and far-field with remarkable results. However, the accuracy of the ground displacement measurement inferred from GPS phase residuals is still depending of the distribution of satellites in the sky. We test here a method, based on the double difference (DD) computations of Line of Sight (LOS), that allows detecting 3D co-seismic ground shaking. The DD method is a quasi-analytically free of most of intrinsic errors affecting GPS measurements. The seismic waves presented in this study produced DD amplitudes 4 and 7 times stronger than the background noise. The method is benchmarked using the GEONET GPS stations recording the Hokkaido Earthquake (2003 September 25th, Mw = 8.3). PMID:22355563

  2. Using 3D Visualization to Communicate Scientific Results to Non-scientists

    NASA Astrophysics Data System (ADS)

    Whipple, S.; Mellors, R. J.; Sale, J.; Kilb, D.

    2002-12-01

    If "a picture is worth a thousand words" then an animation is worth millions. 3D animations and visualizations are useful for geoscientists but are perhaps even more valuable for rapidly illustrating standard geoscience ideas and concepts (such as faults, seismicity patterns, and topography) to non-specialists. This is useful not only for purely educational needs but also in rapidly briefing decision makers where time may be critical. As a demonstration of this we juxtapose large geophysical datasets (e.g., Southern California seismicity and topography) with other large societal datasets (such as highways and urban areas), which allows an instant understanding of the correlations. We intend to work out a methodology to aid other datasets such as hospitals and bridges, for example, in an ongoing fashion. The 3D scenes we create from the separate datasets can be "flown" through and individual snapshots that emphasize the concepts of interest are quickly rendered and converted to formats accessible to all. Viewing the snapshots and scenes greatly aids non-specialists comprehension of the problems and tasks at hand. For example, seismicity clusters (such as aftershocks) and faults near urban areas are clearly visible. A simple "fly-by" through our Southern California scene demonstrates simple concepts such as the topographic features due to plate motion along faults, and the demarcation of the North American/Pacific Plate boundary by the complex fault system (e.g., Elsinore, San Jacinto and San Andreas faults) in Southern California.

  3. Computer Programming Languages and Expertise Needed by Practicing Engineers.

    ERIC Educational Resources Information Center

    Doelling, Irvin

    1980-01-01

    Discussed is the present engineering computer environment of a large aerospace company recognized as a leader in the application and development of computer-aided design and computer-aided manufacturing techniques. A review is given of the exposure spectrum of engineers to the world of computing, the computer languages used, and the career impacts…

  4. Ground-motion signature of dynamic ruptures on rough faults

    NASA Astrophysics Data System (ADS)

    Mai, P. Martin; Galis, Martin; Thingbaijam, Kiran K. S.; Vyas, Jagdish C.

    2016-04-01

    Natural earthquakes occur on faults characterized by large-scale segmentation and small-scale roughness. This multi-scale geometrical complexity controls the dynamic rupture process, and hence strongly affects the radiated seismic waves and near-field shaking. For a fault system with given segmentation, the question arises what are the conditions for producing large-magnitude multi-segment ruptures, as opposed to smaller single-segment events. Similarly, for variable degrees of roughness, ruptures may be arrested prematurely or may break the entire fault. In addition, fault roughness induces rupture incoherence that determines the level of high-frequency radiation. Using HPC-enabled dynamic-rupture simulations, we generate physically self-consistent rough-fault earthquake scenarios (M~6.8) and their associated near-source seismic radiation. Because these computations are too expensive to be conducted routinely for simulation-based seismic hazard assessment, we thrive to develop an effective pseudo-dynamic source characterization that produces (almost) the same ground-motion characteristics. Therefore, we examine how variable degrees of fault roughness affect rupture properties and the seismic wavefield, and develop a planar-fault kinematic source representation that emulates the observed dynamic behaviour. We propose an effective workflow for improved pseudo-dynamic source modelling that incorporates rough-fault effects and its associated high-frequency radiation in broadband ground-motion computation for simulation-based seismic hazard assessment.

  5. Seismic Propagation in the Kuriles/Kamchatka Region

    DTIC Science & Technology

    1980-07-25

    model the final profile is well-represented by a spline interpolation. Figure 7 shows the sampling grid used to input velocity perturbations due to the...A modification of Cagniard’s method for s~ lving seismic pulse problems, Appl. Sci. Res. B., 8, p. 349, 1960. Fuchs, K. and G. Muller, Computation of

  6. Seismic and Restoration Assessment of Monumental Masonry Structures

    PubMed Central

    Asteris, Panagiotis G.; Douvika, Maria G.; Apostolopoulou, Maria; Moropoulou, Antonia

    2017-01-01

    Masonry structures are complex systems that require detailed knowledge and information regarding their response under seismic excitations. Appropriate modelling of a masonry structure is a prerequisite for a reliable earthquake-resistant design and/or assessment. However, modelling a real structure with a robust quantitative (mathematical) representation is a very difficult, complex and computationally-demanding task. The paper herein presents a new stochastic computational framework for earthquake-resistant design of masonry structural systems. The proposed framework is based on the probabilistic behavior of crucial parameters, such as material strength and seismic characteristics, and utilizes fragility analysis based on different failure criteria for the masonry material. The application of the proposed methodology is illustrated in the case of a historical and monumental masonry structure, namely the assessment of the seismic vulnerability of the Kaisariani Monastery, a byzantine church that was built in Athens, Greece, at the end of the 11th to the beginning of the 12th century. Useful conclusions are drawn regarding the effectiveness of the intervention techniques used for the reduction of the vulnerability of the case-study structure, by means of comparison of the results obtained. PMID:28767073

  7. Seismic and Restoration Assessment of Monumental Masonry Structures.

    PubMed

    Asteris, Panagiotis G; Douvika, Maria G; Apostolopoulou, Maria; Moropoulou, Antonia

    2017-08-02

    Masonry structures are complex systems that require detailed knowledge and information regarding their response under seismic excitations. Appropriate modelling of a masonry structure is a prerequisite for a reliable earthquake-resistant design and/or assessment. However, modelling a real structure with a robust quantitative (mathematical) representation is a very difficult, complex and computationally-demanding task. The paper herein presents a new stochastic computational framework for earthquake-resistant design of masonry structural systems. The proposed framework is based on the probabilistic behavior of crucial parameters, such as material strength and seismic characteristics, and utilizes fragility analysis based on different failure criteria for the masonry material. The application of the proposed methodology is illustrated in the case of a historical and monumental masonry structure, namely the assessment of the seismic vulnerability of the Kaisariani Monastery, a byzantine church that was built in Athens, Greece, at the end of the 11th to the beginning of the 12th century. Useful conclusions are drawn regarding the effectiveness of the intervention techniques used for the reduction of the vulnerability of the case-study structure, by means of comparison of the results obtained.

  8. Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    NASA Astrophysics Data System (ADS)

    Traverso, A.; Lopez Torres, E.; Fantacci, M. E.; Cerello, P.

    2017-05-01

    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer-Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists.

  9. Post-seismic velocity changes following the 2010 Mw 7.1 Darfield earthquake, New Zealand, revealed by ambient seismic field analysis

    NASA Astrophysics Data System (ADS)

    Heckels, R. EG; Savage, M. K.; Townend, J.

    2018-05-01

    Quantifying seismic velocity changes following large earthquakes can provide insights into fault healing and reloading processes. This study presents temporal velocity changes detected following the 2010 September Mw 7.1 Darfield event in Canterbury, New Zealand. We use continuous waveform data from several temporary seismic networks lying on and surrounding the Greendale Fault, with a maximum interstation distance of 156 km. Nine-component, day-long Green's functions were computed for frequencies between 0.1 and 1.0 Hz for continuous seismic records from immediately after the 2010 September 04 earthquake until 2011 January 10. Using the moving-window cross-spectral method, seismic velocity changes were calculated. Over the study period, an increase in seismic velocity of 0.14 ± 0.04 per cent was determined near the Greendale Fault, providing a new constraint on post-seismic relaxation rates in the region. A depth analysis further showed that velocity changes were confined to the uppermost 5 km of the subsurface. We attribute the observed changes to post-seismic relaxation via crack healing of the Greendale Fault and throughout the surrounding region.

  10. Improving Seismic Data Accessibility and Performance Using HDF Containers

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Wang, J.; Yang, R.

    2017-12-01

    The performance of computational geophysical data processing and forward modelling relies on both computational and data. Significant efforts on developing new data formats and libraries have been made the community, such as IRIS/PASSCAL and ASDF in data, and programs and utilities such as ObsPy and SPECFEM. The National Computational Infrastructure hosts a national significant geophysical data collection that is co-located with a high performance computing facility and provides an opportunity to investigate how to improve the data formats from both a data management and a performance point of view. This paper investigates how to enhance the data usability in several perspectives: 1) propose a convention for the seismic (both active and passive) community to improve the data accessibility and interoperability; 2) recommend the convention used in the HDF container when data is made available in PH5 or ASDF formats; 3) provide tools to convert between various seismic data formats; 4) provide performance benchmark cases using ObsPy library and SPECFEM3D to demonstrate how different data organization in terms of chunking size and compression impact on the performance by comparing new data formats, such as PH5 and ASDF to traditional formats such as SEGY, SEED, SAC, etc. In this work we apply our knowledge and experience on data standards and conventions, such as CF and ACDD from the climate community to the seismology community. The generic global attributes widely used in climate community are combined with the existing convention in the seismology community, such as CMT and QuakeML, StationXML, SEGY header convention. We also extend such convention by including the provenance and benchmarking records so that the r user can learn the footprint of the data together with its baseline performance. In practise we convert the example wide angle reflection seismic data from SEGY to PH5 or ASDF by using ObsPy and pyasdf libraries. It quantitatively demonstrates how the accessibility can be improved if the seismic data are stored in the HDF container.

  11. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    ERIC Educational Resources Information Center

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  12. Imaging Critical Zone Using High Frequency Rayleigh Wave Group Velocity Measurements Extracted from Ambient Seismic Fields Gathered With 2400 Seismic Nodes in Southeastern Wyoming.

    NASA Astrophysics Data System (ADS)

    Keifer, I. S.; Dueker, K. G.

    2016-12-01

    In an effort to characterize critical zone development in varying regions, seismologist conduct seismic surveys to assist in the realization of critical zone properties e.g. porosity and regolith thickness. A limitation of traditional critical zone seismology is that data is normally collected along lines, to generate two dimensional transects of the subsurface seismic velocity, even though the critical zone structure is 3D. Hence, we deployed six seismic 2D arrays in southeastern Wyoming to gather ambient seismic fields so that 3D shear velocity models could be produced. The arrays were made up of nominally 400 seismic stations arranged in a 200-meter square grid layout. Each array produced a half Terabyte data volume, so a premium was placed on computational efficiency throughout this study, to handle the roughly 65 billion samples recorded by each array. The ambient fields were cross-correlated on the Yellowstone Super-Computer using the pSIN code (Chen et al., 2016), which decreased correlation run times by a factor of 300 with respect to workstation computers. Group delay times extracted from cross-correlations using 8 Hz frequency bands from 10 Hz to 100 Hz show frequency dispersion at sites with shallow regolith underlain by granite bedrock. Dimensionally, the group velocity map inversion is overdetermined, even after extensive culling of spurious group delay times. Model Resolution matrices for our six arrays show values > 0.7 for most of the modal domain, approaching unity at the center of the model domain; we are then confident that we have an adequate number of rays covering our array space, and should experience minimal smearing of our resultant model due to application of inverse solution on the data. After inverting for the group velocity maps, a second inversion is performed of the group velocity maps for the 3D shear velocity model. This inversion is underdetermined and a second order Tikhonov regularization is used to obtain stable inverse images. Results will be presented.

  13. Big Data and High-Performance Computing in Global Seismology

    NASA Astrophysics Data System (ADS)

    Bozdag, Ebru; Lefebvre, Matthieu; Lei, Wenjie; Peter, Daniel; Smith, James; Komatitsch, Dimitri; Tromp, Jeroen

    2014-05-01

    Much of our knowledge of Earth's interior is based on seismic observations and measurements. Adjoint methods provide an efficient way of incorporating 3D full wave propagation in iterative seismic inversions to enhance tomographic images and thus our understanding of processes taking place inside the Earth. Our aim is to take adjoint tomography, which has been successfully applied to regional and continental scale problems, further to image the entire planet. This is one of the extreme imaging challenges in seismology, mainly due to the intense computational requirements and vast amount of high-quality seismic data that can potentially be assimilated. We have started low-resolution inversions (T > 30 s and T > 60 s for body and surface waves, respectively) with a limited data set (253 carefully selected earthquakes and seismic data from permanent and temporary networks) on Oak Ridge National Laboratory's Cray XK7 "Titan" system. Recent improvements in our 3D global wave propagation solvers, such as a GPU version of the SPECFEM3D_GLOBE package, will enable us perform higher-resolution (T > 9 s) and longer duration (~180 m) simulations to take the advantage of high-frequency body waves and major-arc surface waves, thereby improving imbalanced ray coverage as a result of the uneven global distribution of sources and receivers. Our ultimate goal is to use all earthquakes in the global CMT catalogue within the magnitude range of our interest and data from all available seismic networks. To take the full advantage of computational resources, we need a solid framework to manage big data sets during numerical simulations, pre-processing (i.e., data requests and quality checks, processing data, window selection, etc.) and post-processing (i.e., pre-conditioning and smoothing kernels, etc.). We address the bottlenecks in our global seismic workflow, which are mainly coming from heavy I/O traffic during simulations and the pre- and post-processing stages, by defining new data formats for seismograms and outputs of our 3D solvers (i.e., meshes, kernels, seismic models, etc.) based on ORNL's ADIOS libraries. We will discuss our global adjoint tomography workflow on HPC systems as well as the current status of our global inversions.

  14. High-fidelity simulation capability for virtual testing of seismic and acoustic sensors

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Moran, Mark L.; Ketcham, Stephen A.; Lacombe, James; Anderson, Thomas S.; Symons, Neill P.; Aldridge, David F.; Marlin, David H.; Collier, Sandra L.; Ostashev, Vladimir E.

    2005-05-01

    This paper describes development and application of a high-fidelity, seismic/acoustic simulation capability for battlefield sensors. The purpose is to provide simulated sensor data so realistic that they cannot be distinguished by experts from actual field data. This emerging capability provides rapid, low-cost trade studies of unattended ground sensor network configurations, data processing and fusion strategies, and signatures emitted by prototype vehicles. There are three essential components to the modeling: (1) detailed mechanical signature models for vehicles and walkers, (2) high-resolution characterization of the subsurface and atmospheric environments, and (3) state-of-the-art seismic/acoustic models for propagating moving-vehicle signatures through realistic, complex environments. With regard to the first of these components, dynamic models of wheeled and tracked vehicles have been developed to generate ground force inputs to seismic propagation models. Vehicle models range from simple, 2D representations to highly detailed, 3D representations of entire linked-track suspension systems. Similarly detailed models of acoustic emissions from vehicle engines are under development. The propagation calculations for both the seismics and acoustics are based on finite-difference, time-domain (FDTD) methodologies capable of handling complex environmental features such as heterogeneous geologies, urban structures, surface vegetation, and dynamic atmospheric turbulence. Any number of dynamic sources and virtual sensors may be incorporated into the FDTD model. The computational demands of 3D FDTD simulation over tactical distances require massively parallel computers. Several example calculations of seismic/acoustic wave propagation through complex atmospheric and terrain environments are shown.

  15. A review of computer-aided oral and maxillofacial surgery: planning, simulation and navigation.

    PubMed

    Chen, Xiaojun; Xu, Lu; Sun, Yi; Politis, Constantinus

    2016-11-01

    Currently, oral and maxillofacial surgery (OMFS) still poses a significant challenge for surgeons due to the anatomic complexity and limited field of view of the oral cavity. With the great development of computer technologies, he computer-aided surgery has been widely used for minimizing the risks and improving the precision of surgery. Areas covered: The major goal of this paper is to provide a comprehensive reference source of current and future development of computer-aided OMFS including surgical planning, simulation and navigation for relevant researchers. Expert commentary: Compared with the traditional OMFS, computer-aided OMFS overcomes the disadvantage that the treatment on the region of anatomically complex maxillofacial depends almost exclusively on the experience of the surgeon.

  16. Software For Computer-Aided Design Of Control Systems

    NASA Technical Reports Server (NTRS)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  17. NREL Kicks Off Next Phase of Advanced Computer-Aided Battery Engineering |

    Science.gov Websites

    lithium-ion (Li-ion) batteries, known as a multi-scale multi-domain (GH-MSMD) model framework, was News | NREL Kicks Off Next Phase of Advanced Computer-Aided Battery Engineering NREL Kicks Off Next Phase of Advanced Computer-Aided Battery Engineering March 16, 2016 NREL researcher looks across

  18. Defense Acquisitions Acronyms and Terms

    DTIC Science & Technology

    2012-12-01

    Computer-Aided Design CADD Computer-Aided Design and Drafting CAE Component Acquisition Executive; Computer-Aided Engineering CAIV Cost As an...Radiation to Ordnance HFE Human Factors Engineering HHA Health Hazard Assessment HNA Host-Nation Approval HNS Host-Nation Support HOL High -Order...Engineering Change Proposal VHSIC Very High Speed Integrated Circuit VLSI Very Large Scale Integration VOC Volatile Organic Compound W WAN Wide

  19. [The automatic iris map overlap technology in computer-aided iridiagnosis].

    PubMed

    He, Jia-feng; Ye, Hu-nian; Ye, Miao-yuan

    2002-11-01

    In the paper, iridology and computer-aided iridiagnosis technologies are briefly introduced and the extraction method of the collarette contour is then investigated. The iris map can be overlapped on the original iris image based on collarette contour extraction. The research on collarette contour extraction and iris map overlap is of great importance to computer-aided iridiagnosis technologies.

  20. New Paradigms for Computer Aids to Invention.

    ERIC Educational Resources Information Center

    Langston, M. Diane

    Many people are interested in computer aids to rhetorical invention and want to know how to evaluate an invention aid, what the criteria are for a good one, and how to assess the trade-offs involved in buying one product or another. The frame of reference for this evaluation is an "old paradigm," which treats the computer as if it were…

  1. Multimedia Image Technology and Computer Aided Manufacturing Engineering Analysis

    NASA Astrophysics Data System (ADS)

    Nan, Song

    2018-03-01

    Since the reform and opening up, with the continuous development of science and technology in China, more and more advanced science and technology have emerged under the trend of diversification. Multimedia imaging technology, for example, has a significant and positive impact on computer aided manufacturing engineering in China. From the perspective of scientific and technological advancement and development, the multimedia image technology has a very positive influence on the application and development of computer-aided manufacturing engineering, whether in function or function play. Therefore, this paper mainly starts from the concept of multimedia image technology to analyze the application of multimedia image technology in computer aided manufacturing engineering.

  2. Three-Dimensional Computational Fluid Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haworth, D.C.; O'Rourke, P.J.; Ranganathan, R.

    1998-09-01

    Computational fluid dynamics (CFD) is one discipline falling under the broad heading of computer-aided engineering (CAE). CAE, together with computer-aided design (CAD) and computer-aided manufacturing (CAM), comprise a mathematical-based approach to engineering product and process design, analysis and fabrication. In this overview of CFD for the design engineer, our purposes are three-fold: (1) to define the scope of CFD and motivate its utility for engineering, (2) to provide a basic technical foundation for CFD, and (3) to convey how CFD is incorporated into engineering product and process design.

  3. SEISRISK II; a computer program for seismic hazard estimation

    USGS Publications Warehouse

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  4. Viscoelastic Finite Difference Modeling Using Graphics Processing Units

    NASA Astrophysics Data System (ADS)

    Fabien-Ouellet, G.; Gloaguen, E.; Giroux, B.

    2014-12-01

    Full waveform seismic modeling requires a huge amount of computing power that still challenges today's technology. This limits the applicability of powerful processing approaches in seismic exploration like full-waveform inversion. This paper explores the use of Graphics Processing Units (GPU) to compute a time based finite-difference solution to the viscoelastic wave equation. The aim is to investigate whether the adoption of the GPU technology is susceptible to reduce significantly the computing time of simulations. The code presented herein is based on the freely accessible software of Bohlen (2002) in 2D provided under a General Public License (GNU) licence. This implementation is based on a second order centred differences scheme to approximate time differences and staggered grid schemes with centred difference of order 2, 4, 6, 8, and 12 for spatial derivatives. The code is fully parallel and is written using the Message Passing Interface (MPI), and it thus supports simulations of vast seismic models on a cluster of CPUs. To port the code from Bohlen (2002) on GPUs, the OpenCl framework was chosen for its ability to work on both CPUs and GPUs and its adoption by most of GPU manufacturers. In our implementation, OpenCL works in conjunction with MPI, which allows computations on a cluster of GPU for large-scale model simulations. We tested our code for model sizes between 1002 and 60002 elements. Comparison shows a decrease in computation time of more than two orders of magnitude between the GPU implementation run on a AMD Radeon HD 7950 and the CPU implementation run on a 2.26 GHz Intel Xeon Quad-Core. The speed-up varies depending on the order of the finite difference approximation and generally increases for higher orders. Increasing speed-ups are also obtained for increasing model size, which can be explained by kernel overheads and delays introduced by memory transfers to and from the GPU through the PCI-E bus. Those tests indicate that the GPU memory size and the slow memory transfers are the limiting factors of our GPU implementation. Those results show the benefits of using GPUs instead of CPUs for time based finite-difference seismic simulations. The reductions in computation time and in hardware costs are significant and open the door for new approaches in seismic inversion.

  5. Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Insolia, Gerard

    This document contains course outlines in computer-aided manufacturing developed for a business-industry technology resource center for firms in eastern Pennsylvania by Northampton Community College. The four units of the course cover the following: (1) introduction to computer-assisted design (CAD)/computer-assisted manufacturing (CAM); (2) CAM…

  6. Buying CAM.

    ERIC Educational Resources Information Center

    Meloy, Jim; And Others

    1990-01-01

    The relationship between computer-aided design (CAD), computer-aided manufacturing (CAM), and computer numerical control (CNC) computer applications is described. Tips for helping educate the CAM buyer on what to look for and what to avoid when searching for the most appropriate instructional CAM package are provided. (KR)

  7. Computer-Aided Design and Computer-Aided Manufacturing Hydroxyapatite/Epoxide Acrylate Maleic Compound Construction for Craniomaxillofacial Bone Defects.

    PubMed

    Zhang, Lei; Shen, Shunyao; Yu, Hongbo; Shen, Steve Guofang; Wang, Xudong

    2015-07-01

    The aim of this study was to investigate the use of computer-aided design and computer-aided manufacturing hydroxyapatite (HA)/epoxide acrylate maleic (EAM) compound construction artificial implants for craniomaxillofacial bone defects. Computed tomography, computer-aided design/computer-aided manufacturing and three-dimensional reconstruction, as well as rapid prototyping were performed in 12 patients between 2008 and 2013. The customized HA/EAM compound artificial implants were manufactured through selective laser sintering using a rapid prototyping machine into the exact geometric shapes of the defect. The HA/EAM compound artificial implants were then implanted during surgical reconstruction. Color-coded superimpositions demonstrated the discrepancy between the virtual plan and achieved results using Geomagic Studio. As a result, the HA/EAM compound artificial bone implants were perfectly matched with the facial areas that needed reconstruction. The postoperative aesthetic and functional results were satisfactory. The color-coded superimpositions demonstrated good consistency between the virtual plan and achieved results. The three-dimensional maximum deviation is 2.12 ± 0.65  mm and the three-dimensional mean deviation is 0.27 ± 0.07  mm. No facial nerve weakness or pain was observed at the follow-up examinations. Only 1 implant had to be removed 2 months after the surgery owing to severe local infection. No other complication was noted during the follow-up period. In conclusion, computer-aided, individually fabricated HA/EAM compound construction artificial implant was a good craniomaxillofacial surgical technique that yielded improved aesthetic results and functional recovery after reconstruction.

  8. Accuracy evaluation of metal copings fabricated by computer-aided milling and direct metal laser sintering systems

    PubMed Central

    Lee, Wan-Sun; Kim, Woong-Chul

    2015-01-01

    PURPOSE To assess the marginal and internal gaps of the copings fabricated by computer-aided milling and direct metal laser sintering (DMLS) systems in comparison to casting method. MATERIALS AND METHODS Ten metal copings were fabricated by casting, computer-aided milling, and DMLS. Seven mesiodistal and labiolingual positions were then measured, and each of these were divided into the categories; marginal gap (MG), cervical gap (CG), axial wall at internal gap (AG), and incisal edge at internal gap (IG). Evaluation was performed by a silicone replica technique. A digital microscope was used for measurement of silicone layer. Statistical analyses included one-way and repeated measure ANOVA to test the difference between the fabrication methods and categories of measured points (α=.05), respectively. RESULTS The mean gap differed significantly with fabrication methods (P<.001). Casting produced the narrowest gap in each of the four measured positions, whereas CG, AG, and IG proved narrower in computer-aided milling than in DMLS. Thus, with the exception of MG, all positions exhibited a significant difference between computer-aided milling and DMLS (P<.05). CONCLUSION Although the gap was found to vary with fabrication methods, the marginal and internal gaps of the copings fabricated by computer-aided milling and DMLS fell within the range of clinical acceptance (<120 µm). However, the statistically significant difference to conventional casting indicates that the gaps in computer-aided milling and DMLS fabricated restorations still need to be further reduced. PMID:25932310

  9. Accuracy evaluation of metal copings fabricated by computer-aided milling and direct metal laser sintering systems.

    PubMed

    Park, Jong-Kyoung; Lee, Wan-Sun; Kim, Hae-Young; Kim, Woong-Chul; Kim, Ji-Hwan

    2015-04-01

    To assess the marginal and internal gaps of the copings fabricated by computer-aided milling and direct metal laser sintering (DMLS) systems in comparison to casting method. Ten metal copings were fabricated by casting, computer-aided milling, and DMLS. Seven mesiodistal and labiolingual positions were then measured, and each of these were divided into the categories; marginal gap (MG), cervical gap (CG), axial wall at internal gap (AG), and incisal edge at internal gap (IG). Evaluation was performed by a silicone replica technique. A digital microscope was used for measurement of silicone layer. Statistical analyses included one-way and repeated measure ANOVA to test the difference between the fabrication methods and categories of measured points (α=.05), respectively. The mean gap differed significantly with fabrication methods (P<.001). Casting produced the narrowest gap in each of the four measured positions, whereas CG, AG, and IG proved narrower in computer-aided milling than in DMLS. Thus, with the exception of MG, all positions exhibited a significant difference between computer-aided milling and DMLS (P<.05). Although the gap was found to vary with fabrication methods, the marginal and internal gaps of the copings fabricated by computer-aided milling and DMLS fell within the range of clinical acceptance (<120 µm). However, the statistically significant difference to conventional casting indicates that the gaps in computer-aided milling and DMLS fabricated restorations still need to be further reduced.

  10. Instantaneous Frequency Attribute Comparison

    NASA Astrophysics Data System (ADS)

    Yedlin, M. J.; Margrave, G. F.; Ben Horin, Y.

    2013-12-01

    The instantaneous seismic data attribute provides a different means of seismic interpretation, for all types of seismic data. It first came to the fore in exploration seismology in the classic paper of Taner et al (1979), entitled " Complex seismic trace analysis". Subsequently a vast literature has been accumulated on the subject, which has been given an excellent review by Barnes (1992). In this research we will compare two different methods of computation of the instantaneous frequency. The first method is based on the original idea of Taner et al (1979) and utilizes the derivative of the instantaneous phase of the analytic signal. The second method is based on the computation of the power centroid of the time-frequency spectrum, obtained using either the Gabor Transform as computed by Margrave et al (2011) or the Stockwell Transform as described by Stockwell et al (1996). We will apply both methods to exploration seismic data and the DPRK events recorded in 2006 and 2013. In applying the classical analytic signal technique, which is known to be unstable, due to the division of the square of the envelope, we will incorporate the stabilization and smoothing method proposed in the two paper of Fomel (2007). This method employs linear inverse theory regularization coupled with the application of an appropriate data smoother. The centroid method application is straightforward and is based on the very complete theoretical analysis provided in elegant fashion by Cohen (1995). While the results of the two methods are very similar, noticeable differences are seen at the data edges. This is most likely due to the edge effects of the smoothing operator in the Fomel method, which is more computationally intensive, when an optimal search of the regularization parameter is done. An advantage of the centroid method is the intrinsic smoothing of the data, which is inherent in the sliding window application used in all Short-Time Fourier Transform methods. The Fomel technique has a larger CPU run-time, resulting from the necessary matrix inversion. Barnes, Arthur E. "The calculation of instantaneous frequency and instantaneous bandwidth.", Geophysics, 57.11 (1992): 1520-1524. Fomel, Sergey. "Local seismic attributes.", Geophysics, 72.3 (2007): A29-A33. Fomel, Sergey. "Shaping regularization in geophysical-estimation problems." , Geophysics, 72.2 (2007): R29-R36. Stockwell, Robert Glenn, Lalu Mansinha, and R. P. Lowe. "Localization of the complex spectrum: the S transform."Signal Processing, IEEE Transactions on, 44.4 (1996): 998-1001. Taner, M. Turhan, Fulton Koehler, and R. E. Sheriff. "Complex seismic trace analysis." Geophysics, 44.6 (1979): 1041-1063. Cohen, Leon. "Time frequency analysis theory and applications."USA: Prentice Hall, (1995). Margrave, Gary F., Michael P. Lamoureux, and David C. Henley. "Gabor deconvolution: Estimating reflectivity by nonstationary deconvolution of seismic data." Geophysics, 76.3 (2011): W15-W30.

  11. Identifying High Potential Well Targets with 3D Seismic and Mineralogy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellors, R. J.

    2015-10-30

    Seismic reflection the primary tool used in petroleum exploration and production, but use in geothermal exploration is less standard, in part due to cost but also due to the challenges in identifying the highly-permeable zones essential for economic hydrothermal systems [e.g. Louie et al., 2011; Majer, 2003]. Newer technology, such as wireless sensors and low-cost high performance computing, has helped reduce the cost and effort needed to conduct 3D surveys. The second difficulty, identifying permeable zones, has been less tractable so far. Here we report on the use of seismic attributes from a 3D seismic survey to identify and mapmore » permeable zones in a hydrothermal area.« less

  12. Exploring Large-Scale Cross-Correlation for Teleseismic and Regional Seismic Event Characterization

    NASA Astrophysics Data System (ADS)

    Dodge, Doug; Walter, William; Myers, Steve; Ford, Sean; Harris, Dave; Ruppert, Stan; Buttler, Dave; Hauk, Terri

    2013-04-01

    The decrease in costs of both digital storage space and computation power invites new methods of seismic data processing. At Lawrence Livermore National Laboratory(LLNL) we operate a growing research database of seismic events and waveforms for nuclear explosion monitoring and other applications. Currently the LLNL database contains several million events associated with tens of millions of waveforms at thousands of stations. We are making use of this database to explore the power of seismic waveform correlation to quantify signal similarities, to discover new events not in catalogs, and to more accurately locate events and identify source types. Building on the very efficient correlation methodologies of Harris and Dodge (2011) we computed the waveform correlation for event pairs in the LLNL database in two ways. First we performed entire waveform cross-correlation over seven distinct frequency bands. The correlation coefficient exceeds 0.6 for more than 40 million waveform pairs for several hundred thousand events at more than a thousand stations. These correlations reveal clusters of mining events and aftershock sequences, which can be used to readily identify and locate events. Second we determine relative pick times by correlating signals in time windows for distinct seismic phases. These correlated picks are then used to perform very high accuracy event relocations. We are examining the percentage of events that correlate as a function of magnitude and observing station distance in selected high seismicity regions. Combining these empirical results and those using synthetic data, we are working to quantify relationships between correlation and event pair separation (in epicenter and depth) as well as mechanism differences. Our exploration of these techniques on a large seismic database is in process and we will report on our findings in more detail at the meeting.

  13. Exploring Large-Scale Cross-Correlation for Teleseismic and Regional Seismic Event Characterization

    NASA Astrophysics Data System (ADS)

    Dodge, D.; Walter, W. R.; Myers, S. C.; Ford, S. R.; Harris, D.; Ruppert, S.; Buttler, D.; Hauk, T. F.

    2012-12-01

    The decrease in costs of both digital storage space and computation power invites new methods of seismic data processing. At Lawrence Livermore National Laboratory (LLNL) we operate a growing research database of seismic events and waveforms for nuclear explosion monitoring and other applications. Currently the LLNL database contains several million events associated with tens of millions of waveforms at thousands of stations. We are making use of this database to explore the power of seismic waveform correlation to quantify signal similarities, to discover new events not in catalogs, and to more accurately locate events and identify source types. Building on the very efficient correlation methodologies of Harris and Dodge (2011) we computed the waveform correlation for event pairs in the LLNL database in two ways. First we performed entire waveform cross-correlation over seven distinct frequency bands. The correlation coefficient exceeds 0.6 for more than 40 million waveform pairs for several hundred thousand events at more than a thousand stations. These correlations reveal clusters of mining events and aftershock sequences, which can be used to readily identify and locate events. Second we determine relative pick times by correlating signals in time windows for distinct seismic phases. These correlated picks are then used to perform very high accuracy event relocations. We are examining the percentage of events that correlate as a function of magnitude and observing station distance in selected high seismicity regions. Combining these empirical results and those using synthetic data, we are working to quantify relationships between correlation and event pair separation (in epicenter and depth) as well as mechanism differences. Our exploration of these techniques on a large seismic database is in process and we will report on our findings in more detail at the meeting.

  14. A Review of Developments in Computer-Based Systems to Image Teeth and Produce Dental Restorations

    PubMed Central

    Rekow, E. Dianne; Erdman, Arthur G.; Speidel, T. Michael

    1987-01-01

    Computer-aided design and manufacturing (CAD/CAM) make it possible to automate the creation of dental restorations. Currently practiced techniques are described. Three automated systems currently under development are described and compared. Advances in computer-aided design and computer-aided manufacturing (CAD/CAM) provide a new option for dentistry, creating an alternative technique for producing dental restorations. It is possible to create dental restorations that are automatically produced and meet or exceed current requirements for fit and occlusion.

  15. Photogrammetry and computer-aided piping design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keneflick, J.F.; Chirillo, R.D.

    1985-02-18

    Three-dimensional measurements taken from photographs of a plant model can be digitized and linked with computer-aided piping design. This can short-cut the design and construction of new plants and expedite repair and retrofitting projects. Some designers bridge the gap between model and computer by digitizing from orthographic prints obtained via orthography or the laser scanning of model sections. Such valve or fitting then processed is described in this paper. The marriage of photogrammetry and computer-aided piping design can economically produce such numerical drawings.

  16. Integration of the Execution Support System for the Computer-Aided Prototyping System (CAPS)

    DTIC Science & Technology

    1990-09-01

    SUPPORT SYSTEM FOR THE COMPUTER -AIDED PROTOTYPING SYSTEM (CAPS) by Frank V. Palazzo September 1990 Thesis Advisor: Luq± Approved for public release...ZATON REPOR ,,.VBE (, 6a NAME OF PERPORMING ORGAN ZAT7ON 6b OFF:CE SYVBOL 7a NAME OF MONITORINC O0-CA’Za- ON Computer Science Department (if applicable...Include Security Classification) Integration of the Execution Support System for the Computer -Aided Prototyping System (C S) 12 PERSONAL AUTHOR(S) Frank V

  17. Influence of Computer-Aided Detection on Performance of Screening Mammography

    PubMed Central

    Fenton, Joshua J.; Taplin, Stephen H.; Carney, Patricia A.; Abraham, Linn; Sickles, Edward A.; D'Orsi, Carl; Berns, Eric A.; Cutter, Gary; Hendrick, R. Edward; Barlow, William E.; Elmore, Joann G.

    2011-01-01

    Background Computer-aided detection identifies suspicious findings on mammograms to assist radiologists. Since the Food and Drug Administration approved the technology in 1998, it has been disseminated into practice, but its effect on the accuracy of interpretation is unclear. Methods We determined the association between the use of computer-aided detection at mammography facilities and the performance of screening mammography from 1998 through 2002 at 43 facilities in three states. We had complete data for 222,135 women (a total of 429,345 mammograms), including 2351 women who received a diagnosis of breast cancer within 1 year after screening. We calculated the specificity, sensitivity, and positive predictive value of screening mammography with and without computer-aided detection, as well as the rates of biopsy and breast-cancer detection and the overall accuracy, measured as the area under the receiver-operating-characteristic (ROC) curve. Results Seven facilities (16%) implemented computer-aided detection during the study period. Diagnostic specificity decreased from 90.2% before implementation to 87.2% after implementation (P<0.001), the positive predictive value decreased from 4.1% to 3.2% (P = 0.01), and the rate of biopsy increased by 19.7% (P<0.001). The increase in sensitivity from 80.4% before implementation of computer-aided detection to 84.0% after implementation was not significant (P = 0.32). The change in the cancer-detection rate (including invasive breast cancers and ductal carcinomas in situ) was not significant (4.15 cases per 1000 screening mammograms before implementation and 4.20 cases after implementation, P = 0.90). Analyses of data from all 43 facilities showed that the use of computer-aided detection was associated with significantly lower overall accuracy than was nonuse (area under the ROC curve, 0.871 vs. 0.919; P = 0.005). Conclusions The use of computer-aided detection is associated with reduced accuracy of interpretation of screening mammograms. The increased rate of biopsy with the use of computer-aided detection is not clearly associated with improved detection of invasive breast cancer. PMID:17409321

  18. Evaluation of induced seismicity forecast models in the Induced Seismicity Test Bench

    NASA Astrophysics Data System (ADS)

    Király, Eszter; Gischig, Valentin; Zechar, Jeremy; Doetsch, Joseph; Karvounis, Dimitrios; Wiemer, Stefan

    2016-04-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. Here, we propose an Induced Seismicity Test Bench to test and rank such models. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models that incorporate a different mix of physical understanding and stochastic representation of the induced sequences: Shapiro in Space (SiS) and Hydraulics and Seismics (HySei). SiS is based on three pillars: the seismicity rate is computed with help of the seismogenic index and a simple exponential decay of the seismicity; the magnitude distribution follows the Gutenberg-Richter relation; and seismicity is distributed in space based on smoothing seismicity during the learning period with 3D Gaussian kernels. The HySei model describes seismicity triggered by pressure diffusion with irreversible permeability enhancement. Our results show that neither model is fully superior to the other. HySei forecasts the seismicity rate well, but is only mediocre at forecasting the spatial distribution. On the other hand, SiS forecasts the spatial distribution well but not the seismicity rate. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in. Ensemble models that combine HySei's rate forecast with SiS's spatial forecast outperform each individual model.

  19. Seismic waveform sensitivity to global boundary topography

    NASA Astrophysics Data System (ADS)

    Colombi, Andrea; Nissen-Meyer, Tarje; Boschi, Lapo; Giardini, Domenico

    2012-09-01

    We investigate the implications of lateral variations in the topography of global seismic discontinuities, in the framework of high-resolution forward modelling and seismic imaging. We run 3-D wave-propagation simulations accurate at periods of 10 s and longer, with Earth models including core-mantle boundary topography anomalies of ˜1000 km spatial wavelength and up to 10 km height. We obtain very different waveform signatures for PcP (reflected) and Pdiff (diffracted) phases, supporting the theoretical expectation that the latter are sensitive primarily to large-scale structure, whereas the former only to small scale, where large and small are relative to the frequency. PcP at 10 s seems to be well suited to map such a small-scale perturbation, whereas Pdiff at the same frequency carries faint signatures that do not allow any tomographic reconstruction. Only at higher frequency, the signature becomes stronger. We present a new algorithm to compute sensitivity kernels relating seismic traveltimes (measured by cross-correlation of observed and theoretical seismograms) to the topography of seismic discontinuities at any depth in the Earth using full 3-D wave propagation. Calculation of accurate finite-frequency sensitivity kernels is notoriously expensive, but we reduce computational costs drastically by limiting ourselves to spherically symmetric reference models, and exploiting the axial symmetry of the resulting propagating wavefield that collapses to a 2-D numerical domain. We compute and analyse a suite of kernels for upper and lower mantle discontinuities that can be used for finite-frequency waveform inversion. The PcP and Pdiff sensitivity footprints are in good agreement with the result obtained cross-correlating perturbed and unperturbed seismogram, validating our approach against full 3-D modelling to invert for such structures.

  20. Combining mineral physics with seismic observations: What can we deduce about the thermochemical structure of the Earth's deep interior?

    NASA Astrophysics Data System (ADS)

    Cobden, L. J.

    2017-12-01

    Mineral physics provides the essential link between seismic observations of the Earth's interior, and laboratory (or computer-simulated) measurements of rock properties. In this presentation I will outline the procedure for quantitative conversion from thermochemical structure to seismic structure (and vice versa) using the latest datasets from seismology and mineralogy. I will show examples of how this method can allow us to infer major chemical and dynamic properties of the deep mantle. I will also indicate where uncertainties and limitations in the data require us to exercise caution, in order not to "over-interpret" seismic observations. Understanding and modelling these uncertainties serves as a useful guide for mineralogists to ascertain which mineral parameters are most useful in seismic interpretation, and enables seismologists to optimise their data assembly and inversions for quantitative interpretations.

  1. Computers at the Albuquerque Seismological Laboratory

    USGS Publications Warehouse

    Hoffman, J.

    1979-01-01

    The Worldwide Standardized Seismograph Network (WWSSN) is managed by the U.S Geological Survey in Albuquerque, N. Mex. It consists of a global network of seismographs housed in seismic observatories throughout the world. An important recent addition to this network are the Seismic Research Observatories (SRO) which combine a borehole seismometer with a modern digital data recording system. 

  2. Computer-aided Instructional System for Transmission Line Simulation.

    ERIC Educational Resources Information Center

    Reinhard, Erwin A.; Roth, Charles H., Jr.

    A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…

  3. Proceedings of the Workshop on Computational Aspects in the Control of Flexible Systems, part 1

    NASA Technical Reports Server (NTRS)

    Taylor, Lawrence W., Jr. (Compiler)

    1989-01-01

    Control/Structures Integration program software needs, computer aided control engineering for flexible spacecraft, computer aided design, computational efficiency and capability, modeling and parameter estimation, and control synthesis and optimization software for flexible structures and robots are among the topics discussed.

  4. Computer aided field editing in the DHS context: the Turkey experiment.

    PubMed

    Cushing, J; Loaiza, E

    1994-01-01

    "In this study two types of field editing used during the Turkey Demographic and Health Survey are compared. These two types of editing are computer aided field editing and manual editing. It is known that manual editing by field editors is a tedious job in which errors especially on skip questions can be missed; however, with the aid of computers field editors could quickly find all occasions on which an interviewer incorrectly followed a skip instruction. At the end of the experiment it has been found...that the field editing done with the aid of a notebook computer was consistently better than that done in the standard manual manner." (SUMMARY IN TUR) excerpt

  5. Survey evaluation and design (SED): A case study in Garden Banks, Gulf of Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, G.; Hannan, A.; Mann, A.D.

    1995-12-31

    Hydrocarbon exploration in the Gulf of Mexico has reached its mature stages. Exploration objectives such as deep stratigraphic and pre-salt traps are becoming more dominant. As the exploration targets change, earlier 3D seismic surveys, designed for different objectives, become less able to meet the demands of the present day explorations. Some areas of the Gulf of Mexico will require reacquisition of new 3D seismic data, redesigned to meet new objectives. Garden Banks is one such area. A major advantage of performing a survey evaluation design (SED) in a mature area is the amount and diversity of available data. Geological profiles,more » reservoir characterizations, borehole wireline and surface seismic data, all serve to aid in the survey design. Given the exploration history and geological objectives, the geophysical analyses of resolution, signal loss, noise, fold, acquisition geometry, migration aperture, velocity anisotropy and others, may now be carried out in a much more specific manner. A thorough SED ensures that overall survey objectives will be met and reduces the possibility of over design on critical parameters. This generates the highest quality seismic survey for the most reasonable cost.« less

  6. Digital Seismic-Reflection Data from Eastern Rhode Island Sound and Vicinity, 1975-1980

    USGS Publications Warehouse

    McMullen, K.Y.; Poppe, L.J.; Soderberg, N.K.

    2009-01-01

    During 1975 and 1980, the U.S. Geological Survey (USGS) conducted two seismic-reflection surveys in Rhode Island Sound (RIS) aboard the research vessel Asterias: cruise ASTR75-June surveyed eastern RIS in 1975 and cruise AST-80-6B surveyed southern RIS in 1980. Data from these surveys were recorded in analog form and archived at the USGS Woods Hole Coastal and Marine Science Center's Data Library. In response to recent interest in the geology of RIS and in an effort to make the data more readily accessible while preserving the original paper records, the seismic data from these cruises were scanned and converted to black and white Tagged Image File Format and grayscale Portable Network Graphics images and SEG-Y data files. Navigation data were converted from U.S. Coast Guard Long Range Aids to Navigation time delays to latitudes and longitudes that are available in Environmental Systems Research Institute, Inc., shapefile format and as eastings and northings in space-delimited text format. This report complements two others that contain analog seismic-reflection data from RIS (McMullen and others, 2009) and Long Island and Block Island Sounds (Poppe and others, 2002) and were converted into digital form.

  7. Quality indexing with computer-aided lexicography

    NASA Technical Reports Server (NTRS)

    Buchan, Ronald L.

    1992-01-01

    Indexing with computers is a far cry from indexing with the first indexing tool, the manual card sorter. With the aid of computer-aided lexicography, both indexing and indexing tools can provide standardization, consistency, and accuracy, resulting in greater quality control than ever before. A brief survey of computer activity in indexing is presented with detailed illustrations from NASA activity. Applications from techniques mentioned, such as Retrospective Indexing (RI), can be made to many indexing systems. In addition to improving the quality of indexing with computers, the improved efficiency with which certain tasks can be done is demonstrated.

  8. COMPUTER-AIDED DATA ACQUISITION FOR COMBUSTION EXPERIMENTS

    EPA Science Inventory

    The article describes the use of computer-aided data acquisition techniques to aid the research program of the Combustion Research Branch (CRB) of the U.S. EPA's Air and Energy Engineering Research Laboratory (AEERL) in Research Triangle Park, NC, in particular on CRB's bench-sca...

  9. Computer-aided decision support systems for endoscopy in the gastrointestinal tract: a review.

    PubMed

    Liedlgruber, Michael; Uhl, Andreas

    2011-01-01

    Today, medical endoscopy is a widely used procedure to inspect the inner cavities of the human body. The advent of endoscopic imaging techniques-allowing the acquisition of images or videos-created the possibility for the development of the whole new branch of computer-aided decision support systems. Such systems aim at helping physicians to identify possibly malignant abnormalities more accurately. At the beginning of this paper, we give a brief introduction to the history of endoscopy, followed by introducing the main types of endoscopes which emerged so far (flexible endoscope, wireless capsule endoscope, and confocal laser endomicroscope). We then give a brief introduction to computer-aided decision support systems specifically targeted at endoscopy in the gastrointestinal tract. Then we present general facts and figures concerning computer-aided decision support systems and summarize work specifically targeted at computer-aided decision support in the gastrointestinal tract. This summary is followed by a discussion of some common issues concerning the approaches reviewed and suggestions of possible ways to resolve them.

  10. Application of seismic-refraction techniques to hydrologic studies

    USGS Publications Warehouse

    Haeni, F.P.

    1986-01-01

    During the past 30 years, seismic-refraction methods have been used extensively in petroleum, mineral, and engineering investigations, and to some extent for hydrologic applications. Recent advances in equipment, sound sources, and computer interpretation techniques make seismic refraction a highly effective and economical means of obtaining subsurface data in hydrologic studies. Aquifers that can be defined by one or more high seismic-velocity surfaces, such as (1) alluvial or glacial deposits in consolidated rock valleys, (2) limestone or sandstone underlain by metamorphic or igneous rock, or (3) saturated unconsolidated deposits overlain by unsaturated unconsolidated deposits,are ideally suited for applying seismic-refraction methods. These methods allow the economical collection of subsurface data, provide the basis for more efficient collection of data by test drilling or aquifer tests, and result in improved hydrologic studies.This manual briefly reviews the basics of seismic-refraction theory and principles. It emphasizes the use of this technique in hydrologic investigations and describes the planning, equipment, field procedures, and intrepretation techniques needed for this type of study.Examples of the use of seismic-refraction techniques in a wide variety of hydrologic studies are presented.

  11. Application of seismic-refraction techniques to hydrologic studies

    USGS Publications Warehouse

    Haeni, F.P.

    1988-01-01

    During the past 30 years, seismic-refraction methods have been used extensively in petroleum, mineral, and engineering investigations and to some extent for hydrologic applications. Recent advances in equipment, sound sources, and computer interpretation techniques make seismic refraction a highly effective and economical means of obtaining subsurface data in hydrologic studies. Aquifers that can be defined by one or more high-seismic-velocity surface, such as (1) alluvial or glacial deposits in consolidated rock valleys, (2) limestone or sandstone underlain by metamorphic or igneous rock, or (3) saturated unconsolidated deposits overlain by unsaturated unconsolidated deposits, are ideally suited for seismic-refraction methods. These methods allow economical collection of subsurface data, provide the basis for more efficient collection of data by test drilling or aquifer tests, and result in improved hydrologic studies. This manual briefly reviews the basics of seismic-refraction theory and principles. It emphasizes the use of these techniques in hydrologic investigations and describes the planning, equipment, field procedures, and interpretation techniques needed for this type of study. Further-more, examples of the use of seismic-refraction techniques in a wide variety of hydrologic studies are presented.

  12. A Thermo-Hydro-Mechanical coupled Numerical modeling of Injection-induced seismicity on a pre-existing fault

    NASA Astrophysics Data System (ADS)

    Kim, Jongchan; Archer, Rosalind

    2017-04-01

    In terms of energy development (oil, gas and geothermal field) and environmental improvement (carbon dioxide sequestration), fluid injection into subsurface has been dramatically increased. As a side effect of these operations, a number of injection-induced seismic activities have also significantly risen. It is known that the main causes of induced seismicity are changes in local shear and normal stresses and pore pressure as well. This mechanism leads to increase in the probability of earthquake occurrence on permeable pre-existing fault zones predominantly. In this 2D fully coupled THM geothermal reservoir numerical simulation of injection-induced seismicity, we investigate the thermal, hydraulic and mechanical behavior of the fracture zone, considering a variety of 1) fault permeability, 2) injection rate and 3) injection temperature to identify major contributing parameters to induced seismic activity. We also calculate spatiotemporal variation of the Coulomb stress which is a combination of shear stress, normal stress and pore pressure and lastly forecast the seismicity rate on the fault zone by computing the seismic prediction model of Dieterich (1994).

  13. California Fault Parameters for the National Seismic Hazard Maps and Working Group on California Earthquake Probabilities 2007

    USGS Publications Warehouse

    Wills, Chris J.; Weldon, Ray J.; Bryant, W.A.

    2008-01-01

    This report describes development of fault parameters for the 2007 update of the National Seismic Hazard Maps and the Working Group on California Earthquake Probabilities (WGCEP, 2007). These reference parameters are contained within a database intended to be a source of values for use by scientists interested in producing either seismic hazard or deformation models to better understand the current seismic hazards in California. These parameters include descriptions of the geometry and rates of movements of faults throughout the state. These values are intended to provide a starting point for development of more sophisticated deformation models which include known rates of movement on faults as well as geodetic measurements of crustal movement and the rates of movements of the tectonic plates. The values will be used in developing the next generation of the time-independent National Seismic Hazard Maps, and the time-dependant seismic hazard calculations being developed for the WGCEP. Due to the multiple uses of this information, development of these parameters has been coordinated between USGS, CGS and SCEC. SCEC provided the database development and editing tools, in consultation with USGS, Golden. This database has been implemented in Oracle and supports electronic access (e.g., for on-the-fly access). A GUI-based application has also been developed to aid in populating the database. Both the continually updated 'living' version of this database, as well as any locked-down official releases (e.g., used in a published model for calculating earthquake probabilities or seismic shaking hazards) are part of the USGS Quaternary Fault and Fold Database http://earthquake.usgs.gov/regional/qfaults/ . CGS has been primarily responsible for updating and editing of the fault parameters, with extensive input from USGS and SCEC scientists.

  14. Ambient seismic noise interferometry in Hawai'i reveals long-range observability of volcanic tremor

    USGS Publications Warehouse

    Ballmer, Silke; Wolfe, Cecily; Okubo, Paul G.; Haney, Matt; Thurber, Clifford H.

    2013-01-01

    The use of seismic noise interferometry to retrieve Green's functions and the analysis of volcanic tremor are both useful in studying volcano dynamics. Whereas seismic noise interferometry allows long-range extraction of interpretable signals from a relatively weak noise wavefield, the characterization of volcanic tremor often requires a dense seismic array close to the source. We here show that standard processing of seismic noise interferometry yields volcanic tremor signals observable over large distances exceeding 50 km. Our study comprises 2.5 yr of data from the U.S. Geological Survey Hawaiian Volcano Observatory short period seismic network. Examining more than 700 station pairs, we find anomalous and temporally coherent signals that obscure the Green's functions. The time windows and frequency bands of these anomalous signals correspond well with the characteristics of previously studied volcanic tremor sources at Pu'u 'Ō'ō and Halema'uma'u craters. We use the derived noise cross-correlation functions to perform a grid-search for source location, confirming that these signals are surface waves originating from the known tremor sources. A grid-search with only distant stations verifies that useful tremor signals can indeed be recovered far from the source. Our results suggest that the specific data processing in seismic noise interferometry—typically used for Green's function retrieval—can aid in the study of both the wavefield and source location of volcanic tremor over large distances. In view of using the derived Green's functions to image heterogeneity and study temporal velocity changes at volcanic regions, however, our results illustrate how care should be taken when contamination by tremor may be present.

  15. Manipulating the Geometric Computer-aided Design of the Operational Requirements-based Casualty Assessment Model within BRL-CAD

    DTIC Science & Technology

    2018-03-30

    ARL-TR-8336 ● MAR 2018 US Army Research Laboratory Manipulating the Geometric Computer-aided Design of the Operational...so designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an official endorsement or approval of...Army Research Laboratory Manipulating the Geometric Computer-aided Design of the Operational Requirements-based Casualty Assessment Model within

  16. Increasing productivity of the McAuto CAD/CAE system by user-specific applications programming

    NASA Technical Reports Server (NTRS)

    Plotrowski, S. M.; Vu, T. H.

    1985-01-01

    Significant improvements in the productivity of the McAuto Computer-Aided Design/Computer-Aided Engineering (CAD/CAE) system were achieved by applications programming using the system's own Graphics Interactive Programming language (GRIP) and the interface capabilities with the main computer on which the system resides. The GRIP programs for creating springs, bar charts, finite element model representations and aiding management planning are presented as examples.

  17. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Treesearch

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  18. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.

    PubMed

    Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-26

    Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness.

  19. Computer Skill Acquisition and Retention: The Effects of Computer-Aided Self-Explanation

    ERIC Educational Resources Information Center

    Chi, Tai-Yin

    2016-01-01

    This research presents an experimental study to determine to what extent computer skill learners can benefit from generating self-explanation with the aid of different computer-based visualization technologies. Self-explanation was stimulated with dynamic visualization (Screencast), static visualization (Screenshot), or verbal instructions only,…

  20. Variscan deformation along the Teisseyre-Tornquist Zone in SE Poland: Thick-skinned structural inheritance or thin-skinned thrusting?

    NASA Astrophysics Data System (ADS)

    Krzywiec, P.; Gągała, Ł.; Mazur, S.; Słonka, Ł.; Kufrasa, M.; Malinowski, M.; Pietsch, K.; Golonka, J.

    2017-10-01

    Recently acquired seismic reflection data provide better insight in the structural style of extensive sedimentary series overlying the SW slope of the East European Craton (EEC) in Poland. The two main seismic datasets - the POLCRUST-01 profile and PolandSPAN survey - yielded contrasting thick - and thin-skinned structural models for the same structural units in SE Poland. We reattempt an interpretation of the POLCRUST-01 profile using techniques of cross-section balancing and restoration aided by 2D forward seismic modelling. An outcome is the thin-skinned structural model is. This solution relies on a continuous top of the EEC crystalline basement well represented in the seismic data as well as on fragmentary, yet conclusive seismic geometries in shallow depth intervals proving the Ediacaran-Palaeozoic series to be thrust and folded. A Variscan (late Carboniferous) compressional regime is consequently invoked to explain thin-skinned structuring of the pre-Permian sedimentary pile and > 20 km of calculated shortening. We demonstrate an ambiguous nature of the top-basement irregularities previously used as indicators of basement-rooted vertical faulting. The tilt and abrupt increase of the top-basement taper under the thin-skinned belt are attributed to pre-Ordovician tectonic processes operating along the SW margin of the EEC. Post-rift subsidence and/or flexural loading giving rise to a broken foreland plate are invoked.

  1. Classifying elephant behaviour through seismic vibrations.

    PubMed

    Mortimer, Beth; Rees, William Lake; Koelemeijer, Paula; Nissen-Meyer, Tarje

    2018-05-07

    Seismic waves - vibrations within and along the Earth's surface - are ubiquitous sources of information. During propagation, physical factors can obscure information transfer via vibrations and influence propagation range [1]. Here, we explore how terrain type and background seismic noise influence the propagation of seismic vibrations generated by African elephants. In Kenya, we recorded the ground-based vibrations of different wild elephant behaviours, such as locomotion and infrasonic vocalisations [2], as well as natural and anthropogenic seismic noise. We employed techniques from seismology to transform the geophone recordings into source functions - the time-varying seismic signature generated at the source. We used computer modelling to constrain the propagation ranges of elephant seismic vibrations for different terrains and noise levels. Behaviours that generate a high force on a sandy terrain with low noise propagate the furthest, over the kilometre scale. Our modelling also predicts that specific elephant behaviours can be distinguished and monitored over a range of propagation distances and noise levels. We conclude that seismic cues have considerable potential for both behavioural classification and remote monitoring of wildlife. In particular, classifying the seismic signatures of specific behaviours of large mammals remotely in real time, such as elephant running, could inform on poaching threats. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Pattern Informatics Approach to Earthquake Forecasting in 3D

    NASA Astrophysics Data System (ADS)

    Toya, Y.; Tiampo, K. F.; Rundle, J. B.; Chen, C.; Li, H.; Klein, W.

    2009-05-01

    Natural seismicity is correlated across multiple spatial and temporal scales, but correlations in seismicity prior to a large earthquake are locally subtle (e.g. seismic quiescence) and often prominent in broad scale (e.g., seismic activation), resulting in local and regional seismicity patterns, e.g. a Mogi's donut. Recognizing that patterns in seismicity rate are reflecting the regional dynamics of the directly unobservable crustal stresses, the Pattern Informatics (PI) approach was introduced by Tiampo et al. in 2002 [Europhys. Lett., 60 (3), 481-487,] Rundle et al., 2002 [PNAS 99, suppl. 1, 2514-2521.] In this study, we expand the PI approach to forecasting earthquakes into the third, or vertical dimension, and illustrate its further improvement in the forecasting performance through case studies of both natural and synthetic data. The PI characterizes rapidly evolving spatio-temporal seismicity patterns as angular drifts of a unit state vector in a high dimensional correlation space, and systematically identifies anomalous shifts in seismic activity with respect to the regional background. 3D PI analysis is particularly advantageous over 2D analysis in resolving vertically overlapped seismicity anomalies in a highly complex tectonic environment. Case studies will help to illustrate some important properties of the PI forecasting tool. [Submitted to: Concurrency and Computation: Practice and Experience, Wiley, Special Issue: ACES2008.

  3. What's your relationship with computerized manufacturing technologies -- functional, dysfunctional or non-existent?

    Treesearch

    Jan Wiedenbeck; Jeff Parsons; Bruce Beeken

    2009-01-01

    Computer-aided manufacturing (CAM), in which computer-aided design (CAD) and computer numerically controlled (CNC) machining are integrated for the production of parts, became a viable option for the woodworking industry in the 1980s.

  4. WINCADRE (COMPUTER-AIDED DATA REVIEW AND EVALUATION)

    EPA Science Inventory

    WinCADRE (Computer-Aided Data Review and Evaluation) is a Windows -based program designed for computer-assisted data validation. WinCADRE is a powerful tool which significantly decreases data validation turnaround time. The electronic-data-deliverable format has been designed ...

  5. Automated Fault Interpretation and Extraction using Improved Supplementary Seismic Datasets

    NASA Astrophysics Data System (ADS)

    Bollmann, T. A.; Shank, R.

    2017-12-01

    During the interpretation of seismic volumes, it is necessary to interpret faults along with horizons of interest. With the improvement of technology, the interpretation of faults can be expedited with the aid of different algorithms that create supplementary seismic attributes, such as semblance and coherency. These products highlight discontinuities, but still need a large amount of human interaction to interpret faults and are plagued by noise and stratigraphic discontinuities. Hale (2013) presents a method to improve on these datasets by creating what is referred to as a Fault Likelihood volume. In general, these volumes contain less noise and do not emphasize stratigraphic features. Instead, planar features within a specified strike and dip range are highlighted. Once a satisfactory Fault Likelihood Volume is created, extraction of fault surfaces is much easier. The extracted fault surfaces are then exported to interpretation software for QC. Numerous software packages have implemented this methodology with varying results. After investigating these platforms, we developed a preferred Automated Fault Interpretation workflow.

  6. Geophysical examination of coal deposits

    NASA Astrophysics Data System (ADS)

    Jackson, L. J.

    1981-04-01

    Geophysical techniques for the solution of mining problems and as an aid to mine planning are reviewed. Techniques of geophysical borehole logging are discussed. The responses of the coal seams to logging tools are easily recognized on the logging records. Cores for laboratory analysis are cut from selected sections of the borehole. In addition, information about the density and chemical composition of the coal may be obtained. Surface seismic reflection surveys using two dimensional arrays of seismic sources and detectors detect faults with throws as small as 3 m depths of 800 m. In geologically disturbed areas, good results have been obtained from three dimensional surveys. Smaller faults as far as 500 m in advance of the working face may be detected using in seam seismic surveying conducted from a roadway or working face. Small disturbances are detected by pulse radar and continuous wave electromagnetic methods either from within boreholes or from underground. Other geophysical techniques which explicit the electrical, magnetic, gravitational, and geothermal properties of rocks are described.

  7. Spectral-Element Seismic Wave Propagation Codes for both Forward Modeling in Complex Media and Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Peter, D. B.; Tromp, J.; Komatitsch, D.; Lefebvre, M. P.

    2015-12-01

    We present both SPECFEM3D_Cartesian and SPECFEM3D_GLOBE open-source codes, representing high-performance numerical wave solvers simulating seismic wave propagation for local-, regional-, and global-scale application. These codes are suitable for both forward propagation in complex media and tomographic imaging. Both solvers compute highly accurate seismic wave fields using the continuous Galerkin spectral-element method on unstructured meshes. Lateral variations in compressional- and shear-wave speeds, density, as well as 3D attenuation Q models, topography and fluid-solid coupling are all readily included in both codes. For global simulations, effects due to rotation, ellipticity, the oceans, 3D crustal models, and self-gravitation are additionally included. Both packages provide forward and adjoint functionality suitable for adjoint tomography on high-performance computing architectures. We highlight the most recent release of the global version which includes improved performance, simultaneous MPI runs, OpenCL and CUDA support via an automatic source-to-source transformation library (BOAST), parallel I/O readers and writers for databases using ADIOS and seismograms using the recently developed Adaptable Seismic Data Format (ASDF) with built-in provenance. This makes our spectral-element solvers current state-of-the-art, open-source community codes for high-performance seismic wave propagation on arbitrarily complex 3D models. Together with these solvers, we provide full-waveform inversion tools to image the Earth's interior at unprecedented resolution.

  8. Monitoring the Earthquake source process in North America

    USGS Publications Warehouse

    Herrmann, Robert B.; Benz, H.; Ammon, C.J.

    2011-01-01

    With the implementation of the USGS National Earthquake Information Center Prompt Assessment of Global Earthquakes for Response system (PAGER), rapid determination of earthquake moment magnitude is essential, especially for earthquakes that are felt within the contiguous United States. We report an implementation of moment tensor processing for application to broad, seismically active areas of North America. This effort focuses on the selection of regional crustal velocity models, codification of data quality tests, and the development of procedures for rapid computation of the seismic moment tensor. We systematically apply these techniques to earthquakes with reported magnitude greater than 3.5 in continental North America that are not associated with a tectonic plate boundary. Using the 0.02-0.10 Hz passband, we can usually determine, with few exceptions, moment tensor solutions for earthquakes with M w as small as 3.7. The threshold is significantly influenced by the density of stations, the location of the earthquake relative to the seismic stations and, of course, the signal-to-noise ratio. With the existing permanent broadband stations in North America operated for rapid earthquake response, the seismic moment tensor of most earthquakes that are M w 4 or larger can be routinely computed. As expected the nonuniform spatial pattern of these solutions reflects the seismicity pattern. However, the orientation of the direction of maximum compressive stress and the predominant style of faulting is spatially coherent across large regions of the continent.

  9. Broad-band seismic analysis and modeling of the 2015 Taan Fjord, Alaska landslide using Instaseis

    NASA Astrophysics Data System (ADS)

    Gualtieri, Lucia; Ekström, Göran

    2018-06-01

    We carry out a broad-band analysis of the seismic signals generated by a massive landslide that occurred near Icy Bay (Alaska) on 2015 October 17. The event generated seismic signals recorded globally. Using Instaseis, a recently developed tool for rapid computation of complete broad-band synthetic seismograms, we simulate the seismic wave propagation between the event and five seismic stations located around the landslide. By modeling the broad-band seismograms in the period band 5-200 s, we reconstruct by inversion a time-varying point force to characterize the landslide time history. We compute the broad-band spectrum of the landslide force history and find that it has a corner period of about 100 s, corresponding to the duration of sliding. In contrast with standard earthquakes, the landslide force spectrum below the corner frequency decays as ω, while the spectral amplitudes at higher frequencies is proportional to ω-2, similar to the rate of spectral decay seen in earthquakes. From the inverted force history and an estimate of the final run-out distance, we deduce the mass, the trajectory and characteristics of the landslide dynamics associated with the centre of mass, such as acceleration, velocity, displacement and friction. Inferring an effective run-out distance of ˜900 m from a satellite image, we estimate a landslide mass of ˜150 million metric tons.

  10. Assessment of seismic loading on structures based on airborne LiDAR data from the Kalochori urban area (N. Greece)

    NASA Astrophysics Data System (ADS)

    Rovithis, Emmanouil; Kirtas, Emmanouil; Marini, Eleftheria; Bliziotis, Dimitris; Maltezos, Evangelos; Pitilakis, Dimitris; Makra, Konstantia; Savvaidis, Alexandros

    2016-08-01

    Airborne LiDAR monitoring integrated with field data is employed to assess the fundamental period and the seismic loading of structures composing an urban area under prescribed earthquake scenarios. Α piecewise work-flow is adopted by combining geometrical data of the building stock derived from a LiDAR-based 3D city model, structural data from in-situ inspections on representative city blocks and results of soil response analyses. The procedure is implemented in the residential area of Kalochori, (west of Thessaloniki in Northern Greece). Special attention is paid to the in-situ inspection of the building stock in order to discriminate recordings between actual buildings and man-made constructions that do not conform to seismic design codes and to acquire additional building stock data on structural materials, typologies and number of stories which is not feasible by the LiDAR process. The processed LiDAR and field data are employed to compute the fundamental period of each building by means of code-defined formulas. Knowledge of soil conditions in the Kalochoti area allows for soil response analyses to obtain free-field at ground surface under earthquake scenarios with varying return period. Upon combining the computed vibrational characteristics of the structures with the free-field response spectra, the seismic loading imposed on the structures of the urban area under investigation is derived for each one of the prescribed seismic motions. Results are presented in GIS environment in the form of spatially distributed spectral accelerations with direct implications in seismic vulnerability studies of an urban area.

  11. FAST TRACK PAPER: A construct of internal multiples from surface data only: the concept of virtual seismic events

    NASA Astrophysics Data System (ADS)

    Ikelle, Luc T.

    2006-02-01

    We here describe one way of constructing internal multiples from surface seismic data only. The key feature of our construct of internal multiples is the introduction of the concept of virtual seismic events. Virtual events here are events, which are not directly recorded in standard seismic data acquisition, but their existence allows us to construct internal multiples with scattering points at the sea surface; the standard construct of internal multiples does not include any scattering points at the sea surface. The mathematical and computational operations invoked in our construction of virtual events and internal multiples are similar to those encountered in the construction of free-surface multiples based on the Kirchhoff or Born scattering theory. For instance, our construct operates on one temporal frequency at a time, just like free-surface demultiple algorithms; other internal multiple constructs tend to require all frequencies for the computation of an internal multiple at a given frequency. It does not require any knowledge of the subsurface nor an explicit knowledge of specific interfaces that are responsible for the generation of internal multiples in seismic data. However, our construct requires that the data be divided into two, three or four windows to avoid generating primaries. This segmentation of the data also allows us to select a range of periods of internal multiples that one wishes to construct because, in the context of the attenuation of internal multiples, it is important to avoid generating short-period internal multiples that may constructively average to form primaries at the seismic scale.

  12. An automated cross-correlation based event detection technique and its application to surface passive data set

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike

    2013-01-01

    In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.

  13. A computer program to trace seismic ray distribution in complex two-dimensional geological models

    USGS Publications Warehouse

    Yacoub, Nazieh K.; Scott, James H.

    1970-01-01

    A computer program has been developed to trace seismic rays and their amplitudes and energies through complex two-dimensional geological models, for which boundaries between elastic units are defined by a series of digitized X-, Y-coordinate values. Input data for the program includes problem identification, control parameters, model coordinates and elastic parameter for the elastic units. The program evaluates the partitioning of ray amplitude and energy at elastic boundaries, computes the total travel time, total travel distance and other parameters for rays arising at the earth's surface. Instructions are given for punching program control cards and data cards, and for arranging input card decks. An example of printer output for a simple problem is presented. The program is written in FORTRAN IV language. The listing of the program is shown in the Appendix, with an example output from a CDC-6600 computer.

  14. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    NASA Astrophysics Data System (ADS)

    Zollo, Aldo

    2016-04-01

    RISS S.r.l. is a Spin-off company recently born from the initiative of the research group constituting the Seismology Laboratory of the Department of Physics of the University of Naples Federico II. RISS is an innovative start-up, based on the decade-long experience in earthquake monitoring systems and seismic data analysis of its members and has the major goal to transform the most recent innovations of the scientific research into technological products and prototypes. With this aim, RISS has recently started the development of a new software, which is an elegant solution to manage and analyse seismic data and to create automatic earthquake bulletins. The software has been initially developed to manage data recorded at the ISNet network (Irpinia Seismic Network), which is a network of seismic stations deployed in Southern Apennines along the active fault system responsible for the 1980, November 23, MS 6.9 Irpinia earthquake. The software, however, is fully exportable and can be used to manage data from different networks, with any kind of station geometry or network configuration and is able to provide reliable estimates of earthquake source parameters, whichever is the background seismicity level of the area of interest. Here we present the real-time automated procedures and the analyses performed by the software package, which is essentially a chain of different modules, each of them aimed at the automatic computation of a specific source parameter. The P-wave arrival times are first detected on the real-time streaming of data and then the software performs the phase association and earthquake binding. As soon as an event is automatically detected by the binder, the earthquake location coordinates and the origin time are rapidly estimated, using a probabilistic, non-linear, exploration algorithm. Then, the software is able to automatically provide three different magnitude estimates. First, the local magnitude (Ml) is computed, using the peak-to-peak amplitude of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non-expert external users who are interested in the seismological data. The software is a valid tool for the automatic analysis of the background seismicity at different time scales and can be a relevant tool for the monitoring of both natural and induced seismicity.

  15. Helical gears with circular arc teeth: Generation, geometry, precision and adjustment to errors, computer aided simulation of conditions of meshing and bearing contact

    NASA Technical Reports Server (NTRS)

    Litvin, Faydor L.; Tsay, Chung-Biau

    1987-01-01

    The authors have proposed a method for the generation of circular arc helical gears which is based on the application of standard equipment, worked out all aspects of the geometry of the gears, proposed methods for the computer aided simulation of conditions of meshing and bearing contact, investigated the influence of manufacturing and assembly errors, and proposed methods for the adjustment of gears to these errors. The results of computer aided solutions are illustrated with computer graphics.

  16. Seismicity and seismic hazard in Sabah, East Malaysia from earthquake and geodetic data

    NASA Astrophysics Data System (ADS)

    Gilligan, A.; Rawlinson, N.; Tongkul, F.; Stephenson, R.

    2017-12-01

    While the levels of seismicity are low in most of Malaysia, the state of Sabah in northern Borneo has moderate levels of seismicity. Notable earthquakes in the region include the 1976 M6.2 Lahad Datu earthquake and the 2015 M6 Ranau earthquake. The recent Ranau earthquake resulted in the deaths of 18 people on Mt Kinabalu, an estimated 100 million RM ( US$23 million) damage to buildings, roads, and infrastructure from shaking, and flooding, reduced water quality, and damage to farms from landslides. Over the last 40 years the population of Sabah has increased to over four times what it was in 1976, yet seismic hazard in Sabah remains poorly understood. Using seismic and geodetic data we hope to better quantify the hazards posed by earthquakes in Sabah, and thus help to minimize risk. In order to do this we need to know about the locations of earthquakes, types of earthquakes that occur, and faults that are generating them. We use data from 15 MetMalaysia seismic stations currently operating in Sabah to develop a region-specific velocity model from receiver functions and a pre-existing surface wave model. We use this new velocity model to (re)locate earthquakes that occurred in Sabah from 2005-2016, including a large number of aftershocks from the 2015 Ranau earthquake. We use a probabilistic nonlinear earthquake location program to locate the earthquakes and then refine their relative locations using a double difference method. The recorded waveforms are further used to obtain moment tensor solutions for these earthquakes. Earthquake locations and moment tensor solutions are then compared with the locations of faults throughout Sabah. Faults are identified from high-resolution IFSAR images and subsequent fieldwork, with a particular focus on the Lahad Datau and Ranau areas. Used together, these seismic and geodetic data can help us to develop a new seismic hazard model for Sabah, as well as aiding in the delivery of outreach activities regarding seismic hazard within local communities, and understanding the seismo-tectonic processes taking place in Sabah

  17. Computer-aided drug discovery.

    PubMed

    Bajorath, Jürgen

    2015-01-01

    Computational approaches are an integral part of interdisciplinary drug discovery research. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true impact on drug discovery at different levels. If applied in a scientifically meaningful way, computational methods improve the ability to identify and evaluate potential drug molecules, but there remain weaknesses in the methods that preclude naïve applications. Herein, current trends in computer-aided drug discovery are reviewed, and selected computational areas are discussed. Approaches are highlighted that aid in the identification and optimization of new drug candidates. Emphasis is put on the presentation and discussion of computational concepts and methods, rather than case studies or application examples. As such, this contribution aims to provide an overview of the current methodological spectrum of computational drug discovery for a broad audience.

  18. An Expedient but Fascinating Geophysical Chimera: The Pinyon Flat Seismic Strain Point Array

    NASA Astrophysics Data System (ADS)

    Langston, C. A.

    2016-12-01

    The combination of a borehole Gladwin Tensor Strain Meter (GTSM) and a co-located three component broadband seismometer (BB) can theoretically be used to determine the propagation attributes of P-SV waves in vertically inhomogeneous media such as horizontal phase velocity and azimuth of propagation through application of wave gradiometry. A major requirement for this to be successful is to have well-calibrated strain and seismic sensors to be able to rely on using absolute wave amplitude from both systems. A "point" seismic array is constructed using the PBO GTSM station B084 and co-located BB seismic stations from an open array experiment deployed by UCSD as well as PFO station at the Pinyon Flat facility. Site amplitude statics for all three ground motion components are found for the 14-element (13 PY stations + PFO), small aperture seismic array using data from 47 teleseisms recorded from 2014 until present. Precision of amplitude measurement at each site is better than 0.2% for vertical components, 0.5% for EW components, and 1% for NS components. Relative amplitudes among sites of the array are often better than 1% attesting to the high quality of the instrumentation and installation. The wavefield and related horizontal strains are computed for the location of B084 using a second order Taylor's expansion of observed waveforms from moderate ( M4) regional events. The computed seismic array areal, differential, and shear strains show excellent correlation in both phase and amplitude with those recorded by B084 when using the calibration matrix previously determined using teleseismic strains from the entire ANZA seismic network. Use of the GTSM-BB "point" array significantly extends the bandwidth of gradiometry calculations over the small-aperture seismic array by nearly two orders of magnitude from 0.5 Hz to 0.01 Hz. In principle, a seismic strain point array could be constructed from every PBO GTSM with a co-located seismometer to help serve earthquake early warning for large regional events on North America's west coast.

  19. Computer-Aided Engineering Education at the K.U. Leuven.

    ERIC Educational Resources Information Center

    Snoeys, R.; Gobin, R.

    1987-01-01

    Describes some recent initiatives and developments in the computer-aided design program in the engineering faculty of the Katholieke Universiteit Leuven (Belgium). Provides a survey of the engineering curriculum, the computer facilities, and the main software packages available. (TW)

  20. Characterizing 6 August 2007 Crandall Canyon mine collapse from ALOS PALSAR InSAR

    USGS Publications Warehouse

    Lu, Zhong; Wicks, Charles

    2010-01-01

    same as the moment of the collapse source, with each larger than the seismically computed moment. Our InSAR results, including the location of the event, the extent of the collapsed area, and constraints on the shearing component of the deformation source, all confirm and extend recent seismic studies of the 6 August 2007 event.

  1. Possible artifacts in inferring seismic properties from X-ray data

    NASA Astrophysics Data System (ADS)

    Bosak, A.; Krisch, M.; Chumakov, A.; Abrikosov, I. A.; Dubrovinsky, L.

    2016-11-01

    We consider the experimental and computational artifacts relevant for the extraction of aggregate elastic properties of polycrystalline materials with particular emphasis on the derivation of seismic velocities. We use the case of iron as an example, and show that the improper use of definitions and neglecting the crystalline anisotropy can result in unexpectedly large errors up to a few percent.

  2. Joint probabilistic determination of earthquake location and velocity structure: application to local and regional events

    NASA Astrophysics Data System (ADS)

    Beucler, E.; Haugmard, M.; Mocquet, A.

    2016-12-01

    The most widely used inversion schemes to locate earthquakes are based on iterative linearized least-squares algorithms and using an a priori knowledge of the propagation medium. When a small amount of observations is available for moderate events for instance, these methods may lead to large trade-offs between outputs and both the velocity model and the initial set of hypocentral parameters. We present a joint structure-source determination approach using Bayesian inferences. Monte-Carlo continuous samplings, using Markov chains, generate models within a broad range of parameters, distributed according to the unknown posterior distributions. The non-linear exploration of both the seismic structure (velocity and thickness) and the source parameters relies on a fast forward problem using 1-D travel time computations. The a posteriori covariances between parameters (hypocentre depth, origin time and seismic structure among others) are computed and explicitly documented. This method manages to decrease the influence of the surrounding seismic network geometry (sparse and/or azimuthally inhomogeneous) and a too constrained velocity structure by inferring realistic distributions on hypocentral parameters. Our algorithm is successfully used to accurately locate events of the Armorican Massif (western France), which is characterized by moderate and apparently diffuse local seismicity.

  3. Software Tools for Shipbuilding Productivity

    DTIC Science & Technology

    1984-12-01

    shipbuilding, is that design, manufacturing and robotic technology applications to shipbuilding have been proven. all aspects of shipbuilding is now a task...technical information about the process of Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) effectively has been a problem of serious and...Design (CAD) 3.4.1 CAD System Components 3.4.2 CAD System Benefits 3.4.3 New and Future CAD Technologies Computer Aided Manufacturing (CAM) 3.5.1 CAM

  4. Digital data acquisition for a CAD/CAM-fabricated titanium framework and zirconium oxide restorations for an implant-supported fixed complete dental prosthesis.

    PubMed

    Lin, Wei-Shao; Metz, Michael J; Pollini, Adrien; Ntounis, Athanasios; Morton, Dean

    2014-12-01

    This dental technique report describes a digital workflow with digital data acquisition at the implant level, computer-aided design and computer-aided manufacturing fabricated, tissue-colored, anodized titanium framework, individually luted zirconium oxide restorations, and autopolymerizing injection-molded acrylic resin to fabricate an implant-supported, metal-ceramic-resin fixed complete dental prosthesis in an edentulous mandible. The 1-step computer-aided design and computer-aided manufacturing fabrication of titanium framework and zirconium oxide restorations can provide a cost-effective alternative to the conventional metal-resin fixed complete dental prosthesis. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  5. Correction of facial and mandibular asymmetry using a computer aided design/computer aided manufacturing prefabricated titanium implant.

    PubMed

    Watson, Jason; Hatamleh, Muhanad; Alwahadni, Ahed; Srinivasan, Dilip

    2014-05-01

    Patients with significant craniofacial asymmetry may have functional problems associated with their occlusion and aesthetic concerns related to the imbalance in soft and hard tissue profiles. This report details a case of facial asymmetry secondary to left mandible angle deficiency due to undergoing previous radiotherapy. We describe the correction of the bony deformity using computer aided design/computer aided manufacturing custom-made titanium onlay using novel direct metal laser sintering. The direct metal laser sintering onlay proved a very accurate operative fit and showed a good aesthetic correction of the bony defect with no reported complications postoperatively. It is a useful low-morbidity technique, and there is no resorption or associated donor-site complications.

  6. Classification of Computer-Aided Design-Computer-Aided Manufacturing Applications for the Reconstruction of Cranio-Maxillo-Facial Defects.

    PubMed

    Wauters, Lauri D J; Miguel-Moragas, Joan San; Mommaerts, Maurice Y

    2015-11-01

    To gain insight into the methodology of different computer-aided design-computer-aided manufacturing (CAD-CAM) applications for the reconstruction of cranio-maxillo-facial (CMF) defects. We reviewed and analyzed the available literature pertaining to CAD-CAM for use in CMF reconstruction. We proposed a classification system of the techniques of implant and cutting, drilling, and/or guiding template design and manufacturing. The system consisted of 4 classes (I-IV). These classes combine techniques used for both the implant and template to most accurately describe the methodology used. Our classification system can be widely applied. It should facilitate communication and immediate understanding of the methodology of CAD-CAM applications for the reconstruction of CMF defects.

  7. Dual-scan technique for the customization of zirconia computer-aided design/computer-aided manufacturing frameworks.

    PubMed

    Andreiuolo, Rafael Ferrone; Sabrosa, Carlos Eduardo; Dias, Katia Regina H Cervantes

    2013-09-01

    The use of bi-layered all-ceramic crowns has continuously grown since the introduction of computer-aided design/computer-aided manufacturing (CAD/CAM) zirconia cores. Unfortunately, despite the outstanding mechanical properties of zirconia, problems related to porcelain cracking or chipping remain. One of the reasons for this is that ceramic copings are usually milled to uniform thicknesses of 0.3-0.6 mm around the whole tooth preparation. This may not provide uniform thickness or appropriate support for the veneering porcelain. To prevent these problems, the dual-scan technique demonstrates an alternative that allows the restorative team to customize zirconia CAD/CAM frameworks with adequate porcelain thickness and support in a simple manner.

  8. Some Probabilistic and Statistical Properties of the Seismic Regime of Zemmouri (Algeria) Seismoactive Zone

    NASA Astrophysics Data System (ADS)

    Baddari, Kamel; Bellalem, Fouzi; Baddari, Ibtihel; Makdeche, Said

    2016-10-01

    Statistical tests have been used to adjust the Zemmouri seismic data using a distribution function. The Pareto law has been used and the probabilities of various expected earthquakes were computed. A mathematical expression giving the quantiles was established. The extreme values limiting law confirmed the accuracy of the adjustment method. Using the moment magnitude scale, a probabilistic model was made to predict the occurrences of strong earthquakes. The seismic structure has been characterized by the slope of the recurrence plot γ, fractal dimension D, concentration parameter K sr, Hurst exponents H r and H t. The values of D, γ, K sr, H r, and H t diminished many months before the principal seismic shock ( M = 6.9) of the studied seismoactive zone has occurred. Three stages of the deformation of the geophysical medium are manifested in the variation of the coefficient G% of the clustering of minor seismic events.

  9. Computer Simulated Visual and Tactile Feedback as an Aid to Manipulator and Vehicle Control,

    DTIC Science & Technology

    1981-05-08

    STATEMENT ........................ 8 Artificial Intellegence Versus Supervisory Control ....... 8 Computer Generation of Operator Feedback...operator. Artificial Intelligence Versus Supervisory Control The use of computers to aid human operators can be divided into two catagories: artificial ...operator. Artificial intelligence ( A. I. ) attempts to give the computer maximum intelligence and to replace all operator functions by the computer

  10. Computational sciences in the upstream oil and gas industry

    PubMed Central

    Halsey, Thomas C.

    2016-01-01

    The predominant technical challenge of the upstream oil and gas industry has always been the fundamental uncertainty of the subsurface from which it produces hydrocarbon fluids. The subsurface can be detected remotely by, for example, seismic waves, or it can be penetrated and studied in the extremely limited vicinity of wells. Inevitably, a great deal of uncertainty remains. Computational sciences have been a key avenue to reduce and manage this uncertainty. In this review, we discuss at a relatively non-technical level the current state of three applications of computational sciences in the industry. The first of these is seismic imaging, which is currently being revolutionized by the emergence of full wavefield inversion, enabled by algorithmic advances and petascale computing. The second is reservoir simulation, also being advanced through the use of modern highly parallel computing architectures. Finally, we comment on the role of data analytics in the upstream industry. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597785

  11. Virtual Reality versus Computer-Aided Exposure Treatments for Fear of Flying

    ERIC Educational Resources Information Center

    Tortella-Feliu, Miquel; Botella, Cristina; Llabres, Jordi; Breton-Lopez, Juana Maria; del Amo, Antonio Riera; Banos, Rosa M.; Gelabert, Joan M.

    2011-01-01

    Evidence is growing that two modalities of computer-based exposure therapies--virtual reality and computer-aided psychotherapy--are effective in treating anxiety disorders, including fear of flying. However, they have not yet been directly compared. The aim of this study was to analyze the efficacy of three computer-based exposure treatments for…

  12. The Implications of Cognitive Psychology for Computer-Based Learning Tools.

    ERIC Educational Resources Information Center

    Kozma, Robert B.

    1987-01-01

    Defines cognitive computer tools as software programs that use the control capabilities of computers to amplify, extend, or enhance human cognition; suggests seven ways in which computers can aid learning; and describes the "Learning Tool," a software package for the Apple Macintosh microcomputer that is designed to aid learning of…

  13. Artificial Intelligence and Computer Assisted Instruction. CITE Report No. 4.

    ERIC Educational Resources Information Center

    Elsom-Cook, Mark

    The purpose of the paper is to outline some of the major ways in which artificial intelligence research and techniques can affect usage of computers in an educational environment. The role of artificial intelligence is defined, and the difference between Computer Aided Instruction (CAI) and Intelligent Computer Aided Instruction (ICAI) is…

  14. Electronic Circuit Analysis Language (ECAL)

    NASA Astrophysics Data System (ADS)

    Chenghang, C.

    1983-03-01

    The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.

  15. WINCADRE INORGANIC (WINDOWS COMPUTER-AIDED DATA REVIEW AND EVALUATION)

    EPA Science Inventory

    WinCADRE (Computer-Aided Data Review and Evaluation) is a Windows -based program designed for computer-assisted data validation. WinCADRE is a powerful tool which significantly decreases data validation turnaround time. The electronic-data-deliverable format has been designed in...

  16. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework

    NASA Astrophysics Data System (ADS)

    Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain

    2014-05-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes discrepancies. The results of quality control is visualized through a web interface. This latter gathers data from different information systems to provide a global view on last events that could impact the data (like intervention on site or seismic events, etc.). This work is still an ongoing project. We intend to add more sophisticated procedures to enhanced our data Quality Control. Among them, we will deploy a seismic moment tensor inversion tool for amplitude, time and polarity control and a noise correlation procedure for time drift detections.

  17. A frozen Gaussian approximation-based multi-level particle swarm optimization for seismic inversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jinglai, E-mail: jinglaili@sjtu.edu.cn; Lin, Guang, E-mail: lin491@purdue.edu; Computational Sciences and Mathematics Division, Pacific Northwest National Laboratory, Richland, WA 99352

    2015-09-01

    In this paper, we propose a frozen Gaussian approximation (FGA)-based multi-level particle swarm optimization (MLPSO) method for seismic inversion of high-frequency wave data. The method addresses two challenges in it: First, the optimization problem is highly non-convex, which makes hard for gradient-based methods to reach global minima. This is tackled by MLPSO which can escape from undesired local minima. Second, the character of high-frequency of seismic waves requires a large number of grid points in direct computational methods, and thus renders an extremely high computational demand on the simulation of each sample in MLPSO. We overcome this difficulty by threemore » steps: First, we use FGA to compute high-frequency wave propagation based on asymptotic analysis on phase plane; Then we design a constrained full waveform inversion problem to prevent the optimization search getting into regions of velocity where FGA is not accurate; Last, we solve the constrained optimization problem by MLPSO that employs FGA solvers with different fidelity. The performance of the proposed method is demonstrated by a two-dimensional full-waveform inversion example of the smoothed Marmousi model.« less

  18. Coseismic Excitation of the Earth's Polar Motion

    NASA Technical Reports Server (NTRS)

    Chao, B. F.; Gross, R. S.

    2000-01-01

    Apart from the "shaking" near the epicenter that is the earthquake, a seismic event creates a permanent field of dislocation in the entire Earth. This redistribution of mass changes (slightly) the Earth's inertia tensor; and the Earth's rotation will change in accordance with the conservation of angular momentum. Similar to this seismic excitation of Earth rotation variations, the same mass redistribution causes (slight) changes in the Earth's gravitational field expressible in terms of changes in the Stokes coefficients of its harmonic expansion. In this paper, we give a historical background of the subject and discuss the related physics; we then compute the geodynamic effects caused by earthquakes based on normal-mode summation scheme. The effects are computed using the centroid moment tensor (CMT) solutions for 15,814 major earthquakes from Jan., 1977, through Feb., 1999, as provided in the Harvard CMT catalog. The computational results further strengthens these findings and conclusions: (i) the strong tendency for earthquakes to make the Earth rounder and more compact (however slightly) continues; (ii) so does the trend in the seismic "nudging" of the rotation pole toward the general direction of approx. 140 E, roughly opposite to that of the observed polar drift, but two orders of magnitude smaller in drift speed.

  19. Computer-aided US diagnosis of breast lesions by using cell-based contour grouping.

    PubMed

    Cheng, Jie-Zhi; Chou, Yi-Hong; Huang, Chiun-Sheng; Chang, Yeun-Chung; Tiu, Chui-Mei; Chen, Kuei-Wu; Chen, Chung-Ming

    2010-06-01

    To develop a computer-aided diagnostic algorithm with automatic boundary delineation for differential diagnosis of benign and malignant breast lesions at ultrasonography (US) and investigate the effect of boundary quality on the performance of a computer-aided diagnostic algorithm. This was an institutional review board-approved retrospective study with waiver of informed consent. A cell-based contour grouping (CBCG) segmentation algorithm was used to delineate the lesion boundaries automatically. Seven morphologic features were extracted. The classifier was a logistic regression function. Five hundred twenty breast US scans were obtained from 520 subjects (age range, 15-89 years), including 275 benign (mean size, 15 mm; range, 5-35 mm) and 245 malignant (mean size, 18 mm; range, 8-29 mm) lesions. The newly developed computer-aided diagnostic algorithm was evaluated on the basis of boundary quality and differentiation performance. The segmentation algorithms and features in two conventional computer-aided diagnostic algorithms were used for comparative study. The CBCG-generated boundaries were shown to be comparable with the manually delineated boundaries. The area under the receiver operating characteristic curve (AUC) and differentiation accuracy were 0.968 +/- 0.010 and 93.1% +/- 0.7, respectively, for all 520 breast lesions. At the 5% significance level, the newly developed algorithm was shown to be superior to the use of the boundaries and features of the two conventional computer-aided diagnostic algorithms in terms of AUC (0.974 +/- 0.007 versus 0.890 +/- 0.008 and 0.788 +/- 0.024, respectively). The newly developed computer-aided diagnostic algorithm that used a CBCG segmentation method to measure boundaries achieved a high differentiation performance. Copyright RSNA, 2010

  20. Bibliographical search for reliable seismic moments of large earthquakes during 1900-1979 to compute MW in the ISC-GEM Global Instrumental Reference Earthquake Catalogue

    NASA Astrophysics Data System (ADS)

    Lee, William H. K.; Engdahl, E. Robert

    2015-02-01

    Moment magnitude (MW) determinations from the online GCMT Catalogue of seismic moment tensor solutions (GCMT Catalog, 2011) have provided the bulk of MW values in the ISC-GEM Global Instrumental Reference Earthquake Catalogue (1900-2009) for almost all moderate-to-large earthquakes occurring after 1975. This paper describes an effort to determine MW of large earthquakes that occurred prior to the start of the digital seismograph era, based on credible assessments of thousands of seismic moment (M0) values published in the scientific literature by hundreds of individual authors. MW computed from the published M0 values (for a time period more than twice that of the digital era) are preferable to proxy MW values, especially for earthquakes with MW greater than about 8.5, for which MS is known to be underestimated or "saturated". After examining 1,123 papers, we compile a database of seismic moments and related information for 1,003 earthquakes with published M0 values, of which 967 were included in the ISC-GEM Catalogue. The remaining 36 earthquakes were not included in the Catalogue due to difficulties in their relocation because of inadequate arrival time information. However, 5 of these earthquakes with bibliographic M0 (and thus MW) are included in the Catalogue's Appendix. A search for reliable seismic moments was not successful for earthquakes prior to 1904. For each of the 967 earthquakes a "preferred" seismic moment value (if there is more than one) was selected and its uncertainty was estimated according to the data and method used. We used the IASPEI formula (IASPEI, 2005) to compute direct moment magnitudes (MW[M0]) based on the seismic moments (M0), and assigned their errors based on the uncertainties of M0. From 1900 to 1979, there are 129 great or near great earthquakes (MW ⩾ 7.75) - the bibliographic search provided direct MW values for 86 of these events (or 67%), the GCMT Catalog provided direct MW values for 8 events (or 6%), and the remaining 35 (or 27%) earthquakes have empirically determined proxy MW estimates. An electronic supplementary file is included with this paper in order to provide our M0/MW catalogue of earthquakes (1904-1978) from the published literature, and a reference list of the 1,123 papers that we examined.

  1. Estimation of the failure risk of a maxillary premolar with different crack depths with endodontic treatment by computer-aided design/computer-aided manufacturing ceramic restorations.

    PubMed

    Lin, Chun-Li; Chang, Yen-Hsiang; Hsieh, Shih-Kai; Chang, Wen-Jen

    2013-03-01

    This study evaluated the risk of failure for an endodontically treated premolar with different crack depths, which was shearing toward the pulp chamber and was restored by using 3 different computer-aided design/computer-aided manufacturing ceramic restoration configurations. Three 3-dimensional finite element models designed with computer-aided design/computer-aided manufacturing ceramic onlay, endocrown, and conventional crown restorations were constructed to perform simulations. The Weibull function was incorporated with finite element analysis to calculate the long-term failure probability relative to different load conditions. The results indicated that the stress values on the enamel, dentin, and luting cement for endocrown restorations exhibited the lowest values relative to the other 2 restoration methods. Weibull analysis revealed that the overall failure probabilities in a shallow cracked premolar were 27%, 2%, and 1% for the onlay, endocrown, and conventional crown restorations, respectively, in the normal occlusal condition. The corresponding values were 70%, 10%, and 2% for the depth cracked premolar. This numeric investigation suggests that the endocrown provides sufficient fracture resistance only in a shallow cracked premolar with endodontic treatment. The conventional crown treatment can immobilize the premolar for different cracked depths with lower failure risk. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  2. Advances in computer-aided well-test interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horne, R.N.

    1994-07-01

    Despite the feeling expressed several times over the past 40 years that well-test analysis had reached it peak development, an examination of recent advances shows continuous expansion in capability, with future improvement likely. The expansion in interpretation capability over the past decade arose mainly from the development of computer-aided techniques, which, although introduced 20 years ago, have come into use only recently. The broad application of computer-aided interpretation originated with the improvement of the methodologies and continued with the expansion in computer access and capability that accompanied the explosive development of the microcomputer industry. This paper focuses on the differentmore » pieces of the methodology that combine to constitute a computer-aided interpretation and attempts to compare some of the approaches currently used. Future directions of the approach are also discussed. The separate areas discussed are deconvolution, pressure derivatives, model recognition, nonlinear regression, and confidence intervals.« less

  3. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses

    PubMed Central

    Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-01

    Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than others. Integration of content control improved quality of decision making (SMD 0.59 vs 0.23 for knowledge; SMD 0.39 vs 0.29 for decisional conflict). In contrast, tailoring reduced quality of decision making (SMD 0.40 vs 0.71 for knowledge; SMD 0.25 vs 0.52 for decisional conflict). Similarly, patient narratives also reduced quality of decision making (SMD 0.43 vs 0.65 for knowledge; SMD 0.17 vs 0.46 for decisional conflict). Results were varied for different types of explicit values clarification, feedback, and social support. Conclusions Integration of media rich or interactive features into computer-based decision aids can improve quality of preference-sensitive decision making. However, this is an emerging field with limited evidence to guide use. The systematic review and thematic synthesis identified features that have been integrated into available computer-based decision aids, in an effort to facilitate reporting of these features and to promote integration of such features into decision aids. The meta-analyses and associated subgroup analyses provide preliminary evidence to support integration of specific features into future decision aids. Further research can focus on clarifying independent contributions of specific features through experimental designs and refining the designs of features to improve effectiveness. PMID:26813512

  4. Multicomponent ensemble models to forecast induced seismicity

    NASA Astrophysics Data System (ADS)

    Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.

    2018-01-01

    In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels of seismicity days before the occurrence of felt events.

  5. Analysis of the Pre-stack Split-Step Migration Operator Using Ritz Values

    NASA Astrophysics Data System (ADS)

    Kaplan, S. T.; Sacchi, M. D.

    2009-05-01

    The Born approximation for the acoustic wave-field is often used as a basis for developing algorithms in seismic imaging (migration). The approximation is linear, and, as such, can be written as a matrix-vector multiplication (Am=d). In the seismic imaging problem, d is seismic data (the recorded wave-field), and we aim to find the seismic reflectivity m (a representation of earth structure and properties) so that Am=d is satisfied. This is the often studied inverse problem of seismic migration, where given A and d, we solve for m. This can be done in a least-squares sense, so that the equation of interest is, AHAm = AHd. Hence, the solution m is largely dependent on the properties of AHA. The imaging Jacobian J provides an approximation to AHA, so that J-1AHA is, in a broad sense, better behaved then AHA. We attempt to quantify this last statement by providing an analysis of AHA and J-1AHA using their Ritz values, and for the particular case where A is built using a pre-stack split-step migration algorithm. Typically, one might try to analyze the behaviour of these matrices using their eigenvalue spectra. The difficulty in the analysis of AHA and J-1AHA lie in their size. For example, a subset of the relatively small Marmousi data set makes AHA a complex valued matrix with, roughly, dimensions of 45 million by 45 million (requiring, in single-precision, about 16 Peta-bytes of computer memory). In short, the size of the matrix makes its eigenvalues difficult to compute. Instead, we compute the leading principal minors of similar tridiagonal matrices, Bk=Vk-1AHAVk and Ck = Uk-1 J-1 AHAUk. These can be constructed using, for example, the Lanczos decomposition. Up to some value of k it is feasible to compute the eigenvalues of Bk and Ck which, in turn, are the Ritz values of, respectively, AHA and J-1 AHA, and may allow us to make quantitative statements about their behaviours.

  6. Data and Workflow Management Challenges in Global Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Lei, W.; Ruan, Y.; Smith, J. A.; Modrak, R. T.; Orsvuran, R.; Krischer, L.; Chen, Y.; Balasubramanian, V.; Hill, J.; Turilli, M.; Bozdag, E.; Lefebvre, M. P.; Jha, S.; Tromp, J.

    2017-12-01

    It is crucial to take the complete physics of wave propagation into account in seismic tomography to further improve the resolution of tomographic images. The adjoint method is an efficient way of incorporating 3D wave simulations in seismic tomography. However, global adjoint tomography is computationally expensive, requiring thousands of wavefield simulations and massive data processing. Through our collaboration with the Oak Ridge National Laboratory (ORNL) computing group and an allocation on Titan, ORNL's GPU-accelerated supercomputer, we are now performing our global inversions by assimilating waveform data from over 1,000 earthquakes. The first challenge we encountered is dealing with the sheer amount of seismic data. Data processing based on conventional data formats and processing tools (such as SAC), which are not designed for parallel systems, becomes our major bottleneck. To facilitate the data processing procedures, we designed the Adaptive Seismic Data Format (ASDF) and developed a set of Python-based processing tools to replace legacy FORTRAN-based software. These tools greatly enhance reproducibility and accountability while taking full advantage of highly parallel system and showing superior scaling on modern computational platforms. The second challenge is that the data processing workflow contains more than 10 sub-procedures, making it delicate to handle and prone to human mistakes. To reduce human intervention as much as possible, we are developing a framework specifically designed for seismic inversion based on the state-of-the art workflow management research, specifically the Ensemble Toolkit (EnTK), in collaboration with the RADICAL team from Rutgers University. Using the initial developments of the EnTK, we are able to utilize the full computing power of the data processing cluster RHEA at ORNL while keeping human interaction to a minimum and greatly reducing the data processing time. Thanks to all the improvements, we are now able to perform iterations fast enough on more than a 1,000 earthquakes dataset. Starting from model GLAD-M15 (Bozdag et al., 2016), an elastic 3D model with a transversely isotropic upper mantle, we have successfully performed 5 iterations. Our goal is to finish 10 iterations, i.e., generating GLAD M25* by the end of this year.

  7. Low-Cost Computer-Aided Instruction/Computer-Managed Instruction (CAI/CMI) System: Feasibility Study. Final Report.

    ERIC Educational Resources Information Center

    Lintz, Larry M.; And Others

    This study investigated the feasibility of a low cost computer-aided instruction/computer-managed instruction (CAI/CMI) system. Air Force instructors and training supervisors were surveyed to determine the potential payoffs of various CAI and CMI functions. Results indicated that a wide range of capabilities had potential for resident technical…

  8. Computer Aided Drafting Packages for Secondary Education. Edition 1. Apple II and Macintosh. A MicroSIFT Quarterly Report.

    ERIC Educational Resources Information Center

    Pollard, Jim

    This report reviews software packages for Apple Macintosh and Apple II computers available to secondary schools to teach computer-aided drafting (CAD). Products for the report were gathered through reviews of CAD periodicals, computers in education periodicals, advertisements, and teacher recommendations. The first section lists the primary…

  9. Prediction of Reservoir Properties for Geomechanical Analysis Using 3-D Seismic Data and Rock Physics Modeling in the Vaca Muerta Formation, Neuquen Basin, Argentina

    NASA Astrophysics Data System (ADS)

    Convers-Gomez, Carlos E.

    The Vaca Muerta Formation in the Neuquen Basin has recently received a lot of attention from oil companies interested in developing its shale resources. Early identification of potential zones with possible good production is extremely important to optimize the return on capital investment. Developing a work flow in shale plays that associates an effective hydraulic fracture response with the presence of hydrocarbons is crucial for economic success. The vertical and lateral heterogeneity of rock properties are critical factors that impact production. The integration of 3D seismic and well data is necessary for prediction of rock properties and identifies their distribution in the rock, which can also be integrated with geomechanical properties to model the rock response favorable to hydraulic stimulation. This study includes a 3D seismic survey and six vertical wells with full log suites in each well. The well logs allowed for the computation of a pre-stack model-based inversion which uses seismic data to estimate rock property volumes. An inverse relationship between P-impedance and Total Organic Content (TOC) was observed and quantified. Likewise, a direct relationship between P-impedance and volume of carbonate was observed. The volume of kerogen, type of clay, type of carbonate and fluid pressure all control the geomechanical properties of the formation when subject to hydraulic fracturing. Probabilistic Neural Networks were then used to predict the lateral and vertical heterogeneity of rock properties. TOC and volume of kerogen behaved as adequate indicators of possible zones with high presence of hydrocarbons. Meanwhile, the volume of carbonate was a valid indicator of brittle-ductile rock. The predicted density volume was used to estimate geomechanical properties (Young's Modulus and Poisson's Ratio) and to identify the zones that have a better response to hydraulic stimulation. During the analysis of geomechanical properties, Young's Modulus was observed to have a direct relationship with volume of carbonate and an inverse relationship with TOC, enabling the identification of brittle and ductile rocks zones. The analysis detected zones that had a good presence of hydrocarbons and brittle rock. The information was integrated with the analysis of geomechanical properties generating a model with the most possible zones of good production. This model will aid in the future exploration and development of the Vaca Muerta Formation.

  10. Computer Aided Drug Design: Success and Limitations.

    PubMed

    Baig, Mohammad Hassan; Ahmad, Khurshid; Roy, Sudeep; Ashraf, Jalaluddin Mohammad; Adil, Mohd; Siddiqui, Mohammad Haris; Khan, Saif; Kamal, Mohammad Amjad; Provazník, Ivo; Choi, Inho

    2016-01-01

    Over the last few decades, computer-aided drug design has emerged as a powerful technique playing a crucial role in the development of new drug molecules. Structure-based drug design and ligand-based drug design are two methods commonly used in computer-aided drug design. In this article, we discuss the theory behind both methods, as well as their successful applications and limitations. To accomplish this, we reviewed structure based and ligand based virtual screening processes. Molecular dynamics simulation, which has become one of the most influential tool for prediction of the conformation of small molecules and changes in their conformation within the biological target, has also been taken into account. Finally, we discuss the principles and concepts of molecular docking, pharmacophores and other methods used in computer-aided drug design.

  11. Removal of Supernumerary Teeth Utilizing a Computer-Aided Design/Computer-Aided Manufacturing Surgical Guide.

    PubMed

    Jo, Chanwoo; Bae, Doohwan; Choi, Byungho; Kim, Jihun

    2017-05-01

    Supernumerary teeth need to be removed because they can cause various complications. Caution is needed because their removal can cause damage to permanent teeth or tooth germs in the local vicinity. Surgical guides have recently been used in maxillofacial surgery. Because surgical guides are designed through preoperative analysis by computer-aided design software and fabricated using a 3-dimensional printer applying computer-aided manufacturing technology, they increase the accuracy and predictability of surgery. This report describes 2 cases of removal of a mesiodens-1 from a child and 1 from an adolescent-using a surgical guide; these would have been difficult to remove with conventional surgical methods. Copyright © 2016 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  12. Seismograms live from around the world

    USGS Publications Warehouse

    Woodward, Robert L.; Shedlock, Kaye M.; Bolton, Harold F.

    1999-01-01

    You can view earthquakes as they happen! Seismograms from seismic stations around the world are broadcast live, via the Internet, and are updated every 30 minutes, With an Internet connection and a web browser, you can view current seismograms and earthquake locations on your own computer. With special software also available via the Internet, you can obtain seismic data as it arrives from a global network of seismograph stations.

  13. Measuring the size of an earthquake

    USGS Publications Warehouse

    Spence, William; Sipkin, Stuart A.; Choy, George L.

    1989-01-01

    Today, state-of-the-art seismic systems transmit data from the seismograph via telephone line and satellite directly to a central digital computer. A preliminary location, depth-of-focus, and magnitude can now be obtained within minutes of onset of an earthquake.  The only limiting factor is how long the seismic waves take to travel from the epicenter to the stations--usually less than 10 minutes.

  14. Seismic Barrier Protection of Critical Infrastructure from Earthquakes

    DTIC Science & Technology

    2017-05-01

    structure composed of opposing boreholes or trenches to mitigate seismic waves from diffracting and traveling in the vertical plane. Computational...dams, etc., pose significant risk to civilians while adding tremendous cost and recovery time to regain their functionality. Lower energy earthquakes...the most destructive are surface waves (Rayleigh, Love, shear) which can travel great distances in the far field from the earthquake hypocenter and

  15. Measuring the size of an earthquake

    USGS Publications Warehouse

    Spence, W.; Sipkin, S.A.; Choy, G.L.

    1989-01-01

    Today, state-of-the-art seismic systems transmit data from the seismograph via telephone line and satellite directly to a central digital computer. A preliminary location, depth-of-focus, and magntidue can now be obtained within minutes of the onset of an earthquake. The only limiting factor is how long the seismic wave stake to travel from the epicenter to the stations-usually less than 10 minutes. 

  16. Crowd-Sourcing Seismic Data: Lessons Learned from the Quake-Catcher Network

    NASA Astrophysics Data System (ADS)

    Cochran, E. S.; Sumy, D. F.; DeGroot, R. M.; Clayton, R. W.

    2015-12-01

    The Quake Catcher Network (QCN; qcn.caltech.edu) uses low cost micro-electro-mechanical system (MEMS) sensors hosted by volunteers to collect seismic data. Volunteers use accelerometers internal to laptop computers, phones, tablets or small (the size of a matchbox) MEMS sensors plugged into desktop computers using a USB connector to collect scientifically useful data. Data are collected and sent to a central server using the Berkeley Open Infrastructure for Network Computing (BOINC) distributed computing software. Since 2008, when the first citizen scientists joined the QCN project, sensors installed in museums, schools, offices, and residences have collected thousands of earthquake records. We present and describe the rapid installations of very dense sensor networks that have been undertaken following several large earthquakes including the 2010 M8.8 Maule Chile, the 2010 M7.1 Darfield, New Zealand, and the 2015 M7.8 Gorkha, Nepal earthquake. These large data sets allowed seismologists to develop new rapid earthquake detection capabilities and closely examine source, path, and site properties that impact ground shaking at a site. We show how QCN has engaged a wide sector of the public in scientific data collection, providing the public with insights into how seismic data are collected and used. Furthermore, we describe how students use data recorded by QCN sensors installed in their classrooms to explore and investigate earthquakes that they felt, as part of 'teachable moment' exercises.

  17. The Quake-Catcher Network: An Innovative Community-Based Seismic Network

    NASA Astrophysics Data System (ADS)

    Saltzman, J.; Cochran, E. S.; Lawrence, J. F.; Christensen, C. M.

    2009-12-01

    The Quake-Catcher Network (QCN) is a volunteer computing seismic network that engages citizen scientists, teachers, and museums to participate in the detection of earthquakes. In less than two years, the network has grown to over 1000 participants globally and continues to expand. QCN utilizes Micro-Electro-Mechanical System (MEMS) accelerometers, in laptops and external to desktop computers, to detect moderate to large earthquakes. One goal of the network is to involve K-12 classrooms and museums by providing sensors and software to introduce participants to seismology and community-based scientific data collection. The Quake-Catcher Network provides a unique opportunity to engage participants directly in the scientific process, through hands-on activities that link activities and outcomes to their daily lives. Partnerships with teachers and museum staff are critical to growth of the Quake Catcher Network. Each participating institution receives a MEMS accelerometer to connect, via USB, to a computer that can be used for hands-on activities and to record earthquakes through a distributed computing system. We developed interactive software (QCNLive) that allows participants to view sensor readings in real time. Participants can also record earthquakes and download earthquake data that was collected by their sensor or other QCN sensors. The Quake-Catcher Network combines research and outreach to improve seismic networks and increase awareness and participation in science-based research in K-12 schools.

  18. Web-Based Learning in the Computer-Aided Design Curriculum.

    ERIC Educational Resources Information Center

    Sung, Wen-Tsai; Ou, S. C.

    2002-01-01

    Applies principles of constructivism and virtual reality (VR) to computer-aided design (CAD) curriculum, particularly engineering, by integrating network, VR and CAD technologies into a Web-based learning environment that expands traditional two-dimensional computer graphics into a three-dimensional real-time simulation that enhances user…

  19. Integrated Computer-Aided Drafting Instruction (ICADI).

    ERIC Educational Resources Information Center

    Chen, C. Y.; McCampbell, David H.

    Until recently, computer-aided drafting and design (CAD) systems were almost exclusively operated on mainframes or minicomputers and their cost prohibited many schools from offering CAD instruction. Today, many powerful personal computers are capable of performing the high-speed calculation and analysis required by the CAD application; however,…

  20. Soft computing analysis of the possible correlation between temporal and energy release patterns in seismic activity

    NASA Astrophysics Data System (ADS)

    Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin

    2010-05-01

    This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and Ben-Zion Y.: ‘Techniques and parameters to analyze seismicity patterns associated with large earthquakes', Geophysics Res., vol. 102, pp. 17785-17795, 1997a [3] Habermann R. E.: ‘Precursory seismic quiescence: past, present and future', Pure Applied Geophysics, vol. 126, pp. 279-318, 1988 [4] Matthews M. V. and Reasenberg P. A.: ‘Statistical methods for investigating quiescence and other temporal seismicity patterns', Pure Applied Geophysics, vol. 126, pp. 357-372, 1988 [5] Zubkov S. I.: ‘The appearance times of earthquake precursors', Izv. Akad. Nauk SSSR Fiz. Zemli (Solid Earth), No. 5, pp. 87-91, 1987 [6] Dobrovolsky I. P., Zubkov S. I. and Miachkin V. I.: ‘Estimation of the size of earthquake preparation zones', Pageoph, vol. 117, pp. 1025-1044, 1979 [7] Dobrovolsky I. P., Gershenzon N. I. And Gokhberg M. B.: ‘Theory of electrokinetic effects occurring at the final stage in the preparation of a tectonic earthquake', Physics of the Earth and Planetary Interiors, vol. 57, pp. 144-156, 1989 [8] Richter C. F.: ‘Elementary Seismology', W.H.Freeman and Co., San Francisco, 1958 [9] Choy G. L. and Boatwright J. L.: ‘Global patterns of radiated seismic energy and apparent stress', Journal of Geophysical Research, vol. 84 (B5), pp. 2348-2350, 1995 [10] Haykin S.: ‘Neural Networks', 2nd Edition, Prentice Hall, 1999 [11] Jang J., Sun T. and Mizutany E.: ‘Neuro-fuzzy and soft computing', Prentice Hall, Upper Saddle River, NJ, 1997 [12] Konstantaras A., Varley M.R., Vallianatos F., Collins G. and Holifield P.: ‘Detection of weak seismo-electric signals upon the recordings of the electrotelluric field by means of neuron-fuzzy technology', IEEE Geoscience and Remote Sensing Letters, vol. 4 (1), 2007 [13] Konstantaras A., Varley M.R., Vallianatos F., Collins G. and Holifield P.: ‘Neuro-fuzzy prediction-based adaptive filtering applied to severely distorted magnetic field recordings', IEEE Geoscience and Remote Sensing Letters, vol. 3 (4), 2006 [14] Maravelakis E., Bilalis N., Keith J. and Antoniadis A.: ‘Measuring and Benchmarking the Innovativeness of SME's: a three dimensional Fuzzy Logic Approach', Production Planning and Control Journal, vol. 17 (3), pp. 283-292, 2006 [15] Bodri B.: ‘A neural-network model for earthquake occurrence', Geodynamics, vol. 32, pp. 289-310, 2001 [16] Skounakis E., Karagiannis V. and Vlissidis A.: ‘A Versatile System for Real-time Analyzing and Testing Objects Quality', Proceedings-CD of the 4th International Conference on "New Horizons in Industry, Business and Education" (NHIBE 2005), Corfu, Greece, pp. 701-708, 2005

  1. A Computer-Aided Diagnosis System for Breast Cancer Combining Digital Mammography and Genomics

    DTIC Science & Technology

    2006-05-01

    Huang, "Breast cancer diagnosis using self-organizing map for sonography." Ultrasound Med. Biol. 26, 405 (2000). 20 K. Horsch, M.L. Giger, L.A. Venta ...L.A. Venta , "Performance of computer-aided diagnosis in the interpretation of lesions on breast sonography." Acad Radiol 11, 272 (2004). 22 W. Chen...418. 27. Horsch K, Giger ML, Vyborny CJ, Venta LA. Performance of computer-aided diagnosis in the interpretation of lesions on breast sonography

  2. Computer-aided design of the RF-cavity for a high-power S-band klystron

    NASA Astrophysics Data System (ADS)

    Kant, D.; Bandyopadhyay, A. K.; Pal, D.; Meena, R.; Nangru, S. C.; Joshi, L. M.

    2012-08-01

    This article describes the computer-aided design of the RF-cavity for a S-band klystron operating at 2856 MHz. State-of-the-art electromagnetic simulation tools SUPERFISH, CST Microwave studio, HFSS and MAGIC have been used for cavity design. After finalising the geometrical details of the cavity through simulation, it has been fabricated and characterised through cold testing. Detailed results of the computer-aided simulation and cold measurements are presented in this article.

  3. Computer aided design of monolithic microwave and millimeter wave integrated circuits and subsystems

    NASA Astrophysics Data System (ADS)

    Ku, Walter H.; Gang, Guan-Wan; He, J. Q.; Ichitsubo, I.

    1988-05-01

    This final technical report presents results on the computer aided design of monolithic microwave and millimeter wave integrated circuits and subsystems. New results include analytical and computer aided device models of GaAs MESFETs and HEMTs or MODFETs, new synthesis techniques for monolithic feedback and distributed amplifiers and a new nonlinear CAD program for MIMIC called CADNON. This program incorporates the new MESFET and HEMT model and has been successfully applied to the design of monolithic millimeter-wave mixers.

  4. An Interactive Computer Aided Design and Analysis Package.

    DTIC Science & Technology

    1986-03-01

    Al-A167 114 AN INTERACTIVE COMPUTER AIDED DESIGN MUD ANAILYSIS 1/𔃼 PACKAGE(U) NAVAL POSTGRADUATE SCHOOL NONTEREY CA T L EUALD "AR 86 UNCLSSIFIED F... SCHOOL Monterey, California DTIC .LECTE MAYOS THESIS AN INTERACTIVE COMPUTER AIDED DESIGN AND ANALYSIS PACKAGE by Terrence L. Ewald March 1986 jThesis...ORGANIZATION Naval Postgraduate School (if dAp90h81111) Naval Postgraduate School . 62A 6C. ADDRESS (0ty. State, and ZIP Code) 7b. ADDRESS (City State. and

  5. Use of microcomputer in mapping depth of stratigraphic horizons in National Petroleum Reserve in Alaska

    USGS Publications Warehouse

    Payne, Thomas G.

    1982-01-01

    REGIONAL MAPPER is a menu-driven system in the BASIC language for computing and plotting (1) time, depth, and average velocity to geologic horizons, (2) interval time, thickness, and interval velocity of stratigraphic intervals, and (3) subcropping and onlapping intervals at unconformities. The system consists of three programs: FILER, TRAVERSER, and PLOTTER. A control point is a shot point with velocity analysis or a shot point at or near a well with velocity check-shot survey. Reflection time to and code number of seismic horizons are filed by digitizing tablet from record sections. TRAVERSER starts at a point of geologic control and, in traversing to another, parallels seismic events, records loss of horizons by onlap and truncation, and stores reflection time for geologic horizons at traversed shot points. TRAVERSER is basically a phantoming procedure. Permafrost thickness and velocity variations, buried canyons with low-velocity fill, and error in seismically derived velocity cause velocity anomalies that complicate depth mapping. Two depths to the top of the pebble is based shale are computed for each control point. One depth, designated Zs on seismically derived velocity. The other (Zw) is based on interval velocity interpolated linearly between wells and multiplied by interval time (isochron) to give interval thickness. Z w is computed for all geologic horizons by downward summation of interval thickness. Unknown true depth (Z) to the pebble shale may be expressed as Z = Zs + es and Z = Zw + ew where the e terms represent error. Equating the two expressions gives the depth difference D = Zs + Zw = ew + es A plot of D for the top of the pebble shale is readily contourable but smoothing is required to produce a reasonably simple surface. Seismically derived velocity used in computing Zs includes the effect of velocity anomalies but is subject to some large randomly distributed errors resulting in depth errors (es). Well-derived velocity used in computing Zw does not include the effect of velocity anomalies, but the error (ew) should reflect these anomalies and should be contourable (non-random). The D surface as contoured with smoothing is assumed to represent ew, that is, the depth effect of variations in permafrost thickness and velocity and buried canyon depth. Estimated depth (Zest) to each geologic horizon is the sum of Z w for that horizon and a constant e w as contoured for the pebble shale, which is the first highly continuous seismic horizon below the zone of anomalous velocity. Results of this 'depthing' procedure are compared with those of Tetra Tech, Inc., the subcontractor responsible for geologic and geophysical interpretation and mapping.

  6. Rupture Dynamics and Seismic Radiation on Rough Faults for Simulation-Based PSHA

    NASA Astrophysics Data System (ADS)

    Mai, P. M.; Galis, M.; Thingbaijam, K. K. S.; Vyas, J. C.; Dunham, E. M.

    2017-12-01

    Simulation-based ground-motion predictions may augment PSHA studies in data-poor regions or provide additional shaking estimations, incl. seismic waveforms, for critical facilities. Validation and calibration of such simulation approaches, based on observations and GMPE's, is important for engineering applications, while seismologists push to include the precise physics of the earthquake rupture process and seismic wave propagation in 3D heterogeneous Earth. Geological faults comprise both large-scale segmentation and small-scale roughness that determine the dynamics of the earthquake rupture process and its radiated seismic wavefield. We investigate how different parameterizations of fractal fault roughness affect the rupture evolution and resulting near-fault ground motions. Rupture incoherence induced by fault roughness generates realistic ω-2 decay for high-frequency displacement amplitude spectra. Waveform characteristics and GMPE-based comparisons corroborate that these rough-fault rupture simulations generate realistic synthetic seismogram for subsequent engineering application. Since dynamic rupture simulations are computationally expensive, we develop kinematic approximations that emulate the observed dynamics. Simplifying the rough-fault geometry, we find that perturbations in local moment tensor orientation are important, while perturbations in local source location are not. Thus, a planar fault can be assumed if the local strike, dip, and rake are maintained. The dynamic rake angle variations are anti-correlated with local dip angles. Based on a dynamically consistent Yoffe source-time function, we show that the seismic wavefield of the approximated kinematic rupture well reproduces the seismic radiation of the full dynamic source process. Our findings provide an innovative pseudo-dynamic source characterization that captures fault roughness effects on rupture dynamics. Including the correlations between kinematic source parameters, we present a new pseudo-dynamic rupture modeling approach for computing broadband ground-motion time-histories for simulation-based PSHA

  7. Seismic performance assessment of base-isolated safety-related nuclear structures

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2010-01-01

    Seismic or base isolation is a proven technology for reducing the effects of earthquake shaking on buildings, bridges and infrastructure. The benefit of base isolation has been presented in terms of reduced accelerations and drifts on superstructure components but never quantified in terms of either a percentage reduction in seismic loss (or percentage increase in safety) or the probability of an unacceptable performance. Herein, we quantify the benefits of base isolation in terms of increased safety (or smaller loss) by comparing the safety of a sample conventional and base-isolated nuclear power plant (NPP) located in the Eastern U.S. Scenario- and time-based assessments are performed using a new methodology. Three base isolation systems are considered, namely, (1) Friction Pendulum??? bearings, (2) lead-rubber bearings and (3) low-damping rubber bearings together with linear viscous dampers. Unacceptable performance is defined by the failure of key secondary systems because these systems represent much of the investment in a new build power plant and ensure the safe operation of the plant. For the scenario-based assessments, the probability of unacceptable performance is computed for an earthquake with a magnitude of 5.3 at a distance 7.5 km from the plant. For the time-based assessments, the annual frequency of unacceptable performance is computed considering all potential earthquakes that may occur. For both assessments, the implementation of base isolation reduces the probability of unacceptable performance by approximately four orders of magnitude for the same NPP superstructure and secondary systems. The increase in NPP construction cost associated with the installation of seismic isolators can be offset by substantially reducing the required seismic strength of secondary components and systems and potentially eliminating the need to seismically qualify many secondary components and systems. ?? 2010 John Wiley & Sons, Ltd.

  8. Seismic source inversion using Green's reciprocity and a 3-D structural model for the Japanese Islands

    NASA Astrophysics Data System (ADS)

    Simutė, S.; Fichtner, A.

    2015-12-01

    We present a feasibility study for seismic source inversions using a 3-D velocity model for the Japanese Islands. The approach involves numerically calculating 3-D Green's tensors, which is made efficient by exploiting Green's reciprocity. The rationale for 3-D seismic source inversion has several aspects. For structurally complex regions, such as the Japan area, it is necessary to account for 3-D Earth heterogeneities to prevent unknown structure polluting source solutions. In addition, earthquake source characterisation can serve as a means to delineate existing faults. Source parameters obtained for more realistic Earth models can then facilitate improvements in seismic tomography and early warning systems, which are particularly important for seismically active areas, such as Japan. We have created a database of numerically computed 3-D Green's reciprocals for a 40°× 40°× 600 km size area around the Japanese Archipelago for >150 broadband stations. For this we used a regional 3-D velocity model, recently obtained from full waveform inversion. The model includes attenuation and radial anisotropy and explains seismic waveform data for periods between 10 - 80 s generally well. The aim is to perform source inversions using the database of 3-D Green's tensors. As preliminary steps, we present initial concepts to address issues that are at the basis of our approach. We first investigate to which extent Green's reciprocity works in a discrete domain. Considering substantial amounts of computed Green's tensors we address storage requirements and file formatting. We discuss the importance of the initial source model, as an intelligent choice can substantially reduce the search volume. Possibilities to perform a Bayesian inversion and ways to move to finite source inversion are also explored.

  9. Seismic Window Selection and Misfit Measurements for Global Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Lei, W.; Bozdag, E.; Lefebvre, M.; Podhorszki, N.; Smith, J. A.; Tromp, J.

    2013-12-01

    Global Adjoint Tomography requires fast parallel processing of large datasets. After obtaing the preprocessed observed and synthetic seismograms, we use the open source software packages FLEXWIN (Maggi et al. 2007) to select time windows and MEASURE_ADJ to make measurements. These measurements define adjoint sources for data assimilation. Previous versions of these tools work on a pair of SAC files---observed and synthetic seismic data for the same component and station, and loop over all seismic records associated with one earthquake. Given the large number of stations and earthquakes, the frequent read and write operations create severe I/O bottlenecks on modern computing platforms. We present new versions of these tools utilizing a new seismic data format, namely the Adaptive Seismic Data Format(ASDF). This new format shows superior scalability for applications on high-performance computers and accommodates various types of data, including earthquake, industry and seismic interferometry datasets. ASDF also provides user-friendly APIs, which can be easily integrated into the adjoint tomography workflow and combined with other data processing tools. In addition to solving the I/O bottleneck, we are making several improvements to these tools. For example, FLEXWIN is tuned to select windows for different types of earthquakes. To capture their distinct features, we categorize earthquakes by their depths and frequency bands. Moreover, instead of only picking phases between the first P arrival and the surface-wave arrivals, our aim is to select and assimilate many other later prominent phases in adjoint tomography. For example, in the body-wave band (17 s - 60 s), we include SKS, sSKS and their multiple, while in the surface-wave band (60 s - 120 s) we incorporate major-arc surface waves.

  10. Stochastic Seismic Inversion and Migration for Offshore Site Investigation in the Northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Son, J.; Medina-Cetina, Z.

    2017-12-01

    We discuss the comparison between deterministic and stochastic optimization approaches to the nonlinear geophysical full-waveform inverse problem, based on the seismic survey data from Mississippi Canyon in the Northern Gulf of Mexico. Since the subsea engineering and offshore construction projects actively require reliable ground models from various site investigations, the primary goal of this study is to reconstruct the accurate subsurface information of the soil and rock material profiles under the seafloor. The shallow sediment layers have naturally formed heterogeneous formations which may cause unwanted marine landslides or foundation failures of underwater infrastructure. We chose the quasi-Newton and simulated annealing as deterministic and stochastic optimization algorithms respectively. Seismic forward modeling based on finite difference method with absorbing boundary condition implements the iterative simulations in the inverse modeling. We briefly report on numerical experiments using a synthetic data as an offshore ground model which contains shallow artificial target profiles of geomaterials under the seafloor. We apply the seismic migration processing and generate Voronoi tessellation on two-dimensional space-domain to improve the computational efficiency of the imaging stratigraphical velocity model reconstruction. We then report on the detail of a field data implementation, which shows the complex geologic structures in the Northern Gulf of Mexico. Lastly, we compare the new inverted image of subsurface site profiles in the space-domain with the previously processed seismic image in the time-domain at the same location. Overall, stochastic optimization for seismic inversion with migration and Voronoi tessellation show significant promise to improve the subsurface imaging of ground models and improve the computational efficiency required for the full waveform inversion. We anticipate that by improving the inversion process of shallow layers from geophysical data will better support the offshore site investigation.

  11. Node Resource Manager: A Distributed Computing Software Framework Used for Solving Geophysical Problems

    NASA Astrophysics Data System (ADS)

    Lawry, B. J.; Encarnacao, A.; Hipp, J. R.; Chang, M.; Young, C. J.

    2011-12-01

    With the rapid growth of multi-core computing hardware, it is now possible for scientific researchers to run complex, computationally intensive software on affordable, in-house commodity hardware. Multi-core CPUs (Central Processing Unit) and GPUs (Graphics Processing Unit) are now commonplace in desktops and servers. Developers today have access to extremely powerful hardware that enables the execution of software that could previously only be run on expensive, massively-parallel systems. It is no longer cost-prohibitive for an institution to build a parallel computing cluster consisting of commodity multi-core servers. In recent years, our research team has developed a distributed, multi-core computing system and used it to construct global 3D earth models using seismic tomography. Traditionally, computational limitations forced certain assumptions and shortcuts in the calculation of tomographic models; however, with the recent rapid growth in computational hardware including faster CPU's, increased RAM, and the development of multi-core computers, we are now able to perform seismic tomography, 3D ray tracing and seismic event location using distributed parallel algorithms running on commodity hardware, thereby eliminating the need for many of these shortcuts. We describe Node Resource Manager (NRM), a system we developed that leverages the capabilities of a parallel computing cluster. NRM is a software-based parallel computing management framework that works in tandem with the Java Parallel Processing Framework (JPPF, http://www.jppf.org/), a third party library that provides a flexible and innovative way to take advantage of modern multi-core hardware. NRM enables multiple applications to use and share a common set of networked computers, regardless of their hardware platform or operating system. Using NRM, algorithms can be parallelized to run on multiple processing cores of a distributed computing cluster of servers and desktops, which results in a dramatic speedup in execution time. NRM is sufficiently generic to support applications in any domain, as long as the application is parallelizable (i.e., can be subdivided into multiple individual processing tasks). At present, NRM has been effective in decreasing the overall runtime of several algorithms: 1) the generation of a global 3D model of the compressional velocity distribution in the Earth using tomographic inversion, 2) the calculation of the model resolution matrix, model covariance matrix, and travel time uncertainty for the aforementioned velocity model, and 3) the correlation of waveforms with archival data on a massive scale for seismic event detection. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  12. Computer Aided Creativity.

    ERIC Educational Resources Information Center

    Proctor, Tony

    1988-01-01

    Explores the conceptual components of a computer program designed to enhance creative thinking and reviews software that aims to stimulate creative thinking. Discusses BRAIN and ORACLE, programs intended to aid in creative problem solving. (JOW)

  13. Prerequisites for Computer-Aided Cognitive Rehabilitation.

    ERIC Educational Resources Information Center

    Legrand, Colette

    1989-01-01

    This paper describes computer-aided cognitive rehabilitation for mentally deficient persons. It lists motor, cognitive, emotional, and educational prerequisites to such rehabilitation and states advantages and disadvantages in using the prerequisites. (JDD)

  14. Seismic event classification system

    DOEpatents

    Dowla, F.U.; Jarpe, S.P.; Maurer, W.

    1994-12-13

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

  15. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Chandy, M.; Krause, A.

    2010-12-01

    In collaboration with computer science and earthquake engineering, we are developing a dense network of low-cost accelerometers that send their data via the Internet to a cloud-based center. The goal is to make block-by-block measurements of ground shaking in urban areas, which will provide emergency response information in the case of large earthquakes, and an unprecedented high-frequency seismic array to study structure and the earthquake process with moderate shaking. When deployed in high-rise buildings they can be used to monitor the state of health of the structure. The sensors are capable of a resolution of approximately 80 micro-g, connect via USB ports to desktop computers, and cost about $100 each. The network will adapt to its environment by using network-wide machine learning to adjust the picking sensitivity. We are also looking into using other motion sensing devices such as cell phones. For a pilot project, we plan to deploy more than 1000 sensors in the greater Pasadena area. The system is easily adaptable to other seismically vulnerable urban areas.

  16. Cloud Computing Services for Seismic Networks

    NASA Astrophysics Data System (ADS)

    Olson, Michael

    This thesis describes a compositional framework for developing situation awareness applications: applications that provide ongoing information about a user's changing environment. The thesis describes how the framework is used to develop a situation awareness application for earthquakes. The applications are implemented as Cloud computing services connected to sensors and actuators. The architecture and design of the Cloud services are described and measurements of performance metrics are provided. The thesis includes results of experiments on earthquake monitoring conducted over a year. The applications developed by the framework are (1) the CSN---the Community Seismic Network---which uses relatively low-cost sensors deployed by members of the community, and (2) SAF---the Situation Awareness Framework---which integrates data from multiple sources, including the CSN, CISN---the California Integrated Seismic Network, a network consisting of high-quality seismometers deployed carefully by professionals in the CISN organization and spread across Southern California---and prototypes of multi-sensor platforms that include carbon monoxide, methane, dust and radiation sensors.

  17. Regional seismic wavefield computation on a 3-D heterogeneous Earth model by means of coupled traveling wave synthesis

    USGS Publications Warehouse

    Pollitz, F.F.

    2002-01-01

    I present a new algorithm for calculating seismic wave propagation through a three-dimensional heterogeneous medium using the framework of mode coupling theory originally developed to perform very low frequency (f < ???0.01-0.05 Hz) seismic wavefield computation. It is a Greens function approach for multiple scattering within a defined volume and employs a truncated traveling wave basis set using the locked mode approximation. Interactions between incident and scattered wavefields are prescribed by mode coupling theory and account for the coupling among surface waves, body waves, and evanescent waves. The described algorithm is, in principle, applicable to global and regional wave propagation problems, but I focus on higher frequency (typically f ??????0.25 Hz) applications at regional and local distances where the locked mode approximation is best utilized and which involve wavefields strongly shaped by propagation through a highly heterogeneous crust. Synthetic examples are shown for P-SV-wave propagation through a semi-ellipsoidal basin and SH-wave propagation through a fault zone.

  18. Seismic event classification system

    DOEpatents

    Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William

    1994-01-01

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

  19. Micro and Mainframe Computer Models for Improved Planning in Awarding Financial Aid to Disadvantaged Students.

    ERIC Educational Resources Information Center

    Attinasi, Louis C., Jr.; Fenske, Robert H.

    1988-01-01

    Two computer models used at Arizona State University recognize the tendency of students from low-income and minority backgrounds to apply for assistance late in the funding cycle. They permit administrators to project the amount of aid needed by such students. The Financial Aid Computerized Tracking System is described. (Author/MLW)

  20. A Computer-Based, Interactive Videodisc Job Aid and Expert System for Electron Beam Lithography Integration and Diagnostic Procedures.

    ERIC Educational Resources Information Center

    Stevenson, Kimberly

    This master's thesis describes the development of an expert system and interactive videodisc computer-based instructional job aid used for assisting in the integration of electron beam lithography devices. Comparable to all comprehensive training, expert system and job aid development require a criterion-referenced systems approach treatment to…

  1. Cluster Computing For Real Time Seismic Array Analysis.

    NASA Astrophysics Data System (ADS)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by a pro- gram which reads data from disk files and send them to a remote host by using the Internet protocols.

  2. Seismic imaging: From classical to adjoint tomography

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Gu, Y. J.

    2012-09-01

    Seismic tomography has been a vital tool in probing the Earth's internal structure and enhancing our knowledge of dynamical processes in the Earth's crust and mantle. While various tomographic techniques differ in data types utilized (e.g., body vs. surface waves), data sensitivity (ray vs. finite-frequency approximations), and choices of model parameterization and regularization, most global mantle tomographic models agree well at long wavelengths, owing to the presence and typical dimensions of cold subducted oceanic lithospheres and hot, ascending mantle plumes (e.g., in central Pacific and Africa). Structures at relatively small length scales remain controversial, though, as will be discussed in this paper, they are becoming increasingly resolvable with the fast expanding global and regional seismic networks and improved forward modeling and inversion techniques. This review paper aims to provide an overview of classical tomography methods, key debates pertaining to the resolution of mantle tomographic models, as well as to highlight recent theoretical and computational advances in forward-modeling methods that spearheaded the developments in accurate computation of sensitivity kernels and adjoint tomography. The first part of the paper is devoted to traditional traveltime and waveform tomography. While these approaches established a firm foundation for global and regional seismic tomography, data coverage and the use of approximate sensitivity kernels remained as key limiting factors in the resolution of the targeted structures. In comparison to classical tomography, adjoint tomography takes advantage of full 3D numerical simulations in forward modeling and, in many ways, revolutionizes the seismic imaging of heterogeneous structures with strong velocity contrasts. For this reason, this review provides details of the implementation, resolution and potential challenges of adjoint tomography. Further discussions of techniques that are presently popular in seismic array analysis, such as noise correlation functions, receiver functions, inverse scattering imaging, and the adaptation of adjoint tomography to these different datasets highlight the promising future of seismic tomography.

  3. In-situ Planetary Subsurface Imaging System

    NASA Astrophysics Data System (ADS)

    Song, W.; Weber, R. C.; Dimech, J. L.; Kedar, S.; Neal, C. R.; Siegler, M.

    2017-12-01

    Geophysical and seismic instruments are considered the most effective tools for studying the detailed global structures of planetary interiors. A planet's interior bears the geochemical markers of its evolutionary history, as well as its present state of activity, which has direct implications to habitability. On Earth, subsurface imaging often involves massive data collection from hundreds to thousands of geophysical sensors (seismic, acoustic, etc) followed by transfer by hard links or wirelessly to a central location for post processing and computing, which will not be possible in planetary environments due to imposed mission constraints on mass, power, and bandwidth. Emerging opportunities for geophysical exploration of the solar system from Venus to the icy Ocean Worlds of Jupiter and Saturn dictate that subsurface imaging of the deep interior will require substantial data reduction and processing in-situ. The Real-time In-situ Subsurface Imaging (RISI) technology is a mesh network that senses and processes geophysical signals. Instead of data collection then post processing, the mesh network performs the distributed data processing and computing in-situ, and generates an evolving 3D subsurface image in real-time that can be transmitted under bandwidth and resource constraints. Seismic imaging algorithms (including traveltime tomography, ambient noise imaging, and microseismic imaging) have been successfully developed and validated using both synthetic and real-world terrestrial seismic data sets. The prototype hardware system has been implemented and can be extended as a general field instrumentation platform tailored specifically for a wide variety of planetary uses, including crustal mapping, ice and ocean structure, and geothermal systems. The team is applying the RISI technology to real off-world seismic datasets. For example, the Lunar Seismic Profiling Experiment (LSPE) deployed during the Apollo 17 Moon mission consisted of four geophone instruments spaced up to 100 meters apart, which in essence forms a small aperture seismic network. A pattern recognition technique based on Hidden Markov Models was able to characterize this dataset, and we are exploring how the RISI technology can be adapted for this dataset.

  4. Geophysical Analysis of Major Geothermal Anomalies in Romania

    NASA Astrophysics Data System (ADS)

    Panea, Ionelia; Mocanu, Victor

    2017-11-01

    The Romanian segment of the Eastern Pannonian Basin and the Moesian Platform are known for their geothermal and hydrocarbon-bearing structures. We used seismic, gravity, and geothermal data to analyze the geothermal behavior in the Oradea and Timisoara areas, from the Romanian segment of Eastern Pannonian Basin, and the Craiova-Bals-Optasi area, from the Moesian Platform. We processed 22 seismic reflection data sets recorded in the Oradea and Timisoara areas to obtain P-wave velocity distributions and time seismic sections. The P-wave velocity distributions correlate well with the structural trends observed along the seismic lines. We observed a good correlation between the high areas of crystalline basement seen on the time seismic sections and the high heat flow and gravity-anomaly values. For the Craiova-Bals-Optasi area, we computed a three-dimensional (3D) temperature model using calculated and measured temperature and geothermal gradient values in wells with an irregular distribution on the territory. The high temperatures from the Craiova-Bals-Optasi area correlate very well with the uplifted basement blocks seen on the time seismic sections and high gravity-anomaly values.

  5. Kinematics of the New Madrid seismic zone, central United States, based on stepover models

    USGS Publications Warehouse

    Pratt, Thomas L.

    2012-01-01

    Seismicity in the New Madrid seismic zone (NMSZ) of the central United States is generally attributed to a stepover structure in which the Reelfoot thrust fault transfers slip between parallel strike-slip faults. However, some arms of the seismic zone do not fit this simple model. Comparison of the NMSZ with an analog sandbox model of a restraining stepover structure explains all of the arms of seismicity as only part of the extensive pattern of faults that characterizes stepover structures. Computer models show that the stepover structure may form because differences in the trends of lower crustal shearing and inherited upper crustal faults make a step between en echelon fault segments the easiest path for slip in the upper crust. The models predict that the modern seismicity occurs only on a subset of the faults in the New Madrid stepover structure, that only the southern part of the stepover structure ruptured in the A.D. 1811–1812 earthquakes, and that the stepover formed because the trends of older faults are not the same as the current direction of shearing.

  6. Fast 3D elastic micro-seismic source location using new GPU features

    NASA Astrophysics Data System (ADS)

    Xue, Qingfeng; Wang, Yibo; Chang, Xu

    2016-12-01

    In this paper, we describe new GPU features and their applications in passive seismic - micro-seismic location. Locating micro-seismic events is quite important in seismic exploration, especially when searching for unconventional oil and gas resources. Different from the traditional ray-based methods, the wave equation method, such as the method we use in our paper, has a remarkable advantage in adapting to low signal-to-noise ratio conditions and does not need a person to select the data. However, because it has a conspicuous deficiency due to its computation cost, these methods are not widely used in industrial fields. To make the method useful, we implement imaging-like wave equation micro-seismic location in a 3D elastic media and use GPU to accelerate our algorithm. We also introduce some new GPU features into the implementation to solve the data transfer and GPU utilization problems. Numerical and field data experiments show that our method can achieve a more than 30% performance improvement in GPU implementation just by using these new features.

  7. Use of Multibeam-Bathymetry and Seismic-Reflection Data to Investigate the Origin of Seafloor Depressions Along the Southeastern Carbonate Florida Platform

    NASA Astrophysics Data System (ADS)

    Cunningham, K. J.; Kluesner, J.; Westcott, R. L.; Ebuna, D. R.; Walker, C.

    2016-12-01

    Numerous large, semicircular, deep submarine depressions on the seafloor of the Miami Terrace (a bathymetric bench that interrupts the Atlantic continental slope on the southeastern carbonate Florida Platform) have been described as submarine sinkholes resulting from freshwater discharge at the seafloor and dissolution of carbonate rock. Multibeam-bathymetry and marine, high-resolution, multichannel 2D and 3D seismic-reflection data acquired over two of these depressions at water depths of about 250 m ("Miami sinkhole") and 336 m ("Key Biscayne sinkhole") indicate the depressions are pockmarks. Seafloor pockmarks are concave, crater-like depressions that form through the outburst or venting of fluid (gas, liquid) at the sea floor and are important seabed features that provide information about fluid flow on continental margins. Both the "Miami sinkhole" and "Key Biscayne sinkhole" (about 25 and 48m deep, respectively) have a seismic-chimney structure beneath them that indicates an origin related to seafloor fluid expulsion, as supported by multi-attribute analysis of the "Key Biscayne sinkhole". Further, there is no widening of the depressions with depth, as in the Fort Worth Basin, where downward widening of seismic, sub-circular, karst-collapse structures is common. However, hypogenic karst dissolution is not ruled out as part of the evolution of the two depressions. Indeed, a hypogenic karst pipe plausibly extends downward from the bottom of "Key Biscayne sinkhole", providing a passageway for focused upward flow of fluids to the seafloor. In "Key Biscayne sinkhole", the proposed karst pipe occurs above the underlying seismic chimney that contains flat bright spots (a hydrocarbon indicator) in the seismic data plausibly showing fluids are currently trapped beneath the pockmark within a tightly folded popup structure. The Miami Terrace depressions have seismic-reflection features similar to modern pockmarks imaged on the Maldives carbonate platform. The seismic-reflection data also show that ancient satellite expulsions formed buried pockmarks, slumps, and paleo-collapse structures in the carbonate sediments near the "Key Biscayne sinkhole". Additional processing of the 3D seismic data will aid in elucidation of the origin of these seafloor depressions.

  8. Seismic-Reflection Technology Defines Potential Vertical Bypass in Hydrogeologic Confinement within Tertiary Carbonates of the Southeastern Florida Platform

    NASA Astrophysics Data System (ADS)

    Cunningham, K. J.; Walker, C.; Westcott, R. L.

    2011-12-01

    Continuous improvements in shallow-focused, high-resolution, marine seismic-reflection technology has provided the opportunity to evaluate geologic structures that breach confining units of the Floridan aquifer system within the southeastern Florida Platform. The Floridan aquifer system is comprised mostly of Tertiary platform carbonates. In southeastern Florida, hydrogeologic confinement is important to sustainable use of the Floridan aquifer system, where the saline lower part is used for injection of wastewater and the brackish upper part is an alternative source of drinking water. Between 2007 and 2011, approximately 275 km of 24- and 48-channel seismic-reflection profiles were acquired in canals of peninsular southeastern Florida, Biscayne Bay, present-day Florida shelf margin, and the deeply submerged Miami Terrace. Vertical to steeply dipping offsets in seismic reflections indicate faults, which range from Eocene to possible early Pliocene age. Most faults are associated with karst collapse structures; however, a few tectonic faults of early Miocene to early Pliocene age are present. The faults may serve as a pathway for vertical groundwater flow across relatively low-permeability carbonate strata that separate zones of regionally extensive high-permeability in the Floridan aquifer system. The faults may collectively produce a regional confinement bypass system. In early 2011, twenty seismic-reflection profiles were acquired near the Key Biscayne submarine sinkhole located on the seafloor of the Miami Terrace. Here the water depth is about 365 m. A steeply dipping (eastward) zone of mostly deteriorated quality of seismic-reflection data underlies the sinkhole. Correlation of coherent seismic reflections within and adjacent to the disturbed zone indicates a series of faults occur within the zone. It is hypothesized that upward movement of groundwater within the zone contributed to development of a hypogenic karst system and the resultant overlying sinkhole. Study of this modern seafloor sinkhole may provide clues to the genesis of the more deeply buried Tertiary karst collapse structures. Three-dimensional geomodeling of the seismic-reflection data from the Key Biscayne sinkhole further aids visualization of the seismic stratigraphy and structural system that underlies the sinkhole.

  9. Incorporation of CAD/CAM Restoration Into Navy Dentistry

    DTIC Science & Technology

    2017-09-26

    CAD/CAM Computer-aided design /Computer-assisted manufacturing CDT Common Dental Terminology DENCAS Dental Common Access System DTF Dental...to reduce avoidable dental emergencies for deployed sailors and marines. Dental Computer-aided design /Computer-assisted manufacturing (CAD/CAM...report will review and evaluate the placement rate by Navy dentists of digitally fabricated in-office ceramic restorations compared to traditional direct

  10. The Role of Computer-Aided Instruction in Science Courses and the Relevant Misconceptions of Pre-Service Teachers

    ERIC Educational Resources Information Center

    Aksakalli, Ayhan; Turgut, Umit; Salar, Riza

    2016-01-01

    This research aims to investigate the ways in which pre-service physics teachers interact with computers, which, as an indispensable means of today's technology, are of major value in education and training, and to identify any misconceptions said teachers may have about computer-aided instruction. As part of the study, computer-based physics…

  11. Student Performance in Computer-Assisted Instruction in Programming.

    ERIC Educational Resources Information Center

    Friend, Jamesine E.; And Others

    A computer-assisted instructional system to teach college students the computer language, AID (Algebraic Interpretive Dialogue), two control programs, and data collected by the two control programs are described. It was found that although first response errors were often those of AID syntax, such errors were easily corrected. Secondly, while…

  12. Cooperation Support in Computer-Aided Authoring and Learning.

    ERIC Educational Resources Information Center

    Muhlhauser, Max; Rudebusch, Tom

    This paper discusses the use of Computer Supported Cooperative Work (CSCW) techniques for computer-aided learning (CAL); the work was started in the context of project Nestor, a joint effort of German universities about cooperative multimedia authoring/learning environments. There are four major categories of cooperation for CAL: author/author,…

  13. Computer-Assisted Instruction: One Aid for Teachers of Reading.

    ERIC Educational Resources Information Center

    Rauch, Margaret; Samojeden, Elizabeth

    Computer assisted instruction (CAI), an instructional system with direct interaction between the student and the computer, can be a valuable aid for presenting new concepts, for reinforcing of selective skills, and for individualizing instruction. The advantages CAI provides include self-paced learning, more efficient allocation of classroom time,…

  14. Supercomputing resources empowering superstack with interactive and integrated systems

    NASA Astrophysics Data System (ADS)

    Rückemann, Claus-Peter

    2012-09-01

    This paper presents the results from the development and implementation of Superstack algorithms to be dynamically used with integrated systems and supercomputing resources. Processing of geophysical data, thus named geoprocessing, is an essential part of the analysis of geoscientific data. The theory of Superstack algorithms and the practical application on modern computing architectures was inspired by developments introduced with processing of seismic data on mainframes and within the last years leading to high end scientific computing applications. There are several stacking algorithms known but with low signal to noise ratio in seismic data the use of iterative algorithms like the Superstack can support analysis and interpretation. The new Superstack algorithms are in use with wave theory and optical phenomena on highly performant computing resources for huge data sets as well as for sophisticated application scenarios in geosciences and archaeology.

  15. Real-time seismic monitoring and functionality assessment of a building

    USGS Publications Warehouse

    Celebi, M.; ,

    2005-01-01

    This paper presents recent developments and approaches (using GPS technology and real-time double-integration) to obtain displacements and, in turn, drift ratios, in real-time or near real-time to meet the needs of the engineering and user community in seismic monitoring and assessing the functionality and damage condition of structures. Drift ratios computed in near real-time allow technical assessment of the damage condition of a building. Relevant parameters, such as the type of connections and story structural characteristics (including geometry) are used in computing drifts corresponding to several pre-selected threshold stages of damage. Thus, drift ratios determined from real-time monitoring can be compared to pre-computed threshold drift ratios. The approaches described herein can be used for performance evaluation of structures and can be considered as building health-monitoring applications.

  16. Low frequency amplification in deep alluvial basins: an example in the Po Plain (Northern Italy) and consequences for site specific SHA

    NASA Astrophysics Data System (ADS)

    Mascandola, Claudia; Massa, Marco; Barani, Simone; Lovati, Sara; Santulin, Marco

    2016-04-01

    This work deals with the problem of long period seismic site amplification that potentially might involve large and deep alluvial basins in case of strong earthquakes. In particular, it is here presented a case study in the Po Plain (Northern Italy), one of the most extended and deep sedimentary basin worldwide. Even if the studied area shows a low annul seismicity rate with rare strong events (Mw>6.0) and it is characterized by low to medium seismic hazard conditions, the seismic risk is significant for the high density of civil and strategic infrastructures (i.e. high degree of exposition) and the unfavourable geological conditions. The aim of this work is to provide general considerations about the seismic site response of the Po Plain, with particular attention on deep discontinuities (i.e. geological bedrock), in terms of potential low frequency amplification and their incidence on the PSHA. The current results were obtained through active and passive geophysical investigations performed near Castelleone, a site where a seismic station, which is part of the INGV (National Institute for Geophysics and Volcanology) Seismic National Network, is installed from 2009. In particular, the active analyses consisted in a MASW and a refraction survey, whereas the passive ones consisted in seismic ambient noise acquisitions with single stations and arrays of increasing aperture. The results in terms of noise HVSR indicate two main peaks, the first around 0.17 Hz and the second, as already stated in the recent literature, around 0.7 Hz. In order to correlate the amplified frequencies with the geological discontinuities, the array acquisitions were processed to obtain a shear waves velocity profile, computed with a joint inversion, considering the experimental dispersion curves and the HVSR results. The obtained velocity profile shows two main discontinuities: the shallower at ~165 m of depth, which can be correlated to the seismic bedrock (i.e. Vs > 800 m/) and the deeper at ~1350 m of depth, properly associable to the geological bedrock, considering the transition between the pliocenic loose sediments and the miocenic marls observable from the available stratigraphy. Numerical 1D analyses, computed to obtain the theoretical Transfer Function at the site, support the correlation between the experimental amplification peak around 0.17 Hz and the hypothesized geological bedrock. In terms of site specific SHA, the UHS expressed in displacement (MRP: 475 years) shows a significant increase if the seismic input is located at the geological bedrock (~1350 m) instead of the seismic bedrock (~165 m). Even if this increase is not relevant for the studied site, since the seismic hazard is low, it could be significant in other part of the Po Plain, where the seismic hazard is medium-high. According to the HVSR results, obtained for other available Po Plain broadband stations, the considerations of this work could represent a warning for future seismic hazard investigations in other areas of the basin.

  17. Improving Correlation Algorithms to Detect and Characterize Smaller Magnitude Induced Seismicity Swarms

    NASA Astrophysics Data System (ADS)

    Skoumal, R.; Brudzinski, M.; Currie, B.

    2015-12-01

    Induced seismic sequences often occur as swarms that can include thousands of small (< M 2) earthquakes. While the identification of this microseismicity would invariably aid in the characterization and modeling of induced sequences, traditional earthquake detection techniques often provide incomplete catalogs, even when local networks are deployed. Because induced sequences often include scores of micro-earthquakes that prelude larger magnitude events, the identification of these small magnitude events would be crucial for the early identification of induced sequences. By taking advantage of the repeating, swarm-like nature of induced seismicity, a more robust catalog can be created using complementary correlation algorithms in near real-time without the reliance on traditional earthquake detection and association routines. Since traditional earthquake catalog methodologies using regional networks have a relatively high detection threshold (M 2+), we have sought to develop correlation routines that can detect smaller magnitude sequences. While short-term/long-term amplitude average detection algorithms requires significant signal-to-noise ratio at multiple stations for confident identification, a correlation detector is capable of identifying earthquakes with high confidence using just a single station. The result is an embarrassingly parallel task that can be employed for a network to be used as an early warning system for potentially induced seismicity while also better characterizing tectonic sequences beyond what traditional methods allow.

  18. Internal friction quality-factor Q under confining pressure. [of lunar rocks

    NASA Technical Reports Server (NTRS)

    Tittmann, B. R.; Ahlberg, L.; Nadler, H.; Curnow, J.; Smith, T.; Cohen, E. R.

    1977-01-01

    It has been found in previous studies that small amounts of adsorbed volatiles can have a profound effect on the internal friction quality-factor Q of rocks and other porous media. Pandit and Tozer (1970) have suggested that the laboratory-measured Q of volatile-free rocks should be similar to the in situ seismic Q values of near-surface lunar rocks which according to Latham et al. (1970) are in the range of 3000-5000. Observations of dramatic increases in Q with outgassing up to values approaching 2000 in the seismic frequency range confirm this supposition. Measurements under confining pressures with the sample encapsulated under hard vacuum are reported to aid in the interpretation of seismic data obtained below the lunar surface. It has been possible to achieve in the experiments Q values just under 2000 at about 1 kbar for a terrestrial analog of lunar basalt. It was found that a well-outgassed sample maintains a high Q whereas one exposed to moisture maintains a low Q as the confining pressure is raised to 2.5 kbar. This result suggests that volatiles can indeed affect Q when cracks are partially closed and the high lunar seismic Q values reported are concomitant with very dry rock down to depths of at least 50 km.

  19. Computer-aided design of high-frequency transistor amplifiers.

    NASA Technical Reports Server (NTRS)

    Hsieh, C.-C.; Chan, S.-P.

    1972-01-01

    A systematic step-by-step computer-aided procedure for designing high-frequency transistor amplifiers is described. The technique makes it possible to determine the optimum source impedance which gives a minimum noise figure.

  20. High-resolution seismicity catalog of Italian peninsula in the period 1981-2015

    NASA Astrophysics Data System (ADS)

    Michele, M.; Latorre, D.; Castello, B.; Di Stefano, R.; Chiaraluce, L.

    2017-12-01

    In order to provide an updated reference catalog of Italian seismicity, the absolute location of the last 35 years (1981-2015) of seismic activity was computed with a three-dimensional VP and VS velocity model covering the whole Italian territory. The NonLinLoc code (Lomax et al., 2000), which is based on a probabilistic approach, was used to provide a complete and robust description of the uncertainties associated to the locations corresponding to the hypocentral solutions with the highest probability density. Moreover, the code using a finite difference approximation of the eikonal equation (Podvin and Lecomte, 1991), allows to manage very contrasted velocity models in the arrival time computation. To optimize the earthquakes location, we included the station corrections in the inverse problem. For each year, the number of available earthquakes depends on both the network detection capability and the occurrence of major seismic sequences. The starting earthquakes catalog was based on 2.6 million P and 1.9 million S arrival time picks for 278.607 selected earthquakes, recorded at least by 3 seismic stations of the Italian seismic network. The new catalog compared to the previous ones consisting of hypocentral locations retrieved with linearized location methods, shows a very good improvement as testified by the location parameters assessing the quality of the solution (i.e., RMS, azimuthal gap, formal error on horizontal and vertical components). In addition, we used the distance between the expected and the maximum likelihood hypocenter location to establish the unimodal (high-resolved location) or multimodal (poor-resolved location) character of the probability distribution. We used these parameters to classify the resulting locations in four classes (A, B, C and D) considering the simultaneous goodness of the previous parameters. The upper classes (A and B) include the 65% of the relocated earthquake, while the lowest class (D) only includes the 7% of the seismicity. We present the new catalog, consisting of 272.847 events, showing some example of earthquakes location related to the background as well as small to large seismic sequences occurred in Italy the last 35 years.

  1. Seismic hazard assessment of the Kivu rift segment based on a new sismo-tectonic zonation model (Western Branch of the East African Rift system)

    NASA Astrophysics Data System (ADS)

    Havenith, Hans-Balder; Delvaux, Damien

    2015-04-01

    In the frame of the Belgian GeoRisCA multi-risk assessment project focused on the Kivu and Northern Tanganyika Region, a seismic hazard map has been produced for this area. It is based on a on a recently re-compiled catalogue using various local and global earthquake catalogues. The use of macroseismic epicenters determined from felt earthquakes allowed to extend the time-range back to the beginning of the 20th century, thus spanning about 100 years. The magnitudes have been homogenized to Mw and the coherence of the catalogue has been checked and validated. The seismo-tectonic zonation includes 10 seismic source areas that have been defined on the basis of the regional geological structure, neotectonic fault systems, basin architecture and distribution of earthquake epicenters. The seismic catalogue was filtered by removing obvious aftershocks and Gutenberg-Richter Laws were determined for each zone. On the basis of this seismo-tectonic information and existing attenuation laws that had been established by Twesigomwe (1997) and Mavonga et al. (2007) for this area, seismic hazard has been computed with the Crisis 2012 (Ordaz et al., 2012) software. The outputs of this assessment clearly show higher PGA values (for 475 years return period) along the Rift than the previous estimates by Twesigomwe (1997) and Mavonga (2007) while the same attenuation laws had been used. The main reason for these higher PGA values is likely to be related to the more detailed zonation of the Rift structure marked by a strong gradient of the seismicity from outside the rift zone to the inside. Mavonga, T. (2007). An estimate of the attenuation relationship for the strong ground motion in the Kivu Province, Western Rift Valley of Africa. Physics of the Earth and Planetary Interiors 62, 13-21. Ordaz M, Martinelli F, Aguilar A, Arboleda J, Meletti C, D'Amico V. (2012). CRISIS 2012, Program for computing seismic hazard. Instituto de Ingeniería, Universidad Nacional Autónoma de México. Twesigomwe, E. (1997). Probabilistic seismic hazard assessment of Uganda, Ph.D. Thesis, Dept. of Physics, Makare University, Uganda.

  2. Three-dimensional surgical simulation.

    PubMed

    Cevidanes, Lucia H C; Tucker, Scott; Styner, Martin; Kim, Hyungmin; Chapuis, Jonas; Reyes, Mauricio; Proffit, William; Turvey, Timothy; Jaskolka, Michael

    2010-09-01

    In this article, we discuss the development of methods for computer-aided jaw surgery, which allows us to incorporate the high level of precision necessary for transferring virtual plans into the operating room. We also present a complete computer-aided surgery system developed in close collaboration with surgeons. Surgery planning and simulation include construction of 3-dimensional surface models from cone-beam computed tomography, dynamic cephalometry, semiautomatic mirroring, interactive cutting of bone, and bony segment repositioning. A virtual setup can be used to manufacture positioning splints for intraoperative guidance. The system provides further intraoperative assistance with a computer display showing jaw positions and 3-dimensional positioning guides updated in real time during the surgical procedure. The computer-aided surgery system aids in dealing with complex cases with benefits for the patient, with surgical practice, and for orthodontic finishing. Advanced software tools for diagnosis and treatment planning allow preparation of detailed operative plans, osteotomy repositioning, bone reconstructions, surgical resident training, and assessing the difficulties of the surgical procedures before the surgery. Computer-aided surgery can make the elaboration of the surgical plan a more flexible process, increase the level of detail and accuracy of the plan, yield higher operative precision and control, and enhance documentation of cases. 2010 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  3. Monitoring of seismic time-series with advanced parallel computational tools and complex networks

    NASA Astrophysics Data System (ADS)

    Kechaidou, M.; Sirakoulis, G. Ch.; Scordilis, E. M.

    2012-04-01

    Earthquakes have been in the focus of human and research interest for several centuries due to their catastrophic effect to the everyday life as they occur almost all over the world demonstrating a hard to be modelled unpredictable behaviour. On the other hand, their monitoring with more or less technological updated instruments has been almost continuous and thanks to this fact several mathematical models have been presented and proposed so far to describe possible connections and patterns found in the resulting seismological time-series. Especially, in Greece, one of the most seismically active territories on earth, detailed instrumental seismological data are available from the beginning of the past century providing the researchers with valuable and differential knowledge about the seismicity levels all over the country. Considering available powerful parallel computational tools, such as Cellular Automata, these data can be further successfully analysed and, most important, modelled to provide possible connections between different parameters of the under study seismic time-series. More specifically, Cellular Automata have been proven very effective to compose and model nonlinear complex systems resulting in the advancement of several corresponding models as possible analogues of earthquake fault dynamics. In this work preliminary results of modelling of the seismic time-series with the help of Cellular Automata so as to compose and develop the corresponding complex networks are presented. The proposed methodology will be able to reveal under condition hidden relations as found in the examined time-series and to distinguish the intrinsic time-series characteristics in an effort to transform the examined time-series to complex networks and graphically represent their evolvement in the time-space. Consequently, based on the presented results, the proposed model will eventually serve as a possible efficient flexible computational tool to provide a generic understanding of the possible triggering mechanisms as arrived from the adequately monitoring and modelling of the regional earthquake phenomena.

  4. Space Spurred Computer Graphics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  5. A Novel Computer-Aided Design/Computer-Assisted Manufacture Method for One-Piece Removable Partial Denture and Evaluation of Fit.

    PubMed

    Ye, Hongqiang; Li, Xinxin; Wang, Guanbo; Kang, Jing; Liu, Yushu; Sun, Yuchun; Zhou, Yongsheng

    2018-02-15

    To investigate a computer-aided design/computer-aided manufacturing (CAD/CAM) process for producing one-piece removable partial dentures (RPDs) and to evaluate their fits in vitro. A total of 15 one-piece RPDs were designed using dental CAD and reverse engineering software and then fabricated with polyetheretherketone (PEEK) using CAM. The gaps between RPDs and casts were measured and compared with traditional cast framework RPDs. Gaps were lower for one-piece PEEK RPDs compared to traditional RPDs. One-piece RPDs can be manufactured by CAD/CAM, and their fits were better than those of traditional RPDs.

  6. Development of Procedures for Computing Site Seismicity

    DTIC Science & Technology

    1993-02-01

    surface wave magnitude when in the range of 5 to 7.5. REFERENCES Ambraseys, N.N. (1970). "Some characteristic features of the Anatolian fault zone...geology seismicity and environmental impact, Association of Engineering Geologists , Special Publication. Los Angeles, CA, University Publishers, 1973... Geologists ) Recurrenc.e Recurrence Slip Intervals (yr) at Intervals (yr) over Fault Rate Length a Point on Fault Length of Fault (cm/yI) (km) (Rý) (R

  7. Computation of dynamic seismic responses to viscous fluid of digitized three-dimensional Berea sandstones with a coupled finite-difference method.

    PubMed

    Zhang, Yang; Toksöz, M Nafi

    2012-08-01

    The seismic response of saturated porous rocks is studied numerically using microtomographic images of three-dimensional digitized Berea sandstones. A stress-strain calculation is employed to compute the velocities and attenuations of rock samples whose sizes are much smaller than the seismic wavelength of interest. To compensate for the contributions of small cracks lost in the imaging process to the total velocity and attenuation, a hybrid method is developed to recover the crack distribution, in which the differential effective medium theory, the Kuster-Toksöz model, and a modified squirt-flow model are utilized in a two-step Monte Carlo inversion. In the inversion, the velocities of P- and S-waves measured for the dry and water-saturated cases, and the measured attenuation of P-waves for different fluids are used. By using such a hybrid method, both the velocities of saturated porous rocks and the attenuations are predicted accurately when compared to laboratory data. The hybrid method is a practical way to model numerically the seismic properties of saturated porous rocks until very high resolution digital data are available. Cracks lost in the imaging process are critical for accurately predicting velocities and attenuations of saturated porous rocks.

  8. Fast kinematic ray tracing of first- and later-arriving global seismic phases

    NASA Astrophysics Data System (ADS)

    Bijwaard, Harmen; Spakman, Wim

    1999-11-01

    We have developed a ray tracing algorithm that traces first- and later-arriving global seismic phases precisely (traveltime errors of the order of 0.1 s), and with great computational efficiency (15 rays s- 1). To achieve this, we have extended and adapted two existing ray tracing techniques: a graph method and a perturbation method. The two resulting algorithms are able to trace (critically) refracted, (multiply) reflected, some diffracted (Pdiff), and (multiply) converted seismic phases in a 3-D spherical geometry, thus including the largest part of seismic phases that are commonly observed on seismograms. We have tested and compared the two methods in 2-D and 3-D Cartesian and spherical models, for which both algorithms have yielded precise paths and traveltimes. These tests indicate that only the perturbation method is computationally efficient enough to perform 3-D ray tracing on global data sets of several million phases. To demonstrate its potential for non-linear tomography, we have applied the ray perturbation algorithm to a data set of 7.6 million P and pP phases used by Bijwaard et al. (1998) for linearized tomography. This showed that the expected heterogeneity within the Earth's mantle leads to significant non-linear effects on traveltimes for 10 per cent of the applied phases.

  9. Anomalous Induced Seismicity due to Hydraulic Fracturing. Case of study in the Montney Formation, Northeast British Columbia.

    NASA Astrophysics Data System (ADS)

    Longobardi, M.; Bustin, A. M. M.; Johansen, K.; Bustin, R. M.

    2017-12-01

    One of our goals is to investigate the variables and processes controlling the anomalous induced seismicity and its associated ground motions, to better understand the anomalous induced seismicity (AIS) due to hydraulic fracturing in Northeast British Columbia. Our other main objective is to optimize-completions and well design. Although the vast majority of earthquakes that occur in the world each year have natural causes, some of these earthquakes and a number of lesser magnitude seismic events are induced by human activities. The recorded induced seismicity resulting from the fluid injection during hydraulic fracturing is generally small in magnitude (< M 1). Shale gas operations in Northeast British Columbia (BC) have induced the largest recorded occurrence and magnitude of AIS because of hydraulic fracturing. Anomalous induced seismicity have been recorded in seven clusters within the Montney area, with magnitudes up to ML 4.6. Five of these clusters have been linked to hydraulic fracturing. To analyse our AIS data, we first have calculated the earthquakes hypocenters. The data was recorded on an array of real-time accelerometers. We built the array based on our modified design from the early earthquake detectors installed in BC schools for the Earthquake Early Warning System for British Columbia. We have developed a new technique for locating hypocenters and applied it to our dataset. The technique will enable near real-time event location, aiding in both mitigating induced events and adjusting completions to optimize the stimulation. Our hypocenter program assumes to consider a S wave speed, fitting the arrival times to the hypocenter, and using an "amoebae method" multivariate. We have used this method because it is well suited to minimizing of the chi-squared function of the arrival time deviation. We show some preliminary results on the Montney dataset.

  10. Crustal seismic anisotropy: A localized perspective from surface waves at the Ruby Mountains Core Complex

    NASA Astrophysics Data System (ADS)

    Wilgus, J. T.; Schmandt, B.; Jiang, C.

    2017-12-01

    The relative importance of potential controls on crustal seismic anisotropy, such as deformational fabrics in polycrystalline crustal rocks and the contemporary state of stress, remain poorly constrained. Recent regional western US lithospheric seismic anisotropy studies have concluded that the distribution of strain in the lower crust is diffuse throughout the Basin and Range (BR) and that deformation in the crust and mantle are largely uncoupled. To further contribute to our understanding of crustal anisotropy we are conducting a detailed local study of seismic anisotropy within the BR using surface waves at the Ruby Mountain Core Complex (RMCC), located in northeast Nevada. The RMCC is one of many distinctive uplifts within the North American cordillera called metamorphic core complexes which consist of rocks exhumed from middle to lower crustal depths adjacent to mylonitic shear zones. The RMCC records exhumation depths up to 30 km indicating an anomalously high degree of extension relative to the BR average. This exhumation, the geologic setting of the RMCC, and the availability of dense broadband data from the Transportable Array (TA) and the Ruby Mountain Seismic Experiment (RMSE) coalesce to form an ideal opportunity to characterize seismic anisotropy as a function of depth beneath RMCC and evaluate the degree to which anisotropy deviates from regional scale properties of the BR. Preliminary azimuthal anisotropy results using Rayleigh waves reveal clear anisotropic signals at periods between 5-40 s, and demonstrate significant rotations of fast orientations relative to prior regional scale results. Moving forward we will focus on quantification of depth-dependent radial anisotropy from inversion of Rayleigh and Love waves. These results will be relevant to identification of the deep crustal distribution of strain associated with RMCC formation and may aid interpretation of controls on crustal anisotropy in other regions.

  11. Effect of Computer-Presented Organizational/Memory Aids on Problem Solving Behavior.

    ERIC Educational Resources Information Center

    Steinberg, Esther R.; And Others

    This research studied the effects of computer-presented organizational/memory aids on problem solving behavior. The aids were either matrix or verbal charts shown on the display screen next to the problem. The 104 college student subjects were randomly assigned to one of the four conditions: type of chart (matrix or verbal chart) and use of charts…

  12. Programming Design Guide for Computer Implementation of Job Aid for Selecting Instructional Setting. Final Report.

    ERIC Educational Resources Information Center

    Schulz, Russel E.; And Others

    This Programming Design Guide (PDG) was developed to permit the offline Job Aid for Selecting Instructional Setting, which is one of 13 job aids presently available for use with the Instructional Systems Development (ISD) model, to be available in an inquiry-type, online version. It is intended to provide computer programmers with all of the…

  13. Tectonic characterization of a potential high-level nuclear waste repository at Yucca Mountain, Nevada

    USGS Publications Warehouse

    Whitney, John W.; O'Leary, Dennis W.

    1993-01-01

    Tectonic characterization of a potential high-level nuclear waste repository at Yucca Mountain, Nevada, is needed to assess seismic and possible volcanic hazards that could affect the site during the preclosure (next 100 years) and the behavior of the hydrologic system during the postclosure (the following 10,000 years) periods. Tectonic characterization is based on assembling mapped geological structures in their chronological order of development and activity, and interpreting their dynamic interrelationships. Addition of mechanistic models and kinematic explanations for the identified tectonic processes provides one or more tectonic models having predictive power. Proper evaluation and application of tectonic models can aid in seismic design and help anticipate probable occurrence of future geologic events of significance to the repository and its design.

  14. Teaching Advance Care Planning to Medical Students with a Computer-Based Decision Aid

    PubMed Central

    Levi, Benjamin H.

    2013-01-01

    Discussing end-of-life decisions with cancer patients is a crucial skill for physicians. This article reports findings from a pilot study evaluating the effectiveness of a computer-based decision aid for teaching medical students about advance care planning. Second-year medical students at a single medical school were randomized to use a standard advance directive or a computer-based decision aid to help patients with advance care planning. Students' knowledge, skills, and satisfaction were measured by self-report; their performance was rated by patients. 121/133 (91%) of students participated. The Decision-Aid Group (n=60) outperformed the Standard Group (n=61) in terms of students´ knowledge (p<0.01), confidence in helping patients with advance care planning (p<0.01), knowledge of what matters to patients (p=0.05), and satisfaction with their learning experience (p<0.01). Likewise, patients in the Decision Aid Group were more satisfied with the advance care planning method (p<0.01) and with several aspects of student performance. Use of a computer-based decision aid may be an effective way to teach medical students how to discuss advance care planning with cancer patients. PMID:20632222

  15. Evaluation of direct and indirect additive manufacture of maxillofacial prostheses.

    PubMed

    Eggbeer, Dominic; Bibb, Richard; Evans, Peter; Ji, Lu

    2012-09-01

    The efficacy of computer-aided technologies in the design and manufacture of maxillofacial prostheses has not been fully proven. This paper presents research into the evaluation of direct and indirect additive manufacture of a maxillofacial prosthesis against conventional laboratory-based techniques. An implant/magnet-retained nasal prosthesis case from a UK maxillofacial unit was selected as a case study. A benchmark prosthesis was fabricated using conventional laboratory-based techniques for comparison against additive manufactured prostheses. For the computer-aided workflow, photogrammetry, computer-aided design and additive manufacture (AM) methods were evaluated in direct prosthesis body fabrication and indirect production using an additively manufactured mould. Qualitative analysis of position, shape, colour and edge quality was undertaken. Mechanical testing to ISO standards was also used to compare the silicone rubber used in the conventional prosthesis with the AM material. Critical evaluation has shown that utilising a computer-aided work-flow can produce a prosthesis body that is comparable to that produced using existing best practice. Technical limitations currently prevent the direct fabrication method demonstrated in this paper from being clinically viable. This research helps prosthesis providers understand the application of a computer-aided approach and guides technology developers and researchers to address the limitations identified.

  16. Computational modelling of the impact of AIDS on business.

    PubMed

    Matthews, Alan P

    2007-07-01

    An overview of computational modelling of the impact of AIDS on business in South Africa, with a detailed description of the AIDS Projection Model (APM) for companies, developed by the author, and suggestions for further work. Computational modelling of the impact of AIDS on business in South Africa requires modelling of the epidemic as a whole, and of its impact on a company. This paper gives an overview of epidemiological modelling, with an introduction to the Actuarial Society of South Africa (ASSA) model, the most widely used such model for South Africa. The APM produces projections of HIV prevalence, new infections, and AIDS mortality on a company, based on the anonymous HIV testing of company employees, and projections from the ASSA model. A smoothed statistical model of the prevalence test data is computed, and then the ASSA model projection for each category of employees is adjusted so that it matches the measured prevalence in the year of testing. FURTHER WORK: Further techniques that could be developed are microsimulation (representing individuals in the computer), scenario planning for testing strategies, and models for the business environment, such as models of entire sectors, and mapping of HIV prevalence in time and space, based on workplace and community data.

  17. Exploration of a physiologically-inspired hearing-aid algorithm using a computer model mimicking impaired hearing.

    PubMed

    Jürgens, Tim; Clark, Nicholas R; Lecluyse, Wendy; Meddis, Ray

    2016-01-01

    To use a computer model of impaired hearing to explore the effects of a physiologically-inspired hearing-aid algorithm on a range of psychoacoustic measures. A computer model of a hypothetical impaired listener's hearing was constructed by adjusting parameters of a computer model of normal hearing. Absolute thresholds, estimates of compression, and frequency selectivity (summarized to a hearing profile) were assessed using this model with and without pre-processing the stimuli by a hearing-aid algorithm. The influence of different settings of the algorithm on the impaired profile was investigated. To validate the model predictions, the effect of the algorithm on hearing profiles of human impaired listeners was measured. A computer model simulating impaired hearing (total absence of basilar membrane compression) was used, and three hearing-impaired listeners participated. The hearing profiles of the model and the listeners showed substantial changes when the test stimuli were pre-processed by the hearing-aid algorithm. These changes consisted of lower absolute thresholds, steeper temporal masking curves, and sharper psychophysical tuning curves. The hearing-aid algorithm affected the impaired hearing profile of the model to approximate a normal hearing profile. Qualitatively similar results were found with the impaired listeners' hearing profiles.

  18. Program Aids Specification Of Multiple-Block Grids

    NASA Technical Reports Server (NTRS)

    Sorenson, R. L.; Mccann, K. M.

    1993-01-01

    3DPREP computer program aids specification of multiple-block computational grids. Highly interactive graphical preprocessing program designed for use on powerful graphical scientific computer workstation. Divided into three main parts, each corresponding to principal graphical-and-alphanumerical display. Relieves user of some burden of collecting and formatting many data needed to specify blocks and grids, and prepares input data for NASA's 3DGRAPE grid-generating computer program.

  19. Radiated Seismic Energy of Earthquakes in the South-Central Region of the Gulf of California, Mexico

    NASA Astrophysics Data System (ADS)

    Castro, Raúl R.; Mendoza-Camberos, Antonio; Pérez-Vertti, Arturo

    2018-05-01

    We estimated the radiated seismic energy (ES) of 65 earthquakes located in the south-central region of the Gulf of California. Most of these events occurred along active transform faults that define the Pacific-North America plate boundary and have magnitudes between M3.3 and M5.9. We corrected the spectral records for attenuation using nonparametric S-wave attenuation functions determined with the whole data set. The path effects were isolated from the seismic source using a spectral inversion. We computed radiated seismic energy of the earthquakes by integrating the square velocity source spectrum and estimated their apparent stresses. We found that most events have apparent stress between 3 × 10-4 and 3 MPa. Model independent estimates of the ratio between seismic energy and moment (ES/M0) indicates that this ratio is independent of earthquake size. We conclude that in general the apparent stress is low (σa < 3 MPa) in the south-central and southern Gulf of California.

  20. Evaluation of seismic spatial interaction effects through an impact testing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, B.D.; Driesen, G.E.

    The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous ``sources`` and ``targets`` requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to ``calibrate`` the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less

  1. Evaluation of seismic spatial interaction effects through an impact testing program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, B.D.; Driesen, G.E.

    The consequences of non-seismically qualified objects falling and striking essential, seismically qualified objects is an analytically difficult problem to assess. Analytical solutions to impact problems are conservative and only available for simple situations. In a nuclear facility, the numerous sources'' and targets'' requiring evaluation often have complex geometric configurations, which makes calculations and computer modeling difficult. Few industry or regulatory rules are available for this specialized assessment. A drop test program was recently conducted to calibrate'' the judgment of seismic qualification engineers who perform interaction evaluations and to further develop seismic interaction criteria. Impact tests on varying combinations of sourcesmore » and targets were performed by dropping the sources from various heights onto targets that were connected to instruments. This paper summarizes the scope, test configurations, and some results of the drop test program. Force and acceleration time history data and general observations are presented on the ruggedness of various targets when subjected to impacts from different types of sources.« less

  2. Procedures for Computing Site Seismicity

    DTIC Science & Technology

    1994-02-01

    Fourth World Conference on Earthquake Engineering, Santiago, Chile , 1969. Schnabel, P.B., J. Lysmer, and H.B. Seed (1972). SHAKE, a computer program for...This fault system is composed of the Elsinore and Whittier fault zones, Agua Caliente fault, and Earthquake Valley fault. Five recent earthquakes of

  3. Earthquake Source Parameter Estimates for the Charlevoix and Western Quebec Seismic Zones in Eastern Canada

    NASA Astrophysics Data System (ADS)

    Onwuemeka, J.; Liu, Y.; Harrington, R. M.; Peña-Castro, A. F.; Rodriguez Padilla, A. M.; Darbyshire, F. A.

    2017-12-01

    The Charlevoix Seismic Zone (CSZ), located in eastern Canada, experiences a high rate of intraplate earthquakes, hosting more than six M >6 events since the 17th century. The seismicity rate is similarly high in the Western Quebec seismic zone (WQSZ) where an MN 5.2 event was reported on May 17, 2013. A good understanding of seismicity and its relation to the St-Lawrence paleorift system requires information about event source properties, such as static stress drop and fault orientation (via focal mechanism solutions). In this study, we conduct a systematic estimate of event source parameters using 1) hypoDD to relocate event hypocenters, 2) spectral analysis to derive corner frequency, magnitude, and hence static stress drops, and 3) first arrival polarities to derive focal mechanism solutions of selected events. We use a combined dataset for 817 earthquakes cataloged between June 2012 and May 2017 from the Canadian National Seismograph Network (CNSN), and temporary deployments from the QM-III Earthscope FlexArray and McGill seismic networks. We first relocate 450 events using P and S-wave differential travel-times refined with waveform cross-correlation, and compute focal mechanism solutions for all events with impulsive P-wave arrivals at a minimum of 8 stations using the hybridMT moment tensor inversion algorithm. We then determine corner frequency and seismic moment values by fitting S-wave spectra on transverse components at all stations for all events. We choose the final corner frequency and moment values for each event using the median estimate at all stations. We use the corner frequency and moment estimates to calculate moment magnitudes, static stress-drop values and rupture radii, assuming a circular rupture model. We also investigate scaling relationships between parameters, directivity, and compute apparent source dimensions and source time functions of 15 M 2.4+ events from second-degree moment estimates. To the first-order, source dimension estimates from both methods generally agree. We observe higher corner frequencies and higher stress drops (ranging from 20 to 70 MPa) typical of intraplate seismicity in comparison with interplate seismicity. We follow similar approaches to studying 25 MN 3+ events reported in the WQSZ using data recorded by the CNSN and USArray Transportable Array.

  4. One Decade of Induced Seismicity in Basel, Switzerland: A Consistent High-Resolution Catalog Obtained by Template Matching

    NASA Astrophysics Data System (ADS)

    Herrmann, M.; Kraft, T.; Tormann, T.; Scarabello, L.; Wiemer, S.

    2017-12-01

    Induced seismicity at the site of the Basel Enhanced Geothermal System (EGS) continuously decayed for six years after injection had been stopped in December 2006. Starting in May 2012, the Swiss Seismological Service was detecting a renewed increase of induced seismicity in the EGS reservoir to levels last seen in 2007 and reaching magnitudes up to ML2.0. Seismic monitoring at this EGS site is running for more than ten years now, but the details of the long-term behavior of its induced seismicity remained unexplored because a seismic event catalog that is consistent in detection sensitivity and magnitude estimation did not exist.We have created such a catalog by applying our matched filter detector to the 11-year-long seismic recordings of a borehole station at 2.7km depth. Based on 3'600 located earthquakes of the operator's borehole-network catalog, we selected about 2'500 reasonably dissimilar templates using waveform clustering. This large template set ensures an adequate coverage of the diversity of event waveforms which is due to the reservoir's highly complex fault system and the close observation distance. To cope with the increased computational demand of scanning 11-years of data with 2'500 templates, we parallelized our detector to run on a high-performance computer of the Swiss National Supercomputing Centre.We detect more than 200'000 events down to ML-2.5 during the six-day-long stimulation in December 2006 alone. Previously, only 13'000 detections found by an amplitude-threshold-based detector were known for this period. The high temporal and spatial resolution of this new catalog allows us to analyze the statistics of the induced Basel earthquakes in great detail. We resolve spatio-temporal variations of the seismicity parameters (a- and b-value) that have not been identified before and derive the first high-resolution temporal evolution of the seismic hazard for the Basel EGS reservoir.In summer 2017, our detector monitored the 10-week pressure reduction operation at the Basel-1 borehole during which the well was periodically opened. The detections drove a traffic light system based on magnitude thresholds and earthquake rates. For future EGS projects in Switzerland, our detector is planned to run in near real-time and provide the basis for an advanced traffic light system.

  5. Computer Aided Phenomenography: The Role of Leximancer Computer Software in Phenomenographic Investigation

    ERIC Educational Resources Information Center

    Penn-Edwards, Sorrel

    2010-01-01

    The qualitative research methodology of phenomenography has traditionally required a manual sorting and analysis of interview data. In this paper I explore a potential means of streamlining this procedure by considering a computer aided process not previously reported upon. Two methods of lexicological analysis, manual and automatic, were examined…

  6. Computer-Aided College Algebra: Learning Components that Students Find Beneficial

    ERIC Educational Resources Information Center

    Aichele, Douglas B.; Francisco, Cynthia; Utley, Juliana; Wescoatt, Benjamin

    2011-01-01

    A mixed-method study was conducted during the Fall 2008 semester to better understand the experiences of students participating in computer-aided instruction of College Algebra using the software MyMathLab. The learning environment included a computer learning system for the majority of the instruction, a support system via focus groups (weekly…

  7. Computer Aided Instruction: A Study of Student Evaluations and Academic Performance

    ERIC Educational Resources Information Center

    Collins, David; Deck, Alan; McCrickard, Myra

    2008-01-01

    Computer aided instruction (CAI) encompasses a broad range of computer technologies that supplement the classroom learning environment and can dramatically increase a student's access to information. Criticism of CAI generally focuses on two issues: it lacks an adequate foundation in educational theory and the software is difficult to implement…

  8. Using Computer Graphics in the 90's.

    ERIC Educational Resources Information Center

    Towne, Violet A.

    Computer-Aided Design, a hands-on program for public school teachers, was first offered in the summer of 1987 as an outgrowth of a 1986 robotics training program. Area technology teachers needed computer-aided design (CAD) training because of a New York State Education system transition from the industrial arts curriculum to a new curriculum in…

  9. Using Computer-Aided Instruction to Support the Systematic Practice of Phonological Skills in Beginning Readers

    ERIC Educational Resources Information Center

    Wild, Mary

    2009-01-01

    The paper reports the results of a randomised control trial investigating the use of computer-aided instruction (CAI) for practising phonological awareness skills with beginning readers. Two intervention groups followed the same phonological awareness programme: one group undertook practice exercises using a computer and the other group undertook…

  10. The Impact of Software on Associate Degree Programs in Electronic Engineering Technology.

    ERIC Educational Resources Information Center

    Hata, David M.

    1986-01-01

    Assesses the range and extent of computer assisted instruction software available in electronic engineering technology education. Examines the need for software skills in four areas: (1) high-level languages; (2) assembly language; (3) computer-aided engineering; and (4) computer-aided instruction. Outlines strategies for the future in three…

  11. Computer Code Aids Design Of Wings

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Darden, Christine M.

    1993-01-01

    AERO2S computer code developed to aid design engineers in selection and evaluation of aerodynamically efficient wing/canard and wing/horizontal-tail configurations that includes simple hinged-flap systems. Code rapidly estimates longitudinal aerodynamic characteristics of conceptual airplane lifting-surface arrangements. Developed in FORTRAN V on CDC 6000 computer system, and ported to MS-DOS environment.

  12. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1997-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the CAGI: Computer Aided Grid Interface system. The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and/or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  13. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1996-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the Computer Aided Grid Interface system (CAGI). The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and / or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  14. Manufacturing engineering: Principles for optimization

    NASA Astrophysics Data System (ADS)

    Koenig, Daniel T.

    Various subjects in the area of manufacturing engineering are addressed. The topics considered include: manufacturing engineering organization concepts and management techniques, factory capacity and loading techniques, capital equipment programs, machine tool and equipment selection and implementation, producibility engineering, methods, planning and work management, and process control engineering in job shops. Also discussed are: maintenance engineering, numerical control of machine tools, fundamentals of computer-aided design/computer-aided manufacture, computer-aided process planning and data collection, group technology basis for plant layout, environmental control and safety, and the Integrated Productivity Improvement Program.

  15. Extending the life of mature basins in the North Sea and imaging sub-basalt and sub-intrusive structures using seismic intensity monitoring.

    NASA Astrophysics Data System (ADS)

    De Siena, Luca; Rawlinson, Nicholas

    2016-04-01

    Non-standard seismic imaging (velocity, attenuation, and scattering tomography) of the North Sea basins by using unexploited seismic intensities from previous passive and active surveys are key for better imaging and monitoring fluid under the subsurface. These intensities provide unique solutions to the problem of locating/tracking gas/fluid movements in the crust and depicting sub-basalt and sub-intrusives in volcanic reservoirs. The proposed techniques have been tested in volcanic Islands (Deception Island) and have been proved effective at monitoring fracture opening, imaging buried fluid-filled bodies, and tracking water/gas interfaces. These novel seismic attributes are modelled in space and time and connected with the lithology of the sampled medium, specifically density and permeability with as key output a novel computational code with strong commercial potential.

  16. Geophysical-geological studies of possible extensions of the New Madrid Fault Zone. Annual report, 1982. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinze, W.J.; Braile, L.W.; Keller, G.R.

    1983-05-01

    An integrated geophysical/geologic program is being conducted to evaluate the rift complex hypothesis as an explanation for the earthquake activity in the New Madrid Seismic Zone and its extensions, to refine our knowledge of the rift complex, and to investigate the possible northern extensions of the New Madrid Fault Zone, especially its possible connection to the Anna, Ohio seismogenic region. Drillhole basement lithologies are being investigated to aid in tectonic analysis and geophysical interpretation, particularly in the Anna, Ohio area. Gravity and magnetic modeling combined with limited seismic reflection studies in southwest Indiana are interpreted as confirming speculation that anmore » arm of the New Madrid Rift Complex extends northeasterly into Indiana. The geologic and geophysical evidence confirm that the basement lithology in the Anna, Ohio area is highly variable reflecting a complex geologic history. The data indicate that as many as three major Late Precambrian tectonic features intersect within the basement of the Anna area suggesting that the seismicity may be related to basement zones of weakness.« less

  17. Vulnerability of populations and man-made facilities to seismic hazards

    NASA Astrophysics Data System (ADS)

    Badal, J.; Vazquez-Prada, M.; Gonzalez, A.; Chourak, M.; Samardzhieva, E.; Zhang, Z.

    2003-04-01

    Earthquakes become major societal risks when they impinge on vulnerable populations. According to the available worldwide data during the twentieth century (NEIC Catalog of Earthquakes 1980-1999), almost half a thousand of earthquakes resulted in more than 1,615,000 human victims. Besides human casualty levels, destructive earthquakes frequently inflict huge economic losses. An additional problem of very different nature, but also worthy of being considered in a damage and loss analysis, is the direct cost associated with the damages derived from a strong seismic impact. We focus our attention on both aspects to their rapid quantitative assessment, and to lessen the earthquake disaster in areas affected by relatively strong earthquakes. Our final goal is the knowledge of potential losses from earthquakes to forward national programs in emergency management, and consequently the minimization of the life loss due to earthquakes, and to aid in response and recovery tasks. For this purpose we follow a suitable and comprehensible methodology for risk-based loss analysis, and simulate the occurence of a seismic event in densely populated areas of Spain.

  18. Seismic risk management solution for nuclear power plants

    DOE PAGES

    Coleman, Justin; Sabharwall, Piyush

    2014-12-01

    Nuclear power plants should safely operate during normal operations and maintain core-cooling capabilities during off-normal events, including external hazards (such as flooding and earthquakes). Management of external hazards to expectable levels of risk is critical to maintaining nuclear facility and nuclear power plant safety. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components). Seismic isolation (SI) is one protective measure showing promise to minimize seismic risk. Current SI designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefitmore » of SI application in the nuclear industry is being recognized and SI systems have been proposed in American Society of Civil Engineer Standard 4, ASCE-4, to be released in the winter of 2014, for light water reactors facilities using commercially available technology. The intent of ASCE-4 is to provide criteria for seismic analysis of safety related nuclear structures such that the responses to design basis seismic events, computed in accordance with this standard, will have a small likelihood of being exceeded. The U.S. nuclear industry has not implemented SI to date; a seismic isolation gap analysis meeting was convened on August 19, 2014, to determine progress on implementing SI in the U.S. nuclear industry. The meeting focused on the systems and components that could benefit from SI. As a result, this article highlights the gaps identified at this meeting.« less

  19. Hanford Quarter Seismic Report - 98C Seismicity On and Near the Hanford Site, Pasco Basin, Washington: April 1, 1998 Through June 30, 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DC Hartshorn, SP Reidel, AC Rohay

    1998-10-23

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. The staff also locates aud identifies sources of seismic activity and monitors changes in the hi~orical pattern of seismic activity at the Hanford Site. The data are. compiled archived, and published for use by the Hanford Site for waste management Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event of zinmore » earthquake on the Hanford Site. The HSN and Ihe Eastern Washington Regional Network (EN/RN) consist-of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The operational rate for the third quarter of FY 1998 for stations in the HSN was 99.99%. The operational rate for the third quarter of FY 1998 for stations of the EWRN was 99.95%. For the third quarter of FY 1998, the acquisition computer triggered 133 times. Of these triggers 11 were local earthquakes: 5 (45Yo) in the Columbia River Basalt Group, 2(1 8%) in the pre-basalt sediments, and 4 (36%) in the crystalline basement. The geologic and tectonic environments where these earthquakes occurred are discussed in this report.« less

  20. Hanford Quarter Seismic Report - 98C Seismicity On and Near the Hanford Site, Pasco Basin, Washington: April 1, 1998 Through June 30, 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DC Hartshorn, SP Reidel, AC Rohay.

    1998-10-23

    Hanford Seismic Monitoring provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) for the U.S. Department of Energy and its contractors. The staff also locates aud identifies sources of seismic activity and monitors changes in the hi orical pattern of seismic activity at the Hanford Site. The data are. compiled archived, and published for use by the Hanford Site for waste management Natural Phenomena Hazards assessments, and engineering design and construction. In addition, the seismic monitoring organization works with the Hanford Site Emergency Services Organization to provide assistance in the event ofmore » zin earthquake on the Hanford Site. The HSN and Ihe Eastern Washington Regional Network (EN/RN) consist-of 42 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Monitoring staff. The operational rate for the third quarter of FY 1998 for stations in the HSN was 99.99%. The operational rate for the third quarter of FY 1998 for stations of the EWRN was 99.95%. For the third quarter of FY 1998, the acquisition computer triggered 133 times. Of these triggers 11 were local earthquakes: 5 (45Yo) in the Columbia River Basalt Group, 2(1 8%) in the pre-basalt sediments, and 4 (36%) in the crystalline basement. The geologic and tectonic environments where these earthquakes occurred are discussed in this report.« less

  1. Extreme magnitude earthquakes and their economical impact: The Mexico City case

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Mario, C.

    2005-12-01

    The consequences (estimated by the human and economical losses) of the recent occurrence (worldwide) of extreme magnitude (for the region under consideration) earthquakes, such as the 19 09 1985 in Mexico (Ritchter magnitude Ms 8.1, moment magnitude Mw 8.01), or the one in Indonesia of the 26 12 2004 (Ms 9.4, Mw 9.3), stress the importance of performing seismic hazard analysis that, specifically, incorporate this possibility. Herewith, we present and apply a methodology, based on plausible extreme seismic scenarios and the computation of their associated synthetic accelerograms, to estimate the seismic hazard on Mexico City (MC) stiff and compressible surficial soils. The uncertainties about the characteristics of the potential finite seismic sources, as well as those related to the dynamic properties of MC compressible soils are taken into account. The economic consequences (i.e. the seismic risk = seismic hazard x economic cost) implicit in the seismic coefficients proposed in MC seismic Codes before (1976) and after the 1985 earthquake (2004) are analyzed. Based on the latter and on an acceptable risk criterion, a maximum seismic coefficient (MSC) of 1.4g (g = 9.81m/s2) of the elastic acceleration design spectra (5 percent damping), which has a probability of exceedance of 2.4 x 10-4, seems to be appropriate for analyzing the seismic behavior of infrastructure located on MC compressible soils, if extreme Mw 8.5 subduction thrust mechanism earthquakes (similar to the one occurred on 19 09 1985 with an observed, equivalent, MSC of 1g) occurred in the next 50 years.

  2. Seismic activity offshore Martinique and Dominica islands (Central Lesser Antilles subduction zone) from temporary onshore and offshore seismic networks

    NASA Astrophysics Data System (ADS)

    Ruiz, M.; Galve, A.; Monfret, T.; Sapin, M.; Charvis, P.; Laigle, M.; Evain, M.; Hirn, A.; Flueh, E.; Gallart, J.; Diaz, J.; Lebrun, J. F.

    2013-09-01

    This work focuses on the analysis of a unique set of seismological data recorded by two temporary networks of seismometers deployed onshore and offshore in the Central Lesser Antilles Island Arc from Martinique to Guadeloupe islands. During the whole recording period, extending from January to the end of August 2007, more than 1300 local seismic events were detected in this area. A subset of 769 earthquakes was located precisely by using HypoEllipse. We also computed focal mechanisms using P-wave polarities of the best azimuthally constrained earthquakes. We detected earthquakes beneath the Caribbean forearc and in the Atlantic oceanic plate as well. At depth seismicity delineates the Wadati-Benioff Zone down to 170 km depth. The main seismic activity is concentrated in the lower crust and in the mantle wedge, close to the island arc beneath an inner forearc domain in comparison to an outer forearc domain where little seismicity is observed. We propose that the difference of the seismicity beneath the inner and the outer forearc is related to a difference of crustal structure between the inner forearc interpreted as a dense, thick and rigid crustal block and the lighter and more flexible outer forearc. Seismicity is enhanced beneath the inner forearc because it likely increases the vertical stress applied to the subducting plate.

  3. Seismic swarms and diffuse fracturing within Triassic evaporites fed by deep degassing along the low-angle Alto Tiberina normal fault (central Apennines, Italy)

    NASA Astrophysics Data System (ADS)

    Piana Agostinetti, Nicola; Giacomuzzi, Genny; Chiarabba, Claudio

    2017-01-01

    We present high-resolution elastic models and relocated seismicity of a very active segment of the Apennines normal faulting system, computed via transdimensional local earthquake tomography (trans-D LET). Trans-D LET, a fully nonlinear approach to seismic tomography, robustly constrains high-velocity anomalies and inversions of P wave velocity, i.e., decreases of VP with depth, without introducing bias due to, e.g., a starting model, and giving the possibility to investigate the relation between fault structure, seismicity, and fluids. Changes in seismicity rate and recurring seismic swarms are frequent in the Apennines extensional belt. Deep fluids, upwelling from the delaminating continental lithosphere, are thought to be responsible for seismicity clustering in the upper crust and lubrication of normal faults during swarms and large earthquakes. We focus on the tectonic role played by the Alto Tiberina low-angle normal fault (ATF), finding displacements across the fault consistent with long-term accommodation of deformation. Our results show that recent seismic swarms affecting the area occur within a 3 km thick, high VP/VS, densely cracked, and overpressurized evaporitic layer, composed of dolostones and anhydrites. A persistent low VP, low VP/VS volume, present on top of and along the ATF low-angle detachment, traces the location of mantle-derived CO2, the upward flux of which contributes to cracking within the evaporitic layer.

  4. Regional seismic lines reprocessed using post-stack processing techniques; National Petroleum Reserve, Alaska

    USGS Publications Warehouse

    Miller, John J.; Agena, W.F.; Lee, M.W.; Zihlman, F.N.; Grow, J.A.; Taylor, D.J.; Killgore, Michele; Oliver, H.L.

    2000-01-01

    This CD-ROM contains stacked, migrated, 2-Dimensional seismic reflection data and associated support information for 22 regional seismic lines (3,470 line-miles) recorded in the National Petroleum Reserve ? Alaska (NPRA) from 1974 through 1981. Together, these lines constitute about one-quarter of the seismic data collected as part of the Federal Government?s program to evaluate the petroleum potential of the Reserve. The regional lines, which form a grid covering the entire NPRA, were created by combining various individual lines recorded in different years using different recording parameters. These data were reprocessed by the USGS using modern, post-stack processing techniques, to create a data set suitable for interpretation on interactive seismic interpretation computer workstations. Reprocessing was done in support of ongoing petroleum resource studies by the USGS Energy Program. The CD-ROM contains the following files: 1) 22 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 22 lines in standard SEG-P1 format; 3) 22 small scale graphic images of each seismic line in Adobe Acrobat? PDF format; 4) a graphic image of the location map, generated from the navigation file, with hyperlinks to the graphic images of the seismic lines; 5) an ASCII text file with cross-reference information for relating the sequential trace numbers on each regional line to the line number and shotpoint number of the original component lines; and 6) an explanation of the processing used to create the final seismic sections (this document). The SEG-Y format seismic files and SEG-P1 format navigation file contain all the information necessary for loading the data onto a seismic interpretation workstation.

  5. Prosthetically guided maxillofacial surgery: evaluation of the accuracy of a surgical guide and custom-made bone plate in oncology patients after mandibular reconstruction.

    PubMed

    Mazzoni, Simona; Marchetti, Claudio; Sgarzani, Rossella; Cipriani, Riccardo; Scotti, Roberto; Ciocca, Leonardo

    2013-06-01

    The aim of the present study was to evaluate the accuracy of prosthetically guided maxillofacial surgery in reconstructing the mandible with a free vascularized flap using custom-made bone plates and a surgical guide to cut the mandible and fibula. The surgical protocol was applied in a study group of seven consecutive mandibular-reconstructed patients who were compared with a control group treated using the standard preplating technique on stereolithographic models (indirect computer-aided design/computer-aided manufacturing method). The precision of both surgical techniques (prosthetically guided maxillofacial surgery and indirect computer-aided design/computer-aided manufacturing procedure) was evaluated by comparing preoperative and postoperative computed tomographic data and assessment of specific landmarks. With regard to midline deviation, no significant difference was documented between the test and control groups. With regard to mandibular angle shift, only one left angle shift on the lateral plane showed a statistically significant difference between the groups. With regard to angular deviation of the body axis, the data showed a significant difference in the arch deviation. All patients in the control group registered greater than 8 degrees of deviation, determining a facial contracture of the external profile at the lower margin of the mandible. With regard to condylar position, the postoperative condylar position was better in the test group than in the control group, although no significant difference was detected. The new protocol for mandibular reconstruction using computer-aided design/computer-aided manufacturing prosthetically guided maxillofacial surgery to construct custom-made guides and plates may represent a viable method of reproducing the patient's anatomical contour, giving the surgeon better procedural control and reducing procedure time. Therapeutic, III.

  6. Seismic Hazard Maps for the Maltese Archipelago: Preliminary Results

    NASA Astrophysics Data System (ADS)

    D'Amico, S.; Panzera, F.; Galea, P. M.

    2013-12-01

    The Maltese islands form an archipelago of three major islands lying in the Sicily channel at about 140 km south of Sicily and 300 km north of Libya. So far very few investigations have been carried out on seismicity around the Maltese islands and no maps of seismic hazard for the archipelago are available. Assessing the seismic hazard for the region is currently of prime interest for the near-future development of industrial and touristic facilities as well as for urban expansion. A culture of seismic risk awareness has never really been developed in the country, and the public perception is that the islands are relatively safe, and that any earthquake phenomena are mild and infrequent. However, the Archipelago has been struck by several moderate/large events. Although recent constructions of a certain structural and strategic importance have been built according to high engineering standards, the same probably cannot be said for all residential buildings, many higher than 3 storeys, which have mushroomed rapidly in recent years. Such buildings are mostly of unreinforced masonry, with heavy concrete floor slabs, which are known to be highly vulnerable to even moderate ground shaking. We can surely state that in this context planning and design should be based on available national hazard maps. Unfortunately, these kinds of maps are not available for the Maltese islands. In this paper we attempt to compute a first and preliminary probabilistic seismic hazard assessment of the Maltese islands in terms of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) at different periods. Seismic hazard has been computed using the Esteva-Cornell (1968) approach which is the most widely utilized probabilistic method. It is a zone-dependent approach: seismotectonic and geological data are used coupled with earthquake catalogues to identify seismogenic zones within which earthquakes occur at certain rates. Therefore the earthquake catalogues can be reduced to the activity rate, the b-value of the Gutenberg-Richter relationship and an estimate of the maximum magnitude. In this article we also defined a new seismogenic zones in the central Mediterranean never considered before. In order to determine the ground motion parameters related to a specified probability of exceedance, the above statistical parameters are combined with ground motion prediction equations. Seismic hazard computations have been performed within the island boundaries. The preliminary maps for PGA distribution on rock site obtained for a 10% probability of exceedance shows values ranging between 0.09-0.18 g whereas, SA for 0.2, 04, 1.0 s show values of about 0.21-0.40 g, 0.14-0.24 g and 0.05-0.08 g respectively.

  7. Computer-Aided Reliability Estimation

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Stiffler, J. J.; Bryant, L. A.; Petersen, P. L.

    1986-01-01

    CARE III (Computer-Aided Reliability Estimation, Third Generation) helps estimate reliability of complex, redundant, fault-tolerant systems. Program specifically designed for evaluation of fault-tolerant avionics systems. However, CARE III general enough for use in evaluation of other systems as well.

  8. Effectiveness of educational technology to improve patient care in pharmacy curricula.

    PubMed

    Smith, Michael A; Benedict, Neal

    2015-02-17

    A review of the literature on the effectiveness of educational technologies to teach patient care skills to pharmacy students was conducted. Nineteen articles met inclusion criteria for the review. Seven of the articles included computer-aided instruction, 4 utilized human-patient simulation, 1 used both computer-aided instruction and human-patient simulation, and 7 utilized virtual patients. Educational technology was employed with more than 2700 students at 12 colleges and schools of pharmacy in courses including pharmacotherapeutics, skills and patient care laboratories, drug diversion, and advanced pharmacy practice experience (APPE) orientation. Students who learned by means of human-patient simulation and virtual patients reported enjoying the learning activity, whereas the results with computer-aided instruction were mixed. Moreover, the effect on learning was significant in the human-patient simulation and virtual patient studies, while conflicting data emerged on the effectiveness of computer-aided instruction.

  9. Assessment of Chair-side Computer-Aided Design and Computer-Aided Manufacturing Restorations: A Review of the Literature

    PubMed Central

    Baroudi, Kusai; Ibraheem, Shukran Nasser

    2015-01-01

    Background: This paper aimed to evaluate the application of computer-aided design and computer-aided manufacturing (CAD-CAM) technology and the factors that affect the survival of restorations. Materials and Methods: A thorough literature search using PubMed, Medline, Embase, Science Direct, Wiley Online Library and Grey literature were performed from the year 2004 up to June 2014. Only relevant research was considered. Results: The use of chair-side CAD/CAM systems is promising in all dental branches in terms of minimizing time and effort made by dentists, technicians and patients for restoring and maintaining patient oral function and aesthetic, while providing high quality outcome. Conclusion: The way of producing and placing the restorations made with the chair-side CAD/CAM (CEREC and E4D) devices is better than restorations made by conventional laboratory procedures. PMID:25954082

  10. Computer-aided design/computer-aided manufacturing skull base drill.

    PubMed

    Couldwell, William T; MacDonald, Joel D; Thomas, Charles L; Hansen, Bradley C; Lapalikar, Aniruddha; Thakkar, Bharat; Balaji, Alagar K

    2017-05-01

    The authors have developed a simple device for computer-aided design/computer-aided manufacturing (CAD-CAM) that uses an image-guided system to define a cutting tool path that is shared with a surgical machining system for drilling bone. Information from 2D images (obtained via CT and MRI) is transmitted to a processor that produces a 3D image. The processor generates code defining an optimized cutting tool path, which is sent to a surgical machining system that can drill the desired portion of bone. This tool has applications for bone removal in both cranial and spine neurosurgical approaches. Such applications have the potential to reduce surgical time and associated complications such as infection or blood loss. The device enables rapid removal of bone within 1 mm of vital structures. The validity of such a machining tool is exemplified in the rapid (< 3 minutes machining time) and accurate removal of bone for transtemporal (for example, translabyrinthine) approaches.

  11. The Research of Computer Aided Farm Machinery Designing Method Based on Ergonomics

    NASA Astrophysics Data System (ADS)

    Gao, Xiyin; Li, Xinling; Song, Qiang; Zheng, Ying

    Along with agricultural economy development, the farm machinery product type Increases gradually, the ergonomics question is also getting more and more prominent. The widespread application of computer aided machinery design makes it possible that farm machinery design is intuitive, flexible and convenient. At present, because the developed computer aided ergonomics software has not suitable human body database, which is needed in view of farm machinery design in China, the farm machinery design have deviation in ergonomics analysis. This article puts forward that using the open database interface procedure in CATIA to establish human body database which aims at the farm machinery design, and reading the human body data to ergonomics module of CATIA can product practical application virtual body, using human posture analysis and human activity analysis module to analysis the ergonomics in farm machinery, thus computer aided farm machinery designing method based on engineering can be realized.

  12. An esthetics rehabilitation with computer-aided design/ computer-aided manufacturing technology.

    PubMed

    Mazaro, Josá Vitor Quinelli; de Mello, Caroline Cantieri; Zavanelli, Adriana Cristina; Santiago, Joel Ferreira; Amoroso, Andressa Paschoal; Pellizzer, Eduardo Piza

    2014-07-01

    This paper describes a case of a rehabilitation involving Computer Aided Design/Computer Aided Manufacturing (CAD-CAM) system in implant supported and dental supported prostheses using zirconia as framework. The CAD-CAM technology has developed considerably over last few years, becoming a reality in dental practice. Among the widely used systems are the systems based on zirconia which demonstrate important physical and mechanical properties of high strength, adequate fracture toughness, biocompatibility and esthetics, and are indicated for unitary prosthetic restorations and posterior and anterior framework. All the modeling was performed by using CAD-CAM system and prostheses were cemented using resin cement best suited for each situation. The rehabilitation of the maxillary arch using zirconia framework demonstrated satisfactory esthetic and functional results after a 12-month control and revealed no biological and technical complications. This article shows the important of use technology CAD/CAM in the manufacture of dental prosthesis and implant-supported.

  13. Computer-aided detection and quantification of endolymphatic hydrops within the mouse cochlea in vivo using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liu, George S.; Kim, Jinkyung; Applegate, Brian E.; Oghalai, John S.

    2017-07-01

    Diseases that cause hearing loss and/or vertigo in humans such as Meniere's disease are often studied using animal models. The volume of endolymph within the inner ear varies with these diseases. Here, we used a mouse model of increased endolymph volume, endolymphatic hydrops, to develop a computer-aided objective approach to measure endolymph volume from images collected in vivo using optical coherence tomography. The displacement of Reissner's membrane from its normal position was measured in cochlear cross sections. We validated our computer-aided measurements with manual measurements and with trained observer labels. This approach allows for computer-aided detection of endolymphatic hydrops in mice, with test performance showing sensitivity of 91% and specificity of 87% using a running average of five measurements. These findings indicate that this approach is accurate and reliable for classifying endolymphatic hydrops and quantifying endolymph volume.

  14. Automated seismic detection of landslides at regional scales: a Random Forest based detection algorithm

    NASA Astrophysics Data System (ADS)

    Hibert, C.; Michéa, D.; Provost, F.; Malet, J. P.; Geertsema, M.

    2017-12-01

    Detection of landslide occurrences and measurement of their dynamics properties during run-out is a high research priority but a logistical and technical challenge. Seismology has started to help in several important ways. Taking advantage of the densification of global, regional and local networks of broadband seismic stations, recent advances now permit the seismic detection and location of landslides in near-real-time. This seismic detection could potentially greatly increase the spatio-temporal resolution at which we study landslides triggering, which is critical to better understand the influence of external forcings such as rainfalls and earthquakes. However, detecting automatically seismic signals generated by landslides still represents a challenge, especially for events with small mass. The low signal-to-noise ratio classically observed for landslide-generated seismic signals and the difficulty to discriminate these signals from those generated by regional earthquakes or anthropogenic and natural noises are some of the obstacles that have to be circumvented. We present a new method for automatically constructing instrumental landslide catalogues from continuous seismic data. We developed a robust and versatile solution, which can be implemented in any context where a seismic detection of landslides or other mass movements is relevant. The method is based on a spectral detection of the seismic signals and the identification of the sources with a Random Forest machine learning algorithm. The spectral detection allows detecting signals with low signal-to-noise ratio, while the Random Forest algorithm achieve a high rate of positive identification of the seismic signals generated by landslides and other seismic sources. The processing chain is implemented to work in a High Performance Computers centre which permits to explore years of continuous seismic data rapidly. We present here the preliminary results of the application of this processing chain for years of continuous seismic record by the Alaskan permanent seismic network and Hi-Climb trans-Himalayan seismic network. The processing chain we developed also opens the possibility for a near-real time seismic detection of landslides, in association with remote-sensing automated detection from Sentinel 2 images for example.

  15. Real-time Seismic Amplitude Measurement (RSAM): a volcano monitoring and prediction tool

    USGS Publications Warehouse

    Endo, E.T.; Murray, T.

    1991-01-01

    Seismicity is one of the most commonly monitored phenomena used to determine the state of a volcano and for the prediction of volcanic eruptions. Although several real-time earthquake-detection and data acquisition systems exist, few continuously measure seismic amplitude in circumstances where individual events are difficult to recognize or where volcanic tremor is prevalent. Analog seismic records provide a quick visual overview of activity; however, continuous rapid quantitative analysis to define the intensity of seismic activity for the purpose of predicing volcanic eruptions is not always possible because of clipping that results from the limited dynamic range of analog recorders. At the Cascades Volcano Observatory, an inexpensive 8-bit analog-to-digital system controlled by a laptop computer is used to provide 1-min average-amplitude information from eight telemetered seismic stations. The absolute voltage level for each station is digitized, averaged, and appended in near real-time to a data file on a multiuser computer system. Raw realtime seismic amplitude measurement (RSAM) data or transformed RSAM data are then plotted on a common time base with other available volcano-monitoring information such as tilt. Changes in earthquake activity associated with dome-building episodes, weather, and instrumental difficulties are recognized as distinct patterns in the RSAM data set. RSAM data for domebuilding episodes gradually develop into exponential increases that terminate just before the time of magma extrusion. Mount St. Helens crater earthquakes show up as isolated spikes on amplitude plots for crater seismic stations but seldom for more distant stations. Weather-related noise shows up as low-level, long-term disturbances on all seismic stations, regardless of distance from the volcano. Implemented in mid-1985, the RSAM system has proved valuable in providing up-to-date information on seismic activity for three Mount St. Helens eruptive episodes from 1985 to 1986 (May 1985, May 1986, and October 1986). Tiltmeter data, the only other telemetered geophysical information that was available for the three dome-building episodes, is compared to RSAM data to show that the increase in RSAM data was related to the transport of magma to the surface. Thus, if tiltmeter data is not available, RSAM data can be used to predict future magmatic eruptions at Mount St. Helens. We also recognize the limitations of RSAm data. Two examples of RSAM data associated with phreatic or shallow phreatomagmatic explosions were not preceded by the same increases in RSAM data or changes in tilt associated with the three dome-building eruptions. ?? 1991 Springer-Verlag.

  16. Computer Aided Dosimetry and Verification of Exposure to Radiation

    DTIC Science & Technology

    2002-06-01

    Event matrix 2. Hematopoietic * Absolute blood counts * Relative blood counts 3. Dosimetry * TLD * EPDQuantitative * Radiation survey * Whole body...EI1 Defence Research and Recherche et developpement Development Canada pour la d6fense Canada DEFENCE •mI•DEFENSE Computer Aided Dosimetry and...Aided Dosimetry and Verification of Exposure to Radiation Edward Waller SAIC Canada Robert Z Stodilka Radiation Effects Group, Space Systems and

  17. Interval-type and affine arithmetic-type techniques for handling uncertainty in expert systems

    NASA Astrophysics Data System (ADS)

    Ceberio, Martine; Kreinovich, Vladik; Chopra, Sanjeev; Longpre, Luc; Nguyen, Hung T.; Ludascher, Bertram; Baral, Chitta

    2007-02-01

    Expert knowledge consists of statements Sj (facts and rules). The facts and rules are often only true with some probability. For example, if we are interested in oil, we should look at seismic data. If in 90% of the cases, the seismic data were indeed helpful in locating oil, then we can say that if we are interested in oil, then with probability 90% it is helpful to look at the seismic data. In more formal terms, we can say that the implication "if oil then seismic" holds with probability 90%. Another example: a bank A trusts a client B, so if we trust the bank A, we should trust B too; if statistically this trust was justified in 99% of the cases, we can conclude that the corresponding implication holds with probability 99%. If a query Q is deducible from facts and rules, what is the resulting probability p(Q) in Q? We can describe the truth of Q as a propositional formula F in terms of Sj, i.e., as a combination of statements Sj linked by operators like &, [logical or], and [not sign]; computing p(Q) exactly is NP-hard, so heuristics are needed. Traditionally, expert systems use technique similar to straightforward interval computations: we parse F and replace each computation step with corresponding probability operation. Problem: at each step, we ignore the dependence between the intermediate results Fj; hence intervals are too wide. Example: the estimate for P(A[logical or][not sign]A) is not 1. Solution: similar to affine arithmetic, besides P(Fj), we also compute P(Fj&Fi) (or P(Fj1&...&Fjd)), and on each step, use all combinations of l such probabilities to get new estimates. Results: e.g., P(A[logical or][not sign]A) is estimated as 1.

  18. An Annotated Bibliography of Patents Related to Coastal Engineering. Volume I. 1967-1970. Appendix.

    DTIC Science & Technology

    1979-11-01

    HYDRAULIC MODEL BASIN PILE, STRUCTURE CONNECTION ICE PROTECTION PILE, WOOD 16 -- mi • . ... -- POLLUTANT ABSORPTION SEISMIC ACOUSTIC TRANSMITTER ARRAY...NT SEABED PIPELINE PLACEMENT WIND MEASUREMENT SEABED PROPERTY MEASUREMENT WOOD PRESERVATIVE SEABED SCOUR PROTECTION SEABED SITE SURVEY SEABED SOIL...concrete, wood , or thin steel piling to aid driving. PILE EXTRACTOR - A means of removing a pile from the earth. PILE FOOTING - A means of increasing a

  19. Modernization of the Slovenian National Seismic Network

    NASA Astrophysics Data System (ADS)

    Vidrih, R.; Godec, M.; Gosar, A.; Sincic, P.; Tasic, I.; Zivcic, M.

    2003-04-01

    The Environmental Agency of the Republic of Slovenia, the Seismology Office is responsible for the fast and reliable information about earthquakes, originating in the area of Slovenia and nearby. In the year 2000 the project Modernization of the Slovenian National Seismic Network started. The purpose of a modernized seismic network is to enable fast and accurate automatic location of earthquakes, to determine earthquake parameters and to collect data of local, regional and global earthquakes. The modernized network will be finished in the year 2004 and will consist of 25 Q730 remote broadband data loggers based seismic station subsystems transmitting in real-time data to the Data Center in Ljubljana, where the Seismology Office is located. The remote broadband station subsystems include 16 surface broadband seismometers CMG-40T, 5 broadband seismometers CMG-40T with strong motion accelerographs EpiSensor, 4 borehole broadband seismometers CMG-40T, all with accurate timing provided by GPS receivers. The seismic network will cover the entire Slovenian territory, involving an area of 20,256 km2. The network is planned in this way; more seismic stations will be around bigger urban centres and in regions with greater vulnerability (NW Slovenia, Krsko Brezice region). By the end of the year 2002, three old seismic stations were modernized and ten new seismic stations were built. All seismic stations transmit data to UNIX-based computers running Antelope system software. The data is transmitted in real time using TCP/IP protocols over the Goverment Wide Area Network . Real-time data is also exchanged with seismic networks in the neighbouring countries, where the data are collected from the seismic stations, close to the Slovenian border. A typical seismic station consists of the seismic shaft with the sensor and the data acquisition system and, the service shaft with communication equipment (modem, router) and power supply with a battery box. which provides energy in case of mains failure. The data acquisition systems are recording continuous time-series sampled at 200 sps, 20 sps and 1sps.

  20. Seismic hazard estimation of northern Iran using smoothed seismicity

    NASA Astrophysics Data System (ADS)

    Khoshnevis, Naeem; Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Cramer, Chris H.

    2017-07-01

    This article presents a seismic hazard assessment for northern Iran, where a smoothed seismicity approach has been used in combination with an updated seismic catalog and a ground motion prediction equation recently found to yield good fit with data. We evaluate the hazard over a geographical area including the seismic zones of Azerbaijan, the Alborz Mountain Range, and Kopeh-Dagh, as well as parts of other neighboring seismic zones that fall within our region of interest. In the chosen approach, seismic events are not assigned to specific faults but assumed to be potential seismogenic sources distributed within regular grid cells. After performing the corresponding magnitude conversions, we decluster both historical and instrumental seismicity catalogs to obtain earthquake rates based on the number of events within each cell, and smooth the results to account for the uncertainty in the spatial distribution of future earthquakes. Seismicity parameters are computed for each seismic zone separately, and for the entire region of interest as a single uniform seismotectonic region. In the analysis, we consider uncertainties in the ground motion prediction equation, the seismicity parameters, and combine the resulting models using a logic tree. The results are presented in terms of expected peak ground acceleration (PGA) maps and hazard curves at selected locations, considering exceedance probabilities of 2 and 10% in 50 years for rock site conditions. According to our results, the highest levels of hazard are observed west of the North Tabriz and east of the North Alborz faults, where expected PGA values are between about 0.5 and 1 g for 10 and 2% probability of exceedance in 50 years, respectively. We analyze our results in light of similar estimates available in the literature and offer our perspective on the differences observed. We find our results to be helpful in understanding seismic hazard for northern Iran, but recognize that additional efforts are necessary to obtain more robust estimates at specific areas of interest and different site conditions.

  1. Diffraction of seismic waves from 3-D canyons and alluvial basins modeled using the Fast Multipole-accelerated BEM

    NASA Astrophysics Data System (ADS)

    Chaillat, S.; Bonnet, M.; Semblat, J.

    2007-12-01

    Seismic wave propagation and amplification in complex media is a major issue in the field of seismology. To compute seismic wave propagation in complex geological structures such as in alluvial basins, various numerical methods have been proposed. The main advantage of the Boundary Element Method (BEM) is that only the domain boundaries (and possibly interfaces) are discretized, leading to a reduction of the number of degrees of freedom. The main drawback of the standard BEM is that the governing matrix is full and non- symmetric, which gives rise to high computational and memory costs. In other areas where the BEM is used (electromagnetism, acoustics), considerable speedup of solution time and decrease of memory requirements have been achieved through the development, over the last decade, of the Fast Multipole Method (FMM). The goal of the FMM is to speed up the matrix-vector product computation needed at each iteration of the GMRES iterative solver. Moreover, the governing matrix is never explicitly formed, which leads to a storage requirement well below the memory necessary for holding the complete matrix. The FMM-accelerated BEM therefore achieves substantial savings in both CPU time and memory. In this work, the FMM is extended to the 3-D frequency-domain elastodynamics and applied to the computation of seismic wave propagation in 3-D. The efficiency of the present FMM-BEM is demonstrated on seismology- oriented examples. First, the diffraction of a plane wave or a point source by a 3-D canyon is studied. The influence of the size of the meshed part of the free surface is studied, and computations are performed for non- dimensional frequencies higher than those considered in other studies (thanks to the use of the FM-BEM), with which comparisons are made whenever possible. The method is also applied to analyze the diffraction of a plane wave or a point source by a 3-D alluvial basin. A parametrical study is performed on the effect of the shape of the basin and the interaction of the wavefield with the basin edges is analyzed.

  2. A PC-based computer package for automatic detection and location of earthquakes: Application to a seismic network in eastern sicity (Italy)

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano

    Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the automatic results with the manual picks indicates that the ASDP module can accurately pick 80% of the P-waves and 65% of S-waves. The on-line application on the latter data set shows that automatic locations are affected by larger errors, due to the preliminary setting of the configuration parameters in the program. However, both automatic ASDP and manual hypocenter locations are comparable within the estimated error bounds. New improvements of the PC-Seism software for on-line analysis are also discussed.

  3. Comprehensive seismic monitoring of the Cascadia megathrust with real-time GPS

    NASA Astrophysics Data System (ADS)

    Melbourne, T. I.; Szeliga, W. M.; Santillan, V. M.; Scrivner, C. W.; Webb, F.

    2013-12-01

    We have developed a comprehensive real-time GPS-based seismic monitoring system for the Cascadia subduction zone based on 1- and 5-second point position estimates computed within the ITRF08 reference frame. A Kalman filter stream editor that uses a geometry-free combination of phase and range observables to speed convergence while also producing independent estimation of carrier phase biases and ionosphere delay pre-cleans raw satellite measurements. These are then analyzed with GIPSY-OASIS using satellite clock and orbit corrections streamed continuously from the International GNSS Service (IGS) and the German Aerospace Center (DLR). The resulting RMS position scatter is less than 3 cm, and typical latencies are under 2 seconds. Currently 31 coastal Washington, Oregon, and northern California stations from the combined PANGA and PBO networks are analyzed. We are now ramping up to include all of the remaining 400+ stations currently operating throughout the Cascadia subduction zone, all of which are high-rate and telemetered in real-time to CWU. These receivers span the M9 megathrust, M7 crustal faults beneath population centers, several active Cascades volcanoes, and a host of other hazard sources. To use the point position streams for seismic monitoring, we have developed an inter-process client communication package that captures, buffers and re-broadcasts real-time positions and covariances to a variety of seismic estimation routines running on distributed hardware. An aggregator ingests, re-streams and can rebroadcast up to 24 hours of point-positions and resultant seismic estimates derived from the point positions to application clients distributed across web. A suite of seismic monitoring applications has also been written, which includes position time series analysis, instantaneous displacement vectors, and peak ground displacement contouring and mapping. We have also implemented a continuous estimation of finite-fault slip along the Cascadia megathrust using a NIF-type approach. This currently operates on the terrestrial GPS data streams, but could readily be expanded to use real-time offshore geodetic measurements as well. The continuous slip distributions are used in turn to compute tsunami excitation and, when convolved with pre-computed, hydrodynamic Green functions calculated using the COMCOT tsunami modeling software, run-up estimates for the entire Cascadia coastal margin. Finally, a suite of data visualization tools has been written to allow interaction with the real-time position streams and seismic estimates based on them, including time series plotting, instantaneous offset vectors, peak ground deformation contouring, finite-fault inversions, and tsunami run-up. This suite is currently bundled within a single client written in JAVA, called ';GPS Cockpit,' which is available for download.

  4. The Quake-Catcher Network: Improving Earthquake Strong Motion Observations Through Community Engagement

    NASA Astrophysics Data System (ADS)

    Cochran, E. S.; Lawrence, J. F.; Christensen, C. M.; Chung, A. I.; Neighbors, C.; Saltzman, J.

    2010-12-01

    The Quake-Catcher Network (QCN) involves the community in strong motion data collection by utilizing volunteer computing techniques and low-cost MEMS accelerometers. Volunteer computing provides a mechanism to expand strong-motion seismology with minimal infrastructure costs, while promoting community participation in science. Micro-Electro-Mechanical Systems (MEMS) triaxial accelerometers can be attached to a desktop computer via USB and are internal to many laptops. Preliminary shake table tests show the MEMS accelerometers can record high-quality seismic data with instrument response similar to research-grade strong-motion sensors. QCN began distributing sensors and software to K-12 schools and the general public in April 2008 and has grown to roughly 1500 stations worldwide. We also recently tested whether sensors could be quickly deployed as part of a Rapid Aftershock Mobilization Program (RAMP) following the 2010 M8.8 Maule, Chile earthquake. Volunteers are recruited through media reports, web-based sensor request forms, as well as social networking sites. Using data collected to date, we examine whether a distributed sensing network can provide valuable seismic data for earthquake detection and characterization while promoting community participation in earthquake science. We utilize client-side triggering algorithms to determine when significant ground shaking occurs and this metadata is sent to the main QCN server. On average, trigger metadata are received within 1-10 seconds from the observation of a trigger; the larger data latencies are correlated with greater server-station distances. When triggers are detected, we determine if the triggers correlate to others in the network using spatial and temporal clustering of incoming trigger information. If a minimum number of triggers are detected then a QCN-event is declared and an initial earthquake location and magnitude is estimated. Initial analysis suggests that the estimated locations and magnitudes are similar to those reported in regional and global catalogs. As the network expands, it will become increasingly important to provide volunteers access to the data they collect, both to encourage continued participation in the network and to improve community engagement in scientific discourse related to seismic hazard. In the future, we hope to provide access to both images and raw data from seismograms in formats accessible to the general public through existing seismic data archives (e.g. IRIS, SCSN) and/or through the QCN project website. While encouraging community participation in seismic data collection, we can extend the capabilities of existing seismic networks to rapidly detect and characterize strong motion events. In addition, the dense waveform observations may provide high-resolution ground shaking information to improve source imaging and seismic risk assessment.

  5. A Metric for Reducing False Positives in the Computer-Aided Detection of Breast Cancer from Dynamic Contrast-Enhanced Magnetic Resonance Imaging Based Screening Examinations of High-Risk Women.

    PubMed

    Levman, Jacob E D; Gallego-Ortiz, Cristina; Warner, Ellen; Causer, Petrina; Martel, Anne L

    2016-02-01

    Magnetic resonance imaging (MRI)-enabled cancer screening has been shown to be a highly sensitive method for the early detection of breast cancer. Computer-aided detection systems have the potential to improve the screening process by standardizing radiologists to a high level of diagnostic accuracy. This retrospective study was approved by the institutional review board of Sunnybrook Health Sciences Centre. This study compares the performance of a proposed method for computer-aided detection (based on the second-order spatial derivative of the relative signal intensity) with the signal enhancement ratio (SER) on MRI-based breast screening examinations. Comparison is performed using receiver operating characteristic (ROC) curve analysis as well as free-response receiver operating characteristic (FROC) curve analysis. A modified computer-aided detection system combining the proposed approach with the SER method is also presented. The proposed method provides improvements in the rates of false positive markings over the SER method in the detection of breast cancer (as assessed by FROC analysis). The modified computer-aided detection system that incorporates both the proposed method and the SER method yields ROC results equal to that produced by SER while simultaneously providing improvements over the SER method in terms of false positives per noncancerous exam. The proposed method for identifying malignancies outperforms the SER method in terms of false positives on a challenging dataset containing many small lesions and may play a useful role in breast cancer screening by MRI as part of a computer-aided detection system.

  6. RMT focal plane sensitivity to seismic network geometry and faulting style

    USGS Publications Warehouse

    Johnson, Kendra L.; Hayes, Gavin; Herrmann, Robert B.; Benz, Harley M.; McNamara, Daniel E.; Bergman, Eric A.

    2016-01-01

    Modern tectonic studies often use regional moment tensors (RMTs) to interpret the seismotectonic framework of an earthquake or earthquake sequence; however, despite extensive use, little existing work addresses RMT parameter uncertainty. Here, we quantify how network geometry and faulting style affect RMT sensitivity. We examine how data-model fits change with fault plane geometry (strike and dip) for varying station configurations. We calculate the relative data fit for incrementally varying geometries about a best-fitting solution, applying our workflow to real and synthetic seismograms for both real and hypothetical station distributions and earthquakes. Initially, we conduct purely observational tests, computing RMTs from synthetic seismograms for hypothetical earthquakes and a series of well-behaved network geometries. We then incorporate real data and station distributions from the International Maule Aftershock Deployment (IMAD), which recorded aftershocks of the 2010 MW 8.8 Maule earthquake, and a set of regional stations capturing the ongoing earthquake sequence in Oklahoma and southern Kansas. We consider RMTs computed under three scenarios: (1) real seismic records selected for high data quality; (2) synthetic seismic records with noise computed for the observed source-station pairings and (3) synthetic seismic records with noise computed for all possible station-source pairings. To assess RMT sensitivity for each test, we observe the ‘fit falloff’, which portrays how relative fit changes when strike or dip varies incrementally; we then derive the ranges of acceptable strikes and dips by identifying the span of solutions with relative fits larger than 90 per cent of the best fit. For the azimuthally incomplete IMAD network, Scenario 3 best constrains fault geometry, with average ranges of 45° and 31° for strike and dip, respectively. In Oklahoma, Scenario 3 best constrains fault dip with an average range of 46°; however, strike is best constrained by Scenario 1, with a range of 26°. We draw two main conclusions from this study. (1) Station distribution impacts our ability to constrain RMTs using waveform time-series; however, in some tectonic settings, faulting style also plays a significant role and (2) increasing station density and data quantity (both the number of stations and the number of individual channels) does not necessarily improve RMT constraint. These results may be useful when organizing future seismic deployments (e.g. by concentrating stations in alignment with anticipated nodal planes), and in computing RMTs, either by guiding a more rigorous data selection process for input data or informing variable weighting among the selected data (e.g. by eliminating the transverse component when strike-slip mechanisms are expected).

  7. Computer Aided Drafting Packages for Secondary Education. Edition 2. PC DOS Compatible Programs. A MicroSIFT Quarterly Report.

    ERIC Educational Resources Information Center

    Pollard, Jim

    This report reviews eight IBM-compatible software packages that are available to secondary schools to teach computer-aided drafting (CAD). Software packages to be considered were selected following reviews of CAD periodicals, computers in education periodicals, advertisements, and recommendations of teachers. The packages were then rated by…

  8. Siren Songs and a Skeptic.

    ERIC Educational Resources Information Center

    O'Brien, George M.

    This paper reports on an on-going experiment in computer-aided language instruction. In 1972, a class of beginning German students at the Duluth campus of the University of Minnesota volunteered to test two pedagogic theories: (1) Could a computer-aided course be used by a class and an instructor who knew nothing of computers and who had to rely…

  9. PEO Integration Acronym Book

    DTIC Science & Technology

    2011-02-01

    Command CASE Computer Aided Software Engineering CASEVAC Casualty Evacuation CASTFOREM Combined Arms And Support Task Force Evaluation Model CAT Center For...Advanced Technologies CAT Civil Affairs Team CAT Combined Arms Training CAT Crew Integration CAT Crisis Action Team CATIA Computer-Aided Three...Dimensional Interactive Application CATOX Catalytic Oxidation CATS Combined Arms Training Strategy CATT Combined Arms Tactical Trainer CATT Computer

  10. No fault of their own: Increasing public awareness of earthquakes in aseismic regions

    NASA Astrophysics Data System (ADS)

    Galvin, J. L.; Pickering, R. A.; Wetzel, L. R.

    2011-12-01

    EarthScope's Transportable Array (TA) project is installing seismographs across the US, progressing from North America's seismically active West Coast to the passive Atlantic margin. The array consists of 400 seismic stations spaced ~70 km apart for a continental-scale experiment lasting 15 years. A student/faculty team from Eckerd College participated by using computer-based tools to identify potential seismograph sites; conducting field investigations to confirm site suitability; initiating contact with landowners; and preparing reconnaissance reports for future earthquake recording stations in Florida. An ideal seismograph site is in a quiet, dry, unshaded, open area that is remote yet accessible, with cellular network coverage and a willing private landowner. Scouting for site locations presented many challenges, including land use and ownership patterns; low-lying, flooded topography; noisy Atlantic and Gulf coastal regions; extensive river and lake systems; environmentally protected areas; road patterns with high traffic; urban population centers; and a populace unfamiliar with earthquakes. While many of these factors were unavoidable, developing the public's interest in seismology was a crucial step in gaining landowner participation. The majority of those approached were unfamiliar with the importance of earthquake research in an aseismic location. Being presented with this challenge encouraged the team to formulate different approaches to promote public interest and understanding of earthquake research in locations indirectly affected by seismic activity. Throughout the project, landowners expressed greater interest or were more likely to participate for a variety of reasons. For instance, landowners that had personal experience with earthquakes, were involved with the scientific community, or had previously collaborated with other research projects were most receptive to participating in the TA program. From this observation, it became clear that relating potential site hosts to earthquake events or the scientific research process was beneficial for gaining citizen support. For example, many landowners expressed interest in seismic research if they or their family members had experienced an earthquake. For residents lacking a personal association with earthquakes or science in general, it was important to explain why recording earthquakes in a seismically inactive area could be beneficial. For instance, explaining that data collected from the TA project could aid in research of other events including hurricanes and sink holes made the program seem more pertinent to Florida citizens. After spending the summer in contact with Florida residents, the team established that the most effective route to cultivate public interest in seismology was to make the study's purpose applicable to their everyday lives. In doing so, citizens felt directly connected to the project, and were therefore more enthusiastic to participate and become educated on the topic of seismology.

  11. Cyberinfrastructure for the Unified Study of Earth Structure and Earthquake Sources in Complex Geologic Environments

    NASA Astrophysics Data System (ADS)

    Zhao, L.; Chen, P.; Jordan, T. H.; Olsen, K. B.; Maechling, P.; Faerman, M.

    2004-12-01

    The Southern California Earthquake Center (SCEC) is developing a Community Modeling Environment (CME) to facilitate the computational pathways of physics-based seismic hazard analysis (Maechling et al., this meeting). Major goals are to facilitate the forward modeling of seismic wavefields in complex geologic environments, including the strong ground motions that cause earthquake damage, and the inversion of observed waveform data for improved models of Earth structure and fault rupture. Here we report on a unified approach to these coupled inverse problems that is based on the ability to generate and manipulate wavefields in densely gridded 3D Earth models. A main element of this approach is a database of receiver Green tensors (RGT) for the seismic stations, which comprises all of the spatial-temporal displacement fields produced by the three orthogonal unit impulsive point forces acting at each of the station locations. Once the RGT database is established, synthetic seismograms for any earthquake can be simply calculated by extracting a small, source-centered volume of the RGT from the database and applying the reciprocity principle. The partial derivatives needed for point- and finite-source inversions can be generated in the same way. Moreover, the RGT database can be employed in full-wave tomographic inversions launched from a 3D starting model, because the sensitivity (Fréchet) kernels for travel-time and amplitude anomalies observed at seismic stations in the database can be computed by convolving the earthquake-induced displacement field with the station RGTs. We illustrate all elements of this unified analysis with an RGT database for 33 stations of the California Integrated Seismic Network in and around the Los Angeles Basin, which we computed for the 3D SCEC Community Velocity Model (SCEC CVM3.0) using a fourth-order staggered-grid finite-difference code. For a spatial grid spacing of 200 m and a time resolution of 10 ms, the calculations took ~19,000 node-hours on the Linux cluster at USC's High-Performance Computing Center. The 33-station database with a volume of ~23.5 TB was archived in the SCEC digital library at the San Diego Supercomputer Center using the Storage Resource Broker (SRB). From a laptop, anyone with access to this SRB collection can compute synthetic seismograms for an arbitrary source in the CVM in a matter of minutes. Efficient approaches have been implemented to use this RGT database in the inversions of waveforms for centroid and finite moment tensors and tomographic inversions to improve the CVM. Our experience with these large problems suggests areas where the cyberinfrastructure currently available for geoscience computation needs to be improved.

  12. SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model

    NASA Astrophysics Data System (ADS)

    Grelle, G.; Bonito, L.; Lampasi, A.; Revellino, P.; Guerriero, L.; Sappa, G.; Guadagno, F. M.

    2015-06-01

    SiSeRHMap is a computerized methodology capable of drawing up prediction maps of seismic response. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code-architecture composed of five interdependent modules. A GIS (Geographic Information System) Cubic Model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A metamodeling process confers a hybrid nature to the methodology. In this process, the one-dimensional linear equivalent analysis produces acceleration response spectra of shear wave velocity-thickness profiles, defined as trainers, which are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated Evolutionary Algorithm (EA) and the Levenberg-Marquardt Algorithm (LMA) as the final optimizer. In the final step, the GCM Maps Executor module produces a serial map-set of a stratigraphic seismic response at different periods, grid-solving the calibrated Spectra model. In addition, the spectra topographic amplification is also computed by means of a numerical prediction model. This latter is built to match the results of the numerical simulations related to isolate reliefs using GIS topographic attributes. In this way, different sets of seismic response maps are developed, on which, also maps of seismic design response spectra are defined by means of an enveloping technique.

  13. Investigating Segmentation in Cascadia: Anisotropic Crustal Structure and Mantle Wedge Serpentinization from Receiver Functions

    NASA Astrophysics Data System (ADS)

    Krueger, Hannah E.; Wirth, Erin A.

    2017-10-01

    The Cascadia subduction zone exhibits along-strike segmentation in structure, processes, and seismogenic behavior. While characterization of seismic anisotropy can constrain deformation processes at depth, the character of seismic anisotropy in Cascadia remains poorly understood. This is primarily due to a lack of seismicity in the subducting Juan de Fuca slab, which limits shear wave splitting and other seismological analyses that interrogate the fine-scale anisotropic structure of the crust and mantle wedge. We investigate lower crustal anisotropy and mantle wedge structure by computing P-to-S receiver functions at 12 broadband seismic stations along the Cascadia subduction zone. We observe P-to-SV converted energy consistent with previously estimated Moho depths. Several stations exhibit evidence of an "inverted Moho" (i.e., a downward velocity decrease across the crust-mantle boundary), indicative of a serpentinized mantle wedge. Stations with an underlying hydrated mantle wedge appear prevalent from northern Washington to central Oregon, but sparse in southern Oregon and northern California. Transverse component receiver functions are complex, suggesting anisotropic and/or dipping crustal structure. To constrain the orientation of crustal anisotropy we compute synthetic receiver functions using manual forward modeling. We determine that the lower crust shows variable orientations of anisotropy along-strike, with highly complex anisotropy in northern Cascadia, and generally NW-SE and NE-SW orientations of slow-axis anisotropy in central and southern Cascadia, respectively. The orientations of anisotropy from this work generally agree with those inferred from shear wave splitting of tremor studies at similar locations, lending confidence to this relatively new method of inferring seismic anisotropy from slow earthquakes.

  14. Determination and uncertainty of moment tensors for microearthquakes at Okmok Volcano, Alaska

    USGS Publications Warehouse

    Pesicek, J.D.; Sileny, J.; Prejean, S.G.; Thurber, C.H.

    2012-01-01

    Efforts to determine general moment tensors (MTs) for microearthquakes in volcanic areas are often hampered by small seismic networks, which can lead to poorly constrained hypocentres and inadequate modelling of seismic velocity heterogeneity. In addition, noisy seismic signals can make it difficult to identify phase arrivals correctly for small magnitude events. However, small volcanic earthquakes can have source mechanisms that deviate from brittle double-couple shear failure due to magmatic and/or hydrothermal processes. Thus, determining reliable MTs in such conditions is a challenging but potentially rewarding pursuit. We pursued such a goal at Okmok Volcano, Alaska, which erupted recently in 1997 and in 2008. The Alaska Volcano Observatory operates a seismic network of 12 stations at Okmok and routinely catalogues recorded seismicity. Using these data, we have determined general MTs for seven microearthquakes recorded between 2004 and 2007 by inverting peak amplitude measurements of P and S phases. We computed Green's functions using precisely relocated hypocentres and a 3-D velocity model. We thoroughly assessed the quality of the solutions by computing formal uncertainty estimates, conducting a variety of synthetic and sensitivity tests, and by comparing the MTs to solutions obtained using alternative methods. The results show that MTs are sensitive to station distribution and errors in the data, velocity model and hypocentral parameters. Although each of the seven MTs contains a significant non-shear component, we judge several of the solutions to be unreliable. However, several reliable MTs are obtained for a group of previously identified repeating events, and are interpreted as compensated linear-vector dipole events.

  15. Pre-processing ambient noise cross-correlations with equalizing the covariance matrix eigenspectrum

    NASA Astrophysics Data System (ADS)

    Seydoux, Léonard; de Rosny, Julien; Shapiro, Nikolai M.

    2017-09-01

    Passive imaging techniques from ambient seismic noise requires a nearly isotropic distribution of the noise sources in order to ensure reliable traveltime measurements between seismic stations. However, real ambient seismic noise often partially fulfils this condition. It is generated in preferential areas (in deep ocean or near continental shores), and some highly coherent pulse-like signals may be present in the data such as those generated by earthquakes. Several pre-processing techniques have been developed in order to attenuate the directional and deterministic behaviour of this real ambient noise. Most of them are applied to individual seismograms before cross-correlation computation. The most widely used techniques are the spectral whitening and temporal smoothing of the individual seismic traces. We here propose an additional pre-processing to be used together with the classical ones, which is based on the spatial analysis of the seismic wavefield. We compute the cross-spectra between all available stations pairs in spectral domain, leading to the data covariance matrix. We apply a one-bit normalization to the covariance matrix eigenspectrum before extracting the cross-correlations in the time domain. The efficiency of the method is shown with several numerical tests. We apply the method to the data collected by the USArray, when the M8.8 Maule earthquake occurred on 2010 February 27. The method shows a clear improvement compared with the classical equalization to attenuate the highly energetic and coherent waves incoming from the earthquake, and allows to perform reliable traveltime measurement even in the presence of the earthquake.

  16. Miscellaneous Topics in Computer-Aided Drug Design: Synthetic Accessibility and GPU Computing, and Other Topics.

    PubMed

    Fukunishi, Yoshifumi; Mashimo, Tadaaki; Misoo, Kiyotaka; Wakabayashi, Yoshinori; Miyaki, Toshiaki; Ohta, Seiji; Nakamura, Mayu; Ikeda, Kazuyoshi

    2016-01-01

    Computer-aided drug design is still a state-of-the-art process in medicinal chemistry, and the main topics in this field have been extensively studied and well reviewed. These topics include compound databases, ligand-binding pocket prediction, protein-compound docking, virtual screening, target/off-target prediction, physical property prediction, molecular simulation and pharmacokinetics/pharmacodynamics (PK/PD) prediction. Message and Conclusion: However, there are also a number of secondary or miscellaneous topics that have been less well covered. For example, methods for synthesizing and predicting the synthetic accessibility (SA) of designed compounds are important in practical drug development, and hardware/software resources for performing the computations in computer-aided drug design are crucial. Cloud computing and general purpose graphics processing unit (GPGPU) computing have been used in virtual screening and molecular dynamics simulations. Not surprisingly, there is a growing demand for computer systems which combine these resources. In the present review, we summarize and discuss these various topics of drug design.

  17. Miscellaneous Topics in Computer-Aided Drug Design: Synthetic Accessibility and GPU Computing, and Other Topics

    PubMed Central

    Fukunishi, Yoshifumi; Mashimo, Tadaaki; Misoo, Kiyotaka; Wakabayashi, Yoshinori; Miyaki, Toshiaki; Ohta, Seiji; Nakamura, Mayu; Ikeda, Kazuyoshi

    2016-01-01

    Abstract: Background Computer-aided drug design is still a state-of-the-art process in medicinal chemistry, and the main topics in this field have been extensively studied and well reviewed. These topics include compound databases, ligand-binding pocket prediction, protein-compound docking, virtual screening, target/off-target prediction, physical property prediction, molecular simulation and pharmacokinetics/pharmacodynamics (PK/PD) prediction. Message and Conclusion: However, there are also a number of secondary or miscellaneous topics that have been less well covered. For example, methods for synthesizing and predicting the synthetic accessibility (SA) of designed compounds are important in practical drug development, and hardware/software resources for performing the computations in computer-aided drug design are crucial. Cloud computing and general purpose graphics processing unit (GPGPU) computing have been used in virtual screening and molecular dynamics simulations. Not surprisingly, there is a growing demand for computer systems which combine these resources. In the present review, we summarize and discuss these various topics of drug design. PMID:27075578

  18. An Investigation of Seismicity for the West Sumatra Region Indonesia

    NASA Astrophysics Data System (ADS)

    Syafriani, S.

    2018-04-01

    The purpose of this research was to investigate the seismicity of the West Sumatra region in the coordinates area of 94° E – 104° E and 2° N - 4° S. Guttenberg-Richer magnitude-frequency relation and seismic risk have been computed. Historical data of earthquakes used from year of 1970 to 2017 with magnitude higher than 4. The study area was divided into 8 sub-regions based on seismotectonic characteristics, plate tectonic and geological models. The determination of seismotectonic characteristics was based on the level of seismic activity in a region (a value) and rock stress condition (b value). High a value was associated with high seismic activity, whereas high b values were associated with low stress rock conditions, and vice versa. Based on the calculation results, a and b values were obtained in the interval of 5.5-11.3 and 0.7-2. The highest b value was obtained in the sub region 5 (Nias islands), while the lowest b value was obtained in sub region 7 (the Mentawai islands). The sub region 7, Mentawai Islands was indicated as the seismic risk potential areas.

  19. 1D Seismic reflection technique to increase depth information in surface seismic investigations

    NASA Astrophysics Data System (ADS)

    Camilletti, Stefano; Fiera, Francesco; Umberto Pacini, Lando; Perini, Massimiliano; Prosperi, Andrea

    2017-04-01

    1D seismic methods, such as MASW Re.Mi. and HVSR, have been extensively used in engineering investigations, bedrock research, Vs profile and to some extent for hydrologic applications, during the past 20 years. Recent advances in equipment, sound sources and computer interpretation techniques, make 1D seismic methods highly effective in shallow subsoil modeling. Classical 1D seismic surveys allows economical collection of subsurface data however they fail to return accurate information for depths greater than 50 meters. Using a particular acquisition technique it is possible to collect data that can be quickly processed through reflection technique in order to obtain more accurate velocity information in depth. Furthermore, data processing returns a narrow stratigraphic section, alongside the 1D velocity model, where lithological boundaries are represented. This work will show how collect a single-CMP to determine: (1) depth of bedrock; (2) gravel layers in clayey domains; (3) accurate Vs profile. Seismic traces was processed by means a new software developed in collaboration with SARA electronics instruments S.r.l company, Perugia - ITALY. This software has the great advantage of being able to be used directly in the field in order to reduce the times elapsing between acquisition and processing.

  20. Numerical Modeling of 3D Seismic Wave Propagation around Yogyakarta, the Southern Part of Central Java, Indonesia, Using Spectral-Element Method on MPI-GPU Cluster

    NASA Astrophysics Data System (ADS)

    Sudarmaji; Rudianto, Indra; Eka Nurcahya, Budi

    2018-04-01

    A strong tectonic earthquake with a magnitude of 5.9 Richter scale has been occurred in Yogyakarta and Central Java on May 26, 2006. The earthquake has caused severe damage in Yogyakarta and the southern part of Central Java, Indonesia. The understanding of seismic response of earthquake among ground shaking and the level of building damage is important. We present numerical modeling of 3D seismic wave propagation around Yogyakarta and the southern part of Central Java using spectral-element method on MPI-GPU (Graphics Processing Unit) computer cluster to observe its seismic response due to the earthquake. The homogeneous 3D realistic model is generated with detailed topography surface. The influences of free surface topography and layer discontinuity of the 3D model among the seismic response are observed. The seismic wave field is discretized using spectral-element method. The spectral-element method is solved on a mesh of hexahedral elements that is adapted to the free surface topography and the internal discontinuity of the model. To increase the data processing capabilities, the simulation is performed on a GPU cluster with implementation of MPI (Message Passing Interface).

  1. Seismic passive earth resistance using modified pseudo-dynamic method

    NASA Astrophysics Data System (ADS)

    Pain, Anindya; Choudhury, Deepankar; Bhattacharyya, S. K.

    2017-04-01

    In earthquake prone areas, understanding of the seismic passive earth resistance is very important for the design of different geotechnical earth retaining structures. In this study, the limit equilibrium method is used for estimation of critical seismic passive earth resistance for an inclined wall supporting horizontal cohesionless backfill. A composite failure surface is considered in the present analysis. Seismic forces are computed assuming the backfill soil as a viscoelastic material overlying a rigid stratum and the rigid stratum is subjected to a harmonic shaking. The present method satisfies the boundary conditions. The amplification of acceleration depends on the properties of the backfill soil and on the characteristics of the input motion. The acceleration distribution along the depth of the backfill is found to be nonlinear in nature. The present study shows that the horizontal and vertical acceleration distribution in the backfill soil is not always in-phase for the critical value of the seismic passive earth pressure coefficient. The effect of different parameters on the seismic passive earth pressure is studied in detail. A comparison of the present method with other theories is also presented, which shows the merits of the present study.

  2. Ischia Island: Historical Seismicity and Dynamics

    NASA Astrophysics Data System (ADS)

    Carlino, S.; Cubellis, E.; Iannuzzi, R.; Luongo, G.; Obrizzo, F.

    2003-04-01

    The seismic energy release in volcanic areas is a complex process and the island of Ischia provides a significant scenario of historical seismicity. This is characterized by the occurence of earthquakes with low energy and high intensity. Information on the seismicity of the island spans about eight centuries, starting from 1228. With regard to effects, the most recent earthquake of 1883 is extensively documented both in the literature and unpublished sources. The earthquake caused 2333 deaths and the destruction of the historical and environmental heritage of some areas of the island. The most severe damage occurred in Casamicciola. This event, which was the first great catastrophe after the unification of Italy in the 1860s (Imax = XI degree MCS), represents an important date in the prevention of natural disasters, in that it was after this earthquake that the first Seismic Safety Act in Italy was passed by which lower risk zones were identified for new settlements. Thanks to such detailed analysis, reliable modelling of the seismic source was also obtained. The historical data onwards makes it possible to identify the area of the epicenter of all known earthquakes as the northern slope of Monte Epomeo, while analysis of the effects of earthquakes and the geological structures allows us to evaluate the stress fields that generate the earthquakes. In a volcanic area, interpretation of the mechanisms of release and propagation of seismic energy is made even more complex as the stress field that acts at a regional level is compounded by that generated from migration of magmatic masses towards the surface, as well as the rheologic properties of the rocks dependent on the high geothermic gradient. Such structural and dynamic conditions make the island of Ischia a seismic area of considerable interest. It would appear necessary to evaluate the expected damage caused by a new event linked to the renewal of dynamics of the island, where high population density and the high economic value concerned, the island is a tourist destination and holiday resort, increase the seismic risk. A seismic hazard map of the island is proposed according to a comparative analysis of various types of data: the geology, tectonics, historical seismicity and damage caused by the 28 July 1883 Casamicciola earthquake. The analysis was essentially based on a GIS-aided cross-correlation of these data. The GIS is thus able to provide support both for in-depth analysis of the dynamic processes on the island and extend the assessment to other natural risks (volcanic, landslides, flooding, etc.).

  3. Forecasting induced seismicity rate and Mmax using calibrated numerical models

    NASA Astrophysics Data System (ADS)

    Dempsey, D.; Suckale, J.

    2016-12-01

    At Groningen, The Netherlands, several decades of induced seismicity from gas extraction has culminated in a M 3.6 event (mid 2012). From a public safety and commercial perspective, it is desirable to anticipate future seismicity outcomes at Groningen. One way to quantify earthquake risk is Probabilistic Seismic Hazard Analysis (PSHA), which requires an estimate of the future seismicity rate and its magnitude frequency distribution (MFD). This approach is effective at quantifying risk from tectonic events because the seismicity rate, once measured, is almost constant over timescales of interest. In contrast, rates of induced seismicity vary significantly over building lifetimes, largely in response to changes in injection or extraction. Thus, the key to extending PSHA to induced earthquakes is to estimate future changes of the seismicity rate in response to some proposed operating schedule. Numerical models can describe the physical link between fluid pressure, effective stress change, and the earthquake process (triggering and propagation). However, models with predictive potential of individual earthquakes face the difficulty of characterizing specific heterogeneity - stress, strength, roughness, etc. - at locations of interest. Modeling catalogs of earthquakes provides a means of averaging over this uncertainty, focusing instead on the collective features of the seismicity, e.g., its rate and MFD. The model we use incorporates fluid pressure and stress changes to describe nucleation and crack-like propagation of earthquakes on stochastically characterized 1D faults. This enables simulation of synthetic catalogs of induced seismicity from which the seismicity rate, location and MFD are extracted. A probability distribution for Mmax - the largest event in some specified time window - is also computed. Because the model captures the physics linking seismicity to changes in the reservoir, earthquake observations and operating information can be used to calibrate a model at a specific site (or, ideally, many models). This restricts analysis of future seismicity to likely parameter sets and provides physical justification for linking operational changes to subsequent seismicity. To illustrate these concepts, a recent study of prior and forecast seismicity at Groningen will be presented.

  4. Seismic signal processing on heterogeneous supercomputers

    NASA Astrophysics Data System (ADS)

    Gokhberg, Alexey; Ermert, Laura; Fichtner, Andreas

    2015-04-01

    The processing of seismic signals - including the correlation of massive ambient noise data sets - represents an important part of a wide range of seismological applications. It is characterized by large data volumes as well as high computational input/output intensity. Development of efficient approaches towards seismic signal processing on emerging high performance computing systems is therefore essential. Heterogeneous supercomputing systems introduced in the recent years provide numerous computing nodes interconnected via high throughput networks, every node containing a mix of processing elements of different architectures, like several sequential processor cores and one or a few graphical processing units (GPU) serving as accelerators. A typical representative of such computing systems is "Piz Daint", a supercomputer of the Cray XC 30 family operated by the Swiss National Supercomputing Center (CSCS), which we used in this research. Heterogeneous supercomputers provide an opportunity for manifold application performance increase and are more energy-efficient, however they have much higher hardware complexity and are therefore much more difficult to program. The programming effort may be substantially reduced by the introduction of modular libraries of software components that can be reused for a wide class of seismology applications. The ultimate goal of this research is design of a prototype for such library suitable for implementing various seismic signal processing applications on heterogeneous systems. As a representative use case we have chosen an ambient noise correlation application. Ambient noise interferometry has developed into one of the most powerful tools to image and monitor the Earth's interior. Future applications will require the extraction of increasingly small details from noise recordings. To meet this demand, more advanced correlation techniques combined with very large data volumes are needed. This poses new computational problems that require dedicated HPC solutions. The chosen application is using a wide range of common signal processing methods, which include various IIR filter designs, amplitude and phase correlation, computing the analytic signal, and discrete Fourier transforms. Furthermore, various processing methods specific for seismology, like rotation of seismic traces, are used. Efficient implementation of all these methods on the GPU-accelerated systems represents several challenges. In particular, it requires a careful distribution of work between the sequential processors and accelerators. Furthermore, since the application is designed to process very large volumes of data, special attention had to be paid to the efficient use of the available memory and networking hardware resources in order to reduce intensity of data input and output. In our contribution we will explain the software architecture as well as principal engineering decisions used to address these challenges. We will also describe the programming model based on C++ and CUDA that we used to develop the software. Finally, we will demonstrate performance improvements achieved by using the heterogeneous computing architecture. This work was supported by a grant from the Swiss National Supercomputing Centre (CSCS) under project ID d26.

  5. The Shock and Vibration Digest. Volume 18, Number 6

    DTIC Science & Technology

    1986-06-01

    linear, quadratic, or cubic. Bessel function Reed [124] reported a method for computing solutions were obtained for a truncated pyramid amplitudes of a...86-1198 A. Ragab, Chung C. Fu Seismic Analysis of a Large LMFBR with Flu- Cairo Univ., Giza , Egypt . . *. id-Structure Imteractions Computers Struc

  6. Modeling Regional Seismic Waves

    DTIC Science & Technology

    1992-06-29

    the computation of the Green’s functions is rather time comsuming . they arc Computed for each of the fundamental faults, at I1(H) km intervals from 21...this record was very, small. Station GEO displays similar behavior in that the overall features of the waveform are matched, but fit in detail is not

  7. Next Generation Seismic Imaging; High Fidelity Algorithms and High-End Computing

    NASA Astrophysics Data System (ADS)

    Bevc, D.; Ortigosa, F.; Guitton, A.; Kaelin, B.

    2007-05-01

    The rich oil reserves of the Gulf of Mexico are buried in deep and ultra-deep waters up to 30,000 feet from the surface. Minerals Management Service (MMS), the federal agency in the U.S. Department of the Interior that manages the nation's oil, natural gas and other mineral resources on the outer continental shelf in federal offshore waters, estimates that the Gulf of Mexico holds 37 billion barrels of "undiscovered, conventionally recoverable" oil, which, at 50/barrel, would be worth approximately 1.85 trillion. These reserves are very difficult to find and reach due to the extreme depths. Technological advances in seismic imaging represent an opportunity to overcome this obstacle by providing more accurate models of the subsurface. Among these technological advances, Reverse Time Migration (RTM) yields the best possible images. RTM is based on the solution of the two-way acoustic wave-equation. This technique relies on the velocity model to image turning waves. These turning waves are particularly important to unravel subsalt reservoirs and delineate salt-flanks, a natural trap for oil and gas. Because it relies on an accurate velocity model, RTM opens new frontier in designing better velocity estimation algorithms. RTM has been widely recognized as the next chapter in seismic exploration, as it can overcome the limitations of current migration methods in imaging complex geologic structures that exist in the Gulf of Mexico. The chief impediment to the large-scale, routine deployment of RTM has been a lack of sufficient computer power. RTM needs thirty times the computing power used in exploration today to be commercially viable and widely usable. Therefore, advancing seismic imaging to the next level of precision poses a multi-disciplinary challenge. To overcome these challenges, the Kaleidoscope project, a partnership between Repsol YPF, Barcelona Supercomputing Center, 3DGeo Inc., and IBM brings together the necessary components of modeling, algorithms and the uniquely powerful computing power of the MareNostrum supercomputer in Barcelona to realize the promise of RTM, incorporate it into daily processing flows, and to help solve exploration problems in a highly cost-effective way. Uniquely, the Kaleidoscope Project is simultaneously integrating software (algorithms) and hardware (Cell BE), steps that are traditionally taken sequentially. This unique integration of software and hardware will accelerate seismic imaging by several orders of magnitude compared to conventional solutions running on standard Linux Clusters.

  8. Shaded-Color Picture Generation of Computer-Defined Arbitrary Shapes

    NASA Technical Reports Server (NTRS)

    Cozzolongo, J. V.; Hermstad, D. L.; Mccoy, D. S.; Clark, J.

    1986-01-01

    SHADE computer program generates realistic color-shaded pictures from computer-defined arbitrary shapes. Objects defined for computer representation displayed as smooth, color-shaded surfaces, including varying degrees of transparency. Results also used for presentation of computational results. By performing color mapping, SHADE colors model surface to display analysis results as pressures, stresses, and temperatures. NASA has used SHADE extensively in sign and analysis of high-performance aircraft. Industry should find applications for SHADE in computer-aided design and computer-aided manufacturing. SHADE written in VAX FORTRAN and MACRO Assembler for either interactive or batch execution.

  9. Implementation of Information Technology in the Free Trade Era for Indonesia

    DTIC Science & Technology

    1998-06-01

    computer usage, had been organized before Thailand, Malaysia , and China. Also, use of computers for crude oil process applications, and marketing and...seismic computing in Pertamina had been installed and in operation ahead of Taiwan, Malaysia , and Brunei. There are many examples of computer usage at...such as: Malaysia , Thailand, USA, China, Germany, and many others. Although IT development is utilized in Indonesia’s development program, it should

  10. Steep-dip seismic imaging of the shallow San Andreas Fault near Parkfield

    USGS Publications Warehouse

    Hole, J.A.; Catchings, R.D.; St. Clair, K.C.; Rymer, M.J.; Okaya, D.A.; Carney, B.J.

    2001-01-01

    Seismic reflection and refraction images illuminate the San Andreas Fault to a depth of 1 kilometer. The prestack depth-migrated reflection image contains near-vertical reflections aligned with the active fault trace. The fault is vertical in the upper 0.5 kilometer, then dips about 70° to the southwest to at least 1 kilometer subsurface. This dip reconciles the difference between the computed locations of earthquakes and the surface fault trace. The seismic velocity cross section shows strong lateral variations. Relatively low velocity (10 to 30%), high electrical conductivity, and low density indicate a 1-kilometer-wide vertical wedge of porous sediment or fractured rock immediately southwest of the active fault trace.

  11. Influence of seismic anisotropy on the cross correlation tensor: numerical investigations

    NASA Astrophysics Data System (ADS)

    Saade, M.; Montagner, J. P.; Roux, P.; Cupillard, P.; Durand, S.; Brenguier, F.

    2015-05-01

    Temporal changes in seismic anisotropy can be interpreted as variations in the orientation of cracks in seismogenic zones, and thus as variations in the stress field. Such temporal changes have been observed in seismogenic zones before and after earthquakes, although they are still not well understood. In this study, we investigate the azimuthal polarization of surface waves in anisotropic media with respect to the orientation of anisotropy, from a numerical point of view. This technique is based on the observation of the signature of anisotropy on the nine-component cross-correlation tensor (CCT) computed from seismic ambient noise recorded on pairs of three-component sensors. If noise sources are spatially distributed in a homogeneous medium, the CCT allows the reconstruction of the surface wave Green's tensor between the station pairs. In homogeneous, isotropic medium, four off-diagonal terms of the surface wave Green's tensor are null, but not in anisotropic medium. This technique is applied to three-component synthetic seismograms computed in a transversely isotropic medium with a horizontal symmetry axis, using a spectral element code. The CCT is computed between each pair of stations and then rotated, to approximate the surface wave Green's tensor by minimizing the off-diagonal components. This procedure allows the calculation of the azimuthal variation of quasi-Rayleigh and quasi-Love waves. In an anisotropic medium, in some cases, the azimuth of seismic anisotropy can induce a large variation in the horizontal polarization of surface waves. This variation depends on the relative angle between a pair of stations and the direction of anisotropy, the amplitude of the anisotropy, the frequency band of the signal and the depth of the anisotropic layer.

  12. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    DOT National Transportation Integrated Search

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  13. Computer-aided dispatch--traffic management center field operational test : Washington State final report

    DOT National Transportation Integrated Search

    2006-05-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch - Traffic Management Center Integration Field Operations Test in the State of Washington. The document discusses evaluation findings in the foll...

  14. Denver RTD's computer aided dispatch/automatic vehicle location system : the human factors consequences

    DOT National Transportation Integrated Search

    1999-09-01

    This report documents what happened to employees' work procedures when their employer when their employer installed Computer Aided Disptach/Automatic Vehicle Locator (CAD/AVL) technology to provide real-time surveillance of vehicles and to upgrade ra...

  15. The University of Michigan's Computer-Aided Engineering Network.

    ERIC Educational Resources Information Center

    Atkins, D. E.; Olsen, Leslie A.

    1986-01-01

    Presents an overview of the Computer-Aided Engineering Network (CAEN) of the University of Michigan. Describes its arrangement of workstations, communication networks, and servers. Outlines the factors considered in hardware and software decision making. Reviews the program's impact on students. (ML)

  16. Computer-aided engineering of semiconductor integrated circuits

    NASA Astrophysics Data System (ADS)

    Meindl, J. D.; Dutton, R. W.; Gibbons, J. F.; Helms, C. R.; Plummer, J. D.; Tiller, W. A.; Ho, C. P.; Saraswat, K. C.; Deal, B. E.; Kamins, T. I.

    1980-07-01

    Economical procurement of small quantities of high performance custom integrated circuits for military systems is impeded by inadequate process, device and circuit models that handicap low cost computer aided design. The principal objective of this program is to formulate physical models of fabrication processes, devices and circuits to allow total computer-aided design of custom large-scale integrated circuits. The basic areas under investigation are (1) thermal oxidation, (2) ion implantation and diffusion, (3) chemical vapor deposition of silicon and refractory metal silicides, (4) device simulation and analytic measurements. This report discusses the fourth year of the program.

  17. CAD system for footwear design based on whole real 3D data of last surface

    NASA Astrophysics Data System (ADS)

    Song, Wanzhong; Su, Xianyu

    2000-10-01

    Two major parts of application of CAD in footwear design are studied: the development of last surface; computer-aided design of planar shoe-template. A new quasi-experiential development algorithm of last surface based on triangulation approximation is presented. This development algorithm consumes less time and does not need any interactive operation for precisely development compared with other development algorithm of last surface. Based on this algorithm, a software, SHOEMAKERTM, which contains computer aided automatic measurement, automatic development of last surface and computer aide design of shoe-template has been developed.

  18. Training Aids for Online Instruction: An Analysis.

    ERIC Educational Resources Information Center

    Guy, Robin Frederick

    This paper describes a number of different types of training aids currently employed in online training: non-interactive audiovisual presentations; interactive computer-based aids; partially interactive aids based on recorded searches; print-based materials; and kits. The advantages and disadvantages of each type of aid are noted, and a table…

  19. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  20. Interactive Visualization of Complex Seismic Data and Models Using Bokeh

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, Chengping; Ammon, Charles J.; Maceira, Monica

    Visualizing multidimensional data and models becomes more challenging as the volume and resolution of seismic data and models increase. But thanks to the development of powerful and accessible computer systems, a model web browser can be used to visualize complex scientific data and models dynamically. In this paper, we present four examples of seismic model visualization using an open-source Python package Bokeh. One example is a visualization of a surface-wave dispersion data set, another presents a view of three-component seismograms, and two illustrate methods to explore a 3D seismic-velocity model. Unlike other 3D visualization packages, our visualization approach has amore » minimum requirement on users and is relatively easy to develop, provided you have reasonable programming skills. Finally, utilizing familiar web browsing interfaces, the dynamic tools provide us an effective and efficient approach to explore large data sets and models.« less

  1. Interactive Visualization of Complex Seismic Data and Models Using Bokeh

    DOE PAGES

    Chai, Chengping; Ammon, Charles J.; Maceira, Monica; ...

    2018-02-14

    Visualizing multidimensional data and models becomes more challenging as the volume and resolution of seismic data and models increase. But thanks to the development of powerful and accessible computer systems, a model web browser can be used to visualize complex scientific data and models dynamically. In this paper, we present four examples of seismic model visualization using an open-source Python package Bokeh. One example is a visualization of a surface-wave dispersion data set, another presents a view of three-component seismograms, and two illustrate methods to explore a 3D seismic-velocity model. Unlike other 3D visualization packages, our visualization approach has amore » minimum requirement on users and is relatively easy to develop, provided you have reasonable programming skills. Finally, utilizing familiar web browsing interfaces, the dynamic tools provide us an effective and efficient approach to explore large data sets and models.« less

  2. Surface uplift and time-dependent seismic hazard due to fluid injection in eastern Texas.

    PubMed

    Shirzaei, Manoochehr; Ellsworth, William L; Tiampo, Kristy F; González, Pablo J; Manga, Michael

    2016-09-23

    Observations that unequivocally link seismicity and wastewater injection are scarce. Here we show that wastewater injection in eastern Texas causes uplift, detectable in radar interferometric data up to >8 kilometers from the wells. Using measurements of uplift, reported injection data, and a poroelastic model, we computed the crustal strain and pore pressure. We infer that an increase of >1 megapascal in pore pressure in rocks with low compressibility triggers earthquakes, including the 4.8-moment magnitude event that occurred on 17 May 2012, the largest earthquake recorded in eastern Texas. Seismic activity increased even while injection rates declined, owing to diffusion of pore pressure from earlier periods with higher injection rates. Induced seismicity potential is suppressed where tight confining formations prevent pore pressure from propagating into crystalline basement rocks. Copyright © 2016, American Association for the Advancement of Science.

  3. Time-frequency domain SNR estimation and its application in seismic data processing

    NASA Astrophysics Data System (ADS)

    Zhao, Yan; Liu, Yang; Li, Xuxuan; Jiang, Nansen

    2014-08-01

    Based on an approach estimating frequency domain signal-to-noise ratio (FSNR), we propose a method to evaluate time-frequency domain signal-to-noise ratio (TFSNR). This method adopts short-time Fourier transform (STFT) to estimate instantaneous power spectrum of signal and noise, and thus uses their ratio to compute TFSNR. Unlike FSNR describing the variation of SNR with frequency only, TFSNR depicts the variation of SNR with time and frequency, and thus better handles non-stationary seismic data. By considering TFSNR, we develop methods to improve the effects of inverse Q filtering and high frequency noise attenuation in seismic data processing. Inverse Q filtering considering TFSNR can better solve the problem of amplitude amplification of noise. The high frequency noise attenuation method considering TFSNR, different from other de-noising methods, distinguishes and suppresses noise using an explicit criterion. Examples of synthetic and real seismic data illustrate the correctness and effectiveness of the proposed methods.

  4. Financial Aid for Full-Time Undergraduates. Higher Education Panel Report Number 60.

    ERIC Educational Resources Information Center

    Andersen, Charles J.

    The level and composition of student financial aid for undergraduate students were estimated, with attention to estimated number of aid recipients, the total amount they received, the distribution of aided students by their families' income level, the composition of their aid packages, and the use of computers in the administration of aid. In…

  5. The Spherical Brazil Nut Effect and its Significance to Asteroids

    NASA Astrophysics Data System (ADS)

    Perera, Viranga; Jackson, Alan P.; Asphaug, Erik; Ballouz, Ronald-Louis

    2015-11-01

    Asteroids are intriguing remnant objects from the early solar system. They can inform us on how planets formed, they could possibly impact the earth in the future, and they likely contain precious metals; for those reasons, there will be future exploration and mining space missions to them. Telescopic observations and spacecraft data have helped us understand basic properties such as their size, mass, spin rate, orbital elements, and their surface properties. However, their interior structures have remained elusive. In order to fully characterize the interiors of these bodies, seismic data will be necessary. However, we can infer their interior structures by combining several key factors that we know about them: 1). Past work has shown that asteroids between 150 m to 10 km in size are rubble-piles that are a collection of particles held together by gravity and possibly cohesion. 2). Asteroid surfaces show cratering that suggests that past impacts would have seismically shaken these bodies. 3). Spacecraft images show that some asteroids have large protruding boulders on their surfaces. A rubble-pile object made of particles of different sizes and that undergoes seismic shaking will experience granular flow. Specifically, a size sorting effect known as the Brazil Nut Effect will lead larger particles to move towards the surface while smaller particles will move downwards. Previous work has suggested that this effect could possibly explain not only why there are large boulders on the surfaces of some asteroids but also might suggest that the interior particles of these bodies would be organized by size. Previous works have conducted computer simulations and lab experiments; however, all the particle configurations used have been either cylindrical or rectangular boxes. In this work we present a spherical configuration of self-gravitating particles that is a better representation of asteroids. Our results indicate that while friction is not necessary for the Brazil Nut Effect to take place, it aids the sorting process after a certain energy threshold is met. Even though we find that the outer layers of asteroids could possibly be size sorted, the inner regions are likely mixed.

  6. ASDF: A New Adaptable Data Format for Seismology Suitable for Large-Scale Workflows

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Smith, J. A.; Spinuso, A.; Tromp, J.

    2014-12-01

    Increases in the amounts of available data as well as computational power opens the possibility to tackle ever larger and more complex problems. This comes with a slew of new problems, two of which are the need for a more efficient use of available resources and a sensible organization and storage of the data. Both need to be satisfied in order to properly scale a problem and both are frequent bottlenecks in large seismic inversions using ambient noise or more traditional techniques.We present recent developments and ideas regarding a new data format, named ASDF (Adaptable Seismic Data Format), for all branches of seismology aiding with the aforementioned problems. The key idea is to store all information necessary to fully understand a set of data in a single file. This enables the construction of self-explaining and exchangeable data sets facilitating collaboration on large-scale problems. We incorporate the existing metadata standards FDSN StationXML and QuakeML together with waveform and auxiliary data into a common container based on the HDF5 standard. A further critical component of the format is the storage of provenance information as an extension of W3C PROV, meaning information about the history of the data, assisting with the general problem of reproducibility.Applications of the proposed new format are numerous. In the context of seismic tomography it enables the full description and storage of synthetic waveforms including information about the used model, the solver, the parameters, and other variables that influenced the final waveforms. Furthermore, intermediate products like adjoint sources, cross correlations, and receiver functions can be described and most importantly exchanged with others.Usability and tool support is crucial for any new format to gain acceptance and we additionally present a fully functional implementation of this format based on Python and ObsPy. It offers a convenient way to discover and analyze data sets as well as making it trivial to execute processing functionality on modern high performance machines utilizing parallel I/O even for users not familiar with the details. An open-source development and design model as well as a wiki aim to involve the community.

  7. Optimal implicit 2-D finite differences to model wave propagation in poroelastic media

    NASA Astrophysics Data System (ADS)

    Itzá, Reymundo; Iturrarán-Viveros, Ursula; Parra, Jorge O.

    2016-08-01

    Numerical modeling of seismic waves in heterogeneous porous reservoir rocks is an important tool for the interpretation of seismic surveys in reservoir engineering. We apply globally optimal implicit staggered-grid finite differences (FD) to model 2-D wave propagation in heterogeneous poroelastic media at a low-frequency range (<10 kHz). We validate the numerical solution by comparing it to an analytical-transient solution obtaining clear seismic wavefields including fast P and slow P and S waves (for a porous media saturated with fluid). The numerical dispersion and stability conditions are derived using von Neumann analysis, showing that over a wide range of porous materials the Courant condition governs the stability and this optimal implicit scheme improves the stability of explicit schemes. High-order explicit FD can be replaced by some lower order optimal implicit FD so computational cost will not be as expensive while maintaining the accuracy. Here, we compute weights for the optimal implicit FD scheme to attain an accuracy of γ = 10-8. The implicit spatial differentiation involves solving tridiagonal linear systems of equations through Thomas' algorithm.

  8. RESIF Seismology Datacentre : Recently Released Data and New Services. Computing with Dense Seisimic Networks Data.

    NASA Astrophysics Data System (ADS)

    Volcke, P.; Pequegnat, C.; Grunberg, M.; Lecointre, A.; Bzeznik, B.; Wolyniec, D.; Engels, F.; Maron, C.; Cheze, J.; Pardo, C.; Saurel, J. M.; André, F.

    2015-12-01

    RESIF is a nationwide french project aimed at building a high quality observation system to observe and understand the inner earth. RESIF deals with permanent seismic networks data as well as mobile networks data, including dense/semi-dense arrays. RESIF project is distributed among different nodes providing qualified data to the main datacentre in Université Grenoble Alpes, France. Data control and qualification is performed by each individual nodes : the poster will provide some insights on RESIF broadband seismic component data quality control. We will then present data that has been recently made publicly available. Data is distributed through worldwide FDSN and european EIDA standards protocols. A new web portal is now opened to explore and download seismic data and metadata. The RESIF datacentre is also now connected to Grenoble University High Performance Computing (HPC) facility : a typical use-case will be presented using iRODS technologies. The use of dense observation networks is increasing, bringing challenges in data growth and handling : we will present an example where HDF5 data format was used as an alternative to usual seismology data formats.

  9. Empirical Approach in Developing Vs/Vp Ratio for Predicting S-Wave Velocity, Study Case; Sungai Batu, Kedah

    NASA Astrophysics Data System (ADS)

    Sabrian, T. A.; Saad, R.; Saidin, M.; Muhammad, S. B.; Yusoh, R.

    2018-04-01

    In recognition of the difficulties in acquiring seismic refraction shear wave data and the ambiguities involved in its processing, Vs/Vp ratio for sedimentary areas of Sungai Batu have been developed and assessed in this study. Two seismic refraction survey line L1 and L2 were conducted using P- and S-wave were acquired and processed along the same line regarding study area. The resulting velocities were extracted from seismic tomography profile to compute specific ratios after linearity and correlation checks. It is found that Vs is linearly related to Vp, with coefficients of determination (R2) of about 0.74 and 0.52 for L1 and L2 respectively. The specific ratios were computed as 0.3 and 0.4 for L1 and L2 respectively Another data sets acquired along different lines were used to validate the ratios. The mean absolute percentage errors were calculated for both modelling and validation data sets and found that the different percentage between Vs measured and Vs calculated is 20.7% and 22% respectively.

  10. Volume Based Curvature Attributes Illuminate Stress Effects in Contiguous Fault Blocks, Central Basin Platform, West Texas

    NASA Astrophysics Data System (ADS)

    Blumentritt, C. H.; Marfurt, K. J.

    2005-05-01

    We compute curvatures for 3-D seismic volumes covering 200+ mi2 of the Central Basin Platform in West Texas and find that these attributes illumination lineations not seen on other displays of the seismic data. We analyze the preferred orientations of these lineations defined by well imaged faults and fault zones and find that the patterns vary according to the nature of the faults bounding the blocks, mostly strike-slip, high angle reverse, or oblique slip. We perform the analysis in the pre-Mississippian section which is decoupled from the overburden by a Permian age unconformity. Our technique differs from that of previous workers in that we compute curvatures on each sample of a seismic volume using a moving subvolume rather than along surfaces interpreted from the data. In this way, we minimize high frequency variations in the results that arise from picking errors in the interpretation or noise in the data. We are able to extract and display values of curvature along time or depth slices, along horizon slices, and along poorly imaged horizons.

  11. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    DTIC Science & Technology

    2017-08-08

    Usability Studies In Virtual And Traditional Computer Aided Design Environments For Fault Identification Dr. Syed Adeel Ahmed, Xavier University...virtual environment with wand interfaces compared directly with a workstation non-stereoscopic traditional CAD interface with keyboard and mouse. In...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  12. A State-of-the-Art Review of the Real-Time Computer-Aided Study of the Writing Process

    ERIC Educational Resources Information Center

    Abdel Latif, Muhammad M.

    2008-01-01

    Writing researchers have developed various methods for investigating the writing process since the 1970s. The early 1980s saw the occurrence of the real-time computer-aided study of the writing process that relies on the protocols generated by recording the computer screen activities as writers compose using the word processor. This article…

  13. Computer-Aided Transcription in the Courts. Executive Summary. National Center Publication No. R-0058.

    ERIC Educational Resources Information Center

    National Center for State Courts, Williamsburg, VA.

    This report summarizes the findings of the Computer-Aided Transcription (CAT) Project, which conducted a 14-month study of the technology and use of computer systems for translating into English the shorthand notes taken by court reporters on stenotype machines. Included are the state of the art of CAT at the end of 1980 and anticipated future…

  14. Wusor II: A Computer Aided Instruction Program with Student Modelling Capabilities. AI Memo 417.

    ERIC Educational Resources Information Center

    Carr, Brian

    Wusor II is the second intelligent computer aided instruction (ICAI) program that has been developed to monitor the progress of, and offer suggestions to, students playing Wumpus, a computer game designed to teach logical thinking and problem solving. From the earlier efforts with Wusor I, it was possible to produce a rule-based expert which…

  15. Forecasting volcanic unrest using seismicity: The good, the bad and the time consuming

    NASA Astrophysics Data System (ADS)

    Salvage, Rebecca; Neuberg, Jurgen W.

    2013-04-01

    Volcanic eruptions are inherently unpredictable in nature, with scientists struggling to forecast the type and timing of events, in particular in real time scenarios. Current understanding suggests that the use of statistical patterns within precursory datasets of seismicity prior to eruptive events could hold the potential to be used as real time forecasting tools. They allow us to determine times of clear deviation in data, which might be indicative of volcanic unrest. The identification of low frequency seismic swarms and the acceleration of this seismicity prior to observed volcanic unrest may be key in developing forecasting tools. The development of these real time forecasting models which can be implemented at volcano observatories is of particular importance since the identification of early warning signals allows danger to the proximal population to be minimized. We concentrate on understanding the significance and development of these seismic swarms as unrest develops at the volcano. In particular, analysis of accelerations in event rate, amplitude and energy rates released by seismicity prior to eruption suggests that these are important indicators of developing unrest. Real time analysis of these parameters simultaneously allows possible improvements to forecasting models. Although more time and computationally intense, cross correlation techniques applied to continuous seismicity prior to volcanic unrest scenarios allows all significant seismic events to be analysed, rather than only those which can be detected by an automated identification system. This may allow a more accurate forecast since all precursory seismicity can be taken into account. In addition, the classification of seismic events based on spectral characteristics may allow us to isolate individual types of signals which are responsible for certain types of unrest. In this way, we may be able to better forecast the type of eruption that may ensue, or at least some of its prevailing characteristics.

  16. A Novel Approach to Constrain Near-Surface Seismic Wave Speed Based on Polarization Analysis

    NASA Astrophysics Data System (ADS)

    Park, S.; Ishii, M.

    2016-12-01

    Understanding the seismic responses of cities around the world is essential for the risk assessment of earthquake hazards. One of the important parameters is the elastic structure of the sites, in particular, near-surface seismic wave speed, that influences the level of ground shaking. Many methods have been developed to constrain the elastic structure of the populated sites or urban basins, and here, we introduce a new technique based on analyzing the polarization content or the three-dimensional particle motion of seismic phases arriving at the sites. Polarization analysis of three-component seismic data was widely used up to about two decades ago, to detect signals and identify different types of seismic arrivals. Today, we have good understanding of the expected polarization direction and ray parameter for seismic wave arrivals that are calculated based on a reference seismic model. The polarization of a given phase is also strongly sensitive to the elastic wave speed immediately beneath the station. This allows us to compare the observed and predicted polarization directions of incoming body waves and infer the near-surface wave speed. This approach is applied to High-Sensitivity Seismograph Network in Japan, where we benchmark the results against the well-log data that are available at most stations. There is a good agreement between our estimates of seismic wave speeds and those from well logs, confirming the efficacy of the new method. In most urban environments, where well logging is not a practical option for measuring the seismic wave speeds, this method can provide a reliable, non-invasive, and computationally inexpensive estimate of near-surface elastic properties.

  17. 3D Acoustic Full Waveform Inversion for Engineering Purpose

    NASA Astrophysics Data System (ADS)

    Lim, Y.; Shin, S.; Kim, D.; Kim, S.; Chung, W.

    2017-12-01

    Seismic waveform inversion is the most researched data processing technique. In recent years, with an increase in marine development projects, seismic surveys are commonly conducted for engineering purposes; however, researches for application of waveform inversion are insufficient. The waveform inversion updates the subsurface physical property by minimizing the difference between modeled and observed data. Furthermore, it can be used to generate an accurate subsurface image; however, this technique consumes substantial computational resources. Its most compute-intensive step is the calculation of the gradient and hessian values. This aspect gains higher significance in 3D as compared to 2D. This paper introduces a new method for calculating gradient and hessian values, in an effort to reduce computational overburden. In the conventional waveform inversion, the calculation area covers all sources and receivers. In seismic surveys for engineering purposes, the number of receivers is limited. Therefore, it is inefficient to construct the hessian and gradient for the entire region (Figure 1). In order to tackle this problem, we calculate the gradient and the hessian for a single shot within the range of the relevant source and receiver. This is followed by summing up of these positions for the entire shot (Figure 2). In this paper, we demonstrate that reducing the area of calculation of the hessian and gradient for one shot reduces the overall amount of computation and therefore, the computation time. Furthermore, it is proved that the waveform inversion can be suitably applied for engineering purposes. In future research, we propose to ascertain an effective calculation range. This research was supported by the Basic Research Project(17-3314) of the Korea Institute of Geoscience and Mineral Resources(KIGAM) funded by the Ministry of Science, ICT and Future Planning of Korea.

  18. Computational System For Rapid CFD Analysis In Engineering

    NASA Technical Reports Server (NTRS)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  19. Seismic attribute analysis for reservoir and fluid prediction, Malay Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansor, M.N.; Rudolph, K.W.; Richards, F.B.

    1994-07-01

    The Malay Basin is characterized by excellent seismic data quality, but complex clastic reservoir architecture. With these characteristics, seismic attribute analysis is a very important tool in exploration and development geoscience and is routinely used for mapping fluids and reservoir, recognizing and risking traps, assessment, depth conversion, well placement, and field development planning. Attribute analysis can be successfully applied to both 2-D and 3-D data as demonstrated by comparisons of 2-D and 3-D amplitude maps of the same area. There are many different methods of extracting amplitude information from seismic data, including amplitude mapping, horizon slice, summed horizon slice, isochronmore » slice, and horizon slice from AVO (amplitude versus offset) cube. Within the Malay Basin, horizon/isochron slice techniques have several advantages over simply extracting amplitudes from a picked horizon: they are much faster, permit examination of the amplitude structure of the entire cube, yield better results for weak/variable signatures, and aid summation of amplitudes. Summation in itself often yields improved results because it incorporates the signature from the entire reservoir interval, reducing any effects due to noise, mispicking, or waveform variations. Dip and azimuth attributes have been widely applied by industry for fault identification. In addition, these attributes can also be used to map signature variations associated with hydrocarbon contacts or stratigraphic changes, and this must be considered when using these attributes for structural interpretation.« less

  20. Results and implications of new regional seismic lines in the Malay Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leslie, W.; Ho, W.K.; Ghani, M.A.

    1994-07-01

    Regional seismic data, which was previously acquired between 1968 and 1971 by early operators in the Malay Basin, has limitations because the sophisticated modern-day acquisition and processing techniques were not available. These old data do not permit confident mapping below 3 s (TWT), equivalent to approximately 3000 m subsea, but aeromagnetic data indicate that the total sedimentary thickness exceeds 13,000 m. Hence, existing regional seismic data with a record length of 5 s (TWT) is neither adequate to map deeper play opportunities nor able to aid in understanding the geological history of the basin. New plays at deeper levels maymore » exist. (1) Geochemical modeling results now predict the top of the oil generation window at depths greater than previously thought. (2) Existing gas fields occur in the upper section in areas of thickest sedimentary fill but underlying targets have not been tested. (3) Past exploration has been focused on oil and not gas in deeper structures. Because of Malaysia's rapid development and its dedication to the protection of the environment, gas is becoming an increasingly important energy source. Hence, ample internal markets for additional gas discoveries are being created. A better understanding of the Malay Basin geological history will assist in locating these potential plays. To do this, Petronas acquired approximately 3000 line km of high-quality regional seismic data to further exploration efforts in this prospective region.« less

  1. [Earthquakes--a historical review, environmental and health effects, and health care measures].

    PubMed

    Nola, Iskra Alexandra; Doko Jelinić, Jagoda; Žuškin, Eugenija; Kratohvil, Mladen

    2013-06-01

    Earthquakes are natural disasters that can occur at any time, regardless of the location. Their frequency is higher in the Circum-Pacific and Mediterranean/Trans-Asian seismic belt. A number of sophisticated methods define their magnitude using the Richter scale and intensity using the Mercani-Cancani-Sieberg scale. Recorded data show a number of devastating earthquakes that have killed many people and changed the environment dramatically. Croatia is located in a seismically active area, which has endured a series of historical earthquakes, among which several occurred in the Zagreb area. The consequences of an earthquake depend mostly on the population density and seismic resistance of buildings in the affected area. Environmental consequences often include air, water, and soil pollution. The effects of this kind of pollution can have long-term health effects. The most dramatic health consequences result from the demolition of buildings. Therefore, quick and efficient aid depends on well-organized health professionals as well as on the readiness of the civil defence, fire department, and Mountain Rescue Service members. Good coordination among these services can save many lives Public health interventions must include effective control measures in the environment as secondary prevention methods for health problems caused by unfavourable environmental factors. The identification and control of long-term hazards can reduce chronic health effects. The reduction of earthquake-induced damages includes setting priorities in building seismically safe buildings.

  2. Geomorphic and Geologic Controls of Geohazards induced by Nepal's 2015 Gorkha Earthquake

    NASA Technical Reports Server (NTRS)

    Kargel, J. S.; Leonard, G. J.; Shugar, D. H.; Haritashya, U.K.; Bevington, A.; Fielding, E. J.; Fujita, K.; Geertsema, M.; Miles, E. S.; Steiner, J.; hide

    2015-01-01

    The Gorkha earthquake (Magnitude 7.8) on 25 April 2015 and later aftershocks struck South Asia, killing approx.9,000 and damaging a large region. Supported by a large campaign of responsive satellite data acquisitions over the earthquake disaster zone, our team undertook a satellite image survey of the earthquakes induced geohazards in Nepal and China and an assessment of the geomorphic, tectonic, and lithologic controls on quake-induced landslides. Timely analysis and communication aided response and recovery and informed decision makers. We mapped 4,312 co-seismic and post-seismic landslides. We also surveyed 491 glacier lakes for earthquake damage, but found only 9 landslide-impacted lakes and no visible satellite evidence of outbursts. Landslide densities correlate with slope, peak ground acceleration, surface downdrop, and specific metamorphic lithologies and large plutonic intrusions.

  3. Unit cell-based computer-aided manufacturing system for tissue engineering.

    PubMed

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-03-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering.

  4. Permanent Seismically Induced Displacement of Rock-Founded Structures Computed by the Newmark Program

    DTIC Science & Technology

    2009-02-01

    solved, great care is exercised by the seismic engineer to size the mesh so that moderate to high wave frequencies are not artificially excluded in...buttressing effect of a reinforced concrete slab (Figure 1.7) is represented in this simplified dynamic model by the user-specified force Presist...retaining wall that is buttressed by an invert spill- way slab (which is a reinforced concrete slab), exemplify a category of Corps retaining walls that may

  5. a Novel Approach to Support Majority Voting in Spatial Group Mcdm Using Density Induced Owa Operator for Seismic Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Moradi, M.; Delavar, M. R.; Moshiri, B.; Khamespanah, F.

    2014-10-01

    Being one of the most frightening disasters, earthquakes frequently cause huge damages to buildings, facilities and human beings. Although the prediction of characteristics of an earthquake seems to be impossible, its loss and damage is predictable in advance. Seismic loss estimation models tend to evaluate the extent to which the urban areas are vulnerable to earthquakes. Many factors contribute to the vulnerability of urban areas against earthquakes including age and height of buildings, the quality of the materials, the density of population and the location of flammable facilities. Therefore, seismic vulnerability assessment is a multi-criteria problem. A number of multi criteria decision making models have been proposed based on a single expert. The main objective of this paper is to propose a model which facilitates group multi criteria decision making based on the concept of majority voting. The main idea of majority voting is providing a computational tool to measure the degree to which different experts support each other's opinions and make a decision regarding this measure. The applicability of this model is examined in Tehran metropolitan area which is located in a seismically active region. The results indicate that neglecting the experts which get lower degrees of support from others enables the decision makers to avoid the extreme strategies. Moreover, a computational method is proposed to calculate the degree of optimism in the experts' opinions.

  6. Computer-aided dispatch--traffic management center field operational test : state of Utah final report

    DOT National Transportation Integrated Search

    2006-07-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch Traffic Management Center Integration Field Operations Test in the State of Utah. The document discusses evaluation findings in the followin...

  7. USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOL IN POLLUTION PREVENTION

    EPA Science Inventory

    Computer-Aided Process Engineering has become established in industry as a design tool. With the establishment of the CAPE-OPEN software specifications for process simulation environments. CAPE-OPEN provides a set of "middleware" standards that enable software developers to acces...

  8. Using CAD Programs in CAL.

    ERIC Educational Resources Information Center

    Boardman, D.

    1979-01-01

    Practical experience has shown that computer aided design programs can provide an invaluable aid in the learning process when integrated into the syllabus in lecture and laboratory periods. This should be a major area of future development of computer assisted learning in engineering education. (Author/CMV)

  9. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    DOT National Transportation Integrated Search

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  10. Computer-Aided Geometry Modeling

    NASA Technical Reports Server (NTRS)

    Shoosmith, J. N. (Compiler); Fulton, R. E. (Compiler)

    1984-01-01

    Techniques in computer-aided geometry modeling and their application are addressed. Mathematical modeling, solid geometry models, management of geometric data, development of geometry standards, and interactive and graphic procedures are discussed. The applications include aeronautical and aerospace structures design, fluid flow modeling, and gas turbine design.

  11. APPLICATION OF COMPUTER AIDED TOMOGRAPHY (CAT) TO THE STUDY OF MARINE BENTIC COMMUNITIES

    EPA Science Inventory

    Sediment cores were imaged using a Computer-Aided Tomography (CT) scanner at Massachusetts General Hospital, Boston, Massachusetts, United States. Procedures were developed, using the attenuation of X-rays, to differentiate between sediment and the water contained in macrobenthic...

  12. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-01-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  13. Finite-Difference Modeling of Seismic Reflection Data in a Hard Rock Environment: An Example from the Mineralized Otago Schist, New Zealand

    NASA Astrophysics Data System (ADS)

    Leslie, A.; Gorman, A. R.

    2004-12-01

    The interpretation of seismic reflection data in non-sedimentary environments is problematic. In the Macraes Flat region near Dunedin (South Island, New Zealand), ongoing mining of mineralized schist has prompted the development of a seismic interpretation scheme that is capable of imaging a gold-bearing shear zone and associated mineralized structures accurately to the meter scale. The anisotropic and complex structural nature of this geological environment necessitates a cost-effective computer-based modeling technique that can provide information on the physical characteristics of the schist. Such a method has been tested on seismic data acquired in 1993 over a region that has since been excavated and logged. Correlation to measured structural data permits a direct comparison between the seismic data and the actual geology. Synthetic modeling utilizes a 2D visco-elastic finite difference routine to constrain the interpretation of observed seismic characteristics, including the velocity, anisotropy, and contrast, of the shear zone structures. Iterative refinements of the model result in a more representative synthetic model that most closely matches the seismic response. The comparison between the actual and synthetic seismic sections provides promising results that will be tested by new data acquisition over the summer of 2004/2005 to identify structures and zones of potential mineralization. As a downstream benefit, this research could also contribute to earthquake risk assessment analyses at active faults with similar characteristics.

  14. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-05-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  15. Seismic lateral prediction in chalky limestone reservoirs offshore Qatar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubbens, I.B.H.M.; Murat, R.C.; Vankeulen, J.

    Following the discovery of non-structurally trapped oil accumulations in Cretaceous chalky reservoirs on the northern flank of the North Dome offshore QATAR, a seismic lateral prediction study was carried out for QATAR GENERAL PETROLEUM CORPORATION (Offshore Operations). The objectives of this study were to assist in the appraisal of these oil accumulations by predicting their possible lateral extent and to investigate if the technique applied could be used as a basis for further exploration of similar oil prospects in the area. Wireline logs of eight wells and some 1000 km of high quality seismic data were processed into acoustic impedancemore » (A.I.) logs and seismic A.I. sections. Having obtained a satisfactory match of the A.I. well logs and the A.I. of the seismic traces at the well locations, relationships were established by the use of well log data which allowed the interpretation of the seismic A.I. in terms of reservoir quality. Measurements of the relevant A.I. characteristics were then carried out by computer along all seismic lines and porosity distribution maps prepared for some of the reservoirs. These maps, combined with detailed seismic depth contour maps at reservoir tops, lead to definition of good reservoir development areas downdip from poor reservoir quality zones i.e. of the stratigraphic trap areas, and drilling locations could thus be proposed. The system remains to be adequately calibrated when core material becomes available in the area of study.« less

  16. Salton Trough Post-seismic Afterslip, Viscoelastic Response, and Contribution to Regional Hazard

    NASA Astrophysics Data System (ADS)

    Parker, J. W.; Donnellan, A.; Lyzenga, G. A.

    2012-12-01

    The El Mayor-Cucapah M7.2 April 4 2010 earthquake in Baja California may have affected accumulated hazard to Southern California cities due to loading of regional faults including the Elsinore, San Jacinto and southern San Andreas, faults which already have over a century of tectonic loading. We examine changes observed via multiple seismic and geodetic techniques, including micro seismicity and proposed seismicity-based indicators of hazard, high-quality fault models, the Plate Boundary Observatory GNSS array (with 174 stations showing post-seismic transients with greater than 1 mm amplitude), and interferometric radar maps from UAVSAR (aircraft) flights, showing a network of aseismic fault slip events at distances up to 60 km from the end of the surface rupture. Finite element modeling is used to compute the expected coseismic motions at GPS stations with general agreement, including coseismic uplift at sites ~200 km north of the rupture. Postseismic response is also compared, with GNSS and also with the CIG software "RELAX." An initial examination of hazard is made comparing micro seismicity-based metrics, fault models, and changes to coulomb stress on nearby faults using the finite element model. Comparison of seismicity with interferograms and historic earthquakes show aseismic slip occurs on fault segments that have had earthquakes in the last 70 years, while other segments show no slip at the surface but do show high triggered seismicity. UAVSAR-based estimates of fault slip can be incorporated into the finite element model to correct Coloumb stress change.

  17. Precision Seismic Monitoring of Volcanic Eruptions at Axial Seamount

    NASA Astrophysics Data System (ADS)

    Waldhauser, F.; Wilcock, W. S. D.; Tolstoy, M.; Baillard, C.; Tan, Y. J.; Schaff, D. P.

    2017-12-01

    Seven permanent ocean bottom seismometers of the Ocean Observatories Initiative's real time cabled observatory at Axial Seamount off the coast of the western United States record seismic activity since 2014. The array captured the April 2015 eruption, shedding light on the detailed structure and dynamics of the volcano and the Juan de Fuca midocean ridge system (Wilcock et al., 2016). After a period of continuously increasing seismic activity primarily associated with the reactivation of caldera ring faults, and the subsequent seismic crisis on April 24, 2015 with 7000 recorded events that day, seismicity rates steadily declined and the array currently records an average of 5 events per day. Here we present results from ongoing efforts to automatically detect and precisely locate seismic events at Axial in real-time, providing the computational framework and fundamental data that will allow rapid characterization and analysis of spatio-temporal changes in seismogenic properties. We combine a kurtosis-based P- and S-phase onset picker and time domain cross-correlation detection and phase delay timing algorithms together with single-event and double-difference location methods to rapidly and precisely (tens of meters) compute the location and magnitudes of new events with respect to a 2-year long, high-resolution background catalog that includes nearly 100,000 events within a 5×5 km region. We extend the real-time double-difference location software DD-RT to efficiently handle the anticipated high-rate and high-density earthquake activity during future eruptions. The modular monitoring framework will allow real-time tracking of other seismic events such as tremors and sea-floor lava explosions that enable the timing and location of lava flows and thus guide response research cruises to the most interesting sites. Finally, rapid detection of eruption precursors and initiation will allow for adaptive sampling by the OOI instruments for optimal recording of future eruptions. With a higher eruption recurrence rate than land-based volcanoes the Axial OOI observatory offers the opportunity to monitor and study volcanic eruptions throughout multiple cycles.

  18. A new approach to global seismic tomography based on regularization by sparsity in a novel 3D spherical wavelet basis

    NASA Astrophysics Data System (ADS)

    Loris, Ignace; Simons, Frederik J.; Daubechies, Ingrid; Nolet, Guust; Fornasier, Massimo; Vetter, Philip; Judd, Stephen; Voronin, Sergey; Vonesch, Cédric; Charléty, Jean

    2010-05-01

    Global seismic wavespeed models are routinely parameterized in terms of spherical harmonics, networks of tetrahedral nodes, rectangular voxels, or spherical splines. Up to now, Earth model parametrizations by wavelets on the three-dimensional ball remain uncommon. Here we propose such a procedure with the following three goals in mind: (1) The multiresolution character of a wavelet basis allows for the models to be represented with an effective spatial resolution that varies as a function of position within the Earth. (2) This property can be used to great advantage in the regularization of seismic inversion schemes by seeking the most sparse solution vector, in wavelet space, through iterative minimization of a combination of the ℓ2 (to fit the data) and ℓ1 norms (to promote sparsity in wavelet space). (3) With the continuing increase in high-quality seismic data, our focus is also on numerical efficiency and the ability to use parallel computing in reconstructing the model. In this presentation we propose a new wavelet basis to take advantage of these three properties. To form the numerical grid we begin with a surface tesselation known as the 'cubed sphere', a construction popular in fluid dynamics and computational seismology, coupled with an semi-regular radial subdivison that honors the major seismic discontinuities between the core-mantle boundary and the surface. This mapping first divides the volume of the mantle into six portions. In each 'chunk' two angular and one radial variable are used for parametrization. In the new variables standard 'cartesian' algorithms can more easily be used to perform the wavelet transform (or other common transforms). Edges between chunks are handled by special boundary filters. We highlight the benefits of this construction and use it to analyze the information present in several published seismic compressional-wavespeed models of the mantle, paying special attention to the statistics of wavelet and scaling coefficients across scales. We also focus on the likely gains of future inversions of finite-frequency seismic data using a sparsity promoting penalty in combination with our new wavelet approach.

  19. A tutorial on the use of ROC analysis for computer-aided diagnostic systems.

    PubMed

    Scheipers, Ulrich; Perrey, Christian; Siebers, Stefan; Hansen, Christian; Ermert, Helmut

    2005-07-01

    The application of the receiver operating characteristic (ROC) curve for computer-aided diagnostic systems is reviewed. A statistical framework is presented and different methods of evaluating the classification performance of computer-aided diagnostic systems, and, in particular, systems for ultrasonic tissue characterization, are derived. Most classifiers that are used today are dependent on a separation threshold, which can be chosen freely in many cases. The separation threshold separates the range of output values of the classification system into different target groups, thus conducting the actual classification process. In the first part of this paper, threshold specific performance measures, e.g., sensitivity and specificity, are presented. In the second part, a threshold-independent performance measure, the area under the ROC curve, is reviewed. Only the use of separation threshold-independent performance measures provides classification results that are overall representative for computer-aided diagnostic systems. The following text was motivated by the lack of a complete and definite discussion of the underlying subject in available textbooks, references and publications. Most manuscripts published so far address the theme of performance evaluation using ROC analysis in a manner too general to be practical for everyday use in the development of computer-aided diagnostic systems. Nowadays, the user of computer-aided diagnostic systems typically handles huge amounts of numerical data, not always distributed normally. Many assumptions made in more or less theoretical works on ROC analysis are no longer valid for real-life data. The paper aims at closing the gap between theoretical works and real-life data. The review provides the interested scientist with information needed to conduct ROC analysis and to integrate algorithms performing ROC analysis into classification systems while understanding the basic principles of classification.

  20. Towards Exascale Seismic Imaging and Inversion

    NASA Astrophysics Data System (ADS)

    Tromp, J.; Bozdag, E.; Lefebvre, M. P.; Smith, J. A.; Lei, W.; Ruan, Y.

    2015-12-01

    Post-petascale supercomputers are now available to solve complex scientific problems that were thought unreachable a few decades ago. They also bring a cohort of concerns tied to obtaining optimum performance. Several issues are currently being investigated by the HPC community. These include energy consumption, fault resilience, scalability of the current parallel paradigms, workflow management, I/O performance and feature extraction with large datasets. In this presentation, we focus on the last three issues. In the context of seismic imaging and inversion, in particular for simulations based on adjoint methods, workflows are well defined.They consist of a few collective steps (e.g., mesh generation or model updates) and of a large number of independent steps (e.g., forward and adjoint simulations of each seismic event, pre- and postprocessing of seismic traces). The greater goal is to reduce the time to solution, that is, obtaining a more precise representation of the subsurface as fast as possible. This brings us to consider both the workflow in its entirety and the parts comprising it. The usual approach is to speedup the purely computational parts based on code optimization in order to reach higher FLOPS and better memory management. This still remains an important concern, but larger scale experiments show that the imaging workflow suffers from severe I/O bottlenecks. Such limitations occur both for purely computational data and seismic time series. The latter are dealt with by the introduction of a new Adaptable Seismic Data Format (ASDF). Parallel I/O libraries, namely HDF5 and ADIOS, are used to drastically reduce the cost of disk access. Parallel visualization tools, such as VisIt, are able to take advantage of ADIOS metadata to extract features and display massive datasets. Because large parts of the workflow are embarrassingly parallel, we are investigating the possibility of automating the imaging process with the integration of scientific workflow management tools, specifically Pegasus.

Top