Sample records for large-scale shaking table

  1. A study on seismic behavior of pile foundations of bridge abutment on liquefiable ground through shaking table tests

    NASA Astrophysics Data System (ADS)

    Nakata, Mitsuhiko; Tanimoto, Shunsuke; Ishida, Shuichi; Ohsumi, Michio; Hoshikuma, Jun-ichi

    2017-10-01

    There is risk of bridge foundations to be damaged by liquefaction-induced lateral spreading of ground. Once bridge foundations have been damaged, it takes a lot of time for restoration. Therefore, it is important to assess the seismic behavior of the foundations on liquefiable ground appropriately. In this study, shaking table tests of models on a scale of 1/10 were conducted at the large scale shaking table in Public Works Research Institute, Japan, to investigate the seismic behavior of pile-supported bridge abutment on liquefiable ground. The shaking table tests were conducted for three types of model. Two are models of existing bridge which was built without design for liquefaction and the other is a model of bridge which was designed based on the current Japanese design specifications for highway bridges. As a result, the bending strains of piles of the abutment which were designed based on the current design specifications were less than those of the existing bridge.

  2. Large-Scale Biaxial Friction Experiments with an Assistance of the NIED Shaking Table

    NASA Astrophysics Data System (ADS)

    Fukuyama, E.; Mizoguchi, K.; Yamashita, F.; Togo, T.; Kawakata, H.; Yoshimitsu, N.; Shimamoto, T.; Mikoshiba, T.; Sato, M.; Minowa, C.

    2012-12-01

    We constructed a large-scale biaxial friction apparatus using a large shaking table working at NIED (table dimension is 15m x 15m). The actuator of the shaking table becomes the engine of the constant speed loading. We used a 1.5m long rock sample overlaid on a 2m one. Their height and width are both 0.5m. Therefore, the slip area is 1.5m x 0.5m. The 2m long sample moves with the shaking table and the 1.5m sample is fixed to the basement of the shaking table. Thus, the shaking table displacement controls the dislocation between two rock samples. The shaking table can generate 0.4m displacement with a velocity ranging between 0.0125mm/s and 1m/s. We used Indian gabbro for the rock sample of the present experiments. Original flatness of the sliding surface was formed less than 0.024mm undulation using a large-scale plane grinder. Surface roughness evolved as subsequent experiments were done. Wear material was generated during each experiment, whose grain size becomes bigger as the experiments proceed. This might suggest a damage evolution on the sliding surface. In some experiments we did not remove the gouge material before sliding to examine the effect of gouge layer. Normal stress can be applied up to 1.3MPa. The stiffness of this apparatus was measured experimentally and was of the order of 0.1GN/m. We first measured the coefficient of friction at low sliding velocity (0.1~1mm/s) where the steady state was achieved after the slip of ~5mm. The coefficient of friction was about 0.75 under the normal stress between 0.13 and 1.3MPa. This is consistent with those estimated by previous works using smaller rock samples. We observed that the coefficient of friction decreased gradually with increasing slip velocity, but simultaneously the friction curves at the higher velocities are characterized by stick-slip vibration. Our main aim of the experiments is to understand the rupture propagation from slow nucleation to fast unstable rupture during the loading of two contact surfaces. We recorded many unstable slip events that nucleated inside the sliding surface but did not reach the edge of the sliding surface until the termination of slip. These slip events simulate full rupture process during earthquake, including nucleation, propagation and termination of the rupture. We monitored these rupture progress using the strain change propagation measured by 16 semiconductor strain gauges recorded at a sampling rate of 1MHz. In addition, high frequency waves emitted from AE events was continuously observed by 8 piezo-electronic transducers (PZTs) at a sampling rate of 20MHz. These sensors were attached at the edge of the slipping area. The AE event started to occur where the slip was nucleated and the slip area started to expand. Unfortunately, we could not locate all AE events during the unstable rupture, because of the overprints of signals from multiple events in the PZT records. We also monitored the amplitudes of transmitted waves across the sliding surface. The amplitudes decreased just after the stick slip and recovered gradually, suggesting that the transmitted wave amplitudes might reflect the slipped area on the interface.

  3. Shake Table Testing of an Elevator System in a Full-Scale Five-Story Building

    PubMed Central

    Wang, Xiang; Hutchinson, Tara C.; Astroza, Rodrigo; Conte, Joel P.; Restrepo, José I.; Hoehler, Matthew S.; Ribeiro, Waldir

    2016-01-01

    SUMMARY This paper investigates the seismic performance of a functional traction elevator as part of a full-scale five-story building shake table test program. The test building was subjected to a suite of earthquake input motions of increasing intensity, first while the building was isolated at its base, and subsequently while it was fixed to the shake table platen. In addition, low-amplitude white noise base excitation tests were conducted while the elevator system was placed in three different configurations, namely, by varying the vertical location of its cabin and counterweight, to study the acceleration amplifications of the elevator components due to dynamic excitations. During the earthquake tests, detailed observation of the physical damage and operability of the elevator as well as its measured response are reported. Although the cabin and counterweight sustained large accelerations due to impact during these tests, the use of well-restrained guide shoes demonstrated its effectiveness in preventing the cabin and counterweight from derailment during high-intensity earthquake shaking. However, differential displacements induced by the building imposed undesirable distortion of the elevator components and their surrounding support structure, which caused damage and inoperability of the elevator doors. It is recommended that these aspects be explicitly considered in elevator seismic design. PMID:28242957

  4. Shake Table Testing of an Elevator System in a Full-Scale Five-Story Building.

    PubMed

    Wang, Xiang; Hutchinson, Tara C; Astroza, Rodrigo; Conte, Joel P; Restrepo, José I; Hoehler, Matthew S; Ribeiro, Waldir

    2017-03-01

    This paper investigates the seismic performance of a functional traction elevator as part of a full-scale five-story building shake table test program. The test building was subjected to a suite of earthquake input motions of increasing intensity, first while the building was isolated at its base, and subsequently while it was fixed to the shake table platen. In addition, low-amplitude white noise base excitation tests were conducted while the elevator system was placed in three different configurations, namely, by varying the vertical location of its cabin and counterweight, to study the acceleration amplifications of the elevator components due to dynamic excitations. During the earthquake tests, detailed observation of the physical damage and operability of the elevator as well as its measured response are reported. Although the cabin and counterweight sustained large accelerations due to impact during these tests, the use of well-restrained guide shoes demonstrated its effectiveness in preventing the cabin and counterweight from derailment during high-intensity earthquake shaking. However, differential displacements induced by the building imposed undesirable distortion of the elevator components and their surrounding support structure, which caused damage and inoperability of the elevator doors. It is recommended that these aspects be explicitly considered in elevator seismic design.

  5. Experimental/analytical approaches to modeling, calibrating and optimizing shaking table dynamics for structural dynamic applications

    NASA Astrophysics Data System (ADS)

    Trombetti, Tomaso

    This thesis presents an Experimental/Analytical approach to modeling and calibrating shaking tables for structural dynamic applications. This approach was successfully applied to the shaking table recently built in the structural laboratory of the Civil Engineering Department at Rice University. This shaking table is capable of reproducing model earthquake ground motions with a peak acceleration of 6 g's, a peak velocity of 40 inches per second, and a peak displacement of 3 inches, for a maximum payload of 1500 pounds. It has a frequency bandwidth of approximately 70 Hz and is designed to test structural specimens up to 1/5 scale. The rail/table system is mounted on a reaction mass of about 70,000 pounds consisting of three 12 ft x 12 ft x 1 ft reinforced concrete slabs, post-tensioned together and connected to the strong laboratory floor. The slip table is driven by a hydraulic actuator governed by a 407 MTS controller which employs a proportional-integral-derivative-feedforward-differential pressure algorithm to control the actuator displacement. Feedback signals are provided by two LVDT's (monitoring the slip table relative displacement and the servovalve main stage spool position) and by one differential pressure transducer (monitoring the actuator force). The dynamic actuator-foundation-specimen system is modeled and analyzed by combining linear control theory and linear structural dynamics. The analytical model developed accounts for the effects of actuator oil compressibility, oil leakage in the actuator, time delay in the response of the servovalve spool to a given electrical signal, foundation flexibility, and dynamic characteristics of multi-degree-of-freedom specimens. In order to study the actual dynamic behavior of the shaking table, the transfer function between target and actual table accelerations were identified using experimental results and spectral estimation techniques. The power spectral density of the system input and the cross power spectral density of the table input and output were estimated using the Bartlett's spectral estimation method. The experimentally-estimated table acceleration transfer functions obtained for different working conditions are correlated with their analytical counterparts. As a result of this comprehensive correlation study, a thorough understanding of the shaking table dynamics and its sensitivities to control and payload parameters is obtained. Moreover, the correlation study leads to a calibrated analytical model of the shaking table of high predictive ability. It is concluded that, in its present conditions, the Rice shaking table is able to reproduce, with a high degree of accuracy, model earthquake accelerations time histories in the frequency bandwidth from 0 to 75 Hz. Furthermore, the exhaustive analysis performed indicates that the table transfer function is not significantly affected by the presence of a large (in terms of weight) payload with a fundamental frequency up to 20 Hz. Payloads having a higher fundamental frequency do affect significantly the shaking table performance and require a modification of the table control gain setting that can be easily obtained using the predictive analytical model of the shaking table. The complete description of a structural dynamic experiment performed using the Rice shaking table facility is also reported herein. The object of this experimentation was twofold: (1) to verify the testing capability of the shaking table and, (2) to experimentally validate a simplified theory developed by the author, which predicts the maximum rotational response developed by seismic isolated building structures characterized by non-coincident centers of mass and rigidity, when subjected to strong earthquake ground motions.

  6. Damage Assessment of a Full-Scale Six-Story wood-frame Building Following Triaxial shake Table Tests

    Treesearch

    John W. van de Lindt; Rakesh Gupta; Shiling Pei; Kazuki Tachibana; Yasuhiro Araki; Douglas Rammer; Hiroshi Isoda

    2012-01-01

    In the summer of 2009, a full-scale midrise wood-frame building was tested under a series of simulated earthquakes on the world's largest shake table in Miki City, Japan. The objective of this series of tests was to validate a performance-based seismic design approach by qualitatively and quantitatively examining the building's seismic performance in terms of...

  7. Stick-slip behavior of Indian gabbro as studied using a NIED large-scale biaxial friction apparatus

    NASA Astrophysics Data System (ADS)

    Togo, Tetsuhiro; Shimamoto, Toshihiko; Yamashita, Futoshi; Fukuyama, Eiichi; Mizoguchi, Kazuo; Urata, Yumi

    2015-04-01

    This paper reports stick-slip behaviors of Indian gabbro as studied using a new large-scale biaxial friction apparatus, built in the National Research Institute for Earth Science and Disaster Prevention (NIED), Tsukuba, Japan. The apparatus consists of the existing shaking table as the shear-loading device up to 3,600 kN, the main frame for holding two large rectangular prismatic specimens with a sliding area of 0.75 m2 and for applying normal stresses σ n up to 1.33 MPa, and a reaction force unit holding the stationary specimen to the ground. The shaking table can produce loading rates v up to 1.0 m/s, accelerations up to 9.4 m/s2, and displacements d up to 0.44 m, using four servocontrolled actuators. We report results from eight preliminary experiments conducted with room humidity on the same gabbro specimens at v = 0.1-100 mm/s and σ n = 0.66-1.33 MPa, and with d of about 0.39 m. The peak and steady-state friction coefficients were about 0.8 and 0.6, respectively, consistent with the Byerlee friction. The axial force drop or shear stress drop during an abrupt slip is linearly proportional to the amount of displacement, and the slope of this relationship determines the stiffness of the apparatus as 1.15 × 108 N/m or 153 MPa/m for the specimens we used. This low stiffness makes fault motion very unstable and the overshooting of shear stress to a negative value was recognized in some violent stick-slip events. An abrupt slip occurred in a constant rise time of 16-18 ms despite wide variation of the stress drop, and an average velocity during an abrupt slip is linearly proportional to the stress drop. The use of a large-scale shaking table has a great potential in increasing the slip rate and total displacement in biaxial friction experiments with large specimens.

  8. Shake table test of soil-pile groups-bridge structure interaction in liquefiable ground

    NASA Astrophysics Data System (ADS)

    Tang, Liang; Ling, Xianzhang; Xu, Pengju; Gao, Xia; Wang, Dongsheng

    2010-03-01

    This paper describes a shake table test study on the seismic response of low-cap pile groups and a bridge structure in liquefiable ground. The soil profile, contained in a large-scale laminar shear box, consisted of a horizontally saturated sand layer overlaid with a silty clay layer, with the simulated low-cap pile groups embedded. The container was excited in three El Centro earthquake events of different levels. Test results indicate that excessive pore pressure (EPP) during slight shaking only slightly accumulated, and the accumulation mainly occurred during strong shaking. The EPP was gradually enhanced as the amplitude and duration of the input acceleration increased. The acceleration response of the sand was remarkably influenced by soil liquefaction. As soil liquefaction occurred, the peak sand displacement gradually lagged behind the input acceleration; meanwhile, the sand displacement exhibited an increasing effect on the bending moment of the pile, and acceleration responses of the pile and the sand layer gradually changed from decreasing to increasing in the vertical direction from the bottom to the top. A jump variation of the bending moment on the pile was observed near the soil interface in all three input earthquake events. It is thought that the shake table tests could provide the groundwork for further seismic performance studies of low-cap pile groups used in bridges located on liquefiable groun.

  9. Shaking table test and dynamic response prediction on an earthquake-damaged RC building

    NASA Astrophysics Data System (ADS)

    Xianguo, Ye; Jiaru, Qian; Kangning, Li

    2004-12-01

    This paper presents the results from shaking table tests of a one-tenth-scale reinforced concrete (RC) building model. The test model is a protype of a building that was seriously damaged during the 1985 Mexico earthquake. The input ground excitation used during the test was from the records obtained near the site of the prototype building during the 1985 and 1995 Mexico earthquakes. The tests showed that the damage pattern of the test model agreed well with that of the prototype building. Analytical prediction of earthquake response has been conducted for the prototype building using a sophisticated 3-D frame model. The input motion used for the dynamic analysis was the shaking table test measurements with similarity transformation. The comparison of the analytical results and the shaking table test results indicates that the response of the RC building to minor and the moderate earthquakes can be predicated well. However, there is difference between the predication and the actual response to the major earthquake.

  10. Applied research of shaking table for scandium concentration from a silicate ore

    NASA Astrophysics Data System (ADS)

    Yan, P.; Zhang, G. F.; Gao, L.; Shi, B. H.; Shi, Z.; Yang, Y. D.

    2018-03-01

    A poor magnetite iron ore is a super large independent scandium deposit with over the multi-billion potential utilizable value. Shaking table separation is very useful for impurities removing and scandium content increasing as a follow-up step of high-intensity magnetic separation. In the present study, a satisfactory index, namely scandium content of 83.10 g/t and recovery rate of 79.45 wt%, was obtained by shaking table separation. The good result was achieved under the conditions which the parameters were feed concentrate of 18 wt%, feeding quantity of 11 L/min, stroke frequency of 275 times/min and stroke of 17mm.

  11. The quest for better quality-of-life - learning from large-scale shaking table tests

    NASA Astrophysics Data System (ADS)

    Nakashima, M.; Sato, E.; Nagae, T.; Kunio, F.; Takahito, I.

    2010-12-01

    Earthquake engineering has its origins in the practice of “learning from actual earthquakes and earthquake damages.” That is, we recognize serious problems by witnessing the actual damage to our structures, and then we develop and apply engineering solutions to solve these problems. This tradition in earthquake engineering, i.e., “learning from actual damage,” was an obvious engineering response to earthquakes and arose naturally as a practice in a civil and building engineering discipline that traditionally places more emphasis on experience than do other engineering disciplines. But with the rapid progress of urbanization, as society becomes denser, and as the many components that form our society interact with increasing complexity, the potential damage with which earthquakes threaten the society also increases. In such an era, the approach of ”learning from actual earthquake damages” becomes unacceptably dangerous and expensive. Among the practical alternatives to the old practice is to “learn from quasi-actual earthquake damages.” One tool for experiencing earthquake damages without attendant catastrophe is the large shaking table. E-Defense, the largest one we have, was developed in Japan after the 1995 Hyogoken-Nanbu (Kobe) earthquake. Since its inauguration in 2005, E-Defense has conducted over forty full-scale or large-scale shaking table tests, applied to a variety of structural systems. The tests supply detailed data on actual behavior and collapse of the tested structures, offering the earthquake engineering community opportunities to experience and assess the actual seismic performance of the structures, and to help society prepare for earthquakes. Notably, the data were obtained without having to wait for the aftermaths of actual earthquakes. Earthquake engineering has always been about life safety, but in recent years maintaining the quality of life has also become a critical issue. Quality-of-life concerns include nonstructural damage, business continuity, public health, quickness of damage assessment, infrastructure, data and communication networks, and other issues, and not enough useful empirical data have emerged about these issues from the experiences of actual earthquakes. To provide quantitative data that can be used to reduce earthquake risk to our quality of life, E-Defense recently has been implementing two comprehensive research projects in which a base-isolated hospital and a steel high-rise building were tested using the E-Defense shaking table and their seismic performance were examined particularly in terms of the nonstructural damage, damage to building contents and furniture, and operability, functionality, and business-continuity capability. The paper presents the overview of the two projects, together with major findings obtained from the projects.

  12. Experimental seismic behavior of a full-scale four-story soft-story wood-frame building with retrofits II: shake table test results

    Treesearch

    John W. van de Lindt; Pouria Bahmani; Gary Mochizuki; Steven E. Pryor; Mikhail Gershfeld; Jingjing Tian; Michael D. Symans; Douglas Rammer

    2016-01-01

    Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. The majority of these buildings were constructed from the 1920s to the 1960s and are prone to collapse during moderate to large earthquakes due to a characteristic deficiency in strength and stiffness in their first story. In order to propose and validate retrofit...

  13. Lunar regolith densification

    NASA Technical Reports Server (NTRS)

    Ko, Hon-Yim; Sture, Stein

    1991-01-01

    Core tube samples of the lunar regolith obtained during the Apollo missions showed a rapid increase in the density of the regolith with depth. Various hypotheses have been proposed for the possible cause of this phenomenon, including the densification of the loose regolith material by repeated shaking from the seismic tremors which have been found to occur at regular monthly intervals when the moon and earth are closest to one another. A test bed was designed to study regolith densification. This test bed uses Minnesota Lunar Simulant (MLS) to conduct shaking experiments in the geotechnical centrifuge with an inflight shake table system. By reproducing realistic in-situ regolith properties, the experiment also serves to test penetrator concepts. The shake table system was designed and used for simulation experiments to study effects of earthquakes on terrestrial soil structures. It is mounted on a 15 g-ton geotechnical centrifuge in which the self-weight induced stresses are replicated by testing an n-th scale model in a gravity field which is n times larger than Earth's gravity. A similar concept applies when dealing with lunar prototypes, where the gravity ratio required for proper simulation of lunar gravity effects is that between the centrifugal acceleration and the lunar gravity. Records of lunar seismic tremors, or moonquakes, were obtained. While these records are being prepared for use as the input data to drive the shake table system, records from the El Centro earthquake of 1940 are being used to perform preliminary tests, using a soil container which was previously used for earthquake studies. This container has a laminar construction, with the layers free to slide on each other, so that the soil motion during the simulated earthquake will not be constrained by the otherwise rigid boundaries. The soil model is prepared by pluviating the MLS from a hopper into the laminar container to a depth of 6 in. The container is mounted on the shake table and the centrifuge is operated to generate an acceleration of 10 times Earth's gravity or 60 times the lunar gravity, thus simulating a lunar regolith thickness of 30 ft. The shake table is then operated using the scaled 'moonquake' as the input motion. One or more model moonquakes are used in each experiment, after which the soil is analyzed for its density profile with depth. This is accomplished by removing from the soil bed a column of soil contained within a thin rubber sleeve which has been previously embedded vertically in the soil during pluviation. This column of soil is transferred to a gamma ray device, in which the gamma ray transmission transversely through the soil is measured and compared with standard calibration samples. In this manner, the density profile can be determined. Preliminary results to date are encouraging, and the Center plans to study the effects of duration of shaking, intensity of the shaking motion, and the frequency of the motion.

  14. Seismic isolation of small modular reactors using metamaterials

    NASA Astrophysics Data System (ADS)

    Witarto, Witarto; Wang, S. J.; Yang, C. Y.; Nie, Xin; Mo, Y. L.; Chang, K. C.; Tang, Yu; Kassawara, Robert

    2018-04-01

    Adaptation of metamaterials at micro- to nanometer scales to metastructures at much larger scales offers a new alternative for seismic isolation systems. These new isolation systems, known as periodic foundations, function both as a structural foundation to support gravitational weight of the superstructure and also as a seismic isolator to isolate the superstructure from incoming seismic waves. Here we describe the application of periodic foundations for the seismic protection of nuclear power plants, in particular small modular reactors (SMR). For this purpose, a large-scale shake table test on a one-dimensional (1D) periodic foundation supporting an SMR building model was conducted. The 1D periodic foundation was designed and fabricated using reinforced concrete and synthetic rubber (polyurethane) materials. The 1D periodic foundation structural system was tested under various input waves, which include white noise, stepped sine and seismic waves in the horizontal and vertical directions as well as in the torsional mode. The shake table test results show that the 1D periodic foundation can reduce the acceleration response (transmissibility) of the SMR building up to 90%. In addition, the periodic foundation-isolated structure also exhibited smaller displacement than the non-isolated SMR building. This study indicates that the challenge faced in developing metastructures can be overcome and the periodic foundations can be applied to isolating vibration response of engineering structures.

  15. Successful Demonstration of New Isolated Bridge System at UCB Shaking Table

    Science.gov Websites

    other events Successful Demonstration of New Isolated Bridge System at UCB Shaking Table PEER Events Successful Demonstration of New Isolated Bridge System at UCB Shaking Table On May 26, 2010 over 100 demonstration of a new isolated bridge system at the PEER Earthquake Simulator Laboratory at UC Berkeley’s

  16. Response of Global Navigation Satellite System receivers to known shaking between 0.2 and 20 Hertz

    USGS Publications Warehouse

    Langbein, John; Evans, John R.; Blume, Fredrick; Johanson, Ingrid

    2014-01-01

    Similar to Wang and others (2012), we also examined the GPS displacement records using standard spectral techniques. However, we extended their work by evaluating several models of GNSS receivers using a variety of input frequencies. Because our shake table was limited on acceleration and displacement, we did not attempt to duplicate the high shaking associated with high magnitude earthquakes. However, because our shake table could measure the table displacement, we could directly compare the measured GPS displacements with the true displacements.

  17. Operation SNAPPER, Project 3.1. Vulnerability of Parked Aircraft to Atomic Bombs

    DTIC Science & Technology

    1953-02-01

    Portable Calibrator which was used at the Nevada Proving Grounds. The 6-101A consisted of a shake table which generated a sinusoidal motion having a...calibrator was similar to the 6-101A, with the exception that it was sitaller and had a fixed shake table amplitude. The calibration proce- dure was to...mount the accelerometer to be calibrated on the table sind shake it at various frequencies. The output of the accelerometer, which was channeled

  18. Structural assessment of highway "N" power substation under earthquake loads.

    DOT National Transportation Integrated Search

    2009-10-01

    In this study, the Highway N Substation was analyzed with a finite element model (FEM) for its vulnerability. The rigid bus and electric switch components were characterized with full scale shake table tests. Each component of the substation wa...

  19. Non-Linear Material Three Degree of Freedom Analysis of Submarine Drydock Blocking System

    DTIC Science & Technology

    1988-05-01

    drydock blocking materials laminates should be used. For example, if laminated oak timbers are judged to be suitable they would exhibit a minimum of...1/4 of the strength variation of solid timbers . They would not have the inherent defects of large sawn timbers such as grain slope, checks, shakes ,and...experiments on this concept have been carried out over the last few years using the shaking table at the Earthquake Engineering Research Center

  20. Dynamics of the McDonnell Douglas Large Scale Dynamic Rig and Dynamic Calibration of the Rotor Balance

    DOT National Transportation Integrated Search

    1994-10-01

    A shake test was performed on the Large Scale Dynamic Rig in the 40- by 80-Foot Wind Tunnel in support of the McDonnell Douglas Advanced Rotor Technology (MDART) Test Program. The shake test identifies the hub modes and the dynamic calibration matrix...

  1. Shake-table testing of a self-centering precast reinforced concrete frame with shear walls

    NASA Astrophysics Data System (ADS)

    Lu, Xilin; Yang, Boya; Zhao, Bin

    2018-04-01

    The seismic performance of a self-centering precast reinforced concrete (RC) frame with shear walls was investigated in this paper. The lateral force resistance was provided by self-centering precast RC shear walls (SPCW), which utilize a combination of unbonded prestressed post-tensioned (PT) tendons and mild steel reinforcing bars for flexural resistance across base joints. The structures concentrated deformations at the bottom joints and the unbonded PT tendons provided the self-centering restoring force. A 1/3-scale model of a five-story self-centering RC frame with shear walls was designed and tested on a shake-table under a series of bi-directional earthquake excitations with increasing intensity. The acceleration response, roof displacement, inter-story drifts, residual drifts, shear force ratios, hysteresis curves, and local behaviour of the test specimen were analysed and evaluated. The results demonstrated that seismic performance of the test specimen was satisfactory in the plane of the shear wall; however, the structure sustained inter-story drift levels up to 2.45%. Negligible residual drifts were recorded after all applied earthquake excitations. Based on the shake-table test results, it is feasible to apply and popularize a self-centering precast RC frame with shear walls as a structural system in seismic regions.

  2. Estimation of Stresses in a Dry Sand Layer Tested on Shaking Table

    NASA Astrophysics Data System (ADS)

    Sawicki, Andrzej; Kulczykowski, Marek; Jankowski, Robert

    2012-12-01

    Theoretical analysis of shaking table experiments, simulating earthquake response of a dry sand layer, is presented. The aim of such experiments is to study seismic-induced compaction of soil and resulting settlements. In order to determine the soil compaction, the cyclic stresses and strains should be calculated first. These stresses are caused by the cyclic horizontal acceleration at the base of soil layer, so it is important to determine the stress field as function of the base acceleration. It is particularly important for a proper interpretation of shaking table tests, where the base acceleration is controlled but the stresses are hard to measure, and they can only be deduced. Preliminary experiments have shown that small accelerations do not lead to essential settlements, whilst large accelerations cause some phenomena typical for limit states, including a visible appearance of slip lines. All these problems should be well understood for rational planning of experiments. The analysis of these problems is presented in this paper. First, some heuristic considerations about the dynamics of experimental system are presented. Then, the analysis of boundary conditions, expressed as resultants of respective stresses is shown. A particular form of boundary conditions has been chosen, which satisfies the macroscopic boundary conditions and the equilibrium equations. Then, some considerations are presented in order to obtain statically admissible stress field, which does not exceed the Coulomb-Mohr yield conditions. Such an approach leads to determination of the limit base accelerations, which do not cause the plastic state in soil. It was shown that larger accelerations lead to increase of the lateral stresses, and the respective method, which may replace complex plasticity analyses, is proposed. It is shown that it is the lateral stress coefficient K0 that controls the statically admissible stress field during the shaking table experiments.

  3. Application of multivariate analysis and mass transfer principles for refinement of a 3-L bioreactor scale-down model--when shake flasks mimic 15,000-L bioreactors better.

    PubMed

    Ahuja, Sanjeev; Jain, Shilpa; Ram, Kripa

    2015-01-01

    Characterization of manufacturing processes is key to understanding the effects of process parameters on process performance and product quality. These studies are generally conducted using small-scale model systems. Because of the importance of the results derived from these studies, the small-scale model should be predictive of large scale. Typically, small-scale bioreactors, which are considered superior to shake flasks in simulating large-scale bioreactors, are used as the scale-down models for characterizing mammalian cell culture processes. In this article, we describe a case study where a cell culture unit operation in bioreactors using one-sided pH control and their satellites (small-scale runs conducted using the same post-inoculation cultures and nutrient feeds) in 3-L bioreactors and shake flasks indicated that shake flasks mimicked the large-scale performance better than 3-L bioreactors. We detail here how multivariate analysis was used to make the pertinent assessment and to generate the hypothesis for refining the existing 3-L scale-down model. Relevant statistical techniques such as principal component analysis, partial least square, orthogonal partial least square, and discriminant analysis were used to identify the outliers and to determine the discriminatory variables responsible for performance differences at different scales. The resulting analysis, in combination with mass transfer principles, led to the hypothesis that observed similarities between 15,000-L and shake flask runs, and differences between 15,000-L and 3-L runs, were due to pCO2 and pH values. This hypothesis was confirmed by changing the aeration strategy at 3-L scale. By reducing the initial sparge rate in 3-L bioreactor, process performance and product quality data moved closer to that of large scale. © 2015 American Institute of Chemical Engineers.

  4. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  5. Managing large-scale workflow execution from resource provisioning to provenance tracking: The CyberShake example

    USGS Publications Warehouse

    Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2006-01-01

    This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.

  6. SHAKING TABLE TESTS ON SEISMIC DEFORMATION OF PILE SUPPORTED PIER

    NASA Astrophysics Data System (ADS)

    Fujita, Daiki; Kohama, Eiji; Takenobu, Masahiro; Yoshida, Makoto; Kiku, Hiroyoshi

    The seismic deformation characeteristics of a pile supported pier was examined with the shake table test, especially focusing on the pier after its deformation during earthquakes. The model based on the similitude of the fully-plastic moment in piles was prepared to confirm the deformation and stress characteristic after reaching the fully-plastic moment. Moreover, assuming transportation of emergency supplies and occurrence of after shock in the post-disaster period, the pile supported pier was loaded with weight after reaching fully-plastic moment and excited with the shaking table. As the result, it is identified that the displacement of the pile supported pier is comparatively small if bending strength of piles does not decrease after reaching fully-plastic moment due to nonoccourrence of local backling or strain hardening.

  7. Response of a 2-story test-bed structure for the seismic evaluation of nonstructural systems

    NASA Astrophysics Data System (ADS)

    Soroushian, Siavash; Maragakis, E. "Manos"; Zaghi, Arash E.; Rahmanishamsi, Esmaeel; Itani, Ahmad M.; Pekcan, Gokhan

    2016-03-01

    A full-scale, two-story, two-by-one bay, steel braced-frame was subjected to a number of unidirectional ground motions using three shake tables at the UNR-NEES site. The test-bed frame was designed to study the seismic performance of nonstructural systems including steel-framed gypsum partition walls, suspended ceilings and fire sprinkler systems. The frame can be configured to perform as an elastic or inelastic system to generate large floor accelerations or large inter story drift, respectively. In this study, the dynamic performance of the linear and nonlinear test-beds was comprehensively studied. The seismic performance of nonstructural systems installed in the linear and nonlinear test-beds were assessed during extreme excitations. In addition, the dynamic interactions of the test-bed and installed nonstructural systems are investigated.

  8. Development of the monitoring technique on the damage of piles using the biggest shaking table "E-defense"

    NASA Astrophysics Data System (ADS)

    Hayashi, Kazuhiro; Hachimori, Wataru; Kaneda, Shogo; Tamura, Shuji; Saito, Taiki

    2017-10-01

    In case of earthquake damage to buildings, the damage to a superstructure is visible, but the damage to a foundation structure, e.g. the underground pile, is difficult to detect. In this study, the authors aim to develop a monitoring technique for pile damage due to earthquakes. The world's biggest shaking table, E-Defense, was used to reproduce damage to RC pile models embedded in the soil inside a large scale shear box (8m in diameter and 6.5m in height). The diameter of the RC pile model was 154mm. It consisted of mortar (27.2N/mm2 in compressive strength), 6 main reinforcements (6.35mm in diameter) and shear reinforcement hard steel wire (2mm in diameter at intervals of 20mm). The natural period of the superstructure above the pile models is around 0.12sec. The soil consisted of 2 layers. The lower layer is Albany sand of 80% relative density while the upper layer is only 2m from the surface ground and is Kaketsu sand of 60% relative density. Primary four excitations were scaled from JMA Kobe waves in notification at different amplitudes. The maximum acceleration of each wave is 31gal, 67gal, 304gal, and 458gal, respectively. In the test result, reinforcing steels at the pile head of the RC model yielded when the maximum acceleration was 304gal. After that, mortar of the pile head peeled off and a bending shear failure occurred when the maximum acceleration was 458gal. The peak frequency of rotational spectrum on the foundation did not change in elastic range in the piles. However, the peak frequency fell after the plastic hinge occurred.

  9. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  10. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.

  11. Seismic performance of geosynthetic-soil retaining wall structures

    NASA Astrophysics Data System (ADS)

    Zarnani, Saman

    Vertical inclusions of expanded polystyrene (EPS) placed behind rigid retaining walls were investigated as geofoam seismic buffers to reduce earthquake-induced loads. A numerical model was developed using the program FLAC and the model validated against 1-g shaking table test results of EPS geofoam seismic buffer models. Two constitutive models for the component materials were examined: elastic-perfectly plastic with Mohr-Coulomb (M-C) failure criterion and non-linear hysteresis damping model with equivalent linear method (ELM) approach. It was judged that the M-C model was sufficiently accurate for practical purposes. The mechanical property of interest to attenuate dynamic loads using a seismic buffer was the buffer stiffness defined as K = E/t (E = buffer elastic modulus, t = buffer thickness). For the range of parameters investigated in this study, K ≤50 MN/m3 was observed to be the practical range for the optimal design of these systems. Parametric numerical analyses were performed to generate design charts that can be used for the preliminary design of these systems. A new high capacity shaking table facility was constructed at RMC that can be used to study the seismic performance of earth structures. Reduced-scale models of geosynthetic reinforced soil (GRS) walls were built on this shaking table and then subjected to simulated earthquake loading conditions. In some shaking table tests, combined use of EPS geofoam and horizontal geosynthetic reinforcement layers was investigated. Numerical models were developed using program FLAC together with ELM and M-C constitutive models. Physical and numerical results were compared against predicted values using analysis methods found in the journal literature and in current North American design guidelines. The comparison shows that current Mononobe-Okabe (M-O) based analysis methods could not consistently satisfactorily predict measured reinforcement connection load distributions at all elevations under both static and dynamic loading conditions. The results from GRS model wall tests with combined EPS geofoam and geosynthetic reinforcement layers show that the inclusion of a EPS geofoam layer behind the GRS wall face can reduce earth loads acting on the wall facing to values well below those recorded for conventional GRS wall model configurations.

  12. Development of 1-D Shake Table Testing Facility for Liquefaction Studies

    NASA Astrophysics Data System (ADS)

    Unni, Kartha G.; Beena, K. S.; Mahesh, C.

    2018-04-01

    One of the major challenges researchers face in the field of earthquake geotechnical engineering in India is the high cost of laboratory infrastructure. Developing a reliable and low cost experimental set up is attempted in this research. The paper details the design and development of a uniaxial shake table and the data acquisition system with accelerometers and pore water pressure sensors which can be used for liquefaction studies.

  13. Optimization of gold ore Sumbawa separation using gravity method: Shaking table

    NASA Astrophysics Data System (ADS)

    Ferdana, Achmad Dhaefi; Petrus, Himawan Tri Bayu Murti; Bendiyasa, I. Made; Prijambada, Irfan Dwidya; Hamada, Fumio; Sachiko, Takahi

    2018-04-01

    Most of artisanal small gold mining in Indonesia has been using amalgamation method, which caused negative impact to the environment around ore processing area due to the usage of mercury. One of the more environmental-friendly method for gold processing is gravity method. Shaking table is one of separation equipment of gravity method used to increase concentrate based on difference of specific gravity. The optimum concentration result is influenced by several variables, such as rotational speed shaking, particle size and deck slope. In this research, the range of rotational speed shaking was between 100 rpm and 200 rpm, the particle size was between -100 + 200 mesh and -200 + 300 mesh and deck slope was between 3° and 7°. Gold concentration in concentrate was measured by EDX. The result shows that the optimum condition is obtained at a shaking speed of 200 rpm, with a slope of 7° and particle size of -100 + 200 mesh.

  14. Quantifying and Qualifying USGS ShakeMap Uncertainty

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-Wan; Quitoriano, Vincent

    2008-01-01

    We describe algorithms for quantifying and qualifying uncertainties associated with USGS ShakeMap ground motions. The uncertainty values computed consist of latitude/longitude grid-based multiplicative factors that scale the standard deviation associated with the ground motion prediction equation (GMPE) used within the ShakeMap algorithm for estimating ground motions. The resulting grid-based 'uncertainty map' is essential for evaluation of losses derived using ShakeMaps as the hazard input. For ShakeMap, ground motion uncertainty at any point is dominated by two main factors: (i) the influence of any proximal ground motion observations, and (ii) the uncertainty of estimating ground motions from the GMPE, most notably, elevated uncertainty due to initial, unconstrained source rupture geometry. The uncertainty is highest for larger magnitude earthquakes when source finiteness is not yet constrained and, hence, the distance to rupture is also uncertain. In addition to a spatially-dependant, quantitative assessment, many users may prefer a simple, qualitative grading for the entire ShakeMap. We developed a grading scale that allows one to quickly gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of the post-earthquake decision-making process or for qualitative assessments of archived or historical earthquake ShakeMaps. We describe an uncertainty letter grading ('A' through 'F', for high to poor quality, respectively) based on the uncertainty map. A middle-range ('C') grade corresponds to a ShakeMap for a moderate-magnitude earthquake suitably represented with a point-source location. Lower grades 'D' and 'F' are assigned for larger events (M>6) where finite-source dimensions are not yet constrained. The addition of ground motion observations (or observed macroseismic intensities) reduces uncertainties over data-constrained portions of the map. Higher grades ('A' and 'B') correspond to ShakeMaps with constrained fault dimensions and numerous stations, depending on the density of station/data coverage. Due to these dependencies, the letter grade can change with subsequent ShakeMap revisions if more data are added or when finite-faulting dimensions are added. We emphasize that the greatest uncertainties are associated with unconstrained source dimensions for large earthquakes where the distance term in the GMPE is most uncertain; this uncertainty thus scales with magnitude (and consequently rupture dimension). Since this distance uncertainty produces potentially large uncertainties in ShakeMap ground-motion estimates, this factor dominates over compensating constraints for all but the most dense station distributions.

  15. 117. VIEW, LOOKING NORTHWEST, OF DIESTER MODEL 6 CONCENTRATING (SHAKING) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    117. VIEW, LOOKING NORTHWEST, OF DIESTER MODEL 6 CONCENTRATING (SHAKING) TABLE, USED FOR PRIMARY, MECHANICAL SEPARATION OF GOLD FROM ORE. - Shenandoah-Dives Mill, 135 County Road 2, Silverton, San Juan County, CO

  16. Mass timber rocking panel retrofit of a four-story soft-story building with full-scale shake table validation

    Treesearch

    Pouria Bahmani; John van de Lindt; Asif Iqbal; Douglas Rammer

    2017-01-01

    Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. There are tens of thousands of these multi-family three- and four-story structures throughout California and the United States. The majority were constructed between 1920 and 1970, with many being prevalent in the San Francisco Bay Area in California. The NEES Soft...

  17. Shake table tests of suspended ceilings to simulate the observed damage in the M s7.0 Lushan earthquake, China

    NASA Astrophysics Data System (ADS)

    Wang, Duozhi; Dai, Junwu; Qu, Zhe; Ning, Xiaoqing

    2016-06-01

    Severe damage to suspended ceilings of metal grids and lay-in panels was observed in public buildings during the 2013 M s7.0 Lushan earthquake in China. Over the past several years, suspended ceilings have been widely used practice in public buildings throughout China, including government offices, schools and hospitals. To investigate the damage mechanism of suspended ceilings, a series of three-dimensional shake table tests was conducted to reproduce the observed damage. A full-scale reinforced concrete frame was constructed as the testing frame for the ceiling, which was single-story and infilled with brick masonry walls to represent the local construction of low-rise buildings. In general, the ceiling in the tests exhibited similar damage phenomena as the field observations, such as higher vulnerability of perimeter elements and extensive damage to the cross runners. However, it exhibited lower fragility in terms of peak ground/roof accelerations at the initiation of damage. Further investigations are needed to clarify the reasons for this behavior.

  18. Shake Test Results and Dynamic Calibration Efforts for the Large Rotor Test Apparatus

    NASA Technical Reports Server (NTRS)

    Russell, Carl R.

    2014-01-01

    Prior to the full-scale wind tunnel test of the UH-60A Airloads rotor, a shake test was completed on the Large Rotor Test Apparatus. The goal of the shake test was to characterize the oscillatory response of the test rig and provide a dynamic calibration of the balance to accurately measure vibratory hub loads. This paper provides a summary of the shake test results, including balance, shaft bending gauge, and accelerometer measurements. Sensitivity to hub mass and angle of attack were investigated during the shake test. Hub mass was found to have an important impact on the vibratory forces and moments measured at the balance, especially near the UH-60A 4/rev frequency. Comparisons were made between the accelerometer data and an existing finite-element model, showing agreement on mode shapes, but not on natural frequencies. Finally, the results of a simple dynamic calibration are presented, showing the effects of changes in hub mass. The results show that the shake test data can be used to correct in-plane loads measurements up to 10 Hz and normal loads up to 30 Hz.

  19. Recovery of PET from packaging plastics mixtures by wet shaking table.

    PubMed

    Carvalho, M T; Agante, E; Durão, F

    2007-01-01

    Recycling requires the separation of materials appearing in a mass of wastes of heterogeneous composition and characteristics, into single, almost pure, component/material flows. The separation of materials (e.g., some types of plastics) with similar physical properties (e.g., specific gravity) is often accomplished by human sorting. This is the case of the separation of packaging plastics in municipal solid wastes (MSW). The low cost of virgin plastics and low value of recycled plastics necessitate the utilization of low cost techniques and processes in the recycling of packaging plastics. An experimental study was conducted to evaluate the feasibility of production of a PET product, cleaned from PVC and PS, using a wet shaking table. The wet shaking table is an environmentally friendly process, widely used to separate minerals, which has low capital and operational costs. Some operational variables of the equipment, as well as different feed characteristics, were considered. The results show that the separation of these plastics is feasible although, similarly to the mineral field, in somewhat complex flow sheets.

  20. Shake Warning: Helping People Stay Safe With Lots of Small Boxes in the Ground to Warn Them About Strong Shaking

    NASA Astrophysics Data System (ADS)

    Reusch, M.

    2017-12-01

    A group of people at schools are joining with the group of people in control of making pictures of the state of rocks on the ground and water in our land. They are working on a plan to help all people be safe in the case of very big ground shaking (when ground breaks in sight or under ground). They will put many small boxes all over the states in the direction of where the sun sets to look for the first shake that might be a sign of an even bigger shake to come. They tell a big computer (with much power) in several large cities in those states. These computers will decide if the first shake is a sign of a very large and close ground shake, a far-away ground shake, a small but close ground shake, or even just a sign of a shake that people wanted to make. If it is a sign of a close and really big shake, then the computers will tell the phones and computers of many people to help them take safe steps before the big shaking arrives where they are. This warning might be several seconds or maybe a couple of minutes. People will be able to hide, take cover, and hold on under tables and desks in case things fall from walls and places up high in their home and work. Doctors will be able to pause hard work and boxes that move people up and down in homes, businesses, and stores will be able to stop on the next floor and open their doors to let people out and not get stuck. It will help slow down trains to be safe and not fly off of the track as well as it will help to shut off water and air that warms homes and is used for when you make food hot. To make this plan become real, people who work for these groups are putting more small boxes in areas where there are not enough and that there are many people. They are also putting small boxes in places where there are no boxes but the big shake might come from that direction. There are problems to get past such as needing many more small boxes, more people to help with this plan, and getting all people who live in these areas to learn what to do when the warning comes about the big shake, but this year there was good news when in month number four they were able to get all of the computers to talk to each other and run the same plan with the same news of the first shaking.

  1. Mathematical modeling and full-scale shaking table tests for multi-curve buckling restrained braces

    NASA Astrophysics Data System (ADS)

    Tsai, C. S.; Lin, Yungchang; Chen, Wenshin; Su, H. C.

    2009-09-01

    Buckling restrained braces (BRBs) have been widely applied in seismic mitigation since they were introduced in the 1970s. However, traditional BRBs have several disadvantages caused by using a steel tube to envelope the mortar to prevent the core plate from buckling, such as: complex interfaces between the materials used, uncertain precision, and time consumption during the manufacturing processes. In this study, a new device called the multi-curve buckling restrained brace (MC-BRB) is proposed to overcome these disadvantages. The new device consists of a core plate with multiple neck portions assembled to form multiple energy dissipation segments, and the enlarged segment, lateral support elements and constraining elements to prevent the BRB from buckling. The enlarged segment located in the middle of the core plate can be welded to the lateral support and constraining elements to increase buckling resistance and to prevent them from sliding during earthquakes. Component tests and a series of shaking table tests on a full-scale steel structure equipped with MC-BRBs were carried out to investigate the behavior and capability of this new BRB design for seismic mitigation. The experimental results illustrate that the MC-BRB possesses a stable mechanical behavior under cyclic loadings and provides good protection to structures during earthquakes. Also, a mathematical model has been developed to simulate the mechanical characteristics of BRBs.

  2. An Earthquake Shake Map Routine with Low Cost Accelerometers: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Alcik, H. A.; Tanircan, G.; Kaya, Y.

    2015-12-01

    Vast amounts of high quality strong motion data are indispensable inputs of the analyses in the field of geotechnical and earthquake engineering however, high cost of installation of the strong motion systems constitutes the biggest obstacle for worldwide dissemination. In recent years, MEMS based (micro-electro-mechanical systems) accelerometers have been used in seismological research-oriented studies as well as earthquake engineering oriented projects basically due to precision obtained in downsized instruments. In this research our primary goal is to ensure the usage of these low-cost instruments in the creation of shake-maps immediately after a strong earthquake. Second goal is to develop software that will automatically process the real-time data coming from the rapid response network and create shake-map. For those purposes, four MEMS sensors have been set up to deliver real-time data. Data transmission is done through 3G modems. A subroutine was coded in assembler language and embedded into the operating system of each instrument to create MiniSEED files with packages of 1-second instead of 512-byte packages.The Matlab-based software calculates the strong motion (SM) parameters at every second, and they are compared with the user-defined thresholds. A voting system embedded in the software captures the event if the total vote exceeds the threshold. The user interface of the software enables users to monitor the calculated SM parameters either in a table or in a graph (Figure 1). A small scale and affordable rapid response network is created using four MEMS sensors, and the functionality of the software has been tested and validated using shake table tests. The entire system is tested together with a reference sensor under real strong ground motion recordings as well as series of sine waves with varying amplitude and frequency. The successful realization of this software allowed us to set up a test network at Tekirdağ Province, the closest coastal point to the moderate size earthquake activities in the Marmara Sea, Turkey.

  3. GNSS seismometer: Seismic phase recognition of real-time high-rate GNSS deformation waves

    NASA Astrophysics Data System (ADS)

    Nie, Zhaosheng; Zhang, Rui; Liu, Gang; Jia, Zhige; Wang, Dijin; Zhou, Yu; Lin, Mu

    2016-12-01

    High-rate global navigation satellite systems (GNSS) can potentially be used as seismometers to capture short-period instantaneous dynamic deformation waves from earthquakes. However, the performance and seismic phase recognition of the GNSS seismometer in the real-time mode, which plays an important role in GNSS seismology, are still uncertain. By comparing the results of accuracy and precision of the real-time solution using a shake table test, we found real-time solutions to be consistent with post-processing solutions and independent of sampling rate. In addition, we analyzed the time series of real-time solutions for shake table tests and recent large earthquakes. The results demonstrated that high-rate GNSS have the ability to retrieve most types of seismic waves, including P-, S-, Love, and Rayleigh waves. The main factor limiting its performance in recording seismic phases is the widely used 1-Hz sampling rate. The noise floor also makes recognition of some weak seismic phases difficult. We concluded that the propagation velocities and path of seismic waves, macro characteristics of the high-rate GNSS array, spatial traces of seismic phases, and incorporation of seismographs are all useful in helping to retrieve seismic phases from the high-rate GNSS time series.

  4. Using Smartphones to Detect Earthquakes

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  5. Raspberry Shake- A World-Wide Citizen Seismograph Network

    NASA Astrophysics Data System (ADS)

    Christensen, B. C.; Blanco Chia, J. F.

    2017-12-01

    Raspberry Shake was conceived as an inexpensive plug-and-play solution to satisfy the need for universal, quick and accurate earthquake detections. First launched on Kickstarter's crowdfunding platform in July of 2016, the Raspberry Shake project was funded within hours of the launch date and, by the end of the campaign, reached more than 1000% of its initial funding goal. This demonstrated for the first time that there exists a strong interest among Makers, Hobbyists and Do It Yourselfers for personal seismographs. From here, a citizen scientist network was created and it has steadily been growing. The Raspberry Shake network is currently being used in conjunction with publicly available broadband data from the GSN and other state-run seismic networks available through the IRIS, Geoscope and GEOFON data centers to detect and locate earthquakes large and small around the globe. Raspberry Shake looks well positioned to improve local monitoring of earthquakes on a global scale, deepen community's understanding of earthquakes, and serve as a formidable teaching tool. We present the main results of the project, the current state of the network, and the new Raspberry Shake models that are being built.

  6. Enhanced production of natural yellow pigments from Monascus purpureus by liquid culture: The relationship between fermentation conditions and mycelial morphology.

    PubMed

    Lv, Jun; Zhang, Bo-Bo; Liu, Xiao-Dong; Zhang, Chan; Chen, Lei; Xu, Gan-Rong; Cheung, Peter Chi Keung

    2017-10-01

    Natural yellow pigments produced by submerged fermentation of Monascus purpureus have potential economic value and application in the food industry. In the present study, the relationships among fermentation conditions (in terms of pH and shaking/agitation speed), mycelial morphology and the production of Monascus yellow pigments were investigated in both shake-flask and scale-up bioreactor experiments. In the shake-flask fermentation, the highest yield of the Monascus yellow pigments was obtained at pH 5.0 and a shaking speed of 180 rpm. Microscopic images revealed that these results were associated with the formation of freely dispersed small mycelial pellets with shorter, thicker and multi-branched hyphae. Further investigation indicated that the hyphal diameter was highly correlated with the biosynthesis of the Monascus yellow pigments. In a scaled-up fermentation experiment, the yield of yellow pigments (401 U) was obtained in a 200-L bioreactor, which is the highest yield to the best of our knowledge. The present findings can advance our knowledge on the conditions used for enhancing the production of Monascus yellow pigments in submerged fermentation and facilitate large-scale production of these natural pigments. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  7. ARC-2007-ACD07-0073-046

    NASA Image and Video Library

    2007-04-14

    Lunar CRater Observation and Sensing Satellite (LCROSS) and P.I. at NASA Ames Research Center - Total Luminance Photometer shake test in N-244 (EEL) : Metal shake table close up. Shows two units bolted on. The left one is the lens, sensor electronics and photometer sensor. The right is the digital electronics unit for the instrument. The two units, along with their cabling is one of the LCROSS science insruments.

  8. High-resolution flying-PIV with optical fiber laser delivery

    NASA Astrophysics Data System (ADS)

    Weichselbaum, Noah A.; André, Matthieu A.; Rahimi-Abkenar, Morteza; Manzari, Majid T.; Bardet, Philippe M.

    2016-05-01

    Implementation of non-intrusive optical measurement techniques, such as particle image velocimetry (PIV), in harsh environments requires specialized techniques for introducing controlled laser sheets to the region of interest. Large earthquake shake tables are a particularly challenging environment. Lasers must be mounted away from the table, and the laser sheet has to be delivered precisely and stably to the measurement station. Here, high-power multi-mode step-index fiber optics enable introduction of light from an Nd:YLF pulsed laser to a remote test section. Such lasers are suitable for coupling to optical fibers, which presents a portable, flexible, and safe manner to deliver a PIV light sheet. Best practices for their implementation are reviewed. Particular attention is focused on obtaining a collimated beam of acceptable quality at the output of the fiber. To achieve high spatial resolution, the PIV camera is directly mounted on the moving shake table with care to minimize its vibrations. A special arrangement of PIV planes is deployed for precise in-situ PIV alignment and to monitor and account for residual structure vibrations and beam wandering. The design of the instruments is detailed. Here, an experimental facility for the study of nuclear fuel bundle response to seismic forcing near prototypical conditions is instrumented. Only through integration of a high-resolution flying-PIV system can velocity fields be acquired. Data indicate that in the presence of a mean axial flow, a secondary oscillatory flow develops as the bundle oscillates. Instantaneous, phase-averaged, and fluctuating velocity fields illustrate this phenomenon.

  9. Landslides in the New Madrid seismic zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jibson, R.W.; Keefer, D.K.

    1985-01-01

    During the New Madrid earthquakes of 1811-12, bluffs bordering the Mississippi alluvial plain in the epicentral region underwent large-scale landsliding. Between Cairo, Illinois and Memphis, Tennessee, the authors mapped 221 large landslides of three types: (1) old, eroded, coherent block slides and slumps; (2) old earth flows; and (3) young, fresh slumps that occur only along near-river bluffs and are the only landslides present along such bluffs. Historical accounts and field evidence indicate that most or all old coherent slides and earth flows date to the 1811-12 earthquakes and that the only currently active, large-scale landsliding in the area occursmore » along bluffs bordering the river. Analysis of old coherent slides and earth flows indicates that landslide distribution is most strongly affected by slope height, but that proximity to the hypocenters of the 1811-12 earthquakes also has a significant effect. Slope-stability analyses of an old coherent slide and an earth flow selected as representative of the principal kinds of landslides present indicate that both were stable in aseismic conditions even when water tables were at highest possible levels. However, a dynamic Newmark displacement analysis shows that ground shaking such as that in 1811-12 would cause large displacements leading to catastrophic failure in both slides. These results indicate that in large earthquakes landsliding in much of the study are is likely. Moderate earthquakes may also trigger landslides at some locations.« less

  10. System identification of timber masonry walls using shaking table test

    NASA Astrophysics Data System (ADS)

    Roy, Timir B.; Guerreiro, Luis; Bagchi, Ashutosh

    2017-04-01

    Dynamic study is important in order to design, repair and rehabilitation of structures. It has played an important role in the behavior characterization of structures; such as: bridges, dams, high rise buildings etc. There had been substantial development in this area over the last few decades, especially in the field of dynamic identification techniques of structural systems. Frequency Domain Decomposition (FDD) and Time Domain Decomposition are most commonly used methods to identify modal parameters; such as: natural frequency, modal damping and mode shape. The focus of the present research is to study the dynamic characteristics of typical timber masonry walls commonly used in Portugal. For that purpose, a multi-storey structural prototype of such wall has been tested on a seismic shake table at the National Laboratory for Civil Engineering, Portugal (LNEC). Signal processing has been performed of the output response, which is collected from the shaking table experiment of the prototype using accelerometers. In the present work signal processing of the output response, based on the input response has been done in two ways: FDD and Stochastic Subspace Identification (SSI). In order to estimate the values of the modal parameters, algorithms for FDD are formulated and parametric functions for the SSI are computed. Finally, estimated values from both the methods are compared to measure the accuracy of both the techniques.

  11. Large-scale culture of a megakaryocytic progenitor cell line with a single-use bioreactor system.

    PubMed

    Nurhayati, Retno Wahyu; Ojima, Yoshihiro; Dohda, Takeaki; Kino-Oka, Masahiro

    2018-03-01

    The increasing application of regenerative medicine has generated a growing demand for stem cells and their derivatives. Single-use bioreactors offer an attractive platform for stem cell expansion owing to their scalability for large-scale production and feasibility of meeting clinical-grade standards. The current work evaluated the capacity of a single-use bioreactor system (1 L working volume) for expanding Meg01 cells, a megakaryocytic (MK) progenitor cell line. Oxygen supply was provided by surface aeration to minimize foaming and orbital shaking was used to promote oxygen transfer. Oxygen transfer rates (k L a) of shaking speeds 50, 100, and 125 rpm were estimated to be 0.39, 1.12, and 10.45 h -1 , respectively. Shaking speed was a critical factor for optimizing cell growth. At 50 rpm, Meg01 cells exhibited restricted growth due to insufficient mixing. A negative effect occurred when the shaking speed was increased to 125 rpm, likely caused by high hydrodynamic shear stress. The bioreactor culture achieved the highest growth profile when shaken at 100 rpm, achieving a total expansion rate up to 5.7-fold with a total cell number of 1.2 ± 0.2 × 10 9 cells L -1 . In addition, cells expanded using the bioreactor system could maintain their potency to differentiate following the MK lineage, as analyzed from specific surface protein and morphological similarity with the cells grown in the conventional culturing system. Our study reports the impact of operational variables such as shaking speed for growth profile and MK differentiation potential of a progenitor cell line in a single-use bioreactor. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:362-369, 2018. © 2017 American Institute of Chemical Engineers.

  12. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.

  13. A new wireless system for decentralised measurement of physiological parameters from shake flasks

    PubMed Central

    Vasala, Antti; Panula, Johanna; Bollók, Monika; Illmann, Lutz; Hälsig, Christian; Neubauer, Peter

    2006-01-01

    Background Shake flasks are widely used because of their low price and simple handling. Many researcher are, however, not aware of the physiological consequences of oxygen limitation and substrate overflow metabolism that occur in shake flasks. Availability of a wireless measuring system brings the possibilities for quality control and design of cultivation conditions. Results Here we present a new wireless solution for the measurement of pH and oxygen from shake flasks with standard sensors, which allows data transmission over a distance of more than 100 metres in laboratory environments. This new system was applied to monitoring of cultivation conditions in shake flasks. The at-time monitoring of the growth conditions became possible by simple means. Here we demonstrate that with typical protocols E. coli shake flask cultures run into severe oxygen limitation and the medium is strongly acidified. Additionally the strength of the new system is demonstrated by continuous monitoring of the oxygen level in methanol-fed Pichia pastoris shake flask cultures, which allows the optimisation of substrate feeding for preventing starvation or methanol overfeed. 40 % higher cell density was obtained by preventing starvation phases which occur in standard shake flask protocols by adding methanol when the respiration activity decreased in the cultures. Conclusion The here introduced wireless system can read parallel sensor data over long distances from shake flasks that are under vigorous shaking in cultivation rooms or closed incubators. The presented technology allows centralised monitoring of decentralised targets. It is useful for the monitoring of pH and dissolved oxygen in shake flask cultures. It is not limited to standard sensors, but can be easily adopted to new types of sensors and measurement places (e.g., new sensor points in large-scale bioreactors). PMID:16504107

  14. ShakeMapple : tapping laptop motion sensors to map the felt extents of an earthquake

    NASA Astrophysics Data System (ADS)

    Bossu, Remy; McGilvary, Gary; Kamb, Linus

    2010-05-01

    There is a significant pool of untapped sensor resources available in portable computer embedded motion sensors. Included primarily to detect sudden strong motion in order to park the disk heads to prevent damage to the disks in the event of a fall or other severe motion, these sensors may also be tapped for other uses as well. We have developed a system that takes advantage of the Apple Macintosh laptops' embedded Sudden Motion Sensors to record earthquake strong motion data to rapidly build maps of where and to what extent an earthquake has been felt. After an earthquake, it is vital to understand the damage caused especially in urban environments as this is often the scene for large amounts of damage caused by earthquakes. Gathering as much information from these impacts to determine where the areas that are likely to be most effected, can aid in distributing emergency services effectively. The ShakeMapple system operates in the background, continuously saving the most recent data from the motion sensors. After an earthquake has occurred, the ShakeMapple system calculates the peak acceleration within a time window around the expected arrival and sends that to servers at the EMSC. A map plotting the felt responses is then generated and presented on the web. Because large-scale testing of such an application is inherently difficult, we propose to organize a broadly distributed "simulated event" test. The software will be available for download in April, after which we plan to organize a large-scale test by the summer. At a specified time, participating testers will be asked to create their own strong motion to be registered and submitted by the ShakeMapple client. From these responses, a felt map will be produced representing the broadly-felt effects of the simulated event.

  15. Shake Test Results and Dynamic Calibration Efforts for the Large Rotor Test Apparatus

    NASA Technical Reports Server (NTRS)

    Russell, Carl R.

    2014-01-01

    A shake test of the Large Rotor Test Apparatus (LRTA) was performed in an effort to enhance NASAscapability to measure dynamic hub loads for full-scale rotor tests. This paper documents the results of theshake test as well as efforts to calibrate the LRTA balance system to measure dynamic loads.Dynamic rotor loads are the primary source of vibration in helicopters and other rotorcraft, leading topassenger discomfort and damage due to fatigue of aircraft components. There are novel methods beingdeveloped to reduce rotor vibrations, but measuring the actual vibration reductions on full-scale rotorsremains a challenge. In order to measure rotor forces on the LRTA, a balance system in the non-rotatingframe is used. The forces at the balance can then be translated to the hub reference frame to measure therotor loads. Because the LRTA has its own dynamic response, the balance system must be calibrated toinclude the natural frequencies of the test rig.

  16. Accelerated Fermentation of Brewer's Wort by Saccharomyces carlsbergensis1

    PubMed Central

    Porter, Sookie C.

    1975-01-01

    A rapid procedure for wort fermentation with Saccharomyces carlsbergensis at 12 C is described. Fermentation time was reduced from 7 to 4 days with normal inoculum by shaking. Increasing the inoculation to 5 to 10 times normal and shaking resulted in complete fermentation in 3 days. Maximum yeast population was reached rapidly with the large inocula, but fermentation proceeded at approximately the same rate when inoculations in excess of four times the normal were used. Similar results were obtained with both small-scale (100 ml) and microbrew (2.4 liters) fermentations. PMID:16350046

  17. Failure behavior of concrete pile and super-structure dynamic response as a result of soil liquefaction during earthquake

    NASA Astrophysics Data System (ADS)

    Kaneda, Shogo; Hayashi, Kazuhiro; Hachimori, Wataru; Tamura, Shuji; Saito, Taiki

    2017-10-01

    In past earthquake disasters, numerous building structure piles were damaged by soil liquefaction occurring during the earthquake. Damage to these piles, because they are underground, is difficult to find. The authors aim to develop a monitoring method of pile damage based on superstructure dynamic response. This paper investigated the relationship between the damage of large cross section cementitious piles and the dynamic response of the super structure using a centrifuge test apparatus. A dynamic specimen used simple cross section pile models consisting of aluminum rod and mortar, a saturated soil (Toyoura sand) of a relative density of 40% and a super structure model of a natural period of 0.63sec. In the shaking table test under a 50G field (length scale of 1/50), excitation was a total of 3 motions scaled from the Rinkai wave at different amplitudes. The maximum acceleration of each of the excitations was 602gal, 336gal and 299gal. The centrifuge test demonstrated the liquefaction of saturated soil and the failure behavior of piles. In the test result, the damage of piles affected the predominant period of acceleration response spectrum on the footing of the superstructure.

  18. Next-Level ShakeZoning for Earthquake Hazard Definition in Nevada

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Savran, W. H.; Flinchum, B. A.; Dudley, C.; Prina, N.; Pullammanappallil, S.; Pancha, A.

    2011-12-01

    We are developing "Next-Level ShakeZoning" procedures tailored for defining earthquake hazards in Nevada. The current Federally sponsored tools- the USGS hazard maps and ShakeMap, and FEMA HAZUS- were developed as statistical summaries to match earthquake data from California, Japan, and Taiwan. The 2008 Wells and Mogul events in Nevada showed in particular that the generalized statistical approach taken by ShakeMap cannot match actual data on shaking from earthquakes in the Intermountain West, even to first order. Next-Level ShakeZoning relies on physics and geology to define earthquake shaking hazards, rather than statistics. It follows theoretical and computational developments made over the past 20 years, to capitalize on detailed and specific local data sets to more accurately model the propagation and amplification of earthquake waves through the multiple geologic basins of the Intermountain West. Excellent new data sets are now available for Las Vegas Valley. Clark County, Nevada has completed the nation's very first effort to map earthquake hazard class systematically through an entire urban area using Optim's SeisOpt° ReMi technique, which was adapted for large-scale data collection. Using the new Parcel Map in computing shaking in the Valley for scenario earthquakes is crucial for obtaining realistic predictions of ground motions. In an educational element of the project, a dozen undergraduate students have been computing 50 separate earthquake scenarios affecting Las Vegas Valley, using the Next-Level ShakeZoning process. Despite affecting only the upper 30 meters, the Vs30 geotechnical shear-velocity from the Parcel Map shows clear effects on 3-d shaking predictions computed so far at frequencies from 0.1 Hz up to 1.0 Hz. The effect of the Parcel Map on even the 0.1-Hz waves is prominent even with the large mismatch of wavelength to geotechnical depths. Amplifications and de-amplifications affected by the Parcel Map exceed a factor of two, and are highly dependent on the particular scenario. As well, Parcel Map amplification effects extend into areas not characterized in the Parcel Map. The fully 3-d Next-Level ShakeZoning scenarios show many areas of shaking amplification and de-amplification that USGS ShakeMap scenarios cannot predict. For example, the Frenchman Mountain scenario shows PGV of the two approaches within 15% of each other near the source, but upwards of 200% relative amplification or de-amplification, depending on location, throughout Las Vegas Valley.

  19. High cell density media for Escherichia coli are generally designed for aerobic cultivations – consequences for large-scale bioprocesses and shake flask cultures

    PubMed Central

    Soini, Jaakko; Ukkonen, Kaisa; Neubauer, Peter

    2008-01-01

    Background For the cultivation of Escherichia coli in bioreactors trace element solutions are generally designed for optimal growth under aerobic conditions. They do normally not contain selenium and nickel. Molybdenum is only contained in few of them. These elements are part of the formate hydrogen lyase (FHL) complex which is induced under anaerobic conditions. As it is generally known that oxygen limitation appears in shake flask cultures and locally in large-scale bioreactors, function of the FHL complex may influence the process behaviour. Formate has been described to accumulate in large-scale cultures and may have toxic effects on E. coli. Although the anaerobic metabolism of E. coli is well studied, reference data which estimate the impact of the FHL complex on bioprocesses of E. coli with oxygen limitation have so far not been published, but are important for a better process understanding. Results Two sets of fed-batch cultures with conditions triggering oxygen limitation and formate accumulation were performed. Permanent oxygen limitation which is typical for shake flask cultures was caused in a bioreactor by reduction of the agitation rate. Transient oxygen limitation, which has been described to eventually occur in the feed-zone of large-scale bioreactors, was mimicked in a two-compartment scale-down bioreactor consisting of a stirred tank reactor and a plug flow reactor (PFR) with continuous glucose feeding into the PFR. In both models formate accumulated up to about 20 mM in the culture medium without addition of selenium, molybdenum and nickel. By addition of these trace elements the formate accumulation decreased below the level observed in well-mixed laboratory-scale cultures. Interestingly, addition of the extra trace elements caused accumulation of large amounts of lactate and reduced biomass yield in the simulator with permanent oxygen limitation, but not in the scale-down two-compartment bioreactor. Conclusion The accumulation of formate in oxygen limited cultivations of E. coli can be fully prevented by addition of the trace elements selenium, nickel and molybdenum, necessary for the function of FHL complex. For large-scale cultivations, if glucose gradients are likely, the results from the two-compartment scale-down bioreactor indicate that the addition of the extra trace elements is beneficial. No negative effects on the biomass yield or on any other bioprocess parameters could be observed in cultures with the extra trace elements if the cells were repeatedly exposed to transient oxygen limitation. PMID:18687130

  20. SHAKING TABLE TEST AND EFFECTIVE STRESS ANALYSIS ON SEISMIC PERFORMANCE WITH SEISMIC ISOLATION RUBBER TO THE INTERMEDIATE PART OF PILE FOUNDATION IN LIQUEFACTION

    NASA Astrophysics Data System (ADS)

    Uno, Kunihiko; Otsuka, Hisanori; Mitou, Masaaki

    The pile foundation is heavily damaged at the boundary division of the ground types, liquefied ground and non-liquefied ground, during an earthquake and there is a possibility of the collapse of the piles. In this study, we conduct a shaking table test and effective stress analysis of the influence of soil liquefaction and the seismic inertial force exerted on the pile foundation. When the intermediate part of the pile, there is at the boundary division, is subjected to section force, this part increases in size as compared to the pile head in certain instances. Further, we develop a seismic resistance method for a pile foundation in liquefaction using seismic isolation rubber and it is shown the middle part seismic isolation system is very effective.

  1. Finite element modeling of a shaking table test to evaluate the dynamic behaviour of a soil-foundation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abate, G.; Massimino, M. R.; Maugeri, M.

    The deep investigation of soil-foundation interaction behaviour during earthquakes represent one of the key-point for a right seismic design of structures, which can really behave well during earthquake, avoiding dangerous boundary conditions, such as weak foundations supporting the superstructures. The paper presents the results of the FEM modeling of a shaking table test involving a concrete shallow foundation resting on a Leighton Buzzard sand deposit. The numerical simulation is performed using a cap-hardening elasto-plastic constitutive model for the soil and specific soil-foundation contacts to allow slipping and up-lifting phenomena. Thanks to the comparison between experimental and numerical results, the powermore » and the limits of the proposed numerical model are focused. Some aspects of the dynamic soil-foundation interaction are also pointed out.« less

  2. The boundary conditions for simulations of a shake-table experiment on the seismic response of 3D slope

    NASA Astrophysics Data System (ADS)

    Tang, Liang; Cong, Shengyi; Ling, Xianzhang; Ju, Nengpan

    2017-01-01

    Boundary conditions can significantly affect a slope's behavior under strong earthquakes. To evaluate the importance of boundary conditions for finite element (FE) simulations of a shake-table experiment on the slope response, a validated three-dimensional (3D) nonlinear FE model is presented, and the numerical and experimental results are compared. For that purpose, the robust graphical user-interface "SlopeSAR", based on the open-source computational platform OpenSees, is employed, which simplifies the effort-intensive pre- and post-processing phases. The mesh resolution effect is also addressed. A parametric study is performed to evaluate the influence of boundary conditions on the FE model involving the boundary extent and three types of boundary conditions at the end faces. Generally, variations in the boundary extent produce inconsistent slope deformations. For the two end faces, fixing the y-direction displacement is not appropriate to simulate the shake-table experiment, in which the end walls are rigid and rough. In addition, the influence of the length of the 3D slope's top face and the width of the slope play an important role in the difference between two types of boundary conditions at the end faces (fixing the y-direction displacement and fixing the ( y, z) direction displacement). Overall, this study highlights that the assessment of a comparison between a simulation and an experimental result should be performed with due consideration to the effect of the boundary conditions.

  3. Shaking Table Experiment of Trampoline Effect

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Kunugi, T.; Fujiwara, H.

    2010-12-01

    It has been widely thought that soil response to ground shaking do not experience asymmetry in ground motion. An extreme vertical acceleration near four times gravity was recorded during the 2008 Iwate-Miyagi earthquake at IWTH25 station. This record is distinctly asymmetric in shape; the waveform envelope amplitude is about 1.6 times larger in the upward direction compared to the downward direction. To explain this phenomenon, Aoi et al. (2008) proposed a simple model of a mass bouncing on a trampoline. In this study we perform a shaking table experiment of a soil prototype to try to reproduce the asymmetric ground motion and to investigate the physics of this asymmetric behavior. A soil chamber made of an acrylic resin cylinder with 200 mm in diameter and 500 mm in height was tightly anchored to the shaking table and vertically shaken. We used four different sample materials; Toyoura standard sands, grass beads (particle size of 0.1 and 0.4 mm) and sawdust. Sample was uniformly stacked to a depth of 450 mm and, to measure the vertical motions, accelerometers was installed inside the material (at depths of 50, 220, and 390 mm) and on the frame of the chamber. Pictures were taken from a side by a high speed camera (1000 frames/sec) to capture the motions of particles. The chamber was shaken by sinusoidal wave (5, 10, and 20 Hz) with maximum amplitudes from 0.1 to 4.0 g. When the accelerations roughly exceeded gravity, for all samples, granular behaviors of sample materials became dominant and the asymmetric motions were successfully reproduced. Pictures taken by the high speed camera showed that the motions of the particles are clearly different from the motion of the chamber which is identical to the sinusoidal motion of the shaking table (input motion). Particles are rapidly flung up and freely pulled down by gravity, and the downward motion of the particles is slower than the upward motion. It was also observed that the timing difference of the falling motions indicate a dependence with depth. Our results show that the shape of time histories of recorded motions by the accelerometers within the sample, becomes increasingly different than the input sinusoidal wave for sensors at shallower depths. When sands or grass beads are used as fill material, the observed waveforms under large accelerations are the summation of a warped sine-like function and one or few sharp pulses, which might be caused by the shocks generated by the 'landing' of the free-falling material. For sawdust, the observed waveforms have much more smooth shapes which are also asymmetric; larger and narrower for upward direction and smaller and broader for downward direction. The reason why the waveforms of the sawdust experiments are different from the sand or grass bead cases is mainly due to the different elastic deformation characteristics of each material. The impacts of the 'landing' are reduced by the resilience of the sawdust and the shape pulses become blunt. Our experiments show that within all tested materials the sawdust is the one that somehow reproduces waveforms with the largest similarities to the observed asymmetric waveform at IWTH25. This shows that both the granularity and the elasticity may play an important role when the vertical ground motions become asymmetric.

  4. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.

  5. Insights into earthquake hazard map performance from shaking history simulations

    NASA Astrophysics Data System (ADS)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher-than-mapped shaking — arises by chance or reflects biases in the map. Due to this problem, there are limits to how well we can expect hazard maps to predict future shaking, as well as to our ability to test the performance of a hazard map based on available observations.

  6. ARC-2007-ACD07-0073-050

    NASA Image and Video Library

    2007-04-14

    Lunar CRater Observation and Sensing Satellite (LCROSS) and P.I. at NASA Ames Research Center - Total Luminance Photometer lens and electronics units on shake table in N-2444 EEL Laboratory: Kim Ennico and Gi Kojima check electronics

  7. Experimental and numerical investigations of higher mode effects on seismic inelastic response of reinforced concrete shear walls

    NASA Astrophysics Data System (ADS)

    Ghorbanirenani, Iman

    This thesis presents two experimental programs together with companion numerical studies that were carried out on reinforced concrete shear walls: static tests and dynamic (shake table) tests. The first series of experiments were monotonic and cyclic quasi-static testing on ductile reinforced concrete shear wall specimens designed and detailed according to the seismic provisions of NBCC 2005 and CSA-A23.3-04 standard. The tests were carried out on full-scale and 1:2.37 reduced scale wall specimens to evaluate the seismic design provisions and similitude law and determine the appropriate scaling factor that could be applied for further studies such as dynamic tests. The second series of experiments were shake table tests conducted on two identical 1:2.33 scaled, 8-storey moderately ductile reinforced concrete shear wall specimens to investigate the effects of higher modes on the inelastic response of slender walls under high frequency ground motions expected in Eastern North America. The walls were designed and detailed according to the seismic provisions of NBCC 2005 and CSA-A23.3-04 standard. The objectives were to validate and understand the inelastic response and interaction of shear, flexure and axial loads in plastic hinge zones of the walls considering the higher mode effects and to investigate the formation of second hinge in upper part of the wall due to higher mode responses. Second mode response significantly affected the response of the walls. This caused inelastic flexural response to develop at the 6th level with approximately the same rotation ductility compared to that observed at the base. Dynamic amplification of the base shear forces was also observed in both walls. Numerical modeling of these two shake table tests was performed to evaluate the test results and validate current modeling approaches. Nonlinear time history analyses were carried out by the reinforced concrete fibre element (OpenSees program) and finite element (VecTor2 program) methods using the shake table feedback signals as input. Good agreement was generally obtained between numerical and experimental results. Both computer programs were able to predict the natural frequency of the walls in the undamaged and damaged conditions. Both modeling techniques could predict that the maximum bending moment at the base of the walls reached the actual wall moment capacity. The inelastic response and the dual plastic hinge behaviour of the walls could be adequately reproduced using the fibre element and finite element analysis programs. The fibre element method is a good alternative in terms of computing time. It produces reasonable results in comparison with the finite element method, although particular attention needs to be given to the selection of the damping ratios. The different parametric analyses performed in this thesis showed that, for both models, adding a small amount of global viscous damping in combination with a refined reinforced concrete hysteretic model could predict better the seismic behaviour of the tested structures. For the VecTor2 program, a viscous damping of 1% led to reasonable results for the studied RC walls. For the OpenSees program, 2% damping resulted in a good match between test and predictions for the 100% EQ test on the initially undamaged wall. When increasing the earthquake intensities, the damping had to be reduced between 1.5% and 1% to achieve good results for a damaged wall with elongated vibration periods. According to the experimental results and numerical analyses on reinforced concrete shear walls subjected to ground motions from Eastern North America earthquakes, there is a high possibility of having a second plastic hinge forming in the upper part of walls in addition to the one assumed in design at the base. This second hinge could dissipate the earthquake energy more effectively and decrease the force demand on the wall. A dual plastic hinge design approach in which the structures become plastic in the upper wall segment as well as the base could be therefore more appropriate. Preliminary design recommendations considering higher mode effects on dual hinge response and base shear forces for ductile slender shear walls are given in this thesis. (Abstract shortened by UMI.)

  8. Scale-up from shake flasks to bioreactor, based on power input and Streptomyces lividans morphology, for the production of recombinant APA (45/47 kDa protein) from Mycobacterium tuberculosis.

    PubMed

    Gamboa-Suasnavart, Ramsés A; Marín-Palacio, Luz D; Martínez-Sotelo, José A; Espitia, Clara; Servín-González, Luis; Valdez-Cruz, Norma A; Trujillo-Roldán, Mauricio A

    2013-08-01

    Culture conditions in shake flasks affect filamentous Streptomyces lividans morphology, as well the productivity and O-mannosylation of recombinant Ala-Pro-rich O-glycoprotein (known as the 45/47 kDa or APA antigen) from Mycobacterium tuberculosis. In order to scale up from previous reported shake flasks to bioreactor, data from the literature on the effect of agitation on morphology of Streptomyces strains were used to obtain gassed volumetric power input values that can be used to obtain a morphology of S. lividans in bioreactor similar to the morphology previously reported in coiled/baffled shake flasks by our group. Morphology of S. lividans was successfully scaled-up, obtaining similar mycelial sizes in both scales with diameters of 0.21 ± 0.09 mm in baffled and coiled shake flasks, and 0.15 ± 0.01 mm in the bioreactor. Moreover, the specific growth rate was successfully scaled up (0.09 ± 0.02 and 0.12 ± 0.01 h(-1), for bioreactors and flasks, respectively), and the recombinant protein productivity measured by densitometry, as well. More interestingly, the quality of the recombinant glycoprotein measured as the amount of mannoses attached to the C-terminal of APA was also scaled- up; with up to five mannose residues in cultures carried out in shake flasks; and six in the bioreactor. However, final biomass concentration was not similar, indicating that although the process can be scaled-up using the power input, others factors like oxygen transfer rate, tip speed or energy dissipation/circulation function can be an influence on bacterial metabolism.

  9. ARC-2007-ACD07-0073-048

    NASA Image and Video Library

    2007-04-14

    Lunar CRater Observation and Sensing Satellite (LCROSS) and P.I. at NASA Ames Research Center - Total Luminance Photometer lens and electronics units on shake table in N-2444 EEL Laboratory with (l) Gi Kojima (bk - middle) Damon Flansburg (r) Dana Lynch

  10. Occupant Motion Sensors : Rotational Accelerometer Development

    DOT National Transportation Integrated Search

    1972-04-01

    A miniature mouthpiece rotational accelerometer has been developed to measure the angular acceleration of a head during vehicle crash or impact conditions. The device has been tested in the laboratory using a shake table and in the field using dummie...

  11. iShake: Mobile Phones as Seismic Sensors (Invited)

    NASA Astrophysics Data System (ADS)

    Dashti, S.; Reilly, J.; Bray, J. D.; Bayen, A. M.; Glaser, S. D.; Mari, E.

    2010-12-01

    Emergency responders must “see” the effects of an earthquake clearly and rapidly so that they can respond effectively to the damage it has produced. Great strides have been made recently in developing methodologies that deliver rapid and accurate post-earthquake information. However, shortcomings still exist. The iShake project is an innovative use of cell phones and information technology to bridge the gap between the high quality, but sparse, ground motion instrument data that are used to help develop ShakeMap and the low quality, but large quantity, human observational data collected to construct a “Did You Feel It?” (DYFI)-based map. Rather than using people as measurement “devices” as is being done through DYFI, the iShake project is using their cell phones to measure ground motion intensity parameters and automatically deliver the data to the U.S. Geological Survey (USGS) for processing and dissemination. In this participatory sensing paradigm, quantitative shaking data from numerous cellular phones will enable the USGS to produce shaking intensity maps more accurately than presently possible. The phone sensor, however, is an imperfect device with performance variations among phones of a given model as well as between models. The sensor is the entire phone, not just the micro-machined transducer inside. A series of 1-D and 3-D shaking table tests were performed at UC San Diego and UC Berkeley, respectively, to evaluate the performance of a class of cell phones. In these tests, seven iPhones and iPod Touch devices that were mounted at different orientations were subjected to 124 earthquake ground motions to characterize their response and reliability as seismic sensors. The testing also provided insight into the seismic response of unsecured and falling instruments. The cell phones measured seismic parameters such as peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), and 5% damped spectral accelerations well. In general, iPhone and iPod Touch sensors slightly over-estimated ground motion energy (i.e., Arias Intensity, Ia). However, the mean acceleration response spectrum of the seven iPhones compared remarkably well with that of the reference high quality accelerometers. The error in the recorded intensity parameters was dependent on the characteristics of the input ground motion, particularly its PGA and Ia, and increased for stronger motions. The use of a high-friction device cover (e.g., rubber iPhone covers) on unsecured phones yielded substantially improved data by minimizing independent phone movement. Useful information on the ground motion characteristics was even extracted from unsecured phones during intense shaking events. The insight gained from these experiments is valuable in distilling information from a large number of imperfect signals from phones that may not be rigidly connected to the ground. With these ubiquitous measurement devices, a more accurate and rapid portrayal of the damage distribution during an earthquake can be provided to emergency responders and to the public.

  12. Effects of the March 1964 Alaska earthquake on glaciers: Chapter D in The Alaska earthquake, March 27, 1964: effects on hydrologic regimen

    USGS Publications Warehouse

    Post, Austin

    1967-01-01

    The 1964 Alaska earthquake occurred in a region where there are many hundreds of glaciers, large and small. Aerial photographic investigations indicate that no snow and ice avalanches of large size occurred on glaciers despite the violent shaking. Rockslide avalanches extended onto the glaciers in many localities, seven very large ones occurring in the Copper River region 160 kilometers east of the epicenter. Some of these avalanches traveled several kilometers at low gradients; compressed air may have provided a lubricating layer. If long-term changes in glaciers due to tectonic changes in altitude and slope occur, they will probably be very small. No evidence of large-scale dynamic response of any glacier to earthquake shaking or avalanche loading was found in either the Chugach or Kenai Mountains 16 months after the 1964 earthquake, nor was there any evidence of surges (rapid advances) as postulated by the Earthquake-Advance Theory of Tarr and Martin.

  13. Citizen sensors for SHM: use of accelerometer data from smartphones.

    PubMed

    Feng, Maria; Fukuda, Yoshio; Mizuta, Masato; Ozer, Ekin

    2015-01-29

    Ubiquitous smartphones have created a significant opportunity to form a low-cost wireless Citizen Sensor network and produce big data for monitoring structural integrity and safety under operational and extreme loads. Such data are particularly useful for rapid assessment of structural damage in a large urban setting after a major event such as an earthquake. This study explores the utilization of smartphone accelerometers for measuring structural vibration, from which structural health and post-event damage can be diagnosed. Widely available smartphones are tested under sinusoidal wave excitations with frequencies in the range relevant to civil engineering structures. Large-scale seismic shaking table tests, observing input ground motion and response of a structural model, are carried out to evaluate the accuracy of smartphone accelerometers under operational, white-noise and earthquake excitations of different intensity. Finally, the smartphone accelerometers are tested on a dynamically loaded bridge. The extensive experiments show satisfactory agreements between the reference and smartphone sensor measurements in both time and frequency domains, demonstrating the capability of the smartphone sensors to measure structural responses ranging from low-amplitude ambient vibration to high-amplitude seismic response. Encouraged by the results of this study, the authors are developing a citizen-engaging and data-analytics crowdsourcing platform towards a smartphone-based Citizen Sensor network for structural health monitoring and post-event damage assessment applications.

  14. ARC-2007-ACD07-0073-052

    NASA Image and Video Library

    2007-04-14

    Lunar CRater Observation and Sensing Satellite (LCROSS) and P.I. at NASA Ames Research Center - Total Luminance Photometer lens and electronics units on shake table in N-2444 EEL Laboratory: Gi Kojima, Dana Lynch and Lynn Hofland check electronics. Data analyzer is the foreground.

  15. Application of laser scanning technique in earthquake protection of Istanbul's historical heritage buildings

    NASA Astrophysics Data System (ADS)

    Çaktı, Eser; Ercan, Tülay; Dar, Emrullah

    2017-04-01

    Istanbul's vast historical and cultural heritage is under constant threat of earthquakes. Historical records report repeated damages to the city's landmark buildings. Our efforts towards earthquake protection of several buildings in Istanbul involve earthquake monitoring via structural health monitoring systems, linear and non-linear structural modelling and analysis in search of past and future earthquake performance, shake-table testing of scaled models and non-destructive testing. More recently we have been using laser technology in monitoring structural deformations and damage in five monumental buildings which are Hagia Sophia Museum and Fatih, Sultanahmet, Süleymaniye and Mihrimah Sultan Mosques. This presentation is about these efforts with special emphasis on the use of laser scanning in monitoring of edifices.

  16. 40 CFR 440.141 - Specialized definitions and provisions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... shaking tables. (7) “Infiltration water” means that water which permeates through the earth into the plant... drainage, and infiltration and drainage waters which commingle with mine drainage or waters resulting from... increase in volume from precipitation or infiltration, plus the maximum volume of water runoff resulting...

  17. 40 CFR 440.141 - Specialized definitions and provisions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., hydrocyclones, or shaking tables. (7) “Infiltration water” means that water which permeates through the earth... drainage, and infiltration and drainage waters which commingle with mine drainage or waters resulting from... increase in volume from precipitation or infiltration, plus the maximum volume of water runoff resulting...

  18. 40 CFR 440.141 - Specialized definitions and provisions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., hydrocyclones, or shaking tables. (7) “Infiltration water” means that water which permeates through the earth... drainage, and infiltration and drainage waters which commingle with mine drainage or waters resulting from... increase in volume from precipitation or infiltration, plus the maximum volume of water runoff resulting...

  19. 40 CFR 440.141 - Specialized definitions and provisions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., hydrocyclones, or shaking tables. (7) “Infiltration water” means that water which permeates through the earth... drainage, and infiltration and drainage waters which commingle with mine drainage or waters resulting from... increase in volume from precipitation or infiltration, plus the maximum volume of water runoff resulting...

  20. 40 CFR 440.141 - Specialized definitions and provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... shaking tables. (7) “Infiltration water” means that water which permeates through the earth into the plant... drainage, and infiltration and drainage waters which commingle with mine drainage or waters resulting from... increase in volume from precipitation or infiltration, plus the maximum volume of water runoff resulting...

  1. Evaluation of ground motion scaling methods for analysis of structural systems

    USGS Publications Warehouse

    O'Donnell, A. P.; Beltsar, O.A.; Kurama, Y.C.; Kalkan, E.; Taflanidis, A.A.

    2011-01-01

    Ground motion selection and scaling comprises undoubtedly the most important component of any seismic risk assessment study that involves time-history analysis. Ironically, this is also the single parameter with the least guidance provided in current building codes, resulting in the use of mostly subjective choices in design. The relevant research to date has been primarily on single-degree-of-freedom systems, with only a few studies using multi-degree-of-freedom systems. Furthermore, the previous research is based solely on numerical simulations with no experimental data available for the validation of the results. By contrast, the research effort described in this paper focuses on an experimental evaluation of selected ground motion scaling methods based on small-scale shake-table experiments of re-configurable linearelastic and nonlinear multi-story building frame structure models. Ultimately, the experimental results will lead to the development of guidelines and procedures to achieve reliable demand estimates from nonlinear response history analysis in seismic design. In this paper, an overview of this research effort is discussed and preliminary results based on linear-elastic dynamic response are presented. ?? ASCE 2011.

  2. Combined state and parameter identification of nonlinear structural dynamical systems based on Rao-Blackwellization and Markov chain Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Abhinav, S.; Manohar, C. S.

    2018-03-01

    The problem of combined state and parameter estimation in nonlinear state space models, based on Bayesian filtering methods, is considered. A novel approach, which combines Rao-Blackwellized particle filters for state estimation with Markov chain Monte Carlo (MCMC) simulations for parameter identification, is proposed. In order to ensure successful performance of the MCMC samplers, in situations involving large amount of dynamic measurement data and (or) low measurement noise, the study employs a modified measurement model combined with an importance sampling based correction. The parameters of the process noise covariance matrix are also included as quantities to be identified. The study employs the Rao-Blackwellization step at two stages: one, associated with the state estimation problem in the particle filtering step, and, secondly, in the evaluation of the ratio of likelihoods in the MCMC run. The satisfactory performance of the proposed method is illustrated on three dynamical systems: (a) a computational model of a nonlinear beam-moving oscillator system, (b) a laboratory scale beam traversed by a loaded trolley, and (c) an earthquake shake table study on a bending-torsion coupled nonlinear frame subjected to uniaxial support motion.

  3. The 21 May 2014 Mw 5.9 Bay of Bengal earthquake: macroseismic data suggest a high‐stress‐drop event

    USGS Publications Warehouse

    Martin, Stacey; Hough, Susan E.

    2015-01-01

    A modest but noteworthy Mw 5.9 earthquake occurred in the Bay of Bengal beneath the central Bengal fan at 21:51 Indian Standard Time (16:21 UTC) on 21 May 2014. Centered over 300 km from the eastern coastline of India (Fig. 1), it caused modest damage by virtue of its location and magnitude. However, shaking was very widely felt in parts of eastern India where earthquakes are uncommon. Media outlets reported as many as four fatalities. Although most deaths were blamed on heart attacks, the death of one woman was attributed by different sources to either a roof collapse or a stampede (see Table S1, available in the electronic supplement to this article). Across the state of Odisha, as many as 250 people were injured (see Table S1), most after jumping from balconies or terraces. Light damage was reported from a number of towns on coastal deltaic sediments, including collapsed walls and damage to pukka and thatched dwellings. Shaking was felt well inland into east‐central India and was perceptible in multistoried buildings as far as Chennai, Delhi, and Jaipur at distances of ≈1600  km (Table 1).

  4. Large-Scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) Simulations of the Molecular Crystal alphaRDX

    DTIC Science & Technology

    2013-08-01

    potential for HMX / RDX (3, 9). ...................................................................................8 1 1. Purpose This work...6 dispersion and electrostatic interactions. Constants for the SB potential are given in table 1. 8 Table 1. SB potential for HMX / RDX (3, 9...modeling dislocations in the energetic molecular crystal RDX using the Large-Scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) molecular

  5. Integration of Host Strain Bioengineering and Bioprocess Development Using Ultra-Scale Down Studies to Select the Optimum Combination: An Antibody Fragment Primary Recovery Case Study

    PubMed Central

    Aucamp, Jean P; Davies, Richard; Hallet, Damien; Weiss, Amanda; Titchener-Hooker, Nigel J

    2014-01-01

    An ultra scale-down primary recovery sequence was established for a platform E. coli Fab production process. It was used to evaluate the process robustness of various bioengineered strains. Centrifugal discharge in the initial dewatering stage was determined to be the major cause of cell breakage. The ability of cells to resist breakage was dependant on a combination of factors including host strain, vector, and fermentation strategy. Periplasmic extraction studies were conducted in shake flasks and it was demonstrated that key performance parameters such as Fab titre and nucleic acid concentrations were mimicked. The shake flask system also captured particle aggregation effects seen in a large scale stirred vessel, reproducing the fine particle size distribution that impacts the final centrifugal clarification stage. The use of scale-down primary recovery process sequences can be used to screen a larger number of engineered strains. This can lead to closer integration with and better feedback between strain development, fermentation development, and primary recovery studies. Biotechnol. Bioeng. 2014;111: 1971–1981. © 2014 Wiley Periodicals, Inc. PMID:24838387

  6. Citizen Sensors for SHM: Use of Accelerometer Data from Smartphones

    PubMed Central

    Feng, Maria; Fukuda, Yoshio; Mizuta, Masato; Ozer, Ekin

    2015-01-01

    Ubiquitous smartphones have created a significant opportunity to form a low-cost wireless Citizen Sensor network and produce big data for monitoring structural integrity and safety under operational and extreme loads. Such data are particularly useful for rapid assessment of structural damage in a large urban setting after a major event such as an earthquake. This study explores the utilization of smartphone accelerometers for measuring structural vibration, from which structural health and post-event damage can be diagnosed. Widely available smartphones are tested under sinusoidal wave excitations with frequencies in the range relevant to civil engineering structures. Large-scale seismic shaking table tests, observing input ground motion and response of a structural model, are carried out to evaluate the accuracy of smartphone accelerometers under operational, white-noise and earthquake excitations of different intensity. Finally, the smartphone accelerometers are tested on a dynamically loaded bridge. The extensive experiments show satisfactory agreements between the reference and smartphone sensor measurements in both time and frequency domains, demonstrating the capability of the smartphone sensors to measure structural responses ranging from low-amplitude ambient vibration to high-amplitude seismic response. Encouraged by the results of this study, the authors are developing a citizen-engaging and data-analytics crowdsourcing platform towards a smartphone-based Citizen Sensor network for structural health monitoring and post-event damage assessment applications. PMID:25643056

  7. Experimental Evaluation of a Device Prototype Based on Shape Memory Alloys for the Retrofit of Historical Buildings

    NASA Astrophysics Data System (ADS)

    Cardone, Donatello; Sofia, Salvatore

    2012-12-01

    Metallic tie-rods are currently used in many historical buildings for absorbing the out-of-plane horizontal forces of arches, vaults and roof trusses, despite they exhibit several limitations under service and seismic conditions. In this paper, a post-tensioned system based on the superelastic properties of Ni-Ti shape memory alloys is proposed for improving the structural performances of traditional metallic tie-rods. First, the thermal behavior under service conditions is investigated based on the results of numerical and experimental studies. Subsequently, the seismic performances under strong earthquakes are verified trough a number of shaking table tests on a 1:4-scale timber roof truss model. The outcomes of these studies fully confirm the achievement of the design objectives of the proposed prototype device.

  8. Field observations of seismic velocity changes caused by shaking-induced damage and healing due to mesoscopic nonlinearity

    NASA Astrophysics Data System (ADS)

    Gassenmeier, M.; Sens-Schönfelder, C.; Eulenfeld, T.; Bartsch, M.; Victor, P.; Tilmann, F.; Korn, M.

    2016-03-01

    To investigate temporal seismic velocity changes due to earthquake related processes and environmental forcing in Northern Chile, we analyse 8 yr of ambient seismic noise recorded by the Integrated Plate Boundary Observatory Chile (IPOC). By autocorrelating the ambient seismic noise field measured on the vertical components, approximations of the Green's functions are retrieved and velocity changes are measured with Coda Wave Interferometry. At station PATCX, we observe seasonal changes in seismic velocity caused by thermal stress as well as transient velocity reductions in the frequency range of 4-6 Hz. Sudden velocity drops occur at the time of mostly earthquake-induced ground shaking and recover over a variable period of time. We present an empirical model that describes the seismic velocity variations based on continuous observations of the local ground acceleration. The model assumes that not only the shaking of large earthquakes causes velocity drops, but any small vibrations continuously induce minor velocity variations that are immediately compensated by healing in the steady state. We show that the shaking effect is accumulated over time and best described by the integrated envelope of the ground acceleration over the discretization interval of the velocity measurements, which is one day. In our model, the amplitude of the velocity reduction as well as the recovery time are proportional to the size of the excitation. This model with two free scaling parameters fits the data of the shaking induced velocity variation in remarkable detail. Additionally, a linear trend is observed that might be related to a recovery process from one or more earthquakes before our measurement period. A clear relationship between ground shaking and induced velocity reductions is not visible at other stations. We attribute the outstanding sensitivity of PATCX to ground shaking and thermal stress to the special geological setting of the station, where the subsurface material consists of relatively loose conglomerate with high pore volume leading to a stronger nonlinearity compared to the other IPOC stations.

  9. A Field-Shaking System to Reduce the Screening Current-Induced Field in the 800-MHz HTS Insert of the MIT 1.3-GHz LTS/HTS NMR Magnet: A Small-Model Study.

    PubMed

    Lee, Jiho; Park, Dongkeun; Michael, Philip C; Noguchi, So; Bascuñán, Juan; Iwasa, Yukikazu

    2018-04-01

    In this paper, we present experimental results, of a small-model study, from which we plan to develop and apply a full-scale field-shaking system to reduce the screening current-induced field (SCF) in the 800-MHz HTS Insert (H800) of the MIT 1.3-GHz LTS/HTS NMR magnet (1.3G) currently under construction-the H800 is composed of 3 nested coils, each a stack of no-insulation (NI) REBCO double-pancakes. In 1.3G, H800 is the chief source of a large error field generated by its own SCF. To study the effectiveness of the field-shaking technique, we used two NI REBCO double-pancakes, one from Coil 2 (HCoil2) and one from Coil 3 (HCoil3) of the 3 H800 coils, and placed them in the bore of a 5-T/300-mm room-temperature bore low-temperature superconducting (LTS) background magnet. The background magnet is used not only to induce the SCF in the double-pancakes but also to reduce it by the field-shaking technique. For each run, we induced the SCF in the double-pancakes at an axial location where the external radial field Br > 0, then for the field-shaking, moved them to another location where the external axial field Bz ≫ B R . Due to the geometry of H800 and L500, top double-pancakes of 3 H800 coils will experience the considerable radial magnetic field perpendicular to the REBCO tape surface. To examine the effect of the field-shaking on the SCF, we tested each NI REBCO DP in the absence or presence of a radial field. In this paper, we report 77-K experimental results and analysis of the effect and a few significant remarks of the field-shaking.

  10. Expanding CyberShake Physics-Based Seismic Hazard Calculations to Central California

    NASA Astrophysics Data System (ADS)

    Silva, F.; Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.

    2016-12-01

    As part of its program of earthquake system science, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by first simulating a tensor-valued wavefield of Strain Green Tensors. CyberShake then takes an earthquake rupture forecast and extends it by varying the hypocenter location and slip distribution, resulting in about 500,000 rupture variations. Seismic reciprocity is used to calculate synthetic seismograms for each rupture variation at each computation site. These seismograms are processed to obtain intensity measures, such as spectral acceleration, which are then combined with probabilities from the earthquake rupture forecast to produce a hazard curve. Hazard curves are calculated at seismic frequencies up to 1 Hz for hundreds of sites in a region and the results interpolated to obtain a hazard map. In developing and verifying CyberShake, we have focused our modeling in the greater Los Angeles region. We are now expanding the hazard calculations into Central California. Using workflow tools running jobs across two large-scale open-science supercomputers, NCSA Blue Waters and OLCF Titan, we calculated 1-Hz PSHA results for over 400 locations in Central California. For each location, we produced hazard curves using both a 3D central California velocity model created via tomographic inversion, and a regionally averaged 1D model. These new results provide low-frequency exceedance probabilities for the rapidly expanding metropolitan areas of Santa Barbara, Bakersfield, and San Luis Obispo, and lend new insights into the effects of directivity-basin coupling associated with basins juxtaposed to major faults such as the San Andreas. Particularly interesting are the basin effects associated with the deep sediments of the southern San Joaquin Valley. We will compare hazard estimates from the 1D and 3D models, summarize the challenges of expanding CyberShake to a new geographic region, and describe our future CyberShake plans.

  11. Introducing Students to Structural Dynamics and Earthquake Engineering

    ERIC Educational Resources Information Center

    Anthoine, Armelle; Marazzi, Francesco; Tirelli, Daniel

    2010-01-01

    The European Laboratory for Structural Assessment (ELSA) is one of the world's main laboratories for seismic studies. Besides its research activities, it also aims to bring applied science closer to the public. This article describes teaching activities based on a demonstration shaking table which is used to introduce the structural dynamics of…

  12. 40 CFR 799.6755 - TSCA partition coefficient (n-octanol/water), shake flask method.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) Qualifying statements. This method applies only to pure, water soluble substances which do not dissociate or... applies to a pure substance dispersed between two pure solvents. If several different solutes occur in one... applied. The values presented in table 1 of this section are not necessarily representative of the results...

  13. 40 CFR 799.6755 - TSCA partition coefficient (n-octanol/water), shake flask method.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Qualifying statements. This method applies only to pure, water soluble substances which do not dissociate or... applied. The values presented in table 1 of this section are not necessarily representative of the results... Law applies only at constant temperature, pressure, and pH for dilute solutions. It strictly applies...

  14. ARC-2007-ACD07-0073-044

    NASA Image and Video Library

    2007-04-14

    Lunar CRater Observation and Sensing Satellite (LCROSS) and P.I. at NASA Ames Research Center - (l to r) Kim Ennico, Damon Flansburg and Gi Kojima check out the LCROSS Total Luminance Photometer lens and electronics attached to a metal plate in preparation for a vibe (vibration) test on the shake table in N-2444 EEL Laboratory

  15. California's forest products industry: 1985.

    Treesearch

    James O. Howard; Franklin R. Ward

    1988-01-01

    This report presents the findings of a 100-percent survey of the primary forest products industry in California for 1985. The survey included the following sectors: lumber; veneer and plywood; pulp and board; shake and shingle; export; and post, pole, and piling. Tables, presented by sector and for the industry as a whole, include characteristics of the industry,...

  16. California's forest products industry: 1992.

    Treesearch

    Franklin R. Ward

    1995-01-01

    This report presents the findings of a survey of primary forest products industries in California for 1992. The survey included the following sectors: lumber; pulp and board; shake and shingle; export; and post, pole, and piling. Veneer and plywood mills are not included because they could not be presented without disclosing critical details. Tables, presented by...

  17. California's forest products industry: 1994.

    Treesearch

    Franklin R. Ward

    1997-01-01

    This report presents the findings of a survey of primary forest products industries in California for 1994. The survey included the following sectors: lumber; veneer; pulp and board; shake and shingle; export; and post, pole, and piling. Tables, presented by sector and for the industry as a whole, include characteristics of the industry, nature and flow of logs...

  18. Building a Smartphone Seismic Network

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.

    2013-12-01

    We are exploring to build a new type of seismic network by using the smartphones. The accelerometers in smartphones can be used to record earthquakes, the GPS unit can give an accurate location, and the built-in communication unit makes the communication easier for this network. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. In order to build this network, we developed an application for android phones and server to record the acceleration in real time. These records can be sent back to a server in real time, and analyzed at the server. We evaluated the performance of the smartphone as a seismic recording instrument by comparing them with high quality accelerometer while located on controlled shake tables for a variety of tests, and also the noise floor test. Based on the daily human activity data recorded by the volunteers and the shake table tests data, we also developed algorithm for the smartphones to detect earthquakes from daily human activities. These all form the basis of setting up a new prototype smartphone seismic network in the near future.

  19. Experimental and analytical studies on multiple tuned mass dampers for seismic protection of porcelain electrical equipment

    NASA Astrophysics Data System (ADS)

    Bai, Wen; Dai, Junwu; Zhou, Huimeng; Yang, Yongqiang; Ning, Xiaoqing

    2017-10-01

    Porcelain electrical equipment (PEE), such as current transformers, is critical to power supply systems, but its seismic performance during past earthquakes has not been satisfactory. This paper studies the seismic performance of two typical types of PEE and proposes a damping method for PEE based on multiple tuned mass dampers (MTMD). An MTMD damping device involving three mass units, named a triple tuned mass damper (TTMD), is designed and manufactured. Through shake table tests and finite element analysis, the dynamic characteristics of the PEE are studied and the effectiveness of the MTMD damping method is verified. The adverse influence of MTMD redundant mass to damping efficiency is studied and relevant equations are derived. MTMD robustness is verified through adjusting TTMD control frequencies. The damping effectiveness of TTMD, when the peak ground acceleration far exceeds the design value, is studied. Both shake table tests and finite element analysis indicate that MTMD is effective and robust in attenuating PEE seismic responses. TTMD remains effective when the PGA far exceeds the design value and when control deviations are considered.

  20. Performance of sand and shredded rubber tire mixture as a natural base isolator for earthquake protection

    NASA Astrophysics Data System (ADS)

    Bandyopadhyay, Srijit; Sengupta, Aniruddha; Reddy, G. R.

    2015-12-01

    The performance of a well-designed layer of sand, and composites like layer of sand mixed with shredded rubber tire (RSM) as low cost base isolators, is studied in shake table tests in the laboratory. The building foundation is modeled by a 200 mm by 200 mm and 40 mm thick rigid plexi-glass block. The block is placed in the middle of a 1m by 1m tank filled with sand. The selected base isolator is placed between the block and the sand foundation. Accelerometers are placed on top of the footing and foundation sand layer. The displacement of the footing is also measured by LVDT. The whole setup is mounted on a shake table and subjected to sinusoidal motions with varying amplitude and frequency. Sand is found to be effective only at very high amplitude (> 0.65 g) of motions. The performance of a composite consisting of sand and 50% shredded rubber tire placed under the footing is found to be most promising as a low-cost effective base isolator.

  1. Optimization of wet shaking table process using response surface methodology applied to the separation of copper and aluminum from the fine fraction of shredder ELVs.

    PubMed

    Jordão, Helga; Sousa, António Jorge; Carvalho, M Teresa

    2016-02-01

    With the purpose of reducing the waste generated by end-of-life vehicles (ELVs) by enhancing the recovery and recycling of nonferrous metals, an experimental study was conducted with the finest size fraction of nonferrous stream produced at an ELV shredder plant. The aim of this work was to characterize the nonferrous stream and to evaluate the efficiency of a gravity concentration process in separating light and heavy nonferrous metal particles that could be easily integrated in a ELV shredder plant (in this case study the separation explicitly addressed copper and aluminum separation). The characterization of a sample of the 0-10mm particle size fraction showed a mixture of nonferrous metals with a certain degree of impurity due to the present of contaminants such as plastics. The majority of the particles exhibited a wire shape, preventing an efficient separation of materials without prior fragmentation. The gravity concentration process selected for this study was the wet shaking table and three operating parameters of the equipment were manipulated. A full factorial design in combination with a central composite design was employed to model metals recovery. Two second order polynomial equations were successfully fitted to describe the process and predict the recovery of copper and aluminum in Cu concentrate under the conditions of the present study. The optimum conditions were determined to be 11.1° of inclination, 2.8L/min of feed water flow and 4.9L/min of wash water flow. All three final products of the wet shaking table had a content higher than 90% in relation to one of the metals, wherein a Cu concentrate product was obtained with a Cu content of 96%, and 78% of Cu recovery and 2% of Al recovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Use of Solid Waste (Foundry Slag) Mortar and Bamboo Reinforcement in Seismic Analysis for Single Storey Masonry Building

    NASA Astrophysics Data System (ADS)

    Ahmad, S.; Husain, A.; Ghani, F.; Alam, M. N.

    2013-11-01

    The conversion of large amount of solid waste (foundry slag) into alternate source of building material will contribute not only as a solution to growing waste problem, but also it will conserve the natural resources of other building material and thereby reduce the cost of construction. The present work makes an effort to safe and economic use of recycle mortar (1:6) as a supplementary material. Conventional and recycled twelve prisms were casted with varying percentage of solid waste (foundry slag) added (0, 10, 20, 30 %) replacing cement by weight and tested under compression testing machine. As the replacement is increasing, the strength is decreasing. 10 % replacement curve is very closed to 0 % whereas 20 % is farther and 30 % is farthest. 20 % replacement was chosen for dynamic testing as its strength is within permissible limit as per IS code. A 1:4 scale single storey brick model with half size brick was fabricated on shake table in the lab for dynamic testing using pure friction isolation system (coarse sand as friction material µ = 0.34). Pure friction isolation technique can be adopted economically in developing countries where low-rise building prevails due to their low cost. The superstructure was separated from the foundation at plinth level, so as to permit sliding of superstructure during severe earthquake. The observed values of acceleration and displacement responses compare fairly with the analytical values of the analytical model. It also concluded that 20 % replacement of cement by solid waste (foundry slag) could be safely adopted without endangering the safety of the masonry structures under seismic load.To have an idea that how much energy is dissipated through this isolation, the same model with fixed base was tested and results were compared with the isolated free sliding model and it has been observed that more than 60 % energy is dissipated through this pure friction isolation technique. In case of base isolation, no visible cracks were observed up to the table force of 4.25 kN (1,300 rpm), whereas for fixed base failure started at 800 rpm.To strengthen the fixed base model, bamboo reinforcement were used for economical point of view. Another model of same dimension with same mortar ratio was fabricated on the shake table with bamboo reinforcement as plinth band and lintel band. In addition another four round bamboo bars of 3 mm diameter were placed at each of the four corners of the model. The building model was tested and found very encouraging and surprising results. The model failure started at 1,600 rpm, which means that this model is surviving the double force in comparison with the non-bamboo reinforcement.

  3. Stratification and segregation features of pulverized electronic waste in flowing film concentration.

    PubMed

    Vidyadhar, A; Chalavadi, G; Das, A

    2013-03-30

    Gravity separation of metals from plastics in pulverized e-waste using flowing film concentration in a shaking table was investigated. Over 51% rejection of plastics in a single stage operation was achieved under optimum conditions. The shaking table was shown to be suitable for processing ground PCBs. Pulverized e-waste containing 22% metals was enriched to around 40% metals in a single pass. Statistical models for the mass yield of metal-rich stream and its grade were developed by design of experiments. Optimization was carried out to maximize the mass yield at a target product grade and preferred operating regimes were established. Experiments were designed to prevent metal loss and over 95% recovery values were obtained under all conditions. Settling distances of metals and plastics were computed and shown to be good indicators of separation performance. Particle morphology and stratification in the troughs in between the riffles were shown to influence the separation significantly. Water flow-assisted motion of the plastics was captured and its role in determining the effectiveness of separation was described. The efficacy of tabling was well established for treating ground PCBs. The wet process was shown to be environment friendly and sustainable. It is also relatively cheap and has good potential for industrial application. However, rigorous cost estimates will be required before commercial application. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Introducing ShakeMap to potential users in Puerto Rico using scenarios of damaging historical and probable earthquakes

    NASA Astrophysics Data System (ADS)

    Huerfano, V. A.; Cua, G.; von Hillebrandt, C.; Saffar, A.

    2007-12-01

    The island of Puerto Rico has a long history of damaging earthquakes. Major earthquakes from off-shore sources have affected Puerto Rico in 1520, 1615, 1670, 1751, 1787, 1867, and 1918 (Mueller et al, 2003; PRSN Catalogue). Recent trenching has also yielded evidence of possible M7.0 events inland (Prentice, 2000). The high seismic hazard, large population, high tsunami potential and relatively poor construction practice can result in a potentially devastating combination. Efficient emergency response in event of a large earthquake will be crucial to minimizing the loss of life and disruption of lifeline systems in Puerto Rico. The ShakeMap system (Wald et al, 2004) developed by the USGS to rapidly display and disseminate information about the geographical distribution of ground shaking (and hence potential damage) following a large earthquake has proven to be a vital tool for post earthquake emergency response efforts, and is being adopted/emulated in various seismically active regions worldwide. Implementing a robust ShakeMap system is among the top priorities of the Puerto Rico Seismic Network. However, the ultimate effectiveness of ShakeMap in post- earthquake response depends not only on its rapid availability, but also on the effective use of the information it provides. We developed ShakeMap scenarios of a suite of damaging historical and probable earthquakes that severely impact San Juan, Ponce, and Mayagüez, the 3 largest cities in Puerto Rico. Earthquake source parameters were obtained from McCann and Mercado (1998); and Huérfano (2004). For historical earthquakes that generated tsunamis, tsunami inundation maps were generated using the TIME method (Shuto, 1991). The ShakeMap ground shaking maps were presented to local and regional governmental and emergency response agencies at the 2007 Annual conference of the Puerto Rico Emergency Management and Disaster Administration in San Juan, PR, and at numerous other emergency management talks and training sessions. Economic losses are estimated using the ShakeMap scenario ground motions (Saffar, 2007). The calibration tasks necessary in generating these scenarios (developing Vs30 maps, attenuation relationships) complement the on-going efforts of the Puerto Rico Seismic Network to generate ShakeMaps in real-time.

  5. Maybe Some Big Ground Shakes One Hundred Years Ago in a Big State Near the Ocean Were Caused by People

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Tsai, V. C.; Walker, R.; Page, M. T.; Aminzadeh, F.

    2016-12-01

    Sometimes people put water deep into the ground to make it go away and sometimes this causes the ground to shake. Sometimes people take other stuff out of the ground because a lot of people buy this stuff to power cars. Usually when people take this stuff out of the ground it does not cause ground shakes. At least this is what we used to believe. For our study, we looked at ground shakes that caused houses to fall down almost 100 years ago in a big state near the water. They were large ground shakes. One was close to a big city where people make movies and one was a really big shake in another city in the same state. We asked the question, is it possible that these ground shakes happened because people took stuff out of the ground? We considered the places where the ground shakes happened and the places where people took a lot of stuff out of the ground. We show there is a pretty good chance that taking stuff out of the ground caused some pretty big ground shakes. We explain how ground shakes can happen when people take stuff out of the ground. Ground shakes happen on things called faults. When you take stuff out of the ground, usually that makes it harder for the fault to move. This is a good thing. But when the stuff is still deep under the ground, sometimes it also pushes against faults that are close by and helps keep them from moving. So when you take stuff out, it does not push on faults as much, and so sometimes that close-by fault can move and cause ground shakes. We use a computer to show that our idea can explain some of what we see. The idea is not perfect but we think it is a pretty good idea. Our idea explains why it does not usually cause ground shakes when people take stuff out of the ground, but sometimes big ground shakes happen. Our idea suggests that ground shakes caused by people can sometimes be very large. So if people take stuff out of the ground or put stuff in the ground, they need to know if there are faults close by.

  6. Oregon's forest products industry: 1988.

    Treesearch

    James O. Howard; Franklin R. Ward

    1991-01-01

    This report presents the findings of a survey of all primary forest products industries in Oregon for 1988. The survey included the following sectors: lumber; veneer and plywood; pulp and board; shake and shingle; export; and post, pole, and piling. Tables, presented by sector and for the industry as a whole, include characteristics of the industry, nature and flow of...

  7. California's forest products industry: 1988.

    Treesearch

    James O. Howard; Franklln R. Ward

    1991-01-01

    This report presents the findings of a survey of all primary forest products industries in California for 1988. The survey included the following sectors: lumber; veneer and plywood; pulp and board; shake and shingle; export; and post, pole, and piling. Tables, presented by sector and for the industry as a whole, include characteristics of the industry, nature and flow...

  8. Oregon's forest products industry: 1985.

    Treesearch

    James O. Howard; Franklin R. Ward

    1988-01-01

    This report presents the findings of a 100-percent survey of the primary forest products industry in Oregon for 1985. The survey included the following sectors: lumber; veneer and plywood; pulp and board; shake and shingle; export; and post, pole, and piling. Tables, presented by sector and for the industry as a whole, include characteristics of the industry, nature...

  9. Oregon's forest products industry: 1992.

    Treesearch

    Franklin R. Ward

    1995-01-01

    This report presents the findings of a survey of primary forest products industries in. Oregon for 1992. The survey included the following sectors: lumber; veneer and plywood; pulp and board; shake and shingle; export; and post, pole, and piling. Tables, presented by sector and for the industry as a whole, include characteristics of the industry, nature and flow of...

  10. Oregon's forest products industry: 1994.

    Treesearch

    Franklin R. Ward

    1997-01-01

    This report presents the findings of a survey of primary forest products industries in Oregon for 1994. The survey included the following sectors: lumber; veneer; pulp and board; shake and shingle; export; and post, pole, and piling. Tables, presented by sector and for the industry as a whole, include characteristics of the industry, nature and flow of logs consumed,...

  11. Expression and purification of ELP-intein-tagged target proteins in high cell density E. coli fermentation.

    PubMed

    Fong, Baley A; Wood, David W

    2010-10-19

    Elastin-like polypeptides (ELPs) are useful tools that can be used to non-chromatographically purify proteins. When paired with self-cleaving inteins, they can be used as economical self-cleaving purification tags. However, ELPs and ELP-tagged target proteins have been traditionally expressed using highly enriched media in shake flask cultures, which are generally not amenable to scale-up. In this work, we describe the high cell-density expression of self-cleaving ELP-tagged targets in a supplemented minimal medium at a 2.5 liter fermentation scale, with increased yields and purity compared to traditional shake flask cultures. This demonstration of ELP expression in supplemented minimal media is juxtaposed to previous expression of ELP tags in extract-based rich media. We also describe several sets of fed-batch conditions and their impact on ELP expression and growth medium cost. By using fed batch E. coli fermentation at high cell density, ELP-intein-tagged proteins can be expressed and purified at high yield with low cost. Further, the impact of media components and fermentation design can significantly impact the overall process cost, particularly at large scale. This work thus demonstrates an important advances in the scale up of self-cleaving ELP tag-mediated processes.

  12. Expression and purification of ELP-intein-tagged target proteins in high cell density E. coli fermentation

    PubMed Central

    2010-01-01

    Background Elastin-like polypeptides (ELPs) are useful tools that can be used to non-chromatographically purify proteins. When paired with self-cleaving inteins, they can be used as economical self-cleaving purification tags. However, ELPs and ELP-tagged target proteins have been traditionally expressed using highly enriched media in shake flask cultures, which are generally not amenable to scale-up. Results In this work, we describe the high cell-density expression of self-cleaving ELP-tagged targets in a supplemented minimal medium at a 2.5 liter fermentation scale, with increased yields and purity compared to traditional shake flask cultures. This demonstration of ELP expression in supplemented minimal media is juxtaposed to previous expression of ELP tags in extract-based rich media. We also describe several sets of fed-batch conditions and their impact on ELP expression and growth medium cost. Conclusions By using fed batch E. coli fermentation at high cell density, ELP-intein-tagged proteins can be expressed and purified at high yield with low cost. Further, the impact of media components and fermentation design can significantly impact the overall process cost, particularly at large scale. This work thus demonstrates an important advances in the scale up of self-cleaving ELP tag-mediated processes. PMID:20959011

  13. Decoupling the structure from the ground motion during earthquakes by employing friction pendulums

    NASA Astrophysics Data System (ADS)

    Gillich, G. R.; Iancu, V.; Gillich, N.; Korka, Z. I.; Chioncel, C. P.; Hatiegan, C.

    2018-01-01

    Avoiding dynamic loads on structures during earthquakes is an actual issue since seismic actions can harm or destroy the built environment. Several attempts to prevent this are possible, the essence being to decouple the structure from the ground motion during earthquakes and preventing in this way large deflections and high accelerations. A common approach is the use of friction pendulums, with cylindrical or spherical surfaces but not limited to that, inserted between the ground and the structure, respectively between the pillar and the superstructure. This type of bearings permits small pendulum motion and in this way, earthquake-induced displacements that occur in the bearings are not integrally transmitted to the structure. The consequence is that the structure is subject to greatly reduced lateral loads and shaking movements. In the experiments, conducted to prove the efficiency of the friction pendulums, we made use of an own designed and manufactured shaking table. Two types of sliding surfaces are analyzed, one polynomial of second order (i.e. circular) and one of a superior order. For both pendulum types, analytical models were developed. The results have shown that the structure is really decoupled from the ground motion and has a similar behaviour as that described by the analytic model.

  14. Experiments in randomly agitated granular assemblies close to the jamming transition

    NASA Astrophysics Data System (ADS)

    Caballero, Gabriel; Lindner, Anke; Ovarlez, Guillaume; Reydellet, Guillaume; Lanuza, José; Clément, Eric

    2004-11-01

    We present the results obtained for two experiments on randomly agitated granular assemblies using a novel way of shaking. First we discuss the transport properties of a 2D model system undergoing classical shaking that show the importance of large scale dynamics for this type of agitation and offer a local view of the microscopic motions of a grain. We then develop a new way of vibrating the system allowing for random accelerations smaller than gravity. Using this method we study the evolution of the free surface as well as results from a light scattering method for a 3D model system. The final aim of these experiments is to investigate the ideas of effective temperature on the one hand as a function of inherent states and on the other hand using fluctuation dissipation relations.

  15. Experiments in randomly agitated granular assemblies close to the jamming transition

    NASA Astrophysics Data System (ADS)

    Caballero, Gabriel; Lindner, Anke; Ovarlez, Guillaume; Reydellet, Guillaume; Lanuza, José; Clément, Eric

    2004-03-01

    We present the results obtained for two experiments on randomly agitated granular assemblies using a novel way of shaking. First we discuss the transport properties of a 2D model system undergoing classical shaking that show the importance of large scale dynamics for this type of agitation and offer a local view of the microscopic motions of a grain. We then develop a new way of vibrating the system allowing for random accelerations smaller than gravity. Using this method we study the evolution of the free surface as well as results from a light scattering method for a 3D model system. The final aim of these experiments is to investigate the ideas of effective temperature on the one hand as a function of inherent states and on the other hand using fluctuation dissipation relations.

  16. 9. EAST ELEVATION OF SKIDOO MILL, LOOKING WEST. THE LEVELS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. EAST ELEVATION OF SKIDOO MILL, LOOKING WEST. THE LEVELS OF THE MILL CAN BE CLEARLY SEEN HERE. THE UPPER MOST LEVEL CONSISTS OF A CONVEORY THAT BROUGHT ORE TO A JAW CRUSHER. THE CRUSHED ORE WAS CHANNELED DIRECTLY INTO A LARGE ORE BIN LOCATED BEHIND THE COVERED WALL (CENTER). THE NEXT LEVEL SHOWS THE BULL (DRIVE) WHEEL ON THE UPPER PART OF THE STAMP BATTERIES. THE NEXT LEVEL DOWN (STAIRS) IS THE LOWER PORTION OF THE STAMP BATTERIES WITH THE MORTAR BLOCKS AND APRONS. THE NEXT LEVEL DOWN (LOWER RIGHT) HELD CONCENTRATION (SHAKING) TABLES AND A CLASSIFIER. MOST EXTERIOR WALL COVERING, TIMBERS, AND ROOF IS MISSING FROM THE MILL. SEE CA-290-42 (CT) FOR IDENTICAL COLOR TRANSPARENCY - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  17. 42. EAST ELEVATION OF SKIDOO MILL, LOOKING WEST. THE LEVELS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    42. EAST ELEVATION OF SKIDOO MILL, LOOKING WEST. THE LEVELS OF THE MILL CAN BE CLEARLY SEEN HERE. THE UPPER MOST LEVEL CONSISTS OF A CONVEORY THAT BROUGHT ORE TO A JAW CRUSHER. THE CRUSHED ORE WAS CHANNELED DIRECTLY INTO A LARGE ORE BIN LOCATED BEHIND THE COVERED WALL (CENTER). THE NEXT LEVEL SHOWS THE BULL (DRIVE) WHEEL ON THE UPPER PART OF THE STAMP BATTERIES THE NEXT LEVEL DOWN (STAIRS) IS THE LOWER PORTION OF THE STAMP BATTERIES WITH MORTAR BLOCKS AND APRONS. THE NEXT LEVEL DOWN (LOWER RIGHT) HELD CONCENTRATION (SHAKING) TABLES AND A CLASSIFIER. MOST EXTERIOR WALL COVERING, TIMBERS, AND ROOF IS MISSING FROM THE MILL. SEE CA-290-9 FOR IDENTICAL B&W NEGATIVE. - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  18. Test and evaluation of the attic temperature reduction potential of plastic roof shakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holton, J.K.; Beggs, T.R.

    1999-07-01

    While monitoring the comparative performance of two test houses in Pittsburgh, Pennsylvania, it was noticed that the attic air temperature of one house with a plastic shake roof was consistently 20 F (11 C) cooler than its twin with asphalt shingles during peak summer cooling periods. More detailed monitoring of the temperatures on the plastic shake, the roof deck, and the attic showed this effect to be largely due to the plastic shake and not to better roof venting or other heat loss mechanisms.

  19. Analysis on Two Typical Landslide Hazard Phenomena in The Wenchuan Earthquake by Field Investigations and Shaking Table Tests.

    PubMed

    Yang, Changwei; Zhang, Jianjing; Liu, Feicheng; Bi, Junwei; Jun, Zhang

    2015-08-06

    Based on our field investigations of landslide hazards in the Wenchuan earthquake, some findings can be reported: (1) the multi-aspect terrain facing empty isolated mountains and thin ridges reacted intensely to the earthquake and was seriously damaged; (2) the slope angles of most landslides was larger than 45°. Considering the above disaster phenomena, the reasons are analyzed based on shaking table tests of one-sided, two-sided and four-sided slopes. The analysis results show that: (1) the amplifications of the peak accelerations of four-sided slopes is stronger than that of the two-sided slopes, while that of the one-sided slope is the weakest, which can indirectly explain the phenomena that the damage is most serious; (2) the amplifications of the peak accelerations gradually increase as the slope angles increase, and there are two inflection points which are the point where the slope angle is 45° and where the slope angle is 50°, respectively, which can explain the seismic phenomenon whereby landslide hazards mainly occur on the slopes whose slope angle is bigger than 45°. The amplification along the slope strike direction is basically consistent, and the step is smooth.

  20. Study on soil-pile-structure-TMD interaction system by shaking table model test

    NASA Astrophysics Data System (ADS)

    Lou, Menglin; Wang, Wenjian

    2004-06-01

    The success of the tuned mass damper (TMD) in reducing wind-induced structural vibrations has been well established. However, from most of the recent numerical studies, it appears that for a structure situated on very soft soil, soil-structure interaction (SSI) could render a damper on the structure totally ineffective. In order to experimentally verify the SSI effect on the seismic performance of TMD, a series of shaking table model tests have been conducted and the results are presented in this paper. It has been shown that the TMD is not as effective in controlling the seismic responses of structures built on soft soil sites due to the SSI effect. Some test results also show that a TMD device might have a negative impact if the SSI effect is neglected and the structure is built on a soft soil site. For structures constructed on a soil foundation, this research verifies that the SSI effect must be carefully understood before a TMD control system is designed to determine if the control is necessary and if the SSI effect must be considered when choosing the optimal parameters of the TMD device.

  1. Monitoring of Engineering Buildings Behaviour Within the Disaster Management System

    NASA Astrophysics Data System (ADS)

    Oku Topal, G.; Gülal, E.

    2017-11-01

    The Disaster management aims to prevent events that result in disaster or to reduce their losses. Monitoring of engineering buildings, identification of unusual movements and taking the necessary precautions are very crucial for determination of the disaster risk so possible prevention could be taken to reduce big loss. Improving technology, increasing population due to increased construction and these areas largest economy lead to offer damage detection strategies. Structural Health Monitoring (SHM) is the most effective of these strategies. SHM research is very important to maintain all this structuring safely. The purpose of structural monitoring is determining in advance of possible accidents and taking necessary precaution. In this paper, determining the behaviour of construction using Global Positioning System (GPS) is investigated. For this purpose shaking table tests were performed. Shaking table was moved at different amplitude and frequency aiming to determine these movement with a GPS measuring system. The obtained data were evaluated by analysis of time series and Fast Fourier Transformation techniques and the frequency and amplitude values are calculated. By examining the results of the tests made, it will be determined whether the GPS measurement method can accurately detect the movements of the engineering structures.

  2. Development of Arduino based wireless control system

    NASA Astrophysics Data System (ADS)

    Sun, Zhuoxiong; Dyke, Shirley J.; Pena, Francisco; Wilbee, Alana

    2015-03-01

    Over the past few decades, considerable attention has been given to structural control systems to mitigate structural vibration under natural hazards such as earthquakes and extreme weather conditions. Traditional wired structural control systems often employ a large amount of cables for communication among sensors, controllers and actuators. In such systems, implementation of wired sensors is usually quite complicated and expensive, especially on large scale structures such as bridges and buildings. To reduce the laborious installation and maintenance cost, wireless control systems (WCSs) are considered as a novel approach for structural vibration control. In this work, a WCS is developed based on the open source Arduino platform. Low cost, low power wireless sensing and communication components are built on the Arduino platform. Structural control algorithms are embedded within the wireless sensor board for feedback control. The developed WCS is first validated through a series of tests. Next, numerical simulations are performed simulating wireless control of a 3-story shear structure equipped with a semi-active control device (MR damper). Finally, experimental studies are carried out implementing the WCS on the 3-story shear structure in the Intelligent Infrastructure Systems Lab (IISL). A hydraulic shake table is used to generate seismic ground motions. The control performance is evaluated with the impact of modeling uncertainties, measurement noises as well as time delay and data loss induced by the wireless network. The developed WCS is shown to be effective in controlling structural vibrations under several historical earthquake ground motions.

  3. Determination of cadmium and lead in table salt by sequential multi-element flame atomic absorption spectrometry.

    PubMed

    Amorim, Fábio A C; Ferreira, Sérgio L C

    2005-02-28

    In the present paper, a simultaneous pre-concentration procedure for the sequential determination of cadmium and lead in table salt samples using flame atomic absorption spectrometry is proposed. This method is based on the liquid-liquid extraction of cadmium(II) and lead(II) ions as dithizone complexes and direct aspiration of the organic phase for the spectrometer. The sequential determination of cadmium and lead is possible using a computer program. The optimization step was performed by a two-level fractional factorial design involving the variables: pH, dithizone mass, shaking time after addition of dithizone and shaking time after addition of solvent. In the studied levels these variables are not significant. The experimental conditions established propose a sample volume of 250mL and the extraction process using 4.0mL of methyl isobutyl ketone. This way, the procedure allows determination of cadmium and lead in table salt samples with a pre-concentration factor higher than 80, and detection limits of 0.3ngg(-1) for cadmium and 4.2ngg(-1) for lead. The precision expressed as relative standard deviation (n = 10) were 5.6 and 2.6% for cadmium concentration of 2 and 20ngg(-1), respectively, and of 3.2 and 1.1% for lead concentration of 20 and 200ngg(-1), respectively. Recoveries of cadmium and lead in several samples, measured by standard addition technique, proved also that this procedure is not affected by the matrix and can be applied satisfactorily for the determination of cadmium and lead in saline samples. The method was applied for the evaluation of the concentration of cadmium and lead in table salt samples consumed in Salvador City, Bahia, Brazil.

  4. Experimental evaluation of four ground-motion scaling methods for dynamic response-history analysis of nonlinear structures

    USGS Publications Warehouse

    O'Donnell, Andrew P.; Kurama, Yahya C.; Kalkan, Erol; Taflanidis, Alexandros A.

    2017-01-01

    This paper experimentally evaluates four methods to scale earthquake ground-motions within an ensemble of records to minimize the statistical dispersion and maximize the accuracy in the dynamic peak roof drift demand and peak inter-story drift demand estimates from response-history analyses of nonlinear building structures. The scaling methods that are investigated are based on: (1) ASCE/SEI 7–10 guidelines; (2) spectral acceleration at the fundamental (first mode) period of the structure, Sa(T1); (3) maximum incremental velocity, MIV; and (4) modal pushover analysis. A total of 720 shake-table tests of four small-scale nonlinear building frame specimens with different static and dynamic characteristics are conducted. The peak displacement demands from full suites of 36 near-fault ground-motion records as well as from smaller “unbiased” and “biased” design subsets (bins) of ground-motions are included. Out of the four scaling methods, ground-motions scaled to the median MIV of the ensemble resulted in the smallest dispersion in the peak roof and inter-story drift demands. Scaling based on MIValso provided the most accurate median demands as compared with the “benchmark” demands for structures with greater nonlinearity; however, this accuracy was reduced for structures exhibiting reduced nonlinearity. The modal pushover-based scaling (MPS) procedure was the only method to conservatively overestimate the median drift demands.

  5. SCEC Earthquake System Science Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.

    2008-12-01

    The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.

  6. Mass Culture of a Slime Mold, Physarum polycephalum1

    PubMed Central

    Brewer, E. N.; Kuraishi, S.; Garver, J. C.; Strong, F. M.

    1964-01-01

    The slime mold, Physarum polycephalum, was cultivated in a soluble natural medium in shake flasks and in 30-liter and 50-gal conventional baffled fermentors. Yields of 6 to 10 g (dry weight) per liter were obtained in the large-scale fermentations. Because of the slow growth of the myxomycete, particular attention had to be paid to aseptic technique. The inability of this organism to withstand the normal degree of agitation employed with most aerobic fermentations made it difficult to obtain adequate aeration. Conditions for growth of the organism on a pilot-plant scale are presented. PMID:14131366

  7. Effectiveness of educational materials designed to change knowledge and behaviors regarding crying and shaken-baby syndrome in mothers of newborns: a randomized, controlled trial.

    PubMed

    Barr, Ronald G; Rivara, Frederick P; Barr, Marilyn; Cummings, Peter; Taylor, James; Lengua, Liliana J; Meredith-Benitz, Emily

    2009-03-01

    Infant crying is an important precipitant for shaken-infant syndrome. OBJECTIVE. To determine if parent education materials (The Period of PURPLE Crying [PURPLE]) change maternal knowledge and behavior relevant to infant shaking. This study was a randomized, controlled trial conducted in prenatal classes, maternity wards, and pediatric practices. There were 1374 mothers of newborns randomly assigned to the PURPLE intervention and 1364 mothers to the control group. Primary outcomes were measured by telephone 2 months after delivery. These included 2 knowledge scales about crying and the dangers of shaking; 3 scales about behavioral responses to crying generally and to unsoothable crying, and caregiver self-talk in response to unsoothable crying; and 3 questions concerning the behaviors of sharing of information with others about crying, walking away if frustrated, and the dangers of shaking. The mean infant crying knowledge score was greater in the intervention group (69.5) compared with controls (63.3). Mean shaking knowledge was greater for intervention subjects (84.8) compared with controls (83.5). For reported maternal behavioral responses to crying generally, responses to unsoothable crying, and for self-talk responses, mean scores for intervention mothers were similar to those for controls. For the behaviors of information sharing, more intervention mothers reported sharing information about walking away if frustrated and the dangers of shaking, but there was little difference in sharing information about infant crying. Intervention mothers also reported increased infant distress. Use of the PURPLE education materials seem to lead to higher scores in knowledge about early infant crying and the dangers of shaking, and in sharing of information behaviors considered to be important for the prevention of shaking.

  8. Validation of a wireless modular monitoring system for structures

    NASA Astrophysics Data System (ADS)

    Lynch, Jerome P.; Law, Kincho H.; Kiremidjian, Anne S.; Carryer, John E.; Kenny, Thomas W.; Partridge, Aaron; Sundararajan, Arvind

    2002-06-01

    A wireless sensing unit for use in a Wireless Modular Monitoring System (WiMMS) has been designed and constructed. Drawing upon advanced technological developments in the areas of wireless communications, low-power microprocessors and micro-electro mechanical system (MEMS) sensing transducers, the wireless sensing unit represents a high-performance yet low-cost solution to monitoring the short-term and long-term performance of structures. A sophisticated reduced instruction set computer (RISC) microcontroller is placed at the core of the unit to accommodate on-board computations, measurement filtering and data interrogation algorithms. The functionality of the wireless sensing unit is validated through various experiments involving multiple sensing transducers interfaced to the sensing unit. In particular, MEMS-based accelerometers are used as the primary sensing transducer in this study's validation experiments. A five degree of freedom scaled test structure mounted upon a shaking table is employed for system validation.

  9. pH measurement and a rational and practical pH control strategy for high throughput cell culture system.

    PubMed

    Zhou, Haiying; Purdie, Jennifer; Wang, Tongtong; Ouyang, Anli

    2010-01-01

    The number of therapeutic proteins produced by cell culture in the pharmaceutical industry continues to increase. During the early stages of manufacturing process development, hundreds of clones and various cell culture conditions are evaluated to develop a robust process to identify and select cell lines with high productivity. It is highly desirable to establish a high throughput system to accelerate process development and reduce cost. Multiwell plates and shake flasks are widely used in the industry as the scale down model for large-scale bioreactors. However, one of the limitations of these two systems is the inability to measure and control pH in a high throughput manner. As pH is an important process parameter for cell culture, this could limit the applications of these scale down model vessels. An economical, rapid, and robust pH measurement method was developed at Eli Lilly and Company by employing SNARF-4F 5-(-and 6)-carboxylic acid. The method demonstrated the ability to measure the pH values of cell culture samples in a high throughput manner. Based upon the chemical equilibrium of CO(2), HCO(3)(-), and the buffer system, i.e., HEPES, we established a mathematical model to regulate pH in multiwell plates and shake flasks. The model calculates the required %CO(2) from the incubator and the amount of sodium bicarbonate to be added to adjust pH to a preset value. The model was validated by experimental data, and pH was accurately regulated by this method. The feasibility of studying the pH effect on cell culture in 96-well plates and shake flasks was also demonstrated in this study. This work shed light on mini-bioreactor scale down model construction and paved the way for cell culture process development to improve productivity or product quality using high throughput systems. Copyright 2009 American Institute of Chemical Engineers

  10. Mobile Phones and Social Media Empower the Citizen Seismologist

    NASA Astrophysics Data System (ADS)

    Bray, J.; Dashti, S.; Reilly, J.; Bayen, A. M.; Glaser, S. D.

    2014-12-01

    Emergency responders must "see" the effects of an earthquake clearly and rapidly for effective response. Mobile phone and information technology can be used to measure ground motion intensity parameters and relay that information to emergency responders. However, the phone sensor is an imperfect device and has a limited operational range. Thus, shake table tests were performed to evaluate their reliability as seismic monitoring instruments. Representative handheld devices, either rigidly connected to the table or free to move, measured shaking intensity parameters well. Bias in 5%-damped spectral accelerations measured by phones was less than 0.05 and 0.2 [log(g)] during one-dimensional (1-D) and three-dimensional (3-D) shaking in frequencies ranging from 1 Hz to 10 Hz. They did tend to over-estimate the Arias Intensity, but this error declined for stronger motions with larger signal-to-noise ratios. Additionally, much of the data about infrastructure performance and geotechnical effects of an earthquake are lost soon after an earthquake occurs as efforts move to the recovery phase. A better methodology for reliable and rapid collection of perishable hazards data will enhance scientific inquiry and accelerate the building of disaster-resilient cities. Post-earthquake reconnaissance efforts can be aided through the strategic collection and reuse of social media data and other remote sources of information. This is demonstrated through their use following the NSF-sponsored GEER response to the September 2013 flooding in Colorado. With these ubiquitous measurement devices in the hands of the citizen seismologist, a more accurate and rapid portrayal of the damage distribution during an earthquake may be provided to emergency responders and to the public.

  11. ARC-2007-ACD07-0073-047

    NASA Image and Video Library

    2007-04-14

    Lunar CRater Observation and Sensing Satellite (LCROSS) and P.I. at NASA Ames Research Center - close up of Total Luminance Photometer: Metal shake table close up. Shows two units bolted on. The left one is the lens, sensor electronics and photometer sensor. The right is the digital electronics unit for the instrument. The two units, along with their cabling is one of the LCROSS science insruments.

  12. Direct Lattice Shaking of Bose Condensates: Finite Momentum Superfluids

    DOE PAGES

    Anderson, Brandon M.; Clark, Logan W.; Crawford, J

    2017-05-31

    Here, we address band engineering in the presence of periodic driving by numerically shaking a lattice containing a bosonic condensate. By not restricting to simplified band structure models we are able to address arbitrary values of the shaking frequency, amplitude, and interaction strengths g. For "near-resonant" shaking frequencies with moderate g, a quantum phase transition to a finite momentum superfluid is obtained with Kibble-Zurek scaling and quantitative agreement with experiment. We use this successful calibration as a platform to support a more general investigation of the interplay between (one particle) Floquet theory and the effects associated with arbitrary g. Bandmore » crossings lead to superfluid destabilization, but where this occurs depends on g in a complicated fashion.« less

  13. Shake, Rattle and Roles: Lessons from Experimental Earthquake Engineering for Incorporating Remote Users in Large-Scale E-Science Experiments

    DTIC Science & Technology

    2007-01-01

    Mechanical Turk: Artificial Artificial Intelligence . Retrieved May 15, 2006 from http://www.mturk.com/ mturk/welcome Atkins, D. E., Droegemeier, K. K...Turk (Amazon, 2006) site goes beyond volunteers and pays people to do Human Intelligence Tasks, those that are difficult for computers but relatively...geographically distributed scientific collaboration, and the use of videogame technology for training. Address: U.S. Army Research Institute, 2511 Jefferson

  14. Shaking video stabilization with content completion

    NASA Astrophysics Data System (ADS)

    Peng, Yi; Ye, Qixiang; Liu, Yanmei; Jiao, Jianbin

    2009-01-01

    A new stabilization algorithm to counterbalance the shaking motion in a video based on classical Kandade-Lucas- Tomasi (KLT) method is presented in this paper. Feature points are evaluated with law of large numbers and clustering algorithm to reduce the side effect of moving foreground. Analysis on the change of motion direction is also carried out to detect the existence of shaking. For video clips with detected shaking, an affine transformation is performed to warp the current frame to the reference one. In addition, the missing content of a frame during the stabilization is completed with optical flow analysis and mosaicking operation. Experiments on video clips demonstrate the effectiveness of the proposed algorithm.

  15. How well should probabilistic seismic hazard maps work?

    NASA Astrophysics Data System (ADS)

    Vanneste, K.; Stein, S.; Camelbeeck, T.; Vleminckx, B.

    2016-12-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in earthquake hazard maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSHA model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating the shaking history of an area with assumed distribution of earthquakes, frequency-magnitude relation, temporal occurrence model, and ground-motion prediction equation. We compare the "observed" shaking at many sites over time to that predicted by a hazard map generated for the same set of parameters. PSHA predicts that the fraction of sites at which shaking will exceed that mapped is p = 1 - exp(t/T), where t is the duration of observations and T is the map's return period. This implies that shaking in large earthquakes is typically greater than shown on hazard maps, as has occurred in a number of cases. A large number of simulated earthquake histories yield distributions of shaking consistent with this forecast, with a scatter about this value that decreases as t/T increases. The median results are somewhat lower than predicted for small values of t/T and approach the predicted value for larger values of t/T. Hence, the algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. Validation is more complicated because a real observed earthquake history can yield a fractional exceedance significantly higher or lower than that predicted while still being consistent with the hazard map in question. As a result, given that in the real world we have only a single sample, it is hard to assess whether a misfit between a map and observations arises by chance or reflects a biased map.

  16. The Sidebar Computer Program, a seismic-shaking intensity meter: users' manual and software description

    USGS Publications Warehouse

    Evans, John R.

    2003-01-01

    The SideBar computer program provides a visual display of seismic shaking intensity as recorded at one specific seismograph. This software allows a user to tap into the seismic data recorded on that specific seismograph and to display the overall level of shaking at the single location where that seismograph resides (usually the same place the user is). From this shaking level, SideBar also estimates the potential for damage nearby. SideBar cannot tell you the “Richter magnitude” of the earthquake (see box), only how hard the ground shook locally and this estimate of how much damage is likely in the neighborhood. This combination of local effects is called the “seismic intensity”. SideBar runs on a standard desktop or laptop PC, and is intended for the media, schools, emergency responders, and any other group hosting a seismograph and who want to know immediately after an earthquake the levels of shaking measured by that instrument. These local values can be used to inform the public and help initiate appropriate local emergency response activities in the minutes between the earthquake and availability of the broader coverage provided by the USGS over the Web, notably by ShakeMap. For example, for instruments installed in schools, the level of shaking and likely damage at the school could immediately be Web broadcast and parents could quickly determine the likely safety of their children—their biggest postearthquake concern. Also, in the event of a Web outage, SideBar may be a continuing primary source of local emergency response information for some additional minutes. Specifically, SideBar interprets the peak level of acceleration (that is, the force of shaking, as a percentage of the force of gravity) as well as the peak velocity, or highest speed, at which the ground moves. Using these two basic measurements, SideBar computes what is called Instrumental Intensity—a close approximation of the Modified Mercalli Intensity scale, or “MMI” (using the Wald et al., 1999a, relationships between acceleration, velocity, and shaking intensity). Intensity is a measure of local shaking strength and the potential for damage—of how bad the earthquake effects were locally. The intensity level is what SideBar displays most prominently on the PC monitor. Intensity is shown as a large, colored bar that gets taller and changes color up a rainbow from blues toward reds as the shaking level increases. As opposed to earthquake magnitudes, which are reported as decimal values (like “7.6”), intensity is traditionally given as a Roman numeral, with “I” to “X+” assigned to levels of potential damage and perceived shaking strength. For good measure, SideBar shows the actual values of the force of shaking (peak ground acceleration as a percentage of gravity) and the speed of ground motion (peak ground velocity in inches per second, by default, or in centimeters per second, if you wish), both these values as decimal numbers. SideBar also remembers the most recent earthquakes (for up to one week), and can store as many of these previous earthquakes as the user allows (and as the user’s PC has room for)—typically thousands. SideBar also remembers forever the three largest earthquakes it has seen and all earthquakes over intensity IV so that one never loses particularly important events.

  17. On-line prediction of the glucose concentration of CHO cell cultivations by NIR and Raman spectroscopy: Comparative scalability test with a shake flask model system.

    PubMed

    Kozma, Bence; Hirsch, Edit; Gergely, Szilveszter; Párta, László; Pataki, Hajnalka; Salgó, András

    2017-10-25

    In this study, near-infrared (NIR) and Raman spectroscopy were compared in parallel to predict the glucose concentration of Chinese hamster ovary cell cultivations. A shake flask model system was used to quickly generate spectra similar to bioreactor cultivations therefore accelerating the development of a working model prior to actual cultivations. Automated variable selection and several pre-processing methods were tested iteratively during model development using spectra from six shake flask cultivations. The target was to achieve the lowest error of prediction for the glucose concentration in two independent shake flasks. The best model was then used to test the scalability of the two techniques by predicting spectra of a 10l and a 100l scale bioreactor cultivation. The NIR spectroscopy based model could follow the trend of the glucose concentration but it was not sufficiently accurate for bioreactor monitoring. On the other hand, the Raman spectroscopy based model predicted the concentration of glucose in both cultivation scales sufficiently accurately with an error around 4mM (0.72g/l), that is satisfactory for the on-line bioreactor monitoring purposes of the biopharma industry. Therefore, the shake flask model system was proven to be suitable for scalable spectroscopic model development. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Performance-Based Seismic Retrofit of Soft-Story Woodframe Buildings Using Energy-Dissipation Systems

    NASA Astrophysics Data System (ADS)

    Tian, Jingjing

    Low-rise woodframe buildings with disproportionately flexible ground stories represent a significant percentage of the building stock in seismically vulnerable communities in the Western United States. These structures have a readily identifiable structural weakness at the ground level due to an asymmetric distribution of large openings in the perimeter wall lines and to a lack of interior partition walls, resulting in a soft story condition that makes the structure highly susceptible to severe damage or collapse under design-level earthquakes. The conventional approach to retrofitting such structures is to increase the ground story stiffness. An alternate approach is to increase the energy dissipation capacity of the structure via the incorporation of supplemental energy dissipation devices (dampers), thereby relieving the energy dissipation demands on the framing system. Such a retrofit approach is consistent with a Performance-Based Seismic Retrofit (PBSR) philosophy through which multiple performance levels may be targeted. The effectiveness of such a retrofit is presented via examination of the seismic response of a full-scale four-story building that was tested on the outdoor shake table at NEES-UCSD and a full-scale three-story building that was tested using slow pseudo-dynamic hybrid testing at NEES-UB. In addition, a Direct Displacement Design (DDD) methodology was developed as an improvement over current DDD methods by considering torsion, with or without the implementation of damping devices, in an attempt to avoid the computational expense of nonlinear time-history analysis (NLTHA) and thus facilitating widespread application of PBSR in engineering practice.

  19. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    USGS Publications Warehouse

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and resulting losses are one realistic outcome, deliberately not a worst-case scenario, rather one worth preparing for and mitigating against. Decades of improving the life-safety requirements in building codes have greatly reduced the risk of death in earthquakes, yet southern California's economic and social systems are still vulnerable to large-scale disruptions. Because of this, the ShakeOut Scenario earthquake would dramatically alter the nature of the southern California community. Fortunately, steps can be taken now that can change that outcome and repay any costs many times over. The ShakeOut Scenario is the first public product of the USGS Multi-Hazards Demonstration Project, created to show how hazards science can increase a community's resiliency to natural disasters through improved planning, mitigation, and response.

  20. ShakeNet: a portable wireless sensor network for instrumenting large civil structures

    USGS Publications Warehouse

    Kohler, Monica D.; Hao, Shuai; Mishra, Nilesh; Govindan, Ramesh; Nigbor, Robert

    2015-08-03

    We report our findings from a U.S. Geological Survey (USGS) National Earthquake Hazards Reduction Program-funded project to develop and test a wireless, portable, strong-motion network of up to 40 triaxial accelerometers for structural health monitoring. The overall goal of the project was to record ambient vibrations for several days from USGS-instrumented structures. Structural health monitoring has important applications in fields like civil engineering and the study of earthquakes. The emergence of wireless sensor networks provides a promising means to such applications. However, while most wireless sensor networks are still in the experimentation stage, very few take into consideration the realistic earthquake engineering application requirements. To collect comprehensive data for structural health monitoring for civil engineers, high-resolution vibration sensors and sufficient sampling rates should be adopted, which makes it challenging for current wireless sensor network technology in the following ways: processing capabilities, storage limit, and communication bandwidth. The wireless sensor network has to meet expectations set by wired sensor devices prevalent in the structural health monitoring community. For this project, we built and tested an application-realistic, commercially based, portable, wireless sensor network called ShakeNet for instrumentation of large civil structures, especially for buildings, bridges, or dams after earthquakes. Two to three people can deploy ShakeNet sensors within hours after an earthquake to measure the structural response of the building or bridge during aftershocks. ShakeNet involved the development of a new sensing platform (ShakeBox) running a software suite for networking, data collection, and monitoring. Deployments reported here on a tall building and a large dam were real-world tests of ShakeNet operation, and helped to refine both hardware and software. 

  1. EXPERIMENTAL STUDY ON BEHAVIOR OF KENCHI BLOCK MASONRY WALL WITH THE SHAKING TABLE TEST DURING BY VIBRATION CHARACTERISTICS AND FAILURE MECHANISM

    NASA Astrophysics Data System (ADS)

    Ikemoto, Toshikazu; Mori, Masashi; Miyajima, Masakatsu; Hashimoto, Takao; Murata, Akira

    There are many earthquake damages of kenchi block masonry wall. So, we carried out experimental studies on the collapse mechanism of kenchi block masonry wall during earthquake. From these experimental data, i.e. acceleration response magnification, displacement and soil pressure were found to destroy the central wall vibrations caused by the subsidence of the embankment.

  2. A comparative theoretical study on core-hole excitation spectra of azafullerene and its derivatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Yunfeng; Department of Physics, Guizhou University, Guiyang 550025; Gao, Bin, E-mail: bin.gao@uit.no

    2014-03-28

    The core-hole excitation spectra—near-edge x-ray absorption spectroscopy (NEXAFS), x-ray emission spectroscopy (XES), and x-ray photoelectron spectroscopy (XPS) shake-up satellites have been simulated at the level of density functional theory for the azafullerene C{sub 59}N and its derivatives (C{sub 59}N){sup +}, C{sub 59}HN, (C{sub 59}N){sub 2}, and C{sub 59}N–C{sub 60}, in which the XPS shake-up satellites were simulated using our developed equivalent core hole Kohn-Sham (ECH-KS) density functional theory approach [B. Gao, Z. Wu, and Y. Luo, J. Chem. Phys. 128, 234704 (2008)] which aims for the study of XPS shake-up satellites of large-scale molecules. Our calculated spectra are generally inmore » good agreement with available experimental results that validates the use of the ECH-KS method in the present work. The nitrogen K-edge NEXAFS, XES, and XPS shake-up satellites spectra in general can be used as fingerprints to distinguish the azafullerene C{sub 59}N and its different derivatives. Meanwhile, different carbon K-edge spectra could also provide detailed information of (local) electronic structures of different molecules. In particular, a peak (at around 284.5 eV) in the carbon K-edge NEXAFS spectrum of the heterodimer C{sub 59}N–C{sub 60} is confirmed to be related to the electron transfer from the C{sub 59}N part to the C{sub 60} part in this charge-transfer complex.« less

  3. Rapid exposure and loss estimates for the May 12, 2008 Mw 7.9 Wenchuan earthquake provided by the U.S. Geological Survey's PAGER system

    USGS Publications Warehouse

    Earle, P.S.; Wald, D.J.; Allen, T.I.; Jaiswal, K.S.; Porter, K.A.; Hearne, M.G.

    2008-01-01

    One half-hour after the May 12th Mw 7.9 Wenchuan, China earthquake, the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) system distributed an automatically generated alert stating that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater). It was immediately clear that a large-scale disaster had occurred. These alerts were widely distributed and referenced by the major media outlets and used by governments, scientific, and relief agencies to guide their responses. The PAGER alerts and Web pages included predictive ShakeMaps showing estimates of ground shaking, maps of population density, and a list of estimated intensities at impacted cities. Manual, revised alerts were issued in the following hours that included the dimensions of the fault rupture. Within a half-day, PAGER’s estimates of the population exposed to strong shaking levels stabilized at 5.2 million people. A coordinated research effort is underway to extend PAGER’s capability to include estimates of the number of casualties. We are pursuing loss models that will allow PAGER the flexibility to use detailed inventory and engineering results in regions where these data are available while also calculating loss estimates in regions where little is known about the type and strength of the built infrastructure. Prototype PAGER fatality estimates are currently implemented and can be manually triggered. In the hours following the Wenchuan earthquake, these models predicted fatalities in the tens of thousands.

  4. Analysis on Two Typical Landslide Hazard Phenomena in The Wenchuan Earthquake by Field Investigations and Shaking Table Tests

    PubMed Central

    Yang, Changwei; Zhang, Jianjing; Liu, Feicheng; Bi, Junwei; Jun, Zhang

    2015-01-01

    Based on our field investigations of landslide hazards in the Wenchuan earthquake, some findings can be reported: (1) the multi-aspect terrain facing empty isolated mountains and thin ridges reacted intensely to the earthquake and was seriously damaged; (2) the slope angles of most landslides was larger than 45°. Considering the above disaster phenomena, the reasons are analyzed based on shaking table tests of one-sided, two-sided and four-sided slopes. The analysis results show that: (1) the amplifications of the peak accelerations of four-sided slopes is stronger than that of the two-sided slopes, while that of the one-sided slope is the weakest, which can indirectly explain the phenomena that the damage is most serious; (2) the amplifications of the peak accelerations gradually increase as the slope angles increase, and there are two inflection points which are the point where the slope angle is 45° and where the slope angle is 50°, respectively, which can explain the seismic phenomenon whereby landslide hazards mainly occur on the slopes whose slope angle is bigger than 45°. The amplification along the slope strike direction is basically consistent, and the step is smooth. PMID:26258785

  5. Bioprocessing Data for the Production of Marine Enzymes

    PubMed Central

    Sarkar, Sreyashi; Pramanik, Arnab; Mitra, Anindita; Mukherjee, Joydeep

    2010-01-01

    This review is a synopsis of different bioprocess engineering approaches adopted for the production of marine enzymes. Three major modes of operation: batch, fed-batch and continuous have been used for production of enzymes (such as protease, chitinase, agarase, peroxidase) mainly from marine bacteria and fungi on a laboratory bioreactor and pilot plant scales. Submerged, immobilized and solid-state processes in batch mode were widely employed. The fed-batch process was also applied in several bioprocesses. Continuous processes with suspended cells as well as with immobilized cells have been used. Investigations in shake flasks were conducted with the prospect of large-scale processing in reactors. PMID:20479981

  6. Combination of High Rate, Real-Time GNSS and Accelerometer Observations and Rapid Seismic Event Notification for Earthquake Early Warning and Volcano Monitoring with a Focus on the Pacific Rim.

    NASA Astrophysics Data System (ADS)

    Zimakov, L. G.; Passmore, P.; Raczka, J.; Alvarez, M.; Jackson, M.

    2014-12-01

    Scientific GNSS networks are moving towards a model of real-time data acquisition, epoch-by-epoch storage integrity, and on-board real-time position and displacement calculations. This new paradigm allows the integration of real-time, high-rate GNSS displacement information with acceleration and velocity data to create very high-rate displacement records. The mating of these two instruments allows the creation of a new, very high-rate (200 sps) displacement observable that has the full-scale displacement characteristics of GNSS and high-precision dynamic motions of seismic technologies. It is envisioned that these new observables can be used for earthquake early warning studies, volcano monitoring, and critical infrastructure monitoring applications. Our presentation will focus on the characteristics of GNSS, seismic, and strong motion sensors in high dynamic environments, including historic earthquakes in Southern California and the Pacific Rim, replicated on a shake table, over a range of displacements and frequencies. We will explore the optimum integration of these sensors from a filtering perspective including simple harmonic impulses over varying frequencies and amplitudes and under the dynamic conditions of various earthquake scenarios. In addition we will discuss implementation of a Rapid Seismic Event Notification System that provides quick delivery of digital data from seismic stations to the acquisition and processing center and a full data integrity model for real-time earthquake notification that provides warning prior to significant ground shaking.

  7. Water-table and discharge changes associated with the 2016-2017 seismic sequence in central Italy: hydrogeological data and a conceptual model for fractured carbonate aquifers

    NASA Astrophysics Data System (ADS)

    Petitta, Marco; Mastrorillo, Lucia; Preziosi, Elisabetta; Banzato, Francesca; Barberio, Marino Domenico; Billi, Andrea; Cambi, Costanza; De Luca, Gaetano; Di Carlo, Giuseppe; Di Curzio, Diego; Di Salvo, Cristina; Nanni, Torquato; Palpacelli, Stefano; Rusi, Sergio; Saroli, Michele; Tallini, Marco; Tazioli, Alberto; Valigi, Daniela; Vivalda, Paola; Doglioni, Carlo

    2018-01-01

    A seismic sequence in central Italy from August 2016 to January 2017 affected groundwater dynamics in fractured carbonate aquifers. Changes in spring discharge, water-table position, and streamflow were recorded for several months following nine Mw 5.0-6.5 seismic events. Data from 22 measurement sites, located within 100 km of the epicentral zones, were analyzed. The intensity of the induced changes were correlated with seismic magnitude and distance to epicenters. The additional post-seismic discharge from rivers and springs was found to be higher than 9 m3/s, totaling more than 0.1 km3 of groundwater release over 6 months. This huge and unexpected contribution increased streamflow in narrow mountainous valleys to previously unmeasured peak values. Analogously to the L'Aquila 2009 post-earthquake phenomenon, these hydrogeological changes might reflect an increase of bulk hydraulic conductivity at the aquifer scale, which would increase hydraulic heads in the discharge zones and lower them in some recharge areas. The observed changes may also be partly due to other mechanisms, such as shaking and/or squeezing effects related to intense subsidence in the core of the affected area, where effects had maximum extent, or breaching of hydraulic barriers.

  8. Water-table and discharge changes associated with the 2016-2017 seismic sequence in central Italy: hydrogeological data and a conceptual model for fractured carbonate aquifers

    NASA Astrophysics Data System (ADS)

    Petitta, Marco; Mastrorillo, Lucia; Preziosi, Elisabetta; Banzato, Francesca; Barberio, Marino Domenico; Billi, Andrea; Cambi, Costanza; De Luca, Gaetano; Di Carlo, Giuseppe; Di Curzio, Diego; Di Salvo, Cristina; Nanni, Torquato; Palpacelli, Stefano; Rusi, Sergio; Saroli, Michele; Tallini, Marco; Tazioli, Alberto; Valigi, Daniela; Vivalda, Paola; Doglioni, Carlo

    2018-06-01

    A seismic sequence in central Italy from August 2016 to January 2017 affected groundwater dynamics in fractured carbonate aquifers. Changes in spring discharge, water-table position, and streamflow were recorded for several months following nine Mw 5.0-6.5 seismic events. Data from 22 measurement sites, located within 100 km of the epicentral zones, were analyzed. The intensity of the induced changes were correlated with seismic magnitude and distance to epicenters. The additional post-seismic discharge from rivers and springs was found to be higher than 9 m3/s, totaling more than 0.1 km3 of groundwater release over 6 months. This huge and unexpected contribution increased streamflow in narrow mountainous valleys to previously unmeasured peak values. Analogously to the L'Aquila 2009 post-earthquake phenomenon, these hydrogeological changes might reflect an increase of bulk hydraulic conductivity at the aquifer scale, which would increase hydraulic heads in the discharge zones and lower them in some recharge areas. The observed changes may also be partly due to other mechanisms, such as shaking and/or squeezing effects related to intense subsidence in the core of the affected area, where effects had maximum extent, or breaching of hydraulic barriers.

  9. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at facilitating rapid and proportionate earthquake response. For uncertainty representation, PAGER employs an Earthquake Impact Scale (EIS) that provides simple alerting thresholds, derived from systematic analyses of past earthquake impact and response levels. The alert levels are characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (major disaster, necessitating international response). We made a conscious attempt at both simple and intuitive color-coded alerting criterion; yet, we preserve the necessary uncertainty measures (with simple histograms) by which one can gauge the likelihood for the alert to be over- or underestimated. In these hazard and loss modeling examples, both products are widely used across a range of technical as well as general audiences. Ironically, ShakeMap uncertainties--rigorously reported and portrayed for the primarily scientific portion of the audience--are rarely employed and are routinely misunderstood; for PAGER, uncertainties aimed at a wider user audience seem to be more easily digested. We discuss how differences in the way these uncertainties are portrayed may play into their acceptance and uptake, or lack thereof.

  10. The HayWired Earthquake Scenario—Earthquake Hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of the Earth’s surface that the fault rupture and shaking will activate.

  11. Compact 3D Camera for Shake-the-Box Particle Tracking

    NASA Astrophysics Data System (ADS)

    Hesseling, Christina; Michaelis, Dirk; Schneiders, Jan

    2017-11-01

    Time-resolved 3D-particle tracking usually requires the time-consuming optical setup and calibration of 3 to 4 cameras. Here, a compact four-camera housing has been developed. The performance of the system using Shake-the-Box processing (Schanz et al. 2016) is characterized. It is shown that the stereo-base is large enough for sensible 3D velocity measurements. Results from successful experiments in water flows using LED illumination are presented. For large-scale wind tunnel measurements, an even more compact version of the system is mounted on a robotic arm. Once calibrated for a specific measurement volume, the necessity for recalibration is eliminated even when the system moves around. Co-axial illumination is provided through an optical fiber in the middle of the housing, illuminating the full measurement volume from one viewing direction. Helium-filled soap bubbles are used to ensure sufficient particle image intensity. This way, the measurement probe can be moved around complex 3D-objects. By automatic scanning and stitching of recorded particle tracks, the detailed time-averaged flow field of a full volume of cubic meters in size is recorded and processed. Results from an experiment at TU-Delft of the flow field around a cyclist are shown.

  12. CENTRIFUGAL VIBRATION TEST OF RC PILE FOUNDATION

    NASA Astrophysics Data System (ADS)

    Higuchi, Shunichi; Tsutsumiuchi, Takahiro; Otsuka, Rinna; Ito, Koji; Ejiri, Joji

    It is necessary that nonlinear responses of structures are clarified by soil-structure interaction analysis for the purpose of evaluating the seismic performances of underground structure or foundation structure. In this research, centrifuge shake table tests of reinforced concrete pile foundation installed in the liquefied ground were conducted. Then, finite element analyses for the tests were conducted to confirm an applicability of the analytical method by comparing the experimental results and analytical results.

  13. Characterising large scenario earthquakes and their influence on NDSHA maps

    NASA Astrophysics Data System (ADS)

    Magrin, Andrea; Peresan, Antonella; Panza, Giuliano F.

    2016-04-01

    The neo-deterministic approach to seismic zoning, NDSHA, relies on physically sound modelling of ground shaking from a large set of credible scenario earthquakes, which can be defined based on seismic history and seismotectonics, as well as incorporating information from a wide set of geological and geophysical data (e.g. morphostructural features and present day deformation processes identified by Earth observations). NDSHA is based on the calculation of complete synthetic seismograms; hence it does not make use of empirical attenuation models (i.e. ground motion prediction equations). From the set of synthetic seismograms, maps of seismic hazard that describe the maximum of different ground shaking parameters at the bedrock can be produced. As a rule, the NDSHA, defines the hazard as the envelope ground shaking at the site, computed from all of the defined seismic sources; accordingly, the simplest outcome of this method is a map where the maximum of a given seismic parameter is associated to each site. In this way, the standard NDSHA maps permit to account for the largest observed or credible earthquake sources identified in the region in a quite straightforward manner. This study aims to assess the influence of unavoidable uncertainties in the characterisation of large scenario earthquakes on the NDSHA estimates. The treatment of uncertainties is performed by sensitivity analyses for key modelling parameters and accounts for the uncertainty in the prediction of fault radiation and in the use of Green's function for a given medium. Results from sensitivity analyses with respect to the definition of possible seismic sources are discussed. A key parameter is the magnitude of seismic sources used in the simulation, which is based on information from earthquake catalogue, seismogenic zones and seismogenic nodes. The largest part of the existing Italian catalogues is based on macroseismic intensities, a rough estimate of the error in peak values of ground motion can therefore be the factor of two, intrinsic in MCS and other discrete scales. A simple test supports this hypothesis: an increase of 0.5 in the magnitude, i.e. one degrees in epicentral MCS, of all sources used in the national scale seismic zoning produces a doubling of the maximum ground motion. The analysis of uncertainty in ground motion maps, due to the catalogue random errors in magnitude and localization, shows a not uniform distribution of ground shaking uncertainty. The available information from catalogues of past events, that is not complete and may well not be representative of future earthquakes, can be substantially completed using independent indicators of the seismogenic potential of a given area, such as active faulting data and the seismogenic nodes.

  14. The ShakeOut earthquake source and ground motion simulations

    USGS Publications Warehouse

    Graves, R.W.; Houston, Douglas B.; Hudnut, K.W.

    2011-01-01

    The ShakeOut Scenario is premised upon the detailed description of a hypothetical Mw 7.8 earthquake on the southern San Andreas Fault and the associated simulated ground motions. The main features of the scenario, such as its endpoints, magnitude, and gross slip distribution, were defined through expert opinion and incorporated information from many previous studies. Slip at smaller length scales, rupture speed, and rise time were constrained using empirical relationships and experience gained from previous strong-motion modeling. Using this rupture description and a 3-D model of the crust, broadband ground motions were computed over a large region of Southern California. The largest simulated peak ground acceleration (PGA) and peak ground velocity (PGV) generally range from 0.5 to 1.0 g and 100 to 250 cm/s, respectively, with the waveforms exhibiting strong directivity and basin effects. Use of a slip-predictable model results in a high static stress drop event and produces ground motions somewhat higher than median level predictions from NGA ground motion prediction equations (GMPEs).

  15. U.S. Geological Survey's ShakeCast: A cloud-based future

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-Wan; Turner, Loren; Bekiri, Nebi

    2014-01-01

    When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap portrays the extent of potentially damaging shaking. In turn, the ShakeCast system, a freely-available, post-earthquake situational awareness application, automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. ShakeCast is particularly suitable for earthquake planning and response purposes by Departments of Transportation (DOTs), critical facility and lifeline utilities, large businesses, engineering and financial services, and loss and risk modelers. Recent important developments to the ShakeCast system and its user base are described. The newly-released Version 3 of the ShakeCast system encompasses advancements in seismology, earthquake engineering, and information technology applicable to the legacy ShakeCast installation (Version 2). In particular, this upgrade includes a full statistical fragility analysis framework for general assessment of structures as part of the near real-time system, direct access to additional earthquake-specific USGS products besides ShakeMap (PAGER, DYFI?, tectonic summary, etc.), significant improvements in the graphical user interface, including a console view for operations centers, and custom, user-defined hazard and loss modules. The release also introduces a new adaption option to port ShakeCast to the "cloud". Employing Amazon Web Services (AWS), users now have a low-cost alternative to local hosting, by fully offloading hardware, software, and communication obligations to the cloud. Other advantages of the "ShakeCast Cloud" strategy include (1) Reliability and robustness of offsite operations, (2) Scalability naturally accommodated, (3), Serviceability, problems reduced due to software and hardware uniformity, (4) Testability, freely available for new users, (5) Remotely supported, allowing expert-facilitated maintenance, (6) Adoptability, simplified with disk images, and (7) Security, built in at the very high level associated with AWS. The ShakeCast user base continues to expand and broaden. For example, Caltrans, the prototypical ShakeCast user and development supporter, has been providing guidance to other DOTs on the use of the National Bridge Inventory (NBI) database to implement fully-functional ShakeCast systems in their states. A long-term goal underway is to further "connect the DOTs" via a Transportation Pooled Fund (TPF) with participating state DOTs. We also review some of the many other users and uses of ShakeCast. Lastly, on the hazard input front, we detail related ShakeMap improvements and ongoing advancements in estimating the likelihood of shaking-induced secondary hazards at structures, facilities, bridges, and along roadways due to landslides and liquefaction, and implemented within the ShakeCast framework.

  16. The dependence of PGA and PGV on distance and magnitude inferred from Northern California ShakeMap data

    USGS Publications Warehouse

    Boatwright, J.; Bundock, H.; Luetgert, J.; Seekins, L.; Gee, L.; Lombard, P.

    2003-01-01

    We analyze peak ground velocity (PGV) and peak ground acceleration (PGA) data from 95 moderate (3.5 ??? M 100 km, the peak motions attenuate more rapidly than a simple power law (that is, r-??) can fit. Instead, we use an attenuation function that combines a fixed power law (r-0.7) with a fitted exponential dependence on distance, which is estimated as expt(-0.0063r) and exp(-0.0073r) for PGV and PGA, respectively, for moderate earthquakes. We regress log(PGV) and log(PGA) as functions of distance and magnitude. We assume that the scaling of log(PGV) and log(PGA) with magnitude can differ for moderate and large earthquakes, but must be continuous. Because the frequencies that carry PGV and PGA can vary with earthquake size for large earthquakes, the regression for large earthquakes incorporates a magnitude dependence in the exponential attenuation function. We fix the scaling break between moderate and large earthquakes at M 5.5; log(PGV) and log(PGA) scale as 1.06M and 1.00M, respectively, for moderate earthquakes and 0.58M and 0.31M for large earthquakes.

  17. Strong ground motion in Port-au-Prince, Haiti, during the M7.0 12 January 2010 Haiti earthquake

    USGS Publications Warehouse

    Hough, Susan E; Given, Doug; Taniguchi, Tomoyo; Altidor, J.R.; Anglade, Dieuseul; Mildor, S-L.

    2011-01-01

    No strong motion records are available for the 12 January 2010 M7.0 Haiti earthquake. We use aftershock recordings as well as detailed considerations of damage to estimate the severity and distribution of mainshock shaking in Port-au-Prince. Relative to ground motions at a hard - rock reference site, peak accelerations are amplified by a factor of approximately 2 at sites on low-lying deposits in central Port-au-Prince and by a factor of 2.5 - 3.5 on a steep foothill ridge in the southern Port-au-Prince metropolitan region. The observed amplification along the ridge cannot be explained by sediment - induced amplification , but is consistent with predicted topographic amplification by a steep, narrow ridge. Although damage was largely a consequence of poor construction , the damage pattern inferred from analysis of remote sensing imagery provides evidence for a correspondence between small-scale (0.1 - 1.0 km) topographic relief and high damage. Mainshock shaking intensity can be estimated crudely from a consideration of macroseismic effects . We further present detailed, quantitative analysis of the marks left on a tile floor by an industrial battery rack displaced during the mainshock, at the location where we observed the highest weak motion amplifications. Results of this analysis indicate that mainshock shaking was significantly higher at this location (~0.5 g , MMI VIII) relative to the shaking in parts of Port-au-Prince that experienced light damage. Our results further illustrate how observations of rigid body horizontal displacement during earthquakes can be used to estimate peak ground accelerations in the absence of instrumental data .

  18. Dietary Soy Supplement on Fibromyalgia Symptoms: A Randomized, Double-Blind, Placebo-Controlled, Early Phase Trial

    PubMed Central

    Wahner-Roedler, Dietlind L.; Thompson, Jeffrey M.; Luedtke, Connie A.; King, Susan M.; Cha, Stephen S.; Elkin, Peter L.; Bruce, Barbara K.; Townsend, Cynthia O.; Bergeson, Jody R.; Eickhoff, Andrea L.; Loehrer, Laura L.; Sood, Amit; Bauer, Brent A.

    2011-01-01

    Most patients with fibromyalgia use complementary and alternative medicine (CAM). Properly designed controlled trials are necessary to assess the effectiveness of these practices. This study was a randomized, double-blind, placebo-controlled, early phase trial. Fifty patients seen at a fibromyalgia outpatient treatment program were randomly assigned to a daily soy or placebo (casein) shake. Outcome measures were scores of the Fibromyalgia Impact Questionnaire (FIQ) and the Center for Epidemiologic Studies Depression Scale (CES-D) at baseline and after 6 weeks of intervention. Analysis was with standard statistics based on the null hypothesis, and separation test for early phase CAM comparative trials. Twenty-eight patients completed the study. Use of standard statistics with intent-to-treat analysis showed that total FIQ scores decreased by 14% in the soy group (P = .02) and by 18% in the placebo group (P < .001). The difference in change in scores between the groups was not significant (P = .16). With the same analysis, CES-D scores decreased in the soy group by 16% (P = .004) and in the placebo group by 15% (P = .05). The change in scores was similar in the groups (P = .83). Results of statistical analysis using the separation test and intent-to-treat analysis revealed no benefit of soy compared with placebo. Shakes that contain soy and shakes that contain casein, when combined with a multidisciplinary fibromyalgia treatment program, provide a decrease in fibromyalgia symptoms. Separation between the effects of soy and casein (control) shakes did not favor the intervention. Therefore, large-sample studies using soy for patients with fibromyalgia are probably not indicated. PMID:18990724

  19. Specific Signature of Seismic Shaking in Landslide Inventories: Case of the Chichi Earthquake

    NASA Astrophysics Data System (ADS)

    Meunier, P.; Rault, C.; Marc, O.; Hovius, N.

    2017-12-01

    The 1999 Chichi earthquake triggered 10 000 landslides in its epicentral area. In addition to coseismic landsliding, directly induced by the shaking, the hillslopes response extended to several years after the main shock, during which landslide susceptibility remained higher than during the pre-seismic period. We attribute this elevated rate to weakening effects caused by the shaking. The characteristics of the coseismic landslide catalogues (clustering,slope and azimuth distribution) bears the signature of the seismic triggering. Extended landslide mapping (1994-2004) allows to track changes in these signatures in order to better interpret them. We present a summary of the change of these signatures through time and space. At the scale of the epicentral area, we show that coseismic landslide clustering did clearly occur along the fault where the shaking is strong. In 3 sub-catchments of the Choshui river, a finer analysis of the landslide time series reveals a mixed signature of both geology and shaking. Pre-quake rain-induced landslides preferentially occurred down slope and along the bedding planes while coseismic landslides locate higher in the landscape, on slopes strongly affected by site effects. However, during the post seismic period, the signature of the shaking is not present while landslide rate remains high, suggesting that weakening effects seemed homogeneously distributed in the landscape.

  20. Specific signature of seismic shaking in landslide catalogues: Case of the Chichi earthquake

    NASA Astrophysics Data System (ADS)

    Meunier, Patrick; Rault, Claire; Marc, Odin; Hovius, Niels

    2017-04-01

    The 1999 Chichi earthquake triggered 10 000 landslides in its epicentral area. In addition to coseismic landsliding, directly induced by the shaking, the hillslopes response extended to several years after the main shock, during which landslide susceptibility remained higher than during the pre-seismic period. We attribute this elevated rate to weakening effects caused by the shaking. The characteristics of the coseismic landslide catalogues (clustering, slope and azimuth distribution) bears the signature of the seismic triggering. Extended landslide mapping (1994-2004) allows to track changes in these signatures in order to better interpret them. We present a summary of the change of these signatures through time and space. At the scale of the epicentral area, we show that coseismic landslide clustering did clearly occur along the fault where the shaking is strong. In 3 sub-catchments of the Choshui river, a finer analysis of the landslide time series reveals a mixed signature of both geology and shaking. Pre-quake rain-induced landslides preferentially occurred down slope and along the bedding planes while coseismic landslides locate higher in the landscape, on slopes strongly affected by site effects. However, during the post seismic period, the signature of the shaking is not present while landslide rate remains high, suggesting that weakening effects seemed homogeneously distributed in the landscape.

  1. Experimental study on control performance of tuned liquid column dampers considering different excitation directions

    NASA Astrophysics Data System (ADS)

    Altunişik, Ahmet Can; Yetişken, Ali; Kahya, Volkan

    2018-03-01

    This paper gives experimental tests' results for the control performance of Tuned Liquid Column Dampers (TLCDs) installed on a prototype structure exposed to ground motions with different directions. The prototype structure designed in the laboratory consists of top and bottom plates with four columns. Finite element analyses and ambient vibration tests are first performed to extract the natural frequencies and mode shapes of the structure. Then, the damping ratio of the structure as well as the resonant frequency, head-loss coefficient, damping ratio, and water height-frequency diagram of the designed TLCD are obtained experimentally by the shaking table tests. To investigate the effect of TLCDs on the structural response, the prototype structure-TLCD coupled system is considered later, and its natural frequencies and related mode shapes are obtained numerically. The acceleration and displacement time-histories are obtained by the shaking table tests to evaluate its damping ratio. To consider different excitation directions, the measurements are repeated for the directions between 0° and 90° with 15° increment. It can be concluded from the study that TLCD causes to decrease the resonant frequency of the structure with increasing of the total mass. Damping ratio considerably increases with installing TLCD on the structure. This is more pronounced for the angles of 0°, 15°, 30° and 45°.

  2. Wing walls for enhancing the seismic performance of reinforced concrete frame structures

    NASA Astrophysics Data System (ADS)

    Yang, Weisong; Guo, Xun; Xu, Weixiao; Yuan, Xin

    2016-06-01

    A building retrofitted with wing walls in the bottom story, which was damaged during the 2008 M8.0 Wenchuan earthquake in China, is introduced and a corresponding 1/4 scale wing wall-frame model was subjected to shake table motions to study the seismic behavior of this retrofitted structural system. The results show that wing walls can effectively protect columns from damage by moving areas that bear reciprocating tension and compression to the sections of the wing walls, thus achieving an extra measure of seismic fortification. A `strong column-weak beam' mechanism was realized, the flexural rigidity of the vertical member was strengthened, and a more uniform distribution of deformation among all the stories was measured. In addition, the joint between the wing walls and the beams suffered severe damage during the tests, due to an area of local stress concentration. A longer area of intensive stirrup is suggested in the end of the beams.

  3. Optimal Scaling of Aftershock Zones using Ground Motion Forecasts

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.

    2018-02-01

    The spatial distribution of aftershocks following major earthquakes has received significant attention due to the shaking hazard these events pose for structures and populations in the affected region. Forecasting the spatial distribution of aftershock events is an important part of the estimation of future seismic hazard. A simple spatial shape for the zone of activity has often been assumed in the form of an ellipse having semimajor axis to semiminor axis ratio of 2.0. However, since an important application of these calculations is the estimation of ground shaking hazard, an effective criterion for forecasting future aftershock impacts is to use ground motion prediction equations (GMPEs) in addition to the more usual approach of using epicentral or hypocentral locations. Based on these ideas, we present an aftershock model that uses self-similarity and scaling relations to constrain parameters as an option for such hazard assessment. We fit the spatial aspect ratio to previous earthquake sequences in the studied regions, and demonstrate the effect of the fitting on the likelihood of post-disaster ground motion forecasts for eighteen recent large earthquakes. We find that the forecasts in most geographic regions studied benefit from this optimization technique, while some are better suited to the use of the a priori aspect ratio.

  4. Seismic shaking in the North China Basin expected from ruptures of a possible seismic gap

    NASA Astrophysics Data System (ADS)

    Duan, Benchun; Liu, Dunyu; Yin, An

    2017-05-01

    A 160 km long seismic gap, which has not been ruptured over 8000 years, was identified recently in North China. In this study, we use a dynamic source model and a newly available high-resolution 3-D velocity structure to simulate long-period ground motion (up to 0.5 Hz) from possibly worst case rupture scenarios of the seismic gap. We find that the characteristics of the earthquake source and the local geologic structure play a critical role in controlling the amplitude and distribution of the simulated strong ground shaking. Rupture directivity and slip asperities can result in large-amplitude (i.e., >1 m/s) ground shaking near the fault, whereas long-duration shaking may occur within sedimentary basins. In particular, a deep and closed Quaternary basin between Beijing and Tianjin can lead to ground shaking of several tens of cm/s for more than 1 min. These results may provide a sound basis for seismic mitigation in one of the most populated regions in the world.

  5. Nonlinear attenuation of S-waves and Love waves within ambient rock

    NASA Astrophysics Data System (ADS)

    Sleep, Norman H.; Erickson, Brittany A.

    2014-04-01

    obtain scaling relationships for nonlinear attenuation of S-waves and Love waves within sedimentary basins to assist numerical modeling. These relationships constrain the past peak ground velocity (PGV) of strong 3-4 s Love waves from San Andreas events within Greater Los Angeles, as well as the maximum PGV of future waves that can propagate without strong nonlinear attenuation. During each event, the shaking episode cracks the stiff, shallow rock. Over multiple events, this repeated damage in the upper few hundred meters leads to self-organization of the shear modulus. Dynamic strain is PGV divided by phase velocity, and dynamic stress is strain times the shear modulus. The frictional yield stress is proportional to depth times the effective coefficient of friction. At the eventual quasi-steady self-organized state, the shear modulus increases linearly with depth allowing inference of past typical PGV where rock over the damaged depth range barely reaches frictional failure. Still greater future PGV would cause frictional failure throughout the damaged zone, nonlinearly attenuating the wave. Assuming self-organization has taken place, estimated maximum past PGV within Greater Los Angeles Basins is 0.4-2.6 m s-1. The upper part of this range includes regions of accumulating sediments with low S-wave velocity that may have not yet compacted, rather than having been damaged by strong shaking. Published numerical models indicate that strong Love waves from the San Andreas Fault pass through Whittier Narrows. Within this corridor, deep drawdown of the water table from its currently shallow and preindustrial levels would nearly double PGV of Love waves reaching Downtown Los Angeles.

  6. SeismoGeodesy: Combination of High Rate, Real-time GNSS and Accelerometer Observations and Rapid Seismic Event Notification for Earth Quake Early Warning and Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Jackson, Michael; Zimakov, Leonid; Moessmer, Matthias

    2015-04-01

    Scientific GNSS networks are moving towards a model of real-time data acquisition, epoch-by-epoch storage integrity, and on-board real-time position and displacement calculations. This new paradigm allows the integration of real-time, high-rate GNSS displacement information with acceleration and velocity data to create very high-rate displacement records. The mating of these two instruments allows the creation of a new, very high-rate (200 Hz) displacement observable that has the full-scale displacement characteristics of GNSS and high-precision dynamic motions of seismic technologies. It is envisioned that these new observables can be used for earthquake early warning studies, volcano monitoring, and critical infrastructure monitoring applications. Our presentation will focus on the characteristics of GNSS, seismic, and strong motion sensors in high dynamic environments, including historic earthquakes replicated on a shake table over a range of displacements and frequencies. We will explore the optimum integration of these sensors from a filtering perspective including simple harmonic impulses over varying frequencies and amplitudes and under the dynamic conditions of various earthquake scenarios. We will also explore the tradeoffs between various GNSS processing schemes including real-time precise point positioning (PPP) and real-time kinematic (RTK) as applied to seismogeodesy. In addition we will discuss implementation of a Rapid Seismic Event Notification System that provides quick delivery of digital data from seismic stations to the acquisition and processing center and a full data integrity model for real-time earthquake notification that provides warning prior to significant ground shaking.

  7. Methodology for enabling high-throughput simultaneous saccharification and fermentation screening of yeast using solid biomass as a substrate.

    PubMed

    Elliston, Adam; Wood, Ian P; Soucouri, Marie J; Tantale, Rachelle J; Dicks, Jo; Roberts, Ian N; Waldron, Keith W

    2015-01-01

    High-throughput (HTP) screening is becoming an increasingly useful tool for collating biological data which would otherwise require the employment of excessive resources. Second generation biofuel production is one such process. HTP screening allows the investigation of large sample sets to be undertaken with increased speed and cost effectiveness. This paper outlines a methodology that will enable solid lignocellulosic substrates to be hydrolyzed and fermented at a 96-well plate scale, facilitating HTP screening of ethanol production, whilst maintaining repeatability similar to that achieved at a larger scale. The results showed that utilizing sheets of biomass of consistent density (handbills), for paper, and slurries of pretreated biomass that could be pipetted allowed standardized and accurate transfers to 96-well plates to be achieved (±3.1 and 1.7%, respectively). Processing these substrates by simultaneous saccharification and fermentation (SSF) at various volumes showed no significant difference on final ethanol yields, either at standard shake flask (200 mL), universal bottle (10 mL) or 96-well plate (1 mL) scales. Substrate concentrations of up to 10% (w/v) were trialed successfully for SSFs at 1 mL volume. The methodology was successfully tested by showing the effects of steam explosion pretreatment on both oilseed rape and wheat straws. This methodology could be used to replace large shake flask reactions with comparatively fast 96-well plate SSF assays allowing for HTP experimentation. Additionally this method is compatible with a number of standardized assay techniques such as simple colorimetric, High-performance liquid chromatography (HPLC) and Nuclear magnetic resonance (NMR) spectroscopy. Furthermore this research has practical uses in the biorefining of biomass substrates for second generation biofuels and novel biobased chemicals by allowing HTP SSF screening, which should allow selected samples to be scaled up or studied in more detail.

  8. The Relationship between Cranial Structure, Biomechanical Performance and Ecological Diversity in Varanoid Lizards

    PubMed Central

    McCurry, Matthew R.; Mahony, Michael; Clausen, Phillip D.; Quayle, Michelle R.; Walmsley, Christopher W.; Jessop, Tim S.; Wroe, Stephen; Richards, Heather; McHenry, Colin R.

    2015-01-01

    Skull structure is intimately associated with feeding ability in vertebrates, both in terms of specific performance measures and general ecological characteristics. This study quantitatively assessed variation in the shape of the cranium and mandible in varanoid lizards, and its relationship to structural performance (von Mises strain) and interspecific differences in feeding ecology. Geometric morphometric and linear morphometric analyses were used to evaluate morphological differences, and finite element analysis was used to quantify variation in structural performance (strain during simulated biting, shaking and pulling). This data was then integrated with ecological classes compiled from relevant scientific literature on each species in order to establish structure-function relationships. Finite element modelling results showed that variation in cranial morphology resulted in large differences in the magnitudes and locations of strain in biting, shaking and pulling load cases. Gracile species such as Varanus salvadorii displayed high strain levels during shaking, especially in the areas between the orbits. All models exhibit less strain during pull back loading compared to shake loading, even though a larger force was applied (pull =30N, shake = 20N). Relationships were identified between the morphology, performance, and ecology. Species that did not feed on hard prey clustered in the gracile region of cranial morphospace and exhibited significantly higher levels of strain during biting (P = 0.0106). Species that fed on large prey clustered in the elongate area of mandible morphospace. This relationship differs from those that have been identified in other taxonomic groups such as crocodiles and mammals. This difference may be due to a combination of the open ‘space-frame’ structure of the varanoid lizard skull, and the ‘pull back’ behaviour that some species use for processing large prey. PMID:26106889

  9. Optically-based Sensor System for Critical Nuclear Facilities Post-Event Seismic Structural Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCallen, David; Petrone, Floriana; Buckle, Ian

    The U.S. Department of Energy (DOE) has ownership and operational responsibility for a large enterprise of nuclear facilities that provide essential functions to DOE missions ranging from national security to discovery science and energy research. These facilities support a number of DOE programs and offices including the National Nuclear Security Administration, Office of Science, and Office of Environmental Management. With many unique and “one of a kind” functions, these facilities represent a tremendous national investment, and assuring their safety and integrity is fundamental to the success of a breadth of DOE programs. Many DOE critical facilities are located in regionsmore » with significant natural phenomenon hazards including major earthquakes and DOE has been a leader in developing standards for the seismic analysis of nuclear facilities. Attaining and sustaining excellence in nuclear facility design and management must be a core competency of the DOE. An important part of nuclear facility management is the ability to monitor facilities and rapidly assess the response and integrity of the facilities after any major upset event. Experience in the western U.S. has shown that understanding facility integrity after a major earthquake is a significant challenge which, lacking key data, can require extensive effort and significant time. In the work described in the attached report, a transformational approach to earthquake monitoring of facilities is described and demonstrated. An entirely new type of optically-based sensor that can directly and accurately measure the earthquake-induced deformations of a critical facility has been developed and tested. This report summarizes large-scale shake table testing of the sensor concept on a representative steel frame building structure, and provides quantitative data on the accuracy of the sensor measurements.« less

  10. Communication during copulation in the sex-role reversed wolf spider Allocosa brasiliensis: Female shakes for soliciting new ejaculations?

    PubMed

    Garcia Diaz, Virginia; Aisenberg, Anita; Peretti, Alfredo V

    2015-07-01

    Traditional studies on sexual communication have focused on the exchange of signals during courtship. However, communication between the sexes can also occur during or after copulation. Allocosa brasiliensis is a wolf spider that shows a reversal in typical sex roles and of the usual sexual size dimorphism expected for spiders. Females are smaller than males and they are the roving sex that initiates courtship. Occasional previous observations suggested that females performed body shaking behaviors during copulation. Our objective was to analyze if female body shaking is associated with male copulatory behavior in A. brasiliensis, and determine if this female behavior has a communicatory function in this species. For that purpose, we performed fine-scaled analysis of fifteen copulations under laboratory conditions. We video-recorded all the trials and looked for associations between female and male copulatory behaviors. The significant difference between the time before and after female shaking, in favor of the subsequent ejaculation is analyzed. We discuss if shaking could be acting as a signal to accelerate and motivate palpal insertion and ejaculation, and/or inhibiting male cannibalistic tendencies in this species. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. On the selection of user-defined parameters in data-driven stochastic subspace identification

    NASA Astrophysics Data System (ADS)

    Priori, C.; De Angelis, M.; Betti, R.

    2018-02-01

    The paper focuses on the time domain output-only technique called Data-Driven Stochastic Subspace Identification (DD-SSI); in order to identify modal models (frequencies, damping ratios and mode shapes), the role of its user-defined parameters is studied, and rules to determine their minimum values are proposed. Such investigation is carried out using, first, the time histories of structural responses to stationary excitations, with a large number of samples, satisfying the hypothesis on the input imposed by DD-SSI. Then, the case of non-stationary seismic excitations with a reduced number of samples is considered. In this paper, partitions of the data matrix different from the one proposed in the SSI literature are investigated, together with the influence of different choices of the weighting matrices. The study is carried out considering two different applications: (1) data obtained from vibration tests on a scaled structure and (2) in-situ tests on a reinforced concrete building. Referring to the former, the identification of a steel frame structure tested on a shaking table is performed using its responses in terms of absolute accelerations to a stationary (white noise) base excitation and to non-stationary seismic excitations of low intensity. Black-box and modal models are identified in both cases and the results are compared with those from an input-output subspace technique. With regards to the latter, the identification of a complex hospital building is conducted using data obtained from ambient vibration tests.

  12. An offline approach for output-only Bayesian identification of stochastic nonlinear systems using unscented Kalman filtering

    NASA Astrophysics Data System (ADS)

    Erazo, Kalil; Nagarajaiah, Satish

    2017-06-01

    In this paper an offline approach for output-only Bayesian identification of stochastic nonlinear systems is presented. The approach is based on a re-parameterization of the joint posterior distribution of the parameters that define a postulated state-space stochastic model class. In the re-parameterization the state predictive distribution is included, marginalized, and estimated recursively in a state estimation step using an unscented Kalman filter, bypassing state augmentation as required by existing online methods. In applications expectations of functions of the parameters are of interest, which requires the evaluation of potentially high-dimensional integrals; Markov chain Monte Carlo is adopted to sample the posterior distribution and estimate the expectations. The proposed approach is suitable for nonlinear systems subjected to non-stationary inputs whose realization is unknown, and that are modeled as stochastic processes. Numerical verification and experimental validation examples illustrate the effectiveness and advantages of the approach, including: (i) an increased numerical stability with respect to augmented-state unscented Kalman filtering, avoiding divergence of the estimates when the forcing input is unmeasured; (ii) the ability to handle arbitrary prior and posterior distributions. The experimental validation of the approach is conducted using data from a large-scale structure tested on a shake table. It is shown that the approach is robust to inherent modeling errors in the description of the system and forcing input, providing accurate prediction of the dynamic response when the excitation history is unknown.

  13. Study Gradation and Moisture Content of Sand Embankment on Peat Subjected Vibration Potential Liquefaction

    NASA Astrophysics Data System (ADS)

    Agus Nugroho, Soewignjo; Ika Putra, Agus; Yusa, Muhamad

    2018-03-01

    In recent years large earthquakes often occur on the island of Sumatra. There is a phenomenon of the damage occurred during the earthquake, one of the effects is a phenomenon of loss of soil strength due to vibration called liquefaction. Some cases of liquefaction occur in some areas in Aceh, Nias Island, Padang and Pariaman. Pekanbaru is located close to the fault area that causes the occurrence of earthquake wave propagation. Pekanbaru are also at risk for geotechnical problems because of earthquake such as liquefaction. Evaluation of liquefaction potential could using by in-situ test and by laboratory test. The laboratory test to evaluation liquefaction potential among which method of experiment shaking table. In this study, liquefaction phenomenon was conducted by creating a physical model of a laboratory scale using a one-way vibration machine, with a review of how big the influence of sand gradation, sand shaped and grain-size, and surface water level in the sand against liquefaction potential. Evaluate of liquefaction potential based on the surface reading of the soil movement, elapsed time for final settlement and an excess pore water dissipation (EPD) during testing. Based on the results of performed test, indicated that fine sand on fully saturated conditions have the potential of maximum settlement for 20.67% and maximum ascend of pore water for 46.67%. This result mean that poorly graded fine sand on fully saturated conditions has more liquefaction potential than medium sand, coarse sand, and well graded sand

  14. Seismic Rehabilitation of RC Frames by Using Steel Panels

    NASA Astrophysics Data System (ADS)

    Mowrtage, Waiel

    2008-07-01

    Every major earthquake in Turkey causes a large number of building suffer moderate damage due to poor construction. If a proper and fast retrofit is not applied, the aftershocks, which may sometimes come days or weeks after the main shock, can push a moderately damaged building into a major damage or even total collapse. This paper presents a practical retrofit method for moderately damaged buildings, which increases the seismic performance of the structural system by reducing the displacement demand. Fabricated steel panels are used for the retrofit. They are light-weight, easy to handle, and can be constructed very quickly. Moreover, they are cheap, and do not need formwork or skilled workers. They can be designed to compensate for the stiffness and strength degradation, and to fit easily inside a moderately damaged reinforced concrete frame. To test the concept, a half-scale, single-story 3D reinforced concrete frame specimen was constructed at the shake-table laboratories of the Kandilli Observatory and Earthquake Research Institute of Bogazici University, and subjected to recorded real earthquake base accelerations. The amplitudes of base accelerations were increased until a moderate damage level is reached. Then, the damaged RC frames was retrofitted by means of steel panels and tested under the same earthquake. The seismic performance of the specimen before and after the retrofit was evaluated using FEMA356 standards, and the results were compared in terms of stiffness, strength, and deformability. The results have confirmed effectiveness of the proposed retrofit scheme.

  15. Sloshing of a bubbly magma reservoir as a mechanism of triggered eruptions

    NASA Astrophysics Data System (ADS)

    Namiki, Atsuko; Rivalta, Eleonora; Woith, Heiko; Walter, Thomas R.

    2016-06-01

    Large earthquakes sometimes activate volcanoes both in the near field as well as in the far field. One possible explanation is that shaking may increase the mobility of the volcanic gases stored in magma reservoirs and conduits. Here experimentally and theoretically we investigate how sloshing, the oscillatory motion of fluids contained in a shaking tank, may affect the presence and stability of bubbles and foams, with important implications for magma conduits and reservoirs. We adopt this concept from engineering: severe earthquakes are known to induce sloshing and damage petroleum tanks. Sloshing occurs in a partially filled tank or a fully filled tank with density-stratified fluids. These conditions are met at open summit conduits or at sealed magma reservoirs where a bubbly magma layer overlays a newly injected denser magma layer. We conducted sloshing experiments by shaking a rectangular tank partially filled with liquids, bubbly fluids (foams) and fully filled with density-stratified fluids; i.e., a foam layer overlying a liquid layer. In experiments with foams, we find that foam collapse occurs for oscillations near the resonance frequency of the fluid layer. Low viscosity and large bubble size favor foam collapse during sloshing. In the layered case, the collapsed foam mixes with the underlying liquid layer. Based on scaling considerations, we constrain the conditions for the occurrence of foam collapse in natural magma reservoirs. We find that seismic waves with lower frequencies < 1 Hz, usually excited by large earthquakes, can resonate with magma reservoirs whose width is > 0.5 m. Strong ground motion > 0.1 m s- 1 can excite sloshing with sufficient amplitude to collapse a magma foam in an open conduit or a foam overlying basaltic magma in a closed magma reservoir. The gas released from the collapsed foam may infiltrate the rock or diffuse through pores, enhancing heat transfer, or may generate a gas slug to cause a magmatic eruption. The overturn in the magma reservoir provides new nucleation sites which may help to prepare a following/delayed eruption. Mt. Fuji erupted 49 days after the large Hoei earthquake (1707) both dacitic and basaltic magmas. The eruption might have been triggered by magma mixing through sloshing.

  16. High resolution measurement of earthquake impacts on rock slope stability and damage using pre- and post-earthquake terrestrial laser scans

    NASA Astrophysics Data System (ADS)

    Hutchinson, Lauren; Stead, Doug; Rosser, Nick

    2017-04-01

    Understanding the behaviour of rock slopes in response to earthquake shaking is instrumental in response and relief efforts following large earthquakes as well as to ongoing risk management in earthquake affected areas. Assessment of the effects of seismic shaking on rock slope kinematics requires detailed surveys of the pre- and post-earthquake condition of the slope; however, at present, there is a lack of high resolution monitoring data from pre- and post-earthquake to facilitate characterization of seismically induced slope damage and validate models used to back-analyze rock slope behaviour during and following earthquake shaking. Therefore, there is a need for additional research where pre- and post- earthquake monitoring data is available. This paper presents the results of a direct comparison between terrestrial laser scans (TLS) collected in 2014, the year prior to the 2015 earthquake sequence, with that collected 18 months after the earthquakes and two monsoon cycles. The two datasets were collected using Riegl VZ-1000 and VZ-4000 full waveform laser scanners with high resolution (c. 0.1 m point spacing as a minimum). The scans cover the full landslide affected slope from the toe to the crest. The slope is located in Sindhupalchok District, Central Nepal which experienced some of the highest co-seismic and post-seismic landslide intensities across Nepal due to the proximity to the epicenters (<20 km) of both of the main aftershocks on April 26, 2015 (M 6.7) and May 12, 2015 (M7.3). During the 2015 earthquakes and subsequent 2015 and 2016 monsoons, the slope experienced rockfall and debris flows which are evident in satellite imagery and field photographs. Fracturing of the rock mass associated with the seismic shaking is also evident at scales not accessible through satellite and field observations. The results of change detection between the TLS datasets with an emphasis on quantification of seismically-induced slope damage is presented. Patterns in the distribution and expression of rock mass damage are also explored. The findings presented herein provide insight into the response of rock slopes to seismic shaking and highlight the application of remote sensing to understand slope behaviour.

  17. Upper Mississippi embayment shallow seismic velocities measured in situ

    USGS Publications Warehouse

    Liu, Huaibao P.; Hu, Y.; Dorman, J.; Chang, T.-S.; Chiu, J.-M.

    1997-01-01

    Vertical seismic compressional- and shear-wave (P- and S-wave) profiles were collected from three shallow boreholes in sediment of the upper Mississippi embayment. The site of the 60-m hole at Shelby Forest, Tennessee, is on bluffs forming the eastern edge of the Mississippi alluvial plain. The bluffs are composed of Pleistocene loess, Pliocene-Pleistocene alluvial clay and sand deposits, and Tertiary deltaic-marine sediment. The 36-m hole at Marked Tree, Arkansas, and the 27-m hole at Risco, Missouri, are in Holocene Mississippi river floodplain sand, silt, and gravel deposits. At each site, impulsive P- and S-waves were generated by man-made sources at the surface while a three-component geophone was locked downhole at 0.91-m intervals. Consistent with their very similar geology, the two floodplain locations have nearly identical S-wave velocity (VS) profiles. The lowest VS values are about 130 m s-1, and the highest values are about 300 m s-1 at these sites. The shear-wave velocity profile at Shelby Forest is very similar within the Pleistocene loess (12m thick); in deeper, older material, VS exceeds 400 m s-1. At Marked Tree, and at Risco, the compressional-wave velocity (VP) values above the water table are as low as about 230 m s-1, and rise to about 1.9 km s-1 below the water table. At Shelby Forest, VP values in the unsaturated loess are as low as 302 m s-1. VP values below the water table are about 1.8 km s-1. For the two floodplain sites, the VP/VS ratio increases rapidly across the water table depth. For the Shelby Forest site, the largest increase in the VP/VS ratio occurs at ???20-m depth, the boundary between the Pliocene-Pleistocene clay and sand deposits and the Eocene shallow-marine clay and silt deposits. Until recently, seismic velocity data for the embayment basin came from earthquake studies, crustal-scale seismic refraction and reflection profiles, sonic logs, and from analysis of dispersed earthquake surface waves. Since 1991, seismic data for shallow sediment obtained from reflection, refraction, crosshole and downhole techniques have been obtained for sites at the northern end of the embayment basin. The present borehole data, however, are measured from sites representative of large areas in the Mississippi embayment. Therefore, they fill a gap in information needed for modeling the response of the embayment to destructive seismic shaking.

  18. NUTRIENT CHANNELS AND STIRRING ENHANCED THE COMPOSITION AND STIFFNESS OF LARGE CARTILAGE CONSTRUCTS

    PubMed Central

    Cigan, Alexander D.; Nims, Robert J.; Albro, Michael B.; Vunjak-Novakovic, Gordana; Hung, Clark T.; Ateshian, Gerard A.

    2014-01-01

    A significant challenge in cartilage tissue engineering is to successfully culture functional tissues that are sufficiently large to treat osteoarthritic joints. Transport limitations due to nutrient consumption by peripheral cells produce heterogeneous constructs with matrix-deficient centers. Incorporation of nutrient channels into large constructs is a promising technique for alleviating transport limitations, in conjunction with simple yet effective methods for enhancing media flow through channels. Cultivation of cylindrical channeled constructs flat in culture dishes, with or without orbital shaking, produced asymmetric constructs with poor tissue properties. We therefore explored a method for exposing the entire construct surface to the culture media, while promoting flow through the channels. To this end, chondrocyte-seeded agarose constructs (Ø10 mm, 2.34 mm thick), with zero or three nutrient channels (Ø1 mm), were suspended on their sides in custom culture racks and subjected to three media stirring modes for 56 days: uniaxial rocking, orbital shaking, or static control. Orbital shaking led to the highest construct EY, glycosaminoglycan (GAG), and collagen contents, whereas rocking had detrimental effects on GAG and collagen versus static control. Nutrient channels increased EY as well as GAG homogeneity, and the beneficial effects of channels were most marked in orbitally shaken samples. Under these conditions, the constructs developed symmetrically and reached or exceeded native levels of EY (~400 kPa) and glycosaminoglycans (GAG; ~9%/ww). These results suggest that the cultivation of channeled constructs in culture racks with orbital shaking is a promising method for engineering mechanically competent large cartilage constructs. PMID:25458579

  19. Nutrient channels and stirring enhanced the composition and stiffness of large cartilage constructs.

    PubMed

    Cigan, Alexander D; Nims, Robert J; Albro, Michael B; Vunjak-Novakovic, Gordana; Hung, Clark T; Ateshian, Gerard A

    2014-12-18

    A significant challenge in cartilage tissue engineering is to successfully culture functional tissues that are sufficiently large to treat osteoarthritic joints. Transport limitations due to nutrient consumption by peripheral cells produce heterogeneous constructs with matrix-deficient centers. Incorporation of nutrient channels into large constructs is a promising technique for alleviating transport limitations, in conjunction with simple yet effective methods for enhancing media flow through channels. Cultivation of cylindrical channeled constructs flat in culture dishes, with or without orbital shaking, produced asymmetric constructs with poor tissue properties. We therefore explored a method for exposing the entire construct surface to the culture media, while promoting flow through the channels. To this end, chondrocyte-seeded agarose constructs (∅10mm, 2.34mm thick), with zero or three nutrient channels (∅1mm), were suspended on their sides in custom culture racks and subjected to three media stirring modes for 56 days: uniaxial rocking, orbital shaking, or static control. Orbital shaking led to the highest construct EY, sulfated glycosaminoglycan (sGAG), and collagen contents, whereas rocking had detrimental effects on sGAG and collagen versus static control. Nutrient channels increased EY as well as sGAG homogeneity, and the beneficial effects of channels were most marked in orbitally shaken samples. Under these conditions, the constructs developed symmetrically and reached or exceeded native levels of EY (~400kPa) and sGAG (~9%/ww). These results suggest that the cultivation of channeled constructs in culture racks with orbital shaking is a promising method for engineering mechanically competent large cartilage constructs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Earthquake early Warning ShakeAlert system: West coast wide production prototype

    USGS Publications Warehouse

    Kohler, Monica D.; Cochran, Elizabeth S.; Given, Douglas; Guiwits, Stephen; Neuhauser, Doug; Hensen, Ivan; Hartog, Renate; Bodin, Paul; Kress, Victor; Thompson, Stephen; Felizardo, Claude; Brody, Jeff; Bhadha, Rayo; Schwarz, Stan

    2017-01-01

    Earthquake early warning (EEW) is an application of seismological science that can give people, as well as mechanical and electrical systems, up to tens of seconds to take protective actions before peak earthquake shaking arrives at a location. Since 2006, the U.S. Geological Survey has been working in collaboration with several partners to develop EEW for the United States. The goal is to create and operate an EEW system, called ShakeAlert, for the highest risk areas of the United States, starting with the West Coast states of California, Oregon, and Washington. In early 2016, the Production Prototype v.1.0 was established for California; then, in early 2017, v.1.2 was established for the West Coast, with earthquake notifications being distributed to a group of beta users in California, Oregon, and Washington. The new ShakeAlert Production Prototype was an outgrowth from an earlier demonstration EEW system that began sending test notifications to selected users in California in January 2012. ShakeAlert leverages the considerable physical, technical, and organizational earthquake monitoring infrastructure of the Advanced National Seismic System, a nationwide federation of cooperating seismic networks. When fully implemented, the ShakeAlert system may reduce damage and injury caused by large earthquakes, improve the nation’s resilience, and speed recovery.

  1. Research on multi-parameter monitoring of steel frame shaking-table test using smartphone

    NASA Astrophysics Data System (ADS)

    Han, Ruicong; Loh, Kenneth J.; Zhao, Xuefeng; Yu, Yan

    2017-04-01

    The numerical simulation promises an effective method to assess seismic damage of high-rise structure. But it's difficult to determine the input parameters and the simulation results are not completely consistent with the real condition. A more direct approach to evaluate the seismic damage is the structural health monitoring (SHM), which is one complex set of various kinds of sensors, devices and software, and always needs professionals. SHM system has achieved great development over recent years, especially on bridge structures. However it's not so popular on high-rise building due to its difficult implementation. Developing a low-cost and convenient monitoring technique will be helpful for the safety maintenance of high-rise building. Smartphones, which embedded with sensors, network transmission, data storage and processing system, are evolving towards crowdsourcing. The popularity of smartphones presents opportunities for implementation of portable SHM system on buildings. In this paper, multi-parameter monitoring of a three-story steel frame on shaking table under earthquake excitations was conducted with smartphone, and the comparison between smartphone and traditional sensors was provided. First, the monitoring applications on iOS platform, Orion-CC and D-viewer, were introduced. Then the experimental details were presented, including three-story frame model, sensors placement, viscous dampers and so on. Last, the acceleration and displacement time-history curves of smartphone and traditional sensors are provided and compared to prove the feasibility of the monitoring on frame under earthquake excitations by smartphone.

  2. Semi-active tuned liquid column damper implementation with real-time hybrid simulations

    NASA Astrophysics Data System (ADS)

    Riascos, Carlos; Marulanda Casas, Johannio; Thomson, Peter

    2016-04-01

    Real-time hybrid simulation (RTHS) is a modern cyber-physical technique used for the experimental evaluation of complex systems, that treats the system components with predictable behavior as a numerical substructure and the components that are difficult to model as an experimental substructure. Therefore it is an attractive method for evaluation of the response of civil structures under earthquake, wind and anthropic loads. In this paper, the response of three-story shear frame controlled by a tuned liquid column damper (TLCD) and subject to base excitation is considered. Both passive and semi-active control strategies were implemented and are compared. While the passive TLCD achieved a reduction of 50% in the acceleration response of the main structure in comparison with the structure without control, the semi-active TLCD achieved a reduction of 70%, and was robust to variations in the dynamic properties of the main structure. In addition, a RTHS was implemented with the main structure modeled as a linear, time-invariant (LTI) system through a state space representation and the TLCD, with both control strategies, was evaluated on a shake table that reproduced the displacement of the virtual structure. Current assessment measures for RTHS were used to quantify the performance with parameters such as generalized amplitude, equivalent time delay between the target and measured displacement of the shake table, and energy error using the measured force, and prove that the RTHS described in this paper is an accurate method for the experimental evaluation of structural control systems.

  3. Test of FBG sensors for monitoring high pressure pipes

    NASA Astrophysics Data System (ADS)

    Paolozzi, Antonio; Paris, Claudio; Vendittozzi, Cristian; Felli, Ferdinando; Mongelli, Marialuisa; De Canio, Gerardo; Colucci, Alessandro; Asanuma, Hiroshi

    2017-04-01

    Fibre Bragg Grating (FBG) sensors are increasingly being used on a wide range of civil, industrial and aerospace structures. The sensors are created inside optical fibres (usually standard telecommunication fibres); the optical fibres technology allows to install the sensors on structures working in harsh environments, since the materials are almost insensitive to corrosion, the monitoring system can be positioned far away from the sensors without sensible signal losses, and there is no risk of electric discharge. FBG sensors can be used to create strain gages, thermometers or accelerometers, depending on the coating on the grating, on the way the grating is fixed to the structure, and on the presence of a specifically designed interface that can act as a transducer. This paper describes a test of several different FBG sensors to monitor an high pressure pipe that feeds the hydraulic actuators of a 6 degrees-of-freedom shaking table at the ENEA Casaccia research centre. A bare FBG sensor and a copper coated FBG sensor have been glued on the pipe. A third sensor has been mounted on a special interface to amplify the vibrations; this last sensor can be placed on the steel pipe by a magnetic mounting system, that also allows the its removal. All the sensor are placed parallel to the axis of the pipe. The analysis of the data recorded when the shaking table is operated will allow to determine which kind of sensor is best suited for structural monitoring of high pressure pipelines.

  4. Improving Drive Files for Vehicle Road Simulations

    NASA Astrophysics Data System (ADS)

    Cherng, John G.; Goktan, Ali; French, Mark; Gu, Yi; Jacob, Anil

    2001-09-01

    Shaker tables are commonly used in laboratories for automotive vehicle component testing to study durability and acoustics performance. An example is development testing of car seats. However, it is difficult to repeat the measured road data perfectly with the response of a shaker table as there are basic differences in dynamic characteristics between a flexible vehicle and substantially rigid shaker table. In addition, there are performance limits in the shaker table drive systems that can limit correlation. In practice, an optimal drive signal for the actuators is created iteratively. During each iteration, the error between the road data and the response data is minimised by an optimising algorithm which is generally a part of the feed back loop of the shake table controller. This study presents a systematic investigation to the errors in time and frequency domains as well as joint time-frequency domain and an evaluation of different digital signal processing techniques that have been used in previous work. In addition, we present an innovative approach that integrates the dynamic characteristics of car seats and the human body into the error-minimising iteration process. We found that the iteration process can be shortened and the error reduced by using a weighting function created by normalising the frequency response function of the car seat. Two road data test sets were used in the study.

  5. Evaluation of an Immobilized Cell Bioreactor for Degradation of Meta- and Para-Nitrobenzoate

    DTIC Science & Technology

    1994-01-18

    AFB IWTP. 4 Shake flask tests and continuous flow, bench-scale bioreactor tests were conducted using EDA or spent CLEPO 204 as the substrate. It was...found that the shake flask cultures completely degraded EDA when it was the sole substrate. However, using spent CLEPO 204 as the substrate caused a...microorganisms isolated, Kelly 4. Erlenmeyer flasks (250 mL) were used in studies to determine the maximal growth rate of Kelly 4 at 30" C in SMSB

  6. The O-mannosylation and production of recombinant APA (45/47 KDa) protein from Mycobacterium tuberculosis in Streptomyces lividans is affected by culture conditions in shake flasks.

    PubMed

    Gamboa-Suasnavart, Ramsés A; Valdez-Cruz, Norma A; Cordova-Dávalos, Laura E; Martínez-Sotelo, José A; Servín-González, Luis; Espitia, Clara; Trujillo-Roldán, Mauricio A

    2011-12-20

    The Ala-Pro-rich O-glycoprotein known as the 45/47 kDa or APA antigen from Mycobacterium tuberculosis is an immunodominant adhesin restricted to mycobacterium genus and has been proposed as an alternative candidate to generate a new vaccine against tuberculosis or for diagnosis kits. In this work, the recombinant O-glycoprotein APA was produced by the non-pathogenic filamentous bacteria Streptomyces lividans, evaluating three different culture conditions. This strain is known for its ability to produce heterologous proteins in a shorter time compared to M. tuberculosis. Three different shake flask geometries were used to provide different shear and oxygenation conditions; and the impact of those conditions on the morphology of S. lividans and the production of rAPA was characterized and evaluated. Small unbranched free filaments and mycelial clumps were found in baffled and coiled shake flasks, but one order of magnitude larger pellets were found in conventional shake flasks. The production of rAPA is around 3 times higher in small mycelia than in larger pellets, most probably due to difficulties in mass transfer inside pellets. Moreover, there are four putative sites of O-mannosylation in native APA, one of which is located at the carboxy-terminal region. The carbohydrate composition of this site was determined for rAPA by mass spectrometry analysis, and was found to contain different glycoforms depending on culture conditions. Up to two mannoses residues were found in cultures carried out in conventional shake flasks, and up to five mannoses residues were determined in coiled and baffled shake flasks. The shear and/or oxygenation parameters determine the bacterial morphology, the productivity, and the O-mannosylation of rAPA in S. lividans. As demonstrated here, culture conditions have to be carefully controlled in order to obtain recombinant O-glycosylated proteins with similar "quality" in bacteria, particularly, if the protein activity depends on the glycosylation pattern. Furthermore, it will be an interesting exercise to determine the effect of shear and oxygen in shake flasks, to obtain evidences that may be useful in scaling-up these processes to bioreactors. Another approach will be using lab-scale bioreactors under well-controlled conditions, and study the impact of those on rAPA productivity and quality.

  7. How well can we test probabilistic seismic hazard maps?

    NASA Astrophysics Data System (ADS)

    Vanneste, Kris; Stein, Seth; Camelbeeck, Thierry; Vleminckx, Bart

    2017-04-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in probabilistic seismic hazard (PSH) maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSH model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating shaking histories for an area with assumed uniform distribution of earthquakes, Gutenberg-Richter magnitude-frequency relation, Poisson temporal occurrence model, and ground-motion prediction equation (GMPE). We compare the maximum simulated shaking at many sites over time with that predicted by a hazard map generated for the same set of parameters. The Poisson model predicts that the fraction of sites at which shaking will exceed that of the hazard map is p = 1 - exp(-t/T), where t is the duration of observations and T is the map's return period. Exceedance is typically associated with infrequent large earthquakes, as observed in real cases. The ensemble of simulated earthquake histories yields distributions of fractional exceedance with mean equal to the predicted value. Hence, the PSH algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. However, simulated fractional exceedances show a large scatter about the mean value that decreases with increasing t/T, increasing observation time and increasing Gutenberg-Richter a-value (combining intrinsic activity rate and surface area), but is independent of GMPE uncertainty. This scatter is due to the variability of earthquake recurrence, and so decreases as the largest earthquakes occur in more simulations. Our results are important for evaluating the performance of a hazard map based on misfits in fractional exceedance, and for assessing whether such misfit arises by chance or reflects a bias in the map. More specifically, we determined for a broad range of Gutenberg-Richter a-values theoretical confidence intervals on allowed misfits in fractional exceedance and on the percentage of hazard-map bias that can thus be detected by comparison with observed shaking histories. Given that in the real world we only have one shaking history for an area, these results indicate that even if a hazard map does not fit the observations, it is very difficult to assess its veracity, especially for low-to-moderate-seismicity regions. Because our model is a simplified version of reality, any additional uncertainty or complexity will tend to widen these confidence intervals.

  8. Evaluation of knowledge regarding Shaken Baby Syndrome among parents and medical staff.

    PubMed

    Marcinkowska, Urszula; Tyrala, Kinga; Paniczek, Monika; Ledwon, Martyna; Josko-Ochojska, Jadwiga

    2016-06-08

    Shaken Baby Syndrome (SBS), currently functioning as Abusive Head Trauma (AHT), is a form of violence against children mainly under 2 years of age. The number of SBS might be underestimated, as many cases of violence remain unreported. The aim of the study was evaluation of the state of knowledge of the SBS phenomenon, its scale and diagnostic methods among parents, medical staff and medical students. 639 people were examined: 39% of parents, 32,5% medical staff members and 28,5% of medical students. 82% were women. The average age was 34,9 years (SD=9,78). 70% of them had children. The research tool was an anonymous survey. The 34 questions concerned numerous aspects of violence against children as well as knowledge about SBS. According to 90% of the interviewees shaking a baby may be dangerous but 43% have ever heard about shaken baby syndrome. 'SBS is a form of violence' said 88% of respondents but 57% realize that one-time shaking can lead to death and only 19% indicated men as aggressors. 16% of medical staff members did not know how long it takes for the consequences of shaking a baby to be revealed. Majority of the medical staff members working with children have never heard about SBS. Only half of the surveyed understands the connection of shaking with vision loss or child's death. Among the long-term consequences of shaking a baby the greatest knowledge concerns emotional consequences of shaking.

  9. Ground-motion modeling of the 1906 San Francisco earthquake, part I: Validation using the 1989 Loma Prieta earthquake

    USGS Publications Warehouse

    Aagaard, Brad T.; Brocher, T.M.; Dolenc, D.; Dreger, D.; Graves, R.W.; Harmsen, S.; Hartzell, S.; Larsen, S.; Zoback, M.L.

    2008-01-01

    We compute ground motions for the Beroza (1991) and Wald et al. (1991) source models of the 1989 magnitude 6.9 Loma Prieta earthquake using four different wave-propagation codes and recently developed 3D geologic and seismic velocity models. In preparation for modeling the 1906 San Francisco earthquake, we use this well-recorded earthquake to characterize how well our ground-motion simulations reproduce the observed shaking intensities and amplitude and durations of recorded motions throughout the San Francisco Bay Area. All of the simulations generate ground motions consistent with the large-scale spatial variations in shaking associated with rupture directivity and the geologic structure. We attribute the small variations among the synthetics to the minimum shear-wave speed permitted in the simulations and how they accommodate topography. Our long-period simulations, on average, under predict shaking intensities by about one-half modified Mercalli intensity (MMI) units (25%-35% in peak velocity), while our broadband simulations, on average, under predict the shaking intensities by one-fourth MMI units (16% in peak velocity). Discrepancies with observations arise due to errors in the source models and geologic structure. The consistency in the synthetic waveforms across the wave-propagation codes for a given source model suggests the uncertainty in the source parameters tends to exceed the uncertainty in the seismic velocity structure. In agreement with earlier studies, we find that a source model with slip more evenly distributed northwest and southeast of the hypocenter would be preferable to both the Beroza and Wald source models. Although the new 3D seismic velocity model improves upon previous velocity models, we identify two areas needing improvement. Nevertheless, we find that the seismic velocity model and the wave-propagation codes are suitable for modeling the 1906 earthquake and scenario events in the San Francisco Bay Area.

  10. Development of a wireless displacement measurement system using acceleration responses.

    PubMed

    Park, Jong-Woong; Sim, Sung-Han; Jung, Hyung-Jo; Spencer, Billie F

    2013-07-01

    Displacement measurements are useful information for various engineering applications such as structural health monitoring (SHM), earthquake engineering and system identification. Most existing displacement measurement methods are costly, labor-intensive, and have difficulties particularly when applying to full-scale civil structures because the methods require stationary reference points. Indirect estimation methods converting acceleration to displacement can be a good alternative as acceleration transducers are generally cost-effective, easy to install, and have low noise. However, the application of acceleration-based methods to full-scale civil structures such as long span bridges is challenging due to the need to install cables to connect the sensors to a base station. This article proposes a low-cost wireless displacement measurement system using acceleration. Developed with smart sensors that are low-cost, wireless, and capable of on-board computation, the wireless displacement measurement system has significant potential to impact many applications that need displacement information at multiple locations of a structure. The system implements an FIR-filter type displacement estimation algorithm that can remove low frequency drifts typically caused by numerical integration of discrete acceleration signals. To verify the accuracy and feasibility of the proposed system, laboratory tests are carried out using a shaking table and on a three storey shear building model, experimentally confirming the effectiveness of the proposed system.

  11. Development of a Wireless Displacement Measurement System Using Acceleration Responses

    PubMed Central

    Park, Jong-Woong; Sim, Sung-Han; Jung, Hyung-Jo; Spencer, Billie F.

    2013-01-01

    Displacement measurements are useful information for various engineering applications such as structural health monitoring (SHM), earthquake engineering and system identification. Most existing displacement measurement methods are costly, labor-intensive, and have difficulties particularly when applying to full-scale civil structures because the methods require stationary reference points. Indirect estimation methods converting acceleration to displacement can be a good alternative as acceleration transducers are generally cost-effective, easy to install, and have low noise. However, the application of acceleration-based methods to full-scale civil structures such as long span bridges is challenging due to the need to install cables to connect the sensors to a base station. This article proposes a low-cost wireless displacement measurement system using acceleration. Developed with smart sensors that are low-cost, wireless, and capable of on-board computation, the wireless displacement measurement system has significant potential to impact many applications that need displacement information at multiple locations of a structure. The system implements an FIR-filter type displacement estimation algorithm that can remove low frequency drifts typically caused by numerical integration of discrete acceleration signals. To verify the accuracy and feasibility of the proposed system, laboratory tests are carried out using a shaking table and on a three storey shear building model, experimentally confirming the effectiveness of the proposed system. PMID:23881123

  12. Review of microfluidic microbioreactor technology for high-throughput submerged microbiological cultivation

    PubMed Central

    Hegab, Hanaa M.; ElMekawy, Ahmed; Stakenborg, Tim

    2013-01-01

    Microbial fermentation process development is pursuing a high production yield. This requires a high throughput screening and optimization of the microbial strains, which is nowadays commonly achieved by applying slow and labor-intensive submerged cultivation in shake flasks or microtiter plates. These methods are also limited towards end-point measurements, low analytical data output, and control over the fermentation process. These drawbacks could be overcome by means of scaled-down microfluidic microbioreactors (μBR) that allow for online control over cultivation data and automation, hence reducing cost and time. This review goes beyond previous work not only by providing a detailed update on the current μBR fabrication techniques but also the operation and control of μBRs is compared to large scale fermentation reactors. PMID:24404006

  13. Determination of Ultramicro Quantities of Elemental Phosphorus in Water by Neutron Activation Analysis.

    DTIC Science & Technology

    1977-06-10

    HYPOPHOSPHITE :80x I0O4 PHOSPHITE I1.8 x 10- PHOSPHATE 8.0 x 1- SODIUM SALTS: 10 mg/I 16 mad NSWC/WOL TR 77-49 TABLE 3 RECOVERY OF PHOSPHORUS IN NITRIC ACID...of the benzene extract by shaking with aqueous nitric acid resulted in nitric acid oxidation of P4 to phosphate ion. which then nassed into the...aqueous phase. The treatment was carrie out in a mechanical shaker or magnetic stirrer. The aqueous layer, containing phosphate , was isolated in a

  14. Imaging the 2016 Mw 7.8 Kaikoura, New Zealand, earthquake with teleseismic P waves: A cascading rupture across multiple faults

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Koper, Keith D.; Pankow, Kristine; Ge, Zengxi

    2017-05-01

    The 13 November 2016 Mw 7.8 Kaikoura, New Zealand, earthquake was investigated using teleseismic P waves. Backprojection of high-frequency P waves from two regional arrays shows unilateral rupture of at least two southwest-northeast striking faults with an average rupture speed of 1.4-1.6 km/s and total duration of 100 s. Guided by these backprojection results, 33 globally distributed low-frequency P waves were inverted for a finite fault model (FFM) of slip. The FFM showed evidence of several subevents; however, it lacked significant moment release near the epicenter, where a large burst of high-frequency energy was observed. A local strong-motion network recorded strong shaking near the epicenter; hence, for this earthquake the distribution of backprojection energy is superior to the FFM as a guide of strong shaking. For future large earthquakes that occur in regions without strong-motion networks, initial shaking estimates could benefit from backprojection constraints.

  15. Earthquake Facts

    MedlinePlus

    ... recordings of large earthquakes, scientists built large spring-pendulum seismometers in an attempt to record the long- ... are moving away from one another. The first “pendulum seismoscope” to measure the shaking of the ground ...

  16. Distributed cable sensors with memory feature for post-disaster damage assessment

    NASA Astrophysics Data System (ADS)

    Chen, Genda; McDaniel, Ryan D.; Pommerenke, David J.; Sun, Shishuang

    2005-05-01

    A new design of distributed crack sensors is presented for the condition assessment of reinforced concrete (RC) structures during and immediately after an earthquake event. This study is mainly focused on the performance of cable sensors under dynamic loading, particularly their ability to memorize the crack history of an RC member. This unique memory feature enables the post-earthquake condition assessment of structural members such as RC columns, in which the earthquake-induced cracks are closed immediately after an earthquake event due to gravity loads and they are visually undetectable. Factors affecting the onset of the memory feature were investigated experimentally with small-scale RC beams under cyclic loading. Test results indicated that both crack width and the number of loading cycles were instrumental in the onset of the memory feature of cable sensors. Practical issues related to dynamic acquisition with the sensors were discussed. The sensors were proven to be fatigue resistant from the shake table tests of RC columns. They continued to show useful signal after the columns can no longer support additional loads.

  17. Analytical Prediction of the Seismic Response of a Reinforced Concrete Containment Vessel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, R.J.; Rashid, Y.R.; Cherry, J.L.

    Under the sponsorship of the Ministry of International Trade and Industry (MITI) of Japan, the Nuclear Power Engineering Corporation (NUPEC) is investigating the seismic behavior of a Reinforced Concrete Containment Vessel (RCCV) through scale-model testing using the high-performance shaking table at the Tadotsu Engineering Laboratory. A series of tests representing design-level seismic ground motions was initially conducted to gather valuable experimental measurements for use in design verification. Additional tests will be conducted with increasing amplifications of the seismic input until a structural failure of the test model occurs. In a cooperative program with NUPEC, the US Nuclear Regulatory Commission (USNRC),more » through Sandia National Laboratories (SNL), is conducting analytical research on the seismic behavior of RCCV structures. As part of this program, pretest analytical predictions of the model tests are being performed. The dynamic time-history analysis utilizes a highly detailed concrete constitutive model applied to a three-dimensional finite element representation of the test structure. This paper describes the details of the analysis model and provides analysis results.« less

  18. Spring tube braces for seismic isolation of buildings

    NASA Astrophysics Data System (ADS)

    Karayel, V.; Yuksel, Ercan; Gokce, T.; Sahin, F.

    2017-01-01

    A new low-cost seismic isolation system based on spring tube bracings has been proposed and studied at the Structural and Earthquake Engineering Laboratory of Istanbul Technical University. Multiple compression-type springs are positioned in a special cylindrical tube to obtain a symmetrical response in tension and compression-type axial loading. An isolation floor, which consists of pin-ended steel columns and spring tube bracings, is constructed at the foundation level or any intermediate level of the building. An experimental campaign with three stages was completed to evaluate the capability of the system. First, the behavior of the spring tubes subjected to axial displacement reversals with varying frequencies was determined. In the second phase, the isolation floor was assessed in the quasi-static tests. Finally, a ¼ scaled 3D steel frame was tested on the shake table using actual acceleration records. The transmitted acceleration to the floor levels is greatly diminished because of the isolation story, which effects longer period and higher damping. There are no stability and self-centering problems in the isolation floor.

  19. A user-friendly tool to transform large scale administrative data into wide table format using a MapReduce program with a Pig Latin based script.

    PubMed

    Horiguchi, Hiromasa; Yasunaga, Hideo; Hashimoto, Hideki; Ohe, Kazuhiko

    2012-12-22

    Secondary use of large scale administrative data is increasingly popular in health services and clinical research, where a user-friendly tool for data management is in great demand. MapReduce technology such as Hadoop is a promising tool for this purpose, though its use has been limited by the lack of user-friendly functions for transforming large scale data into wide table format, where each subject is represented by one row, for use in health services and clinical research. Since the original specification of Pig provides very few functions for column field management, we have developed a novel system called GroupFilterFormat to handle the definition of field and data content based on a Pig Latin script. We have also developed, as an open-source project, several user-defined functions to transform the table format using GroupFilterFormat and to deal with processing that considers date conditions. Having prepared dummy discharge summary data for 2.3 million inpatients and medical activity log data for 950 million events, we used the Elastic Compute Cloud environment provided by Amazon Inc. to execute processing speed and scaling benchmarks. In the speed benchmark test, the response time was significantly reduced and a linear relationship was observed between the quantity of data and processing time in both a small and a very large dataset. The scaling benchmark test showed clear scalability. In our system, doubling the number of nodes resulted in a 47% decrease in processing time. Our newly developed system is widely accessible as an open resource. This system is very simple and easy to use for researchers who are accustomed to using declarative command syntax for commercial statistical software and Structured Query Language. Although our system needs further sophistication to allow more flexibility in scripts and to improve efficiency in data processing, it shows promise in facilitating the application of MapReduce technology to efficient data processing with large scale administrative data in health services and clinical research.

  20. Combination of High Rate, Real-time GNSS and Accelerometer Observations - Preliminary Results Using a Shake Table and Historic Earthquake Events.

    NASA Astrophysics Data System (ADS)

    Jackson, Michael; Passmore, Paul; Zimakov, Leonid; Raczka, Jared

    2014-05-01

    One of the fundamental requirements of an Earthquake Early Warning (EEW) system (and other mission critical applications) is to quickly detect and process the information from the strong motion event, i.e. event detection and location, magnitude estimation, and the peak ground motion estimation at the defined targeted site, thus allowing the civil protection authorities to provide pre-programmed emergency response actions: Slow down or stop rapid transit trains and high-speed trains; shutoff of gas pipelines and chemical facilities; stop elevators at the nearest floor; send alarms to hospitals, schools and other civil institutions. An important question associated with the EEW system is: can we measure displacements in real time with sufficient accuracy? Scientific GNSS networks are moving towards a model of real-time data acquisition, storage integrity, and real-time position and displacement calculations. This new paradigm allows the integration of real-time, high-rate GNSS displacement information with acceleration and velocity data to create very high-rate displacement records. The mating of these two instruments allows the creation of a new, very high-rate (200 Hz) displacement observable that has the full-scale displacement characteristics of GNSS and high-precision dynamic motions of seismic technologies. It is envisioned that these new observables can be used for earthquake early warning studies and other mission critical applications, such as volcano monitoring, building, bridge and dam monitoring systems. REF TEK a Division of Trimble has developed the integrated GNSS/Accelerograph system, model 160-09SG, which consists of REF TEK's fourth generation electronics, a 147-01 high-resolution ANSS Class A accelerometer, and Trimble GNSS receiver and antenna capable of real time, on board Precise Point Positioning (PPP) techniques with satellite clock and orbit corrections delivered to the receiver directly via L-band satellite communications. The test we conducted with the 160-09SG Recorder is focused on the characteristics of GNSS and seismic sensors in high dynamic environments, including historic earthquakes replicated on a shake table, over a range of displacements and frequencies. The main goals of the field tests are to explore the optimum integration of these sensors from a filtering perspective including simple harmonic impulses over varying frequencies and amplitudes and under the dynamic conditions of various earthquake scenarios.

  1. Liquid films on shake flask walls explain increasing maximum oxygen transfer capacities with elevating viscosity.

    PubMed

    Giese, Heiner; Azizan, Amizon; Kümmel, Anne; Liao, Anping; Peter, Cyril P; Fonseca, João A; Hermann, Robert; Duarte, Tiago M; Büchs, Jochen

    2014-02-01

    In biotechnological screening and production, oxygen supply is a crucial parameter. Even though oxygen transfer is well documented for viscous cultivations in stirred tanks, little is known about the gas/liquid oxygen transfer in shake flask cultures that become increasingly viscous during cultivation. Especially the oxygen transfer into the liquid film, adhering on the shake flask wall, has not yet been described for such cultivations. In this study, the oxygen transfer of chemical and microbial model experiments was measured and the suitability of the widely applied film theory of Higbie was studied. With numerical simulations of Fick's law of diffusion, it was demonstrated that Higbie's film theory does not apply for cultivations which occur at viscosities up to 10 mPa s. For the first time, it was experimentally shown that the maximum oxygen transfer capacity OTRmax increases in shake flasks when viscosity is increased from 1 to 10 mPa s, leading to an improved oxygen supply for microorganisms. Additionally, the OTRmax does not significantly undermatch the OTRmax at waterlike viscosities, even at elevated viscosities of up to 80 mPa s. In this range, a shake flask is a somehow self-regulating system with respect to oxygen supply. This is in contrary to stirred tanks, where the oxygen supply is steadily reduced to only 5% at 80 mPa s. Since, the liquid film formation at shake flask walls inherently promotes the oxygen supply at moderate and at elevated viscosities, these results have significant implications for scale-up. © 2013 Wiley Periodicals, Inc.

  2. India's pharmaceutical industry: hype or high tech take-off?

    PubMed

    Malhotra, Prabodh; Lofgren, Hans

    2004-11-08

    India has built a large pharmaceutical industry through an array of measures in support of domestic firms. The absence of product patents enabled Indian companies to become world leading producers of generic versions of patented drugs. Low costs and a strong engineering tradition continue to sustain competitive strength. The implementation of the World Trade Organization patent regime in 2005 is driving a transformation of the industry. Key elements of the present shake-up include the return of 'big pharma' companies on a large scale and the emergence of several Indian firms that aim to become fully-fledged research-based multinationals. This article provides a description of the development and structure of the Indian pharmaceutical industry and explores questions and challenges arising from its integration into global markets.

  3. Geological control of earthquake induced landslide in El Salvador

    NASA Astrophysics Data System (ADS)

    Tsige Aga, Meaza

    2010-05-01

    Geological control of earthquake induced landslides in El Salvador. M., Tsige(1), I., Garcia-Flórez(1), R., Mateos(2) (1)Universidad Complutense de Madrid, Facultad de Geología, Madrid, Spain, (meaza@geo.ucm.es) (2)IGME, Mallorca El Salvador is located at one of the most seismically active areas en Central America, and suffered severe damage and loss of life in historical and recent earthquakes, as a consequence of earthquake induced landslides. The most common landslides were shallow disrupted soil-slides on steep slopes and were particularly dense in the central part of the country. Most of them are cited in the recent mechanically weak volcanic pyroclastic deposits known as "Tierra Blanca" and "Tierra Color Café" which are prone to seismic wave amplification and are supposed to have contributed to the triggering of some of the hundreds of landslides related to the 2001 (Mw = 7.6 and Mw = 6.7), seismic events. The earthquakes also triggered numerous deep large scale landslides responsible for the enormous devastation of villages and towns and are the source for the current high seismic hazard as well. Many of these landslides are located at distances more than 50 and 100 km from the focal distance, although some of them occurred at near field. Until now there has been little effort to explain the causes and concentration of the deep large-scale landslides especially their distribution, failure mechanism and post-rapture behavior of the landslide mass (long run-out). It has been done a field investigation of landslides, geological materiales and interpretation of aerial photographs taken before and after the two 2001 (Mw= 7.6 and Mw= 6.7) El Salvador earthquakes. The result of the study showed that most of the large-scale landslides occured as coherent block slides with the sliding surface parallel to a pre-existing fractures and fault planes (La Leona, Barriolera, El Desague, Jiboa landslides). Besides that the pre-existing fractures are weak zones controlling the mechanism and size of the slide, they may become the centre of seismic wave guiding and therefore of seismic energy entrapment producing a larger ground movement. On the other hand, the flow-like behavior of the landslide mass after failer is suggested to be controlled by the nature of the geological and geotechnical aspects of the materials. After seismic shaking the landslide mass mobilizes downslope up to hundreds of meters. This mobilization seemed to be due to a large deformation as a consecuence of structure colapse during seismic shaking. These generally are Miocene to Quaternary-aged thick volcanic pyroclasts, fall deposits and breccted tuffs inter-beded frequently by a thin volcanic ash. They consist of 50-60 per cent silt and sand particles with a few amount of clay which are evolving large andesitic blocks. Have a very open texture with a high void ratio and low density which confers an anomalous post-failure deformation. At their in situ state these materials possess high apparent strength due to primary weak chemical and silty-clay cementation but they are susceptible to large reductions in their strength due to shaking and flow like a semi-liquid mass (quick-silt), so that the mass will long run-out.

  4. 5. Credit BG. This interior view shows the weigh room, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Credit BG. This interior view shows the weigh room, looking west (240°): Electric lighting and scale read-outs (boxes with circular windows on the wall) are fitted with explosion-proof enclosures; these enclosures prevent malfunctioning electrical parts from sparking and starting fires or explosions. One marble table and scale have been removed at the extreme left of the view. Two remaining scales handle small and large quantities of propellants and additives. Marble tables do not absorb chemicals or conduct electricity; their mass also prevents vibration from upsetting the scales. The floor has an electrically conductive coating to dissipate static electric charges, thus preventing sparks which might ignite propellants. - Jet Propulsion Laboratory Edwards Facility, Weigh & Control Building, Edwards Air Force Base, Boron, Kern County, CA

  5. Large-Scale Transient Transfection of Chinese Hamster Ovary Cells in Suspension.

    PubMed

    Rajendra, Yashas; Balasubramanian, Sowmya; Hacker, David L

    2017-01-01

    We describe a one-liter transfection of suspension-adapted Chinese hamster ovary (CHO-DG44) cells using polyethyleneimine (PEI) for DNA delivery. The method involves transfection at a high cell density (5 × 10 6 cells/mL) by direct addition of plasmid DNA (pDNA) and PEI to the culture and subsequent incubation at 31 °C with agitation by orbital shaking. We also describe an alternative method in which 90% of the pDNA is replaced by nonspecific (filler) DNA, and the production phase is performed at 31 °C in the presence of 0.25% N, N-dimethylacetamide (DMA).

  6. Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets

    NASA Astrophysics Data System (ADS)

    Juric, Mario

    2011-01-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.

  7. Electromagnetic tracking (EMT) technology for improved treatment quality assurance in interstitial brachytherapy.

    PubMed

    Kellermeier, Markus; Herbolzheimer, Jens; Kreppner, Stephan; Lotter, Michael; Strnad, Vratislav; Bert, Christoph

    2017-01-01

    Electromagnetic Tracking (EMT) is a novel technique for error detection and quality assurance (QA) in interstitial high dose rate brachytherapy (HDR-iBT). The purpose of this study is to provide a concept for data acquisition developed as part of a clinical evaluation study on the use of EMT during interstitial treatment of breast cancer patients. The stability, accuracy, and precision of EMT-determined dwell positions were quantified. Dwell position reconstruction based on EMT was investigated on CT table, HDR table and PDR bed to examine the influence on precision and accuracy in a typical clinical workflow. All investigations were performed using a precise PMMA phantom. The track of catheters inserted in that phantom was measured by manually inserting a 5 degree of freedom (DoF) sensor while recording the position of three 6DoF fiducial sensors on the phantom surface to correct motion influences. From the corrected data, dwell positions were reconstructed along the catheter's track. The accuracy of the EMT-determined dwell positions was quantified by the residual distances to reference dwell positions after using a rigid registration. Precision and accuracy were investigated for different phantom-table and sensor-field generator (FG) distances. The measured precision of the EMT-determined dwell positions was ≤ 0.28 mm (95th percentile). Stability tests showed a drift of 0.03 mm in the first 20 min of use. Sudden shaking of the FG or (large) metallic objects close to the FG degrade the precision. The accuracy with respect to the reference dwell positions was on all clinical tables < 1 mm at 200 mm FG distance and 120 mm phantom-table distance. Phantom measurements showed that EMT-determined localization of dwell positions in HDR-iBT is stable, precise, and sufficiently accurate for clinical assessment. The presented method may be viable for clinical applications in HDR-iBT, like implant definition, error detection or quantification of uncertainties. Further clinical investigations are needed. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  8. The Long-term Impacts of Earthquakes on Economic Growth

    NASA Astrophysics Data System (ADS)

    Lackner, S.

    2016-12-01

    The social science literature has so far not reached a consensus on whether and how earthquakes actually impact economic growth in the long-run. Several hypotheses have been suggested and some even argue for a positive impact. A general weakness in the literature, however, is the predominant use of inadequate measures for the exogenous natural hazard of an earthquake. The most common problems are the lack of individual event size (e.g. earthquake dummy or number of events), the use of magnitude instead of a measure for surface shaking, and endogeneity issues when traditional qualitative intensity scales or actual impact data is used. Here we use peak ground acceleration (PGA) as the ground motion intensity measure and investigate the impacts of earthquake shaking on long-run economic growth. We construct a data set from USGS ShakeMaps that can be considered the universe of global relevant earthquake ground shaking from 1973 to 2014. This data set is then combined with World Bank GDP data to conduct a regression analysis. Furthermore, the impacts of PGA on different industries and other economic variables such as employment and education are also investigated. This will on one hand help to identify the mechanism of how earthquakes impact long-run growth and also show potential impacts on other welfare indicators that are not captured by GDP. This is the first application of global earthquake shaking data to investigate long-term earthquake impacts.

  9. Query-Adaptive Hash Code Ranking for Large-Scale Multi-View Visual Search.

    PubMed

    Liu, Xianglong; Huang, Lei; Deng, Cheng; Lang, Bo; Tao, Dacheng

    2016-10-01

    Hash-based nearest neighbor search has become attractive in many applications. However, the quantization in hashing usually degenerates the discriminative power when using Hamming distance ranking. Besides, for large-scale visual search, existing hashing methods cannot directly support the efficient search over the data with multiple sources, and while the literature has shown that adaptively incorporating complementary information from diverse sources or views can significantly boost the search performance. To address the problems, this paper proposes a novel and generic approach to building multiple hash tables with multiple views and generating fine-grained ranking results at bitwise and tablewise levels. For each hash table, a query-adaptive bitwise weighting is introduced to alleviate the quantization loss by simultaneously exploiting the quality of hash functions and their complement for nearest neighbor search. From the tablewise aspect, multiple hash tables are built for different data views as a joint index, over which a query-specific rank fusion is proposed to rerank all results from the bitwise ranking by diffusing in a graph. Comprehensive experiments on image search over three well-known benchmarks show that the proposed method achieves up to 17.11% and 20.28% performance gains on single and multiple table search over the state-of-the-art methods.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonzon, L. L.; Hente, D. B.; Kukreti, B. M.

    The seismic-fragility response of naturally-aged, nuclear station, safety-related batteries is of interest for two reasons: (1) to determine actual failure modes and thresholds; and (2) to determine the validity of using the electrical capacity of individual cells as an indicator of the end-of-life of a battery, given a seismic event. This report covers the first test series of an extensive program using 12-year old, lead-calcium, Gould NCX-2250 cells, from the James A. Fitzpatrick Nuclear Power Station operated by the New York Power Authority. Seismic tests with three cell configurations were performed using a triaxial shake table: single-cell tests, rigidly mounted;more » multi-cell (three) tests, mounted in a typical battery rack; and single-cell tests specifically aimed towards examining propagation of pre-existing case cracks. In general the test philosophy was to monitor the electrical properties including discharge capacity of cells through a graduated series of g-level step increases until either the shake-table limits were reached or until electrical failure of the cells occurred. Of nine electrically active cells, six failed during seismic testing over a range of imposed g-level loads in excess of a 1-g ZPA. Post-test examination revealed a common failure mode, the cracking at the abnormally brittle, positive lead bus-bar/post interface; further examination showed that the failure zone was extremely coarse grained and extensively corroded. Presently accepted accelerated-aging methods for qualifying batteries, per IEEE Std. 535-1979, are based on plate growth, but these naturally-aged 12-year old cells showed no significant plate growth.« less

  11. An updated geospatial liquefaction model for global application

    USGS Publications Warehouse

    Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.

    2017-01-01

    We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.

  12. Ground motion modeling of the 1906 San Francisco earthquake II: Ground motion estimates for the 1906 earthquake and scenario events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aagaard, B; Brocher, T; Dreger, D

    2007-02-09

    We estimate the ground motions produced by the 1906 San Francisco earthquake making use of the recently developed Song et al. (2008) source model that combines the available geodetic and seismic observations and recently constructed 3D geologic and seismic velocity models. Our estimates of the ground motions for the 1906 earthquake are consistent across five ground-motion modeling groups employing different wave propagation codes and simulation domains. The simulations successfully reproduce the main features of the Boatwright and Bundock (2005) ShakeMap, but tend to over predict the intensity of shaking by 0.1-0.5 modified Mercalli intensity (MMI) units. Velocity waveforms at sitesmore » throughout the San Francisco Bay Area exhibit characteristics consistent with rupture directivity, local geologic conditions (e.g., sedimentary basins), and the large size of the event (e.g., durations of strong shaking lasting tens of seconds). We also compute ground motions for seven hypothetical scenarios rupturing the same extent of the northern San Andreas fault, considering three additional hypocenters and an additional, random distribution of slip. Rupture directivity exerts the strongest influence on the variations in shaking, although sedimentary basins do consistently contribute to the response in some locations, such as Santa Rosa, Livermore, and San Jose. These scenarios suggest that future large earthquakes on the northern San Andreas fault may subject the current San Francisco Bay urban area to stronger shaking than a repeat of the 1906 earthquake. Ruptures propagating southward towards San Francisco appear to expose more of the urban area to a given intensity level than do ruptures propagating northward.« less

  13. Reconstituting botulinum toxin drugs: shaking, stirring or what?

    PubMed

    Dressler, Dirk; Bigalke, Hans

    2016-05-01

    Most botulinum toxin (BT) drugs are stored as powders which need to be reconstituted with normal saline before clinical use. As botulinum neurotoxin (BNT), the therapeutically active ingredient, is a large double-stranded protein the process of reconstitution should be performed with special attention to mechanical stress applied. We wanted to test the mechanical stability of BNT during the reconstitution process. For this, 100 MU onabotulinumtoxinA (Botox(®), Irvine, CA, USA) was reconstituted with 2.0 ml of NaCl/H2O. Gentle reconstitution (GR) was performed with a 5 ml syringe, a 0.90 × 70 mm injection needle, one cycle of injection-aspiration-injection and two gentle shakes of the vial. Aggressive reconstitution (AR) was performed with a 5 ml syringe, a 0.40 × 40 mm injection needle, ten injection-aspiration-injection cycles and 30 s of continuous shaking of the vial. AR increased the time to paralysis in the mouse hemidiaphragm assay (HDA) from 72.0 ± 4.6 to 106.0 ± 16.0 min (*p = 0.002, two-tailed t test after Kolmogorov-Smirnova test with Lilliefors correction for normal distribution). Construction of a calibration curve revealed that the increase in the time to paralysis was correlated with a loss of potency of from 100 to 58 MU (-42 %). BT users should use large diameter injection needles for reconstitution, apply two or three injection-aspiration-injection cycles and, maybe, shake the vials a few times to rinse the entire glass wall. Aggressive reconstitution with small diameter needles, prolonged injection-aspiration-injection and violent shaking should be avoided.

  14. Ground-motion modeling of the 1906 San Francisco Earthquake, part II: Ground-motion estimates for the 1906 earthquake and scenario events

    USGS Publications Warehouse

    Aagaard, Brad T.; Brocher, T.M.; Dolenc, D.; Dreger, D.; Graves, R.W.; Harmsen, S.; Hartzell, S.; Larsen, S.; McCandless, K.; Nilsson, S.; Petersson, N.A.; Rodgers, A.; Sjogreen, B.; Zoback, M.L.

    2008-01-01

    We estimate the ground motions produce by the 1906 San Francisco earthquake making use of the recently developed Song et al. (2008) source model that combines the available geodetic and seismic observations and recently constructed 3D geologic and seismic velocity models. Our estimates of the ground motions for the 1906 earthquake are consistent across five ground-motion modeling groups employing different wave propagation codes and simulation domains. The simulations successfully reproduce the main features of the Boatwright and Bundock (2005) ShakeMap, but tend to over predict the intensity of shaking by 0.1-0.5 modified Mercalli intensity (MMI) units. Velocity waveforms at sites throughout the San Francisco Bay Area exhibit characteristics consistent with rupture directivity, local geologic conditions (e.g., sedimentary basins), and the large size of the event (e.g., durations of strong shaking lasting tens of seconds). We also compute ground motions for seven hypothetical scenarios rupturing the same extent of the northern San Andreas fault, considering three additional hypocenters and an additional, random distribution of slip. Rupture directivity exerts the strongest influence on the variations in shaking, although sedimentary basins do consistently contribute to the response in some locations, such as Santa Rosa, Livermore, and San Jose. These scenarios suggest that future large earthquakes on the northern San Andreas fault may subject the current San Francisco Bay urban area to stronger shaking than a repeat of the 1906 earthquake. Ruptures propagating southward towards San Francisco appear to expose more of the urban area to a given intensity level than do ruptures propagating northward.

  15. MyShake: Building a smartphone seismic network

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.

    2014-12-01

    We are in the process of building up a smartphone seismic network. In order to build this network, we did shake table tests to evaluate the performance of the smartphones as seismic recording instruments. We also conducted noise floor test to find the minimum earthquake signal we can record using smartphones. We added phone noises to the strong motion data from past earthquakes, and used these as an analogy dataset to test algorithms and to understand the difference of using the smartphone network and the traditional seismic network. We also built a prototype system to trigger the smartphones from our server to record signals which can be sent back to the server in near real time. The phones can also be triggered by our developed algorithm running locally on the phone, if there's an earthquake occur to trigger the phones, the signal recorded by the phones will be sent back to the server. We expect to turn the prototype system into a real smartphone seismic network to work as a supplementary network to the existing traditional seismic network.

  16. A Simple Algorithm for Predicting Bacteremia Using Food Consumption and Shaking Chills: A Prospective Observational Study.

    PubMed

    Komatsu, Takayuki; Takahashi, Erika; Mishima, Kentaro; Toyoda, Takeo; Saitoh, Fumihiro; Yasuda, Akari; Matsuoka, Joe; Sugita, Manabu; Branch, Joel; Aoki, Makoto; Tierney, Lawrence; Inoue, Kenji

    2017-07-01

    Predicting the presence of true bacteremia based on clinical examination is unreliable. We aimed to construct a simple algorithm for predicting true bacteremia by using food consumption and shaking chills. A prospective multicenter observational study. Three hospital centers in a large Japanese city. In total, 1,943 hospitalized patients aged 14 to 96 years who underwent blood culture acquisitions between April 2013 and August 2014 were enrolled. Patients with anorexia-inducing conditions were excluded. We assessed the patients' oral food intake based on the meal immediately prior to the blood culture with definition as "normal food consumption" when >80% of a meal was consumed and "poor food consumption" when <80% was consumed. We also concurrently evaluated for a history of shaking chills. We calculated the statistical characteristics of food consumption and shaking chills for the presence of true bacteremia, and subsequently built the algorithm by using recursive partitioning analysis. Among 1,943 patients, 223 cases were true bacteremia. Among patients with normal food consumption, without shaking chills, the incidence of true bacteremia was 2.4% (13/552). Among patients with poor food consumption and shaking chills, the incidence of true bacteremia was 47.7% (51/107). The presence of poor food consumption had a sensitivity of 93.7% (95% confidence interval [CI], 89.4%-97.9%) for true bacteremia, and the absence of poor food consumption (ie, normal food consumption) had a negative likelihood ratio (LR) of 0.18 (95% CI, 0.17-0.19) for excluding true bacteremia, respectively. Conversely, the presence of the shaking chills had a specificity of 95.1% (95% CI, 90.7%-99.4%) and a positive LR of 4.78 (95% CI, 4.56-5.00) for true bacteremia. A 2-item screening checklist for food consumption and shaking chills had excellent statistical properties as a brief screening instrument for predicting true bacteremia. © 2017 Society of Hospital Medicine

  17. Bouncing ball problem: stability of the periodic modes.

    PubMed

    Barroso, Joaquim J; Carneiro, Marcus V; Macau, Elbert E N

    2009-02-01

    Exploring all its ramifications, we give an overview of the simple yet fundamental bouncing ball problem, which consists of a ball bouncing vertically on a sinusoidally vibrating table under the action of gravity. The dynamics is modeled on the basis of a discrete map of difference equations, which numerically solved fully reveals a rich variety of nonlinear behaviors, encompassing irregular nonperiodic orbits, subharmonic and chaotic motions, chattering mechanisms, and also unbounded nonperiodic orbits. For periodic motions, the corresponding conditions for stability and bifurcation are determined from analytical considerations of a reduced map. Through numerical examples, it is shown that a slight change in the initial conditions makes the ball motion switch from periodic to chaotic orbits bounded by a velocity strip v=+/-Gamma(1-epsilon) , where Gamma is the nondimensionalized shaking acceleration and epsilon the coefficient of restitution which quantifies the amount of energy lost in the ball-table collision.

  18. Optical interconnect for large-scale systems

    NASA Astrophysics Data System (ADS)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  19. California spotted owl habitat characteristics and use

    Treesearch

    Susan L. Roberts

    2017-01-01

    California spotted owls (Strix occidentalis occidentalis) establish large home ranges averaging about 1279 ha (3,160 ac) (table 3-1), and within these home ranges individual owls select habitat at different scales, depending on their activity. At the smallest spatial scale, the nest tree, it appears there is very limited flexibility in the...

  20. ShakeMap fed by macroseismic data in France: feedbacks and contribution for improving SHA.

    NASA Astrophysics Data System (ADS)

    Schlupp, A.

    2016-12-01

    We are using the USGS ShakeMap software V3.5 that allows including intensity as input in association with instrumental data. We have been collecting citizen testimonies for 17 years in France, a region of moderate seismicity for the metropolitan part and in a subduction context for the West Indies part. We collect frequently several thousands testimonies after Mw>≈4.5. Thanks to the selection of "intensity characteristic thumbnails", we can provide in real time a single questionnary intensity (SQI) averaged at the city scale for a preliminary EMS98 intensity. We observed that about 65% of these "thumbnails SQI" are identical to the "final expert SQI" and the remaining part is shifted by only an intensity degree. With about 36000 cities (1 per 14 square km), we are able to sample in details the territory when the about 400 seismic stations give irreplaceable precise ground motion parameters but very local and most of the times at a farther epicentral distance. Since 2012, we contribute as intensity provider for ShakeMap in Pyrenees range (www.SisPyr.eu). Since spring 2016, we run the ShakeMap V3.5 in a "beta version" for the whole territory of France with several adaptations for region with moderate size events. The BCSF provides Intensities (www.franceseisme.fr), RESIF the instrumental data (www.resif.fr) with the West Indies observatories (OVSG-OVSM) and few stations of bordering countries. Feedbacks are: a huge improvement at any distance by including intensities, need to use regional attenuation law, detection of important ML overestimation in few regions, strong dependence to the epicenter localization, recent published GMICE well adapted, difficulty to represent non circular isoseismals. What we learn from ShakeMap is also a valuable contribution for hazard assessment. We aim to continuously improve the results for a state reference ShakeMap through a specific "ShakeMap transverse action" and its working group in the frame of RESIF.

  1. Earthquake Early Warning: User Education and Designing Effective Messages

    NASA Astrophysics Data System (ADS)

    Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.

    2014-12-01

    The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental information to increase the public's understanding of earthquake shaking and protective behaviors.

  2. Solving the problem of Trans-Genomic Query with alignment tables.

    PubMed

    Parker, Douglass Stott; Hsiao, Ruey-Lung; Xing, Yi; Resch, Alissa M; Lee, Christopher J

    2008-01-01

    The trans-genomic query (TGQ) problem--enabling the free query of biological information, even across genomes--is a central challenge facing bioinformatics. Solutions to this problem can alter the nature of the field, moving it beyond the jungle of data integration and expanding the number and scope of questions that can be answered. An alignment table is a binary relationship on locations (sequence segments). An important special case of alignment tables are hit tables ? tables of pairs of highly similar segments produced by alignment tools like BLAST. However, alignment tables also include general binary relationships, and can represent any useful connection between sequence locations. They can be curated, and provide a high-quality queryable backbone of connections between biological information. Alignment tables thus can be a natural foundation for TGQ, as they permit a central part of the TGQ problem to be reduced to purely technical problems involving tables of locations.Key challenges in implementing alignment tables include efficient representation and indexing of sequence locations. We define a location datatype that can be incorporated naturally into common off-the-shelf database systems. We also describe an implementation of alignment tables in BLASTGRES, an extension of the open-source POSTGRESQL database system that provides indexing and operators on locations required for querying alignment tables. This paper also reviews several successful large-scale applications of alignment tables for Trans-Genomic Query. Tables with millions of alignments have been used in queries about alternative splicing, an area of genomic analysis concerning the way in which a single gene can yield multiple transcripts. Comparative genomics is a large potential application area for TGQ and alignment tables.

  3. Cache Coherence Protocols for Large-Scale Multiprocessors

    DTIC Science & Technology

    1990-09-01

    and is compared with the other protocols for large-scale machines. In later analysis, this coherence method is designated by the acronym OCPD , which...private read misses 2 6 6 ( OCPD ) private write misses 2 6 6 Table 4.2: Transaction Types and Costs. the performance of the memory system. These...methodologies. Figure 4-2 shows the processor utiliza- tions of the Weather program, with special code in the dyn-nic post-mortem sched- 94 OCPD DlrINB

  4. Variable Pitch Darrieus Water Turbines

    NASA Astrophysics Data System (ADS)

    Kirke, Brian; Lazauskas, Leo

    In recent years the Darrieus wind turbine concept has been adapted for use in water, either as a hydrokinetic turbine converting the kinetic energy of a moving fluid in open flow like an underwater wind turbine, or in a low head or ducted arrangement where flow is confined, streamtube expansion is controlled and efficiency is not subject to the Betz limit. Conventional fixed pitch Darrieus turbines suffer from two drawbacks, (i) low starting torque and (ii) shaking due to cyclical variations in blade angle of attack. Ventilation and cavitation can also cause problems in water turbines when blade velocities are high. Shaking can be largely overcome by the use of helical blades, but these do not produce large starting torque. Variable pitch can produce high starting torque and high efficiency, and by suitable choice of pitch regime, shaking can be minimized but not entirely eliminated. Ventilation can be prevented by avoiding operation close to a free surface, and cavitation can be prevented by limiting blade velocities. This paper summarizes recent developments in Darrieus water turbines, some problems and some possible solutions.

  5. The Transuranium Elements.

    ERIC Educational Resources Information Center

    Seaborg, Glenn T.

    1985-01-01

    Discusses the unusual chemistry of the transuranium elements as well as their impact on the periodic table. Also considers the practical applications of transuranium isotopes, such as their use in nuclear fuel for the large-scale generation of electricity. (JN)

  6. Investigations in site response from ground motion observations in vertical arrays

    NASA Astrophysics Data System (ADS)

    Baise, Laurie Gaskins

    The aim of the research is to improve the understanding of earthquake site response and to improve the techniques available to investigate issues in this field. Vertical array ground motion data paired with the empirical transfer function (ETF) methodology is shown to accurately characterize site response. This manuscript draws on methods developed in the field of signal processing and statistical time series analysis to parameterize the ETF as an autoregressive moving-average (ARMA) system which is justified theoretically, historically, and by example. Site response is evaluated at six sites in California, Japan, and Taiwan using ETF estimates, correlation analysis, and full waveform modeling. Correlation analysis is proposed as a required data quality evaluation imperative to any subsequent site response analysis. ETF estimates and waveform modeling are used to decipher the site response at sites with simple and complex geologic structure, which provide simple time-invariant and time-variant methods for evaluating both linear site transfer functions and nonlinear site response for sites experiencing liquefaction of the soils. The Treasure and Yerba Buena Island sites, however, require 2-D waveform modeling to accurately evaluate the effects of the shallow sedimentary basin. ETFs are used to characterize the Port Island site and corresponding shake table tests before, during, and after liquefaction. ETFs derived from the shake table tests were demonstrated to consistently predict the linear field ground response below 16 m depth and the liquefied behavior above 15 m depth. The liquefied interval response was demonstrated to gradually return to pre-liquefied conditions within several weeks of the 1995 Hyogo-ken Nanbu earthquake. Both the site's and the shake table test's response were shown to be effectively linear up to 0.5 g in the native materials below 16 m depth. The effective linearity of the site response at GVDA, Chiba, and Lotting up to 0.1 g, 0.33 g, and 0.49 g, respectively, further confirms that site response in the field may be more linear than expected from laboratory tests. Strong motions were predicted at these sites with normalized mean square error less than 0.10 using ETFs generated from weak motions. The Treasure Island site response was shown to be dominated by surface waves propagating in the shallow sediments of the San Francisco Bay. Low correlation of the ground motions recorded on rock at Yerba Buena Island and in rock beneath the Treasure Island site intimates that the Yerba Buena site is an inappropriate reference site for Treasure Island site response studies. Accurate simulation of the Treasure Island site response was achieved using a 2-D velocity structure comprised of a 100 m uniform soil basin (Vs = 400 m/s) over a weathered rock veneer (Vs = 1.5 km/s) to 200 m depth.

  7. Improvement of ε-poly-L-lysine production through seed stage development based on in situ pH monitoring.

    PubMed

    Sun, Qi-Xing; Chen, Xu-Sheng; Ren, Xi-Dong; Mao, Zhong-Gui

    2015-01-01

    Nissin, natamycin, and ε-poly-L-lysine (ε-PL) are three safe, microbial-produced food preservatives used today in the food industry. However, current industrial production of ε-PL is only performed in several countries. In order to realize large-scale ε-PL production by fermentation, the effects of seed stage on cell growth and ε-PL production were investigated by monitoring of pH in situ in a 5-L laboratory-scale fermenter. A significant increase in ε-PL production in fed-batch fermentation by Streptomyces sp. M-Z18 was achieved, at 48.9 g/L, through the optimization of several factors associated with seed stage, including spore pretreatment, inoculum age, and inoculum level. Compared with conventional fermentation approaches using 24-h-old shake-flask seed broth as inoculum, the maximum ε-PL concentration and productivity were enhanced by 32.3 and 36.6 %, respectively. The effect of optimized inoculum conditions on ε-PL production on a large scale was evaluated using a 50-L pilot-scale fermenter, attaining a maximum ε-PL production of 36.22 g/L in fed-batch fermentation, constituting the first report of ε-PL production at pilot scale. These results will be helpful for efficient ε-PL production by Streptomyces at pilot and plant scales.

  8. Investigating the Mercalli Intensity Scale through "Lived Experience"

    ERIC Educational Resources Information Center

    Jones, Richard

    2012-01-01

    The modified Mercalli (MM) intensity scale is composed of 12 increasing levels of intensity that range from imperceptible shaking to catastrophic destruction and is designated by Roman numerals I through XII. Although qualitative in nature, it can provide a more concrete model for middle and high school students striving to understand the dynamics…

  9. Seismogeodesy and Rapid Earthquake and Tsunami Source Assessment

    NASA Astrophysics Data System (ADS)

    Melgar Moctezuma, Diego

    This dissertation presents an optimal combination algorithm for strong motion seismograms and regional high rate GPS recordings. This seismogeodetic solution produces estimates of ground motion that recover the whole seismic spectrum, from the permanent deformation to the Nyquist frequency of the accelerometer. This algorithm will be demonstrated and evaluated through outdoor shake table tests and recordings of large earthquakes, notably the 2010 Mw 7.2 El Mayor-Cucapah earthquake and the 2011 Mw 9.0 Tohoku-oki events. This dissertations will also show that strong motion velocity and displacement data obtained from the seismogeodetic solution can be instrumental to quickly determine basic parameters of the earthquake source. We will show how GPS and seismogeodetic data can produce rapid estimates of centroid moment tensors, static slip inversions, and most importantly, kinematic slip inversions. Throughout the dissertation special emphasis will be placed on how to compute these source models with minimal interaction from a network operator. Finally we will show that the incorporation of off-shore data such as ocean-bottom pressure and RTK-GPS buoys can better-constrain the shallow slip of large subduction events. We will demonstrate through numerical simulations of tsunami propagation that the earthquake sources derived from the seismogeodetic and ocean-based sensors is detailed enough to provide a timely and accurate assessment of expected tsunami intensity immediately following a large earthquake.

  10. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  11. Time-resolved large-scale volumetric pressure fields of an impinging jet from dense Lagrangian particle tracking

    NASA Astrophysics Data System (ADS)

    Huhn, F.; Schanz, D.; Manovski, P.; Gesemann, S.; Schröder, A.

    2018-05-01

    Time-resolved volumetric pressure fields are reconstructed from Lagrangian particle tracking with high seeding concentration using the Shake-The-Box algorithm in a perpendicular impinging jet flow with exit velocity U=4 m/s (Re˜ 36,000) and nozzle-plate spacing H/D=5. Helium-filled soap bubbles are used as tracer particles which are illuminated with pulsed LED arrays. A large measurement volume has been covered (cloud of tracked particles in a volume of 54 L, ˜ 180,000 particles). The reconstructed pressure field has been validated against microphone recordings at the wall with high correlation coefficients up to 0.88. In a reduced measurement volume (13 L), dense Lagrangian particle tracking is shown to be feasable up to the maximal possible jet velocity of U=16 m/s.

  12. Experiments on shells under base excitation

    NASA Astrophysics Data System (ADS)

    Pellicano, Francesco; Barbieri, Marco; Zippo, Antonio; Strozzi, Matteo

    2016-05-01

    The aim of the present paper is a deep experimental investigation of the nonlinear dynamics of circular cylindrical shells. The specific problem regards the response of circular cylindrical shells subjected to base excitation. The shells are mounted on a shaking table that furnishes a vertical vibration parallel to the cylinder axis; a heavy rigid disk is mounted on the top of the shells. The base vibration induces a rigid body motion, which mainly causes huge inertia forces exerted by the top disk to the shell. In-plane stresses due to the aforementioned inertias give rise to impressively large vibration on the shell. An extremely violent dynamic phenomenon suddenly appears as the excitation frequency varies up and down close to the linear resonant frequency of the first axisymmetric mode. The dynamics are deeply investigated by varying excitation level and frequency. Moreover, in order to generalise the investigation, two different geometries are analysed. The paper furnishes a complete dynamic scenario by means of: (i) amplitude frequency diagrams, (ii) bifurcation diagrams, (iii) time histories and spectra, (iv) phase portraits and Poincaré maps. It is to be stressed that all the results presented here are experimental.

  13. Fixed Base Modal Testing Using the NASA GRC Mechanical Vibration Facility

    NASA Technical Reports Server (NTRS)

    Staab, Lucas D.; Winkel, James P.; Suarez, Vicente J.; Jones, Trevor M.; Napolitano, Kevin L.

    2016-01-01

    The Space Power Facility at NASA's Plum Brook Station houses the world's largest and most powerful space environment simulation facilities, including the Mechanical Vibration Facility (MVF), which offers the world's highest-capacity multi-axis spacecraft shaker system. The MVF was designed to perform sine vibration testing of a Crew Exploration Vehicle (CEV)-class spacecraft with a total mass of 75,000 pounds, center of gravity (cg) height above the table of 284 inches, diameter of 18 feet, and capability of 1.25 gravity units peak acceleration in the vertical and 1.0 gravity units peak acceleration in the lateral directions. The MVF is a six-degree-of-freedom, servo-hydraulic, sinusoidal base-shake vibration system that has the advantage of being able to perform single-axis sine vibration testing of large structures in the vertical and two lateral axes without the need to reconfigure the test article for each axis. This paper discusses efforts to extend the MVF's capabilities so that it can also be used to determine fixed base modes of its test article without the need for an expensive test-correlated facility simulation.

  14. Modeling of a viscoelastic damper and its application in structural control.

    PubMed

    Mehrabi, M H; Suhatril, Meldi; Ibrahim, Zainah; Ghodsi, S S; Khatibi, Hamed

    2017-01-01

    Conventional seismic rehabilitation methods may not be suitable for some buildings owing to their high cost and time-consuming foundation work. In recent years, viscoelastic dampers (VEDs) have been widely used in many mid- and high-rise buildings. This study introduces a viscoelastic passive control system called rotary rubber braced damper (RRBD). The RRBD is an economical, lightweight, and easy-to-assemble device. A finite element model considering nonlinearity, large deformation, and material damage is developed to conduct a parametric study on different damper sizes under pushover cyclic loading. The fundamental characteristics of this VED system are clarified by analyzing building structures under cyclic loading. The result show excellent energy absorption and stable hysteresis loops in all specimens. Additionally, by using a sinusoidal shaking table test, the effectiveness of the RRBD to manage the response displacement and acceleration of steel frames is considered. The RRBD functioned at early stages of lateral displacement, indicating that the system is effective for all levels of vibration. Moreover, the proposed damper shows significantly better performance in terms of the column compression force resulting from the brace action compared to chevron bracing (CB).

  15. Simplified dynamic analysis to evaluate liquefaction-induced lateral deformation of earth slopes: a computational fluid dynamics approach

    NASA Astrophysics Data System (ADS)

    Jafarian, Yaser; Ghorbani, Ali; Ahmadi, Omid

    2014-09-01

    Lateral deformation of liquefiable soil is a cause of much damage during earthquakes, reportedly more than other forms of liquefaction-induced ground failures. Researchers have presented studies in which the liquefied soil is considered as viscous fluid. In this manner, the liquefied soil behaves as non-Newtonian fluid, whose viscosity decreases as the shear strain rate increases. The current study incorporates computational fluid dynamics to propose a simplified dynamic analysis for the liquefaction-induced lateral deformation of earth slopes. The numerical procedure involves a quasi-linear elastic model for small to moderate strains and a Bingham fluid model for large strain states during liquefaction. An iterative procedure is considered to estimate the strain-compatible shear stiffness of soil. The post-liquefaction residual strength of soil is considered as the initial Bingham viscosity. Performance of the numerical procedure is examined by using the results of centrifuge model and shaking table tests together with some field observations of lateral ground deformation. The results demonstrate that the proposed procedure predicts the time history of lateral ground deformation with a reasonable degree of precision.

  16. [The ecological and epidemiological principles of prevention of ascariasis under the conditions of large-scale solid waste storage].

    PubMed

    Kas'ianov, V I

    2005-01-01

    The paper presents the results of a study of the impact of large-scale solid waste storage on ascariasis morbidity in the population. The use of sewage sediments as an organic soil fertilizer to grow strawberries and table greens is shown to substantially increase the risk of Ascaris infection in the population. Storage of solid domestic garbage on specialized dumping grounds does not lead to mass environmental pollution with geohelminthic eggs.

  17. Finite-Fault and Other New Capabilities of CISN ShakeAlert

    NASA Astrophysics Data System (ADS)

    Boese, M.; Felizardo, C.; Heaton, T. H.; Hudnut, K. W.; Hauksson, E.

    2013-12-01

    Over the past 6 years, scientists at Caltech, UC Berkeley, the Univ. of Southern California, the Univ. of Washington, the US Geological Survey, and ETH Zurich (Switzerland) have developed the 'ShakeAlert' earthquake early warning demonstration system for California and the Pacific Northwest. We have now started to transform this system into a stable end-to-end production system that will be integrated into the daily routine operations of the CISN and PNSN networks. To quickly determine the earthquake magnitude and location, ShakeAlert currently processes and interprets real-time data-streams from several hundred seismic stations within the California Integrated Seismic Network (CISN) and the Pacific Northwest Seismic Network (PNSN). Based on these parameters, the 'UserDisplay' software predicts and displays the arrival and intensity of shaking at a given user site. Real-time ShakeAlert feeds are currently being shared with around 160 individuals, companies, and emergency response organizations to gather feedback about the system performance, to educate potential users about EEW, and to identify needs and applications of EEW in a future operational warning system. To improve the performance during large earthquakes (M>6.5), we have started to develop, implement, and test a number of new algorithms for the ShakeAlert system: the 'FinDer' (Finite Fault Rupture Detector) algorithm provides real-time estimates of locations and extents of finite-fault ruptures from high-frequency seismic data. The 'GPSlip' algorithm estimates the fault slip along these ruptures using high-rate real-time GPS data. And, third, a new type of ground-motion prediction models derived from over 415,000 rupture simulations along active faults in southern California improves MMI intensity predictions for large earthquakes with consideration of finite-fault, rupture directivity, and basin response effects. FinDer and GPSlip are currently being real-time and offline tested in a separate internal ShakeAlert installation at Caltech. Real-time position and displacement time series from around 100 GPS sensors are obtained in JSON format from RTK/PPP(AR) solutions using the RTNet software at USGS Pasadena. However, we have also started to investigate the usage of onsite (in-receiver) processing using NetR9 with RTX and tracebuf2 output format. A number of changes to the ShakeAlert processing, xml message format, and the usage of this information in the UserDisplay software were necessary to handle the new finite-fault and slip information from the FinDer and GPSlip algorithms. In addition, we have developed a framework for end-to-end off-line testing with archived and simulated waveform data using the Earthworm tankplayer. Detailed background information about the algorithms, processing, and results from these test runs will be presented.

  18. Shaking table experimentation on adjacent structures controlled by passive and semi-active MR dampers

    NASA Astrophysics Data System (ADS)

    Basili, M.; De Angelis, M.; Fraraccio, G.

    2013-06-01

    This paper presents the results of shaking table tests on adjacent structures controlled by passive and semi-active MR dampers. The aim was to demonstrate experimentally the effectiveness of passive and semi-active strategies in reducing structural vibrations due to seismic excitation. The physical model at issue was represented by two adjacent steel structures, respectively of 4 and 2 levels, connected at the second level by a MR damper. When the device operated in semi-active mode, an ON-OFF control algorithm, derived by the Lyapunov stability theory, was implemented and experimentally validated. Since the experimentation concerned adjacent structures, two control objectives have been reached: global and selective protection. In case of global protection, the attention was focused on protecting both structures, whereas, in case of selective protection, the attention was focused on protecting only one structure. For each objective the effectiveness of passive control has been compared with the situation of no control and then the effectiveness of semi-active control has been compared with the passive one. The quantities directly compared have been: measured displacements, accelerations and force-displacement of the MR damper, moreover some global response quantities have been estimated from experimental measures, which are the base share force and the base bending moment, the input energy and the energy dissipated by the device. In order to evaluate the effectiveness of the control action in both passive and semi-active case, an energy index EDI, previously defined and already often applied numerically, has been utilized. The aspects investigated in the experimentation have been: the implementation and validation of the control algorithm for selective and global protection, the MR damper input voltage influence, the kind of seismic input and its intensity.

  19. CyberShake-derived ground-motion prediction models for the Los Angeles region with application to earthquake early warning

    USGS Publications Warehouse

    Bose, Maren; Graves, Robert; Gill, David; Callaghan, Scott; Maechling, Phillip J.

    2014-01-01

    Real-time applications such as earthquake early warning (EEW) typically use empirical ground-motion prediction equations (GMPEs) along with event magnitude and source-to-site distances to estimate expected shaking levels. In this simplified approach, effects due to finite-fault geometry, directivity and site and basin response are often generalized, which may lead to a significant under- or overestimation of shaking from large earthquakes (M > 6.5) in some locations. For enhanced site-specific ground-motion predictions considering 3-D wave-propagation effects, we develop support vector regression (SVR) models from the SCEC CyberShake low-frequency (<0.5 Hz) and broad-band (0–10 Hz) data sets. CyberShake encompasses 3-D wave-propagation simulations of >415 000 finite-fault rupture scenarios (6.5 ≤ M ≤ 8.5) for southern California defined in UCERF 2.0. We use CyberShake to demonstrate the application of synthetic waveform data to EEW as a ‘proof of concept’, being aware that these simulations are not yet fully validated and might not appropriately sample the range of rupture uncertainty. Our regression models predict the maximum and the temporal evolution of instrumental intensity (MMI) at 71 selected test sites using only the hypocentre, magnitude and rupture ratio, which characterizes uni- and bilateral rupture propagation. Our regression approach is completely data-driven (where here the CyberShake simulations are considered data) and does not enforce pre-defined functional forms or dependencies among input parameters. The models were established from a subset (∼20 per cent) of CyberShake simulations, but can explain MMI values of all >400 k rupture scenarios with a standard deviation of about 0.4 intensity units. We apply our models to determine threshold magnitudes (and warning times) for various active faults in southern California that earthquakes need to exceed to cause at least ‘moderate’, ‘strong’ or ‘very strong’ shaking in the Los Angeles (LA) basin. These thresholds are used to construct a simple and robust EEW algorithm: to declare a warning, the algorithm only needs to locate the earthquake and to verify that the corresponding magnitude threshold is exceeded. The models predict that a relatively moderate M6.5–7 earthquake along the Palos Verdes, Newport-Inglewood/Rose Canyon, Elsinore or San Jacinto faults with a rupture propagating towards LA could cause ‘very strong’ to ‘severe’ shaking in the LA basin; however, warning times for these events could exceed 30 s.

  20. An Overview of Mesoscale Modeling Software for Energetic Materials Research

    DTIC Science & Technology

    2010-03-01

    12 2.9 Large-scale Atomic/Molecular Massively Parallel Simulator ( LAMMPS ...13 Table 10. LAMMPS summary...extensive reviews, lectures and workshops are available on multiscale modeling of materials applications (76-78). • Multi-phase mixtures of

  1. Pattern-based, multi-scale segmentation and regionalization of EOSD land cover

    NASA Astrophysics Data System (ADS)

    Niesterowicz, Jacek; Stepinski, Tomasz F.

    2017-10-01

    The Earth Observation for Sustainable Development of Forests (EOSD) map is a 25 m resolution thematic map of Canadian forests. Because of its large spatial extent and relatively high resolution the EOSD is difficult to analyze using standard GIS methods. In this paper we propose multi-scale segmentation and regionalization of EOSD as new methods for analyzing EOSD on large spatial scales. Segments, which we refer to as forest land units (FLUs), are delineated as tracts of forest characterized by cohesive patterns of EOSD categories; we delineated from 727 to 91,885 FLUs within the spatial extent of EOSD depending on the selected scale of a pattern. Pattern of EOSD's categories within each FLU is described by 1037 landscape metrics. A shapefile containing boundaries of all FLUs together with an attribute table listing landscape metrics make up an SQL-searchable spatial database providing detailed information on composition and pattern of land cover types in Canadian forest. Shapefile format and extensive attribute table pertaining to the entire legend of EOSD are designed to facilitate broad range of investigations in which assessment of composition and pattern of forest over large areas is needed. We calculated four such databases using different spatial scales of pattern. We illustrate the use of FLU database for producing forest regionalization maps of two Canadian provinces, Quebec and Ontario. Such maps capture the broad scale variability of forest at the spatial scale of the entire province. We also demonstrate how FLU database can be used to map variability of landscape metrics, and thus the character of landscape, over the entire Canada.

  2. Strong Effects of Vs30 Heterogeneity on Physics-Based Scenario Ground-Shaking Computations

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Pullammanappallil, S. K.

    2014-12-01

    Hazard mapping and building codes worldwide use the vertically time-averaged shear-wave velocity between the surface and 30 meters depth, Vs30, as one predictor of earthquake ground shaking. Intensive field campaigns a decade ago in Reno, Los Angeles, and Las Vegas measured urban Vs30 transects with 0.3-km spacing. The Clark County, Nevada, Parcel Map includes urban Las Vegas and comprises over 10,000 site measurements over 1500 km2, completed in 2010. All of these data demonstrate fractal spatial statistics, with a fractal dimension of 1.5-1.8 at scale lengths from 0.5 km to 50 km. Vs measurements in boreholes up to 400 m deep show very similar statistics at 1 m to 200 m lengths. When included in physics-based earthquake-scenario ground-shaking computations, the highly heterogeneous Vs30 maps exhibit unexpectedly strong influence. In sensitivity tests (image below), low-frequency computations at 0.1 Hz display amplifications (as well as de-amplifications) of 20% due solely to Vs30. In 0.5-1.0 Hz computations, the amplifications are a factor of two or more. At 0.5 Hz and higher frequencies the amplifications can be larger than what the 1-d Building Code equations would predict from the Vs30 variations. Vs30 heterogeneities at one location have strong influence on amplifications at other locations, stretching out in the predominant direction of wave propagation for that scenario. The sensitivity tests show that shaking and amplifications are highly scenario-dependent. Animations of computed ground motions and how they evolve with time suggest that the fractal Vs30 variance acts to trap wave energy and increases the duration of shaking. Validations of the computations against recorded ground motions, possible in Las Vegas Valley due to the measurements of the Clark County Parcel Map, show that ground motion levels and amplifications match, while recorded shaking has longer duration than computed shaking. Several mechanisms may explain the amplification and increased duration of shaking in the presence of heterogeneous spatial distributions of Vs: conservation of wave energy across velocity changes; geometric focusing of waves by low-velocity lenses; vertical resonance and trapping; horizontal resonance and trapping; and multiple conversion of P- to S-wave energy.

  3. Evaluating the intensity of U.S. earthquakes

    USGS Publications Warehouse

    Simon, R.; Stover, C.

    1977-01-01

    The effects of seismic shaking are objective. All observers can agree these are real occurences and not subjective speculation. Reliable intensity evaluations are based not on a single factor on any scale but on consistent combinations. 

  4. The Classroom Sandbox: A Physical Model for Scientific Inquiry

    ERIC Educational Resources Information Center

    Feldman, Allan; Cooke, Michele L.; Ellsworth, Mary S.

    2010-01-01

    For scientists, the sandbox serves as an analog for faulting in Earth's crust. Here, the large, slow processes within the crust can be scaled to the size of a table, and time scales are directly observable. This makes it a useful tool for demonstrating the role of inquiry in science. For this reason, the sandbox is also helpful for learning…

  5. VLSI (Very Large Scale Integrated Circuits) Device Reliability Models.

    DTIC Science & Technology

    1984-12-01

    CIRCUIT COMPLEXITY FAILURE RATES FOR... A- 40 MOS SSI/MSI DEVICES IN FAILURE PER 106 HOURS TABLE 5.1.2.5-19: C1 AND C2 CIRCUIT COMPLEXITY FAILURE RATES FOR...A- 40 MOS SSI/MSI DEVICES IN FAILURE PER 106 HOURS TABLE 5.1.2.5-19: Cl AND C2 CIRCUIT COMPLEXITY FAILURE RATES FOR... A-41 LINEAR DEVICES IN...19 National Semiconductor 20 Nitron 21 Raytheon 22 Sprague 23 Synertek 24 Teledyne Crystalonics 25 TRW Semiconductor 26 Zilog The following companies

  6. Behavior of braced excavation in sand under a seismic condition: experimental and numerical studies

    NASA Astrophysics Data System (ADS)

    Konai, Sanku; Sengupta, Aniruddha; Deb, Kousik

    2018-04-01

    The behavior of braced excavation in dry sand under a seismic condition is investigated in this paper. A series of shake table tests on a reduced scale model of a retaining wall with one level of bracing were conducted to study the effect of different design parameters such as excavation depth, acceleration amplitude and wall stiffness. Numerical analyses using FLAC 2D were also performed considering one level of bracing. The strut forces, lateral displacements and bending moments in the wall at the end of earthquake motion were compared with experimental results. The study showed that in a post-seismic condition, when other factors were constant, lateral displacement, bending moment, strut forces and maximum ground surface displacement increased with excavation depth and the amplitude of base acceleration. The study also showed that as wall stiffness decreased, the lateral displacement of the wall and ground surface displacement increased, but the bending moment of the wall and strut forces decreased. The net earth pressure behind the walls was influenced by excavation depth and the peak acceleration amplitude, but did not change significantly with wall stiffness. Strut force was the least affected parameter when compared with others under a seismic condition.

  7. Experimental investigation of jet pulse control on flexible vibrating structures

    NASA Astrophysics Data System (ADS)

    Karaiskos, Grigorios; Papanicolaou, Panos; Zacharopoulos, Dimitrios

    2016-08-01

    The feasibility of applying on-line fluid jet pulses to actively control the vibrations of flexible structures subjected to harmonic and earthquake-like base excitations provided by a shake table is explored. The operating principles and capabilities of the control system applied have been investigated in a simplified small-scale laboratory model that is a mass attached at the top free end of a vertical flexible slender beam with rectangular cross-section, the other end of which is mounted on an electrodynamic shaker. A pair of opposite jets placed on the mass at the top of the cantilever beam applied the appropriate forces by ejecting pressurized air pulses controlled by on/off solenoid electro-valves via in house developed control software, in order to control the vibration caused by harmonic, periodic and random excitations at pre-selected frequency content provided by the shaker. The dynamics of the structure was monitored by accelerometers and the jet impulses by pressure sensors. The experimental results have demonstrated the effectiveness and reliability of Jet Pulse Control Systems (JPCS). It was verified that the measured root mean square (RMS) vibration levels of the controlled structure from harmonic and earthquake base excitations, could be reduced by approximately 50% and 33% respectively.

  8. Seismic Response of Steel Braced Building Frame Considering Soil Structure Interaction (SSI): An Experimental Study

    NASA Astrophysics Data System (ADS)

    Hirave, Vivek; Kalyanshetti, Mahesh

    2018-02-01

    Conventional fixed-base analysis ignoring the effect of soil-flexibility may result in unsafe design. Therefore, to evaluate the realistic behavior of structure the soil structure interaction (SSI) effect shall be incorporated in the analysis. In seismic analysis, provision of bracing system is one of the important option for the structure to have sufficient strength with adequate stiffness to resist lateral forces. The different configuration of these bracing systems alters the response of buildings, and therefore, it is important to evaluate the most effective bracing systems in view point of stability against SSI effect. In present study, three RC building frames, G+3, G+5 and G+7 and their respective scaled down steel model with two types of steel bracing system incorporating the effect of soil flexibility is considered for experimental and analytical study. The analytical study is carried out using Elastic continuum approach and the experimental study is carried out using Shake Table. The influence of SSI on various seismic parameters is presented. The study reveals that, steel bracing system is beneficial to control SSI effect and it is observed that V bracing is more effective, in resisting seismic load considering SSI.

  9. Strategies for Maximizing Successful Drug Substance Technology Transfer Using Engineering, Shake-Down, and Wet Test Runs.

    PubMed

    Abraham, Sushil; Bain, David; Bowers, John; Larivee, Victor; Leira, Francisco; Xie, Jasmina

    2015-01-01

    The technology transfer of biological products is a complex process requiring control of multiple unit operations and parameters to ensure product quality and process performance. To achieve product commercialization, the technology transfer sending unit must successfully transfer knowledge about both the product and the process to the receiving unit. A key strategy for maximizing successful scale-up and transfer efforts is the effective use of engineering and shake-down runs to confirm operational performance and product quality prior to embarking on good manufacturing practice runs such as process performance qualification runs. We consider key factors to consider in making the decision to perform shake-down or engineering runs. We also present industry benchmarking results of how engineering runs are used in drug substance technology transfers alongside the main themes and best practices that have emerged. Our goal is to provide companies with a framework for ensuring the "right first time" technology transfers with effective deployment of resources within increasingly aggressive timeline constraints. © PDA, Inc. 2015.

  10. [Synthesis of vitamin K2 by isopentenyl transferase NovA in Pichia pastoris Gpn12].

    PubMed

    Wu, Xihua; Li, Zhemin; Liu, Hui; Wang, Peng; Wang, Li; Fang, Xue; Sun, Xiaowen; Ni, Wenfeng; Yang, Qiang; Zheng, Zhiming; Zhao, Genhai

    2018-01-25

    The effect of methanol addition on the heterologous expression of isoprenyl transferase NovQ was studied in Pichia pastoris Gpn12, with menadione and isopentenol as precursors to catalyze vitamin K2 (MK-3) synthesis. The expression of NovQ increased by 36% when 2% methanol was added every 24 h. The influence of initial pH, temperature, methanol addition, precursors (menadione, isopentenol) addition, catalytic time and cetyltrimethyl-ammonium bromide (CTAB) addition were explored in the P. pastoris whole-cell catalytic synthesis process of MK-3 in shaking flask. Three significant factors were then studied by response surface method. The optimal catalytic conditions obtained were as follows: catalytic temperature 31.56 ℃, menadione 295.54 mg/L, catalytic time 15.87 h. Consistent with the response surface prediction results, the optimized yield of MK-3 reached 98.47 mg/L in shaking flask, 35% higher than that of the control group. On this basis, the production in a 30-L fermenter reached 189.67 mg/L when the cell catalyst of 220 g/L (dry weight) was used to catalyze the synthesis for 24 h. This method laid the foundation for the large-scale production of MK-3 by P. pastoris Gpn12.

  11. Overexpression of Human Bone Alkaline Phosphatase in Pichia Pastoris

    NASA Technical Reports Server (NTRS)

    Karr, Laurel; Malone, Christine, C.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The Pichiapastoris expression system was utilized to produce functionally active human bone alkaline phosphatase in gram quantities. Bone alkaline phosphatase is a key enzyme in bone formation and biomineralization, yet important questions about its structural chemistry and interactions with other cellular enzymes in mineralizing tissues remain unanswered. A soluble form of human bone alkaline phosphatase was constructed by deletion of the 25 amino acid hydrophobic C-terminal region of the encoding cDNA and inserted into the X-33 Pichiapastoris strain. An overexpression system was developed in shake flasks and converted to large-scale fermentation. Alkaline phosphatase was secreted into the medium to a level of 32mgAL when cultured in shake flasks. Enzyme activity was 12U/mg measured by a spectrophotometric assay. Fermentation yielded 880mgAL with enzymatic activity of 968U/mg. Gel electrophoresis analysis indicates that greater than 50% of the total protein in the fermentation is alkaline phosphatase. A purification scheme has been developed using ammonium sulfate precipitation followed by hydrophobic interaction chromatography. We are currently screening crystallization conditions of the purified recombinant protein for subsequent X-ray diffraction analyses. Structural data should provide additional information on the role of alkaline phosphatase in normal bone mineralization and in certain bone mineralization anomalies.

  12. A fuzzy model of superelastic shape memory alloys for vibration control in civil engineering applications

    NASA Astrophysics Data System (ADS)

    Ozbulut, O. E.; Mir, C.; Moroni, M. O.; Sarrazin, M.; Roschke, P. N.

    2007-06-01

    Two experimental test programs are conducted to collect data and simulate the dynamic behavior of CuAlBe shape memory alloy (SMA) wires. First, in order to evaluate the effect of temperature changes on superelastic SMA wires, a large number of cyclic, sinusoidal, tensile tests are performed at 1 Hz. These tests are conducted in a controlled environment at 0, 25 and 50 °C with three different strain amplitudes. Second, in order to assess the dynamic effects of the material, a series of laboratory experiments is conducted on a shake table with a scale model of a three-story structure that is stiffened with SMA wires. Data from these experiments are used to create fuzzy inference systems (FISs) that can predict hysteretic behavior of CuAlBe wire. Both fuzzy models employ a total of three input variables (strain, strain-rate, and temperature or pre-stress) and an output variable (predicted stress). Gaussian membership functions are used to fuzzify data for each of the input and output variables. Values of the initially assigned membership functions are adjusted using a neural-fuzzy procedure to more accurately predict the correct stress level in the wires. Results of the trained FISs are validated using test results from experimental records that had not been previously used in the training procedure. Finally, a set of numerical simulations is conducted to illustrate practical use of these wires in a civil engineering application. The results reveal the applicability for structural vibration control of pseudoelastic CuAlBe wire whose highly nonlinear behavior is modeled by a simple, accurate, and computationally efficient FIS.

  13. Rapid estimation of the economic consequences of global earthquakes

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is to reduce this time gap to more rapidly and effectively mobilize response. We present here a procedure to rapidly and approximately ascertain the economic impact immediately following a large earthquake anywhere in the world. In principle, the approach presented is similar to the empirical fatality estimation methodology proposed and implemented by Jaiswal and others (2009). In order to estimate economic losses, we need an assessment of the economic exposure at various levels of shaking intensity. The economic value of all the physical assets exposed at different locations in a given area is generally not known and extremely difficult to compile at a global scale. In the absence of such a dataset, we first estimate the total Gross Domestic Product (GDP) exposed at each shaking intensity by multiplying the per-capita GDP of the country by the total population exposed at that shaking intensity level. We then scale the total GDP estimated at each intensity by an exposure correction factor, which is a multiplying factor to account for the disparity between wealth and/or economic assets to the annual GDP. The economic exposure obtained using this procedure is thus a proxy estimate for the economic value of the actual inventory that is exposed to the earthquake. The economic loss ratio, defined in terms of a country-specific lognormal cumulative distribution function of shaking intensity, is derived and calibrated against the losses from past earthquakes. This report describes the development of a country or region-specific economic loss ratio model using economic loss data available for global earthquakes from 1980 to 2007. The proposed model is a potential candidate for directly estimating economic losses within the currently-operating PAGER system. PAGER's other loss models use indirect methods that require substantially more data (such as building/asset inventories, vulnerabilities, and the asset values exposed at the time of earthquake) to implement on a global basis and will thus take more time to develop and implement within the PAGER system.

  14. Analyzing the dynamic response of rotating blades in small-scale wind turbines

    NASA Astrophysics Data System (ADS)

    Hsiung, Wan-Ying; Huang, Yu-Ting; Loh, Chin-Hsiung; Loh, Kenneth J.; Kamisky, Robert J.; Nip, Danny; van Dam, Cornelis

    2014-03-01

    The objective of this study was to validate modal analysis, system identification and damage detection of small-scale rotating wind turbine blades in the laboratory and in the field. Here, wind turbine blades were instrumented with accelerometers and strain gages, and data acquisition was achieved using a prototype wireless sensing system. In the first portion of this study conducted in the laboratory, sensors were installed onto metallic structural elements that were fabricated to be representative of an actual wind blade. In order to control the excitation (rotation of the wind blade), a motor was used to spin the blades at controlled angular velocities. The wind turbine was installed on a shaking table for testing under rotation of turbine blades. Data measured by the sensors were recorded while the blade was operated at different speeds. On the other hand, the second part of this study utilized a small-scale wind turbine system mounted on the rooftop of a building. The main difference, as compared to the lab tests, was that the field tests relied on actual wind excitations (as opposed to a controlled motor). The raw data from both tests were analyzed using signal processing and system identification techniques for deriving the model response of the blades. The multivariate singular spectrum analysis (MSSA) and covariance-driven stochastic subspace identification method (SSI-COV) were used to identify the dynamic characteristics of the system. Damage of one turbine blade (loose bolts connection) in the lab test was also conducted. The extracted modal properties for both undamaged and damage cases under different ambient or forced excitations (earthquake loading) were compared. These tests confirmed that dynamic characterization of rotating wind turbines was feasible, and the results will guide future monitoring studies planned for larger-scale systems.

  15. Coping Behavior of International Late Adolescent Students in Selected Australian Educational Institutions

    PubMed Central

    Shahrill, Masitah; Mundia, Lawrence

    2014-01-01

    Using the Adolescent Coping Scale, ACS (Frydenberg & Lewis, 1993) we surveyed 45 randomly selected foreign adolescents in Australian schools. The coping strategies used most by the participants were: focus on solving the problem; seeking relaxing diversions; focusing on the positive; seeking social support; worry; seeking to belong; investing in close friends; wishful thinking; and keep to self (Table 4). With regard to coping styles, the most widely used was the productive coping followed by non-productive coping while the least used style was reference to others (Table 4). In terms of both genders the four coping strategies used most often were: work hard to achieve; seeking relaxing diversions; focus on solving the problem; and focus on the positive (Table 5). The most noticeable gender difference was the use of the physical recreation coping strategy in which male students engaged more (Fig 1). The usage of four coping strategies (solving problem; work hard; focus on positive; and social support) was higher for students who have been away from family more than once as compared to those who have been away once only while the usage of seeking relaxing diversions was higher for the first timers (Table 6). No significant differences were obtained on the sample’s performance on the ACS subscales by gender (Table 7), frequency of leaving own country (Table 8), country of origin (Table 9), and length of stay in Australia (Table 11). However, foundation students scored significantly higher on the reference to others variable than their secondary school peers (Table 10). We recommended counseling for students with high support needs and further large-scale mixed-methods research to gain additional insights. PMID:24373267

  16. Bringing Policy and Practice to the Table: Young Women's Nutritional Experiences in an Ontario Secondary School

    ERIC Educational Resources Information Center

    Gray, Sarah K.

    2015-01-01

    In recent years, media, health organizations and researchers have raised concern over the health of Canadian children and adolescents. Stakeholders have called on the government to confront the problem. Schools are seen as an ideal location for developing and implementing large-scale interventions because of the ease of access to large groups of…

  17. Investigation of aeroelastic stability phenomena of a helicopter by in-flight shake test

    NASA Technical Reports Server (NTRS)

    Miao, W. L.; Edwards, T.; Brandt, D. E.

    1976-01-01

    The analytical capability of the helicopter stability program is discussed. The parameters which are found to be critical to the air resonance characteristics of the soft in-plane hingeless rotor systems are detailed. A summary of two model test programs, a 1/13.8 Froude-scaled BO-105 model and a 1.67 meter (5.5 foot) diameter Froude-scaled YUH-61A model, are presented with emphasis on the selection of the final parameters which were incorporated in the full scale YUH-61A helicopter. Model test data for this configuration are shown. The actual test results of the YUH-61A air resonance in-flight shake test stability are presented. Included are a concise description of the test setup, which employs the Grumman Automated Telemetry System (ATS), the test technique for recording in-flight stability, and the test procedure used to demonstrate favorable stability characteristics with no in-plane damping augmentation (lag damper removed). The data illustrating the stability trend of air resonance with forward speed and the stability trend of ground resonance for percent airborne are presented.

  18. Assessing the Challenges in the Application of Potential Probiotic Lactic Acid Bacteria in the Large-Scale Fermentation of Spanish-Style Table Olives.

    PubMed

    Rodríguez-Gómez, Francisco; Romero-Gil, Verónica; Arroyo-López, Francisco N; Roldán-Reyes, Juan C; Torres-Gallardo, Rosa; Bautista-Gallego, Joaquín; García-García, Pedro; Garrido-Fernández, Antonio

    2017-01-01

    This work studies the inoculation conditions for allowing the survival/predominance of a potential probiotic strain ( Lactobacillus pentosus TOMC-LAB2) when used as a starter culture in large-scale fermentations of green Spanish-style olives. The study was performed in two successive seasons (2011/2012 and 2012/2013), using about 150 tons of olives. Inoculation immediately after brining (to prevent wild initial microbiota growth) followed by re-inoculation 24 h later (to improve competitiveness) was essential for inoculum predominance. Processing early in the season (September) showed a favorable effect on fermentation and strain predominance on olives (particularly when using acidified brines containing 25 L HCl/vessel) but caused the disappearance of the target strain from both brines and olives during the storage phase. On the contrary, processing in October slightly reduced the target strain predominance on olives (70-90%) but allowed longer survival. The type of inoculum used (laboratory vs. industry pre-adapted) never had significant effects. Thus, this investigation discloses key issues for the survival and predominance of starter cultures in large-scale industrial fermentations of green Spanish-style olives. Results can be of interest for producing probiotic table olives and open new research challenges on the causes of inoculum vanishing during the storage phase.

  19. MyShake: A smartphone seismic network for earthquake early warning and beyond

    PubMed Central

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis; Kwon, Young-Woo

    2016-01-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics. PMID:26933682

  20. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    PubMed

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics.

  1. Pollutant dispersion in a large indoor space: Part 1 -- Scaled experiments using a water-filled model with occupants and furniture.

    PubMed

    Thatcher, T L; Wilson, D J; Wood, E E; Craig, M J; Sextro, R G

    2004-08-01

    Scale modeling is a useful tool for analyzing complex indoor spaces. Scale model experiments can reduce experimental costs, improve control of flow and temperature conditions, and provide a practical method for pretesting full-scale system modifications. However, changes in physical scale and working fluid (air or water) can complicate interpretation of the equivalent effects in the full-scale structure. This paper presents a detailed scaling analysis of a water tank experiment designed to model a large indoor space, and experimental results obtained with this model to assess the influence of furniture and people in the pollutant concentration field at breathing height. Theoretical calculations are derived for predicting the effects from losses of molecular diffusion, small scale eddies, turbulent kinetic energy, and turbulent mass diffusivity in a scale model, even without Reynolds number matching. Pollutant dispersion experiments were performed in a water-filled 30:1 scale model of a large room, using uranine dye injected continuously from a small point source. Pollutant concentrations were measured in a plane, using laser-induced fluorescence techniques, for three interior configurations: unobstructed, table-like obstructions, and table-like and figure-like obstructions. Concentrations within the measurement plane varied by more than an order of magnitude, even after the concentration field was fully developed. Objects in the model interior had a significant effect on both the concentration field and fluctuation intensity in the measurement plane. PRACTICAL IMPLICATION: This scale model study demonstrates both the utility of scale models for investigating dispersion in indoor environments and the significant impact of turbulence created by furnishings and people on pollutant transport from floor level sources. In a room with no furniture or occupants, the average concentration can vary by about a factor of 3 across the room. Adding furniture and occupants can increase this spatial variation by another factor of 3.

  2. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018)

    USGS Publications Warehouse

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-06-30

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and also for preparing emergency response plans.The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault Zone to the east of the study area.In this report, an earthquake scenario is intended to depict the potential consequences of significant earthquakes. A scenario earthquake is not necessarily the largest or most damaging earthquake possible on a recognized fault. Rather it is both large enough and likely enough that emergency planners should consider it in regional emergency response plans. In particular, the ground motion predicted for a given scenario earthquake does not represent a full probabilistic hazard assessment, and thus it does not provide the basis for hazard zoning and earthquake-resistant building design.Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM). Alternatives to the NSHM scenario were developed for the Hilton Creek and Hartley Springs Faults to account for different opinions in how far these two faults extend into Long Valley Caldera. For each scenario, ground motions were calculated using the current standard practice: the deterministic seismic hazard analysis program developed by Art Frankel of USGS and three Next Generation Ground Motion Attenuation (NGA) models. Ground motion calculations incorporated the potential amplification of seismic shaking by near-surface soils defined by a map of the average shear wave velocity in the uppermost 30 m (VS30) developed by CGS.In addition to ground shaking and shaking-related ground failure such as liquefaction and earthquake induced landslides, earthquakes cause surface rupture displacement, which can lead to severe damage of buildings and lifelines. For each earthquake scenario, potential surface fault displacements are estimated using deterministic and probabilistic approaches. Liquefaction occurs when saturated sediments lose their strength because of ground shaking. Zones of potential liquefaction are mapped by incorporating areas where loose sandy sediments, shallow groundwater, and strong earthquake shaking coincide in the earthquake scenario. The process for defining zones of potential landslide and rockfall incorporates rock strength, surface slope, and existing landslides, with ground motions caused by the scenario earthquake.Each scenario is illustrated with maps of seismic shaking potential and fault displacement, liquefaction, and landslide potential. Seismic shaking is depicted by the distribution of shaking intensity, peak ground acceleration, and 1.0-second spectral acceleration. One-second spectral acceleration correlates well with structural damage to surface facilities. Acceleration greater than 0.2 g is often associated with strong ground shaking and may cause moderate to heavy damage. The extent of strong shaking is influenced by subsurface fault dip and near surface materials. Strong shaking is more widespread in the hanging wall regions of a normal fault. Larger ground motions also occur where young alluvial sediments amplify the shaking. Both of these effects can lead to strong shaking that extends farther from the fault on the valley side than on the hill side.The effect of fault rupture displacements may be localized along the surface trace of the mapped earthquake fault if fault geometry is simple and the fault traces are accurately located. However, surface displacement hazards can spread over a few hundred meters to a few kilometers if the earthquake fault has numerous splays or branches, such as the Hilton Creek Fault. Faulting displacements are estimated to be about 1 meter along normal faults in the study area and close to 2 meters along the White Mountains Fault Zone.All scenarios show the possibility of widespread ground failure. Liquefaction damage would likely occur in the areas of higher ground shaking near the faults where there are sandy/silty sediments and the depth to groundwater is 6.1 meters (20 feet) or less. Generally, this means damage is most common near lakes and streams in the areas of strongest shaking. Landslide potential exists throughout the study region. All steep slopes (>30 degrees) present a potential hazard at any level of shaking. Lesser slopes may have landslides within the areas of the higher ground shaking. The landslide hazard zones also are likely sources for snow avalanches during winter months and for large boulders that can be shaken loose and roll hundreds of feet down hill, which happened during the 1980 Mammoth Lakes earthquakes.Whereas methodologies used in estimating ground shaking, liquefaction, and landslides are well developed and have been applied in published hazard maps; methodologies used in estimating surface fault displacement are still being developed. Therefore, this report provides a more in-depth and detailed discussion of methodologies used for deterministic and probabilistic fault displacement hazard analyses for this project.

  3. Ground-motion signature of dynamic ruptures on rough faults

    NASA Astrophysics Data System (ADS)

    Mai, P. Martin; Galis, Martin; Thingbaijam, Kiran K. S.; Vyas, Jagdish C.

    2016-04-01

    Natural earthquakes occur on faults characterized by large-scale segmentation and small-scale roughness. This multi-scale geometrical complexity controls the dynamic rupture process, and hence strongly affects the radiated seismic waves and near-field shaking. For a fault system with given segmentation, the question arises what are the conditions for producing large-magnitude multi-segment ruptures, as opposed to smaller single-segment events. Similarly, for variable degrees of roughness, ruptures may be arrested prematurely or may break the entire fault. In addition, fault roughness induces rupture incoherence that determines the level of high-frequency radiation. Using HPC-enabled dynamic-rupture simulations, we generate physically self-consistent rough-fault earthquake scenarios (M~6.8) and their associated near-source seismic radiation. Because these computations are too expensive to be conducted routinely for simulation-based seismic hazard assessment, we thrive to develop an effective pseudo-dynamic source characterization that produces (almost) the same ground-motion characteristics. Therefore, we examine how variable degrees of fault roughness affect rupture properties and the seismic wavefield, and develop a planar-fault kinematic source representation that emulates the observed dynamic behaviour. We propose an effective workflow for improved pseudo-dynamic source modelling that incorporates rough-fault effects and its associated high-frequency radiation in broadband ground-motion computation for simulation-based seismic hazard assessment.

  4. Practices of shake-flask culture and advances in monitoring CO2 and O2.

    PubMed

    Takahashi, Masato; Aoyagi, Hideki

    2018-05-01

    About 85 years have passed since the shaking culture was devised. Since then, various monitoring devices have been developed to measure culture parameters. O 2 consumed and CO 2 produced by the respiration of cells in shaking cultures are of paramount importance due to their presence in both the culture broth and headspace of shake flask. Monitoring in situ conditions during shake-flask culture is useful for analysing the behaviour of O 2 and CO 2 , which interact according to Henry's law, and is more convenient than conventional sampling that requires interruption of shaking. In situ monitoring devices for shake-flask cultures are classified as direct or the recently developed bypass type. It is important to understand the characteristics of each type along with their unintended effect on shake-flask cultures, in order to improve the existing devices and culture conditions. Technical developments in the bypass monitoring devices are strongly desired in the future. It is also necessary to understand the mechanism underlying conventional shake-flask culture. The existing shaking culture methodology can be expanded into next-generation shake-flask cultures constituting a novel culture environment through a judicious selection of monitoring devices depending on the intended purpose of shake-flask culture. Construction and sharing the databases compatible with the various types of the monitoring devices and measurement instruments adapted for shaking culture can provide a valuable resource for broadening the application of cells with shake-flask culture.

  5. Relative performance of several inexpensive accelerometers

    USGS Publications Warehouse

    Evans, John R.; Rogers, John A.

    1995-01-01

    We examined the performance of several low-cost accelerometers for highly cost-driven applications in recording earthquake strong motion. We anticipate applications for such sensors in providing the lifeline and emergency-response communities with an immediate, comprehensive picture of the extent and characteristics of likely damage. We also foresee their use as 'filler' instruments sited between research-grade instruments to provide spatially detailed and near-field records of large earthquakes (on the order of 1000 stations at 600-m intervals in San Fernando Valley, population 1.2 million, for example). The latter applications would provide greatly improved attenuation relationships for building codes and design, the first examples of mainshock information (that is, potentially nonlinear regime) for microzonation, and a suite of records for structural engineers. We also foresee possible applications in monitoring structural inter-story drift during earthquakes, possibly leading to local and remote alarm functions as well as design criteria. This effort appears to be the first of its type at the USGS. It is spurred by rapid advances in sensor technology and the recognition of potential non-classical applications. In this report, we estimate sensor noise spectra, relative transfer functions and cross-axis sensitivity of six inexpensive sensors. We tested three micromachined ('silicon-chip') sensors in addition to classical force-balance and piezoelectric examples. This sample of devices is meant to be representative, not comprehensive. Sensor noise spectra were estimated by recording system output with the sensor mounted on a pneumatically supported 545-kg optical-bench isolation table. This isolation table appears to limit ground motion to below our system noise level. These noise estimates include noise introduced by signal-conditioning circuitry, the analog-to-digital converter (ADC), and noise induced in connecting wiring by ambient electromagnetic fields in our suburban laboratory. These latter sources are believed to dominate sensor noise in the quieter sensors we tested. Transfer functions were obtained relative to a research grade force-balance accelerometer (a Kinemetrics TM FBA-11) by shaking the sensors simultaneously on the same shake table and taking spectral ratios with the output of the FBA- 11. This reference sensor is said to have 120 db dynamic range (-+20 bits, though we only digitized it at 16 bits resolution and drove it with relatively small signals). We did not test temperature sensitivity, which is thought to be a significant issue at least for the silicon devices. Though these tests were not designed to be definitive (our anticipated applications do not demand research-grade precision), our tests do appear to have been successful in estimating relative transfer functions from about 0.3 to 50 Hz. Most sensors performed adequately in this range, with essentially fiat relative transfer functions. Noise tests appear to measure sensor noise well for the noisier (generally less expensive) instruments from about 0.1 to 50 Hz.

  6. Oscillating potential well in the complex plane and the adiabatic theorem

    NASA Astrophysics Data System (ADS)

    Longhi, Stefano

    2017-10-01

    A quantum particle in a slowly changing potential well V (x ,t ) =V ( x -x0(ɛ t ) ) , periodically shaken in time at a slow frequency ɛ , provides an important quantum mechanical system where the adiabatic theorem fails to predict the asymptotic dynamics over time scales longer than ˜1 /ɛ . Specifically, we consider a double-well potential V (x ) sustaining two bound states spaced in frequency by ω0 and periodically shaken in a complex plane. Two different spatial displacements x0(t ) are assumed: the real spatial displacement x0(ɛ t ) =A sin(ɛ t ) , corresponding to ordinary Hermitian shaking, and the complex one x0(ɛ t ) =A -A exp(-i ɛ t ) , corresponding to non-Hermitian shaking. When the particle is initially prepared in the ground state of the potential well, breakdown of adiabatic evolution is found for both Hermitian and non-Hermitian shaking whenever the oscillation frequency ɛ is close to an odd resonance of ω0. However, a different physical mechanism underlying nonadiabatic transitions is found in the two cases. For the Hermitian shaking, an avoided crossing of quasienergies is observed at odd resonances and nonadiabatic transitions between the two bound states, resulting in Rabi flopping, can be explained as a multiphoton resonance process. For the complex oscillating potential well, breakdown of adiabaticity arises from the appearance of Floquet exceptional points at exact quasienergy crossing.

  7. Mapping PetaSHA Applications to TeraGrid Architectures

    NASA Astrophysics Data System (ADS)

    Cui, Y.; Moore, R.; Olsen, K.; Zhu, J.; Dalguer, L. A.; Day, S.; Cruz-Atienza, V.; Maechling, P.; Jordan, T.

    2007-12-01

    The Southern California Earthquake Center (SCEC) has a science program in developing an integrated cyberfacility - PetaSHA - for executing physics-based seismic hazard analysis (SHA) computations. The NSF has awarded PetaSHA 15 million allocation service units this year on the fastest supercomputers available within the NSF TeraGrid. However, one size does not fit all, a range of systems are needed to support this effort at different stages of the simulations. Enabling PetaSHA simulations on those TeraGrid architectures to solve both dynamic rupture and seismic wave propagation have been a challenge from both hardware and software levels. This is an adaptation procedure to meet specific requirements of each architecture. It is important to determine how fundamental system attributes affect application performance. We present an adaptive approach in our PetaSHA application that enables the simultaneous optimization of both computation and communication at run-time using flexible settings. These techniques optimize initialization, source/media partition and MPI-IO output in different ways to achieve optimal performance on the target machines. The resulting code is a factor of four faster than the orignial version. New MPI-I/O capabilities have been added for the accurate Staggered-Grid Split-Node (SGSN) method for dynamic rupture propagation in the velocity-stress staggered-grid finite difference scheme (Dalguer and Day, JGR, 2007), We use execution workflow across TeraGrid sites for managing the resulting data volumes. Our lessons learned indicate that minimizing time to solution is most critical, in particular when scheduling large scale simulations across supercomputer sites. The TeraShake platform has been ported to multiple architectures including TACC Dell lonestar and Abe, Cray XT3 Bigben and Blue Gene/L. Parallel efficiency of 96% with the PetaSHA application Olsen-AWM has been demonstrated on 40,960 Blue Gene/L processors at IBM TJ Watson Center. Notable accomplishments using the optimized code include the M7.8 ShakeOut rupture scenario, as part of the southern San Andreas Fault evaluation SoSAFE. The ShakeOut simulation domain is the same as used for the SCEC TeraShake simulations (600 km by 300 km by 80 km). However, the higher resolution of 100 m with frequency content up to 1 Hz required 14.4 billion grid points, eight times more than the TeraShake scenarios. The simulation used 2000 TACC Dell linux Lonestar processors and took 56 hours to compute 240 seconds of wave propagation. The pre-processing input partition, as well as post-processing analysis has been performed on the SDSC IBM Datastar p655 and p690. In addition, as part of the SCEC DynaShake computational platform, the SGSN capability was used to model dynamic rupture propagation for the ShakeOut scenario that match the proposed surface slip and size of the event. Mapping applications to different architectures require coordination of many areas of expertise in hardware and application level, an outstanding challenge faced on the current petascale computing effort. We believe our techniques as well as distributed data management through data grids have provided a practical example of how to effectively use multiple compute resources, and our results will benefit other geoscience disciplines as well.

  8. A Study on the Performance of Low Cost MEMS Sensors in Strong Motion Studies

    NASA Astrophysics Data System (ADS)

    Tanırcan, Gulum; Alçık, Hakan; Kaya, Yavuz; Beyen, Kemal

    2017-04-01

    Recent advances in sensors have helped the growth of local networks. In recent years, many Micro Electro Mechanical System (MEMS)-based accelerometers have been successfully used in seismology and earthquake engineering projects. This is basically due to the increased precision obtained in these downsized instruments. Moreover, they are cheaper alternatives to force-balance type accelerometers. In Turkey, though MEMS-based accelerometers have been used in various individual applications such as magnitude and location determination of earthquakes, structural health monitoring, earthquake early warning systems, MEMS-based strong motion networks are not currently available in other populated areas of the country. Motivation of this study comes from the fact that, if MEMS sensors are qualified to record strong motion parameters of large earthquakes, a dense network can be formed in an affordable price at highly populated areas. The goals of this study are 1) to test the performance of MEMS sensors, which are available in the inventory of the Institute through shake table tests, and 2) to setup a small scale network for observing online data transfer speed to a trusted in-house routine. In order to evaluate the suitability of sensors in strong motion related studies, MEMS sensors and a reference sensor are tested under excitations of sweeping waves as well as scaled earthquake recordings. Amplitude response and correlation coefficients versus frequencies are compared. As for earthquake recordings, comparisons are carried out in terms of strong motion(SM) parameters (PGA, PGV, AI, CAV) and elastic response of structures (Sa). Furthermore, this paper also focuses on sensitivity and selectivity for sensor performances in time-frequency domain to compare different sensing characteristics and analyzes the basic strong motion parameters that influence the design majors. Results show that the cheapest MEMS sensors under investigation are able to record the mid-frequency dominant SM parameters PGV and CAV with high correlation. PGA and AI, the high frequency components of the ground motion, are underestimated. Such a difference, on the other hand, does not manifest itself on intensity estimations. PGV and CAV values from the reference and MEMS sensors converge to the same seismic intensity level. Hence a strong motion network with MEMS sensors could be a modest option to produce PGV-based damage impact of an urban area under large magnitude earthquake threats in the immediate vicinity.

  9. The ShakeOut Scenario

    USGS Publications Warehouse

    Jones, Lucile M.; Bernknopf, Richard; Cox, Dale; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Perry, Suzanne; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    This is the initial publication of the results of a cooperative project to examine the implications of a major earthquake in southern California. The study comprised eight counties: Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura. Its results will be used as the basis of an emergency response and preparedness exercise, the Great Southern California ShakeOut, and for this purpose we defined our earthquake as occurring at 10:00 a.m. on November 13, 2008. As members of the southern California community use the ShakeOut Scenario to plan and execute the exercise, we anticipate discussion and feedback. This community input will be used to refine our assessment and will lead to a formal publication in early 2009. Our goal in the ShakeOut Scenario is to identify the physical, social and economic consequences of a major earthquake in southern California and in so doing, enable the users of our results to identify what they can change now?before the earthquake?to avoid catastrophic impact after the inevitable earthquake occurs. To do so, we had to determine the physical damages (casualties and losses) caused by the earthquake and the impact of those damages on the region?s social and economic systems. To do this, we needed to know about the earthquake ground shaking and fault rupture. So we first constructed an earthquake, taking all available earthquake research information, from trenching and exposed evidence of prehistoric earthquakes, to analysis of instrumental recordings of large earthquakes and the latest theory in earthquake source physics. We modeled a magnitude (M) 7.8 earthquake on the southern San Andreas Fault, a plausible event on the fault most likely to produce a major earthquake. This information was then fed forward into the rest of the ShakeOut Scenario. The damage impacts of the scenario earthquake were estimated using both HAZUS-MH and expert opinion through 13 special studies and 6 expert panels, and fall into four categories: building damages, non-structural damages, damage to lifelines and infrastructure, and fire losses. The magnitude 7.8 ShakeOut earthquake is modeled to cause about 1800 deaths and $213 billion of economic losses. These numbers are as low as they are because of aggressive retrofitting programs that have increased the seismic resistance of buildings, highways and lifelines, and economic resiliency. These numbers are as large as they are because much more retrofitting could still be done. The earthquake modeled here may never happen. Big earthquakes on the San Andreas Fault are inevitable, and by geologic standards extremely common, but probably will not be exactly like this one. The next very damaging earthquake could easily be on another fault. However, lessons learned from this particular event apply to many other events and could provide benefits in many possible future events.

  10. Improvements to a global-scale groundwater model to estimate the water table across New Zealand

    NASA Astrophysics Data System (ADS)

    Westerhoff, Rogier; Miguez-Macho, Gonzalo; White, Paul

    2017-04-01

    Groundwater models at the global scale have become increasingly important in recent years to assess the effects of climate change and groundwater depletion. However, these global-scale models are typically not used for studies at the catchment scale, because they are simplified and too spatially coarse. In this study, we improved the global-scale Equilibrium Water Table (EWT) model, so it could better assess water table depth and water table elevation at the national scale for New Zealand. The resulting National Water Table (NWT) model used improved input data (i.e., national input data of terrain, geology, and recharge) and model equations (e.g., a hydraulic conductivity - depth relation). The NWT model produced maps of the water table that identified the main alluvial aquifers with fine spatial detail. Two regional case studies at the catchment scale demonstrated excellent correlation between the water table elevation and observations of hydraulic head. The NWT water tables are an improved water table estimation over the EWT model. In two case studies the NWT model provided a better approximation to observed water table for deep aquifers and the improved resolution of the model provided the capability to fill the gaps in data-sparse areas. This national model calculated water table depth and elevation across regional jurisdictions. Therefore, the model is relevant where trans-boundary issues, such as source protection and catchment boundary definition, occur. The NWT model also has the potential to constrain the uncertainty of catchment-scale models, particularly where data are sparse. Shortcomings of the NWT model are caused by the inaccuracy of input data and the simplified model properties. Future research should focus on improved estimation of input data (e.g., hydraulic conductivity and terrain). However, more advanced catchment-scale groundwater models should be used where groundwater flow is dominated by confining layers and fractures.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roehm, Dominic; Pavel, Robert S.; Barros, Kipton

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our predictionmore » scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5 to 25, while retaining high accuracy for various choices of the algorithm parameters.« less

  12. Distributed database kriging for adaptive sampling (D²KAS)

    DOE PAGES

    Roehm, Dominic; Pavel, Robert S.; Barros, Kipton; ...

    2015-03-18

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our predictionmore » scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5 to 25, while retaining high accuracy for various choices of the algorithm parameters.« less

  13. Laboratory validation of MEMS-based sensors for post-earthquake damage assessment image

    NASA Astrophysics Data System (ADS)

    Pozzi, Matteo; Zonta, Daniele; Santana, Juan; Colin, Mikael; Saillen, Nicolas; Torfs, Tom; Amditis, Angelos; Bimpas, Matthaios; Stratakos, Yorgos; Ulieru, Dumitru; Bairaktaris, Dimitirs; Frondistou-Yannas, Stamatia; Kalidromitis, Vasilis

    2011-04-01

    The evaluation of seismic damage is today almost exclusively based on visual inspection, as building owners are generally reluctant to install permanent sensing systems, due to their high installation, management and maintenance costs. To overcome this limitation, the EU-funded MEMSCON project aims to produce small size sensing nodes for measurement of strain and acceleration, integrating Micro-Electro-Mechanical Systems (MEMS) based sensors and Radio Frequency Identification (RFID) tags in a single package that will be attached to reinforced concrete buildings. To reduce the impact of installation and management, data will be transmitted to a remote base station using a wireless interface. During the project, sensor prototypes were produced by assembling pre-existing components and by developing ex-novo miniature devices with ultra-low power consumption and sensing performance beyond that offered by sensors available on the market. The paper outlines the device operating principles, production scheme and working at both unit and network levels. It also reports on validation campaigns conducted in the laboratory to assess system performance. Accelerometer sensors were tested on a reduced scale metal frame mounted on a shaking table, back to back with reference devices, while strain sensors were embedded in both reduced and full-scale reinforced concrete specimens undergoing increasing deformation cycles up to extensive damage and collapse. The paper assesses the economical sustainability and performance of the sensors developed for the project and discusses their applicability to long-term seismic monitoring.

  14. EPA Facilities and Regional Boundaries Service, US, 2012, US EPA, SEGS

    EPA Pesticide Factsheets

    This SEGS web service contains EPA facilities, EPA facilities labels, small- and large-scale versions of EPA region boundaries, and EPA region boundaries extended to the 200nm Exclusive Economic Zone (EEZ). Small scale EPA boundaries and boundaries extended to the EEZ render at scales of less than 5 million, large scale EPA boundaries draw at scales greater than or equal to 5 million. EPA facilities labels draw at scales greater than 2 million. Data used to create this web service are available as a separate download at the Secondary Linkage listed above. Full FGDC metadata records for each layer may be found by clicking the layer name in the web service table of contents (available through the online link provided above) and viewing the layer description. This SEGS dataset was produced by EPA through the Office of Environmental Information.

  15. Developing ShakeCast statistical fragility analysis framework for rapid post-earthquake assessment

    USGS Publications Warehouse

    Lin, K.-W.; Wald, D.J.

    2012-01-01

    When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap estimates the extent of potentially damaging shaking and provides overall information regarding the affected areas. The USGS ShakeCast system is a freely-available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. We describe notable improvements of the ShakeMap and the ShakeCast applications. We present a design for comprehensive fragility implementation, integrating spatially-varying ground-motion uncertainties into fragility curves for ShakeCast operations. For each facility, an overall inspection priority (or damage assessment) is assigned on the basis of combined component-based fragility curves using pre-defined logic. While regular ShakeCast users receive overall inspection priority designations for each facility, engineers can access the full fragility analyses for further evaluation.

  16. The ShakeMap Atlas for the City of Naples, Italy

    NASA Astrophysics Data System (ADS)

    Pierdominici, Simona; Faenza, Licia; Camassi, Romano; Michelini, Alberto; Ercolani, Emanuela; Lauciani, Valentino

    2016-04-01

    Naples is one of the most vulnerable cities in the world because it is threatened by several natural and man-made hazards: earthquakes, volcanic eruptions, tsunamis, landslides, hydrogeological disasters, and morphologic alterations due to human interference. In addition, the risk is increased by the high density of population (Naples and the surrounding area are among the most populated in Italy), and by the type and condition of buildings and monuments. In light of this, it is crucial to assess the ground shaking suffered by the city. We take into account and integrate data information from five Italian databases and catalogues (DBMI11; CPTI11; CAMAL11; MOLAL08; ITACA) to build a reliable ShakeMap atlas for the area and to recreate the seismic history of the city from historical to recent times (1293 to 1999). This large amount of data gives the opportunity to explore several sources of information, expanding the completeness of our data set in both time and magnitude. 84 earthquakes have been analyzed and for each event, a Shakemap set has been computed using an ad hoc implementation developed for this application: (1) specific ground-motion prediction equations (GMPEs) accounting for the different attenuation properties in volcanic areas compared with the tectonic ones, and (2) detailed local microzonation to include the site effects. The ShakeMap atlas has two main applications: a) it is an important instrument in seismic risk management. It quantifies the level of shaking suffered by a city during its history, and it could be implemented to the quantification of the number of people exposed to certain degrees of shaking. Intensity data provide the evaluation of the damage caused by earthquakes; the damage is closely linked with the ground shaking, building type, and vulnerability, and it is not possible to separate these contributions; b) the Atlas can be used as starting point for Bayesian estimation of seismic hazard. This technique allows for the merging of the more standard approach adopted in the compilation of the national hazard map of Italy. These Shakemaps are provided in terms of Mercalli-Cancani-Sieberg intensity (MCS hereinafter) and peak ground acceleration (PGA).

  17. Artificial diets for life tables bioassays of TPB in Mississippi

    USDA-ARS?s Scientific Manuscript database

    Two artificial diets for mass rearing and bioassay of the tarnished plant bug, (TPB), Lygus lineolaris Palisot de Beauvois, (Hemiptera: Miridae) were modified and developed, respectively. The first diet is a modification of a semisolid artificial diet (NI diet), which permits large scale rearing of ...

  18. Nonlinear instability and convection in a vertically vibrated granular bed

    NASA Astrophysics Data System (ADS)

    Shukla, Priyanka; Ansari, I. H.; van der Meer, D.; Lohse, Detlef; Alam, Meheboob

    2015-11-01

    The nonlinear instability of the density-inverted granular Leidenfrost state and the resulting convective motion in strongly shaken granular matter are analysed via a weakly nonlinear analysis. Under a quasi-steady ansatz, the base state temperature decreases with increasing height away from from the vibrating plate, but the density profile consists of three distinct regions: (i) a collisional dilute layer at the bottom, (ii) a levitated dense layer at some intermediate height and (iii) a ballistic dilute layer at the top of the granular bed. For the nonlinear stability analysis, the nonlinearities up-to cubic order in perturbation amplitude are retained, leading to the Landau equation. The genesis of granular convection is shown to be tied to a supercritical pitchfork bifurcation from the Leidenfrost state. Near the bifurcation point the equilibrium amplitude is found to follow a square-root scaling law, Ae √{ ▵} , with the distance ▵ from bifurcation point. The strength of convection is maximal at some intermediate value of the shaking strength, with weaker convection both at weaker and stronger shaking. Our theory predicts a novel floating-convection state at very strong shaking.

  19. Comments on potential geologic and seismic hazards affecting coastal Ventura County, California

    USGS Publications Warehouse

    Ross, Stephanie L.; Boore, David M.; Fisher, Michael A.; Frankel, Arthur D.; Geist, Eric L.; Hudnut, Kenneth W.; Kayen, Robert E.; Lee, Homa J.; Normark, William R.; Wong, Florence L.

    2004-01-01

    This report examines the regional seismic and geologic hazards that could affect proposed liquefied natural gas (LNG) facilities in coastal Ventura County, California. Faults throughout this area are thought to be capable of producing earthquakes of magnitude 6.5 to 7.5, which could produce surface fault offsets of as much as 15 feet. Many of these faults are sufficiently well understood to be included in the current generation of the National Seismic Hazard Maps; others may become candidates for inclusion in future revisions as research proceeds. Strong shaking is the primary hazard that causes damage from earthquakes and this area is zoned with a high level of shaking hazard. The estimated probability of a magnitude 6.5 or larger earthquake (comparable in size to the 2003 San Simeon quake) occurring in the next 30 years within 30 miles of Platform Grace is 50-60%; for Cabrillo Port, the estimate is a 35% likelihood. Combining these probabilities of earthquake occurrence with relationships that give expected ground motions yields the estimated seismic-shaking hazard. In parts of the project area, the estimated shaking hazard is as high as along the San Andreas Fault. The combination of long-period basin waves and LNG installations with large long-period resonances potentially increases this hazard.

  20. Interplanetary field and plasma during initial phase of geomagnetic storms

    NASA Technical Reports Server (NTRS)

    Patel, V. L.; Wiskerchen, M. J.

    1975-01-01

    A study has been conducted of a large number of geomagnetic storms occurring during the period from 1966 to 1970. Questions of data selection are discussed and the large-scale interplanetary magnetic field during the initial phase is examined. Small-scale interplanetary fields during the initial phase are also considered, taking into account important features of small-scale variations in the interplanetary field and plasma for three storms. Details concerning 23 geomagnetic storms and the interplanetary magnetic field are presented in a table. A study of the initial phase of these storms indicates that in most of these events, the solar-ecliptic Z component of the interplanetary magnetic field turns southward when the main phase decrease begins.

  1. Some problems of control of dynamical conditions of technological vibrating machines

    NASA Astrophysics Data System (ADS)

    Kuznetsov, N. K.; Lapshin, V. L.; Eliseev, A. V.

    2017-10-01

    The possibility of control of dynamical condition of the shakers that are designed for vibration treatment of parts interacting with granular media is discussed. The aim of this article is to develop the methodological basis of technology of creation of mathematical models of shake tables and the development of principles of formation of vibrational fields, estimation of their parameters and control of the structure vibration fields. Approaches to build mathematical models that take into account unilateral constraints, the relationships between elements, with the vibrating surface are developed. Methods intended to construct mathematical model of linear mechanical oscillation systems are used. Small oscillations about the position of static equilibrium are performed. The original method of correction of vibration fields by introduction of the oscillating system additional ties to the structure are proposed. Additional ties are implemented in the form of a mass-inertial device for changing the inertial parameters of the working body of the vibration table by moving the mass-inertial elements. The concept of monitoring the dynamic state of the vibration table based on the original measuring devices is proposed. Estimation for possible changes in dynamic properties is produced. The article is of interest for specialists in the field of creation of vibration technology machines and equipment.

  2. Modeling of a viscoelastic damper and its application in structural control

    PubMed Central

    Ibrahim, Zainah; Ghodsi, S. S.; Khatibi, Hamed

    2017-01-01

    Conventional seismic rehabilitation methods may not be suitable for some buildings owing to their high cost and time-consuming foundation work. In recent years, viscoelastic dampers (VEDs) have been widely used in many mid- and high-rise buildings. This study introduces a viscoelastic passive control system called rotary rubber braced damper (RRBD). The RRBD is an economical, lightweight, and easy-to-assemble device. A finite element model considering nonlinearity, large deformation, and material damage is developed to conduct a parametric study on different damper sizes under pushover cyclic loading. The fundamental characteristics of this VED system are clarified by analyzing building structures under cyclic loading. The result show excellent energy absorption and stable hysteresis loops in all specimens. Additionally, by using a sinusoidal shaking table test, the effectiveness of the RRBD to manage the response displacement and acceleration of steel frames is considered. The RRBD functioned at early stages of lateral displacement, indicating that the system is effective for all levels of vibration. Moreover, the proposed damper shows significantly better performance in terms of the column compression force resulting from the brace action compared to chevron bracing (CB). PMID:28570657

  3. Landslides triggered by the 2002 Denali fault, Alaska, earthquake and the inferred nature of the strong shaking

    USGS Publications Warehouse

    Jibson, R.W.; Harp, E.L.; Schulz, W.; Keefer, D.K.

    2004-01-01

    The 2002 M7.9 Denali fault, Alaska, earthquake triggered thousands of landslides, primarily rock falls and rock slides, that ranged in volume from rock falls of a few cubic meters to rock avalanches having volumes as great as 15 ?? 106 m3. The pattern of landsliding was unusual; the number of slides was less than expected for an earthquake of this magnitude, and the landslides were concentrated in a narrow zone 30-km wide that straddled the fault rupture over its entire 300-km length. The large rock avalanches all clustered along the western third of the rupture zone where acceleration levels and ground-shaking frequencies are thought to have been the highest. Inferences about near-field strong shaking characteristics drawn from the interpretation of the landslide distribution are consistent with results of recent inversion modeling that indicate high-frequency energy generation was greatest in the western part of the fault rupture zone and decreased markedly to the east. ?? 2004, Earthquake Engineering Research Institute.

  4. Helping Students Interpret Large-Scale Data Tables

    ERIC Educational Resources Information Center

    Prodromou, Theodosia

    2016-01-01

    New technologies have completely altered the ways that citizens can access data. Indeed, emerging online data sources give citizens access to an enormous amount of numerical information that provides new sorts of evidence used to influence public opinion. In this new environment, two trends have had a significant impact on our increasingly…

  5. 33 CFR 164.33 - Charts and publications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Ocean Service, U.S. Army Corps of Engineers, or a river authority that— (i) Are of a large enough scale...) For the area to be transited, the current edition of, or applicable current extract from: (i) Tide tables published by private entities using data provided by the National Ocean Service. (ii) Tidal...

  6. SCHOOL LUNCH, SUGGESTED GUIDES FOR SELECTING LARGE EQUIPMENT.

    ERIC Educational Resources Information Center

    South Carolina State Dept. of Education, Columbia.

    THE TYPE AND CAPACITY OF A WIDE RANGE OF SCHOOL KITCHEN EQUIPMENT IS RECOMMENDED WITH RESPECT TO THE NUMBER OF MEALS SERVED PER DAY. THESE RECOMMENDATIONS ARE GIVEN FOR RANGES, SINKS, ELECTRIC HEATING, GAS HEATING, REFRIGERATION, TABLES, KITCHEN MACHINES, TRUCK DOLLIES, SCALES, STORAGE CABINETS, OFFICE SPACES, LOUNGES, GARBAGE AND CAN WASHING…

  7. Health-Terrain: Visualizing Large Scale Health Data

    DTIC Science & Technology

    2014-12-01

    systems can only be realized if the quality of emerging large medical databases can be characterized and the meaning of the data understood. For this...Designed and tested an evaluation procedure for health data visualization system. This visualization framework offers a real time and web-based solution...rule is shown in the table, with the quality measures of each rule including the support, confidence, Laplace, Gain, p-s, lift and Conviction. We

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weimar, Mark R.; Daly, Don S.; Wood, Thomas W.

    Both nuclear power and nuclear weapons programs should have (related) economic signatures which are detectible at some scale. We evaluated this premise in a series of studies using national economic input/output (IO) data. Statistical discrimination models using economic IO tables predict with a high probability whether a country with an unknown predilection for nuclear weapons proliferation is in fact engaged in nuclear power development or nuclear weapons proliferation. We analyzed 93 IO tables, spanning the years 1993 to 2005 for 37 countries that are either members or associates of the Organization for Economic Cooperation and Development (OECD). The 2009 OECDmore » input/output tables featured 48 industrial sectors based on International Standard Industrial Classification (ISIC) Revision 3, and described the respective economies in current country-of-origin valued currency. We converted and transformed these reported values to US 2005 dollars using appropriate exchange rates and implicit price deflators, and addressed discrepancies in reported industrial sectors across tables. We then classified countries with Random Forest using either the adjusted or industry-normalized values. Random Forest, a classification tree technique, separates and categorizes countries using a very small, select subset of the 2304 individual cells in the IO table. A nation’s efforts in nuclear power, be it for electricity or nuclear weapons, are an enterprise with a large economic footprint -- an effort so large that it should discernibly perturb coarse country-level economics data such as that found in yearly input-output economic tables. The neoclassical economic input-output model describes a country’s or region’s economy in terms of the requirements of industries to produce the current level of economic output. An IO table row shows the distribution of an industry’s output to the industrial sectors while a table column shows the input required of each industrial sector by a given industry.« less

  9. Incorporating water table dynamics in climate modeling: 1. Water table observations and equilibrium water table simulations

    NASA Astrophysics Data System (ADS)

    Fan, Ying; Miguez-Macho, Gonzalo; Weaver, Christopher P.; Walko, Robert; Robock, Alan

    2007-05-01

    Soil moisture is a key participant in land-atmosphere interactions and an important determinant of terrestrial climate. In regions where the water table is shallow, soil moisture is coupled to the water table. This paper is the first of a two-part study to quantify this coupling and explore its implications in the context of climate modeling. We examine the observed water table depth in the lower 48 states of the United States in search of salient spatial and temporal features that are relevant to climate dynamics. As a means to interpolate and synthesize the scattered observations, we use a simple two-dimensional groundwater flow model to construct an equilibrium water table as a result of long-term climatic and geologic forcing. Model simulations suggest that the water table depth exhibits spatial organization at watershed, regional, and continental scales, which may have implications for the spatial organization of soil moisture at similar scales. The observations suggest that water table depth varies at diurnal, event, seasonal, and interannual scales, which may have implications for soil moisture memory at these scales.

  10. Assessment of Cultivation Factors that Affect Biomass and Geraniol Production in Transgenic Tobacco Cell Suspension Cultures

    PubMed Central

    Vasilev, Nikolay; Schmitz, Christian; Grömping, Ulrike; Fischer, Rainer; Schillberg, Stefan

    2014-01-01

    A large-scale statistical experimental design was used to determine essential cultivation parameters that affect biomass accumulation and geraniol production in transgenic tobacco (Nicotiana tabacum cv. Samsun NN) cell suspension cultures. The carbohydrate source played a major role in determining the geraniol yield and factors such as filling volume, inoculum size and light were less important. Sucrose, filling volume and inoculum size had a positive effect on geraniol yield by boosting growth of plant cell cultures whereas illumination of the cultures stimulated the geraniol biosynthesis. We also found that the carbohydrates sucrose and mannitol showed polarizing effects on biomass and geraniol accumulation. Factors such as shaking frequency, the presence of conditioned medium and solubilizers had minor influence on both plant cell growth and geraniol content. When cells were cultivated under the screened conditions for all the investigated factors, the cultures produced ∼5.2 mg/l geraniol after 12 days of cultivation in shaking flasks which is comparable to the yield obtained in microbial expression systems. Our data suggest that industrial experimental designs based on orthogonal arrays are suitable for the selection of initial cultivation parameters prior to the essential medium optimization steps. Such designs are particularly beneficial in the early optimization steps when many factors must be screened, increasing the statistical power of the experiments without increasing the demand on time and resources. PMID:25117009

  11. Assessment of cultivation factors that affect biomass and geraniol production in transgenic tobacco cell suspension cultures.

    PubMed

    Vasilev, Nikolay; Schmitz, Christian; Grömping, Ulrike; Fischer, Rainer; Schillberg, Stefan

    2014-01-01

    A large-scale statistical experimental design was used to determine essential cultivation parameters that affect biomass accumulation and geraniol production in transgenic tobacco (Nicotiana tabacum cv. Samsun NN) cell suspension cultures. The carbohydrate source played a major role in determining the geraniol yield and factors such as filling volume, inoculum size and light were less important. Sucrose, filling volume and inoculum size had a positive effect on geraniol yield by boosting growth of plant cell cultures whereas illumination of the cultures stimulated the geraniol biosynthesis. We also found that the carbohydrates sucrose and mannitol showed polarizing effects on biomass and geraniol accumulation. Factors such as shaking frequency, the presence of conditioned medium and solubilizers had minor influence on both plant cell growth and geraniol content. When cells were cultivated under the screened conditions for all the investigated factors, the cultures produced ∼ 5.2 mg/l geraniol after 12 days of cultivation in shaking flasks which is comparable to the yield obtained in microbial expression systems. Our data suggest that industrial experimental designs based on orthogonal arrays are suitable for the selection of initial cultivation parameters prior to the essential medium optimization steps. Such designs are particularly beneficial in the early optimization steps when many factors must be screened, increasing the statistical power of the experiments without increasing the demand on time and resources.

  12. Heightened odds of large earthquakes near Istanbul: an interaction-based probability calculation

    USGS Publications Warehouse

    Parsons, T.; Toda, S.; Stein, R.S.; Barka, A.; Dieterich, J.H.

    2000-01-01

    We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium, departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 ± 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 ± 12% during the next decade.

  13. Assessing the Challenges in the Application of Potential Probiotic Lactic Acid Bacteria in the Large-Scale Fermentation of Spanish-Style Table Olives

    PubMed Central

    Rodríguez-Gómez, Francisco; Romero-Gil, Verónica; Arroyo-López, Francisco N.; Roldán-Reyes, Juan C.; Torres-Gallardo, Rosa; Bautista-Gallego, Joaquín; García-García, Pedro; Garrido-Fernández, Antonio

    2017-01-01

    This work studies the inoculation conditions for allowing the survival/predominance of a potential probiotic strain (Lactobacillus pentosus TOMC-LAB2) when used as a starter culture in large-scale fermentations of green Spanish-style olives. The study was performed in two successive seasons (2011/2012 and 2012/2013), using about 150 tons of olives. Inoculation immediately after brining (to prevent wild initial microbiota growth) followed by re-inoculation 24 h later (to improve competitiveness) was essential for inoculum predominance. Processing early in the season (September) showed a favorable effect on fermentation and strain predominance on olives (particularly when using acidified brines containing 25 L HCl/vessel) but caused the disappearance of the target strain from both brines and olives during the storage phase. On the contrary, processing in October slightly reduced the target strain predominance on olives (70–90%) but allowed longer survival. The type of inoculum used (laboratory vs. industry pre-adapted) never had significant effects. Thus, this investigation discloses key issues for the survival and predominance of starter cultures in large-scale industrial fermentations of green Spanish-style olives. Results can be of interest for producing probiotic table olives and open new research challenges on the causes of inoculum vanishing during the storage phase. PMID:28567038

  14. Probabilistic seismic hazard assessment for northern Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Kosuwan, S.; Nguyen, M. L.; Shi, X.; Sieh, K.

    2016-12-01

    We assess seismic hazard for northern Southeast Asia through constructing an earthquake and fault database, conducting a series of ground-shaking scenarios and proposing regional seismic hazard maps. Our earthquake database contains earthquake parameters from global and local seismic catalogues, including the ISC, ISC-GEM, the global ANSS Comprehensive Catalogues, Seismological Bureau, Thai Meteorological Department, Thailand, and Institute of Geophysics Vietnam Academy of Science and Technology, Vietnam. To harmonize the earthquake parameters from various catalogue sources, we remove duplicate events and unify magnitudes into the same scale. Our active fault database include active fault data from previous studies, e.g. the active fault parameters determined by Wang et al. (2014), Department of Mineral Resources, Thailand, and Institute of Geophysics, Vietnam Academy of Science and Technology, Vietnam. Based on the parameters from analysis of the databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and time elapsed of last events), we determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the felt intensities of historical earthquakes to the modelled ground motions using ground motion prediction equations (GMPEs). By incorporating the best-fitting GMPEs and site conditions, we utilized site effect and assessed probabilistic seismic hazard. The highest seismic hazard is in the region close to the Sagaing Fault, which cuts through some major cities in central Myanmar. The northern segment of Sunda megathrust, which could potentially cause M8-class earthquake, brings significant hazard along the Western Coast of Myanmar and eastern Bangladesh. Besides, we conclude a notable hazard level in northern Vietnam and the boundary between Myanmar, Thailand and Laos, due to a series of strike-slip faults, which could potentially cause moderate-large earthquakes. Note that although much of the region has a low probability of damaging shaking, low-probability events have resulted in much destruction recently in SE Asia (e.g. 2008 Wenchuan, 2015 Sabah earthquakes).

  15. The Wireless Data Acquisition System for the Vibration Table

    NASA Astrophysics Data System (ADS)

    Teng, Y. T.; Hu, X.

    2014-12-01

    The vibration table is a large-scaled tool used for inspecting the performance of seismometers. The output from a seismometer on the table can be directly monitored when the vibration table moves in certain pattern. Compared with other inspection methods, inspecting seismometers' performance indicators (frequency response, degree of linearity, sensitivity, lateral inhibition and dynamic range etc). using vibration tables is more intuitive. Therefore, the vibration tables are an essential testing part in developing new seismometers and seismometer quality control. Whereas, in practice, a cable is needed to connect the seismometer to the ground equipments for its signal outputs and power supply, that means adding a time-varying nonlinear spring between the vibration table and ground. The cable adds nonlinear feature to the table, distorts the table-board movement and bring extra errors to the inspecting work and affected the testing accuracy and precision. In face of this problem, we developed a wireless acquiring system for the vibration table. The system is consisted of a three-channel analog-to-digital conversion, an acquisition control part, local data storage, network interface, wireless router and power management, etc. The analog-to-digital conversion part uses a 24-digit high-precision converter, which has a programmable amplifier at the front end of its artificial circuit, with the function of matching outputs with different amplifier from the vibration table. The acquisition control part uses a 32 bit ARM processor, with low-power dissipation, minute extension and high performance. The application software platform is written in Linux to make the system convenient for multitasking work. Large volume local digital storage is achieved by a 32G SD card, which is used for saving real time acquired data. Data transmission is achieved by network interface and wireless router, which can simplify the application software by the supported TCP/IP protocol. Besides, the acquisition system uses built-in power supply, which provides power to the system with Li-On rechargeable battery with high capacity, then all the cable link between the vibration table and the ground equipment have been removed. With all these changes, the whole system is immobilized on board of the vibration table after being packaged.

  16. The Chesapeake Bay impact structure

    USGS Publications Warehouse

    Powars, David S.; Edwards, Lucy E.; Gohn, Gregory S.; Horton, J. Wright

    2015-10-28

    About 35 million years ago, during late Eocene time, a 2-mile-wide asteroid or comet smashed into Earth in what is now the lower Chesapeake Bay in Virginia. The oceanic impact vaporized, melted, fractured, and (or) displaced the target rocks and sediments and sent billions of tons of water, sediments, and rocks into the air. Glassy particles of solidified melt rock rained down as far away as Texas and the Caribbean. Models suggest that even up to 50 miles away the velocity of the intensely hot air blast was greater than 1,500 miles per hour, and ground shaking was equivalent to an earthquake greater than magnitude 8.0 on the Richter scale. Large tsunamis affected most of the North Atlantic basin. The Chesapeake Bay impact structure is among the 20 largest known impact structures on Earth.

  17. Cross-section perimeter is a suitable parameter to describe the effects of different baffle geometries in shaken microtiter plates

    PubMed Central

    2014-01-01

    Background Biotechnological screening processes are performed since more than 8 decades in small scale shaken bioreactors like shake flasks or microtiter plates. One of the major issues of such reactors is the sufficient oxygen supply of suspended microorganisms. Oxygen transfer into the bulk liquid can in general be increased by introducing suitable baffles at the reactor wall. However, a comprehensive and systematic characterization of baffled shaken bioreactors has never been carried out so far. Baffles often differ in number, size and shape. The exact geometry of baffles in glass lab ware like shake flasks is very difficult to reproduce from piece to piece due to the hard to control flow behavior of molten glass during manufacturing. Thus, reproducibility of the maximum oxygen transfer capacity in such baffled shake flasks is hardly given. Results As a first step to systematically elucidate the general effect of different baffle geometries on shaken bioreactor performance, the maximum oxygen transfer capacity (OTRmax) in baffled 48-well microtiter plates as shaken model reactor was characterized. This type of bioreactor made of plastic material was chosen, as the exact geometry of the baffles can be fabricated by highly reproducible laser cutting. As a result, thirty different geometries were investigated regarding their maximum oxygen transfer capacity (OTRmax) and liquid distribution during shaking. The relative perimeter of the cross-section area as new fundamental geometric key parameter is introduced. An empirical correlation for the OTRmax as function of the relative perimeter, shaking frequency and filling volume is derived. For the first time, this correlation allows a systematic description of the maximum oxygen transfer capacity in baffled microtiter plates. Conclusions Calculated and experimentally determined OTRmax values agree within ± 30% accuracy. Furthermore, undesired out-of-phase operating conditions can be identified by using the relative perimeter as key parameter. Finally, an optimum well geometry characterized by an increased perimeter of 10% compared to the unbaffled round geometry is identified. This study may also assist to comprehensively describe and optimize the baffles of shake flasks in future. PMID:25093039

  18. USGS ShakeMap Developments, Implementation, and Derivative Tools

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Lin, K.; Quitoriano, V.; Worden, B.

    2007-12-01

    We discuss ongoing development and enhancements of ShakeMap, a system for automatically generating maps of ground shaking and intensity in the minutes following an earthquake. The rapid availability of these maps is of particular value to emergency response organizations, utilities, insurance companies, government decision- makers, the media, and the general public. ShakeMap Version 3.2 was released in March, 2007, on a download site which allows ShakeMap developers to track operators' updates and provide follow-up information; V3.2 has now been downloaded in 15 countries. The V3.2 release supports LINUX in addition to other UNIX operating systems and adds enhancements to XML, KML, metadata, and other products. We have also added an uncertainty measure, quantified as a function of spatial location. Uncertainty is essential for evaluating the range of possible losses. Though not released in V3.2, we will describe a new quantitative uncertainty letter grading for each ShakeMap produced, allowing users to gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of their post-earthquake critical decision-making process. Since the V3.2 release, several new ground motion predictions equations have also been added to the prediction equation modules. ShakeMap is implemented in several new regions as reported in this Session. Within the U.S., robust systems serve California, Nevada, Utah, Washington and Oregon, Hawaii, and Anchorage. Additional systems are in development and efforts to provide backup capabilities for all Advanced National Seismic System (ANSS) regions at the National Earthquake Information Center are underway. Outside the U.S., this Session has descriptions of ShakeMap systems in Italy, Switzerland, Romania, and Turkey, among other countries. We also describe our predictive global ShakeMap system for the rapid evaluation of significant earthquakes globally for the Prompt Assessment of Global Earthquakes for Response (PAGER) system. These global ShakeMaps are constrained by rapidly gathered intensity data via the Internet and by finite fault and aftershock analyses for portraying fault rupture dimensions. As part of the PAGER loss calibration process we have produced an Atlas of ShakeMaps for significant earthquakes around the globe since 1973 (Allen and others, this Session); these Atlas events have additional constraints provided by archival strong motion, faulting dimensions, and macroseismic intensity data. We also describe derivative tools for further utilizing ShakeMap including ShakeCast, a fully automated system for delivering specific ShakeMap products to critical users and triggering established post-earthquake response protocols. We have released ShakeCast Version 2.0 (Lin and others, this Session), which allows RSS feeds for automatically receiving ShakeMap files, auto-launching of post-download processing scripts, and delivering notifications based on users' likely facility damage states derived from ShakeMap shaking parameters. As part of our efforts to produce estimated ShakeMaps globally, we have developed a procedure for deriving Vs30 estimates from correlations with topographic slope, and we have now implemented a global Vs30 Server, allowing users to generate Vs30 maps for custom user-selected regions around the globe (Allen and Wald, this Session). Finally, as a further derivative product of the ShakeMap Atlas project, we will present a shaking hazard Map for the past 30 years based on approximately 3,900 earthquake ShakeMaps of historic earthquakes.

  19. ShakeCast: Automating and Improving the Use of ShakeMap for Post-Earthquake Decision- Making and Response

    NASA Astrophysics Data System (ADS)

    Lin, K.; Wald, D. J.

    2007-12-01

    ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users" facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for emergency managers and responders. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, provides overall information regarding the affected areas. When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. To this end, ShakeCast estimates the potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps showing structures or facilities most likely impacted. All ShakeMap and ShakeCast files and products are non-propriety to simplify interfacing with existing users" response tools and to encourage user-made enhancement to the software. ShakeCast uses standard RSS and HTTP requests to communicate with the USGS Web servers that host ShakeMaps, which are widely-distributed and heavily mirrored. The RSS approach allows ShakeCast users to initiate and receive selected ShakeMap products and information on software updates. To assess facility damage estimates, ShakeCast users can combine measured or estimated ground motion parameters with damage relationships that can be pre-computed, use one of these ground motion parameters as input, and produce a multi-state discrete output of damage likelihood. Presently three common approaches are being used to provide users with an indication of damage: HAZUS-based, intensity-based, and customized damage functions. Intensity-based thresholds are for locations with poorly established damage relationships; custom damage levels are for advanced ShakeCast users such as Caltrans which produces its own set of damage functions that correspond to the specific details of each California bridge or overpass in its jurisdiction. For users whose portfolio of structures is comprised of common, standard designs, ShakeCast offers a simplified structural damage-state estimation capability adapted from the HAZUS-MH earthquake module (NIBS and FEMA, 2003). Currently the simplified fragility settings consist of 128 combinations of HAZUS model building types, construction materials, building heights, and building-code eras.

  20. A bridge column with superelastic NiTi SMA and replaceable rubber hinge for earthquake damage mitigation

    NASA Astrophysics Data System (ADS)

    Varela, Sebastian; ‘Saiid' Saiidi, M.

    2016-07-01

    This paper reports a unique concept for resilient bridge columns that can undergo intense earthquake loading and remain functional with minimal damage and residual drift. In this concept, the column is designed so that its components can be easily disassembled and reassembled to facilitate material recycling and component reuse. This is meant to foster sustainability of bridge systems while minimizing monetary losses from earthquakes. Self-centering and energy dissipation in the column were provided by unbonded superelastic nickel-titanium (NiTi) shape memory alloy bars placed inside a plastic hinge element made of rubber. This replaceable plastic hinge was in turn attached to a concrete-filled carbon fiber-reinforced polymer tube and a precast concrete footing that were designed to behave elastically. The proposed concept was evaluated experimentally by testing a ¼-scale column model under simulated near-fault earthquake motions on a shake table. After testing, the model was disassembled, reassembled and tested again. The seismic performance of the reassembled model was found to be comparable to that of the ‘virgin’ model. A relatively simple computational model of the column tested that was developed in OpenSees was able to match some of the key experimental response parameters.

  1. Evaluating the relationship between topography and groundwater using outputs from a continental-scale integrated hydrology model

    NASA Astrophysics Data System (ADS)

    Condon, Laura E.; Maxwell, Reed M.

    2015-08-01

    We study the influence of topography on groundwater fluxes and water table depths across the contiguous United States (CONUS). Groundwater tables are often conceptualized as subdued replicas of topography. While it is well known that groundwater configuration is also controlled by geology and climate, nonlinear interactions between these drivers within large real-world systems are not well understood and are difficult to characterize given sparse groundwater observations. We address this limitation using the fully integrated physical hydrology model ParFlow to directly simulate groundwater fluxes and water table depths within a complex heterogeneous domain that incorporates all three primary groundwater drivers. Analysis is based on a first of its kind, continental-scale, high-resolution (1 km), groundwater-surface water simulation spanning more than 6.3 million km2. Results show that groundwater fluxes are most strongly driven by topographic gradients (as opposed to gradients in pressure head) in humid regions with small topographic gradients or low conductivity. These regions are generally consistent with the topographically controlled groundwater regions identified in previous studies. However, we also show that areas where topographic slopes drive groundwater flux do not generally have strong correlations between water table depth and elevation. Nonlinear relationships between topography and water table depth are consistent with groundwater flow systems that are dominated by local convergence and could also be influenced by local variability in geology and climate. One of the strengths of the numerical modeling approach is its ability to evaluate continental-scale groundwater behavior at a high resolution not possible with other techniques. This article was corrected on 11 SEP 2015. See the end of the full text for details.

  2. ShakeCast: Automating and improving the use of shakemap for post-earthquake deeision-making and response

    USGS Publications Warehouse

    Wald, D.; Lin, K.-W.; Porter, K.; Turner, Loren

    2008-01-01

    When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, on its own can be useful for emergency response, loss estimation, and public information. However, to take full advantage of the potential of ShakeMap, we introduce ShakeCast. ShakeCast facilitates the complicated assessment of potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps of structures or facilities most likely impacted. ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users' facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for both public and private emergency managers and responders. ?? 2008, Earthquake Engineering Research Institute.

  3. A revised ground-motion and intensity interpolation scheme for shakemap

    USGS Publications Warehouse

    Worden, C.B.; Wald, D.J.; Allen, T.I.; Lin, K.; Garcia, D.; Cua, G.

    2010-01-01

    We describe a weighted-average approach for incorporating various types of data (observed peak ground motions and intensities and estimates from groundmotion prediction equations) into the ShakeMap ground motion and intensity mapping framework. This approach represents a fundamental revision of our existing ShakeMap methodology. In addition, the increased availability of near-real-time macroseismic intensity data, the development of newrelationships between intensity and peak ground motions, and new relationships to directly predict intensity from earthquake source information have facilitated the inclusion of intensity measurements directly into ShakeMap computations. Our approach allows for the combination of (1) direct observations (ground-motion measurements or reported intensities), (2) observations converted from intensity to ground motion (or vice versa), and (3) estimated ground motions and intensities from prediction equations or numerical models. Critically, each of the aforementioned data types must include an estimate of its uncertainties, including those caused by scaling the influence of observations to surrounding grid points and those associated with estimates given an unknown fault geometry. The ShakeMap ground-motion and intensity estimates are an uncertainty-weighted combination of these various data and estimates. A natural by-product of this interpolation process is an estimate of total uncertainty at each point on the map, which can be vital for comprehensive inventory loss calculations. We perform a number of tests to validate this new methodology and find that it produces a substantial improvement in the accuracy of ground-motion predictions over empirical prediction equations alone.

  4. Effect of culture medium, host strain and oxygen transfer on recombinant Fab antibody fragment yield and leakage to medium in shaken E. coli cultures.

    PubMed

    Ukkonen, Kaisa; Veijola, Johanna; Vasala, Antti; Neubauer, Peter

    2013-07-29

    Fab antibody fragments in E. coli are usually directed to the oxidizing periplasmic space for correct folding. From periplasm Fab fragments may further leak into extracellular medium. Information on the cultivation parameters affecting this leakage is scarce, and the unpredictable nature of Fab leakage is problematic regarding consistent product recovery. To elucidate the effects of cultivation conditions, we investigated Fab expression and accumulation into either periplasm or medium in E. coli K-12 and E. coli BL21 when grown in different types of media and under different aeration conditions. Small-scale Fab expression demonstrated significant differences in yield and ratio of periplasmic to extracellular Fab between different culture media and host strains. Expression in a medium with fed-batch-like glucose feeding provided highest total and extracellular yields in both strains. Unexpectedly, cultivation in baffled shake flasks at 150 rpm shaking speed resulted in higher yield and accumulation of Fabs into culture medium as compared to cultivation at 250 rpm. In the fed-batch medium, extracellular fraction in E. coli K-12 increased from 2-17% of total Fab at 250 rpm up to 75% at 150 rpm. This was partly due to increased lysis, but also leakage from intact cells increased at the lower shaking speed. Total Fab yield in E. coli BL21 in glycerol-based autoinduction medium was 5 to 9-fold higher at the lower shaking speed, and the extracellular fraction increased from ≤ 10% to 20-90%. The effect of aeration on Fab localization was reproduced in multiwell plate by variation of culture volume. Yield and leakage of Fab fragments are dependent on expression strain, culture medium, aeration rate, and the combination of these parameters. Maximum productivity in fed-batch-like conditions and in autoinduction medium is achieved under sufficiently oxygen-limited conditions, and lower aeration also promotes increased Fab accumulation into extracellular medium. These findings have practical implications for screening applications and small-scale Fab production, and highlight the importance of maintaining consistent aeration conditions during scale-up to avoid changes in product yield and localization. On the other hand, the dependency of Fab leakage on cultivation conditions provides a practical way to manipulate Fab localization.

  5. Effect of culture medium, host strain and oxygen transfer on recombinant Fab antibody fragment yield and leakage to medium in shaken E. coli cultures

    PubMed Central

    2013-01-01

    Background Fab antibody fragments in E. coli are usually directed to the oxidizing periplasmic space for correct folding. From periplasm Fab fragments may further leak into extracellular medium. Information on the cultivation parameters affecting this leakage is scarce, and the unpredictable nature of Fab leakage is problematic regarding consistent product recovery. To elucidate the effects of cultivation conditions, we investigated Fab expression and accumulation into either periplasm or medium in E. coli K-12 and E. coli BL21 when grown in different types of media and under different aeration conditions. Results Small-scale Fab expression demonstrated significant differences in yield and ratio of periplasmic to extracellular Fab between different culture media and host strains. Expression in a medium with fed-batch-like glucose feeding provided highest total and extracellular yields in both strains. Unexpectedly, cultivation in baffled shake flasks at 150 rpm shaking speed resulted in higher yield and accumulation of Fabs into culture medium as compared to cultivation at 250 rpm. In the fed-batch medium, extracellular fraction in E. coli K-12 increased from 2-17% of total Fab at 250 rpm up to 75% at 150 rpm. This was partly due to increased lysis, but also leakage from intact cells increased at the lower shaking speed. Total Fab yield in E. coli BL21 in glycerol-based autoinduction medium was 5 to 9-fold higher at the lower shaking speed, and the extracellular fraction increased from ≤ 10% to 20-90%. The effect of aeration on Fab localization was reproduced in multiwell plate by variation of culture volume. Conclusions Yield and leakage of Fab fragments are dependent on expression strain, culture medium, aeration rate, and the combination of these parameters. Maximum productivity in fed-batch-like conditions and in autoinduction medium is achieved under sufficiently oxygen-limited conditions, and lower aeration also promotes increased Fab accumulation into extracellular medium. These findings have practical implications for screening applications and small-scale Fab production, and highlight the importance of maintaining consistent aeration conditions during scale-up to avoid changes in product yield and localization. On the other hand, the dependency of Fab leakage on cultivation conditions provides a practical way to manipulate Fab localization. PMID:23895637

  6. Slip pulse and resonance of Kathmandu basin during the 2015 Mw 7.8 Gorkha earthquake, Nepal imaged with space geodesy

    USGS Publications Warehouse

    Galetzka, John; Melgar, D.; Genrich, J.F.; Geng, J.; Owen, S.; Lindsey, E. O.; Xu, X.; Bock, Y.; Avouac, J.-P.; Adhikari, L. B.; Upreti, B. N.; Pratt-Sitaula, B.; Bhattarai, T. N.; Sitaula, B. P.; Moore, A.; Hudnut, Kenneth W.; Szeliga, W.; Normandeau, J.; Fend, M.; Flouzat, M; Bollinger, L.; Shrestha, P.; Koirala, B.; Gautam, U.; Bhatterai, M.; Gupta, R.; Kandel, T.; Timsina, C.; Sapkota, S.N.; Rajaure, S.; Maharjan, N.

    2015-01-01

    Detailed geodetic imaging of earthquake rupture enhances our understanding of earthquake physics and induced ground shaking. The April 25, 2015 Mw 7.8 Gorkha, Nepal earthquake is the first example of a large continental megathrust rupture beneath a high-rate (5 Hz) GPS network. We use GPS and InSAR data to model the earthquake rupture as a slip pulse of ~20 km width, ~6 s duration, and with peak sliding velocity of 1.1 m/s that propagated toward Kathmandu basin at ~3.3 km/s over ~140 km. The smooth slip onset, indicating a large ~5 m slip-weakening distance, caused moderate ground shaking at high >1Hz frequencies (~16% g) and limited damage to regular dwellings. Whole basin resonance at 4-5 s period caused collapse of tall structures, including cultural artifacts.

  7. The Early Warning System(EWS) as First Stage to Generate and Develop Shake Map for Bucharest to Deep Vrancea Earthquakes

    NASA Astrophysics Data System (ADS)

    Marmureanu, G.; Ionescu, C.; Marmureanu, A.; Grecu, B.; Cioflan, C.

    2007-12-01

    EWS made by NIEP is the first European system for real-time early detection and warning of the seismic waves in case of strong deep earthquakes. EWS uses the time interval (28-32 seconds) between the moment when earthquake is detected by the borehole and surface local accelerometers network installed in the epicenter area (Vrancea) and the arrival time of the seismic waves in the protected area, to deliver timely integrated information in order to enable actions to be taken before a main destructive shaking takes place. Early warning system is viewed as part of an real-time information system that provide rapid information, about an earthquake impeding hazard, to the public and disaster relief organizations before (early warning) and after a strong earthquake (shake map).This product is fitting in with other new product on way of National Institute for Earth Physics, that is, the shake map which is a representation of ground shaking produced by an event and it will be generated automatically following large Vrancea earthquakes. Bucharest City is located in the central part of the Moesian platform (age: Precambrian and Paleozoic) in the Romanian Plain, at about 140 km far from Vrancea area. Above a Cretaceous and a Miocene deposit (with the bottom at roundly 1,400 m of depth), a Pliocene shallow water deposit (~ 700m thick) was settled. The surface geology consists mainly of Quaternary alluvial deposits. Later loess covered these deposits and the two rivers crossing the city (Dambovita and Colentina) carved the present landscape. During the last century Bucharest suffered heavy damage and casualties due to 1940 (Mw = 7.7) and 1977 (Mw = 7.4) Vrancea earthquakes. For example, 32 high tall buildings collapsed and more then 1500 people died during the 1977 event. The innovation with comparable or related systems worldwide is that NIEP will use the EWS to generate a virtual shake map for Bucharest (140 km away of epicentre) immediately after the magnitude is estimated (in 3-4 seconds after the detection in epicentre) and later make corrections by using real time dataflow from each K2 accelerometers installed in Bucharest area, inclusively nonlinear effects. Thus, developing of a near real-time shake map for Bucharest urban area is of highest interest, providing valuable information to the civil defense, decision makers and general public on the area where the ground motion is most severe. EWS made by NIEP can be considered the first stage to generate and develop the shake map for Bucharest to deep Vrancea earthquakes.

  8. MOIL-opt: Energy-Conserving Molecular Dynamics on a GPU/CPU system

    PubMed Central

    Ruymgaart, A. Peter; Cardenas, Alfredo E.; Elber, Ron

    2011-01-01

    We report an optimized version of the molecular dynamics program MOIL that runs on a shared memory system with OpenMP and exploits the power of a Graphics Processing Unit (GPU). The model is of heterogeneous computing system on a single node with several cores sharing the same memory and a GPU. This is a typical laboratory tool, which provides excellent performance at minimal cost. Besides performance, emphasis is made on accuracy and stability of the algorithm probed by energy conservation for explicit-solvent atomically-detailed-models. Especially for long simulations energy conservation is critical due to the phenomenon known as “energy drift” in which energy errors accumulate linearly as a function of simulation time. To achieve long time dynamics with acceptable accuracy the drift must be particularly small. We identify several means of controlling long-time numerical accuracy while maintaining excellent speedup. To maintain a high level of energy conservation SHAKE and the Ewald reciprocal summation are run in double precision. Double precision summation of real-space non-bonded interactions improves energy conservation. In our best option, the energy drift using 1fs for a time step while constraining the distances of all bonds, is undetectable in 10ns simulation of solvated DHFR (Dihydrofolate reductase). Faster options, shaking only bonds with hydrogen atoms, are also very well behaved and have drifts of less than 1kcal/mol per nanosecond of the same system. CPU/GPU implementations require changes in programming models. We consider the use of a list of neighbors and quadratic versus linear interpolation in lookup tables of different sizes. Quadratic interpolation with a smaller number of grid points is faster than linear lookup tables (with finer representation) without loss of accuracy. Atomic neighbor lists were found most efficient. Typical speedups are about a factor of 10 compared to a single-core single-precision code. PMID:22328867

  9. MyShake - Smartphone seismic network powered by citizen scientists

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Strauss, J. A.

    2017-12-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing. It is driven by the citizen scientists that run MyShake on their personal smartphones. It has two components: an android application running on the smartphones to detect earthquake-like motion, and a network detection algorithm to aggregate results from multiple smartphones to confirm when an earthquake occurs. The MyShake application was released to the public on Feb 12th 2016. Within the first year, more than 250,000 people downloaded MyShake app around the world. There are more than 500 earthquakes recorded by the smartphones in this period, including events in Chile, Argentina, Mexico, Morocco, Greece, Nepal, New Zealand, Taiwan, Japan, and across North America. Currently, we are working on earthquake early warning with MyShake network and the shaking data provided by MyShake is a unique dataset that can be used for the research community.

  10. Stability with large step sizes for multistep discretizations of stiff ordinary differential equations

    NASA Technical Reports Server (NTRS)

    Majda, George

    1986-01-01

    One-leg and multistep discretizations of variable-coefficient linear systems of ODEs having both slow and fast time scales are investigated analytically. The stability properties of these discretizations are obtained independent of ODE stiffness and compared. The results of numerical computations are presented in tables, and it is shown that for large step sizes the stability of one-leg methods is better than that of the corresponding linear multistep methods.

  11. Seismic shaking scenarios in realistic 3D crustal model of Northern Italy

    NASA Astrophysics Data System (ADS)

    Molinari, I.; Morelli, A.; Basini, P.; Berbellini, A.

    2013-12-01

    Simulation of seismic wave propagation in realistic crustal structures is a fundamental tool to evaluate earthquake-generated ground shaking and assess seismic hazard. Current-generation numerical codes, and modern HPC infrastructures, allow for realistic simulations in complex 3D geologic structures. We apply such methodology to the Po Plain in Northern Italy -- a region with relatively rare earthquakes but having large property and industrial exposure, as it became clear during the two M~6 events of May 20-29, 2012. Historical seismicity is well known in this region, with maximum magnitudes estimates reaching M~7, and wave field amplitudes may be significantly amplified by the presence of the very thick sedimentary basin. Our goal is to produce estimates of expected ground shaking in Northern Italy through detailed deterministic simulations of ground motion due to expected earthquakes. We defined a three-dimensional model of the earth's crust using geo-statistical tools to merge the abundant information existing in the form of borehole data and seismic reflection profiles that had been shot in the '70s and the '80s for hydrocarbon exploration. Such information, that has been used by geologists to infer the deep structural setup, had never been merged to build a 3D model to be used for seismological simulations. We implement the model in SPECFEM3D_Cartesian and a hexahedral mesh with elements of ~2km, that allows us to simulate waves with minimum period of ~2 seconds. The model has then been optimized through comparison between simulated and recorded seismograms for the ~20 moderate-magnitude events (Mw > 4.5) that have been instrumentally recorded in the last 15 years. Realistic simulations in the frequency band of most common engineering relevance -- say, ~1 Hz -- at such a large scale would require an extremely detailed structural model, currently not available, and prohibitive computational resources. However, an interest is growing in longer period ground motion -- that impacts on the seismic response of taller structures (Cauzzi and Faccioli, 2008) -- and it is not unusual to consider the wave field up to 20s. In such period range, our Po Plain structural model has shown to be able to reproduce well basin resonance and amplification effects at stations boarding the sedimentary plain. We then simulate seismic shaking scenarios for possible sources tied to devastating historical earthquakes that are known to have occurred in the region --- such as the M~6 event that hit Modena in 1501; and the Verona, M~6.7 in 1117, quake that caused well-documented strong effects in an unusually wide area with radius of hundreds of kilometers. We explore different source geometries and rupture histories for each earthquake. We mainly focus our attention on the synthesis of the prominent surface waves that are highly amplified in deep sedimentary basin structures (e.g., Smerzini et al, 2011; Koketsu and Miyage, 2008). Such simulations hold high relevance because of the large local property exposure, due to extensive industrial and touristic infrastructure. We show that deterministic ground motion calculation can indeed provide information to be actively used to mitigate the effects of desctructive earthquakes on critical infrastructures.

  12. Use of liquefaction-induced features for paleoseismic analysis - An overview of how seismic liquefaction features can be distinguished from other features and how their regional distribution and properties of source sediment can be used to infer the location and strength of Holocene paleo-earthquakes

    USGS Publications Warehouse

    Obermeier, S.F.

    1996-01-01

    Liquefaction features can be used in many field settings to estimate the recurrence interval and magnitude of strong earthquakes through much of the Holocene. These features include dikes, craters, vented sand, sills, and laterally spreading landslides. The relatively high seismic shaking level required for their formation makes them particularly valuable as records of strong paleo-earthquakes. This state-of-the-art summary for using liquefaction-induced features for paleoseismic interpretation and analysis takes into account both geological and geotechnical engineering perspectives. The driving mechanism for formation of the features is primarily the increased pore-water pressure associated with liquefaction of sand-rich sediment. The role of this mechanism is often supplemented greatly by the direct action of seismic shaking at the ground surface, which strains and breaks the clay-rich cap that lies immediately above the sediment that liquefied. Discussed in the text are the processes involved in formation of the features, as well as their morphology and characteristics in field settings. Whether liquefaction occurs is controlled mainly by sediment grain size, sediment packing, depth to the water table, and strength and duration of seismic shaking. Formation of recognizable features in the field generally requires a low-permeability cap above the sediment that liquefied. Field manifestations are controlled largely by the severity of liquefaction and the thickness and properties of the low-permeability cap. Criteria are presented for determining whether observed sediment deformation in the field originated by seismically induced liquefaction. These criteria have been developed mainly by observing historic effects of liquefaction in varied field settings. The most important criterion is that a seismic liquefaction origin requires widespread, regional development of features around a core area where the effects are most severe. In addition, the features must have a morphology that is consistent with a very sudden application of a large hydraulic force. This article discusses case studies in widely separated and different geological settings: coastal South Carolina, the New Madrid seismic zone, the Wabash Valley seismic zone, and coastal Washington State. These studies encompass most of the range of settings and the types of liquefaction-induced features likely to be encountered anywhere. The case studies describe the observed features and the logic for assigning a seismic liquefaction origin to them. Also discussed are some types of sediment deformations that can be misinterpreted as having a seismic origin. Two independent methods for estimating prehistoric magnitude are discussed briefly. One method is based on determination of the maximum distance from the epicenter over which liquefaction-induced effects have formed. The other method is based on use of geotechnical engineering techniques at sites of marginal liquefaction, in order to bracket the peak accelerations as a function of epicentral distance; these accelerations can then be compared with predictions from seismological models.

  13. Demonstration of Regenerable, Large-Scale Ion Exchange System Using WBA Resin in Rialto, CA (Drinking Water Treatment - Pilot Scale)

    DTIC Science & Technology

    2008-08-01

    Administration NDBA N-nitrosodi-n-butylamine NDEA N-nitrosodiethylamine NDMA N-nitrosodimethylamine NDPA N-nitrosodi-n-propylamine v ACRONYMS...spectrometry (IC-MS/MS). Nitrosamines were analyzed using EPA Method 521. N-nitrosodimethylamine ( NDMA ) was 2.6 parts per trillion (ppt) with a detection...and metals (Ca, Cu, Fe, Mg, Mn, K, Na , and Zn). Specific methods are listed in Table 5. ** N-nitrosodimethylamine ( NDMA ), N-nitrosodiethylamine

  14. Incorporating Learning Theory into Existing Systems Engineering Models

    DTIC Science & Technology

    2013-09-01

    3. Social  Cognition 22 Table 1. Classification of learning theories Behaviorism Cognitivism Constructivism Connectivism...Introdution to design of large scale systems. New York: Mcgraw-Hill. Grusec. J. (1992). Social learning theory and development psychology: The... LEARNING THEORY INTO EXISTING SYSTEMS ENGINEERING MODELS by Valentine Leo September 2013 Thesis Advisor: Gary O. Langford Co-Advisor

  15. Characteristics of a Sensitive Well Showing Pre-Earthquake Water-Level Changes

    NASA Astrophysics Data System (ADS)

    King, Chi-Yu

    2018-04-01

    Water-level data recorded at a sensitive well next to a fault in central Japan between 1989 and 1998 showed many coseismic water-level drops and a large (60 cm) and long (6-month) pre-earthquake drop before a rare local earthquake of magnitude 5.8 on 17 March 1997, as well as 5 smaller pre-earthquake drops during a 7-year period prior to this earthquake. The pre-earthquake changes were previously attributed to leakage through the fault-gouge zone caused by small but broad-scaled crustal-stress increments. These increments now seem to be induced by some large slow-slip events. The coseismic changes are attributed to seismic shaking-induced fissures in the adjacent aquitards, in addition to leakage through the fault. The well's high-sensitivity is attributed to its tapping a highly permeable aquifer, which is connected to the fractured side of the fault, and its near-critical condition for leakage, especially during the 7 years before the magnitude 5.8 earthquake.

  16. Earthquake shaking hazard estimates and exposure changes in the conterminous United States

    USGS Publications Warehouse

    Jaiswal, Kishor S.; Petersen, Mark D.; Rukstales, Kenneth S.; Leith, William S.

    2015-01-01

    A large portion of the population of the United States lives in areas vulnerable to earthquake hazards. This investigation aims to quantify population and infrastructure exposure within the conterminous U.S. that are subjected to varying levels of earthquake ground motions by systematically analyzing the last four cycles of the U.S. Geological Survey's (USGS) National Seismic Hazard Models (published in 1996, 2002, 2008 and 2014). Using the 2013 LandScan data, we estimate the numbers of people who are exposed to potentially damaging ground motions (peak ground accelerations at or above 0.1g). At least 28 million (~9% of the total population) may experience 0.1g level of shaking at relatively frequent intervals (annual rate of 1 in 72 years or 50% probability of exceedance (PE) in 50 years), 57 million (~18% of the total population) may experience this level of shaking at moderately frequent intervals (annual rate of 1 in 475 years or 10% PE in 50 years), and 143 million (~46% of the total population) may experience such shaking at relatively infrequent intervals (annual rate of 1 in 2,475 years or 2% PE in 50 years). We also show that there is a significant number of critical infrastructure facilities located in high earthquake-hazard areas (Modified Mercalli Intensity ≥ VII with moderately frequent recurrence interval).

  17. Lake sediment records as earthquake catalogues: A compilation from Swiss lakes - Limitations and possibilities

    NASA Astrophysics Data System (ADS)

    Kremer, Katrina; Reusch, Anna; Wirth, Stefanie B.; Anselmetti, Flavio S.; Girardclos, Stéphanie; Strasser, Michael

    2016-04-01

    Intraplate settings are characterized by low deformation rates and recurrence intervals of strong earthquakes that often exceed the time span covered by instrumental records. Switzerland, as an example for such settings, shows a low instrumentally recorded seismicity, in contrast to strong earthquakes (e.g. 1356 Basel earthquake, Mw=6.6 and 1601 Unterwalden earthquake, Mw=5.9) mentioned in the historical archives. As such long recurrence rates do not allow for instrumental identification of earthquake sources of these strong events, and as intense geomorphologic alterations prevent preservation of surface expressions of faults, the knowledge of active faults is very limited. Lake sediments are sensitive to seismic shaking and thus, can be used to extend the regional earthquake catalogue if the sedimentary deposits or deformation structures can be linked to an earthquake. Single lake records allow estimating local intensities of shaking while multiple lake records can furthermore be used to compare temporal and spatial distribution of earthquakes. In this study, we compile a large dataset of dated sedimentary event deposits recorded in Swiss lakes available from peer-reviewed publications and unpublished master theses. We combine these data in order to detect large prehistoric regional earthquake events or periods of intense shaking that might have affected multiple lake settings. In a second step, using empirical seismic attenuation equations, we test if lake records can be used to reconstruct magnitudes and epicentres of identified earthquakes.

  18. ShakeCast Manual

    USGS Publications Warehouse

    Lin, Kuo-Wan; Wald, David J.

    2008-01-01

    ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users? facilities, and generates potential damage assessment notifications, facility damage maps, and other Web-based products for emergency managers and responders.

  19. Fan Blade Shake Test Results for the 40- by 80-/80- by 120-Foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Warmbrodt, W.; Graham, T.

    1983-01-01

    This report documents the shake tests performed on the first set of hydulignum fan blades for the 40- by 80-/80- by 120-Foot Wind Tunnel. The purpose of the shake test program is described. The test equipment and test procedures are reviewed. Results from each shake test are presented and the overall findings of the shake test program are discussed.

  20. Electromagnetically levitated vibration isolation system for the manufacturing process of silicon monocrystals

    NASA Technical Reports Server (NTRS)

    Kanemitsu, Yoichi; Watanabe, Katsuhide; Yano, Kenichi; Mizuno, Takayuki

    1994-01-01

    This paper introduces a study on an Electromagnetically Levitated Vibration Isolation System (ELVIS) for isolation control of large-scale vibration. This system features no mechanical contact between the isolation table and the installation floor, using a total of four electromagnetic actuators which generate magnetic levitation force in the vertical and horizontal directions. The configuration of the magnet for the vertical direction is designed to prevent any generation of restoring vibratory force in the horizontal direction. The isolation system is set so that vibration control effects due to small earthquakes can be regulated to below 5(gal) versus horizontal vibration levels of the installation floor of up t 25(gal), and those in the horizontal relative displacement of up to 30 (mm) between the floor and levitated isolation table. In particular, studies on the relative displacement between the installation floor and the levitated isolation table have been made for vibration control in the horizontal direction. In case of small-scale earthquakes (Taft wave scaled: max. 25 gal), the present system has been confirmed to achieve a vibration isolation to a level below 5 gal. The vibration transmission ratio of below 1/10 has been achieved versus continuous micro-vibration (approx. one gal) in the horizontal direction on the installation floor.

  1. An analytical study on nested flow systems in a Tóthian basin with a periodically changing water table

    NASA Astrophysics Data System (ADS)

    Zhao, Ke-Yu; Jiang, Xiao-Wei; Wang, Xu-Sheng; Wan, Li; Wang, Jun-Zhi; Wang, Heng; Li, Hailong

    2018-01-01

    Classical understanding on basin-scale groundwater flow patterns is based on Tóth's findings of a single flow system in a unit basin (Tóth, 1962) and nested flow systems in a complex basin (Tóth, 1963), both of which were based on steady state models. Vandenberg (1980) extended Tóth (1962) by deriving a transient solution under a periodically changing water table in a unit basin and examined the flow field distortion under different dimensionless response time, τ∗. Following Vandenberg's (1980) approach, we extended Tóth (1963) by deriving the transient solution under a periodically changing water table in a complex basin and examined the transient behavior of nested flow systems. Due to the effect of specific storage, the flow field is asymmetric with respect to the midline, and the trajectory of internal stagnation points constitutes a non-enclosed loop, whose width decreases when τ∗ decreases. The distribution of the relative magnitude of hydraulic head fluctuation, Δh∗ , is dependent on the horizontal distance away from a divide and the depth below the land surface. In the shallow part, Δh∗ decreases from 1 at the divide to 0 at its neighboring valley under all τ∗, while in the deep part, Δh∗ reaches a threshold, whose value decreases when τ∗ increases. The zones with flowing wells are also found to change periodically. As water table falls, there is a general trend of shrinkage in the area of zones with flowing wells, which has a lag to the declining water table under a large τ∗. Although fluxes have not been assigned in our model, the recharge/discharge flux across the top boundary can be obtained. This study is critical to understand a series of periodically changing hydrogeological phenomena in large-scale basins.

  2. Design of an efficient medium for heterologous protein production in Yarrowia lipolytica: case of human interferon alpha 2b.

    PubMed

    Gasmi, Najla; Ayed, Atef; Nicaud, Jean-Marc; Kallel, Héla

    2011-05-20

    The non conventional yeast Yarrowia lipolytica has aroused a strong industrial interest for heterologous protein production. However most of the studies describing recombinant protein production by this yeast rely on the use of complex media, such media are not convenient for large scale production particularly for products intended for pharmaceutical applications. In addition medium composition can also affect the production yield. Hence it is necessary to design an efficient medium for therapeutic protein expression by this host. Five different media, including four minimal media and a complex medium, were assessed in shake flasks for the production of human interferon alpha 2b (hIFN α2b) by Y. lipolytica under the control of POX2 promoter inducible with oleic acid. The chemically defined medium SM4 formulated by Invitrogen for Pichia pastoris growth was the most suitable. Using statistical experimental design this medium was further optimized. The selected minimal medium consisting in SM4 supplemented with 10 mg/l FeCl₃, 1 g/l glutamate, 5 ml/l PTM1 (Pichia Trace Metals) solution and a vitamin solution composed of myo-inositol, thiamin and biotin was called GNY medium. Compared to shake flask, bioreactor culture in GNY medium resulted in 416-fold increase of hIFN α2b production and 2-fold increase of the biological activity. Furthermore, SM4 enrichment with 5 ml/l PTM1 solution contributed to protect hIFN α2b against the degradation by the 28 kDa protease identified by zymography gel in culture supernatant. The screening of the inhibitory effect of the trace elements present in PTM1 solution on the activity of this protease was achieved using a Box-Behnken design. Statistical data analysis showed that FeCl₃ and MnSO₄ had the most inhibitory effect. We have designed an efficient medium for large scale production of heterologous proteins by Y. lipolytica. The optimized medium GNY is suitable for the production of hIFN α2b with the advantage that no complex nitrogen sources with non-defined composition were required.

  3. Practical Applications for Earthquake Scenarios Using ShakeMap

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Worden, B.; Quitoriano, V.; Goltz, J.

    2001-12-01

    In planning and coordinating emergency response, utilities, local government, and other organizations are best served by conducting training exercises based on realistic earthquake situations-ones that they are most likely to face. Scenario earthquakes can fill this role; they can be generated for any geologically plausible earthquake or for actual historic earthquakes. ShakeMap Web pages now display selected earthquake scenarios (www.trinet.org/shake/archive/scenario/html) and more events will be added as they are requested and produced. We will discuss the methodology and provide practical examples where these scenarios are used directly for risk reduction. Given a selected event, we have developed tools to make it relatively easy to generate a ShakeMap earthquake scenario using the following steps: 1) Assume a particular fault or fault segment will (or did) rupture over a certain length, 2) Determine the magnitude of the earthquake based on assumed rupture dimensions, 3) Estimate the ground shaking at all locations in the chosen area around the fault, and 4) Represent these motions visually by producing ShakeMaps and generating ground motion input for loss estimation modeling (e.g., FEMA's HAZUS). At present, ground motions are estimated using empirical attenuation relationships to estimate peak ground motions on rock conditions. We then correct the amplitude at that location based on the local site soil (NEHRP) conditions as we do in the general ShakeMap interpolation scheme. Finiteness is included explicitly, but directivity enters only through the empirical relations. Although current ShakeMap earthquake scenarios are empirically based, substantial improvements in numerical ground motion modeling have been made in recent years. However, loss estimation tools, HAZUS for example, typically require relatively high frequency (3 Hz) input for predicting losses, above the range of frequencies successfully modeled to date. Achieving full-synthetic ground motion estimates that will substantially improve over empirical relations at these frequencies will require developing cost-effective numerical tools for proper theoretical inclusion of known complex ground motion effects. Current efforts underway must continue in order to obtain site, basin, and deeper crustal structure, and to characterize and test 3D earth models (including attenuation and nonlinearity). In contrast, longer period synthetics (>2 sec) are currently being generated in a deterministic fashion to include 3D and shallow site effects, an improvement on empirical estimates alone. As progress is made, we will naturally incorporate such advances into the ShakeMap scenario earthquake and processing methodology. Our scenarios are currently used heavily in emergency response planning and loss estimation. Primary users include city, county, state and federal government agencies (e.g., the California Office of Emergency Services, FEMA, the County of Los Angeles) as well as emergency response planners and managers for utilities, businesses, and other large organizations. We have found the scenarios are also of fundamental interest to many in the media and the general community interested in the nature of the ground shaking likely experienced in past earthquakes as well as effects of rupture on known faults in the future.

  4. Three-dimensional (3D) evaluation of liquid distribution in shake flask using an optical fluorescence technique.

    PubMed

    Azizan, Amizon; Büchs, Jochen

    2017-01-01

    Biotechnological development in shake flask necessitates vital engineering parameters e.g. volumetric power input, mixing time, gas liquid mass transfer coefficient, hydromechanical stress and effective shear rate. Determination and optimization of these parameters through experiments are labor-intensive and time-consuming. Computational Fluid Dynamics (CFD) provides the ability to predict and validate these parameters in bioprocess engineering. This work provides ample experimental data which are easily accessible for future validations to represent the hydrodynamics of the fluid flow in the shake flask. A non-invasive measuring technique using an optical fluorescence method was developed for shake flasks containing a fluorescent solution with a waterlike viscosity at varying filling volume (V L  = 15 to 40 mL) and shaking frequency ( n  = 150 to 450 rpm) at a constant shaking diameter (d o  = 25 mm). The method detected the leading edge (LB) and tail of the rotating bulk liquid (TB) relative to the direction of the centrifugal acceleration at varying circumferential heights from the base of the shake flask. The determined LB and TB points were translated into three-dimensional (3D) circumferential liquid distribution plots. The maximum liquid height (H max ) of the bulk liquid increased with increasing filling volume and shaking frequency of the shaking flask, as expected. The toroidal shapes of LB and TB are clearly asymmetrical and the measured TB differed by the elongation of the liquid particularly towards the torus part of the shake flask. The 3D liquid distribution data collected at varying filling volume and shaking frequency, comprising of LB and TB values relative to the direction of the centrifugal acceleration are essential for validating future numerical solutions using CFD to predict vital engineering parameters in shake flask.

  5. Estimating economic losses from earthquakes using an empirical approach

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2013-01-01

    We extended the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) empirical fatality estimation methodology proposed by Jaiswal et al. (2009) to rapidly estimate economic losses after significant earthquakes worldwide. The requisite model inputs are shaking intensity estimates made by the ShakeMap system, the spatial distribution of population available from the LandScan database, modern and historic country or sub-country population and Gross Domestic Product (GDP) data, and economic loss data from Munich Re's historical earthquakes catalog. We developed a strategy to approximately scale GDP-based economic exposure for historical and recent earthquakes in order to estimate economic losses. The process consists of using a country-specific multiplicative factor to accommodate the disparity between economic exposure and the annual per capita GDP, and it has proven successful in hindcast-ing past losses. Although loss, population, shaking estimates, and economic data used in the calibration process are uncertain, approximate ranges of losses can be estimated for the primary purpose of gauging the overall scope of the disaster and coordinating response. The proposed methodology is both indirect and approximate and is thus best suited as a rapid loss estimation model for applications like the PAGER system.

  6. Experimental Study on Steel Tank Model Using Shaking Table/ Badania Eksperymentalne Modelu Zbiornika Stalowego Na Stole Sejsmicznym

    NASA Astrophysics Data System (ADS)

    Burkacki, Daniel; Jankowski, Robert

    2014-09-01

    Cylindrical steel tanks are very popular structures used for storage of products of chemical and petroleum industries. Earthquakes are the most dangerous and also the most unpredictable dynamic loads acting on such structures. On the other hand, mining tremors are usually considered to be less severe due to lower acceleration levels observed. The aim of the present paper is to show the results of the experimental study which has been conducted on a scaled model of a real tank located in Poland. The investigation has been carried out under different dynamic excitations (earthquakes and mining tremors) using the shaking table. The results of the study indicate that stored product may significantly influence the values of dynamic parameters and confirm that the level of liquid filling is really essential in the structural analysis. The comparison of the response under moderate earthquakes and mining tremors indicate that the second excitation may be more severe in some cases. Stalowe zbiorniki walcowe są bardzo popularnymi konstrukcjami używanymi do magazynowania produktów przemysłu chemicznego i naftowego. Ich bezpieczeństwo i niezawodność są kluczowe, ponieważ każde uszkodzenie może nieść za sobą bardzo poważne konsekwencje. Trzęsienia ziemi są najbardziej niebezpiecznymi, a zarazem najbardziej nieprzewidywalnymi obciążeniami dynamicznymi, które mogą oddziaływać na tego typu konstrukcje. Z drugiej strony ruchy podłoża związane ze wstrząsami górniczymi są uważane za mniej groźne z powodu osiągania niższych poziomów wartości przyspieszeń. Celem niniejszego artykułu jest przedstawienie wyników badań eksperymentalnych, które przeprowadzono na wykonanym w skali modelu rzeczywistego zbiornika zlokalizowanego na terenie Polski. Badania wykonano przy użyciu stołu sejsmicznego. Zakres badań obejmował testy harmoniczne właściwości dynamicznych oraz zachowanie się stalowego zbiornika walcowego podczas trzęsień ziemi oraz wstrząsów górniczych dla różnych poziomów wypełnienia cieczą. Wyniki badań pokazują, że produkt magazynowany może mieć znaczący wpływ na wartości parametrów dynamicznych oraz potwierdzają, iż poziom wypełnienia cieczą jest istotny w analizie konstrukcji. Porównanie odpowiedzi podczas trzęsień ziemi oraz wstrząsów górniczych wskazuje, iż to drugie wymuszenie może być w niektórych przypadkach bardziej niekorzystne.

  7. Empty calories and phantom fullness: a randomized trial studying the relative effects of energy density and viscosity on gastric emptying determined by MRI and satiety.

    PubMed

    Camps, Guido; Mars, Monica; de Graaf, Cees; Smeets, Paul Am

    2016-07-01

    Stomach fullness is a determinant of satiety. Although both the viscosity and energy content have been shown to delay gastric emptying, their relative importance is not well understood. We compared the relative effects of and interactions between the viscosity and energy density on gastric emptying and perceived satiety. A total of 15 healthy men [mean ± SD age: 22.6 ± 2.4 y; body mass index (in kg/m(2)): 22.6 ± 1.8] participated in an experiment with a randomized 2 × 2 crossover design. Participants received dairy-based shakes (500 mL; 50% carbohydrate, 20% protein, and 30% fat) that differed in viscosity (thin and thick) and energy density [100 kcal (corresponding to 0.2 kcal/mL) compared with 500 kcal (corresponding to 1 kcal/mL)]. After ingestion, participants entered an MRI scanner where abdominal scans and oral appetite ratings on a 100-point scale were obtained every 10 min until 90 min after ingestion. From the scans, gastric content volumes were determined. Overall, the gastric emptying half-time (GE t50) was 54.7 ± 3.8 min. The thin 100-kcal shake had the lowest GE t50 of 26.5 ± 3.0 min, followed by the thick 100-kcal shake with a GE t50 of 41 ± 3.9 min and the thin 500-kcal shake with a GE t50 of 69.5 ± 5.9 min, and the thick 500-kcal shake had the highest GE t50 of 81.9 ± 8.3 min. With respect to appetite, the thick 100-kcal shake led to higher fullness (58 points at 40 min) than the thin 500-kcal shake (48 points at 40 min). Our results show that increasing the viscosity is less effective than increasing the energy density in slowing gastric emptying. However, the viscosity is more important to increase the perceived fullness. These results underscore the lack of the satiating efficiency of empty calories in quickly ingested drinks such as sodas. The increase in perceived fullness that is due solely to the increased viscosity, which is a phenomenon that we refer to as phantom fullness, may be useful in lowering energy intake. This trial was registered at www.trialregister.nl as NTR4573. © 2016 American Society for Nutrition.

  8. Remotely-Sensed Regional-Scale Evapotranspiration of a Semi-Arid Great Basin Desert and its Relationship to Geomorphology, Soils, and Vegetation

    NASA Technical Reports Server (NTRS)

    Laymon, C.; Quattrochi, D.; Malek, E.; Hipps, L.; Boettinger, J.; McCurdy, G.

    1998-01-01

    Landsat thematic mapper data are used to estimate instantaneous regional-scale surface water and energy fluxes in a semi-arid Great Basin desert of the western United States. Results suggest that it is possible to scale from point measurements of environmental state variables to regional estimates of water and energy exchange. This research characterizes the unifying thread in the classical climate-topography-soil-vegetation relation -the surface water and energy balance-through maps of the partitioning of energy throughout the landscape. The study was conducted in Goshute Valley of northeastern Nevada, which is characteristic of most faulted graben valleys of the Basin and Range Province of the western United States. The valley comprises a central playa and lake plain bordered by alluvial fans emanating from the surrounding mountains. The distribution of evapotranspiration (ET) is lowest in the middle reaches of the fans where the water table is deep and plants are small, resulting in low evaporation and transpiration. Highest ET occurs in the center of the valley, particularly in the playa, where limited to no vegetation occurs, but evaporation is relatively high because of a shallow water table and silty clay soil capable of large capillary movement. Intermediate values of ET are associated with large shrubs and is dominated by transpiration.

  9. Remotely-Sensed Regional-Scale Evapotranspiration of a Semi-Arid Great Basin Desert and its Relationship to Geomorphology, Soils, and Vegetation

    NASA Technical Reports Server (NTRS)

    Laymon, C.; Quattrochi, D.; Malek, E.; Hipps, L.; Boettinger, J.; McCurdy, G.

    1997-01-01

    Landsat Thematic Mapper data is used to estimate instantaneous regional-scale surface water and energy fluxes in a semi-arid Great Basin desert of the western United States. Results suggest that it is possible to scale from point measurements of environmental state variables to regional estimates of water and energy exchange. This research characterizes the unifying thread in the classical climate-topography-soil-vegetation relation-the surface water and energy balance-through maps of the partitioning of energy throughout the landscape. The study was conducted in Goshute Valley of northeastern Nevada, which is characteristic of most faulted graben valleys of the Basin and Range Province of the western United States. The valley comprises a central playa and lake plain bordered by alluvial fans emanating from the surrounding mountains. The distribution of evapotranspiration (ET) is lowest in the middle reaches of the fans where the water table is deep and plants are small, resulting in low evaporation and transpiration. Highest ET occurs in the center of the valley, particularly in the playa, where limited to no vegetation occurs, but evaporation is relatively high because of a shallow water table and silty clay soil capable of large capillary movement. Intermediate values of ET are associated with large shrubs and is dominated by transpiration.

  10. An Overview and Parametric Evaluation of the CGS ShakeMap Automated System in CISN

    NASA Astrophysics Data System (ADS)

    Hagos, L. Z.; Haddadi, H. R.; Shakal, A. F.

    2014-12-01

    In the recent years, ShakeMap has been extensively used in California for earthquake rapid response. Serving as a backup to the Northern and Southern seismic regions of the California Integrated Seismic Network (CISN), the California Geological Survey (CGS) is running a ShakeMap system configured such that it effectively produces ShakeMaps for earthquakes occurring in both regions. In achieving this goal, CGS has worked to improve the robustness of its ShakeMap system and the quality of its products. Peak ground motion amplitude data are exchanged between the CISN data centers to provide robust generation of ShakeMap. Most exchanged ground motion packets come associated with an earthquake by the authoritative network. However, for ground motion packets that come unassociated, CGS employs an event association scheme to associate them with the corresponding earthquake. The generated ShakeMap products are published to the CGS server which can also be accessed through the CISN website. The backup function is designed to publish ShakeMap products to the USGS NEIC server without collision with the regional networks, only acting in cases where the authoritative region encounters a system failure. Depending on the size, location and significance of the earthquake, review of ShakeMap products by a seismologist may involve changes to ShakeMap parameters from the default. We present an overview of the CGS ShakeMap system and highlight some of the parameters a seismologist may adjust including parameters related to basin effects, directivity effects when finite fault models are available, site corrections, etc. We also analyze the sensitivity and dependence of the ShakeMap intensity and ground motion maps on the number of observed data included in the computation. In light of the available strong motion amplitude data, we attempt to address the question of what constitutes an adequate quality ShakeMap in the tradeoff between rapidity and completeness. We also present a brief comparative study of the available Ground Motion to Intensity Conversion Equations (GMICE) by studying selected earthquakes in California region. Results of these studies can be used as a tool in ShakeMap generation for California earthquakes when the use of non-default parameters is required.

  11. Contributions of algae to GPP and DOC production in an Alaskan fen: effects of historical water table manipulations on ecosystem responses to a natural flood.

    PubMed

    Wyatt, Kevin H; Turetsky, Merritt R; Rober, Allison R; Giroldo, Danilo; Kane, Evan S; Stevenson, R Jan

    2012-07-01

    The role of algae in the metabolism of northern peatlands is largely unknown, as is how algae will respond to the rapid climate change being experienced in this region. In this study, we examined patterns in algal productivity, nutrients, and dissolved organic carbon (DOC) during an uncharacteristically wet summer in an Alaskan rich fen. Our sampling was conducted in three large-scale experimental plots where water table position had been manipulated (including both drying and wetting plots and a control) for the previous 4 years. This study allowed us to explore how much ecosystem memory of the antecedent water table manipulations governed algal responses to natural flooding. Despite no differences in water table position between the manipulated plots at the time of sampling, algal primary productivity was consistently higher in the lowered water table plot compared to the control or raised water table plots. In all plots, algal productivity peaked immediately following seasonal maxima in nutrient concentrations. We found a positive relationship between algal productivity and water-column DOC concentrations (r (2) = 0.85, P < 0.001). Using these data, we estimate that algae released approximately 19% of fixed carbon into the water column. Algal exudates were extremely labile in biodegradability assays, decreasing by more than 55% within the first 24 h of incubation. We suggest that algae can be an important component of the photosynthetic community in boreal peatlands and may become increasingly important for energy flow in a more variable climate with more intense droughts and flooding.

  12. Speak Simply When Warning About After Shocks

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Hardebeck, J.; Page, M. T.; van der Elst, N.; Wein, A. M.

    2016-12-01

    When a fault in the ground slips, the ground moves fast and can shake hard. After a big ground shake, there are more shakes. We call them after shocks and these can happen over a long time, for many years. An after shock can shake the ground more than it shook the first time. These shocks can shake and break places where people live and work, make rocks fall down and the ground go soft and wet, and hurt or kill people. After shocks also make people worry. If people are scared, then they may leave the area and not come back. To help people be safe and feel calm we want to tell them what may happen. We often use big words and lots of numbers to give the chances for the number of shakes over days, weeks, and years. That helps some people fix things and do their jobs such as those who work on roads, power, water, phones, hospitals, schools or in the money business. But big words and too many numbers can confuse a lot of people and make them worry more. Studies of talking about the ground shake problem show that it is best to speak simply to people. What if we only use the ten hundred most often used words to talk about these ground shakes. Would that work? Here is a possible warning: Last week's huge ground shake will probably make more ground shakes. This week expect to feel three to ten ground shakes and maybe one big ground shake that could break things. That big ground shake has a chance of 1 in 10. This is normal. Be safe. Stay out of broken houses, shops, and work places. When you feel the ground shake: drop, cover, and hold on. People may feel afraid or be hurt, so check on friends and family. Get some more food and water. Over time there will be fewer ground shakes, but always be ready for them. That warning gives a lot of key ideas: what may happen, whether houses could get broken, that what is happening is normal, and what people may feel and should do. These are the key parts of a good warning. Maybe we should use the most often used words all the time.

  13. Seismic damage diagnosis of a masonry building using short-term damping measurements

    NASA Astrophysics Data System (ADS)

    Kouris, Leonidas Alexandros S.; Penna, Andrea; Magenes, Guido

    2017-04-01

    It is of considerable importance to perform dynamic identification and detect damage in existing structures. This paper describes a new and practical method for damage diagnosis of masonry buildings requiring minimum computational effort. The method is based on the relative variation of modal damping and validated against experimental data from a full scale two storey shake table test. The experiment involves a building subjected to uniaxial vibrations of progressively increasing intensity at the facilities of EUCENTRE laboratory (Pavia, Italy) up to a near collapse damage state. Five time-histories are applied scaling the Montenegro (1979) accelerogram. These strong motion tests are preceded by random vibration tests (RVT's) which are used to perform modal analysis. Two deterministic methods are applied: the single degree of freedom (SDOF) assumption together with the peak-picking method in the discrete frequency domain and the Eigen realisation algorithm with data correlations (ERA-DC) in the discrete time domain. Regarding the former procedure, some improvements are incorporated to locate rigorously the natural frequencies and estimate the modal damping. The progressive evolution of the modal damping is used as a key indicator to characterise damage on the building. Modal damping is connected to the structural mass and stiffness. A square integrated but only with two components expression for proportional (classical) damping is proposed to fit better with the experimental measurements of modal damping ratios. Using this Rayleigh order formulation the contribution of each of the damping components is evaluated. The stiffness component coefficient is proposed as an effective index to detect damage and quantify its intensity.

  14. The Commercial TREMOR Strong-Motion Seismograph

    NASA Astrophysics Data System (ADS)

    Evans, J. R.; Hamstra, R. H.; Kuendig, C.; Camina, P.

    2001-12-01

    The emergence of major seismological and earthquake-engineering problems requiring large, dense instrument arrays led several of us to investigate alternate solutions. Evans and Rogers (USGS Open File Report 95-555, 1995) and Evans (USGS Open File Report 98-109, 1998) demonstrated the efficacy of low-cost robust silicon accelerometers in strong-motion seismology, making possible a vast increase in the spatial density of such arrays. The 1998 design displays true 16-bit performance and excellent robustness and linearity---13 of these prototype near-real-time instruments are deployed in Oakland, California, and have recorded data from seven small events (up to 5.7 %g). Since this technology is a radical departure from past efforts, it was necessary for the USGS to develop the sensor and demonstrate its efficacy thoroughly. Since it is neither practical nor appropriate for the USGS to produce instrumentation beyond a demonstration phase, the US Geological Survey and GeoSIG Ltd undertook a collaborative effort (a ``CRAD'') to commercialize the new technology. This effort has resulted in a fully temperature-compensated 16-bit system, the GeoSIG GT-316, announced in April, 2001, combining the ICS-3028 TM-based USGS sensor, temperature compensation technique, and peak ground velocity (PGV) computation with a highly customized 16-bit GeoSIG recorder. The price has not been set but is likely to be around \\2000 in large quantities. The result is a near-real-time instrument telemetering peak ground acceleration (PGA) and PGV about 90 s after onset of the P wave, then minutes later transmitting the waveform. The receiving software, ``HomeBase()'', also computes spectral acceleration, S_{a}. PGA, PGV, S_{a}, and waveforms are forwarded immediately by HomeBase() for ShakeMap generation and other uses. Shaking metrics from the prototypes in Oakland are consistently among the first to arrive for the northern California ShakeMap. For telemetry we use a low-cost always-connected cell-phone-based Internet technology (CDPD), but any RS-232 connected telemetry system is a viable candidate (spread spectrum, CDMA, GSM, POT). The instruments can be synchronized via CDPD to a few tenths of a second, or to full precision with an optional GPS receiver. Sensor RMS noise is 33 \\mathrm \\mu g over the band 0.1 to 35 Hz, 11 \\mathrm \\mu g$ over the band 0.1 to 1.0 Hz; the sensor is extremely linear (far better than 1% of full scale); temperature compensation is to better than 1% of full scale. TREMOR-class instruments are intended to fill the niche of high spatial resolution with an economical low-maintenance device, while conventional instruments continue to pursue maximum amplitude resolution. The TREMOR instrument also has applications in areas where budget or access limitations require lower purchase, installation, or maintenance cost (commercial ANSS partners, remote sites, on-call aftershock arrays, extremely dense arrays, and organizations with limited budgets). However, we primarily envision large, mixed arrays of conventional and TREMOR instruments in urban areas, the former providing better early information from small events and the TREMOR instruments guaranteeing better spatial resolution and more near-field recording of large events. Together, they would meet the ANSS goal of dense near-real-time urban monitoring and the collection of requisite data for risk mitigation.

  15. Why the Long Face? The Mechanics of Mandibular Symphysis Proportions in Crocodiles

    PubMed Central

    Walmsley, Christopher W.; Smits, Peter D.; Quayle, Michelle R.; McCurry, Matthew R.; Richards, Heather S.; Oldfield, Christopher C.; Wroe, Stephen; Clausen, Phillip D.; McHenry, Colin R.

    2013-01-01

    Background Crocodilians exhibit a spectrum of rostral shape from long snouted (longirostrine), through to short snouted (brevirostrine) morphologies. The proportional length of the mandibular symphysis correlates consistently with rostral shape, forming as much as 50% of the mandible’s length in longirostrine forms, but 10% in brevirostrine crocodilians. Here we analyse the structural consequences of an elongate mandibular symphysis in relation to feeding behaviours. Methods/Principal Findings Simple beam and high resolution Finite Element (FE) models of seven species of crocodile were analysed under loads simulating biting, shaking and twisting. Using beam theory, we statistically compared multiple hypotheses of which morphological variables should control the biomechanical response. Brevi- and mesorostrine morphologies were found to consistently outperform longirostrine types when subject to equivalent biting, shaking and twisting loads. The best predictors of performance for biting and twisting loads in FE models were overall length and symphyseal length respectively; for shaking loads symphyseal length and a multivariate measurement of shape (PC1– which is strongly but not exclusively correlated with symphyseal length) were equally good predictors. Linear measurements were better predictors than multivariate measurements of shape in biting and twisting loads. For both biting and shaking loads but not for twisting, simple beam models agree with best performance predictors in FE models. Conclusions/Significance Combining beam and FE modelling allows a priori hypotheses about the importance of morphological traits on biomechanics to be statistically tested. Short mandibular symphyses perform well under loads used for feeding upon large prey, but elongate symphyses incur high strains under equivalent loads, underlining the structural constraints to prey size in the longirostrine morphotype. The biomechanics of the crocodilian mandible are largely consistent with beam theory and can be predicted from simple morphological measurements, suggesting that crocodilians are a useful model for investigating the palaeobiomechanics of other aquatic tetrapods. PMID:23342027

  16. Large-scale breeder reactor prototype mechanical pump conceptual design study, hot leg

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-09-01

    Due to the extensive nature of this study, the report is presented as a series of small reports. The complete design analysis is placed in a separate section. The drawings and tabulations are in the back portion of the report. Other topics are enumerated and located as shown in the table of contents.

  17. Supports for libraries'restoration from the Great East Japan Earthquake : Challenges we address at Miyagi Prefectural Library

    NASA Astrophysics Data System (ADS)

    Kumagai, Shinichiro

    This article overviews the situations of damage and reconstruction of mainly public libraries in Miyagi Prefecture about 9 months after the Great East Japan Earthquake. Serious damage of library buildings was due not only to the tsunami or seismic sea wave but to violent shaking, the latter less reported by the media. We at the Miyagi Prefectural Library implemented reconstruction assistance for regional public libraries in both direct and indirect ways. Among them, we report in detail on the support we offered until the Minami-sanriku Town Library reopened its service. We highlight a prefectural library's role, acting between supporters and those supportees, to consider the necessity of middle organizations. We clarify what challenges we face and examine how best to provide assistance in case of large-scale disasters.

  18. Flotation-separation of aluminum from some water samples using powdered marble waste and oleic acid.

    PubMed

    Ghazy, Shaban el-Sayed; Samra, Salem el-Sayed; Mahdy, Abd el-Fattah Mohammed; el-Morsy, Sherin Mohammed

    2003-10-01

    Bench-scale experiments were conducted in the laboratory, aiming to remove aluminum from water. They were based on the use of powdered marble wastes (PMW), which are inexpensive and produced in large quantity, and thus potentially cause environmental problems, as an effective inorganic sorbent and oleic acid (HOL) as surfactant. The main parameters (solution pHs, sorbent, surfactant and aluminum concentrations, shaking time, ionic strength and the presence of foreign ions) that influence the sorptive-flotation process were examined. Good results were obtained under the optimum conditions, for which nearly 100% of aluminum at pH 7 and at room temperature (approximately 25 degrees C) was removed. The procedure was successfully applied to the recovery of aluminum spiked to some natural water samples. Moreover, a sorption and flotation mechanism is suggested.

  19. Erratum to "Large-scale mitochondrial COI gene sequence variability reflects the complex colonization history of the invasive soft-shell clam, Mya arenaria (L.) (Bivalvia)" [Estuar. Coast. Shelf Sci. 181 (2016) 256-265

    NASA Astrophysics Data System (ADS)

    Lasota, Rafal; Pierscieniak, Karolina; Garcia, Pascale; Simon-Bouhet, Benoit; Wolowicz, Maciej

    2017-03-01

    The publisher regrets a printing error in the last paragraph in the Results section. The correct text should read as follows: Tajima's D, Fu and Li's D* and F*, and Fu's Fs were negative for all American populations, and statistically significant in most cases (Table 3). In most of the European populations the values of neutrality tests were positive, but not statistically significant. The highest positive values of neutrality tests were noted in the populations from Reykjavik (Iceland) and Dublin (Ireland) (Table 3).

  20. Filamentous fungal biofilm for production of human drug metabolites.

    PubMed

    Amadio, Jessica; Casey, Eoin; Murphy, Cormac D

    2013-07-01

    In drug development, access to drug metabolites is essential for assessment of toxicity and pharmacokinetic studies. Metabolites are usually acquired via chemical synthesis, although biological production is potentially more efficient with fewer waste management issues. A significant problem with the biological approach is the effective half-life of the biocatalyst, which can be resolved by immobilisation. The fungus Cunninghamella elegans is well established as a model of mammalian metabolism, although it has not yet been used to produce metabolites on a large scale. Here, we describe immobilisation of C. elegans as a biofilm, which can transform drugs to important human metabolites. The biofilm was cultivated on hydrophilic microtiter plates and in shake flasks containing a steel spring in contact with the glass. Fluorescence and confocal scanning laser microscopy revealed that the biofilm was composed of a dense network of hyphae, and biochemical analysis demonstrated that the matrix was predominantly polysaccharide. The medium composition was crucial for both biofilm formation and biotransformation of flurbiprofen. In shake flasks, the biofilm transformed 86% of the flurbiprofen added to hydroxylated metabolites within 24 h, which was slightly more than planktonic cultures (76%). The biofilm had a longer effective lifetime than the planktonic cells, which underwent lysis after 2×72 h cycles, and diluting the Sabouraud dextrose broth enabled the thickness of the biofilm to be controlled while retaining transformation efficiency. Thus, C. elegans biofilm has the potential to be applied as a robust biocatalyst for the production of human drug metabolites required for drug development.

  1. Prospective testing of neo-deterministic seismic hazard scenarios for the Italian territory

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Vaccari, Franco; Kossobokov, Vladimir; Panza, Giuliano F.

    2013-04-01

    A reliable and comprehensive characterization of expected seismic ground shaking, eventually including the related time information, is essential in order to develop effective mitigation strategies and increase earthquake preparedness. Moreover, any effective tool for SHA must demonstrate its capability in anticipating the ground shaking related with large earthquake occurrences, a result that can be attained only through rigorous verification and validation process. So far, the major problems in classical probabilistic methods for seismic hazard assessment, PSHA, consisted in the adequate description of the earthquake recurrence, particularly for the largest and sporadic events, and of the attenuation models, which may be unable to account for the complexity of the medium and of the seismic sources and are often weekly constrained by the available observations. Current computational resources and physical knowledge of the seismic waves generation and propagation processes allow nowadays for viable numerical and analytical alternatives to the use of attenuation relations. Accordingly, a scenario-based neo-deterministic approach, NDSHA, to seismic hazard assessment has been proposed, which allows considering a wide range of possible seismic sources as the starting point for deriving scenarios by means of full waveforms modeling. The method does not make use of attenuation relations and naturally supplies realistic time series of ground shaking, including reliable estimates of ground displacement readily applicable to seismic isolation techniques. Based on NDSHA, an operational integrated procedure for seismic hazard assessment has been developed, that allows for the definition of time dependent scenarios of ground shaking, through the routine updating of formally defined earthquake predictions. The integrated NDSHA procedure for seismic input definition, which is currently applied to the Italian territory, combines different pattern recognition techniques, designed for the space-time identification of strong earthquakes, with algorithms for the realistic modeling of ground motion. Accordingly, a set of deterministic scenarios of ground motion at bedrock, which refers to the time interval when a strong event is likely to occur within the alerted area, can be defined by means of full waveform modeling, both at regional and local scale. CN and M8S predictions, as well as the related time-dependent ground motion scenarios associated with the alarmed areas, are regularly updated every two months since 2006. The routine application of the time-dependent NDSHA approach provides information that can be useful in assigning priorities for timely mitigation actions and, at the same time, allows for a rigorous prospective testing and validation of the proposed methodology. As an example, for sites where ground shaking values greater than 0.2 g are estimated at bedrock, further investigations can be performed taking into account the local soil conditions, to assess the performances of relevant structures, such as historical and strategic buildings. The issues related with prospective testing and validation of the time-dependent NDSHA scenarios will be discussed, illustrating the results obtained for the recent strong earthquakes in Italy, including the May 20, 2012 Emilia earthquake.

  2. Diversity of head shaking nystagmus in peripheral vestibular disease.

    PubMed

    Kim, Min-Beom; Huh, Se Hyung; Ban, Jae Ho

    2012-06-01

    To evaluate the characteristics of head shaking nystagmus in various peripheral vestibular diseases. Retrospective case series. Tertiary referral center. Data of 235 patients with peripheral vestibular diseases including vestibular neuritis, Ménière's disease, and benign paroxysmal positional vertigo, were retrospectively analyzed. All subjects presented between August 2009 and July 2010. Patients were tested for vestibular function including head shaking nystagmus and caloric information. Regarding vestibular neuritis, all tests were again performed during the 1-month follow-up. Head shaking nystagmus was classified as monophasic or biphasic and, according to the affected ear, was divided as ipsilesional or contralesional. Of the 235 patients, 87 patients revealed positive head shaking nystagmus. According to each disease, positive rates of head shaking nystagmus were as follows: 35 (100%) of 35 cases of vestibular neuritis, 11 (68.8%) of 16 cases of Ménière's disease, and 41 (22.2%) of 184 cases of benign paroxysmal positional vertigo. All cases of vestibular neuritis initially presented as a monophasic, contralesional beating, head shaking nystagmus. However, 1 month after first visit, the direction of nystagmus was changed to biphasic (contralesional first then ipsilesional beating) in 25 cases (72.5%) but not in 10 cases (27.5%). There was a significant correlation between the degree of initial caloric weakness and the biphasic conversion of head shaking nystagmus (p = 0.02). In 72.5% of vestibular neuritis cases, head shaking nystagmus was converted to biphasic during the subacute period. The larger the initial canal paresis was present, the more frequent the biphasic conversion of head shaking nystagmus occurred. However, Ménière's disease and benign paroxysmal positional vertigo did not have specific patterns of head shaking nystagmus.

  3. The transboundary non-renewable Nubian Aquifer System of Chad, Egypt, Libya and Sudan: classical groundwater questions and parsimonious hydrogeologic analysis and modelin

    USGS Publications Warehouse

    Voss, Clifford I.; Soliman, Safaa M.

    2014-01-01

    Parsimonious groundwater modeling provides insight into hydrogeologic functioning of the Nubian Aquifer System (NAS), the world’s largest non-renewable groundwater system (belonging to Chad, Egypt, Libya, and Sudan). Classical groundwater-resource issues exist (magnitude and lateral extent of drawdown near pumping centers) with joint international management questions regarding transboundary drawdown. Much of NAS is thick, containing a large volume of high-quality groundwater, but receives insignificant recharge, so water-resource availability is time-limited. Informative aquifer data are lacking regarding large-scale response, providing only local-scale information near pumps. Proxy data provide primary underpinning for understanding regional response: Holocene water-table decline from the previous pluvial period, after thousands of years, results in current oasis/sabkha locations where the water table still intersects the ground. Depletion is found to be controlled by two regional parameters, hydraulic diffusivity and vertical anisotropy of permeability. Secondary data that provide insight are drawdowns near pumps and isotope-groundwater ages (million-year-old groundwaters in Egypt). The resultant strong simply structured three-dimensional model representation captures the essence of NAS regional groundwater-flow behavior. Model forecasts inform resource management that transboundary drawdown will likely be minimal—a nonissue—whereas drawdown within pumping centers may become excessive, requiring alternative extraction schemes; correspondingly, significant water-table drawdown may occur in pumping centers co-located with oases, causing oasis loss and environmental impacts.

  4. The transboundary non-renewable Nubian Aquifer System of Chad, Egypt, Libya and Sudan: classical groundwater questions and parsimonious hydrogeologic analysis and modeling

    NASA Astrophysics Data System (ADS)

    Voss, Clifford I.; Soliman, Safaa M.

    2014-03-01

    Parsimonious groundwater modeling provides insight into hydrogeologic functioning of the Nubian Aquifer System (NAS), the world's largest non-renewable groundwater system (belonging to Chad, Egypt, Libya, and Sudan). Classical groundwater-resource issues exist (magnitude and lateral extent of drawdown near pumping centers) with joint international management questions regarding transboundary drawdown. Much of NAS is thick, containing a large volume of high-quality groundwater, but receives insignificant recharge, so water-resource availability is time-limited. Informative aquifer data are lacking regarding large-scale response, providing only local-scale information near pumps. Proxy data provide primary underpinning for understanding regional response: Holocene water-table decline from the previous pluvial period, after thousands of years, results in current oasis/sabkha locations where the water table still intersects the ground. Depletion is found to be controlled by two regional parameters, hydraulic diffusivity and vertical anisotropy of permeability. Secondary data that provide insight are drawdowns near pumps and isotope-groundwater ages (million-year-old groundwaters in Egypt). The resultant strong simply structured three-dimensional model representation captures the essence of NAS regional groundwater-flow behavior. Model forecasts inform resource management that transboundary drawdown will likely be minimal—a nonissue—whereas drawdown within pumping centers may become excessive, requiring alternative extraction schemes; correspondingly, significant water-table drawdown may occur in pumping centers co-located with oases, causing oasis loss and environmental impacts.

  5. Development and utilization of USGS ShakeCast for rapid post-earthquake assessment of critical facilities and infrastructure

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-wan; Kircher, C.A.; Jaiswal, Kishor; Luco, Nicolas; Turner, L.; Slosky, Daniel

    2017-01-01

    The ShakeCast system is an openly available, near real-time post-earthquake information management system. ShakeCast is widely used by public and private emergency planners and responders, lifeline utility operators and transportation engineers to automatically receive and process ShakeMap products for situational awareness, inspection priority, or damage assessment of their own infrastructure or building portfolios. The success of ShakeCast to date and its broad, critical-user base mandates improved software usability and functionality, including improved engineering-based damage and loss functions. In order to make the software more accessible to novice users—while still utilizing advanced users’ technical and engineering background—we have developed a “ShakeCast Workbook”, a well documented, Excel spreadsheet-based user interface that allows users to input notification and inventory data and export XML files requisite for operating the ShakeCast system. Users will be able to select structure based on a minimum set of user-specified facility (building location, size, height, use, construction age, etc.). “Expert” users will be able to import user-modified structural response properties into facility inventory associated with the HAZUS Advanced Engineering Building Modules (AEBM). The goal of the ShakeCast system is to provide simplified real-time potential impact and inspection metrics (i.e., green, yellow, orange and red priority ratings) to allow users to institute customized earthquake response protocols. Previously, fragilities were approximated using individual ShakeMap intensity measures (IMs, specifically PGA and 0.3 and 1s spectral accelerations) for each facility but we are now performing capacity-spectrum damage state calculations using a more robust characterization of spectral deamnd.We are also developing methods for the direct import of ShakeMap’s multi-period spectra in lieu of the assumed three-domain design spectrum (at 0.3s for constant acceleration; 1s or 3s for constant velocity and constant displacement at very long response periods). As part of ongoing ShakeCast research and development, we will also explore the use of ShakeMap IM uncertainty estimates and evaluate the assumption of employing multiple response spectral damping values rather than the single 5%-damped value currently employed. Developing and incorporating advanced fragility assignments into the ShakeCast Workbook requires related software modifications and database improvements; these enhancements are part of an extensive rewrite of the ShakeCast application.

  6. Validation of the shake test for detecting freeze damage to adsorbed vaccines.

    PubMed

    Kartoglu, Umit; Ozgüler, Nejat Kenan; Wolfson, Lara J; Kurzatkowski, Wiesław

    2010-08-01

    To determine the validity of the shake test for detecting freeze damage in aluminium-based, adsorbed, freeze-sensitive vaccines. A double-blind crossover design was used to compare the performance of the shake test conducted by trained health-care workers (HCWs) with that of phase contrast microscopy as a "gold standard". A total of 475 vials of 8 different types of World Health Organization prequalified freeze-sensitive vaccines from 10 different manufacturers were used. Vaccines were kept at 5 degrees C. Selected numbers of vials from each type were then exposed to -25 degrees C and -2 degrees C for 24-hour periods. There was complete concordance between HCWs and phase-contrast microscopy in identifying freeze-damaged vials and non-frozen samples. Non-frozen samples showed a fine-grain structure under phase contrast microscopy, but freeze-damaged samples showed large conglomerates of massed precipitates with amorphous, crystalline, solid and needle-like structures. Particles in the non-frozen samples measured from 1 microm (vaccines against diphtheria-tetanus-pertussis; Haemophilus influenzae type b; hepatitis B; diphtheria-tetanus-pertussis-hepatitis B) to 20 microm (diphtheria and tetanus vaccines, alone or in combination). By contrast, aggregates in the freeze-damaged samples measured up to 700 microm (diphtheria-tetanus-pertussis) and 350 microm on average. The shake test had 100% sensitivity, 100% specificity and 100% positive predictive value in this study, which confirms its validity for detecting freeze damage to aluminium-based freeze-sensitive vaccines.

  7. Development of stiffer and ductile glulam portal frame

    NASA Astrophysics Data System (ADS)

    Komatsu, Kohei

    2017-11-01

    Portal frame structures, which are constituted of straight glulam beams and columns connected semi-rigidly by steel insert gusset plate with a lot of drift pins, were the first successful glulam structures widely used in Japan. In addition to this connection system, the author invented also a new type of jointing devise for glulam structures named as "Lagscrewbolt" which had a full threaded portion at inner part to grip wooden member as well as another thread part at the end of shank to connect with other member. The initial type of "Lagscrewbolt" was successfully applied to a various types of glulam buildings which could be rapidly built-up on construction site. Its strength performance, however, was rather brittle therefore the improvement of the ductility was a crucial research subject. In order to give a sufficient ductility on the "Lagscrewbolted joint system", so-called "Slotted Bolted Connection" concept was adopted for making use of large energy dissipation characteristics due to high-tension bolted steel connection with slotted bolt holes. Static & dynamic performance of glulam portal frame specimens was evaluated by static cyclic loading test as well as shaking table test. Current latest form of the jointing system can show very high ductility as well as stable hysteretic cyclic loops by inserting brass-shim between steel-to-steel friction interfaces

  8. Change in Weather Research and Forecasting (WRF) Model Accuracy with Age of Input Data from the Global Forecast System (GFS)

    DTIC Science & Technology

    2016-09-01

    Laboratory Change in Weather Research and Forecasting (WRF) Model Accuracy with Age of Input Data from the Global Forecast System (GFS) by JL Cogan...analysis. As expected, accuracy generally tended to decline as the large-scale data aged , but appeared to improve slightly as the age of the large...19 Table 7 Minimum and maximum mean RMDs for each WRF time (or GFS data age ) category. Minimum and

  9. Stable isotope and noble gas constraints on the source and residence time of spring water from the Table Mountain Group Aquifer, Paarl, South Africa and implications for large scale abstraction

    NASA Astrophysics Data System (ADS)

    Miller, J. A.; Dunford, A. J.; Swana, K. A.; Palcsu, L.; Butler, M.; Clarke, C. E.

    2017-08-01

    Large scale groundwater abstraction is increasingly being used to support large urban centres especially in areas of low rainfall but presents particular challenges in the management and sustainability of the groundwater system. The Table Mountain Group (TMG) Aquifer is one of the largest and most important aquifer systems in South Africa and is currently being considered as an alternative source of potable water for the City of Cape Town, a metropolis of over four million people. The TMG aquifer is a fractured rock aquifer hosted primarily in super mature sandstones, quartzites and quartz arenites. The groundwater naturally emanates from numerous springs throughout the cape region. One set of springs were examined to assess the source and residence time of the spring water. Oxygen and hydrogen isotopes indicate that the spring water has not been subject to evaporation and in combination with Na/Cl ratios implies that recharge to the spring systems is via coastal precipitation. Although rainfall in the Cape is usually modelled on orographic rainfall, δ18O and δ2H values of some rainfall samples are strongly positive indicating a stratiform component as well. Comparing the spring water δ18O and δ2H values with that of local rainfall, indicates that the springs are likely derived from continuous bulk recharge over the immediate hinterland to the springs and not through large and/or heavy downpours. Noble gas concentrations, combined with tritium and radiocarbon activities indicate that the residence time of the TMG groundwater in this area is decadal in age with a probable maximum upper limit of ∼40 years. This residence time is probably a reflection of the slow flow rate through the fractured rock aquifer and hence indicates that the interconnectedness of the fractures is the most important factor controlling groundwater flow. The short residence time of the groundwater suggest that recharge to the springs and the Table Mountain Group Aquifer as a whole is vulnerable to climate change and reductions in regional precipitation. Any plans for large scale abstraction to supplement the City of Cape Town water supply would need to factor this into models of maximum sustainable yield.

  10. Leaching of metals from large pieces of printed circuit boards using citric acid and hydrogen peroxide.

    PubMed

    Jadhav, Umesh; Su, C; Hocheng, Hong

    2016-12-01

    In the present study, the leaching of metals from large pieces of computer printed circuit boards (CPCBs) was studied. A combination of citric acid (0.5 M) and 1.76 M hydrogen peroxide (H 2 O 2 ) was used to leach the metals from CPCB piece. The influence of system variables such as H 2 O 2 concentration, concentration of citric acid, shaking speed, and temperature on the metal leaching process was investigated. The complete metal leaching was achieved in 4 h from a 4 × 4 cm CPCB piece. The presence of citric acid and H 2 O 2 together in the leaching solution is essential for complete metal leaching. The optimum addition amount of H 2 O 2 was 5.83 %. The citric acid concentration and shaking speed had an insignificant effect on the leaching of metals. The increase in the temperature above 30 °C showed a drastic effect on metal leaching process.

  11. Slip pulse and resonance of the Kathmandu basin during the 2015 Gorkha earthquake, Nepal

    NASA Astrophysics Data System (ADS)

    Galetzka, J.; Melgar, D.; Genrich, J. F.; Geng, J.; Owen, S.; Lindsey, E. O.; Xu, X.; Bock, Y.; Avouac, J.-P.; Adhikari, L. B.; Upreti, B. N.; Pratt-Sitaula, B.; Bhattarai, T. N.; Sitaula, B. P.; Moore, A.; Hudnut, K. W.; Szeliga, W.; Normandeau, J.; Fend, M.; Flouzat, M.; Bollinger, L.; Shrestha, P.; Koirala, B.; Gautam, U.; Bhatterai, M.; Gupta, R.; Kandel, T.; Timsina, C.; Sapkota, S. N.; Rajaure, S.; Maharjan, N.

    2015-09-01

    Detailed geodetic imaging of earthquake ruptures enhances our understanding of earthquake physics and associated ground shaking. The 25 April 2015 moment magnitude 7.8 earthquake in Gorkha, Nepal was the first large continental megathrust rupture to have occurred beneath a high-rate (5-hertz) Global Positioning System (GPS) network. We used GPS and interferometric synthetic aperture radar data to model the earthquake rupture as a slip pulse ~20 kilometers in width, ~6 seconds in duration, and with a peak sliding velocity of 1.1 meters per second, which propagated toward the Kathmandu basin at ~3.3 kilometers per second over ~140 kilometers. The smooth slip onset, indicating a large (~5-meter) slip-weakening distance, caused moderate ground shaking at high frequencies (>1 hertz; peak ground acceleration, ~16% of Earth’s gravity) and minimized damage to vernacular dwellings. Whole-basin resonance at a period of 4 to 5 seconds caused the collapse of tall structures, including cultural artifacts.

  12. Hydrometallurgical Recovery of Metals from Large Printed Circuit Board Pieces.

    PubMed

    Jadhav, U; Hocheng, H

    2015-09-29

    The recovery of precious metals from waste printed circuit boards (PCBs) is an effective recycling process. This paper presents a promising hydrometallurgical process to recover precious metals from waste PCBs. To simplify the metal leaching process, large pieces of PCBs were used instead of a pulverized sample. The chemical coating present on the PCBs was removed by sodium hydroxide (NaOH) treatment prior to the hydrometallurgical treatment. Among the leaching reagents examined, hydrochloric acid (HCl) showed great potential for the recovery of metals. The HCl-mediated leaching of waste PCBs was investigated over a range of conditions. Increasing the acid concentration decreased the time required for complete metal recovery. The shaking speed showed a pronounced positive effect on metal recovery, but the temperature showed an insignificant effect. The results showed that 1 M HCl recovered all of the metals from 4 cm × 4 cm PCBs at room temperature and 150 rpm shaking speed in 22 h.

  13. Hydrometallurgical Recovery of Metals from Large Printed Circuit Board Pieces

    PubMed Central

    Jadhav, U.; Hocheng, H.

    2015-01-01

    The recovery of precious metals from waste printed circuit boards (PCBs) is an effective recycling process. This paper presents a promising hydrometallurgical process to recover precious metals from waste PCBs. To simplify the metal leaching process, large pieces of PCBs were used instead of a pulverized sample. The chemical coating present on the PCBs was removed by sodium hydroxide (NaOH) treatment prior to the hydrometallurgical treatment. Among the leaching reagents examined, hydrochloric acid (HCl) showed great potential for the recovery of metals. The HCl-mediated leaching of waste PCBs was investigated over a range of conditions. Increasing the acid concentration decreased the time required for complete metal recovery. The shaking speed showed a pronounced positive effect on metal recovery, but the temperature showed an insignificant effect. The results showed that 1 M HCl recovered all of the metals from 4 cm × 4 cm PCBs at room temperature and 150 rpm shaking speed in 22 h. PMID:26415827

  14. Slip pulse and resonance of the Kathmandu basin during the 2015 Gorkha earthquake, Nepal.

    PubMed

    Galetzka, J; Melgar, D; Genrich, J F; Geng, J; Owen, S; Lindsey, E O; Xu, X; Bock, Y; Avouac, J-P; Adhikari, L B; Upreti, B N; Pratt-Sitaula, B; Bhattarai, T N; Sitaula, B P; Moore, A; Hudnut, K W; Szeliga, W; Normandeau, J; Fend, M; Flouzat, M; Bollinger, L; Shrestha, P; Koirala, B; Gautam, U; Bhatterai, M; Gupta, R; Kandel, T; Timsina, C; Sapkota, S N; Rajaure, S; Maharjan, N

    2015-09-04

    Detailed geodetic imaging of earthquake ruptures enhances our understanding of earthquake physics and associated ground shaking. The 25 April 2015 moment magnitude 7.8 earthquake in Gorkha, Nepal was the first large continental megathrust rupture to have occurred beneath a high-rate (5-hertz) Global Positioning System (GPS) network. We used GPS and interferometric synthetic aperture radar data to model the earthquake rupture as a slip pulse ~20 kilometers in width, ~6 seconds in duration, and with a peak sliding velocity of 1.1 meters per second, which propagated toward the Kathmandu basin at ~3.3 kilometers per second over ~140 kilometers. The smooth slip onset, indicating a large (~5-meter) slip-weakening distance, caused moderate ground shaking at high frequencies (>1 hertz; peak ground acceleration, ~16% of Earth's gravity) and minimized damage to vernacular dwellings. Whole-basin resonance at a period of 4 to 5 seconds caused the collapse of tall structures, including cultural artifacts. Copyright © 2015, American Association for the Advancement of Science.

  15. Association of adverse childhood experiences with shaking and smothering behaviors among Japanese caregivers.

    PubMed

    Isumi, Aya; Fujiwara, Takeo

    2016-07-01

    Shaking and smothering in response to infant crying are life-threatening child abuse. Parental childhood abuse history is known to be one of the most robust risk factors for abusing their offspring. In addition to childhood abuse history, other adverse childhood exposures (ACEs) need to be considered due to co-occurrence. However, few studies have investigated the impact of ACEs on caregivers shaking and smothering their infant. This study aims to investigate the association of ACEs with shaking and smothering among caregivers of infants in Japan. A questionnaire was administered to caregivers participating in a four-month health checkup between September 2013 and August 2014 in Chiba City, Japan, to assess their ACEs (parental death, parental divorce, mentally ill parents, witness of intimate partner violence, physical abuse, neglect, psychological abuse and economic hardship), and shaking and smothering toward their infants (N=4297). Logistic regression analysis was used to examine the cumulative and individual impacts of ACEs on shaking and smothering. Analyses were conducted in 2015. A total of 28.3% reported having experienced at least one ACE during their childhood. We found that only witness of IPV had a significant association with shaking of infant (OR=1.93, 95% CI: 1.03-3.61). The total number of ACEs was not associated with either shaking or smothering. Our findings suggest that shaking and smothering in response to crying can occur regardless of ACEs. Population-based strategies that target all caregivers to prevent shaking and smothering of infants are needed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Towards large scale modelling of wetland water dynamics in northern basins.

    NASA Astrophysics Data System (ADS)

    Pedinotti, V.; Sapriza, G.; Stone, L.; Davison, B.; Pietroniro, A.; Quinton, W. L.; Spence, C.; Wheater, H. S.

    2015-12-01

    Understanding the hydrological behaviour of low topography, wetland-dominated sub-arctic areas is one major issue needed for the improvement of large scale hydrological models. These wet organic soils cover a large extent of Northern America and have a considerable impact on the rainfall-runoff response of a catchment. Moreover their strong interactions with the lower atmosphere and the carbon cycle make of these areas a noteworthy component of the regional climate system. In the framework of the Changing Cold Regions Network (CCRN), this study aims at providing a model for wetland water dynamics that can be used for large scale applications in cold regions. The modelling system has two main components : a) the simulation of surface runoff using the Modélisation Environmentale Communautaire - Surface and Hydrology (MESH) land surface model driven with several gridded atmospheric datasets and b) the routing of surface runoff using the WATROUTE channel scheme. As a preliminary study, we focus on two small representative study basins in Northern Canada : Scotty Creek in the lower Liard River valley of the Northwest Territories and Baker Creek, located a few kilometers north of Yellowknife. Both areas present characteristic landscapes dominated by a series of peat plateaus, channel fens, small lakes and bogs. Moreover, they constitute important fieldwork sites with detailed data to support our modelling study. The challenge of our new wetland model is to represent the hydrological functioning of the various landscape units encountered in those watersheds and their interactions using simple numerical formulations that can be later extended to larger basins such as the Mackenzie river basin. Using observed datasets, the performance of the model to simulate the temporal evolution of hydrological variables such as the water table depth, frost table depth and discharge is assessed.

  17. Ring Shake in Eastern Hemlock: Frequency and Relationship to Tree Attributes

    Treesearch

    John E. Baumgras; Paul E. Sendak; David L. Sonderman; David L. Sonderman

    2000-01-01

    Ring shake is a barrier to improved utilization of eastern hemlock, an important component of the total softwood timber resource in the Eastern United States and Canada. Ring shake is the lengthwise separation of wood that occurs between and parallel to growth rings, diminishing lumber yields and values. Evaluating the potential for ring shake is essential to improving...

  18. Ring shake in eastern hemlock: frequency and relationship to tree attributes

    Treesearch

    John E. Baumgras; Paul E. Sendak; David L. Sonderman

    2000-01-01

    Ring shake is a barrier to improved utilization of eastern hemlock, an important component of the total softwood timber resource in the Eastern United States and Canada. Ring shake is the lengthwise separation of wood that occurs between and parallel to growth rings, diminishing lumber yields and values. Evaluating the potential for ring shake is essential to improving...

  19. Radial shakes and "frost cracks" in living oak trees

    Treesearch

    Heinz Butin; Alex L. Shigo

    1981-01-01

    Dissections of hundreds of living, mature oak trees over a 25-year period revealed that radial shakes (or "frost cracks") and ring shakes are associated with a variety of wounds and stubs of branches and basal sprouts. A more intensive study of radial shakes that included dissections of more than 30 oaks confirmed the earlier finds, and provided additional...

  20. High-Resolution Assimilation of GRACE Terrestrial Water Storage Observations to Represent Local-Scale Water Table Depths

    NASA Astrophysics Data System (ADS)

    Stampoulis, D.; Reager, J. T., II; David, C. H.; Famiglietti, J. S.; Andreadis, K.

    2017-12-01

    Despite the numerous advances in hydrologic modeling and improvements in Land Surface Models, an accurate representation of the water table depth (WTD) still does not exist. Data assimilation of observations of the joint NASA and DLR mission, Gravity Recovery and Climate Experiment (GRACE) leads to statistically significant improvements in the accuracy of hydrologic models, ultimately resulting in more reliable estimates of water storage. However, the usually shallow groundwater compartment of the models presents a problem with GRACE assimilation techniques, as these satellite observations account for much deeper aquifers. To improve the accuracy of groundwater estimates and allow the representation of the WTD at fine spatial scales we implemented a novel approach that enables a large-scale data integration system to assimilate GRACE data. This was achieved by augmenting the Variable Infiltration Capacity (VIC) hydrologic model, which is the core component of the Regional Hydrologic Extremes Assessment System (RHEAS), a high-resolution modeling framework developed at the Jet Propulsion Laboratory (JPL) for hydrologic modeling and data assimilation. The model has insufficient subsurface characterization and therefore, to reproduce groundwater variability not only in shallow depths but also in deep aquifers, as well as to allow GRACE assimilation, a fourth soil layer of varying depth ( 1000 meters) was added in VIC as the bottom layer. To initialize a water table in the model we used gridded global WTD data at 1 km resolution which were spatially aggregated to match the model's resolution. Simulations were then performed to test the augmented model's ability to capture seasonal and inter-annual trends of groundwater. The 4-layer version of VIC was run with and without assimilating GRACE Total Water Storage anomalies (TWSA) over the Central Valley in California. This is the first-ever assimilation of GRACE TWSA for the determination of realistic water table depths, at fine scales that are required for local water management. In addition, Open Loop and GRACE-assimilation simulations of water table depth were compared to in-situ data over the state of California, derived from observation wells operated/maintained by the U.S. Geological Service.

  1. An Oceanographic and Climatological Atlas of Bristol Bay

    DTIC Science & Technology

    1987-10-01

    36 Forecasting Method ................................ 38 SUPERSTRUCTURE ICING.............................. 41 WIND...slicks and risk general advection of oil by large-scale ice move- analysis to coastal regions were computed. ment, and specific advection of oil by the...tide 1) Fetch wind (speed and direction) from tables or other sources. Forecast time of a surface map analysis of pressure highest range based on loss of

  2. A Comparative Study of Handicap-Free Life Expectancy of China in 1987 and 2006

    ERIC Educational Resources Information Center

    Lai, Dejian

    2009-01-01

    After the first large scale national sampling survey on handicapped persons in 1987, China conducted its second national sampling survey in 2006. Using the data from these two surveys and the national life tables, we computed and compared the expected years of life free of handicapped condition by the Sullivan method. The expected years of life…

  3. VizieR Online Data Catalog: REFLEX Galaxy Cluster Survey catalogue (Boehringer+, 2004)

    NASA Astrophysics Data System (ADS)

    Boehringer, H.; Schuecker, P.; Guzzo, L.; Collins, C. A.; Voges, W.; Cruddace, R. G.; Ortiz-Gil, A.; Chincarini, G.; de Grandi, S.; Edge, A. C.; MacGillivray, H. T.; Neumann, D. M.; Schindler, S.; Shaver, P.

    2004-05-01

    The following tables provide the catalogue as well as several data files necessary to reproduce the sample preparation. These files are also required for the cosmological modeling of these observations in e.g. the study of the statistics of the large-scale structure of the matter distribution in the Universe and related cosmological tests. (13 data files).

  4. Biomass Production of Hairy Roots of Artemisia annua and Arachis hypogaea in a Scaled-Up Mist Bioreactor

    PubMed Central

    Sivakumar, Ganapathy; Liu, Chunzhao; Towler, Melissa J.

    2014-01-01

    Hairy roots have the potential to produce a variety of valuable small and large molecules. The mist reactor is a gas phase bioreactor that has shown promise for low-cost culture of hairy roots. Using a newer, disposable culture bag, mist reactor performance was studied with two species, Artemisia annua L. and Arachis hypogaea (peanut), at scales from 1 to 20 L. Both species of hairy roots when grown at 1 L in the mist reactor showed growth rates that surpassed that in shake flasks. From the information gleaned at 1 L, Arachis was scaled further to 4 and then 20 L. Misting duty cycle, culture medium flow rate, and timing of when flow rate was increased were varied. In a mist reactor increasing the misting cycle or increasing the medium flow rate are the two alternatives for increased delivery of liquid nutrients to the root bed. Longer misting cycles beyond 2–3 min were generally deemed detrimental to growth. On the other hand, increasing the medium flow rate to the sonic nozzle especially during the exponential phase of root growth (weeks 2–3) was the most important factor for increasing growth rates and biomass yields in the 20 L reactors. A. hypogaea growth in 1 L reactors was μ = 0.173 day−1 with biomass yield of 12.75 g DWL−1. This exceeded that in shake flasks at μ = 0.166 day−1 and 11.10 g DWL−1. Best growth rate and biomass yield at 20 L was μ = 0.147 and 7.77 g DWL−1, which was mainly achieved when medium flow rate delivery was increased. The mist deposition model was further evaluated using this newer reactor design and when the apparent thickness of roots (+hairs) was taken into account, the empirical data correlated with model predictions. Together these results establish the most important conditions to explore for future optimization of the mist bioreactor for culture of hairy roots. PMID:20687140

  5. Real-time Shakemap implementation in Austria

    NASA Astrophysics Data System (ADS)

    Weginger, Stefan; Jia, Yan; Papi Isaba, Maria; Horn, Nikolaus

    2017-04-01

    ShakeMaps provide near-real-time maps of ground motion and shaking intensity following significant earthquakes. They are automatically generated within a few minutes after occurrence of an earthquake. We tested and included the USGS ShakeMap 4.0 (experimental code) based on python in the Antelope real-time system with local modified GMPE and Site Effects based on the conditions in Austria. The ShakeMaps are provided in terms of Intensity, PGA, PGV and PSA. Future presentation of ShakeMap contour lines and Ground Motion Parameter with interactive maps and data exchange over Web-Services are shown.

  6. Ground motions from the 2015 Mw 7.8 Gorkha, Nepal, earthquake constrained by a detailed assessment of macroseismic data

    USGS Publications Warehouse

    Martin, Stacey; Hough, Susan E.; Hung, Charleen

    2015-01-01

    To augment limited instrumental recordings of the Mw 7.8 Gorkha, Nepal, earthquake on 25 April 2015 (Nepali calendar: 12 Baisakh 2072, Bikram Samvat), we collected 3831 detailed media and first-person accounts of macroseismic effects that include sufficiently detailed information to assign intensities. The resulting intensity map reveals the distribution of shaking within and outside of Nepal, with the key result that shaking intensities throughout the near-field region only exceeded intensity 8 on the 1998 European Macroseismic Scale (EMS-98) in rare instances. Within the Kathmandu Valley, intensities were generally 6–7 EMS. This surprising (and fortunate) result can be explained by the nature of the mainshock ground motions, which were dominated by energy at periods significantly longer than the resonant periods of vernacular structures throughout the Kathmandu Valley. Outside of the Kathmandu Valley, intensities were also generally lower than 8 EMS, but the earthquake took a heavy toll on a number of remote villages, where many especially vulnerable masonry houses collapsed catastrophically in 7–8 EMS shaking. We further reconsider intensities from the 1833 earthquake sequence and conclude that it occurred on the same fault segment as the Gorkha earthquake.

  7. Acetification of rice wine by Acetobacter aceti using loofa sponge in a low-cost reciprocating shaker.

    PubMed

    Krusong, W; Tantratian, S

    2014-11-01

    To maximize acetification rate (ETA) by adsorption of acetic acid bacteria (AAB) on loofa sponge matrices (LSM). AAB were adsorbed on LSM, and the optimal shaking rate was determined for maximized AAB growth and oxygen availability. Results confirm that the 1 Hz reciprocating shaking rate with 40% working volume (liquid volume 24 l, tank volume 60 l) achieved a high oxygen transfer coefficient (k(L)a). The highest ETA was obtained at 50% (w:v) LSM-AAB:culture medium at 30 ± 2°C (P ≤ 0·05). To test process consistency, nine sequential acetification cycles were run using LSM-AAB and comparing it with no LSM. The highest ETA (1·701-2·401 g l(-1) d(-1)) was with LSM-AAB and was associated with the highest biomass of AAB, confirmed by SEM images. Results confirm that LSM-AAB works well as an inert substrate for AAB. High oxygenation was maintained by a reciprocating shaker. Both shaking and LSM were important in increasing ETA. High cell biomass in LSM-AAB provides good conditions for higher ETAs of quick acetification under adequate oxygen transfer by reciprocating shaker. It is a sustainable process for small-scale vinegar production system requiring minimal set-up cost. © 2014 The Society for Applied Microbiology.

  8. Evaluation of cysteine ethyl ester as efficient inducer for glutathione overproduction in Saccharomyces spp.

    PubMed

    Lorenz, Eric; Schmacht, Maximilian; Senz, Martin

    2016-11-01

    Economical yeast based glutathione (GSH) production is a process that is influenced by several factors like raw material and production costs, biomass production and efficient biotransformation of adequate precursors into the final product GSH. Nowadays the usage of cysteine for the microbial conversion into GSH is industrial state of practice. In the following study, the potential of different inducers to increase the GSH content was evaluated by means of design of experiments methodology. Investigations were executed in three natural Saccharomyces strains, S. cerevisiae, S. bayanus and S. boulardii, in a well suited 50ml shake tube system. Results of shake tube experiments were confirmed in traditional baffled shake flasks and finally via batch cultivation in lab-scale bioreactors under controlled conditions. Comprehensive studies showed that the usage of cysteine ethyl ester (CEE) for the batch-wise biotransformation into GSH led up to a more than 2.2 times higher yield compared to cysteine as inducer. Additionally, the intracellular GSH content could be significantly increased for all strains in terms of 2.29±0.29% for cysteine to 3.65±0.23% for CEE, respectively, in bioreactors. Thus, the usage of CEE provides a highly attractive inducing strategy for the GSH overproduction. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Why Colleges Can't Shake the Feds

    ERIC Educational Resources Information Center

    Fain, Paul

    2008-01-01

    This article reports that Congress is cranky about how colleges spend money. Over the last three years, regulation-minded lawmakers have investigated university endowments, intercollegiate athletics, and presidential pay, but that grilling has largely ceased. A presidential election has dulled legislative ambitions, and Congress has its hands full…

  10. Reevaluation of the macroseismic effects of the 1887 Sonora, Mexico earthquake and its magnitude estimation

    USGS Publications Warehouse

    Suárez, Gerardo; Hough, Susan E.

    2008-01-01

    The Sonora, Mexico, earthquake of 3 May 1887 occurred a few years before the start of the instrumental era in seismology. We revisit all available accounts of the earthquake and assign Modified Mercalli Intensities (MMI), interpreting and analyzing macroseismic information using the best available modern methods. We find that earlier intensity assignments for this important earthquake were unjustifiably high in many cases. High intensity values were assigned based on accounts of rock falls, soil failure or changes in the water table, which are now known to be very poor indicators of shaking severity and intensity. Nonetheless, reliable accounts reveal that light damage (intensity VI) occurred at distances of up to ~200 km in both Mexico and the United States. The resulting set of 98 reevaluated intensity values is used to draw an isoseismal map of this event. Using the attenuation relation proposed by Bakun (2006b), we estimate an optimal moment magnitude of Mw7.6. Assuming this magnitude is correct, a fact supported independently by documented rupture parameters assuming standard scaling relations, our results support the conclusion that northern Sonora as well as the Basin and Range province are characterized by lower attenuation of intensities than California. However, this appears to be at odds with recent results that Lg attenuation in the Basin and Range province is comparable to that in California.

  11. Evaluating the value of ENVISAT ASAR Data for the mapping and monitoring of peatland water table depths

    NASA Astrophysics Data System (ADS)

    Bechtold, Michel; Schlaffer, Stefan

    2015-04-01

    The Advanced Synthetic Aperture Radar (ASAR) onboard ENVISAT collected C-Band microwave backscatter data from 2005 to 2012. Backscatter in the C-Band depends to a large degree on the roughness and the moisture status of vegetation and soil surface with a penetration depth of ca. 3 cm. In wetlands with stable high water levels, the annual soil surface moisture dynamics are very distinct compared to the surrounding areas, which allows the monitoring of such environments with ASAR data (Reschke et al. 2012). Also in drained peatlands, moisture status of vegetation and soil surface strongly depends on water table depth due to high hydraulic conductivities of many peat soils in the low suction range (Dettmann et al. 2014). We hypothesize that this allows the characterization of water table depths with ASAR data. Here we analyze whether ASAR data can be used for the spatial and temporal estimation of water table depths in different peatlands (natural, near-natural, agriculturally-used and rewetted). Mapping and monitoring of water table depths is of crucial importance, e.g. for upscaling greenhouse gas emissions and evaluating the success of peatland rewetting projects. Here, ASAR data is analyzed with a new map of water table depths for the organic soils in Germany (Bechtold et al. 2014) as well as with a comprehensive data set of monitored peatland water levels from 1100 dip wells and 54 peatlands. ASAR time series from the years 2005-2012 with irregular temporal sampling intervals of 3-14 days were processed. Areas covered by snow were masked. Primary results about the accuracy of spatial estimates show significant correlations between long-term backscatter statistics and spatially-averaged water table depths extracted from the map at the resolution of the ASAR data. Backscatter also correlates with long-term averages of point-scale water table depth data of the monitoring wells. For the latter, correlation is highest between the dry reference backscatter values and summer mean water table depth. Using the boosted regression tree model of Bechtold et al., we evaluate whether the ASAR data can improve prediction accuracy and/or replace parts of ancillary data that is often not available in other countries. In the temporal domain primary results often show a better dependency between backscatter and water table depths compared to the spatial domain. For a variety of vegetation covers the temporal monitoring potential of ASAR data is evaluated at the level of annual water table depth statistics. Bechtold, M., Tiemeyer, B., Laggner, A., Leppelt, T., Frahm, E., and Belting, S., 2014. Large-scale regionalization of water table depth in peatlands optimized for greenhouse gas emission upscaling, Hydrol. Earth Syst. Sci., 18, 3319-3339. Dettmann, U., Bechtold, M., Frahm, E., Tiemeyer, B., 2014. On the applicability of unimodal and bimodal van Genuchten-Mualem based models to peat and other organic soils under evaporation conditions. Journal of Hydrology, 515, 103-115. Reschke, J., Bartsch, A., Schlaffer, S., Schepaschenko, D., 2012. Capability of C-Band SAR for Operational Wetland Monitoring at High Latitudes. Remote Sens. 4, 2923-2943.

  12. Correlation between mass transfer coefficient kLa and relevant operating parameters in cylindrical disposable shaken bioreactors on a bench-to-pilot scale

    PubMed Central

    2013-01-01

    Background Among disposable bioreactor systems, cylindrical orbitally shaken bioreactors show important advantages. They provide a well-defined hydrodynamic flow combined with excellent mixing and oxygen transfer for mammalian and plant cell cultivations. Since there is no known universal correlation between the volumetric mass transfer coefficient for oxygen kLa and relevant operating parameters in such bioreactor systems, the aim of this current study is to experimentally determine a universal kLa correlation. Results A Respiration Activity Monitoring System (RAMOS) was used to measure kLa values in cylindrical disposable shaken bioreactors and Buckingham’s π-Theorem was applied to define a dimensionless equation for kLa. In this way, a scale- and volume-independent kLa correlation was developed and validated in bioreactors with volumes from 2 L to 200 L. The final correlation was used to calculate cultivation parameters at different scales to allow a sufficient oxygen supply of tobacco BY-2 cell suspension cultures. Conclusion The resulting equation can be universally applied to calculate the mass transfer coefficient for any of seven relevant cultivation parameters such as the reactor diameter, the shaking frequency, the filling volume, the viscosity, the oxygen diffusion coefficient, the gravitational acceleration or the shaking diameter within an accuracy range of +/− 30%. To our knowledge, this is the first kLa correlation that has been defined and validated for the cited bioreactor system on a bench-to-pilot scale. PMID:24289110

  13. Correlation between mass transfer coefficient kLa and relevant operating parameters in cylindrical disposable shaken bioreactors on a bench-to-pilot scale.

    PubMed

    Klöckner, Wolf; Gacem, Riad; Anderlei, Tibor; Raven, Nicole; Schillberg, Stefan; Lattermann, Clemens; Büchs, Jochen

    2013-12-02

    Among disposable bioreactor systems, cylindrical orbitally shaken bioreactors show important advantages. They provide a well-defined hydrodynamic flow combined with excellent mixing and oxygen transfer for mammalian and plant cell cultivations. Since there is no known universal correlation between the volumetric mass transfer coefficient for oxygen kLa and relevant operating parameters in such bioreactor systems, the aim of this current study is to experimentally determine a universal kLa correlation. A Respiration Activity Monitoring System (RAMOS) was used to measure kLa values in cylindrical disposable shaken bioreactors and Buckingham's π-Theorem was applied to define a dimensionless equation for kLa. In this way, a scale- and volume-independent kLa correlation was developed and validated in bioreactors with volumes from 2 L to 200 L. The final correlation was used to calculate cultivation parameters at different scales to allow a sufficient oxygen supply of tobacco BY-2 cell suspension cultures. The resulting equation can be universally applied to calculate the mass transfer coefficient for any of seven relevant cultivation parameters such as the reactor diameter, the shaking frequency, the filling volume, the viscosity, the oxygen diffusion coefficient, the gravitational acceleration or the shaking diameter within an accuracy range of +/- 30%. To our knowledge, this is the first kLa correlation that has been defined and validated for the cited bioreactor system on a bench-to-pilot scale.

  14. An atlas of ShakeMaps for selected global earthquakes

    USGS Publications Warehouse

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul S.; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have merged all of the ShakeMaps in the Atlas to provide a global perspective of earthquake ground shaking for the past 35 years, allowing comparison with probabilistic hazard maps. The online Atlas and supporting databases can be found at http://earthquake.usgs.gov/eqcenter/shakemap/atlas.php/.

  15. Combining Multiple Rupture Models in Real-Time for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Wu, S.; Beck, J. L.; Heaton, T. H.

    2015-12-01

    The ShakeAlert earthquake early warning system for the west coast of the United States is designed to combine information from multiple independent earthquake analysis algorithms in order to provide the public with robust predictions of shaking intensity at each user's location before they are affected by strong shaking. The current contributing analyses come from algorithms that determine the origin time, epicenter, and magnitude of an earthquake (On-site, ElarmS, and Virtual Seismologist). A second generation of algorithms will provide seismic line source information (FinDer), as well as geodetically-constrained slip models (BEFORES, GPSlip, G-larmS, G-FAST). These new algorithms will provide more information about the spatial extent of the earthquake rupture and thus improve the quality of the resulting shaking forecasts.Each of the contributing algorithms exploits different features of the observed seismic and geodetic data, and thus each algorithm may perform differently for different data availability and earthquake source characteristics. Thus the ShakeAlert system requires a central mediator, called the Central Decision Module (CDM). The CDM acts to combine disparate earthquake source information into one unified shaking forecast. Here we will present a new design for the CDM that uses a Bayesian framework to combine earthquake reports from multiple analysis algorithms and compares them to observed shaking information in order to both assess the relative plausibility of each earthquake report and to create an improved unified shaking forecast complete with appropriate uncertainties. We will describe how these probabilistic shaking forecasts can be used to provide each user with a personalized decision-making tool that can help decide whether or not to take a protective action (such as opening fire house doors or stopping trains) based on that user's distance to the earthquake, vulnerability to shaking, false alarm tolerance, and time required to act.

  16. Seismic hazard, risk, and design for South America

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.

  17. A reliable simultaneous representation of seismic hazard and of ground shaking recurrence

    NASA Astrophysics Data System (ADS)

    Peresan, A.; Panza, G. F.; Magrin, A.; Vaccari, F.

    2015-12-01

    Different earthquake hazard maps may be appropriate for different purposes - such as emergency management, insurance and engineering design. Accounting for the lower occurrence rate of larger sporadic earthquakes may allow to formulate cost-effective policies in some specific applications, provided that statistically sound recurrence estimates are used, which is not typically the case of PSHA (Probabilistic Seismic Hazard Assessment). We illustrate the procedure to associate the expected ground motions from Neo-deterministic Seismic Hazard Assessment (NDSHA) to an estimate of their recurrence. Neo-deterministic refers to a scenario-based approach, which allows for the construction of a broad range of earthquake scenarios via full waveforms modeling. From the synthetic seismograms the estimates of peak ground acceleration, velocity and displacement, or any other parameter relevant to seismic engineering, can be extracted. NDSHA, in its standard form, defines the hazard computed from a wide set of scenario earthquakes (including the largest deterministically or historically defined credible earthquake, MCE) and it does not supply the frequency of occurrence of the expected ground shaking. A recent enhanced variant of NDSHA that reliably accounts for recurrence has been developed and it is applied to the Italian territory. The characterization of the frequency-magnitude relation can be performed by any statistically sound method supported by data (e.g. multi-scale seismicity model), so that a recurrence estimate is associated to each of the pertinent sources. In this way a standard NDSHA map of ground shaking is obtained simultaneously with the map of the corresponding recurrences. The introduction of recurrence estimates in NDSHA naturally allows for the generation of ground shaking maps at specified return periods. This permits a straightforward comparison between NDSHA and PSHA maps.

  18. Process optimization involving critical evaluation of oxygen transfer, oxygen uptake and nitrogen limitation for enhanced biomass and lipid production by oleaginous yeast for biofuel application.

    PubMed

    Chopra, Jayita; Sen, Ramkrishna

    2018-04-20

    Lipid accumulation in oleaginous yeast is generally induced by nitrogen starvation, while oxygen saturation can influence biomass growth. Systematic shake flask studies that help in identifying the right nitrogen source and relate its uptake kinetics to lipid biosynthesis under varying oxygen saturation conditions are very essential for addressing the bioprocessing-related issues, which are envisaged to occur in the fermenter scale production. In the present study, lipid bioaccumulation by P. guilliermondii at varying C:N ratios and oxygen transfer conditions (assessed in terms of k L a) was investigated in shake flasks using a pre-optimized N-source and a two-stage inoculum formulated in a hybrid medium. A maximum lipid concentration of 10.8 ± 0.5 g L -1 was obtained in shake flask study at the optimal condition with an initial C:N and k L a of 60:1 and 0.6 min -1 , respectively, at a biomass specific growth rate of 0.11 h -1 . Translating these optimal shake flask conditions to a 3.7 L stirred tank reactor resulted in biomass and lipid concentrations of 16.74 ± 0.8 and 8 ± 0.4 g L -1 . The fatty acid methyl ester (FAME) profile of lipids obtained by gas chromatography was found to be suitable for biodiesel application. We strongly believe that the rationalistic approach-based design of experiments adopted in the study would help in achieving high cell density with improved lipid accumulation and also minimize the efforts towards process optimization during bioreactor level operations, consequently reducing the research and development-associated costs.

  19. Does knowledge signify protection? The SEISMOPOLIS centre for improvement of behavior in case of an earthquake

    NASA Astrophysics Data System (ADS)

    Dandoulaki, M.; Kourou, A.; Panoutsopoulou, M.

    2009-04-01

    It is vastly accepted that earthquake education is the way to earthquake protection. Nonetheless experience demonstrates that knowing what to do does not necessarily result in a better behaviour in case of a real earthquake. A research project titled: "Seismopolis" - "Pilot integrated System for Public Familiarization with Earthquakes and Information on Earthquake Protection" aimed at the improvement of the behaviour of people through an appropriate amalgamation of knowledge transfer and virtually experiencing an earthquake situation. Seismopolis combines well established education means such as books and leaflets with new technologies like earthquake simulation and virtual reality. It comprises a series of 5 main spaces that the visitor passes one-by-one. Space 1. Reception and introductory information. Visitors are given fundamental information on earthquakes and earthquake protection, as well as on the appropriate behaviour in case of an earthquake. Space 2. Earthquake simulation room Visitors experience an earthquake in a room. A typical kitchen is set on a shake table area (3m x 6m planar triaxial shake table) and is shaken in both horizontal and vertical directions by introducing seismographs of real or virtual earthquakes. Space 3. Virtual reality room Visitors may have the opportunity to virtually move around in the building or in the city after an earthquake disaster and take action as in a real-life situation, wearing stereoscopic glasses and using navigation tools. Space 4. Information and resources library Visitors are offered the opportunity to know more about earthquake protection. A series of means are available for this, some developed especially for Seismopolis (3 books, 2 Cds, a website and an interactive table game). Space 5. De-briefing area Visitors may be subjected to a pedagogical and psychological evaluation at the end of their visit and offered support if needed. For the evaluation of the "Seismopolis" Centre, a pilot application of the complete complex took place with the participation of different groups (schoolchildren, university students, adults, elderly persons, emigrants and persons with special needs). This test period recorded positive impression and reaction from the visitors and indicated the pedagogical and psychological appropriateness of the system. Seismopolis is the outcome of collaboration of public, academic and private partners and of a range of disciplines, namely seismologists, geologists, structural engineers, geographers, sociologists and psycologists. It is actually hosted by the Municipality of Rendis in Athens. More information on Seismopolis can be found in www.seismopolis.org .

  20. Wood Shakes and Shingles for Roof Applications: Tips for Longer Life

    Treesearch

    Mark T. Knaebe

    2013-01-01

    Many wood shakes and shingles have been replaced by composition or asphalt-based shingles. Nevertheless, wood shakes and shingles are still widely used on commercial structures and residential houses.

  1. Seismic hazard of American Samoa and neighboring South Pacific Islands--methods, data, parameters, and results

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen C.; Rukstales, Kenneth S.; Mueller, Charles S.; McNamara, Daniel E.; Luco, Nicolas; Walling, Melanie

    2012-01-01

    American Samoa and the neighboring islands of the South Pacific lie near active tectonic-plate boundaries that host many large earthquakes which can result in strong earthquake shaking and tsunamis. To mitigate earthquake risks from future ground shaking, the Federal Emergency Management Agency requested that the U.S. Geological Survey prepare seismic hazard maps that can be applied in building-design criteria. This Open-File Report describes the data, methods, and parameters used to calculate the seismic shaking hazard as well as the output hazard maps, curves, and deaggregation (disaggregation) information needed for building design. Spectral acceleration hazard for 1 Hertz having a 2-percent probability of exceedance on a firm rock site condition (Vs30=760 meters per second) is 0.12 acceleration of gravity (1 second, 1 Hertz) and 0.32 acceleration of gravity (0.2 seconds, 5 Hertz) on American Samoa, 0.72 acceleration of gravity (1 Hertz) and 2.54 acceleration of gravity (5 Hertz) on Tonga, 0.15 acceleration of gravity (1 Hertz) and 0.55 acceleration of gravity (5 Hertz) on Fiji, and 0.89 acceleration of gravity (1 Hertz) and 2.77 acceleration of gravity (5 Hertz) on the Vanuatu Islands.

  2. Detection on vehicle vibration induced by the engine shaking based on the laser triangulation

    NASA Astrophysics Data System (ADS)

    Chen, Wenxue; Yang, Biwu; Ni, Zhibin; Hu, Xinhan; Han, Tieqiang; Hu, Yaocheng; Zhang, Wu; Wang, Yunfeng

    2017-10-01

    The magnitude of engine shaking is chosen to evaluate the vehicle performance. The engine shaking is evaluated by the vehicle vibration. Based on the laser triangulation, the vehicle vibration is measured by detecting the distance variation between the bodywork and road surface. The results represent the magnitude of engine shaking. The principle and configuration of the laser triangulation is also introduced in this paper.

  3. Process development for the production of 15β-hydroxycyproterone acetate using Bacillus megaterium expressing CYP106A2 as whole-cell biocatalyst.

    PubMed

    Kiss, Flora M; Lundemo, Marie T; Zapp, Josef; Woodley, John M; Bernhardt, Rita

    2015-03-05

    CYP106A2 from Bacillus megaterium ATCC 13368 was first identified as a regio- and stereoselective 15β-hydroxylase of 3-oxo-∆4-steroids. Recently, it was shown that besides 3-oxo-∆4-steroids, 3-hydroxy-∆5-steroids as well as di- and triterpenes can also serve as substrates for this biocatalyst. It is highly selective towards the 15β position, but the 6β, 7α/β, 9α, 11α and 15α positions have also been described as targets for hydroxylation. Based on the broad substrate spectrum and hydroxylating capacity, it is an excellent candidate for the production of human drug metabolites and drug precursors. In this work, we demonstrate the conversion of a synthetic testosterone derivative, cyproterone acetate, by CYP106A2 under in vitro and in vivo conditions. Using a Bacillus megaterium whole-cell system overexpressing CYP106A2, sufficient amounts of product for structure elucidation by nuclear magnetic resonance spectroscopy were obtained. The product was characterized as 15β-hydroxycyproterone acetate, the main human metabolite. Since the product is of pharmaceutical interest, our aim was to intensify the process by increasing the substrate concentration and to scale-up the reaction from shake flasks to bioreactors to demonstrate an efficient, yet green and cost-effective production. Using a bench-top bioreactor and the recombinant Bacillus megaterium system, both a fermentation and a transformation process were successfully implemented. To improve the yield and product titers for future industrial application, the main bottlenecks of the reaction were addressed. Using 2-hydroxypropyl-β-cyclodextrin, an effective bioconversion of 98% was achieved using 1 mM substrate concentration, corresponding to a product formation of 0.43 g/L, at a 400 mL scale. Here we describe the successful scale-up of cyproterone acetate conversion from shake flasks to bioreactors, using the CYP106A2 enzyme in a whole-cell system. The substrate was converted to its main human metabolite, 15β-hydroxycyproterone acetate, a highly interesting drug candidate, due to its retained antiandrogen activity but significantly lower progestogen properties than the mother compound. Optimization of the process led to an improvement from 55% to 98% overall conversion, with a product formation of 0.43 g/L, approaching industrial process requirements and a future large-scale application.

  4. Quantitative characterization of conformational-specific protein-DNA binding using a dual-spectral interferometric imaging biosensor

    NASA Astrophysics Data System (ADS)

    Zhang, Xirui; Daaboul, George G.; Spuhler, Philipp S.; Dröge, Peter; Ünlü, M. Selim

    2016-03-01

    DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions. Electronic supplementary information (ESI) available: DNA sequences and nomenclature (Table 1S); SDS-PAGE assay of IHF stock solution (Fig. 1S); determination of the concentration of IHF stock solution by Bradford assay (Fig. 2S); equilibrium binding isotherm fitting results of other DNA sequences (Table 2S); calculation of dissociation constants (Fig. 3S, 4S; Table 2S); geometric model for quantitation of DNA bending angle induced by specific IHF binding (Fig. 4S); customized flow cell assembly (Fig. 5S); real-time measurement of average fluorophore height change by SSFM (Fig. 6S); summary of binding parameters obtained from additive isotherm model fitting (Table 3S); average surface densities of 10 dsDNA spots and bound IHF at equilibrium (Table 4S); effects of surface densities on the binding and bending of dsDNA (Tables 5S, 6S and Fig. 7S-10S). See DOI: 10.1039/c5nr06785e

  5. Deformation from the 1989 Loma Prieta earthquake near the southwest margin of the Santa Clara Valley, California

    USGS Publications Warehouse

    Schmidt, Kevin M.; Ellen, Stephen D.; Peterson, David M.

    2014-01-01

    To gain additional measurement of any permanent ground deformation that accompanied this damage, we compiled and conducted post-earthquake surveys along two 5-km lines of horizontal control and a 15-km level line. Measurements of horizontal distortion indicate approximately 0.1 m shortening in a NE-SW direction across the valley margin, similar to the amount measured in the channel lining. Evaluation of precise leveling by the National Geodetic Survey showed a downwarp, with an amplitude of >0.1 m over a span of >12 km, that resembled regional geodetic models of coseismic deformation. Although the leveling indicates broad, regional warping, abrupt discontinuities characteristic of faulting characterize both the broad-scale distribution of damage and the local deformation of the channel lining. Reverse movement largely along preexisting faults and probably enhanced significantly by warping combined with enhanced ground shaking, produced the documented coseismic ground deformation.

  6. KSC-2009-2396

    NASA Image and Video Library

    2009-03-28

    CAPE CANAVERAL, Fla. – STS-119 Commander Lee Archambault shakes hands with NASA Acting Administrator Chris Scolese as NASA Deputy Associate Administrator Charles Scales, left, also prepares to welcome him home. Pilot Tony Antonelli approaches the group, at right. Space shuttle Discovery’s landing completed the 13-day, 5.3-million mile journey of the STS-119 mission to the International Space Station. Main gear touchdown was at 3:13:17 p.m. EDT. Nose gear touchdown was at 3:13:40 p.m. and wheels stop was at 3:14:45 p.m. Discovery delivered the final pair of large power-generating solar array wings and the S6 truss segment. The mission was the 28th flight to the station, the 36th flight of Discovery and the 125th in the Space Shuttle Program, as well as the 70th landing at Kennedy. Photo credit: NASA/Kim Shiflett

  7. Estimation of hectare-scale soil-moisture characteristics from aquifer-test data

    USGS Publications Warehouse

    Moench, A.F.

    2003-01-01

    Analysis of a 72-h, constant-rate aquifer test conducted in a coarse-grained and highly permeable, glacial outwash deposit on Cape Cod, Massachusetts revealed that drawdowns measured in 20 piezometers located at various depths below the water table and distances from the pumped well were significantly influenced by effects of drainage from the vadose zone. The influence was greatest in piezometers located close to the water table and diminished with increasing depth. The influence of the vadose zone was evident from a gap, in the intermediate-time zone, between measured drawdowns and drawdowns computed under the assumption that drainage from the vadose zone occurred instantaneously in response to a decline in the elevation of the water table. By means of an analytical model that was designed to account for time-varying drainage, simulated drawdowns could be closely fitted to measured drawdowns regardless of the piezometer locations. Because of the exceptional quality and quantity of the data and the relatively small aquifer heterogeneity, it was possible by inverse modeling to estimate all relevant aquifer parameters and a set of three empirical constants used in the upper-boundary condition to account for the dynamic drainage process. The empirical constants were used to define a one-dimensional (ID) drainage versus time curve that is assumed to be representative of the bulk material overlying the water table. The curve was inverted with a parameter estimation algorithm and a ID numerical model for variably saturated flow to obtain soil-moisture retention curves and unsaturated hydraulic conductivity relationships defined by the Brooks and Corey equations. Direct analysis of the aquifer-test data using a parameter estimation algorithm and a two-dimensional, axisymmetric numerical model for variably saturated flow yielded similar soil-moisture characteristics. Results suggest that hectare-scale soil-moisture characteristics are different from core-scale predictions and even relatively small amounts of fine-grained material and heterogeneity can dominate the large-scale soil-moisture characteristics and aquifer response. ?? 2003 Elsevier B.V. All rights reserved.

  8. ShakeMap manual: technical manual, user's guide, and software guide

    USGS Publications Warehouse

    Wald, David J.; Worden, Bruce C.; Quitoriano, Vincent; Pankow, Kris L.

    2005-01-01

    ShakeMap (http://earthquake.usgs.gov/shakemap) --rapidly, automatically generated shaking and intensity maps--combines instrumental measurements of shaking with information about local geology and earthquake location and magnitude to estimate shaking variations throughout a geographic area. The results are rapidly available via the Web through a variety of map formats, including Geographic Information System (GIS) coverages. These maps have become a valuable tool for emergency response, public information, loss estimation, earthquake planning, and post-earthquake engineering and scientific analyses. With the adoption of ShakeMap as a standard tool for a wide array of users and uses came an impressive demand for up-to-date technical documentation and more general guidelines for users and software developers. This manual is meant to address this need. ShakeMap, and associated Web and data products, are rapidly evolving as new advances in communications, earthquake science, and user needs drive improvements. As such, this documentation is organic in nature. We will make every effort to keep it current, but undoubtedly necessary changes in operational systems take precedence over producing and making documentation publishable.

  9. The Movement to Transform High School. Forum Report

    ERIC Educational Resources Information Center

    Frey, Susan

    2005-01-01

    Although society has changed exponentially over the past 100 years, secondary schools have remained largely static, according to Gerald Hayward, who moderated EdSource's 28th Annual Forum, "Shaking up the Status Quo: The Movement to Transform High School," held in March 2005. Calling high schools difficult, complicated, and expensive,…

  10. Scale translation from shaken to diffused bubble aerated systems for lycopene production by Blakeslea trispora under stimulated conditions.

    PubMed

    Mantzouridou, Fani Th; Naziri, Eleni

    2017-03-01

    This study deals with the scale up of Blakeslea trispora culture from the successful surface-aerated shake flasks to dispersed-bubble aerated column reactor for lycopene production in the presence of lycopene cyclase inhibitor 2-methyl imidazole. Controlling the initial volumetric oxygen mass transfer coefficient (k L a) via airflow rate contributes to increasing cell mass and lycopene accumulation. Inhibitor effectiveness seems to decrease in conditions of high cell mass. Optimization of crude soybean oil (CSO), airflow rate, and 2-methyl imidazole was arranged according to central composite statistical design. The optimized levels of factors were 110.5 g/L, 2.3 vvm, and 29.5 mg/L, respectively. At this optimum setting, maximum lycopene yield (256 mg/L) was comparable or even higher to those reported in shake flasks and stirred tank reactor. 2-Methyl imidazole use at levels significantly lower than those reported for other inhibitors in the literature was successful in terms of process selectivity. CSO provides economic benefits to the process through its ability to stimulate lycopene synthesis, as an inexpensive carbon source and oxygen vector at the same time.

  11. Vascular and inflammatory high fat meal responses in young healthy men; a discriminative role of IL-8 observed in a randomized trial.

    PubMed

    Esser, Diederik; Oosterink, Els; op 't Roodt, Jos; Henry, Ronald M A; Stehouwer, Coen D A; Müller, Michael; Afman, Lydia A

    2013-01-01

    High fat meal challenges are known to induce postprandial low-grade inflammation and endothelial dysfunction. This assumption is largely based on studies performed in older populations or in populations with a progressed disease state and an appropriate control meal is often lacking. Young healthy individuals might be more resilient to such challenges. We therefore aimed to characterize the vascular and inflammatory response after a high fat meal in young healthy individuals. In a double-blind randomized cross-over intervention study, we used a comprehensive phenotyping approach to determine the vascular and inflammatory response after consumption of a high fat shake and after an average breakfast shake in 20 young healthy subjects. Both interventions were performed three times. Many features of the vascular postprandial response, such as FMD, arterial stiffness and micro-vascular skin blood flow were not different between shakes. High fat/high energy shake consumption was associated with a more pronounced increase in blood pressure, heart rate, plasma concentrations of IL-8 and PBMCs gene expression of IL-8 and CD54 (ICAM-1), whereas plasma concentrations of sVCAM1 were decreased compared to an average breakfast. Whereas no difference in postprandial response were observed on classical markers of endothelial function, we did observe differences between consumption of a HF/HE and an average breakfast meal on blood pressure and IL-8 in young healthy volunteers. IL-8 might play an important role in dealing with high fat challenges and might be an early marker for endothelial stress, a stage preceding endothelial dysfunction.

  12. Development of small scale cell culture models for screening poloxamer 188 lot-to-lot variation.

    PubMed

    Peng, Haofan; Hall, Kaitlyn M; Clayton, Blake; Wiltberger, Kelly; Hu, Weiwei; Hughes, Erik; Kane, John; Ney, Rachel; Ryll, Thomas

    2014-01-01

    Shear protectants such as poloxamer 188 play a critical role in protecting cells during cell culture bioprocessing. Lot-to-lot variation of poloxamer 188 was experienced during a routine technology transfer across sites of similar scale and equipment. Cell culture medium containing a specific poloxamer 188 lot resulted in an unusual drop in cell growth, viability, and titer during manufacturing runs. After switching poloxamer lots, culture performance returned to the expected level. In order to control the quality of poloxamer 188 and thus maintain better consistency in manufacturing, multiple small scale screening models were developed. Initially, a 5L bioreactor model was established to evaluate cell damage by high sparge rates with different poloxamer 188 lots. Subsequently, a more robust, simple, and efficient baffled shake flask model was developed. The baffled shake flask model can be performed in a high throughput manner to investigate the cell damage in a bubbling environment. The main cause of the poor performance was the loss of protection, rather than toxicity. It was also suggested that suspicious lots can be identified using different cell line and media. The screening methods provide easy, yet remarkable models for understanding and controlling cell damage due to raw material lot variation as well as studying the interaction between poloxamer 188 and cells. © 2014 American Institute of Chemical Engineers.

  13. The 2010-2011 Canterbury Earthquake Sequence: Environmental effects, seismic triggering thresholds and geologic legacy

    NASA Astrophysics Data System (ADS)

    Quigley, Mark C.; Hughes, Matthew W.; Bradley, Brendon A.; van Ballegooy, Sjoerd; Reid, Catherine; Morgenroth, Justin; Horton, Travis; Duffy, Brendan; Pettinga, Jarg R.

    2016-03-01

    Seismic shaking and tectonic deformation during strong earthquakes can trigger widespread environmental effects. The severity and extent of a given effect relates to the characteristics of the causative earthquake and the intrinsic properties of the affected media. Documentation of earthquake environmental effects in well-instrumented, historical earthquakes can enable seismologic triggering thresholds to be estimated across a spectrum of geologic, topographic and hydrologic site conditions, and implemented into seismic hazard assessments, geotechnical engineering designs, palaeoseismic interpretations, and forecasts of the impacts of future earthquakes. The 2010-2011 Canterbury Earthquake Sequence (CES), including the moment magnitude (Mw) 7.1 Darfield earthquake and Mw 6.2, 6.0, 5.9, and 5.8 aftershocks, occurred on a suite of previously unidentified, primarily blind, active faults in the eastern South Island of New Zealand. The CES is one of Earth's best recorded historical earthquake sequences. The location of the CES proximal to and beneath a major urban centre enabled rapid and detailed collection of vast amounts of field, geospatial, geotechnical, hydrologic, biologic, and seismologic data, and allowed incremental and cumulative environmental responses to seismic forcing to be documented throughout a protracted earthquake sequence. The CES caused multiple instances of tectonic surface deformation (≥ 3 events), surface manifestations of liquefaction (≥ 11 events), lateral spreading (≥ 6 events), rockfall (≥ 6 events), cliff collapse (≥ 3 events), subsidence (≥ 4 events), and hydrological (10s of events) and biological shifts (≥ 3 events). The terrestrial area affected by strong shaking (e.g. peak ground acceleration (PGA) ≥ 0.1-0.3 g), and the maximum distances between earthquake rupture and environmental response (Rrup), both generally increased with increased earthquake Mw, but were also influenced by earthquake location and source characteristics. However, the severity of a given environmental response at any given site related predominantly to ground shaking characteristics (PGA, peak ground velocities) and site conditions (water table depth, soil type, geomorphic and topographic setting) rather than earthquake Mw. In most cases, the most severe liquefaction, rockfall, cliff collapse, subsidence, flooding, tree damage, and biologic habitat changes were triggered by proximal, moderate magnitude (Mw ≤ 6.2) earthquakes on blind faults. CES environmental effects will be incompletely preserved in the geologic record and variably diagnostic of spatial and temporal earthquake clustering. Liquefaction feeder dikes in areas of severe and recurrent liquefaction will provide the best preserved and potentially most diagnostic CES features. Rockfall talus deposits and boulders will be well preserved and potentially diagnostic of the strong intensity of CES shaking, but challenging to decipher in terms of single versus multiple events. Most other phenomena will be transient (e.g., distal groundwater responses), not uniquely diagnostic of earthquakes (e.g., flooding), or more ambiguous (e.g. biologic changes). Preliminary palaeoseismic investigations in the CES region indicate recurrence of liquefaction in susceptible sediments of 100 to 300 yr, recurrence of severe rockfall event(s) of ca. 6000 to 8000 yr, and recurrence of surface rupturing on the largest CES source fault of ca. 20,000 to 30,000 yr. These data highlight the importance of utilising multiple proxy datasets in palaeoearthquake studies. The severity of environmental effects triggered during the strongest CES earthquakes was as great as or equivalent to any historic or prehistoric effects recorded in the geologic record. We suggest that the shaking caused by rupture of local blind faults in the CES comprised a 'worst case' seismic shaking scenario for parts of the Christchurch urban area. Moderate Mw blind fault earthquakes may contribute the highest proportion of seismic hazard, be the most important drivers of landscape evolution, and dominate the palaeoseismic record in some locations on Earth, including locations distal from any identified active faults. A high scientific priority should be placed on improving the spatial extent and quality of 'off-fault' shaking records of strong earthquakes, particularly near major urban centres.

  14. Query-Adaptive Reciprocal Hash Tables for Nearest Neighbor Search.

    PubMed

    Liu, Xianglong; Deng, Cheng; Lang, Bo; Tao, Dacheng; Li, Xuelong

    2016-02-01

    Recent years have witnessed the success of binary hashing techniques in approximate nearest neighbor search. In practice, multiple hash tables are usually built using hashing to cover more desired results in the hit buckets of each table. However, rare work studies the unified approach to constructing multiple informative hash tables using any type of hashing algorithms. Meanwhile, for multiple table search, it also lacks of a generic query-adaptive and fine-grained ranking scheme that can alleviate the binary quantization loss suffered in the standard hashing techniques. To solve the above problems, in this paper, we first regard the table construction as a selection problem over a set of candidate hash functions. With the graph representation of the function set, we propose an efficient solution that sequentially applies normalized dominant set to finding the most informative and independent hash functions for each table. To further reduce the redundancy between tables, we explore the reciprocal hash tables in a boosting manner, where the hash function graph is updated with high weights emphasized on the misclassified neighbor pairs of previous hash tables. To refine the ranking of the retrieved buckets within a certain Hamming radius from the query, we propose a query-adaptive bitwise weighting scheme to enable fine-grained bucket ranking in each hash table, exploiting the discriminative power of its hash functions and their complement for nearest neighbor search. Moreover, we integrate such scheme into the multiple table search using a fast, yet reciprocal table lookup algorithm within the adaptive weighted Hamming radius. In this paper, both the construction method and the query-adaptive search method are general and compatible with different types of hashing algorithms using different feature spaces and/or parameter settings. Our extensive experiments on several large-scale benchmarks demonstrate that the proposed techniques can significantly outperform both the naive construction methods and the state-of-the-art hashing algorithms.

  15. Deep Neural Network Based Supervised Speech Segregation Generalizes to Novel Noises through Large-scale Training

    DTIC Science & Technology

    2015-01-01

    Table 2: Segregation results in terms of STOI on a variety of novel noises (SNR=-2 dB) Babble-20 Cafeteria Factory Babble-100 Living Room Cafe Park...NOISEX-92 corpus [13], and a living room, a cafe and a park noise from the DEMAND corpus [12]. To put the performance of the noise-independent model in

  16. High Frequency Infrasonic Radiation from the 11 March 2011 Tohoku Mw 9.0 Earthquake

    NASA Astrophysics Data System (ADS)

    Walker, K. T.; Le Pichon, A.; Degroot-Hedlin, C. D.; Che, I.

    2011-12-01

    The tragic March 11 Mw 9.0 Tohoku earthquake ruptured the Wadati-Benioff zone beneath northeast Japan, generating a damaging seismic wavetrain and triggering a tsunami that devastated the nearby coastal areas. Centroid moment tensors, aftershocks, and the geometry of the trench suggest the rupture occurred on a plane roughly 400 km long by 200 km wide. Because the Earth's surface is effectively a speaker, the seismic wavetrain generated infrasonic emissions from northeast Japan that were recorded by seven infrasound arrays within 5600 km of the epicenter. Using a time progressive beamforming method and the Progressive Multi-Channel Correlation method, we detect and calculate back azimuths for the 0.3 to 3 Hz infrasonic signals at these stations. After application of predicted wind corrections, these back azimuths point to Honshu and Hokkaido, with the majority of detections illuminating a north-south elongated area near Sendai, where the USGS ShakeMap predicts the greatest intensity of surface shaking. An array near Tokyo (IS30) provides the first recording of locally generated infrasound from a very large dip-slip earthquake. At IS30 a six-minute arrival in the 0.3 to 1.5 Hz band is observed from northeast Japan spanning an 18° back azimuth range. Two shorter events originate from a source to the west, likely Mt. Fuji. Using constraints from propagation modeling, we back project the infrasonic amplitudes recorded at IS30 to a relatively localized area. The maximum amplitude of 1 Pa originates from surface shaking along the coast. This location is also just west of the epicenter and adjacent to the location of maximum P-wave radiation from back projection studies. Noise at IS30 after the mainshock limits the detection of additional signals. A more pronounced infrasonic wavetrain at IS44 (Kamchatka) illuminates the entire Honshu and Hokkaido region, especially along the east coast near Sendai. In agreement with propagation modeling predictions using global atmospheric velocity models, far-field stratospheric and thermospheric arrivals are detected to the northeast and west of the epicenter along the two approximate source-receiver planes. The northeast stations demonstrate a remarkably consistent amplitude decay with range (R) typical of spherical spreading (20log10{R}) whereas the western stations also experience a remarkably consistent decay, but one that is much greater (35log10{R}), likely due to thermospheric attenuation. Our results show that infrasonic arrays listening in regions of very large earthquakes can provide direct measurements of surface shaking, which is pertinent to the timely creation of accurate ShakeMaps.

  17. A new probabilistic seismic hazard assessment for greater Tokyo

    USGS Publications Warehouse

    Stein, R.S.; Toda, S.; Parsons, T.; Grunewald, E.; Blong, R.; Sparks, S.; Shah, H.; Kennedy, J.

    2006-01-01

    Tokyo and its outlying cities are home to one-quarter of Japan's 127 million people. Highly destructive earthquakes struck the capital in 1703, 1855 and 1923, the last of which took 105 000 lives. Fuelled by greater Tokyo's rich seismological record, but challenged by its magnificent complexity, our joint Japanese-US group carried out a new study of the capital's earthquake hazards. We used the prehistoric record of great earthquakes preserved by uplifted marine terraces and tsunami deposits (17 M???8 shocks in the past 7000 years), a newly digitized dataset of historical shaking (10 000 observations in the past 400 years), the dense modern seismic network (300 000 earthquakes in the past 30 years), and Japan's GeoNet array (150 GPS vectors in the past 10 years) to reinterpret the tectonic structure, identify active faults and their slip rates and estimate their earthquake frequency. We propose that a dislodged fragment of the Pacific plate is jammed between the Pacific, Philippine Sea and Eurasian plates beneath the Kanto plain on which Tokyo sits. We suggest that the Kanto fragment controls much of Tokyo's seismic behaviour for large earthquakes, including the damaging 1855 M???7.3 Ansei-Edo shock. On the basis of the frequency of earthquakes beneath greater Tokyo, events with magnitude and location similar to the M??? 7.3 Ansei-Edo event have a ca 20% likelihood in an average 30 year period. In contrast, our renewal (time-dependent) probability for the great M??? 7.9 plate boundary shocks such as struck in 1923 and 1703 is 0.5% for the next 30 years, with a time-averaged 30 year probability of ca 10%. The resulting net likelihood for severe shaking (ca 0.9g peak ground acceleration (PGA)) in Tokyo, Kawasaki and Yokohama for the next 30 years is ca 30%. The long historical record in Kanto also affords a rare opportunity to calculate the probability of shaking in an alternative manner exclusively from intensity observations. This approach permits robust estimates for the spatial distribution of expected shaking, even for sites with few observations. The resulting probability of severe shaking is ca 35% in Tokyo, Kawasaki and Yokohama and ca 10% in Chiba for an average 30 year period, in good agreement with our independent estimate, and thus bolstering our view that Tokyo's hazard looms large. Given $1 trillion estimates for the cost of an M???7.3 shock beneath Tokyo, our probability implies a $13 billion annual probable loss. ?? 2006 The Royal Society.

  18. A new probabilistic seismic hazard assessment for greater Tokyo.

    PubMed

    Stein, Ross S; Toda, Shinji; Parsons, Tom; Grunewald, Elliot

    2006-08-15

    Tokyo and its outlying cities are home to one-quarter of Japan's 127 million people. Highly destructive earthquakes struck the capital in 1703, 1855 and 1923, the last of which took 105,000 lives. Fuelled by greater Tokyo's rich seismological record, but challenged by its magnificent complexity, our joint Japanese-US group carried out a new study of the capital's earthquake hazards. We used the prehistoric record of great earthquakes preserved by uplifted marine terraces and tsunami deposits (17 M approximately 8 shocks in the past 7000 years), a newly digitized dataset of historical shaking (10000 observations in the past 400 years), the dense modern seismic network (300,000 earthquakes in the past 30 years), and Japan's GeoNet array (150 GPS vectors in the past 10 years) to reinterpret the tectonic structure, identify active faults and their slip rates and estimate their earthquake frequency. We propose that a dislodged fragment of the Pacific plate is jammed between the Pacific, Philippine Sea and Eurasian plates beneath the Kanto plain on which Tokyo sits. We suggest that the Kanto fragment controls much of Tokyo's seismic behaviour for large earthquakes, including the damaging 1855 M approximately 7.3 Ansei-Edo shock. On the basis of the frequency of earthquakes beneath greater Tokyo, events with magnitude and location similar to the M approximately 7.3 Ansei-Edo event have a ca 20% likelihood in an average 30 year period. In contrast, our renewal (time-dependent) probability for the great M > or = 7.9 plate boundary shocks such as struck in 1923 and 1703 is 0.5% for the next 30 years, with a time-averaged 30 year probability of ca 10%. The resulting net likelihood for severe shaking (ca 0.9 g peak ground acceleration (PGA)) in Tokyo, Kawasaki and Yokohama for the next 30 years is ca 30%. The long historical record in Kanto also affords a rare opportunity to calculate the probability of shaking in an alternative manner exclusively from intensity observations. This approach permits robust estimates for the spatial distribution of expected shaking, even for sites with few observations. The resulting probability of severe shaking is ca 35% in Tokyo, Kawasaki and Yokohama and ca 10% in Chiba for an average 30 year period, in good agreement with our independent estimate, and thus bolstering our view that Tokyo's hazard looms large. Given 1 trillion US dollars estimates for the cost of an M approximately 7.3 shock beneath Tokyo, our probability implies a 13 billion US dollars annual probable loss.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy,more » and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.« less

  20. An Atlas of ShakeMaps and population exposure catalog for earthquake loss modeling

    USGS Publications Warehouse

    Allen, T.I.; Wald, D.J.; Earle, P.S.; Marano, K.D.; Hotovec, A.J.; Lin, K.; Hearne, M.G.

    2009-01-01

    We present an Atlas of ShakeMaps and a catalog of human population exposures to moderate-to-strong ground shaking (EXPO-CAT) for recent historical earthquakes (1973-2007). The common purpose of the Atlas and exposure catalog is to calibrate earthquake loss models to be used in the US Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER). The full ShakeMap Atlas currently comprises over 5,600 earthquakes from January 1973 through December 2007, with almost 500 of these maps constrained-to varying degrees-by instrumental ground motions, macroseismic intensity data, community internet intensity observations, and published earthquake rupture models. The catalog of human exposures is derived using current PAGER methodologies. Exposure to discrete levels of shaking intensity is obtained by correlating Atlas ShakeMaps with a global population database. Combining this population exposure dataset with historical earthquake loss data, such as PAGER-CAT, provides a useful resource for calibrating loss methodologies against a systematically-derived set of ShakeMap hazard outputs. We illustrate two example uses for EXPO-CAT; (1) simple objective ranking of country vulnerability to earthquakes, and; (2) the influence of time-of-day on earthquake mortality. In general, we observe that countries in similar geographic regions with similar construction practices tend to cluster spatially in terms of relative vulnerability. We also find little quantitative evidence to suggest that time-of-day is a significant factor in earthquake mortality. Moreover, earthquake mortality appears to be more systematically linked to the population exposed to severe ground shaking (Modified Mercalli Intensity VIII+). Finally, equipped with the full Atlas of ShakeMaps, we merge each of these maps and find the maximum estimated peak ground acceleration at any grid point in the world for the past 35 years. We subsequently compare this "composite ShakeMap" with existing global hazard models, calculating the spatial area of the existing hazard maps exceeded by the combined ShakeMap ground motions. In general, these analyses suggest that existing global, and regional, hazard maps tend to overestimate hazard. Both the Atlas of ShakeMaps and EXPO-CAT have many potential uses for examining earthquake risk and epidemiology. All of the datasets discussed herein are available for download on the PAGER Web page ( http://earthquake.usgs.gov/ eqcenter/pager/prodandref/ ). ?? 2009 Springer Science+Business Media B.V.

  1. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 1. Baseline Studies. Volume VIII. Summary of Baseline Studies and Data.

    DTIC Science & Technology

    1982-05-01

    in May 1976, and, by July 1976, all sampling techniques were employed. In addition to routine displays of data analysis such as frequency tables and...amphibian and reptile communities in large aquatic habitats in Florida, comparison with similar herpetofaunal assemblages or populations is not possible... field environment was initiated at Lake Conway near Orlando, Fla., to study the effectiveness of the fish as a biological macrophyte control agent. A

  2. West-Coast Wide Expansion and Testing of the Geodetic Alarm System (G-larmS)

    NASA Astrophysics Data System (ADS)

    Ruhl, C. J.; Grapenthin, R.; Melgar, D.; Aranha, M. A.; Allen, R. M.

    2016-12-01

    The Geodetic Alarm System (G-larmS) was developed in collaboration between the Berkeley Seismological Laboratory (BSL) and New Mexico Tech for real-time Earthquake Early Warning (EEW). G-larmS has been in continuous operation at the BSL since 2014 using event triggers from the ShakeAlert EEW system and real-time position time series from a fully triangulated network consisting of BARD, PBO and USGS stations across northern California (CA). G-larmS has been extended to include southern CA and Cascadia, providing continuous west-coast wide coverage. G-larmS currently uses high rate (1 Hz), low latency (< 5 s), accurate positioning (cm level) time series data from a regional GPS network and P-wave event triggers from the ShakeAlert EEW system. It extracts static offsets from real-time GPS time series upon S-wave arrival and performs a least squares inversion on these offsets to determine slip on a finite fault. A key issue with geodetic EEW approaches is that unlike seismology-based algorithms that are routinely tested using frequent small-magnitude events, geodetic systems are not regularly exercised. Scenario ruptures are therefore important for testing the performance of G-larmS. We discuss results from scenario events on several large faults (capable of M>6.5) in CA and Cascadia built from realistic 3D geometries. Synthetic long-period 1Hz displacement waveforms were obtained from a new stochastic kinematic slip distribution generation method. Waveforms are validated by direct comparison to peak P-wave displacement scaling laws and to PGD GMPEs obtained from high-rate GPS observations of large events worldwide. We run the scenarios on real-time streams to systematically test the recovery of slip and magnitude by G-larmS. In addition to presenting these results, we will discuss new capabilities, such as implementing 2D geometry and the applicability of these results to GPS enhanced tsunami warning systems.

  3. Assessing the Utility of Strong Motion Data to Determine Static Ground Displacements During Great Megathrust Earthquakes: Tohoku and Iquique

    NASA Astrophysics Data System (ADS)

    Herman, M. W.; Furlong, K. P.; Hayes, G. P.; Benz, H.

    2014-12-01

    Strong motion accelerometers can record large amplitude shaking on-scale in the near-field of large earthquake ruptures; however, numerical integration of such records to determine displacement is typically unstable due to baseline changes (i.e., distortions in the zero value) that occur during strong shaking. We use datasets from the 2011 Mw 9.0 Tohoku earthquake to assess whether a relatively simple empirical correction scheme (Boore et al., 2002) can return accurate displacement waveforms useful for constraining details of the fault slip. The coseismic deformation resulting from the Tohoku earthquake was recorded by the Kiban Kyoshin network (KiK-net) of strong motion instruments as well as by a dense network of high-rate (1 Hz) GPS instruments. After baseline correcting the KiK-net records and integrating to displacement, over 85% of the KiK-net borehole instrument waveforms and over 75% of the KiK-net surface instrument waveforms match collocated 1 Hz GPS displacement time series. Most of the records that do not match the GPS-derived displacements following the baseline correction have large, systematic drifts that can be automatically identified by examining the slopes in the first 5-10 seconds of the velocity time series. We apply the same scheme to strong motion records from the 2014 Mw 8.2 Iquique earthquake. Close correspondence in both direction and amplitude between coseismic static offsets derived from the integrated strong motion time series and those predicted from a teleseismically-derived finite fault model, as well as displacement amplitudes consistent with InSAR-derived results, suggest that the correction scheme works successfully for the Iquique event. In the absence of GPS displacements, these strong motion-derived offsets provide constraints on the overall distribution of slip on the fault. In addition, the coseismic strong motion-derived displacement time series (50-100 s long) contain a near-field record of the temporal evolution of the rupture, supplementing teleseismic data and improving resolution of the location and timing of moment in finite fault models.

  4. Treatment of table olive washing water using trickling filters, constructed wetlands and electrooxidation.

    PubMed

    Tatoulis, Triantafyllos; Stefanakis, Alexandros; Frontistis, Zacharias; Akratos, Christos S; Tekerlekopoulou, Athanasia G; Mantzavinos, Dionissios; Vayenas, Dimitrios V

    2017-01-01

    The production of table olives is a significant economic activity in Mediterranean countries. Table olive processing generates large volumes of rinsing water that are characterized by high organic matter and phenol contents. Due to these characteristics, a combination of more than one technology is imperative to ensure efficient treatment with low operational cost. Previously, biological filters were combined with electrooxidation to treat table olive washing water. Although this combination was successful in reducing pollutant loads, its cost could be further reduced. Constructed wetlands could be an eligible treatment method for integrated table olive washing water treatment as they have proved tolerant to high organic matter and phenol loads. Two pilot-scale horizontal subsurface constructed wetlands, one planted and one unplanted, were combined with a biological filter and electrooxidation over a boron-doped diamond anode to treat table olive washing water. In the biological filter inlet, chemical oxygen demand (COD) concentrations ranged from 5500 to 15,000 mg/L, while mean COD influent concentration in the constructed wetlands was 2800 mg/L. The wetlands proved to be an efficient intermediate treatment stage, since COD removal levels for the planted unit reached 99 % (mean 70 %), while the unplanted unit presented removal rates of around 65 %. Moreover, the concentration of phenols in the effluent was typically below 100 mg/L. The integrated trickling filter-constructed wetland-electrooxidation treatment system examined here could mineralize and decolorize table olive washing water and fully remove its phenolic content.

  5. Multisite comparison of drivers of methane emissions from wetlands in the European Arctic: influence of vegetation community and water table.

    NASA Astrophysics Data System (ADS)

    Dinsmore, Kerry; Drewer, Julia; Leeson, Sarah; Skiba, Ute; Levy, Pete; George, Charles

    2014-05-01

    Arctic and sub arctic wetlands are a major source of atmospheric CH4 and therefore have the potential to be important in controlling global radiative forcing. Furthermore, the strong links between wetland CH4 emissions and vegetation community, hydrology and temperature suggest potentially large feedbacks between climate change and future emissions. Quantifying current emissions over large spatial scales and predicting future climatic feedbacks requires a fundamental understanding of the ground based drivers of plot scale emissions. The MAMM project (Methane in the Arctic: Measurements and Modelling) aims to understand and quantify current CH4 emissions and future climatic impacts by combining both ground and aircraft measurements across the European Arctic with regional computer modelling. Here we present results from the ground-based MAMM measurement campaigns, analysing chamber-measured CH4 emissions from two sites in the European Arctic/Sub-Arctic region (Sodankylä, Finland; Stordalen Mire, Sweden) from growing seasons in 2012 and 2013. A total of 85 wetland static chambers were deployed across the two field sites; 39 at Sodankylä (67° 22'01' N, 26° 3'06' E) in 2012 and 46 at Stordalen Mire (68° 21'20' N, 19° 02'56' E) in 2013. Chamber design, protocol and deployment were the same across both sites. Chambers were located at sites chosen strategically to cover the local range of water table depths and vegetation communities. A total of 18 and 15 repeated measurements were made at each chamber in Sodankylä and Stordalen Mire, respectively, over the snow-free season. Preliminary results show a large range of CH4 fluxes across both sites ranging from a CH4 uptake of up to 0.07 and 0.06 mg CH4-C m-2 hr-1 to emissions of 17.3 and 44.2 mg CH4-C m-2 hr-1 in Sodankylä and Stordalen Mire, respectively. Empirical models based on vegetation community, water table depth, temperature and soil nutrient availability (Plant Root Simulator Probes, PRSTM) have been constructed with the aim of understanding the drivers of chamber scale fluxes. By combining measurements made at two different sites, >300km apart, using the same experimental setup, we are uniquely able to investigate whether CH4 emissions are driven by common parameters. Furthermore we are able to determine if plot scale empirical models and parameterisations can be used effectively to upscale emissions to landscape and whole Arctic scale.

  6. Shaking Table Tests Validating Two Strengthening Interventions on Masonry Buildings

    NASA Astrophysics Data System (ADS)

    De Canio, Gerardo; Muscolino, Giuseppe; Palmeri, Alessandro; Poggi, Massimo; Clemente, Paolo

    2008-07-01

    Masonry buildings constitute quite often a precious cultural heritage for our cities. In order to future generations can enjoy this heritage, thence, effective projects of protection should be developed against all the anthropical and natural actions which may irreparably damage old masonry buildings. However, the strengthening interventions on these constructions have to respect their authenticity, without altering the original conception, not only functionally and aesthetically of course, but also statically. These issues are of central interests in the Messina area, where the seismic protection of new and existing constructions is a primary demand. It is well known, in fact, that the city of Messina lies in a highly seismic zone, and has been subjected to two destructive earthquakes in slightly more than one century, the 1783 Calabria earthquake and the more famous 1908 Messina-Reggio Calabria earthquake. It follows that the retrofitting projects on buildings which survived these two events should be designed with the aim to save the life of occupants operating with "light" techniques, i.e. respecting the original structural scheme. On the other hand, recent earthquakes, and in particular the 1997 Umbria-Marche sequence, unequivocally demonstrated that some of the most popular retrofitting interventions adopted in the second half the last century are absolutely ineffective, or even unsafe. Over these years, in fact, a number of "heavy" techniques proliferated, and therefore old masonry buildings suffered, among others, the substitution of existing timber slabs with more ponderous concrete slabs and/or the insertion of RC and steel members coupled with the original masonry elements (walls, arches, vaults). As a result, these buildings have been transformed by unwise engineers into hybrid structures, having a mixed behaviour (which frequently proved to be also unpredictable) between those of historic masonry and new members. Starting from these considerations, a numerical and experimental research has been carried out, aimed at validating two different strengthening interventions on masonry buildings: (i) the substitution of the existing roof with timber-concrete composite slabs, which are able to improve the dynamic behaviour of the structure without excessively increase the mass, and (ii) the reinforcement of masonry walls with FRP materials, which allow increasing both stiffness and strength of the construction. The experimental tests have been performed on a 1:2 scale model of a masonry building resembling a special type, the so-called "tipo misto messinese", which is proper to the reconstruction of the city of Messina after the 1783 Calabria earthquake. The model, incorporating a novel timber-concrete composite slab, has been tested on the main shaking table available at the ENEA Research Centre "Casaccia," both before and after the reinforcement with FRP materials. Some aspects related to the definition of the model and to the selection of an appropriate seismic input will be discussed, and numerical results confirming the effectiveness of the interventions mentioned above will be presented.

  7. Shaking Table Tests Validating Two Strengthening Interventions on Masonry Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Canio, Gerardo; Poggi, Massimo; Clemente, Paolo

    2008-07-08

    Masonry buildings constitute quite often a precious cultural heritage for our cities. In order to future generations can enjoy this heritage, thence, effective projects of protection should be developed against all the anthropical and natural actions which may irreparably damage old masonry buildings. However, the strengthening interventions on these constructions have to respect their authenticity, without altering the original conception, not only functionally and aesthetically of course, but also statically. These issues are of central interests in the Messina area, where the seismic protection of new and existing constructions is a primary demand. It is well known, in fact, thatmore » the city of Messina lies in a highly seismic zone, and has been subjected to two destructive earthquakes in slightly more than one century, the 1783 Calabria earthquake and the more famous 1908 Messina-Reggio Calabria earthquake. It follows that the retrofitting projects on buildings which survived these two events should be designed with the aim to save the life of occupants operating with 'light' techniques, i.e. respecting the original structural scheme. On the other hand, recent earthquakes, and in particular the 1997 Umbria-Marche sequence, unequivocally demonstrated that some of the most popular retrofitting interventions adopted in the second half the last century are absolutely ineffective, or even unsafe. Over these years, in fact, a number of 'heavy' techniques proliferated, and therefore old masonry buildings suffered, among others, the substitution of existing timber slabs with more ponderous concrete slabs and/or the insertion of RC and steel members coupled with the original masonry elements (walls, arches, vaults). As a result, these buildings have been transformed by unwise engineers into hybrid structures, having a mixed behaviour (which frequently proved to be also unpredictable) between those of historic masonry and new members. Starting from these considerations, a numerical and experimental research has been carried out, aimed at validating two different strengthening interventions on masonry buildings: (i) the substitution of the existing roof with timber-concrete composite slabs, which are able to improve the dynamic behaviour of the structure without excessively increase the mass, and (ii) the reinforcement of masonry walls with FRP materials, which allow increasing both stiffness and strength of the construction. The experimental tests have been performed on a 1:2 scale model of a masonry building resembling a special type, the so-called 'tipo misto messinese', which is proper to the reconstruction of the city of Messina after the 1783 Calabria earthquake. The model, incorporating a novel timber-concrete composite slab, has been tested on the main shaking table available at the ENEA Research Centre 'Casaccia', both before and after the reinforcement with FRP materials. Some aspects related to the definition of the model and to the selection of an appropriate seismic input will be discussed, and numerical results confirming the effectiveness of the interventions mentioned above will be presented.« less

  8. Soft-sediment deformation in New Zealand: Structures resulting from the 2010/11 Christchurch earthquakes and comparison with Pleistocene sediments of the Taupo Volcanic Zone (TVZ)

    NASA Astrophysics Data System (ADS)

    Scholz, C.; Downs, D. T.; Gravley, D.; Quigley, M.; Rowland, J. V.

    2011-12-01

    The distinction between seismites and other event-related soft-sediment deformation is a challenging problem. Recognition and interpretation is aided by comparison of recent examples produced during known seismic events and those generated experimentally. Seismites are important features, once recognized in a rock, for interpretations of paleotectonic environment, tectonic relationships of sediments in basins, sedimentary facies analysis, evaluation of earthquake frequency and hazard and consequent land managment. Two examples of soft-sediment deformation, potentially generated through ground shaking and associated liquefaction, are described from within the TVZ: 1) Near Matata on the western margin of the Whakatane Graben. This location has a complicated en-echelon fault history and large earthquakes occur from time to time (e.g., 1987 ML6.3 Edgecumbe event). The structures occur in ~550 ka volcanic sediments, and represent soft-sediment deformation within stratigraphically-bounded layers. Based on paleoenvironment, appearance, and diagnostic criteria described by other authors (Sims 1975; Hempton and Dewey 1983), we interpret these features to have formed by ground shaking related to an earthquake and/or possibly accompanying large volcanic eruptions, rather than by slope failure. 2) Near Taupo, 3 km from the active Kaiapo fault. Lakeward dipping, nearly horizontal lacustrine sediments overlay Taupo Ignimbrite (1.8 ka). At one outcrop the lake beds have subsided into the underlying substrate resulting in kidney-shaped features. These structures formed as a result of liquefaction of the underlying substrate, which may have been caused by ground shaking related to either seismic or volcanic activity. However, inferred time relationships are more consistent with seismic-induced ground shaking. We compare and contrast the form and geometry of the above structures with seismites generated during the recent Christchurch earthquakes (Sep. 2010 and Feb. 2011). Hempton, M. R. and J. F. Dewey (1983). "Earthquake-induced deformational structures in young lacustrine sediments, East Anatolian Fault, southeast Turkey." Tectonophysics 98(3-4): T7-T14. Sims, J. D. (1975). "Determining earthquake recurrence intervals from deformational structures in young lacustrine sediments." Tectonophysics 29(1-4): 141-152.

  9. Earthquake Shaking - Finding the "Hot Spots"

    USGS Publications Warehouse

    Field, Edward; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa

    2001-01-01

    A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes.

  10. Building a Communication, Education, an Outreach Program for the ShakeAlert National Earthquake Early Warning Program - Recommendations for Public Alerts Via Cell Phones

    NASA Astrophysics Data System (ADS)

    DeGroot, R. M.; Long, K.; Strauss, J. A.

    2017-12-01

    The United States Geological Survey (USGS) and its partners are developing the ShakeAlert Earthquake Early Warning System for the West Coast of the United States. To be an integral part of successful implementation, ShakeAlert engagement programs and materials must integrate with and leverage broader earthquake risk programs. New methods and products for dissemination must be multidisciplinary, cost effective, and consistent with existing hazards education and communication efforts. The ShakeAlert Joint Committee for Communication, Education, and Outreach (JCCEO), is identifying, developing, and cultivating partnerships with ShakeAlert stakeholders including Federal, State, academic partners, private companies, policy makers, and local organizations. Efforts include developing materials, methods for delivery, and reaching stakeholders with information on ShakeAlert, earthquake preparedness, and emergency protective actions. It is essential to develop standards to ensure information communicated via the alerts is consistent across the public and private sector and achieving a common understanding of what actions users take when they receive a ShakeAlert warning. In February 2017, the JCCEO convened the Warning Message Focus Group (WMFG) to provide findings and recommendations to the Alliance for Telecommunications Industry Solutions on the use of earthquake early warning message content standards for public alerts via cell phones. The WMFG represents communications, education, and outreach stakeholders from various sectors including ShakeAlert regional coordinators, industry, emergency managers, and subject matter experts from the social sciences. The group knowledge was combined with an in-depth literature review to ensure that all groups who could receive the message would be taken into account. The USGS and the participating states and agencies acknowledge that the implementation of ShakeAlert is a collective effort requiring the participation of hundreds of stakeholders committed to ensuring public accessibility.

  11. [A Case of Middle Cerebral Artery Stenosis Presented with Limb-Shaking TIA].

    PubMed

    Uno, Junji; Mineta, Haruyuki; Ren, Nice; Takagishi, Sou; Nagaoka, Shintarou; Kameda, Katsuharu; Maeda, Kazushi; Ikai, Yoshiaki; Gi, Hidefuku

    2016-07-01

    Involuntary movement is a rare clinical manifestation of transient ischemic attack (TIA). However, limb-shaking TIA is well described presentation of carotid occlusive disease. We present the case of a patient who developed limb-shaking TIA associated with high-grade stenosis of middle cerebral artery (M1), which was treated with percutaneous transluminal angioplasty (PTA). The procedure was performed successfully without complication and the symptom disappeared immediately after the procedure. The patient remained free of symptoms at the 38-month follow-up. There was no tendency of restenosis of M1. In this case, PTA was technically feasible and beneficial for limb-shaking TIA with M1 stenosis. Limb-shaking TIA can be a symptom of high-grade stenosis of M1.

  12. Relations between some horizontal‐component ground‐motion intensity measures used in practice

    USGS Publications Warehouse

    Boore, David; Kishida, Tadahiro

    2017-01-01

    Various measures using the two horizontal components of recorded ground motions have been used in a number of studies that derive ground‐motion prediction equations and construct maps of shaking intensity. We update relations between a number of these measures, including those in Boore et al. (2006) and Boore (2010), using the large and carefully constructed global database of ground motions from crustal earthquakes in active tectonic regions developed as part of the Pacific Earthquake Engineering Research Center–Next Generation Attenuation‐West2 project. The ratios from the expanded datasets generally agree to within a few percent of the previously published ratios. We also provide some ratios that were not considered before, some of which will be useful in applications such as constructing ShakeMaps. Finally, we compare two important ratios with those from a large central and eastern North American database and from many records from subduction earthquakes in Japan and Taiwan. In general, the ratios from these regions are within several percent of those from crustal earthquakes in active tectonic regions.

  13. Behavioral Response in the Immediate Aftermath of Shaking: Earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan

    PubMed Central

    Jon, Ihnji; Lindell, Michael K.; Prater, Carla S.; Huang, Shih-Kai; Wu, Hao-Che; Johnston, David M.; Becker, Julia S.; Shiroshita, Hideyuki; Doyle, Emma E.H.; Potter, Sally H.; McClure, John; Lambie, Emily

    2016-01-01

    This study examines people’s response actions in the first 30 min after shaking stopped following earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan. Data collected from 257 respondents in Christchurch, 332 respondents in Hitachi, and 204 respondents in Wellington revealed notable similarities in some response actions immediately after the shaking stopped. In all four events, people were most likely to contact family members and seek additional information about the situation. However, there were notable differences among events in the frequency of resuming previous activities. Actions taken in the first 30 min were weakly related to: demographic variables, earthquake experience, contextual variables, and actions taken during the shaking, but were significantly related to perceived shaking intensity, risk perception and affective responses to the shaking, and damage/infrastructure disruption. These results have important implications for future research and practice because they identify promising avenues for emergency managers to communicate seismic risks and appropriate responses to risk area populations. PMID:27854306

  14. Behavioral Response in the Immediate Aftermath of Shaking: Earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan.

    PubMed

    Jon, Ihnji; Lindell, Michael K; Prater, Carla S; Huang, Shih-Kai; Wu, Hao-Che; Johnston, David M; Becker, Julia S; Shiroshita, Hideyuki; Doyle, Emma E H; Potter, Sally H; McClure, John; Lambie, Emily

    2016-11-15

    This study examines people's response actions in the first 30 min after shaking stopped following earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan. Data collected from 257 respondents in Christchurch, 332 respondents in Hitachi, and 204 respondents in Wellington revealed notable similarities in some response actions immediately after the shaking stopped. In all four events, people were most likely to contact family members and seek additional information about the situation. However, there were notable differences among events in the frequency of resuming previous activities. Actions taken in the first 30 min were weakly related to: demographic variables, earthquake experience, contextual variables, and actions taken during the shaking, but were significantly related to perceived shaking intensity, risk perception and affective responses to the shaking, and damage/infrastructure disruption. These results have important implications for future research and practice because they identify promising avenues for emergency managers to communicate seismic risks and appropriate responses to risk area populations.

  15. B-CAN: a resource sharing platform to improve the operation, visualization and integrated analysis of TCGA breast cancer data.

    PubMed

    Wen, Can-Hong; Ou, Shao-Min; Guo, Xiao-Bo; Liu, Chen-Feng; Shen, Yan-Bo; You, Na; Cai, Wei-Hong; Shen, Wen-Jun; Wang, Xue-Qin; Tan, Hai-Zhu

    2017-12-12

    Breast cancer is a high-risk heterogeneous disease with myriad subtypes and complicated biological features. The Cancer Genome Atlas (TCGA) breast cancer database provides researchers with the large-scale genome and clinical data via web portals and FTP services. Researchers are able to gain new insights into their related fields, and evaluate experimental discoveries with TCGA. However, it is difficult for researchers who have little experience with database and bioinformatics to access and operate on because of TCGA's complex data format and diverse files. For ease of use, we build the breast cancer (B-CAN) platform, which enables data customization, data visualization, and private data center. The B-CAN platform runs on Apache server and interacts with the backstage of MySQL database by PHP. Users can customize data based on their needs by combining tables from original TCGA database and selecting variables from each table. The private data center is applicable for private data and two types of customized data. A key feature of the B-CAN is that it provides single table display and multiple table display. Customized data with one barcode corresponding to many records and processed customized data are allowed in Multiple Tables Display. The B-CAN is an intuitive and high-efficient data-sharing platform.

  16. Modeling continuous seismic velocity changes due to ground shaking in Chile

    NASA Astrophysics Data System (ADS)

    Gassenmeier, Martina; Richter, Tom; Sens-Schönfelder, Christoph; Korn, Michael; Tilmann, Frederik

    2015-04-01

    In order to investigate temporal seismic velocity changes due to earthquake related processes and environmental forcing, we analyze 8 years of ambient seismic noise recorded by the Integrated Plate Boundary Observatory Chile (IPOC) network in northern Chile between 18° and 25° S. The Mw 7.7 Tocopilla earthquake in 2007 and the Mw 8.1 Iquique earthquake in 2014 as well as numerous smaller events occurred in this area. By autocorrelation of the ambient seismic noise field, approximations of the Green's functions are retrieved. The recovered function represents backscattered or multiply scattered energy from the immediate neighborhood of the station. To detect relative changes of the seismic velocities we apply the stretching method, which compares individual autocorrelation functions to stretched or compressed versions of a long term averaged reference autocorrelation function. We use time windows in the coda of the autocorrelations, that contain scattered waves which are highly sensitive to minute changes in the velocity. At station PATCX we observe seasonal changes in seismic velocity as well as temporary velocity reductions in the frequency range of 4-6 Hz. The seasonal changes can be attributed to thermal stress changes in the subsurface related to variations of the atmospheric temperature. This effect can be modeled well by a sine curve and is subtracted for further analysis of short term variations. Temporary velocity reductions occur at the time of ground shaking usually caused by earthquakes and are followed by a recovery. We present an empirical model that describes the seismic velocity variations based on continuous observations of the local ground acceleration. Our hypothesis is that not only the shaking of earthquakes provokes velocity drops, but any small vibrations continuously induce minor velocity variations that are immediately compensated by healing in the steady state. We show that the shaking effect is accumulated over time and best described by the integrated envelope of the ground acceleration over 1 day which is the discretization interval of the velocity measurements. In our model the amplitude of the velocity reduction as well as the recovery time are proportional to the size of the excitation. This model with the two free scaling parameters for the shaking induced velocity variation fits the data in remarkable detail. Additionally, a linear trend is observed that might be related to a recovery process from one or more earthquakes before our measurement period. For the Tocopilla earthquake in 2007 and the Iquique earthquake in 2014 velocity reductions are also observed at other stations of the IPOC network. However, a clear relationship between the ground shaking and the induced velocity reductions is not visible at other stations. We attribute the outstanding sensitivity of PATCX to ground shaking to the special geological setting of the station, where the material consists of relatively loose conglomerate with high pore volume.

  17. A Memory-Based Programmable Logic Device Using Look-Up Table Cascade with Synchronous Static Random Access Memories

    NASA Astrophysics Data System (ADS)

    Nakamura, Kazuyuki; Sasao, Tsutomu; Matsuura, Munehiro; Tanaka, Katsumasa; Yoshizumi, Kenichi; Nakahara, Hiroki; Iguchi, Yukihiro

    2006-04-01

    A large-scale memory-technology-based programmable logic device (PLD) using a look-up table (LUT) cascade is developed in the 0.35-μm standard complementary metal oxide semiconductor (CMOS) logic process. Eight 64 K-bit synchronous SRAMs are connected to form an LUT cascade with a few additional circuits. The features of the LUT cascade include: 1) a flexible cascade connection structure, 2) multi phase pseudo asynchronous operations with synchronous static random access memory (SRAM) cores, and 3) LUT-bypass redundancy. This chip operates at 33 MHz in 8-LUT cascades at 122 mW. Benchmark results show that it achieves a comparable performance to field programmable gate array (FPGAs).

  18. Medium-high frequency FBG accelerometer with integrative matrix structure.

    PubMed

    Dai, Yutang; Yin, Guanglin; Liu, Bin; Xu, Gang; Karanja, Joseph Muna

    2015-04-10

    To meet the requirements for medium-high frequency vibration monitoring, a new type fiber Bragg grating (FBG) accelerometer with an integrative matrix structure is proposed. Two symmetrical flexible gemels are used as elastic elements, which drive respective inertial mass moving reversely when exciting vibration exists, leading to doubling the wavelength shift of the FBG. The mechanics model and a numerical method are presented in this paper, by which the influence of the structural parameters on the sensitivity and eigenfrequency is discussed. Sensitivity higher than 200  pm/g and an eigenfrequency larger than 3000 Hz can be realized separately, but both cannot be achieved simultaneously. Aiming for a broader measuring frequency range, a prototype accelerometer with an eigenfrequency near 3000 Hz is designed, and results from a shake table test are also demonstrated.

  19. Metadata and network API aspects of a framework for storing and retrieving civil infrastructure monitoring data

    NASA Astrophysics Data System (ADS)

    Wong, John-Michael; Stojadinovic, Bozidar

    2005-05-01

    A framework has been defined for storing and retrieving civil infrastructure monitoring data over a network. The framework consists of two primary components: metadata and network communications. The metadata component provides the descriptions and data definitions necessary for cataloging and searching monitoring data. The communications component provides Java classes for remotely accessing the data. Packages of Enterprise JavaBeans and data handling utility classes are written to use the underlying metadata information to build real-time monitoring applications. The utility of the framework was evaluated using wireless accelerometers on a shaking table earthquake simulation test of a reinforced concrete bridge column. The NEESgrid data and metadata repository services were used as a backend storage implementation. A web interface was created to demonstrate the utility of the data model and provides an example health monitoring application.

  20. MyEEW: A Smartphone App for the ShakeAlert System

    NASA Astrophysics Data System (ADS)

    Strauss, J. A.; Allen, S.; Allen, R. M.; Hellweg, M.

    2015-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The UC Berkeley Seismological Laboratory has created a smartphone app called MyEEW, which interfaces with the ShakeAlert system to deliver early warnings to individual users. Many critical facilities (transportation, police, and fire) have control rooms, which could run a centralized interface, but our ShakeAlert Beta Testers have also expressed their need for mobile options. This app augments the basic ShakeAlert Java desktop applet by allowing workers off-site (or merely out of hearing range) to be informed of coming hazards. MyEEW receives information from the ShakeAlert system to provide users with real-time information about shaking that is about to happen at their individual location. It includes a map, timer, and earthquake information similar to the Java desktop User Display. The app will also feature educational material to help users craft their own response and resiliency strategies. The app will be open to UC Berkeley Earthquake Research Affiliates members for testing in the near future.

  1. JPRS Report, Science & Technology, USSR: Chemistry.

    DTIC Science & Technology

    1987-07-15

    dust and gas emissions from ferrous and nonferrous metallurgical facilities on vegetable crops of 42 collective farms within a 10-15 km radius...the dust and gas wastes were determined to have adverse effects on vegetable crops. Tables 3. 12172/12379 CSO: 1841/299 43 FERTILIZERS...academy’s Institute of physical-Organic Chemistry, head of the republic large-scale program " Membrana "] [Abstract] The author assesses progress in

  2. VHSIC Electronics and the Cost of Air Force Avionics in the 1990s

    DTIC Science & Technology

    1990-11-01

    circuit. LRM Line replaceable module. LRU Line replaceable unit. LSI Large-scale integration. LSTTL Tow-power Schottky Transitor -to-Transistor Logic...displays, communications/navigation/identification, electronic combat equipment, dispensers, and computers. These CERs, which statistically relate the...some of the reliability numbers, and adding the F-15 and F-16 to obtain the data sample shown in Table 6. Both suite costs and reliability statistics

  3. Validation of a modified table to map the 1998 Abbreviated Injury Scale to the 2008 scale and the use of adjusted severities.

    PubMed

    Tohira, Hideo; Jacobs, Ian; Mountain, David; Gibson, Nick; Yeo, Allen; Ueno, Masato; Watanabe, Hiroaki

    2011-12-01

    The Abbreviated Injury Scale 2008 (AIS 2008) is the most recent injury coding system. A mapping table from a previous AIS 98 to AIS 2008 is available. However, AIS 98 codes that are unmappable to AIS 2008 codes exist in this table. Furthermore, some AIS 98 codes can be mapped to multiple candidate AIS 2008 codes with different severities. We aimed to modify the original table to adjust the severities and to validate these changes. We modified the original table by adding links from unmappable AIS 98 codes to AIS 2008 codes. We applied the original table and our modified table to AIS 98 codes for major trauma patients. We also assigned candidate codes with different severities the weighted averages of their severities as an adjusted severity. The proportion of cases whose injury severity scores (ISSs) were computable were compared. We also compared the agreement of the ISS and New ISS (NISS) between manually determined AIS 2008 codes (MAN) and mapped codes by using our table (MAP) with unadjusted or adjusted severities. All and 72.3% of cases had their ISSs computed by our modified table and the original table, respectively. The agreement between MAN and MAP with respect to the ISS and NISS was substantial (intraclass correlation coefficient = 0.939 for ISS and 0.943 for NISS). Using adjusted severities, the agreements of the ISS and NISS improved to 0.953 (p = 0.11) and 0.963 (p = 0.007), respectively. Our modified mapping table seems to allow more ISSs to be computed than the original table. Severity scores exhibited substantial agreement between MAN and MAP. The use of adjusted severities improved these agreements further.

  4. Compiling and using input-output frameworks through collaborative virtual laboratories.

    PubMed

    Lenzen, Manfred; Geschke, Arne; Wiedmann, Thomas; Lane, Joe; Anderson, Neal; Baynes, Timothy; Boland, John; Daniels, Peter; Dey, Christopher; Fry, Jacob; Hadjikakou, Michalis; Kenway, Steven; Malik, Arunima; Moran, Daniel; Murray, Joy; Nettleton, Stuart; Poruschi, Lavinia; Reynolds, Christian; Rowley, Hazel; Ugon, Julien; Webb, Dean; West, James

    2014-07-01

    Compiling, deploying and utilising large-scale databases that integrate environmental and economic data have traditionally been labour- and cost-intensive processes, hindered by the large amount of disparate and misaligned data that must be collected and harmonised. The Australian Industrial Ecology Virtual Laboratory (IELab) is a novel, collaborative approach to compiling large-scale environmentally extended multi-region input-output (MRIO) models. The utility of the IELab product is greatly enhanced by avoiding the need to lock in an MRIO structure at the time the MRIO system is developed. The IELab advances the idea of the "mother-daughter" construction principle, whereby a regionally and sectorally very detailed "mother" table is set up, from which "daughter" tables are derived to suit specific research questions. By introducing a third tier - the "root classification" - IELab users are able to define their own mother-MRIO configuration, at no additional cost in terms of data handling. Customised mother-MRIOs can then be built, which maximise disaggregation in aspects that are useful to a family of research questions. The second innovation in the IELab system is to provide a highly automated collaborative research platform in a cloud-computing environment, greatly expediting workflows and making these computational benefits accessible to all users. Combining these two aspects realises many benefits. The collaborative nature of the IELab development project allows significant savings in resources. Timely deployment is possible by coupling automation procedures with the comprehensive input from multiple teams. User-defined MRIO tables, coupled with high performance computing, mean that MRIO analysis will be useful and accessible for a great many more research applications than would otherwise be possible. By ensuring that a common set of analytical tools such as for hybrid life-cycle assessment is adopted, the IELab will facilitate the harmonisation of fragmented, dispersed and misaligned raw data for the benefit of all interested parties. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Changes in hot spring temperature and hydrogeology of the Alpine Fault hanging wall, New Zealand, induced by distal South Island earthquakes

    NASA Astrophysics Data System (ADS)

    Cox, S.; Menzies, C. D.; Sutherland, R.; Denys, P. H.; Chamberlain, C. J.; Teagle, D. A. H.

    2014-12-01

    In response to large distant earthquakes Copland hot spring cooled c.1 °C and changed fluid chemistry. Thermal springs in the Southern Alps, New Zealand, originate through penetration of fluids into a thermal anomaly generated by rapid uplift and exhumation on the Alpine Fault. Copland hot spring (43.629S, 169.946E) is one of the most vigorously flowing, hottest of the springs, discharging strongly effervescent CO2-rich 56-58 °C water at 6 ± 1 Ls-1. Shaking from the Mw7.8 Dusky Sound (Fiordland) 2009 and Mw7.1 Darfield (Canterbury) 2010 earthquakes, 350 and 180 km from the spring respectively, resulted in a characteristic c. 1 °C delayed-cooling over five days. The cooling responses occurred at low shaking intensities (MM III-IV) and seismic energy densities (~10-1 Jm-3) from intermediate-field distances, independent of variations in spectral frequency, without the need for post-seismic recovery before the next cooling occurred. Such shaking can be expected approximately every 1-10 years in central Southern Alps. Observed temperature and fluid chemistry responses are inferred to reflect subtle changes in the fracture permeability of schist mountains adjacent to the spring. Relatively low intensity shaking induced small permanent 10-7-10-6 strains across the Southern Alps - opening fractures which enhance mixing of relatively cool near-surface groundwater with upwelling hot water. Hydrothermal systems situated in places of active deformation, tectonic and topographic stress may be particularly susceptible to earthquake-induced transience, that if monitored may provide important information on difficult to measure hydrogeological properties within active orogens.

  6. Detection of ground motions using high-rate GPS time-series

    NASA Astrophysics Data System (ADS)

    Psimoulis, Panos A.; Houlié, Nicolas; Habboub, Mohammed; Michel, Clotaire; Rothacher, Markus

    2018-05-01

    Monitoring surface deformation in real-time help at planning and protecting infrastructures and populations, manage sensitive production (i.e. SEVESO-type) and mitigate long-term consequences of modifications implemented. We present RT-SHAKE, an algorithm developed to detect ground motions associated with landslides, sub-surface collapses, subsidences, earthquakes or rock falls. RT-SHAKE detects first transient changes in individual GPS time series before investigating for spatial correlation(s) of observations made at neighbouring GPS sites and eventually issue a motion warning. In order to assess our algorithm on fast (seconds to minute), large (from 1 cm to meters) and spatially consistent surface motions, we use the 1 Hz GEONET GNSS network data of the Tohoku-Oki MW9.0 2011 as a test scenario. We show the delay of detection of seismic wave arrival by GPS records is of ˜10 seconds with respect to an identical analysis based on strong-motion data and this time delay depends on the level of the time-variable noise. Nevertheless, based on the analysis of the GPS network noise level and ground motion stochastic model, we show that RT-SHAKE can narrow the range of earthquake magnitude, by setting a lower threshold of detected earthquakes to MW6.5-7, if associated with a real-time automatic earthquake location system.

  7. Installation, care, and maintenance of wood shake and shingle siding

    Treesearch

    Jack Dwyer; Tony Bonura; Arnie Nebelsick; Sam Williams; Christopher G. Hunt

    2011-01-01

    This article gives general guidelines for selection, installation, finishing, and maintenance of wood shakes and shingles. The authors gathered information from a variety of sources: research publications on wood finishing, technical data sheets from paint manufacturers, installation instructions for shake and shingle siding, and interviews with experts having...

  8. Oxygen transfer phenomena in 48-well microtiter plates: determination by optical monitoring of sulfite oxidation and verification by real-time measurement during microbial growth.

    PubMed

    Kensy, Frank; Zimmermann, Hartmut F; Knabben, Ingo; Anderlei, Tibor; Trauthwein, Harald; Dingerdissen, Uwe; Büchs, Jochen

    2005-03-20

    Oxygen limitation is one of the most frequent problems associated with the application of shaking bioreactors. The gas-liquid oxygen transfer properties of shaken 48-well microtiter plates (MTPs) were analyzed at different filling volumes, shaking diameters, and shaking frequencies. On the one hand, an optical method based on sulfite oxidation was used as a chemical model system to determine the maximum oxygen transfer capacity (OTR(max)). On the other hand, the Respiration Activity Monitoring System (RAMOS) was applied for online measurement of the oxygen transfer rate (OTR) during growth of the methylotropic yeast Hansenula polymorpha. A proportionality constant between the OTR(max) of the biological system and the OTR(max) of the chemical system were indicated from these data, offering the possibility to transform the whole set of chemical data to biologically relevant conditions. The results exposed "out of phase" shaking conditions at a shaking diameter of 1 mm, which were confirmed by theoretical consideration with the phase number (Ph). At larger shaking diameters (2-50 mm) the oxygen transfer rate in MTPs shaken at high frequencies reached values of up to 0.28 mol/L/h, corresponding to a volumetric mass transfer coefficient (k(L)a) of 1,600 1/h. The specific mass transfer area (a) increases exponentially with the shaking frequency up to values of 2,400 1/m. On the contrary, the mass transfer coefficient (k(L)) is constant at a level of about 0.15 m/h over a wide range of shaking frequencies and shaking diameters. However, at high shaking frequencies, when the complete liquid volume forms a thin film on the cylindric wall of the well, the mass transfer coefficient (k(L)) increases linearly to values of up to 0.76 m/h. Essentially, the present investigation demonstrates that the 48-well plate outperforms the 96-well MTP and shake flasks at widely used operating conditions with respect to oxygen supply. The 48-well plates emerge, therefore, as an excellent alternative for microbial cultivation and expression studies combining the advantages of both the high-throughput 96-well MTP and the classical shaken Erlenmeyer flask.

  9. San Andreas fault geometry at Desert Hot Springs, California, and its effects on earthquake hazards and groundwater

    USGS Publications Warehouse

    Catchings, R.D.; Rymer, M.J.; Goldman, M.R.; Gandhok, G.

    2009-01-01

    The Mission Creek and Banning faults are two of the principal strands of the San Andreas fault zone in the northern Coachella Valley of southern California. Structural characteristics of the faults affect both regional earthquake hazards and local groundwater resources. We use seismic, gravity, and geological data to characterize the San Andreas fault zone in the vicinity of Desert Hot Springs. Seismic images of the upper 500 m of the Mission Creek fault at Desert Hot Springs show multiple fault strands distributed over a 500 m wide zone, with concentrated faulting within a central 200 m wide area of the fault zone. High-velocity (up to 5000 m=sec) rocks on the northeast side of the fault are juxtaposed against a low-velocity (6.0) earthquakes in the area (in 1948 and 1986) occurred at or near the depths (~10 to 12 km) of the merged (San Andreas) fault. Large-magnitude earthquakes that nucleate at or below the merged fault will likely generate strong shaking from guided waves along both fault zones and from amplified seismic waves in the low-velocity basin between the two fault zones. The Mission Creek fault zone is a groundwater barrier with the top of the water table varying by 60 m in depth and the aquifer varying by about 50 m in thickness across a 200 m wide zone of concentrated faulting.

  10. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation of earthquake scientists and engineers. In addition to the national maps, the USGS produces more detailed urban seismic hazard maps that communities have used to prioritize retrofits and design critical infrastructure that can withstand large earthquakes. At a regional scale, the USGS and its partners in California have developed a time-dependent earthquake rupture forecast that is being used by the insurance sector, which can serve to distribute risk and foster mitigation if the right incentives are in place. What the USGS and partners are doing at the urban, regional, and national scales, the Global Earthquake Model project is seeking to do for the world. A significant challenge for engaging the public to prepare for earthquakes is making low-probability, high-consequence events real enough to merit personal action. Scenarios help by starting with the hazard posed by a specific earthquake and then exploring the fragility of the built environment, cascading failures, and the real-life consequences for the public. To generate such a complete picture takes multiple disciplines working together. Earthquake scenarios are being used both for emergency management exercises and much broader public preparedness efforts like the Great California ShakeOut, which engaged nearly 7 million people.

  11. Applying Rasch Model and Generalizability Theory to Study Modified-Angoff Cut Scores

    ERIC Educational Resources Information Center

    Arce, Alvaro J.; Wang, Ze

    2012-01-01

    The traditional approach to scale modified-Angoff cut scores transfers the raw cuts to an existing raw-to-scale score conversion table. Under the traditional approach, cut scores and conversion table raw scores are not only seen as interchangeable but also as originating from a common scaling process. In this article, we propose an alternative…

  12. VIEW OF SHEAR (ELECTRIC POWERED), SCALE HOUSE TO LEFT. BARS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF SHEAR (ELECTRIC POWERED), SCALE HOUSE TO LEFT. BARS ARE PLACED ON WEIGHING SCALE SHOWN LOWER LEFT. 15-TON CLEVELAND CRANE HANDLES BARS FOR FINAL LOADING INTO RAILROAD CARS (12" BAY) AND FOR MOVING FROM TABLE TO SHEAR TABLE. - Cambria Iron Company, Gautier Works, 12" Mill, Clinton Street & Little Conemaugh River, Johnstown, Cambria County, PA

  13. Real-Time Science on Social Media: The Example of Twitter in the Minutes, Hours, Days after the 2015 M7.8 Nepal Earthquake

    NASA Astrophysics Data System (ADS)

    Lomax, A.; Bossu, R.; Mazet-Roux, G.

    2015-12-01

    Scientific information on disasters such as earthquakes typically comes firstly from official organizations, news reports and interviews with experts, and later from scientific presentations and peer-reviewed articles. With the advent of the Internet and social media, this information is available in real-time from automated systems and within a dynamic, collaborative interaction between scientific experts, responders and the public. After the 2015 M7.8 Nepal earthquake, Twitter Tweets from earth scientists* included information, analysis, commentary and discussion on earthquake parameters (location, size, mechanism, rupture extent, high-frequency radiation, …), earthquake effects (distribution of felt shaking and damage, triggered seismicity, landslides, …), earthquake rumors (e.g. the imminence of a larger event) and other earthquake information and observations (aftershock forecasts, statistics and maps, source and regional tectonics, seismograms, GPS, InSAR, photos/videos, …).In the future (while taking into account security, false or erroneous information and identity verification), collaborative, real-time science on social media after a disaster will give earlier and better scientific understanding and dissemination of public information, and enable improved emergency response and disaster management.* A sample of scientific Tweets after the 2015 Nepal earthquake: In the first minutes: "mb5.9 Mwp7.4 earthquake Nepal 2015.04.25-06:11:25UTC", "Major earthquake shakes Nepal 8 min ago", "Epicenter between Pokhara and Kathmandu", "Major earthquake shakes Nepal 18 min ago. Effects derived from witnesses' reports". In the first hour: "shallow thrust faulting to North under Himalayas", "a very large and shallow event ... Mw7.6-7.7", "aftershocks extend east and south of Kathmandu, so likely ruptured beneath city", "Valley-blocking landslides must be a very real worry". In the first day: "M7.8 earthquake in Nepal 2hr ago: destructive in Kathmandu Valley and widely felt in India", "USGS pager v.3 contains initial fatality & economic loss estimates", "analysis of seismic waves … shows fault rupture lasted 80 sec, shaking longer", "aftershocks suggests rupture zone, directivity and shaking intensity".

  14. Co-Seismic Mass Displacement and its Effect on Earth's Rotation and Gravity

    NASA Technical Reports Server (NTRS)

    Chao, B. F.; Gross, R. S.

    2004-01-01

    Mantle processes often involve large-scale mass transport, ranging from mantle convection, tectonic motions, glacial isostatic adjustment, to tides, atmospheric and oceanic loadings, volcanism and seismicity. On very short time scale of less than an hour, co-seismic event, apart from the "shaking" that is the earthquake, leaves behind permanent (step-function-like) displacements in the crust and mantle. This redistribution of mass changes the Earth's inertia tensor (and hence Earth's rotation in both length-of-day and polar motion), and the gravity field. The question is whether these effects are large enough to be of any significance. In this paper we report updated calculation results based on Chao & Gross. The calculation uses the normal mode summation scheme, applied to over twenty thousand major earthquakes that occurred during 1976-2002, according to source mechanism solutions given by the Harvard Centroid Moment Tensor catalog. Compared to the truly large ones earlier in the century, the earthquakes we study are individually all too small to have left any discernible signature in geodetic records of Earth rotation or global gravity field. However, their collective effects continue to exhibit an extremely strong statistical tendencies, conspiring to decrease J2 and J22 while shortening LOD, resulting in a rounder and more compact Earth. Strong tendency is also seen in the earthquakes trying to "nudge" the Earth rotation pole towards approx. 140 deg.E, roughly opposite to the observed polar drift direction. Currently, the Gravity Recovery And Climate Experiment (GRACE) is measuring the time-variable gravity to high degree and order with unprecedented accuracy. Our results show that great earthquakes such as the 1960 Chilean or 1964 Alaskan events cause gravitational field changes that are large enough to be detected by GRACE.

  15. Hotspots, Lifelines, and the Safrr Haywired Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Ratliff, J. L.; Porter, K.

    2014-12-01

    Though California has experienced many large earthquakes (San Francisco, 1906; Loma Prieta, 1989; Northridge, 1994), the San Francisco Bay Area has not had a damaging earthquake for 25 years. Earthquake risk and surging reliance on smartphones and the Internet to handle everyday tasks raise the question: is an increasingly technology-reliant Bay Area prepared for potential infrastructure impacts caused by a major earthquake? How will a major earthquake on the Hayward Fault affect lifelines (roads, power, water, communication, etc.)? The U.S. Geological Survey Science Application for Risk Reduction (SAFRR) program's Haywired disaster scenario, a hypothetical two-year earthquake sequence triggered by a M7.05 mainshock on the Hayward Fault, addresses these and other questions. We explore four geographic aspects of lifeline damage from earthquakes: (1) geographic lifeline concentrations, (2) areas where lifelines pass through high shaking or potential ground-failure zones, (3) areas with diminished lifeline service demand due to severe building damage, and (4) areas with increased lifeline service demand due to displaced residents and businesses. Potential mainshock lifeline vulnerability and spatial demand changes will be discerned by superimposing earthquake shaking, liquefaction probability, and landslide probability damage thresholds with lifeline concentrations and with large-capacity shelters. Intersecting high hazard levels and lifeline clusters represent potential lifeline susceptibility hotspots. We will also analyze possible temporal vulnerability and demand changes using an aftershock shaking threshold. The results of this analysis will inform regional lifeline resilience initiatives and response and recovery planning, as well as reveal potential redundancies and weaknesses for Bay Area lifelines. Identified spatial and temporal hotspots can provide stakeholders with a reference for possible systemic vulnerability resulting from an earthquake sequence.

  16. Identifying areas of basin-floor recharge in the Trans-Pecos region and the link to vegetation

    USGS Publications Warehouse

    Walvoord, Michelle Ann; Phillips, Fred M.

    2004-01-01

    Comparative water potential and chloride profiles (∼10 m deep) collected from four vegetation communities in the Trans-Pecos region of the Chihuahuan Desert were assessed to evaluate the potential for using vegetation patterns as a means of efficiently improving large-scale estimates of basin-floor recharge in semiarid and arid regions. Analytical solutions and multiphase flow and transport modeling constrained flux histories and current fluxes across the water table at each site. Chloride bulge profiles containing ∼12–15 kyr of atmospheric deposition and long-term drying water potential profiles typified most desertscrub and grassland sites. In contrast, evidence of episodic sub-root zone percolation and chloride profiles containing <250 yr of atmospheric deposition characterized the woodland site. The results suggested that the desertscrub and grassland areas support small upward fluxes across the water table (nonrecharge), whereas the woodland site supports significant downward fluxes across the water table (recharge). A nonrecharge–recharge transition was identified to be collocated with a grassland–woodland ecotone. The establishment of vegetation–recharge relationships such as this will improve estimates of basin-scale recharge by identifying regions where no recharge is expected and regions where recharge is expected and point measurements should be concentrated. An approach integrating remotely sensed spatial distributions of vegetation and indicator relationships to recharge is both timely and warranted, although several caveats, as revealed in this study, should be noted. For example, the relative importance and distribution of vertical conduits that permit percolation to the water table merits future investigation.

  17. Multiple-methods investigation of recharge at a humid-region fractured rock site, Pennsylvania, USA

    USGS Publications Warehouse

    Heppner, C.S.; Nimmo, J.R.; Folmar, G.J.; Gburek, W.J.; Risser, D.W.

    2007-01-01

    Lysimeter-percolate and well-hydrograph analyses were combined to evaluate recharge for the Masser Recharge Site (central Pennsylvania, USA). In humid regions, aquifer recharge through an unconfined low-porosity fractured-rock aquifer can cause large magnitude water-table fluctuations over short time scales. The unsaturated hydraulic characteristics of the subsurface porous media control the magnitude and timing of these fluctuations. Data from multiple sets of lysimeters at the site show a highly seasonal pattern of percolate and exhibit variability due to both installation factors and hydraulic property heterogeneity. Individual event analysis of well hydrograph data reveals the primary influences on water-table response, namely rainfall depth, rainfall intensity, and initial water-table depth. Spatial and seasonal variability in well response is also evident. A new approach for calculating recharge from continuous water-table elevation records using a master recession curve (MRC) is demonstrated. The recharge estimated by the MRC approach when assuming a constant specific yield is seasonal to a lesser degree than the recharge estimate resulting from the lysimeter analysis. Partial reconciliation of the two recharge estimates is achieved by considering a conceptual model of flow processes in the highly-heterogeneous underlying fractured porous medium. ?? Springer-Verlag 2007.

  18. Shake, Rattle and Roll--Can Music Be Used by Parents and Practitioners to Support Communication, Language and Literacy within a Pre-School Setting?

    ERIC Educational Resources Information Center

    Harris, Deborah Jayne

    2011-01-01

    The aim of this small-scale study was to evaluate whether music could support communication, language and literacy development within a pre-school setting. The research focused on a music specialist who provided a range of musical activities that engaged both parents and children over a 20-week period. Initial interviews with parents indicated…

  19. Co-Seismic Mass Dislocation and its Effect on Earth's Rotation and Gravity

    NASA Technical Reports Server (NTRS)

    Chao, B. F.; Gross, R. S.

    2002-01-01

    Mantle processes often involve large-scale mass transport, ranging from mantle convection, tectonic motions, glacial isostatic adjustment, to tides, atmospheric and oceanic loadings, volcanism and seismicity. On very short time scale of less than an hour, co-seismic event, apart from the shaking that is the earthquake, leaves behind permanent (step-function-like) dislocations in the crust and mantle. This redistribution of mass changes the Earth's inertia tensor (and hence Earth's rotation in both length-of-day and polar motion), and the gravity field (in terms of spherical harmonic Stokes coefficients). The question is whether these effects are large enough to be of any significance. In this paper we report updated calculation results based on Chao & Gross (1987). The calculation uses the normal mode summation scheme, applied to nearly twenty thousand major earthquakes that occurred during 1976-2002, according to source mechanism solutions given by the Harvard Central Moment Tensor catalog. Compared to the truly large ones earlier in the century, the earthquakes we study are individually all too small to have left any discernible signature in geodetic records of Earth rotation or global gravity field. However, their collective effects continue to exhibit an extremely strong statistical tendencies. For example, earthquakes conspire to decrease J2 and J22 while shortening LOD, resulting in a rounder and more compact Earth. Strong tendency is also seen in the earthquakes trying to nudge the Earth rotation pole towards approximately 140 degrees E, roughly opposite to the observed polar drift direction. The geophysical significance and implications will be further studied.

  20. Co-Seismic Mass Dislocation and Its Effect on Earth's Rotation and Gravity

    NASA Technical Reports Server (NTRS)

    Chao, Benjamin F.

    1999-01-01

    Mantle processes often involve large-scale mass transport, ranging from mantle convection, tectonic motions, glacial isostatic adjustment, to tides, atmospheric and oceanic loadings, volcanism and seismicity. On very short time scale of less than an hour, co-seismic event, apart from the "shaking" that is the earthquake, leaves behind permanent (step-function-like) dislocations in the crust and mantle. This redistribution of mass changes the Earth's inertia tensor (and hence Earth's rotation in both length-of-day and polar motion), and the gravity field (in terms of spherical harmonic Stokes coefficients). The question is whether these effects are large enough to be of any significance. In this paper we report updated calculation results. The calculation uses the normal mode summation scheme, applied to 15,814 major earthquakes that occurred during 1976-1998, according to source mechanism solutions given by the Harvard Central Moment Tensor catalog. Compared to the truly large ones earlier in the century, the earthquakes we study are individually all too small to have left any discernible signature in geodetic records of Earth rotation or global gravity field. However, their collective effects continue to exhibit an extremely strong statistical tendencies. For example, earthquakes conspire to decrease J(sub 2) and J(sub 22) while shortening LOD, resulting in a rounder and more compact Earth. Strong tendency is also seen in the earthquakes trying to "nudge" the Earth rotation pole towards about 140 degree E, roughly opposite to the observed polar drift direction. The geophysical significance and implications will be further studied.

  1. Installation, care, and maintenance of wood shake and shingle roofs

    Treesearch

    Tony Bonura; Jack Dwyer; Arnie Nebelsick; Brent Stuart; R. Sam Williams; Christopher Hunt

    2011-01-01

    This article gives general guidelines for selection, installation, finishing, and maintenance of wood shake and shingle roofs. The authors have gathered information from a variety of sources: research publications on wood finishing, technical data sheets from paint manufacturers, installation instructions for shake and shingle roofs, and interviews with experts having...

  2. ShakeMap Atlas 2.0: an improved suite of recent historical earthquake ShakeMaps for global hazard analyses and loss model calibration

    USGS Publications Warehouse

    Garcia, D.; Mah, R.T.; Johnson, K.L.; Hearne, M.G.; Marano, K.D.; Lin, K.-W.; Wald, D.J.

    2012-01-01

    We introduce the second version of the U.S. Geological Survey ShakeMap Atlas, which is an openly-available compilation of nearly 8,000 ShakeMaps of the most significant global earthquakes between 1973 and 2011. This revision of the Atlas includes: (1) a new version of the ShakeMap software that improves data usage and uncertainty estimations; (2) an updated earthquake source catalogue that includes regional locations and finite fault models; (3) a refined strategy to select prediction and conversion equations based on a new seismotectonic regionalization scheme; and (4) vastly more macroseismic intensity and ground-motion data from regional agencies All these changes make the new Atlas a self-consistent, calibrated ShakeMap catalogue that constitutes an invaluable resource for investigating near-source strong ground-motion, as well as for seismic hazard, scenario, risk, and loss-model development. To this end, the Atlas will provide a hazard base layer for PAGER loss calibration and for the Earthquake Consequences Database within the Global Earthquake Model initiative.

  3. The NASA modern technology rotors program

    NASA Technical Reports Server (NTRS)

    Watts, M. E.; Cross, J. L.

    1986-01-01

    Existing data bases regarding helicopters are based on work conducted on 'old-technology' rotor systems. The Modern Technology Rotors (MTR) Program is to provide extensive data bases on rotor systems using present and emerging technology. The MTR is concerned with modern, four-bladed, rotor systems presently being manufactured or under development. Aspects of MTR philosophy are considered along with instrumentation, the MTR test program, the BV 360 Rotor, and the UH-60 Black Hawk. The program phases include computer modelling, shake test, model-scale test, minimally instrumented flight test, extensively pressure-instrumented-blade flight test, and full-scale wind tunnel test.

  4. Interpreting plant responses to clinostating. I - Mechanical stresses and ethylene

    NASA Technical Reports Server (NTRS)

    Salisbury, Frank B.; Wheeler, Raymond M.

    1981-01-01

    The possibility that the clinostat mechanical stresses (leaf flopping) induces ethylene production and, thus, the development of epinasty was tested by stressing vertical plants by constant gentle horizontal or vertical shaking or by a quick back-and-forth rotation (twisting). Clinostat leaf flopping was closely approximated by turning plants so that their stems were horizontal, rotating them quickly about the stem axis, and returning them to the vertical, with the treatment repeated every four minutes. It was found that horizontal and vertical shaking, twisting, intermittent horizontal rotating, and gentle hand shaking failed to induce epinasties that approached those observed on the slow clinostat. Minor epinasties were generated by vigorous hand-shaking (120 sec/day) and by daily application of Ag(+). Reducing leaf displacements by inverting plants did not significantly reduce the minor epinasty generated by vigorous hand-shaking.

  5. Liquefaction evidence for the strength of ground motions resulting from Late Holocene Cascadia subduction earthquakes, with emphasis on the event of 1700 A.D.

    USGS Publications Warehouse

    Obermeier, S.F.; Dickenson, S.E.

    2000-01-01

    During the past decade, paleoseismic studies done by many researchers in the coastal regions of the Pacific Northwest have shown that regional downdropping and subsequent tsunami inundation occurred in response to a major earthquake along the Cascadia subduction zone. This earthquake occurred almost certainly in 1700 A.D., and is believed by many to have been of M 8.5-9 or perhaps larger. In order to characterize the severity of ground motions from this earthquake, we report on a field search and analysis of seismically induced liquefaction features. The search was conducted chiefly along the banks of islands in the lowermost Columbia River of Oregon and Washington and in stream banks along smaller rivers throughout southwestern Washington. To a lesser extent, the investigation included rivers in central Oregon. Numerous small- to moderate-sized liquefaction features from the earthquake of 1700 A.D. were found in some regions, but there was a notable lack of liquefaction features in others. The regional distribution of liquefaction features is evaluated as a function of geologic and geotechnical factors in different field settings near the coast. Our use of widely different field settings, each in which we independently assess the strength of shaking and arrive at the same conclusion, enhances the credibility of our interpretations. Our regional inventory of liquefaction features and preliminary geotechnical analysis of liquefaction potential provide substantial evidence for only moderate levels of ground shaking in coastal Washington and Oregon during the subduction earthquake of 1700 A.D. Additionally, it appears that a similar conclusion can be reached for an earlier subduction earthquake that occurred within the past 1100 years, which also has been characterized by others as being M 8 or greater. On the basis of more limited data for older events collected in our regional study, it appears that seismic shaking has been no stronger throughout Holocene time. Our interpreted levels of shaking are considerably lower than current estimates in the technical literature that use theoretical and statistical models to predict ground motions of subduction earthquakes in the Cascadia region. Because of the influence of estimated ground motions from Cascadia subduction-zone earthquakes on seismic hazard evaluations, more paleoliquefaction and geotechnical field studies are needed to definitively bracket the strength of shaking. With further work, it should be possible to extend the record of seismic shaking through much of Holocene time in large portions of Washington and Oregon.

  6. The Puerto Rico Seismic Network Broadcast System: A user friendly GUI to broadcast earthquake messages, to generate shakemaps and to update catalogues

    NASA Astrophysics Data System (ADS)

    Velez, J.; Huerfano, V.; von Hillebrandt, C.

    2007-12-01

    The Puerto Rico Seismic Network (PRSN) has historically provided locations and magnitudes for earthquakes in the Puerto Rico and Virgin Islands (PRVI) region. PRSN is the reporting authority for the region bounded by latitudes 17.0N to 20.0N, and longitudes 63.5W to 69.0W. The main objective of the PRSN is to record, process, analyze, provide information and research local, regional and teleseismic earthquakes, providing high quality data and information to be able to respond to the needs of the emergency management, academic and research communities, and the general public. The PRSN runs Earthworm software (Johnson et al, 1995) to acquire and write waveforms to disk for permanent archival. Automatic locations and alerts are generated for events in Puerto Rico, the Intra America Seas, and the Atlantic by the EarlyBird system (Whitmore and Sokolowski, 2002), which monitors PRSN stations as well as some 40 additional stations run by networks operating in North, Central and South America and other sites in the Caribbean. PRDANIS (Puerto Rico Data Analysis and Information System) software, developed by PRSN, supports manual locations and analyst review of automatic locations of events within the PRSN area of responsibility (AOR), using all the broadband, strong-motion and short-period waveforms Rapidly available information regarding the geographic distribution of ground shaking in relation to the population and infrastructure at risk can assist emergency response communities in efficient and optimized allocation of resources following a large earthquake. The ShakeMap system developed by the USGS provides near real-time maps of instrumental ground motions and shaking intensity and has proven effective in rapid assessment of the extent of shaking and potential damage after significant earthquakes (Wald, 2004). In Northern and Southern California, the Pacific Northwest, and the states of Utah and Nevada, ShakeMaps are used for emergency planning and response, loss estimation, and communication of earthquake information to the public. We develop a tool to help the PRSN personnel on duty with the generation of ShakeMaps for the felt events in Puerto Rico and the Virgin Islands. Automatic or reviewed locations came from different sources and the user can select the method to broadcast the message using several ways like direct email trough service lists, a server/client tool to push messages to a remote display client, generate shakemap web pages and update the catalogues.

  7. Acoustic emission energy b-value for local damage evaluation in reinforced concrete structures subjected to seismic loadings

    NASA Astrophysics Data System (ADS)

    Sagasta, Francisco; Zitto, Miguel E.; Piotrkowski, Rosa; Benavent-Climent, Amadeo; Suarez, Elisabet; Gallego, Antolino

    2018-03-01

    A modification of the original b-value (Gutenberg-Richter parameter) is proposed to evaluate local damage of reinforced concrete structures subjected to dynamical loads via the acoustic emission (AE) method. The modification, shortly called energy b-value, is based on the use of the true energy of the AE signals instead of its peak amplitude, traditionally used for the calculation of b-value. The proposal is physically supported by the strong correlation between the plastic strain energy dissipated by the specimen and the true energy of the AE signals released during its deformation and cracking process, previously demonstrated by the authors in several publications. AE data analysis consisted in the use of guard sensors and the Continuous Wavelet Transform in order to separate primary and secondary emissions as much as possible according to particular frequency bands. The approach has been experimentally applied to the AE signals coming from a scaled reinforced concrete frame structure, which was subjected to sequential seismic loads of incremental acceleration peak by means of a 3 × 3 m2 shaking table. For this specimen two beam-column connections-one exterior and one interior-were instrumented with wide band low frequency sensors properly attached on the structure. Evolution of the energy b-value along the loading process accompanies the evolution of the severe damage at the critical regions of the structure (beam-column connections), thus making promising its use for structural health monitoring purposes.

  8. BlueSeis3A - performance, laboratory tests and applications

    NASA Astrophysics Data System (ADS)

    Bernauer, F.; Wassermann, J. M.; de Toldi, E.; Guattari, F.; Ponceau, D.; Ripepe, M.; Igel, H.

    2017-12-01

    One of the most emerging developments in seismic instrumentation is the application of fiber optic gyroscopes as portable rotational ground motion sensors. In the framework of the European Research Council Project, ROMY (ROtational Motions in seismologY), BlueSeis3A was developed in a collaboration between researchers from Ludwig-Maximilians University of Munich, Germany, and the fiber optic sensors manufacturer iXblue, France. With its high sensitivity (20 nrads-1Hz-1/2) in a broad frequency range (0.001 Hz to 50 Hz) BlueSeis3A opens a variety of applications which were up to now hampered by the lack of such an instrument. In this contribution, we will first present performance characteristics of BlueSeis3A with a focus on timing stability and scale factor linearity. In a second part we demonstrate the benefit of directly measured rotational motion for dynamic tilt correction of measurements made with a classical seismometer. A well known tilt signal was produced with a shake table and recorded simultaneously with a classical seismometer and BlueSeis3A. The seismometer measurement could be improved significantly by subtracting the coherent tilt signal which was measured directly with the rotational motion sensor. As a last part we show the advantage of directly measured rotational motion for applications in civil engineering. Results from a measurement campaign in the Giotto bell tower in the city of Florence, Italy, show the possibility of direct observation of torsional modes by deploying a rotational motion sensor inside the structure.

  9. Earthquake Response of Reinforced Concrete Building Retrofitted with Geopolymer Concrete and X-shaped Metallic Damper

    NASA Astrophysics Data System (ADS)

    Madheswaran, C. K.; Prakash vel, J.; Sathishkumar, K.; Rao, G. V. Rama

    2017-06-01

    A three-storey half scale reinforced concrete (RC) building is fixed with X-shaped metallic damper at the ground floor level, is designed and fabricated to study its seismic response characteristics. Experimental studies are carried out using the (4 m × 4 m) tri-axial shake-table facility to evaluate the seismic response of a retrofitted RC building with open ground storey (OGS) structure using yielding type X-shaped metallic dampers (also called as Added Damping and Stiffness-ADAS elements) and repairing the damaged ground storey columns using geopolymer concrete composites. This elasto-plastic device is normally incorporated within the frame structure between adjacent floors through chevron bracing, so that they efficiently enhance the overall energy dissipation ability of the seismically deficient frame structure under earthquake loading. Free vibration tests on RC building without and with yielding type X-shaped metallic damper is carried out. The natural frequencies and mode shapes of RC building without and with yielding type X-shaped metallic damper are determined. The retrofitted reinforced concrete building is subjected to earthquake excitations and the response from the structure is recorded. This work discusses the preparation of test specimen, experimental set-up, instrumentation, method of testing of RC building and the response of the structure. The metallic damper reduces the time period of the structure and displacement demands on the OGS columns of the structure. Nonlinear time history analysis is performed using structural analysis package, SAP2000.

  10. Large-Scale Physical Models of Thermal Remediation of DNAPL Source Zones in Aquitards

    DTIC Science & Technology

    2009-05-01

    pressure at the bottom of the tank. The higher pressure is reflected in higher measured water levels in external gauges . Figure 63: 3D Cross...than atmospheric. This higher pressure can raise the apparent water level in a sight gauge or external overflow and can even drive more fluid through...the water table. All met or exceeded their goals. Typical turnkey unit costs (including design, permitting, fabrication, mobilization, drilling

  11. Evaluating earthquake hazards in the Los Angeles region; an earth-science perspective

    USGS Publications Warehouse

    Ziony, Joseph I.

    1985-01-01

    Potentially destructive earthquakes are inevitable in the Los Angeles region of California, but hazards prediction can provide a basis for reducing damage and loss. This volume identifies the principal geologically controlled earthquake hazards of the region (surface faulting, strong shaking, ground failure, and tsunamis), summarizes methods for characterizing their extent and severity, and suggests opportunities for their reduction. Two systems of active faults generate earthquakes in the Los Angeles region: northwest-trending, chiefly horizontal-slip faults, such as the San Andreas, and west-trending, chiefly vertical-slip faults, such as those of the Transverse Ranges. Faults in these two systems have produced more than 40 damaging earthquakes since 1800. Ninety-five faults have slipped in late Quaternary time (approximately the past 750,000 yr) and are judged capable of generating future moderate to large earthquakes and displacing the ground surface. Average rates of late Quaternary slip or separation along these faults provide an index of their relative activity. The San Andreas and San Jacinto faults have slip rates measured in tens of millimeters per year, but most other faults have rates of about 1 mm/yr or less. Intermediate rates of as much as 6 mm/yr characterize a belt of Transverse Ranges faults that extends from near Santa Barbara to near San Bernardino. The dimensions of late Quaternary faults provide a basis for estimating the maximum sizes of likely future earthquakes in the Los Angeles region: moment magnitude .(M) 8 for the San Andreas, M 7 for the other northwest-trending elements of that fault system, and M 7.5 for the Transverse Ranges faults. Geologic and seismologic evidence along these faults, however, suggests that, for planning and designing noncritical facilities, appropriate sizes would be M 8 for the San Andreas, M 7 for the San Jacinto, M 6.5 for other northwest-trending faults, and M 6.5 to 7 for the Transverse Ranges faults. The geologic and seismologic record indicates that parts of the San Andreas and San Jacinto faults have generated major earthquakes having recurrence intervals of several tens to a few hundred years. In contrast, the geologic evidence at points along other active faults suggests recurrence intervals measured in many hundreds to several thousands of years. The distribution and character of late Quaternary surface faulting permit estimation of the likely location, style, and amount of future surface displacements. An extensive body of geologic and geotechnical information is used to evaluate areal differences in future levels of shaking. Bedrock and alluvial deposits are differentiated according to the physical properties that control shaking response; maps of these properties are prepared by analyzing existing geologic and soils maps, the geomorphology of surficial units, and. geotechnical data obtained from boreholes. The shear-wave velocities of near-surface geologic units must be estimated for some methods of evaluating shaking potential. Regional-scale maps of highly generalized shearwave velocity groups, based on the age and texture of exposed geologic units and on a simple two-dimensional model of Quaternary sediment distribution, provide a first approximation of the areal variability in shaking response. More accurate depictions of near-surface shear-wave velocity useful for predicting ground-motion parameters take into account the thickness of the Quaternary deposits, vertical variations in sediment .type, and the correlation of shear-wave velocity with standard penetration resistance of different sediments. A map of the upper Santa Ana River basin showing shear-wave velocities to depths equal to one-quarter wavelength of a 1-s shear wave demonstrates the three-dimensional mapping procedure. Four methods for predicting the distribution and strength of shaking from future earthquakes are presented. These techniques use different measures of strong-motion

  12. Characterization of the Physical Stability of a Lyophilized IgG1 mAb After Accelerated Shipping-like Stress

    PubMed Central

    Telikepalli, Srivalli; Kumru, Ozan S.; Kim, Jae Hyun; Joshi, Sangeeta B.; O'Berry, Kristin B.; Blake-Haskins, Angela W.; Perkins, Melissa D.; Middaugh, C. Russell; Volkin, David B.

    2014-01-01

    Upon exposure to shaking stress, an IgG1 mAb formulation in both liquid and lyophilized state formed subvisible particles. Since freeze-drying is expected to minimize protein physical instability under these conditions, the extent and nature of aggregate formation in the lyophilized preparation was examined using a variety of particle characterization techniques. The effect of formulation variables such as residual moisture content, reconstitution rate, and reconstitution medium were examined. Upon reconstitution of shake-stressed lyophilized mAb, differences in protein particle size and number were observed by Microflow Digital Imaging (MFI), with the reconstitution medium having the largest impact. Shake-stress had minor effects on the structure of protein within the particles as shown by SDS-PAGE and FTIR analysis. The lyophilized mAb was shake-stressed to different extents and stored for 3 months at different temperatures. Both extent of cake collapse and storage temperature affected the physical stability of the shake-stressed lyophilized mAb upon subsequent storage. These findings demonstrate that physical degradation upon shaking of a lyophilized IgG1 mAb formulation includes not only cake breakage, but also results in an increase in subvisible particles and turbidity upon reconstitution. The shaking-induced cake breakage of the lyophilized IgG1 mAb formulation also resulted in decreased physical stability upon storage. PMID:25522000

  13. Performance of Earthquake Early Warning Systems during the Major Events of the 2016-2017 Central Italy Seismic Sequence.

    NASA Astrophysics Data System (ADS)

    Festa, G.; Picozzi, M.; Alessandro, C.; Colombelli, S.; Cattaneo, M.; Chiaraluce, L.; Elia, L.; Martino, C.; Marzorati, S.; Supino, M.; Zollo, A.

    2017-12-01

    Earthquake early warning systems (EEWS) are systems nowadays contributing to the seismic risk mitigation actions, both in terms of losses and societal resilience, by issuing an alert promptly after the earthquake origin and before the ground shaking impacts the targets to be protected. EEWS systems can be grouped in two main classes: network based and stand-alone systems. Network based EEWS make use of dense seismic networks surrounding the fault (e.g. Near Fault Observatory; NFO) generating the event. The rapid processing of the P-wave early portion allows for the location and magnitude estimation of the event then used to predict the shaking through ground motion prediction equations. Stand-alone systems instead analyze the early P-wave signal to predict the ground shaking carried by the late S or surface waves, through empirically calibrated scaling relationships, at the recording site itself. We compared the network-based (PRESTo, PRobabilistic and Evolutionary early warning SysTem, www.prestoews.org, Satriano et al., 2011) and the stand-alone (SAVE, on-Site-Alert-leVEl, Caruso et al., 2017) systems, by analyzing their performance during the 2016-2017 Central Italy sequence. We analyzed 9 earthquakes having magnitude 5.0 < M < 6.5 at about 200 stations located within 200 km from the epicentral area, including stations of The Altotiberina NFO (TABOO). Performances are evaluated in terms of rate of success of ground shaking intensity prediction and available lead-time, i.e. the time available for security actions. PRESTo also evaluated the accuracy of location and magnitude. Both systems well predict the ground shaking nearby the event source, with a success rate around 90% within the potential damage zone. The lead-time is significantly larger for the network based system, increasing to more than 10s at 40 km from the event epicentre. The stand-alone system better performs in the near-source region showing a positive albeit small lead-time (<3s). Far away from the source, the performances slightly degrade, mostly owing to uncertain calibration of attenuation relationships. This study opens to the possibility of making EEWS operational in Italy, based on the available acceleration networks, by improving the capability of reducing the lead-time related to data telemetry.

  14. A 5 Year Study of Carbon Fluxes from a Restored English Blanket Bog

    NASA Astrophysics Data System (ADS)

    Worrall, F.; Dixon, S.; Evans, M.

    2014-12-01

    This study aimed to measure the effects of ecological restoration on blanket peat water table depths, DOC concentrations and CO2 fluxes. In April 2003 the Bleaklow Plateau, an extensive area of deep blanket peat in the Peak District National Park, northern England, was devegetated by a wildfire. As a result the area was selected for large scale restoration. In this study we considered a 5-year study of four restored sites in comparison to both an unrestored, bare peat control and to vegetated control that did not require restoration. Results suggested that sites with revegetation alongside slope stabilisation had the highest rates of photosynthesis and were the largest net (daylight hours) sinks of CO2. Bare sites were the largest net sources of CO2 and had the deepest water table depths. Sites with gully wall stabilisation were between 5-8 times more likely to be net CO2 sinks than the bare sites. Revegetation without gully flow blocking using plastic dams did not have a large effect on water table depths in and around the gullies investigated whereas a blocked gully had water table depths comparable to a naturally revegetating gully. A ten centimetre lowering in water table depth decreased the probability of observing a net CO2 sink, on a given site, by up to 30%. With respect to DOC the study showed that the average soil porewater DOC concentration on the restored sites rose significantly over the 5 year study representing a 34% increase relative to the vegetated control and an 11% increase relative to the unrestored, bare control. Soil pore water concentrations were not significantly different from surface runoff DOC concentrations and therefore restoration as conducted by this study would have contributed to water quality deterioration in the catchment. The most important conclusion of this research was that restoration interventions were apparently effective at increasing the likelihood of net CO2 sink behaviour and raising water tables on degraded, climatically marginal blanket bog. However, had water table restoration been conducted alongside revegetation then a significant decline in DOC concentrations could have also been realised.

  15. Total height volume tables for western hemlock, sitka spruce and young-growth Douglas-fir (based on 32-foot logs and an 8-inch top).

    Treesearch

    Harold A. Rapraeger

    1952-01-01

    In the Pacific Northwest logs are often scaled in lengths which average about 32 feet to facilitate logging. Although several excellent Western hemlock, Sitka spruce and Douglas-fir volume tables based on a 32-foot scaling length have been available for some time, they provide for a larger top diameter than is now used in actual practice. Other tables specify a...

  16. The Shaking Torch: Another Variation on the Inductive Force

    ERIC Educational Resources Information Center

    Thompson, Frank

    2010-01-01

    A recent article showed how the influx of neodymium magnets has provided striking demonstrations of the interactions between magnets and conductors. The "shaking torch" is yet another example. Many of these torches require no batteries and can be submerged in water--indeed, a light for life. In this article, the author disassembles a shaking torch…

  17. Perpetrator Accounts in Infant Abusive Head Trauma Brought about by a Shaking Event

    ERIC Educational Resources Information Center

    Biron, Dean; Shelton, Doug

    2005-01-01

    Objective: To analyze perpetrator and medical evidence collected during investigations of infant abusive head trauma (IAHT), with a view to (a) identifying cases where injuries were induced by shaking in the absence of any impact and (b) documenting the response of infant victims to a violent shaking event. Method: A retrospective study was…

  18. Multi-Agent Framework for the Fair Division of Resources and Tasks

    DTIC Science & Technology

    2006-01-01

    144 B.1.2 Application of Shake Out Algorithm to JFK Airport Test Data.........................144 B.2 Generalization...145 Figure B–2: Available Aircraft Inventory at JFK Airport ............................................. 148 Figure B–3...Available Aircraft Inventory at JFK Airport after the first shake out ....... 148 Figure B–4: Inventory Vectors for Second and Third Shake Outs

  19. Summary of the First High-Altitude, Supersonic Flight Dynamics Test for the Low-Density Supersonic Decelerator Project

    NASA Technical Reports Server (NTRS)

    Clark, Ian G.; Adler, Mark; Manning, Rob

    2015-01-01

    NASA's Low-Density Supersonic Decelerator Project is developing and testing the next generation of supersonic aerodynamic decelerators for planetary entry. A key element of that development is the testing of full-scale articles in conditions relevant to their intended use, primarily the tenuous Mars atmosphere. To achieve this testing, the LDSD project developed a test architecture similar to that used by the Viking Project in the early 1970's for the qualification of their supersonic parachute. A large, helium filled scientific balloon is used to hoist a 4.7 m blunt body test vehicle to an altitude of approximately 32 kilometers. The test vehicle is released from the balloon, spun up for gyroscopic stability, and accelerated to over four times the speed of sound and an altitude of 50 kilometers using a large solid rocket motor. Once at those conditions, the vehicle is despun and the test period begins. The first flight of this architecture occurred on June 28th of 2014. Though primarily a shake out flight of the new test system, the flight was also able to achieve an early test of two of the LDSD technologies, a large 6 m diameter Supersonic Inflatable Aerodynamic Decelerator (SIAD) and a large, 30.5 m nominal diameter supersonic parachute. This paper summarizes this first flight.

  20. Earth Impact Effects Program: Estimating the Regional Environmental Consequences of Impacts On Earth

    NASA Astrophysics Data System (ADS)

    Collins, G. S.; Melosh, H. J.; Marcus, R. A.

    2009-12-01

    The Earth Impact Effects Program (www.lpl.arizona.edu/impacteffects) is a popular web-based calculator for estimating the regional environmental consequences of a comet or asteroid impact on Earth. It is widely used, both by inquisitive members of the public as an educational device and by scientists as a simple research tool. It applies a variety of scaling laws, based on theory, nuclear explosion test data, observations from terrestrial and extraterrestrial craters and the results of small-scale impact experiments and numerical modelling, to quantify the principal hazards that might affect the people, buildings and landscape in the vicinity of an impact. The program requires six inputs: impactor diameter, impactor density, impact velocity prior to atmospheric entry, impact angle, and the target type (sedimentary rock, crystalline rock, or a water layer above rock), as well as the distance from the impact at which the environmental effects are to be calculated. The program includes simple algorithms for estimating the fate of the impactor during atmospheric traverse, the thermal radiation emitted by the impact plume (fireball) and the intensity of seismic shaking. The program also approximates various dimensions of the impact crater and ejecta deposit, as well as estimating the severity of the air blast in both crater-forming and airburst impacts. We illustrate the strengths and limitations of the program by comparing its predictions (where possible) against known impacts, such as Carancas, Peru (2007); Tunguska, Siberia (1908); Barringer (Meteor) crater, Arizona (ca 49 ka). These tests demonstrate that, while adequate for large impactors, the simple approximation of atmospheric entry in the original program does not properly account for the disruption and dispersal of small impactors as they traverse Earth's atmosphere. We describe recent improvements to the calculator to better describe atmospheric entry of small meteors; the consequences of oceanic impacts; and the recurrance interval between impacts of a given size. In addition, we assess the potential regional hazard of hypothetical impact scenarios of different scales. Our simple calculator suggests that the most wide-reaching regional hazard is seismic shaking: both ejecta-deposit thickness and airblast pressure decay much more rapidly with distance than seismic ground motion. Close to the impact site the most severe hazard is from thermal radiation; however, the curvature of the Earth implies that distant localities are shielded from direct thermal radiation because the fireball is below the horizon.

  1. Large scale database scrubbing using object oriented software components.

    PubMed

    Herting, R L; Barnes, M R

    1998-01-01

    Now that case managers, quality improvement teams, and researchers use medical databases extensively, the ability to share and disseminate such databases while maintaining patient confidentiality is paramount. A process called scrubbing addresses this problem by removing personally identifying information while keeping the integrity of the medical information intact. Scrubbing entire databases, containing multiple tables, requires that the implicit relationships between data elements in different tables of the database be maintained. To address this issue we developed DBScrub, a Java program that interfaces with any JDBC compliant database and scrubs the database while maintaining the implicit relationships within it. DBScrub uses a small number of highly configurable object-oriented software components to carry out the scrubbing. We describe the structure of these software components and how they maintain the implicit relationships within the database.

  2. The 17 July 2006 Tsunami earthquake in West Java, Indonesia

    USGS Publications Warehouse

    Mori, J.; Mooney, W.D.; Afnimar,; Kurniawan, S.; Anaya, A.I.; Widiyantoro, S.

    2007-01-01

    A tsunami earthquake (Mw = 7.7) occurred south of Java on 17 July 2006. The event produced relatively low levels of high-frequency radiation, and local felt reports indicated only weak shaking in Java. There was no ground motion damage from the earthquake, but there was extensive damage and loss of life from the tsunami along 250 km of the southern coasts of West Java and Central Java. An inspection of the area a few days after the earthquake showed extensive damage to wooden and unreinforced masonry buildings that were located within several hundred meters of the coast. Since there was no tsunami warning system in place, efforts to escape the large waves depended on how people reacted to the earthquake shaking, which was only weakly felt in the coastal areas. This experience emphasizes the need for adequate tsunami warning systems for the Indian Ocean region.

  3. New constraints on mechanisms of remotely triggered seismicity at Long Valley Caldera

    USGS Publications Warehouse

    Brodsky, E.E.; Prejean, S.G.

    2005-01-01

    Regional-scale triggering of local earthquakes in the crust by seismic waves from distant main shocks has now been robustly documented for over a decade. Some of the most thoroughly recorded examples of repeated triggering of a single site from multiple, large earthquakes are measured in geothermal fields of the western United States like Long Valley Caldera. As one of the few natural cases where the causality of an earthquake sequence is apparent, triggering provides fundamental constraints on the failure processes in earthquakes. We show here that the observed triggering by seismic waves is inconsistent with any mechanism that depends on cumulative shaking as measured by integrated energy density. We also present evidence for a frequency-dependent triggering threshold. On the basis of the seismic records of 12 regional and teleseismic events recorded at Long Valley Caldera, long-period waves (>30 s) are more effective at generating local seismicity than short-period waves of comparable amplitude. If the properties of the system are stationary over time, the failure threshold for long-period waves is ~0.05 cm/s vertical shaking. Assuming a phase velocity of 3.5 km/s and an elastic modulus of 3.5 x 1010Pa, the threshold in terms of stress is 5 kPa. The frequency dependence is due in part to the attenuation of the surface waves with depth. Fluid flow through a porous medium can produce the rest of the observed frequency dependence of the threshold. If the threshold is not stationary with time, pore pressures that are >99.5% of lithostatic and vary over time by a factor of 4 could explain the observations with no frequency dependence of the triggering threshold. Copyright 2005 by the American Geophysical Union.

  4. The ShakeOut earthquake scenario: Verification of three simulation sets

    USGS Publications Warehouse

    Bielak, J.; Graves, R.W.; Olsen, K.B.; Taborda, R.; Ramirez-Guzman, L.; Day, S.M.; Ely, G.P.; Roten, D.; Jordan, T.H.; Maechling, P.J.; Urbanic, J.; Cui, Y.; Juve, G.

    2010-01-01

    This paper presents a verification of three simulations of the ShakeOut scenario, an Mw 7.8 earthquake on a portion of the San Andreas fault in southern California, conducted by three different groups at the Southern California Earthquake Center using the SCEC Community Velocity Model for this region. We conducted two simulations using the finite difference method, and one by the finite element method, and performed qualitative and quantitative comparisons between the corresponding results. The results are in good agreement with each other; only small differences occur both in amplitude and phase between the various synthetics at ten observation points located near and away from the fault-as far as 150 km away from the fault. Using an available goodness-of-fit criterion all the comparisons scored above 8, with most above 9.2. This score would be regarded as excellent if the measurements were between recorded and synthetic seismograms. We also report results of comparisons based on time-frequency misfit criteria. Results from these two criteria can be used for calibrating the two methods for comparing seismograms. In those cases in which noticeable discrepancies occurred between the seismograms generated by the three groups, we found that they were the product of inherent characteristics of the various numerical methods used and their implementations. In particular, we found that the major source of discrepancy lies in the difference between mesh and grid representations of the same material model. Overall, however, even the largest differences in the synthetic seismograms are small. Thus, given the complexity of the simulations used in this verification, it appears that the three schemes are consistent, reliable and sufficiently accurate and robust for use in future large-scale simulations. ?? 2009 The Authors Journal compilation ?? 2009 RAS.

  5. Are the traditional large-scale drought indices suitable for shallow water wetlands? An example in the Everglades.

    PubMed

    Zhao, Dehua; Wang, Penghe; Zuo, Jie; Zhang, Hui; An, Shuqing; Ramesh, Reddy K

    2017-08-01

    Numerous drought indices have been developed over the past several decades. However, few studies have focused on the suitability of indices for studies of ephemeral wetlands. The objective is to answer the following question: can the traditional large-scale drought indices characterize drought severity in shallow water wetlands such as the Everglades? The question was approached from two perspectives: the available water quantity and the response of wetland ecosystems to drought. The results showed the unsuitability of traditional large-scale drought indices for characterizing the actual available water quantity based on two findings. (1) Large spatial variations in precipitation (P), potential evapotranspiration (PE), water table depth (WTD) and the monthly water storage change (SC) were observed in the Everglades; notably, the spatial variation in SC, which reflects the monthly water balance, was 1.86 and 1.62 times larger than the temporal variation between seasons and between years, respectively. (2) The large-scale water balance measured based on the water storage variation had an average indicating efficiency (IE) of only 60.01% due to the redistribution of interior water. The spatial distribution of variations in the Normalized Different Vegetation Index (NDVI) in the 2011 dry season showed significantly positive, significantly negative and weak correlations with the minimum WTD in wet prairies, graminoid prairies and sawgrass wetlands, respectively. The significant and opposite correlations imply the unsuitability of the traditional large-scale drought indices in evaluating the effect of drought on shallow water wetlands. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. An empirical model for global earthquake fatality estimation

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David

    2010-01-01

    We analyzed mortality rates of earthquakes worldwide and developed a country/region-specific empirical model for earthquake fatality estimation within the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is defined as total killed divided by total population exposed at specific shaking intensity level. The total fatalities for a given earthquake are estimated by multiplying the number of people exposed at each shaking intensity level by the fatality rates for that level and then summing them at all relevant shaking intensities. The fatality rate is expressed in terms of a two-parameter lognormal cumulative distribution function of shaking intensity. The parameters are obtained for each country or a region by minimizing the residual error in hindcasting the total shaking-related deaths from earthquakes recorded between 1973 and 2007. A new global regionalization scheme is used to combine the fatality data across different countries with similar vulnerability traits.

  7. B-CAN: a resource sharing platform to improve the operation, visualization and integrated analysis of TCGA breast cancer data

    PubMed Central

    Wen, Can-Hong; Ou, Shao-Min; Guo, Xiao-Bo; Liu, Chen-Feng; Shen, Yan-Bo; You, Na; Cai, Wei-Hong; Shen, Wen-Jun; Wang, Xue-Qin; Tan, Hai-Zhu

    2017-01-01

    Breast cancer is a high-risk heterogeneous disease with myriad subtypes and complicated biological features. The Cancer Genome Atlas (TCGA) breast cancer database provides researchers with the large-scale genome and clinical data via web portals and FTP services. Researchers are able to gain new insights into their related fields, and evaluate experimental discoveries with TCGA. However, it is difficult for researchers who have little experience with database and bioinformatics to access and operate on because of TCGA’s complex data format and diverse files. For ease of use, we build the breast cancer (B-CAN) platform, which enables data customization, data visualization, and private data center. The B-CAN platform runs on Apache server and interacts with the backstage of MySQL database by PHP. Users can customize data based on their needs by combining tables from original TCGA database and selecting variables from each table. The private data center is applicable for private data and two types of customized data. A key feature of the B-CAN is that it provides single table display and multiple table display. Customized data with one barcode corresponding to many records and processed customized data are allowed in Multiple Tables Display. The B-CAN is an intuitive and high-efficient data-sharing platform. PMID:29312567

  8. Measuring relative-story displacement and local inclination angle using multiple position-sensitive detectors.

    PubMed

    Matsuya, Iwao; Katamura, Ryuta; Sato, Maya; Iba, Miroku; Kondo, Hideaki; Kanekawa, Kiyoshi; Takahashi, Motoichi; Hatada, Tomohiko; Nitta, Yoshihiro; Tanii, Takashi; Shoji, Shuichi; Nishitani, Akira; Ohdomari, Iwao

    2010-01-01

    We propose a novel sensor system for monitoring the structural health of a building. The system optically measures the relative-story displacement during earthquakes for detecting any deformations of building elements. The sensor unit is composed of three position sensitive detectors (PSDs) and lenses capable of measuring the relative-story displacement precisely, even if the PSD unit was inclined in response to the seismic vibration. For verification, laboratory tests were carried out using an Xθ-stage and a shaking table. The static experiment verified that the sensor could measure the local inclination angle as well as the lateral displacement. The dynamic experiment revealed that the accuracy of the sensor was 150 μm in the relative-displacement measurement and 100 μrad in the inclination angle measurement. These results indicate that the proposed sensor system has sufficient accuracy for the measurement of relative-story displacement in response to the seismic vibration.

  9. Semi-active friction damper for buildings subject to seismic excitation

    NASA Astrophysics Data System (ADS)

    Mantilla, Juan S.; Solarte, Alexander; Gomez, Daniel; Marulanda, Johannio; Thomson, Peter

    2016-04-01

    Structural control systems are considered an effective alternative for reducing vibrations in civil structures and are classified according to their energy supply requirement: passive, semi-active, active and hybrid. Commonly used structural control systems in buildings are passive friction dampers, which add energy dissipation through damping mechanisms induced by sliding friction between their surfaces. Semi-Active Variable Friction Dampers (SAVFD) allow the optimum efficiency range of friction dampers to be enhanced by controlling the clamping force in real time. This paper describes the development and performance evaluation of a low-cost SAVFD for the reduction of vibrations of structures subject to earthquakes. The SAVFD and a benchmark structural control test structure were experimentally characterized and analytical models were developed and updated based on the dynamic characterization. Decentralized control algorithms were implemented and tested on a shaking table. Relative displacements and accelerations of the structure controlled with the SAVFD were 80% less than those of the uncontrolled structure

  10. A strategy for clone selection under different production conditions.

    PubMed

    Legmann, Rachel; Benoit, Brian; Fedechko, Ronald W; Deppeler, Cynthia L; Srinivasan, Sriram; Robins, Russell H; McCormick, Ellen L; Ferrick, David A; Rodgers, Seth T; Russo, A Peter

    2011-01-01

    Top performing clones have failed at the manufacturing scale while the true best performer may have been rejected early in the screening process. Therefore, the ability to screen multiple clones in complex fed-batch processes using multiple process variations can be used to assess robustness and to identify critical factors. This dynamic ranking of clones' strategy requires the execution of many parallel experiments than traditional approaches. Therefore, this approach is best suited for micro-bioreactor models which can perform hundreds of experiments quickly and efficiently. In this study, a fully monitored and controlled small scale platform was used to screen eight CHO clones producing a recombinant monoclonal antibody across several process variations, including different feeding strategies, temperature shifts and pH control profiles. The first screen utilized 240 micro-bioreactors were run for two weeks for this assessment of the scale-down model as a high-throughput tool for clone evaluation. The richness of the outcome data enable to clearly identify the best and worst clone as well as process in term of maximum monoclonal antibody titer. The follow-up comparison study utilized 180 micro-bioreactors in a full factorial design and a subset of 12 clone/process combinations was selected to be run parallel in duplicate shake flasks. Good correlation between the micro-bioreactor predictions and those made in shake flasks with a Pearson correlation value of 0.94. The results also demonstrate that this micro-scale system can perform clone screening and process optimization for gaining significant titer improvements simultaneously. This dynamic ranking strategy can support better choices of production clones. Copyright © 2011 American Institute of Chemical Engineers (AIChE).

  11. Applying Hillslope Hydrology to Bridge between Ecosystem and Grid-Scale Processes within an Earth System Model

    NASA Astrophysics Data System (ADS)

    Subin, Z. M.; Sulman, B. N.; Malyshev, S.; Shevliakova, E.

    2013-12-01

    Soil moisture is a crucial control on surface energy fluxes, vegetation properties, and soil carbon cycling. Its interactions with ecosystem processes are highly nonlinear across a large range, as both drought stress and anoxia can impede vegetation and microbial growth. Earth System Models (ESMs) generally only represent an average soil-moisture state in grid cells at scales of 50-200 km, and as a result are not able to adequately represent the effects of subgrid heterogeneity in soil moisture, especially in regions with large wetland areas. We addressed this deficiency by developing the first ESM-coupled subgrid hillslope-hydrological model, TiHy (Tiled-hillslope Hydrology), embedded within the Geophysical Fluid Dynamics Laboratory (GFDL) land model. In each grid cell, one or more representative hillslope geometries are discretized into land model tiles along an upland-to-lowland gradient. These geometries represent ~1 km hillslope-scale hydrological features and allow for flexible representation of hillslope profile and plan shapes, in addition to variation of subsurface properties among or within hillslopes. Each tile (which may represent ~100 m along the hillslope) has its own surface fluxes, vegetation state, and vertically-resolved state variables for soil physics and biogeochemistry. Resolution of water state in deep layers (~200 m) down to bedrock allows for physical integration of groundwater transport with unsaturated overlying dynamics. Multiple tiles can also co-exist at the same vertical position along the hillslope, allowing the simulation of ecosystem heterogeneity due to disturbance. The hydrological model is coupled to the vertically-resolved Carbon, Organisms, Respiration, and Protection in the Soil Environment (CORPSE) model, which captures non-linearity resulting from interactions between vertically-heterogeneous soil carbon and water profiles. We present comparisons of simulated water table depth to observations. We examine sensitivities to alternative parameterizations of hillslope geometry, macroporosity, and surface runoff / inundation, and to the choice of global topographic dataset and groundwater hydraulic conductivity distribution. Simulated groundwater dynamics among hillslopes tend to cluster into three regimes of wet and well-drained, wet but poorly-drained, and dry. In the base model configuration, near-surface gridcell-mean water tables exist in an excessively large area compared to observations, including large areas of the Eastern U.S. and Northern Europe. However, in better-drained areas, the decrease in water table depth along the hillslope gradient allows for realistic increases in ecosystem water availability and soil carbon downslope. The inclusion of subgrid hydrology can increase the equilibrium 0-2 m global soil carbon stock by a large factor, due to the nonlinear effect of anoxia. We conclude that this innovative modeling framework allows for the inclusion of hillslope-scale processes and the potential for wetland dynamics in an ESM without need for a high-resolution 3-dimensional groundwater model. Future work will include investigating the potential for future changes in land carbon fluxes caused by the effects of changing hydrological regime, particularly in peatland-rich areas poorly treated by current ESMs.

  12. Global patterns of groundwater table depth.

    PubMed

    Fan, Y; Li, H; Miguez-Macho, G

    2013-02-22

    Shallow groundwater affects terrestrial ecosystems by sustaining river base-flow and root-zone soil water in the absence of rain, but little is known about the global patterns of water table depth and where it provides vital support for land ecosystems. We present global observations of water table depth compiled from government archives and literature, and fill in data gaps and infer patterns and processes using a groundwater model forced by modern climate, terrain, and sea level. Patterns in water table depth explain patterns in wetlands at the global scale and vegetation gradients at regional and local scales. Overall, shallow groundwater influences 22 to 32% of global land area, including ~15% as groundwater-fed surface water features and 7 to 17% with the water table or its capillary fringe within plant rooting depths.

  13. A Comprehensive Analysis of Multiscale Field-Aligned Currents: Characteristics, Controlling Parameters, and Relationships

    NASA Astrophysics Data System (ADS)

    McGranaghan, Ryan M.; Mannucci, Anthony J.; Forsyth, Colin

    2017-12-01

    We explore the characteristics, controlling parameters, and relationships of multiscale field-aligned currents (FACs) using a rigorous, comprehensive, and cross-platform analysis. Our unique approach combines FAC data from the Swarm satellites and the Advanced Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE) to create a database of small-scale (˜10-150 km, <1° latitudinal width), mesoscale (˜150-250 km, 1-2° latitudinal width), and large-scale (>250 km) FACs. We examine these data for the repeatable behavior of FACs across scales (i.e., the characteristics), the dependence on the interplanetary magnetic field orientation, and the degree to which each scale "departs" from nominal large-scale specification. We retrieve new information by utilizing magnetic latitude and local time dependence, correlation analyses, and quantification of the departure of smaller from larger scales. We find that (1) FACs characteristics and dependence on controlling parameters do not map between scales in a straight forward manner, (2) relationships between FAC scales exhibit local time dependence, and (3) the dayside high-latitude region is characterized by remarkably distinct FAC behavior when analyzed at different scales, and the locations of distinction correspond to "anomalous" ionosphere-thermosphere behavior. Comparing with nominal large-scale FACs, we find that differences are characterized by a horseshoe shape, maximizing across dayside local times, and that difference magnitudes increase when smaller-scale observed FACs are considered. We suggest that both new physics and increased resolution of models are required to address the multiscale complexities. We include a summary table of our findings to provide a quick reference for differences between multiscale FACs.

  14. ShakeAlert—An earthquake early warning system for the United States west coast

    USGS Publications Warehouse

    Burkett, Erin R.; Given, Douglas D.; Jones, Lucile M.

    2014-08-29

    Earthquake early warning systems use earthquake science and the technology of monitoring systems to alert devices and people when shaking waves generated by an earthquake are expected to arrive at their location. The seconds to minutes of advance warning can allow people and systems to take actions to protect life and property from destructive shaking. The U.S. Geological Survey (USGS), in collaboration with several partners, has been working to develop an early warning system for the United States. ShakeAlert, a system currently under development, is designed to cover the West Coast States of California, Oregon, and Washington.

  15. Probabilistic description of infant head kinematics in abusive head trauma.

    PubMed

    Lintern, T O; Nash, M P; Kelly, P; Bloomfield, F H; Taberner, A J; Nielsen, P M F

    2017-12-01

    Abusive head trauma (AHT) is a potentially fatal result of child abuse, but the mechanisms by which injury occur are often unclear. To investigate the contention that shaking alone can elicit the injuries observed, effective computational models are necessary. The aim of this study was to develop a probabilistic model describing infant head kinematics in AHT. A deterministic model incorporating an infant's mechanical properties, subjected to different shaking motions, was developed in OpenSim. A Monte Carlo analysis was used to simulate the range of infant kinematics produced as a result of varying both the mechanical properties and the type of shaking motions. By excluding physically unrealistic shaking motions, worst-case shaking scenarios were simulated and compared to existing injury criteria for a newborn, a 4.5 month-old, and a 12 month-old infant. In none of the three cases were head kinematics observed to exceed previously-estimated subdural haemorrhage injury thresholds. The results of this study provide no biomechanical evidence to demonstrate how shaking by a human alone can cause the injuries observed in AHT, suggesting either that additional factors, such as impact, are required, or that the current estimates of injury thresholds are incorrect.

  16. A review on full-scale decentralized wastewater treatment systems: techno-economical approach.

    PubMed

    Singh, Nitin Kumar; Kazmi, A A; Starkl, M

    2015-01-01

    As a solution to the shortcomings of centralized systems, over the last two decades large numbers of decentralized wastewater treatment plants of different technology types have been installed all over the world. This paper aims at deriving lessons learned from existing decentralized wastewater treatment plants that are relevant for smaller towns (and peri-urban areas) as well as rural communities in developing countries, such as India. Only full-scale implemented decentralized wastewater treatment systems are reviewed in terms of performance, land area requirement, capital cost, and operation and maintenance costs. The results are presented in tables comparing different technology types with respect to those parameters.

  17. VizieR Online Data Catalog: Variables in Centaurus field F170 (Pietrukowicz+, 2012)

    NASA Astrophysics Data System (ADS)

    Pietrukowicz, P.; Minniti, D.; Alonso-Garcia; J.; Hempel, M.

    2011-10-01

    VJHKs photometry of stars in two VIMOS disc fields: F167 and F170. Data table with 333 variables detected in the field F170 in Centaurus. The optical observations were taken with the 8.2-m Unit Telescope 3 + VIMOS imager with a scale of 0.205"/pix at ESO Very Large Telescope at Paranal Observatory. Date of the observations: Apr 11-12, 2005. The infrared observations were obtained with the 4.1-m VISTA telescope + VIRCAM with a scale of 0.34"/pix also at Paranal Observatory. Date of the observations: Mar-Apr 2010. (4 data files).

  18. A rapid local singularity analysis algorithm with applications

    NASA Astrophysics Data System (ADS)

    Chen, Zhijun; Cheng, Qiuming; Agterberg, Frits

    2015-04-01

    The local singularity model developed by Cheng is fast gaining popularity in characterizing mineralization and detecting anomalies of geochemical, geophysical and remote sensing data. However in one of the conventional algorithms involving the moving average values with different scales is time-consuming especially while analyzing a large dataset. Summed area table (SAT), also called as integral image, is a fast algorithm used within the Viola-Jones object detection framework in computer vision area. Historically, the principle of SAT is well-known in the study of multi-dimensional probability distribution functions, namely in computing 2D (or ND) probabilities (area under the probability distribution) from the respective cumulative distribution functions. We introduce SAT and it's variation Rotated Summed Area Table in the isotropic, anisotropic or directional local singularity mapping in this study. Once computed using SAT, any one of the rectangular sum can be computed at any scale or location in constant time. The area for any rectangular region in the image can be computed by using only 4 array accesses in constant time independently of the size of the region; effectively reducing the time complexity from O(n) to O(1). New programs using Python, Julia, matlab and C++ are implemented respectively to satisfy different applications, especially to the big data analysis. Several large geochemical and remote sensing datasets are tested. A wide variety of scale changes (linear spacing or log spacing) for non-iterative or iterative approach are adopted to calculate the singularity index values and compare the results. The results indicate that the local singularity analysis with SAT is more robust and superior to traditional approach in identifying anomalies.

  19. Shaking It up: How to Run the Best Club and Chapter Program

    ERIC Educational Resources Information Center

    Peterson, Erin

    2012-01-01

    Alumni clubs and chapters are powerful tools for keeping alumni connected to each other and the institution, gathering insight into what alumni want from their alma mater, and even raising money for the institution. And while alumni leaders do not need to devote a large amount of their budget to create successful groups, they do need to ensure…

  20. Large rock avalanches triggered by the M 7.9 Denali Fault, Alaska, earthquake of 3 November 2002

    USGS Publications Warehouse

    Jibson, R.W.; Harp, E.L.; Schulz, W.; Keefer, D.K.

    2006-01-01

    The moment magnitude (M) 7.9 Denali Fault, Alaska, earthquake of 3 November 2002 triggered thousands of landslides, primarily rock falls and rock slides, that ranged in volume from rock falls of a few cubic meters to rock avalanches having volumes as great as 20 ?? 106 m3. The pattern of landsliding was unusual: the number and concentration of triggered slides was much less than expected for an earthquake of this magnitude, and the landslides were concentrated in a narrow zone about 30-km wide that straddled the fault-rupture zone over its entire 300-km length. Despite the overall sparse landslide concentration, the earthquake triggered several large rock avalanches that clustered along the western third of the rupture zone where acceleration levels and ground-shaking frequencies are thought to have been the highest. Inferences about near-field strong-shaking characteristics drawn from interpretation of the landslide distribution are strikingly consistent with results of recent inversion modeling that indicate that high-frequency energy generation was greatest in the western part of the fault-rupture zone and decreased markedly to the east. ?? 2005 Elsevier B.V. All rights reserved.

  1. Childhood parental separation experiences and depressive symptomatology in acute major depression.

    PubMed

    Takeuchi, Hiroshi; Hiroe, Takahiro; Kanai, Takahiro; Morinobu, Shigeru; Kitamura, Toshinori; Takahashi, Kiyohisa; Furukawa, Toshiaki A

    2003-04-01

    The aim of this study was to examine the pathoplastic effects of childhood parental separation experiences on depressive symptoms. Patients with acute major depression were identified in a large 31-center study of affective disorders in Japan. Information regarding the patients' childhood losses was collected using a semistructured interview, and their depressive symptomatology was assessed by the Center for Epidemiologic Studies Depression Scale (CES-D). Patients reported significantly higher CES-D total scores when they had experienced early object loss of the same-sex parent. In terms of the CES-D subscores derived by factor analysis, early object loss significantly aggravated symptoms that people normally could cope with but could no longer cope with when depressed (e.g. 'poor appetite', 'cannot shake off the blues' and 'everything an effort.'). Once depression develops, early object loss may act as a pathoplastic factor by making it severer especially by rendering people less able to perform what they normally could do.

  2. Removal of nickel and cadmium from battery waste by a chemical method using ferric sulphate.

    PubMed

    Jadhav, Umesh U; Hocheng, Hong

    2014-01-01

    The removal of nickel (Ni) and cadmium (Cd) from spent batteries was studied by the chemical method. A novel leaching system using ferric sulphate hydrate was introduced to dissolve heavy metals in batteries. Ni-Cd batteries are classified as hazardous waste because Ni and Cd are suspected carcinogens. More efficient technologies are required to recover metals from spent batteries to minimize capital outlay, environmental impact and to respond to increased demand. The results obtained demonstrate that optimal conditions, including pH, concentration of ferric sulphate, shaking speed and temperature for the metal removal, were 2.5, 60 g/L, 150 rpm and 30 degrees C, respectively. More than 88 (+/- 0.9) and 84 (+/- 2.8)% of nickel and cadmium were recovered, respectively. These results suggest that ferric ion oxidized Ni and Cd present in battery waste. This novel process provides a possibility for recycling waste Ni-Cd batteries in a large industrial scale.

  3. CISN ShakeAlert: Using early warnings for earthquakes in California

    NASA Astrophysics Data System (ADS)

    Vinci, M.; Hellweg, M.; Jones, L. M.; Khainovski, O.; Schwartz, K.; Lehrer, D.; Allen, R. M.; Neuhauser, D. S.

    2009-12-01

    Educated users who have developed response plans and procedures are just as important for an earthquake early warning (EEW) system as are the algorithms and computers that process the data and produce the warnings. In Japan, for example, the implementation of the EEW system which now provides advanced alerts of ground shaking included intense outreach efforts to both institutional and individual recipients. Alerts are now used in automatic control systems that stop trains, place sensitive equipment in safe mode and isolate hazards while the public takes cover. In California, the California Integrated Seismic Network (CISN) is now developing and implementing components of a prototype system for EEW, ShakeAlert. As this processing system is developed, we invite a suite of perspective users from critical industries and institutions throughout California to partner with us in developing useful ShakeAlert products and procedures. At the same time, we will support their efforts to determine and implement appropriate responses to an early warning of earthquake shaking. As a first step, in a collaboration with BART, we have developed a basic system allowing BART’s operation center to receive realtime ground shaking information from more than 150 seismic stations operating in the San Francisco Bay Area. BART engineers are implementing a display system for this information. Later phases will include the development of improved response procedures utilizing this information. We plan to continue this collaboration to include more sophisticated information from the prototype CISN ShakeAlert system.

  4. Freezing Injury in Onion Bulb Cells

    PubMed Central

    Palta, Jiwan P.; Levitt, Jacob; Stadelmann, Eduard J.

    1977-01-01

    Onion (Allium cepa L.) bulbs were frozen to −4 and −11 C and kept frozen for up to 12 days. After slow thawing, a 2.5-cm square from a bulb scale was transferred to 25 ml deionized H2O. After shaking for standard times, measurements were made on the effusate and on the effused cells. The results obtained were as follows. Even when the scale tissue was completely infiltrated, and when up to 85% of the ions had diffused out, all of the cells were still alive, as revealed by cytoplasmic streaming and ability to plasmolyze. The osmotic concentration of the cell sap, as measured plasmolytically, decreased in parallel to the rise in conductivity of the effusate. The K+ content of the effusate, plus its assumed counterion, accounted for only 20% of the total solutes, but for 100% of the conductivity. A large part of the nonelectrolytes in the remaining 80% of the solutes was sugars. The increased cell injury and infiltration in the −11 C treatment, relative to the −4 C and control (unfrozen) treatments, were paralleled by increases in conductivity, K+ content, sugar content, and pH of the effusate. In spite of the 100% infiltration of the tissue and the large increase in conductivity of the effusate following freezing, no increase in permeability of the cells to water could be detected. The above observations may indicate that freezing or thawing involves a disruption of the active transport system before the cells reveal any injury microscopically. PMID:16660100

  5. Extraction of MOS VLSI (Very-Large-Scale-Integrated) Circuit Models Including Critical Interconnect Parasitics.

    DTIC Science & Technology

    1987-09-01

    can be reduced substantially, compared to using numerical methods to model inter - " connect parasitics. Although some accuracy might be lost with...conductor widths and spacings listed in Table 2 1 , have been employed for simulation. In the first set of the simulations, planar dielectric inter ...model, there are no restrictions on the iumber ol diele-iric and conductors. andl the shape of the conductors and the dielectric inter - a.e,, In the

  6. Life Enhancement of Naval Systems through Advanced Materials.

    DTIC Science & Technology

    1982-05-12

    sulfate ( eutectic at 575*C) and nickel sulfate-sodium sulfate ( eutectic at 670 0 C) systems. Cobalt and nickel sulfate are thermally unstable and undergo a...large scale commercial usage. Table IV-l - Ion implantation parameters Implanted Elements - Virtually any element from hydrogen to uranium can be...readily attainable by oxidation of the up to 1% sulfur allowed inI Navy fuel. Therefore, cobalt and nickel sulfate are formed by reaction of the 30 Fig. V-1

  7. Grief support in accident and emergency nursing: a literature review 1985-1993.

    PubMed

    McDonald, L; Butterworth, T; Yates, D W

    1995-07-01

    On completing a wide ranging review of literature related to Accident and Emergency (A & E) nursing, the authors chose to focus upon grief support. The literature ranges from personal experiences to large scale research. A table of studies is included to clarify major research findings in this area. The article concludes by recommending long term support for bereaved relatives and research to demonstrate the value of support for relatives in the community.

  8. W-SHAKE - the Waves that SHAKE our Earth, a Parents-in-Science Initiative

    NASA Astrophysics Data System (ADS)

    Rocha, Francisco; Silveira, Graça; Moreira, Guilherme; Afonso, Isabel; Custódio, Susana; Matias, Luís

    2014-05-01

    The catastrophes induced by earthquakes are among the most devastating, causing an elevated number of human losses and economic damages. In spite of being exposed to a moderate and large earthquakes, and as a result of the slow tectonic rates, the Portuguese society is, in general, little aware of the seismic risk. Earthquakes can't be predicted and the only way of mitigating their consequences is to understand their genesis, propagation and their effects in society. Children play a determinant role in the establishment of a real and long-lasting "culture of prevention", both through action and new attitudes. As adults they will have a greater knowledge of natural phenomena as well of the effects of human actions and their consequences. Parents and other caregivers have a critical function in encouraging and supporting their learning at home, in school, and throughout community. Teachers also play an important role in this effort and can be valuable partners with parents in cultivating learning confidence and skills in school-age youth. In this presentation we will give an overview of the project W-Shake, a Parents-in-Science Initiative to promote the study of seismology and related subjects. This project, supported by the Portuguese "Ciência Viva" program, results from a direct cooperation between the parents association, science school-teachers and the seismology research group at Instituto Dom Luíz.

  9. Public Release of Estimated Impact-Based Earthquake Alerts - An Update to the U.S. Geological Survey PAGER System

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Jaiswal, K. S.; Marano, K.; Hearne, M.; Earle, P. S.; So, E.; Garcia, D.; Hayes, G. P.; Mathias, S.; Applegate, D.; Bausch, D.

    2010-12-01

    The U.S. Geological Survey (USGS) has begun publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses. These estimates should significantly enhance the utility of the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system that has been providing estimated ShakeMaps and computing population exposures to specific shaking intensities since 2007. Quantifying earthquake impacts and communicating loss estimates (and their uncertainties) to the public has been the culmination of several important new and evolving components of the system. First, the operational PAGER system now relies on empirically-based loss models that account for estimated shaking hazard, population exposure, and employ country-specific fatality and economic loss functions derived using analyses of losses due to recent and past earthquakes. In some countries, our empirical loss models are informed in part by PAGER’s semi-empirical and analytical loss models, and building exposure and vulnerability data sets, all of which are being developed in parallel to the empirical approach. Second, human and economic loss information is now portrayed as a supplement to existing intensity/exposure content on both PAGER summary alert (available via cell phone/email) messages and web pages. Loss calculations also include estimates of the economic impact with respect to the country’s gross domestic product. Third, in order to facilitate rapid and appropriate earthquake responses based on our probable loss estimates, in early 2010 we proposed a four-level Earthquake Impact Scale (EIS). Instead of simply issuing median estimates for losses—which can be easily misunderstood and misused—this scale provides ranges of losses from which potential responders can gauge expected overall impact from strong shaking. EIS is based on two complementary criteria: the estimated cost of damage, which is most suitable for U.S. domestic events; and estimated ranges of fatalities, which are generally more appropriate for global events, particularly in earthquake-vulnerable countries. Alert levels are characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1000, respectively. For damage impact, yellow, orange, and red thresholds are triggered when estimated US dollar losses reach 1 million, 100 million, and 1 billion+ levels, respectively. Finally, alerting protocols now explicitly support EIS-based alerts. Critical users can receive PAGER alerts i) based on the EIS-based alert level, in addition to or as an alternative to magnitude and population/intensity exposure-based alerts, and ii) optionally, based on user-selected regions of the world. The essence of PAGER’s impact-based alerting is that actionable loss information is now available in the immediate aftermath of significant earthquakes worldwide based on quantifiable, albeit uncertain, loss estimates provided by the USGS.

  10. Liquefaction-induced lateral spreading in Oceano, California, during the 2003 San Simeon Earthquake

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.; Di Alessandro, Carola; Boatwright, John; Tinsley, John C.; Sell, Russell W.; Rosenberg, Lewis I.

    2004-01-01

    The December 22, 2003, San Simeon, California, (M6.5) earthquake caused damage to houses, road surfaces, and underground utilities in Oceano, California. The community of Oceano is approximately 50 miles (80 km) from the earthquake epicenter. Damage at this distance from a M6.5 earthquake is unusual. To understand the causes of this damage, the U.S. Geological Survey conducted extensive subsurface exploration and monitoring of aftershocks in the months after the earthquake. The investigation included 37 seismic cone penetration tests, 5 soil borings, and aftershock monitoring from January 28 to March 7, 2004. The USGS investigation identified two earthquake hazards in Oceano that explain the San Simeon earthquake damage?site amplification and liquefaction. Site amplification is a phenomenon observed in many earthquakes where the strength of the shaking increases abnormally in areas where the seismic-wave velocity of shallow geologic layers is low. As a result, earthquake shaking is felt more strongly than in surrounding areas without similar geologic conditions. Site amplification in Oceano is indicated by the physical properties of the geologic layers beneath Oceano and was confirmed by monitoring aftershocks. Liquefaction, which is also commonly observed during earthquakes, is a phenomenon where saturated sands lose their strength during an earthquake and become fluid-like and mobile. As a result, the ground may undergo large permanent displacements that can damage underground utilities and well-built surface structures. The type of displacement of major concern associated with liquefaction is lateral spreading because it involves displacement of large blocks of ground down gentle slopes or towards stream channels. The USGS investigation indicates that the shallow geologic units beneath Oceano are very susceptible to liquefaction. They include young sand dunes and clean sandy artificial fill that was used to bury and convert marshes into developable lots. Most of the 2003 damage was caused by lateral spreading in two separate areas, one near Norswing Drive and the other near Juanita Avenue. The areas coincided with areas with the highest liquefaction potential found in Oceano. Areas with site amplification conditions similar to those in Oceano are particularly vulnerable to earthquakes. Site amplification may cause shaking from distant earthquakes, which normally would not cause damage, to increase locally to damaging levels. The vulnerability in Oceano is compounded by the widespread distribution of highly liquefiable soils that will reliquefy when ground shaking is amplified as it was during the San Simeon earthquake. The experience in Oceano can be expected to repeat because the region has many active faults capable of generating large earthquakes. In addition, liquefaction and lateral spreading will be more extensive for moderate-size earthquakes that are closer to Oceano than was the 2003 San Simeon earthquake. Site amplification and liquefaction can be mitigated. Shaking is typically mitigated in California by adopting and enforcing up-to-date building codes. Although not a guarantee of safety, application of these codes ensures that the best practice is used in construction. Building codes, however, do not always require the upgrading of older structures to new code requirements. Consequently, many older structures may not be as resistant to earthquake shaking as new ones. For older structures, retrofitting is required to bring them up to code. Seismic provisions in codes also generally do not apply to nonstructural elements such as drywall, heating systems, and shelving. Frequently, nonstructural damage dominates the earthquake loss. Mitigation of potential liquefaction in Oceano presently is voluntary for existing buildings, but required by San Luis Obispo County for new construction. Multiple mitigation procedures are available to individual property owners. These procedures typically involve either

  11. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

    NASA Astrophysics Data System (ADS)

    Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

    2008-12-01

    Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake response technologies by Los Angeles Unified School District and a top to bottom examination of Los Angeles County Fire Department's earthquake response strategies.

  12. Earthquake Early Warning ShakeAlert System: Testing and certification platform

    USGS Publications Warehouse

    Cochran, Elizabeth S.; Kohler, Monica D.; Given, Douglas; Guiwits, Stephen; Andrews, Jennifer; Meier, Men-Andrin; Ahmad, Mohammad; Henson, Ivan; Hartog, Renate; Smith, Deborah

    2017-01-01

    Earthquake early warning systems provide warnings to end users of incoming moderate to strong ground shaking from earthquakes. An earthquake early warning system, ShakeAlert, is providing alerts to beta end users in the western United States, specifically California, Oregon, and Washington. An essential aspect of the earthquake early warning system is the development of a framework to test modifications to code to ensure functionality and assess performance. In 2016, a Testing and Certification Platform (TCP) was included in the development of the Production Prototype version of ShakeAlert. The purpose of the TCP is to evaluate the robustness of candidate code that is proposed for deployment on ShakeAlert Production Prototype servers. TCP consists of two main components: a real‐time in situ test that replicates the real‐time production system and an offline playback system to replay test suites. The real‐time tests of system performance assess code optimization and stability. The offline tests comprise a stress test of candidate code to assess if the code is production ready. The test suite includes over 120 events including local, regional, and teleseismic historic earthquakes, recentering and calibration events, and other anomalous and potentially problematic signals. Two assessments of alert performance are conducted. First, point‐source assessments are undertaken to compare magnitude, epicentral location, and origin time with the Advanced National Seismic System Comprehensive Catalog, as well as to evaluate alert latency. Second, we describe assessment of the quality of ground‐motion predictions at end‐user sites by comparing predicted shaking intensities to ShakeMaps for historic events and implement a threshold‐based approach that assesses how often end users initiate the appropriate action, based on their ground‐shaking threshold. TCP has been developed to be a convenient streamlined procedure for objectively testing algorithms, and it has been designed with flexibility to accommodate significant changes in development of new or modified system code. It is expected that the TCP will continue to evolve along with the ShakeAlert system, and the framework we describe here provides one example of how earthquake early warning systems can be evaluated.

  13. Simulating cyanobacterial phenotypes by integrating flux balance analysis, kinetics, and a light distribution function

    DOE PAGES

    He, Lian; Wu, Stephen G.; Wan, Ni; ...

    2015-12-24

    In this study, genome-scale models (GSMs) are widely used to predict cyanobacterial phenotypes in photobioreactors (PBRs). However, stoichiometric GSMs mainly focus on fluxome that result in maximal yields. Cyanobacterial metabolism is controlled by both intracellular enzymes and photobioreactor conditions. To connect both intracellular and extracellular information and achieve a better understanding of PBRs productivities, this study integrates a genome-scale metabolic model of Synechocystis 6803 with growth kinetics, cell movements, and a light distribution function. The hybrid platform not only maps flux dynamics in cells of sub-populations but also predicts overall production titer and rate in PBRs. Analysis of the integratedmore » GSM demonstrates several results. First, cyanobacteria are capable of reaching high biomass concentration (>20 g/L in 21 days) in PBRs without light and CO 2 mass transfer limitations. Second, fluxome in a single cyanobacterium may show stochastic changes due to random cell movements in PBRs. Third, insufficient light due to cell self-shading can activate the oxidative pentose phosphate pathway in subpopulation cells. Fourth, the model indicates that the removal of glycogen synthesis pathway may not improve cyanobacterial bio-production in large-size PBRs, because glycogen can support cell growth in the dark zones. Based on experimental data, the integrated GSM estimates that Synechocystis 6803 in shake flask conditions has a photosynthesis efficiency of ~2.7 %. Conclusions: The multiple-scale integrated GSM, which examines both intracellular and extracellular domains, can be used to predict production yield/rate/titer in large-size PBRs. More importantly, genetic engineering strategies predicted by a traditional GSM may work well only in optimal growth conditions. In contrast, the integrated GSM may reveal mutant physiologies in diverse bioreactor conditions, leading to the design of robust strains with high chances of success in industrial settings.« less

  14. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    NASA Astrophysics Data System (ADS)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several years. Another critical lesson that has been learned is to employ K-12 education professionals and utilize undergrad and graduate student workers in the University's Department of Education. Such staff members are keenly aware of the pressures and needs in diverse communities such as Shelby County, Tennessee and are uniquely suited to design and implement new and innovative programs that provide substantive short-term user benefits and promote long-term relationships with the K-12 teachers, students, and teacher's organizations.

  15. Dynamics of water-table fluctuations in an upland between two prairie-pothole wetlands in North Dakota

    USGS Publications Warehouse

    Rosenberry, Donald O.; Winter, Thomas C.

    1997-01-01

    Data from a string of instrumented wells located on an upland of 55 m width between two wetlands in central North Dakota, USA, indicated frequent changes in water-table configuration following wet and dry periods during 5 years of investigation. A seasonal wetland is situated about 1.5 m higher than a nearby semipermanent wetland, suggesting an average ground water-table gradient of 0.02. However, water had the potential to flow as ground water from the upper to the lower wetland during only a few instances. A water-table trough adjacent to the lower semipermanent wetland was the most common water-table configuration during the first 4 years of the study, but it is likely that severe drought during those years contributed to the longevity and extent of the water-table trough. Water-table mounds that formed in response to rainfall events caused reversals of direction of flow that frequently modified the more dominant water-table trough during the severe drought. Rapid and large water-table rise to near land surface in response to intense rainfall was aided by the thick capillary fringe. One of the wettest summers on record ended the severe drought during the last year of the study, and caused a larger-scale water-table mound to form between the two wetlands. The mound was short in duration because it was overwhelmed by rising stage of the higher seasonal wetland which spilled into the lower wetland. Evapotranspiration was responsible for generating the water-table trough that formed between the two wetlands. Estimation of evapotranspiration based on diurnal fluctuations in wells yielded rates that averaged 3–5 mm day−1. On many occasions water levels in wells closer to the semipermanent wetland indicated a direction of flow that was different from the direction indicated by water levels in wells farther from the wetland. Misinterpretation of direction and magnitude of gradients between ground water and wetlands could result from poorly placed or too few observation wells, and also from infrequent measurement of water levels in wells.

  16. Preparing for a "Big One": The great southern California shakeout

    USGS Publications Warehouse

    Jones, L.M.; Benthien, M.

    2011-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million Southern Californians pretended that the magnitude-7.8 ShakeOut scenario earthquake was occurring and practiced actions derived from results of the ShakeOut Scenario, to reduce the impact of a real, San Andreas Fault event. The communications campaign was based on four principles: 1) consistent messaging from multiple sources; 2) visual reinforcement: 3) encouragement of "milling"; and 4) focus on concrete actions. The goals of the Shake-Out established in Spring 2008 were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in Southern California; and 3) to reduce earthquake losses in Southern California. Over 90% of the registrants surveyed the next year reported improvement in earthquake preparedness at their organization as a result of the ShakeOut. ?? 2011, Earthquake Engineering Research Institute.

  17. MyShake: Initial observations from a global smartphone seismic network

    NASA Astrophysics Data System (ADS)

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis

    2016-09-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing. In the first 6 months since the release of the MyShake app, there were almost 200,000 downloads. On a typical day about 8000 phones provide acceleration waveform data to the MyShake archive. The on-phone app can detect and trigger on P waves and is capable of recording magnitude 2.5 and larger events. More than 200 seismic events have been recorded so far, including events in Chile, Argentina, Mexico, Morocco, Nepal, New Zealand, Taiwan, Japan, and across North America. The largest number of waveforms from a single earthquake to date comes from the M5.2 Borrego Springs earthquake in Southern California, for which MyShake collected 103 useful three-component waveforms. The network continues to grow with new downloads from the Google Play store everyday and expands rapidly when public interest in earthquakes peaks such as during an earthquake sequence.

  18. Group Cohesion DEOCS 4.1 Construct Validity Summary

    DTIC Science & Technology

    2017-08-01

    Group Cohesion DEOCS 4.1 Construct Validity Summary DEFENSE EQUAL OPPORTUNITY MANAGEMENT INSTITUTE DIRECTORATE...See Table 4 for more information regarding item reliabilities. The relationship between the original four-point scale (Organizational Cohesion) and...future analyses, including those using the seven-point scale. Tables 4 and 5 provide additional information regarding the reliability and descriptive

  19. Potential groundwater contribution to Amazon evapotranspiration

    NASA Astrophysics Data System (ADS)

    Fan, Y.; Miguez-Macho, G.

    2010-07-01

    Climate and land ecosystem models simulate a dry-season vegetation stress in the Amazon forest, but observations show enhanced growth in response to higher radiation under less cloudy skies, indicating an adequate water supply. Proposed mechanisms include larger soil water store and deeper roots in nature and the ability of roots to move water up and down (hydraulic redistribution). Here we assess the importance of the upward soil water flux from the groundwater driven by capillarity. We present a map of water table depth from observations and groundwater modeling, and a map of potential capillary flux these water table depths can sustain. The maps show that the water table beneath the Amazon can be quite shallow in lowlands and river valleys (<5 m in 36% and <10 m in 60% of Amazonia). The water table can potentially sustain a capillary flux of >2.1 mm day-1 to the land surface averaged over Amazonia, but varies from 0.6 to 3.7 mm day-1 across nine study sites. Current models simulate a large-scale reduction in dry-season photosynthesis under today's climate and a possible dieback under projected future climate with a longer dry season, converting the Amazon from a net carbon sink to a source and accelerating warming. The inclusion of groundwater and capillary flux may modify the model results.

  20. Volume 51, Issue 7-8Copyright © 2003 WILEY-VCH Verlag GmbH & Co. KGaA, WeinheimSave Title to My Profile

    E-MailPrint

    Volume 51, Issue 7-8, Pages 639-896(July 2003)

    Preface

    Preface

    NASA Astrophysics Data System (ADS)

    Andreev, O.

    2003-07-01

    We briefly review a possible scheme for getting the known QCD scaling laws within string theory. In particular, we consider amplitudes for exclusive scattering of hadrons at large momentum transfer and hadronic form factors.

Top