Sample records for constructing modal base-shake

  1. Comparison of NASTRAN analysis with ground vibration results of UH-60A NASA/AEFA test configuration

    NASA Technical Reports Server (NTRS)

    Idosor, Florentino; Seible, Frieder

    1990-01-01

    Preceding program flight tests, a ground vibration test and modal test analysis of a UH-60A Black Hawk helicopter was conducted by Sikorsky Aircraft to complement the UH-60A test plan and NASA/ARMY Modern Technology Rotor Airloads Program. The 'NASA/AEFA' shake test configuration was tested for modal frequencies and shapes and compared with its NASTRAN finite element model counterpart to give correlative results. Based upon previous findings, significant differences in modal data existed and were attributed to assumptions regarding the influence of secondary structure contributions in the preliminary NASTRAN modeling. An analysis of an updated finite element model including several secondary structural additions has confirmed that the inclusion of specific secondary components produces a significant effect on modal frequency and free-response shapes and improves correlations at lower frequencies with shake test data.

  2. System identification of timber masonry walls using shaking table test

    NASA Astrophysics Data System (ADS)

    Roy, Timir B.; Guerreiro, Luis; Bagchi, Ashutosh

    2017-04-01

    Dynamic study is important in order to design, repair and rehabilitation of structures. It has played an important role in the behavior characterization of structures; such as: bridges, dams, high rise buildings etc. There had been substantial development in this area over the last few decades, especially in the field of dynamic identification techniques of structural systems. Frequency Domain Decomposition (FDD) and Time Domain Decomposition are most commonly used methods to identify modal parameters; such as: natural frequency, modal damping and mode shape. The focus of the present research is to study the dynamic characteristics of typical timber masonry walls commonly used in Portugal. For that purpose, a multi-storey structural prototype of such wall has been tested on a seismic shake table at the National Laboratory for Civil Engineering, Portugal (LNEC). Signal processing has been performed of the output response, which is collected from the shaking table experiment of the prototype using accelerometers. In the present work signal processing of the output response, based on the input response has been done in two ways: FDD and Stochastic Subspace Identification (SSI). In order to estimate the values of the modal parameters, algorithms for FDD are formulated and parametric functions for the SSI are computed. Finally, estimated values from both the methods are compared to measure the accuracy of both the techniques.

  3. Dynamic response of NASA Rotor Test Apparatus and Sikorsky S-76 hub mounted in the 80- by 120-Foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Peterson, Randall L.; Hoque, Muhammed S.

    1994-01-01

    A shake test was conducted in the 80- by 120-Foot Wind Tunnel at NASA Ames Research Center, using the NASA Ames Rotor Test Apparatus (RTA) and the Sikorsky S-76 rotor hub. The primary objective of this shake test was to determine the modal properties of the RTA, the S-76 rotor hub, and the model support system installed in the wind tunnel. Random excitation was applied at the rotor hub, and vibration responses were measured using accelerometers mounted at various critical locations on the model and the model support system. Transfer functions were computed using the load cell data and the accelerometer responses. The transfer function data were used to compute the system modal parameters with the aid of modal analysis software.

  4. Shake test results of the MDHC test stand in the 40- by 80-foot wind tunnel

    NASA Technical Reports Server (NTRS)

    Lau, Benton H.; Peterson, Randall

    1994-01-01

    A shake test was conducted to determine the modal properties of the MDHC (McDonnell Douglas Helicopter Company) test stand installed in the 40- by 80- Foot Wind Tunnel at Ames Research Center. The shake test was conducted for three wind-tunnel balance configurations with and without balance dampers, and with the snubber engagement to lock the balance frame. A hydraulic shaker was used to apply random excitation at the rotor hub in the longitudinal and lateral directions. A GenRad 2515 computer-aided test system computed the frequency response functions at the rotor hub and support struts. From these response functions, the modal properties, including the natural frequency, damping ratio, and mode shape were calculated. The critical modes with low damping ratios are identified as the test-stand second longitudinal mode for the dampers-off configuration, the test-stand yaw mode for the dampers-on configuration, and the test stand first longitudinal mode for the balance-frame locked configuration.

  5. ShakeCast: Automating and Improving the Use of ShakeMap for Post-Earthquake Decision- Making and Response

    NASA Astrophysics Data System (ADS)

    Lin, K.; Wald, D. J.

    2007-12-01

    ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users" facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for emergency managers and responders. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, provides overall information regarding the affected areas. When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. To this end, ShakeCast estimates the potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps showing structures or facilities most likely impacted. All ShakeMap and ShakeCast files and products are non-propriety to simplify interfacing with existing users" response tools and to encourage user-made enhancement to the software. ShakeCast uses standard RSS and HTTP requests to communicate with the USGS Web servers that host ShakeMaps, which are widely-distributed and heavily mirrored. The RSS approach allows ShakeCast users to initiate and receive selected ShakeMap products and information on software updates. To assess facility damage estimates, ShakeCast users can combine measured or estimated ground motion parameters with damage relationships that can be pre-computed, use one of these ground motion parameters as input, and produce a multi-state discrete output of damage likelihood. Presently three common approaches are being used to provide users with an indication of damage: HAZUS-based, intensity-based, and customized damage functions. Intensity-based thresholds are for locations with poorly established damage relationships; custom damage levels are for advanced ShakeCast users such as Caltrans which produces its own set of damage functions that correspond to the specific details of each California bridge or overpass in its jurisdiction. For users whose portfolio of structures is comprised of common, standard designs, ShakeCast offers a simplified structural damage-state estimation capability adapted from the HAZUS-MH earthquake module (NIBS and FEMA, 2003). Currently the simplified fragility settings consist of 128 combinations of HAZUS model building types, construction materials, building heights, and building-code eras.

  6. Development and utilization of USGS ShakeCast for rapid post-earthquake assessment of critical facilities and infrastructure

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-wan; Kircher, C.A.; Jaiswal, Kishor; Luco, Nicolas; Turner, L.; Slosky, Daniel

    2017-01-01

    The ShakeCast system is an openly available, near real-time post-earthquake information management system. ShakeCast is widely used by public and private emergency planners and responders, lifeline utility operators and transportation engineers to automatically receive and process ShakeMap products for situational awareness, inspection priority, or damage assessment of their own infrastructure or building portfolios. The success of ShakeCast to date and its broad, critical-user base mandates improved software usability and functionality, including improved engineering-based damage and loss functions. In order to make the software more accessible to novice users—while still utilizing advanced users’ technical and engineering background—we have developed a “ShakeCast Workbook”, a well documented, Excel spreadsheet-based user interface that allows users to input notification and inventory data and export XML files requisite for operating the ShakeCast system. Users will be able to select structure based on a minimum set of user-specified facility (building location, size, height, use, construction age, etc.). “Expert” users will be able to import user-modified structural response properties into facility inventory associated with the HAZUS Advanced Engineering Building Modules (AEBM). The goal of the ShakeCast system is to provide simplified real-time potential impact and inspection metrics (i.e., green, yellow, orange and red priority ratings) to allow users to institute customized earthquake response protocols. Previously, fragilities were approximated using individual ShakeMap intensity measures (IMs, specifically PGA and 0.3 and 1s spectral accelerations) for each facility but we are now performing capacity-spectrum damage state calculations using a more robust characterization of spectral deamnd.We are also developing methods for the direct import of ShakeMap’s multi-period spectra in lieu of the assumed three-domain design spectrum (at 0.3s for constant acceleration; 1s or 3s for constant velocity and constant displacement at very long response periods). As part of ongoing ShakeCast research and development, we will also explore the use of ShakeMap IM uncertainty estimates and evaluate the assumption of employing multiple response spectral damping values rather than the single 5%-damped value currently employed. Developing and incorporating advanced fragility assignments into the ShakeCast Workbook requires related software modifications and database improvements; these enhancements are part of an extensive rewrite of the ShakeCast application.

  7. Radiated emissions comparison of seven-stage modal filter constructions for Ethernet 100Base-T network protection

    NASA Astrophysics Data System (ADS)

    Khazhibekov, R. R.; Zabolotsky, A. M.

    2018-05-01

    The authors consider Ethernet protection devices based on modal filtering. Radiated emission measurement results for three modal filter constructions are presented. It is shown that the improved construction of a non-resistive filter has lower emission levels than the original one.

  8. Damage detection of rotating wind turbine blades using local flexibility method and long-gauge fiber Bragg grating sensors

    NASA Astrophysics Data System (ADS)

    Hsu, Ting-Yu; Shiao, Shen-Yuan; Liao, Wen-I.

    2018-01-01

    Wind turbines are a cost-effective alternative energy source; however, their blades are susceptible to damage. Therefore, damage detection of wind turbine blades is of great importance for condition monitoring of wind turbines. Many vibration-based structural damage detection techniques have been proposed in the last two decades. The local flexibility method, which can determine local stiffness variations of beam-like structures by using measured modal parameters, is one of the most promising vibration-based approaches. The local flexibility method does not require a finite element model of the structure. A few structural modal parameters identified from the ambient vibration signals both before and after damage are required for this method. In this study, we propose a damage detection approach for rotating wind turbine blades using the local flexibility method based on the dynamic macro-strain signals measured by long-gauge fiber Bragg grating (FBG)-based sensors. A small wind turbine structure was constructed and excited using a shaking table to generate vibration signals. The structure was designed to have natural frequencies as close as possible to those of a typical 1.5 MW wind turbine in real scale. The optical fiber signal of the rotating blades was transmitted to the data acquisition system through a rotary joint fixed inside the hollow shaft of the wind turbine. Reversible damage was simulated by aluminum plates attached to some sections of the wind turbine blades. The damaged locations of the rotating blades were successfully detected using the proposed approach, with the extent of damage somewhat over-estimated. Nevertheless, although the specimen of wind turbine blades cannot represent a real one, the results still manifest that FBG-based macro-strain measurement has potential to be employed to obtain the modal parameters of the rotating wind turbines and then locations of wind turbine segments with a change of rigidity can be estimated effectively by utilizing these identified parameters.

  9. Solar collector cell and roof flashing assembly and method of constructing a roof with such an assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayerovitch, M.D.

    1980-03-25

    A solar collector cell formed as an integral portion of a roof flashing is disclosed as comprising a flashing base having a dihedral surface including a larger base portion and a smaller ramp portion, and a solar collector cell container built integrally with the base portion of the flashing. The combination is designed to be installed in the roof of a dwelling or other building structure. The container portion of the flashing is substantially shorter in height above the roof line than conventional solar collector cell structures added to a roof subsequent to its construction. As a result, the inventionmore » gives the building constructor or owner, the option of either including the solar cell components at the time of construction of the roof to provide a solar heating device, or to fill the solar collector cell container with a temporary support structure, such as roof shakes or tiles. The shape of the solar collector cell and flashing assembly permits the solar collector cell structure to be camouflaged by overlying shakes or tiles of which the roof is constructed.« less

  10. NUTRIENT CHANNELS AND STIRRING ENHANCED THE COMPOSITION AND STIFFNESS OF LARGE CARTILAGE CONSTRUCTS

    PubMed Central

    Cigan, Alexander D.; Nims, Robert J.; Albro, Michael B.; Vunjak-Novakovic, Gordana; Hung, Clark T.; Ateshian, Gerard A.

    2014-01-01

    A significant challenge in cartilage tissue engineering is to successfully culture functional tissues that are sufficiently large to treat osteoarthritic joints. Transport limitations due to nutrient consumption by peripheral cells produce heterogeneous constructs with matrix-deficient centers. Incorporation of nutrient channels into large constructs is a promising technique for alleviating transport limitations, in conjunction with simple yet effective methods for enhancing media flow through channels. Cultivation of cylindrical channeled constructs flat in culture dishes, with or without orbital shaking, produced asymmetric constructs with poor tissue properties. We therefore explored a method for exposing the entire construct surface to the culture media, while promoting flow through the channels. To this end, chondrocyte-seeded agarose constructs (Ø10 mm, 2.34 mm thick), with zero or three nutrient channels (Ø1 mm), were suspended on their sides in custom culture racks and subjected to three media stirring modes for 56 days: uniaxial rocking, orbital shaking, or static control. Orbital shaking led to the highest construct EY, glycosaminoglycan (GAG), and collagen contents, whereas rocking had detrimental effects on GAG and collagen versus static control. Nutrient channels increased EY as well as GAG homogeneity, and the beneficial effects of channels were most marked in orbitally shaken samples. Under these conditions, the constructs developed symmetrically and reached or exceeded native levels of EY (~400 kPa) and glycosaminoglycans (GAG; ~9%/ww). These results suggest that the cultivation of channeled constructs in culture racks with orbital shaking is a promising method for engineering mechanically competent large cartilage constructs. PMID:25458579

  11. Nutrient channels and stirring enhanced the composition and stiffness of large cartilage constructs.

    PubMed

    Cigan, Alexander D; Nims, Robert J; Albro, Michael B; Vunjak-Novakovic, Gordana; Hung, Clark T; Ateshian, Gerard A

    2014-12-18

    A significant challenge in cartilage tissue engineering is to successfully culture functional tissues that are sufficiently large to treat osteoarthritic joints. Transport limitations due to nutrient consumption by peripheral cells produce heterogeneous constructs with matrix-deficient centers. Incorporation of nutrient channels into large constructs is a promising technique for alleviating transport limitations, in conjunction with simple yet effective methods for enhancing media flow through channels. Cultivation of cylindrical channeled constructs flat in culture dishes, with or without orbital shaking, produced asymmetric constructs with poor tissue properties. We therefore explored a method for exposing the entire construct surface to the culture media, while promoting flow through the channels. To this end, chondrocyte-seeded agarose constructs (∅10mm, 2.34mm thick), with zero or three nutrient channels (∅1mm), were suspended on their sides in custom culture racks and subjected to three media stirring modes for 56 days: uniaxial rocking, orbital shaking, or static control. Orbital shaking led to the highest construct EY, sulfated glycosaminoglycan (sGAG), and collagen contents, whereas rocking had detrimental effects on sGAG and collagen versus static control. Nutrient channels increased EY as well as sGAG homogeneity, and the beneficial effects of channels were most marked in orbitally shaken samples. Under these conditions, the constructs developed symmetrically and reached or exceeded native levels of EY (~400kPa) and sGAG (~9%/ww). These results suggest that the cultivation of channeled constructs in culture racks with orbital shaking is a promising method for engineering mechanically competent large cartilage constructs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Conditioning to colors: a population assay for visual learning in Drosophila.

    PubMed

    van Swinderen, Bruno

    2011-11-01

    Vision is a major sensory modality in Drosophila behavior, with more than one-half of the Drosophila brain devoted to visual processing. The mechanisms of vision in Drosophila can be studied in individuals and in populations of flies by using various paradigms. Although there has never been a widely used population assay for visual learning in Drosophila, some population paradigms have shown significant visual learning. These studies use colors as conditioned stimuli (CS) and shaking as the unconditioned stimulus (US). A simple version of the paradigm, conditioning to colors using a shaking device, is described here. A conditioning chamber, called a crab, is designed to center the flies after shaking by having them tumble down to the lowest point between joined glass tubes forming a V. Thus, vibration should be just strong enough to center most flies. After shaking, flies display a geotactic response and climb up either side of the V, and their choice of which side to climb is influenced by color displays on either side. The proportion of flies on either side determines the flies' natural preference or their learned avoidance of a color associated with shaking.

  13. A Simple Algorithm for Predicting Bacteremia Using Food Consumption and Shaking Chills: A Prospective Observational Study.

    PubMed

    Komatsu, Takayuki; Takahashi, Erika; Mishima, Kentaro; Toyoda, Takeo; Saitoh, Fumihiro; Yasuda, Akari; Matsuoka, Joe; Sugita, Manabu; Branch, Joel; Aoki, Makoto; Tierney, Lawrence; Inoue, Kenji

    2017-07-01

    Predicting the presence of true bacteremia based on clinical examination is unreliable. We aimed to construct a simple algorithm for predicting true bacteremia by using food consumption and shaking chills. A prospective multicenter observational study. Three hospital centers in a large Japanese city. In total, 1,943 hospitalized patients aged 14 to 96 years who underwent blood culture acquisitions between April 2013 and August 2014 were enrolled. Patients with anorexia-inducing conditions were excluded. We assessed the patients' oral food intake based on the meal immediately prior to the blood culture with definition as "normal food consumption" when >80% of a meal was consumed and "poor food consumption" when <80% was consumed. We also concurrently evaluated for a history of shaking chills. We calculated the statistical characteristics of food consumption and shaking chills for the presence of true bacteremia, and subsequently built the algorithm by using recursive partitioning analysis. Among 1,943 patients, 223 cases were true bacteremia. Among patients with normal food consumption, without shaking chills, the incidence of true bacteremia was 2.4% (13/552). Among patients with poor food consumption and shaking chills, the incidence of true bacteremia was 47.7% (51/107). The presence of poor food consumption had a sensitivity of 93.7% (95% confidence interval [CI], 89.4%-97.9%) for true bacteremia, and the absence of poor food consumption (ie, normal food consumption) had a negative likelihood ratio (LR) of 0.18 (95% CI, 0.17-0.19) for excluding true bacteremia, respectively. Conversely, the presence of the shaking chills had a specificity of 95.1% (95% CI, 90.7%-99.4%) and a positive LR of 4.78 (95% CI, 4.56-5.00) for true bacteremia. A 2-item screening checklist for food consumption and shaking chills had excellent statistical properties as a brief screening instrument for predicting true bacteremia. © 2017 Society of Hospital Medicine

  14. Modal space three-state feedback control for electro-hydraulic servo plane redundant driving mechanism with eccentric load decoupling.

    PubMed

    Zhao, Jinsong; Wang, Zhipeng; Zhang, Chuanbi; Yang, Chifu; Bai, Wenjie; Zhao, Zining

    2018-06-01

    The shaking table based on electro-hydraulic servo parallel mechanism has the advantage of strong carrying capacity. However, the strong coupling caused by the eccentric load not only affects the degree of freedom space control precision, but also brings trouble to the system control. A novel decoupling control strategy is proposed, which is based on modal space to solve the coupling problem for parallel mechanism with eccentric load. The phenomenon of strong dynamic coupling among degree of freedom space is described by experiments, and its influence on control design is discussed. Considering the particularity of plane motion, the dynamic model is built by Lagrangian method to avoid complex calculations. The dynamic equations of the coupling physical space are transformed into the dynamic equations of the decoupling modal space by using the weighted orthogonality of the modal main mode with respect to mass matrix and stiffness matrix. In the modal space, the adjustments of the modal channels are independent of each other. Moreover, the paper discusses identical closed-loop dynamic characteristics of modal channels, which will realize decoupling for degree of freedom space, thus a modal space three-state feedback control is proposed to expand the frequency bandwidth of each modal channel for ensuring their near-identical responses in a larger frequency range. Experimental results show that the concept of modal space three-state feedback control proposed in this paper can effectively reduce the strong coupling problem of degree of freedom space channels, which verify the effectiveness of the proposed model space state feedback control strategy for improving the control performance of the electro-hydraulic servo plane redundant driving mechanism. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Beside the point: Mothers' head nodding and shaking gestures during parent-child play.

    PubMed

    Fusaro, Maria; Vallotton, Claire D; Harris, Paul L

    2014-05-01

    Understanding the context for children's social learning and language acquisition requires consideration of caregivers' multi-modal (speech, gesture) messages. Though young children can interpret both manual and head gestures, little research has examined the communicative input that children receive via parents' head gestures. We longitudinally examined the frequency and communicative functions of mothers' head nodding and head shaking gestures during laboratory play sessions for 32 mother-child dyads, when the children were 14, 20, and 30 months of age. The majority of mothers produced head nods more frequently than head shakes. Both gestures contributed to mothers' verbal attempts at behavior regulation and dialog. Mothers' head nods primarily conveyed agreement with, and attentiveness to, children's utterances, and accompanied affirmative statements and yes/no questions. Mothers' head shakes primarily conveyed prohibitions and statements with negations. Changes over time appeared to reflect corresponding developmental changes in social and communicative dimensions of caregiver-child interaction. Directions for future research are discussed regarding the role of head gesture input in socialization and in supporting language development. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Practices of shake-flask culture and advances in monitoring CO2 and O2.

    PubMed

    Takahashi, Masato; Aoyagi, Hideki

    2018-05-01

    About 85 years have passed since the shaking culture was devised. Since then, various monitoring devices have been developed to measure culture parameters. O 2 consumed and CO 2 produced by the respiration of cells in shaking cultures are of paramount importance due to their presence in both the culture broth and headspace of shake flask. Monitoring in situ conditions during shake-flask culture is useful for analysing the behaviour of O 2 and CO 2 , which interact according to Henry's law, and is more convenient than conventional sampling that requires interruption of shaking. In situ monitoring devices for shake-flask cultures are classified as direct or the recently developed bypass type. It is important to understand the characteristics of each type along with their unintended effect on shake-flask cultures, in order to improve the existing devices and culture conditions. Technical developments in the bypass monitoring devices are strongly desired in the future. It is also necessary to understand the mechanism underlying conventional shake-flask culture. The existing shaking culture methodology can be expanded into next-generation shake-flask cultures constituting a novel culture environment through a judicious selection of monitoring devices depending on the intended purpose of shake-flask culture. Construction and sharing the databases compatible with the various types of the monitoring devices and measurement instruments adapted for shaking culture can provide a valuable resource for broadening the application of cells with shake-flask culture.

  17. Static strain and vibration characteristics of a metal semimonocoque helicopter tail cone of moderate size

    NASA Technical Reports Server (NTRS)

    Bielawa, Richard L.; Hefner, Rachel E.; Castagna, Andre

    1991-01-01

    The results are presented of an analytic and experimental research program involving a Sikorsky S-55 helicopter tail cone directed ultimately to the improved structural analysis of airframe substructures typical of moderate sized helicopters of metal semimonocoque construction. Experimental static strain and dynamic shake-testing measurements are presented. Correlation studies of each of these tests with a PC-based finite element analysis (COSMOS/M) are described. The tests included static loadings at the end of the tail cone supported in the cantilever configuration as well as vibrational shake-testing in both the cantilever and free-free configurations.

  18. Vibration Modal Characterization of a Stirling Convertor via Base-Shake Excitation

    NASA Technical Reports Server (NTRS)

    Suarez, Vicente J.; Goodnight, Thomas W.; Hughes, William O.; Samorezov, Sergey

    2003-01-01

    The U.S. Department of Energy (DOE), Lockheed Martin (LM), Stirling Technology Company (STC), and NASA John H. Glenn Research Center (GRC) are currently developing a high-efficiency Stirling convertor for use in a Stirling Radioisotope Generator (SRG). NASA and DOE have identified the SRG for potential use as an advanced power system for future NASA Space Science missions, providing spacecraft onboard electric power for deep space missions and power for unmanned Mars rovers. Low-level, baseshake sine vibration tests were conducted on the Stirling Technology Demonstration Convertor (TDC), at NASA GRC's Structural Dynamics Laboratory, in February 2001, as part of the development of this Stirling technology. The purpose of these tests was to provide a better understanding of the TDC's internal dynamic response to external vibratory base excitations. The knowledge obtained can therein be used to help explain the success that the TDC enjoyed in its previous random vibration qualification tests (December 1999). This explanation focuses on the TDC s internal dynamic characteristics in the 50 to 250 Hz frequency range, which corresponds to the maximum input levels of its qualification random vibration test specification. The internal dynamic structural characteristics of the TDC have now been measured in two separate tests under different motoring and dynamic loading conditions: (1) with the convertor being electrically motored, under a vibratory base-shake excitation load, and (2) with the convertor turned off, and its alternator internals undergoing dynamic excitation via hammer impact loading. This paper addresses the test setup, procedure and results of the base-shake vibration testing conducted on the motored TDC, and will compare these results with those results obtained from the dynamic impact tests (May 2001) on the nonmotored TDC.

  19. Engineering of a Stable Whole-Cell Biocatalyst Capable of (S)-Styrene Oxide Formation for Continuous Two-Liquid-Phase Applications

    PubMed Central

    Panke, Sven; de Lorenzo, Víctor; Kaiser, Arnë; Witholt, Bernard; Wubbolts, Marcel G.

    1999-01-01

    Recombinant strains of Pseudomonas putida KT2440 carrying genetic expression cassettes with xylene oxygenase- and styrene monooxygenase-encoding genes on their chromosomes could be induced in shaking-flask experiments to specific activities that rivaled those of multicopy-plasmid-based Escherichia coli recombinants. Such strains maintained the introduced styrene oxidation activity in continuous two-liquid-phase cultures for at least 100 generations, although at a lower level than in the shaking-flask experiments. The data suggest that placement of target genes on the chromosome might be a suitable route for the construction of segregationally stable and highly active whole-cell biocatalysts. PMID:10584030

  20. Developing ShakeCast statistical fragility analysis framework for rapid post-earthquake assessment

    USGS Publications Warehouse

    Lin, K.-W.; Wald, D.J.

    2012-01-01

    When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap estimates the extent of potentially damaging shaking and provides overall information regarding the affected areas. The USGS ShakeCast system is a freely-available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. We describe notable improvements of the ShakeMap and the ShakeCast applications. We present a design for comprehensive fragility implementation, integrating spatially-varying ground-motion uncertainties into fragility curves for ShakeCast operations. For each facility, an overall inspection priority (or damage assessment) is assigned on the basis of combined component-based fragility curves using pre-defined logic. While regular ShakeCast users receive overall inspection priority designations for each facility, engineers can access the full fragility analyses for further evaluation.

  1. Construction and identification of a D-Vine model applied to the probability distribution of modal parameters in structural dynamics

    NASA Astrophysics Data System (ADS)

    Dubreuil, S.; Salaün, M.; Rodriguez, E.; Petitjean, F.

    2018-01-01

    This study investigates the construction and identification of the probability distribution of random modal parameters (natural frequencies and effective parameters) in structural dynamics. As these parameters present various types of dependence structures, the retained approach is based on pair copula construction (PCC). A literature review leads us to choose a D-Vine model for the construction of modal parameters probability distributions. Identification of this model is based on likelihood maximization which makes it sensitive to the dimension of the distribution, namely the number of considered modes in our context. To this respect, a mode selection preprocessing step is proposed. It allows the selection of the relevant random modes for a given transfer function. The second point, addressed in this study, concerns the choice of the D-Vine model. Indeed, D-Vine model is not uniquely defined. Two strategies are proposed and compared. The first one is based on the context of the study whereas the second one is purely based on statistical considerations. Finally, the proposed approaches are numerically studied and compared with respect to their capabilities, first in the identification of the probability distribution of random modal parameters and second in the estimation of the 99 % quantiles of some transfer functions.

  2. Seismic damage diagnosis of a masonry building using short-term damping measurements

    NASA Astrophysics Data System (ADS)

    Kouris, Leonidas Alexandros S.; Penna, Andrea; Magenes, Guido

    2017-04-01

    It is of considerable importance to perform dynamic identification and detect damage in existing structures. This paper describes a new and practical method for damage diagnosis of masonry buildings requiring minimum computational effort. The method is based on the relative variation of modal damping and validated against experimental data from a full scale two storey shake table test. The experiment involves a building subjected to uniaxial vibrations of progressively increasing intensity at the facilities of EUCENTRE laboratory (Pavia, Italy) up to a near collapse damage state. Five time-histories are applied scaling the Montenegro (1979) accelerogram. These strong motion tests are preceded by random vibration tests (RVT's) which are used to perform modal analysis. Two deterministic methods are applied: the single degree of freedom (SDOF) assumption together with the peak-picking method in the discrete frequency domain and the Eigen realisation algorithm with data correlations (ERA-DC) in the discrete time domain. Regarding the former procedure, some improvements are incorporated to locate rigorously the natural frequencies and estimate the modal damping. The progressive evolution of the modal damping is used as a key indicator to characterise damage on the building. Modal damping is connected to the structural mass and stiffness. A square integrated but only with two components expression for proportional (classical) damping is proposed to fit better with the experimental measurements of modal damping ratios. Using this Rayleigh order formulation the contribution of each of the damping components is evaluated. The stiffness component coefficient is proposed as an effective index to detect damage and quantify its intensity.

  3. U.S. Geological Survey's ShakeCast: A cloud-based future

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-Wan; Turner, Loren; Bekiri, Nebi

    2014-01-01

    When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap portrays the extent of potentially damaging shaking. In turn, the ShakeCast system, a freely-available, post-earthquake situational awareness application, automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. ShakeCast is particularly suitable for earthquake planning and response purposes by Departments of Transportation (DOTs), critical facility and lifeline utilities, large businesses, engineering and financial services, and loss and risk modelers. Recent important developments to the ShakeCast system and its user base are described. The newly-released Version 3 of the ShakeCast system encompasses advancements in seismology, earthquake engineering, and information technology applicable to the legacy ShakeCast installation (Version 2). In particular, this upgrade includes a full statistical fragility analysis framework for general assessment of structures as part of the near real-time system, direct access to additional earthquake-specific USGS products besides ShakeMap (PAGER, DYFI?, tectonic summary, etc.), significant improvements in the graphical user interface, including a console view for operations centers, and custom, user-defined hazard and loss modules. The release also introduces a new adaption option to port ShakeCast to the "cloud". Employing Amazon Web Services (AWS), users now have a low-cost alternative to local hosting, by fully offloading hardware, software, and communication obligations to the cloud. Other advantages of the "ShakeCast Cloud" strategy include (1) Reliability and robustness of offsite operations, (2) Scalability naturally accommodated, (3), Serviceability, problems reduced due to software and hardware uniformity, (4) Testability, freely available for new users, (5) Remotely supported, allowing expert-facilitated maintenance, (6) Adoptability, simplified with disk images, and (7) Security, built in at the very high level associated with AWS. The ShakeCast user base continues to expand and broaden. For example, Caltrans, the prototypical ShakeCast user and development supporter, has been providing guidance to other DOTs on the use of the National Bridge Inventory (NBI) database to implement fully-functional ShakeCast systems in their states. A long-term goal underway is to further "connect the DOTs" via a Transportation Pooled Fund (TPF) with participating state DOTs. We also review some of the many other users and uses of ShakeCast. Lastly, on the hazard input front, we detail related ShakeMap improvements and ongoing advancements in estimating the likelihood of shaking-induced secondary hazards at structures, facilities, bridges, and along roadways due to landslides and liquefaction, and implemented within the ShakeCast framework.

  4. ShakeCast Manual

    USGS Publications Warehouse

    Lin, Kuo-Wan; Wald, David J.

    2008-01-01

    ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users? facilities, and generates potential damage assessment notifications, facility damage maps, and other Web-based products for emergency managers and responders.

  5. Detection on vehicle vibration induced by the engine shaking based on the laser triangulation

    NASA Astrophysics Data System (ADS)

    Chen, Wenxue; Yang, Biwu; Ni, Zhibin; Hu, Xinhan; Han, Tieqiang; Hu, Yaocheng; Zhang, Wu; Wang, Yunfeng

    2017-10-01

    The magnitude of engine shaking is chosen to evaluate the vehicle performance. The engine shaking is evaluated by the vehicle vibration. Based on the laser triangulation, the vehicle vibration is measured by detecting the distance variation between the bodywork and road surface. The results represent the magnitude of engine shaking. The principle and configuration of the laser triangulation is also introduced in this paper.

  6. Real-time Shakemap implementation in Austria

    NASA Astrophysics Data System (ADS)

    Weginger, Stefan; Jia, Yan; Papi Isaba, Maria; Horn, Nikolaus

    2017-04-01

    ShakeMaps provide near-real-time maps of ground motion and shaking intensity following significant earthquakes. They are automatically generated within a few minutes after occurrence of an earthquake. We tested and included the USGS ShakeMap 4.0 (experimental code) based on python in the Antelope real-time system with local modified GMPE and Site Effects based on the conditions in Austria. The ShakeMaps are provided in terms of Intensity, PGA, PGV and PSA. Future presentation of ShakeMap contour lines and Ground Motion Parameter with interactive maps and data exchange over Web-Services are shown.

  7. Minority shareholder claims foul at Stone & Webster

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krizan, W.G.

    1994-05-09

    An activist minority shareholder is trying to shake up the management of one of the industry`s oldest firms, New York City-based engineering-constructor Stone & Webster Inc., The shareholder has filed a lawsuit against the 105-year-old firm, claiming that it is understanding losses from construction activities and is using the voting rights of employee-owned stock to perpetuate current management to the detriment of all shareholders.

  8. The Development of the Acoustic Design of NASA Glenn Research Center's New Reverberant Acoustic Test Facility

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Mark E.; Hozman, Aron D.; McNelis, Anne M.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) is leading the design and build of the new world-class vibroacoustic test capabilities at the NASA GRC s Plum Brook Station in Sandusky, Ohio. Benham Companies, LLC is currently constructing modal, base-shake sine and reverberant acoustic test facilities to support the future testing needs of NASA s space exploration program. The large Reverberant Acoustic Test Facility (RATF) will be approximately 101,000 ft3 in volume and capable of achieving an empty chamber acoustic overall sound pressure level (OASPL) of 163 dB. This combination of size and acoustic power is unprecedented amongst the world s known active reverberant acoustic test facilities. The key to achieving the expected acoustic test spectra for a range of many NASA space flight environments in the RATF is the knowledge gained from a series of ground acoustic tests. Data was obtained from several NASA-sponsored test programs, including testing performed at the National Research Council of Canada s acoustic test facility in Ottawa, Ontario, Canada, and at the Redstone Technical Test Center acoustic test facility in Huntsville, Alabama. The majority of these tests were performed to characterize the acoustic performance of the modulators (noise generators) and representative horns that would be required to meet the desired spectra, as well as to evaluate possible supplemental gas jet noise sources. The knowledge obtained in each of these test programs enabled the design of the RATF sound generation system to confidently advance to its final acoustic design and subsequent on-going construction.

  9. The Testing Behind The Test Facility: The Acoustic Design of the NASA Glenn Research Center's World-Class Reverberant Acoustic Test Facility

    NASA Technical Reports Server (NTRS)

    Hozman, Aron D.; Hughes, William O.; McNelis, Mark E.; McNelis, Anne M.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) is leading the design and build of the new world-class vibroacoustic test capabilities at the NASA GRC's Plum Brook Station in Sandusky, Ohio, USA. Benham Companies, LLC is currently constructing modal, base-shake sine and reverberant acoustic test facilities to support the future testing needs of NASA's space exploration program. The large Reverberant Acoustic Test Facility (RATF) will be approximately 101,000 cu ft in volume and capable of achieving an empty chamber acoustic overall sound pressure level (OASPL) of 163 dB. This combination of size and acoustic power is unprecedented amongst the world's known active reverberant acoustic test facilities. The key to achieving the expected acoustic test spectra for a range of many NASA space flight environments in the RATF is the knowledge gained from a series of ground acoustic tests. Data was obtained from several NASA-sponsored test programs, including testing performed at the National Research Council of Canada's acoustic test facility in Ottawa, Ontario, Canada, and at the Redstone Technical Test Center acoustic test facility in Huntsville, Alabama, USA. The majority of these tests were performed to characterize the acoustic performance of the modulators (noise generators) and representative horns that would be required to meet the desired spectra, as well as to evaluate possible supplemental gas jet noise sources. The knowledge obtained in each of these test programs enabled the design of the RATF sound generation system to confidently advance to its final acoustic design and subsequent on-going construction.

  10. The Development of the Acoustic Design of NASA Glenn Research Center's New Reverberant Acoustic Test Facility

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Mark E.; Hozman, Aron D.; McNelis, Anne M.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) is leading the design and build of the new world-class vibroacoustic test capabilities at the NASA GRC's Plum Brook Station in Sandusky, Ohio, USA. Benham Companies, LLC is currently constructing modal, base-shake sine and reverberant acoustic test facilities to support the future testing needs of NASA s space exploration program. The large Reverberant Acoustic Test Facility (RATF) will be approximately 101,000 ft3 in volume and capable of achieving an empty chamber acoustic overall sound pressure level (OASPL) of 163 dB. This combination of size and acoustic power is unprecedented amongst the world s known active reverberant acoustic test facilities. The key to achieving the expected acoustic test spectra for a range of many NASA space flight environments in the RATF is the knowledge gained from a series of ground acoustic tests. Data was obtained from several NASA-sponsored test programs, including testing performed at the National Research Council of Canada s acoustic test facility in Ottawa, Ontario, Canada, and at the Redstone Technical Test Center acoustic test facility in Huntsville, Alabama, USA. The majority of these tests were performed to characterize the acoustic performance of the modulators (noise generators) and representative horns that would be required to meet the desired spectra, as well as to evaluate possible supplemental gas jet noise sources. The knowledge obtained in each of these test programs enabled the design of the RATF sound generation system to confidently advance to its final acoustic design and subsequent on-going construction.

  11. Seismic hazard, risk, and design for South America

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.

  12. Wood Shakes and Shingles for Roof Applications: Tips for Longer Life

    Treesearch

    Mark T. Knaebe

    2013-01-01

    Many wood shakes and shingles have been replaced by composition or asphalt-based shingles. Nevertheless, wood shakes and shingles are still widely used on commercial structures and residential houses.

  13. An Atlas of ShakeMaps and population exposure catalog for earthquake loss modeling

    USGS Publications Warehouse

    Allen, T.I.; Wald, D.J.; Earle, P.S.; Marano, K.D.; Hotovec, A.J.; Lin, K.; Hearne, M.G.

    2009-01-01

    We present an Atlas of ShakeMaps and a catalog of human population exposures to moderate-to-strong ground shaking (EXPO-CAT) for recent historical earthquakes (1973-2007). The common purpose of the Atlas and exposure catalog is to calibrate earthquake loss models to be used in the US Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER). The full ShakeMap Atlas currently comprises over 5,600 earthquakes from January 1973 through December 2007, with almost 500 of these maps constrained-to varying degrees-by instrumental ground motions, macroseismic intensity data, community internet intensity observations, and published earthquake rupture models. The catalog of human exposures is derived using current PAGER methodologies. Exposure to discrete levels of shaking intensity is obtained by correlating Atlas ShakeMaps with a global population database. Combining this population exposure dataset with historical earthquake loss data, such as PAGER-CAT, provides a useful resource for calibrating loss methodologies against a systematically-derived set of ShakeMap hazard outputs. We illustrate two example uses for EXPO-CAT; (1) simple objective ranking of country vulnerability to earthquakes, and; (2) the influence of time-of-day on earthquake mortality. In general, we observe that countries in similar geographic regions with similar construction practices tend to cluster spatially in terms of relative vulnerability. We also find little quantitative evidence to suggest that time-of-day is a significant factor in earthquake mortality. Moreover, earthquake mortality appears to be more systematically linked to the population exposed to severe ground shaking (Modified Mercalli Intensity VIII+). Finally, equipped with the full Atlas of ShakeMaps, we merge each of these maps and find the maximum estimated peak ground acceleration at any grid point in the world for the past 35 years. We subsequently compare this "composite ShakeMap" with existing global hazard models, calculating the spatial area of the existing hazard maps exceeded by the combined ShakeMap ground motions. In general, these analyses suggest that existing global, and regional, hazard maps tend to overestimate hazard. Both the Atlas of ShakeMaps and EXPO-CAT have many potential uses for examining earthquake risk and epidemiology. All of the datasets discussed herein are available for download on the PAGER Web page ( http://earthquake.usgs.gov/ eqcenter/pager/prodandref/ ). ?? 2009 Springer Science+Business Media B.V.

  14. Construction of a quartz spherical analyzer: application to high-resolution analysis of the Ni Kα emission spectrum

    DOE PAGES

    Honnicke, Marcelo Goncalves; Bianco, Leonardo M.; Ceppi, Sergio A.; ...

    2016-08-10

    The construction and characterization of a focusing X-ray spherical analyzer based on α-quartz 4more » $$\\overline{4}$$04 are presented. For this study, the performance of the analyzer was demonstrated by applying it to a high-resolution X-ray spectroscopy study of theKα 1,2emission spectrum of Ni. An analytical representation based on physical grounds was assumed to model the shape of the X-ray emission lines. Satellite structures assigned to 3dspectator hole transitions were resolved and determined as well as their relative contribution to the emission spectrum. The present results on 1s -13d -1shake probabilities support a recently proposed calculation framework based on a multi-configuration atomic model.« less

  15. Construction of a quartz spherical analyzer: application to high-resolution analysis of the Ni Kα emission spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Honnicke, Marcelo Goncalves; Bianco, Leonardo M.; Ceppi, Sergio A.

    The construction and characterization of a focusing X-ray spherical analyzer based on α-quartz 4more » $$\\overline{4}$$04 are presented. For this study, the performance of the analyzer was demonstrated by applying it to a high-resolution X-ray spectroscopy study of theKα 1,2emission spectrum of Ni. An analytical representation based on physical grounds was assumed to model the shape of the X-ray emission lines. Satellite structures assigned to 3dspectator hole transitions were resolved and determined as well as their relative contribution to the emission spectrum. The present results on 1s -13d -1shake probabilities support a recently proposed calculation framework based on a multi-configuration atomic model.« less

  16. Construction of a quartz spherical analyzer: application to high-resolution analysis of the Ni K α emission spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Honnicke, Marcelo Goncalves; Bianco, Leonardo M.; Ceppi, Sergio A.

    The construction and characterization of a focusing X-ray spherical analyzer based on α-quartz 4more » $$\\bar{4}$$04 are presented. The performance of the analyzer was demonstrated by applying it to a high-resolution X-ray spectroscopy study of theKα 1,2emission spectrum of Ni. An analytical representation based on physical grounds was assumed to model the shape of the X-ray emission lines. Satellite structures assigned to 3dspectator hole transitions were resolved and determined as well as their relative contribution to the emission spectrum. The present results on 1s -13d -1shake probabilities support a recently proposed calculation framework based on a multi-configuration atomic model.« less

  17. Introducing ShakeMap to potential users in Puerto Rico using scenarios of damaging historical and probable earthquakes

    NASA Astrophysics Data System (ADS)

    Huerfano, V. A.; Cua, G.; von Hillebrandt, C.; Saffar, A.

    2007-12-01

    The island of Puerto Rico has a long history of damaging earthquakes. Major earthquakes from off-shore sources have affected Puerto Rico in 1520, 1615, 1670, 1751, 1787, 1867, and 1918 (Mueller et al, 2003; PRSN Catalogue). Recent trenching has also yielded evidence of possible M7.0 events inland (Prentice, 2000). The high seismic hazard, large population, high tsunami potential and relatively poor construction practice can result in a potentially devastating combination. Efficient emergency response in event of a large earthquake will be crucial to minimizing the loss of life and disruption of lifeline systems in Puerto Rico. The ShakeMap system (Wald et al, 2004) developed by the USGS to rapidly display and disseminate information about the geographical distribution of ground shaking (and hence potential damage) following a large earthquake has proven to be a vital tool for post earthquake emergency response efforts, and is being adopted/emulated in various seismically active regions worldwide. Implementing a robust ShakeMap system is among the top priorities of the Puerto Rico Seismic Network. However, the ultimate effectiveness of ShakeMap in post- earthquake response depends not only on its rapid availability, but also on the effective use of the information it provides. We developed ShakeMap scenarios of a suite of damaging historical and probable earthquakes that severely impact San Juan, Ponce, and Mayagüez, the 3 largest cities in Puerto Rico. Earthquake source parameters were obtained from McCann and Mercado (1998); and Huérfano (2004). For historical earthquakes that generated tsunamis, tsunami inundation maps were generated using the TIME method (Shuto, 1991). The ShakeMap ground shaking maps were presented to local and regional governmental and emergency response agencies at the 2007 Annual conference of the Puerto Rico Emergency Management and Disaster Administration in San Juan, PR, and at numerous other emergency management talks and training sessions. Economic losses are estimated using the ShakeMap scenario ground motions (Saffar, 2007). The calibration tasks necessary in generating these scenarios (developing Vs30 maps, attenuation relationships) complement the on-going efforts of the Puerto Rico Seismic Network to generate ShakeMaps in real-time.

  18. The Testing Behind the Test Facility: the Acoustic Design of the NASA Glenn Research Center's World-Class Reverberant Acoustic Test Facility

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Mark E.; Hozman, Aron D.; McNelis, Anne M.

    2010-01-01

    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) is leading the design and build of the new world-class vibroacoustic test capabilities at the NASA GRC s Plum Brook Station in Sandusky, Ohio, U.S.A. Benham Companies, LLC is currently constructing modal, base-shake sine and reverberant acoustic test facilities to support the future testing needs of NASA s space exploration program. The large Reverberant Acoustic Test Facility (RATF) will be approximately 101,000 ft3 in volume and capable of achieving an empty chamber acoustic overall sound pressure level (OASPL) of 163 dB. This combination of size and acoustic power is unprecedented amongst the world s known active reverberant acoustic test facilities. The key to achieving the expected acoustic test spectra for a range of many NASA space flight environments in the RATF is the knowledge gained from a series of ground acoustic tests. Data was obtained from several NASA-sponsored test programs, including testing performed at the National Research Council of Canada s acoustic test facility in Ottawa, Ontario, Canada, and at the Redstone Technical Test Center acoustic test facility in Huntsville, Alabama, U.S.A. The majority of these tests were performed to characterize the acoustic performance of the modulators (noise generators) and representative horns that would be required to meet the desired spectra, as well as to evaluate possible supplemental gas jet noise sources. The knowledge obtained in each of these test programs enabled the design of the RATF sound generation system to confidently advance to its final acoustic design and subsequent ongoing construction.

  19. The Testing Behind The Test Facility: The Acoustic Design of the NASA Glenn Research Center's World-Class Reverberant Acoustic Test Facility

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Mark E.; McNelis, Anne M.

    2011-01-01

    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) is leading the design and build of the new world-class vibroacoustic test capabilities at the NASA GRC?s Plum Brook Station in Sandusky, Ohio, USA. Benham Companies, LLC is currently constructing modal, base-shake sine and reverberant acoustic test facilities to support the future testing needs of NASA?s space exploration program. T he large Reverberant Acoustic Test Facility (RATF) will be approximately 101,000 ft3 in volume and capable of achieving an empty chamber acoustic overall sound pressure level (OASPL) of 163 dB. This combination of size and acoustic power is unprecedented amongst the world?s known active reverberant acoustic test facilities. The key to achieving the expected acoustic test spectra for a range of many NASA space flight environments in the RATF is the knowledge gained from a series of ground acoustic tests. Data was obtained from several NASA-sponsored test programs, including testing performed at the National Research Council of Canada?s acoustic test facility in Ottawa, Ontario, Canada, and at the Redstone Technical Test Center acoustic test facility in Huntsville, Alabama, USA. The majority of these tests were performed to characterize the acoustic performance of the modulators (noise generators) and representative horns that would be required to meet the desired spectra, as well as to evaluate possible supplemental gas jet noise sources. The knowledge obtained in each of these test programs enabled the design of the RATF sound generation system to confidently advance to its final acoustic de-sign and subsequent on-going construction.

  20. ShakeCast: Automating and improving the use of shakemap for post-earthquake deeision-making and response

    USGS Publications Warehouse

    Wald, D.; Lin, K.-W.; Porter, K.; Turner, Loren

    2008-01-01

    When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, on its own can be useful for emergency response, loss estimation, and public information. However, to take full advantage of the potential of ShakeMap, we introduce ShakeCast. ShakeCast facilitates the complicated assessment of potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps of structures or facilities most likely impacted. ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users' facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for both public and private emergency managers and responders. ?? 2008, Earthquake Engineering Research Institute.

  1. The design and performance of a low-cost strong-motion sensor using the ICS-3028 micromachined accelerometer

    USGS Publications Warehouse

    Evans, J.R.

    1998-01-01

    The severity of earthquake ground shaking varies tremendously over very short distances (Figures 1a-c). Within a distance of as little as 1 km from the nearest station, one knows little more than what can be obtained from an attenuation relation, given only distance from the fault rupture and the geology of the site. For example, if some station measures 0.5 g peak ground acceleration (PGA), then at a distance of 1 km from that site, under otherwise identical conditions, the shaking has one chance in three of being under 0.36 g or over 0.70 g, based on the curve shown in Figures la, c. Similarly, pseudovelocity (PSV) response spectra have a 5% chance of differing by 2? at 1 km distance (Figure 1 b). This variance can be the difference between moderate and severe damage. Hence, there are critical needs, both in emergency response and in mitigation (prediction of shaking strength, building codes, structural engineering), to sample ground shaking densely enough to identify individual neighborhoods suffering localized, strong shaking. These needs imply a spatially dense network of strong-motion seismographs, probably numbering thousands of sites in an urban region the size of the San Francisco Bay Area, California (Figure 1 c). It has not been economically feasible to field that many instruments, since existing ones cost many thousands of dollars apiece. For example, there are currently just a few dozen digital free-field instruments in the Bay Area. This paper is one step toward a solution to this conundrum. I demonstrate that a recently developed class of accelerometers, those constructed from silicon by 'micromachining' (a process similar to integrated circuit fabrication), is now capable of resolving ground motion with the necessary accuracy while greatly lowering both acquisition and maintenance costs.

  2. Quantifying and Qualifying USGS ShakeMap Uncertainty

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-Wan; Quitoriano, Vincent

    2008-01-01

    We describe algorithms for quantifying and qualifying uncertainties associated with USGS ShakeMap ground motions. The uncertainty values computed consist of latitude/longitude grid-based multiplicative factors that scale the standard deviation associated with the ground motion prediction equation (GMPE) used within the ShakeMap algorithm for estimating ground motions. The resulting grid-based 'uncertainty map' is essential for evaluation of losses derived using ShakeMaps as the hazard input. For ShakeMap, ground motion uncertainty at any point is dominated by two main factors: (i) the influence of any proximal ground motion observations, and (ii) the uncertainty of estimating ground motions from the GMPE, most notably, elevated uncertainty due to initial, unconstrained source rupture geometry. The uncertainty is highest for larger magnitude earthquakes when source finiteness is not yet constrained and, hence, the distance to rupture is also uncertain. In addition to a spatially-dependant, quantitative assessment, many users may prefer a simple, qualitative grading for the entire ShakeMap. We developed a grading scale that allows one to quickly gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of the post-earthquake decision-making process or for qualitative assessments of archived or historical earthquake ShakeMaps. We describe an uncertainty letter grading ('A' through 'F', for high to poor quality, respectively) based on the uncertainty map. A middle-range ('C') grade corresponds to a ShakeMap for a moderate-magnitude earthquake suitably represented with a point-source location. Lower grades 'D' and 'F' are assigned for larger events (M>6) where finite-source dimensions are not yet constrained. The addition of ground motion observations (or observed macroseismic intensities) reduces uncertainties over data-constrained portions of the map. Higher grades ('A' and 'B') correspond to ShakeMaps with constrained fault dimensions and numerous stations, depending on the density of station/data coverage. Due to these dependencies, the letter grade can change with subsequent ShakeMap revisions if more data are added or when finite-faulting dimensions are added. We emphasize that the greatest uncertainties are associated with unconstrained source dimensions for large earthquakes where the distance term in the GMPE is most uncertain; this uncertainty thus scales with magnitude (and consequently rupture dimension). Since this distance uncertainty produces potentially large uncertainties in ShakeMap ground-motion estimates, this factor dominates over compensating constraints for all but the most dense station distributions.

  3. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.; Liu, A.; Strand, L.

    2012-12-01

    We report on developments in sensor connectivity, architecture, and data fusion algorithms executed in Cloud computing systems in the Community Seismic Network (CSN), a network of low-cost sensors housed in homes and offices by volunteers in the Pasadena, CA area. The network has over 200 sensors continuously reporting anomalies in local acceleration through the Internet to a Cloud computing service (the Google App Engine) that continually fuses sensor data to rapidly detect shaking from earthquakes. The Cloud computing system consists of data centers geographically distributed across the continent and is likely to be resilient even during earthquakes and other local disasters. The region of Southern California is partitioned in a multi-grid style into sets of telescoping cells called geocells. Data streams from sensors within a geocell are fused to detect anomalous shaking across the geocell. Temporal spatial patterns across geocells are used to detect anomalies across regions. The challenge is to detect earthquakes rapidly with an extremely low false positive rate. We report on two data fusion algorithms, one that tessellates the surface so as to fuse data from a large region around Pasadena and the other, which uses a standard tessellation of equal-sized cells. Since September 2011, the network has successfully detected earthquakes of magnitude 2.5 or higher within 40 Km of Pasadena. In addition to the standard USB device, which connects to the host's computer, we have developed a stand-alone sensor that directly connects to the internet via Ethernet or wifi. This bypasses security concerns that some companies have with the USB-connected devices, and allows for 24/7 monitoring at sites that would otherwise shut down their computers after working hours. In buildings we use the sensors to model the behavior of the structures during weak events in order to understand how they will perform during strong events. Visualization models of instrumented buildings ranging between five and 22 stories tall have been constructed using Google SketchUp. Ambient vibration records are used to identify the first set of horizontal vibrational modal frequencies of the buildings. These frequencies are used to compute the response on every floor of the building, given either observed data or scenario ground motion input at the buildings' base.

  4. USGS ShakeMap Developments, Implementation, and Derivative Tools

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Lin, K.; Quitoriano, V.; Worden, B.

    2007-12-01

    We discuss ongoing development and enhancements of ShakeMap, a system for automatically generating maps of ground shaking and intensity in the minutes following an earthquake. The rapid availability of these maps is of particular value to emergency response organizations, utilities, insurance companies, government decision- makers, the media, and the general public. ShakeMap Version 3.2 was released in March, 2007, on a download site which allows ShakeMap developers to track operators' updates and provide follow-up information; V3.2 has now been downloaded in 15 countries. The V3.2 release supports LINUX in addition to other UNIX operating systems and adds enhancements to XML, KML, metadata, and other products. We have also added an uncertainty measure, quantified as a function of spatial location. Uncertainty is essential for evaluating the range of possible losses. Though not released in V3.2, we will describe a new quantitative uncertainty letter grading for each ShakeMap produced, allowing users to gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of their post-earthquake critical decision-making process. Since the V3.2 release, several new ground motion predictions equations have also been added to the prediction equation modules. ShakeMap is implemented in several new regions as reported in this Session. Within the U.S., robust systems serve California, Nevada, Utah, Washington and Oregon, Hawaii, and Anchorage. Additional systems are in development and efforts to provide backup capabilities for all Advanced National Seismic System (ANSS) regions at the National Earthquake Information Center are underway. Outside the U.S., this Session has descriptions of ShakeMap systems in Italy, Switzerland, Romania, and Turkey, among other countries. We also describe our predictive global ShakeMap system for the rapid evaluation of significant earthquakes globally for the Prompt Assessment of Global Earthquakes for Response (PAGER) system. These global ShakeMaps are constrained by rapidly gathered intensity data via the Internet and by finite fault and aftershock analyses for portraying fault rupture dimensions. As part of the PAGER loss calibration process we have produced an Atlas of ShakeMaps for significant earthquakes around the globe since 1973 (Allen and others, this Session); these Atlas events have additional constraints provided by archival strong motion, faulting dimensions, and macroseismic intensity data. We also describe derivative tools for further utilizing ShakeMap including ShakeCast, a fully automated system for delivering specific ShakeMap products to critical users and triggering established post-earthquake response protocols. We have released ShakeCast Version 2.0 (Lin and others, this Session), which allows RSS feeds for automatically receiving ShakeMap files, auto-launching of post-download processing scripts, and delivering notifications based on users' likely facility damage states derived from ShakeMap shaking parameters. As part of our efforts to produce estimated ShakeMaps globally, we have developed a procedure for deriving Vs30 estimates from correlations with topographic slope, and we have now implemented a global Vs30 Server, allowing users to generate Vs30 maps for custom user-selected regions around the globe (Allen and Wald, this Session). Finally, as a further derivative product of the ShakeMap Atlas project, we will present a shaking hazard Map for the past 30 years based on approximately 3,900 earthquake ShakeMaps of historic earthquakes.

  5. Experimental evaluation of four ground-motion scaling methods for dynamic response-history analysis of nonlinear structures

    USGS Publications Warehouse

    O'Donnell, Andrew P.; Kurama, Yahya C.; Kalkan, Erol; Taflanidis, Alexandros A.

    2017-01-01

    This paper experimentally evaluates four methods to scale earthquake ground-motions within an ensemble of records to minimize the statistical dispersion and maximize the accuracy in the dynamic peak roof drift demand and peak inter-story drift demand estimates from response-history analyses of nonlinear building structures. The scaling methods that are investigated are based on: (1) ASCE/SEI 7–10 guidelines; (2) spectral acceleration at the fundamental (first mode) period of the structure, Sa(T1); (3) maximum incremental velocity, MIV; and (4) modal pushover analysis. A total of 720 shake-table tests of four small-scale nonlinear building frame specimens with different static and dynamic characteristics are conducted. The peak displacement demands from full suites of 36 near-fault ground-motion records as well as from smaller “unbiased” and “biased” design subsets (bins) of ground-motions are included. Out of the four scaling methods, ground-motions scaled to the median MIV of the ensemble resulted in the smallest dispersion in the peak roof and inter-story drift demands. Scaling based on MIValso provided the most accurate median demands as compared with the “benchmark” demands for structures with greater nonlinearity; however, this accuracy was reduced for structures exhibiting reduced nonlinearity. The modal pushover-based scaling (MPS) procedure was the only method to conservatively overestimate the median drift demands.

  6. Operational modal analysis of a high-rise multi-function building with dampers by a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Ni, Yanchun; Lu, Xilin; Lu, Wensheng

    2017-03-01

    The field non-destructive vibration test plays an important role in the area of structural health monitoring. It assists in monitoring the health status and reducing the risk caused by the poor performance of structures. As the most economic field test among the various vibration tests, the ambient vibration test is the most popular and is widely used to assess the physical condition of a structure under operational service. Based on the ambient vibration data, modal identification can help provide significant previous study for model updating and damage detection during the service life of a structure. It has been proved that modal identification works well in the investigation of the dynamic performance of different kinds of structures. In this paper, the objective structure is a high-rise multi-function office building. The whole building is composed of seven three-story structural units. Each unit comprises one complete floor and two L shaped floors to form large spaces along the vertical direction. There are 56 viscous dampers installed in the building to improve the energy dissipation capacity. Due to the special feature of the structure, field vibration tests and further modal identification were performed to investigate its dynamic performance. Twenty-nine setups were designed to cover all the degrees of freedom of interest. About two years later, another field test was carried out to measure the building for 48 h to investigate the performance variance and the distribution of the modal parameters. A Fast Bayesian FFT method was employed to perform the modal identification. This Bayesian method not only provides the most probable values of the modal parameters but also assesses the associated posterior uncertainty analytically, which is especially relevant in field vibration tests arising due to measurement noise, sensor alignment error, modelling error, etc. A shaking table test was also implemented including cases with and without dampers, which assists in investigating the effect of dampers. The modal parameters obtained from different tests were investigated separately and then compared with each other.

  7. ShakeMap Atlas 2.0: an improved suite of recent historical earthquake ShakeMaps for global hazard analyses and loss model calibration

    USGS Publications Warehouse

    Garcia, D.; Mah, R.T.; Johnson, K.L.; Hearne, M.G.; Marano, K.D.; Lin, K.-W.; Wald, D.J.

    2012-01-01

    We introduce the second version of the U.S. Geological Survey ShakeMap Atlas, which is an openly-available compilation of nearly 8,000 ShakeMaps of the most significant global earthquakes between 1973 and 2011. This revision of the Atlas includes: (1) a new version of the ShakeMap software that improves data usage and uncertainty estimations; (2) an updated earthquake source catalogue that includes regional locations and finite fault models; (3) a refined strategy to select prediction and conversion equations based on a new seismotectonic regionalization scheme; and (4) vastly more macroseismic intensity and ground-motion data from regional agencies All these changes make the new Atlas a self-consistent, calibrated ShakeMap catalogue that constitutes an invaluable resource for investigating near-source strong ground-motion, as well as for seismic hazard, scenario, risk, and loss-model development. To this end, the Atlas will provide a hazard base layer for PAGER loss calibration and for the Earthquake Consequences Database within the Global Earthquake Model initiative.

  8. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.

  9. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  10. Strong ground motions generated by earthquakes on creeping faults

    USGS Publications Warehouse

    Harris, Ruth A.; Abrahamson, Norman A.

    2014-01-01

    A tenet of earthquake science is that faults are locked in position until they abruptly slip during the sudden strain-relieving events that are earthquakes. Whereas it is expected that locked faults when they finally do slip will produce noticeable ground shaking, what is uncertain is how the ground shakes during earthquakes on creeping faults. Creeping faults are rare throughout much of the Earth's continental crust, but there is a group of them in the San Andreas fault system. Here we evaluate the strongest ground motions from the largest well-recorded earthquakes on creeping faults. We find that the peak ground motions generated by the creeping fault earthquakes are similar to the peak ground motions generated by earthquakes on locked faults. Our findings imply that buildings near creeping faults need to be designed to withstand the same level of shaking as those constructed near locked faults.

  11. The Long-term Impacts of Earthquakes on Economic Growth

    NASA Astrophysics Data System (ADS)

    Lackner, S.

    2016-12-01

    The social science literature has so far not reached a consensus on whether and how earthquakes actually impact economic growth in the long-run. Several hypotheses have been suggested and some even argue for a positive impact. A general weakness in the literature, however, is the predominant use of inadequate measures for the exogenous natural hazard of an earthquake. The most common problems are the lack of individual event size (e.g. earthquake dummy or number of events), the use of magnitude instead of a measure for surface shaking, and endogeneity issues when traditional qualitative intensity scales or actual impact data is used. Here we use peak ground acceleration (PGA) as the ground motion intensity measure and investigate the impacts of earthquake shaking on long-run economic growth. We construct a data set from USGS ShakeMaps that can be considered the universe of global relevant earthquake ground shaking from 1973 to 2014. This data set is then combined with World Bank GDP data to conduct a regression analysis. Furthermore, the impacts of PGA on different industries and other economic variables such as employment and education are also investigated. This will on one hand help to identify the mechanism of how earthquakes impact long-run growth and also show potential impacts on other welfare indicators that are not captured by GDP. This is the first application of global earthquake shaking data to investigate long-term earthquake impacts.

  12. A reliable simultaneous representation of seismic hazard and of ground shaking recurrence

    NASA Astrophysics Data System (ADS)

    Peresan, A.; Panza, G. F.; Magrin, A.; Vaccari, F.

    2015-12-01

    Different earthquake hazard maps may be appropriate for different purposes - such as emergency management, insurance and engineering design. Accounting for the lower occurrence rate of larger sporadic earthquakes may allow to formulate cost-effective policies in some specific applications, provided that statistically sound recurrence estimates are used, which is not typically the case of PSHA (Probabilistic Seismic Hazard Assessment). We illustrate the procedure to associate the expected ground motions from Neo-deterministic Seismic Hazard Assessment (NDSHA) to an estimate of their recurrence. Neo-deterministic refers to a scenario-based approach, which allows for the construction of a broad range of earthquake scenarios via full waveforms modeling. From the synthetic seismograms the estimates of peak ground acceleration, velocity and displacement, or any other parameter relevant to seismic engineering, can be extracted. NDSHA, in its standard form, defines the hazard computed from a wide set of scenario earthquakes (including the largest deterministically or historically defined credible earthquake, MCE) and it does not supply the frequency of occurrence of the expected ground shaking. A recent enhanced variant of NDSHA that reliably accounts for recurrence has been developed and it is applied to the Italian territory. The characterization of the frequency-magnitude relation can be performed by any statistically sound method supported by data (e.g. multi-scale seismicity model), so that a recurrence estimate is associated to each of the pertinent sources. In this way a standard NDSHA map of ground shaking is obtained simultaneously with the map of the corresponding recurrences. The introduction of recurrence estimates in NDSHA naturally allows for the generation of ground shaking maps at specified return periods. This permits a straightforward comparison between NDSHA and PSHA maps.

  13. Validation of in vitro assays in three-dimensional human dermal constructs.

    PubMed

    Idrees, Ayesha; Chiono, Valeria; Ciardelli, Gianluca; Shah, Siegfried; Viebahn, Richard; Zhang, Xiang; Salber, Jochen

    2018-05-01

    Three-dimensional cell culture systems are urgently needed for cytocompatibility testing of biomaterials. This work aimed at the development of three-dimensional in vitro dermal skin models and their optimization for cytocompatibility evaluation. Initially "murine in vitro dermal construct" based on L929 cells was generated, leading to the development of "human in vitro dermal construct" consisting of normal human dermal fibroblasts in rat tail tendon collagen type I. To assess the viability of the cells, different assays CellTiter-Blue ® , RealTime-Glo ™ MT, and CellTiter-Glo ® (Promega) were evaluated to optimize the best-suited assay to the respective cell type and three-dimensional system. Z-stack imaging (Live/Dead and Phalloidin/DAPI-Promokine) was performed to visualize normal human dermal fibroblasts inside matrix revealing filopodia-like morphology and a uniform distribution of normal human dermal fibroblasts in matrix. CellTiter-Glo was found to be the optimal cell viability assay among those analyzed. CellTiter-Blue reagent affected the cell morphology of normal human dermal fibroblasts (unlike L929), suggesting an interference with cell biological activity, resulting in less reliable viability data. On the other hand, RealTime-Glo provided a linear signal only with a very low cell density, which made this assay unsuitable for this system. CellTiter-Glo adapted to three-dimensional dermal construct by optimizing the "shaking time" to enhance the reagent penetration and maximum adenosine triphosphate release, indicating 2.4 times higher viability value by shaking for 60 min than for 5 min. In addition, viability results showed that cells were viable inside the matrix. This model would be further advanced with more layers of skin to make a full thickness model.

  14. Three-dimensional (3D) evaluation of liquid distribution in shake flask using an optical fluorescence technique.

    PubMed

    Azizan, Amizon; Büchs, Jochen

    2017-01-01

    Biotechnological development in shake flask necessitates vital engineering parameters e.g. volumetric power input, mixing time, gas liquid mass transfer coefficient, hydromechanical stress and effective shear rate. Determination and optimization of these parameters through experiments are labor-intensive and time-consuming. Computational Fluid Dynamics (CFD) provides the ability to predict and validate these parameters in bioprocess engineering. This work provides ample experimental data which are easily accessible for future validations to represent the hydrodynamics of the fluid flow in the shake flask. A non-invasive measuring technique using an optical fluorescence method was developed for shake flasks containing a fluorescent solution with a waterlike viscosity at varying filling volume (V L  = 15 to 40 mL) and shaking frequency ( n  = 150 to 450 rpm) at a constant shaking diameter (d o  = 25 mm). The method detected the leading edge (LB) and tail of the rotating bulk liquid (TB) relative to the direction of the centrifugal acceleration at varying circumferential heights from the base of the shake flask. The determined LB and TB points were translated into three-dimensional (3D) circumferential liquid distribution plots. The maximum liquid height (H max ) of the bulk liquid increased with increasing filling volume and shaking frequency of the shaking flask, as expected. The toroidal shapes of LB and TB are clearly asymmetrical and the measured TB differed by the elongation of the liquid particularly towards the torus part of the shake flask. The 3D liquid distribution data collected at varying filling volume and shaking frequency, comprising of LB and TB values relative to the direction of the centrifugal acceleration are essential for validating future numerical solutions using CFD to predict vital engineering parameters in shake flask.

  15. BE, DO, and Modal Auxiliaries of 3-Year-Old African American English Speakers

    ERIC Educational Resources Information Center

    Newkirk-Turner, Brandi L.; Oetting, Janna B.; Stockman, Ida J.

    2014-01-01

    Purpose: This study examined African American English--speaking children's use of BE, DO, and modal auxiliaries. Method: The data were based on language samples obtained from 48 three-year-olds. Analyses examined rates of marking by auxiliary type, auxiliary surface form, succeeding element, and syntactic construction and by a number of child…

  16. A Field-Shaking System to Reduce the Screening Current-Induced Field in the 800-MHz HTS Insert of the MIT 1.3-GHz LTS/HTS NMR Magnet: A Small-Model Study.

    PubMed

    Lee, Jiho; Park, Dongkeun; Michael, Philip C; Noguchi, So; Bascuñán, Juan; Iwasa, Yukikazu

    2018-04-01

    In this paper, we present experimental results, of a small-model study, from which we plan to develop and apply a full-scale field-shaking system to reduce the screening current-induced field (SCF) in the 800-MHz HTS Insert (H800) of the MIT 1.3-GHz LTS/HTS NMR magnet (1.3G) currently under construction-the H800 is composed of 3 nested coils, each a stack of no-insulation (NI) REBCO double-pancakes. In 1.3G, H800 is the chief source of a large error field generated by its own SCF. To study the effectiveness of the field-shaking technique, we used two NI REBCO double-pancakes, one from Coil 2 (HCoil2) and one from Coil 3 (HCoil3) of the 3 H800 coils, and placed them in the bore of a 5-T/300-mm room-temperature bore low-temperature superconducting (LTS) background magnet. The background magnet is used not only to induce the SCF in the double-pancakes but also to reduce it by the field-shaking technique. For each run, we induced the SCF in the double-pancakes at an axial location where the external radial field Br > 0, then for the field-shaking, moved them to another location where the external axial field Bz ≫ B R . Due to the geometry of H800 and L500, top double-pancakes of 3 H800 coils will experience the considerable radial magnetic field perpendicular to the REBCO tape surface. To examine the effect of the field-shaking on the SCF, we tested each NI REBCO DP in the absence or presence of a radial field. In this paper, we report 77-K experimental results and analysis of the effect and a few significant remarks of the field-shaking.

  17. Shaking Alone Induces De Novo Conversion of Recombinant Prion Proteins to β-Sheet Rich Oligomers and Fibrils

    PubMed Central

    Ladner-Keay, Carol L.; Griffith, Bethany J.; Wishart, David S.

    2014-01-01

    The formation of β-sheet rich prion oligomers and fibrils from native prion protein (PrP) is thought to be a key step in the development of prion diseases. Many methods are available to convert recombinant prion protein into β-sheet rich fibrils using various chemical denaturants (urea, SDS, GdnHCl), high temperature, phospholipids, or mildly acidic conditions (pH 4). Many of these methods also require shaking or another form of agitation to complete the conversion process. We have identified that shaking alone causes the conversion of recombinant PrP to β-sheet rich oligomers and fibrils at near physiological pH (pH 5.5 to pH 6.2) and temperature. This conversion does not require any denaturant, detergent, or any other chemical cofactor. Interestingly, this conversion does not occur when the water-air interface is eliminated in the shaken sample. We have analyzed shaking-induced conversion using circular dichroism, resolution enhanced native acidic gel electrophoresis (RENAGE), electron microscopy, Fourier transform infrared spectroscopy, thioflavin T fluorescence and proteinase K resistance. Our results show that shaking causes the formation of β-sheet rich oligomers with a population distribution ranging from octamers to dodecamers and that further shaking causes a transition to β-sheet fibrils. In addition, we show that shaking-induced conversion occurs for a wide range of full-length and truncated constructs of mouse, hamster and cervid prion proteins. We propose that this method of conversion provides a robust, reproducible and easily accessible model for scrapie-like amyloid formation, allowing the generation of milligram quantities of physiologically stable β-sheet rich oligomers and fibrils. These results may also have interesting implications regarding our understanding of prion conversion and propagation both within the brain and via techniques such as protein misfolding cyclic amplification (PMCA) and quaking induced conversion (QuIC). PMID:24892647

  18. Relations between some horizontal‐component ground‐motion intensity measures used in practice

    USGS Publications Warehouse

    Boore, David; Kishida, Tadahiro

    2017-01-01

    Various measures using the two horizontal components of recorded ground motions have been used in a number of studies that derive ground‐motion prediction equations and construct maps of shaking intensity. We update relations between a number of these measures, including those in Boore et al. (2006) and Boore (2010), using the large and carefully constructed global database of ground motions from crustal earthquakes in active tectonic regions developed as part of the Pacific Earthquake Engineering Research Center–Next Generation Attenuation‐West2 project. The ratios from the expanded datasets generally agree to within a few percent of the previously published ratios. We also provide some ratios that were not considered before, some of which will be useful in applications such as constructing ShakeMaps. Finally, we compare two important ratios with those from a large central and eastern North American database and from many records from subduction earthquakes in Japan and Taiwan. In general, the ratios from these regions are within several percent of those from crustal earthquakes in active tectonic regions.

  19. CyberShake-derived ground-motion prediction models for the Los Angeles region with application to earthquake early warning

    USGS Publications Warehouse

    Bose, Maren; Graves, Robert; Gill, David; Callaghan, Scott; Maechling, Phillip J.

    2014-01-01

    Real-time applications such as earthquake early warning (EEW) typically use empirical ground-motion prediction equations (GMPEs) along with event magnitude and source-to-site distances to estimate expected shaking levels. In this simplified approach, effects due to finite-fault geometry, directivity and site and basin response are often generalized, which may lead to a significant under- or overestimation of shaking from large earthquakes (M > 6.5) in some locations. For enhanced site-specific ground-motion predictions considering 3-D wave-propagation effects, we develop support vector regression (SVR) models from the SCEC CyberShake low-frequency (<0.5 Hz) and broad-band (0–10 Hz) data sets. CyberShake encompasses 3-D wave-propagation simulations of >415 000 finite-fault rupture scenarios (6.5 ≤ M ≤ 8.5) for southern California defined in UCERF 2.0. We use CyberShake to demonstrate the application of synthetic waveform data to EEW as a ‘proof of concept’, being aware that these simulations are not yet fully validated and might not appropriately sample the range of rupture uncertainty. Our regression models predict the maximum and the temporal evolution of instrumental intensity (MMI) at 71 selected test sites using only the hypocentre, magnitude and rupture ratio, which characterizes uni- and bilateral rupture propagation. Our regression approach is completely data-driven (where here the CyberShake simulations are considered data) and does not enforce pre-defined functional forms or dependencies among input parameters. The models were established from a subset (∼20 per cent) of CyberShake simulations, but can explain MMI values of all >400 k rupture scenarios with a standard deviation of about 0.4 intensity units. We apply our models to determine threshold magnitudes (and warning times) for various active faults in southern California that earthquakes need to exceed to cause at least ‘moderate’, ‘strong’ or ‘very strong’ shaking in the Los Angeles (LA) basin. These thresholds are used to construct a simple and robust EEW algorithm: to declare a warning, the algorithm only needs to locate the earthquake and to verify that the corresponding magnitude threshold is exceeded. The models predict that a relatively moderate M6.5–7 earthquake along the Palos Verdes, Newport-Inglewood/Rose Canyon, Elsinore or San Jacinto faults with a rupture propagating towards LA could cause ‘very strong’ to ‘severe’ shaking in the LA basin; however, warning times for these events could exceed 30 s.

  20. DSOD Procedures for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Howard, J. K.; Fraser, W. A.

    2005-12-01

    DSOD, which has jurisdiction over more than 1200 dams in California, routinely evaluates their dynamic stability using seismic shaking input ranging from simple pseudostatic coefficients to spectrally matched earthquake time histories. Our seismic hazard assessments assume maximum earthquake scenarios of nearest active and conditionally active seismic sources. Multiple earthquake scenarios may be evaluated depending on sensitivity of the design analysis (e.g., to certain spectral amplitudes, duration of shaking). Active sources are defined as those with evidence of movement within the last 35,000 years. Conditionally active sources are those with reasonable expectation of activity, which are treated as active until demonstrated otherwise. The Division's Geology Branch develops seismic hazard estimates using spectral attenuation formulas applicable to California. The formulas were selected, in part, to achieve a site response model similar to the 2000 IBC's for rock, soft rock, and stiff soil sites. The level of dynamic loading used in the stability analysis (50th, 67th, or 84th percentile ground shaking estimates) is determined using a matrix that considers consequence of dam failure and fault slip rate. We account for near-source directivity amplification along such faults by adjusting target response spectra and developing appropriate design earthquakes for analysis of structures sensitive to long-period motion. Based on in-house studies, the orientation of the dam analysis section relative to the fault-normal direction is considered for strike-slip earthquakes, but directivity amplification is assumed in any orientation for dip-slip earthquakes. We do not have probabilistic standards, but we evaluate the probability of our ground shaking estimates using hazard curves constructed from the USGS Interactive De-Aggregation website. Typically, return periods for our design loads exceed 1000 years. Excessive return periods may warrant a lower design load. Minimum shaking levels are provided for sites far from active faulting. Our procedures and standards are presented at the DSOD website http://damsafety.water.ca.gov/. We review our methods and tools periodically under the guidance of our Consulting Board for Earthquake Analysis (and expect to make changes pending NGA completion), mindful that frequent procedural changes can interrupt design evaluations.

  1. A gantry-based tri-modality system for bioluminescence tomography

    PubMed Central

    Yan, Han; Lin, Yuting; Barber, William C.; Unlu, Mehmet Burcin; Gulsen, Gultekin

    2012-01-01

    A gantry-based tri-modality system that combines bioluminescence (BLT), diffuse optical (DOT), and x-ray computed tomography (XCT) into the same setting is presented here. The purpose of this system is to perform bioluminescence tomography using a multi-modality imaging approach. As parts of this hybrid system, XCT and DOT provide anatomical information and background optical property maps. This structural and functional a priori information is used to guide and restrain bioluminescence reconstruction algorithm and ultimately improve the BLT results. The performance of the combined system is evaluated using multi-modality phantoms. In particular, a cylindrical heterogeneous multi-modality phantom that contains regions with higher optical absorption and x-ray attenuation is constructed. We showed that a 1.5 mm diameter bioluminescence inclusion can be localized accurately with the functional a priori information while its source strength can be recovered more accurately using both structural and the functional a priori information. PMID:22559540

  2. Rotating Shake Test and Modal Analysis of a Model Helicopter Rotor Blade

    NASA Technical Reports Server (NTRS)

    Wilkie, W. Keats; Mirick, Paul H.; Langston, Chester W.

    1997-01-01

    Rotating blade frequencies for a model generic helicopter rotor blade mounted on an articulated hub were experimentally determined. Testing was conducted using the Aeroelastic Rotor Experimental System (ARES) testbed in the Helicopter Hover Facility (HBF) at Langley Research Center. The measured data were compared to pretest analytical predictions of the rotating blade frequencies made using the MSC/NASTRAN finite-element computer code. The MSC/NASTRAN solution sequences used to analyze the model were modified to account for differential stiffening effects caused by the centrifugal force acting on the blade and rotating system dynamic effects. The correlation of the MSC/NASTRAN-derived frequencies with the experimental data is, in general, very good although discrepancies in the blade torsional frequency trends and magnitudes were observed. The procedures necessary to perform a rotating system modal analysis of a helicopter rotor blade with MSC/NASTRAN are outlined, and complete sample data deck listings are provided.

  3. Responses of a tall building with U.S. code-type instrumentation in Tokyo, Japan, to events before, during and after the Tohoku earthquake of 11 March 2011

    USGS Publications Warehouse

    Çelebi, Mehmet; Kashima, Toshihide; Ghahari, S. Farid; Abazarsa, Fariba; Taciroglu, Ertugrul

    2016-01-01

    The 11 March 2011 M 9.0 Tohoku earthquake generated long-duration shaking that propagated hundreds of kilometers from the epicenter and affected tall buildings in urban areas several hundred kilometers from the epicenter of the main shock. Recorded responses show that tall buildings were affected by long-period motions. This study presents the behavior and performance of a 37-story building in the Tsukuda area of Tokyo, Japan, as inferred from modal analyses of records retrieved for a time interval covering a few days before, during, and for several months after the main shock. The U.S. “code-type” array comprises three triaxial accelerometers deployed at three levels in the superstructure. Such a sparse array in a tall structure limits a reliable assessment, because its performance must be based on only the average drift ratios. Based on the inferred values of this parameter, the subject building was not structurally damaged.

  4. iShake: Mobile Phones as Seismic Sensors (Invited)

    NASA Astrophysics Data System (ADS)

    Dashti, S.; Reilly, J.; Bray, J. D.; Bayen, A. M.; Glaser, S. D.; Mari, E.

    2010-12-01

    Emergency responders must “see” the effects of an earthquake clearly and rapidly so that they can respond effectively to the damage it has produced. Great strides have been made recently in developing methodologies that deliver rapid and accurate post-earthquake information. However, shortcomings still exist. The iShake project is an innovative use of cell phones and information technology to bridge the gap between the high quality, but sparse, ground motion instrument data that are used to help develop ShakeMap and the low quality, but large quantity, human observational data collected to construct a “Did You Feel It?” (DYFI)-based map. Rather than using people as measurement “devices” as is being done through DYFI, the iShake project is using their cell phones to measure ground motion intensity parameters and automatically deliver the data to the U.S. Geological Survey (USGS) for processing and dissemination. In this participatory sensing paradigm, quantitative shaking data from numerous cellular phones will enable the USGS to produce shaking intensity maps more accurately than presently possible. The phone sensor, however, is an imperfect device with performance variations among phones of a given model as well as between models. The sensor is the entire phone, not just the micro-machined transducer inside. A series of 1-D and 3-D shaking table tests were performed at UC San Diego and UC Berkeley, respectively, to evaluate the performance of a class of cell phones. In these tests, seven iPhones and iPod Touch devices that were mounted at different orientations were subjected to 124 earthquake ground motions to characterize their response and reliability as seismic sensors. The testing also provided insight into the seismic response of unsecured and falling instruments. The cell phones measured seismic parameters such as peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), and 5% damped spectral accelerations well. In general, iPhone and iPod Touch sensors slightly over-estimated ground motion energy (i.e., Arias Intensity, Ia). However, the mean acceleration response spectrum of the seven iPhones compared remarkably well with that of the reference high quality accelerometers. The error in the recorded intensity parameters was dependent on the characteristics of the input ground motion, particularly its PGA and Ia, and increased for stronger motions. The use of a high-friction device cover (e.g., rubber iPhone covers) on unsecured phones yielded substantially improved data by minimizing independent phone movement. Useful information on the ground motion characteristics was even extracted from unsecured phones during intense shaking events. The insight gained from these experiments is valuable in distilling information from a large number of imperfect signals from phones that may not be rigidly connected to the ground. With these ubiquitous measurement devices, a more accurate and rapid portrayal of the damage distribution during an earthquake can be provided to emergency responders and to the public.

  5. Association of adverse childhood experiences with shaking and smothering behaviors among Japanese caregivers.

    PubMed

    Isumi, Aya; Fujiwara, Takeo

    2016-07-01

    Shaking and smothering in response to infant crying are life-threatening child abuse. Parental childhood abuse history is known to be one of the most robust risk factors for abusing their offspring. In addition to childhood abuse history, other adverse childhood exposures (ACEs) need to be considered due to co-occurrence. However, few studies have investigated the impact of ACEs on caregivers shaking and smothering their infant. This study aims to investigate the association of ACEs with shaking and smothering among caregivers of infants in Japan. A questionnaire was administered to caregivers participating in a four-month health checkup between September 2013 and August 2014 in Chiba City, Japan, to assess their ACEs (parental death, parental divorce, mentally ill parents, witness of intimate partner violence, physical abuse, neglect, psychological abuse and economic hardship), and shaking and smothering toward their infants (N=4297). Logistic regression analysis was used to examine the cumulative and individual impacts of ACEs on shaking and smothering. Analyses were conducted in 2015. A total of 28.3% reported having experienced at least one ACE during their childhood. We found that only witness of IPV had a significant association with shaking of infant (OR=1.93, 95% CI: 1.03-3.61). The total number of ACEs was not associated with either shaking or smothering. Our findings suggest that shaking and smothering in response to crying can occur regardless of ACEs. Population-based strategies that target all caregivers to prevent shaking and smothering of infants are needed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Eigensystem realization algorithm user's guide forVAX/VMS computers: Version 931216

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.

    1994-01-01

    The eigensystem realization algorithm (ERA) is a multiple-input, multiple-output, time domain technique for structural modal identification and minimum-order system realization. Modal identification is the process of calculating structural eigenvalues and eigenvectors (natural vibration frequencies, damping, mode shapes, and modal masses) from experimental data. System realization is the process of constructing state-space dynamic models for modern control design. This user's guide documents VAX/VMS-based FORTRAN software developed by the author since 1984 in conjunction with many applications. It consists of a main ERA program and 66 pre- and post-processors. The software provides complete modal identification capabilities and most system realization capabilities.

  7. Shaking video stabilization with content completion

    NASA Astrophysics Data System (ADS)

    Peng, Yi; Ye, Qixiang; Liu, Yanmei; Jiao, Jianbin

    2009-01-01

    A new stabilization algorithm to counterbalance the shaking motion in a video based on classical Kandade-Lucas- Tomasi (KLT) method is presented in this paper. Feature points are evaluated with law of large numbers and clustering algorithm to reduce the side effect of moving foreground. Analysis on the change of motion direction is also carried out to detect the existence of shaking. For video clips with detected shaking, an affine transformation is performed to warp the current frame to the reference one. In addition, the missing content of a frame during the stabilization is completed with optical flow analysis and mosaicking operation. Experiments on video clips demonstrate the effectiveness of the proposed algorithm.

  8. Using structural damage statistics to derive macroseismic intensity within the Kathmandu valley for the 2015 M7.8 Gorkha, Nepal earthquake

    NASA Astrophysics Data System (ADS)

    McGowan, S. M.; Jaiswal, K. S.; Wald, D. J.

    2017-09-01

    We make and analyze structural damage observations from within the Kathmandu valley following the 2015 M7.8 Gorkha, Nepal earthquake to derive macroseismic intensities at several locations including some located near ground motion recording sites. The macroseismic intensity estimates supplement the limited strong ground motion data in order to characterize the damage statistics. This augmentation allows for direct comparisons between ground motion amplitudes and structural damage characteristics and ultimately produces a more constrained ground shaking hazard map for the Gorkha earthquake. For systematic assessments, we focused on damage to three specific building categories: (a) low/mid-rise reinforced concrete frames with infill brick walls, (b) unreinforced brick masonry bearing walls with reinforced concrete slabs, and (c) unreinforced brick masonry bearing walls with partial timber framing. Evaluating dozens of photos of each construction type, assigning each building in the study sample to a European Macroseismic Scale (EMS)-98 Vulnerability Class based upon its structural characteristics, and then individually assigning an EMS-98 Damage Grade to each building allows a statistically derived estimate of macroseismic intensity for each of nine study areas in and around the Kathmandu valley. This analysis concludes that EMS-98 macroseismic intensities for the study areas from the Gorkha mainshock typically were in the VII-IX range. The intensity assignment process described is more rigorous than the informal approach of assigning intensities based upon anecdotal media or first-person accounts of felt-reports, shaking, and their interpretation of damage. Detailed EMS-98 macroseismic assessments in urban areas are critical for quantifying relations between shaking and damage as well as for calibrating loss estimates. We show that the macroseismic assignments made herein result in fatality estimates consistent with the overall and district-wide reported values.

  9. Using structural damage statistics to derive macroseismic intensity within the Kathmandu valley for the 2015 M7.8 Gorkha, Nepal earthquake

    USGS Publications Warehouse

    McGowan, Sean; Jaiswal, Kishor; Wald, David J.

    2017-01-01

    We make and analyze structural damage observations from within the Kathmandu valley following the 2015 M7.8 Gorkha, Nepal earthquake to derive macroseismic intensities at several locations including some located near ground motion recording sites. The macroseismic intensity estimates supplement the limited strong ground motion data in order to characterize the damage statistics. This augmentation allows for direct comparisons between ground motion amplitudes and structural damage characteristics and ultimately produces a more constrained ground shaking hazard map for the Gorkha earthquake. For systematic assessments, we focused on damage to three specific building categories: (a) low/mid-rise reinforced concrete frames with infill brick walls, (b) unreinforced brick masonry bearing walls with reinforced concrete slabs, and (c) unreinforced brick masonry bearing walls with partial timber framing. Evaluating dozens of photos of each construction type, assigning each building in the study sample to a European Macroseismic Scale (EMS)-98 Vulnerability Class based upon its structural characteristics, and then individually assigning an EMS-98 Damage Grade to each building allows a statistically derived estimate of macroseismic intensity for each of nine study areas in and around the Kathmandu valley. This analysis concludes that EMS-98 macroseismic intensities for the study areas from the Gorkha mainshock typically were in the VII–IX range. The intensity assignment process described is more rigorous than the informal approach of assigning intensities based upon anecdotal media or first-person accounts of felt-reports, shaking, and their interpretation of damage. Detailed EMS-98 macroseismic assessments in urban areas are critical for quantifying relations between shaking and damage as well as for calibrating loss estimates. We show that the macroseismic assignments made herein result in fatality estimates consistent with the overall and district-wide reported values.

  10. Insights into earthquake hazard map performance from shaking history simulations

    NASA Astrophysics Data System (ADS)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher-than-mapped shaking — arises by chance or reflects biases in the map. Due to this problem, there are limits to how well we can expect hazard maps to predict future shaking, as well as to our ability to test the performance of a hazard map based on available observations.

  11. Loanwords and Vocabulary Size Test Scores: A Case of Different Estimates for Different L1 Learners

    ERIC Educational Resources Information Center

    Laufer, Batia; McLean, Stuart

    2016-01-01

    The article investigated how the inclusion of loanwords in vocabulary size tests affected the test scores of two L1 groups of EFL learners: Hebrew and Japanese. New BNC- and COCA-based vocabulary size tests were constructed in three modalities: word form recall, word form recognition, and word meaning recall. Depending on the test modality, the…

  12. An atlas of ShakeMaps for selected global earthquakes

    USGS Publications Warehouse

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul S.; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have merged all of the ShakeMaps in the Atlas to provide a global perspective of earthquake ground shaking for the past 35 years, allowing comparison with probabilistic hazard maps. The online Atlas and supporting databases can be found at http://earthquake.usgs.gov/eqcenter/shakemap/atlas.php/.

  13. Combining Multiple Rupture Models in Real-Time for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Wu, S.; Beck, J. L.; Heaton, T. H.

    2015-12-01

    The ShakeAlert earthquake early warning system for the west coast of the United States is designed to combine information from multiple independent earthquake analysis algorithms in order to provide the public with robust predictions of shaking intensity at each user's location before they are affected by strong shaking. The current contributing analyses come from algorithms that determine the origin time, epicenter, and magnitude of an earthquake (On-site, ElarmS, and Virtual Seismologist). A second generation of algorithms will provide seismic line source information (FinDer), as well as geodetically-constrained slip models (BEFORES, GPSlip, G-larmS, G-FAST). These new algorithms will provide more information about the spatial extent of the earthquake rupture and thus improve the quality of the resulting shaking forecasts.Each of the contributing algorithms exploits different features of the observed seismic and geodetic data, and thus each algorithm may perform differently for different data availability and earthquake source characteristics. Thus the ShakeAlert system requires a central mediator, called the Central Decision Module (CDM). The CDM acts to combine disparate earthquake source information into one unified shaking forecast. Here we will present a new design for the CDM that uses a Bayesian framework to combine earthquake reports from multiple analysis algorithms and compares them to observed shaking information in order to both assess the relative plausibility of each earthquake report and to create an improved unified shaking forecast complete with appropriate uncertainties. We will describe how these probabilistic shaking forecasts can be used to provide each user with a personalized decision-making tool that can help decide whether or not to take a protective action (such as opening fire house doors or stopping trains) based on that user's distance to the earthquake, vulnerability to shaking, false alarm tolerance, and time required to act.

  14. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    ,

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  15. Fixed Base Modal Testing Using the NASA GRC Mechanical Vibration Facility

    NASA Technical Reports Server (NTRS)

    Staab, Lucas D.; Winkel, James P.; Suarez, Vicente J.; Jones, Trevor M.; Napolitano, Kevin L.

    2016-01-01

    The Space Power Facility at NASA's Plum Brook Station houses the world's largest and most powerful space environment simulation facilities, including the Mechanical Vibration Facility (MVF), which offers the world's highest-capacity multi-axis spacecraft shaker system. The MVF was designed to perform sine vibration testing of a Crew Exploration Vehicle (CEV)-class spacecraft with a total mass of 75,000 pounds, center of gravity (cg) height above the table of 284 inches, diameter of 18 feet, and capability of 1.25 gravity units peak acceleration in the vertical and 1.0 gravity units peak acceleration in the lateral directions. The MVF is a six-degree-of-freedom, servo-hydraulic, sinusoidal base-shake vibration system that has the advantage of being able to perform single-axis sine vibration testing of large structures in the vertical and two lateral axes without the need to reconfigure the test article for each axis. This paper discusses efforts to extend the MVF's capabilities so that it can also be used to determine fixed base modes of its test article without the need for an expensive test-correlated facility simulation.

  16. Earthquake Early Warning ShakeAlert System: Testing and certification platform

    USGS Publications Warehouse

    Cochran, Elizabeth S.; Kohler, Monica D.; Given, Douglas; Guiwits, Stephen; Andrews, Jennifer; Meier, Men-Andrin; Ahmad, Mohammad; Henson, Ivan; Hartog, Renate; Smith, Deborah

    2017-01-01

    Earthquake early warning systems provide warnings to end users of incoming moderate to strong ground shaking from earthquakes. An earthquake early warning system, ShakeAlert, is providing alerts to beta end users in the western United States, specifically California, Oregon, and Washington. An essential aspect of the earthquake early warning system is the development of a framework to test modifications to code to ensure functionality and assess performance. In 2016, a Testing and Certification Platform (TCP) was included in the development of the Production Prototype version of ShakeAlert. The purpose of the TCP is to evaluate the robustness of candidate code that is proposed for deployment on ShakeAlert Production Prototype servers. TCP consists of two main components: a real‐time in situ test that replicates the real‐time production system and an offline playback system to replay test suites. The real‐time tests of system performance assess code optimization and stability. The offline tests comprise a stress test of candidate code to assess if the code is production ready. The test suite includes over 120 events including local, regional, and teleseismic historic earthquakes, recentering and calibration events, and other anomalous and potentially problematic signals. Two assessments of alert performance are conducted. First, point‐source assessments are undertaken to compare magnitude, epicentral location, and origin time with the Advanced National Seismic System Comprehensive Catalog, as well as to evaluate alert latency. Second, we describe assessment of the quality of ground‐motion predictions at end‐user sites by comparing predicted shaking intensities to ShakeMaps for historic events and implement a threshold‐based approach that assesses how often end users initiate the appropriate action, based on their ground‐shaking threshold. TCP has been developed to be a convenient streamlined procedure for objectively testing algorithms, and it has been designed with flexibility to accommodate significant changes in development of new or modified system code. It is expected that the TCP will continue to evolve along with the ShakeAlert system, and the framework we describe here provides one example of how earthquake early warning systems can be evaluated.

  17. Aging and perceived event structure as a function of modality

    PubMed Central

    Magliano, Joseph; Kopp, Kristopher; McNerney, M. Windy; Radvansky, Gabriel A.; Zacks, Jeffrey M.

    2012-01-01

    The majority of research on situation model processing in older adults has focused on narrative texts. Much of this research has shown that many important aspects of constructing a situation model for a text are preserved and may even improve with age. However, narratives need not be text-based, and little is known as to whether these findings generalize to visually-based narratives. The present study assessed the impact of story modality on event segmentation, which is a basic component of event comprehension. Older and younger adults viewed picture stories or read text versions of them and segmented them into events. There was comparable alignment between the segmentation judgments and a theoretically guided analysis of shifts in situational features across modalities for both populations. These results suggest that situation models provide older adults with a stable basis for event comprehension across different modalities of expereinces. PMID:22182344

  18. A comparison of "Train-the-Trainer" and expert training modalities for hearing protection use in construction.

    PubMed

    Trabeau, Maggie; Neitzel, Richard; Meischke, Hendrika; Daniell, William E; Seixas, Noah S

    2008-02-01

    Few assessments have been conducted on the impact of a "Train-the-Trainer" (T3) approach for training delivery. The present study compared the effectiveness of a noise induced hearing loss (NIHL) prevention training delivered using "Train-the-Trainer" and expert trainer modalities. Participating construction companies were assigned to the Train-the-Trainer or expert trainer modalities. Workers were recruited from each company and then trained. The effectiveness of the modalities was assessed through the use of surveys. The accuracy of self-reported hearing protection device (HPD) use was also evaluated through on-site observation. Post-training scores for hearing conservation knowledge, perceived barriers, and current and intended future use of HPDs improved significantly for both training modalities. Subjects trained by T3 trainers significantly increased their beliefs regarding general susceptibility to NIHL, desire to prevent NIHL, and ability to recognize, and control hazardous noise exposures. The expert-trained groups significantly increased their beliefs regarding the benefits of HPD use and ability to ask for help with HPDs. The only changes that were significantly different between modalities were in general susceptibility to NIHL and effective use of HPDs. However, these beliefs differed significantly between subjects in the two-modality groups prior to training. Self-reported HPD use was poorly correlated with observed use, calling into question the validity of survey-based HPD use measures in this context. The training improved beliefs regarding HPD use, increased workers' hearing conservation knowledge, and increased self-reported HPD use. The effectiveness of the training was not found to be dependent on training modality.

  19. The Great California ShakeOut: Science-Based Preparedness Advocacy

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.

    2009-12-01

    The Great Southern California ShakeOut in November 2008 was the largest earthquake drill in U.S. history, involving over 5 million southern Californians through a broad-based outreach program, media partnerships, and public advocacy by hundreds of partners. The basis of the drill was a comprehensive scenario for a magnitude 7.8 earthquake on the southern San Andreas fault, which would cause broad devastation. In early 2009 the decision was made to hold the drill statewide on the third Thursday of October each year (October 15 in 2009). Results of the 2008 and 2009 drills will be shared in this session. In addition, prospects of early warning systems will be described, that will one day provide the needed seconds before strong shaking arrives in which critical systems and be shut down, and people can do what they've been practicing in the ShakeOut drills: drop, cover, and hold on. A key aspect of the ShakeOut is the integration of a comprehensive earthquake scenario (incorporating earth science, engineering, policy, economics, public health, and other disciplines) and the lessons learned from decades of social science research about why people get prepared. The result is a “teachable moment” on par with having an actual earthquake (often followed by increased interest in getting ready for earthquakes). ShakeOut creates the sense of urgency that is needed for people, organizations, and communities to get prepared, to practice what to do to be safe, and to learn what plans need to be improved.

  20. NMRI Measurements of Flow of Granular Mixtures

    NASA Technical Reports Server (NTRS)

    Nakagawa, Masami; Waggoner, R. Allen; Fukushima, Eiichi

    1996-01-01

    We investigate complex 3D behavior of granular mixtures in shaking and shearing devices. NMRI can non-invasively measure concentration, velocity, and velocity fluctuations of flows of suitable particles. We investigate origins of wall-shear induced convection flow of single component particles by measuring the flow and fluctuating motion of particles near rough boundaries. We also investigate if a mixture of different size particles segregate into their own species under the influence of external shaking and shearing disturbances. These non-invasive measurements will reveal true nature of convecting flow properties and wall disturbance. For experiments in a reduced gravity environment, we will design a light weight NMR imager. The proof of principle development will prepare for the construction of a complete spaceborne system to perform experiments in space.

  1. Ground motion modeling of the 1906 San Francisco earthquake II: Ground motion estimates for the 1906 earthquake and scenario events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aagaard, B; Brocher, T; Dreger, D

    2007-02-09

    We estimate the ground motions produced by the 1906 San Francisco earthquake making use of the recently developed Song et al. (2008) source model that combines the available geodetic and seismic observations and recently constructed 3D geologic and seismic velocity models. Our estimates of the ground motions for the 1906 earthquake are consistent across five ground-motion modeling groups employing different wave propagation codes and simulation domains. The simulations successfully reproduce the main features of the Boatwright and Bundock (2005) ShakeMap, but tend to over predict the intensity of shaking by 0.1-0.5 modified Mercalli intensity (MMI) units. Velocity waveforms at sitesmore » throughout the San Francisco Bay Area exhibit characteristics consistent with rupture directivity, local geologic conditions (e.g., sedimentary basins), and the large size of the event (e.g., durations of strong shaking lasting tens of seconds). We also compute ground motions for seven hypothetical scenarios rupturing the same extent of the northern San Andreas fault, considering three additional hypocenters and an additional, random distribution of slip. Rupture directivity exerts the strongest influence on the variations in shaking, although sedimentary basins do consistently contribute to the response in some locations, such as Santa Rosa, Livermore, and San Jose. These scenarios suggest that future large earthquakes on the northern San Andreas fault may subject the current San Francisco Bay urban area to stronger shaking than a repeat of the 1906 earthquake. Ruptures propagating southward towards San Francisco appear to expose more of the urban area to a given intensity level than do ruptures propagating northward.« less

  2. Reconstituting botulinum toxin drugs: shaking, stirring or what?

    PubMed

    Dressler, Dirk; Bigalke, Hans

    2016-05-01

    Most botulinum toxin (BT) drugs are stored as powders which need to be reconstituted with normal saline before clinical use. As botulinum neurotoxin (BNT), the therapeutically active ingredient, is a large double-stranded protein the process of reconstitution should be performed with special attention to mechanical stress applied. We wanted to test the mechanical stability of BNT during the reconstitution process. For this, 100 MU onabotulinumtoxinA (Botox(®), Irvine, CA, USA) was reconstituted with 2.0 ml of NaCl/H2O. Gentle reconstitution (GR) was performed with a 5 ml syringe, a 0.90 × 70 mm injection needle, one cycle of injection-aspiration-injection and two gentle shakes of the vial. Aggressive reconstitution (AR) was performed with a 5 ml syringe, a 0.40 × 40 mm injection needle, ten injection-aspiration-injection cycles and 30 s of continuous shaking of the vial. AR increased the time to paralysis in the mouse hemidiaphragm assay (HDA) from 72.0 ± 4.6 to 106.0 ± 16.0 min (*p = 0.002, two-tailed t test after Kolmogorov-Smirnova test with Lilliefors correction for normal distribution). Construction of a calibration curve revealed that the increase in the time to paralysis was correlated with a loss of potency of from 100 to 58 MU (-42 %). BT users should use large diameter injection needles for reconstitution, apply two or three injection-aspiration-injection cycles and, maybe, shake the vials a few times to rinse the entire glass wall. Aggressive reconstitution with small diameter needles, prolonged injection-aspiration-injection and violent shaking should be avoided.

  3. Ground-motion modeling of the 1906 San Francisco Earthquake, part II: Ground-motion estimates for the 1906 earthquake and scenario events

    USGS Publications Warehouse

    Aagaard, Brad T.; Brocher, T.M.; Dolenc, D.; Dreger, D.; Graves, R.W.; Harmsen, S.; Hartzell, S.; Larsen, S.; McCandless, K.; Nilsson, S.; Petersson, N.A.; Rodgers, A.; Sjogreen, B.; Zoback, M.L.

    2008-01-01

    We estimate the ground motions produce by the 1906 San Francisco earthquake making use of the recently developed Song et al. (2008) source model that combines the available geodetic and seismic observations and recently constructed 3D geologic and seismic velocity models. Our estimates of the ground motions for the 1906 earthquake are consistent across five ground-motion modeling groups employing different wave propagation codes and simulation domains. The simulations successfully reproduce the main features of the Boatwright and Bundock (2005) ShakeMap, but tend to over predict the intensity of shaking by 0.1-0.5 modified Mercalli intensity (MMI) units. Velocity waveforms at sites throughout the San Francisco Bay Area exhibit characteristics consistent with rupture directivity, local geologic conditions (e.g., sedimentary basins), and the large size of the event (e.g., durations of strong shaking lasting tens of seconds). We also compute ground motions for seven hypothetical scenarios rupturing the same extent of the northern San Andreas fault, considering three additional hypocenters and an additional, random distribution of slip. Rupture directivity exerts the strongest influence on the variations in shaking, although sedimentary basins do consistently contribute to the response in some locations, such as Santa Rosa, Livermore, and San Jose. These scenarios suggest that future large earthquakes on the northern San Andreas fault may subject the current San Francisco Bay urban area to stronger shaking than a repeat of the 1906 earthquake. Ruptures propagating southward towards San Francisco appear to expose more of the urban area to a given intensity level than do ruptures propagating northward.

  4. A study on seismic behavior of pile foundations of bridge abutment on liquefiable ground through shaking table tests

    NASA Astrophysics Data System (ADS)

    Nakata, Mitsuhiko; Tanimoto, Shunsuke; Ishida, Shuichi; Ohsumi, Michio; Hoshikuma, Jun-ichi

    2017-10-01

    There is risk of bridge foundations to be damaged by liquefaction-induced lateral spreading of ground. Once bridge foundations have been damaged, it takes a lot of time for restoration. Therefore, it is important to assess the seismic behavior of the foundations on liquefiable ground appropriately. In this study, shaking table tests of models on a scale of 1/10 were conducted at the large scale shaking table in Public Works Research Institute, Japan, to investigate the seismic behavior of pile-supported bridge abutment on liquefiable ground. The shaking table tests were conducted for three types of model. Two are models of existing bridge which was built without design for liquefaction and the other is a model of bridge which was designed based on the current Japanese design specifications for highway bridges. As a result, the bending strains of piles of the abutment which were designed based on the current design specifications were less than those of the existing bridge.

  5. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  6. PageRank versatility analysis of multilayer modality-based network for exploring the evolution of oil-water slug flow.

    PubMed

    Gao, Zhong-Ke; Dang, Wei-Dong; Li, Shan; Yang, Yu-Xuan; Wang, Hong-Tao; Sheng, Jing-Ran; Wang, Xiao-Fan

    2017-07-14

    Numerous irregular flow structures exist in the complicated multiphase flow and result in lots of disparate spatial dynamical flow behaviors. The vertical oil-water slug flow continually attracts plenty of research interests on account of its significant importance. Based on the spatial transient flow information acquired through our designed double-layer distributed-sector conductance sensor, we construct multilayer modality-based network to encode the intricate spatial flow behavior. Particularly, we calculate the PageRank versatility and multilayer weighted clustering coefficient to quantitatively explore the inferred multilayer modality-based networks. Our analysis allows characterizing the complicated evolution of oil-water slug flow, from the opening formation of oil slugs, to the succedent inter-collision and coalescence among oil slugs, and then to the dispersed oil bubbles. These properties render our developed method particularly powerful for mining the essential flow features from the multilayer sensor measurements.

  7. Construction of specific magnetic resonance imaging/optical dual-modality molecular probe used for imaging angiogenesis of gastric cancer.

    PubMed

    Yan, Xuejie; Song, Xiaoyan; Wang, Zhenbo

    2017-05-01

    The purpose of the study was to construct specific magnetic resonance imaging (MRI)/optical dual-modality molecular probe. Tumor-bearing animal models were established. MRI/optical dual-modality molecular probe was construed by coupling polyethylene glycol (PEG)-modified nano-Fe 3 O 4 with specific targeted cyclopeptide GX1 and near-infrared fluorescent dyes Cy5.5. MRI/optical imaging effects of the probe were observed and the feasibility of in vivo double-modality imaging was discussed. It was found that, the double-modality probe was of high stability; tumor signal of the experimental group tended to be weak after injection of the probe, but rose to a level which was close to the previous level after 18 h (p > 0.05). We successively completed the construction of an ideal MRI/optical dual-modality molecular probe. MRI/optical dual-modality molecular probe which can selectively gather in gastric cancer is expected to be a novel probe used for diagnosing gastric cancer in the early stage.

  8. Optimization of gold ore Sumbawa separation using gravity method: Shaking table

    NASA Astrophysics Data System (ADS)

    Ferdana, Achmad Dhaefi; Petrus, Himawan Tri Bayu Murti; Bendiyasa, I. Made; Prijambada, Irfan Dwidya; Hamada, Fumio; Sachiko, Takahi

    2018-04-01

    Most of artisanal small gold mining in Indonesia has been using amalgamation method, which caused negative impact to the environment around ore processing area due to the usage of mercury. One of the more environmental-friendly method for gold processing is gravity method. Shaking table is one of separation equipment of gravity method used to increase concentrate based on difference of specific gravity. The optimum concentration result is influenced by several variables, such as rotational speed shaking, particle size and deck slope. In this research, the range of rotational speed shaking was between 100 rpm and 200 rpm, the particle size was between -100 + 200 mesh and -200 + 300 mesh and deck slope was between 3° and 7°. Gold concentration in concentrate was measured by EDX. The result shows that the optimum condition is obtained at a shaking speed of 200 rpm, with a slope of 7° and particle size of -100 + 200 mesh.

  9. Preparing for a "Big One": The great southern California shakeout

    USGS Publications Warehouse

    Jones, L.M.; Benthien, M.

    2011-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million Southern Californians pretended that the magnitude-7.8 ShakeOut scenario earthquake was occurring and practiced actions derived from results of the ShakeOut Scenario, to reduce the impact of a real, San Andreas Fault event. The communications campaign was based on four principles: 1) consistent messaging from multiple sources; 2) visual reinforcement: 3) encouragement of "milling"; and 4) focus on concrete actions. The goals of the Shake-Out established in Spring 2008 were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in Southern California; and 3) to reduce earthquake losses in Southern California. Over 90% of the registrants surveyed the next year reported improvement in earthquake preparedness at their organization as a result of the ShakeOut. ?? 2011, Earthquake Engineering Research Institute.

  10. TriNet "ShakeMaps": Rapid generation of peak ground motion and intensity maps for earthquakes in southern California

    USGS Publications Warehouse

    Wald, D.J.; Quitoriano, V.; Heaton, T.H.; Kanamori, H.; Scrivner, C.W.; Worden, C.B.

    1999-01-01

    Rapid (3-5 minutes) generation of maps of instrumental ground-motion and shaking intensity is accomplished through advances in real-time seismographic data acquisition combined with newly developed relationships between recorded ground-motion parameters and expected shaking intensity values. Estimation of shaking over the entire regional extent of southern California is obtained by the spatial interpolation of the measured ground motions with geologically based frequency and amplitude-dependent site corrections. Production of the maps is automatic, triggered by any significant earthquake in southern California. Maps are now made available within several minutes of the earthquake for public and scientific consumption via the World Wide Web; they will be made available with dedicated communications for emergency response agencies and critical users.

  11. Practical Applications for Earthquake Scenarios Using ShakeMap

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Worden, B.; Quitoriano, V.; Goltz, J.

    2001-12-01

    In planning and coordinating emergency response, utilities, local government, and other organizations are best served by conducting training exercises based on realistic earthquake situations-ones that they are most likely to face. Scenario earthquakes can fill this role; they can be generated for any geologically plausible earthquake or for actual historic earthquakes. ShakeMap Web pages now display selected earthquake scenarios (www.trinet.org/shake/archive/scenario/html) and more events will be added as they are requested and produced. We will discuss the methodology and provide practical examples where these scenarios are used directly for risk reduction. Given a selected event, we have developed tools to make it relatively easy to generate a ShakeMap earthquake scenario using the following steps: 1) Assume a particular fault or fault segment will (or did) rupture over a certain length, 2) Determine the magnitude of the earthquake based on assumed rupture dimensions, 3) Estimate the ground shaking at all locations in the chosen area around the fault, and 4) Represent these motions visually by producing ShakeMaps and generating ground motion input for loss estimation modeling (e.g., FEMA's HAZUS). At present, ground motions are estimated using empirical attenuation relationships to estimate peak ground motions on rock conditions. We then correct the amplitude at that location based on the local site soil (NEHRP) conditions as we do in the general ShakeMap interpolation scheme. Finiteness is included explicitly, but directivity enters only through the empirical relations. Although current ShakeMap earthquake scenarios are empirically based, substantial improvements in numerical ground motion modeling have been made in recent years. However, loss estimation tools, HAZUS for example, typically require relatively high frequency (3 Hz) input for predicting losses, above the range of frequencies successfully modeled to date. Achieving full-synthetic ground motion estimates that will substantially improve over empirical relations at these frequencies will require developing cost-effective numerical tools for proper theoretical inclusion of known complex ground motion effects. Current efforts underway must continue in order to obtain site, basin, and deeper crustal structure, and to characterize and test 3D earth models (including attenuation and nonlinearity). In contrast, longer period synthetics (>2 sec) are currently being generated in a deterministic fashion to include 3D and shallow site effects, an improvement on empirical estimates alone. As progress is made, we will naturally incorporate such advances into the ShakeMap scenario earthquake and processing methodology. Our scenarios are currently used heavily in emergency response planning and loss estimation. Primary users include city, county, state and federal government agencies (e.g., the California Office of Emergency Services, FEMA, the County of Los Angeles) as well as emergency response planners and managers for utilities, businesses, and other large organizations. We have found the scenarios are also of fundamental interest to many in the media and the general community interested in the nature of the ground shaking likely experienced in past earthquakes as well as effects of rupture on known faults in the future.

  12. Shake table tests of suspended ceilings to simulate the observed damage in the M s7.0 Lushan earthquake, China

    NASA Astrophysics Data System (ADS)

    Wang, Duozhi; Dai, Junwu; Qu, Zhe; Ning, Xiaoqing

    2016-06-01

    Severe damage to suspended ceilings of metal grids and lay-in panels was observed in public buildings during the 2013 M s7.0 Lushan earthquake in China. Over the past several years, suspended ceilings have been widely used practice in public buildings throughout China, including government offices, schools and hospitals. To investigate the damage mechanism of suspended ceilings, a series of three-dimensional shake table tests was conducted to reproduce the observed damage. A full-scale reinforced concrete frame was constructed as the testing frame for the ceiling, which was single-story and infilled with brick masonry walls to represent the local construction of low-rise buildings. In general, the ceiling in the tests exhibited similar damage phenomena as the field observations, such as higher vulnerability of perimeter elements and extensive damage to the cross runners. However, it exhibited lower fragility in terms of peak ground/roof accelerations at the initiation of damage. Further investigations are needed to clarify the reasons for this behavior.

  13. CyberShake Physics-Based PSHA in Central California

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.

    2017-12-01

    The Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, which performs physics-based probabilistic seismic hazard analyis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a wavefield of Strain Green Tensors. An earthquake rupture forecast (ERF) is then extended by varying hypocenters and slips on finite faults, generating about 500,000 events per site of interest. Seismic reciprocity is used to calculate synthetic seismograms, which are processed to obtain intensity measures (IMs) such as RotD100. These are combined with ERF probabilities to produce hazard curves. PSHA results from hundreds of locations across a region are interpolated to produce a hazard map. CyberShake simulations with SCEC 3D Community Velocity Models have shown how the site and path effects vary with differences in upper crustal structure, and they are particularly informative about epistemic uncertainties in basin effects, which are not well parameterized by depths to iso-velocity surfaces, common inputs to GMPEs. In 2017, SCEC performed CyberShake Study 17.3, expanding into Central California for the first time. Seismic hazard calculations were performed at 1 Hz at 438 sites, using both a 3D tomographically-derived central California velocity model and a regionally averaged 1D model. Our simulation volumes extended outside of Central California, so we included other SCEC velocity models and developed a smoothing algorithm to minimize reflection and refraction effects along interfaces. CyberShake Study 17.3 ran for 31 days on NCSA's Blue Waters and ORNL's Titan supercomputers, burning 21.6 million core-hours and producing 285 million two-component seismograms and 43 billion IMs. These results demonstrate that CyberShake can be successfully expanded into new regions, and lend insights into the effects of directivity-basin coupling associated with basins near major faults such as the San Andreas. In particular, we observe in the 3D results that basin amplification for sites in the southern San Joaquin Valley is less than for sites in smaller basins such as around Ventura. We will present CyberShake hazard estimates from the 1D and 3D models, compare results to those from previous CyberShake studies and GMPEs, and describe our future plans.

  14. Robust independent modal space control of a coupled nano-positioning piezo-stage

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Yang, Fufeng; Rui, Xiaoting

    2018-06-01

    In order to accurately control a coupled 3-DOF nano-positioning piezo-stage, this paper designs a hybrid controller. In this controller, a hysteresis observer based on a Bouc-Wen model is established to compensate the hysteresis nonlinearity of the piezoelectric actuator first. Compared to hysteresis compensations using Preisach model and Prandt-Ishlinskii model, the compensation method using the hysteresis observer is computationally lighter. Then, based on the proposed dynamics model, by constructing the modal filter, a robust H∞ independent modal space controller is designed and utilized to decouple the piezo-stage and deal with the unmodeled dynamics, disturbance, and hysteresis compensation error. The effectiveness of the proposed controller is demonstrated experimentally. The experimental results show that the proposed controller can significantly achieve the high-precision positioning.

  15. Photoacoustic-Based Multimodal Nanoprobes: from Constructing to Biological Applications.

    PubMed

    Gao, Duyang; Yuan, Zhen

    2017-01-01

    Multimodal nanoprobes have attracted intensive attentions since they can integrate various imaging modalities to obtain complementary merits of single modality. Meanwhile, recent interest in laser-induced photoacoustic imaging is rapidly growing due to its unique advantages in visualizing tissue structure and function with high spatial resolution and satisfactory imaging depth. In this review, we summarize multimodal nanoprobes involving photoacoustic imaging. In particular, we focus on the method to construct multimodal nanoprobes. We have divided the synthetic methods into two types. First, we call it "one for all" concept, which involves intrinsic properties of the element in a single particle. Second, "all in one" concept, which means integrating different functional blocks in one particle. Then, we simply introduce the applications of the multifunctional nanoprobes for in vivo imaging and imaging-guided tumor therapy. At last, we discuss the advantages and disadvantages of the present methods to construct the multimodal nanoprobes and share our viewpoints in this area.

  16. Overview of the Orion Vibroacoustic Test Capability at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; Hozman, Aron D.; McNelis, Mark E.; Otten, Kim D.

    2008-01-01

    In order to support the environmental test needs for our new Orion and Constellation program, NASA is developing unique world-class test facilities. To optimize this testing of spaceflight hardware while minimizing transportation issues, a one-stop, under one roof test capability is being developed at the Space Power Facility at the NASA Glenn Research Center's Plum Brook Station. This facility will provide the capability to perform the following environmental testing: (1) reverberation acoustic testing, (2) mechanical base-shake sine testing, (3) modal testing, (4) thermal-vacuum testing, and (5) EMI/EMC (electromagnetic interference and compatibility) testing. An overview of this test capability will be provided in this presentation, with special focus on the two new vibroacoustic test facilities currently being designed and built, the Reverberant Acoustic Test Facility (RATF) and the Mechanical Vibration Facility (MVF). Testing of the engineering developmental hardware and qualification hardware of the Orion (Crew Exploration Vehicle) will commence shortly after the facilities are commissioned.

  17. A novel multigene expression construct for modification of glycerol metabolism in Yarrowia lipolytica

    PubMed Central

    2013-01-01

    Background High supply of raw, residual glycerol from biodiesel production plants promote the search for novel biotechnological methods of its utilization. In this study we attempted modification of glycerol catabolism in a nonconventional yeast species Yarrowia lipolytica through genetic engineering approach. Results To address this, we developed a novel genetic construct which allows transferring three heterologous genes, encoding glycerol dehydratase, its reactivator and a wide-spectrum alcohol oxidoreductase under the control of glycerol-induced promoter. The three genes, tandemly arrayed in an expression cassette with a marker gene ura3, regulatory and targeting sequences (G3P dh promoter and XPR-like terminator, 28S rDNA as a target locus), were transferred into Yarrowia lipolytica cells. The obtained recombinant strain NCYC3825 was characterized at the molecular level and with respect to its biotechnological potential. Our experiments indicated that the novel recombinant strain stably borne one copy of the expression cassette and efficiently expressed heterologous alcohol oxidoreductase, while glycerol dehydratase and its reactivator were expressed at lower level. Comparative shake flask cultivations in glucose- and glycerol-based media demonstrated higher biomass production by the recombinant strain when glycerol was the main carbon source. During bioreactor (5 L) fed-batch cultivation in glycerol-based medium, the recombinant strain was characterized by relatively high biomass and lipids accumulation (up to 42 gDCW L-1, and a peak value of 38%LIPIDS of DCW, respectively), and production of high titers of citric acid (59 g L-1) and 2-phenylethanol (up to 1 g L-1 in shake flask cultivation), which are industrially attractive bioproducts. Conclusions Due to heterogeneous nature of the observed alterations, we postulate that the main driving force of the modified phenotype was faster growth in glycerol-based media, triggered by modifications in the red-ox balance brought by the wide spectrum oxidoreductase. Our results demonstrate the potential multidirectional use of a novel Yarrowia lipolytica strain as a microbial cell factory. PMID:24188724

  18. Expanding CyberShake Physics-Based Seismic Hazard Calculations to Central California

    NASA Astrophysics Data System (ADS)

    Silva, F.; Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.

    2016-12-01

    As part of its program of earthquake system science, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by first simulating a tensor-valued wavefield of Strain Green Tensors. CyberShake then takes an earthquake rupture forecast and extends it by varying the hypocenter location and slip distribution, resulting in about 500,000 rupture variations. Seismic reciprocity is used to calculate synthetic seismograms for each rupture variation at each computation site. These seismograms are processed to obtain intensity measures, such as spectral acceleration, which are then combined with probabilities from the earthquake rupture forecast to produce a hazard curve. Hazard curves are calculated at seismic frequencies up to 1 Hz for hundreds of sites in a region and the results interpolated to obtain a hazard map. In developing and verifying CyberShake, we have focused our modeling in the greater Los Angeles region. We are now expanding the hazard calculations into Central California. Using workflow tools running jobs across two large-scale open-science supercomputers, NCSA Blue Waters and OLCF Titan, we calculated 1-Hz PSHA results for over 400 locations in Central California. For each location, we produced hazard curves using both a 3D central California velocity model created via tomographic inversion, and a regionally averaged 1D model. These new results provide low-frequency exceedance probabilities for the rapidly expanding metropolitan areas of Santa Barbara, Bakersfield, and San Luis Obispo, and lend new insights into the effects of directivity-basin coupling associated with basins juxtaposed to major faults such as the San Andreas. Particularly interesting are the basin effects associated with the deep sediments of the southern San Joaquin Valley. We will compare hazard estimates from the 1D and 3D models, summarize the challenges of expanding CyberShake to a new geographic region, and describe our future CyberShake plans.

  19. Concise Review: Bioprinting of Stem Cells for Transplantable Tissue Fabrication.

    PubMed

    Leberfinger, Ashley N; Ravnic, Dino J; Dhawan, Aman; Ozbolat, Ibrahim T

    2017-10-01

    Bioprinting is a quickly progressing technology, which holds the potential to generate replacement tissues and organs. Stem cells offer several advantages over differentiated cells for use as starting materials, including the potential for autologous tissue and differentiation into multiple cell lines. The three most commonly used stem cells are embryonic, induced pluripotent, and adult stem cells. Cells are combined with various natural and synthetic materials to form bioinks, which are used to fabricate scaffold-based or scaffold-free constructs. Computer aided design technology is combined with various bioprinting modalities including droplet-, extrusion-, or laser-based bioprinting to create tissue constructs. Each bioink and modality has its own advantages and disadvantages. Various materials and techniques are combined to maximize the benefits. Researchers have been successful in bioprinting cartilage, bone, cardiac, nervous, liver, and vascular tissues. However, a major limitation to clinical translation is building large-scale vascularized constructs. Many challenges must be overcome before this technology is used routinely in a clinical setting. Stem Cells Translational Medicine 2017;6:1940-1948. © 2017 The Authors Stem Cells Translational Medicine published by Wiley Periodicals, Inc. on behalf of AlphaMed Press.

  20. Development and validation of an automated operational modal analysis algorithm for vibration-based monitoring and tensile load estimation

    NASA Astrophysics Data System (ADS)

    Rainieri, Carlo; Fabbrocino, Giovanni

    2015-08-01

    In the last few decades large research efforts have been devoted to the development of methods for automated detection of damage and degradation phenomena at an early stage. Modal-based damage detection techniques are well-established methods, whose effectiveness for Level 1 (existence) and Level 2 (location) damage detection is demonstrated by several studies. The indirect estimation of tensile loads in cables and tie-rods is another attractive application of vibration measurements. It provides interesting opportunities for cheap and fast quality checks in the construction phase, as well as for safety evaluations and structural maintenance over the structure lifespan. However, the lack of automated modal identification and tracking procedures has been for long a relevant drawback to the extensive application of the above-mentioned techniques in the engineering practice. An increasing number of field applications of modal-based structural health and performance assessment are appearing after the development of several automated output-only modal identification procedures in the last few years. Nevertheless, additional efforts are still needed to enhance the robustness of automated modal identification algorithms, control the computational efforts and improve the reliability of modal parameter estimates (in particular, damping). This paper deals with an original algorithm for automated output-only modal parameter estimation. Particular emphasis is given to the extensive validation of the algorithm based on simulated and real datasets in view of continuous monitoring applications. The results point out that the algorithm is fairly robust and demonstrate its ability to provide accurate and precise estimates of the modal parameters, including damping ratios. As a result, it has been used to develop systems for vibration-based estimation of tensile loads in cables and tie-rods. Promising results have been achieved for non-destructive testing as well as continuous monitoring purposes. They are documented in the last sections of the paper.

  1. On-line prediction of the glucose concentration of CHO cell cultivations by NIR and Raman spectroscopy: Comparative scalability test with a shake flask model system.

    PubMed

    Kozma, Bence; Hirsch, Edit; Gergely, Szilveszter; Párta, László; Pataki, Hajnalka; Salgó, András

    2017-10-25

    In this study, near-infrared (NIR) and Raman spectroscopy were compared in parallel to predict the glucose concentration of Chinese hamster ovary cell cultivations. A shake flask model system was used to quickly generate spectra similar to bioreactor cultivations therefore accelerating the development of a working model prior to actual cultivations. Automated variable selection and several pre-processing methods were tested iteratively during model development using spectra from six shake flask cultivations. The target was to achieve the lowest error of prediction for the glucose concentration in two independent shake flasks. The best model was then used to test the scalability of the two techniques by predicting spectra of a 10l and a 100l scale bioreactor cultivation. The NIR spectroscopy based model could follow the trend of the glucose concentration but it was not sufficiently accurate for bioreactor monitoring. On the other hand, the Raman spectroscopy based model predicted the concentration of glucose in both cultivation scales sufficiently accurately with an error around 4mM (0.72g/l), that is satisfactory for the on-line bioreactor monitoring purposes of the biopharma industry. Therefore, the shake flask model system was proven to be suitable for scalable spectroscopic model development. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Oxygen transfer phenomena in 48-well microtiter plates: determination by optical monitoring of sulfite oxidation and verification by real-time measurement during microbial growth.

    PubMed

    Kensy, Frank; Zimmermann, Hartmut F; Knabben, Ingo; Anderlei, Tibor; Trauthwein, Harald; Dingerdissen, Uwe; Büchs, Jochen

    2005-03-20

    Oxygen limitation is one of the most frequent problems associated with the application of shaking bioreactors. The gas-liquid oxygen transfer properties of shaken 48-well microtiter plates (MTPs) were analyzed at different filling volumes, shaking diameters, and shaking frequencies. On the one hand, an optical method based on sulfite oxidation was used as a chemical model system to determine the maximum oxygen transfer capacity (OTR(max)). On the other hand, the Respiration Activity Monitoring System (RAMOS) was applied for online measurement of the oxygen transfer rate (OTR) during growth of the methylotropic yeast Hansenula polymorpha. A proportionality constant between the OTR(max) of the biological system and the OTR(max) of the chemical system were indicated from these data, offering the possibility to transform the whole set of chemical data to biologically relevant conditions. The results exposed "out of phase" shaking conditions at a shaking diameter of 1 mm, which were confirmed by theoretical consideration with the phase number (Ph). At larger shaking diameters (2-50 mm) the oxygen transfer rate in MTPs shaken at high frequencies reached values of up to 0.28 mol/L/h, corresponding to a volumetric mass transfer coefficient (k(L)a) of 1,600 1/h. The specific mass transfer area (a) increases exponentially with the shaking frequency up to values of 2,400 1/m. On the contrary, the mass transfer coefficient (k(L)) is constant at a level of about 0.15 m/h over a wide range of shaking frequencies and shaking diameters. However, at high shaking frequencies, when the complete liquid volume forms a thin film on the cylindric wall of the well, the mass transfer coefficient (k(L)) increases linearly to values of up to 0.76 m/h. Essentially, the present investigation demonstrates that the 48-well plate outperforms the 96-well MTP and shake flasks at widely used operating conditions with respect to oxygen supply. The 48-well plates emerge, therefore, as an excellent alternative for microbial cultivation and expression studies combining the advantages of both the high-throughput 96-well MTP and the classical shaken Erlenmeyer flask.

  3. Temporal and spatial heterogeneity of rupture process application in shakemaps of Yushu Ms7.1 earthquake, China

    NASA Astrophysics Data System (ADS)

    Kun, C.

    2015-12-01

    Studies have shown that estimates of ground motion parameter from ground motion attenuation relationship often greater than the observed value, mainly because multiple ruptures of the big earthquake reduce the source pulse height of source time function. In the absence of real-time data of the station after the earthquake, this paper attempts to make some constraints from the source, to improve the accuracy of shakemaps. Causative fault of Yushu Ms 7.1 earthquake is vertical approximately (dip 83 °), and source process in time and space was dispersive distinctly. Main shock of Yushu Ms7.1 earthquake can be divided into several sub-events based on source process of this earthquake. Magnitude of each sub-events depended on each area under the curve of source pulse of source time function, and location derived from source process of each sub-event. We use ShakeMap method with considering the site effect to generate shakeMap for each sub-event, respectively. Finally, ShakeMaps of mainshock can be aquired from superposition of shakemaps for all the sub-events in space. Shakemaps based on surface rupture of causative Fault from field survey can also be derived for mainshock with only one magnitude. We compare ShakeMaps of both the above methods with Intensity of investigation. Comparisons show that decomposition method of main shock more accurately reflect the shake of earthquake in near-field, but for far field the shake is controlled by the weakening influence of the source, the estimated Ⅵ area was smaller than the intensity of the actual investigation. Perhaps seismic intensity in far-field may be related to the increasing seismic duration for the two events. In general, decomposition method of main shock based on source process, considering shakemap of each sub-event, is feasible for disaster emergency response, decision-making and rapid Disaster Assessment after the earthquake.

  4. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  5. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015 Sabah earthquake offers a case in point.

  6. Using safety inspection data to estimate shaking intensity for the 1994 Northridge earthquake

    USGS Publications Warehouse

    Thywissen, K.; Boatwright, J.

    1998-01-01

    We map the shaking intensity suffered in Los Angeles County during the 17 January 1994, Northridge earthquake using municipal safety inspection data. The intensity is estimated from the number of buildings given red, yellow, or green tags, aggregated by census tract. Census tracts contain from 200 to 4000 residential buildings and have an average area of 6 km2 but are as small as 2 and 1 km2 in the most densely populated areas of the San Fernando Valley and downtown Los Angeles, respectively. In comparison, the zip code areas on which standard MMI intensity estimates are based are six times larger, on average, than the census tracts. We group the buildings by age (before and after 1940 and 1976), by number of housing units (one, two to four, and five or more), and by construction type, and we normalize the tags by the total number of similar buildings in each census tract. We analyze the seven most abundant building categories. The fragilities (the fraction of buildings in each category tagged within each intensity level) for these seven building categories are adjusted so that the intensity estimates agree. We calibrate the shaking intensity to correspond with the modified Mercalli intensities (MMI) estimated and compiled by Dewey et al. (1995); the shapes of the resulting isoseismals are similar, although we underestimate the extent of the MMI = 6 and 7 areas. The fragility varies significantly between different building categories (by factors of 10 to 20) and building ages (by factors of 2 to 6). The post-1940 wood-frame multi-family (???5 units) dwellings make up the most fragile building category, and the post-1940 wood-frame single-family dwellings make up the most resistant building category.

  7. Hybrid Residual Flexibility/Mass-Additive Method for Structural Dynamic Testing

    NASA Technical Reports Server (NTRS)

    Tinker, M. L.

    2003-01-01

    A large fixture was designed and constructed for modal vibration testing of International Space Station elements. This fixed-base test fixture, which weighs thousands of pounds and is anchored to a massive concrete floor, initially utilized spherical bearings and pendulum mechanisms to simulate Shuttle orbiter boundary constraints for launch of the hardware. Many difficulties were encountered during a checkout test of the common module prototype structure, mainly due to undesirable friction and excessive clearances in the test-article-to-fixture interface bearings. Measured mode shapes and frequencies were not representative of orbiter-constrained modes due to the friction and clearance effects in the bearings. As a result, a major redesign effort for the interface mechanisms was undertaken. The total cost of the fixture design, construction and checkout, and redesign was over $2 million. Because of the problems experienced with fixed-base testing, alternative free-suspension methods were studied, including the residual flexibility and mass-additive approaches. Free-suspension structural dynamics test methods utilize soft elastic bungee cords and overhead frame suspension systems that are less complex and much less expensive than fixed-base systems. The cost of free-suspension fixturing is on the order of tens of thousands of dollars as opposed to millions, for large fixed-base fixturing. In addition, free-suspension test configurations are portable, allowing modal tests to be done at sites without modal test facilities. For example, a mass-additive modal test of the ASTRO-1 Shuttle payload was done at the Kennedy Space Center launch site. In this Technical Memorandum, the mass-additive and residual flexibility test methods are described in detail. A discussion of a hybrid approach that combines the best characteristics of each method follows and is the focus of the study.

  8. Lunar regolith densification

    NASA Technical Reports Server (NTRS)

    Ko, Hon-Yim; Sture, Stein

    1991-01-01

    Core tube samples of the lunar regolith obtained during the Apollo missions showed a rapid increase in the density of the regolith with depth. Various hypotheses have been proposed for the possible cause of this phenomenon, including the densification of the loose regolith material by repeated shaking from the seismic tremors which have been found to occur at regular monthly intervals when the moon and earth are closest to one another. A test bed was designed to study regolith densification. This test bed uses Minnesota Lunar Simulant (MLS) to conduct shaking experiments in the geotechnical centrifuge with an inflight shake table system. By reproducing realistic in-situ regolith properties, the experiment also serves to test penetrator concepts. The shake table system was designed and used for simulation experiments to study effects of earthquakes on terrestrial soil structures. It is mounted on a 15 g-ton geotechnical centrifuge in which the self-weight induced stresses are replicated by testing an n-th scale model in a gravity field which is n times larger than Earth's gravity. A similar concept applies when dealing with lunar prototypes, where the gravity ratio required for proper simulation of lunar gravity effects is that between the centrifugal acceleration and the lunar gravity. Records of lunar seismic tremors, or moonquakes, were obtained. While these records are being prepared for use as the input data to drive the shake table system, records from the El Centro earthquake of 1940 are being used to perform preliminary tests, using a soil container which was previously used for earthquake studies. This container has a laminar construction, with the layers free to slide on each other, so that the soil motion during the simulated earthquake will not be constrained by the otherwise rigid boundaries. The soil model is prepared by pluviating the MLS from a hopper into the laminar container to a depth of 6 in. The container is mounted on the shake table and the centrifuge is operated to generate an acceleration of 10 times Earth's gravity or 60 times the lunar gravity, thus simulating a lunar regolith thickness of 30 ft. The shake table is then operated using the scaled 'moonquake' as the input motion. One or more model moonquakes are used in each experiment, after which the soil is analyzed for its density profile with depth. This is accomplished by removing from the soil bed a column of soil contained within a thin rubber sleeve which has been previously embedded vertically in the soil during pluviation. This column of soil is transferred to a gamma ray device, in which the gamma ray transmission transversely through the soil is measured and compared with standard calibration samples. In this manner, the density profile can be determined. Preliminary results to date are encouraging, and the Center plans to study the effects of duration of shaking, intensity of the shaking motion, and the frequency of the motion.

  9. What Do We Tell People to Do during Earthquake Shaking in Emerging Countries? A Multidisciplinary Approach

    NASA Astrophysics Data System (ADS)

    Tucker, B. E.; Rodgers, J. E.; Tobin, L. T.; Cedillos, V.; Tshering, K. D.; Kumar, H.; Jomo, J.

    2015-12-01

    Providing advice on what to do during shaking in developing nations with high earthquake hazard and vulnerable buildings is a daunting task at the intersection of public policy and science. In areas where most buildings are masonry, earthen or concrete and are at high risk of collapse in strong shaking, advice may differ from the "Drop, Cover and Hold On" message given where residential construction is light timber (less lethal) and building codes are enforced. People in emerging countries are often unsure whether to evacuate or "run out" of the building, or to remain inside and protect oneself with Drop, Cover and Hold On, going to a marked "Safe Zone" or one of several other actions. GeoHazards International approached this problem by bringing together scientific research from multiple disciplines: seismology, epidemiology, structural engineering, risk communication, and sociology. We brought together researchers and practitioners, who applied scientific principles and professional judgment to the limited data on effectiveness of various protective actions in emerging countries. We developed guidance on what message creators and policymakers should consider; the process for developing message content and forms; and the people to involve. A responsible message must account for not only the local tectonic environment and site conditions, but also building vulnerability, the presence of safe open space, how people are killed and injured in earthquakes, population exposure, and the beliefs, customs and social context that affect how messages are received and acted upon. We found that local agencies should make a policy decision on the appropriate action, based on local scientific and technical information, because no one protective action will protect the majority in every context. The safest specific action varies according to where people are located, whether they will be safer where they are or by moving, and whether they can make it to the safer place before shaking becomes too strong. We plan to conduct a field test of our guidance process in Haiti, where we will help local agencies create and disseminate messages. GHI's protective actions guidance illustrates how scientific research from multiple disciplines can combine to achieve broader impact, in this case via public safety messaging.

  10. Seismic performance of geosynthetic-soil retaining wall structures

    NASA Astrophysics Data System (ADS)

    Zarnani, Saman

    Vertical inclusions of expanded polystyrene (EPS) placed behind rigid retaining walls were investigated as geofoam seismic buffers to reduce earthquake-induced loads. A numerical model was developed using the program FLAC and the model validated against 1-g shaking table test results of EPS geofoam seismic buffer models. Two constitutive models for the component materials were examined: elastic-perfectly plastic with Mohr-Coulomb (M-C) failure criterion and non-linear hysteresis damping model with equivalent linear method (ELM) approach. It was judged that the M-C model was sufficiently accurate for practical purposes. The mechanical property of interest to attenuate dynamic loads using a seismic buffer was the buffer stiffness defined as K = E/t (E = buffer elastic modulus, t = buffer thickness). For the range of parameters investigated in this study, K ≤50 MN/m3 was observed to be the practical range for the optimal design of these systems. Parametric numerical analyses were performed to generate design charts that can be used for the preliminary design of these systems. A new high capacity shaking table facility was constructed at RMC that can be used to study the seismic performance of earth structures. Reduced-scale models of geosynthetic reinforced soil (GRS) walls were built on this shaking table and then subjected to simulated earthquake loading conditions. In some shaking table tests, combined use of EPS geofoam and horizontal geosynthetic reinforcement layers was investigated. Numerical models were developed using program FLAC together with ELM and M-C constitutive models. Physical and numerical results were compared against predicted values using analysis methods found in the journal literature and in current North American design guidelines. The comparison shows that current Mononobe-Okabe (M-O) based analysis methods could not consistently satisfactorily predict measured reinforcement connection load distributions at all elevations under both static and dynamic loading conditions. The results from GRS model wall tests with combined EPS geofoam and geosynthetic reinforcement layers show that the inclusion of a EPS geofoam layer behind the GRS wall face can reduce earth loads acting on the wall facing to values well below those recorded for conventional GRS wall model configurations.

  11. SCEC Earthquake System Science Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.

    2008-12-01

    The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.

  12. The optimal hormonal replacement modality selection for multiple organ procurement from brain-dead organ donors

    PubMed Central

    Mi, Zhibao; Novitzky, Dimitri; Collins, Joseph F; Cooper, David KC

    2015-01-01

    The management of brain-dead organ donors is complex. The use of inotropic agents and replacement of depleted hormones (hormonal replacement therapy) is crucial for successful multiple organ procurement, yet the optimal hormonal replacement has not been identified, and the statistical adjustment to determine the best selection is not trivial. Traditional pair-wise comparisons between every pair of treatments, and multiple comparisons to all (MCA), are statistically conservative. Hsu’s multiple comparisons with the best (MCB) – adapted from the Dunnett’s multiple comparisons with control (MCC) – has been used for selecting the best treatment based on continuous variables. We selected the best hormonal replacement modality for successful multiple organ procurement using a two-step approach. First, we estimated the predicted margins by constructing generalized linear models (GLM) or generalized linear mixed models (GLMM), and then we applied the multiple comparison methods to identify the best hormonal replacement modality given that the testing of hormonal replacement modalities is independent. Based on 10-year data from the United Network for Organ Sharing (UNOS), among 16 hormonal replacement modalities, and using the 95% simultaneous confidence intervals, we found that the combination of thyroid hormone, a corticosteroid, antidiuretic hormone, and insulin was the best modality for multiple organ procurement for transplantation. PMID:25565890

  13. Strong Effects of Vs30 Heterogeneity on Physics-Based Scenario Ground-Shaking Computations

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Pullammanappallil, S. K.

    2014-12-01

    Hazard mapping and building codes worldwide use the vertically time-averaged shear-wave velocity between the surface and 30 meters depth, Vs30, as one predictor of earthquake ground shaking. Intensive field campaigns a decade ago in Reno, Los Angeles, and Las Vegas measured urban Vs30 transects with 0.3-km spacing. The Clark County, Nevada, Parcel Map includes urban Las Vegas and comprises over 10,000 site measurements over 1500 km2, completed in 2010. All of these data demonstrate fractal spatial statistics, with a fractal dimension of 1.5-1.8 at scale lengths from 0.5 km to 50 km. Vs measurements in boreholes up to 400 m deep show very similar statistics at 1 m to 200 m lengths. When included in physics-based earthquake-scenario ground-shaking computations, the highly heterogeneous Vs30 maps exhibit unexpectedly strong influence. In sensitivity tests (image below), low-frequency computations at 0.1 Hz display amplifications (as well as de-amplifications) of 20% due solely to Vs30. In 0.5-1.0 Hz computations, the amplifications are a factor of two or more. At 0.5 Hz and higher frequencies the amplifications can be larger than what the 1-d Building Code equations would predict from the Vs30 variations. Vs30 heterogeneities at one location have strong influence on amplifications at other locations, stretching out in the predominant direction of wave propagation for that scenario. The sensitivity tests show that shaking and amplifications are highly scenario-dependent. Animations of computed ground motions and how they evolve with time suggest that the fractal Vs30 variance acts to trap wave energy and increases the duration of shaking. Validations of the computations against recorded ground motions, possible in Las Vegas Valley due to the measurements of the Clark County Parcel Map, show that ground motion levels and amplifications match, while recorded shaking has longer duration than computed shaking. Several mechanisms may explain the amplification and increased duration of shaking in the presence of heterogeneous spatial distributions of Vs: conservation of wave energy across velocity changes; geometric focusing of waves by low-velocity lenses; vertical resonance and trapping; horizontal resonance and trapping; and multiple conversion of P- to S-wave energy.

  14. A simple hand‐held magnet array for efficient and reproducible SABRE hyperpolarisation using manual sample shaking

    PubMed Central

    Richardson, Peter M.; Jackson, Scott; Parrott, Andrew J.; Nordon, Alison; Duckett, Simon B.

    2018-01-01

    Signal amplification by reversible exchange (SABRE) is a hyperpolarisation technique that catalytically transfers nuclear polarisation from parahydrogen, the singlet nuclear isomer of H2, to a substrate in solution. The SABRE exchange reaction is carried out in a polarisation transfer field (PTF) of tens of gauss before transfer to a stronger magnetic field for nuclear magnetic resonance (NMR) detection. In the simplest implementation, polarisation transfer is achieved by shaking the sample in the stray field of a superconducting NMR magnet. Although convenient, this method suffers from limited reproducibility and cannot be used with NMR spectrometers that do not have appreciable stray fields, such as benchtop instruments. Here, we use a simple hand‐held permanent magnet array to provide the necessary PTF during sample shaking. We find that the use of this array provides a 25% increase in SABRE enhancement over the stray field approach, while also providing improved reproducibility. Arrays with a range of PTFs were tested, and the PTF‐dependent SABRE enhancements were found to be in excellent agreement with comparable experiments carried out using an automated flow system where an electromagnet is used to generate the PTF. We anticipate that this approach will improve the efficiency and reproducibility of SABRE experiments carried out using manual shaking and will be particularly useful for benchtop NMR, where a suitable stray field is not readily accessible. The ability to construct arrays with a range of PTFs will also enable the rapid optimisation of SABRE enhancement as function of PTF for new substrate and catalyst systems. PMID:29193324

  15. Imaging Strategies for Tissue Engineering Applications

    PubMed Central

    Nam, Seung Yun; Ricles, Laura M.; Suggs, Laura J.

    2015-01-01

    Tissue engineering has evolved with multifaceted research being conducted using advanced technologies, and it is progressing toward clinical applications. As tissue engineering technology significantly advances, it proceeds toward increasing sophistication, including nanoscale strategies for material construction and synergetic methods for combining with cells, growth factors, or other macromolecules. Therefore, to assess advanced tissue-engineered constructs, tissue engineers need versatile imaging methods capable of monitoring not only morphological but also functional and molecular information. However, there is no single imaging modality that is suitable for all tissue-engineered constructs. Each imaging method has its own range of applications and provides information based on the specific properties of the imaging technique. Therefore, according to the requirements of the tissue engineering studies, the most appropriate tool should be selected among a variety of imaging modalities. The goal of this review article is to describe available biomedical imaging methods to assess tissue engineering applications and to provide tissue engineers with criteria and insights for determining the best imaging strategies. Commonly used biomedical imaging modalities, including X-ray and computed tomography, positron emission tomography and single photon emission computed tomography, magnetic resonance imaging, ultrasound imaging, optical imaging, and emerging techniques and multimodal imaging, will be discussed, focusing on the latest trends of their applications in recent tissue engineering studies. PMID:25012069

  16. Monitoring of Engineering Buildings Behaviour Within the Disaster Management System

    NASA Astrophysics Data System (ADS)

    Oku Topal, G.; Gülal, E.

    2017-11-01

    The Disaster management aims to prevent events that result in disaster or to reduce their losses. Monitoring of engineering buildings, identification of unusual movements and taking the necessary precautions are very crucial for determination of the disaster risk so possible prevention could be taken to reduce big loss. Improving technology, increasing population due to increased construction and these areas largest economy lead to offer damage detection strategies. Structural Health Monitoring (SHM) is the most effective of these strategies. SHM research is very important to maintain all this structuring safely. The purpose of structural monitoring is determining in advance of possible accidents and taking necessary precaution. In this paper, determining the behaviour of construction using Global Positioning System (GPS) is investigated. For this purpose shaking table tests were performed. Shaking table was moved at different amplitude and frequency aiming to determine these movement with a GPS measuring system. The obtained data were evaluated by analysis of time series and Fast Fourier Transformation techniques and the frequency and amplitude values are calculated. By examining the results of the tests made, it will be determined whether the GPS measurement method can accurately detect the movements of the engineering structures.

  17. Topology Design for Directional Range Extension Networks with Antenna Blockage

    DTIC Science & Technology

    2017-03-19

    introduced by pod-based antenna blockages. Using certain modeling approximations, the paper presents a quantitative analysis showing design trade-offs...parameters. Sec- tion IV develops quantitative relationships among key design elements and performance metrics. Section V considers some implications of the...Topology Design for Directional Range Extension Networks with Antenna Blockage Thomas Shake MIT Lincoln Laboratory shake@ll.mit.edu Abstract

  18. ShakeMap-based prediction of earthquake-induced mass movements in Switzerland calibrated on historical observations

    USGS Publications Warehouse

    Cauzzi, Carlo; Fah, Donat; Wald, David J.; Clinton, John; Losey, Stephane; Wiemer, Stefan

    2018-01-01

    In Switzerland, nearly all historical Mw ~ 6 earthquakes have induced damaging landslides, rockslides and snow avalanches that, in some cases, also resulted in damage to infrastructure and loss of lives. We describe the customisation to Swiss conditions of a globally calibrated statistical approach originally developed to rapidly assess earthquake-induced landslide likelihoods worldwide. The probability of occurrence of such earthquake-induced effects is modelled through a set of geospatial susceptibility proxies and peak ground acceleration. The predictive model is tuned to capture the observations from past events and optimised for near-real-time estimates based on USGS-style ShakeMaps routinely produced by the Swiss Seismological Service. Our emphasis is on the use of high-resolution geospatial datasets along with additional local information on ground failure susceptibility. Even if calibrated on historic events with moderate magnitudes, the methodology presented in this paper yields sensible results also for low-magnitude recent events. The model is integrated in the Swiss ShakeMap framework. This study has a high practical relevance to many Swiss ShakeMap stakeholders, especially those managing lifeline systems, and to other global users interested in conducting a similar customisation for their region of interest.

  19. Linking ShakeMap and Emergency Managers in the Utah Region

    NASA Astrophysics Data System (ADS)

    Pankow, K.; Bausch, D.; Carey, B.

    2007-12-01

    In 2001, the University of Utah Seismograph Stations (UUSS) locally customized and began producing automatic ShakeMaps in Utah's Wasatch Front urban corridor as part of a new real-time earthquake information system developed under the Advanced National Seismic System. In 2005, motivated by requests from Utah's Division of Homeland Security and FEMA, ShakeMap capabilities were expanded to cover the entire Utah region. Now in 2007, ShakeMap capabilities throughout the region will again be enhanced by increased station coverage. The increased station coverage comes both from permanent stations funded by a state initiative and from the temporary deployment of EarthScope USArray stations. The state initiative will add ~22 strong-motion instruments and ~10 broadband instruments to the UUSS network. The majority of these stations will be located in southwestern Utah--one of the fastest growing regions in the U.S. EarthScope will evenly distribute 70 broadband stations in the region during 2007 that will be removed after 18 to 24 months. In addition to the enhanced station coverage for producing ShakeMaps in the Utah region, the transfer of information to the emergency response community is also being enhanced. First, tools are being developed that will link ShakeMap data with HAZUS loss-estimation software in near-real-time for rapid impact assessment. Second, ShakeMap scenarios are being used in conjunction with HAZUS loss-estimation software to produce customized maps for planning and preparedness exercises and also for developing templates that can be used following a significant regional earthquake. With the improvements to ShakeMap and the improved dialogue with the emergency managers, a suite of maps and information products were developed based on scenario earthquakes for training and exercise purposes. These products will be available in a timely fashion following a significant earthquake in the Utah region.

  20. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.

  1. Seismic isolation of nuclear power plants using elastomeric bearings

    NASA Astrophysics Data System (ADS)

    Kumar, Manish

    Seismic isolation using low damping rubber (LDR) and lead-rubber (LR) bearings is a viable strategy for mitigating the effects of extreme earthquake shaking on safety-related nuclear structures. Although seismic isolation has been deployed in nuclear structures in France and South Africa, it has not seen widespread use because of limited new build nuclear construction in the past 30 years and a lack of guidelines, codes and standards for the analysis, design and construction of isolation systems specific to nuclear structures. The nuclear accident at Fukushima Daiichi in March 2011 has led the nuclear community to consider seismic isolation for new large light water and small modular reactors to withstand the effects of extreme earthquakes. The mechanical properties of LDR and LR bearings are not expected to change substantially in design basis shaking. However, under shaking more intense than design basis, the properties of the lead cores in lead-rubber bearings may degrade due to heating associated with energy dissipation, some bearings in an isolation system may experience net tension, and the compression and tension stiffness may be affected by the horizontal displacement of the isolation system. The effects of intra-earthquake changes in mechanical properties on the response of base-isolated nuclear power plants (NPPs) were investigated using an advanced numerical model of a lead-rubber bearing that has been verified and validated, and implemented in OpenSees and ABAQUS. A series of experiments were conducted at University at Buffalo to characterize the behavior of elastomeric bearings in tension. The test data was used to validate a phenomenological model of an elastomeric bearing in tension. The value of three times the shear modulus of rubber in elastomeric bearing was found to be a reasonable estimate of the cavitation stress of a bearing. The sequence of loading did not change the behavior of an elastomeric bearing under cyclic tension, and there was no significant change in the shear modulus, compressive stiffness, and buckling load of a bearing following cavitation. Response-history analysis of base-isolated NPPs was performed using a two-node macro model and a lumped-mass stick model. A comparison of responses obtained from analysis using simplified and advanced isolator models showed that the variation in buckling load due to horizontal displacement and strength degradation due to heating of lead cores affect the responses of a base-isolated NPP most significantly. The two-node macro model can be used to estimate the horizontal displacement response of a base-isolated NPP, but a three-dimensional model that explicitly considers all of the bearings in the isolation system will be required to estimate demands on individual bearings, and to investigate rocking and torsional responses. The use of the simplified LR bearing model underestimated the torsional and rocking response of the base-isolated NPP. Vertical spectral response at the top of containment building was very sensitive to how damping was defined for the response-history analysis.

  2. Finite element model updating of a prestressed concrete box girder bridge using subproblem approximation

    NASA Astrophysics Data System (ADS)

    Chen, G. W.; Omenzetter, P.

    2016-04-01

    This paper presents the implementation of an updating procedure for the finite element model (FEM) of a prestressed concrete continuous box-girder highway off-ramp bridge. Ambient vibration testing was conducted to excite the bridge, assisted by linear chirp sweepings induced by two small electrodynamic shakes deployed to enhance the excitation levels, since the bridge was closed to traffic. The data-driven stochastic subspace identification method was executed to recover the modal properties from measurement data. An initial FEM was developed and correlation between the experimental modal results and their analytical counterparts was studied. Modelling of the pier and abutment bearings was carefully adjusted to reflect the real operational conditions of the bridge. The subproblem approximation method was subsequently utilized to automatically update the FEM. For this purpose, the influences of bearing stiffness, and mass density and Young's modulus of materials were examined as uncertain parameters using sensitivity analysis. The updating objective function was defined based on a summation of squared values of relative errors of natural frequencies between the FEM and experimentation. All the identified modes were used as the target responses with the purpose of putting more constrains for the optimization process and decreasing the number of potentially feasible combinations for parameter changes. The updated FEM of the bridge was able to produce sufficient improvements in natural frequencies in most modes of interest, and can serve for a more precise dynamic response prediction or future investigation of the bridge health.

  3. On the selection of user-defined parameters in data-driven stochastic subspace identification

    NASA Astrophysics Data System (ADS)

    Priori, C.; De Angelis, M.; Betti, R.

    2018-02-01

    The paper focuses on the time domain output-only technique called Data-Driven Stochastic Subspace Identification (DD-SSI); in order to identify modal models (frequencies, damping ratios and mode shapes), the role of its user-defined parameters is studied, and rules to determine their minimum values are proposed. Such investigation is carried out using, first, the time histories of structural responses to stationary excitations, with a large number of samples, satisfying the hypothesis on the input imposed by DD-SSI. Then, the case of non-stationary seismic excitations with a reduced number of samples is considered. In this paper, partitions of the data matrix different from the one proposed in the SSI literature are investigated, together with the influence of different choices of the weighting matrices. The study is carried out considering two different applications: (1) data obtained from vibration tests on a scaled structure and (2) in-situ tests on a reinforced concrete building. Referring to the former, the identification of a steel frame structure tested on a shaking table is performed using its responses in terms of absolute accelerations to a stationary (white noise) base excitation and to non-stationary seismic excitations of low intensity. Black-box and modal models are identified in both cases and the results are compared with those from an input-output subspace technique. With regards to the latter, the identification of a complex hospital building is conducted using data obtained from ambient vibration tests.

  4. Vortex shaking study of REBCO tape with consideration of anisotropic characteristics

    NASA Astrophysics Data System (ADS)

    Liang, Fei; Qu, Timing; Zhang, Zhenyu; Sheng, Jie; Yuan, Weijia; Iwasa, Yukikazu; Zhang, Min

    2017-09-01

    The second generation high temperature superconductor, specifically REBCO, has become a new research focus in the development of a new generation of high-field (>25 T) magnets. One of the main challenges in the application of the magnets is the current screening problem. Previous research shows that for magnetized superconducting stacks and bulks the application of an AC field in plane with the circulating current will lead to demagnetization due to vortex shaking, which provides a possible solution to remove the shielding current. This paper provides an in-depth study, both experimentally and numerically, to unveil the vortex shaking mechanism of REBCO stacks. A new experiment was carried out to measure the demagnetization rate of REBCO stacks exposed to an in-plane AC magnetic field. Meanwhile, 2D finite element models, based on the E-J power law, are developed for simulating the vortex shaking effect of the AC magnetic field. Qualitative agreement was obtained between the experimental and the simulation results. Our results show that the applied in-plane magnetic field leads to a sudden decay of trapped magnetic field in the first half shaking cycle, which is caused by the magnetic field dependence of critical current. Furthermore, the decline of demagnetization rate with the increase of tape number is mainly due to the cross-magnetic field being screened by the top and bottom stacks during the shaking process, which leads to lower demagnetization rate of inner layers. We also demonstrate that the frequency of the applied AC magnetic field has little impact on the demagnetization process. Our modeling tool and findings perfect the vortex shaking theory and provide helpful guidance for eliminating screening current in the new generation REBCO magnets.

  5. [Research on non-rigid registration of multi-modal medical image based on Demons algorithm].

    PubMed

    Hao, Peibo; Chen, Zhen; Jiang, Shaofeng; Wang, Yang

    2014-02-01

    Non-rigid medical image registration is a popular subject in the research areas of the medical image and has an important clinical value. In this paper we put forward an improved algorithm of Demons, together with the conservation of gray model and local structure tensor conservation model, to construct a new energy function processing multi-modal registration problem. We then applied the L-BFGS algorithm to optimize the energy function and solve complex three-dimensional data optimization problem. And finally we used the multi-scale hierarchical refinement ideas to solve large deformation registration. The experimental results showed that the proposed algorithm for large de formation and multi-modal three-dimensional medical image registration had good effects.

  6. Multi-modality endoscopic imaging for the detection of colorectal cancer

    NASA Astrophysics Data System (ADS)

    Wall, Richard Andrew

    Optical coherence tomography (OCT) is an imaging method that is considered the optical analog to ultrasound, using the technique of optical interferometry to construct two-dimensional depth-resolved images of tissue microstructure. With a resolution on the order of 10 um and a penetration depth of 1-2 mm in highly scattering tissue, fiber optics-coupled OCT is an ideal modality for the inspection of the mouse colon with its miniaturization capabilities. In the present study, the complementary modalities laser-induced fluorescence (LIF), which offers information on the biochemical makeup of the tissue, and surface magnifying chromoendoscopy, which offers high contrast surface visualization, are combined with OCT in endoscopic imaging systems for the greater specificity and sensitivity in the differentiation between normal and neoplastic tissue, and for the visualization of biomarkers which are indicative of early events in colorectal carcinogenesis. Oblique incidence reflectometry (OIR) also offers advantages, allowing the calculation of bulk tissue optical properties for use as a diagnostic tool. The study was broken up into three specific sections. First, a dual-modality OCTLIF imaging system was designed, capable of focusing light over 325-1300 nm using a reflective distal optics design. A dual-modality fluorescence-based SMC-OCT system was then designed and constructed, capable of resolving the stained mucosal crypt structure of the in vivo mouse colon. The SMC-OCT instrument's OIR capabilities were then modeled, as a modified version of the probe was used measure tissue scattering and absorption coefficients.

  7. Bob Meyer (right), acting deputy director of NASA Dryden, shakes hands with Les Bordelon, executive

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Bob Meyer (on the right), acting deputy director of NASA's Dryden Flight Research Center, Edwards, California, shakes hands with Les Bordelon, executive director of Edwards Air Force Base. The handshake represents Dryden's acceptance of an Air Force C-20A delivered from Ramstein Air Base, Germany. The aircraft will be modified to carry equipment and experiments in support of both NASA and U.S. Air Force projects. The joint use of this aircraft is a result of the NASA Dryden/Edwards Air Force Base Alliance which shares some resources as cost-cutting measures.

  8. Damage Assessment of a Full-Scale Six-Story wood-frame Building Following Triaxial shake Table Tests

    Treesearch

    John W. van de Lindt; Rakesh Gupta; Shiling Pei; Kazuki Tachibana; Yasuhiro Araki; Douglas Rammer; Hiroshi Isoda

    2012-01-01

    In the summer of 2009, a full-scale midrise wood-frame building was tested under a series of simulated earthquakes on the world's largest shake table in Miki City, Japan. The objective of this series of tests was to validate a performance-based seismic design approach by qualitatively and quantitatively examining the building's seismic performance in terms of...

  9. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.

  10. MyShake: Smartphone-based detection and analysis of Oklahoma earthquakes

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.

    2016-12-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing (myshake.berkeley.edu). It uses the accelerometer data from phones to detect earthquake-like motion, and then uploads triggers and waveform data to a server for aggregation of the results. Since the public release in Feb 2016, more than 200,000 android-phone owners have installed the app, and the global network has recorded more than 300 earthquakes. In Oklahoma, there are about 200 active users each day providing enough data for the network to detect earthquakes and for us to perform analysis of the events. MyShake has recorded waveform data for M2.6 to M5.8 earthquakes in the state. For the September 3, 2016, M5.8 earthquake 14 phones detected the event and we can use the waveforms to determine event characteristics. MyShake data provides a location 3.95 km from the ANSS location and a magnitude of 5.7. We can also use MyShake data to estimate a stress drop of 7.4 MPa. MyShake is still a rapidly expanding network that has the ability to grow by thousands of stations/phones in a matter of hours as public interest increases. These initial results suggest that the data will be useful for a variety of scientific studies of induced seismicity phenomena in Oklahoma as well as having the potential to provide earthquake early warning in the future.

  11. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    NASA Astrophysics Data System (ADS)

    Graves, Robert; Jordan, Thomas H.; Callaghan, Scott; Deelman, Ewa; Field, Edward; Juve, Gideon; Kesselman, Carl; Maechling, Philip; Mehta, Gaurang; Milner, Kevin; Okaya, David; Small, Patrick; Vahi, Karan

    2011-03-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i.e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process.

  12. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    USGS Publications Warehouse

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process. ?? 2010 Springer Basel AG.

  13. A fuel-based approach for emission factor development for highway paving construction equipment in China.

    PubMed

    Li, Zhen; Zhang, Kaishan; Pang, Kaili; Di, Baofeng

    2016-12-01

    The objective of this paper is to develop and demonstrate a fuel-based approach for emissions factor estimation for highway paving construction equipment in China for better accuracy. A highway construction site in Chengdu was selected for this study with NO emissions being characterized and demonstrated. Four commonly used paving equipment, i.e., three rollers and one paver were selected in this study. A portable emission measurement system (PEMS) was developed and used for emission measurements of selected equipment during real-world highway construction duties. Three duty modes were defined to characterize the NO emissions, i.e., idling, moving, and working. In order to develop a representative emission factor for these highway construction equipment, composite emission factors were estimated using modal emission rates and the corresponding modal durations in the process of typical construction duties. Depending on duty mode and equipment type, NO emission rate ranged from 2.6-63.7mg/s and 6.0-55.6g/kg-fuel with the fuel consumption ranging from 0.31-4.52 g/s correspondingly. The NO composite emission factor was estimated to be 9-41mg/s with the single-drum roller being the highest and double-drum roller being the lowest and 6-30g/kg-fuel with the pneumatic tire roller being the highest while the double-drum roller being the lowest. For the paver, both time-based and fuel consumption-based NO composite emission rates are higher than all of the rollers with 56mg/s and 30g/kg-fuel, respectively. In terms of time-based quantity, the working mode contributes more than the other modes with idling being the least for both emissions and fuel consumption. In contrast, the fuel-based emission rate appears to have less variability in emissions. Thus, in order to estimate emission factors for emission inventory development, the fuel-based emission factor may be selected for better accuracy. The fuel-based composite emissions factors will be less variable and more accurate than time-based emission factors. As a consequence, emissions inventory developed using this approach will be more accurate and practical.

  14. Learning Discriminative Binary Codes for Large-scale Cross-modal Retrieval.

    PubMed

    Xu, Xing; Shen, Fumin; Yang, Yang; Shen, Heng Tao; Li, Xuelong

    2017-05-01

    Hashing based methods have attracted considerable attention for efficient cross-modal retrieval on large-scale multimedia data. The core problem of cross-modal hashing is how to learn compact binary codes that construct the underlying correlations between heterogeneous features from different modalities. A majority of recent approaches aim at learning hash functions to preserve the pairwise similarities defined by given class labels. However, these methods fail to explicitly explore the discriminative property of class labels during hash function learning. In addition, they usually discard the discrete constraints imposed on the to-be-learned binary codes, and compromise to solve a relaxed problem with quantization to obtain the approximate binary solution. Therefore, the binary codes generated by these methods are suboptimal and less discriminative to different classes. To overcome these drawbacks, we propose a novel cross-modal hashing method, termed discrete cross-modal hashing (DCH), which directly learns discriminative binary codes while retaining the discrete constraints. Specifically, DCH learns modality-specific hash functions for generating unified binary codes, and these binary codes are viewed as representative features for discriminative classification with class labels. An effective discrete optimization algorithm is developed for DCH to jointly learn the modality-specific hash function and the unified binary codes. Extensive experiments on three benchmark data sets highlight the superiority of DCH under various cross-modal scenarios and show its state-of-the-art performance.

  15. Feasibility study of earthquake early warning (EEW) in Hawaii

    USGS Publications Warehouse

    Thelen, Weston A.; Hotovec-Ellis, Alicia J.; Bodin, Paul

    2016-09-30

    The effects of earthquake shaking on the population and infrastructure across the State of Hawaii could be catastrophic, and the high seismic hazard in the region emphasizes the likelihood of such an event. Earthquake early warning (EEW) has the potential to give several seconds of warning before strong shaking starts, and thus reduce loss of life and damage to property. The two approaches to EEW are (1) a network approach (such as ShakeAlert or ElarmS) where the regional seismic network is used to detect the earthquake and distribute the alarm and (2) a local approach where a critical facility has a single seismometer (or small array) and a warning system on the premises.The network approach, also referred to here as ShakeAlert or ElarmS, uses the closest stations within a regional seismic network to detect and characterize an earthquake. Most parameters used for a network approach require observations on multiple stations (typically 3 or 4), which slows down the alarm time slightly, but the alarms are generally more reliable than with single-station EEW approaches. The network approach also benefits from having stations closer to the source of any potentially damaging earthquake, so that alarms can be sent ahead to anyone who subscribes to receive the notification. Thus, a fully implemented ShakeAlert system can provide seconds of warning for both critical facilities and general populations ahead of damaging earthquake shaking.The cost to implement and maintain a fully operational ShakeAlert system is high compared to a local approach or single-station solution, but the benefits of a ShakeAlert system would be felt statewide—the warning times for strong shaking are potentially longer for most sources at most locations.The local approach, referred to herein as “single station,” uses measurements from a single seismometer to assess whether strong earthquake shaking can be expected. Because of the reliance on a single station, false alarms are more common than when using a regional network of seismometers. Given the current network, a single-station approach provides more warning for damaging earthquakes that occur close to the station, but it would have limited benefit compared to a fully implemented ShakeAlert system. For Honolulu, for example, the single-station approach provides an advantage over ShakeAlert only for earthquakes that occur in a narrow zone extending northeast and southwest of O‘ahu. Instrumentation and alarms associated with the single-station approach are typically maintained and assessed within the target facility, and thus no outside connectivity is required. A single-station approach, then, is unlikely to help broader populations beyond the individuals at the target facility, but they have the benefit of being commercially available for relatively little cost. The USGS Hawaiian Volcano Observatory (HVO) is the Advanced National Seismic System (ANSS) regional seismic network responsible for locating and characterizing earthquakes across the State of Hawaii. During 2014 and 2015, HVO tested a network-based EEW algorithm within the current seismic network in order to assess the suitability for building a full EEW system. Using the current seismic instrumentation and processing setup at HVO, it is possible for a network approach to release an alarm a little more than 3 seconds after the earthquake is recorded on the fourth seismometer. Presently, earthquakes having M≥3 detected with the ElarmS algorithm have an average location error of approximately 4.5 km and an average magnitude error of -0.3 compared to the reviewed catalog locations from the HVO. Additional stations and upgrades to existing seismic stations would serve to improve solution precision and warning times and additional staffing would be required to provide support for a robust, network-based EEW system. For a critical facility on the Island of Hawaiʻi, such as the telescopes atop Mauna Kea, one phased approach to mitigate losses could be to immediately install a single station system to establish some level of warning. Subsequently, supporting the implementation of a full network-based EEW system on the Island of Hawaiʻi would provide additional benefit in the form of improved warning times once the system is fully installed and operational, which may take several years. Distributed populations across the Hawaiian Islands, including those outside the major cities and far from the likely earthquake source areas, would likely only benefit from a network approach such as ShakeAlert to provide warnings of strong shaking.

  16. Strong ground motion in Port-au-Prince, Haiti, during the M7.0 12 January 2010 Haiti earthquake

    USGS Publications Warehouse

    Hough, Susan E; Given, Doug; Taniguchi, Tomoyo; Altidor, J.R.; Anglade, Dieuseul; Mildor, S-L.

    2011-01-01

    No strong motion records are available for the 12 January 2010 M7.0 Haiti earthquake. We use aftershock recordings as well as detailed considerations of damage to estimate the severity and distribution of mainshock shaking in Port-au-Prince. Relative to ground motions at a hard - rock reference site, peak accelerations are amplified by a factor of approximately 2 at sites on low-lying deposits in central Port-au-Prince and by a factor of 2.5 - 3.5 on a steep foothill ridge in the southern Port-au-Prince metropolitan region. The observed amplification along the ridge cannot be explained by sediment - induced amplification , but is consistent with predicted topographic amplification by a steep, narrow ridge. Although damage was largely a consequence of poor construction , the damage pattern inferred from analysis of remote sensing imagery provides evidence for a correspondence between small-scale (0.1 - 1.0 km) topographic relief and high damage. Mainshock shaking intensity can be estimated crudely from a consideration of macroseismic effects . We further present detailed, quantitative analysis of the marks left on a tile floor by an industrial battery rack displaced during the mainshock, at the location where we observed the highest weak motion amplifications. Results of this analysis indicate that mainshock shaking was significantly higher at this location (~0.5 g , MMI VIII) relative to the shaking in parts of Port-au-Prince that experienced light damage. Our results further illustrate how observations of rigid body horizontal displacement during earthquakes can be used to estimate peak ground accelerations in the absence of instrumental data .

  17. A simple hand-held magnet array for efficient and reproducible SABRE hyperpolarisation using manual sample shaking.

    PubMed

    Richardson, Peter M; Jackson, Scott; Parrott, Andrew J; Nordon, Alison; Duckett, Simon B; Halse, Meghan E

    2018-07-01

    Signal amplification by reversible exchange (SABRE) is a hyperpolarisation technique that catalytically transfers nuclear polarisation from parahydrogen, the singlet nuclear isomer of H 2 , to a substrate in solution. The SABRE exchange reaction is carried out in a polarisation transfer field (PTF) of tens of gauss before transfer to a stronger magnetic field for nuclear magnetic resonance (NMR) detection. In the simplest implementation, polarisation transfer is achieved by shaking the sample in the stray field of a superconducting NMR magnet. Although convenient, this method suffers from limited reproducibility and cannot be used with NMR spectrometers that do not have appreciable stray fields, such as benchtop instruments. Here, we use a simple hand-held permanent magnet array to provide the necessary PTF during sample shaking. We find that the use of this array provides a 25% increase in SABRE enhancement over the stray field approach, while also providing improved reproducibility. Arrays with a range of PTFs were tested, and the PTF-dependent SABRE enhancements were found to be in excellent agreement with comparable experiments carried out using an automated flow system where an electromagnet is used to generate the PTF. We anticipate that this approach will improve the efficiency and reproducibility of SABRE experiments carried out using manual shaking and will be particularly useful for benchtop NMR, where a suitable stray field is not readily accessible. The ability to construct arrays with a range of PTFs will also enable the rapid optimisation of SABRE enhancement as function of PTF for new substrate and catalyst systems. © 2017 The Authors Magnetic Resonance in Chemistry Published by John Wiley & Sons Ltd.

  18. Relation of landslides triggered by the Kiholo Bay earthquake to modeled ground motion

    USGS Publications Warehouse

    Harp, Edwin L.; Hartzell, Stephen H.; Jibson, Randall W.; Ramirez-Guzman, L.; Schmitt, Robert G.

    2014-01-01

    The 2006 Kiholo Bay, Hawaii, earthquake triggered high concentrations of rock falls and slides in the steep canyons of the Kohala Mountains along the north coast of Hawaii. Within these mountains and canyons a complex distribution of landslides was triggered by the earthquake shaking. In parts of the area, landslides were preferentially located on east‐facing slopes, whereas in other parts of the canyons no systematic pattern prevailed with respect to slope aspect or vertical position on the slopes. The geology within the canyons is homogeneous, so we hypothesize that the variable landslide distribution is the result of localized variation in ground shaking; therefore, we used a state‐of‐the‐art, high‐resolution ground‐motion simulation model to see if it could reproduce the landslide‐distribution patterns. We used a 3D finite‐element analysis to model earthquake shaking using a 10 m digital elevation model and slip on a finite‐fault model constructed from teleseismic records of the mainshock. Ground velocity time histories were calculated up to a frequency of 5 Hz. Dynamic shear strain also was calculated and compared with the landslide distribution. Results were mixed for the velocity simulations, with some areas showing correlation of landslide locations with peak modeled ground motions but many other areas showing no such correlation. Results were much improved for the comparison with dynamic shear strain. This suggests that (1) rock falls and slides are possibly triggered by higher frequency ground motions (velocities) than those in our simulations, (2) the ground‐motion velocity model needs more refinement, or (3) dynamic shear strain may be a more fundamental measurement of the decoupling process of slope materials during seismic shaking.

  19. Comparing the Performance of Japan's Earthquake Hazard Maps to Uniform and Randomized Maps

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S. A.; Spencer, B. D.

    2015-12-01

    The devastating 2011 magnitude 9.1 Tohoku earthquake and the resulting shaking and tsunami were much larger than anticipated in earthquake hazard maps. Because this and all other earthquakes that caused ten or more fatalities in Japan since 1979 occurred in places assigned a relatively low hazard, Geller (2011) argued that "all of Japan is at risk from earthquakes, and the present state of seismological science does not allow us to reliably differentiate the risk level in particular geographic areas," so a map showing uniform hazard would be preferable to the existing map. Defenders of the maps countered by arguing that these earthquakes are low-probability events allowed by the maps, which predict the levels of shaking that should expected with a certain probability over a given time. Although such maps are used worldwide in making costly policy decisions for earthquake-resistant construction, how well these maps actually perform is unknown. We explore this hotly-contested issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted ground motion should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. These results indicate that the JNH maps are not performing as well as expected, that what factors control map performance is complicated, and that learning more about how maps perform and why would be valuable in making more effective policy.

  20. Multi-Modality Phantom Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huber, Jennifer S.; Peng, Qiyu; Moses, William W.

    2009-03-20

    Multi-modality imaging has an increasing role in the diagnosis and treatment of a large number of diseases, particularly if both functional and anatomical information are acquired and accurately co-registered. Hence, there is a resulting need for multi modality phantoms in order to validate image co-registration and calibrate the imaging systems. We present our PET-ultrasound phantom development, including PET and ultrasound images of a simple prostate phantom. We use agar and gelatin mixed with a radioactive solution. We also present our development of custom multi-modality phantoms that are compatible with PET, transrectal ultrasound (TRUS), MRI and CT imaging. We describe bothmore » our selection of tissue mimicking materials and phantom construction procedures. These custom PET-TRUS-CT-MRI prostate phantoms use agargelatin radioactive mixtures with additional contrast agents and preservatives. We show multi-modality images of these custom prostate phantoms, as well as discuss phantom construction alternatives. Although we are currently focused on prostate imaging, this phantom development is applicable to many multi-modality imaging applications.« less

  1. Shake Table Testing of an Elevator System in a Full-Scale Five-Story Building

    PubMed Central

    Wang, Xiang; Hutchinson, Tara C.; Astroza, Rodrigo; Conte, Joel P.; Restrepo, José I.; Hoehler, Matthew S.; Ribeiro, Waldir

    2016-01-01

    SUMMARY This paper investigates the seismic performance of a functional traction elevator as part of a full-scale five-story building shake table test program. The test building was subjected to a suite of earthquake input motions of increasing intensity, first while the building was isolated at its base, and subsequently while it was fixed to the shake table platen. In addition, low-amplitude white noise base excitation tests were conducted while the elevator system was placed in three different configurations, namely, by varying the vertical location of its cabin and counterweight, to study the acceleration amplifications of the elevator components due to dynamic excitations. During the earthquake tests, detailed observation of the physical damage and operability of the elevator as well as its measured response are reported. Although the cabin and counterweight sustained large accelerations due to impact during these tests, the use of well-restrained guide shoes demonstrated its effectiveness in preventing the cabin and counterweight from derailment during high-intensity earthquake shaking. However, differential displacements induced by the building imposed undesirable distortion of the elevator components and their surrounding support structure, which caused damage and inoperability of the elevator doors. It is recommended that these aspects be explicitly considered in elevator seismic design. PMID:28242957

  2. Shake Table Testing of an Elevator System in a Full-Scale Five-Story Building.

    PubMed

    Wang, Xiang; Hutchinson, Tara C; Astroza, Rodrigo; Conte, Joel P; Restrepo, José I; Hoehler, Matthew S; Ribeiro, Waldir

    2017-03-01

    This paper investigates the seismic performance of a functional traction elevator as part of a full-scale five-story building shake table test program. The test building was subjected to a suite of earthquake input motions of increasing intensity, first while the building was isolated at its base, and subsequently while it was fixed to the shake table platen. In addition, low-amplitude white noise base excitation tests were conducted while the elevator system was placed in three different configurations, namely, by varying the vertical location of its cabin and counterweight, to study the acceleration amplifications of the elevator components due to dynamic excitations. During the earthquake tests, detailed observation of the physical damage and operability of the elevator as well as its measured response are reported. Although the cabin and counterweight sustained large accelerations due to impact during these tests, the use of well-restrained guide shoes demonstrated its effectiveness in preventing the cabin and counterweight from derailment during high-intensity earthquake shaking. However, differential displacements induced by the building imposed undesirable distortion of the elevator components and their surrounding support structure, which caused damage and inoperability of the elevator doors. It is recommended that these aspects be explicitly considered in elevator seismic design.

  3. A Cross-Modal Perspective on the Relationships between Imagery and Working Memory

    PubMed Central

    Likova, Lora T.

    2013-01-01

    Mapping the distinctions and interrelationships between imagery and working memory (WM) remains challenging. Although each of these major cognitive constructs is defined and treated in various ways across studies, most accept that both imagery and WM involve a form of internal representation available to our awareness. In WM, there is a further emphasis on goal-oriented, active maintenance, and use of this conscious representation to guide voluntary action. Multicomponent WM models incorporate representational buffers, such as the visuo-spatial sketchpad, plus central executive functions. If there is a visuo-spatial “sketchpad” for WM, does imagery involve the same representational buffer? Alternatively, does WM employ an imagery-specific representational mechanism to occupy our awareness? Or do both constructs utilize a more generic “projection screen” of an amodal nature? To address these issues, in a cross-modal fMRI study, I introduce a novel Drawing-Based Memory Paradigm, and conceptualize drawing as a complex behavior that is readily adaptable from the visual to non-visual modalities (such as the tactile modality), which opens intriguing possibilities for investigating cross-modal learning and plasticity. Blindfolded participants were trained through our Cognitive-Kinesthetic Method (Likova, 2010a, 2012) to draw complex objects guided purely by the memory of felt tactile images. If this WM task had been mediated by transfer of the felt spatial configuration to the visual imagery mechanism, the response-profile in visual cortex would be predicted to have the “top-down” signature of propagation of the imagery signal downward through the visual hierarchy. Remarkably, the pattern of cross-modal occipital activation generated by the non-visual memory drawing was essentially the inverse of this typical imagery signature. The sole visual hierarchy activation was isolated to the primary visual area (V1), and accompanied by deactivation of the entire extrastriate cortex, thus ’cutting-off’ any signal propagation from/to V1 through the visual hierarchy. The implications of these findings for the debate on the interrelationships between the core cognitive constructs of WM and imagery and the nature of internal representations are evaluated. PMID:23346061

  4. MyShake - Smartphone seismic network powered by citizen scientists

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Strauss, J. A.

    2017-12-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing. It is driven by the citizen scientists that run MyShake on their personal smartphones. It has two components: an android application running on the smartphones to detect earthquake-like motion, and a network detection algorithm to aggregate results from multiple smartphones to confirm when an earthquake occurs. The MyShake application was released to the public on Feb 12th 2016. Within the first year, more than 250,000 people downloaded MyShake app around the world. There are more than 500 earthquakes recorded by the smartphones in this period, including events in Chile, Argentina, Mexico, Morocco, Greece, Nepal, New Zealand, Taiwan, Japan, and across North America. Currently, we are working on earthquake early warning with MyShake network and the shaking data provided by MyShake is a unique dataset that can be used for the research community.

  5. The shakeout scenario: Meeting the needs for construction aggregates, asphalt, and concrete

    USGS Publications Warehouse

    Langer, W.H.

    2011-01-01

    An Mw 7.8 earthquake as described in the ShakeOut Scenario would cause significantdamage to buildings and infrastructure. Over 6 million tons of newly mined aggregate would be used for emergency repairs and for reconstruction in the five years following the event. This aggregate would be applied mostly in the form of concrete for buildings and bridges, asphalt or concrete for pavement, and unbound gravel for applications such as base course that goes under highway pavement and backfilling for foundations and pipelines. There are over 450 aggregate, concrete, and asphalt plants in the affected area, some of which would be heavily damaged. Meeting the increased demand for construction materials would require readily available permitted reserves, functioning production facilities, a supply of cement and asphalt, a source of water, gas, and electricity, and a trained workforce. Prudent advance preparations would facilitate a timely emergency response and reconstruction following such an earthquake. ?? 2011, Earthquake Engineering Research Institute.

  6. Mental Programs and Social Behavior Patterns in Russian Society

    ERIC Educational Resources Information Center

    Lubsky, Anatoly Vladimirovich; Kolesnykova, Elena Yuryevna; Lubsky, Roman Anatolyevich

    2016-01-01

    The objective of the article is to reconstruct the mental programs, their cognitive, axiological and connotative structures, and construction on this basis of various modal patterns of social behavior in Russian society. Methodology of the article is based on an interdisciplinary scientific approach making it possible to conceptually disclose the…

  7. Perceptions of Saudi Students towards Electronic and Traditional Writing Groups

    ERIC Educational Resources Information Center

    Alqurashi, Fahad

    2008-01-01

    This paper reports the findings of an experiment that investigated the reactions of Saudi college students to collaborative learning techniques introduced in two modalities: face-to-face and web-based learning. Quantitative data were collected with a questionnaire that examined the changes of three constructs: attitudes toward collaboration,…

  8. Blind identification of full-field vibration modes of output-only structures from uniformly-sampled, possibly temporally-aliased (sub-Nyquist), video measurements

    NASA Astrophysics Data System (ADS)

    Yang, Yongchao; Dorn, Charles; Mancini, Tyler; Talken, Zachary; Nagarajaiah, Satish; Kenyon, Garrett; Farrar, Charles; Mascareñas, David

    2017-03-01

    Enhancing the spatial and temporal resolution of vibration measurements and modal analysis could significantly benefit dynamic modelling, analysis, and health monitoring of structures. For example, spatially high-density mode shapes are critical for accurate vibration-based damage localization. In experimental or operational modal analysis, higher (frequency) modes, which may be outside the frequency range of the measurement, contain local structural features that can improve damage localization as well as the construction and updating of the modal-based dynamic model of the structure. In general, the resolution of vibration measurements can be increased by enhanced hardware. Traditional vibration measurement sensors such as accelerometers have high-frequency sampling capacity; however, they are discrete point-wise sensors only providing sparse, low spatial sensing resolution measurements, while dense deployment to achieve high spatial resolution is expensive and results in the mass-loading effect and modification of structure's surface. Non-contact measurement methods such as scanning laser vibrometers provide high spatial and temporal resolution sensing capacity; however, they make measurements sequentially that requires considerable acquisition time. As an alternative non-contact method, digital video cameras are relatively low-cost, agile, and provide high spatial resolution, simultaneous, measurements. Combined with vision based algorithms (e.g., image correlation or template matching, optical flow, etc.), video camera based measurements have been successfully used for experimental and operational vibration measurement and subsequent modal analysis. However, the sampling frequency of most affordable digital cameras is limited to 30-60 Hz, while high-speed cameras for higher frequency vibration measurements are extremely costly. This work develops a computational algorithm capable of performing vibration measurement at a uniform sampling frequency lower than what is required by the Shannon-Nyquist sampling theorem for output-only modal analysis. In particular, the spatio-temporal uncoupling property of the modal expansion of structural vibration responses enables a direct modal decoupling of the temporally-aliased vibration measurements by existing output-only modal analysis methods, yielding (full-field) mode shapes estimation directly. Then the signal aliasing properties in modal analysis is exploited to estimate the modal frequencies and damping ratios. The proposed method is validated by laboratory experiments where output-only modal identification is conducted on temporally-aliased acceleration responses and particularly the temporally-aliased video measurements of bench-scale structures, including a three-story building structure and a cantilever beam.

  9. An antibody based approach for multi-coloring osteogenic and chondrogenic proteins in tissue engineered constructs.

    PubMed

    Leferink, Anne M; Reis, Diogo Santos; van Blitterswijk, Clemens A; Moroni, Lorenzo

    2018-04-11

    When tissue engineering strategies rely on the combination of three-dimensional (3D) polymeric or ceramic scaffolds with cells to culture implantable tissue constructs in vitro, it is desirable to monitor tissue growth and cell fate to be able to more rationally predict the quality and success of the construct upon implantation. Such a 3D construct is often referred to as a 'black-box' since the properties of the scaffolds material limit the applicability of most imaging modalities to assess important construct parameters. These parameters include the number of cells, the amount and type of tissue formed and the distribution of cells and tissue throughout the construct. Immunolabeling enables the spatial and temporal identification of multiple tissue types within one scaffold without the need to sacrifice the construct. In this report, we concisely review the applicability of antibodies (Abs) and their conjugation chemistries in tissue engineered constructs. With some preliminary experiments, we show an efficient conjugation strategy to couple extracellular matrix Abs to fluorophores. The conjugated probes proved to be effective in determining the presence of collagen type I and type II on electrospun and additive manufactured 3D scaffolds seeded with adult human bone marrow derived mesenchymal stromal cells. The conjugation chemistry applied in our proof of concept study is expected to be applicable in the coupling of any other fluorophore or particle to the Abs. This could ultimately lead to a library of probes to permit high-contrast imaging by several imaging modalities.

  10. Integrating modal-based NDE techniques and bridge management systems using quality management

    NASA Astrophysics Data System (ADS)

    Sikorsky, Charles S.

    1997-05-01

    The intent of bridge management systems is to help engineers and managers determine when and where to spend bridge funds such that commerce and the motoring public needs are satisfied. A major shortcoming which states are experiencing is the NBIS data available is insufficient to perform certain functions required by new bridge management systems, such as modeling bridge deterioration and predicting costs. This paper will investigate how modal based nondestructive damage evaluation techniques can be integrated into bridge management using quality management principles. First, quality from the manufacturing perspective will be summarized. Next, the implementation of quality management in design and construction will be reinterpreted for bridge management. Based on this, a theory of approach will be formulated to improve the productivity of a highway transportation system.

  11. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  12. The ShakeOut Scenario

    USGS Publications Warehouse

    Jones, Lucile M.; Bernknopf, Richard; Cox, Dale; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Perry, Suzanne; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    This is the initial publication of the results of a cooperative project to examine the implications of a major earthquake in southern California. The study comprised eight counties: Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura. Its results will be used as the basis of an emergency response and preparedness exercise, the Great Southern California ShakeOut, and for this purpose we defined our earthquake as occurring at 10:00 a.m. on November 13, 2008. As members of the southern California community use the ShakeOut Scenario to plan and execute the exercise, we anticipate discussion and feedback. This community input will be used to refine our assessment and will lead to a formal publication in early 2009. Our goal in the ShakeOut Scenario is to identify the physical, social and economic consequences of a major earthquake in southern California and in so doing, enable the users of our results to identify what they can change now?before the earthquake?to avoid catastrophic impact after the inevitable earthquake occurs. To do so, we had to determine the physical damages (casualties and losses) caused by the earthquake and the impact of those damages on the region?s social and economic systems. To do this, we needed to know about the earthquake ground shaking and fault rupture. So we first constructed an earthquake, taking all available earthquake research information, from trenching and exposed evidence of prehistoric earthquakes, to analysis of instrumental recordings of large earthquakes and the latest theory in earthquake source physics. We modeled a magnitude (M) 7.8 earthquake on the southern San Andreas Fault, a plausible event on the fault most likely to produce a major earthquake. This information was then fed forward into the rest of the ShakeOut Scenario. The damage impacts of the scenario earthquake were estimated using both HAZUS-MH and expert opinion through 13 special studies and 6 expert panels, and fall into four categories: building damages, non-structural damages, damage to lifelines and infrastructure, and fire losses. The magnitude 7.8 ShakeOut earthquake is modeled to cause about 1800 deaths and $213 billion of economic losses. These numbers are as low as they are because of aggressive retrofitting programs that have increased the seismic resistance of buildings, highways and lifelines, and economic resiliency. These numbers are as large as they are because much more retrofitting could still be done. The earthquake modeled here may never happen. Big earthquakes on the San Andreas Fault are inevitable, and by geologic standards extremely common, but probably will not be exactly like this one. The next very damaging earthquake could easily be on another fault. However, lessons learned from this particular event apply to many other events and could provide benefits in many possible future events.

  13. Fan Blade Shake Test Results for the 40- by 80-/80- by 120-Foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Warmbrodt, W.; Graham, T.

    1983-01-01

    This report documents the shake tests performed on the first set of hydulignum fan blades for the 40- by 80-/80- by 120-Foot Wind Tunnel. The purpose of the shake test program is described. The test equipment and test procedures are reviewed. Results from each shake test are presented and the overall findings of the shake test program are discussed.

  14. Design, fabrication and actuation of a MEMS-based image stabilizer for photographic cell phone applications

    NASA Astrophysics Data System (ADS)

    Chiou, Jin-Chern; Hung, Chen-Chun; Lin, Chun-Ying

    2010-07-01

    This work presents a MEMS-based image stabilizer applied for anti-shaking function in photographic cell phones. The proposed stabilizer is designed as a two-axis decoupling XY stage 1.4 × 1.4 × 0.1 mm3 in size, and adequately strong to suspend an image sensor for anti-shaking photographic function. This stabilizer is fabricated by complex fabrication processes, including inductively coupled plasma (ICP) processes and flip-chip bonding technique. Based on the special designs of a hollow handle layer and a corresponding wire-bonding assisted holder, electrical signals of the suspended image sensor can be successfully sent out with 32 signal springs without incurring damage during wire-bonding packaging. The longest calculated traveling distance of the stabilizer is 25 µm which is sufficient to resolve the anti-shaking problem in a three-megapixel image sensor. Accordingly, the applied voltage for the 25 µm moving distance is 38 V. Moreover, the resonant frequency of the actuating device with the image sensor is 1.123 kHz.

  15. Isolating social influences on vulnerability to earthquake shaking: identifying cost-effective mitigation strategies.

    NASA Astrophysics Data System (ADS)

    Bhloscaidh, Mairead Nic; McCloskey, John; Pelling, Mark; Naylor, Mark

    2013-04-01

    Until expensive engineering solutions become more universally available, the objective targeting of resources at demonstrably effective, low-cost interventions might help reverse the trend of increasing mortality in earthquakes. Death tolls in earthquakes are the result of complex interactions between physical effects, such as the exposure of the population to strong shaking, and the resilience of the exposed population along with supporting critical infrastructures and institutions. The identification of socio-economic factors that contribute to earthquake mortality is crucial to identifying and developing successful risk management strategies. Here we develop a quantitative methodology more objectively to assess the ability of communities to withstand earthquake shaking, focusing on, in particular, those cases where risk management performance appears to exceed or fall below expectations based on economic status. Using only published estimates of the shaking intensity and population exposure for each earthquake, data that is available for earthquakes in countries irrespective of their level of economic development, we develop a model for mortality based on the contribution of population exposure to shaking only. This represents an attempt to remove, as far as possible, the physical causes of mortality from our analysis (where we consider earthquake engineering to reduce building collapse among the socio-economic influences). The systematic part of the variance with respect to this model can therefore be expected to be dominated by socio-economic factors. We find, as expected, that this purely physical analysis partitions countries in terms of basic socio-economic measures, for example GDP, focusing analytical attention on the power of economic measures to explain variance in observed distributions of earthquake risk. The model allows the definition of a vulnerability index which, although broadly it demonstrates the expected income-dependence of vulnerability to strong shaking, also identifies both anomalously resilient and anomalously vulnerable countries. We argue that this approach has the potential to direct sociological investigations to expose the underlying causes of the observed non-economic differentiation of vulnerability. At one level, closer study of the earthquakes represented by these data points might expose local or national interventions which are increasing resilience of communities to strong shaking in the absence of major national investment. Ultimately it may contribute to the development of a quantitative evaluation of risk management effectiveness at the national level that can be used better to target and track risk management investments.

  16. Mass timber rocking panel retrofit of a four-story soft-story building with full-scale shake table validation

    Treesearch

    Pouria Bahmani; John van de Lindt; Asif Iqbal; Douglas Rammer

    2017-01-01

    Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. There are tens of thousands of these multi-family three- and four-story structures throughout California and the United States. The majority were constructed between 1920 and 1970, with many being prevalent in the San Francisco Bay Area in California. The NEES Soft...

  17. Modal Damping Ratio and Optimal Elastic Moduli of Human Body Segments for Anthropometric Vibratory Model of Standing Subjects.

    PubMed

    Gupta, Manoj; Gupta, T C

    2017-10-01

    The present study aims to accurately estimate inertial, physical, and dynamic parameters of human body vibratory model consistent with physical structure of the human body that also replicates its dynamic response. A 13 degree-of-freedom (DOF) lumped parameter model for standing person subjected to support excitation is established. Model parameters are determined from anthropometric measurements, uniform mass density, elastic modulus of individual body segments, and modal damping ratios. Elastic moduli of ellipsoidal body segments are initially estimated by comparing stiffness of spring elements, calculated from a detailed scheme, and values available in literature for same. These values are further optimized by minimizing difference between theoretically calculated platform-to-head transmissibility ratio (TR) and experimental measurements. Modal damping ratios are estimated from experimental transmissibility response using two dominant peaks in the frequency range of 0-25 Hz. From comparison between dynamic response determined form modal analysis and experimental results, a set of elastic moduli for different segments of human body and a novel scheme to determine modal damping ratios from TR plots, are established. Acceptable match between transmissibility values calculated from the vibratory model and experimental measurements for 50th percentile U.S. male, except at very low frequencies, establishes the human body model developed. Also, reasonable agreement obtained between theoretical response curve and experimental response envelop for average Indian male, affirms the technique used for constructing vibratory model of a standing person. Present work attempts to develop effective technique for constructing subject specific damped vibratory model based on its physical measurements.

  18. Students' Multimodal Construction of the Work-Energy Concept

    NASA Astrophysics Data System (ADS)

    Tang, Kok-Sing; Chee Tan, Seng; Yeo, Jennifer

    2011-09-01

    This article examines the role of multimodalities in representing the concept of work-energy by studying the collaborative discourse of a group of ninth-grade physics students engaging in an inquiry-based instruction. Theorising a scientific concept as a network of meaning relationships across semiotic modalities situated in human activity, this article analyses the students' interactions through their use of natural language, mathematical symbolism, depiction, and gestures, and examines the intertextual meanings made through the integration of these modalities. Results indicate that the thematic integration of multimodalities is both difficult and necessary for students in order to construct a scientific understanding that is congruent with the physics curriculum. More significantly, the difficulties in multimodal integration stem from the subtle differences in the categorical, quantitative, and spatial meanings of the work-energy concept whose contrasts are often not made explicit to the students. The implications of these analyses and findings for science teaching and educational research are discussed.

  19. Field observations of seismic velocity changes caused by shaking-induced damage and healing due to mesoscopic nonlinearity

    NASA Astrophysics Data System (ADS)

    Gassenmeier, M.; Sens-Schönfelder, C.; Eulenfeld, T.; Bartsch, M.; Victor, P.; Tilmann, F.; Korn, M.

    2016-03-01

    To investigate temporal seismic velocity changes due to earthquake related processes and environmental forcing in Northern Chile, we analyse 8 yr of ambient seismic noise recorded by the Integrated Plate Boundary Observatory Chile (IPOC). By autocorrelating the ambient seismic noise field measured on the vertical components, approximations of the Green's functions are retrieved and velocity changes are measured with Coda Wave Interferometry. At station PATCX, we observe seasonal changes in seismic velocity caused by thermal stress as well as transient velocity reductions in the frequency range of 4-6 Hz. Sudden velocity drops occur at the time of mostly earthquake-induced ground shaking and recover over a variable period of time. We present an empirical model that describes the seismic velocity variations based on continuous observations of the local ground acceleration. The model assumes that not only the shaking of large earthquakes causes velocity drops, but any small vibrations continuously induce minor velocity variations that are immediately compensated by healing in the steady state. We show that the shaking effect is accumulated over time and best described by the integrated envelope of the ground acceleration over the discretization interval of the velocity measurements, which is one day. In our model, the amplitude of the velocity reduction as well as the recovery time are proportional to the size of the excitation. This model with two free scaling parameters fits the data of the shaking induced velocity variation in remarkable detail. Additionally, a linear trend is observed that might be related to a recovery process from one or more earthquakes before our measurement period. A clear relationship between ground shaking and induced velocity reductions is not visible at other stations. We attribute the outstanding sensitivity of PATCX to ground shaking and thermal stress to the special geological setting of the station, where the subsurface material consists of relatively loose conglomerate with high pore volume leading to a stronger nonlinearity compared to the other IPOC stations.

  20. An Overview and Parametric Evaluation of the CGS ShakeMap Automated System in CISN

    NASA Astrophysics Data System (ADS)

    Hagos, L. Z.; Haddadi, H. R.; Shakal, A. F.

    2014-12-01

    In the recent years, ShakeMap has been extensively used in California for earthquake rapid response. Serving as a backup to the Northern and Southern seismic regions of the California Integrated Seismic Network (CISN), the California Geological Survey (CGS) is running a ShakeMap system configured such that it effectively produces ShakeMaps for earthquakes occurring in both regions. In achieving this goal, CGS has worked to improve the robustness of its ShakeMap system and the quality of its products. Peak ground motion amplitude data are exchanged between the CISN data centers to provide robust generation of ShakeMap. Most exchanged ground motion packets come associated with an earthquake by the authoritative network. However, for ground motion packets that come unassociated, CGS employs an event association scheme to associate them with the corresponding earthquake. The generated ShakeMap products are published to the CGS server which can also be accessed through the CISN website. The backup function is designed to publish ShakeMap products to the USGS NEIC server without collision with the regional networks, only acting in cases where the authoritative region encounters a system failure. Depending on the size, location and significance of the earthquake, review of ShakeMap products by a seismologist may involve changes to ShakeMap parameters from the default. We present an overview of the CGS ShakeMap system and highlight some of the parameters a seismologist may adjust including parameters related to basin effects, directivity effects when finite fault models are available, site corrections, etc. We also analyze the sensitivity and dependence of the ShakeMap intensity and ground motion maps on the number of observed data included in the computation. In light of the available strong motion amplitude data, we attempt to address the question of what constitutes an adequate quality ShakeMap in the tradeoff between rapidity and completeness. We also present a brief comparative study of the available Ground Motion to Intensity Conversion Equations (GMICE) by studying selected earthquakes in California region. Results of these studies can be used as a tool in ShakeMap generation for California earthquakes when the use of non-default parameters is required.

  1. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  2. Combining multiple earthquake models in real time for earthquake early warning

    USGS Publications Warehouse

    Minson, Sarah E.; Wu, Stephen; Beck, James L; Heaton, Thomas H.

    2017-01-01

    The ultimate goal of earthquake early warning (EEW) is to provide local shaking information to users before the strong shaking from an earthquake reaches their location. This is accomplished by operating one or more real‐time analyses that attempt to predict shaking intensity, often by estimating the earthquake’s location and magnitude and then predicting the ground motion from that point source. Other EEW algorithms use finite rupture models or may directly estimate ground motion without first solving for an earthquake source. EEW performance could be improved if the information from these diverse and independent prediction models could be combined into one unified, ground‐motion prediction. In this article, we set the forecast shaking at each location as the common ground to combine all these predictions and introduce a Bayesian approach to creating better ground‐motion predictions. We also describe how this methodology could be used to build a new generation of EEW systems that provide optimal decisions customized for each user based on the user’s individual false‐alarm tolerance and the time necessary for that user to react.

  3. Real-time 3-D space numerical shake prediction for earthquake early warning

    NASA Astrophysics Data System (ADS)

    Wang, Tianyun; Jin, Xing; Huang, Yandan; Wei, Yongxiang

    2017-12-01

    In earthquake early warning systems, real-time shake prediction through wave propagation simulation is a promising approach. Compared with traditional methods, it does not suffer from the inaccurate estimation of source parameters. For computation efficiency, wave direction is assumed to propagate on the 2-D surface of the earth in these methods. In fact, since the seismic wave propagates in the 3-D sphere of the earth, the 2-D space modeling of wave direction results in inaccurate wave estimation. In this paper, we propose a 3-D space numerical shake prediction method, which simulates the wave propagation in 3-D space using radiative transfer theory, and incorporate data assimilation technique to estimate the distribution of wave energy. 2011 Tohoku earthquake is studied as an example to show the validity of the proposed model. 2-D space model and 3-D space model are compared in this article, and the prediction results show that numerical shake prediction based on 3-D space model can estimate the real-time ground motion precisely, and overprediction is alleviated when using 3-D space model.

  4. pH measurement and a rational and practical pH control strategy for high throughput cell culture system.

    PubMed

    Zhou, Haiying; Purdie, Jennifer; Wang, Tongtong; Ouyang, Anli

    2010-01-01

    The number of therapeutic proteins produced by cell culture in the pharmaceutical industry continues to increase. During the early stages of manufacturing process development, hundreds of clones and various cell culture conditions are evaluated to develop a robust process to identify and select cell lines with high productivity. It is highly desirable to establish a high throughput system to accelerate process development and reduce cost. Multiwell plates and shake flasks are widely used in the industry as the scale down model for large-scale bioreactors. However, one of the limitations of these two systems is the inability to measure and control pH in a high throughput manner. As pH is an important process parameter for cell culture, this could limit the applications of these scale down model vessels. An economical, rapid, and robust pH measurement method was developed at Eli Lilly and Company by employing SNARF-4F 5-(-and 6)-carboxylic acid. The method demonstrated the ability to measure the pH values of cell culture samples in a high throughput manner. Based upon the chemical equilibrium of CO(2), HCO(3)(-), and the buffer system, i.e., HEPES, we established a mathematical model to regulate pH in multiwell plates and shake flasks. The model calculates the required %CO(2) from the incubator and the amount of sodium bicarbonate to be added to adjust pH to a preset value. The model was validated by experimental data, and pH was accurately regulated by this method. The feasibility of studying the pH effect on cell culture in 96-well plates and shake flasks was also demonstrated in this study. This work shed light on mini-bioreactor scale down model construction and paved the way for cell culture process development to improve productivity or product quality using high throughput systems. Copyright 2009 American Institute of Chemical Engineers

  5. Probabilistic seismic hazard assessment for northern Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Kosuwan, S.; Nguyen, M. L.; Shi, X.; Sieh, K.

    2016-12-01

    We assess seismic hazard for northern Southeast Asia through constructing an earthquake and fault database, conducting a series of ground-shaking scenarios and proposing regional seismic hazard maps. Our earthquake database contains earthquake parameters from global and local seismic catalogues, including the ISC, ISC-GEM, the global ANSS Comprehensive Catalogues, Seismological Bureau, Thai Meteorological Department, Thailand, and Institute of Geophysics Vietnam Academy of Science and Technology, Vietnam. To harmonize the earthquake parameters from various catalogue sources, we remove duplicate events and unify magnitudes into the same scale. Our active fault database include active fault data from previous studies, e.g. the active fault parameters determined by Wang et al. (2014), Department of Mineral Resources, Thailand, and Institute of Geophysics, Vietnam Academy of Science and Technology, Vietnam. Based on the parameters from analysis of the databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and time elapsed of last events), we determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the felt intensities of historical earthquakes to the modelled ground motions using ground motion prediction equations (GMPEs). By incorporating the best-fitting GMPEs and site conditions, we utilized site effect and assessed probabilistic seismic hazard. The highest seismic hazard is in the region close to the Sagaing Fault, which cuts through some major cities in central Myanmar. The northern segment of Sunda megathrust, which could potentially cause M8-class earthquake, brings significant hazard along the Western Coast of Myanmar and eastern Bangladesh. Besides, we conclude a notable hazard level in northern Vietnam and the boundary between Myanmar, Thailand and Laos, due to a series of strike-slip faults, which could potentially cause moderate-large earthquakes. Note that although much of the region has a low probability of damaging shaking, low-probability events have resulted in much destruction recently in SE Asia (e.g. 2008 Wenchuan, 2015 Sabah earthquakes).

  6. Maybe Some Big Ground Shakes One Hundred Years Ago in a Big State Near the Ocean Were Caused by People

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Tsai, V. C.; Walker, R.; Page, M. T.; Aminzadeh, F.

    2016-12-01

    Sometimes people put water deep into the ground to make it go away and sometimes this causes the ground to shake. Sometimes people take other stuff out of the ground because a lot of people buy this stuff to power cars. Usually when people take this stuff out of the ground it does not cause ground shakes. At least this is what we used to believe. For our study, we looked at ground shakes that caused houses to fall down almost 100 years ago in a big state near the water. They were large ground shakes. One was close to a big city where people make movies and one was a really big shake in another city in the same state. We asked the question, is it possible that these ground shakes happened because people took stuff out of the ground? We considered the places where the ground shakes happened and the places where people took a lot of stuff out of the ground. We show there is a pretty good chance that taking stuff out of the ground caused some pretty big ground shakes. We explain how ground shakes can happen when people take stuff out of the ground. Ground shakes happen on things called faults. When you take stuff out of the ground, usually that makes it harder for the fault to move. This is a good thing. But when the stuff is still deep under the ground, sometimes it also pushes against faults that are close by and helps keep them from moving. So when you take stuff out, it does not push on faults as much, and so sometimes that close-by fault can move and cause ground shakes. We use a computer to show that our idea can explain some of what we see. The idea is not perfect but we think it is a pretty good idea. Our idea explains why it does not usually cause ground shakes when people take stuff out of the ground, but sometimes big ground shakes happen. Our idea suggests that ground shakes caused by people can sometimes be very large. So if people take stuff out of the ground or put stuff in the ground, they need to know if there are faults close by.

  7. Speak Simply When Warning About After Shocks

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Hardebeck, J.; Page, M. T.; van der Elst, N.; Wein, A. M.

    2016-12-01

    When a fault in the ground slips, the ground moves fast and can shake hard. After a big ground shake, there are more shakes. We call them after shocks and these can happen over a long time, for many years. An after shock can shake the ground more than it shook the first time. These shocks can shake and break places where people live and work, make rocks fall down and the ground go soft and wet, and hurt or kill people. After shocks also make people worry. If people are scared, then they may leave the area and not come back. To help people be safe and feel calm we want to tell them what may happen. We often use big words and lots of numbers to give the chances for the number of shakes over days, weeks, and years. That helps some people fix things and do their jobs such as those who work on roads, power, water, phones, hospitals, schools or in the money business. But big words and too many numbers can confuse a lot of people and make them worry more. Studies of talking about the ground shake problem show that it is best to speak simply to people. What if we only use the ten hundred most often used words to talk about these ground shakes. Would that work? Here is a possible warning: Last week's huge ground shake will probably make more ground shakes. This week expect to feel three to ten ground shakes and maybe one big ground shake that could break things. That big ground shake has a chance of 1 in 10. This is normal. Be safe. Stay out of broken houses, shops, and work places. When you feel the ground shake: drop, cover, and hold on. People may feel afraid or be hurt, so check on friends and family. Get some more food and water. Over time there will be fewer ground shakes, but always be ready for them. That warning gives a lot of key ideas: what may happen, whether houses could get broken, that what is happening is normal, and what people may feel and should do. These are the key parts of a good warning. Maybe we should use the most often used words all the time.

  8. SHAKING TABLE TESTS ON SEISMIC DEFORMATION OF PILE SUPPORTED PIER

    NASA Astrophysics Data System (ADS)

    Fujita, Daiki; Kohama, Eiji; Takenobu, Masahiro; Yoshida, Makoto; Kiku, Hiroyoshi

    The seismic deformation characeteristics of a pile supported pier was examined with the shake table test, especially focusing on the pier after its deformation during earthquakes. The model based on the similitude of the fully-plastic moment in piles was prepared to confirm the deformation and stress characteristic after reaching the fully-plastic moment. Moreover, assuming transportation of emergency supplies and occurrence of after shock in the post-disaster period, the pile supported pier was loaded with weight after reaching fully-plastic moment and excited with the shaking table. As the result, it is identified that the displacement of the pile supported pier is comparatively small if bending strength of piles does not decrease after reaching fully-plastic moment due to nonoccourrence of local backling or strain hardening.

  9. Instrumental shaking thresholds for seismically induced landslides and preliminary report on landslides triggered by the October 17, 1989, Loma Prieta, California earthquake

    USGS Publications Warehouse

    Harp, E.L.

    1993-01-01

    The generation of seismically induced landslide depends on the characteristics of shaking as well as mechanical properties of geologic materials. A very important parameter in the study of seismically induced landslide is the intensity based on a strong-motion accelerogram: it is defined as Arias intensity and is proportional to the duration of the shaking record as well as the amplitude. Having a theoretical relationship between Arias intensity, magnitude and distance it is possible to predict how far away from the seismic source landslides are likely to occur for a given magnitude earthquake. Field investigations have established that the threshold level of Arias intensity depends also on site effects, particularly the fracture characteristics of the outcrops present. -from Author

  10. Damage detection of building structures under ambient excitation through the analysis of the relationship between the modal participation ratio and story stiffness

    NASA Astrophysics Data System (ADS)

    Park, Hyo Seon; Oh, Byung Kwan

    2018-03-01

    This paper presents a new approach for the damage detection of building structures under ambient excitation based on the inherent modal characteristics. In this study, without the extraction of modal parameters widely utilized in the previous studies on damage detection, a new index called the modal participation ratio (MPR), which is a representative value of the modal response extracted from dynamic responses measured in ambient vibration tests, is proposed to evaluate the change of the system of a structure according to the reduction of the story stiffness. The relationship between the MPR, representing a modal contribution for a specific mode and degree of freedom in buildings, and the story stiffness damage factor (SSDF), representing the extent of reduction in the story stiffness, is analyzed in various damage scenarios. From the analyses with three examples, several rules for the damage localization of building structures are found based on the characteristics of the MPR variation for the first mode subject to change in the SSDF. In addition, a damage severity function, derived from the relationship between the MPR for the first mode in the lowest story and the SSDF, is constructed to identify the severity of story stiffness reduction. Furthermore, the locations and severities of multiple damages are identified via the superposition of the presented damage severity functions. The presented method was applied to detect damage in a three-dimensional reinforced concrete (RC) structure.

  11. Dual-modality single particle orientation and rotational tracking of intracellular transport of nanocargos.

    PubMed

    Sun, Wei; Gu, Yan; Wang, Gufeng; Fang, Ning

    2012-01-17

    The single particle orientation and rotational tracking (SPORT) technique was introduced recently to follow the rotational motion of plasmonic gold nanorod under a differential interference contrast (DIC) microscope. In biological studies, however, cellular activities usually involve a multiplicity of molecules; thus, tracking the motion of a single molecule/object is insufficient. Fluorescence-based techniques have long been used to follow the spatial and temporal distributions of biomolecules of interest thanks to the availability of multiplexing fluorescent probes. To know the type and number of molecules and the timing of their involvement in a biological process under investigation by SPORT, we constructed a dual-modality DIC/fluorescence microscope to simultaneously image fluorescently tagged biomolecules and plasmonic nanoprobes in living cells. With the dual-modality SPORT technique, the microtubule-based intracellular transport can be unambiguously identified while the dynamic orientation of nanometer-sized cargos can be monitored at video rate. Furthermore, the active transport on the microtubule can be easily separated from the diffusion before the nanocargo docks on the microtubule or after it undocks from the microtubule. The potential of dual-modality SPORT is demonstrated for shedding new light on unresolved questions in intracellular transport.

  12. Image barcodes

    NASA Astrophysics Data System (ADS)

    Damera-Venkata, Niranjan; Yen, Jonathan

    2003-01-01

    A Visually significant two-dimensional barcode (VSB) developed by Shaked et. al. is a method used to design an information carrying two-dimensional barcode, which has the appearance of a given graphical entity such as a company logo. The encoding and decoding of information using the VSB, uses a base image with very few graylevels (typically only two). This typically requires the image histogram to be bi-modal. For continuous-tone images such as digital photographs of individuals, the representation of tone or "shades of gray" is not only important to obtain a pleasing rendition of the face, but in most cases, the VSB renders these images unrecognizable due to its inability to represent true gray-tone variations. This paper extends the concept of a VSB to an image bar code (IBC). We enable the encoding and subsequent decoding of information embedded in the hardcopy version of continuous-tone base-images such as those acquired with a digital camera. The encoding-decoding process is modeled by robust data transmission through a noisy print-scan channel that is explicitly modeled. The IBC supports a high information capacity that differentiates it from common hardcopy watermarks. The reason for the improved image quality over the VSB is a joint encoding/halftoning strategy based on a modified version of block error diffusion. Encoder stability, image quality vs. information capacity tradeoffs and decoding issues with and without explicit knowledge of the base-image are discussed.

  13. The Effects of Infrared-Blocking Pigments and Deck Venting on Stone-Coated Metal Residential Roofs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, William A

    2006-01-01

    Field data show that stone-coated metal shakes and S-mission tile, which exploit the use of infraredblocking color pigments (IrBCPs), along with underside venting reduce the heat flow penetrating the conditioned space of a residence by 70% compared with the amount of heat flow penetrating roofs with conventional asphalt shingles. Stone-coated metal roof products are typically placed on battens and counter-battens and nailed through the battens to the roof deck. The design provides venting on the underside of the metal roof that reduces the heat flow penetrating a home. The Metal Construction Association (MCA) and its affiliate members installed stone-coated metalmore » roofs with shake and S-mission tile profiles and a painted metal shake roof on a fully instrumented attic test assembly at Oak Ridge National Laboratory (ORNL). Measurements of roof, deck, attic, and ceiling temperatures; heat flows; solar reflectance; thermal emittance; and ambient weather were recorded for each of the test roofs and also for an adjacent attic cavity covered with a conventional pigmented and direct nailed asphalt shingle roof. All attic assemblies had ridge and soffit venting; the ridge was open to the underside of the stone-coated metal roofs. A control assembly with a conventional asphalt shingle roof was used for comparing deck and ceiling heat transfer rates.« less

  14. Bi-orthogonality relations for fluid-filled elastic cylindrical shells: Theory, generalisations and application to construct tailored Green's matrices

    NASA Astrophysics Data System (ADS)

    Ledet, Lasse S.; Sorokin, Sergey V.

    2018-03-01

    The paper addresses the classical problem of time-harmonic forced vibrations of a fluid-filled cylindrical shell considered as a multi-modal waveguide carrying infinitely many waves. The forced vibration problem is solved using tailored Green's matrices formulated in terms of eigenfunction expansions. The formulation of Green's matrix is based on special (bi-)orthogonality relations between the eigenfunctions, which are derived here for the fluid-filled shell. Further, the relations are generalised to any multi-modal symmetric waveguide. Using the orthogonality relations the transcendental equation system is converted into algebraic modal equations that can be solved analytically. Upon formulation of Green's matrices the solution space is studied in terms of completeness and convergence (uniformity and rate). Special features and findings exposed only through this modal decomposition method are elaborated and the physical interpretation of the bi-orthogonality relation is discussed in relation to the total energy flow which leads to derivation of simplified equations for the energy flow components.

  15. Topology of Awareness: Therapeutic Implications of Logical Modalities of Multiple Levels of Awareness.

    ERIC Educational Resources Information Center

    Levine, Shellie

    2000-01-01

    Describes a theory of a topology of awareness, in which higher levels organize reality through dialectical logic, whereas lower levels construct reality based on Aristotelian logic, binary oppositions, and experiencing entities as discreet and independent. Argues that metaphor, poetry, and narrative are linguistic tools that enable clients to…

  16. Modal Representations and Their Role in the Learning Process: A Theoretical and Pragmatic Analysis

    ERIC Educational Resources Information Center

    Gunel, Murat; Yesildag-Hasancebi, Funda

    2016-01-01

    In the construction and sharing of scientific knowledge, modal representations such as text, graphics, pictures, and mathematical expressions are commonly used. Due to the increasing importance of their role in the production and communication of science, modal representations have become a topic of growing interest in science education research…

  17. Contemporary Multi-Modal Historical Representations and the Teaching of Disciplinary Understandings in History

    ERIC Educational Resources Information Center

    Donnelly, Debra J.

    2018-01-01

    Traditional privileging of the printed text has been considerably eroded by rapid technological advancement and in Australia, as elsewhere, many History teaching programs feature an array of multi-modal historical representations. Research suggests that engagement with the visual and multi-modal constructs has the potential to enrich the pedagogy…

  18. Development of piezoelectric bistable energy harvester based on buckled beam with axially constrained end condition for human motion

    NASA Astrophysics Data System (ADS)

    Eltanany, Ali M.; Yoshimura, Takeshi; Fujimura, Norifumi; Ebied, Mohamed R.; Ali, Mohamed G. S.

    2017-10-01

    In this study, we aim to examine the triggering force for an efficient snap-through solution of hand shaking vibrations of a piezoelectric bistable energy harvester. The proposed structure works at very low frequencies with nearly continuous periodic vibrations. The static characterizations are presented as well as the dynamic characterizations based on the phase diagrams of velocity vs displacement, voltage vs displacement, and voltage vs input acceleration. The mass attached to the bistable harvester plays an important role in determining the acceleration needed for the snap-through action, and the explanation for this role is complex because of mass dependence on frequency/amplitude vibration. Various hand shaking vibration tests are performed to demonstrate the advantage of the proposed structure in harvesting energy from hand shaking vibration. The minimum input acceleration for snap-through action was 11.59 m/s2 with peaks of 15.76 and 2 m/s2 in the frequency range of 1.3-2.7 Hz, when an attached mass of 14.6 g is used. The maximum generated power at a buckling state of 0.5 mm is 11.3 µW for the test structure at 26 g. The experimental results obtained in this study indicate that power output harvesting of slow hand shaking vibrations at 10 µW and a load resistance of 1 MΩ.

  19. Application of an Online-Biomass Sensor in an Optical Multisensory Platform Prototype for Growth Monitoring of Biotechnical Relevant Microorganism and Cell Lines in Single-Use Shake Flasks

    PubMed Central

    Ude, Christian; Schmidt-Hager, Jörg; Findeis, Michael; John, Gernot Thomas; Scheper, Thomas; Beutel, Sascha

    2014-01-01

    In the context of this work we evaluated a multisensory, noninvasive prototype platform for shake flask cultivations by monitoring three basic parameters (pH, pO2 and biomass). The focus lies on the evaluation of the biomass sensor based on backward light scattering. The application spectrum was expanded to four new organisms in addition to E. coli K12 and S. cerevisiae [1]. It could be shown that the sensor is appropriate for a wide range of standard microorganisms, e.g., L. zeae, K. pastoris, A. niger and CHO-K1. The biomass sensor signal could successfully be correlated and calibrated with well-known measurement methods like OD600, cell dry weight (CDW) and cell concentration. Logarithmic and Bleasdale-Nelder derived functions were adequate for data fitting. Measurements at low cell concentrations proved to be critical in terms of a high signal to noise ratio, but the integration of a custom made light shade in the shake flask improved these measurements significantly. This sensor based measurement method has a high potential to initiate a new generation of online bioprocess monitoring. Metabolic studies will particularly benefit from the multisensory data acquisition. The sensor is already used in labscale experiments for shake flask cultivations. PMID:25232914

  20. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.

  1. Scale-up from shake flasks to bioreactor, based on power input and Streptomyces lividans morphology, for the production of recombinant APA (45/47 kDa protein) from Mycobacterium tuberculosis.

    PubMed

    Gamboa-Suasnavart, Ramsés A; Marín-Palacio, Luz D; Martínez-Sotelo, José A; Espitia, Clara; Servín-González, Luis; Valdez-Cruz, Norma A; Trujillo-Roldán, Mauricio A

    2013-08-01

    Culture conditions in shake flasks affect filamentous Streptomyces lividans morphology, as well the productivity and O-mannosylation of recombinant Ala-Pro-rich O-glycoprotein (known as the 45/47 kDa or APA antigen) from Mycobacterium tuberculosis. In order to scale up from previous reported shake flasks to bioreactor, data from the literature on the effect of agitation on morphology of Streptomyces strains were used to obtain gassed volumetric power input values that can be used to obtain a morphology of S. lividans in bioreactor similar to the morphology previously reported in coiled/baffled shake flasks by our group. Morphology of S. lividans was successfully scaled-up, obtaining similar mycelial sizes in both scales with diameters of 0.21 ± 0.09 mm in baffled and coiled shake flasks, and 0.15 ± 0.01 mm in the bioreactor. Moreover, the specific growth rate was successfully scaled up (0.09 ± 0.02 and 0.12 ± 0.01 h(-1), for bioreactors and flasks, respectively), and the recombinant protein productivity measured by densitometry, as well. More interestingly, the quality of the recombinant glycoprotein measured as the amount of mannoses attached to the C-terminal of APA was also scaled- up; with up to five mannose residues in cultures carried out in shake flasks; and six in the bioreactor. However, final biomass concentration was not similar, indicating that although the process can be scaled-up using the power input, others factors like oxygen transfer rate, tip speed or energy dissipation/circulation function can be an influence on bacterial metabolism.

  2. ShakeNet: a portable wireless sensor network for instrumenting large civil structures

    USGS Publications Warehouse

    Kohler, Monica D.; Hao, Shuai; Mishra, Nilesh; Govindan, Ramesh; Nigbor, Robert

    2015-08-03

    We report our findings from a U.S. Geological Survey (USGS) National Earthquake Hazards Reduction Program-funded project to develop and test a wireless, portable, strong-motion network of up to 40 triaxial accelerometers for structural health monitoring. The overall goal of the project was to record ambient vibrations for several days from USGS-instrumented structures. Structural health monitoring has important applications in fields like civil engineering and the study of earthquakes. The emergence of wireless sensor networks provides a promising means to such applications. However, while most wireless sensor networks are still in the experimentation stage, very few take into consideration the realistic earthquake engineering application requirements. To collect comprehensive data for structural health monitoring for civil engineers, high-resolution vibration sensors and sufficient sampling rates should be adopted, which makes it challenging for current wireless sensor network technology in the following ways: processing capabilities, storage limit, and communication bandwidth. The wireless sensor network has to meet expectations set by wired sensor devices prevalent in the structural health monitoring community. For this project, we built and tested an application-realistic, commercially based, portable, wireless sensor network called ShakeNet for instrumentation of large civil structures, especially for buildings, bridges, or dams after earthquakes. Two to three people can deploy ShakeNet sensors within hours after an earthquake to measure the structural response of the building or bridge during aftershocks. ShakeNet involved the development of a new sensing platform (ShakeBox) running a software suite for networking, data collection, and monitoring. Deployments reported here on a tall building and a large dam were real-world tests of ShakeNet operation, and helped to refine both hardware and software. 

  3. Development of a Low Cost Earthquake Early Warning System in Taiwan

    NASA Astrophysics Data System (ADS)

    Wu, Y. M.

    2017-12-01

    The National Taiwan University (NTU) developed an earthquake early warning (EEW) system for research purposes using low-cost accelerometers (P-Alert) since 2010. As of 2017, a total of 650 stations have been deployed and configured. The NTU system can provide earthquake information within 15 s of an earthquake occurrence. Thus, this system may provide early warnings for cities located more than 50 km from the epicenter. Additionally, the NTU system also has an onsite alert function that triggers a warning for incoming P-waves greater than a certain magnitude threshold, thus providing a 2-3 s lead time before peak ground acceleration (PGA) for regions close to an epicenter. Detailed shaking maps are produced by the NTU system within one or two minutes after an earthquake. Recently, a new module named ShakeAlarm has been developed. Equipped with real-time acceleration signals and the time-dependent anisotropic attenuation relationship of the PGA, ShakingAlarm can provide an accurate PGA estimation immediately before the arrival of the observed PGA. This unique advantage produces sufficient lead time for hazard assessment and emergency response, which is unavailable for traditional shakemap, which are based on only the PGA observed in real time. The performance of ShakingAlarm was tested with six M > 5.5 inland earthquakes from 2013 to 2016. Taking the 2016 M6.4 Meinong earthquake simulation as an example, the predicted PGA converges to a stable value and produces a predicted shake map and an isocontour map of the predicted PGA within 16 seconds of earthquake occurrence. Compared with traditional regional EEW system, ShakingAlarm can effectively identify possible damage regions and provide valuable early warning information (magnitude and PGA) for risk mitigation.

  4. Estimation of Stresses in a Dry Sand Layer Tested on Shaking Table

    NASA Astrophysics Data System (ADS)

    Sawicki, Andrzej; Kulczykowski, Marek; Jankowski, Robert

    2012-12-01

    Theoretical analysis of shaking table experiments, simulating earthquake response of a dry sand layer, is presented. The aim of such experiments is to study seismic-induced compaction of soil and resulting settlements. In order to determine the soil compaction, the cyclic stresses and strains should be calculated first. These stresses are caused by the cyclic horizontal acceleration at the base of soil layer, so it is important to determine the stress field as function of the base acceleration. It is particularly important for a proper interpretation of shaking table tests, where the base acceleration is controlled but the stresses are hard to measure, and they can only be deduced. Preliminary experiments have shown that small accelerations do not lead to essential settlements, whilst large accelerations cause some phenomena typical for limit states, including a visible appearance of slip lines. All these problems should be well understood for rational planning of experiments. The analysis of these problems is presented in this paper. First, some heuristic considerations about the dynamics of experimental system are presented. Then, the analysis of boundary conditions, expressed as resultants of respective stresses is shown. A particular form of boundary conditions has been chosen, which satisfies the macroscopic boundary conditions and the equilibrium equations. Then, some considerations are presented in order to obtain statically admissible stress field, which does not exceed the Coulomb-Mohr yield conditions. Such an approach leads to determination of the limit base accelerations, which do not cause the plastic state in soil. It was shown that larger accelerations lead to increase of the lateral stresses, and the respective method, which may replace complex plasticity analyses, is proposed. It is shown that it is the lateral stress coefficient K0 that controls the statically admissible stress field during the shaking table experiments.

  5. Experimental seismic behavior of a full-scale four-story soft-story wood-frame building with retrofits II: shake table test results

    Treesearch

    John W. van de Lindt; Pouria Bahmani; Gary Mochizuki; Steven E. Pryor; Mikhail Gershfeld; Jingjing Tian; Michael D. Symans; Douglas Rammer

    2016-01-01

    Soft-story wood-frame buildings have been recognized as a disaster preparedness problem for decades. The majority of these buildings were constructed from the 1920s to the 1960s and are prone to collapse during moderate to large earthquakes due to a characteristic deficiency in strength and stiffness in their first story. In order to propose and validate retrofit...

  6. Induced earthquake during the 2016 Kumamoto earthquake (Mw7.0): Importance of real-time shake monitoring for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Ogiso, M.

    2016-12-01

    Sequence of the 2016 Kumamoto earthquakes (Mw6.2 on April 14, Mw7.0 on April 16, and many aftershocks) caused a devastating damage at Kumamoto and Oita prefectures, Japan. During the Mw7.0 event, just after the direct S waves passing the central Oita, another M6 class event occurred there more than 80 km apart from the Mw7.0 event. The M6 event is interpreted as an induced earthquake; but it brought stronger shaking at the central Oita than that from the Mw7.0 event. We will discuss the induced earthquake from viewpoint of Earthquake Early Warning. In terms of ground shaking such as PGA and PGV, the Mw7.0 event is much smaller than those of the M6 induced earthquake at the central Oita (for example, 1/8 smaller at OIT009 station for PGA), and then it is easy to discriminate two events. However, PGD of the Mw7.0 is larger than that of the induced earthquake, and its appearance is just before the occurrence of the induced earthquake. It is quite difficult to recognize the induced earthquake from displacement waveforms only, because the displacement is strongly contaminated by that of the preceding Mw7.0 event. In many methods of EEW (including current JMA EEW system), magnitude is used for prediction of ground shaking through Ground Motion Prediction Equation (GMPE) and the magnitude is often estimated from displacement. However, displacement magnitude does not necessarily mean the best one for prediction of ground shaking, such as PGA and PGV. In case of the induced earthquake during the Kumamoto earthquake, displacement magnitude could not be estimated because of the strong contamination. Actually JMA EEW system could not recognize the induced earthquake. One of the important lessons we learned from eight years' operation of EEW is an issue of the multiple simultaneous earthquakes, such as aftershocks of the 2011 Mw9.0 Tohoku earthquake. Based on this lesson, we have proposed enhancement of real-time monitor of ground shaking itself instead of rapid estimation of hypocenter location and magnitude. Because we want to predict ground shaking in EEW, we should more focus on monitoring of ground shaking. Experience of the induced earthquake also indicates the importance of the real-time monitor of ground shaking for making EEW more rapid and precise.

  7. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Level 2 analysis of the ELER Software (similar to HAZUS and SELENA) is essentially intended for earthquake risk assessment (building damage, consequential human casualties and macro economic loss quantifiers) in urban areas. The basic Shake Mapping is similar to the Level 0 and Level 1 analysis however, options are available for more sophisticated treatment of site response through externally entered data and improvement of the shake map through incorporation of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.

  8. A new wireless system for decentralised measurement of physiological parameters from shake flasks

    PubMed Central

    Vasala, Antti; Panula, Johanna; Bollók, Monika; Illmann, Lutz; Hälsig, Christian; Neubauer, Peter

    2006-01-01

    Background Shake flasks are widely used because of their low price and simple handling. Many researcher are, however, not aware of the physiological consequences of oxygen limitation and substrate overflow metabolism that occur in shake flasks. Availability of a wireless measuring system brings the possibilities for quality control and design of cultivation conditions. Results Here we present a new wireless solution for the measurement of pH and oxygen from shake flasks with standard sensors, which allows data transmission over a distance of more than 100 metres in laboratory environments. This new system was applied to monitoring of cultivation conditions in shake flasks. The at-time monitoring of the growth conditions became possible by simple means. Here we demonstrate that with typical protocols E. coli shake flask cultures run into severe oxygen limitation and the medium is strongly acidified. Additionally the strength of the new system is demonstrated by continuous monitoring of the oxygen level in methanol-fed Pichia pastoris shake flask cultures, which allows the optimisation of substrate feeding for preventing starvation or methanol overfeed. 40 % higher cell density was obtained by preventing starvation phases which occur in standard shake flask protocols by adding methanol when the respiration activity decreased in the cultures. Conclusion The here introduced wireless system can read parallel sensor data over long distances from shake flasks that are under vigorous shaking in cultivation rooms or closed incubators. The presented technology allows centralised monitoring of decentralised targets. It is useful for the monitoring of pH and dissolved oxygen in shake flask cultures. It is not limited to standard sensors, but can be easily adopted to new types of sensors and measurement places (e.g., new sensor points in large-scale bioreactors). PMID:16504107

  9. Diversity of head shaking nystagmus in peripheral vestibular disease.

    PubMed

    Kim, Min-Beom; Huh, Se Hyung; Ban, Jae Ho

    2012-06-01

    To evaluate the characteristics of head shaking nystagmus in various peripheral vestibular diseases. Retrospective case series. Tertiary referral center. Data of 235 patients with peripheral vestibular diseases including vestibular neuritis, Ménière's disease, and benign paroxysmal positional vertigo, were retrospectively analyzed. All subjects presented between August 2009 and July 2010. Patients were tested for vestibular function including head shaking nystagmus and caloric information. Regarding vestibular neuritis, all tests were again performed during the 1-month follow-up. Head shaking nystagmus was classified as monophasic or biphasic and, according to the affected ear, was divided as ipsilesional or contralesional. Of the 235 patients, 87 patients revealed positive head shaking nystagmus. According to each disease, positive rates of head shaking nystagmus were as follows: 35 (100%) of 35 cases of vestibular neuritis, 11 (68.8%) of 16 cases of Ménière's disease, and 41 (22.2%) of 184 cases of benign paroxysmal positional vertigo. All cases of vestibular neuritis initially presented as a monophasic, contralesional beating, head shaking nystagmus. However, 1 month after first visit, the direction of nystagmus was changed to biphasic (contralesional first then ipsilesional beating) in 25 cases (72.5%) but not in 10 cases (27.5%). There was a significant correlation between the degree of initial caloric weakness and the biphasic conversion of head shaking nystagmus (p = 0.02). In 72.5% of vestibular neuritis cases, head shaking nystagmus was converted to biphasic during the subacute period. The larger the initial canal paresis was present, the more frequent the biphasic conversion of head shaking nystagmus occurred. However, Ménière's disease and benign paroxysmal positional vertigo did not have specific patterns of head shaking nystagmus.

  10. Regional Earthquake Shaking and Loss Estimation

    NASA Astrophysics Data System (ADS)

    Sesetyan, K.; Demircioglu, M. B.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses in the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Both Level 0 (similar to PAGER system of USGS) and Level 1 analyses of the ELER routine are based on obtaining intensity distributions analytically and estimating total number of casualties and their geographic distribution either using regionally adjusted intensity-casualty or magnitude-casualty correlations (Level 0) of using regional building inventory data bases (Level 1). Level 0 analysis is similar to the PAGER system being developed by USGS. For given basis source parameters the intensity distributions can be computed using: a)Regional intensity attenuation relationships, b)Intensity correlations with attenuation relationship based PGV, PGA, and Spectral Amplitudes and, c)Intensity correlations with synthetic Fourier Amplitude Spectrum. In Level 1 analysis EMS98 based building vulnerability relationships are used for regional estimates of building damage and the casualty distributions. Results obtained from pilot applications of the Level 0 and Level 1 analysis modes of the ELER software to the 1999 M 7.4 Kocaeli, 1995 M 6.1 Dinar, and 2007 M 5.4 Bingol earthquakes in terms of ground shaking and losses are presented and comparisons with the observed losses are made. The regional earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation and related Monte-Carlo type simulations.

  11. Recorded motions of the 6 April 2009 Mw 6.3 L'Aquila, Italy, earthquake and implications for building structural damage: Overview

    USGS Publications Warehouse

    Celebi, M.; Bazzurro, P.; Chiaraluce, L.; Clemente, P.; Decanini, L.; Desortis, A.; Ellsworth, W.; Gorini, A.; Kalkan, E.; Marcucci, S.; Milana, G.; Mollaioli, F.; Olivieri, M.; Paolucci, R.; Rinaldis, D.; Rovelli, A.; Sabetta, F.; Stephens, C.

    2010-01-01

    The normal-faulting earthquake of 6 April 2009 in the Abruzzo Region of central Italy caused heavy losses of life and substantial damage to centuriesold buildings of significant cultural importance and to modern reinforcedconcrete- framed buildings with hollow masonry infill walls. Although structural deficiencies were significant and widespread, the study of the characteristics of strong motion data from the heavily affected area indicated that the short duration of strong shaking may have spared many more damaged buildings from collapsing. It is recognized that, with this caveat of shortduration shaking, the infill walls may have played a very important role in preventing further deterioration or collapse of many buildings. It is concluded that better new or retrofit construction practices that include reinforcedconcrete shear walls may prove helpful in reducing risks in such seismic areas of Italy, other Mediterranean countries, and even in United States, where there are large inventories of deficient structures. ?? 2010, Earthquake Engineering Research Institute.

  12. Urban Observation and Sentiment in James Parkinson’s Essay on the Shaking Palsy (1817)

    PubMed Central

    Hurwitz, Brian

    2014-01-01

    James Parkinson’s Essay on the Shaking Palsy (1817) has long been considered the foundational text of the disease which now bears the author’s name. This paper shows how the Essay radically re-formulated a diverse array of human dysmobilities as a “species” of disease. Parkinson incorporated medical observation with a clear focus on patient experience and subjectivity in a deeply affecting narrative, fusing clinical and urban case-descriptions within the genre of a sentimental natural history. His detailed, diagnostic portrayal of the malady recast earlier descriptions of trembling, posture and gait disorder within a new narrative order, simultaneously recruiting reader involvement to the plight of sufferers. Hardly any clinical examination as we know it today undergirds what remains an exemplary account of disciplined medical witness. The Essay demonstrates the potential of case construction and powerful, sympathetic case writing to transform clinical understanding of a complex medical condition of long duration. PMID:25055707

  13. The Chemical and Physical Properties of Pyrrole-Based Conducting Polymers: The Characterization of As-Grown Films by X-Ray Photoemission Spectroscopy.

    DTIC Science & Technology

    1983-04-07

    has been the subject of the most extensive experimental and theoretical investigations because in this particular polymer bond-alternation defects6...systems2 3 " 2 4 that such structures can arise from simultaneous core electron photoionization and valence electron excitation ("shake up") or ionization...34shake off"). While structures on the high energy side of the direct photoionization peak could also arise from characteristic energy losses (Le

  14. Blind identification of full-field vibration modes of output-only structures from uniformly-sampled, possibly temporally-aliased (sub-Nyquist), video measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Yongchao; Dorn, Charles; Mancini, Tyler

    Enhancing the spatial and temporal resolution of vibration measurements and modal analysis could significantly benefit dynamic modelling, analysis, and health monitoring of structures. For example, spatially high-density mode shapes are critical for accurate vibration-based damage localization. In experimental or operational modal analysis, higher (frequency) modes, which may be outside the frequency range of the measurement, contain local structural features that can improve damage localization as well as the construction and updating of the modal-based dynamic model of the structure. In general, the resolution of vibration measurements can be increased by enhanced hardware. Traditional vibration measurement sensors such as accelerometers havemore » high-frequency sampling capacity; however, they are discrete point-wise sensors only providing sparse, low spatial sensing resolution measurements, while dense deployment to achieve high spatial resolution is expensive and results in the mass-loading effect and modification of structure's surface. Non-contact measurement methods such as scanning laser vibrometers provide high spatial and temporal resolution sensing capacity; however, they make measurements sequentially that requires considerable acquisition time. As an alternative non-contact method, digital video cameras are relatively low-cost, agile, and provide high spatial resolution, simultaneous, measurements. Combined with vision based algorithms (e.g., image correlation or template matching, optical flow, etc.), video camera based measurements have been successfully used for experimental and operational vibration measurement and subsequent modal analysis. However, the sampling frequency of most affordable digital cameras is limited to 30–60 Hz, while high-speed cameras for higher frequency vibration measurements are extremely costly. This work develops a computational algorithm capable of performing vibration measurement at a uniform sampling frequency lower than what is required by the Shannon-Nyquist sampling theorem for output-only modal analysis. In particular, the spatio-temporal uncoupling property of the modal expansion of structural vibration responses enables a direct modal decoupling of the temporally-aliased vibration measurements by existing output-only modal analysis methods, yielding (full-field) mode shapes estimation directly. Then the signal aliasing properties in modal analysis is exploited to estimate the modal frequencies and damping ratios. Furthermore, the proposed method is validated by laboratory experiments where output-only modal identification is conducted on temporally-aliased acceleration responses and particularly the temporally-aliased video measurements of bench-scale structures, including a three-story building structure and a cantilever beam.« less

  15. Blind identification of full-field vibration modes of output-only structures from uniformly-sampled, possibly temporally-aliased (sub-Nyquist), video measurements

    DOE PAGES

    Yang, Yongchao; Dorn, Charles; Mancini, Tyler; ...

    2016-12-05

    Enhancing the spatial and temporal resolution of vibration measurements and modal analysis could significantly benefit dynamic modelling, analysis, and health monitoring of structures. For example, spatially high-density mode shapes are critical for accurate vibration-based damage localization. In experimental or operational modal analysis, higher (frequency) modes, which may be outside the frequency range of the measurement, contain local structural features that can improve damage localization as well as the construction and updating of the modal-based dynamic model of the structure. In general, the resolution of vibration measurements can be increased by enhanced hardware. Traditional vibration measurement sensors such as accelerometers havemore » high-frequency sampling capacity; however, they are discrete point-wise sensors only providing sparse, low spatial sensing resolution measurements, while dense deployment to achieve high spatial resolution is expensive and results in the mass-loading effect and modification of structure's surface. Non-contact measurement methods such as scanning laser vibrometers provide high spatial and temporal resolution sensing capacity; however, they make measurements sequentially that requires considerable acquisition time. As an alternative non-contact method, digital video cameras are relatively low-cost, agile, and provide high spatial resolution, simultaneous, measurements. Combined with vision based algorithms (e.g., image correlation or template matching, optical flow, etc.), video camera based measurements have been successfully used for experimental and operational vibration measurement and subsequent modal analysis. However, the sampling frequency of most affordable digital cameras is limited to 30–60 Hz, while high-speed cameras for higher frequency vibration measurements are extremely costly. This work develops a computational algorithm capable of performing vibration measurement at a uniform sampling frequency lower than what is required by the Shannon-Nyquist sampling theorem for output-only modal analysis. In particular, the spatio-temporal uncoupling property of the modal expansion of structural vibration responses enables a direct modal decoupling of the temporally-aliased vibration measurements by existing output-only modal analysis methods, yielding (full-field) mode shapes estimation directly. Then the signal aliasing properties in modal analysis is exploited to estimate the modal frequencies and damping ratios. Furthermore, the proposed method is validated by laboratory experiments where output-only modal identification is conducted on temporally-aliased acceleration responses and particularly the temporally-aliased video measurements of bench-scale structures, including a three-story building structure and a cantilever beam.« less

  16. Performance of Earthquake Early Warning Systems during the Major Events of the 2016-2017 Central Italy Seismic Sequence.

    NASA Astrophysics Data System (ADS)

    Festa, G.; Picozzi, M.; Alessandro, C.; Colombelli, S.; Cattaneo, M.; Chiaraluce, L.; Elia, L.; Martino, C.; Marzorati, S.; Supino, M.; Zollo, A.

    2017-12-01

    Earthquake early warning systems (EEWS) are systems nowadays contributing to the seismic risk mitigation actions, both in terms of losses and societal resilience, by issuing an alert promptly after the earthquake origin and before the ground shaking impacts the targets to be protected. EEWS systems can be grouped in two main classes: network based and stand-alone systems. Network based EEWS make use of dense seismic networks surrounding the fault (e.g. Near Fault Observatory; NFO) generating the event. The rapid processing of the P-wave early portion allows for the location and magnitude estimation of the event then used to predict the shaking through ground motion prediction equations. Stand-alone systems instead analyze the early P-wave signal to predict the ground shaking carried by the late S or surface waves, through empirically calibrated scaling relationships, at the recording site itself. We compared the network-based (PRESTo, PRobabilistic and Evolutionary early warning SysTem, www.prestoews.org, Satriano et al., 2011) and the stand-alone (SAVE, on-Site-Alert-leVEl, Caruso et al., 2017) systems, by analyzing their performance during the 2016-2017 Central Italy sequence. We analyzed 9 earthquakes having magnitude 5.0 < M < 6.5 at about 200 stations located within 200 km from the epicentral area, including stations of The Altotiberina NFO (TABOO). Performances are evaluated in terms of rate of success of ground shaking intensity prediction and available lead-time, i.e. the time available for security actions. PRESTo also evaluated the accuracy of location and magnitude. Both systems well predict the ground shaking nearby the event source, with a success rate around 90% within the potential damage zone. The lead-time is significantly larger for the network based system, increasing to more than 10s at 40 km from the event epicentre. The stand-alone system better performs in the near-source region showing a positive albeit small lead-time (<3s). Far away from the source, the performances slightly degrade, mostly owing to uncertain calibration of attenuation relationships. This study opens to the possibility of making EEWS operational in Italy, based on the available acceleration networks, by improving the capability of reducing the lead-time related to data telemetry.

  17. Generalization and modularization of two-dimensional adaptive coordinate transformations for the Fourier modal method.

    PubMed

    Küchenmeister, Jens

    2014-04-21

    The Fourier modal method (FMM) has advanced greatly by using adaptive coordinates and adaptive spatial resolution. The convergence characteristics were shown to be improved significantly, a construction principle for suitable meshes was demonstrated and a guideline for the optimal choice of the coordinate transformation parameters was found. However, the construction guidelines published so far rely on a certain restriction that is overcome with the formulation presented in this paper. Moreover, a modularization principle is formulated that significantly eases the construction of coordinate transformations in unit cells with reappearing shapes and complex sub-structures.

  18. Ring Shake in Eastern Hemlock: Frequency and Relationship to Tree Attributes

    Treesearch

    John E. Baumgras; Paul E. Sendak; David L. Sonderman; David L. Sonderman

    2000-01-01

    Ring shake is a barrier to improved utilization of eastern hemlock, an important component of the total softwood timber resource in the Eastern United States and Canada. Ring shake is the lengthwise separation of wood that occurs between and parallel to growth rings, diminishing lumber yields and values. Evaluating the potential for ring shake is essential to improving...

  19. Ring shake in eastern hemlock: frequency and relationship to tree attributes

    Treesearch

    John E. Baumgras; Paul E. Sendak; David L. Sonderman

    2000-01-01

    Ring shake is a barrier to improved utilization of eastern hemlock, an important component of the total softwood timber resource in the Eastern United States and Canada. Ring shake is the lengthwise separation of wood that occurs between and parallel to growth rings, diminishing lumber yields and values. Evaluating the potential for ring shake is essential to improving...

  20. Radial shakes and "frost cracks" in living oak trees

    Treesearch

    Heinz Butin; Alex L. Shigo

    1981-01-01

    Dissections of hundreds of living, mature oak trees over a 25-year period revealed that radial shakes (or "frost cracks") and ring shakes are associated with a variety of wounds and stubs of branches and basal sprouts. A more intensive study of radial shakes that included dissections of more than 30 oaks confirmed the earlier finds, and provided additional...

  1. High-speed shaking of frozen blood clots for extraction of human and malaria parasite DNA.

    PubMed

    Lundblom, Klara; Macharia, Alex; Lebbad, Marianne; Mohammed, Adan; Färnert, Anna

    2011-08-08

    Frozen blood clots remaining after serum collection is an often disregarded source of host and pathogen DNA due to troublesome handling and suboptimal outcome. High-speed shaking of clot samples in a cell disruptor manufactured for homogenization of tissue and faecal specimens was evaluated for processing frozen blood clots for DNA extraction. The method was compared to two commercial clot protocols based on a chemical kit and centrifugation through a plastic sieve, followed by the same DNA extraction protocol. Blood clots with different levels of parasitaemia (1-1,000 p/μl) were prepared from parasite cultures to assess sensitivity of PCR detection. In addition, clots retrieved from serum samples collected within two epidemiological studies in Kenya (n = 630) were processed by high speed shaking and analysed by PCR for detection of malaria parasites and the human α-thalassaemia gene. High speed shaking succeeded in fully dispersing the clots and the method generated the highest DNA yield. The level of PCR detection of P. falciparum parasites and the human thalassaemia gene was the same as samples optimally collected with an anticoagulant. The commercial clot protocol and centrifugation through a sieve failed to fully dissolve the clots and resulted in lower sensitivity of PCR detection. High speed shaking was a simple and efficacious method for homogenizing frozen blood clots before DNA purification and resulted in PCR templates of high quality both from humans and malaria parasites. This novel method enables genetic studies from stored blood clots.

  2. Shake Warning: Helping People Stay Safe With Lots of Small Boxes in the Ground to Warn Them About Strong Shaking

    NASA Astrophysics Data System (ADS)

    Reusch, M.

    2017-12-01

    A group of people at schools are joining with the group of people in control of making pictures of the state of rocks on the ground and water in our land. They are working on a plan to help all people be safe in the case of very big ground shaking (when ground breaks in sight or under ground). They will put many small boxes all over the states in the direction of where the sun sets to look for the first shake that might be a sign of an even bigger shake to come. They tell a big computer (with much power) in several large cities in those states. These computers will decide if the first shake is a sign of a very large and close ground shake, a far-away ground shake, a small but close ground shake, or even just a sign of a shake that people wanted to make. If it is a sign of a close and really big shake, then the computers will tell the phones and computers of many people to help them take safe steps before the big shaking arrives where they are. This warning might be several seconds or maybe a couple of minutes. People will be able to hide, take cover, and hold on under tables and desks in case things fall from walls and places up high in their home and work. Doctors will be able to pause hard work and boxes that move people up and down in homes, businesses, and stores will be able to stop on the next floor and open their doors to let people out and not get stuck. It will help slow down trains to be safe and not fly off of the track as well as it will help to shut off water and air that warms homes and is used for when you make food hot. To make this plan become real, people who work for these groups are putting more small boxes in areas where there are not enough and that there are many people. They are also putting small boxes in places where there are no boxes but the big shake might come from that direction. There are problems to get past such as needing many more small boxes, more people to help with this plan, and getting all people who live in these areas to learn what to do when the warning comes about the big shake, but this year there was good news when in month number four they were able to get all of the computers to talk to each other and run the same plan with the same news of the first shaking.

  3. A new method to extract modal parameters using output-only responses

    NASA Astrophysics Data System (ADS)

    Kim, Byeong Hwa; Stubbs, Norris; Park, Taehyo

    2005-04-01

    This work proposes a new output-only modal analysis method to extract mode shapes and natural frequencies of a structure. The proposed method is based on an approach with a single-degree-of-freedom in the time domain. For a set of given mode-isolated signals, the un-damped mode shapes are extracted utilizing the singular value decomposition of the output energy correlation matrix with respect to sensor locations. The natural frequencies are extracted from a noise-free signal that is projected on the estimated modal basis. The proposed method is particularly efficient when a high resolution of mode shape is essential. The accuracy of the method is numerically verified using a set of time histories that are simulated using a finite-element method. The feasibility and practicality of the method are verified using experimental data collected at the newly constructed King Storm Water Bridge in California, United States.

  4. Advances in Modal Analysis Using a Robust and Multiscale Method

    NASA Astrophysics Data System (ADS)

    Picard, Cécile; Frisson, Christian; Faure, François; Drettakis, George; Kry, Paul G.

    2010-12-01

    This paper presents a new approach to modal synthesis for rendering sounds of virtual objects. We propose a generic method that preserves sound variety across the surface of an object at different scales of resolution and for a variety of complex geometries. The technique performs automatic voxelization of a surface model and automatic tuning of the parameters of hexahedral finite elements, based on the distribution of material in each cell. The voxelization is performed using a sparse regular grid embedding of the object, which permits the construction of plausible lower resolution approximations of the modal model. We can compute the audible impulse response of a variety of objects. Our solution is robust and can handle nonmanifold geometries that include both volumetric and surface parts. We present a system which allows us to manipulate and tune sounding objects in an appropriate way for games, training simulations, and other interactive virtual environments.

  5. Validation of the shake test for detecting freeze damage to adsorbed vaccines.

    PubMed

    Kartoglu, Umit; Ozgüler, Nejat Kenan; Wolfson, Lara J; Kurzatkowski, Wiesław

    2010-08-01

    To determine the validity of the shake test for detecting freeze damage in aluminium-based, adsorbed, freeze-sensitive vaccines. A double-blind crossover design was used to compare the performance of the shake test conducted by trained health-care workers (HCWs) with that of phase contrast microscopy as a "gold standard". A total of 475 vials of 8 different types of World Health Organization prequalified freeze-sensitive vaccines from 10 different manufacturers were used. Vaccines were kept at 5 degrees C. Selected numbers of vials from each type were then exposed to -25 degrees C and -2 degrees C for 24-hour periods. There was complete concordance between HCWs and phase-contrast microscopy in identifying freeze-damaged vials and non-frozen samples. Non-frozen samples showed a fine-grain structure under phase contrast microscopy, but freeze-damaged samples showed large conglomerates of massed precipitates with amorphous, crystalline, solid and needle-like structures. Particles in the non-frozen samples measured from 1 microm (vaccines against diphtheria-tetanus-pertussis; Haemophilus influenzae type b; hepatitis B; diphtheria-tetanus-pertussis-hepatitis B) to 20 microm (diphtheria and tetanus vaccines, alone or in combination). By contrast, aggregates in the freeze-damaged samples measured up to 700 microm (diphtheria-tetanus-pertussis) and 350 microm on average. The shake test had 100% sensitivity, 100% specificity and 100% positive predictive value in this study, which confirms its validity for detecting freeze damage to aluminium-based freeze-sensitive vaccines.

  6. Analysis of structural response data using discrete modal filters. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.

    1991-01-01

    The application of reciprocal modal vectors to the analysis of structural response data is described. Reciprocal modal vectors are constructed using an existing experimental modal model and an existing frequency response matrix of a structure, and can be assembled into a matrix that effectively transforms the data from the physical space to a modal space within a particular frequency range. In other words, the weighting matrix necessary for modal vector orthogonality (typically the mass matrix) is contained within the reciprocal model matrix. The underlying goal of this work is mostly directed toward observing the modal state responses in the presence of unknown, possibly closed loop forcing functions, thus having an impact on both operating data analysis techniques and independent modal space control techniques. This study investigates the behavior of reciprocol modal vectors as modal filters with respect to certain calculation parameters and their performance with perturbed system frequency response data.

  7. Molecular Platform for Design and Synthesis of Targeted Dual-Modality Imaging Probes

    PubMed Central

    2015-01-01

    We report a versatile dendritic structure based platform for construction of targeted dual-modality imaging probes. The platform contains multiple copies of 1,4,7,10-tetraazacyclododecane-1,4,7,10-tetraacetic acid (DOTA) branching out from a 1,4,7-triazacyclononane-N,N′,N″-triacetic acid (NOTA) core. The specific coordination chemistries of the NOTA and DOTA moieties offer specific loading of 68/67Ga3+ and Gd3+, respectively, into a common molecular scaffold. The platform also contains three amino groups which can potentiate targeted dual-modality imaging of PET/MRI or SPECT/MRI (PET: positron emission tomography; SPECT: single photon emission computed tomography; MRI: magnetic resonance imaging) when further functionalized by targeting vectors of interest. To validate this design concept, a bimetallic complex was synthesized with six peripheral Gd-DOTA units and one Ga-NOTA core at the center, whose ion T1 relaxivity per gadolinium atom was measured to be 15.99 mM–1 s–1 at 20 MHz. Further, the bimetallic agent demonstrated its anticipated in vivo stability, tissue distribution, and pharmacokinetic profile when labeled with 67Ga. When conjugated with a model targeting peptide sequence, the trivalent construct was able to visualize tumors in a mouse xenograft model by both PET and MRI via a single dose injection. PMID:25615011

  8. Examination of a modified cell cycle synchronization method and bovine nuclear transfer using synchronized early G1 phase fibroblast cells.

    PubMed

    Urakawa, Manami; Ideta, Atsushi; Sawada, Tokihiko; Aoyagi, Yoshito

    2004-08-01

    Somatic cell nuclear transfer has a low success rate, due to a high incidence of fetal loss and increased perinatal morbidity/mortality. One factor that may affect the successful development of nuclear transfer embryos is the cell cycle stage of the donor cell. In order to establish a cell cycle synchronization method that can consistently produce cloned embryos and offspring, we examined the effects of different combinations of three cell treatments on the recovery rate of mitotic phase cells using bovine fetal fibroblasts. In the first experiment, we examined the recovery rate of mitotic phase cells by a combination of treatment with a metaphase arrestant (1 microM 2-methoxyestradiol), shaking the plate and selecting cells with a diameter of 20 microns. As a result, 99% of mitotic phase cells were recovered by repeating the combined treatment of metaphase arrestant and shaking, and collection of cells with a specific diameter. In the second experiment, nuclear transfer was carried out using early G1 phase cells by choosing pairs of bridged cells derived from mitotic phase cells recovered by the combined treatment of 1 microM 2-methoxyestradiol and shaking, and collection of cells with a diameter of 20 microns. The reconstructed embryos were transferred to recipient heifers to determine post-implantation development. Development of embryos reconstructed from early G1 phase cells from the >/=6 cells stage on Day 3 to the morula-blastocyst stage on Day 6 was 100%. Ten blastocysts constructed from two cell lines were transferred into 10 recipient heifers. Nine of the 10 recipients delivered single live calves. In conclusion, mitotic phase bovine fibroblast cells were easily recovered by the combined treatments of 1 microM 2-methoxyestradiol, shaking, and selecting cells of the appropriate diameter. Furthermore, nuclear transfer using cells in the early G1 phase as donor cells gave a high rate of offspring production.

  9. PSMA-Targeted Nano-Conjugates as Dual-Modality (MRI/PET) Imaging Probes for the Non-Invasive Detection of Prostate Cancer

    DTIC Science & Technology

    2009-10-01

    be made. Currently, iodine based compounds are used to enhance contrast of CT which have the limitations of short imaging window due to rapid...number compared to conventionally used iodine compounds . Nanoparticle based CT contrast agents have been demonstrated for vascular imaging, which...constructs with gamma or positron emitting isotopes through a covalent attachment of a bifunctional chelator to the nanoparticles surface. However, in

  10. The efficacy of aerobic exercise and resistance training as transdiagnostic interventions for anxiety-related disorders and constructs: A randomized controlled trial.

    PubMed

    LeBouthillier, Daniel M; Asmundson, Gordon J G

    2017-12-01

    Evidence supports exercise as an intervention for many mental health concerns; however, randomized controlled investigations of the efficacy of different exercise modalities and predictors of change are lacking. The purposes of the current trial were to: (1) quantify the effects of aerobic exercise and resistance training on anxiety-related disorder (including anxiety disorders, obsessive-compulsive disorder, and posttraumatic stress disorder) status, symptoms, and constructs, (2) evaluate whether both modalities of exercise were equivalent, and (3) to determine whether exercise enjoyment and physical fitness are associated with symptom reduction. A total of 48 individuals with anxiety-related disorders were randomized to aerobic exercise, resistance training, or a waitlist. Symptoms of anxiety-related disorders, related constructs, and exercise enjoyment were assessed at pre-intervention and weekly during the 4-week intervention. Participants were further assessed 1-week and 1-month post-intervention. Both exercise modalities were efficacious in improving disorder status. As well, aerobic exercise improved general psychological distress and anxiety, while resistance training improved disorder-specific symptoms, anxiety sensitivity, distress tolerance, and intolerance of uncertainty. Physical fitness predicted reductions in general psychological distress for both types of exercise and reductions in stress for aerobic exercise. Results highlight the efficacy of different exercise modalities in uniquely addressing anxiety-related disorder symptoms and constructs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Constraint-Based Abstract Semantics for Temporal Logic: A Direct Approach to Design and Implementation

    NASA Astrophysics Data System (ADS)

    Banda, Gourinath; Gallagher, John P.

    interpretation provides a practical approach to verifying properties of infinite-state systems. We apply the framework of abstract interpretation to derive an abstract semantic function for the modal μ-calculus, which is the basis for abstract model checking. The abstract semantic function is constructed directly from the standard concrete semantics together with a Galois connection between the concrete state-space and an abstract domain. There is no need for mixed or modal transition systems to abstract arbitrary temporal properties, as in previous work in the area of abstract model checking. Using the modal μ-calculus to implement CTL, the abstract semantics gives an over-approximation of the set of states in which an arbitrary CTL formula holds. Then we show that this leads directly to an effective implementation of an abstract model checking algorithm for CTL using abstract domains based on linear constraints. The implementation of the abstract semantic function makes use of an SMT solver. We describe an implemented system for proving properties of linear hybrid automata and give some experimental results.

  12. ShakeMap manual: technical manual, user's guide, and software guide

    USGS Publications Warehouse

    Wald, David J.; Worden, Bruce C.; Quitoriano, Vincent; Pankow, Kris L.

    2005-01-01

    ShakeMap (http://earthquake.usgs.gov/shakemap) --rapidly, automatically generated shaking and intensity maps--combines instrumental measurements of shaking with information about local geology and earthquake location and magnitude to estimate shaking variations throughout a geographic area. The results are rapidly available via the Web through a variety of map formats, including Geographic Information System (GIS) coverages. These maps have become a valuable tool for emergency response, public information, loss estimation, earthquake planning, and post-earthquake engineering and scientific analyses. With the adoption of ShakeMap as a standard tool for a wide array of users and uses came an impressive demand for up-to-date technical documentation and more general guidelines for users and software developers. This manual is meant to address this need. ShakeMap, and associated Web and data products, are rapidly evolving as new advances in communications, earthquake science, and user needs drive improvements. As such, this documentation is organic in nature. We will make every effort to keep it current, but undoubtedly necessary changes in operational systems take precedence over producing and making documentation publishable.

  13. Development of the mathematical model for design and verification of acoustic modal analysis methods

    NASA Astrophysics Data System (ADS)

    Siner, Alexander; Startseva, Maria

    2016-10-01

    To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.

  14. Engineering Escherichia coli for poly-(3-hydroxybutyrate) production guided by genome-scale metabolic network analysis.

    PubMed

    Zheng, Yangyang; Yuan, Qianqian; Yang, Xiaoyan; Ma, Hongwu

    2017-11-01

    Poly-(3-hydroxybutyrate) (P3HB) is a promising biodegradable plastic synthesized from acetyl-CoA. One important factor affecting the P3HB production cost is the P3HB yield. Through flux balance analysis of an extended genome-scale metabolic network of E. coli, we found that the introduction of non-oxidative glycolysis pathway (NOG), a previously reported pathway enabling complete carbon conservation, can increase the theoretical carbon yield from 67% to 89%, equivalent to the theoretical mass yield from 0.48g P3HB/g glucose to 0.64g P3HB/g glucose. Based on this analysis result, we introduced phosphoketolase and enhanced the NOG pathway in E. coli. The mass yield in the engineered strain was increased from 0.16g P3HB/g glucose to 0.24g P3HB/g glucose. We further overexpressed pntAB to enhance the NADPH availability and down-regulated TCA cycle to divert more acetyl-CoA toward P3HB. The final construct accumulated 5.7g/L P3HB and reached a carbon yield of 0.43 (a mass yield of 0.31g P3HB/g glucose) in shake flask cultures in shake flask cultures. The introduction of NOG pathway could also be useful for improving yields of many other biochemicals derived from acetyl-coA. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Importance of multi-modal approaches to effectively identify cataract cases from electronic health records.

    PubMed

    Peissig, Peggy L; Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B

    2012-01-01

    There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries.

  16. Composite Bending Box Section Modal Vibration Fault Detection

    NASA Technical Reports Server (NTRS)

    Werlink, Rudy

    2002-01-01

    One of the primary concerns with Composite construction in critical structures such as wings and stabilizers is that hidden faults and cracks can develop operationally. In the real world, catastrophic sudden failure can result from these undetected faults in composite structures. Vibration data incorporating a broad frequency modal approach, could detect significant changes prior to failure. The purpose of this report is to investigate the usefulness of frequency mode testing before and after bending and torsion loading on a composite bending Box Test section. This test article is representative of construction techniques being developed for the recent NASA Blended Wing Body Low Speed Vehicle Project. The Box section represents the construction technique on the proposed blended wing aircraft. Modal testing using an impact hammer provides an frequency fingerprint before and after bending and torsional loading. If a significant structural discontinuity develops, the vibration response is expected to change. The limitations of the data will be evaluated for future use as a non-destructive in-situ method of assessing hidden damage in similarly constructed composite wing assemblies. Modal vibration fault detection sensitivity to band-width, location and axis will be investigated. Do the sensor accelerometers need to be near the fault and or in the same axis? The response data used in this report was recorded at 17 locations using tri-axial accelerometers. The modal tests were conducted following 5 independent loading conditions before load to failure and 2 following load to failure over a period of 6 weeks. Redundant data was used to minimize effects from uncontrolled variables which could lead to incorrect interpretations. It will be shown that vibrational modes detected failure at many locations when skin de-bonding failures occurred near the center section. Important considerations are the axis selected and frequency range.

  17. Reducing the Salt Added to Takeaway Food: Within-Subjects Comparison of Salt Delivered by Five and 17 Holed Salt Shakers in Controlled Conditions

    PubMed Central

    Goffe, Louis; Wrieden, Wendy; Penn, Linda; Hillier-Brown, Frances; Lake, Amelia A.; Araujo-Soares, Vera; Summerbell, Carolyn; White, Martin; Adamson, Ashley J.

    2016-01-01

    Objectives To determine if the amount of salt delivered by standard salt shakers commonly used in English independent takeaways varies between those with five and 17 holes; and to determine if any differences are robust to variations in: the amount of salt in the shaker, the length of time spent shaking, and the person serving. Design Four laboratory experiments comparing the amount of salt delivered by shakers. Independent variables considered were: type of shaker used (five or 17 holes), amount of salt in the shaker before shaking commences (shaker full, half full or nearly empty), time spent shaking (3s, 5s or 10s), and individual serving. Setting Controlled, laboratory, conditions. Participants A quota-based convenience sample of 10 participants (five women) aged 18–59 years. Main Outcome Measures Amount of salt delivered by salt shakers. Results Across all trials, the 17 holed shaker delivered a mean (SD) of 7.86g (4.54) per trial, whilst the five holed shaker delivered 2.65g (1.22). The five holed shaker delivered a mean of 33.7% of the salt of the 17 holed shaker. There was a significant difference in salt delivered between the five and 17 holed salt shakers when time spent shaking, amount of salt in the shaker and participant were all kept constant (p<0.001). This difference was robust to variations in the starting weight of shakers, time spent shaking and participant shaking (ps

  18. An FEM-based AI approach to model parameter identification for low vibration modes of wind turbine composite rotor blades

    NASA Astrophysics Data System (ADS)

    Navadeh, N.; Goroshko, I. O.; Zhuk, Y. A.; Fallah, A. S.

    2017-11-01

    An approach to construction of a beam-type simplified model of a horizontal axis wind turbine composite blade based on the finite element method is proposed. The model allows effective and accurate description of low vibration bending modes taking into account the effects of coupling between flapwise and lead-lag modes of vibration transpiring due to the non-uniform distribution of twist angle in the blade geometry along its length. The identification of model parameters is carried out on the basis of modal data obtained by more detailed finite element simulations and subsequent adoption of the 'DIRECT' optimisation algorithm. Stable identification results were obtained using absolute deviations in frequencies and in modal displacements in the objective function and additional a priori information (boundedness and monotony) on the solution properties.

  19. Next-Level ShakeZoning for Earthquake Hazard Definition in Nevada

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Savran, W. H.; Flinchum, B. A.; Dudley, C.; Prina, N.; Pullammanappallil, S.; Pancha, A.

    2011-12-01

    We are developing "Next-Level ShakeZoning" procedures tailored for defining earthquake hazards in Nevada. The current Federally sponsored tools- the USGS hazard maps and ShakeMap, and FEMA HAZUS- were developed as statistical summaries to match earthquake data from California, Japan, and Taiwan. The 2008 Wells and Mogul events in Nevada showed in particular that the generalized statistical approach taken by ShakeMap cannot match actual data on shaking from earthquakes in the Intermountain West, even to first order. Next-Level ShakeZoning relies on physics and geology to define earthquake shaking hazards, rather than statistics. It follows theoretical and computational developments made over the past 20 years, to capitalize on detailed and specific local data sets to more accurately model the propagation and amplification of earthquake waves through the multiple geologic basins of the Intermountain West. Excellent new data sets are now available for Las Vegas Valley. Clark County, Nevada has completed the nation's very first effort to map earthquake hazard class systematically through an entire urban area using Optim's SeisOpt° ReMi technique, which was adapted for large-scale data collection. Using the new Parcel Map in computing shaking in the Valley for scenario earthquakes is crucial for obtaining realistic predictions of ground motions. In an educational element of the project, a dozen undergraduate students have been computing 50 separate earthquake scenarios affecting Las Vegas Valley, using the Next-Level ShakeZoning process. Despite affecting only the upper 30 meters, the Vs30 geotechnical shear-velocity from the Parcel Map shows clear effects on 3-d shaking predictions computed so far at frequencies from 0.1 Hz up to 1.0 Hz. The effect of the Parcel Map on even the 0.1-Hz waves is prominent even with the large mismatch of wavelength to geotechnical depths. Amplifications and de-amplifications affected by the Parcel Map exceed a factor of two, and are highly dependent on the particular scenario. As well, Parcel Map amplification effects extend into areas not characterized in the Parcel Map. The fully 3-d Next-Level ShakeZoning scenarios show many areas of shaking amplification and de-amplification that USGS ShakeMap scenarios cannot predict. For example, the Frenchman Mountain scenario shows PGV of the two approaches within 15% of each other near the source, but upwards of 200% relative amplification or de-amplification, depending on location, throughout Las Vegas Valley.

  20. Shaking Takete and Flowing Maluma. Non-Sense Words Are Associated with Motion Patterns

    PubMed Central

    Koppensteiner, Markus; Stephan, Pia; Jäschke, Johannes Paul Michael

    2016-01-01

    People assign the artificial words takete and kiki to spiky, angular figures and the artificial words maluma and bouba to rounded figures. We examined whether such a cross-modal correspondence could also be found for human body motion. We transferred the body movements of speakers onto two-dimensional coordinates and created animated stick-figures based on this data. Then we invited people to judge these stimuli using the words takete-maluma, bouba-kiki, and several verbal descriptors that served as measures of angularity/smoothness. In addition to this we extracted the quantity of motion, the velocity of motion and the average angle between motion vectors from the coordinate data. Judgments of takete (and kiki) were related to verbal descriptors of angularity, a high quantity of motion, high velocity and sharper angles. Judgments of maluma (or bouba) were related to smooth movements, a low velocity, a lower quantity of motion and blunter angles. A forced-choice experiment during which we presented subsets with low and high rankers on our motion measures revealed that people preferably assigned stimuli displaying fast movements with sharp angles in motion vectors to takete and stimuli displaying slow movements with blunter angles in motion vectors to maluma. Results indicated that body movements share features with information inherent in words such as takete and maluma and that people perceive the body movements of speakers on the level of changes in motion direction (e.g., body moves to the left and then back to the right). Follow-up studies are needed to clarify whether impressions of angularity and smoothness have similar communicative values across different modalities and how this affects social judgments and person perception. PMID:26939013

  1. High-speed shaking of frozen blood clots for extraction of human and malaria parasite DNA

    PubMed Central

    2011-01-01

    Background Frozen blood clots remaining after serum collection is an often disregarded source of host and pathogen DNA due to troublesome handling and suboptimal outcome. Methods High-speed shaking of clot samples in a cell disruptor manufactured for homogenization of tissue and faecal specimens was evaluated for processing frozen blood clots for DNA extraction. The method was compared to two commercial clot protocols based on a chemical kit and centrifugation through a plastic sieve, followed by the same DNA extraction protocol. Blood clots with different levels of parasitaemia (1-1,000 p/μl) were prepared from parasite cultures to assess sensitivity of PCR detection. In addition, clots retrieved from serum samples collected within two epidemiological studies in Kenya (n = 630) were processed by high speed shaking and analysed by PCR for detection of malaria parasites and the human α-thalassaemia gene. Results High speed shaking succeeded in fully dispersing the clots and the method generated the highest DNA yield. The level of PCR detection of P. falciparum parasites and the human thalassaemia gene was the same as samples optimally collected with an anticoagulant. The commercial clot protocol and centrifugation through a sieve failed to fully dissolve the clots and resulted in lower sensitivity of PCR detection. Conclusions High speed shaking was a simple and efficacious method for homogenizing frozen blood clots before DNA purification and resulted in PCR templates of high quality both from humans and malaria parasites. This novel method enables genetic studies from stored blood clots. PMID:21824391

  2. An Orientation Sensor-Based Head Tracking System for Driver Behaviour Monitoring.

    PubMed

    Zhao, Yifan; Görne, Lorenz; Yuen, Iek-Man; Cao, Dongpu; Sullman, Mark; Auger, Daniel; Lv, Chen; Wang, Huaji; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros

    2017-11-22

    Although at present legislation does not allow drivers in a Level 3 autonomous vehicle to engage in a secondary task, there may become a time when it does. Monitoring the behaviour of drivers engaging in various non-driving activities (NDAs) is crucial to decide how well the driver will be able to take over control of the vehicle. One limitation of the commonly used face-based head tracking system, using cameras, is that sufficient features of the face must be visible, which limits the detectable angle of head movement and thereby measurable NDAs, unless multiple cameras are used. This paper proposes a novel orientation sensor based head tracking system that includes twin devices, one of which measures the movement of the vehicle while the other measures the absolute movement of the head. Measurement error in the shaking and nodding axes were less than 0.4°, while error in the rolling axis was less than 2°. Comparison with a camera-based system, through in-house tests and on-road tests, showed that the main advantage of the proposed system is the ability to detect angles larger than 20° in the shaking and nodding axes. Finally, a case study demonstrated that the measurement of the shaking and nodding angles, produced from the proposed system, can effectively characterise the drivers' behaviour while engaged in the NDAs of chatting to a passenger and playing on a smartphone.

  3. Integrating landslide and liquefaction hazard and loss estimates with existing USGS real-time earthquake information products

    USGS Publications Warehouse

    Allstadt, Kate E.; Thompson, Eric M.; Hearne, Mike; Nowicki Jessee, M. Anna; Zhu, J.; Wald, David J.; Tanyas, Hakan

    2017-01-01

    The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquake information users. We describe the available datasets and models, developments currently underway, and anticipated products. 

  4. MODAL TRACKING of A Structural Device: A Subspace Identification Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J. V.; Franco, S. N.; Ruggiero, E. L.

    Mechanical devices operating in an environment contaminated by noise, uncertainties, and extraneous disturbances lead to low signal-to-noise-ratios creating an extremely challenging processing problem. To detect/classify a device subsystem from noisy data, it is necessary to identify unique signatures or particular features. An obvious feature would be resonant (modal) frequencies emitted during its normal operation. In this report, we discuss a model-based approach to incorporate these physical features into a dynamic structure that can be used for such an identification. The approach we take after pre-processing the raw vibration data and removing any extraneous disturbances is to obtain a representation ofmore » the structurally unknown device along with its subsystems that capture these salient features. One approach is to recognize that unique modal frequencies (sinusoidal lines) appear in the estimated power spectrum that are solely characteristic of the device under investigation. Therefore, the objective of this effort is based on constructing a black box model of the device that captures these physical features that can be exploited to “diagnose” whether or not the particular device subsystem (track/detect/classify) is operating normally from noisy vibrational data. Here we discuss the application of a modern system identification approach based on stochastic subspace realization techniques capable of both (1) identifying the underlying black-box structure thereby enabling the extraction of structural modes that can be used for analysis and modal tracking as well as (2) indicators of condition and possible changes from normal operation.« less

  5. Participatory action inquiry using baccalaureate nursing students: The inclusion of integrative health care modalities in nursing core curriculum.

    PubMed

    Chan, Roxane Raffin; Schaffrath, Michelle

    2017-01-01

    Nurses, nursing educators and students support the inclusion of integrative health care (IHC) into nursing core curriculum as a way to create nurses who deliver nursing care to the full extent of their scope of practice and advance evidenced based IHC. Because of the holistic nature of IHC modalities, research to investigate appropriate teaching strategies and potential efficacy of learning IHC in the baccalaureate core curriculum requires a holistic approach. Therefore a phenomenological exploration using participatory action inquiry was conducted at a large Midwestern university. Eighteen first year nursing students were selected as co-researchers. Their experiences in learning and delivering three 15 min IHC interventions (foot reflexology, lavender aromatherapy and mindful breathing) in an acute care setting were captured using reflexive journaling and participation in structured and organic communicative spaces. Of the patients approached, 67% accepted to receive one or more IHC modalities (147/219). Using van Manen's model for holistic data reduction three themes emerged: The experience of presence, competency and unexpected results. Learning IHC modalities is best supported by a self-reflective process that is constructed and modeled by a nurse faculty member with experience in delivering IHC modalities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Earthquake Shaking - Finding the "Hot Spots"

    USGS Publications Warehouse

    Field, Edward; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa

    2001-01-01

    A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes.

  7. Seismic Rehabilitation of RC Frames by Using Steel Panels

    NASA Astrophysics Data System (ADS)

    Mowrtage, Waiel

    2008-07-01

    Every major earthquake in Turkey causes a large number of building suffer moderate damage due to poor construction. If a proper and fast retrofit is not applied, the aftershocks, which may sometimes come days or weeks after the main shock, can push a moderately damaged building into a major damage or even total collapse. This paper presents a practical retrofit method for moderately damaged buildings, which increases the seismic performance of the structural system by reducing the displacement demand. Fabricated steel panels are used for the retrofit. They are light-weight, easy to handle, and can be constructed very quickly. Moreover, they are cheap, and do not need formwork or skilled workers. They can be designed to compensate for the stiffness and strength degradation, and to fit easily inside a moderately damaged reinforced concrete frame. To test the concept, a half-scale, single-story 3D reinforced concrete frame specimen was constructed at the shake-table laboratories of the Kandilli Observatory and Earthquake Research Institute of Bogazici University, and subjected to recorded real earthquake base accelerations. The amplitudes of base accelerations were increased until a moderate damage level is reached. Then, the damaged RC frames was retrofitted by means of steel panels and tested under the same earthquake. The seismic performance of the specimen before and after the retrofit was evaluated using FEMA356 standards, and the results were compared in terms of stiffness, strength, and deformability. The results have confirmed effectiveness of the proposed retrofit scheme.

  8. Building a Communication, Education, an Outreach Program for the ShakeAlert National Earthquake Early Warning Program - Recommendations for Public Alerts Via Cell Phones

    NASA Astrophysics Data System (ADS)

    DeGroot, R. M.; Long, K.; Strauss, J. A.

    2017-12-01

    The United States Geological Survey (USGS) and its partners are developing the ShakeAlert Earthquake Early Warning System for the West Coast of the United States. To be an integral part of successful implementation, ShakeAlert engagement programs and materials must integrate with and leverage broader earthquake risk programs. New methods and products for dissemination must be multidisciplinary, cost effective, and consistent with existing hazards education and communication efforts. The ShakeAlert Joint Committee for Communication, Education, and Outreach (JCCEO), is identifying, developing, and cultivating partnerships with ShakeAlert stakeholders including Federal, State, academic partners, private companies, policy makers, and local organizations. Efforts include developing materials, methods for delivery, and reaching stakeholders with information on ShakeAlert, earthquake preparedness, and emergency protective actions. It is essential to develop standards to ensure information communicated via the alerts is consistent across the public and private sector and achieving a common understanding of what actions users take when they receive a ShakeAlert warning. In February 2017, the JCCEO convened the Warning Message Focus Group (WMFG) to provide findings and recommendations to the Alliance for Telecommunications Industry Solutions on the use of earthquake early warning message content standards for public alerts via cell phones. The WMFG represents communications, education, and outreach stakeholders from various sectors including ShakeAlert regional coordinators, industry, emergency managers, and subject matter experts from the social sciences. The group knowledge was combined with an in-depth literature review to ensure that all groups who could receive the message would be taken into account. The USGS and the participating states and agencies acknowledge that the implementation of ShakeAlert is a collective effort requiring the participation of hundreds of stakeholders committed to ensuring public accessibility.

  9. The evaluation of sources of knowledge underlying different conceptual categories.

    PubMed

    Gainotti, Guido; Spinelli, Pietro; Scaricamazza, Eugenia; Marra, Camillo

    2013-01-01

    According to the "embodied cognition" theory and the "sensory-motor model of semantic knowledge": (a) concepts are represented in the brain in the same format in which they are constructed by the sensory-motor system and (b) various conceptual categories differ according to the weight of different kinds of information in their representation. In this study, we tried to check the second assumption by asking normal elderly subjects to subjectively evaluate the role of various perceptual, motor and language-mediated sources of knowledge in the construction of different semantic categories. Our first aim was to rate the influence of different sources of knowledge in the representation of animals, plant life and artifact categories, rather than in living and non-living beings, as many previous studies on this subject have done. We also tried to check the influence of age and stimulus modality on these evaluations of the "sources of knowledge" underlying different conceptual categories. The influence of age was checked by comparing results obtained in our group of elderly subjects with those obtained in a previous study, conducted with a similar methodology on a sample of young students. And the influence of stimulus modality was assessed by presenting the stimuli in the verbal modality to 50 subjects and in the pictorial modality to 50 other subjects. The distinction between "animals" and "plant life" in the "living" categories was confirmed by analyzing their prevalent sources of knowledge and by a cluster analysis, which allowed us to distinguish "plant life" items from animals. Furthermore, results of the study showed: (a) that our subjects considered the visual modality as the main source of knowledge for all categories taken into account; and (b) that in biological categories the next most important source of information was represented by other perceptual modalities, whereas in artifacts it was represented by the actions performed with them. Finally, age and stimulus modality did not significantly influence judgment of relevance of the sources of knowledge involved in the construction of different conceptual categories.

  10. Ground motion values for use in the seismic design of the Trans-Alaska Pipeline system

    USGS Publications Warehouse

    Page, Robert A.; Boore, D.M.; Joyner, W.B.; Coulter, H.W.

    1972-01-01

    The proposed trans-Alaska oil pipeline, which would traverse the state north to south from Prudhoe Bay on the Arctic coast to Valdez on Prince William Sound, will be subject to serious earthquake hazards over much of its length. To be acceptable from an environmental standpoint, the pipeline system is to be designed to minimize the potential of oil leakage resulting from seismic shaking, faulting, and seismically induced ground deformation. The design of the pipeline system must accommodate the effects of earthquakes with magnitudes ranging from 5.5 to 8.5 as specified in the 'Stipulations for Proposed Trans-Alaskan Pipeline System.' This report characterizes ground motions for the specified earthquakes in terms of peak levels of ground acceleration, velocity, and displacement and of duration of shaking. Published strong motion data from the Western United States are critically reviewed to determine the intensity and duration of shaking within several kilometers of the slipped fault. For magnitudes 5 and 6, for which sufficient near-fault records are available, the adopted ground motion values are based on data. For larger earthquakes the values are based on extrapolations from the data for smaller shocks, guided by simplified theoretical models of the faulting process.

  11. [A Case of Middle Cerebral Artery Stenosis Presented with Limb-Shaking TIA].

    PubMed

    Uno, Junji; Mineta, Haruyuki; Ren, Nice; Takagishi, Sou; Nagaoka, Shintarou; Kameda, Katsuharu; Maeda, Kazushi; Ikai, Yoshiaki; Gi, Hidefuku

    2016-07-01

    Involuntary movement is a rare clinical manifestation of transient ischemic attack (TIA). However, limb-shaking TIA is well described presentation of carotid occlusive disease. We present the case of a patient who developed limb-shaking TIA associated with high-grade stenosis of middle cerebral artery (M1), which was treated with percutaneous transluminal angioplasty (PTA). The procedure was performed successfully without complication and the symptom disappeared immediately after the procedure. The patient remained free of symptoms at the 38-month follow-up. There was no tendency of restenosis of M1. In this case, PTA was technically feasible and beneficial for limb-shaking TIA with M1 stenosis. Limb-shaking TIA can be a symptom of high-grade stenosis of M1.

  12. Behavioral Response in the Immediate Aftermath of Shaking: Earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan

    PubMed Central

    Jon, Ihnji; Lindell, Michael K.; Prater, Carla S.; Huang, Shih-Kai; Wu, Hao-Che; Johnston, David M.; Becker, Julia S.; Shiroshita, Hideyuki; Doyle, Emma E.H.; Potter, Sally H.; McClure, John; Lambie, Emily

    2016-01-01

    This study examines people’s response actions in the first 30 min after shaking stopped following earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan. Data collected from 257 respondents in Christchurch, 332 respondents in Hitachi, and 204 respondents in Wellington revealed notable similarities in some response actions immediately after the shaking stopped. In all four events, people were most likely to contact family members and seek additional information about the situation. However, there were notable differences among events in the frequency of resuming previous activities. Actions taken in the first 30 min were weakly related to: demographic variables, earthquake experience, contextual variables, and actions taken during the shaking, but were significantly related to perceived shaking intensity, risk perception and affective responses to the shaking, and damage/infrastructure disruption. These results have important implications for future research and practice because they identify promising avenues for emergency managers to communicate seismic risks and appropriate responses to risk area populations. PMID:27854306

  13. Behavioral Response in the Immediate Aftermath of Shaking: Earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan.

    PubMed

    Jon, Ihnji; Lindell, Michael K; Prater, Carla S; Huang, Shih-Kai; Wu, Hao-Che; Johnston, David M; Becker, Julia S; Shiroshita, Hideyuki; Doyle, Emma E H; Potter, Sally H; McClure, John; Lambie, Emily

    2016-11-15

    This study examines people's response actions in the first 30 min after shaking stopped following earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan. Data collected from 257 respondents in Christchurch, 332 respondents in Hitachi, and 204 respondents in Wellington revealed notable similarities in some response actions immediately after the shaking stopped. In all four events, people were most likely to contact family members and seek additional information about the situation. However, there were notable differences among events in the frequency of resuming previous activities. Actions taken in the first 30 min were weakly related to: demographic variables, earthquake experience, contextual variables, and actions taken during the shaking, but were significantly related to perceived shaking intensity, risk perception and affective responses to the shaking, and damage/infrastructure disruption. These results have important implications for future research and practice because they identify promising avenues for emergency managers to communicate seismic risks and appropriate responses to risk area populations.

  14. Construction, characterization and application of molecular tools for metabolic engineering of Synechocystis sp.

    PubMed

    Qi, Fengxia; Yao, Lun; Tan, Xiaoming; Lu, Xuefeng

    2013-10-01

    An integrative gene expression system has been constructed for the directional assembly of biological components in Synechocystis PCC6803. We have characterized 11 promoter parts with various expression efficiencies for genetic engineering of Synechocystis for the production of fatty alcohols. This was achieved by integrating several genetic modifications including the expression of multiple-copies of fatty acyl-CoA reductase (FAR) under the control of strong promoters, disruption of the competing pathways for poly-β-hydroxybutyrate and glycogen synthesis, and for peptide truncation of the FAR. In shake-flask cultures, the production of fatty alcohols was significantly improved with a yield of 761 ± 216 μg/g cell dry weight in Synechocystis, which is the highest reported to date.

  15. MyEEW: A Smartphone App for the ShakeAlert System

    NASA Astrophysics Data System (ADS)

    Strauss, J. A.; Allen, S.; Allen, R. M.; Hellweg, M.

    2015-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The UC Berkeley Seismological Laboratory has created a smartphone app called MyEEW, which interfaces with the ShakeAlert system to deliver early warnings to individual users. Many critical facilities (transportation, police, and fire) have control rooms, which could run a centralized interface, but our ShakeAlert Beta Testers have also expressed their need for mobile options. This app augments the basic ShakeAlert Java desktop applet by allowing workers off-site (or merely out of hearing range) to be informed of coming hazards. MyEEW receives information from the ShakeAlert system to provide users with real-time information about shaking that is about to happen at their individual location. It includes a map, timer, and earthquake information similar to the Java desktop User Display. The app will also feature educational material to help users craft their own response and resiliency strategies. The app will be open to UC Berkeley Earthquake Research Affiliates members for testing in the near future.

  16. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018)

    USGS Publications Warehouse

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-06-30

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and also for preparing emergency response plans.The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault Zone to the east of the study area.In this report, an earthquake scenario is intended to depict the potential consequences of significant earthquakes. A scenario earthquake is not necessarily the largest or most damaging earthquake possible on a recognized fault. Rather it is both large enough and likely enough that emergency planners should consider it in regional emergency response plans. In particular, the ground motion predicted for a given scenario earthquake does not represent a full probabilistic hazard assessment, and thus it does not provide the basis for hazard zoning and earthquake-resistant building design.Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM). Alternatives to the NSHM scenario were developed for the Hilton Creek and Hartley Springs Faults to account for different opinions in how far these two faults extend into Long Valley Caldera. For each scenario, ground motions were calculated using the current standard practice: the deterministic seismic hazard analysis program developed by Art Frankel of USGS and three Next Generation Ground Motion Attenuation (NGA) models. Ground motion calculations incorporated the potential amplification of seismic shaking by near-surface soils defined by a map of the average shear wave velocity in the uppermost 30 m (VS30) developed by CGS.In addition to ground shaking and shaking-related ground failure such as liquefaction and earthquake induced landslides, earthquakes cause surface rupture displacement, which can lead to severe damage of buildings and lifelines. For each earthquake scenario, potential surface fault displacements are estimated using deterministic and probabilistic approaches. Liquefaction occurs when saturated sediments lose their strength because of ground shaking. Zones of potential liquefaction are mapped by incorporating areas where loose sandy sediments, shallow groundwater, and strong earthquake shaking coincide in the earthquake scenario. The process for defining zones of potential landslide and rockfall incorporates rock strength, surface slope, and existing landslides, with ground motions caused by the scenario earthquake.Each scenario is illustrated with maps of seismic shaking potential and fault displacement, liquefaction, and landslide potential. Seismic shaking is depicted by the distribution of shaking intensity, peak ground acceleration, and 1.0-second spectral acceleration. One-second spectral acceleration correlates well with structural damage to surface facilities. Acceleration greater than 0.2 g is often associated with strong ground shaking and may cause moderate to heavy damage. The extent of strong shaking is influenced by subsurface fault dip and near surface materials. Strong shaking is more widespread in the hanging wall regions of a normal fault. Larger ground motions also occur where young alluvial sediments amplify the shaking. Both of these effects can lead to strong shaking that extends farther from the fault on the valley side than on the hill side.The effect of fault rupture displacements may be localized along the surface trace of the mapped earthquake fault if fault geometry is simple and the fault traces are accurately located. However, surface displacement hazards can spread over a few hundred meters to a few kilometers if the earthquake fault has numerous splays or branches, such as the Hilton Creek Fault. Faulting displacements are estimated to be about 1 meter along normal faults in the study area and close to 2 meters along the White Mountains Fault Zone.All scenarios show the possibility of widespread ground failure. Liquefaction damage would likely occur in the areas of higher ground shaking near the faults where there are sandy/silty sediments and the depth to groundwater is 6.1 meters (20 feet) or less. Generally, this means damage is most common near lakes and streams in the areas of strongest shaking. Landslide potential exists throughout the study region. All steep slopes (>30 degrees) present a potential hazard at any level of shaking. Lesser slopes may have landslides within the areas of the higher ground shaking. The landslide hazard zones also are likely sources for snow avalanches during winter months and for large boulders that can be shaken loose and roll hundreds of feet down hill, which happened during the 1980 Mammoth Lakes earthquakes.Whereas methodologies used in estimating ground shaking, liquefaction, and landslides are well developed and have been applied in published hazard maps; methodologies used in estimating surface fault displacement are still being developed. Therefore, this report provides a more in-depth and detailed discussion of methodologies used for deterministic and probabilistic fault displacement hazard analyses for this project.

  17. Imaging-based biomarkers of cognitive performance in older adults constructed via high-dimensional pattern regression applied to MRI and PET.

    PubMed

    Wang, Ying; Goh, Joshua O; Resnick, Susan M; Davatzikos, Christos

    2013-01-01

    In this study, we used high-dimensional pattern regression methods based on structural (gray and white matter; GM and WM) and functional (positron emission tomography of regional cerebral blood flow; PET) brain data to identify cross-sectional imaging biomarkers of cognitive performance in cognitively normal older adults from the Baltimore Longitudinal Study of Aging (BLSA). We focused on specific components of executive and memory domains known to decline with aging, including manipulation, semantic retrieval, long-term memory (LTM), and short-term memory (STM). For each imaging modality, brain regions associated with each cognitive domain were generated by adaptive regional clustering. A relevance vector machine was adopted to model the nonlinear continuous relationship between brain regions and cognitive performance, with cross-validation to select the most informative brain regions (using recursive feature elimination) as imaging biomarkers and optimize model parameters. Predicted cognitive scores using our regression algorithm based on the resulting brain regions correlated well with actual performance. Also, regression models obtained using combined GM, WM, and PET imaging modalities outperformed models based on single modalities. Imaging biomarkers related to memory performance included the orbito-frontal and medial temporal cortical regions with LTM showing stronger correlation with the temporal lobe than STM. Brain regions predicting executive performance included orbito-frontal, and occipito-temporal areas. The PET modality had higher contribution to most cognitive domains except manipulation, which had higher WM contribution from the superior longitudinal fasciculus and the genu of the corpus callosum. These findings based on machine-learning methods demonstrate the importance of combining structural and functional imaging data in understanding complex cognitive mechanisms and also their potential usage as biomarkers that predict cognitive status.

  18. Using a Redox Modality to Connect Synthetic Biology to Electronics: Hydrogel-Based Chemo-Electro Signal Transduction for Molecular Communication.

    PubMed

    Liu, Yi; Tsao, Chen-Yu; Kim, Eunkyoung; Tschirhart, Tanya; Terrell, Jessica L; Bentley, William E; Payne, Gregory F

    2017-01-01

    A hydrogel-based dual film coating is electrofabricated for transducing bio-relevant chemical information into electronical output. The outer film has a synthetic biology construct that recognizes an external molecular signal and transduces this input into the expression of an enzyme that converts redox-inactive substrate into a redox-active intermediate, which is detected through an amplification mechanism of the inner redox-capacitor film. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Research on target tracking algorithm based on spatio-temporal context

    NASA Astrophysics Data System (ADS)

    Li, Baiping; Xu, Sanmei; Kang, Hongjuan

    2017-07-01

    In this paper, a novel target tracking algorithm based on spatio-temporal context is proposed. During the tracking process, the camera shaking or occlusion may lead to the failure of tracking. The proposed algorithm can solve this problem effectively. The method use the spatio-temporal context algorithm as the main research object. We get the first frame's target region via mouse. Then the spatio-temporal context algorithm is used to get the tracking targets of the sequence of frames. During this process a similarity measure function based on perceptual hash algorithm is used to judge the tracking results. If tracking failed, reset the initial value of Mean Shift algorithm for the subsequent target tracking. Experiment results show that the proposed algorithm can achieve real-time and stable tracking when camera shaking or target occlusion.

  20. Importance of multi-modal approaches to effectively identify cataract cases from electronic health records

    PubMed Central

    Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B

    2012-01-01

    Objective There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. Materials and methods We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. Results An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. Discussion A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. Conclusion We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries. PMID:22319176

  1. Revisiting Molecular Dynamics on a CPU/GPU system: Water Kernel and SHAKE Parallelization.

    PubMed

    Ruymgaart, A Peter; Elber, Ron

    2012-11-13

    We report Graphics Processing Unit (GPU) and Open-MP parallel implementations of water-specific force calculations and of bond constraints for use in Molecular Dynamics simulations. We focus on a typical laboratory computing-environment in which a CPU with a few cores is attached to a GPU. We discuss in detail the design of the code and we illustrate performance comparable to highly optimized codes such as GROMACS. Beside speed our code shows excellent energy conservation. Utilization of water-specific lists allows the efficient calculations of non-bonded interactions that include water molecules and results in a speed-up factor of more than 40 on the GPU compared to code optimized on a single CPU core for systems larger than 20,000 atoms. This is up four-fold from a factor of 10 reported in our initial GPU implementation that did not include a water-specific code. Another optimization is the implementation of constrained dynamics entirely on the GPU. The routine, which enforces constraints of all bonds, runs in parallel on multiple Open-MP cores or entirely on the GPU. It is based on Conjugate Gradient solution of the Lagrange multipliers (CG SHAKE). The GPU implementation is partially in double precision and requires no communication with the CPU during the execution of the SHAKE algorithm. The (parallel) implementation of SHAKE allows an increase of the time step to 2.0fs while maintaining excellent energy conservation. Interestingly, CG SHAKE is faster than the usual bond relaxation algorithm even on a single core if high accuracy is expected. The significant speedup of the optimized components transfers the computational bottleneck of the MD calculation to the reciprocal part of Particle Mesh Ewald (PME).

  2. A new physical method to assess handle properties of fabrics made from wood-based fibers

    NASA Astrophysics Data System (ADS)

    Abu-Rous, M.; Liftinger, E.; Innerlohinger, J.; Malengier, B.; Vasile, S.

    2017-10-01

    In this work, the handfeel of fabrics made of wood-based fibers such as viscose, modal and Lyocell was investigated in relation to cotton fabrics applying the Tissue Softness Analyzer (TSA) method in comparison to other classical methods. Two different construction groups of textile were investigated. The validity of TSA in assessing textile softness of these constructions was tested. TSA results were compared to human hand evaluation as well as to classical physical measurements like drape coefficient, ring pull-through and Handle-o-meter, as well as a newer device, the Fabric Touch Tester (FTT). Physical methods as well as human hand assessments mostly agreed on the softest and smoothest range, but showed different rankings in the harder/rougher side fabrics. TSA ranking of softness and smoothness corresponded to the rankings by other physical methods as well as with human hand feel for the basic textile constructions.

  3. Y-junctions based on circular depressed-cladding waveguides fabricated with femtosecond pulses in Nd:YAG crystal: A route to integrate complex photonic circuits in crystals

    NASA Astrophysics Data System (ADS)

    Ajates, Javier G.; Romero, Carolina; Castillo, Gabriel R.; Chen, Feng; Vázquez de Aldana, Javier R.

    2017-10-01

    We have designed and fabricated photonic structures such as, Y-junctions (one of the basic building blocks for construction any integrated photonic devices) and Mach-Zehnder interferometers, based on circular depressed-cladding waveguides by direct femtosecond laser irradiation in Nd:YAG crystal. The waveguides were optically characterized at 633 nm, showing nearly mono-modal behaviour for the selected waveguide radius (9 μm). The effect of the splitting angle in the Y structures was investigated finding a good preservation of the modal profiles up to more than 2°, with 1 dB of additional losses in comparison with straight waveguides. The dependence with polarization of these splitters keeps in a reasonable low level. Our designs pave the way for the fabrication of arbitrarily complex 3D photonic circuits in crystals with cladding waveguides.

  4. Frequency-dependent seismic attenuation in the eastern United States as observed from the 2011 central Virginia earthquake and aftershock sequence

    USGS Publications Warehouse

    McNamara, Daniel E.; Gee, Lind; Benz, Harley M.; Chapman, Martin

    2014-01-01

    Ground shaking due to earthquakes in the eastern United States (EUS) is felt at significantly greater distances than in the western United States (WUS) and for some earthquakes it has been shown to display a strong preferential direction. Shaking intensity variation can be due to propagation path effects, source directivity, and/or site amplification. In this paper, we use S and Lg waves recorded from the 2011 central Virginia earthquake and aftershock sequence, in the Central Virginia Seismic Zone, to quantify attenuation as frequency‐dependent Q(f). In support of observations based on shaking intensity, we observe high Q values in the EUS relative to previous studies in the WUS with especially efficient propagation along the structural trend of the Appalachian mountains. Our analysis of Q(f) quantifies the path effects of the northeast‐trending felt distribution previously inferred from the U.S. Geological Survey (USGS) “Did You Feel It” data, historic intensity data, and the asymmetrical distribution of rockfalls and landslides.

  5. Installation, care, and maintenance of wood shake and shingle siding

    Treesearch

    Jack Dwyer; Tony Bonura; Arnie Nebelsick; Sam Williams; Christopher G. Hunt

    2011-01-01

    This article gives general guidelines for selection, installation, finishing, and maintenance of wood shakes and shingles. The authors gathered information from a variety of sources: research publications on wood finishing, technical data sheets from paint manufacturers, installation instructions for shake and shingle siding, and interviews with experts having...

  6. Response of Global Navigation Satellite System receivers to known shaking between 0.2 and 20 Hertz

    USGS Publications Warehouse

    Langbein, John; Evans, John R.; Blume, Fredrick; Johanson, Ingrid

    2014-01-01

    Similar to Wang and others (2012), we also examined the GPS displacement records using standard spectral techniques. However, we extended their work by evaluating several models of GNSS receivers using a variety of input frequencies. Because our shake table was limited on acceleration and displacement, we did not attempt to duplicate the high shaking associated with high magnitude earthquakes. However, because our shake table could measure the table displacement, we could directly compare the measured GPS displacements with the true displacements.

  7. Test and evaluation of the attic temperature reduction potential of plastic roof shakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holton, J.K.; Beggs, T.R.

    1999-07-01

    While monitoring the comparative performance of two test houses in Pittsburgh, Pennsylvania, it was noticed that the attic air temperature of one house with a plastic shake roof was consistently 20 F (11 C) cooler than its twin with asphalt shingles during peak summer cooling periods. More detailed monitoring of the temperatures on the plastic shake, the roof deck, and the attic showed this effect to be largely due to the plastic shake and not to better roof venting or other heat loss mechanisms.

  8. Biphasic Finite Element Modeling Reconciles Mechanical Properties of Tissue-Engineered Cartilage Constructs Across Testing Platforms.

    PubMed

    Meloni, Gregory R; Fisher, Matthew B; Stoeckl, Brendan D; Dodge, George R; Mauck, Robert L

    2017-07-01

    Cartilage tissue engineering is emerging as a promising treatment for osteoarthritis, and the field has progressed toward utilizing large animal models for proof of concept and preclinical studies. Mechanical testing of the regenerative tissue is an essential outcome for functional evaluation. However, testing modalities and constitutive frameworks used to evaluate in vitro grown samples differ substantially from those used to evaluate in vivo derived samples. To address this, we developed finite element (FE) models (using FEBio) of unconfined compression and indentation testing, modalities commonly used for such samples. We determined the model sensitivity to tissue radius and subchondral bone modulus, as well as its ability to estimate material parameters using the built-in parameter optimization tool in FEBio. We then sequentially tested agarose gels of 4%, 6%, 8%, and 10% weight/weight using a custom indentation platform, followed by unconfined compression. Similarly, we evaluated the ability of the model to generate material parameters for living constructs by evaluating engineered cartilage. Juvenile bovine mesenchymal stem cells were seeded (2 × 10 7 cells/mL) in 1% weight/volume hyaluronic acid hydrogels and cultured in a chondrogenic medium for 3, 6, and 9 weeks. Samples were planed and tested sequentially in indentation and unconfined compression. The model successfully completed parameter optimization routines for each testing modality for both acellular and cell-based constructs. Traditional outcome measures and the FE-derived outcomes showed significant changes in material properties during the maturation of engineered cartilage tissue, capturing dynamic changes in functional tissue mechanics. These outcomes were significantly correlated with one another, establishing this FE modeling approach as a singular method for the evaluation of functional engineered and native tissue regeneration, both in vitro and in vivo.

  9. A modal radar cross section of thin-wire targets via the singularity expansion method

    NASA Technical Reports Server (NTRS)

    Richards, M. A.; Shumpert, T. H.; Riggs, L. S.

    1992-01-01

    A modal radar cross section (RCS) of arbitrary wire scatterers is constructed in terms of SEM parameters. Numerical results are presented for both straight and L-shaped wire targets and are compared to computations performed in the frequency domain using the method of moments.

  10. An Evaluation of Multimodal Interactions with Technology while Learning Science Concepts

    ERIC Educational Resources Information Center

    Anastopoulou, Stamatina; Sharples, Mike; Baber, Chris

    2011-01-01

    This paper explores the value of employing multiple modalities to facilitate science learning with technology. In particular, it is argued that when multiple modalities are employed, learners construct strong relations between physical movement and visual representations of motion. Body interactions with visual representations, enabled by…

  11. Building v/s Exploring Models: Comparing Learning of Evolutionary Processes through Agent-based Modeling

    NASA Astrophysics Data System (ADS)

    Wagh, Aditi

    Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both modalities, with students who built models not incorporating slippage explanations in responses. Study 3 compares these modalities with a control using traditional activities. Pre and posttests reveal that the two modalities manifested greater facility with accessing and assembling rules than the control. The dissertation offers implications for the design of learning environments for evolutionary change, design of the two modalities based on their strengths and weaknesses, and teacher training for the same.

  12. Vibration control of rotor shaft

    NASA Technical Reports Server (NTRS)

    Nonami, K.

    1985-01-01

    Suppression of flexural forced vibration or the self-excited vibration of a rotating shaft system not by passive elements but by active elements is described. The distinctive feature of this method is not to dissipate the vibration energy but to provide the force cancelling the vibration displacement and the vibration velocity through the bearing housing in rotation. Therefore the bearings of this kind are appropriately named Active Control Bearings. A simple rotor system having one disk at the center of the span on flexible supports is investigated in this paper. The actuators of the electrodynamic transducer are inserted in the sections of the bearing housing. First, applying the optimal regulator of optimal control theory, the flexural vibration control of the rotating shaft and the vibration control of support systems are performed by the optimal state feedback system using these actuators. Next, the quasi-modal control based on a modal analysis is applied to this rotor system. This quasi-modal control system is constructed by means of optimal velocity feedback loops. The differences between optimal control and quasi-modal control are discussed and their merits and demerits are made clear. Finally, the experiments are described concerning only the optimal regulator method.

  13. Semi-Supervised Tripled Dictionary Learning for Standard-dose PET Image Prediction using Low-dose PET and Multimodal MRI

    PubMed Central

    Wang, Yan; Ma, Guangkai; An, Le; Shi, Feng; Zhang, Pei; Lalush, David S.; Wu, Xi; Pu, Yifei; Zhou, Jiliu; Shen, Dinggang

    2017-01-01

    Objective To obtain high-quality positron emission tomography (PET) image with low-dose tracer injection, this study attempts to predict the standard-dose PET (S-PET) image from both its low-dose PET (L-PET) counterpart and corresponding magnetic resonance imaging (MRI). Methods It was achieved by patch-based sparse representation (SR), using the training samples with a complete set of MRI, L-PET and S-PET modalities for dictionary construction. However, the number of training samples with complete modalities is often limited. In practice, many samples generally have incomplete modalities (i.e., with one or two missing modalities) that thus cannot be used in the prediction process. In light of this, we develop a semi-supervised tripled dictionary learning (SSTDL) method for S-PET image prediction, which can utilize not only the samples with complete modalities (called complete samples) but also the samples with incomplete modalities (called incomplete samples), to take advantage of the large number of available training samples and thus further improve the prediction performance. Results Validation was done on a real human brain dataset consisting of 18 subjects, and the results show that our method is superior to the SR and other baseline methods. Conclusion This work proposed a new S-PET prediction method, which can significantly improve the PET image quality with low-dose injection. Significance The proposed method is favorable in clinical application since it can decrease the potential radiation risk for patients. PMID:27187939

  14. A Locally Modal B-Spline Based Full-Vector Finite-Element Method with PML for Nonlinear and Lossy Plasmonic Waveguide

    NASA Astrophysics Data System (ADS)

    Karimi, Hossein; Nikmehr, Saeid; Khodapanah, Ehsan

    2016-09-01

    In this paper, we develop a B-spline finite-element method (FEM) based on a locally modal wave propagation with anisotropic perfectly matched layers (PMLs), for the first time, to simulate nonlinear and lossy plasmonic waveguides. Conventional approaches like beam propagation method, inherently omit the wave spectrum and do not provide physical insight into nonlinear modes especially in the plasmonic applications, where nonlinear modes are constructed by linear modes with very close propagation constant quantities. Our locally modal B-spline finite element method (LMBS-FEM) does not suffer from the weakness of the conventional approaches. To validate our method, first, propagation of wave for various kinds of linear, nonlinear, lossless and lossy materials of metal-insulator plasmonic structures are simulated using LMBS-FEM in MATLAB and the comparisons are made with FEM-BPM module of COMSOL Multiphysics simulator and B-spline finite-element finite-difference wide angle beam propagation method (BSFEFD-WABPM). The comparisons show that not only our developed numerical approach is computationally more accurate and efficient than conventional approaches but also it provides physical insight into the nonlinear nature of the propagation modes.

  15. Liquefaction-induced lateral spreading in Oceano, California, during the 2003 San Simeon Earthquake

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.; Di Alessandro, Carola; Boatwright, John; Tinsley, John C.; Sell, Russell W.; Rosenberg, Lewis I.

    2004-01-01

    The December 22, 2003, San Simeon, California, (M6.5) earthquake caused damage to houses, road surfaces, and underground utilities in Oceano, California. The community of Oceano is approximately 50 miles (80 km) from the earthquake epicenter. Damage at this distance from a M6.5 earthquake is unusual. To understand the causes of this damage, the U.S. Geological Survey conducted extensive subsurface exploration and monitoring of aftershocks in the months after the earthquake. The investigation included 37 seismic cone penetration tests, 5 soil borings, and aftershock monitoring from January 28 to March 7, 2004. The USGS investigation identified two earthquake hazards in Oceano that explain the San Simeon earthquake damage?site amplification and liquefaction. Site amplification is a phenomenon observed in many earthquakes where the strength of the shaking increases abnormally in areas where the seismic-wave velocity of shallow geologic layers is low. As a result, earthquake shaking is felt more strongly than in surrounding areas without similar geologic conditions. Site amplification in Oceano is indicated by the physical properties of the geologic layers beneath Oceano and was confirmed by monitoring aftershocks. Liquefaction, which is also commonly observed during earthquakes, is a phenomenon where saturated sands lose their strength during an earthquake and become fluid-like and mobile. As a result, the ground may undergo large permanent displacements that can damage underground utilities and well-built surface structures. The type of displacement of major concern associated with liquefaction is lateral spreading because it involves displacement of large blocks of ground down gentle slopes or towards stream channels. The USGS investigation indicates that the shallow geologic units beneath Oceano are very susceptible to liquefaction. They include young sand dunes and clean sandy artificial fill that was used to bury and convert marshes into developable lots. Most of the 2003 damage was caused by lateral spreading in two separate areas, one near Norswing Drive and the other near Juanita Avenue. The areas coincided with areas with the highest liquefaction potential found in Oceano. Areas with site amplification conditions similar to those in Oceano are particularly vulnerable to earthquakes. Site amplification may cause shaking from distant earthquakes, which normally would not cause damage, to increase locally to damaging levels. The vulnerability in Oceano is compounded by the widespread distribution of highly liquefiable soils that will reliquefy when ground shaking is amplified as it was during the San Simeon earthquake. The experience in Oceano can be expected to repeat because the region has many active faults capable of generating large earthquakes. In addition, liquefaction and lateral spreading will be more extensive for moderate-size earthquakes that are closer to Oceano than was the 2003 San Simeon earthquake. Site amplification and liquefaction can be mitigated. Shaking is typically mitigated in California by adopting and enforcing up-to-date building codes. Although not a guarantee of safety, application of these codes ensures that the best practice is used in construction. Building codes, however, do not always require the upgrading of older structures to new code requirements. Consequently, many older structures may not be as resistant to earthquake shaking as new ones. For older structures, retrofitting is required to bring them up to code. Seismic provisions in codes also generally do not apply to nonstructural elements such as drywall, heating systems, and shelving. Frequently, nonstructural damage dominates the earthquake loss. Mitigation of potential liquefaction in Oceano presently is voluntary for existing buildings, but required by San Luis Obispo County for new construction. Multiple mitigation procedures are available to individual property owners. These procedures typically involve either

  16. Regional patterns of earthquake-triggered landslides and their relation to ground motion

    NASA Astrophysics Data System (ADS)

    Meunier, Patrick; Hovius, Niels; Haines, A. John

    2007-10-01

    We have documented patterns of landsliding associated with large earthquakes on three thrust faults: the Northridge earthquake in California, Chi-Chi earthquake in Taiwan, and two earthquakes on the Ramu-Markham fault bounding the Finisterre Mountains of Papua New Guinea. In each case, landslide densities are shown to be greatest in the area of strongest ground acceleration and to decay with distance from the epicenter. In California and Taiwan, the density of co-seismic landslides is linearly and highly correlated with both the vertical and horizontal components of measured peak ground acceleration. Based on this observation, we derive an expression for the spatial variation of landslide density analogous with regional seismic attenuation laws. In its general form, this expression applies to our three examples, and we determine best fit values for individual cases. Our findings open a window on the construction of shake maps from geomorphic observations for earthquakes in non-instrumented regions.

  17. Validation of a wireless modular monitoring system for structures

    NASA Astrophysics Data System (ADS)

    Lynch, Jerome P.; Law, Kincho H.; Kiremidjian, Anne S.; Carryer, John E.; Kenny, Thomas W.; Partridge, Aaron; Sundararajan, Arvind

    2002-06-01

    A wireless sensing unit for use in a Wireless Modular Monitoring System (WiMMS) has been designed and constructed. Drawing upon advanced technological developments in the areas of wireless communications, low-power microprocessors and micro-electro mechanical system (MEMS) sensing transducers, the wireless sensing unit represents a high-performance yet low-cost solution to monitoring the short-term and long-term performance of structures. A sophisticated reduced instruction set computer (RISC) microcontroller is placed at the core of the unit to accommodate on-board computations, measurement filtering and data interrogation algorithms. The functionality of the wireless sensing unit is validated through various experiments involving multiple sensing transducers interfaced to the sensing unit. In particular, MEMS-based accelerometers are used as the primary sensing transducer in this study's validation experiments. A five degree of freedom scaled test structure mounted upon a shaking table is employed for system validation.

  18. Preparing a population for an earthquake like Chi-Chi: The Great Southern California ShakeOut

    USGS Publications Warehouse

    Jones, Lucile M.; ,

    2009-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million southern Californians pretended that a magnitude-7.8 earthquake had occurred and practiced actions that could reduce its impact on their lives. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The drill was based on a scenario of the impacts and consequences of such an earthquake on the Southern San Andreas Fault, developed by over 300 experts led by the U.S. Geological Survey in partnership with the California Geological Survey, the Southern California Earthquake Center, Earthquake Engineering Research Institute, lifeline operators, emergency services and many other organizations. The ShakeOut campaign was designed and implemented by earthquake scientists, emergency managers, sociologists, art designers and community participants. The means of communication were developed using results from sociological research on what encouraged people to take action. This was structured around four objectives: 1) consistent messages – people are more inclined to believe something when they hear the same thing from multiple sources; 2) visual reinforcement – people are more inclined to do something they see other people doing; 3) encourage “milling” or discussing contemplated action – people need to discuss an action with others they care about before committing to undertaking it; and 4) focus on concrete actions – people are more likely to prepare for a set of concrete consequences of a particular hazard than for an abstract concept of risk. The goals of the ShakeOut were established in Spring 2008 and were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in southern California; and 3) to reduce earthquake losses in southern California. All of these goals were met. The final registration at www.shakeout.org for the 2008 ShakeOut was 5.47 million people, or one-quarter of the population of the region. A survey conducted with the registered participants showed that the messages they took from the ShakeOut were the concepts intended, including the importance of “Drop, Cover, Hold On”, the interdependency of earthquake risk (“We are all in this together”) and the possibility of reducing losses through preparation and mitigation. Sales data from the Home Depot hardware stores in southern California showed a 260% increase in the sale of earthquake safety products during the month of the ShakeOut, November 2008.

  19. How well can we test probabilistic seismic hazard maps?

    NASA Astrophysics Data System (ADS)

    Vanneste, Kris; Stein, Seth; Camelbeeck, Thierry; Vleminckx, Bart

    2017-04-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in probabilistic seismic hazard (PSH) maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSH model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating shaking histories for an area with assumed uniform distribution of earthquakes, Gutenberg-Richter magnitude-frequency relation, Poisson temporal occurrence model, and ground-motion prediction equation (GMPE). We compare the maximum simulated shaking at many sites over time with that predicted by a hazard map generated for the same set of parameters. The Poisson model predicts that the fraction of sites at which shaking will exceed that of the hazard map is p = 1 - exp(-t/T), where t is the duration of observations and T is the map's return period. Exceedance is typically associated with infrequent large earthquakes, as observed in real cases. The ensemble of simulated earthquake histories yields distributions of fractional exceedance with mean equal to the predicted value. Hence, the PSH algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. However, simulated fractional exceedances show a large scatter about the mean value that decreases with increasing t/T, increasing observation time and increasing Gutenberg-Richter a-value (combining intrinsic activity rate and surface area), but is independent of GMPE uncertainty. This scatter is due to the variability of earthquake recurrence, and so decreases as the largest earthquakes occur in more simulations. Our results are important for evaluating the performance of a hazard map based on misfits in fractional exceedance, and for assessing whether such misfit arises by chance or reflects a bias in the map. More specifically, we determined for a broad range of Gutenberg-Richter a-values theoretical confidence intervals on allowed misfits in fractional exceedance and on the percentage of hazard-map bias that can thus be detected by comparison with observed shaking histories. Given that in the real world we only have one shaking history for an area, these results indicate that even if a hazard map does not fit the observations, it is very difficult to assess its veracity, especially for low-to-moderate-seismicity regions. Because our model is a simplified version of reality, any additional uncertainty or complexity will tend to widen these confidence intervals.

  20. Finite-Fault and Other New Capabilities of CISN ShakeAlert

    NASA Astrophysics Data System (ADS)

    Boese, M.; Felizardo, C.; Heaton, T. H.; Hudnut, K. W.; Hauksson, E.

    2013-12-01

    Over the past 6 years, scientists at Caltech, UC Berkeley, the Univ. of Southern California, the Univ. of Washington, the US Geological Survey, and ETH Zurich (Switzerland) have developed the 'ShakeAlert' earthquake early warning demonstration system for California and the Pacific Northwest. We have now started to transform this system into a stable end-to-end production system that will be integrated into the daily routine operations of the CISN and PNSN networks. To quickly determine the earthquake magnitude and location, ShakeAlert currently processes and interprets real-time data-streams from several hundred seismic stations within the California Integrated Seismic Network (CISN) and the Pacific Northwest Seismic Network (PNSN). Based on these parameters, the 'UserDisplay' software predicts and displays the arrival and intensity of shaking at a given user site. Real-time ShakeAlert feeds are currently being shared with around 160 individuals, companies, and emergency response organizations to gather feedback about the system performance, to educate potential users about EEW, and to identify needs and applications of EEW in a future operational warning system. To improve the performance during large earthquakes (M>6.5), we have started to develop, implement, and test a number of new algorithms for the ShakeAlert system: the 'FinDer' (Finite Fault Rupture Detector) algorithm provides real-time estimates of locations and extents of finite-fault ruptures from high-frequency seismic data. The 'GPSlip' algorithm estimates the fault slip along these ruptures using high-rate real-time GPS data. And, third, a new type of ground-motion prediction models derived from over 415,000 rupture simulations along active faults in southern California improves MMI intensity predictions for large earthquakes with consideration of finite-fault, rupture directivity, and basin response effects. FinDer and GPSlip are currently being real-time and offline tested in a separate internal ShakeAlert installation at Caltech. Real-time position and displacement time series from around 100 GPS sensors are obtained in JSON format from RTK/PPP(AR) solutions using the RTNet software at USGS Pasadena. However, we have also started to investigate the usage of onsite (in-receiver) processing using NetR9 with RTX and tracebuf2 output format. A number of changes to the ShakeAlert processing, xml message format, and the usage of this information in the UserDisplay software were necessary to handle the new finite-fault and slip information from the FinDer and GPSlip algorithms. In addition, we have developed a framework for end-to-end off-line testing with archived and simulated waveform data using the Earthworm tankplayer. Detailed background information about the algorithms, processing, and results from these test runs will be presented.

  1. pH-metric solubility. 2: correlation between the acid-base titration and the saturation shake-flask solubility-pH methods.

    PubMed

    Avdeef, A; Berger, C M; Brownell, C

    2000-01-01

    The objective of this study was to compare the results of a normal saturation shake-flask method to a new potentiometric acid-base titration method for determining the intrinsic solubility and the solubility-pH profiles of ionizable molecules, and to report the solubility constants determined by the latter technique. The solubility-pH profiles of twelve generic drugs (atenolol, diclofenac.Na, famotidine, flurbiprofen, furosemide, hydrochlorothiazide, ibuprofen, ketoprofen, labetolol.HCl, naproxen, phenytoin, and propranolol.HCl), with solubilities spanning over six orders of magnitude, were determined both by the new pH-metric method and by a traditional approach (24 hr shaking of saturated solutions, followed by filtration, then HPLC assaying with UV detection). The 212 separate saturation shake-flask solubility measurements and those derived from 65 potentiometric titrations agreed well. The analysis produced the correlation equation: log(1/S)titration = -0.063(+/- 0.032) + 1.025(+/- 0.011) log(1/S)shake-flask, s = 0.20, r2 = 0.978. The potentiometrically-derived intrinsic solubilities of the drugs were: atenolol 13.5 mg/mL, diclofenac.Na 0.82 microg/mL, famotidine 1.1 mg/ mL, flurbiprofen 10.6 microg/mL, furosemide 5.9 microg/mL, hydrochlorothiazide 0.70 mg/mL, ibuprofen 49 microg/mL, ketoprofen 118 microg/mL, labetolol.HCl 128 microg/mL, naproxen 14 microg/mL, phenytoin 19 microg/mL, and propranolol.HCl 70 microg/mL. The new potentiometric method was shown to be reliable for determining the solubility-pH profiles of uncharged ionizable drug substances. Its speed compared to conventional equilibrium measurements, its sound theoretical basis, its ability to generate the full solubility-pH profile from a single titration, and its dynamic range (currently estimated to be seven orders of magnitude) make the new pH-metric method an attractive addition to traditional approaches used by preformulation and development scientists. It may be useful even to discovery scientists in critical decision situations (such as calibrating computational prediction methods).

  2. Local Tsunami Warnings using GNSS and Seismic Data.

    NASA Astrophysics Data System (ADS)

    Hirshorn, B. F.

    2017-12-01

    Tsunami warning Centers (TWC's) must issue warnings based on imperfect and limited data. Uncertainties increase in the near field, where a tsunami reaches the closest coastal populations to the causative earthquake in a half hour or less. In the absence of a warning, the usual advice is "When the ground shakes so severely that it's difficult to stand, move uphill and away from the coast." But, what if the shaking is not severe? If, for example, the earthquake ruptures slowly (producing very little perceived shaking) this advice will fail. Unfortunately these "Tsunami" earthquakes are not rare: tsunamis from slow earthquakes off of Nicaragua in 1992, and Java in 1994 and 2006, killed 179, 250 and 637 people, respectively, even though very few nearby coastal residents felt any strong ground shaking. TWC's must therefore warn the closest coastal populations to the causative earthquake, where over 80% of the Tsunami based casualties typically occur, as soon possible after earthquake rupture begins. The NWS Tsunami Warning Centers (TWCs) currently issue local Tsunami Warnings for the US West Coast, Hawaii, and the Puerto Rico - Virgin Island region within 2-4 minutes after origin time. However, our initial short period Magnitude estimates saturate over about Mw 6.5, and Mwp underestimates Mw for events larger than about Mw 7.5 when using data in the 0 to 3 degree epicentral distance range, severely underestimating the danger of a potential Tsunami in the near field. Coastal GNSS networks complement seismic monitoring networks, and enable unsaturated estimates of Mw within 2-3 minutes of earthquake origin time. NASA/JPL, SIO, USGS, CWU, UCB and UW, with funding and guidance from NASA, and leveraging the USGS funded ShakeAlert development, have been working with the National Weather Service TWC's to incorporate real-time GNSS and seismogeodetic data into their operations. These data will soon provide unsaturated estimates of moment magnitude, Centroid Moment Tensor solutions, coseismic crustal deformation, and fault slip models within a few minutes after earthquake initiation. The sea floor deformation associated with the earthquake slip can then be used as an initial condition for an automatically generated tsunami propagation and coastal inundation model for coastal warnings.

  3. An Orientation Sensor-Based Head Tracking System for Driver Behaviour Monitoring

    PubMed Central

    Görne, Lorenz; Yuen, Iek-Man; Cao, Dongpu; Sullman, Mark; Auger, Daniel; Lv, Chen; Wang, Huaji; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros

    2017-01-01

    Although at present legislation does not allow drivers in a Level 3 autonomous vehicle to engage in a secondary task, there may become a time when it does. Monitoring the behaviour of drivers engaging in various non-driving activities (NDAs) is crucial to decide how well the driver will be able to take over control of the vehicle. One limitation of the commonly used face-based head tracking system, using cameras, is that sufficient features of the face must be visible, which limits the detectable angle of head movement and thereby measurable NDAs, unless multiple cameras are used. This paper proposes a novel orientation sensor based head tracking system that includes twin devices, one of which measures the movement of the vehicle while the other measures the absolute movement of the head. Measurement error in the shaking and nodding axes were less than 0.4°, while error in the rolling axis was less than 2°. Comparison with a camera-based system, through in-house tests and on-road tests, showed that the main advantage of the proposed system is the ability to detect angles larger than 20° in the shaking and nodding axes. Finally, a case study demonstrated that the measurement of the shaking and nodding angles, produced from the proposed system, can effectively characterise the drivers’ behaviour while engaged in the NDAs of chatting to a passenger and playing on a smartphone. PMID:29165331

  4. Installation, care, and maintenance of wood shake and shingle roofs

    Treesearch

    Tony Bonura; Jack Dwyer; Arnie Nebelsick; Brent Stuart; R. Sam Williams; Christopher Hunt

    2011-01-01

    This article gives general guidelines for selection, installation, finishing, and maintenance of wood shake and shingle roofs. The authors have gathered information from a variety of sources: research publications on wood finishing, technical data sheets from paint manufacturers, installation instructions for shake and shingle roofs, and interviews with experts having...

  5. Raspberry Shake- A World-Wide Citizen Seismograph Network

    NASA Astrophysics Data System (ADS)

    Christensen, B. C.; Blanco Chia, J. F.

    2017-12-01

    Raspberry Shake was conceived as an inexpensive plug-and-play solution to satisfy the need for universal, quick and accurate earthquake detections. First launched on Kickstarter's crowdfunding platform in July of 2016, the Raspberry Shake project was funded within hours of the launch date and, by the end of the campaign, reached more than 1000% of its initial funding goal. This demonstrated for the first time that there exists a strong interest among Makers, Hobbyists and Do It Yourselfers for personal seismographs. From here, a citizen scientist network was created and it has steadily been growing. The Raspberry Shake network is currently being used in conjunction with publicly available broadband data from the GSN and other state-run seismic networks available through the IRIS, Geoscope and GEOFON data centers to detect and locate earthquakes large and small around the globe. Raspberry Shake looks well positioned to improve local monitoring of earthquakes on a global scale, deepen community's understanding of earthquakes, and serve as a formidable teaching tool. We present the main results of the project, the current state of the network, and the new Raspberry Shake models that are being built.

  6. Shake Test Results and Dynamic Calibration Efforts for the Large Rotor Test Apparatus

    NASA Technical Reports Server (NTRS)

    Russell, Carl R.

    2014-01-01

    Prior to the full-scale wind tunnel test of the UH-60A Airloads rotor, a shake test was completed on the Large Rotor Test Apparatus. The goal of the shake test was to characterize the oscillatory response of the test rig and provide a dynamic calibration of the balance to accurately measure vibratory hub loads. This paper provides a summary of the shake test results, including balance, shaft bending gauge, and accelerometer measurements. Sensitivity to hub mass and angle of attack were investigated during the shake test. Hub mass was found to have an important impact on the vibratory forces and moments measured at the balance, especially near the UH-60A 4/rev frequency. Comparisons were made between the accelerometer data and an existing finite-element model, showing agreement on mode shapes, but not on natural frequencies. Finally, the results of a simple dynamic calibration are presented, showing the effects of changes in hub mass. The results show that the shake test data can be used to correct in-plane loads measurements up to 10 Hz and normal loads up to 30 Hz.

  7. Interpreting plant responses to clinostating. I - Mechanical stresses and ethylene

    NASA Technical Reports Server (NTRS)

    Salisbury, Frank B.; Wheeler, Raymond M.

    1981-01-01

    The possibility that the clinostat mechanical stresses (leaf flopping) induces ethylene production and, thus, the development of epinasty was tested by stressing vertical plants by constant gentle horizontal or vertical shaking or by a quick back-and-forth rotation (twisting). Clinostat leaf flopping was closely approximated by turning plants so that their stems were horizontal, rotating them quickly about the stem axis, and returning them to the vertical, with the treatment repeated every four minutes. It was found that horizontal and vertical shaking, twisting, intermittent horizontal rotating, and gentle hand shaking failed to induce epinasties that approached those observed on the slow clinostat. Minor epinasties were generated by vigorous hand-shaking (120 sec/day) and by daily application of Ag(+). Reducing leaf displacements by inverting plants did not significantly reduce the minor epinasty generated by vigorous hand-shaking.

  8. The Use of GIS for the Application of the Phenomenological Approach to the Seismic Risk Analysis: the Case of the Italian Fortified Architecture

    NASA Astrophysics Data System (ADS)

    Lenticchia, E.; Coïsson, E.

    2017-05-01

    The present paper proposes the use of GIS for the application of the so-called phenomenological approach to the analysis of the seismic behaviour of historical buildings. This approach is based on the awareness that the different masonry building typologies are characterized by different, recurring vulnerabilities. Thus, the observation and classification of the real damage is seen as the first step for recognizing and classifying these vulnerabilities, in order to plan focused preventive interventions. For these purposes, the GIS has proven to be a powerful instrument to collect and manage this type of information on a large number of cases. This paper specifically focuses on the application of the phenomenological approach to the analysis of the seismic behaviour of fortified buildings, including castles, fortresses, citadels, and all the typical historical constructions characterized by the presence of massive towers and defensive walls. The main earthquakes which struck Italy in the last 40 years (up to the recent Central Italy seismic swarm) were taken into consideration and described by means of shake maps. A previously published work has been continued with the addition of new data and some improvements, including a specific symbology for the description of building typologies and conservation status on the maps, the indications of damage levels and the comparison between shake maps in terms of pga and in terms of pseudo-acceleration. The increase in knowledge obtained and the broader frame given by the analysis of the data are here directed to the primary aim of cultural heritage preservation.

  9. Aristotelian syllogisms

    NASA Astrophysics Data System (ADS)

    Ollongren, Alexander

    2011-02-01

    Aristotelian assertive syllogistic logic (without modalities) is embedded in the author's Lingua Cosmica. The well-known basic structures of assertions and conversions between them in this logic are represented in LINCOS. Since these representations correspond with set-theoretic operations, the latter are embedded in LINCOS as well. Based on this valid argumentation in Aristotle's sense is obtained for four important so-called perfect figures. Their constructive (intuitionistic) verifications are of a surprisingly elegant simplicity.

  10. Android Based Behavioral Biometric Authentication via Multi-Modal Fusion

    DTIC Science & Technology

    2014-06-12

    such as the way he or she uses the mouse, or interacts with the Graphical User Interface (GUI) [9]. Described simply, standard biometrics is determined...as a login screen on a standard computer. Active authentication is authentication that occurs dynamically throughout interaction with the device. A...because they are higher level constructs in themselves. The Android framework was specifically used for capturing the multitouch gestures: pinch and zoom

  11. Small Sample Studies of Food Habits: I. The Relationship between Food Preference and Food Choice in Naval Enlisted Personnel at the Naval Construction Battalion Center, Davisville, Rhode Island

    DTIC Science & Technology

    1974-10-01

    6-44 102. French onion soup 6-48 103. Strawberry milk shake 6-52 104. Sprite 6-56 105. Broccoli w/mock hollandaise sauce 6-60 106. Parmesan...Sandwich with Brown Gravy Grilled Cheeseburger on Toasted Roll Grilled Hamburger on Toasted Roll. • • < Broccoli with Mock Hollandaise Sauce ...MEAL APPENDIX B NAME/NUMBER ♦Beef Barley Soup _ Croutons > _ ♦Baked Virginia Ham with Pineapple Raisin Sauce _ Grilled Frankfurter on Toasted

  12. Large-Scale Biaxial Friction Experiments with an Assistance of the NIED Shaking Table

    NASA Astrophysics Data System (ADS)

    Fukuyama, E.; Mizoguchi, K.; Yamashita, F.; Togo, T.; Kawakata, H.; Yoshimitsu, N.; Shimamoto, T.; Mikoshiba, T.; Sato, M.; Minowa, C.

    2012-12-01

    We constructed a large-scale biaxial friction apparatus using a large shaking table working at NIED (table dimension is 15m x 15m). The actuator of the shaking table becomes the engine of the constant speed loading. We used a 1.5m long rock sample overlaid on a 2m one. Their height and width are both 0.5m. Therefore, the slip area is 1.5m x 0.5m. The 2m long sample moves with the shaking table and the 1.5m sample is fixed to the basement of the shaking table. Thus, the shaking table displacement controls the dislocation between two rock samples. The shaking table can generate 0.4m displacement with a velocity ranging between 0.0125mm/s and 1m/s. We used Indian gabbro for the rock sample of the present experiments. Original flatness of the sliding surface was formed less than 0.024mm undulation using a large-scale plane grinder. Surface roughness evolved as subsequent experiments were done. Wear material was generated during each experiment, whose grain size becomes bigger as the experiments proceed. This might suggest a damage evolution on the sliding surface. In some experiments we did not remove the gouge material before sliding to examine the effect of gouge layer. Normal stress can be applied up to 1.3MPa. The stiffness of this apparatus was measured experimentally and was of the order of 0.1GN/m. We first measured the coefficient of friction at low sliding velocity (0.1~1mm/s) where the steady state was achieved after the slip of ~5mm. The coefficient of friction was about 0.75 under the normal stress between 0.13 and 1.3MPa. This is consistent with those estimated by previous works using smaller rock samples. We observed that the coefficient of friction decreased gradually with increasing slip velocity, but simultaneously the friction curves at the higher velocities are characterized by stick-slip vibration. Our main aim of the experiments is to understand the rupture propagation from slow nucleation to fast unstable rupture during the loading of two contact surfaces. We recorded many unstable slip events that nucleated inside the sliding surface but did not reach the edge of the sliding surface until the termination of slip. These slip events simulate full rupture process during earthquake, including nucleation, propagation and termination of the rupture. We monitored these rupture progress using the strain change propagation measured by 16 semiconductor strain gauges recorded at a sampling rate of 1MHz. In addition, high frequency waves emitted from AE events was continuously observed by 8 piezo-electronic transducers (PZTs) at a sampling rate of 20MHz. These sensors were attached at the edge of the slipping area. The AE event started to occur where the slip was nucleated and the slip area started to expand. Unfortunately, we could not locate all AE events during the unstable rupture, because of the overprints of signals from multiple events in the PZT records. We also monitored the amplitudes of transmitted waves across the sliding surface. The amplitudes decreased just after the stick slip and recovered gradually, suggesting that the transmitted wave amplitudes might reflect the slipped area on the interface.

  13. Characterization of the Physical Stability of a Lyophilized IgG1 mAb After Accelerated Shipping-like Stress

    PubMed Central

    Telikepalli, Srivalli; Kumru, Ozan S.; Kim, Jae Hyun; Joshi, Sangeeta B.; O'Berry, Kristin B.; Blake-Haskins, Angela W.; Perkins, Melissa D.; Middaugh, C. Russell; Volkin, David B.

    2014-01-01

    Upon exposure to shaking stress, an IgG1 mAb formulation in both liquid and lyophilized state formed subvisible particles. Since freeze-drying is expected to minimize protein physical instability under these conditions, the extent and nature of aggregate formation in the lyophilized preparation was examined using a variety of particle characterization techniques. The effect of formulation variables such as residual moisture content, reconstitution rate, and reconstitution medium were examined. Upon reconstitution of shake-stressed lyophilized mAb, differences in protein particle size and number were observed by Microflow Digital Imaging (MFI), with the reconstitution medium having the largest impact. Shake-stress had minor effects on the structure of protein within the particles as shown by SDS-PAGE and FTIR analysis. The lyophilized mAb was shake-stressed to different extents and stored for 3 months at different temperatures. Both extent of cake collapse and storage temperature affected the physical stability of the shake-stressed lyophilized mAb upon subsequent storage. These findings demonstrate that physical degradation upon shaking of a lyophilized IgG1 mAb formulation includes not only cake breakage, but also results in an increase in subvisible particles and turbidity upon reconstitution. The shaking-induced cake breakage of the lyophilized IgG1 mAb formulation also resulted in decreased physical stability upon storage. PMID:25522000

  14. Optimal Co-segmentation of Tumor in PET-CT Images with Context Information

    PubMed Central

    Song, Qi; Bai, Junjie; Han, Dongfeng; Bhatia, Sudershan; Sun, Wenqing; Rockey, William; Bayouth, John E.; Buatti, John M.

    2014-01-01

    PET-CT images have been widely used in clinical practice for radiotherapy treatment planning of the radiotherapy. Many existing segmentation approaches only work for a single imaging modality, which suffer from the low spatial resolution in PET or low contrast in CT. In this work we propose a novel method for the co-segmentation of the tumor in both PET and CT images, which makes use of advantages from each modality: the functionality information from PET and the anatomical structure information from CT. The approach formulates the segmentation problem as a minimization problem of a Markov Random Field (MRF) model, which encodes the information from both modalities. The optimization is solved using a graph-cut based method. Two sub-graphs are constructed for the segmentation of the PET and the CT images, respectively. To achieve consistent results in two modalities, an adaptive context cost is enforced by adding context arcs between the two subgraphs. An optimal solution can be obtained by solving a single maximum flow problem, which leads to simultaneous segmentation of the tumor volumes in both modalities. The proposed algorithm was validated in robust delineation of lung tumors on 23 PET-CT datasets and two head-and-neck cancer subjects. Both qualitative and quantitative results show significant improvement compared to the graph cut methods solely using PET or CT. PMID:23693127

  15. Response of the Laprak Landslide to the 2015 Nepal Earthquake and Implications for the Utility of Simple Infinite Slope Models in Regional Landslide Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Haneberg, W. C.; Gurung, N.

    2016-12-01

    The village of Laprak, located in the Gorkha District of western Nepal, was built on a large colluvium landslide about 10 km from the epicenter of the 25 April 2015 M 7.8 Nepal earthquake. Recent episodic movement began during a wet period in 1999 and continued in at least 2002, 2006, and 2007, destroying 24 homes, removing 23 hectares of land from agricultural production, and claiming 1 life. Reconnaissance mapping, soil sampling and testing, and slope stability analyses undertaken before the 2015 earthquake suggested that the hillside should be stable under dry conditions, unstable to marginally stable under static wet conditions, and wholly unstable under wet seismic conditions. Most of the buildings in Laprak, which were predominantly of dry fitted stone masonry, were destroyed by Intensity IX shaking during the 2015 earthquake. Interpretation of remotely sensed imagery and published photographs shows new landslide features; hence, some downslope movement occurred but the landslide did not mobilize into a long run-out flow. Monte Carlo simulations based upon a pseudostatic infinite slope model and constrained by reasonable distributions of soil shear strength, pore pressure, and slope angle from earlier work and seismic coefficients based upon the observed Intensity IX shaking (and inferred PGA) yield high probabilities of failure for steep portions of the slope above and below the village but moderate probabilities of failure for the more gentle portion of the slope upon which most of the village was constructed. In retrospect, the seismic coefficient selected for the pre-earthquake analysis proved to be remarkably prescient. Similar results were obtained using a first-order, second-moment (FOSM) approach that is convenient for GIS based regional analyses. Predictions of permanent displacement made using a variety of published empirical formulae based upon sliding block analyses range from about 10 cm to about 200 cm, also broadly consistent with the observed earthquake effects. Thus, at least in the case of Laprak, the conceptually simple infinite slope models used in many GIS based regional analyses appear to provide robust results if reasonable ranges of soil strength, pore pressure, slope geometry, and seismic loading are used.

  16. The Shaking Torch: Another Variation on the Inductive Force

    ERIC Educational Resources Information Center

    Thompson, Frank

    2010-01-01

    A recent article showed how the influx of neodymium magnets has provided striking demonstrations of the interactions between magnets and conductors. The "shaking torch" is yet another example. Many of these torches require no batteries and can be submerged in water--indeed, a light for life. In this article, the author disassembles a shaking torch…

  17. Perpetrator Accounts in Infant Abusive Head Trauma Brought about by a Shaking Event

    ERIC Educational Resources Information Center

    Biron, Dean; Shelton, Doug

    2005-01-01

    Objective: To analyze perpetrator and medical evidence collected during investigations of infant abusive head trauma (IAHT), with a view to (a) identifying cases where injuries were induced by shaking in the absence of any impact and (b) documenting the response of infant victims to a violent shaking event. Method: A retrospective study was…

  18. Multi-Agent Framework for the Fair Division of Resources and Tasks

    DTIC Science & Technology

    2006-01-01

    144 B.1.2 Application of Shake Out Algorithm to JFK Airport Test Data.........................144 B.2 Generalization...145 Figure B–2: Available Aircraft Inventory at JFK Airport ............................................. 148 Figure B–3...Available Aircraft Inventory at JFK Airport after the first shake out ....... 148 Figure B–4: Inventory Vectors for Second and Third Shake Outs

  19. Students' Multi-Modal Re-Presentations of Scientific Knowledge and Creativity

    ERIC Educational Resources Information Center

    Koren, Yitzhak; Klavir, Rama; Gorodetsky, Malka

    2005-01-01

    The paper brings the results of a project that passed on to students the opportunity for re-presenting their acquired knowledge via the construction of multi-modal "learning resources". These "learning resources" substituted for lectures and books and became the official learning sources in the classroom. The rational for the…

  20. Exploring the Future of Lifelong Learning: Advocacy, Research and Footprinting

    ERIC Educational Resources Information Center

    Chisholm, Lynne

    2013-01-01

    This reflective think-tank contribution begins by comparing advocacy and research as distinct modalities of professional and social action. In practice they frequently elide and merge into one another. While alliance and complementarity between the two modalities is constructive for shaping policy and practice, it poses risks when governments and…

  1. The Impact of Multimedia Effect on Science Learning: Evidence from Eye Movements

    ERIC Educational Resources Information Center

    She, Hsiao-Ching; Chen, Yi-Zen

    2009-01-01

    This study examined how middle school students constructed their understanding of the mitosis and meiosis processes at a molecular level through multimedia learning materials presented in different interaction and sensory modality modes. A two (interaction modes: animation/simulation) by two (sensory modality modes: narration/on-screen text)…

  2. Earthquake Ground Motion Selection

    DOT National Transportation Integrated Search

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  3. Shake-table testing of a self-centering precast reinforced concrete frame with shear walls

    NASA Astrophysics Data System (ADS)

    Lu, Xilin; Yang, Boya; Zhao, Bin

    2018-04-01

    The seismic performance of a self-centering precast reinforced concrete (RC) frame with shear walls was investigated in this paper. The lateral force resistance was provided by self-centering precast RC shear walls (SPCW), which utilize a combination of unbonded prestressed post-tensioned (PT) tendons and mild steel reinforcing bars for flexural resistance across base joints. The structures concentrated deformations at the bottom joints and the unbonded PT tendons provided the self-centering restoring force. A 1/3-scale model of a five-story self-centering RC frame with shear walls was designed and tested on a shake-table under a series of bi-directional earthquake excitations with increasing intensity. The acceleration response, roof displacement, inter-story drifts, residual drifts, shear force ratios, hysteresis curves, and local behaviour of the test specimen were analysed and evaluated. The results demonstrated that seismic performance of the test specimen was satisfactory in the plane of the shear wall; however, the structure sustained inter-story drift levels up to 2.45%. Negligible residual drifts were recorded after all applied earthquake excitations. Based on the shake-table test results, it is feasible to apply and popularize a self-centering precast RC frame with shear walls as a structural system in seismic regions.

  4. Microarray platform affords improved product analysis in mammalian cell growth studies

    PubMed Central

    Li, Lingyun; Migliore, Nicole; Schaefer, Eugene; Sharfstein, Susan T.; Dordick, Jonathan S.; Linhardt, Robert J.

    2014-01-01

    High throughput (HT) platforms serve as cost-efficient and rapid screening method for evaluating the effect of cell culture conditions and screening of chemicals. The aim of the current study was to develop a high-throughput cell-based microarray platform to assess the effect of culture conditions on Chinese hamster ovary (CHO) cells. Specifically, growth, transgene expression and metabolism of a GS/MSX CHO cell line, which produces a therapeutic monoclonal antibody, was examined using microarray system in conjunction with conventional shake flask platform in a non-proprietary medium. The microarray system consists of 60 nl spots of cells encapsulated in alginate and separated in groups via an 8-well chamber system attached to the chip. Results show the non-proprietary medium developed allows cell growth, production and normal glycosylation of recombinant antibody and metabolism of the recombinant CHO cells in both the microarray and shake flask platforms. In addition, 10.3 mM glutamate addition to the defined base media results in lactate metabolism shift in the recombinant GS/MSX CHO cells in the shake flask platform. Ultimately, the results demonstrate that the high-throughput microarray platform has the potential to be utilized for evaluating the impact of media additives on cellular processes, such as, cell growth, metabolism and productivity. PMID:24227746

  5. Detection of ground motions using high-rate GPS time-series

    NASA Astrophysics Data System (ADS)

    Psimoulis, Panos A.; Houlié, Nicolas; Habboub, Mohammed; Michel, Clotaire; Rothacher, Markus

    2018-05-01

    Monitoring surface deformation in real-time help at planning and protecting infrastructures and populations, manage sensitive production (i.e. SEVESO-type) and mitigate long-term consequences of modifications implemented. We present RT-SHAKE, an algorithm developed to detect ground motions associated with landslides, sub-surface collapses, subsidences, earthquakes or rock falls. RT-SHAKE detects first transient changes in individual GPS time series before investigating for spatial correlation(s) of observations made at neighbouring GPS sites and eventually issue a motion warning. In order to assess our algorithm on fast (seconds to minute), large (from 1 cm to meters) and spatially consistent surface motions, we use the 1 Hz GEONET GNSS network data of the Tohoku-Oki MW9.0 2011 as a test scenario. We show the delay of detection of seismic wave arrival by GPS records is of ˜10 seconds with respect to an identical analysis based on strong-motion data and this time delay depends on the level of the time-variable noise. Nevertheless, based on the analysis of the GPS network noise level and ground motion stochastic model, we show that RT-SHAKE can narrow the range of earthquake magnitude, by setting a lower threshold of detected earthquakes to MW6.5-7, if associated with a real-time automatic earthquake location system.

  6. Heightened odds of large earthquakes near Istanbul: an interaction-based probability calculation

    USGS Publications Warehouse

    Parsons, T.; Toda, S.; Stein, R.S.; Barka, A.; Dieterich, J.H.

    2000-01-01

    We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium, departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 ± 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 ± 12% during the next decade.

  7. An empirical model for global earthquake fatality estimation

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David

    2010-01-01

    We analyzed mortality rates of earthquakes worldwide and developed a country/region-specific empirical model for earthquake fatality estimation within the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is defined as total killed divided by total population exposed at specific shaking intensity level. The total fatalities for a given earthquake are estimated by multiplying the number of people exposed at each shaking intensity level by the fatality rates for that level and then summing them at all relevant shaking intensities. The fatality rate is expressed in terms of a two-parameter lognormal cumulative distribution function of shaking intensity. The parameters are obtained for each country or a region by minimizing the residual error in hindcasting the total shaking-related deaths from earthquakes recorded between 1973 and 2007. A new global regionalization scheme is used to combine the fatality data across different countries with similar vulnerability traits.

  8. Biocompatible and high-performance amino acids-capped MnWO4 nanocasting as a novel non-lanthanide contrast agent for X-ray computed tomography and T1-weighted magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Dong, Kai; Liu, Zhen; Liu, Jianhua; Huang, Sa; Li, Zhenhua; Yuan, Qinghai; Ren, Jinsong; Qu, Xiaogang

    2014-01-01

    In the present work, a novel non-lanthanide dual-modality contrast agent, manganese tungstate (MnWO4), has been successfully constructed by a facile and versatile hydrothermal route. With the merits of a high atomic number and a well-positioned K-edge energy of tungsten, our well-prepared non-lanthanide nanoprobes provide a higher contrast efficacy than routine iodine-based agents in clinics. Additionally, the presence of Mn in these nanoparticles endow them with excellent T1-weighted MR imaging capabilities. As an alternative to T2-weighted MRI and CT dual-modality contrast agents, the nanoprobes can provide a positive contrast signal, which prevents confusion with the dark signals from hemorrhage and blood clots. To the best of our knowledge, this is the first report that a non-lanthanide imaging nanoprobe is applied for CT and T1-weighted MRI simultaneously. Moreover, comparing with gadolinium-based T1-weighted MRI and CT dual-modality contrast agents that were associated with nephrogenic systemic fibrosis (NSF), our contrast agents have superior biocompatibility, which is proved by a detailed study of the pharmacokinetics, biodistribution, and in vivo toxicology. Together with excellent dispersibility, high biocompatibility and superior contrast efficacy, these nanoprobes provide detailed and complementary information from dual-modality imaging over traditional single-mode imaging and bring more opportunities to the new generation of non-lanthanide nanoparticulate-based contrast agents.In the present work, a novel non-lanthanide dual-modality contrast agent, manganese tungstate (MnWO4), has been successfully constructed by a facile and versatile hydrothermal route. With the merits of a high atomic number and a well-positioned K-edge energy of tungsten, our well-prepared non-lanthanide nanoprobes provide a higher contrast efficacy than routine iodine-based agents in clinics. Additionally, the presence of Mn in these nanoparticles endow them with excellent T1-weighted MR imaging capabilities. As an alternative to T2-weighted MRI and CT dual-modality contrast agents, the nanoprobes can provide a positive contrast signal, which prevents confusion with the dark signals from hemorrhage and blood clots. To the best of our knowledge, this is the first report that a non-lanthanide imaging nanoprobe is applied for CT and T1-weighted MRI simultaneously. Moreover, comparing with gadolinium-based T1-weighted MRI and CT dual-modality contrast agents that were associated with nephrogenic systemic fibrosis (NSF), our contrast agents have superior biocompatibility, which is proved by a detailed study of the pharmacokinetics, biodistribution, and in vivo toxicology. Together with excellent dispersibility, high biocompatibility and superior contrast efficacy, these nanoprobes provide detailed and complementary information from dual-modality imaging over traditional single-mode imaging and bring more opportunities to the new generation of non-lanthanide nanoparticulate-based contrast agents. Electronic supplementary information (ESI) available: TEM images of MnWO4 nanoparticles synthesized at pH = 7, 180 °C pH = 9, 180 °C pH = 6, 200 °C with various amino acid molecules as capped agents, survey XPS spectra, FTIR spectrum of glycine capped MnWO4 nanorods, photos of glycine capped MnWO4 nanorods in various solutions including PBS, DMEM cell medium, and FBS, in vivo coronal view CT images of a rat before and after intravenous injection of iobitridol at different timed intervals, in vivo CT imaging of the rat one month after intravenous injection of MnWO4 nanorods, CT values of the heart, liver, spleen and kidney of a rat before and after intravenous administration of MnWO4 nanorods and iobitridol at different time intervals, hematology analysis and blood biochemical assay. See DOI: 10.1039/c3nr05455a

  9. A method for producing digital probabilistic seismic landslide hazard maps

    USGS Publications Warehouse

    Jibson, R.W.; Harp, E.L.; Michael, J.A.

    2000-01-01

    The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include: (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24 000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10 m grid spacing using ARC/INFO GIS software on a UNIX computer. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure. ?? 2000 Elsevier Science B.V. All rights reserved.

  10. A method for producing digital probabilistic seismic landslide hazard maps; an example from the Los Angeles, California, area

    USGS Publications Warehouse

    Jibson, Randall W.; Harp, Edwin L.; Michael, John A.

    1998-01-01

    The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24,000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10-m grid spacing in the ARC/INFO GIS platform. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure.

  11. Empty calories and phantom fullness: a randomized trial studying the relative effects of energy density and viscosity on gastric emptying determined by MRI and satiety.

    PubMed

    Camps, Guido; Mars, Monica; de Graaf, Cees; Smeets, Paul Am

    2016-07-01

    Stomach fullness is a determinant of satiety. Although both the viscosity and energy content have been shown to delay gastric emptying, their relative importance is not well understood. We compared the relative effects of and interactions between the viscosity and energy density on gastric emptying and perceived satiety. A total of 15 healthy men [mean ± SD age: 22.6 ± 2.4 y; body mass index (in kg/m(2)): 22.6 ± 1.8] participated in an experiment with a randomized 2 × 2 crossover design. Participants received dairy-based shakes (500 mL; 50% carbohydrate, 20% protein, and 30% fat) that differed in viscosity (thin and thick) and energy density [100 kcal (corresponding to 0.2 kcal/mL) compared with 500 kcal (corresponding to 1 kcal/mL)]. After ingestion, participants entered an MRI scanner where abdominal scans and oral appetite ratings on a 100-point scale were obtained every 10 min until 90 min after ingestion. From the scans, gastric content volumes were determined. Overall, the gastric emptying half-time (GE t50) was 54.7 ± 3.8 min. The thin 100-kcal shake had the lowest GE t50 of 26.5 ± 3.0 min, followed by the thick 100-kcal shake with a GE t50 of 41 ± 3.9 min and the thin 500-kcal shake with a GE t50 of 69.5 ± 5.9 min, and the thick 500-kcal shake had the highest GE t50 of 81.9 ± 8.3 min. With respect to appetite, the thick 100-kcal shake led to higher fullness (58 points at 40 min) than the thin 500-kcal shake (48 points at 40 min). Our results show that increasing the viscosity is less effective than increasing the energy density in slowing gastric emptying. However, the viscosity is more important to increase the perceived fullness. These results underscore the lack of the satiating efficiency of empty calories in quickly ingested drinks such as sodas. The increase in perceived fullness that is due solely to the increased viscosity, which is a phenomenon that we refer to as phantom fullness, may be useful in lowering energy intake. This trial was registered at www.trialregister.nl as NTR4573. © 2016 American Society for Nutrition.

  12. Successful Demonstration of New Isolated Bridge System at UCB Shaking Table

    Science.gov Websites

    other events Successful Demonstration of New Isolated Bridge System at UCB Shaking Table PEER Events Successful Demonstration of New Isolated Bridge System at UCB Shaking Table On May 26, 2010 over 100 demonstration of a new isolated bridge system at the PEER Earthquake Simulator Laboratory at UC Berkeley’s

  13. ShakeAlert—An earthquake early warning system for the United States west coast

    USGS Publications Warehouse

    Burkett, Erin R.; Given, Douglas D.; Jones, Lucile M.

    2014-08-29

    Earthquake early warning systems use earthquake science and the technology of monitoring systems to alert devices and people when shaking waves generated by an earthquake are expected to arrive at their location. The seconds to minutes of advance warning can allow people and systems to take actions to protect life and property from destructive shaking. The U.S. Geological Survey (USGS), in collaboration with several partners, has been working to develop an early warning system for the United States. ShakeAlert, a system currently under development, is designed to cover the West Coast States of California, Oregon, and Washington.

  14. Probabilistic description of infant head kinematics in abusive head trauma.

    PubMed

    Lintern, T O; Nash, M P; Kelly, P; Bloomfield, F H; Taberner, A J; Nielsen, P M F

    2017-12-01

    Abusive head trauma (AHT) is a potentially fatal result of child abuse, but the mechanisms by which injury occur are often unclear. To investigate the contention that shaking alone can elicit the injuries observed, effective computational models are necessary. The aim of this study was to develop a probabilistic model describing infant head kinematics in AHT. A deterministic model incorporating an infant's mechanical properties, subjected to different shaking motions, was developed in OpenSim. A Monte Carlo analysis was used to simulate the range of infant kinematics produced as a result of varying both the mechanical properties and the type of shaking motions. By excluding physically unrealistic shaking motions, worst-case shaking scenarios were simulated and compared to existing injury criteria for a newborn, a 4.5 month-old, and a 12 month-old infant. In none of the three cases were head kinematics observed to exceed previously-estimated subdural haemorrhage injury thresholds. The results of this study provide no biomechanical evidence to demonstrate how shaking by a human alone can cause the injuries observed in AHT, suggesting either that additional factors, such as impact, are required, or that the current estimates of injury thresholds are incorrect.

  15. Urban MEMS based seismic network for post-earthquakes rapid disaster assessment

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Antonino; Luzio, Dario; D'Anna, Giuseppe

    2014-05-01

    Life losses following disastrous earthquake depends mainly by the building vulnerability, intensity of shaking and timeliness of rescue operations. In recent decades, the increase in population and industrial density has significantly increased the exposure to earthquakes of urban areas. The potential impact of a strong earthquake on a town center can be reduced by timely and correct actions of the emergency management centers. A real time urban seismic network can drastically reduce casualties immediately following a strong earthquake, by timely providing information about the distribution of the ground shaking level. Emergency management centers, with functions in the immediate post-earthquake period, could be use this information to allocate and prioritize resources to minimize loss of human life. However, due to the high charges of the seismological instrumentation, the realization of an urban seismic network, which may allow reducing the rate of fatalities, has not been achieved. Recent technological developments in MEMS (Micro Electro-Mechanical Systems) technology could allow today the realization of a high-density urban seismic network for post-earthquakes rapid disaster assessment, suitable for the earthquake effects mitigation. In the 1990s, MEMS accelerometers revolutionized the automotive-airbag system industry and are today widely used in laptops, games controllers and mobile phones. Due to their great commercial successes, the research into and development of MEMS accelerometers are actively pursued around the world. Nowadays, the sensitivity and dynamics of these sensors are such to allow accurate recording of earthquakes with moderate to strong magnitude. Due to their low cost and small size, the MEMS accelerometers may be employed for the realization of high-density seismic networks. The MEMS accelerometers could be installed inside sensitive places (high vulnerability and exposure), such as schools, hospitals, public buildings and places of worship. The waveforms recorded could be promptly used to determine ground-shaking parameters, like peak ground acceleration/velocity/displacement, Arias and Housner intensity, that could be all used to create, few seconds after a strong earthquakes, shaking maps at urban scale. These shaking maps could allow to quickly identify areas of the town center that have had the greatest earthquake resentment. When a strong seismic event occur, the beginning of the ground motion observed at the site could be used to predict the ensuing ground motion at the same site and so to realize a short term earthquake early warning system. The data acquired after a moderate magnitude earthquake, would provide valuable information for the detail seismic microzonation of the area based on direct earthquake shaking observations rather than from a model-based or indirect methods. In this work, we evaluate the feasibility and effectiveness of such seismic network taking in to account both technological, scientific and economic issues. For this purpose, we have simulated the creation of a MEMS based urban seismic network in a medium size city. For the selected town, taking into account the instrumental specifics, the array geometry and the environmental noise, we investigated the ability of the planned network to detect and measure earthquakes of different magnitude generated from realistic near seismogentic sources.

  16. Operational Modal Analysis of the Cablestayed Footbridge

    NASA Astrophysics Data System (ADS)

    Kortiš, Ján; Daniel, Ľuboš; Farbák, Matúš; Maliar, Lukáš; Škarupa, Milan

    2017-12-01

    Modern architecture leads to design subtle bridge structures that are more sensitive to increased dynamic loading than the massive ones. This phenomenon can be especially observed on lightweight steel structures such as suspended footbridges. As a result, it is necessary to know precisely its dynamic characteristics, such as natural frequencies, natural shapes and damping of construction. This information can be used for further analysis such as damage detection, system identification, health monitoring, etc. or also for the design of new types of construction. For this purpose, classical modal analysis using trigger load or harmonic vibration exciter in combination with acceleration sensors is used in practice. However, there are many situations where it is not possible to stop the traffic or operation of the bridge. The article presents an experimental measurement of the dynamic parameters of the structure at the operating load using the operational modal analysis.

  17. Feasibility of Twitter Based Earthquake Characterization From Analysis of 32 Million Tweets: There's Got to be a Pony in Here Somewhere!

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M. R.; Smoczyk, G. M.; Horvath, S. R.; Jessica, T. S.; Bausch, D. B.

    2014-12-01

    The U.S. Geological Survey (USGS) operates a real-time system that detects earthquakes using only data from Twitter—a service for sending and reading public text-based messages of up to 140 characters. The detector algorithm scans for significant increases in tweets containing the word "earthquake" in several languages and sends internal alerts with the detection time, representative tweet texts, and the location of the population center where most of the tweets originated. It has been running in real-time for over two years and finds, on average, two or three felt events per day, with a false detection rate of 9%. The main benefit of the tweet-based detections is speed, with most detections occurring between 20 and 120 seconds after the earthquake origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. The detections have reasonable coverage of populated areas globally. The number of Twitter-based detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter-based detections are generally caused by widely felt events in populated urban areas that are of more immediate interest than those with no human impact. We will present a technical overview of the system and investigate the potential for rapid characterization of earthquake damage and effects using the 32 million "earthquake" tweets that the system has so far amassed. Initial results show potential for a correlation between characteristic responses and shaking level. For example, tweets containing the word "terremoto" were common following the MMI VII shaking produced by the April 1, 2014 M8.2 Iquique, Chile earthquake whereas a widely-tweeted deep-focus M5.2 north of Santiago, Chile on April 4, 2014 produced MMI VI shaking and almost exclusively "temblor" tweets. We are also investigating the use of other social media such as Instagram to obtain rapid images of earthquake-related damage. An Instagram search following the damaging M6.9 earthquake near the Mexico, Guatemala boarder on July 7, 2014 reveled half a dozen unconfirmed images of damage; the first posted 15 minutes after the event.

  18. CISN ShakeAlert: Using early warnings for earthquakes in California

    NASA Astrophysics Data System (ADS)

    Vinci, M.; Hellweg, M.; Jones, L. M.; Khainovski, O.; Schwartz, K.; Lehrer, D.; Allen, R. M.; Neuhauser, D. S.

    2009-12-01

    Educated users who have developed response plans and procedures are just as important for an earthquake early warning (EEW) system as are the algorithms and computers that process the data and produce the warnings. In Japan, for example, the implementation of the EEW system which now provides advanced alerts of ground shaking included intense outreach efforts to both institutional and individual recipients. Alerts are now used in automatic control systems that stop trains, place sensitive equipment in safe mode and isolate hazards while the public takes cover. In California, the California Integrated Seismic Network (CISN) is now developing and implementing components of a prototype system for EEW, ShakeAlert. As this processing system is developed, we invite a suite of perspective users from critical industries and institutions throughout California to partner with us in developing useful ShakeAlert products and procedures. At the same time, we will support their efforts to determine and implement appropriate responses to an early warning of earthquake shaking. As a first step, in a collaboration with BART, we have developed a basic system allowing BART’s operation center to receive realtime ground shaking information from more than 150 seismic stations operating in the San Francisco Bay Area. BART engineers are implementing a display system for this information. Later phases will include the development of improved response procedures utilizing this information. We plan to continue this collaboration to include more sophisticated information from the prototype CISN ShakeAlert system.

  19. Changes of the directional brain networks related with brain plasticity in patients with long-term unilateral sensorineural hearing loss.

    PubMed

    Zhang, G-Y; Yang, M; Liu, B; Huang, Z-C; Li, J; Chen, J-Y; Chen, H; Zhang, P-P; Liu, L-J; Wang, J; Teng, G-J

    2016-01-28

    Previous studies often report that early auditory deprivation or congenital deafness contributes to cross-modal reorganization in the auditory-deprived cortex, and this cross-modal reorganization limits clinical benefit from cochlear prosthetics. However, there are inconsistencies among study results on cortical reorganization in those subjects with long-term unilateral sensorineural hearing loss (USNHL). It is also unclear whether there exists a similar cross-modal plasticity of the auditory cortex for acquired monaural deafness and early or congenital deafness. To address this issue, we constructed the directional brain functional networks based on entropy connectivity of resting-state functional MRI and researched changes of the networks. Thirty-four long-term USNHL individuals and seventeen normally hearing individuals participated in the test, and all USNHL patients had acquired deafness. We found that certain brain regions of the sensorimotor and visual networks presented enhanced synchronous output entropy connectivity with the left primary auditory cortex in the left long-term USNHL individuals as compared with normally hearing individuals. Especially, the left USNHL showed more significant changes of entropy connectivity than the right USNHL. No significant plastic changes were observed in the right USNHL. Our results indicate that the left primary auditory cortex (non-auditory-deprived cortex) in patients with left USNHL has been reorganized by visual and sensorimotor modalities through cross-modal plasticity. Furthermore, the cross-modal reorganization also alters the directional brain functional networks. The auditory deprivation from the left or right side generates different influences on the human brain. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  20. About the structure of cellulose: debating the Lindman hypothesis

    USDA-ARS?s Scientific Manuscript database

    The hypothesis advanced in this issue of Cellulose, that the solubility or insolubility characteristics of cellulose are significantly based upon amphiphilic and hydrophobic molecular interactions, is bound to shake the roots of (some of) our textbook wisdom. The hypothesis is based on the considera...

  1. Nonlinear instability and convection in a vertically vibrated granular bed

    NASA Astrophysics Data System (ADS)

    Shukla, Priyanka; Ansari, I. H.; van der Meer, D.; Lohse, Detlef; Alam, Meheboob

    2015-11-01

    The nonlinear instability of the density-inverted granular Leidenfrost state and the resulting convective motion in strongly shaken granular matter are analysed via a weakly nonlinear analysis. Under a quasi-steady ansatz, the base state temperature decreases with increasing height away from from the vibrating plate, but the density profile consists of three distinct regions: (i) a collisional dilute layer at the bottom, (ii) a levitated dense layer at some intermediate height and (iii) a ballistic dilute layer at the top of the granular bed. For the nonlinear stability analysis, the nonlinearities up-to cubic order in perturbation amplitude are retained, leading to the Landau equation. The genesis of granular convection is shown to be tied to a supercritical pitchfork bifurcation from the Leidenfrost state. Near the bifurcation point the equilibrium amplitude is found to follow a square-root scaling law, Ae √{ ▵} , with the distance ▵ from bifurcation point. The strength of convection is maximal at some intermediate value of the shaking strength, with weaker convection both at weaker and stronger shaking. Our theory predicts a novel floating-convection state at very strong shaking.

  2. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

    NASA Astrophysics Data System (ADS)

    Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

    2008-12-01

    Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake response technologies by Los Angeles Unified School District and a top to bottom examination of Los Angeles County Fire Department's earthquake response strategies.

  3. Study on soil-pile-structure-TMD interaction system by shaking table model test

    NASA Astrophysics Data System (ADS)

    Lou, Menglin; Wang, Wenjian

    2004-06-01

    The success of the tuned mass damper (TMD) in reducing wind-induced structural vibrations has been well established. However, from most of the recent numerical studies, it appears that for a structure situated on very soft soil, soil-structure interaction (SSI) could render a damper on the structure totally ineffective. In order to experimentally verify the SSI effect on the seismic performance of TMD, a series of shaking table model tests have been conducted and the results are presented in this paper. It has been shown that the TMD is not as effective in controlling the seismic responses of structures built on soft soil sites due to the SSI effect. Some test results also show that a TMD device might have a negative impact if the SSI effect is neglected and the structure is built on a soft soil site. For structures constructed on a soil foundation, this research verifies that the SSI effect must be carefully understood before a TMD control system is designed to determine if the control is necessary and if the SSI effect must be considered when choosing the optimal parameters of the TMD device.

  4. The Earthquake Closet: Making Early-Warning Useful

    NASA Astrophysics Data System (ADS)

    Wyss, M.; Trendafiloski, G.

    2009-12-01

    Early-warning of approaching strong shaking that could have fatal consequences is a research field that has made great progress. It makes it possible to reduce the impact on dangerous processes in critical facilities and on trains. However, its potential to save lives has a serious Achilles heel: The time for getting to safety is five to 10 seconds only, in many cities. Occupants of upper floors cannot get out of their buildings and narrow streets are not a safe place in strong earthquakes for people who might be able to exit. Thus, only about 10% of a city’s population can benefit from early-warnings, unless they have access to their own earthquake closet that is strong enough to remain intact in a collapsing building. Such an Earthquake Protection Unit (EPU) may be installed in the structurally strongest part of an existing apartment at low cost. In new constructions, we propose that an earthquake shelter be constructed for each floor, large enough to accommodate all occupants of that floor. These types of EPU should be constructed on top of each other, forming a strong tower, next to the elevator shaft and the staircase, at the center of the building. If an EPU with structural properties equivalent to an E-class building is placed into a building of B-class in South America, for example, we estimate that the chances of surviving shaking of intensity VII is about 30,000 times better inside the closet. The probability of escaping injury inside compared to outside we estimate as about 1,500 times better. Educating the population regarding the usefulness of EPUs will be essential, and P-waves can be used as the early warning signal. The owner of an earthquake closet can easily be motivated to take protective measures, when these involve simply to step into his closet, rather than attempting to exit from the building by running down many flights of stairs. Our intention is to start a discussion how best to construct EPUs and how to introduce legislation that will require earthquake shelters in new multistory buildings.

  5. Using Smartphones to Detect Earthquakes

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  6. Tissue engineering of cartilage using a mechanobioreactor exerting simultaneous mechanical shear and compression to simulate the rolling action of articular joints.

    PubMed

    Shahin, Kifah; Doran, Pauline M

    2012-04-01

    The effect of dynamic mechanical shear and compression on the synthesis of human tissue-engineered cartilage was investigated using a mechanobioreactor capable of simulating the rolling action of articular joints in a mixed fluid environment. Human chondrocytes seeded into polyglycolic acid (PGA) mesh or PGA-alginate scaffolds were precultured in shaking T-flasks or recirculation perfusion bioreactors for 2.5 or 4 weeks prior to mechanical stimulation in the mechanobioreactor. Constructs were subjected to intermittent unconfined shear and compressive loading at a frequency of 0.05 Hz using a peak-to-peak compressive strain amplitude of 2.2% superimposed on a static axial compressive strain of 6.5%. The mechanical treatment was carried out for up to 2.5 weeks using a loading regime of 10 min duration each day with the direction of the shear forces reversed after 5 min and release of all loading at the end of the daily treatment period. Compared with shaking T-flasks and mechanobioreactor control cultures without loading, mechanical treatment improved the amount and quality of cartilage produced. On a per cell basis, synthesis of both major structural components of cartilage, glycosaminoglycan (GAG) and collagen type II, was enhanced substantially by up to 5.3- and 10-fold, respectively, depending on the scaffold type and seeding cell density. Levels of collagen type II as a percentage of total collagen were also increased after mechanical treatment by up to 3.4-fold in PGA constructs. Mechanical treatment had a less pronounced effect on the composition of constructs precultured in perfusion bioreactors compared with perfusion culture controls. This work demonstrates that the quality of tissue-engineered cartilage can be enhanced significantly by application of simultaneous dynamic mechanical shear and compression, with the greatest benefits evident for synthesis of collagen type II. Copyright © 2011 Wiley Periodicals, Inc.

  7. A digital 3D atlas of the marmoset brain based on multi-modal MRI.

    PubMed

    Liu, Cirong; Ye, Frank Q; Yen, Cecil Chern-Chyi; Newman, John D; Glen, Daniel; Leopold, David A; Silva, Afonso C

    2018-04-01

    The common marmoset (Callithrix jacchus) is a New-World monkey of growing interest in neuroscience. Magnetic resonance imaging (MRI) is an essential tool to unveil the anatomical and functional organization of the marmoset brain. To facilitate identification of regions of interest, it is desirable to register MR images to an atlas of the brain. However, currently available atlases of the marmoset brain are mainly based on 2D histological data, which are difficult to apply to 3D imaging techniques. Here, we constructed a 3D digital atlas based on high-resolution ex-vivo MRI images, including magnetization transfer ratio (a T1-like contrast), T2w images, and multi-shell diffusion MRI. Based on the multi-modal MRI images, we manually delineated 54 cortical areas and 16 subcortical regions on one hemisphere of the brain (the core version). The 54 cortical areas were merged into 13 larger cortical regions according to their locations to yield a coarse version of the atlas, and also parcellated into 106 sub-regions using a connectivity-based parcellation method to produce a refined atlas. Finally, we compared the new atlas set with existing histology atlases and demonstrated its applications in connectome studies, and in resting state and stimulus-based fMRI. The atlas set has been integrated into the widely-distributed neuroimaging data analysis software AFNI and SUMA, providing a readily usable multi-modal template space with multi-level anatomical labels (including labels from the Paxinos atlas) that can facilitate various neuroimaging studies of marmosets. Published by Elsevier Inc.

  8. The Sidebar Computer Program, a seismic-shaking intensity meter: users' manual and software description

    USGS Publications Warehouse

    Evans, John R.

    2003-01-01

    The SideBar computer program provides a visual display of seismic shaking intensity as recorded at one specific seismograph. This software allows a user to tap into the seismic data recorded on that specific seismograph and to display the overall level of shaking at the single location where that seismograph resides (usually the same place the user is). From this shaking level, SideBar also estimates the potential for damage nearby. SideBar cannot tell you the “Richter magnitude” of the earthquake (see box), only how hard the ground shook locally and this estimate of how much damage is likely in the neighborhood. This combination of local effects is called the “seismic intensity”. SideBar runs on a standard desktop or laptop PC, and is intended for the media, schools, emergency responders, and any other group hosting a seismograph and who want to know immediately after an earthquake the levels of shaking measured by that instrument. These local values can be used to inform the public and help initiate appropriate local emergency response activities in the minutes between the earthquake and availability of the broader coverage provided by the USGS over the Web, notably by ShakeMap. For example, for instruments installed in schools, the level of shaking and likely damage at the school could immediately be Web broadcast and parents could quickly determine the likely safety of their children—their biggest postearthquake concern. Also, in the event of a Web outage, SideBar may be a continuing primary source of local emergency response information for some additional minutes. Specifically, SideBar interprets the peak level of acceleration (that is, the force of shaking, as a percentage of the force of gravity) as well as the peak velocity, or highest speed, at which the ground moves. Using these two basic measurements, SideBar computes what is called Instrumental Intensity—a close approximation of the Modified Mercalli Intensity scale, or “MMI” (using the Wald et al., 1999a, relationships between acceleration, velocity, and shaking intensity). Intensity is a measure of local shaking strength and the potential for damage—of how bad the earthquake effects were locally. The intensity level is what SideBar displays most prominently on the PC monitor. Intensity is shown as a large, colored bar that gets taller and changes color up a rainbow from blues toward reds as the shaking level increases. As opposed to earthquake magnitudes, which are reported as decimal values (like “7.6”), intensity is traditionally given as a Roman numeral, with “I” to “X+” assigned to levels of potential damage and perceived shaking strength. For good measure, SideBar shows the actual values of the force of shaking (peak ground acceleration as a percentage of gravity) and the speed of ground motion (peak ground velocity in inches per second, by default, or in centimeters per second, if you wish), both these values as decimal numbers. SideBar also remembers the most recent earthquakes (for up to one week), and can store as many of these previous earthquakes as the user allows (and as the user’s PC has room for)—typically thousands. SideBar also remembers forever the three largest earthquakes it has seen and all earthquakes over intensity IV so that one never loses particularly important events.

  9. Seismic shaking in the North China Basin expected from ruptures of a possible seismic gap

    NASA Astrophysics Data System (ADS)

    Duan, Benchun; Liu, Dunyu; Yin, An

    2017-05-01

    A 160 km long seismic gap, which has not been ruptured over 8000 years, was identified recently in North China. In this study, we use a dynamic source model and a newly available high-resolution 3-D velocity structure to simulate long-period ground motion (up to 0.5 Hz) from possibly worst case rupture scenarios of the seismic gap. We find that the characteristics of the earthquake source and the local geologic structure play a critical role in controlling the amplitude and distribution of the simulated strong ground shaking. Rupture directivity and slip asperities can result in large-amplitude (i.e., >1 m/s) ground shaking near the fault, whereas long-duration shaking may occur within sedimentary basins. In particular, a deep and closed Quaternary basin between Beijing and Tianjin can lead to ground shaking of several tens of cm/s for more than 1 min. These results may provide a sound basis for seismic mitigation in one of the most populated regions in the world.

  10. MyShake: Initial observations from a global smartphone seismic network

    NASA Astrophysics Data System (ADS)

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis

    2016-09-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing. In the first 6 months since the release of the MyShake app, there were almost 200,000 downloads. On a typical day about 8000 phones provide acceleration waveform data to the MyShake archive. The on-phone app can detect and trigger on P waves and is capable of recording magnitude 2.5 and larger events. More than 200 seismic events have been recorded so far, including events in Chile, Argentina, Mexico, Morocco, Nepal, New Zealand, Taiwan, Japan, and across North America. The largest number of waveforms from a single earthquake to date comes from the M5.2 Borrego Springs earthquake in Southern California, for which MyShake collected 103 useful three-component waveforms. The network continues to grow with new downloads from the Google Play store everyday and expands rapidly when public interest in earthquakes peaks such as during an earthquake sequence.

  11. Successful ShakeAlert Performance for the Napa Quake

    NASA Astrophysics Data System (ADS)

    Allen, R. M.; Given, D. D.; Heaton, T. H.; Vidale, J. E.

    2014-12-01

    ShakeAlert, the demonstration earthquake early warning system, developed by the USGS, UC Berkeley, Caltech, ETH, and the University of Washington, functioned as expected for the August 24, 2014, M6.0 Napa earthquake. The first ShakeAlert was generated by the ElarmS algorithm 5.1 sec after the origin time of the earthquake, and 3.3 sec after the P-wave arrived at the closest station 6.5 km from the epicenter. This initial alert, based on P-wave triggers from four stations, estimated the magnitude to be 5.7. The warning was received at the UC Berkeley Seismological Laboratory 5 seconds before the S-wave and about 10 sec prior to the onset of the strongest shaking. ShakeAlert beta-testers across the San Francisco Bay Area simultaneously received the alert, including the San Francisco 911 center with 8 sec warning, and the BART train system. BART has implemented an automated train-stopping system that was activated (although no trains were running at 3:20 am). With the available network geometry and communications, the blind zone of the first alert had a radius of 16 km. The four stations that contributed to the first alert all encapsulate data into 1-second packets, but the latency in transmitting data to the processing center ranged from 0.27 to 2.62 seconds. If all the stations were to deliver data in 0.27 seconds, then the alert would have been available 2.3 sec sooner and the blind zone would be reduced to about 8 km. This would also mean that the city of Napa would have received about 1 second of warning. The magnitude estimate and event location were accurate from the initial alert onwards. The magnitude estimate did first increase to 5.8 and then dip to 5.4 2.6 sec after the initial alert, stayed at that level for 2 sec, and then returned to 5.7. The final magnitude estimate was 6.0, consistent with the ANSS catalog.

  12. Science Resulting from U.S. Geological Survey's "Did You Feel It?" Citizen Science Portal

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Dewey, J. W.; Atkinson, G. M.; Worden, C. B.; Quitoriano, V. P. R.

    2016-12-01

    The U.S. Geological Survey (USGS) "Did You Feel It?" (DYFI) system, in operation since 1999, is an automated approach for rapidly collecting macroseismic intensity data from internet users' shaking and damage reports and generating intensity maps immediately following earthquakes felt around the globe. As with any citizen science project, a significant component of the DYFI system is public awareness and participation in the immediate aftermath of any widely felt earthquake, allowing the public and the USGS to exchange valuable post-earthquake information. The data collected are remarkably robust and useful, as indicated by the range of peer-reviewed literature that rely on these citizen-science intensity reports. A Google Scholar search results in 14,700 articles citing DYFI, a number of which rely exclusively on these data. Though focused on topics of earthquake seismology (including shaking attenuation and relationships with damage), other studies cover social media use in disasters, human risk perception, earthquake-induced landslides, rapid impact assessment, emergency response, and science education. DYFI data have also been analyzed for non-earthquake events, including explosions, aircraft sonic booms, and even bolides and DYFI is now one of the best data sources from which to study induced earthquakes. Yet, DYFI was designed primarily as an operational system to rapidly assess the effects of earthquakes for situational awareness. Oftentimes, DYFI data are the only data available pertaining to shaking levels for much of the United States. As such, DYFI provides site-specific constraints of the shaking levels that feed directly into ShakeMap; thus, these data are readily available to emergency managers and responders, the media, and the general public. As an early adopter of web-based citizen science and having worked out many kinks in the process, DYFI developers have provided guidance on many other citizen-science endeavors across a wide range of disciplines.

  13. Evaluating glacier movement fluctuations using remote sensing: A case study of the Baird, Patterson, LeConte, and Shakes glaciers in central Southeastern Alaska

    NASA Astrophysics Data System (ADS)

    Davidson, Robert Howard

    Global Land Survey (GLS) data encompassing Landsat Multispectral Scanner (MSS), Landsat 5's Thematic Mapper (TM), and Landsat 7's Enhanced Thematic Mapper Plus (ETM+) were used to determine the terminus locations of Baird, Patterson, LeConte, and Shakes Glaciers in Alaska in the time period 1975-2010. The sequences of the terminuses locations were investigated to determine the movement rates of these glaciers with respect to specific physical and environmental conditions. GLS data from 1975, 1990, 2000, 2005, and 2010 in false-color composite images enhancing ice-snow differentiation and Iterative Self-Organizing (ISO) Data Cluster Unsupervised Classifications were used to 1) quantify the movement rates of Baird, Patterson, LeConte, and Shakes Glaciers; 2) analyze the movement rates for glaciers with similar terminal terrain conditions and; 3) analyze the movement rates for glaciers with dissimilar terminal terrain conditions. From the established sequence of terminus locations, movement distances were quantified between the glacier locations. Movement distances were then compared to see if any correlation existed between glaciers with similar or dissimilar terminal terrain conditions. The Global Land Ice Measurement from Space (GLIMS) data was used as a starting point from which glacier movement was measured for Baird, Patterson, and LeConte Glaciers only as the Shakes Glacier is currently not included in the GLIMS database. The National Oceanographic and Atmospheric Administration (NOAA) temperature data collected at the Petersburg, Alaska, meteorological station (from January 1, 1973 to December 31, 2009) were used to help in the understanding of the climatic condition in this area and potential impact on glaciers terminus. Results show that glaciers with similar terminal terrain conditions (Patterson and Shakes Glaciers) and glaciers with dissimilar terminal terrain conditions (Baird, Patterson, and LeConte Glaciers) did not exhibit similar movement rates. Glacier movement rates were greatest for glaciers whose terminuses were in fresh water (Patterson and Shakes Glaciers), less for those with terminuses in salt water (LeConte Glacier), and least for glaciers with terminuses on dry land (Baird Glacier).Based upon these findings, the presence of water, especially fresh water, at the terminal end of the Patterson and Shakes Glaciers had a greater effect on glacier movement than slope. Possible explanations for this effect might include a heat sink effect or tidal motions that hasten glacier disintegration in the ablation zone. In a heat sink scenario, the water bodies in which the Patterson and Shakes Glaciers terminus are located could act as a thermal energy transfer medium that increases glacier melting and subsequent retreat. On the other hand, tidal motions could act as horizontal and vertical push/pull forces, which increase the fracturing rate, calving, and subsequent retreat of glaciers terminus that are is salt water like the LeConte Glacier. Over the length of the study period, 1975 through 2010, there has been a 0.85°C increase in annual air temperatures that, although may seem low, may prove important when determining glacial mass balance rates. Further studies are necessary to test these hypotheses to determine if a heat sink effect and tidal motions significantly affected the movement rates for the glaciers in this study area. An additional significant result of this study was the creation of shapefiles delineating the positions of the Shakes Glaciers that are being submitted to the Global Land Ice Measurements from Space (GLIMS) program for inclusion in their master worldwide glacier database.

  14. Borders and Modal Articulations. Semiotic Constructs of Sensemaking Processes Enabling a Fecund Dialogue Between Cultural Psychology and Clinical Psychology.

    PubMed

    De Luca Picione, Raffaele; Freda, Maria Francesca

    2016-03-01

    The notion of the border is an interesting advancement in research on the processes of meaning making within the cultural psychology. The development of this notion in semiotic key allows to handle with adequate complexity construction, transformation, stability and the breakup of the relationship between person/world/otherness. These semiotic implications have already been widely discussed and exposed by authors such Valsiner (2007, 2014), Neuman (2003, 2008), Simão (Culture & Psychology, 9, 449-459, 2003, Theory & Psychology, 15, 549-574, 2005, 2015), with respect to issues of identity/relatedness, inside/outside, stability/change in the irreversible flow of the time. In this work, after showing some of the basics of such semiotic notion of border, we discuss the processes of construction and transformation of borders through the modal articulation, defined as the contextual positioning that the person assumes with respect to the establishment of a boundary in terms of necessity, obligation, willingness, possibility, permission, ability. This modal subjective positioning acquires considerable interest from the clinical point of view since its degree of plasticity vs that of rigidity is the basis of processes of development or stiffening of relations between person/world/otherness.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kundu, B.K.; Stolin, A.V.; Pole, J.

    Our group is developing a scanner that combines x-ray, single gamma, and optical imaging on the same rotating gantry. Two functional modalities (SPECT and optical) are included because they have different strengths and weaknesses in terms of spatial and temporal decay lengths in the context of in vivo imaging, and because of the recent advent of multiple reporter gene constructs. The effect of attenuation by biological tissue on the detected intensity of the emitted signal was measured for both gamma and optical imaging. Attenuation by biological tissue was quantified for both the bioluminescent emission of luciferace and for the emissionmore » light of the near infrared fluorophore cyanine 5.5, using a fixed excitation light intensity. Experiments were performed to test the feasibility of using either single gamma or x-ray imaging to make depth-dependent corrections to the measured optical signal. Our results suggest that significant improvements in quantitation of optical emission are possible using straightforward correction techniques based on information from other modalities. Development of an integrated scanner in which data from each modality are obtained with the animal in a common configuration will greatly simplify this process.« less

  16. Specific Signature of Seismic Shaking in Landslide Inventories: Case of the Chichi Earthquake

    NASA Astrophysics Data System (ADS)

    Meunier, P.; Rault, C.; Marc, O.; Hovius, N.

    2017-12-01

    The 1999 Chichi earthquake triggered 10 000 landslides in its epicentral area. In addition to coseismic landsliding, directly induced by the shaking, the hillslopes response extended to several years after the main shock, during which landslide susceptibility remained higher than during the pre-seismic period. We attribute this elevated rate to weakening effects caused by the shaking. The characteristics of the coseismic landslide catalogues (clustering,slope and azimuth distribution) bears the signature of the seismic triggering. Extended landslide mapping (1994-2004) allows to track changes in these signatures in order to better interpret them. We present a summary of the change of these signatures through time and space. At the scale of the epicentral area, we show that coseismic landslide clustering did clearly occur along the fault where the shaking is strong. In 3 sub-catchments of the Choshui river, a finer analysis of the landslide time series reveals a mixed signature of both geology and shaking. Pre-quake rain-induced landslides preferentially occurred down slope and along the bedding planes while coseismic landslides locate higher in the landscape, on slopes strongly affected by site effects. However, during the post seismic period, the signature of the shaking is not present while landslide rate remains high, suggesting that weakening effects seemed homogeneously distributed in the landscape.

  17. Earthquake early Warning ShakeAlert system: West coast wide production prototype

    USGS Publications Warehouse

    Kohler, Monica D.; Cochran, Elizabeth S.; Given, Douglas; Guiwits, Stephen; Neuhauser, Doug; Hensen, Ivan; Hartog, Renate; Bodin, Paul; Kress, Victor; Thompson, Stephen; Felizardo, Claude; Brody, Jeff; Bhadha, Rayo; Schwarz, Stan

    2017-01-01

    Earthquake early warning (EEW) is an application of seismological science that can give people, as well as mechanical and electrical systems, up to tens of seconds to take protective actions before peak earthquake shaking arrives at a location. Since 2006, the U.S. Geological Survey has been working in collaboration with several partners to develop EEW for the United States. The goal is to create and operate an EEW system, called ShakeAlert, for the highest risk areas of the United States, starting with the West Coast states of California, Oregon, and Washington. In early 2016, the Production Prototype v.1.0 was established for California; then, in early 2017, v.1.2 was established for the West Coast, with earthquake notifications being distributed to a group of beta users in California, Oregon, and Washington. The new ShakeAlert Production Prototype was an outgrowth from an earlier demonstration EEW system that began sending test notifications to selected users in California in January 2012. ShakeAlert leverages the considerable physical, technical, and organizational earthquake monitoring infrastructure of the Advanced National Seismic System, a nationwide federation of cooperating seismic networks. When fully implemented, the ShakeAlert system may reduce damage and injury caused by large earthquakes, improve the nation’s resilience, and speed recovery.

  18. Specific signature of seismic shaking in landslide catalogues: Case of the Chichi earthquake

    NASA Astrophysics Data System (ADS)

    Meunier, Patrick; Rault, Claire; Marc, Odin; Hovius, Niels

    2017-04-01

    The 1999 Chichi earthquake triggered 10 000 landslides in its epicentral area. In addition to coseismic landsliding, directly induced by the shaking, the hillslopes response extended to several years after the main shock, during which landslide susceptibility remained higher than during the pre-seismic period. We attribute this elevated rate to weakening effects caused by the shaking. The characteristics of the coseismic landslide catalogues (clustering, slope and azimuth distribution) bears the signature of the seismic triggering. Extended landslide mapping (1994-2004) allows to track changes in these signatures in order to better interpret them. We present a summary of the change of these signatures through time and space. At the scale of the epicentral area, we show that coseismic landslide clustering did clearly occur along the fault where the shaking is strong. In 3 sub-catchments of the Choshui river, a finer analysis of the landslide time series reveals a mixed signature of both geology and shaking. Pre-quake rain-induced landslides preferentially occurred down slope and along the bedding planes while coseismic landslides locate higher in the landscape, on slopes strongly affected by site effects. However, during the post seismic period, the signature of the shaking is not present while landslide rate remains high, suggesting that weakening effects seemed homogeneously distributed in the landscape.

  19. Recorded earthquake responses from the integrated seismic monitoring network of the Atwood Building, Anchorage, Alaska

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    An integrated seismic monitoring system with a total of 53 channels of accelerometers is now operating in and at the nearby free-field site of the 20-story steel-framed Atwood Building in highly seismic Anchorage, Alaska. The building has a single-story basement and a reinforced concrete foundation without piles. The monitoring system comprises a 32-channel structural array and a 21-channel site array. Accelerometers are deployed on 10 levels of the building to assess translational, torsional, and rocking motions, interstory drift (displacement) between selected pairs of adjacent floors, and average drift between floors. The site array, located approximately a city block from the building, comprises seven triaxial accelerometers, one at the surface and six in boreholes ranging in depths from 15 to 200 feet (???5-60 meters). The arrays have already recorded low-amplitude shaking responses of the building and the site caused by numerous earthquakes at distances ranging from tens to a couple of hundred kilometers. Data from an earthquake that occurred 186 km away traces the propagation of waves from the deepest borehole to the roof of the building in approximately 0.5 seconds. Fundamental structural frequencies [0.58 Hz (NS) and 0.47 Hz (EW)], low damping percentages (2-4%), mode coupling, and beating effects are identified. The fundamental site frequency at approximately 1.5 Hz is close to the second modal frequencies (1.83 Hz NS and 1.43 EW) of the building, which may cause resonance of the building. Additional earthquakes prove repeatability of these characteristics; however, stronger shaking may alter these conclusions. ?? 2006, Earthquake Engineering Research Institute.

  20. New ShakeMaps for Georgia Resulting from Collaboration with EMME

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.; Varazanashvili, O.; Alania, V.

    2015-12-01

    Correct assessment of probabilistic seismic hazard and risks maps are first step for advance planning and action to reduce seismic risk. Seismic hazard maps for Georgia were calculated based on modern approach that was developed in the frame of EMME (Earthquake Modl for Middle east region) project. EMME was one of GEM's successful endeavors at regional level. With EMME and GEM assistance, regional models were analyzed to identify the information and additional work needed for the preparation national hazard models. Probabilistic seismic hazard map (PSH) provides the critical bases for improved building code and construction. The most serious deficiency in PSH assessment for the territory of Georgia is the lack of high-quality ground motion data. Due to this an initial hybrid empirical ground motion model is developed for PGA and SA at selected periods. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Obtained results of seismic hazard maps show evidence that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation.

  1. Enhanced production of astaxanthin by Chromochloris zofingiensis in a microplate-based culture system under high light irradiation.

    PubMed

    Chen, Jun-Hui; Liu, Lu; Wei, Dong

    2017-12-01

    The green microalga Chromochloris zofingiensis is a promising producer of natural astaxanthin. In the present study, C. zofingiensis was first cultivated in shake flasks under low light irradiation and then subjected to continuous high light irradiation, which effectively promoted astaxanthin production. In addition, a microplate-based culture system in concert with high light irradiation from blue light and white light above 150μmolm -2 s -1 was constructed and applied to improve astaxanthin production. Blue light exerted more positive influences on astaxanthin accumulation, but when the light intensity was increased to 300μmolm -2 s -1 , astaxanthin biosynthesis was substantially inhibited. Conversely, in a nitrogen-deprived culture under white light, the highest astaxanthin content for C. zofingiensis, 7.1mg/g, was obtained. The highest astaxanthin yield achieved was 38.9mg/L in a culture with 0.1g/L nitrate under the same culture conditions. This study demonstrates that C. zofingiensis has great potential for natural astaxanthin production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Construction and fed-batch cultivation of Candida famata with enhanced riboflavin production.

    PubMed

    Dmytruk, Kostyantyn; Lyzak, Oleksy; Yatsyshyn, Valentyna; Kluz, Maciej; Sibirny, Vladimir; Puchalski, Czeslaw; Sibirny, Andriy

    2014-02-20

    Riboflavin (vitamin B2) is an essential nutrition component serving as a precursor of coenzymes FMN and FAD that are involved mostly in reactions of oxidative metabolism. Riboflavin is produced in commercial scale and is used in feed and food industries, and in medicine. The yeast Candida famata (Candida flareri) belongs to the group of so called "flavinogenic yeasts" which overproduce riboflavin under iron limitation. Three genes SEF1, RIB1 and RIB7 coding for a putative transcription factor, GTP cyclohydrolase II and riboflavin synthase, respectively were simultaneously overexpressed in the background of a non-reverting riboflavin producing mutant AF-4, obtained earlier in our laboratory using methods of classical selection (Dmytruk et al. (2011), Metabolic Engineering 13, 82-88). Cultivation conditions of the constructed strain were optimized for shake-flasks and bioreactor cultivations. The constructed strain accumulated up to 16.4g/L of riboflavin in optimized medium in a 7L laboratory bioreactor during fed-batch fermentation. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. An Atlas of ShakeMaps for Landslide and Liquefaction Modeling

    NASA Astrophysics Data System (ADS)

    Johnson, K. L.; Nowicki, M. A.; Mah, R. T.; Garcia, D.; Harp, E. L.; Godt, J. W.; Lin, K.; Wald, D. J.

    2012-12-01

    The human consequences of a seismic event are often a result of subsequent hazards induced by the earthquake, such as landslides. While the United States Geological Survey (USGS) ShakeMap and Prompt Assessment of Global Earthquakes for Response (PAGER) systems are, in conjunction, capable of estimating the damage potential of earthquake shaking in near-real time, they do not currently provide estimates for the potential of further damage by secondary processes. We are developing a sound basis for providing estimates of the likelihood and spatial distribution of landslides for any global earthquake under the PAGER system. Here we discuss several important ingredients in this effort. First, we report on the development of a standardized hazard layer from which to calibrate observed landslide distributions; in contrast, prior studies have used a wide variety of means for estimating the hazard input. This layer now takes the form of a ShakeMap, a standardized approach for computing geospatial estimates for a variety of shaking metrics (both peak ground motions and shaking intensity) from any well-recorded earthquake. We have created ShakeMaps for about 20 historical landslide "case history" events, significant in terms of their landslide occurrence, as part of an updated release of the USGS ShakeMap Atlas. We have also collected digitized landslide data from open-source databases for many of the earthquake events of interest. When these are combined with up-to-date topographic and geologic maps, we have the basic ingredients for calibrating landslide probabilities for a significant collection of earthquakes. In terms of modeling, rather than focusing on mechanistic models of landsliding, we adopt a strictly statistical approach to quantify landslide likelihood. We incorporate geology, slope, peak ground acceleration, and landslide data as variables in a logistic regression, selecting the best explanatory variables given the standardized new hazard layers (see Nowicki et al., this meeting, for more detail on the regression). To make the ShakeMap and PAGER systems more comprehensive in terms of secondary losses, we are working to calibrate a similarly constrained regression for liquefaction estimation using a suite of well-studied earthquakes for which detailed, digitized liquefaction datasets are available; here variants of wetness index and soil strength replace geology and slope. We expect that this Atlas of ShakeMaps for landslide and liquefaction case history events, which will soon be publicly available via the internet, will aid in improving the accuracy of loss-modeling systems such as PAGER, as well as allow for a common framework for numerous other mechanistic and empirical studies.

  4. Sensor Level Functional Connectivity Topography Comparison Between Different References Based EEG and MEG.

    PubMed

    Huang, Yunzhi; Zhang, Junpeng; Cui, Yuan; Yang, Gang; Liu, Qi; Yin, Guangfu

    2018-01-01

    Sensor-level functional connectivity topography (sFCT) contributes significantly to our understanding of brain networks. sFCT can be constructed using either electroencephalography (EEG) or magnetoencephalography (MEG). Here, we compared sFCT within the EEG modality and between EEG and MEG modalities. We first used simulations to look at how different EEG references-including the Reference Electrode Standardization Technique (REST), average reference (AR), linked mastoids (LM), and left mastoid references (LR)-affect EEG-based sFCT. The results showed that REST decreased the reference effects on scalp EEG recordings, making REST-based sFCT closer to the ground truth (sFCT based on ideal recordings). For the inter-modality simulation comparisons, we compared each type of EEG-sFCT with MEG-sFCT using three metrics to quantize the differences: Relative Error (RE), Overlap Rate (OR), and Hamming Distance (HD). When two sFCTs are similar, RE and HD are low, while OR is high. Results showed that among all reference schemes, EEG-and MEG-sFCT were most similar when the EEG was REST-based and the EEG and MEG were recorded simultaneously. Next, we analyzed simultaneously recorded MEG and EEG data from publicly available face-recognition experiments using a similar procedure as in the simulations. The results showed (1) if MEG-sFCT is the standard, REST-and LM-based sFCT provided results closer to this standard in the terms of HD; (2) REST-based sFCT and MEG-sFCT had the highest similarity in terms of RE; (3) REST-based sFCT had the most overlapping edges with MEG-sFCT in terms of OR. This study thus provides new insights into the effect of different reference schemes on sFCT and the similarity between MEG and EEG in terms of sFCT.

  5. Development of seismic fragility curves for low-rise masonry infilled reinforced concrete buildings by a coefficient-based method

    NASA Astrophysics Data System (ADS)

    Su, Ray Kai Leung; Lee, Chien-Liang

    2013-06-01

    This study presents a seismic fragility analysis and ultimate spectral displacement assessment of regular low-rise masonry infilled (MI) reinforced concrete (RC) buildings using a coefficient-based method. The coefficient-based method does not require a complicated finite element analysis; instead, it is a simplified procedure for assessing the spectral acceleration and displacement of buildings subjected to earthquakes. A regression analysis was first performed to obtain the best-fitting equations for the inter-story drift ratio (IDR) and period shift factor of low-rise MI RC buildings in response to the peak ground acceleration of earthquakes using published results obtained from shaking table tests. Both spectral acceleration- and spectral displacement-based fragility curves under various damage states (in terms of IDR) were then constructed using the coefficient-based method. Finally, the spectral displacements of low-rise MI RC buildings at the ultimate (or nearcollapse) state obtained from this paper and the literature were compared. The simulation results indicate that the fragility curves obtained from this study and other previous work correspond well. Furthermore, most of the spectral displacements of low-rise MI RC buildings at the ultimate state from the literature fall within the bounded spectral displacements predicted by the coefficient-based method.

  6. A Framework and Toolkit for the Construction of Multimodal Learning Interfaces

    DTIC Science & Technology

    1998-04-29

    human communication modalities in the context of a broad class of applications, specifically those that support state manipulation via parameterized actions. The multimodal semantic model is also the basis for a flexible, domain independent, incrementally trainable multimodal interpretation algorithm based on a connectionist network. The second major contribution is an application framework consisting of reusable components and a modular, distributed system architecture. Multimodal application developers can assemble the components in the framework into a new application,

  7. Liquefaction under drained condition, from the lab to reality ?

    NASA Astrophysics Data System (ADS)

    Clément, Cécile; Aharonov, Einat; Stojanova, Menka; Toussaint, Renaud

    2015-04-01

    Liquefaction constitutes a significant natural hazard in relation to earthquakes and landslides. This effect can cause buildings to tilt or sink into the soil, mud-volcanoes, floatation of buried objects, long-runout landslides, etc. In this work we present a new understanding regarding the mechanism by which buildings sink and tilt during liquefaction caused by earthquakes. Conventional understanding of liquefaction explains most observed cases as occurring in an undrained, under-compacted, layer of sandy soil saturated with water [1]: According to that understanding, the under compacted sandy layer has the tendency to compact when a load is applied. In our case the load comes from ground shaking during an earthquake. When the soil compacts, the fluid pore pressure rises. Because in undrained conditions the fluid cannot flow out, the pore pressure builds up. The weight of buildings is in this case transferred from the grains of the soil to the pore water. The soil loses its rigidity and it flows like a liquid. From this model scientists made theoretical and empirical laws for geotechnical use and buildings construction. Despite the success of this conventional model in many cases, liquefied soils were also observed under drained conditions, and in previously compacted soils, which doesn't agree with the assumption of the model quoted above. One of the famous liquefaction events is the Kobe port destruction during the 1995 earthquake. A simple calculation of the Deborah number following Goren et al ([2][3]) shows that the undrained constraint was not met below the Kobe port during the 1995 earthquake. We propose another model, of liquefaction in drained granular media. According to our model the mere presence of water in granular media is enough to cause liquefaction during an earthquake, provided that the water reaches close to the surface. Our computations are based on the buoyancy force, and we take into account the static fluid pressure only. For small horizontal shaking our model predicts that the soil remains rigid. Under stronger accelerations, some of the particles, which constitute the medium, slide past each other, and the medium slowly rearranges. Yet, in this regime of shaking, the shaking is insufficient to cause the building to slide. The building sinks simply due to hydrostatic considerations, and since it is a static object in a dynamically rearranging medium. This is the case we call liquefaction. Eventually, for even stronger accelerations, both the particles and the building can slide and we predict convective movement. To test this model we run numerical simulations (granular dynamics DEM algorithm) and laboratory experiments. The numerical experiments do not include pore pressure, and only simulate buoyancy effects of water. The controlling parameters are the amplitude and frequency of the shaking, and the water level. With a saturated medium, experiments and simulations display three different behaviors: rigid, liquefaction, and convection, in agreement with our theoretical model. The peak ground acceleration (PGA) is the decisive parameter. It is important to note that for dry media and for a case when the building is fully submerged underwater, both in experiments and in simulations, the liquefaction effect disappears. Based on our work we suggest that elevated pore pressure conditions are not necessary for inducing liquefaction, and that liquefaction can occur under well drained and highly compacted soils, in situations previously considered to be safe from liquefaction. Références [1] Chi-Yuen Wang and Michael Manga. Earthquakes and Water, volume 114. Springer Verlag, 2010. [2] L. Goren, E. Aharonov, D. Sparks, and R. Toussaint. Pore pressure evolution in deforming granu- lar material : A general formulation and the infinitely stiff approximation. Journal of Geophysical Research, 115(B9), Sep 2010. [3] Liran Goren, Einat Aharonov, David Sparks, and Renaud Toussaint. The mechanical coupling of fluid-filled granular material under shear. Pure and applied geophysics, 168(12) :2289-2323, 2011.

  8. Field Telemetry of Blade-rotor Coupled Torsional Vibration at Matuura Power Station Number 1 Unit

    NASA Technical Reports Server (NTRS)

    Isii, Kuniyoshi; Murakami, Hideaki; Otawara, Yasuhiko; Okabe, Akira

    1991-01-01

    The quasi-modal reduction technique and finite element model (FEM) were used to construct an analytical model for the blade-rotor coupled torsional vibration of a steam turbine generator of the Matuura Power Station. A single rotor test was executed in order to evaluate umbrella vibration characteristics. Based on the single rotor test results and the quasi-modal procedure, the total rotor system was analyzed to predict coupled torsional frequencies. Finally, field measurement of the vibration of the last stage buckets was made, which confirmed that the double synchronous resonance was 124.2 Hz, meaning that the machine can be safely operated. The measured eigen values are very close to the predicted value. The single rotor test and this analytical procedure thus proved to be a valid technique to estimate coupled torsional vibration.

  9. Direct Lattice Shaking of Bose Condensates: Finite Momentum Superfluids

    DOE PAGES

    Anderson, Brandon M.; Clark, Logan W.; Crawford, J

    2017-05-31

    Here, we address band engineering in the presence of periodic driving by numerically shaking a lattice containing a bosonic condensate. By not restricting to simplified band structure models we are able to address arbitrary values of the shaking frequency, amplitude, and interaction strengths g. For "near-resonant" shaking frequencies with moderate g, a quantum phase transition to a finite momentum superfluid is obtained with Kibble-Zurek scaling and quantitative agreement with experiment. We use this successful calibration as a platform to support a more general investigation of the interplay between (one particle) Floquet theory and the effects associated with arbitrary g. Bandmore » crossings lead to superfluid destabilization, but where this occurs depends on g in a complicated fashion.« less

  10. Molecular Imaging of Pancreatic Cancer with Antibodies

    PubMed Central

    2015-01-01

    Development of novel imaging probes for cancer diagnostics remains critical for early detection of disease, yet most imaging agents are hindered by suboptimal tumor accumulation. To overcome these limitations, researchers have adapted antibodies for imaging purposes. As cancerous malignancies express atypical patterns of cell surface proteins in comparison to noncancerous tissues, novel antibody-based imaging agents can be constructed to target individual cancer cells or surrounding vasculature. Using molecular imaging techniques, these agents may be utilized for detection of malignancies and monitoring of therapeutic response. Currently, there are several imaging modalities commonly employed for molecular imaging. These imaging modalities include positron emission tomography (PET), single-photon emission computed tomography (SPECT), magnetic resonance (MR) imaging, optical imaging (fluorescence and bioluminescence), and photoacoustic (PA) imaging. While antibody-based imaging agents may be employed for a broad range of diseases, this review focuses on the molecular imaging of pancreatic cancer, as there are limited resources for imaging and treatment of pancreatic malignancies. Additionally, pancreatic cancer remains the most lethal cancer with an overall 5-year survival rate of approximately 7%, despite significant advances in the imaging and treatment of many other cancers. In this review, we discuss recent advances in molecular imaging of pancreatic cancer using antibody-based imaging agents. This task is accomplished by summarizing the current progress in each type of molecular imaging modality described above. Also, several considerations for designing and synthesizing novel antibody-based imaging agents are discussed. Lastly, the future directions of antibody-based imaging agents are discussed, emphasizing the potential applications for personalized medicine. PMID:26620581

  11. An Earthquake Shake Map Routine with Low Cost Accelerometers: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Alcik, H. A.; Tanircan, G.; Kaya, Y.

    2015-12-01

    Vast amounts of high quality strong motion data are indispensable inputs of the analyses in the field of geotechnical and earthquake engineering however, high cost of installation of the strong motion systems constitutes the biggest obstacle for worldwide dissemination. In recent years, MEMS based (micro-electro-mechanical systems) accelerometers have been used in seismological research-oriented studies as well as earthquake engineering oriented projects basically due to precision obtained in downsized instruments. In this research our primary goal is to ensure the usage of these low-cost instruments in the creation of shake-maps immediately after a strong earthquake. Second goal is to develop software that will automatically process the real-time data coming from the rapid response network and create shake-map. For those purposes, four MEMS sensors have been set up to deliver real-time data. Data transmission is done through 3G modems. A subroutine was coded in assembler language and embedded into the operating system of each instrument to create MiniSEED files with packages of 1-second instead of 512-byte packages.The Matlab-based software calculates the strong motion (SM) parameters at every second, and they are compared with the user-defined thresholds. A voting system embedded in the software captures the event if the total vote exceeds the threshold. The user interface of the software enables users to monitor the calculated SM parameters either in a table or in a graph (Figure 1). A small scale and affordable rapid response network is created using four MEMS sensors, and the functionality of the software has been tested and validated using shake table tests. The entire system is tested together with a reference sensor under real strong ground motion recordings as well as series of sine waves with varying amplitude and frequency. The successful realization of this software allowed us to set up a test network at Tekirdağ Province, the closest coastal point to the moderate size earthquake activities in the Marmara Sea, Turkey.

  12. Process Analysis of Variables for Standardization of Antifungal Susceptibility Testing of Nonfermentative Yeasts ▿

    PubMed Central

    Zaragoza, Oscar; Mesa-Arango, Ana C.; Gómez-López, Alicia; Bernal-Martínez, Leticia; Rodríguez-Tudela, Juan Luis; Cuenca-Estrella, Manuel

    2011-01-01

    Nonfermentative yeasts, such as Cryptococcus spp., have emerged as fungal pathogens during the last few years. However, standard methods to measure their antifungal susceptibility (antifungal susceptibility testing [AST]) are not completely reliable due to the impaired growth of these yeasts in standard media. In this work, we have compared the growth kinetics and the antifungal susceptibilities of representative species of nonfermentative yeasts such as Cryptococcus neoformans, Cryptococcus gattii, Cryptococcus albidus, Rhodotorula spp., Yarrowia lipolytica, Geotrichum spp., and Trichosporon spp. The effect of the growth medium (RPMI medium versus yeast nitrogen base [YNB]), glucose concentration (0.2% versus 2%), nitrogen source (ammonium sulfate), temperature (30°C versus 35°C), shaking, and inoculum size (103, 104, and 105 cells) were analyzed. The growth rate, lag phase, and maximum optical density were obtained from each growth experiment, and after multivariate analysis, YNB-based media demonstrated a significant improvement in the growth of yeasts. Shaking, an inoculum size of 105 CFU/ml, and incubation at 30°C also improved the growth kinetics of organisms. Supplementation with ammonium sulfate and with 2% glucose did not have any effect on growth. We also tested the antifungal susceptibilities of all the isolates by the reference methods of the CLSI and EUCAST, the EUCAST method with shaking, YNB under static conditions, and YNB with shaking. MIC values obtained under different conditions showed high percentages of agreement and significant correlation coefficient values between them. MIC value determinations according to CLSI and EUCAST standards were rather complicated, since more than half of isolates tested showed a limited growth index, hampering endpoint determinations. We conclude that AST conditions including YNB as an assay medium, agitation of the plates, reading after 48 h of incubation, an inoculum size of 105 CFU/ml, and incubation at 30°C made MIC determinations easier without an overestimation of MIC values. PMID:21245438

  13. Evaluating the intensity of U.S. earthquakes

    USGS Publications Warehouse

    Simon, R.; Stover, C.

    1977-01-01

    The effects of seismic shaking are objective. All observers can agree these are real occurences and not subjective speculation. Reliable intensity evaluations are based not on a single factor on any scale but on consistent combinations. 

  14. Effect of structural mount dynamics on a pair of operating Stirling Convertors

    NASA Astrophysics Data System (ADS)

    Goodnight, Thomas W.; Suárez, Vicente J.; Hughes, William O.; Samorezov, Sergey

    2002-01-01

    The U.S. Department of Energy (DOE), in conjunction with NASA John H. Glenn Research Center and Stirling Technology Company, are currently developing a Stirling convertor for a Stirling Radioisotope Generator (SRG). NASA Headquarters and DOE have identified the SRG for potential use as an advanced spacecraft power system for future NASA deep-space and Mars surface missions. Low-level dynamic base-shake tests were conducted on a dynamic simulation of the structural mount for a pair of Operating Stirling Convertors. These tests were conducted at NASA Glenn Research Center's Structural Dynamics Laboratory as part of the development of this technology. The purpose of these tests was to identify the changes in transmissibility and the effect on structural dynamic response on a pair of operating Stirling Technology Demonstration Convertors (TDCs). This paper addresses the base-shake test, setup, procedure and results conducted on the Stirling TDC mount simulator in April 2001. .

  15. Building an EEG-fMRI Multi-Modal Brain Graph: A Concurrent EEG-fMRI Study

    PubMed Central

    Yu, Qingbao; Wu, Lei; Bridwell, David A.; Erhardt, Erik B.; Du, Yuhui; He, Hao; Chen, Jiayu; Liu, Peng; Sui, Jing; Pearlson, Godfrey; Calhoun, Vince D.

    2016-01-01

    The topological architecture of brain connectivity has been well-characterized by graph theory based analysis. However, previous studies have primarily built brain graphs based on a single modality of brain imaging data. Here we develop a framework to construct multi-modal brain graphs using concurrent EEG-fMRI data which are simultaneously collected during eyes open (EO) and eyes closed (EC) resting states. FMRI data are decomposed into independent components with associated time courses by group independent component analysis (ICA). EEG time series are segmented, and then spectral power time courses are computed and averaged within 5 frequency bands (delta; theta; alpha; beta; low gamma). EEG-fMRI brain graphs, with EEG electrodes and fMRI brain components serving as nodes, are built by computing correlations within and between fMRI ICA time courses and EEG spectral power time courses. Dynamic EEG-fMRI graphs are built using a sliding window method, versus static ones treating the entire time course as stationary. In global level, static graph measures and properties of dynamic graph measures are different across frequency bands and are mainly showing higher values in eyes closed than eyes open. Nodal level graph measures of a few brain components are also showing higher values during eyes closed in specific frequency bands. Overall, these findings incorporate fMRI spatial localization and EEG frequency information which could not be obtained by examining only one modality. This work provides a new approach to examine EEG-fMRI associations within a graph theoretic framework with potential application to many topics. PMID:27733821

  16. Modality-specificity of Selective Attention Networks.

    PubMed

    Stewart, Hannah J; Amitay, Sygal

    2015-01-01

    To establish the modality specificity and generality of selective attention networks. Forty-eight young adults completed a battery of four auditory and visual selective attention tests based upon the Attention Network framework: the visual and auditory Attention Network Tests (vANT, aANT), the Test of Everyday Attention (TEA), and the Test of Attention in Listening (TAiL). These provided independent measures for auditory and visual alerting, orienting, and conflict resolution networks. The measures were subjected to an exploratory factor analysis to assess underlying attention constructs. The analysis yielded a four-component solution. The first component comprised of a range of measures from the TEA and was labeled "general attention." The third component was labeled "auditory attention," as it only contained measures from the TAiL using pitch as the attended stimulus feature. The second and fourth components were labeled as "spatial orienting" and "spatial conflict," respectively-they were comprised of orienting and conflict resolution measures from the vANT, aANT, and TAiL attend-location task-all tasks based upon spatial judgments (e.g., the direction of a target arrow or sound location). These results do not support our a-priori hypothesis that attention networks are either modality specific or supramodal. Auditory attention separated into selectively attending to spatial and non-spatial features, with the auditory spatial attention loading onto the same factor as visual spatial attention, suggesting spatial attention is supramodal. However, since our study did not include a non-spatial measure of visual attention, further research will be required to ascertain whether non-spatial attention is modality-specific.

  17. Effectiveness of educational materials designed to change knowledge and behaviors regarding crying and shaken-baby syndrome in mothers of newborns: a randomized, controlled trial.

    PubMed

    Barr, Ronald G; Rivara, Frederick P; Barr, Marilyn; Cummings, Peter; Taylor, James; Lengua, Liliana J; Meredith-Benitz, Emily

    2009-03-01

    Infant crying is an important precipitant for shaken-infant syndrome. OBJECTIVE. To determine if parent education materials (The Period of PURPLE Crying [PURPLE]) change maternal knowledge and behavior relevant to infant shaking. This study was a randomized, controlled trial conducted in prenatal classes, maternity wards, and pediatric practices. There were 1374 mothers of newborns randomly assigned to the PURPLE intervention and 1364 mothers to the control group. Primary outcomes were measured by telephone 2 months after delivery. These included 2 knowledge scales about crying and the dangers of shaking; 3 scales about behavioral responses to crying generally and to unsoothable crying, and caregiver self-talk in response to unsoothable crying; and 3 questions concerning the behaviors of sharing of information with others about crying, walking away if frustrated, and the dangers of shaking. The mean infant crying knowledge score was greater in the intervention group (69.5) compared with controls (63.3). Mean shaking knowledge was greater for intervention subjects (84.8) compared with controls (83.5). For reported maternal behavioral responses to crying generally, responses to unsoothable crying, and for self-talk responses, mean scores for intervention mothers were similar to those for controls. For the behaviors of information sharing, more intervention mothers reported sharing information about walking away if frustrated and the dangers of shaking, but there was little difference in sharing information about infant crying. Intervention mothers also reported increased infant distress. Use of the PURPLE education materials seem to lead to higher scores in knowledge about early infant crying and the dangers of shaking, and in sharing of information behaviors considered to be important for the prevention of shaking.

  18. Icilin-evoked behavioral stimulation is attenuated by alpha2-adrenoceptor activation

    PubMed Central

    Kim, Jae; Cowan, Alan; Lisek, Renata; Raymondi, Natalie; Rosenthal, Aaron; Hirsch, Daniel D.; Rawls, Scott M.

    2011-01-01

    Icilin is a transient receptor potential cation channel subfamily M (TRPM8) agonist that produces behavioral activation in rats and mice. Its hallmark overt pharmacological effect is wet-dog shakes (WDS) in rats. The vigorous shaking associated with icilin is dependent on NMDA receptor activation and nitric oxide production, but little else is known about the biological systems that modulate the behavioral phenomenon. The present study investigated the hypothesis that alpha2-adrenoceptor activation inhibits icilin-induced WDS. Rats injected with icilin (0.5, 1, 2.5, 5 mg/kg, i.p.) displayed dose-related WDS that were inhibited by pretreatment with a fixed dose of clonidine (0.15 mg/kg, s.c.). Shaking behavior caused by a fixed dose (2.5 mg/kg) of icilin was also inhibited in a dose-related manner by clonidine pretreatment (0.03–0.15 mg/kg, s.c.) and reduced by clonidine posttreatment (0.15 mg/kg, s.c.). Pretreatment with a peripherally restricted alpha2-adrenoceptor agonist, ST91 (0.075, 0.15 mg/kg), also decreased the incidence of shaking elicited by 2.5 mg/kg of icilin. Pretreatment with yohimbine (2 mg/kg, i.p.) enhanced the shaking induced by a low dose of icilin (0.5 mg/kg). The imidazoline site agonists, agmatine (150 mg/kg, i.p.) and 2-BFI (7 mg/kg, i.p.), did not affect icilin-evoked shaking. These results suggest that alpha2-adrenoceptor activation inhibits shaking induced by icilin and that increases in peripheral, as well as central, alpha2-adrenoceptor signaling oppose the behavioral stimulant effect of icilin. PMID:21315691

  19. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  20. Forecasting probabilistic seismic shaking for greater Tokyo from 400 years of intensity observations

    USGS Publications Warehouse

    Bozkurt, S.B.; Stein, R.S.; Toda, S.

    2007-01-01

    The long recorded history of earthquakes in Japan affords an opportunity to forecast seismic shaking exclusively from past shaking. We calculate the time-averaged (Poisson) probability of severe shaking by using more than 10,000 intensity observations recorded since AD 1600 in a 350 km-wide box centered on Tokyo. Unlike other hazard-assessment methods, source and site effects are included without modeling, and we do not need to know the size or location of any earthquake nor the location and slip rate of any fault. The two key assumptions are that the slope of the observed frequency-intensity relation at every site is the same, and that the 400-year record is long enough to encompass the full range of seismic behavior. Tests we conduct here suggest that both assumptions are sound. The resulting 30-year probability of IJMA ??? 6 shaking (??? PGA ??? 0.4 g or MMI ??? IX) is 30%-40% in Tokyo, Kawasaki, and Yokohama, and 10% 15% in Chiba and Tsukuba. This result means that there is a 30% chance that 4 million people will be subjected to IJMA ??? 6 shaking during an average 30-year period. We also produce exceedance maps of PGA for building-code regulations, and calculate short-term hazard associated with a hypothetical catastrophe bond. Our results resemble an independent assessment developed from conventional seismic hazard analysis for greater Tokyo. ?? 2007, Earthquake Engineering Research Institute.

  1. The NASA/industry Design Analysis Methods for Vibrations (DAMVIBS) program: Sikorsky Aircraft: Advances toward interacting with the airframe design process

    NASA Technical Reports Server (NTRS)

    Twomey, William J.

    1993-01-01

    A short history is traced of the work done at Sikorsky Aircraft under the NASA/industry DAMVIBS program. This includes both work directly funded by the program as well as work which was internally funded but which received its initial impetus from DAMVIBS. The development of a finite element model of the UH-60A airframe having a marked improvement in vibration-predicting ability is described. A new program, PAREDYM, developed at Sikorsky, which automatically adjusts an FEM so that its modal characteristics match test values, is described, as well as the part this program played in the improvement of the UH-60A model. Effects of the bungee suspension system on the shake test data used for model verification are described. The impetus given by the modeling improvement, as well as the recent availability of PAREDYM, has brought for the first time the introduction of low-vibration design into the design cycle at Sikorsky.

  2. Aftershocks, earthquake effects, and the location of the large 14 December 1872 earthquake near Entiat, central Washington

    USGS Publications Warehouse

    Brocher, Thomas M.; Hopper, Margaret G.; Algermissen, S.T. Ted; Perkins, David M.; Brockman, Stanley R.; Arnold, Edouard P.

    2017-01-01

    Reported aftershock durations, earthquake effects, and other observations from the large 14 December 1872 earthquake in central Washington are consistent with an epicenter near Entiat, Washington. Aftershocks were reported for more than 3 months only near Entiat. Modal intensity data described in this article are consistent with an Entiat area epicenter, where the largest modified Mercalli intensities, VIII, were assigned between Lake Chelan and Wenatchee. Although ground failures and water effects were widespread, there is a concentration of these features along the Columbia River and its tributaries in the Entiat area. Assuming linear ray paths, misfits from 23 reports of the directions of horizontal shaking have a local minima at Entiat, assuming the reports are describing surface waves, but the region having comparable misfit is large. Broadband seismograms recorded for comparable ray paths provide insight into the reasons why possible S–P times estimated from felt reports at two locations are several seconds too small to be consistent with an Entiat area epicenter.

  3. Vascular and inflammatory high fat meal responses in young healthy men; a discriminative role of IL-8 observed in a randomized trial.

    PubMed

    Esser, Diederik; Oosterink, Els; op 't Roodt, Jos; Henry, Ronald M A; Stehouwer, Coen D A; Müller, Michael; Afman, Lydia A

    2013-01-01

    High fat meal challenges are known to induce postprandial low-grade inflammation and endothelial dysfunction. This assumption is largely based on studies performed in older populations or in populations with a progressed disease state and an appropriate control meal is often lacking. Young healthy individuals might be more resilient to such challenges. We therefore aimed to characterize the vascular and inflammatory response after a high fat meal in young healthy individuals. In a double-blind randomized cross-over intervention study, we used a comprehensive phenotyping approach to determine the vascular and inflammatory response after consumption of a high fat shake and after an average breakfast shake in 20 young healthy subjects. Both interventions were performed three times. Many features of the vascular postprandial response, such as FMD, arterial stiffness and micro-vascular skin blood flow were not different between shakes. High fat/high energy shake consumption was associated with a more pronounced increase in blood pressure, heart rate, plasma concentrations of IL-8 and PBMCs gene expression of IL-8 and CD54 (ICAM-1), whereas plasma concentrations of sVCAM1 were decreased compared to an average breakfast. Whereas no difference in postprandial response were observed on classical markers of endothelial function, we did observe differences between consumption of a HF/HE and an average breakfast meal on blood pressure and IL-8 in young healthy volunteers. IL-8 might play an important role in dealing with high fat challenges and might be an early marker for endothelial stress, a stage preceding endothelial dysfunction.

  4. Drosophila Shaking-B protein forms gap junctions in paired Xenopus oocytes.

    PubMed

    Phelan, P; Stebbings, L A; Baines, R A; Bacon, J P; Davies, J A; Ford, C

    1998-01-08

    In most multicellular organisms direct cell-cell communication is mediated by the intercellular channels of gap junctions. These channels allow the exchange of ions and molecules that are believed to be essential for cell signalling during development and in some differentiated tissues. Proteins called connexins, which are products of a multigene family, are the structural components of vertebrate gap junctions. Surprisingly, molecular homologues of the connexins have not been described in any invertebrate. A separate gene family, which includes the Drosophila genes shaking-B and l(1)ogre, and the Caenorhabditis elegans genes unc-7 and eat-5, encodes transmembrane proteins with a predicted structure similar to that of the connexins. shaking-B and eat-5 are required for the formation of functional gap junctions. To test directly whether Shaking-B is a channel protein, we expressed it in paired Xenopus oocytes. Here we show that Shaking-B localizes to the membrane, and that its presence induces the formation of functional intercellular channels. To our knowledge, this is the first structural component of an invertebrate gap junction to be characterized.

  5. Modeling of growth and laccase production by Pycnoporus sanguineus.

    PubMed

    Saat, Muhammad Naziz; Annuar, Mohamad Suffian Mohamad; Alias, Zazali; Chuan, Ling Tau; Chisti, Yusuf

    2014-05-01

    Production of extracellular laccase by the white-rot fungus Pycnoporus sanguineus was examined in batch submerged cultures in shake flasks, baffled shake flasks and a stirred tank bioreactor. The biomass growth in the various culture systems closely followed a logistic growth model. The production of laccase followed a Luedeking-Piret model. A modified Luedeking-Piret model incorporating logistic growth effectively described the consumption of glucose. Biomass productivity, enzyme productivity and substrate consumption were enhanced in baffled shake flasks relative to the cases for the conventional shake flasks. This was associated with improved oxygen transfer in the presence of the baffles. The best results were obtained in the stirred tank bioreactor. At 28 °C, pH 4.5, an agitation speed of 600 rpm and a dissolved oxygen concentration of ~25 % of air saturation, the laccase productivity in the bioreactor exceeded 19 U L(-1 )days(-1), or 1.5-fold better than the best case for the baffled shake flask. The final concentration of the enzyme was about 325 U L(-1).

  6. Role of nuclear charge change and nuclear recoil on shaking processes and their possible implication on physical processes

    NASA Astrophysics Data System (ADS)

    Sharma, Prashant

    2017-12-01

    The probable role of the sudden nuclear charge change and nuclear recoil in the shaking processes during the neutron- or heavy-ion-induced nuclear reactions and weakly interacting massive particle-nucleus scattering has been investigated in the present work. Using hydrogenic wavefunctions, general analytical expressions of survival, shakeup/shakedown, and shakeoff probability have been derived for various subshells of hydrogen-like atomic systems. These expressions are employed to calculate the shaking, shakeup/shakedown, and shakeoff probabilities in some important cases of interest in the nuclear astrophysics and the dark matter search experiments. The results underline that the shaking processes are one of the probable channels of electronic transitions during the weakly interacting massive particle-nucleus scattering, which can be used to probe the dark matter in the sub-GeV regime. Further, it is found that the shaking processes initiating due to nuclear charge change and nuclear recoil during the nuclear reactions may influence the electronic configuration of the participating atomic systems and thus may affect the nuclear reaction measurements at astrophysically relevant energies.

  7. Meal replacements as a weight loss tool in a population with severe mental illness.

    PubMed

    Gelberg, Hollie A; Kwan, Crystal L; Mena, Shirley J; Erickson, Zachary D; Baker, Matthew R; Chamberlin, Valery; Nguyen, Charles; Rosen, Jennifer A; Shah, Chandresh; Ames, Donna

    2015-12-01

    Weight gain and worsening metabolic parameters are often side effects of antipsychotic medications used by individuals with severe mental illness. To address this, a randomized, controlled research study of a behavioral weight management program for individuals with severe mental illness was undertaken to assess its efficacy. Patients unable to meet weight loss goals during the first portion of the year-long study were given the option of using meal replacement shakes in an effort to assist with weight loss. Specific requirements for use of meal replacement shakes were specified in the study protocol; only five patients were able to use the shakes in accordance with the protocol and lose weight while improving metabolic parameters. Case studies of two subjects are presented, illustrating the challenges and obstacles they faced, as well as their successes. Taking responsibility for their own weight loss, remaining motivated through the end of the study, and incorporating the meal replacement shakes into a daily routine were factors found in common with these patients. Use of meal replacements shakes with this population may be effective. Published by Elsevier Ltd.

  8. Measurement of the electron shake-off in the β-decay of laser-trapped 6He atoms

    NASA Astrophysics Data System (ADS)

    Hong, Ran; Bagdasarova, Yelena; Garcia, Alejandro; Storm, Derek; Sternberg, Matthew; Swanson, Erik; Wauters, Frederik; Zumwalt, David; Bailey, Kevin; Leredde, Arnaud; Mueller, Peter; O'Connor, Thomas; Flechard, Xavier; Liennard, Etienne; Knecht, Andreas; Naviliat-Cuncic, Oscar

    2016-03-01

    Electron shake-off is an important process in many high precision nuclear β-decay measurements searching for physics beyond the standard model. 6He being one of the lightest β-decaying isotopes, has a simple atomic structure. Thus, it is well suited for testing calculations of shake-off effects. Shake-off probabilities from the 23S1 and 23P2 initial states of laser trapped 6He matter for the on-going beta-neutrino correlation study at the University of Washington. These probabilities are obtained by analyzing the time-of-flight distribution of the recoil ions detected in coincidence with the beta particles. A β-neutrino correlation independent analysis approach was developed. The measured upper limit of the double shake-off probability is 2 ×10-4 at 90% confidence level. This result is ~100 times lower than the most recent calculation by Schulhoff and Drake. This work is supported by DOE, Office of Nuclear Physics, under Contract Nos. DE-AC02-06CH11357 and DE-FG02-97ER41020.

  9. MyShake: Initial Observations from a Global Smartphone Seismic Network

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.

    2016-12-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing. It has two component: an android application running on the personal smartphones to detect earthquake-like motion, and a network detection algorithm to aggregate results from multiple smartphones to detect earthquakes. The MyShake application was released to the public on Feb 12th 2016. Within the first 5 months, there are more than 200 earthquakes recorded by the smartphones all over the world, including events in Chile, Argentina, Mexico, Morocco, Greece, Nepal, New Zealand, Taiwan, Japan, and across North America. In this presentation, we will show the waveforms we recorded from the smartphones for different earthquakes, and the evidences for using this data as a supplementary to the current earthquake early warning system. We will also show the performance of MyShake system during the some earthquakes in US. In short, MyShake smartphone seismic network can be a nice complementary system to the current traditional seismic network, at the same time, it can be a standalone system in places where few seismic stations were installed to reduce the earthquake hazards.

  10. Finite-momentum Bose-Einstein condensates in shaken two-dimensional square optical lattices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Liberto, M.; Scuola Superiore di Catania, Universita di Catania, Via Valdisavoia 9, I-95123 Catania; Tieleman, O.

    2011-07-15

    We consider ultracold bosons in a two-dimensional square optical lattice described by the Bose-Hubbard model. In addition, an external time-dependent sinusoidal force is applied to the system, which shakes the lattice along one of the diagonals. The effect of the shaking is to renormalize the nearest-neighbor-hopping coefficients, which can be arbitrarily reduced, can vanish, or can even change sign, depending on the shaking parameter. Therefore, it is necessary to account for higher-order-hopping terms, which are renormalized differently by the shaking, and to introduce anisotropy into the problem. We show that the competition between these different hopping terms leads to finite-momentummore » condensates with a momentum that may be tuned via the strength of the shaking. We calculate the boundaries between the Mott insulator and the different superfluid phases and present the time-of-flight images expected to be observed experimentally. Our results open up possibilities for the realization of bosonic analogs of the Fulde, Ferrel, Larkin, and Ovchinnikov phase describing inhomogeneous superconductivity.« less

  11. Different treatment modalities of fusiform basilar trunk aneurysm: study on computational hemodynamics.

    PubMed

    Wu, Chen; Xu, Bai-Nan; Sun, Zheng-Hui; Wang, Fu-Yu; Liu, Lei; Zhang, Xiao-Jun; Zhou, Ding-Biao

    2012-01-01

    Unclippable fusiform basilar trunk aneurysm is a formidable condition for surgical treatment. The aim of this study was to establish a computational model and to investigate the hemodynamic characteristics in a fusiform basilar trunk aneurysm. The three-dimensional digital model of a fusiform basilar trunk aneurysm was constructed using MIMICS, ANSYS and CFX software. Different hemodynamic modalities and border conditions were assigned to the model. Thirty points were selected randomly on the wall and within the aneurysm. Wall total pressure (WTP), wall shear stress (WSS), and blood flow velocity of each point were calculated and hemodynamic status was compared between different modalities. The quantitative average values of the 30 points on the wall and within the aneurysm were obtained by computational calculation point by point. The velocity and WSS in modalities A and B were different from those of the remaining 5 modalities; and the WTP in modalities A, E and F were higher than those of the remaining 4 modalities. The digital model of a fusiform basilar artery aneurysm is feasible and reliable. This model could provide some important information to clinical treatment options.

  12. Comparing solutions to the expectancy-value muddle in the theory of planned behaviour.

    PubMed

    O' Sullivan, B; McGee, H; Keegan, O

    2008-11-01

    The authors of the Theories of Reasoned Action (TRA) and Planned Behaviour (TPB) recommended a method for statistically analysing the relationship between the indirect belief-based measures and the direct measures of attitude, subjective norm, and perceived behavioural control (PBC). However, there is a growing awareness that this yields statistically uninterpretable results. This study's objective was to compare two solutions to what has been called the 'expectancy-value muddle'. These solutions were (i) optimal scoring of modal beliefs and (ii) individual beliefs without multiplicative composites. Cross-sectional data were collected by telephone interview. Participants were 110 first-degree relatives (FDRs) of patients diagnosed with colorectal cancer (CRC), who were offered CRC screening in the study hospital (83% response rate). Participants were asked to rate the TPB constructs in relation to attending for CRC screening. There was no significant difference in the correlation between behavioural beliefs and attitude for rescaled modal and individual beliefs. This was also the case for control beliefs and PBC. By contrast, there was a large correlation between rescaled modal normative beliefs and subjective norm, whereas individual normative beliefs did not correlate with subjective norm. Using individual beliefs without multiplicative composites allows for a fairly unproblematic interpretation of the relationship between the indirect and direct TPB constructs (French & Hankins, 2003). Therefore, it is recommended that future studies consider using individual measures of behavioural and control beliefs without multiplicative composites and examine a different way of measuring individual normative beliefs without multiplicative composites to that used in this study.

  13. Experimental/analytical approaches to modeling, calibrating and optimizing shaking table dynamics for structural dynamic applications

    NASA Astrophysics Data System (ADS)

    Trombetti, Tomaso

    This thesis presents an Experimental/Analytical approach to modeling and calibrating shaking tables for structural dynamic applications. This approach was successfully applied to the shaking table recently built in the structural laboratory of the Civil Engineering Department at Rice University. This shaking table is capable of reproducing model earthquake ground motions with a peak acceleration of 6 g's, a peak velocity of 40 inches per second, and a peak displacement of 3 inches, for a maximum payload of 1500 pounds. It has a frequency bandwidth of approximately 70 Hz and is designed to test structural specimens up to 1/5 scale. The rail/table system is mounted on a reaction mass of about 70,000 pounds consisting of three 12 ft x 12 ft x 1 ft reinforced concrete slabs, post-tensioned together and connected to the strong laboratory floor. The slip table is driven by a hydraulic actuator governed by a 407 MTS controller which employs a proportional-integral-derivative-feedforward-differential pressure algorithm to control the actuator displacement. Feedback signals are provided by two LVDT's (monitoring the slip table relative displacement and the servovalve main stage spool position) and by one differential pressure transducer (monitoring the actuator force). The dynamic actuator-foundation-specimen system is modeled and analyzed by combining linear control theory and linear structural dynamics. The analytical model developed accounts for the effects of actuator oil compressibility, oil leakage in the actuator, time delay in the response of the servovalve spool to a given electrical signal, foundation flexibility, and dynamic characteristics of multi-degree-of-freedom specimens. In order to study the actual dynamic behavior of the shaking table, the transfer function between target and actual table accelerations were identified using experimental results and spectral estimation techniques. The power spectral density of the system input and the cross power spectral density of the table input and output were estimated using the Bartlett's spectral estimation method. The experimentally-estimated table acceleration transfer functions obtained for different working conditions are correlated with their analytical counterparts. As a result of this comprehensive correlation study, a thorough understanding of the shaking table dynamics and its sensitivities to control and payload parameters is obtained. Moreover, the correlation study leads to a calibrated analytical model of the shaking table of high predictive ability. It is concluded that, in its present conditions, the Rice shaking table is able to reproduce, with a high degree of accuracy, model earthquake accelerations time histories in the frequency bandwidth from 0 to 75 Hz. Furthermore, the exhaustive analysis performed indicates that the table transfer function is not significantly affected by the presence of a large (in terms of weight) payload with a fundamental frequency up to 20 Hz. Payloads having a higher fundamental frequency do affect significantly the shaking table performance and require a modification of the table control gain setting that can be easily obtained using the predictive analytical model of the shaking table. The complete description of a structural dynamic experiment performed using the Rice shaking table facility is also reported herein. The object of this experimentation was twofold: (1) to verify the testing capability of the shaking table and, (2) to experimentally validate a simplified theory developed by the author, which predicts the maximum rotational response developed by seismic isolated building structures characterized by non-coincident centers of mass and rigidity, when subjected to strong earthquake ground motions.

  14. Phase congruency map driven brain tumour segmentation

    NASA Astrophysics Data System (ADS)

    Szilágyi, Tünde; Brady, Michael; Berényi, Ervin

    2015-03-01

    Computer Aided Diagnostic (CAD) systems are already of proven value in healthcare, especially for surgical planning, nevertheless much remains to be done. Gliomas are the most common brain tumours (70%) in adults, with a survival time of just 2-3 months if detected at WHO grades III or higher. Such tumours are extremely variable, necessitating multi-modal Magnetic Resonance Images (MRI). The use of Gadolinium-based contrast agents is only relevant at later stages of the disease where it highlights the enhancing rim of the tumour. Currently, there is no single accepted method that can be used as a reference. There are three main challenges with such images: to decide whether there is tumour present and is so localize it; to construct a mask that separates healthy and diseased tissue; and to differentiate between the tumour core and the surrounding oedema. This paper presents two contributions. First, we develop tumour seed selection based on multiscale multi-modal texture feature vectors. Second, we develop a method based on a local phase congruency based feature map to drive level-set segmentation. The segmentations achieved with our method are more accurate than previously presented methods, particularly for challenging low grade tumours.

  15. A review of consensus test methods for established medical imaging modalities and their implications for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Pfefer, Joshua; Agrawal, Anant

    2012-03-01

    In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.

  16. Comparison of Modal Analysis Methods Applied to a Vibro-Acoustic Test Article

    NASA Technical Reports Server (NTRS)

    Pritchard, Jocelyn; Pappa, Richard; Buehrle, Ralph; Grosveld, Ferdinand

    2001-01-01

    Modal testing of a vibro-acoustic test article referred to as the Aluminum Testbed Cylinder (ATC) has provided frequency response data for the development of validated numerical models of complex structures for interior noise prediction and control. The ATC is an all aluminum, ring and stringer stiffened cylinder, 12 feet in length and 4 feet in diameter. The cylinder was designed to represent typical aircraft construction. Modal tests were conducted for several different configurations of the cylinder assembly under ambient and pressurized conditions. The purpose of this paper is to present results from dynamic testing of different ATC configurations using two modal analysis software methods: Eigensystem Realization Algorithm (ERA) and MTS IDEAS Polyreference method. The paper compares results from the two analysis methods as well as the results from various test configurations. The effects of pressurization on the modal characteristics are discussed.

  17. Estimating economic losses from earthquakes using an empirical approach

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2013-01-01

    We extended the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) empirical fatality estimation methodology proposed by Jaiswal et al. (2009) to rapidly estimate economic losses after significant earthquakes worldwide. The requisite model inputs are shaking intensity estimates made by the ShakeMap system, the spatial distribution of population available from the LandScan database, modern and historic country or sub-country population and Gross Domestic Product (GDP) data, and economic loss data from Munich Re's historical earthquakes catalog. We developed a strategy to approximately scale GDP-based economic exposure for historical and recent earthquakes in order to estimate economic losses. The process consists of using a country-specific multiplicative factor to accommodate the disparity between economic exposure and the annual per capita GDP, and it has proven successful in hindcast-ing past losses. Although loss, population, shaking estimates, and economic data used in the calibration process are uncertain, approximate ranges of losses can be estimated for the primary purpose of gauging the overall scope of the disaster and coordinating response. The proposed methodology is both indirect and approximate and is thus best suited as a rapid loss estimation model for applications like the PAGER system.

  18. Parkinson's disease - the story of an eponym.

    PubMed

    Goedert, Michel; Compston, Alastair

    2018-01-01

    One of the most prevalent neurodegenerative diseases worldwide is still referred to as 'Parkinson's disease'. The condition is named after James Parkinson who, in 1817, described the shaking palsy (paralysis agitans). In the bicentennial year of this publication, we trace when and why the shaking palsy became Parkinson's disease. The term was coined by William Rutherford Sanders of Edinburgh in 1865 and later entered general usage through the influence of Jean-Martin Charcot and the school that he nurtured at the Salpêtrière Hospital in Paris. Despite a move towards more mechanism-based nosology for many medical conditions in recent years, the Parkinson's disease eponym remains in place, celebrating the life and work of this doctor, palaeontologist and political activist.

  19. Operation SNAPPER, Project 3.1. Vulnerability of Parked Aircraft to Atomic Bombs

    DTIC Science & Technology

    1953-02-01

    Portable Calibrator which was used at the Nevada Proving Grounds. The 6-101A consisted of a shake table which generated a sinusoidal motion having a...calibrator was similar to the 6-101A, with the exception that it was sitaller and had a fixed shake table amplitude. The calibration proce- dure was to...mount the accelerometer to be calibrated on the table sind shake it at various frequencies. The output of the accelerometer, which was channeled

  20. Performance of sand and shredded rubber tire mixture as a natural base isolator for earthquake protection

    NASA Astrophysics Data System (ADS)

    Bandyopadhyay, Srijit; Sengupta, Aniruddha; Reddy, G. R.

    2015-12-01

    The performance of a well-designed layer of sand, and composites like layer of sand mixed with shredded rubber tire (RSM) as low cost base isolators, is studied in shake table tests in the laboratory. The building foundation is modeled by a 200 mm by 200 mm and 40 mm thick rigid plexi-glass block. The block is placed in the middle of a 1m by 1m tank filled with sand. The selected base isolator is placed between the block and the sand foundation. Accelerometers are placed on top of the footing and foundation sand layer. The displacement of the footing is also measured by LVDT. The whole setup is mounted on a shake table and subjected to sinusoidal motions with varying amplitude and frequency. Sand is found to be effective only at very high amplitude (> 0.65 g) of motions. The performance of a composite consisting of sand and 50% shredded rubber tire placed under the footing is found to be most promising as a low-cost effective base isolator.

  1. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars

    PubMed Central

    Patil, Shashidhar; Chintalapalli, Harinadha Reddy; Kim, Dubeom; Chai, Youngho

    2015-01-01

    In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor. PMID:26094629

  2. Facile mechanical shaking method is an improved isolation approach for islet preparation and transplantation.

    PubMed

    Yin, Nina; Chen, Tao; Yu, Yuling; Han, Yongming; Yan, Fei; Zheng, Zhou; Chen, Zebin

    2016-12-01

    Successful islet isolation is crucial for islet transplantation and cell treatment for type 1 diabetes. Current isolation methods are able to obtain 500-1,000 islets per rat, which results in a waste of ≥50% of total islets. In the present study, a facile mechanical shaking method for improving islet yield (up to 1,500 per rat) was developed and summarized, which was demonstrated to be more effective than the existing well-established stationary method. The present results showed that isolated islets have a maximum yield of 1,326±152 when shaking for 15 min for the fully-cannulated pancreas. For both fully-cannulated and half-cannulated pancreas in the presence of rat DNAse inhibitor, the optimal shaking time was amended to 20 min with a further increased yield of 1,344±134 and 1,286±124 islets, respectively. Furthermore, the majority of the isolated islets were morphologically intact with a well-defined surface and almost no central necrotic zone, which suggested that the condition of islets obtained via the mechanical shaking method was consistent with the stationary method. Islet size distribution was also calculated and it was demonstrated that islets from the stationary method exhibited the same size distribution as the non-cannulated group, which had more larger islets than the fully-cannulated and half-cannulated groups isolated via the shaking method. In addition, the results of glucose challenge showed that the refraction index of each group was >2.5, which indicated the well-preserved function of isolated islets. Furthermore, the transplanted islets exhibited a therapeutic effect after 1 day of transplantation; however, they failed to control blood glucose levels after ~7 days of transplantation. In conclusion, these results demonstrated that the facile mechanical shaking method may markedly improve the yield of rat islet isolation, and in vitro and in vivo investigation demonstrated the well-preserved function of isolated islets in the control of blood glucose. Therefore, the facile mechanical shaking method may be an alternative improved procedure to obtain higher islet yield for islet preparation and transplantation in the treatment of type 1 diabetes.

  3. How well should probabilistic seismic hazard maps work?

    NASA Astrophysics Data System (ADS)

    Vanneste, K.; Stein, S.; Camelbeeck, T.; Vleminckx, B.

    2016-12-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in earthquake hazard maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSHA model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating the shaking history of an area with assumed distribution of earthquakes, frequency-magnitude relation, temporal occurrence model, and ground-motion prediction equation. We compare the "observed" shaking at many sites over time to that predicted by a hazard map generated for the same set of parameters. PSHA predicts that the fraction of sites at which shaking will exceed that mapped is p = 1 - exp(t/T), where t is the duration of observations and T is the map's return period. This implies that shaking in large earthquakes is typically greater than shown on hazard maps, as has occurred in a number of cases. A large number of simulated earthquake histories yield distributions of shaking consistent with this forecast, with a scatter about this value that decreases as t/T increases. The median results are somewhat lower than predicted for small values of t/T and approach the predicted value for larger values of t/T. Hence, the algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. Validation is more complicated because a real observed earthquake history can yield a fractional exceedance significantly higher or lower than that predicted while still being consistent with the hazard map in question. As a result, given that in the real world we have only a single sample, it is hard to assess whether a misfit between a map and observations arises by chance or reflects a biased map.

  4. Evaluating Alignment of Shapes by Ensemble Visualization

    PubMed Central

    Raj, Mukund; Mirzargar, Mahsa; Preston, J. Samuel; Kirby, Robert M.; Whitaker, Ross T.

    2016-01-01

    The visualization of variability in surfaces embedded in 3D, which is a type of ensemble uncertainty visualization, provides a means of understanding the underlying distribution of a collection or ensemble of surfaces. Although ensemble visualization for isosurfaces has been described in the literature, we conduct an expert-based evaluation of various ensemble visualization techniques in a particular medical imaging application: the construction of atlases or templates from a population of images. In this work, we extend contour boxplot to 3D, allowing us to evaluate it against an enumeration-style visualization of the ensemble members and other conventional visualizations used by atlas builders, namely examining the atlas image and the corresponding images/data provided as part of the construction process. We present feedback from domain experts on the efficacy of contour boxplot compared to other modalities when used as part of the atlas construction and analysis stages of their work. PMID:26186768

  5. Transfer matrix method for dynamics modeling and independent modal space vibration control design of linear hybrid multibody system

    NASA Astrophysics Data System (ADS)

    Rong, Bao; Rui, Xiaoting; Lu, Kun; Tao, Ling; Wang, Guoping; Ni, Xiaojun

    2018-05-01

    In this paper, an efficient method of dynamics modeling and vibration control design of a linear hybrid multibody system (MS) is studied based on the transfer matrix method. The natural vibration characteristics of a linear hybrid MS are solved by using low-order transfer equations. Then, by constructing the brand-new body dynamics equation, augmented operator and augmented eigenvector, the orthogonality of augmented eigenvector of a linear hybrid MS is satisfied, and its state space model expressed in each independent model space is obtained easily. According to this dynamics model, a robust independent modal space-fuzzy controller is designed for vibration control of a general MS, and the genetic optimization of some critical control parameters of fuzzy tuners is also presented. Two illustrative examples are performed, which results show that this method is computationally efficient and with perfect control performance.

  6. Verticality perception during and after galvanic vestibular stimulation.

    PubMed

    Volkening, Katharina; Bergmann, Jeannine; Keller, Ingo; Wuehr, Max; Müller, Friedemann; Jahn, Klaus

    2014-10-03

    The human brain constructs verticality perception by integrating vestibular, somatosensory, and visual information. Here we investigated whether galvanic vestibular stimulation (GVS) has an effect on verticality perception both during and after application, by assessing the subjective verticals (visual, haptic and postural) in healthy subjects at those times. During stimulation the subjective visual vertical and the subjective haptic vertical shifted towards the anode, whereas this shift was reversed towards the cathode in all modalities once stimulation was turned off. Overall, the effects were strongest for the haptic modality. Additional investigation of the time course of GVS-induced changes in the haptic vertical revealed that anodal shifts persisted for the entire 20-min stimulation interval in the majority of subjects. Aftereffects exhibited different types of decay, with a preponderance for an exponential decay. The existence of such reverse effects after stimulation could have implications for GVS-based therapy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Shaking Eden: Voyages, Bodies and Change in the Social Construction of South American Skies

    NASA Astrophysics Data System (ADS)

    López, Alejandro Martín

    2015-05-01

    South America presents a clear example of the importance of displacements and exchanges in shaping human societies. Nevertheless, the academic works, following the ideas of the first European visitors, have tended to see it as an undisturbed Eden in a `state of nature.´ For too long, South American societies were thought of as small units without history, isolated from each other. The opposition to the excesses of diffusionism helped to reinforce that image. However, in recent years this static and `naturaĺ representation has collapsed. New works from the most varied perspectives show us a changing and interconnected South America, where the notions of body, person and territory are complex social constructions and not the expression of an 'unmediated' experience of the world. We discuss the implications of these new perspectives of thinking on South America for the study of ways of perceiving and representing the sky in this region.

  8. Evaluation of knowledge regarding Shaken Baby Syndrome among parents and medical staff.

    PubMed

    Marcinkowska, Urszula; Tyrala, Kinga; Paniczek, Monika; Ledwon, Martyna; Josko-Ochojska, Jadwiga

    2016-06-08

    Shaken Baby Syndrome (SBS), currently functioning as Abusive Head Trauma (AHT), is a form of violence against children mainly under 2 years of age. The number of SBS might be underestimated, as many cases of violence remain unreported. The aim of the study was evaluation of the state of knowledge of the SBS phenomenon, its scale and diagnostic methods among parents, medical staff and medical students. 639 people were examined: 39% of parents, 32,5% medical staff members and 28,5% of medical students. 82% were women. The average age was 34,9 years (SD=9,78). 70% of them had children. The research tool was an anonymous survey. The 34 questions concerned numerous aspects of violence against children as well as knowledge about SBS. According to 90% of the interviewees shaking a baby may be dangerous but 43% have ever heard about shaken baby syndrome. 'SBS is a form of violence' said 88% of respondents but 57% realize that one-time shaking can lead to death and only 19% indicated men as aggressors. 16% of medical staff members did not know how long it takes for the consequences of shaking a baby to be revealed. Majority of the medical staff members working with children have never heard about SBS. Only half of the surveyed understands the connection of shaking with vision loss or child's death. Among the long-term consequences of shaking a baby the greatest knowledge concerns emotional consequences of shaking.

  9. Seismomorphogenesis: a novel approach to acclimatization of tissue culture regenerated plants.

    PubMed

    Sarmast, Mostafa Khoshhal; Salehi, Hassan; Khosh-Khui, Morteza

    2014-12-01

    Plantlets under in vitro conditions transferred to ex vivo conditions are exposed to biotic and abiotic stresses. Furthermore, in vitro regenerated plants are typically frail and sometimes difficult to handle subsequently increasing their risk to damage and disease; hence acclimatization of these plantlets is the most important step in tissue culture techniques. An experiment was conducted under in vitro conditions to study the effects of shaking duration (twice daily at 6:00 a.m. and 9:00 p.m. for 2, 4, 8, and 16 min at 250 rpm for 14 days) on Sansevieria trifasciata L. as a model plant. Results showed that shaking improved handling, total plant height, and leaf characteristics of the model plant. Forty-eight hours after 14 days of shaking treatments with increasing shaking time, leaf length decreased but proline content of leaf increased. However, 6 months after starting the experiment different results were observed. In explants that received 16 min of shaking treatment, leaf length and area and photosynthesis rate were increased compared with control plantlets. Six months after starting the experiment, control plantlets had 12.5 % mortality; however, no mortality was observed in other treated explants. The results demonstrated that shaking improved the explants' root length and number and as a simple, cost-effective, and non-chemical novel approach may be substituted for other prevalent acclimatization techniques used for tissue culture regenerated plantlets. Further studies with sensitive plants are needed to establish this hypothesis.

  10. Overexpression of Human Bone Alkaline Phosphatase in Pichia Pastoris

    NASA Technical Reports Server (NTRS)

    Karr, Laurel; Malone, Christine, C.; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The Pichiapastoris expression system was utilized to produce functionally active human bone alkaline phosphatase in gram quantities. Bone alkaline phosphatase is a key enzyme in bone formation and biomineralization, yet important questions about its structural chemistry and interactions with other cellular enzymes in mineralizing tissues remain unanswered. A soluble form of human bone alkaline phosphatase was constructed by deletion of the 25 amino acid hydrophobic C-terminal region of the encoding cDNA and inserted into the X-33 Pichiapastoris strain. An overexpression system was developed in shake flasks and converted to large-scale fermentation. Alkaline phosphatase was secreted into the medium to a level of 32mgAL when cultured in shake flasks. Enzyme activity was 12U/mg measured by a spectrophotometric assay. Fermentation yielded 880mgAL with enzymatic activity of 968U/mg. Gel electrophoresis analysis indicates that greater than 50% of the total protein in the fermentation is alkaline phosphatase. A purification scheme has been developed using ammonium sulfate precipitation followed by hydrophobic interaction chromatography. We are currently screening crystallization conditions of the purified recombinant protein for subsequent X-ray diffraction analyses. Structural data should provide additional information on the role of alkaline phosphatase in normal bone mineralization and in certain bone mineralization anomalies.

  11. One-pot DNA construction for synthetic biology: the Modular Overlap-Directed Assembly with Linkers (MODAL) strategy

    PubMed Central

    Casini, Arturo; MacDonald, James T.; Jonghe, Joachim De; Christodoulou, Georgia; Freemont, Paul S.; Baldwin, Geoff S.; Ellis, Tom

    2014-01-01

    Overlap-directed DNA assembly methods allow multiple DNA parts to be assembled together in one reaction. These methods, which rely on sequence homology between the ends of DNA parts, have become widely adopted in synthetic biology, despite being incompatible with a key principle of engineering: modularity. To answer this, we present MODAL: a Modular Overlap-Directed Assembly with Linkers strategy that brings modularity to overlap-directed methods, allowing assembly of an initial set of DNA parts into a variety of arrangements in one-pot reactions. MODAL is accompanied by a custom software tool that designs overlap linkers to guide assembly, allowing parts to be assembled in any specified order and orientation. The in silico design of synthetic orthogonal overlapping junctions allows for much greater efficiency in DNA assembly for a variety of different methods compared with using non-designed sequence. In tests with three different assembly technologies, the MODAL strategy gives assembly of both yeast and bacterial plasmids, composed of up to five DNA parts in the kilobase range with efficiencies of between 75 and 100%. It also seamlessly allows mutagenesis to be performed on any specified DNA parts during the process, allowing the one-step creation of construct libraries valuable for synthetic biology applications. PMID:24153110

  12. Gold standards and expert panels: a pulmonary nodule case study with challenges and solutions

    NASA Astrophysics Data System (ADS)

    Miller, Dave P.; O'Shaughnessy, Kathryn F.; Wood, Susan A.; Castellino, Ronald A.

    2004-05-01

    Comparative evaluations of reader performance using different modalities, e.g. CT with computer-aided detection (CAD) vs. CT without CAD, generally require a "truth" definition based on a gold standard. There are many situations in which a true invariant gold standard is impractical or impossible to obtain. For instance, small pulmonary nodules are generally not assessed by biopsy or resection. In such cases, it is common to use a unanimous consensus or majority agreement from an expert panel as a reference standard for actionability in lieu of the unknown gold standard for disease. Nonetheless, there are three major concerns about expert panel reference standards: (1) actionability is not synonymous with disease (2) it may be possible to obtain different conclusions about which modality is better using different rules (e.g. majority vs. unanimous consensus), and (3) the variability associated with the panelists is not formally captured in the p-values or confidence intervals that are generally produced for estimating the extent to which one modality is superior to the other. A multi-reader-multi-case (MRMC) receiver operating characteristic (ROC) study was performed using 90 cases, 15 readers, and a reference truth based on 3 experienced panelists. The primary analyses were conducted using a reference truth of unanimous consensus regarding actionability (3 out of 3 panelists). To assess the three concerns noted above: (1) additional data from the original radiology reports were compared to the panel (2) the complete analysis was repeated using different definitions of truth, and (3) bootstrap analyses were conducted in which new truth panels were constructed by picking 1, 2, or 3 panelists at random. The definition of the reference truth affected the results for each modality (CT with CAD and CT without CAD) considered by itself, but the effects were similar, so the primary analysis comparing the modalities was robust to the choice of the reference truth.

  13. Causal Inference for Cross-Modal Action Selection: A Computational Study in a Decision Making Framework.

    PubMed

    Daemi, Mehdi; Harris, Laurence R; Crawford, J Douglas

    2016-01-01

    Animals try to make sense of sensory information from multiple modalities by categorizing them into perceptions of individual or multiple external objects or internal concepts. For example, the brain constructs sensory, spatial representations of the locations of visual and auditory stimuli in the visual and auditory cortices based on retinal and cochlear stimulations. Currently, it is not known how the brain compares the temporal and spatial features of these sensory representations to decide whether they originate from the same or separate sources in space. Here, we propose a computational model of how the brain might solve such a task. We reduce the visual and auditory information to time-varying, finite-dimensional signals. We introduce controlled, leaky integrators as working memory that retains the sensory information for the limited time-course of task implementation. We propose our model within an evidence-based, decision-making framework, where the alternative plan units are saliency maps of space. A spatiotemporal similarity measure, computed directly from the unimodal signals, is suggested as the criterion to infer common or separate causes. We provide simulations that (1) validate our model against behavioral, experimental results in tasks where the participants were asked to report common or separate causes for cross-modal stimuli presented with arbitrary spatial and temporal disparities. (2) Predict the behavior in novel experiments where stimuli have different combinations of spatial, temporal, and reliability features. (3) Illustrate the dynamics of the proposed internal system. These results confirm our spatiotemporal similarity measure as a viable criterion for causal inference, and our decision-making framework as a viable mechanism for target selection, which may be used by the brain in cross-modal situations. Further, we suggest that a similar approach can be extended to other cognitive problems where working memory is a limiting factor, such as target selection among higher numbers of stimuli and selections among other modality combinations.

  14. Hybrid optical acoustic seafloor mapping

    NASA Astrophysics Data System (ADS)

    Inglis, Gabrielle

    The oceanographic research and industrial communities have a persistent demand for detailed three dimensional sea floor maps which convey both shape and texture. Such data products are used for archeology, geology, ship inspection, biology, and habitat classification. There are a variety of sensing modalities and processing techniques available to produce these maps and each have their own potential benefits and related challenges. Multibeam sonar and stereo vision are such two sensors with complementary strengths making them ideally suited for data fusion. Data fusion approaches however, have seen only limited application to underwater mapping and there are no established methods for creating hybrid, 3D reconstructions from two underwater sensing modalities. This thesis develops a processing pipeline to synthesize hybrid maps from multi-modal survey data. It is helpful to think of this processing pipeline as having two distinct phases: Navigation Refinement and Map Construction. This thesis extends existing work in underwater navigation refinement by incorporating methods which increase measurement consistency between both multibeam and camera. The result is a self consistent 3D point cloud comprised of camera and multibeam measurements. In map construction phase, a subset of the multi-modal point cloud retaining the best characteristics of each sensor is selected to be part of the final map. To quantify the desired traits of a map several characteristics of a useful map are distilled into specific criteria. The different ways that hybrid maps can address these criteria provides justification for producing them as an alternative to current methodologies. The processing pipeline implements multi-modal data fusion and outlier rejection with emphasis on different aspects of map fidelity. The resulting point cloud is evaluated in terms of how well it addresses the map criteria. The final hybrid maps retain the strengths of both sensors and show significant improvement over the single modality maps and naively assembled multi-modal maps.

  15. Communication during copulation in the sex-role reversed wolf spider Allocosa brasiliensis: Female shakes for soliciting new ejaculations?

    PubMed

    Garcia Diaz, Virginia; Aisenberg, Anita; Peretti, Alfredo V

    2015-07-01

    Traditional studies on sexual communication have focused on the exchange of signals during courtship. However, communication between the sexes can also occur during or after copulation. Allocosa brasiliensis is a wolf spider that shows a reversal in typical sex roles and of the usual sexual size dimorphism expected for spiders. Females are smaller than males and they are the roving sex that initiates courtship. Occasional previous observations suggested that females performed body shaking behaviors during copulation. Our objective was to analyze if female body shaking is associated with male copulatory behavior in A. brasiliensis, and determine if this female behavior has a communicatory function in this species. For that purpose, we performed fine-scaled analysis of fifteen copulations under laboratory conditions. We video-recorded all the trials and looked for associations between female and male copulatory behaviors. The significant difference between the time before and after female shaking, in favor of the subsequent ejaculation is analyzed. We discuss if shaking could be acting as a signal to accelerate and motivate palpal insertion and ejaculation, and/or inhibiting male cannibalistic tendencies in this species. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. The effect of multiple internal representations on context-rich instruction

    NASA Astrophysics Data System (ADS)

    Lasry, Nathaniel; Aulls, Mark W.

    2007-11-01

    We discuss n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging data that demonstrates the independence of information processed along different modalities such as verbal, visual, kinesthetic, logico-mathematic, and social modalities. A study testing the effectiveness of the n-coding construct in classrooms is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional-instruction section used as a control group, each of the remaining three sections were given context-rich problems, which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem-solving skills were assessed as conceptual learning using the force concept inventory. We also developed several new measures that take students' confidence in concepts into account. Our results show that the n-coding construct is useful in designing context-rich environments and can be used to increase learning gains in problem solving, conceptual knowledge, and concept confidence. Specifically, when using props in designing context-rich problems, we find n-coding to be a useful construct in guiding which additional dimensions need to be attended to.

  17. 117. VIEW, LOOKING NORTHWEST, OF DIESTER MODEL 6 CONCENTRATING (SHAKING) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    117. VIEW, LOOKING NORTHWEST, OF DIESTER MODEL 6 CONCENTRATING (SHAKING) TABLE, USED FOR PRIMARY, MECHANICAL SEPARATION OF GOLD FROM ORE. - Shenandoah-Dives Mill, 135 County Road 2, Silverton, San Juan County, CO

  18. Public Release of Estimated Impact-Based Earthquake Alerts - An Update to the U.S. Geological Survey PAGER System

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Jaiswal, K. S.; Marano, K.; Hearne, M.; Earle, P. S.; So, E.; Garcia, D.; Hayes, G. P.; Mathias, S.; Applegate, D.; Bausch, D.

    2010-12-01

    The U.S. Geological Survey (USGS) has begun publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses. These estimates should significantly enhance the utility of the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system that has been providing estimated ShakeMaps and computing population exposures to specific shaking intensities since 2007. Quantifying earthquake impacts and communicating loss estimates (and their uncertainties) to the public has been the culmination of several important new and evolving components of the system. First, the operational PAGER system now relies on empirically-based loss models that account for estimated shaking hazard, population exposure, and employ country-specific fatality and economic loss functions derived using analyses of losses due to recent and past earthquakes. In some countries, our empirical loss models are informed in part by PAGER’s semi-empirical and analytical loss models, and building exposure and vulnerability data sets, all of which are being developed in parallel to the empirical approach. Second, human and economic loss information is now portrayed as a supplement to existing intensity/exposure content on both PAGER summary alert (available via cell phone/email) messages and web pages. Loss calculations also include estimates of the economic impact with respect to the country’s gross domestic product. Third, in order to facilitate rapid and appropriate earthquake responses based on our probable loss estimates, in early 2010 we proposed a four-level Earthquake Impact Scale (EIS). Instead of simply issuing median estimates for losses—which can be easily misunderstood and misused—this scale provides ranges of losses from which potential responders can gauge expected overall impact from strong shaking. EIS is based on two complementary criteria: the estimated cost of damage, which is most suitable for U.S. domestic events; and estimated ranges of fatalities, which are generally more appropriate for global events, particularly in earthquake-vulnerable countries. Alert levels are characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1000, respectively. For damage impact, yellow, orange, and red thresholds are triggered when estimated US dollar losses reach 1 million, 100 million, and 1 billion+ levels, respectively. Finally, alerting protocols now explicitly support EIS-based alerts. Critical users can receive PAGER alerts i) based on the EIS-based alert level, in addition to or as an alternative to magnitude and population/intensity exposure-based alerts, and ii) optionally, based on user-selected regions of the world. The essence of PAGER’s impact-based alerting is that actionable loss information is now available in the immediate aftermath of significant earthquakes worldwide based on quantifiable, albeit uncertain, loss estimates provided by the USGS.

  19. Logistic Regression for Seismically Induced Landslide Predictions: Using Uniform Hazard and Geophysical Layers as Predictor Variables

    NASA Astrophysics Data System (ADS)

    Nowicki, M. A.; Hearne, M.; Thompson, E.; Wald, D. J.

    2012-12-01

    Seismically induced landslides present a costly and often fatal threats in many mountainous regions. Substantial effort has been invested to understand where seismically induced landslides may occur in the future. Both slope-stability methods and, more recently, statistical approaches to the problem are described throughout the literature. Though some regional efforts have succeeded, no uniformly agreed-upon method is available for predicting the likelihood and spatial extent of seismically induced landslides. For use in the U. S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, we would like to routinely make such estimates, in near-real time, around the globe. Here we use the recently produced USGS ShakeMap Atlas of historic earthquakes to develop an empirical landslide probability model. We focus on recent events, yet include any digitally-mapped landslide inventories for which well-constrained ShakeMaps are also available. We combine these uniform estimates of the input shaking (e.g., peak acceleration and velocity) with broadly available susceptibility proxies, such as topographic slope and surface geology. The resulting database is used to build a predictive model of the probability of landslide occurrence with logistic regression. The landslide database includes observations from the Northridge, California (1994); Wenchuan, China (2008); ChiChi, Taiwan (1999); and Chuetsu, Japan (2004) earthquakes; we also provide ShakeMaps for moderate-sized events without landslide for proper model testing and training. The performance of the regression model is assessed with both statistical goodness-of-fit metrics and a qualitative review of whether or not the model is able to capture the spatial extent of landslides for each event. Part of our goal is to determine which variables can be employed based on globally-available data or proxies, and whether or not modeling results from one region are transferrable to geomorphologically-similar regions that lack proper calibration events. Combined with near-real time ShakeMaps, we anticipate using our model to make generalized predictions of whether or not (and if so, where) landslides are likely to occur for earthquakes around the globe; we also intend to incorporate this functionality into the USGS PAGER system.

  20. Real-time earthquake shake, damage, and loss mapping for Istanbul metropolitan area

    NASA Astrophysics Data System (ADS)

    Zülfikar, A. Can; Fercan, N. Özge Zülfikar; Tunç, Süleyman; Erdik, Mustafa

    2017-01-01

    The past devastating earthquakes in densely populated urban centers, such as the 1994 Northridge; 1995 Kobe; 1999 series of Kocaeli, Düzce, and Athens; and 2011 Van-Erciş events, showed that substantial social and economic losses can be expected. Previous studies indicate that inadequate emergency response can increase the number of casualties by a maximum factor of 10, which suggests the need for research on rapid earthquake shaking damage and loss estimation. The reduction in casualties in urban areas immediately following an earthquake can be improved if the location and severity of damages can be rapidly assessed by information from rapid response systems. In this context, a research project (TUBITAK-109M734) titled "Real-time Information of Earthquake Shaking, Damage, and Losses for Target Cities of Thessaloniki and Istanbul" was conducted during 2011-2014 to establish the rapid estimation of ground motion shaking and related earthquake damages and casualties for the target cities. In the present study, application to Istanbul metropolitan area is presented. In order to fulfill this objective, earthquake hazard and risk assessment methodology known as Earthquake Loss Estimation Routine, which was developed for the Euro-Mediterranean region within the Network of Research Infrastructures for European Seismology EC-FP6 project, was used. The current application to the Istanbul metropolitan area provides real-time ground motion information obtained by strong motion stations distributed throughout the densely populated areas of the city. According to this ground motion information, building damage estimation is computed by using grid-based building inventory, and the related loss is then estimated. Through this application, the rapidly estimated information enables public and private emergency management authorities to take action and allocate and prioritize resources to minimize the casualties in urban areas during immediate post-earthquake periods. Moreover, it is expected that during an earthquake, rapid information of ground shaking, damage, and loss estimations will provide vital information to allow appropriate emergency agencies to take immediate action, which will help to save lives. In general terms, this study can be considered as an example for application to metropolitan areas under seismic risk.

  1. Modeling continuous seismic velocity changes due to ground shaking in Chile

    NASA Astrophysics Data System (ADS)

    Gassenmeier, Martina; Richter, Tom; Sens-Schönfelder, Christoph; Korn, Michael; Tilmann, Frederik

    2015-04-01

    In order to investigate temporal seismic velocity changes due to earthquake related processes and environmental forcing, we analyze 8 years of ambient seismic noise recorded by the Integrated Plate Boundary Observatory Chile (IPOC) network in northern Chile between 18° and 25° S. The Mw 7.7 Tocopilla earthquake in 2007 and the Mw 8.1 Iquique earthquake in 2014 as well as numerous smaller events occurred in this area. By autocorrelation of the ambient seismic noise field, approximations of the Green's functions are retrieved. The recovered function represents backscattered or multiply scattered energy from the immediate neighborhood of the station. To detect relative changes of the seismic velocities we apply the stretching method, which compares individual autocorrelation functions to stretched or compressed versions of a long term averaged reference autocorrelation function. We use time windows in the coda of the autocorrelations, that contain scattered waves which are highly sensitive to minute changes in the velocity. At station PATCX we observe seasonal changes in seismic velocity as well as temporary velocity reductions in the frequency range of 4-6 Hz. The seasonal changes can be attributed to thermal stress changes in the subsurface related to variations of the atmospheric temperature. This effect can be modeled well by a sine curve and is subtracted for further analysis of short term variations. Temporary velocity reductions occur at the time of ground shaking usually caused by earthquakes and are followed by a recovery. We present an empirical model that describes the seismic velocity variations based on continuous observations of the local ground acceleration. Our hypothesis is that not only the shaking of earthquakes provokes velocity drops, but any small vibrations continuously induce minor velocity variations that are immediately compensated by healing in the steady state. We show that the shaking effect is accumulated over time and best described by the integrated envelope of the ground acceleration over 1 day which is the discretization interval of the velocity measurements. In our model the amplitude of the velocity reduction as well as the recovery time are proportional to the size of the excitation. This model with the two free scaling parameters for the shaking induced velocity variation fits the data in remarkable detail. Additionally, a linear trend is observed that might be related to a recovery process from one or more earthquakes before our measurement period. For the Tocopilla earthquake in 2007 and the Iquique earthquake in 2014 velocity reductions are also observed at other stations of the IPOC network. However, a clear relationship between the ground shaking and the induced velocity reductions is not visible at other stations. We attribute the outstanding sensitivity of PATCX to ground shaking to the special geological setting of the station, where the material consists of relatively loose conglomerate with high pore volume.

  2. Examples of Communicating Uncertainty Applied to Earthquake Hazard and Risk Products

    NASA Astrophysics Data System (ADS)

    Wald, D. J.

    2013-12-01

    When is communicating scientific modeling uncertainty effective? One viewpoint is that the answer depends on whether one is communicating hazard or risk: hazards have quantifiable uncertainties (which, granted, are often ignored), yet risk uncertainties compound uncertainties inherent in the hazard with those of the risk calculations, and are thus often larger. Larger, yet more meaningful: since risk entails societal impact of some form, consumers of such information tend to have a better grasp of the potential uncertainty ranges for loss information than they do for less-tangible hazard values (like magnitude, peak acceleration, or stream flow). I present two examples that compare and contrast communicating uncertainty for earthquake hazard and risk products. The first example is the U.S. Geological Survey's (USGS) ShakeMap system, which portrays the uncertain, best estimate of the distribution and intensity of shaking over the potentially impacted region. The shaking intensity is well constrained at seismograph locations yet is uncertain elsewhere, so shaking uncertainties are quantified and presented spatially. However, with ShakeMap, it seems that users tend to believe what they see is accurate in part because (1) considering the shaking uncertainty complicates the picture, and (2) it would not necessarily alter their decision-making. In contrast, when it comes to making earthquake-response decisions based on uncertain loss estimates, actions tend to be made only after analysis of the confidence in (or source of) such estimates. Uncertain ranges of loss estimates instill tangible images for users, and when such uncertainties become large, intuitive reality-check alarms go off, for example, when the range of losses presented become too wide to be useful. The USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, which in near-real time alerts users to the likelihood of ranges of potential fatalities and economic impact, is aimed at facilitating rapid and proportionate earthquake response. For uncertainty representation, PAGER employs an Earthquake Impact Scale (EIS) that provides simple alerting thresholds, derived from systematic analyses of past earthquake impact and response levels. The alert levels are characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (major disaster, necessitating international response). We made a conscious attempt at both simple and intuitive color-coded alerting criterion; yet, we preserve the necessary uncertainty measures (with simple histograms) by which one can gauge the likelihood for the alert to be over- or underestimated. In these hazard and loss modeling examples, both products are widely used across a range of technical as well as general audiences. Ironically, ShakeMap uncertainties--rigorously reported and portrayed for the primarily scientific portion of the audience--are rarely employed and are routinely misunderstood; for PAGER, uncertainties aimed at a wider user audience seem to be more easily digested. We discuss how differences in the way these uncertainties are portrayed may play into their acceptance and uptake, or lack thereof.

  3. Ambient Vibration and Earthquake-Data Analyses of a 62-STORY Building Using System Identification and Seismic Interferometry

    NASA Astrophysics Data System (ADS)

    Kalkan, E.; Fletcher, J. B.; Ulusoy, H. S.; Baker, L. A.

    2014-12-01

    A 62-story residential tower in San Francisco—the tallest all-residential building in California—was recently instrumented by the USGS's National Strong Motion Project in collaboration with the Strong Motion Instrumentation Program of the California Geological Survey to monitor the motion of a tall building built with specifically engineered features (including buckling-restrained braces, outrigger columns and a tuned liquid damper) to reduce its sway from seismic and wind loads. This 641-ft tower has been outfitted with 72 uni-axial accelerometers, spanning through 26 different levels of the building. For damage detection and localization through structural health monitoring, we use local micro-earthquake and ambient monitoring (background noises) to define linear-elastic (undamaged) dynamic properties of the superstructure including its modal parameters (fundamental frequencies, mode shapes and modal damping values) and shear-wave propagation profile and wave attenuation inside the building, which need to be determined in advance of strong shaking. In order to estimate the baseline modal parameters, we applied a frequency domain decomposition method. Using this method, the first three bending modes in the reference east-west direction, the first two bending modes in the reference north-south direction, and the first two torsional modes were identified. The shear-wave propagation and wave attenuation inside the building were computed using deconvolution interferometry. The data used for analyses are from ambient vibrations having 20 minutes duration, and earthquake data from a local M4.5 event located just north east of Geyserville, California. We show that application of deconvolution interferometry to data recorded inside a building is a powerful technique for monitoring structural parameters, such as velocities of traveling waves, frequencies of normal modes, and intrinsic attenuation (i.e., damping). The simplicity and similarity of the deconvolved waveforms from ambient vibrations and a small magnitude event also suggest that a one-dimensional shear velocity model is sufficiently accurate to represent the wave propagation charactersistics inside the building.

  4. Gold nanoclusters as contrast agents for fluorescent and X-ray dual-modality imaging.

    PubMed

    Zhang, Aili; Tu, Yu; Qin, Songbing; Li, Yan; Zhou, Juying; Chen, Na; Lu, Qiang; Zhang, Bingbo

    2012-04-15

    Multimodal imaging technique is an alternative approach to improve sensitivity of early cancer diagnosis. In this study, highly fluorescent and strong X-ray absorption coefficient gold nanoclusters (Au NCs) are synthesized as dual-modality imaging contrast agents (CAs) for fluorescent and X-ray dual-modality imaging. The experimental results show that the as-prepared Au NCs are well constructed with ultrasmall sizes, reliable fluorescent emission, high computed tomography (CT) value and fine biocompatibility. In vivo imaging results indicate that the obtained Au NCs are capable of fluorescent and X-ray enhanced imaging. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    NASA Astrophysics Data System (ADS)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  6. Process optimization involving critical evaluation of oxygen transfer, oxygen uptake and nitrogen limitation for enhanced biomass and lipid production by oleaginous yeast for biofuel application.

    PubMed

    Chopra, Jayita; Sen, Ramkrishna

    2018-04-20

    Lipid accumulation in oleaginous yeast is generally induced by nitrogen starvation, while oxygen saturation can influence biomass growth. Systematic shake flask studies that help in identifying the right nitrogen source and relate its uptake kinetics to lipid biosynthesis under varying oxygen saturation conditions are very essential for addressing the bioprocessing-related issues, which are envisaged to occur in the fermenter scale production. In the present study, lipid bioaccumulation by P. guilliermondii at varying C:N ratios and oxygen transfer conditions (assessed in terms of k L a) was investigated in shake flasks using a pre-optimized N-source and a two-stage inoculum formulated in a hybrid medium. A maximum lipid concentration of 10.8 ± 0.5 g L -1 was obtained in shake flask study at the optimal condition with an initial C:N and k L a of 60:1 and 0.6 min -1 , respectively, at a biomass specific growth rate of 0.11 h -1 . Translating these optimal shake flask conditions to a 3.7 L stirred tank reactor resulted in biomass and lipid concentrations of 16.74 ± 0.8 and 8 ± 0.4 g L -1 . The fatty acid methyl ester (FAME) profile of lipids obtained by gas chromatography was found to be suitable for biodiesel application. We strongly believe that the rationalistic approach-based design of experiments adopted in the study would help in achieving high cell density with improved lipid accumulation and also minimize the efforts towards process optimization during bioreactor level operations, consequently reducing the research and development-associated costs.

  7. Dietary Soy Supplement on Fibromyalgia Symptoms: A Randomized, Double-Blind, Placebo-Controlled, Early Phase Trial

    PubMed Central

    Wahner-Roedler, Dietlind L.; Thompson, Jeffrey M.; Luedtke, Connie A.; King, Susan M.; Cha, Stephen S.; Elkin, Peter L.; Bruce, Barbara K.; Townsend, Cynthia O.; Bergeson, Jody R.; Eickhoff, Andrea L.; Loehrer, Laura L.; Sood, Amit; Bauer, Brent A.

    2011-01-01

    Most patients with fibromyalgia use complementary and alternative medicine (CAM). Properly designed controlled trials are necessary to assess the effectiveness of these practices. This study was a randomized, double-blind, placebo-controlled, early phase trial. Fifty patients seen at a fibromyalgia outpatient treatment program were randomly assigned to a daily soy or placebo (casein) shake. Outcome measures were scores of the Fibromyalgia Impact Questionnaire (FIQ) and the Center for Epidemiologic Studies Depression Scale (CES-D) at baseline and after 6 weeks of intervention. Analysis was with standard statistics based on the null hypothesis, and separation test for early phase CAM comparative trials. Twenty-eight patients completed the study. Use of standard statistics with intent-to-treat analysis showed that total FIQ scores decreased by 14% in the soy group (P = .02) and by 18% in the placebo group (P < .001). The difference in change in scores between the groups was not significant (P = .16). With the same analysis, CES-D scores decreased in the soy group by 16% (P = .004) and in the placebo group by 15% (P = .05). The change in scores was similar in the groups (P = .83). Results of statistical analysis using the separation test and intent-to-treat analysis revealed no benefit of soy compared with placebo. Shakes that contain soy and shakes that contain casein, when combined with a multidisciplinary fibromyalgia treatment program, provide a decrease in fibromyalgia symptoms. Separation between the effects of soy and casein (control) shakes did not favor the intervention. Therefore, large-sample studies using soy for patients with fibromyalgia are probably not indicated. PMID:18990724

  8. Geodetic Finite-Fault-based Earthquake Early Warning Performance for Great Earthquakes Worldwide

    NASA Astrophysics Data System (ADS)

    Ruhl, C. J.; Melgar, D.; Grapenthin, R.; Allen, R. M.

    2017-12-01

    GNSS-based earthquake early warning (EEW) algorithms estimate fault-finiteness and unsaturated moment magnitude for the largest, most damaging earthquakes. Because large events are infrequent, algorithms are not regularly exercised and insufficiently tested on few available datasets. The Geodetic Alarm System (G-larmS) is a GNSS-based finite-fault algorithm developed as part of the ShakeAlert EEW system in the western US. Performance evaluations using synthetic earthquakes offshore Cascadia showed that G-larmS satisfactorily recovers magnitude and fault length, providing useful alerts 30-40 s after origin time and timely warnings of ground motion for onshore urban areas. An end-to-end test of the ShakeAlert system demonstrated the need for GNSS data to accurately estimate ground motions in real-time. We replay real data from several subduction-zone earthquakes worldwide to demonstrate the value of GNSS-based EEW for the largest, most damaging events. We compare predicted ground acceleration (PGA) from first-alert-solutions with those recorded in major urban areas. In addition, where applicable, we compare observed tsunami heights to those predicted from the G-larmS solutions. We show that finite-fault inversion based on GNSS-data is essential to achieving the goals of EEW.

  9. ShakeMap fed by macroseismic data in France: feedbacks and contribution for improving SHA.

    NASA Astrophysics Data System (ADS)

    Schlupp, A.

    2016-12-01

    We are using the USGS ShakeMap software V3.5 that allows including intensity as input in association with instrumental data. We have been collecting citizen testimonies for 17 years in France, a region of moderate seismicity for the metropolitan part and in a subduction context for the West Indies part. We collect frequently several thousands testimonies after Mw>≈4.5. Thanks to the selection of "intensity characteristic thumbnails", we can provide in real time a single questionnary intensity (SQI) averaged at the city scale for a preliminary EMS98 intensity. We observed that about 65% of these "thumbnails SQI" are identical to the "final expert SQI" and the remaining part is shifted by only an intensity degree. With about 36000 cities (1 per 14 square km), we are able to sample in details the territory when the about 400 seismic stations give irreplaceable precise ground motion parameters but very local and most of the times at a farther epicentral distance. Since 2012, we contribute as intensity provider for ShakeMap in Pyrenees range (www.SisPyr.eu). Since spring 2016, we run the ShakeMap V3.5 in a "beta version" for the whole territory of France with several adaptations for region with moderate size events. The BCSF provides Intensities (www.franceseisme.fr), RESIF the instrumental data (www.resif.fr) with the West Indies observatories (OVSG-OVSM) and few stations of bordering countries. Feedbacks are: a huge improvement at any distance by including intensities, need to use regional attenuation law, detection of important ML overestimation in few regions, strong dependence to the epicenter localization, recent published GMICE well adapted, difficulty to represent non circular isoseismals. What we learn from ShakeMap is also a valuable contribution for hazard assessment. We aim to continuously improve the results for a state reference ShakeMap through a specific "ShakeMap transverse action" and its working group in the frame of RESIF.

  10. Earthquake Early Warning: User Education and Designing Effective Messages

    NASA Astrophysics Data System (ADS)

    Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.

    2014-12-01

    The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental information to increase the public's understanding of earthquake shaking and protective behaviors.

  11. Bioprinting for vascular and vascularized tissue biofabrication.

    PubMed

    Datta, Pallab; Ayan, Bugra; Ozbolat, Ibrahim T

    2017-03-15

    Bioprinting is a promising technology to fabricate design-specific tissue constructs due to its ability to create complex, heterocellular structures with anatomical precision. Bioprinting enables the deposition of various biologics including growth factors, cells, genes, neo-tissues and extra-cellular matrix-like hydrogels. Benefits of bioprinting have started to make a mark in the fields of tissue engineering, regenerative medicine and pharmaceutics. Specifically, in the field of tissue engineering, the creation of vascularized tissue constructs has remained a principal challenge till date. However, given the myriad advantages over other biofabrication methods, it becomes organic to expect that bioprinting can provide a viable solution for the vascularization problem, and facilitate the clinical translation of tissue engineered constructs. This article provides a comprehensive account of bioprinting of vascular and vascularized tissue constructs. The review is structured as introducing the scope of bioprinting in tissue engineering applications, key vascular anatomical features and then a thorough coverage of 3D bioprinting using extrusion-, droplet- and laser-based bioprinting for fabrication of vascular tissue constructs. The review then provides the reader with the use of bioprinting for obtaining thick vascularized tissues using sacrificial bioink materials. Current challenges are discussed, a comparative evaluation of different bioprinting modalities is presented and future prospects are provided to the reader. Biofabrication of living tissues and organs at the clinically-relevant volumes vitally depends on the integration of vascular network. Despite the great progress in traditional biofabrication approaches, building perfusable hierarchical vascular network is a major challenge. Bioprinting is an emerging technology to fabricate design-specific tissue constructs due to its ability to create complex, heterocellular structures with anatomical precision, which holds a great promise in fabrication of vascular or vascularized tissues for transplantation use. Although a great progress has recently been made on building perfusable tissues and branched vascular network, a comprehensive review on the state-of-the-art in vascular and vascularized tissue bioprinting has not reported so far. This contribution is thus significant because it discusses the use of three major bioprinting modalities in vascular tissue biofabrication for the first time in the literature and compares their strengths and limitations in details. Moreover, the use of scaffold-based and scaffold-free bioprinting is expounded within the domain of vascular tissue fabrication. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  12. Modality-specificity of Selective Attention Networks

    PubMed Central

    Stewart, Hannah J.; Amitay, Sygal

    2015-01-01

    Objective: To establish the modality specificity and generality of selective attention networks. Method: Forty-eight young adults completed a battery of four auditory and visual selective attention tests based upon the Attention Network framework: the visual and auditory Attention Network Tests (vANT, aANT), the Test of Everyday Attention (TEA), and the Test of Attention in Listening (TAiL). These provided independent measures for auditory and visual alerting, orienting, and conflict resolution networks. The measures were subjected to an exploratory factor analysis to assess underlying attention constructs. Results: The analysis yielded a four-component solution. The first component comprised of a range of measures from the TEA and was labeled “general attention.” The third component was labeled “auditory attention,” as it only contained measures from the TAiL using pitch as the attended stimulus feature. The second and fourth components were labeled as “spatial orienting” and “spatial conflict,” respectively—they were comprised of orienting and conflict resolution measures from the vANT, aANT, and TAiL attend-location task—all tasks based upon spatial judgments (e.g., the direction of a target arrow or sound location). Conclusions: These results do not support our a-priori hypothesis that attention networks are either modality specific or supramodal. Auditory attention separated into selectively attending to spatial and non-spatial features, with the auditory spatial attention loading onto the same factor as visual spatial attention, suggesting spatial attention is supramodal. However, since our study did not include a non-spatial measure of visual attention, further research will be required to ascertain whether non-spatial attention is modality-specific. PMID:26635709

  13. Blind identification of the Millikan Library from earthquake data considering soil–structure interaction

    USGS Publications Warehouse

    Ghahari, S. F.; Abazarsa, F.; Avci, O.; Çelebi, Mehmet; Taciroglu, E.

    2016-01-01

    The Robert A. Millikan Library is a reinforced concrete building with a basement level and nine stories above the ground. Located on the campus of California Institute of Technology (Caltech) in Pasadena California, it is among the most densely instrumented buildings in the U.S. From the early dates of its construction, it has been the subject of many investigations, especially regarding soil–structure interaction effects. It is well accepted that the structure is significantly interacting with the surrounding soil, which implies that the true foundation input motions cannot be directly recorded during earthquakes because of inertial effects. Based on this limitation, input–output modal identification methods are not applicable to this soil–structure system. On the other hand, conventional output-only methods are typically based on the unknown input signals to be stationary whitenoise, which is not the case for earthquake excitations. Through the use of recently developed blind identification (i.e. output-only) methods, it has become possible to extract such information from only the response signals because of earthquake excitations. In the present study, we employ such a blind identification method to extract the modal properties of the Millikan Library. We present some modes that have not been identified from force vibration tests in several studies to date. Then, to quantify the contribution of soil–structure interaction effects, we first create a detailed Finite Element (FE) model using available information about the superstructure; and subsequently update the soil–foundation system's dynamic stiffnesses at each mode such that the modal properties of the entire soil–structure system agree well with those obtained via output-only modal identification.

  14. Integrated trimodal SSEP experimental setup for visual, auditory and tactile stimulation

    NASA Astrophysics Data System (ADS)

    Kuś, Rafał; Spustek, Tomasz; Zieleniewska, Magdalena; Duszyk, Anna; Rogowski, Piotr; Suffczyński, Piotr

    2017-12-01

    Objective. Steady-state evoked potentials (SSEPs), the brain responses to repetitive stimulation, are commonly used in both clinical practice and scientific research. Particular brain mechanisms underlying SSEPs in different modalities (i.e. visual, auditory and tactile) are very complex and still not completely understood. Each response has distinct resonant frequencies and exhibits a particular brain topography. Moreover, the topography can be frequency-dependent, as in case of auditory potentials. However, to study each modality separately and also to investigate multisensory interactions through multimodal experiments, a proper experimental setup appears to be of critical importance. The aim of this study was to design and evaluate a novel SSEP experimental setup providing a repetitive stimulation in three different modalities (visual, tactile and auditory) with a precise control of stimuli parameters. Results from a pilot study with a stimulation in a particular modality and in two modalities simultaneously prove the feasibility of the device to study SSEP phenomenon. Approach. We developed a setup of three separate stimulators that allows for a precise generation of repetitive stimuli. Besides sequential stimulation in a particular modality, parallel stimulation in up to three different modalities can be delivered. Stimulus in each modality is characterized by a stimulation frequency and a waveform (sine or square wave). We also present a novel methodology for the analysis of SSEPs. Main results. Apart from constructing the experimental setup, we conducted a pilot study with both sequential and simultaneous stimulation paradigms. EEG signals recorded during this study were analyzed with advanced methodology based on spatial filtering and adaptive approximation, followed by statistical evaluation. Significance. We developed a novel experimental setup for performing SSEP experiments. In this sense our study continues the ongoing research in this field. On the other hand, the described setup along with the presented methodology is a considerable improvement and an extension of methods constituting the state-of-the-art in the related field. Device flexibility both with developed analysis methodology can lead to further development of diagnostic methods and provide deeper insight into information processing in the human brain.

  15. Project of Near-Real-Time Generation of ShakeMaps and a New Hazard Map in Austria

    NASA Astrophysics Data System (ADS)

    Jia, Yan; Weginger, Stefan; Horn, Nikolaus; Hausmann, Helmut; Lenhardt, Wolfgang

    2016-04-01

    Target-orientated prevention and effective crisis management can reduce or avoid damage and save lives in case of a strong earthquake. To achieve this goal, a project for automatic generated ShakeMaps (maps of ground motion and shaking intensity) and updating the Austrian hazard map was started at ZAMG (Zentralanstalt für Meteorologie und Geodynamik) in 2015. The first goal of the project is set for a near-real-time generation of ShakeMaps following strong earthquakes in Austria to provide rapid, accurate and official information to support the governmental crisis management. Using newly developed methods and software by SHARE (Seismic Hazard Harmonization in Europe) and GEM (Global Earthquake Model), which allows a transnational analysis at European level, a new generation of Austrian hazard maps will be ultimately calculated. More information and a status of our project will be given by this presentation.

  16. On the feasibility of a transient dynamic design analysis

    NASA Astrophysics Data System (ADS)

    Cunniff, Patrick F.; Pohland, Robert D.

    1993-05-01

    The Dynamic Design Analysis Method has been used for the past 30 years as part of the Navy's efforts to shock-harden heavy shipboard equipment. This method which has been validated several times employs normal mode theory and design shock values. This report examines the degree of success that may be achieved by using simple equipment-vehicle models that produce time history responses which are equivalent to the responses that would be achieved using spectral design values employed by the Dynamic Design Analysis Method. These transient models are constructed by attaching the equipment's modal oscillators to the vehicle which is composed of rigid masses and elastic springs. Two methods have been developed for constructing these transient models. Each method generates the parameters of the vehicles so as to approximate the required damaging effects, such that the transient model is excited by an idealized impulse applied to the vehicle mass to which the equipment modal oscillators are attached. The first method called the Direct Modeling Method, is limited to equipment with at most three-degrees of freedom and the vehicle consists of a single lumped mass and spring. The Optimization Modeling Method, which is based on the simplex method for optimization, has been used successfully with a variety of vehicle models and equipment sizes.

  17. "Shake It Baby, Shake It": Media Preferences, Sexual Attitudes and Gender Stereotypes Among Adolescents.

    PubMed

    Ter Bogt, Tom F M; Engels, Rutger C M E; Bogers, Sanne; Kloosterman, Monique

    2010-12-01

    In this study exposure to and preferences for three important youth media (TV, music styles/music TV, internet) were examined in relation to adolescents' permissive sexual attitudes and gender stereotypes (i.e., views of men as sex-driven and tough, and of women as sex objects). Multivariate structural analysis of data from a school-based sample of 480 13 to 16-year-old Dutch students revealed that preferences, rather than exposure were associated with attitudes and stereotypes. For both girls and boys, preferences for hip-hop and hard-house music were associated positively with gender stereotypes and preference for classical music was negatively associated with gender stereotypes. Particularly for boys, using internet to find explicit sexual content emerged as a powerful indicator of all attitudes and stereotypes.

  18. Bridge Structure, Foundation and Approach Embankment Performance for the October-November 2002 Earthquake Sequence on the Denali Fault, Alaska

    NASA Astrophysics Data System (ADS)

    Vinson, T. S.; Hulsey, L.; Ma, J.; Connor, B.; Brooks, T. E.

    2002-12-01

    More than two dozen major bridges were subjected to severe ground motions during the October-November 2002 Earthquake Sequence on the Denali Fault, Alaska. The bridges represented a number of conventional designs constructed over the past three to four decades. The objective of the field investigation presented herein was to determine the extent of the damage, if any, to the bridge structures, foundations and approach embankments. This was accomplished by direct inspection of the bridges by the authors (or employees of their organizations) along the Richardson, Alaska, Parks, and Denali Highways, the Tok Cutoff, and the railroad bridges for the railroad alignment between Trapper Creek and Fairbanks. More specifically, the members of the investigation team (represented by the authors) conducted more than three days of field inspections of bridges within the zone of severe ground shaking during the M6.7 and M7.9 Denali fault events. The primary conclusion noted was that while a substantial number of bridges were subjected to intense shaking they all performed very well and were not damaged to the extent that remedial repairs to the bridge structure were necessary. There were occurrences of lateral spreading/liquefaction related damage to the approach embankments and slight separation of the approach embankment from the abutment foundation systems. Overall, considering the severity of ground shaking, much greater damage to the bridge structures, foundations and approach embankments would be predicted. Had the earthquakes occurred during winter when the ground was frozen and the ductility of the structures was substantially reduced events comparable to the October-November 2002 Earthquake Sequence on the Denali Fault, Alaska could have resulted in significant damage to bridges. This reconnaissance was supported by the National Science Foundation, Alaska Dept. of Transportation and Public Facilities, and the Alaska Railroad Corporation.

  19. Comparison of Human Response against Earthquake and Tsunami

    NASA Astrophysics Data System (ADS)

    Arikawa, T.; Güler, H. G.; Yalciner, A. C.

    2017-12-01

    The evacuation response against the earthquake and tsunamis is very important for the reduction of human damages against tsunami. But it is very difficult to predict the human behavior after shaking of the earthquake. The purpose of this research is to clarify the difference of the human response after the earthquake shock in the difference countries and to consider the relation between the response and the safety feeling, knowledge and education. For the objective of this paper, the questionnaire survey was conducted after the 21st July 2017 Gokova earthquake and tsunami. Then, consider the difference of the human behavior by comparison of that in 2015 Chilean earthquake and tsunami and 2011 Japan earthquake and tsunami. The seismic intensity of the survey points was almost 6 to 7. The contents of the questions include the feeling of shaking, recalling of the tsunami, the behavior after shock and so on. The questionnaire was conducted for more than 20 20 people in 10 areas. The results are the following; 1) Most people felt that it was a strong shake not to stand, 2) All of the questionnaires did not recall the tsunami, 3) Depending on the area, they felt that after the earthquake the beach was safer than being at home. 4) After they saw the sea drawing, they thought that a tsunami would come and ran away. Fig. 1 shows the comparison of the evacuation rate within 10 minutes in 2011 Japan, 2015 Chile and 2017 Turkey.. From the education point of view, education for tsunami is not done much in Turkey. From the protection facilities point of view, the high sea walls are constructed only in Japan. From the warning alert point of view, there is no warning system against tsunamis in the Mediterranean Sea. As a result of this survey, the importance of tsunami education is shown, and evacuation tends to be delayed if dependency on facilities and alarms is too high.

  20. Cross-Modality Image Synthesis via Weakly Coupled and Geometry Co-Regularized Joint Dictionary Learning.

    PubMed

    Huang, Yawen; Shao, Ling; Frangi, Alejandro F

    2018-03-01

    Multi-modality medical imaging is increasingly used for comprehensive assessment of complex diseases in either diagnostic examinations or as part of medical research trials. Different imaging modalities provide complementary information about living tissues. However, multi-modal examinations are not always possible due to adversary factors, such as patient discomfort, increased cost, prolonged scanning time, and scanner unavailability. In additionally, in large imaging studies, incomplete records are not uncommon owing to image artifacts, data corruption or data loss, which compromise the potential of multi-modal acquisitions. In this paper, we propose a weakly coupled and geometry co-regularized joint dictionary learning method to address the problem of cross-modality synthesis while considering the fact that collecting the large amounts of training data is often impractical. Our learning stage requires only a few registered multi-modality image pairs as training data. To employ both paired images and a large set of unpaired data, a cross-modality image matching criterion is proposed. Then, we propose a unified model by integrating such a criterion into the joint dictionary learning and the observed common feature space for associating cross-modality data for the purpose of synthesis. Furthermore, two regularization terms are added to construct robust sparse representations. Our experimental results demonstrate superior performance of the proposed model over state-of-the-art methods.

  1. Liquid films on shake flask walls explain increasing maximum oxygen transfer capacities with elevating viscosity.

    PubMed

    Giese, Heiner; Azizan, Amizon; Kümmel, Anne; Liao, Anping; Peter, Cyril P; Fonseca, João A; Hermann, Robert; Duarte, Tiago M; Büchs, Jochen

    2014-02-01

    In biotechnological screening and production, oxygen supply is a crucial parameter. Even though oxygen transfer is well documented for viscous cultivations in stirred tanks, little is known about the gas/liquid oxygen transfer in shake flask cultures that become increasingly viscous during cultivation. Especially the oxygen transfer into the liquid film, adhering on the shake flask wall, has not yet been described for such cultivations. In this study, the oxygen transfer of chemical and microbial model experiments was measured and the suitability of the widely applied film theory of Higbie was studied. With numerical simulations of Fick's law of diffusion, it was demonstrated that Higbie's film theory does not apply for cultivations which occur at viscosities up to 10 mPa s. For the first time, it was experimentally shown that the maximum oxygen transfer capacity OTRmax increases in shake flasks when viscosity is increased from 1 to 10 mPa s, leading to an improved oxygen supply for microorganisms. Additionally, the OTRmax does not significantly undermatch the OTRmax at waterlike viscosities, even at elevated viscosities of up to 80 mPa s. In this range, a shake flask is a somehow self-regulating system with respect to oxygen supply. This is in contrary to stirred tanks, where the oxygen supply is steadily reduced to only 5% at 80 mPa s. Since, the liquid film formation at shake flask walls inherently promotes the oxygen supply at moderate and at elevated viscosities, these results have significant implications for scale-up. © 2013 Wiley Periodicals, Inc.

  2. Development of a circulation direct sampling and monitoring system for O2 and CO2 concentrations in the gas-liquid phases of shake-flask systems during microbial cell culture.

    PubMed

    Takahashi, Masato; Sawada, Yoshisuke; Aoyagi, Hideki

    2017-08-23

    Monitoring the environmental factors during shake-flask culture of microorganisms can help to optimise the initial steps of bioprocess development. Herein, we developed a circulation direct monitoring and sampling system (CDMSS) that can monitor the behaviour of CO 2 and O 2 in the gas-liquid phases and obtain a sample without interrupting the shaking of the culture in Erlenmeyer flasks capped with breathable culture plugs. Shake-flask culturing of Escherichia coli using this set-up indicated that a high concentration of CO 2 accumulated not only in the headspace (maximum ~100 mg/L) but also in the culture broth (maximum ~85 mg/L) during the logarithmic phase (4.5-9.0 h). By packing a CO 2 absorbent in the gas circulation unit of CDMSS, a specialised shake-flask culture was developed to remove CO 2 from the headspace. It was posited that removing CO 2 from the headspace would suppress increases in the dissolved CO 2 concentration in the culture broth (maximum ~15 mg/L). Furthermore, the logarithmic growth phase (4.5-12.0 h) was extended, the U.O.D. 580 and pH value increased, and acetic acid concentration was reduced, compared with the control. To our knowledge, this is the first report of a method aimed at improving the growth of E. coli cells without changing the composition of the medium, temperature, and shaking conditions.

  3. Cascadia Onshore-Offshore Site Response, Submarine Sediment Mobilization, and Earthquake Recurrence

    NASA Astrophysics Data System (ADS)

    Gomberg, J.

    2018-02-01

    Local geologic structure and topography may modify arriving seismic waves. This inherent variation in shaking, or "site response," may affect the distribution of slope failures and redistribution of submarine sediments. I used seafloor seismic data from the 2011 to 2015 Cascadia Initiative and permanent onshore seismic networks to derive estimates of site response, denoted Sn, in low- and high-frequency (0.02-1 and 1-10 Hz) passbands. For three shaking metrics (peak velocity and acceleration and energy density) Sn varies similarly throughout Cascadia and changes primarily in the direction of convergence, roughly east-west. In the two passbands, Sn patterns offshore are nearly opposite and range over an order of magnitude or more across Cascadia. Sn patterns broadly may be attributed to sediment resonance and attenuation. This and an abrupt step in the east-west trend of Sn suggest that changes in topography and structure at the edge of the continental margin significantly impact shaking. These patterns also correlate with gravity lows diagnostic of marginal basins and methane plumes channeled within shelf-bounding faults. Offshore Sn exceeds that onshore in both passbands, and the steepest slopes and shelf coincide with the relatively greatest and smallest Sn estimates at low and high frequencies, respectively; these results should be considered in submarine shaking-triggered slope stability failure studies. Significant north-south Sn variations are not apparent, but sparse sampling does not permit rejection of the hypothesis that the southerly decrease in intervals between shaking-triggered turbidites and great earthquakes inferred by Goldfinger et al. (2012, 2013, 2016) and Priest et al. (2017) is due to inherently stronger shaking southward.

  4. Post-Earthquake Assessment of Nevada Bridges Using ShakeMap/ShakeCast

    DOT National Transportation Integrated Search

    2016-01-01

    Post-earthquake capacity of Nevada highway bridges is examined through a combination of engineering study and scenario earthquake evaluation. The study was undertaken by the University of Nevada Reno Department of Civil and Environmental Engineering ...

  5. Seismological Field Observation of Mesoscopic Nonlinearity

    NASA Astrophysics Data System (ADS)

    Sens-Schönfelder, Christoph; Gassenmeier, Martina; Eulenfeld, Tom; Tilmann, Frederik; Korn, Michael; Niederleithinger, Ernst

    2016-04-01

    Noise based observations of seismic velocity changes have been made in various environments. We know of seasonal changes of velocities related to ground water or temperature changes, co-seismic changes originating from shaking or stress redistribution and changes related to volcanic activity. Is is often argued that a decrease of velocity is related to the opening of cracks while the closure of cracks leads to a velocity increase if permanent stress changes are invoked. In contrast shaking induced changes are often related to "damage" and subsequent "healing" of the material. The co-seismic decrease and transient recovery of seismic velocities can thus be explained with both - static stress changes or damage/healing processes. This results in ambiguous interpretations of the observations. Here we present the analysis of one particular seismic station in northern Chile that shows very strong and clear velocity changes associated with several earthquakes ranging from Mw=5.3 to Mw=8.1. The fact that we can observe the response to several events of various magnitudes from different directions offers the unique possibility to discern the two possible causative processes. We test the hypothesis, that the velocity changes are related to shaking rather than stress changes by developing an empirical model that is based on the local ground acceleration at the sensor site. The eight year of almost continuous observations of velocity changes are well modeled by a daily drop of the velocity followed by an exponential recovery. Both, the amplitude of the drop as well as the recovery time are proportional to the integrated acceleration at the seismic station. Effects of consecutive days are independent and superimposed resulting in strong changes after earthquakes and constantly increasing velocities during quiet days thereafter. This model describes the continuous observations of the velocity changes solely based on the acceleration time series without individually defined dates of events associated with separately inverted parameters. As the local ground acceleration is not correlated to static stress changes we can exclude static stress changes as causative process. The shaking sensitivity and healing process is well known from laboratory experiments in composite materials as mesoscopic nonlinearity. The sensitive behavior at this station is related to the particular near surface material that is a conglomerate cemented with gypsum - so called gypcrete. However, mesoscopic nonlinearity with different parameters might be a key to understand velocity changes also at other sites.

  6. Ultrasonic-energy enhance the ionic liquid-based dual microextraction to preconcentrate the lead in ground and stored rain water samples as compared to conventional shaking method.

    PubMed

    Nizamani, Sooraj; Kazi, Tasneem G; Afridi, Hassan I

    2018-01-01

    An efficient preconcentration technique based on ultrasonic-assisted ionic liquid-based dual microextraction (UA-ILDµE) method has been developed to preconcentrate the lead (Pb +2 ) in ground and stored rain water. In the current proposed method, Pb +2 was complexed with a chelating agent (dithizone), whereas an ionic liquid (1-butyl-3-methylimidazolium hexafluorophosphate) was used for extraction purpose. The ultrasonic irradiation and electrical shaking system were applied to enhance the dispersion and extraction of Pb +2 complex in aqueous samples. For second phase, dual microextraction (DµE phase), the enriched Pb +2 complex in ionic liquid, extracted back into the acidic aqueous solution and finally determined by flame atomic absorption spectrometry. Some major analytical parameters that influenced the extraction efficiency of developed method, such as pH, concentration of ligand, volume of ionic liquid and samples, time of shaking in thermostatic electrical shaker and ultrasonic bath, effect of back extracting HNO 3 volume, matrix effect, centrifugation time and rate were optimized. At the sample volume of 25mL, the calculated preconcentration factor was 62.2. The limit of detection of proposed procedure for Pb +2 ions was found to be 0.54μgL -1 . The validation of developed method was performed by the analysis of certified sample of water SRM 1643e and standard addition method in a real water sample. The extraction recovery of Pb +2 was enhanced≥2% with shaking time of 80s in ultrasonic bath as compared to used thermostatic electrical shaker, where for optimum recovery up to 10min was required. The developed procedure was successfully used for the enrichment of Pb +2 in ground and stored rain water (surface water) samples of an endemic region of Pakistan. The resulted data indicated that the ground water samples were highly contaminated with Pb +2 , while some of the surface water samples were also have higher values of Pb +2 than permissible limit of WHO. The concentration of Pb +2 in surface and ground water samples was found in the range of 17.5-24.5 and 25.6-99.1μgL - 1 respectively. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Kohler, M. D.; Heaton, T. H.; Massari, A.; Guy, R.; Bunn, J.; Chandy, M.

    2015-12-01

    The CSN now has approximately 600 stations in the northern Los Angeles region. The sensors are class-C MEMs accelerometers that are packaged with backup power and data memory and are connected to a cloud-based processing system through the Internet. Most of the sensors are located in an xy-spatial network with an average minimum station spacing of 800 m. This density allows the lateral variations in ground motion to be determined, which will lead to detailed microzonation maps of the region. Approximately 100 of the sensors are located on campuses of the Los Angeles Unified School District (LAUSD), and this is part of a plan to provide schools with critical earthquake information immediately following an earthquake using the ShakeCast system. The software system in the sensors is being upgraded to allow on site measurements of PGA and PVA to be sent directly to the ShakeMap and earthquake early warning systems. More than 160 of the sensor packages are located on multiple floors of buildings with typically one or two 3-component sensors per floor. With these data we can identify traveling waves in the building, as well as determine the eigenfrequencies and mode shapes. By monitoring these quantities with high spatial density before, during, and after a major shaking event, we hope to determine the state of health of the structure.

  8. Topography and geology site effects from the intensity prediction model (ShakeMap) for Austria

    NASA Astrophysics Data System (ADS)

    del Puy Papí Isaba, María; Jia, Yan; Weginger, Stefan

    2017-04-01

    The seismicity in Austria can be categorized as moderated. Despite the fact that the hazard seems to be rather low, earthquakes can cause great damage and losses, specially in densely populated and industrialized areas. It is well known, that equations which predict intensity as a function of magnitude and distance, among other parameters, are useful tool for hazard and risk assessment. Therefore, this study aims to determine an empirical model of the ground shaking intensities (ShakeMap) of a series of earthquakes occurred in Austria between 1000 and 2014. Furthermore, the obtained empirical model will lead to further interpretation of both, contemporary and historical earthquakes. A total of 285 events, which epicenters were located in Austria, and a sum of 22.739 reported macreoseismic data points from Austria and adjoining countries, were used. These events are enclosed in the period 1000-2014 and characterized by having a local magnitude greater than 3. In the first state of the model development, the data was careful selected, e.g. solely intensities equal or greater than III were used. In a second state the data was adjusted to the selected empirical model. Finally, geology and topography corrections were obtained by means of the model residuals in order to derive intensity-based site amplification effects.

  9. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  10. Evaluation of cysteine ethyl ester as efficient inducer for glutathione overproduction in Saccharomyces spp.

    PubMed

    Lorenz, Eric; Schmacht, Maximilian; Senz, Martin

    2016-11-01

    Economical yeast based glutathione (GSH) production is a process that is influenced by several factors like raw material and production costs, biomass production and efficient biotransformation of adequate precursors into the final product GSH. Nowadays the usage of cysteine for the microbial conversion into GSH is industrial state of practice. In the following study, the potential of different inducers to increase the GSH content was evaluated by means of design of experiments methodology. Investigations were executed in three natural Saccharomyces strains, S. cerevisiae, S. bayanus and S. boulardii, in a well suited 50ml shake tube system. Results of shake tube experiments were confirmed in traditional baffled shake flasks and finally via batch cultivation in lab-scale bioreactors under controlled conditions. Comprehensive studies showed that the usage of cysteine ethyl ester (CEE) for the batch-wise biotransformation into GSH led up to a more than 2.2 times higher yield compared to cysteine as inducer. Additionally, the intracellular GSH content could be significantly increased for all strains in terms of 2.29±0.29% for cysteine to 3.65±0.23% for CEE, respectively, in bioreactors. Thus, the usage of CEE provides a highly attractive inducing strategy for the GSH overproduction. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Colic

    MedlinePlus

    ... prompted parents to shake or otherwise harm their child. Shaking a baby can cause serious damage to the brain and death. The risk of these uncontrolled reactions is greater if parents don't have information about soothing a crying child, education about colic and the support needed for ...

  12. A versatile computer package for mechanism analysis, part 2: Dynamics and balance

    NASA Astrophysics Data System (ADS)

    Davies, T.

    The algorithms required for the shaking force components, the shaking moment about the crankshaft axis, and the input torque and bearing load components are discussed using the textile machine as a focus for the discussion. The example is also used to provide illustrations of the output for options on the hodograph of the shaking force vector. This provides estimates of the optimum contrarotating masses and their locations for a generalized primary Lanchester balancer. The suitability of generalized Lanchester balancers particularly for textile machinery, and the overall strategy used during the development of the package are outlined.

  13. Comparison of static and shake culture in the decolorization of textile dyes and dye effluents by Phanerochaete chrysoporium.

    PubMed

    Sani, R K; Azmi, W; Banerjee, U C

    1998-01-01

    Decolorization of several dyes (Red HE-8B, Malachite Green, Navy Blue HE-2R, Magenta, Crystal Violet) and an industrial effluent with growing cells of Phanerochaete chrysosporium in shake and static culture was demonstrated. All the dyes and the industrial effluent were decolorized to some extent with varying percentages of decolorization (20-100%). The rate of decolorization was very rapid with Red HE-8B, an industrial dye. Decolorization rates for all the dyes in static condition were found to be less than the shake culture and also dependent on biomass concentration.

  14. Friability Testing as a New Stress-Stability Assay for Biopharmaceuticals.

    PubMed

    Torisu, Tetsuo; Maruno, Takahiro; Yoneda, Saki; Hamaji, Yoshinori; Honda, Shinya; Ohkubo, Tadayasu; Uchiyama, Susumu

    2017-10-01

    A cycle of dropping and shaking a vial containing antibody solution was reported to induce aggregation. In this study, antibody solutions in glass prefillable syringes with or without silicone oil lubrication were subjected to the combined stresses of dropping and shaking, using a friability testing apparatus. Larger numbers of subvisible particles were generated, regardless of silicone oil lubrication, upon combination stress exposure than that with shaking stress alone. Nucleation of antibody molecules upon perturbation by an impact of dropping and adsorption of antibody molecules to the syringe surface followed by film formation and antibody film desorption were considered key steps in the particle formation promoted by combination stress. A larger number of silicone oil droplets was released when silicone oil-lubricated glass syringes containing phosphate buffer saline were exposed to combination stress than that observed with shaking stress alone. Polysorbate 20, a non-ionic surfactant, effectively reduced the number of protein particles, but failed to prevent silicone oil release upon combination stress exposure. This study indicates that stress-stability assays using the friability testing apparatus are effective for assessing the stability of biopharmaceuticals under the combined stresses of dropping and shaking, which have not been tested in conventional stress-stability assays. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  15. Modeling Signal-Noise Processes Supports Student Construction of a Hierarchical Image of Sample

    ERIC Educational Resources Information Center

    Lehrer, Richard

    2017-01-01

    Grade 6 (modal age 11) students invented and revised models of the variability generated as each measured the perimeter of a table in their classroom. To construct models, students represented variability as a linear composite of true measure (signal) and multiple sources of random error. Students revised models by developing sampling…

  16. Sensor Level Functional Connectivity Topography Comparison Between Different References Based EEG and MEG

    PubMed Central

    Huang, Yunzhi; Zhang, Junpeng; Cui, Yuan; Yang, Gang; Liu, Qi; Yin, Guangfu

    2018-01-01

    Sensor-level functional connectivity topography (sFCT) contributes significantly to our understanding of brain networks. sFCT can be constructed using either electroencephalography (EEG) or magnetoencephalography (MEG). Here, we compared sFCT within the EEG modality and between EEG and MEG modalities. We first used simulations to look at how different EEG references—including the Reference Electrode Standardization Technique (REST), average reference (AR), linked mastoids (LM), and left mastoid references (LR)—affect EEG-based sFCT. The results showed that REST decreased the reference effects on scalp EEG recordings, making REST-based sFCT closer to the ground truth (sFCT based on ideal recordings). For the inter-modality simulation comparisons, we compared each type of EEG-sFCT with MEG-sFCT using three metrics to quantize the differences: Relative Error (RE), Overlap Rate (OR), and Hamming Distance (HD). When two sFCTs are similar, RE and HD are low, while OR is high. Results showed that among all reference schemes, EEG-and MEG-sFCT were most similar when the EEG was REST-based and the EEG and MEG were recorded simultaneously. Next, we analyzed simultaneously recorded MEG and EEG data from publicly available face-recognition experiments using a similar procedure as in the simulations. The results showed (1) if MEG-sFCT is the standard, REST—and LM-based sFCT provided results closer to this standard in the terms of HD; (2) REST-based sFCT and MEG-sFCT had the highest similarity in terms of RE; (3) REST-based sFCT had the most overlapping edges with MEG-sFCT in terms of OR. This study thus provides new insights into the effect of different reference schemes on sFCT and the similarity between MEG and EEG in terms of sFCT. PMID:29867395

  17. Modal characterization of the ASCIE segmented optics testbed: New algorithms and experimental results

    NASA Technical Reports Server (NTRS)

    Carrier, Alain C.; Aubrun, Jean-Noel

    1993-01-01

    New frequency response measurement procedures, on-line modal tuning techniques, and off-line modal identification algorithms are developed and applied to the modal identification of the Advanced Structures/Controls Integrated Experiment (ASCIE), a generic segmented optics telescope test-bed representative of future complex space structures. The frequency response measurement procedure uses all the actuators simultaneously to excite the structure and all the sensors to measure the structural response so that all the transfer functions are measured simultaneously. Structural responses to sinusoidal excitations are measured and analyzed to calculate spectral responses. The spectral responses in turn are analyzed as the spectral data become available and, which is new, the results are used to maintain high quality measurements. Data acquisition, processing, and checking procedures are fully automated. As the acquisition of the frequency response progresses, an on-line algorithm keeps track of the actuator force distribution that maximizes the structural response to automatically tune to a structural mode when approaching a resonant frequency. This tuning is insensitive to delays, ill-conditioning, and nonproportional damping. Experimental results show that is useful for modal surveys even in high modal density regions. For thorough modeling, a constructive procedure is proposed to identify the dynamics of a complex system from its frequency response with the minimization of a least-squares cost function as a desirable objective. This procedure relies on off-line modal separation algorithms to extract modal information and on least-squares parameter subset optimization to combine the modal results and globally fit the modal parameters to the measured data. The modal separation algorithms resolved modal density of 5 modes/Hz in the ASCIE experiment. They promise to be useful in many challenging applications.

  18. Fiche Pratique: Pour en finir avec les histoires de modes; Faire parler les bruits; FDM Frequence Plus: La revue de presse; Construire l'absurde (Practical Ideas: Finishing the Modal Story; Making Noises Speak; FDM Frequence Plus: The Press Review; Constructing the Absurd).

    ERIC Educational Resources Information Center

    Courally, Sylvie; And Others

    1993-01-01

    Four ideas for French language instruction are described, including an exercise on modals, an activity focusing on the use of noises for expression, a listening comprehension exercise, and a lesson on humorous possibilities in language using material from the theater of the absurd. (MSE)

  19. Primary treatments for clinically localized prostate cancer: a comprehensive lifetime cost-utility analysis

    PubMed Central

    Cooperberg, Matthew R.; Ramakrishna, Naren R.; Duff, Steven B.; Hughes, Kathleen E.; Sadownik, Sara; Smith, Joseph A.; Tewari, Ashutosh K.

    2012-01-01

    Objectives To characterize the costs and outcomes associated with radical prostatectomy (open, laparoscopic, or robot-assisted) and radiation therapy (dose-escalated 3-dimensional conformal radiation, intensity-modulated radiation, brachytherapy, or combination), using a comprehensive, lifetime decision analytic model. Patients and Methods A Markov model was constructed to follow hypothetical men with low-, intermediate-, and high-risk prostate cancer over their lifetimes following primary treatment; probabilities of outcomes were based on an exhaustive literature search yielding 232 unique publications. Patients could experience remission, recurrence, salvage treatment, metastasis, death from prostate cancer, and death from other causes. Utilities for each health state were determined, and disutilities were applied for complications and toxicities of treatment. Costs were determined from the U.S. payer perspective, with incorporation of patient costs in a sensitivity analysis. Results Differences in quality-adjusted life years across modalities were modest, ranging from 10.3 to 11.3 for low-risk patients, 9.6 to 10.5 for intermediate-risk patients, and 7.8 to 9.3 for high-risk patients. There were no statistically significant differences among surgical modalities, which tended to be more effective than radiation modalities, with the exception of combination external beam + brachytherapy for high-risk disease. Radiation modalities were consistently more expensive than surgical modalities; costs ranged from $19,901 (robot-assisted prostatectomy for low-risk disease) to $50,276 (combination radiation for high-risk disease). These findings were robust to an extensive set of sensitivity analyses. Conclusions Our analysis found small differences in outcomes and substantial differences in payer and patient costs across treatment alternatives. These findings may inform future policy discussions regarding strategies to improve efficiency of treatment selection for localized prostate cancer. PMID:23279038

  20. President Nixon at Hickam AFB congratulates Astronaut James Lovell

    NASA Technical Reports Server (NTRS)

    1970-01-01

    President Richard M. Nixon and Astronaut James A. Lovell Jr., Apollo 13 commander, shake hands at special ceremonies at Hickam Air Force Base, Hawaii. President Nixon was in Hawaii to present the Apollo 13 crew with the Presidential Medal of Freedom, the nation's highest civilian honor.

  1. Mango Shake

    MedlinePlus

    ... this page: https://medlineplus.gov/recipe/mangoshake.html Mango Shake To use the sharing features on this page, please enable JavaScript. Prep time: 5 minutes Cook time: 0 minutes ... cup low-fat (1 percent) milk 4 Tbsp frozen mango juice (or 1 fresh pitted mango) 1 small ...

  2. Grounded Learning Experience: Helping Students Learn Physics through Visuo-Haptic Priming and Instruction

    NASA Astrophysics Data System (ADS)

    Huang, Shih-Chieh Douglas

    In this dissertation, I investigate the effects of a grounded learning experience on college students' mental models of physics systems. The grounded learning experience consisted of a priming stage and an instruction stage, and within each stage, one of two different types of visuo-haptic representation was applied: visuo-gestural simulation (visual modality and gestures) and visuo-haptic simulation (visual modality, gestures, and somatosensory information). A pilot study involving N = 23 college students examined how using different types of visuo-haptic representation in instruction affected people's mental model construction for physics systems. Participants' abilities to construct mental models were operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Findings from this pilot study revealed that, while both simulations significantly improved participants' mental modal construction for physics systems, visuo-haptic simulation was significantly better than visuo-gestural simulation. In addition, clinical interviews suggested that participants' mental model construction for physics systems benefited from receiving visuo-haptic simulation in a tutorial prior to the instruction stage. A dissertation study involving N = 96 college students examined how types of visuo-haptic representation in different applications support participants' mental model construction for physics systems. Participant's abilities to construct mental models were again operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Participants' physics misconceptions were also measured before and after the grounded learning experience. Findings from this dissertation study not only revealed that visuo-haptic simulation was significantly more effective in promoting mental model construction and remedying participants' physics misconceptions than visuo-gestural simulation, they also revealed that visuo-haptic simulation was more effective during the priming stage than during the instruction stage. Interestingly, the effects of visuo-haptic simulation in priming and visuo-haptic simulation in instruction on participants' pretest-to-posttest gain scores for a basic physics system appeared additive. These results suggested that visuo-haptic simulation is effective in physics learning, especially when it is used during the priming stage.

  3. A procedure for damage detection and localization of framed buildings based on curvature variation

    NASA Astrophysics Data System (ADS)

    Ditommaso, Rocco; Carlo Ponzo, Felice; Auletta, Gianluca; Iacovino, Chiara; Mossucca, Antonello; Nigro, Domenico; Nigro, Antonella

    2014-05-01

    Structural Health Monitoring and Damage Detection are topics of current interest in civil, mechanical and aerospace engineering. Damage Detection approach based on dynamic monitoring of structural properties over time has received a considerable attention in recent scientific literature of the last years. The basic idea arises from the observation that spectral properties, described in terms of the so-called modal parameters (eigenfrequencies, mode shapes, and modal damping), are functions of the physical properties of the structure (mass, energy dissipation mechanisms and stiffness). Structural damage exhibits its main effects in terms of stiffness and damping variation. As a consequence, a permanent dynamic monitoring system makes it possible to detect and, if suitably concentrated on the structure, to localize structural and non-structural damage occurred on the structure during a strong earthquake. In the last years many researchers are working to set-up new methodologies for Non-destructive Damage Evaluation (NDE) based on the variation of the dynamic behaviour of structures under seismic loads. Pandey et al. (1991) highlighted on the possibility to use the structural mode shapes to extract useful information for structural damage localization. In this paper a new procedure for damage detection on framed structures based on changes in modal curvature is proposed. The proposed approach is based on the use of Stockwell Transform, a special kind of integral transformation that become a powerful tool for nonlinear signal analysis and then to analyse the nonlinear behaviour of a general structure. Using this kind of approach, it is possible to use a band-variable filter (Ditommaso et al., 2012) to extract from a signal recorded on a structure (excited by an earthquake) the response related to a single mode of vibration for which the related frequency changes over time (if the structure is being damaged). İn general, by acting simultaneously in both frequency and time domain, it is possible to use the band-variable filter to extract the dynamic characteristics of a system that evolves over time. Aim of this paper is to show, through practical examples, how it is possible to identify and to localize damage on a structure comparing mode shapes and the related curvature variations over time. It is possible to demonstrate that mode curvature variation is strongly related with the damage occurred on a structure. This paper resumes the main outcomes retrieved from many numerical non linear dynamic models of reinforced concrete framed structures characterized by different geometric configurations and designed for gravity loads only. The numerical campaign was conducted using both natural and artificial accelerograms compatible with the Italian code. The main results of experimental shaking table tests carried out on a steel framed model are also showed to confirm the effectiveness of the proposed procedure. REFERENCES Ditommaso R., Mucciarelli M., Ponzo F. C. (2012). Analysis of non-stationary structural systems by using a band-variable filter. Bulletin of Earthquake Engineering. Volume 10, Number 3, pp. 895-911. DOI: 10.1007/s10518-012-9338-y. Pandey AK, Biswas M, Samman MM (1991) "Damage detection from changes in curvature mode shapes", Journal of Sound and Vibration, Vol. 145: Issue 2, pp. 321-332.

  4. Revisiting chlorophyll extraction methods in biological soil crusts - methodology for determination of chlorophyll a and chlorophyll a + b as compared to previous methods

    NASA Astrophysics Data System (ADS)

    Caesar, Jennifer; Tamm, Alexandra; Ruckteschler, Nina; Lena Leifke, Anna; Weber, Bettina

    2018-03-01

    Chlorophyll concentrations of biological soil crust (biocrust) samples are commonly determined to quantify the relevance of photosynthetically active organisms within these surface soil communities. Whereas chlorophyll extraction methods for freshwater algae and leaf tissues of vascular plants are well established, there is still some uncertainty regarding the optimal extraction method for biocrusts, where organism composition is highly variable and samples comprise major amounts of soil. In this study we analyzed the efficiency of two different chlorophyll extraction solvents, the effect of grinding the soil samples prior to the extraction procedure, and the impact of shaking as an intermediate step during extraction. The analyses were conducted on four different types of biocrusts. Our results show that for all biocrust types chlorophyll contents obtained with ethanol were significantly lower than those obtained using dimethyl sulfoxide (DMSO) as a solvent. Grinding of biocrust samples prior to analysis caused a highly significant decrease in chlorophyll content for green algal lichen- and cyanolichen-dominated biocrusts, and a tendency towards lower values for moss- and algae-dominated biocrusts. Shaking of the samples after each extraction step had a significant positive effect on the chlorophyll content of green algal lichen- and cyanolichen-dominated biocrusts. Based on our results we confirm a DMSO-based chlorophyll extraction method without grinding pretreatment and suggest the addition of an intermediate shaking step for complete chlorophyll extraction (see Supplement S6 for detailed manual). Determination of a universal chlorophyll extraction method for biocrusts is essential for the inter-comparability of publications conducted across all continents.

  5. Ocular motor characteristics of different subtypes of spinocerebellar ataxia: distinguishing features.

    PubMed

    Kim, Ji Sun; Kim, Ji Soo; Youn, Jinyoung; Seo, Dae-Won; Jeong, Yuri; Kang, Ji-Hoon; Park, Jeong Ho; Cho, Jin Whan

    2013-08-01

    Because of frequent involvement of the cerebellum and brainstem, ocular motor abnormalities are key features of spinocerebellar ataxias and may aid in differential diagnosis. Our objective for this study was to distinguish the subtypes by ophthalmologic features after head-shaking and positional maneuvers, which are not yet recognized as differential diagnostic tools in most common forms of spinocerebellar ataxias. Of the 302 patients with a diagnosis of cerebellar ataxia in 3 Korean University Hospitals from June 2011 to June 2012, 48 patients with spinocerebellar ataxia types 1, 2, 3, 6, 7, or 8 or with undetermined spinocerebellar ataxias were enrolled. All patients underwent a video-oculographic recording of fixation abnormalities, gaze-evoked nystagmus, positional and head-shaking nystagmus, and dysmetric saccades. Logistic regression analysis controlling for disease duration revealed that spontaneous and positional downbeat nystagmus and perverted head-shaking nystagmus were strong predictors for spinocerebellar ataxia 6, whereas saccadic intrusions and oscillations were identified as positive indicators of spinocerebellar ataxia 3. In contrast, the presence of gaze-evoked nystagmus and dysmetric saccades was a negative predictor of spinocerebellar ataxia 2. Positional maneuvers and horizontal head shaking occasionally induced or augmented saccadic intrusions/oscillations in patients with spinocerebellar ataxia types 1, 2, and 3 and undetermined spinocerebellar ataxia. The results indicated that perverted head-shaking nystagmus may be the most sensitive parameter for SCA6, whereas saccadic intrusions/oscillations are the most sensitive for spinocerebellar ataxia 3. In contrast, a paucity of gaze-evoked nystagmus and dysmetric saccades is more indicative of spinocerebellar ataxia 2. Head-shaking and positional maneuvers aid in defining ocular motor characteristics in spinocerebellar ataxias. © 2013 Movement Disorder Society. Copyright © 2013 Movement Disorder Society.

  6. Construction of Halomonas bluephagenesis capable of high cell density growth for efficient PHA production.

    PubMed

    Ren, Yilin; Ling, Chen; Hajnal, Ivan; Wu, Qiong; Chen, Guo-Qiang

    2018-05-01

    High-cell-density cultivation is an effective way to improve the productivity of microbial fermentations and in turn reduce the cost of the final products, especially in the case of intracellular products. Halomonas bluephagenesis TD01 is a halophilic platform bacterium for the next generation of industrial biotechnology with a native PHA synthetic pathway, able to grow under non-sterile continuous fermentation conditions. A selection strategy for mutant strains that can grow to a high cell density was developed. Based on an error-prone DNA polymerase III ε subunit, a genome-wide random mutagenesis system was established and used in conjunction with an artificial high cell density culture environment during the selection process. A high-cell-density H. bluephagenesis TDHCD-R3 obtained after 3 rounds of selection showed an obvious enhancement of resistance to toxic metabolites including acetate, formate, lactate and ethanol compared to wild-type. H. bluephagenesis TDHCD-R3-8-3 constructed from H. bluephagenesis TDHCD-R3 by overexpressing an optimized phaCAB operon was able to grow to 15 g/L cell dry weight (CDW) containing 94% PHA in shake flask studies. H. bluephagenesis TDHCD-R3-8-3 was grown to more than 90 g/L CDW containing 79% PHA compared with only 81 g/L with 70% PHA by the wild type when incubated in a 7-L fermentor under the same conditions.

  7. “Shake It Baby, Shake It”: Media Preferences, Sexual Attitudes and Gender Stereotypes Among Adolescents

    PubMed Central

    Engels, Rutger C. M. E.; Bogers, Sanne; Kloosterman, Monique

    2010-01-01

    In this study exposure to and preferences for three important youth media (TV, music styles/music TV, internet) were examined in relation to adolescents’ permissive sexual attitudes and gender stereotypes (i.e., views of men as sex-driven and tough, and of women as sex objects). Multivariate structural analysis of data from a school-based sample of 480 13 to 16-year-old Dutch students revealed that preferences, rather than exposure were associated with attitudes and stereotypes. For both girls and boys, preferences for hip-hop and hard-house music were associated positively with gender stereotypes and preference for classical music was negatively associated with gender stereotypes. Particularly for boys, using internet to find explicit sexual content emerged as a powerful indicator of all attitudes and stereotypes. PMID:21212809

  8. Living with earthquakes - development and usage of earthquake-resistant construction methods in European and Asian Antiquity

    NASA Astrophysics Data System (ADS)

    Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes

    2010-05-01

    Earthquakes are among the most horrible events of nature due to unexpected occurrence, for which no spiritual means are available for protection. The only way of preserving life and property is applying earthquake-resistant construction methods. Ancient Greek architects of public buildings applied steel clamps embedded in lead casing to hold together columns and masonry walls during frequent earthquakes in the Aegean region. Elastic steel provided strength, while plastic lead casing absorbed minor shifts of blocks without fracturing rigid stone. Romans invented concrete and built all sizes of buildings as a single, unflexible unit. Masonry surrounding and decorating concrete core of the wall did not bear load. Concrete resisted minor shaking, yielding only to forces higher than fracture limits. Roman building traditions survived the Dark Ages and 12th century Crusader castles erected in earthquake-prone Syria survive until today in reasonably good condition. Concrete and steel clamping persisted side-by-side in the Roman Empire. Concrete was used for cheap construction as compared to building of masonry. Applying lead-encased steel increased costs, and was avoided whenever possible. Columns of the various forums in Italian Pompeii mostly lack steel fittings despite situated in well-known earthquake-prone area. Whether frequent recurrence of earthquakes in the Naples region was known to inhabitants of Pompeii might be a matter of debate. Seemingly the shock of the AD 62 earthquake was not enough to apply well-known protective engineering methods throughout the reconstruction of the city before the AD 79 volcanic catastrophe. An independent engineering tradition developed on the island of Java (Indonesia). The mortar-less construction technique of 8-9th century Hindu masonry shrines around Yogyakarta would allow scattering of blocks during earthquakes. To prevent dilapidation an intricate mortise-and-tenon system was carved into adjacent faces of blocks. Only the outermost layer was treated this way, the core of the shrines was made of simple rectangular blocks. The system resisted both in-plane and out-of-plane shaking quite well, as proven by survival of many shrines for more than a millennium, and by fracturing of blocks instead of displacement during the 2006 Yogyakarta earthquake. Systematic use or disuse of known earthquake-resistant techniques in any one society depends on the perception of earthquake risk and on available financial resources. Earthquake-resistant construction practice is significantly more expensive than regular construction. Perception is influenced mostly by short individual and longer social memory. If earthquake recurrence time is longer than the preservation of social memory, if damaging quakes fade into the past, societies commit the same construction mistakes again and again. Length of the memory is possibly about a generation's lifetime. Events occurring less frequently than 25-30 years can be readily forgotten, and the risk of recurrence considered as negligible, not worth the costs of safe construction practices. (Example of recurring flash floods in Hungary.) Frequent earthquakes maintain safe construction practices, like the Java masonry technique throughout at least two centuries, and like the Fachwerk tradition on Modern Aegean Samos throughout 500 years of political and technological development. (OTKA K67583)

  9. Shaking table test and dynamic response prediction on an earthquake-damaged RC building

    NASA Astrophysics Data System (ADS)

    Xianguo, Ye; Jiaru, Qian; Kangning, Li

    2004-12-01

    This paper presents the results from shaking table tests of a one-tenth-scale reinforced concrete (RC) building model. The test model is a protype of a building that was seriously damaged during the 1985 Mexico earthquake. The input ground excitation used during the test was from the records obtained near the site of the prototype building during the 1985 and 1995 Mexico earthquakes. The tests showed that the damage pattern of the test model agreed well with that of the prototype building. Analytical prediction of earthquake response has been conducted for the prototype building using a sophisticated 3-D frame model. The input motion used for the dynamic analysis was the shaking table test measurements with similarity transformation. The comparison of the analytical results and the shaking table test results indicates that the response of the RC building to minor and the moderate earthquakes can be predicated well. However, there is difference between the predication and the actual response to the major earthquake.

  10. Effects of oxcarbazepine on monoamines content in hippocampus and head and body shakes and sleep patterns in kainic acid-treated rats.

    PubMed

    Alfaro-Rodríguez, Alfonso; González-Piña, Rigoberto; Bueno-Nava, Antonio; Arch-Tirado, Emilio; Ávila-Luna, Alberto; Uribe-Escamilla, Rebeca; Vargas-Sánchez, Javier

    2011-09-01

    The aim of this work was to analyze the effect of oxcarbazepine (OXC) on sleep patterns, "head and body shakes" and monoamine neurotransmitters level in a model of kainic-induced seizures. Adult Wistar rats were administered kainic acid (KA), OXC or OXC + KA. A polysomnographic study showed that KA induced animals to stay awake for the whole initial 10 h. OXC administration 30 min prior to KA diminished the effect of KA on the sleep parameters. As a measure of the effects of the drug treatments on behavior, head and body shakes were visually recorded for 4 h after administration of KA, OXC + KA or saline. The presence of OXC diminished the shakes frequency. 4 h after drug application, the hippocampus was dissected out, and the content of monoamines was analyzed. The presence of OXC still more increased serotonin, 5-hidroxyindole acetic acid, dopamine, and homovanilic acid, induced by KA.

  11. Correlation between shaking behaviors and seizure severity in five animal models of convulsive seizures.

    PubMed

    Rodrigues, Marcelo Cairrão Araújo; Rossetti, Franco; Foresti, Maira Licia; Arisi, Gabriel Maisonnave; Furtado, Márcio Araújo; Dal-Cól, Maria Luiza Cleto; Bertti, Poliana; Fernandes, Artur; Santos, Francisco Leite; Del Vecchio, Flávio; Garcia-Cairasco, Norberto

    2005-05-01

    Wet dog shakes (WDS) and head shakes (HS) are associated with experimentally induced convulsive seizures. We sought to determine whether these behaviors are correlated or not with major (status epilepticus (SE) or fully kindled animals) or minor (non-SE or partially kindled animals) seizure severity. WDS are directly correlated with SE induced by intracerebral star fruit extract (Averrhoa carambola) injection and with kindled animals in the amygdala fast kindling model. On the other hand, WDS are inversely correlated with SE induced by intracerebral bicuculline and pilocarpine injections. Systemic pilocarpine in animals pretreated with methyl-scopolamine barely induced WDS or HS. The role of shaking behaviors may vary from ictal to anticonvulsant depending on the experimental seizure model, circuitries involved, and stimulus intensity. The physical presence of acrylic helmets may per se inhibit the HS response. Also, methyl-scopolamine, a drug incapable of crossing the blood-brain barrier, can induce HS in animals without acrylic helmets.

  12. Anomalies and contradictions in an airport construction project: a historical analysis based on Cultural-Historical Activity Theory.

    PubMed

    Lopes, Manoela Gomes Reis; Vilela, Rodolfo Andrade de Gouveia; Querol, Marco Antônio Pereira

    2018-02-19

    Large construction projects involve the functioning of a complex activity system (AS) in network format. Anomalies such as accidents, delays, reworks, etc., can be explained by contradictions that emerge historically in the system. The aim of this study was to analyze the history of an airport construction project to understand the current contradictions and anomalies in the AS and how they emerged. A case study was conducted for this purpose, combining Collective Work Analysis, interviews, observations, and analysis of documents that provided the basis for sessions in the Change Laboratory, where a participant timeline was elaborated with the principal events during the construction project. Based on the timeline, a historical analysis of the airport's AS revealed critical historical events and contradictions that explained the anomalies that occurred during the project. The analysis showed that the airport had been planned for construction with politically determined deadlines that were insufficient and inconsistent with the project's complexity. The choice of the contract modality, which assigned responsibility to a joint venture for all of the project's phases, was another critical historical event, because it allowed launching the construction before a definitive executive project had been drafted. There were also different cultures in companies working together for the first time in the context of a project with time pressures and outsourcing of activities without the necessary coordination. Identifying these contradictions and their historical origins proved essential for understanding the current situation and efforts to prevent similar situations in the future.

  13. The ShakeMap Atlas for the City of Naples, Italy

    NASA Astrophysics Data System (ADS)

    Pierdominici, Simona; Faenza, Licia; Camassi, Romano; Michelini, Alberto; Ercolani, Emanuela; Lauciani, Valentino

    2016-04-01

    Naples is one of the most vulnerable cities in the world because it is threatened by several natural and man-made hazards: earthquakes, volcanic eruptions, tsunamis, landslides, hydrogeological disasters, and morphologic alterations due to human interference. In addition, the risk is increased by the high density of population (Naples and the surrounding area are among the most populated in Italy), and by the type and condition of buildings and monuments. In light of this, it is crucial to assess the ground shaking suffered by the city. We take into account and integrate data information from five Italian databases and catalogues (DBMI11; CPTI11; CAMAL11; MOLAL08; ITACA) to build a reliable ShakeMap atlas for the area and to recreate the seismic history of the city from historical to recent times (1293 to 1999). This large amount of data gives the opportunity to explore several sources of information, expanding the completeness of our data set in both time and magnitude. 84 earthquakes have been analyzed and for each event, a Shakemap set has been computed using an ad hoc implementation developed for this application: (1) specific ground-motion prediction equations (GMPEs) accounting for the different attenuation properties in volcanic areas compared with the tectonic ones, and (2) detailed local microzonation to include the site effects. The ShakeMap atlas has two main applications: a) it is an important instrument in seismic risk management. It quantifies the level of shaking suffered by a city during its history, and it could be implemented to the quantification of the number of people exposed to certain degrees of shaking. Intensity data provide the evaluation of the damage caused by earthquakes; the damage is closely linked with the ground shaking, building type, and vulnerability, and it is not possible to separate these contributions; b) the Atlas can be used as starting point for Bayesian estimation of seismic hazard. This technique allows for the merging of the more standard approach adopted in the compilation of the national hazard map of Italy. These Shakemaps are provided in terms of Mercalli-Cancani-Sieberg intensity (MCS hereinafter) and peak ground acceleration (PGA).

  14. Make an Earthquake: Ground Shaking!

    ERIC Educational Resources Information Center

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  15. Applied research of shaking table for scandium concentration from a silicate ore

    NASA Astrophysics Data System (ADS)

    Yan, P.; Zhang, G. F.; Gao, L.; Shi, B. H.; Shi, Z.; Yang, Y. D.

    2018-03-01

    A poor magnetite iron ore is a super large independent scandium deposit with over the multi-billion potential utilizable value. Shaking table separation is very useful for impurities removing and scandium content increasing as a follow-up step of high-intensity magnetic separation. In the present study, a satisfactory index, namely scandium content of 83.10 g/t and recovery rate of 79.45 wt%, was obtained by shaking table separation. The good result was achieved under the conditions which the parameters were feed concentrate of 18 wt%, feeding quantity of 11 L/min, stroke frequency of 275 times/min and stroke of 17mm.

  16. Managing large-scale workflow execution from resource provisioning to provenance tracking: The CyberShake example

    USGS Publications Warehouse

    Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2006-01-01

    This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.

  17. Fixed Base Modal Survey of the MPCV Orion European Service Module Structural Test Article

    NASA Technical Reports Server (NTRS)

    Winkel, James P.; Akers, J. C.; Suarez, Vicente J.; Staab, Lucas D.; Napolitano, Kevin L.

    2017-01-01

    Recently, the MPCV Orion European Service Module Structural Test Article (E-STA) underwent sine vibration testing using the multi-axis shaker system at NASA GRC Plum Brook Station Mechanical Vibration Facility (MVF). An innovative approach using measured constraint shapes at the interface of E-STA to the MVF allowed high-quality fixed base modal parameters of the E-STA to be extracted, which have been used to update the E-STA finite element model (FEM), without the need for a traditional fixed base modal survey. This innovative approach provided considerable program cost and test schedule savings. This paper documents this modal survey, which includes the modal pretest analysis sensor selection, the fixed base methodology using measured constraint shapes as virtual references and measured frequency response functions, and post-survey comparison between measured and analysis fixed base modal parameters.

  18. Multimodal neural correlates of cognitive control in the Human Connectome Project.

    PubMed

    Lerman-Sinkoff, Dov B; Sui, Jing; Rachakonda, Srinivas; Kandala, Sridhar; Calhoun, Vince D; Barch, Deanna M

    2017-12-01

    Cognitive control is a construct that refers to the set of functions that enable decision-making and task performance through the representation of task states, goals, and rules. The neural correlates of cognitive control have been studied in humans using a wide variety of neuroimaging modalities, including structural MRI, resting-state fMRI, and task-based fMRI. The results from each of these modalities independently have implicated the involvement of a number of brain regions in cognitive control, including dorsal prefrontal cortex, and frontal parietal and cingulo-opercular brain networks. However, it is not clear how the results from a single modality relate to results in other modalities. Recent developments in multimodal image analysis methods provide an avenue for answering such questions and could yield more integrated models of the neural correlates of cognitive control. In this study, we used multiset canonical correlation analysis with joint independent component analysis (mCCA + jICA) to identify multimodal patterns of variation related to cognitive control. We used two independent cohorts of participants from the Human Connectome Project, each of which had data from four imaging modalities. We replicated the findings from the first cohort in the second cohort using both independent and predictive analyses. The independent analyses identified a component in each cohort that was highly similar to the other and significantly correlated with cognitive control performance. The replication by prediction analyses identified two independent components that were significantly correlated with cognitive control performance in the first cohort and significantly predictive of performance in the second cohort. These components identified positive relationships across the modalities in neural regions related to both dynamic and stable aspects of task control, including regions in both the frontal-parietal and cingulo-opercular networks, as well as regions hypothesized to be modulated by cognitive control signaling, such as visual cortex. Taken together, these results illustrate the potential utility of multi-modal analyses in identifying the neural correlates of cognitive control across different indicators of brain structure and function. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Explaining errors in children's questions.

    PubMed

    Rowland, Caroline F

    2007-07-01

    The ability to explain the occurrence of errors in children's speech is an essential component of successful theories of language acquisition. The present study tested some generativist and constructivist predictions about error on the questions produced by ten English-learning children between 2 and 5 years of age. The analyses demonstrated that, as predicted by some generativist theories [e.g. Santelmann, L., Berk, S., Austin, J., Somashekar, S. & Lust. B. (2002). Continuity and development in the acquisition of inversion in yes/no questions: dissociating movement and inflection, Journal of Child Language, 29, 813-842], questions with auxiliary DO attracted higher error rates than those with modal auxiliaries. However, in wh-questions, questions with modals and DO attracted equally high error rates, and these findings could not be explained in terms of problems forming questions with why or negated auxiliaries. It was concluded that the data might be better explained in terms of a constructivist account that suggests that entrenched item-based constructions may be protected from error in children's speech, and that errors occur when children resort to other operations to produce questions [e.g. Dabrowska, E. (2000). From formula to schema: the acquisition of English questions. Cognitive Liguistics, 11, 83-102; Rowland, C. F. & Pine, J. M. (2000). Subject-auxiliary inversion errors and wh-question acquisition: What children do know? Journal of Child Language, 27, 157-181; Tomasello, M. (2003). Constructing a language: A usage-based theory of language acquisition. Cambridge, MA: Harvard University Press]. However, further work on constructivist theory development is required to allow researchers to make predictions about the nature of these operations.

  20. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    USGS Publications Warehouse

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and resulting losses are one realistic outcome, deliberately not a worst-case scenario, rather one worth preparing for and mitigating against. Decades of improving the life-safety requirements in building codes have greatly reduced the risk of death in earthquakes, yet southern California's economic and social systems are still vulnerable to large-scale disruptions. Because of this, the ShakeOut Scenario earthquake would dramatically alter the nature of the southern California community. Fortunately, steps can be taken now that can change that outcome and repay any costs many times over. The ShakeOut Scenario is the first public product of the USGS Multi-Hazards Demonstration Project, created to show how hazards science can increase a community's resiliency to natural disasters through improved planning, mitigation, and response.

  1. The Early Warning System(EWS) as First Stage to Generate and Develop Shake Map for Bucharest to Deep Vrancea Earthquakes

    NASA Astrophysics Data System (ADS)

    Marmureanu, G.; Ionescu, C.; Marmureanu, A.; Grecu, B.; Cioflan, C.

    2007-12-01

    EWS made by NIEP is the first European system for real-time early detection and warning of the seismic waves in case of strong deep earthquakes. EWS uses the time interval (28-32 seconds) between the moment when earthquake is detected by the borehole and surface local accelerometers network installed in the epicenter area (Vrancea) and the arrival time of the seismic waves in the protected area, to deliver timely integrated information in order to enable actions to be taken before a main destructive shaking takes place. Early warning system is viewed as part of an real-time information system that provide rapid information, about an earthquake impeding hazard, to the public and disaster relief organizations before (early warning) and after a strong earthquake (shake map).This product is fitting in with other new product on way of National Institute for Earth Physics, that is, the shake map which is a representation of ground shaking produced by an event and it will be generated automatically following large Vrancea earthquakes. Bucharest City is located in the central part of the Moesian platform (age: Precambrian and Paleozoic) in the Romanian Plain, at about 140 km far from Vrancea area. Above a Cretaceous and a Miocene deposit (with the bottom at roundly 1,400 m of depth), a Pliocene shallow water deposit (~ 700m thick) was settled. The surface geology consists mainly of Quaternary alluvial deposits. Later loess covered these deposits and the two rivers crossing the city (Dambovita and Colentina) carved the present landscape. During the last century Bucharest suffered heavy damage and casualties due to 1940 (Mw = 7.7) and 1977 (Mw = 7.4) Vrancea earthquakes. For example, 32 high tall buildings collapsed and more then 1500 people died during the 1977 event. The innovation with comparable or related systems worldwide is that NIEP will use the EWS to generate a virtual shake map for Bucharest (140 km away of epicentre) immediately after the magnitude is estimated (in 3-4 seconds after the detection in epicentre) and later make corrections by using real time dataflow from each K2 accelerometers installed in Bucharest area, inclusively nonlinear effects. Thus, developing of a near real-time shake map for Bucharest urban area is of highest interest, providing valuable information to the civil defense, decision makers and general public on the area where the ground motion is most severe. EWS made by NIEP can be considered the first stage to generate and develop the shake map for Bucharest to deep Vrancea earthquakes.

  2. Shake table test of soil-pile groups-bridge structure interaction in liquefiable ground

    NASA Astrophysics Data System (ADS)

    Tang, Liang; Ling, Xianzhang; Xu, Pengju; Gao, Xia; Wang, Dongsheng

    2010-03-01

    This paper describes a shake table test study on the seismic response of low-cap pile groups and a bridge structure in liquefiable ground. The soil profile, contained in a large-scale laminar shear box, consisted of a horizontally saturated sand layer overlaid with a silty clay layer, with the simulated low-cap pile groups embedded. The container was excited in three El Centro earthquake events of different levels. Test results indicate that excessive pore pressure (EPP) during slight shaking only slightly accumulated, and the accumulation mainly occurred during strong shaking. The EPP was gradually enhanced as the amplitude and duration of the input acceleration increased. The acceleration response of the sand was remarkably influenced by soil liquefaction. As soil liquefaction occurred, the peak sand displacement gradually lagged behind the input acceleration; meanwhile, the sand displacement exhibited an increasing effect on the bending moment of the pile, and acceleration responses of the pile and the sand layer gradually changed from decreasing to increasing in the vertical direction from the bottom to the top. A jump variation of the bending moment on the pile was observed near the soil interface in all three input earthquake events. It is thought that the shake table tests could provide the groundwork for further seismic performance studies of low-cap pile groups used in bridges located on liquefiable groun.

  3. Quantitative x-ray photoelectron spectroscopy: Quadrupole effects, shake-up, Shirley background, and relative sensitivity factors from a database of true x-ray photoelectron spectra

    NASA Astrophysics Data System (ADS)

    Seah, M. P.; Gilmore, I. S.

    2006-05-01

    An analysis is provided of the x-ray photoelectron spectroscopy (XPS) intensities measured in the National Physical Laboratory (NPL) XPS database for 46 solid elements. This present analysis does not change our previous conclusions concerning the excellent correlation between experimental intensities, following deconvolving the spectra with angle-averaged reflection electron energy loss data, and the theoretical intensities involving the dipole approximation using Scofield’s cross sections. Here, more recent calculations for cross sections by Trzhaskovskaya involving quadrupole terms are evaluated and it is shown that their cross sections diverge from the experimental database results by up to a factor of 5. The quadrupole angular terms lead to small corrections that are close to our measurement limit but do appear to be supported in the present analysis. Measurements of the extent of shake-up for the 46 elements broadly agree with the calculations of Yarzhemsky but not in detail. The predicted constancy in the shake-up contribution by Yarzhemsky implies that the use of the Shirley background will lead to a peak area that is a constant fraction of the true peak area including the shake-up intensities. However, the measured variability of the shake-up contribution makes the Shirley background invalid for quantification except for situations where the sensitivity factors are from reference samples similar to those being analyzed.

  4. Vision 20/20: Simultaneous CT-MRI — Next chapter of multimodality imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ge, E-mail: wangg6@rpi.edu; Xi, Yan; Gjesteby, Lars

    Multimodality imaging systems such as positron emission tomography-computed tomography (PET-CT) and MRI-PET are widely available, but a simultaneous CT-MRI instrument has not been developed. Synergies between independent modalities, e.g., CT, MRI, and PET/SPECT can be realized with image registration, but such postprocessing suffers from registration errors that can be avoided with synchronized data acquisition. The clinical potential of simultaneous CT-MRI is significant, especially in cardiovascular and oncologic applications where studies of the vulnerable plaque, response to cancer therapy, and kinetic and dynamic mechanisms of targeted agents are limited by current imaging technologies. The rationale, feasibility, and realization of simultaneous CT-MRImore » are described in this perspective paper. The enabling technologies include interior tomography, unique gantry designs, open magnet and RF sequences, and source and detector adaptation. Based on the experience with PET-CT, PET-MRI, and MRI-LINAC instrumentation where hardware innovation and performance optimization were instrumental to construct commercial systems, the authors provide top-level concepts for simultaneous CT-MRI to meet clinical requirements and new challenges. Simultaneous CT-MRI fills a major gap of modality coupling and represents a key step toward the so-called “omnitomography” defined as the integration of all relevant imaging modalities for systems biology and precision medicine.« less

  5. Gold nanoshelled liquid perfluorocarbon nanocapsules for combined dual modal ultrasound/CT imaging and photothermal therapy of cancer.

    PubMed

    Ke, Hengte; Yue, Xiuli; Wang, Jinrui; Xing, Sen; Zhang, Qian; Dai, Zhifei; Tian, Jie; Wang, Shumin; Jin, Yushen

    2014-03-26

    The integration of multimodal contrast-enhanced diagnostic imaging and therapeutic capabilities could utilize imaging guided therapy to plan the treatment strategy based on the diagnostic results and to guide/monitor the therapeutic procedures. Herein, gold nanoshelled perfluorooctylbromide (PFOB) nanocapsules with PEGylation (PGsP NCs) are constructed by oil-in-water emulsion method to form polymeric PFOB nanocapsules, followed by the formation of PEGylated gold nanoshell on the surface. PGsP NCs could not only provide excellent contrast enhancement for dual modal ultrasound and CT imaging in vitro and in vivo, but also serve as efficient photoabsorbers for photothermal ablation of tumors on xenografted nude mouse model. To our best knowledge, this is the first report of gold nanoshell serving as both CT contrast agents and photoabsorbers for photothermal therapy. The novel multifunctional nanomedicine would be of great value to offer more comprehensive diagnostic information to guide more accurate and effective cancer therapy. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Stakeholders' perceptions of transferability criteria for health promotion interventions: a case study.

    PubMed

    Trompette, Justine; Kivits, Joëlle; Minary, Laetitia; Cambon, Linda; Alla, François

    2014-11-04

    The effects of health promotion interventions are the result not only of the interventions themselves, but also of the contexts in which they unfold. The objective of this study was to analyze, through stakeholders' discourse, the characteristics of an intervention that can influence its outcomes. This case study was based on semi-structured interviews with health promotion stakeholders involved in a regional program (PRALIMAP). General hypotheses on transferability and on how the intervention is presumed to produce its effects were used to construct an interview guide. Interviews were analyzed using thematic coding. Twenty-three stakeholders were interviewed. Results showed stakeholders made few references to population and environment characteristics. Three themes emerged as significant for the stakeholders: implementation modalities and methodology, modalities used to mobilize actors; and transferability-promoting factors and barriers. Our work contributes to a better understanding not only of transferability factors, but also of stakeholders' perceptions of them, which are just as important, because those perceptions themselves are a factor in mobilization of actors, implementation, and transferability.

  7. Regression-based pediatric norms for the brief visuospatial memory test: revised and the symbol digit modalities test.

    PubMed

    Smerbeck, A M; Parrish, J; Yeh, E A; Hoogs, M; Krupp, Lauren B; Weinstock-Guttman, B; Benedict, R H B

    2011-04-01

    The Brief Visuospatial Memory Test - Revised (BVMTR) and the Symbol Digit Modalities Test (SDMT) oral-only administration are known to be sensitive to cerebral disease in adult samples, but pediatric norms are not available. A demographically balanced sample of healthy control children (N = 92) ages 6-17 was tested with the BVMTR and SDMT. Multiple regression analysis (MRA) was used to develop demographically controlled normative equations. This analysis provided equations that were then used to construct demographically adjusted z-scores for the BVMTR Trial 1, Trial 2, Trial 3, Total Learning, and Delayed Recall indices, as well as the SDMT total correct score. To demonstrate the utility of this approach, a comparison group of children with acute disseminated encephalomyelitis (ADEM) or multiple sclerosis (MS) were also assessed. We find that these visual processing tests discriminate neurological patients from controls. As the tests are validated in adult multiple sclerosis, they are likely to be useful in monitoring pediatric onset multiple sclerosis patients as they transition into adulthood.

  8. Application of a Terrestrial LIDAR System for Elevation Mapping in Terra Nova Bay, Antarctica.

    PubMed

    Cho, Hyoungsig; Hong, Seunghwan; Kim, Sangmin; Park, Hyokeun; Park, Ilsuk; Sohn, Hong-Gyoo

    2015-09-16

    A terrestrial Light Detection and Ranging (LIDAR) system has high productivity and accuracy for topographic mapping, but the harsh conditions of Antarctica make LIDAR operation difficult. Low temperatures cause malfunctioning of the LIDAR system, and unpredictable strong winds can deteriorate data quality by irregularly shaking co-registration targets. For stable and efficient LIDAR operation in Antarctica, this study proposes and demonstrates the following practical solutions: (1) a lagging cover with a heating pack to maintain the temperature of the terrestrial LIDAR system; (2) co-registration using square planar targets and two-step point-merging methods based on extracted feature points and the Iterative Closest Point (ICP) algorithm; and (3) a georeferencing module consisting of an artificial target and a Global Navigation Satellite System (GNSS) receiver. The solutions were used to produce a topographic map for construction of the Jang Bogo Research Station in Terra Nova Bay, Antarctica. Co-registration and georeferencing precision reached 5 and 45 mm, respectively, and the accuracy of the Digital Elevation Model (DEM) generated from the LIDAR scanning data was ±27.7 cm.

  9. Spring tube braces for seismic isolation of buildings

    NASA Astrophysics Data System (ADS)

    Karayel, V.; Yuksel, Ercan; Gokce, T.; Sahin, F.

    2017-01-01

    A new low-cost seismic isolation system based on spring tube bracings has been proposed and studied at the Structural and Earthquake Engineering Laboratory of Istanbul Technical University. Multiple compression-type springs are positioned in a special cylindrical tube to obtain a symmetrical response in tension and compression-type axial loading. An isolation floor, which consists of pin-ended steel columns and spring tube bracings, is constructed at the foundation level or any intermediate level of the building. An experimental campaign with three stages was completed to evaluate the capability of the system. First, the behavior of the spring tubes subjected to axial displacement reversals with varying frequencies was determined. In the second phase, the isolation floor was assessed in the quasi-static tests. Finally, a ¼ scaled 3D steel frame was tested on the shake table using actual acceleration records. The transmitted acceleration to the floor levels is greatly diminished because of the isolation story, which effects longer period and higher damping. There are no stability and self-centering problems in the isolation floor.

  10. FISH-Based Analysis of Clonally Derived CHO Cell Populations Reveals High Probability for Transgene Integration in a Terminal Region of Chromosome 1 (1q13).

    PubMed

    Li, Shengwei; Gao, Xiaoping; Peng, Rui; Zhang, Sheng; Fu, Wei; Zou, Fangdong

    A basic goal in the development of recombinant proteins is the generation of cell lines that express the desired protein stably over many generations. Here, we constructed engineered Chinese hamster ovary cell lines (CHO-S) with a pCHO-hVR1 vector that carried an extracellular domain of a VEGF receptor (VR) fusion gene. Forty-five clones with high hVR1 expression were selected for karyotype analysis. Using fluorescence in situ hybridization (FISH) and G-banding, we found that pCHO-hVR1 was integrated into three chromosomes, including chromosomes 1, Z3 and Z4. Four clones were selected to evaluate their productivity under non-fed, non-optimized shake flask conditions. The results showed that clones 1 and 2 with integration sites on chromosome 1 revealed high levels of hVR1 products (shake flask of approximately 800 mg/L), whereas clones 3 and 4 with integration sites on chromosomes Z3 or Z4 had lower levels of hVR1 products. Furthermore, clones 1 and 2 maintained their productivity stabilities over a continuous period of 80 generations, and clones 3 and 4 showed significant declines in their productivities in the presence of selection pressure. Finally, pCHO-hVR1 localized to the same region at chromosome 1q13, the telomere region of normal chromosome 1. In this study, these results demonstrate that the integration of exogenous hVR1 gene on chromosome 1, band q13, may create a high protein-producing CHO-S cell line, suggesting that chromosome 1q13 may contain a useful target site for the high expression of exogenous protein. This study shows that the integration into the target site of chromosome 1q13 may avoid the problems of random integration that cause gene silencing or also overcome position effects, facilitating exogenous gene expression in CHO-S cells.

  11. Protons -- The Future of Radiation Therapy?

    NASA Astrophysics Data System (ADS)

    Avery, Steven

    2007-03-01

    Cancer is the 2^nd highest cause of death in the United States. The challenges of controlling this disease remain more difficult as the population lives longer. Proton therapy offers another choice in the management of cancer care. Proton therapy has existed since the late 1950s and the first hospital based center in the United States opened in 1990. Since that time four hospital based proton centers are treating patients with other centers either under construction or under consideration. This talk will focus on an introduction to proton therapy: it's medical advantages over current treatment modalities, accelerators and beam delivery systems, applications to clinical radiation oncology and the future outlook for proton therapy.

  12. Explicit reference governor for linear systems

    NASA Astrophysics Data System (ADS)

    Garone, Emanuele; Nicotra, Marco; Ntogramatzidis, Lorenzo

    2018-06-01

    The explicit reference governor is a constrained control scheme that was originally introduced for generic nonlinear systems. This paper presents two explicit reference governor strategies that are specifically tailored for the constrained control of linear time-invariant systems subject to linear constraints. Both strategies are based on the idea of maintaining the system states within an invariant set which is entirely contained in the constraints. This invariant set can be constructed by exploiting either the Lyapunov inequality or modal decomposition. To improve the performance, we show that the two strategies can be combined by choosing at each time instant the least restrictive set. Numerical simulations illustrate that the proposed scheme achieves performances that are comparable to optimisation-based reference governors.

  13. USGS earthquake hazards program (EHP) GPS use case : earthquake early warning (EEW) and shake alert

    DOT National Transportation Integrated Search

    2017-03-30

    GPS Adjacent Band Workshop VI RTCA Inc., Washington D.C., 30 March 2017. USGS GPS receiver use case - Real-Time GPS for EEW -Continued: CRITICAL EFFECT - The GNSS component of the Shake Alert system augments the inertial sensors and is especial...

  14. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1990-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  15. Multidisciplinary optimization of aeroservoelastic systems using reduced-size models

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1992-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  16. Modal-space reference-model-tracking fuzzy control of earthquake excited structures

    NASA Astrophysics Data System (ADS)

    Park, Kwan-Soon; Ok, Seung-Yong

    2015-01-01

    This paper describes an adaptive modal-space reference-model-tracking fuzzy control technique for the vibration control of earthquake-excited structures. In the proposed approach, the fuzzy logic is introduced to update optimal control force so that the controlled structural response can track the desired response of a reference model. For easy and practical implementation, the reference model is constructed by assigning the target damping ratios to the first few dominant modes in modal space. The numerical simulation results demonstrate that the proposed approach successfully achieves not only the adaptive fault-tolerant control system against partial actuator failures but also the robust performance against the variations of the uncertain system properties by redistributing the feedback control forces to the available actuators.

  17. Structural Brain Atlases: Design, Rationale, and Applications in Normal and Pathological Cohorts

    PubMed Central

    Mandal, Pravat K.; Mahajan, Rashima; Dinov, Ivo D.

    2015-01-01

    Structural magnetic resonance imaging (MRI) provides anatomical information about the brain in healthy as well as in diseased conditions. On the other hand, functional MRI (fMRI) provides information on the brain activity during performance of a specific task. Analysis of fMRI data requires the registration of the data to a reference brain template in order to identify the activated brain regions. Brain templates also find application in other neuroimaging modalities, such as diffusion tensor imaging and multi-voxel spectroscopy. Further, there are certain differences (e.g., brain shape and size) in the brains of populations of different origin and during diseased conditions like in Alzheimer’s disease (AD), population and disease-specific brain templates may be considered crucial for accurate registration and subsequent analysis of fMRI as well as other neuroimaging data. This manuscript provides a comprehensive review of the history, construction and application of brain atlases. A chronological outline of the development of brain template design, starting from the Talairach and Tournoux atlas to the Chinese brain template (to date), along with their respective detailed construction protocols provides the backdrop to this manuscript. The manuscript also provides the automated workflow-based protocol for designing a population-specific brain atlas from structural MRI data using LONI Pipeline graphical workflow environment. We conclude by discussing the scope of brain templates as a research tool and their application in various neuroimaging modalities. PMID:22647262

  18. Combined GPS and seismic monitoring of a 12-story structure in a region of induced seismicity in Oklahoma

    NASA Astrophysics Data System (ADS)

    Haase, J. S.; Soliman, M.; Kim, H.; Jaiswal, P.; Saunders, J. K.; Vernon, F.; Zhang, W.

    2017-12-01

    This work focuses on quantifying ground motions and their effects in Oklahoma near the location of the 2016 Mw 5.8 Pawnee earthquake, where seismicity has been increasing due to wastewater injection related to oil and natural gas production. Much of the building inventory in Oklahoma was constructed before the increase in seismicity and before the implementation of earthquake design and detailing provisions for reinforced concrete (RC) structures. We will use combined GPS/seismic monitoring techniques to measure ground motion in the field and the response of structures to this ground motion. Several Oklahoma State University buildings experienced damage due to the Pawnee earthquake. The USGS Shake Map product estimated peak ground acceleration (PGA) ranging from 0.12g to 0.15g at campus locations. We are deploying a high-rate GPS sensor and accelerometer on the roof and another accelerometer at ground level of a 12-story RC structure and at selected field sites in order to collect ambient noise data and nearby seismicity. The longer period recording characteristics of the GPS/seismic system are particularly well adapted to monitoring these large structures in the event of a significant earthquake. Gross characteristics of the structural system are described, which consists of RC columns and RC slabs in all stories. We conducted a preliminary structural analysis including modal analysis and response spectrum analysis based on a finite element (FE) simulation, which indicated that the period associated with the first X-axis bending, first torsional, and first Y-axis bending modes are 2.2 s, 2.1 s, and 1.8 s, respectively. Next, a preliminary analysis was conducted to estimate the range of expected deformation at the roof level for various earthquake excitations. The earthquake analysis shows a maximum roof displacement of 5 and 7 cm in the horizontal directions resulting from earthquake loads with PGA of 0.2g, well above the noise level of the combined GPS/seismic displacements. Another earthquake comparable to the Pawnee earthquake should be well recorded by the system. Recordings of ambient vibration data collected to date describing noise characteristics and measurement error levels will be presented. Any recordings of seismic motions will be discussed, should a significant event occur.

  19. Modeling the Interaction between Fluid Pressure and Faulting in an Earthquake Swarm at Long Valley Caldera

    NASA Astrophysics Data System (ADS)

    Haase, J. S.; Soliman, M.; Kim, H.; Jaiswal, P.; Saunders, J. K.; Vernon, F.; Zhang, W.

    2016-12-01

    This work focuses on quantifying ground motions and their effects in Oklahoma near the location of the 2016 Mw 5.8 Pawnee earthquake, where seismicity has been increasing due to wastewater injection related to oil and natural gas production. Much of the building inventory in Oklahoma was constructed before the increase in seismicity and before the implementation of earthquake design and detailing provisions for reinforced concrete (RC) structures. We will use combined GPS/seismic monitoring techniques to measure ground motion in the field and the response of structures to this ground motion. Several Oklahoma State University buildings experienced damage due to the Pawnee earthquake. The USGS Shake Map product estimated peak ground acceleration (PGA) ranging from 0.12g to 0.15g at campus locations. We are deploying a high-rate GPS sensor and accelerometer on the roof and another accelerometer at ground level of a 12-story RC structure and at selected field sites in order to collect ambient noise data and nearby seismicity. The longer period recording characteristics of the GPS/seismic system are particularly well adapted to monitoring these large structures in the event of a significant earthquake. Gross characteristics of the structural system are described, which consists of RC columns and RC slabs in all stories. We conducted a preliminary structural analysis including modal analysis and response spectrum analysis based on a finite element (FE) simulation, which indicated that the period associated with the first X-axis bending, first torsional, and first Y-axis bending modes are 2.2 s, 2.1 s, and 1.8 s, respectively. Next, a preliminary analysis was conducted to estimate the range of expected deformation at the roof level for various earthquake excitations. The earthquake analysis shows a maximum roof displacement of 5 and 7 cm in the horizontal directions resulting from earthquake loads with PGA of 0.2g, well above the noise level of the combined GPS/seismic displacements. Another earthquake comparable to the Pawnee earthquake should be well recorded by the system. Recordings of ambient vibration data collected to date describing noise characteristics and measurement error levels will be presented. Any recordings of seismic motions will be discussed, should a significant event occur.

  20. Learning outcomes of in-person and virtual field-based geoscience instruction at Grand Canyon National Park: complementary mixed-methods analyses

    NASA Astrophysics Data System (ADS)

    Semken, S. C.; Ruberto, T.; Mead, C.; Bruce, G.; Buxner, S.; Anbar, A. D.

    2017-12-01

    Students with limited access to field-based geoscience learning can benefit from immersive, student-centered virtual-reality and augmented-reality field experiences. While no digital modalities currently envisioned can truly supplant field-based learning, they afford students access to geologically illustrative but inaccessible places on Earth and beyond. As leading producers of immersive virtual field trips (iVFTs), we investigate complementary advantages and disadvantages of iVFTs and in-person field trips (ipFTs). Settings for our mixed-methods study were an intro historical-geology class (n = 84) populated mostly by non-majors and an advanced Southwest geology class (n = 39) serving mostly majors. Both represent the diversity of our urban Southwestern research university. For the same credit, students chose either an ipFT to the Trail of Time (ToT) Exhibition at Grand Canyon National Park (control group) or an online Grand Canyon iVFT (experimental group), in the same time interval. Learning outcomes for each group were identically drawn from elements of the ToT and assessed using pre/post concept sketching and inquiry exercises. Student attitudes and cognitive-load factors for both groups were assessed pre/post using the PANAS instrument (Watson et al., 1998) and with affective surveys. Analysis of pre/post concept sketches indicated improved knowledge in both groups and classes, but more so in the iVFT group. PANAS scores from the intro class showed the ipFT students having significantly stronger (p = .004) positive affect immediately prior to the experience than the iVFT students, possibly reflecting their excitement about the trip to come. Post-experience, the two groups were no longer significantly different, possibly due to the fatigue associated with a full-day ipFT. Two lines of evidence suggest that the modalities were comparable in expected effectiveness. First, the information relevant for the concept sketch was specifically covered in both modalities. Second, coding using the ICAP Framework (Chi & Wylie, 2014) suggests that the modalities are qualitatively similar, with each being predominantly active or passive and rarely reaching the constructive or interactive levels. This leaves other factors such as cognitive load to explain the differential learning outcomes by modality.

  1. Semiautomatic tumor segmentation with multimodal images in a conditional random field framework.

    PubMed

    Hu, Yu-Chi; Grossberg, Michael; Mageras, Gikas

    2016-04-01

    Volumetric medical images of a single subject can be acquired using different imaging modalities, such as computed tomography, magnetic resonance imaging (MRI), and positron emission tomography. In this work, we present a semiautomatic segmentation algorithm that can leverage the synergies between different image modalities while integrating interactive human guidance. The algorithm provides a statistical segmentation framework partly automating the segmentation task while still maintaining critical human oversight. The statistical models presented are trained interactively using simple brush strokes to indicate tumor and nontumor tissues and using intermediate results within a patient's image study. To accomplish the segmentation, we construct the energy function in the conditional random field (CRF) framework. For each slice, the energy function is set using the estimated probabilities from both user brush stroke data and prior approved segmented slices within a patient study. The progressive segmentation is obtained using a graph-cut-based minimization. Although no similar semiautomated algorithm is currently available, we evaluated our method with an MRI data set from Medical Image Computing and Computer Assisted Intervention Society multimodal brain segmentation challenge (BRATS 2012 and 2013) against a similar fully automatic method based on CRF and a semiautomatic method based on grow-cut, and our method shows superior performance.

  2. Violent Interaction Detection in Video Based on Deep Learning

    NASA Astrophysics Data System (ADS)

    Zhou, Peipei; Ding, Qinghai; Luo, Haibo; Hou, Xinglin

    2017-06-01

    Violent interaction detection is of vital importance in some video surveillance scenarios like railway stations, prisons or psychiatric centres. Existing vision-based methods are mainly based on hand-crafted features such as statistic features between motion regions, leading to a poor adaptability to another dataset. En lightened by the development of convolutional networks on common activity recognition, we construct a FightNet to represent the complicated visual violence interaction. In this paper, a new input modality, image acceleration field is proposed to better extract the motion attributes. Firstly, each video is framed as RGB images. Secondly, optical flow field is computed using the consecutive frames and acceleration field is obtained according to the optical flow field. Thirdly, the FightNet is trained with three kinds of input modalities, i.e., RGB images for spatial networks, optical flow images and acceleration images for temporal networks. By fusing results from different inputs, we conclude whether a video tells a violent event or not. To provide researchers a common ground for comparison, we have collected a violent interaction dataset (VID), containing 2314 videos with 1077 fight ones and 1237 no-fight ones. By comparison with other algorithms, experimental results demonstrate that the proposed model for violent interaction detection shows higher accuracy and better robustness.

  3. Structural system identification based on variational mode decomposition

    NASA Astrophysics Data System (ADS)

    Bagheri, Abdollah; Ozbulut, Osman E.; Harris, Devin K.

    2018-03-01

    In this paper, a new structural identification method is proposed to identify the modal properties of engineering structures based on dynamic response decomposition using the variational mode decomposition (VMD). The VMD approach is a decomposition algorithm that has been developed as a means to overcome some of the drawbacks and limitations of the empirical mode decomposition method. The VMD-based modal identification algorithm decomposes the acceleration signal into a series of distinct modal responses and their respective center frequencies, such that when combined their cumulative modal responses reproduce the original acceleration response. The decaying amplitude of the extracted modal responses is then used to identify the modal damping ratios using a linear fitting function on modal response data. Finally, after extracting modal responses from available sensors, the mode shape vector for each of the decomposed modes in the system is identified from all obtained modal response data. To demonstrate the efficiency of the algorithm, a series of numerical, laboratory, and field case studies were evaluated. The laboratory case study utilized the vibration response of a three-story shear frame, whereas the field study leveraged the ambient vibration response of a pedestrian bridge to characterize the modal properties of the structure. The modal properties of the shear frame were computed using analytical approach for a comparison with the experimental modal frequencies. Results from these case studies demonstrated that the proposed method is efficient and accurate in identifying modal data of the structures.

  4. The Integrated Taxonomy of Health Care: Classifying Both Complementary and Biomedical Practices Using a Uniform Classification Protocol

    PubMed Central

    Porcino, Antony; MacDougall, Colleen

    2009-01-01

    Background: Since the late 1980s, several taxonomies have been developed to help map and describe the interrelationships of complementary and alternative medicine (CAM) modalities. In these taxonomies, several issues are often incompletely addressed: A simple categorization process that clearly isolates a modality to a single conceptual categoryClear delineation of verticality—that is, a differentiation of scale being observed from individually applied techniques, through modalities (therapies), to whole medical systemsRecognition of CAM as part of the general field of health care Methods: Development of the Integrated Taxonomy of Health Care (ITHC) involved three stages: Development of a precise, uniform health glossaryAnalysis of the extant taxonomiesUse of an iterative process of classifying modalities and medical systems into categories until a failure to singularly classify a modality occurred, requiring a return to the glossary and adjustment of the classifying protocol Results: A full vertical taxonomy was developed that includes and clearly differentiates between techniques, modalities, domains (clusters of similar modalities), systems of health care (coordinated care system involving multiple modalities), and integrative health care. Domains are the classical primary focus of taxonomies. The ITHC has eleven domains: chemical/substance-based work, device-based work, soft tissue–focused manipulation, skeletal manipulation, fitness/movement instruction, mind–body integration/classical somatics work, mental/emotional–based work, bio-energy work based on physical manipulation, bio-energy modulation, spiritual-based work, unique assessments. Modalities are assigned to the domains based on the primary mode of interaction with the client, according the literature of the practitioners. Conclusions: The ITHC has several strengths: little interpretation is used while successfully assigning modalities to single domains; the issue of taxonomic verticality is fully resolved; and the design fully integrates the complementary health care fields of biomedicine and CAM. PMID:21589735

  5. High-level extracellular protein production in Bacillus subtilis using an optimized dual-promoter expression system.

    PubMed

    Zhang, Kang; Su, Lingqia; Duan, Xuguo; Liu, Lina; Wu, Jing

    2017-02-20

    We recently constructed a Bacillus subtilis strain (CCTCC M 2016536) from which we had deleted the srfC, spoIIAC, nprE, aprE and amyE genes. This strain is capable of robust recombinant protein production and amenable to high-cell-density fermentation. Because the promoter is among the factors that influence the production of target proteins, optimization of the initial promoter, P amyQ from Bacillus amyloliquefaciens, should improve protein expression using this strain. This study was undertaken to develop a new, high-level expression system in B. subtilis CCTCC M 2016536. Using the enzyme β-cyclodextrin glycosyltransferase (β-CGTase) as a reporter protein and B. subtilis CCTCC M 2016536 as the host, nine plasmids equipped with single promoters were screened using shake-flask cultivation. The plasmid containing the P amyQ' promoter produced the greatest extracellular β-CGTase activity; 24.1 U/mL. Subsequently, six plasmids equipped with dual promoters were constructed and evaluated using this same method. The plasmid containing the dual promoter P HpaII -P amyQ' produced the highest extracellular β-CGTase activity (30.5 U/mL) and was relatively glucose repressed. The dual promoter P HpaII -P amyQ' also mediated substantial extracellular pullulanase (90.7 U/mL) and α-CGTase expression (9.5 U/mL) during shake-flask cultivation, demonstrating the general applicability of this system. Finally, the production of β-CGTase using the dual-promoter P HpaII -P amyQ' system was investigated in a 3-L fermenter. Extracellular expression of β-CGTase reached 571.2 U/mL (2.5 mg/mL), demonstrating the potential of this system for use in industrial applications. The dual-promoter P HpaII -P amyQ' system was found to support superior expression of extracellular proteins in B. subtilis CCTCC M 2016536. This system appears generally applicable and is amenable to scale-up.

  6. Time-dependent neo-deterministic seismic hazard scenarios for the 2016 Central Italy earthquakes sequence

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Kossobokov, Vladimir; Romashkova, Leontina; Panza, Giuliano F.

    2017-04-01

    Predicting earthquakes and related ground shaking is widely recognized among the most challenging scientific problems, both for societal relevance and intrinsic complexity of the problem. The development of reliable forecasting tools requires their rigorous formalization and testing, first in retrospect, and then in an experimental real-time mode, which imply a careful application of statistics to data sets of limited size and different accuracy. Accordingly, the operational issues of prospective validation and use of time-dependent neo-deterministic seismic hazard scenarios are discussed, reviewing the results in their application in Italy and surroundings. Long-term practice and results obtained for the Italian territory in about two decades of rigorous prospective testing, support the feasibility of earthquake forecasting based on the analysis of seismicity patterns at the intermediate-term middle-range scale. Italy is the only country worldwide where two independent, globally tested, algorithms are simultaneously applied, namely CN and M8S, which permit to deal with multiple sets of seismic precursors to allow for a diagnosis of the intervals of time when a strong event is likely to occur inside a given region. Based on routinely updated space-time information provided by CN and M8S forecasts, an integrated procedure has been developed that allows for the definition of time-dependent seismic hazard scenarios, through the realistic modeling of ground motion by the neo-deterministic approach (NDSHA). This scenario-based methodology permits to construct, both at regional and local scale, scenarios of ground motion for the time interval when a strong event is likely to occur within the alerted areas. CN and M8S predictions, as well as the related time-dependent ground motion scenarios associated with the alarmed areas, are routinely updated since 2006. The issues and results from real-time testing of the integrated NDSHA scenarios are illustrated, with special emphasis on the sequence of destructive earthquakes that struck Central Italy starting on August 2016. The results obtained so far evidence the validity of the proposed methodology in anticipating ground shaking from approaching strong earthquakes and prove that the information provided by time-dependent NDSHA can be useful in assigning priorities for timely and effective mitigation actions.

  7. The need for supplemental breast cancer screening modalities: a perspective of population-based breast cancer screening programs in Japan.

    PubMed

    Uematsu, Takayoshi

    2017-01-01

    This article discusses possible supplemental breast cancer screening modalities for younger women with dense breasts from a perspective of population-based breast cancer screening program in Japan. Supplemental breast cancer screening modalities have been proposed to increase the sensitivity and detection rates of early stage breast cancer in women with dense breasts; however, there are no global guidelines that recommend the use of supplemental breast cancer screening modalities in such women. Also, no criterion standard exists for breast density assessment. Based on the current situation of breast imaging in Japan, the possible supplemental breast cancer screening modalities are ultrasonography, digital breast tomosynthesis, and breast magnetic resonance imaging. An appropriate population-based breast cancer screening program based on the balance between cost and benefit should be a high priority. Further research based on evidence-based medicine is encouraged. It is very important that the ethnicity, workforce, workflow, and resources for breast cancer screening in each country should be considered when considering supplemental breast cancer screening modalities for women with dense breasts.

  8. Using Comprehensive Science-based Disaster Scenarios to Support Seismic Safety Policy: A Case Study in Los Angeles, California

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2014-12-01

    In 2014, the USGS entered a technical assistance agreement with the City of Los Angeles to apply the results of the 2008 ShakeOut Scenario of a M7.8 earthquake on the southern San Andreas fault to develop a comprehensive plan to increase the seismic resilience of the City. The results of this project are to be submitted to the Mayor of Los Angeles at the Great ShakeOut on October 16, 2014. The ShakeOut scenario detailed how the expected cascade of failures in a big earthquake could lead to significant delays in disaster recovery that could create financial losses that greatly exceed the direct losses in the event. The goal of the seismic resilience plan is to: protect the lives of residents during earthquakes improve the capacity of the City to respond to the earthquake prepare the City to recover quickly after the earthquake so as to protect the economy of the City and all of southern California To accomplish these goals, the project addresses three areas of seismic vulnerability that were identified in the original ShakeOut Scenario: Pre-1980 buildings that present an unacceptable risk to the lives of residents, including "non-ductile reinforced concrete," and "soft-first-story" buildings Water system infrastructure (including impact on firefighting capability) Communications infrastructure The critical science needed to support policy decisions is to understand the probable consequences to the regional long-term economy caused by decisions to undertake (or not) different levels of mitigation. The arguments against mitigation are the immediate financial costs, so a better understanding of the eventual benefit is required. However, the direct savings rarely justify the mitigation costs, so the arguments in favor of mitigation are driven by the potential for cascading failures and the potential to trigger the type of long term reduction in population and economic activity that has occurred in New Orleans since Hurricane Katrina.

  9. Arsenic and fluoride removal from contaminated drinking water with Haix-Fe-Zr and Haix-Zr resin beads.

    PubMed

    Phillips, Debra H; Sen Gupta, Bhaskar; Mukhopadhyay, Soumyadeep; Sen Gupta, Arup K

    2018-06-01

    The objective of the study was to carry-out batch tests to examine the effectiveness of Haix-Fe-Zr and Haix-Zr resin beads in the removal of As(III), As(V) and F - from groundwater with a similar geochemistry to a site where a community-based drinking water plant has been installed in West Bengal, India. The groundwater was spiked separately with ∼200 μg/L As(III) and As(V) and 5 mg/L F - . Haix-Zr resin beads were more effective than Haix-Fe-Zr resin beads in removing As(III) and As(V). Haix-Zr resin beads showed higher removal of As(V) compared to As(III). Haix-Zr resin beads removed As(V) below the WHO (10 μg/L) drinking water standards at 8.79 μg/L after 4 h of shaking, while As(III) was reduced to 7.72 μg/L after 8 h of shaking. Haix-Fe-Zr resin beads were more effective in removing F - from the spiked groundwater compared to Haix-Zr resin beads. Concentrations of F - decreased from 6.27 mg/L to 1.26 mg/L, which is below the WHO drinking water standards (1.5 mg/L) for F - , after 15 min of shaking with Haix-Fe-Zr resin beads. After 20 min of shaking in groundwater treated with Haix-Zr resin beads, F - concentrations decreased from 6.27 mg/L to 1.43 mg/L. In the removal of As(III), As(V), and F - from the groundwater, Haix-Fe-Zr and Haix-Zr resin beads fit the parabolic diffusion equation (PDE) suggesting that adsorption of these contaminants was consistent with inter-particle diffusion. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Secondary analysis of the "Love Me...Never Shake Me" SBS education program.

    PubMed

    Deyo, Grace; Skybo, Theresa; Carroll, Alisa

    2008-11-01

    Shaken baby syndrome (SBS) is preventable; however, an estimated 21-74 per 100,000 children worldwide are victims annually. This study examined the effectiveness of an SBS prevention program in the US. A descriptive, secondary analysis of the Prevent Child Abuse Ohio (PCAO) "Love Me...Never Shake Me" SBS education program database included 7,051 women who completed a commitment statement, pre and post-test, and follow-up survey. Participants were mostly White (76%), had at least some college education (62%), were privately insured (62%), and lived with the father and infant (63%). Mothers knew of the dangers of shaking (96%) and recommended SBS education for all parents (98%) because they found it helpful (97%). Scores on the pre and post-tests were significantly different, but there was no difference based on education site or demographics. There was a significant increase in a pre/post-test item pertaining to infant crying. At follow-up, participants remembered postpartum SBS education (98%), but post-discharge did not receive SBS education from their primary care provider (62%). Most mothers practiced infant soothing techniques (79%) provided in the education; however, few women practiced self-coping techniques (36%) and accessed community support services (9%). Postpartum SBS prevention education should continue. Development of SBS programs should result from these study findings focusing on education content and program evaluation. Mothers report that shaken baby syndrome education is important for all parents and memorable at follow-up. Postpartum SBS education should continue because the hospital is the primary place they receive education. Mothers' report they less frequently receive education from healthcare sources post-discharge. Diligence of primary care providers to incorporate SBS prevention education in well child visits will increase parental exposure to this information. Education may need to place greater emphasis on infant crying and soothing, as well as parent support and self-coping techniques versus the dangers of shaking.

  11. Introducing Students to Structural Dynamics and Earthquake Engineering

    ERIC Educational Resources Information Center

    Anthoine, Armelle; Marazzi, Francesco; Tirelli, Daniel

    2010-01-01

    The European Laboratory for Structural Assessment (ELSA) is one of the world's main laboratories for seismic studies. Besides its research activities, it also aims to bring applied science closer to the public. This article describes teaching activities based on a demonstration shaking table which is used to introduce the structural dynamics of…

  12. Oregon Hazard Explorer for Lifelines Program (OHELP): A web-based geographic information system tool for assessing potential Cascadia earthquake hazard

    NASA Astrophysics Data System (ADS)

    Sharifi Mood, M.; Olsen, M. J.; Gillins, D. T.; Javadnejad, F.

    2016-12-01

    The Cascadia Subduction Zone (CSZ) has the ability to generate earthquake as powerful as 9 moment magnitude creating great amount of damage to structures and facilities in Oregon. Series of deterministic earthquake analysis are performed for M9.0, M8.7, M8.4 and M8.1 presenting persistent, long lasting shaking associated with other geological threats such as ground shaking, landslides, liquefaction-induced ground deformations, fault rupture vertical displacement, tsunamis, etc. These ground deformation endangers urban structures, foundations, bridges, roadways, pipelines and other lifelines. Lifeline providers in Oregon, including private and public practices responsible for transportation, electric and gas utilities, water and wastewater, fuel, airports, and harbors face an aging infrastructure that was built prior to a full understanding of this extreme seismic risk. As recently experienced in Chile and Japan, a three to five minutes long earthquake scenario, expected in Oregon, necessities a whole different method of risk mitigation for these major lifelines than those created for shorter shakings from crustal earthquakes. A web-based geographic information system tool is developed to fully assess the potential hazard from the multiple threats impending from Cascadia subduction zone earthquakes in the region. The purpose of this website is to provide easy access to the latest and best available hazard information over the web, including work completed in the recent Oregon Resilience Plan (ORP) (OSSPAC, 2013) and other work completed by the Department of Geology and Mineral Industries (DOGAMI) and the United States Geological Survey (USGS). As a result, this tool is designated for engineers, planners, geologists, and others who need this information to help make appropriate decisions despite the fact that this web-GIS tool only needs minimal knowledge of GIS to work with.

  13. Comparison of two self-directed weight loss interventions: Limited weekly support vs. no outside support.

    PubMed

    Smith, Bryan K; Van Walleghen, Emily L; Cook-Wiens, Galen; Martin, Rachael N; Curry, Chelsea R; Sullivan, Debra K; Gibson, Cheryl A; Donnelly, Joseph E

    2009-08-01

    The purpose of this study was to compare the efficacy of two home-based weight loss interventions that differ only in the amount of outside support provided. This was a 12-week, randomized, controlled trial. One group received limited support (LWS, n = 35) via a single 10 min phone call each week while another group received no weekly support (NWS, n = 28). Both the LWS and NWS received pre-packaged meals (PM) and shakes. A third group served as control (CON, n = 30) and received no components of the intervention. Weight loss at 12 weeks was the primary outcome. Diet (PM, shake, and fruit/vegetable (F/V) intake) and physical activity (PA) were self-monitored, recorded daily and reported weekly. An exit survey was completed by participants in the intervention groups upon completion of the study. Weight loss and percent weight loss in the LWS, NWS, and CON groups were 7.7 ± 4.4 kg (8.5 ± 4.2%), 5.9 ± 4.1 kg (6.0 ± 4.2%), and 0.3 ± 1.9 kg (0.4 ± 1.2%), respectively. The decrease in body weight and percent weight loss was significantly greater in the LWS and NWS groups when compared to the CON group and the percent weight loss was significantly greater in the LWS when compared to both the NWS and CON groups. A home-based weight loss program utilizing PM and shakes results in clinically significant percent weight loss and the addition of a brief weekly call promotes additional percent weight loss. © 2009 Asian Oceanian Association for the Study of Obesity . Published by Elsevier Ltd. All rights reserved.

  14. Moral injury: A new challenge for complementary and alternative medicine.

    PubMed

    Kopacz, Marek S; Connery, April L; Bishop, Todd M; Bryan, Craig J; Drescher, Kent D; Currier, Joseph M; Pigeon, Wilfred R

    2016-02-01

    Moral injury represents an emerging clinical construct recognized as a source of morbidity in current and former military personnel. Finding effective ways to support those affected by moral injury remains a challenge for both biomedical and complementary and alternative medicine. This paper introduces the concept of moral injury and suggests two complementary and alternative medicine, pastoral care and mindfulness, which may prove useful in supporting military personnel thought to be dealing with moral injury. Research strategies for developing an evidence-base for applying these, and other, complementary and alternative medicine modalities to moral injury are discussed. Published by Elsevier Ltd.

  15. Dynamics of the McDonnell Douglas Large Scale Dynamic Rig and Dynamic Calibration of the Rotor Balance

    DOT National Transportation Integrated Search

    1994-10-01

    A shake test was performed on the Large Scale Dynamic Rig in the 40- by 80-Foot Wind Tunnel in support of the McDonnell Douglas Advanced Rotor Technology (MDART) Test Program. The shake test identifies the hub modes and the dynamic calibration matrix...

  16. Shaking up Expectations: The OCLS Shake It! App

    ERIC Educational Resources Information Center

    Shivers, Cassandra

    2012-01-01

    The author, a digital access architect in the information systems department of the Orange County Library System in Florida, was given the challenge of creating a library mobile app around the 2009 holiday season. At that time, Sheri Chambers, digital content manager in the information systems department, and Debbie Moss, assistant director of the…

  17. PHYSICAL AND BIOLOGICAL PARAMETERS THAT DETERMINE THE FATE OF 'P'-CHLOROPHENOL IN LABORATORY TEST SYSTEMS

    EPA Science Inventory

    Shake flask and microcosm studies were conducted to determine the fate of parachlorophenol (p-CP) in water and sediment systems and the role of sediment and nonsediment surfaces in the biodegradation process. Biodegradation of p-CP in estuarine water samples in shake flasks was s...

  18. Shaking the Tree, Making a Rhizome: Towards a Nomadic Geophilosophy of Science Education

    ERIC Educational Resources Information Center

    Gough, Noel

    2006-01-01

    This essay enacts a philosophy of science education inspired by Gilles Deleuze and Felix Guattari's figurations of rhizomatic and nomadic thought. It imagines rhizomes shaking the tree of modern Western science and science education by destabilising arborescent conceptions of knowledge as hierarchically articulated branches of a central stem or…

  19. Bucket shaking stops bunch dancing in Tevatron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burov, A.; Tan, C.Y.; /Fermilab

    2011-03-01

    Bunches in Tevatron are known to be longitudinally unstable: their collective oscillations, also called dancing bunches, persist without any signs of decay. Typically, a damper is used to stop these oscillations, but recently, it was theoretically predicted that the oscillations can be stabilized by means of small bucket shaking. Dedicated measurements in Tevatron have shown that this method does stop the dancing. According to predictions of Refs. [2,3], the flattening of the bunch distribution at low amplitudes should make the bunch more stable against LLD. An experiment has been devised to flatten the distribution by modulating the RF phase atmore » the low-amplitude synchrotron frequency for a few degrees of amplitude. These beam studies show that stabilisation really happens. After several consecutive shakings, the dancing disappears and the resulting bunch profile becomes smoother at the top. Although not shown in this report, sometimes a little divot forms at the centre of the distribution. These experiments confirm that resonant RF shaking flattens the bunch distribution at low amplitudes, and the dancing stops.« less

  20. Abusive head trauma and the triad: a critique on behalf of RCPCH of 'Traumatic shaking: the role of the triad in medical investigations of suspected traumatic shaking'.

    PubMed

    Debelle, Geoffrey David; Maguire, Sabine; Watts, Patrick; Nieto Hernandez, Rosa; Kemp, Alison Mary

    2018-06-01

    The Swedish Agency for Health Technology Assessment and Assessment of Social Services (SBU) has recently published what they purported to be a systematic review of the literature on 'isolated traumatic shaking' in infants, concluding that 'there is limited evidence that the so-called triad (encephalopathy, subdural haemorrhage, retinal haemorrhage) and therefore its components can be associated with traumatic shaking'. This flawed report, from a national body, demands a robust response. The conclusions of the original report have the potential to undermine medico-legal practice. We have conducted a critique of the methodology used in the SBU review and have found it to be flawed, to the extent that children's lives may be put at risk. Thus, we call on this review to be withdrawn or to be subjected to international scrutiny. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. MyShake: A smartphone seismic network for earthquake early warning and beyond

    PubMed Central

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis; Kwon, Young-Woo

    2016-01-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics. PMID:26933682

  2. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    PubMed

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics.

  3. Evaluation of the shaking technique for the economic management of American foulbrood disease of honey bees (Hymenoptera: Apidae).

    PubMed

    Pernal, Stephen F; Albright, Robert L; Melathopoulos, Andony P

    2008-08-01

    Shaking is a nonantibiotic management technique for the bacterial disease American foulbrood (AFB) (Paenibacillus larvae sensu Genersch et al.), in which infected nesting comb is destroyed and the adult honey bees, Apis mellifera L. (Hymenoptera: Apidae), are transferred onto uncontaminated nesting material. We hypothesized that colonies shaken onto frames of uninfected drawn comb would have similar reductions in AFB symptoms and bacterial spore loads than those shaken onto frames of foundation, but they would attain higher levels of production. We observed that colonies shaken onto drawn comb, or a combination of foundation and drawn comb, exhibited light transitory AFB infections, whereas colonies shaken onto frames containing only foundation failed to exhibit clinical symptoms. Furthermore, concentrations of P. larvae spores in honey and adult worker bees sampled from colonies shaken onto all comb and foundation treatments declined over time and were undetectable in adult bee samples 3 mo after shaking. In contrast, colonies that were reestablished on the original infected comb remained heavily infected resulting in consistently high levels of spores, and eventually, their death. In a subsequent experiment, production of colonies shaken onto foundation was compared with that of colonies established from package (bulk) bees or that of overwintered colonies. Economic analysis proved shaking to be 24% more profitable than using package bees. These results suggest that shaking bees onto frames of foundation in the spring is a feasible option for managing AFB in commercial beekeeping operations where antibiotic use is undesirable or prohibited.

  4. Effect of the consumption of a new symbiotic shake on glycemia and cholesterol levels in elderly people with type 2 diabetes mellitus

    PubMed Central

    2012-01-01

    Background The consumption of foods containing probiotic and prebiotic ingredients is growing consistently every year, and in view of the limited number of studies investigating their effect in the elderly. Objective The objective of this study was to evaluate the effect of the consumption of a symbiotic shake containing Lactobacillus acidophilus, Bifidobacterium bifidum and fructooligosaccharides on glycemia and cholesterol levels in elderly people. Methods A randomized, double-blind, placebo-controlled study was conducted on twenty volunteers (ten for placebo group and ten for symbiotic group), aged 50 to 60 years. The criteria for inclusion in the study were: total cholesterol > 200 mg/dL; triglycerides > 200 mg/dL and glycemia > 110 mg/dL. Over a total test period of 30 days, 10 individuals (the symbiotic group) consumed a daily dose of 200 mL of a symbiotic shake containing 108 UFC/mL Lactobacillus acidophilus, 108 UFC/mL Bifidobacterium bifidum and 2 g oligofructose, while 10 other volunteers (the placebo group) drank daily the same amount of a shake that did not contain any symbiotic bacteria. Blood samples were collected 15 days prior to the start of the experiment and at 10-day intervals after the beginning of the shake intake. The standard lipid profile (total cholesterol, triglycerides and HDL cholesterol) and glycemia, or blood sugar levels, were evaluated by an enzyme colorimetric assay. Results The results of the symbiotic group showed a non-significant reduction (P > 0.05) in total cholesterol and triglycerides, a significant increase (P < 0.05) in HDL cholesterol and a significant reduction (P < 0.05) in fasting glycemia. No significant changes were observed in the placebo group. Conclusion The consumption of symbiotic shake resulted in a significant increase in HDL and a significant decrease of glycemia. Trial Registration ClinicalTrials.gov: NCT00123456 PMID:22356933

  5. Effect of the consumption of a new symbiotic shake on glycemia and cholesterol levels in elderly people with type 2 diabetes mellitus.

    PubMed

    Moroti, Camila; Souza Magri, Loyanne Francine; de Rezende Costa, Marcela; Cavallini, Daniela C U; Sivieri, Katia

    2012-02-22

    The consumption of foods containing probiotic and prebiotic ingredients is growing consistently every year, and in view of the limited number of studies investigating their effect in the elderly. The objective of this study was to evaluate the effect of the consumption of a symbiotic shake containing Lactobacillus acidophilus, Bifidobacterium bifidum and fructooligosaccharides on glycemia and cholesterol levels in elderly people. A randomized, double-blind, placebo-controlled study was conducted on twenty volunteers (ten for placebo group and ten for symbiotic group), aged 50 to 60 years. The criteria for inclusion in the study were: total cholesterol > 200 mg/dL; triglycerides > 200 mg/dL and glycemia > 110 mg/dL. Over a total test period of 30 days, 10 individuals (the symbiotic group) consumed a daily dose of 200 mL of a symbiotic shake containing 10(8) UFC/mL Lactobacillus acidophilus, 10(8) UFC/mL Bifidobacterium bifidum and 2 g oligofructose, while 10 other volunteers (the placebo group) drank daily the same amount of a shake that did not contain any symbiotic bacteria. Blood samples were collected 15 days prior to the start of the experiment and at 10-day intervals after the beginning of the shake intake. The standard lipid profile (total cholesterol, triglycerides and HDL cholesterol) and glycemia, or blood sugar levels, were evaluated by an enzyme colorimetric assay. The results of the symbiotic group showed a non-significant reduction (P > 0.05) in total cholesterol and triglycerides, a significant increase (P < 0.05) in HDL cholesterol and a significant reduction (P < 0.05) in fasting glycemia. No significant changes were observed in the placebo group. The consumption of symbiotic shake resulted in a significant increase in HDL and a significant decrease of glycemia.

  6. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the contribution of building stock, its relative vulnerability, and distribution are vital components for determining the extent of casualties during an earthquake. It is evident from large deadly historical earthquakes that the distribution of vulnerable structures and their occupancy level during an earthquake control the severity of human losses. For example, though the number of strong earthquakes in California is comparable to that of Iran, the total earthquake-related casualties in California during the last 100 years are dramatically lower than the casualties from several individual Iranian earthquakes. The relatively low casualties count in California is attributed mainly to the fact that more than 90 percent of the building stock in California is made of wood and is designed to withstand moderate to large earthquakes (Kircher, Seligson and others, 2006). In contrast, the 80 percent adobe and or non-engineered masonry building stock with poor lateral load resisting systems in Iran succumbs even for moderate levels of ground shaking. Consequently, the heavy death toll for the 2003 Bam, Iran earthquake, which claimed 31,828 lives (Ghafory-Ashtiany and Mousavi, 2005), is directly attributable to such poorly resistant construction, and future events will produce comparable losses unless practices change. Similarly, multistory, precast-concrete framed buildings caused heavy casualties in the 1988 Spitak, Armenia earthquake (Bertero, 1989); weaker masonry and reinforced-concrete framed construction designed for gravity loads with soft first stories dominated losses in the Bhuj, India earthquake of 2001 (Madabhushi and Haigh, 2005); and adobe and weak masonry dwellings in Peru controlled the death toll in the Peru earthquake of 2007 (Taucer, J. and others, 2007). Spence (2007) after conducting a brief survey of most lethal earthquakes since 1960 found that building collapses remains a major cause of earthquake mortality and unreinforced masonry buildings are one of the mos

  7. Developing a MATLAB(registered)-Based Tool for Visualization and Transformation

    NASA Technical Reports Server (NTRS)

    Anderton, Blake J.

    2003-01-01

    An important step in the structural design and development of spacecraft is the experimental identification of a structure s modal characteristics, such as its natural frequencies and modes of vibration. These characteristics are vital to developing a representative model of any given structure or analyzing the range of input frequencies that can be handled by a particular structure. When setting up such a representative model of a structure, careful measurements using precision equipment (such as accelerometers and instrumented hammers) must be made on many individual points of the structure in question. The coordinate location of each data point is used to construct a wireframe geometric model of the structure. Response measurements obtained from the accelerometers is used to generate the modal shapes of the particular structure. Graphically, this is displayed as a combination of the ways a structure will ideally respond to a specified force input. Two types of models of the tested structure are often used in modal analysis: an analytic model showing expected behavior of the structure, and an experimental model showing measured results due to observed phenomena. To evaluate the results from the experimental model, a comparison of analytic and experimental results must be made between the two models. However, comparisons between these two models become difficult when the two coordinate orientations differ in a manner such that results are displayed in an unclear fashion. Such a problem proposes the need for a tool that not only communicates a graphical image of a structure s wireframe geometry based on various measurement locations (called nodes), but also allows for a type of transformation of the image s coordinate geometry so that a model s coordinate orientation is made to match the orientation of another model. Such a tool should also be designed so that it is able to construct coordinate geometry based on many different listings of node locations and is able to transform the wireframe coordinate orientation to match almost any possible orientation (i.e. it should not be a problem specific application) if it is to be of much value in modal analysis. Also, since universal files are used to store modal parameters and wireframe geometry, the tool must be able to read and extract information from universal files and use these files to exchange model data.The purpose of this project is to develop such a tool as a computer graphical user interface (GUI) capable of performing the following tasks: 1) Browsing for a particular universal file within the computer directory and displaying the name of this file to the screen; 2) Plotting each of the nodes within the universal file in a useful, descriptive, and easily understood figure; 3) Reading the node numbers from the selected file and listing these node numbers to the user for selection in an easily accessible format; 4) Allowing for user selection of a new model orientation defined by three selected nodes; and 5) Allowing the user to specify a directory to which the transformed model s node locations will be saved, and saving the transformed node locations to the specified file.

  8. Modal-Power-Based Haptic Motion Recognition

    NASA Astrophysics Data System (ADS)

    Kasahara, Yusuke; Shimono, Tomoyuki; Kuwahara, Hiroaki; Sato, Masataka; Ohnishi, Kouhei

    Motion recognition based on sensory information is important for providing assistance to human using robots. Several studies have been carried out on motion recognition based on image information. However, in the motion of humans contact with an object can not be evaluated precisely by image-based recognition. This is because the considering force information is very important for describing contact motion. In this paper, a modal-power-based haptic motion recognition is proposed; modal power is considered to reveal information on both position and force. Modal power is considered to be one of the defining features of human motion. A motion recognition algorithm based on linear discriminant analysis is proposed to distinguish between similar motions. Haptic information is extracted using a bilateral master-slave system. Then, the observed motion is decomposed in terms of primitive functions in a modal space. The experimental results show the effectiveness of the proposed method.

  9. Fine-scale delineation of the location of and relative ground shaking within the San Andreas Fault zone at San Andreas Lake, San Mateo County, California

    USGS Publications Warehouse

    Catchings, R.D.; Rymer, M.J.; Goldman, M.R.; Prentice, C.S.; Sickler, R.R.

    2013-01-01

    The San Francisco Public Utilities Commission is seismically retrofitting the water delivery system at San Andreas Lake, San Mateo County, California, where the reservoir intake system crosses the San Andreas Fault (SAF). The near-surface fault location and geometry are important considerations in the retrofit effort. Because the SAF trends through highly distorted Franciscan mélange and beneath much of the reservoir, the exact trace of the 1906 surface rupture is difficult to determine from surface mapping at San Andreas Lake. Based on surface mapping, it also is unclear if there are additional fault splays that extend northeast or southwest of the main surface rupture. To better understand the fault structure at San Andreas Lake, the U.S. Geological Survey acquired a series of seismic imaging profiles across the SAF at San Andreas Lake in 2008, 2009, and 2011, when the lake level was near historical lows and the surface traces of the SAF were exposed for the first time in decades. We used multiple seismic methods to locate the main 1906 rupture zone and fault splays within about 100 meters northeast of the main rupture zone. Our seismic observations are internally consistent, and our seismic indicators of faulting generally correlate with fault locations inferred from surface mapping. We also tested the accuracy of our seismic methods by comparing our seismically located faults with surface ruptures mapped by Schussler (1906) immediately after the April 18, 1906 San Francisco earthquake of approximate magnitude 7.9; our seismically determined fault locations were highly accurate. Near the reservoir intake facility at San Andreas Lake, our seismic data indicate the main 1906 surface rupture zone consists of at least three near-surface fault traces. Movement on multiple fault traces can have appreciable engineering significance because, unlike movement on a single strike-slip fault trace, differential movement on multiple fault traces may exert compressive and extensional stresses on built structures within the fault zone. Such differential movement and resulting distortion of built structures appear to have occurred between fault traces at the gatewell near the southern end of San Andreas Lake during the 1906 San Francisco earthquake (Schussler, 1906). In addition to the three fault traces within the main 1906 surface rupture zone, our data indicate at least one additional fault trace (or zone) about 80 meters northeast of the main 1906 surface rupture zone. Because ground shaking also can damage structures, we used fault-zone guided waves to investigate ground shaking within the fault zones relative to ground shaking outside the fault zones. Peak ground velocity (PGV) measurements from our guided-wave study indicate that ground shaking is greater at each of the surface fault traces, varying with the frequency of the seismic data and the wave type (P versus S). S-wave PGV increases by as much as 5–6 times at the fault traces relative to areas outside the fault zone, and P-wave PGV increases by as much as 3–10 times. Assuming shaking increases linearly with increasing earthquake magnitude, these data suggest strong shaking may pose a significant hazard to built structures that extend across the fault traces. Similarly complex fault structures likely underlie other strike-slip faults (such as the Hayward, Calaveras, and Silver Creek Faults) that intersect structures of the water delivery system, and these fault structures similarly should be investigated.

  10. Mapping PetaSHA Applications to TeraGrid Architectures

    NASA Astrophysics Data System (ADS)

    Cui, Y.; Moore, R.; Olsen, K.; Zhu, J.; Dalguer, L. A.; Day, S.; Cruz-Atienza, V.; Maechling, P.; Jordan, T.

    2007-12-01

    The Southern California Earthquake Center (SCEC) has a science program in developing an integrated cyberfacility - PetaSHA - for executing physics-based seismic hazard analysis (SHA) computations. The NSF has awarded PetaSHA 15 million allocation service units this year on the fastest supercomputers available within the NSF TeraGrid. However, one size does not fit all, a range of systems are needed to support this effort at different stages of the simulations. Enabling PetaSHA simulations on those TeraGrid architectures to solve both dynamic rupture and seismic wave propagation have been a challenge from both hardware and software levels. This is an adaptation procedure to meet specific requirements of each architecture. It is important to determine how fundamental system attributes affect application performance. We present an adaptive approach in our PetaSHA application that enables the simultaneous optimization of both computation and communication at run-time using flexible settings. These techniques optimize initialization, source/media partition and MPI-IO output in different ways to achieve optimal performance on the target machines. The resulting code is a factor of four faster than the orignial version. New MPI-I/O capabilities have been added for the accurate Staggered-Grid Split-Node (SGSN) method for dynamic rupture propagation in the velocity-stress staggered-grid finite difference scheme (Dalguer and Day, JGR, 2007), We use execution workflow across TeraGrid sites for managing the resulting data volumes. Our lessons learned indicate that minimizing time to solution is most critical, in particular when scheduling large scale simulations across supercomputer sites. The TeraShake platform has been ported to multiple architectures including TACC Dell lonestar and Abe, Cray XT3 Bigben and Blue Gene/L. Parallel efficiency of 96% with the PetaSHA application Olsen-AWM has been demonstrated on 40,960 Blue Gene/L processors at IBM TJ Watson Center. Notable accomplishments using the optimized code include the M7.8 ShakeOut rupture scenario, as part of the southern San Andreas Fault evaluation SoSAFE. The ShakeOut simulation domain is the same as used for the SCEC TeraShake simulations (600 km by 300 km by 80 km). However, the higher resolution of 100 m with frequency content up to 1 Hz required 14.4 billion grid points, eight times more than the TeraShake scenarios. The simulation used 2000 TACC Dell linux Lonestar processors and took 56 hours to compute 240 seconds of wave propagation. The pre-processing input partition, as well as post-processing analysis has been performed on the SDSC IBM Datastar p655 and p690. In addition, as part of the SCEC DynaShake computational platform, the SGSN capability was used to model dynamic rupture propagation for the ShakeOut scenario that match the proposed surface slip and size of the event. Mapping applications to different architectures require coordination of many areas of expertise in hardware and application level, an outstanding challenge faced on the current petascale computing effort. We believe our techniques as well as distributed data management through data grids have provided a practical example of how to effectively use multiple compute resources, and our results will benefit other geoscience disciplines as well.

  11. The O-mannosylation and production of recombinant APA (45/47 KDa) protein from Mycobacterium tuberculosis in Streptomyces lividans is affected by culture conditions in shake flasks.

    PubMed

    Gamboa-Suasnavart, Ramsés A; Valdez-Cruz, Norma A; Cordova-Dávalos, Laura E; Martínez-Sotelo, José A; Servín-González, Luis; Espitia, Clara; Trujillo-Roldán, Mauricio A

    2011-12-20

    The Ala-Pro-rich O-glycoprotein known as the 45/47 kDa or APA antigen from Mycobacterium tuberculosis is an immunodominant adhesin restricted to mycobacterium genus and has been proposed as an alternative candidate to generate a new vaccine against tuberculosis or for diagnosis kits. In this work, the recombinant O-glycoprotein APA was produced by the non-pathogenic filamentous bacteria Streptomyces lividans, evaluating three different culture conditions. This strain is known for its ability to produce heterologous proteins in a shorter time compared to M. tuberculosis. Three different shake flask geometries were used to provide different shear and oxygenation conditions; and the impact of those conditions on the morphology of S. lividans and the production of rAPA was characterized and evaluated. Small unbranched free filaments and mycelial clumps were found in baffled and coiled shake flasks, but one order of magnitude larger pellets were found in conventional shake flasks. The production of rAPA is around 3 times higher in small mycelia than in larger pellets, most probably due to difficulties in mass transfer inside pellets. Moreover, there are four putative sites of O-mannosylation in native APA, one of which is located at the carboxy-terminal region. The carbohydrate composition of this site was determined for rAPA by mass spectrometry analysis, and was found to contain different glycoforms depending on culture conditions. Up to two mannoses residues were found in cultures carried out in conventional shake flasks, and up to five mannoses residues were determined in coiled and baffled shake flasks. The shear and/or oxygenation parameters determine the bacterial morphology, the productivity, and the O-mannosylation of rAPA in S. lividans. As demonstrated here, culture conditions have to be carefully controlled in order to obtain recombinant O-glycosylated proteins with similar "quality" in bacteria, particularly, if the protein activity depends on the glycosylation pattern. Furthermore, it will be an interesting exercise to determine the effect of shear and oxygen in shake flasks, to obtain evidences that may be useful in scaling-up these processes to bioreactors. Another approach will be using lab-scale bioreactors under well-controlled conditions, and study the impact of those on rAPA productivity and quality.

  12. Kinematic and Dynamic Source Rupture Scenario for Potential Megathrust Event along the Southernmost Ryukyu Trench

    NASA Astrophysics Data System (ADS)

    Lin, T. C.; Hu, F.; Chen, X.; Lee, S. J.; Hung, S. H.

    2017-12-01

    Kinematic source model is widely used for the simulation of an earthquake, because of its simplicity and ease of application. On the other hand, dynamic source model is a more complex but important tool that can help us to understand the physics of earthquake initiation, propagation, and healing. In this study, we focus on the southernmost Ryukyu Trench which is extremely close to northern Taiwan. Interseismic GPS data in northeast Taiwan shows a pattern of strain accumulation, which suggests the maximum magnitude of a potential future earthquake in this area is probably about magnitude 8.7. We develop dynamic rupture models for the hazard estimation of the potential megathrust event based on the kinematic rupture scenarios which are inverted using the interseismic GPS data. Besides, several kinematic source rupture scenarios with different characterized slip patterns are also considered to constrain the dynamic rupture process better. The initial stresses and friction properties are tested using the trial-and-error method, together with the plate coupling and tectonic features. An analysis of the dynamic stress field associated with the slip prescribed in the kinematic models can indicate possible inconsistencies with physics of faulting. Furthermore, the dynamic and kinematic rupture models are considered to simulate the ground shaking from based on 3-D spectral-element method. We analyze ShakeMap and ShakeMovie from the simulation results to evaluate the influence over the island between different source models. A dispersive tsunami-propagation simulation is also carried out to evaluate the maximum tsunami wave height along the coastal areas of Taiwan due to coseismic seafloor deformation of different source models. The results of this numerical simulation study can provide a physically-based information of megathrust earthquake scenario for the emergency response agency to take the appropriate action before the really big one happens.

  13. Characterising large scenario earthquakes and their influence on NDSHA maps

    NASA Astrophysics Data System (ADS)

    Magrin, Andrea; Peresan, Antonella; Panza, Giuliano F.

    2016-04-01

    The neo-deterministic approach to seismic zoning, NDSHA, relies on physically sound modelling of ground shaking from a large set of credible scenario earthquakes, which can be defined based on seismic history and seismotectonics, as well as incorporating information from a wide set of geological and geophysical data (e.g. morphostructural features and present day deformation processes identified by Earth observations). NDSHA is based on the calculation of complete synthetic seismograms; hence it does not make use of empirical attenuation models (i.e. ground motion prediction equations). From the set of synthetic seismograms, maps of seismic hazard that describe the maximum of different ground shaking parameters at the bedrock can be produced. As a rule, the NDSHA, defines the hazard as the envelope ground shaking at the site, computed from all of the defined seismic sources; accordingly, the simplest outcome of this method is a map where the maximum of a given seismic parameter is associated to each site. In this way, the standard NDSHA maps permit to account for the largest observed or credible earthquake sources identified in the region in a quite straightforward manner. This study aims to assess the influence of unavoidable uncertainties in the characterisation of large scenario earthquakes on the NDSHA estimates. The treatment of uncertainties is performed by sensitivity analyses for key modelling parameters and accounts for the uncertainty in the prediction of fault radiation and in the use of Green's function for a given medium. Results from sensitivity analyses with respect to the definition of possible seismic sources are discussed. A key parameter is the magnitude of seismic sources used in the simulation, which is based on information from earthquake catalogue, seismogenic zones and seismogenic nodes. The largest part of the existing Italian catalogues is based on macroseismic intensities, a rough estimate of the error in peak values of ground motion can therefore be the factor of two, intrinsic in MCS and other discrete scales. A simple test supports this hypothesis: an increase of 0.5 in the magnitude, i.e. one degrees in epicentral MCS, of all sources used in the national scale seismic zoning produces a doubling of the maximum ground motion. The analysis of uncertainty in ground motion maps, due to the catalogue random errors in magnitude and localization, shows a not uniform distribution of ground shaking uncertainty. The available information from catalogues of past events, that is not complete and may well not be representative of future earthquakes, can be substantially completed using independent indicators of the seismogenic potential of a given area, such as active faulting data and the seismogenic nodes.

  14. Asymmetric cultural effects on perceptual expertise underlie an own-race bias for voices

    PubMed Central

    Perrachione, Tyler K.; Chiao, Joan Y.; Wong, Patrick C.M.

    2009-01-01

    The own-race bias in memory for faces has been a rich source of empirical work on the mechanisms of person perception. This effect is thought to arise because the face-perception system differentially encodes the relevant structural dimensions of features and their configuration based on experiences with different groups of faces. However, the effects of sociocultural experiences on person perception abilities in other identity-conveying modalities like audition have not been explored. Investigating an own-race bias in the auditory domain provides a unique opportunity for studying whether person identification is a modality-independent construct and how it is sensitive to asymmetric cultural experiences. Here we show that an own-race bias in talker identification arises from asymmetric experience with different spoken dialects. When listeners categorized voices by race (White or Black), a subset of the Black voices were categorized as sounding White, while the opposite case was unattested. Acoustic analyses indicated listeners' perceptions about race were consistent with differences in specific phonetic and phonological features. In a subsequent person-identification experiment, the Black voices initially categorized as sounding White elicited an own-race bias from White listeners, but not from Black listeners. These effects are inconsistent with person-perception models that strictly analogize faces and voices based on recognition from only structural features. Our results demonstrate that asymmetric exposure to spoken dialect, independent from talkers' physical characteristics, affects auditory perceptual expertise for talker identification. Person perception thus additionally relies on socioculturally-acquired dynamic information, which may be represented by different mechanisms in different sensory modalities. PMID:19782970

  15. RANZCR Body Systems Framework of diagnostic imaging examination descriptors.

    PubMed

    Pitman, Alexander G; Penlington, Lisa; Doromal, Darren; Slater, Gregory; Vukolova, Natalia

    2014-08-01

    A unified and logical system of descriptors for diagnostic imaging examinations and procedures is a desirable resource for radiology in Australia and New Zealand and is needed to support core activities of RANZCR. Existing descriptor systems available in Australia and New Zealand (including the Medicare DIST and the ACC Schedule) have significant limitations and are inappropriate for broader clinical application. An anatomically based grid was constructed, with anatomical structures arranged in rows and diagnostic imaging modalities arranged in columns (including nuclear medicine and positron emission tomography). The grid was segregated into five body systems. The cells at the intersection of an anatomical structure row and an imaging modality column were populated with short, formulaic descriptors of the applicable diagnostic imaging examinations. Clinically illogical or physically impossible combinations were 'greyed out'. Where the same examination applied to different anatomical structures, the descriptor was kept identical for the purposes of streamlining. The resulting Body Systems Framework of diagnostic imaging examination descriptors lists all the reasonably common diagnostic imaging examinations currently performed in Australia and New Zealand using a unified grid structure allowing navigation by both referrers and radiologists. The Framework has been placed on the RANZCR website and is available for access free of charge by registered users. The Body Systems Framework of diagnostic imaging examination descriptors is a system of descriptors based on relationships between anatomical structures and imaging modalities. The Framework is now available as a resource and reference point for the radiology profession and to support core College activities. © 2014 The Royal Australian and New Zealand College of Radiologists.

  16. Seismic Risk Assessment and Loss Estimation for Tbilisi City

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Alania, Victor; Varazanashvili, Otar; Gugeshashvili, Tengiz; Arabidze, Vakhtang; Arevadze, Nika; Tsereteli, Emili; Gaphrindashvili, Giorgi; Gventcadze, Alexander; Goguadze, Nino; Vephkhvadze, Sophio

    2013-04-01

    The proper assessment of seismic risk is of crucial importance for society protection and city sustainable economic development, as it is the essential part to seismic hazard reduction. Estimation of seismic risk and losses is complicated tasks. There is always knowledge deficiency on real seismic hazard, local site effects, inventory on elements at risk, infrastructure vulnerability, especially for developing countries. Lately great efforts was done in the frame of EMME (earthquake Model for Middle East Region) project, where in the work packages WP1, WP2 , WP3 and WP4 where improved gaps related to seismic hazard assessment and vulnerability analysis. Finely in the frame of work package wp5 "City Scenario" additional work to this direction and detail investigation of local site conditions, active fault (3D) beneath Tbilisi were done. For estimation economic losses the algorithm was prepared taking into account obtained inventory. The long term usage of building is very complex. It relates to the reliability and durability of buildings. The long term usage and durability of a building is determined by the concept of depreciation. Depreciation of an entire building is calculated by summing the products of individual construction unit' depreciation rates and the corresponding value of these units within the building. This method of calculation is based on an assumption that depreciation is proportional to the building's (constructions) useful life. We used this methodology to create a matrix, which provides a way to evaluate the depreciation rates of buildings with different type and construction period and to determine their corresponding value. Finally loss was estimated resulting from shaking 10%, 5% and 2% exceedance probability in 50 years. Loss resulting from scenario earthquake (earthquake with possible maximum magnitude) also where estimated.

  17. What shakes the FX tree? Understanding currency dominance, dependence, and dynamics (Keynote Address)

    NASA Astrophysics Data System (ADS)

    Johnson, Neil F.; McDonald, Mark; Suleman, Omer; Williams, Stacy; Howison, Sam

    2005-05-01

    There is intense interest in understanding the stochastic and dynamical properties of the global Foreign Exchange (FX) market, whose daily transactions exceed one trillion US dollars. This is a formidable task since the FX market is characterized by a web of fluctuating exchange rates, with subtle inter-dependencies which may change in time. In practice, traders talk of particular currencies being 'in play' during a particular period of time -- yet there is no established machinery for detecting such important information. Here we apply the construction of Minimum Spanning Trees (MSTs) to the FX market, and show that the MST can capture important features of the global FX dynamics. Moreover, we show that the MST can help identify momentarily dominant and dependent currencies.

  18. Saturn Apollo Program

    NASA Image and Video Library

    1966-01-01

    Engineers and technicians at the Marshall Space Flight Center placed a Saturn V ground test booster (S-IC-D) into the dynamic test stand. The stand was constructed to test the integrity of the vehicle. Forces were applied to the tail of the vehicle to simulate the engines thrusting, and various other flight factors were fed to the vehicle to test reactions. The Saturn V launch vehicle, with the Apollo spacecraft, was subjected to more than 450 hours of shaking. The photograph shows the 300,000 pound S-IC stage being lifted from its transporter into place inside the 360-foot tall test stand. This dynamic test booster has one dummy F-1 engine and weight simulators are used at the other four engine positions.

  19. Bridge inspection / washing program : bridge drainage program

    DOT National Transportation Integrated Search

    2002-02-01

    The Rhode Island Department of Transportation, Operations Division is responsible for operation and maintenance of roads and bridges, and construction of highway and multi-modal projects to improve the transportation system of our state. One of the m...

  20. Automated computation of autonomous spectral submanifolds for nonlinear modal analysis

    NASA Astrophysics Data System (ADS)

    Ponsioen, Sten; Pedergnana, Tiemo; Haller, George

    2018-04-01

    We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.

Top