Young Children's Memory for the Times of Personal Past Events
Pathman, Thanujeni; Larkina, Marina; Burch, Melissa; Bauer, Patricia J.
2012-01-01
Remembering the temporal information associated with personal past events is critical for autobiographical memory, yet we know relatively little about the development of this capacity. In the present research, we investigated temporal memory for naturally occurring personal events in 4-, 6-, and 8-year-old children. Parents recorded unique events in which their children participated during a 4-month period. At test, children made relative recency judgments and estimated the time of each event using conventional time-scales (time of day, day of week, month of year, and season). Children also were asked to provide justifications for their time-scale judgments. Six- and 8-year-olds, but not 4-year-olds, accurately judged the order of two distinct events. There were age-related improvements in children's estimation of the time of events using conventional time-scales. Older children provided more justifications for their time-scale judgments compared to younger children. Relations between correct responding on the time-scale judgments and provision of meaningful justifications suggest that children may use that information to reconstruct the times associated with past events. The findings can be used to chart a developmental trajectory of performance in temporal memory for personal past events, and have implications for our understanding of autobiographical memory development. PMID:23687467
Contrasting scaling properties of interglacial and glacial climates
Shao, Zhi-Gang; Ditlevsen, Peter D.
2016-01-01
Understanding natural climate variability is essential for assessments of climate change. This is reflected in the scaling properties of climate records. The scaling exponents of the interglacial and the glacial climates are fundamentally different. The Holocene record is monofractal, with a scaling exponent H∼0.7. On the contrary, the glacial record is multifractal, with a significantly higher scaling exponent H∼1.2, indicating a longer persistence time and stronger nonlinearities in the glacial climate. The glacial climate is dominated by the strong multi-millennial Dansgaard–Oeschger (DO) events influencing the long-time correlation. However, by separately analysing the last glacial maximum lacking DO events, here we find the same scaling for that period as for the full glacial period. The unbroken scaling thus indicates that the DO events are part of the natural variability and not externally triggered. At glacial time scales, there is a scale break to a trivial scaling, contrasting the DO events from the similarly saw-tooth-shaped glacial cycles. PMID:26980084
Temporal Clustering of Regional-Scale Extreme Precipitation Events in Southern Switzerland
NASA Astrophysics Data System (ADS)
Barton, Yannick; Giannakaki, Paraskevi; Von Waldow, Harald; Chevalier, Clément; Pfhal, Stephan; Martius, Olivia
2017-04-01
Temporal clustering of extreme precipitation events on subseasonal time scales is a form of compound extremes and is of crucial importance for the formation of large-scale flood events. Here, the temporal clustering of regional-scale extreme precipitation events in southern Switzerland is studied. These precipitation events are relevant for the flooding of lakes in southern Switzerland and northern Italy. This research determines whether temporal clustering is present and then identifies the dynamics that are responsible for the clustering. An observation-based gridded precipitation dataset of Swiss daily rainfall sums and ECMWF reanalysis datasets are used. To analyze the clustering in the precipitation time series a modified version of Ripley's K function is used. It determines the average number of extreme events in a time period, to characterize temporal clustering on subseasonal time scales and to determine the statistical significance of the clustering. Significant clustering of regional-scale precipitation extremes is found on subseasonal time scales during the fall season. Four high-impact clustering episodes are then selected and the dynamics responsible for the clustering are examined. During the four clustering episodes, all heavy precipitation events were associated with an upperlevel breaking Rossby wave over western Europe and in most cases strong diabatic processes upstream over the Atlantic played a role in the amplification of these breaking waves. Atmospheric blocking downstream over eastern Europe supported this wave breaking during two of the clustering episodes. During one of the clustering periods, several extratropical transitions of tropical cyclones in the Atlantic contributed to the formation of high-amplitude ridges over the Atlantic basin and downstream wave breaking. During another event, blocking over Alaska assisted the phase locking of the Rossby waves downstream over the Atlantic.
A model of return intervals between earthquake events
NASA Astrophysics Data System (ADS)
Zhou, Yu; Chechkin, Aleksei; Sokolov, Igor M.; Kantz, Holger
2016-06-01
Application of the diffusion entropy analysis and the standard deviation analysis to the time sequence of the southern California earthquake events from 1976 to 2002 uncovered scaling behavior typical for anomalous diffusion. However, the origin of such behavior is still under debate. Some studies attribute the scaling behavior to the correlations in the return intervals, or waiting times, between aftershocks or mainshocks. To elucidate a nature of the scaling, we applied specific reshulffling techniques to eliminate correlations between different types of events and then examined how it affects the scaling behavior. We demonstrate that the origin of the scaling behavior observed is the interplay between mainshock waiting time distribution and the structure of clusters of aftershocks, but not correlations in waiting times between the mainshocks and aftershocks themselves. Our findings are corroborated by numerical simulations of a simple model showing a very similar behavior. The mainshocks are modeled by a renewal process with a power-law waiting time distribution between events, and aftershocks follow a nonhomogeneous Poisson process with the rate governed by Omori's law.
Contrasting scaling properties of interglacial and glacial climates
NASA Astrophysics Data System (ADS)
Ditlevsen, Peter; Shao, Zhi-Gang
2017-04-01
Understanding natural climate variability is essential for assessments of climate change. This is reflected in the scaling properties of climate records. The scaling exponents of the interglacial and the glacial climates are fundamentally different. The Holocene record is monofractal, with a scaling exponent H˜0.7. On the contrary, the glacial record is multifractal, with a significantly higher scaling exponent H˜1.2, indicating a longer persistence time and stronger nonlinearities in the glacial climate. The glacial climate is dominated by the strong multi-millennial Dansgaard-Oeschger (DO) events influencing the long-time correlation. However, by separately analysing the last glacial maximum lacking DO events, here we find the same scaling for that period as for the full glacial period. The unbroken scaling thus indicates that the DO events are part of the natural variability and not externally triggered. At glacial time scales, there is a scale break to a trivial scaling, contrasting the DO events from the similarly saw-tooth-shaped glacial cycles. Ref: Zhi-Gang Shao and Peter Ditlevsen, Nature Comm. 7, 10951, 2016
Visual search of cyclic spatio-temporal events
NASA Astrophysics Data System (ADS)
Gautier, Jacques; Davoine, Paule-Annick; Cunty, Claire
2018-05-01
The analysis of spatio-temporal events, and especially of relationships between their different dimensions (space-time-thematic attributes), can be done with geovisualization interfaces. But few geovisualization tools integrate the cyclic dimension of spatio-temporal event series (natural events or social events). Time Coil and Time Wave diagrams represent both the linear time and the cyclic time. By introducing a cyclic temporal scale, these diagrams may highlight the cyclic characteristics of spatio-temporal events. However, the settable cyclic temporal scales are limited to usual durations like days or months. Because of that, these diagrams cannot be used to visualize cyclic events, which reappear with an unusual period, and don't allow to make a visual search of cyclic events. Also, they don't give the possibility to identify the relationships between the cyclic behavior of the events and their spatial features, and more especially to identify localised cyclic events. The lack of possibilities to represent the cyclic time, outside of the temporal diagram of multi-view geovisualization interfaces, limits the analysis of relationships between the cyclic reappearance of events and their other dimensions. In this paper, we propose a method and a geovisualization tool, based on the extension of Time Coil and Time Wave, to provide a visual search of cyclic events, by allowing to set any possible duration to the diagram's cyclic temporal scale. We also propose a symbology approach to push the representation of the cyclic time into the map, in order to improve the analysis of relationships between space and the cyclic behavior of events.
Relative Time-scale for Channeling Events Within Chaotic Terrains, Margaritifer Sinus, Mars
NASA Technical Reports Server (NTRS)
Janke, D.
1985-01-01
A relative time scale for ordering channel and chaos forming events was constructed for areas within the Margaritifer Sinus region of Mars. Transection and superposition relationships of channels, chaotic terrain, and the surfaces surrounding them were used to create the relative time scale; crater density studies were not used. Channels and chaos in contact with one another were treated as systems. These systems were in turn treated both separately (in order to understand internal relationships) and as members of the suite of Martian erosional forms (in order to produce a combined, master time scale). Channeling events associated with chaotic terrain development occurred over an extended geomorphic period. The channels can be divided into three convenient groups: those that pre-date intercrater plains development post-plains, pre-chasma systems; and those associated with the development of the Vallis Marineris chasmata. No correlations with cyclic climatic changes, major geologic events in other regions on Mars, or triggering phenomena (for example, specific impact events) were found.
Using Low-Frequency Earthquake Families on the San Andreas Fault as Deep Creepmeters
NASA Astrophysics Data System (ADS)
Thomas, A. M.; Beeler, N. M.; Bletery, Q.; Burgmann, R.; Shelly, D. R.
2018-01-01
The central section of the San Andreas Fault hosts tectonic tremor and low-frequency earthquakes (LFEs) similar to subduction zone environments. LFEs are often interpreted as persistent regions that repeatedly fail during the aseismic shear of the surrounding fault allowing them to be used as creepmeters. We test this idea by using the recurrence intervals of individual LFEs within LFE families to estimate the timing, duration, recurrence interval, slip, and slip rate associated with inferred slow slip events. We formalize the definition of a creepmeter and determine whether this definition is consistent with our observations. We find that episodic families reflect surrounding creep over the interevent time, while the continuous families and the short time scale bursts that occur as part of the episodic families do not. However, when these families are evaluated on time scales longer than the interevent time these events can also be used to meter slip. A straightforward interpretation of episodic families is that they define sections of the fault where slip is distinctly episodic in well-defined slow slip events that slip 16 times the long-term rate. In contrast, the frequent short-term bursts of the continuous and short time scale episodic families likely do not represent individual creep events but rather are persistent asperities that are driven to failure by quasi-continuous creep on the surrounding fault. Finally, we find that the moment-duration scaling of our inferred creep events are inconsistent with the proposed linear moment-duration scaling. However, caution must be exercised when attempting to determine scaling with incomplete knowledge of scale.
NASA Astrophysics Data System (ADS)
Kenward, D. R.; Lessard, M.; Lynch, K. A.; Hysell, D. L.; Hampton, D. L.; Michell, R.; Samara, M.; Varney, R. H.; Oksavik, K.; Clausen, L. B. N.; Hecht, J. H.; Clemmons, J. H.; Fritz, B.
2017-12-01
The RENU2 sounding rocket (launched from Andoya rocket range on December 13th, 2015) observed Poleward Moving Auroral Forms within the dayside cusp. The ISINGLASS rockets (launched from Poker Flat rocket range on February 22, 2017 and March 2, 2017) both observed aurora during a substorm event. Despite observing very different events, both campaigns witnessed a high degree of small scale structuring within the larger auroral boundary, including Alfvenic signatures. These observations suggest a method of coupling large-scale energy input to fine scale structures within aurorae. During RENU2, small (sub-km) scale drivers persist for long (10s of minutes) time scales and result in large scale ionospheric (thermal electron) and thermospheric response (neutral upwelling). ISINGLASS observations show small scale drivers, but with short (minute) time scales, with ionospheric response characterized by the flight's thermal electron instrument (ERPA). The comparison of the two flights provides an excellent opportunity to examine ionospheric and thermospheric response to small scale drivers over different integration times.
Time functions of deep earthquakes from broadband and short-period stacks
Houston, H.; Benz, H.M.; Vidale, J.E.
1998-01-01
To constrain dynamic source properties of deep earthquakes, we have systematically constructed broadband time functions of deep earthquakes by stacking and scaling teleseismic P waves from U.S. National Seismic Network, TERRAscope, and Berkeley Digital Seismic Network broadband stations. We examined 42 earthquakes with depths from 100 to 660 km that occurred between July 1, 1992 and July 31, 1995. To directly compare time functions, or to group them by size, depth, or region, it is essential to scale them to remove the effect of moment, which varies by more than 3 orders of magnitude for these events. For each event we also computed short-period stacks of P waves recorded by west coast regional arrays. The comparison of broadband with short-period stacks yields a considerable advantage, enabling more reliable measurement of event duration. A more accurate estimate of the duration better constrains the scaling procedure to remove the effect of moment, producing scaled time functions with both correct timing and amplitude. We find only subtle differences in the broadband time-function shape with moment, indicating successful scaling and minimal effects of attenuation at the periods considered here. The average shape of the envelopes of the short-period stacks is very similar to the average broadband time function. The main variations seen with depth are (1) a mild decrease in duration with increasing depth, (2) greater asymmetry in the time functions of intermediate events compared to deep ones, and (3) unexpected complexity and late moment release for events between 350 and 550 km, with seven of the eight events in that depth interval displaying markedly more complicated time functions with more moment release late in the rupture than most events above or below. The first two results are broadly consistent with our previous studies, while the third is reported here for the first time. The greater complexity between 350 and 550 km suggests greater heterogeneity in the failure process in that depth range. Copyright 1998 by the American Geophysical Union.
Long-term changes in regular and low-frequency earthquake inter-event times near Parkfield, CA
NASA Astrophysics Data System (ADS)
Wu, C.; Shelly, D. R.; Johnson, P. A.; Gomberg, J. S.; Peng, Z.
2012-12-01
The temporal evolution of earthquake inter-event time may provide important clues for the timing of future events and underlying physical mechanisms of earthquake nucleation. In this study, we examine inter-event times from 12-yr catalogs of ~50,000 earthquakes and ~730,000 LFEs in the vicinity of the Parkfield section of the San Andreas Fault. We focus on the long-term evolution of inter-event times after the 2003 Mw6.5 San Simeon and 2004 Mw6.0 Parkfield earthquakes. We find that inter-event times decrease by ~4 orders of magnitudes after the Parkfield and San Simeon earthquakes and are followed by a long-term recovery with time scales of ~3 years and more than 8 years for earthquakes along and to the southwest of the San Andreas fault, respectively. The differing long-term recovery of the earthquake inter-event times is likely a manifestation of different aftershock recovery time scales that reflect the different tectonic loading rates in the two regions. We also observe a possible decrease of LFE inter-event times in some LFE families, followed by a recovery with time scales of ~4 months to several years. The drop in the recurrence time of LFE after the Parkfield earthquake is likely caused by a combination of the dynamic and positive static stress induced by the Parkfield earthquake, and the long-term recovery in LFE recurrence time could be due to post-seismic relaxation or gradual recovery of the fault zone material properties. Our on-going work includes better constraining and understanding the physical mechanisms responsible for the observed long-term recovery in earthquake and LFE inter-event times.
A Life Events Scale for Armed Forces personnel
Chaudhury, Suprakash; Srivastava, Kalpana; Raju, M.S.V. Kama; Salujha, S.K.
2006-01-01
Background: Armed Forces personnel are routinely exposed to a number of unique stressful life events. None of the available scales are relevant to service personnel. Aim: To construct a scale to measure life events in service personnel. Methods: In the first stage of the study open-ended questions along with items generated by the expert group by consensus method were administered to 50 soldiers. During the second stage a scale comprising 59 items and open-ended questions was administered to 165 service personnel. The final scale of 52 items was administered to 200 service personnel in group setting. Weightage was assigned on a 0 to 100 range. For normative study the Armed Forces Medical College Life Events Scale (AFMC LES) was administered to 1200 Army, 100 Air Force and 100 Navy personnel. Results: Service personnel experience an average of 4 life events in past one year and 13 events in a life-time. On an average service personnel experience 115 life change unit scores in past one year and 577 life change unit scores in life-time on the AFMC LES. The scale has concurrent validity when compared with the Presumptive Stressful Life Events Scale (PSLES). There is internal consistency in the scale with the routine items being rated very low. There is a pattern of uniformity with the civilian counterparts along with differences in the items specific to service personnel. Conclusions: The AFMC LES includes the unique stresses of service personnel that are not included in any life events scale available in India or in the west and should be used to assess stressful life events in service personnel. PMID:20844647
Takahiro Sayama; Jeffrey J. McDonnell
2009-01-01
Hydrograph source components and stream water residence time are fundamental behavioral descriptors of watersheds but, as yet, are poorly represented in most rainfall-runoff models. We present a new time-space accounting scheme (T-SAS) to simulate the pre-event and event water fractions, mean residence time, and spatial source of streamflow at the watershed scale. We...
Escape and finite-size scaling in diffusion-controlled annihilation
Ben-Naim, Eli; Krapivsky, Paul L.
2016-12-16
In this paper, we study diffusion-controlled single-species annihilation with a finite number of particles. In this reaction-diffusion process, each particle undergoes ordinary diffusion, and when two particles meet, they annihilate. We focus on spatial dimensions d>2 where a finite number of particles typically survive the annihilation process. Using scaling techniques we investigate the average number of surviving particles, M, as a function of the initial number of particles, N. In three dimensions, for instance, we find the scaling law M ~ N 1/3 in the asymptotic regime N»1. We show that two time scales govern the reaction kinetics: the diffusionmore » time scale, T ~ N 2/3, and the escape time scale, τ ~ N 4/3. The vast majority of annihilation events occur on the diffusion time scale, while no annihilation events occur beyond the escape time scale.« less
Simulating recurrent event data with hazard functions defined on a total time scale.
Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald
2015-03-08
In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data. Its application therefore allows the simulation of data that closely resemble real settings and thus can improve the use of simulation studies for designing and analysing studies.
NASA Astrophysics Data System (ADS)
Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming
2017-07-01
Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.
ERIC Educational Resources Information Center
Nam, Younkyeong; Karahan, Engin; Roehrig, Gillian
2016-01-01
Geologic time scale is a very important concept for understanding long-term earth system events such as climate change. This study examines forty-three 4th-8th grade Native American--particularly Ojibwe tribe--students' understanding of relative ordering and absolute time of Earth's significant geological and biological events. This study also…
Fast rise times and the physical mechanism of deep earthquakes
NASA Technical Reports Server (NTRS)
Houston, H.; Williams, Q.
1991-01-01
A systematic global survey of the rise times and stress drops of deep and intermediate earthquakes is reported. When the rise times are scaled to the seismic moment release of the events, their average is nearly twice as fast for events deeper than about 450 km as for shallower events.
Near bed suspended sediment flux by single turbulent events
NASA Astrophysics Data System (ADS)
Amirshahi, Seyed Mohammad; Kwoll, Eva; Winter, Christian
2018-01-01
The role of small scale single turbulent events in the vertical mixing of near bed suspended sediments was explored in a shallow shelf sea environment. High frequency velocity and suspended sediment concentration (SSC; calibrated from the backscatter intensity) were collected using an Acoustic Doppler Velocimeter (ADV). Using quadrant analysis, the despiked velocity time series was divided into turbulent events and small background fluctuations. Reynolds stress and Turbulent Kinetic Energy (TKE) calculated from all velocity samples, were compared to the same turbulent statistics calculated only from velocity samples classified as turbulent events (Reevents and TKEevents). The comparison showed that Reevents and TKEevents was increased 3 and 1.6 times, respectively, when small background fluctuations were removed and that the correlation with SSC for TKE could be improved through removal of the latter. The correlation between instantaneous vertical turbulent flux (w ‧) and SSC fluctuations (SSC ‧) exhibits a tidal pattern with the maximum correlation at peak ebb and flood currents, when strong turbulent events appear. Individual turbulent events were characterized by type, strength, duration and length. Cumulative vertical turbulent sediment fluxes and average SSC associated with individual turbulent events were calculated. Over the tidal cycle, ejections and sweeps were the most dominant events, transporting 50% and 36% of the cumulative vertical turbulent event sediment flux, respectively. Although the contribution of outward interactions to the vertical turbulent event sediment flux was low (11%), single outward interaction events were capable of inducing similar SSC ‧ as sweep events. The results suggest that on time scales of tens of minutes to hours, TKE may be appropriate to quantify turbulence in sediment transport studies, but that event characteristics, particular the upward turbulent flux need to be accounted for when considering sediment transport on process time scales.
NASA Astrophysics Data System (ADS)
Lamb, Derek A.
2016-10-01
While sunspots follow a well-defined pattern of emergence in space and time, small-scale flux emergence is assumed to occur randomly at all times in the quiet Sun. HMI's full-disk coverage, high cadence, spatial resolution, and duty cycle allow us to probe that basic assumption. Some case studies of emergence suggest that temporal clustering on spatial scales of 50-150 Mm may occur. If clustering is present, it could serve as a diagnostic of large-scale subsurface magnetic field structures. We present the results of a manual survey of small-scale flux emergence events over a short time period, and a statistical analysis addressing the question of whether these events show spatio-temporal behavior that is anything other than random.
Large Scale Water Vapor Sources Relative to the October 2000 Piedmont Flood
NASA Technical Reports Server (NTRS)
Turato, Barbara; Reale, Oreste; Siccardi, Franco
2003-01-01
Very intense mesoscale or synoptic-scale rainfall events can occasionally be observed in the Mediterranean region without any deep cyclone developing over the areas affected by precipitation. In these perplexing cases the synoptic situation can superficially look similar to cases in which very little precipitation occurs. These situations could possibly baffle the operational weather forecasters. In this article, the major precipitation event that affected Piedmont (Italy) between 13 and 16 October 2000 is investigated. This is one of the cases in which no intense cyclone was observed within the Mediterranean region at any time, only a moderate system was present, and yet exceptional rainfall and flooding occurred. The emphasis of this study is on the moisture origin and transport. Moisture and energy balances are computed on different space- and time-scales, revealing that precipitation exceeds evaporation over an area inclusive of Piedmont and the northwestern Mediterranean region, on a time-scale encompassing the event and about two weeks preceding it. This is suggestive of an important moisture contribution originating from outside the region. A synoptic and dynamic analysis is then performed to outline the potential mechanisms that could have contributed to the large-scale moisture transport. The central part of the work uses a quasi-isentropic water-vapor back trajectory technique. The moisture sources obtained by this technique are compared with the results of the balances and with the synoptic situation, to unveil possible dynamic mechanisms and physical processes involved. It is found that moisture sources on a variety of atmospheric scales contribute to this event. First, an important contribution is caused by the extratropical remnants of former tropical storm Leslie. The large-scale environment related to this system allows a significant amount of moisture to be carried towards Europe. This happens on a time- scale of about 5-15 days preceding the Piedmont event. Second, water-vapor intrusions from the African Inter-Tropical Convergence Zone and evaporation from the eastern Atlantic contribute on the 2-5 day time-scale. The large-scale moist dynamics appears therefore to be one important factor enabling a moderate Mediterranean cyclone to produce heavy precipitation. Finally, local evaporation from the Mediterranean, water-vapor recycling, and orographically-induced low-level convergence enhance and concentrate the moisture over the area where heavy precipitation occurs. This happens on a 12-72 hour time-scale.
Dual effects of the winter monsoon on haze-fog variations in eastern China
NASA Astrophysics Data System (ADS)
Liu, Qian; Sheng, Lifang; Cao, Ziqi; Diao, Yina; Wang, Wencai; Zhou, Yang
2017-06-01
Previous studies have revealed a negative correlation between the East Asian winter monsoon and wintertime haze-fog events in China. The winter monsoon reduces haze-fog by advecting away aerosol particles and supplying clean air through cold waves. However, it is found that the frequency of haze-fog events on subseasonal time scales displays no correlation with typical winter monsoon indices. The results show that the accumulating and maintaining effects of calm weather related to the Siberian High, which is also a part of the monsoon circulation system, are equally important for the development of haze-fog events during winter. Correlation analysis indicates that subseasonal variations in haze-fog are closely related to the intensity of the Siberian High (r = 0.49). The Siberian High may increase the occurrence of haze-fog events by reducing the near surface wind speed and enhancing the stratification stability. To quantify the contribution of these diverse effects of the winter monsoon on the variations in haze-fog events, we analyzed haze-fog events during periods of cold wave activity and calm weather separately and contrasted the relative contributions of these two effects on different time scales. On the subseasonal scale, the effect of the Siberian High was 2.0 times that of cold waves; on the interannual scale, the effect of cold waves was 2.4 times that of the Siberian High. This study reveals the dual effects of the East Asian winter monsoon on wintertime haze-fog variations in eastern China and provides a more comprehensive understanding of the relationship between the monsoon and haze-fog events.
Evidence for the timing of sea-level events during MIS 3
NASA Astrophysics Data System (ADS)
Siddall, M.
2005-12-01
Four large sea-level peaks of millennial-scale duration occur during MIS 3. In addition smaller peaks may exist close to the sensitivity of existing methods to derive sea level during these periods. Millennial-scale changes in temperature during MIS 3 are well documented across much of the planet and are linked in some unknown, yet fundamental way to changes in ice volume / sea level. It is therefore highly likely that the timing of the sea level events during MIS 3 will prove to be a `Rosetta Stone' for understanding millennial scale climate variability. I will review observational and mechanistic arguments for the variation of sea level on Antarctic, Greenland and absolute time scales.
Cross scale interactions, nonlinearities, and forecasting catastrophic events
Peters, Debra P.C.; Pielke, Roger A.; Bestelmeyer, Brandon T.; Allen, Craig D.; Munson-McGee, Stuart; Havstad, Kris M.
2004-01-01
Catastrophic events share characteristic nonlinear behaviors that are often generated by cross-scale interactions and feedbacks among system elements. These events result in surprises that cannot easily be predicted based on information obtained at a single scale. Progress on catastrophic events has focused on one of the following two areas: nonlinear dynamics through time without an explicit consideration of spatial connectivity [Holling, C. S. (1992) Ecol. Monogr. 62, 447–502] or spatial connectivity and the spread of contagious processes without a consideration of cross-scale interactions and feedbacks [Zeng, N., Neeling, J. D., Lau, L. M. & Tucker, C. J. (1999) Science 286, 1537–1540]. These approaches rarely have ventured beyond traditional disciplinary boundaries. We provide an interdisciplinary, conceptual, and general mathematical framework for understanding and forecasting nonlinear dynamics through time and across space. We illustrate the generality and usefulness of our approach by using new data and recasting published data from ecology (wildfires and desertification), epidemiology (infectious diseases), and engineering (structural failures). We show that decisions that minimize the likelihood of catastrophic events must be based on cross-scale interactions, and such decisions will often be counterintuitive. Given the continuing challenges associated with global change, approaches that cross disciplinary boundaries to include interactions and feedbacks at multiple scales are needed to increase our ability to predict catastrophic events and develop strategies for minimizing their occurrence and impacts. Our framework is an important step in developing predictive tools and designing experiments to examine cross-scale interactions.
Measuring Cognitive and Affective Constructs in the Context of an Acute Health Event
Boudreaux, Edwin D.; Moon, Simon; Tappe, Karyn A.; Bock, Beth; Baumann, Brigitte; Chapman, Gretchen B.
2013-01-01
The latest recommendations for building dynamic health behavior theories emphasize that cognitions, emotions, and behaviors – and the nature of their inter-relationships -- can change over time. This paper describes the development and psychometric validation of four scales created to measure smoking-related causal attributions, perceived illness severity, event-related emotions, and intention to quit smoking among patients experiencing acute cardiac symptoms. After completing qualitative work with a sample of 50 cardiac patients, we administered the scales to 300 patients presenting to the emergency department for cardiac-related symptoms. Factor analyses, alpha coefficients, ANOVAS, and Pearson correlation coefficients were used to establish the scales' reliability and validity. Factor analyses revealed a stable factor structures for each of the four constructs. The scales were internally consistent, with the majority having an alpha of >0.80 (range: 0.57 to 0.89). Mean differences in ratings of the perceived illness severity and event-related emotions were noted across the three time anchors. Significant increases in intention to quit at the time of enrollment, compared to retrospective ratings of intention to quit before the event, provide preliminary support for the sensitivity of this measure to the motivating impact of the event. Finally, smoking-related causal attributions, perceived illness severity, and event-related emotions correlated in the expected directions with intention to quit smoking, providing preliminary support for construct validity. PMID:22970703
Event management for large scale event-driven digital hardware spiking neural networks.
Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean
2013-09-01
The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms. Copyright © 2013 Elsevier Ltd. All rights reserved.
Track-based event recognition in a realistic crowded environment
NASA Astrophysics Data System (ADS)
van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.
2014-10-01
Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.
Recurrence and interoccurrence behavior of self-organized complex phenomena
NASA Astrophysics Data System (ADS)
Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.
2007-08-01
The sandpile, forest-fire and slider-block models are said to exhibit self-organized criticality. Associated natural phenomena include landslides, wildfires, and earthquakes. In all cases the frequency-size distributions are well approximated by power laws (fractals). Another important aspect of both the models and natural phenomena is the statistics of interval times. These statistics are particularly important for earthquakes. For earthquakes it is important to make a distinction between interoccurrence and recurrence times. Interoccurrence times are the interval times between earthquakes on all faults in a region whereas recurrence times are interval times between earthquakes on a single fault or fault segment. In many, but not all cases, interoccurrence time statistics are exponential (Poissonian) and the events occur randomly. However, the distribution of recurrence times are often Weibull to a good approximation. In this paper we study the interval statistics of slip events using a slider-block model. The behavior of this model is sensitive to the stiffness α of the system, α=kC/kL where kC is the spring constant of the connector springs and kL is the spring constant of the loader plate springs. For a soft system (small α) there are no system-wide events and interoccurrence time statistics of the larger events are Poissonian. For a stiff system (large α), system-wide events dominate the energy dissipation and the statistics of the recurrence times between these system-wide events satisfy the Weibull distribution to a good approximation. We argue that this applicability of the Weibull distribution is due to the power-law (scale invariant) behavior of the hazard function, i.e. the probability that the next event will occur at a time t0 after the last event has a power-law dependence on t0. The Weibull distribution is the only distribution that has a scale invariant hazard function. We further show that the onset of system-wide events is a well defined critical point. We find that the number of system-wide events NSWE satisfies the scaling relation NSWE ∝(α-αC)δ where αC is the critical value of the stiffness. The system-wide events represent a new phase for the slider-block system.
Self-similarity of waiting times in fracture systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niccolini, G.; Bosia, F.; Carpinteri, A.
2009-08-15
Experimental and numerical results are presented for a fracture experiment carried out on a fiber-reinforced element under flexural loading, and a statistical analysis is performed for acoustic emission waiting-time distributions. By an optimization procedure, a recently proposed scaling law describing these distributions for different event magnitude scales is confirmed by both experimental and numerical data, thus reinforcing the idea that fracture of heterogeneous materials has scaling properties similar to those found for earthquakes. Analysis of the different scaling parameters obtained for experimental and numerical data leads us to formulate the hypothesis that the type of scaling function obtained depends onmore » the level of correlation among fracture events in the system.« less
Inter-annual Variability of Temperature and Extreme Heat Events during the Nairobi Warm Season
NASA Astrophysics Data System (ADS)
Scott, A.; Misiani, H. O.; Zaitchik, B. F.; Ouma, G. O.; Anyah, R. O.; Jordan, A.
2016-12-01
Extreme heat events significantly stress all organisms in the ecosystem, and are likely to be amplified in peri-urban and urban areas. Understanding the variability and drivers behind these events is key to generating early warnings, yet in Equatorial East Africa, this information is currently unavailable. This study uses daily maximum and minimum temperature records from weather stations within Nairobi and its surroundings to characterize variability in daily minimum temperatures and the number of extreme heat events. ERA-Interim reanalysis is applied to assess the drivers of these events at event and seasonal time scales. At seasonal time scales, high temperatures in Nairobi are a function of large scale climate variability associated with the Atlantic Multi-decadal Oscillation (AMO) and Global Mean Sea Surface Temperature (GMSST). Extreme heat events, however, are more strongly associated with the El Nino Southern Oscillation (ENSO). For instance, the persistence of AMO and ENSO, in particular, provide a basis for seasonal prediction of extreme heat events/days in Nairobi. It is also apparent that the temporal signal from extreme heat events in tropics differs from classic heat wave definitions developed in the mid-latitudes, which suggests that a new approach for defining these events is necessary for tropical regions.
Λ(t)CDM model as a unified origin of holographic and agegraphic dark energy models
NASA Astrophysics Data System (ADS)
Chen, Yun; Zhu, Zong-Hong; Xu, Lixin; Alcaniz, J. S.
2011-04-01
Motivated by the fact that any nonzero Λ can introduce a length scale or a time scale into Einstein's theory, r=ct=3/|Λ|. Conversely, any cosmological length scale or time scale can introduce a Λ(t), Λ(t)=3/rΛ2(t)=3/(c2tΛ2(t)). In this Letter, we investigate the time varying Λ(t) corresponding to the length scales, including the Hubble horizon, the particle horizon and the future event horizon, and the time scales, including the age of the universe and the conformal time. It is found out that, in this scenario, the Λ(t)CDM model can be taken as the unified origin of the holographic and agegraphic dark energy models with interaction between the matter and the dark energy, where the interacting term is determined by Q=-ρ. We place observational constraints on the Λ(t)CDM models originating from different cosmological length scales and time scales with the recently compiled “Union2 compilation” which consists of 557 Type Ia supernovae (SNIa) covering a redshift range 0.015⩽z⩽1.4. In conclusion, an accelerating expansion universe can be derived in the cases taking the Hubble horizon, the future event horizon, the age of the universe and the conformal time as the length scale or the time scale.
M.M. Cowden; J.L. Hart; C.J. Schweitzer; D.C. Dey
2014-01-01
Forest disturbances are discrete events in space and time that disrupt the biophysical environment and impart lasting legacies on forest composition and structure. Disturbances are often classified along a gradient of spatial extent and magnitude that ranges from catastrophic events where most of the overstory is removed to gap-scale events that modify local...
Extreme-volatility dynamics in crude oil markets
NASA Astrophysics Data System (ADS)
Jiang, Xiong-Fei; Zheng, Bo; Qiu, Tian; Ren, Fei
2017-02-01
Based on concepts and methods from statistical physics, we investigate extreme-volatility dynamics in the crude oil markets, using the high-frequency data from 2006 to 2010 and the daily data from 1986 to 2016. The dynamic relaxation of extreme volatilities is described by a power law, whose exponents usually depend on the magnitude of extreme volatilities. In particular, the relaxation before and after extreme volatilities is time-reversal symmetric at the high-frequency time scale, but time-reversal asymmetric at the daily time scale. This time-reversal asymmetry is mainly induced by exogenous events. However, the dynamic relaxation after exogenous events exhibits the same characteristics as that after endogenous events. An interacting herding model both with and without exogenous driving forces could qualitatively describe the extreme-volatility dynamics.
Review of subjective measures of human response to aircraft noise
NASA Technical Reports Server (NTRS)
Cawthorn, J. M.; Mayes, W. H.
1976-01-01
The development of aircraft noise rating scales and indexes is reviewed up to the present time. Single event scales, multiple event indexes, and their interrelation with each other, are considered. Research requirements for further refinement and development of aircraft noise rating quantification factors are discussed.
NASA Astrophysics Data System (ADS)
Tebbens, S. F.; Barton, C. C.; Scott, B. E.
2016-12-01
Traditionally, the size of natural disaster events such as hurricanes, earthquakes, tornadoes, and floods is measured in terms of wind speed (m/sec), energy released (ergs), or discharge (m3/sec) rather than by economic loss or fatalities. Economic loss and fatalities from natural disasters result from the intersection of the human infrastructure and population with the size of the natural event. This study investigates the size versus cumulative number distribution of individual natural disaster events for several disaster types in the United States. Economic losses are adjusted for inflation to 2014 USD. The cumulative number divided by the time over which the data ranges for each disaster type is the basis for making probabilistic forecasts in terms of the number of events greater than a given size per year and, its inverse, return time. Such forecasts are of interest to insurers/re-insurers, meteorologists, seismologists, government planners, and response agencies. Plots of size versus cumulative number distributions per year for economic loss and fatalities are well fit by power scaling functions of the form p(x) = Cx-β; where, p(x) is the cumulative number of events with size equal to and greater than size x, C is a constant, the activity level, x is the event size, and β is the scaling exponent. Economic loss and fatalities due to hurricanes, earthquakes, tornadoes, and floods are well fit by power functions over one to five orders of magnitude in size. Economic losses for hurricanes and tornadoes have greater scaling exponents, β = 1.1 and 0.9 respectively, whereas earthquakes and floods have smaller scaling exponents, β = 0.4 and 0.6 respectively. Fatalities for tornadoes and floods have greater scaling exponents, β = 1.5 and 1.7 respectively, whereas hurricanes and earthquakes have smaller scaling exponents, β = 0.4 and 0.7 respectively. The scaling exponents can be used to make probabilistic forecasts for time windows ranging from 1 to 1000 years. Forecasts show that on an annual basis, in the United States, the majority of events with 10 fatalities and greater are related to floods and tornadoes; while events with 100 fatalities and greater are less frequent and are dominated by hurricanes and earthquakes. Disaster mitigation strategies need to account for these differences.
The origins of multifractality in financial time series and the effect of extreme events
NASA Astrophysics Data System (ADS)
Green, Elena; Hanan, William; Heffernan, Daniel
2014-06-01
This paper presents the results of multifractal testing of two sets of financial data: daily data of the Dow Jones Industrial Average (DJIA) index and minutely data of the Euro Stoxx 50 index. Where multifractal scaling is found, the spectrum of scaling exponents is calculated via Multifractal Detrended Fluctuation Analysis. In both cases, further investigations reveal that the temporal correlations in the data are a more significant source of the multifractal scaling than are the distributions of the returns. It is also shown that the extreme events which make up the heavy tails of the distribution of the Euro Stoxx 50 log returns distort the scaling in the data set. The most extreme events are inimical to the scaling regime. This result is in contrast to previous findings that extreme events contribute to multifractality.
Heidari, Zahra; Roe, Daniel R; Galindo-Murillo, Rodrigo; Ghasemi, Jahan B; Cheatham, Thomas E
2016-07-25
Long time scale molecular dynamics (MD) simulations of biological systems are becoming increasingly commonplace due to the availability of both large-scale computational resources and significant advances in the underlying simulation methodologies. Therefore, it is useful to investigate and develop data mining and analysis techniques to quickly and efficiently extract the biologically relevant information from the incredible amount of generated data. Wavelet analysis (WA) is a technique that can quickly reveal significant motions during an MD simulation. Here, the application of WA on well-converged long time scale (tens of μs) simulations of a DNA helix is described. We show how WA combined with a simple clustering method can be used to identify both the physical and temporal locations of events with significant motion in MD trajectories. We also show that WA can not only distinguish and quantify the locations and time scales of significant motions, but by changing the maximum time scale of WA a more complete characterization of these motions can be obtained. This allows motions of different time scales to be identified or ignored as desired.
Cascading events in linked ecological and socioeconomic systems
Peters, Debra P.C.; Sala, O.E.; Allen, Craig D.; Covich, A.; Brunson, M.
2007-01-01
Cascading events that start at small spatial scales and propagate non-linearly through time to influence larger areas often have major impacts on ecosystem goods and services. Events such as wildfires and hurricanes are increasing in frequency and magnitude as systems become more connected through globalization processes. We need to improve our understanding of these events in order to predict their occurrence, minimize potential impacts, and allow for strategic recovery. Here, we synthesize information about cascading events in systems located throughout the Americas. We discuss a variety of examples of cascading events that share a common feature: they are often driven by linked ecological and human processes across scales. In this era of globalization, we recommend studies that explicitly examine connections across scales and examine the role of connectivity among non-contiguous as well as contiguous areas.
NASA Technical Reports Server (NTRS)
Huffman, George J.; Adler, Robert F.; Bolvin, David T.; Gu, Guojun; Nelkin, Eric J.; Bowman, Kenneth P.; Stocker, Erich; Wolff, David B.
2006-01-01
The TRMM Multi-satellite Precipitation Analysis (TMPA) provides a calibration-based sequential scheme for combining multiple precipitation estimates from satellites, as well as gauge analyses where feasible, at fine scales (0.25 degrees x 0.25 degrees and 3-hourly). It is available both after and in real time, based on calibration by the TRMM Combined Instrument and TRMM Microwave Imager precipitation products, respectively. Only the after-real-time product incorporates gauge data at the present. The data set covers the latitude band 50 degrees N-S for the period 1998 to the delayed present. Early validation results are as follows: The TMPA provides reasonable performance at monthly scales, although it is shown to have precipitation rate dependent low bias due to lack of sensitivity to low precipitation rates in one of the input products (based on AMSU-B). At finer scales the TMPA is successful at approximately reproducing the surface-observation-based histogram of precipitation, as well as reasonably detecting large daily events. The TMPA, however, has lower skill in correctly specifying moderate and light event amounts on short time intervals, in common with other fine-scale estimators. Examples are provided of a flood event and diurnal cycle determination.
Estimating Rain Rates from Tipping-Bucket Rain Gauge Measurements
NASA Technical Reports Server (NTRS)
Wang, Jianxin; Fisher, Brad L.; Wolff, David B.
2007-01-01
This paper describes the cubic spline based operational system for the generation of the TRMM one-minute rain rate product 2A-56 from Tipping Bucket (TB) gauge measurements. Methodological issues associated with applying the cubic spline to the TB gauge rain rate estimation are closely examined. A simulated TB gauge from a Joss-Waldvogel (JW) disdrometer is employed to evaluate effects of time scales and rain event definitions on errors of the rain rate estimation. The comparison between rain rates measured from the JW disdrometer and those estimated from the simulated TB gauge shows good overall agreement; however, the TB gauge suffers sampling problems, resulting in errors in the rain rate estimation. These errors are very sensitive to the time scale of rain rates. One-minute rain rates suffer substantial errors, especially at low rain rates. When one minute rain rates are averaged to 4-7 minute or longer time scales, the errors dramatically reduce. The rain event duration is very sensitive to the event definition but the event rain total is rather insensitive, provided that the events with less than 1 millimeter rain totals are excluded. Estimated lower rain rates are sensitive to the event definition whereas the higher rates are not. The median relative absolute errors are about 22% and 32% for 1-minute TB rain rates higher and lower than 3 mm per hour, respectively. These errors decrease to 5% and 14% when TB rain rates are used at 7-minute scale. The radar reflectivity-rainrate (Ze-R) distributions drawn from large amount of 7-minute TB rain rates and radar reflectivity data are mostly insensitive to the event definition.
Synchronization Of Parallel Discrete Event Simulations
NASA Technical Reports Server (NTRS)
Steinman, Jeffrey S.
1992-01-01
Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.
Time-decreasing hazard and increasing time until the next earthquake
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corral, Alvaro
2005-01-01
The existence of a slowly always decreasing probability density for the recurrence times of earthquakes in the stationary case implies that the occurrence of an event at a given instant becomes more unlikely as time since the previous event increases. Consequently, the expected waiting time to the next earthquake increases with the elapsed time, that is, the event moves away fast to the future. We have found direct empirical evidence of this counterintuitive behavior in two worldwide catalogs as well as in diverse regional catalogs. Universal scaling functions describe the phenomenon well.
Joint scale-change models for recurrent events and failure time.
Xu, Gongjun; Chiou, Sy Han; Huang, Chiung-Yu; Wang, Mei-Cheng; Yan, Jun
2017-01-01
Recurrent event data arise frequently in various fields such as biomedical sciences, public health, engineering, and social sciences. In many instances, the observation of the recurrent event process can be stopped by the occurrence of a correlated failure event, such as treatment failure and death. In this article, we propose a joint scale-change model for the recurrent event process and the failure time, where a shared frailty variable is used to model the association between the two types of outcomes. In contrast to the popular Cox-type joint modeling approaches, the regression parameters in the proposed joint scale-change model have marginal interpretations. The proposed approach is robust in the sense that no parametric assumption is imposed on the distribution of the unobserved frailty and that we do not need the strong Poisson-type assumption for the recurrent event process. We establish consistency and asymptotic normality of the proposed semiparametric estimators under suitable regularity conditions. To estimate the corresponding variances of the estimators, we develop a computationally efficient resampling-based procedure. Simulation studies and an analysis of hospitalization data from the Danish Psychiatric Central Register illustrate the performance of the proposed method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elber, Ron
Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.
NASA Technical Reports Server (NTRS)
Dryer, M. (Editor); Tandberg-Hanssen, E.
1980-01-01
The symposium focuses on solar phenomena as the source of transient events propagating through the solar system, and theoretical and observational assessments of the dynamic processes involved in these events. The topics discussed include the life history of coronal structures and fields, coronal and interplanetary responses to long time scale phenomena, solar transient phenomena affecting the corona and interplanetary medium, coronal and interplanetary responses to short time scale phenomena, and future directions.
AN AUTOMATIC DETECTION METHOD FOR EXTREME-ULTRAVIOLET DIMMINGS ASSOCIATED WITH SMALL-SCALE ERUPTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alipour, N.; Safari, H.; Innes, D. E.
2012-02-10
Small-scale extreme-ultraviolet (EUV) dimming often surrounds sites of energy release in the quiet Sun. This paper describes a method for the automatic detection of these small-scale EUV dimmings using a feature-based classifier. The method is demonstrated using sequences of 171 Angstrom-Sign images taken by the STEREO/Extreme UltraViolet Imager (EUVI) on 2007 June 13 and by Solar Dynamics Observatory/Atmospheric Imaging Assembly on 2010 August 27. The feature identification relies on recognizing structure in sequences of space-time 171 Angstrom-Sign images using the Zernike moments of the images. The Zernike moments space-time slices with events and non-events are distinctive enough to be separatedmore » using a support vector machine (SVM) classifier. The SVM is trained using 150 events and 700 non-event space-time slices. We find a total of 1217 events in the EUVI images and 2064 events in the AIA images on the days studied. Most of the events are found between latitudes -35 Degree-Sign and +35 Degree-Sign . The sizes and expansion speeds of central dimming regions are extracted using a region grow algorithm. The histograms of the sizes in both EUVI and AIA follow a steep power law with slope of about -5. The AIA slope extends to smaller sizes before turning over. The mean velocity of 1325 dimming regions seen by AIA is found to be about 14 km s{sup -1}.« less
Scaling and intermittency of brain events as a manifestation of consciousness
NASA Astrophysics Data System (ADS)
Paradisi, P.; Allegrini, P.; Gemignani, A.; Laurino, M.; Menicucci, D.; Piarulli, A.
2013-01-01
We discuss the critical brain hypothesis and its relationship with intermittent renewal processes displaying power-law decay in the distribution of waiting times between two consecutive renewal events. In particular, studies on complex systems in a "critical" condition show that macroscopic variables, integrating the activities of many individual functional units, undergo fluctuations with an intermittent serial structure characterized by avalanches with inverse-power-law (scale-free) distribution densities of sizes and inter-event times. This condition, which is denoted as "fractal intermittency", was found in the electroencephalograms of subjects observed during a resting state wake condition. It remained unsolved whether fractal intermittency correlates with the stream of consciousness or with a non-task-driven default mode activity, also present in non-conscious states, like deep sleep. After reviewing a method of scaling analysis of intermittent systems based of eventdriven random walks, we show that during deep sleep fractal intermittency breaks down, and reestablishes during REM (Rapid Eye Movement) sleep, with essentially the same anomalous scaling of the pre-sleep wake condition. From the comparison of the pre-sleep wake, deep sleep and REM conditions we argue that the scaling features of intermittent brain events are related to the level of consciousness and, consequently, could be exploited as a possible indicator of consciousness in clinical applications.
USDA-ARS?s Scientific Manuscript database
Emergent properties and cross-scale interactions are important in driving landscape-scale dynamics during a disturbance event, such as wildfire. We used these concepts related to changing pattern-process relationships across scales to explain ecological responses following disturbance that resulted ...
NASA Astrophysics Data System (ADS)
Bevilacqua, Andrea; Neri, Augusto; Esposti Ongaro, Tomaso; Isaia, Roberto; Flandoli, Franco; Bisson, Marina
2016-04-01
Today hundreds of thousands people live inside the Campi Flegrei caldera (Italy) and in the adjacent part of the city of Naples making a future eruption of such volcano an event with huge consequences. Very high risks are associated with the occurrence of pyroclastic density currents (PDCs). Mapping of background or long-term PDC hazard in the area is a great challenge due to the unknown eruption time, scale and vent location of the next event as well as the complex dynamics of the flow over the caldera topography. This is additionally complicated by the remarkable epistemic uncertainty on the eruptive record, affecting the time of past events, the location of vents as well as the PDCs areal extent estimates. First probability maps of PDC invasion were produced combining a vent-opening probability map, statistical estimates concerning the eruptive scales and a Cox-type temporal model including self-excitement effects, based on the eruptive record of the last 15 kyr. Maps were produced by using a Monte Carlo approach and adopting a simplified inundation model based on the "box model" integral approximation tested with 2D transient numerical simulations of flow dynamics. In this presentation we illustrate the independent effects of eruption scale, vent location and time of forecast of the next event. Specific focus was given to the remarkable differences between the eastern and western sectors of the caldera and their effects on the hazard maps. The analysis allowed to identify areas with elevated probabilities of flow invasion as a function of the diverse assumptions made. With the quantification of some sources of uncertainty in relation to the system, we were also able to provide mean and percentile maps of PDC hazard levels.
NASA Astrophysics Data System (ADS)
Hixson, J.; Ward, A. S.; Schmadel, N.
2015-12-01
The exchange of water and solutes across the stream-hyporheic-riparian-hillslope continuum is controlled by the interaction of dynamic hydrological processes with the underlying geological setting. Our current understanding of exchange processes is primarily based on field observations collected during baseflow conditions, with few studies considering time-variable stream-aquifer interactions during storm events. We completed ten sets of four in-stream tracer slug injections during and after a large storm event in a headwater catchment at the H.J. Andrews Experimental Forest, Oregon. The injections were performed in three adjacent 50-meter study reaches, enabling comparison of spatial heterogeneity in transport processes. Reach-scale data demonstrate apparent trends with discharge in both transient storage and long-term storage (commonly "channel water balance"). Comparison of flowpath-scale observations from a network of monitoring wells to reach-scale observations showed that the advective timescale changed with discharge making it difficult to infer process from simple, reach-scale tracer studies. Overall, our results highlight the opportunities and challenges for interpretation of multi-scale solute tracer data along the stream-hyporheic-riparian-hillslope continuum.
NASA Astrophysics Data System (ADS)
Koskelo, Antti I.; Fisher, Thomas R.; Utz, Ryan M.; Jordan, Thomas E.
2012-07-01
SummaryBaseflow separation methods are often impractical, require expensive materials and time-consuming methods, and/or are not designed for individual events in small watersheds. To provide a simple baseflow separation method for small watersheds, we describe a new precipitation-based technique known as the Sliding Average with Rain Record (SARR). The SARR uses rainfall data to justify each separation of the hydrograph. SARR has several advantages such as: it shows better consistency with the precipitation and discharge records, it is easier and more practical to implement, and it includes a method of event identification based on precipitation and quickflow response. SARR was derived from the United Kingdom Institute of Hydrology (UKIH) method with several key modifications to adapt it for small watersheds (<50 km2). We tested SARR on watersheds in the Choptank Basin on the Delmarva Peninsula (US Mid-Atlantic region) and compared the results with the UKIH method at the annual scale and the hydrochemical method at the individual event scale. Annually, SARR calculated a baseflow index that was ˜10% higher than the UKIH method due to the finer time step of SARR (1 d) compared to UKIH (5 d). At the watershed scale, hydric soils were an important driver of the annual baseflow index likely due to increased groundwater retention in hydric areas. At the event scale, SARR calculated less baseflow than the hydrochemical method, again because of the differences in time step (hourly for hydrochemical) and different definitions of baseflow. Both SARR and hydrochemical baseflow increased with event size, suggesting that baseflow contributions are more important during larger storms. To make SARR easy to implement, we have written a MatLab program to automate the calculations which requires only daily rainfall and daily flow data as inputs.
Seismic Parameters of Mining-Induced Aftershock Sequences for Re-entry Protocol Development
NASA Astrophysics Data System (ADS)
Vallejos, Javier A.; Estay, Rodrigo A.
2018-03-01
A common characteristic of deep mines in hard rock is induced seismicity. This results from stress changes and rock failure around mining excavations. Following large seismic events, there is an increase in the levels of seismicity, which gradually decay with time. Restricting access to areas of a mine for enough time to allow this decay of seismic events is the main approach in re-entry strategies. The statistical properties of aftershock sequences can be studied with three scaling relations: (1) Gutenberg-Richter frequency magnitude, (2) the modified Omori's law (MOL) for the temporal decay, and (3) Båth's law for the magnitude of the largest aftershock. In this paper, these three scaling relations, in addition to the stochastic Reasenberg-Jones model are applied to study the characteristic parameters of 11 large magnitude mining-induced aftershock sequences in four mines in Ontario, Canada. To provide guidelines for re-entry protocol development, the dependence of the scaling relation parameters on the magnitude of the main event are studied. Some relations between the parameters and the magnitude of the main event are found. Using these relationships and the scaling relations, a space-time-magnitude re-entry protocol is developed. These findings provide a first approximation to concise and well-justified guidelines for re-entry protocol development applicable to the range of mining conditions found in Ontario, Canada.
Li, Yu; Wang, Nai'ang; Zhang, Chengqi
2014-01-01
The mid-latitudes of East Asia are characterized by the interaction between the Asian summer monsoon and the westerly winds. Understanding long-term climate change in the marginal regions of the Asian monsoon is critical for understanding the millennial-scale interactions between the Asian monsoon and the westerly winds. Abrupt climate events are always associated with changes in large-scale circulation patterns; therefore, investigations into abrupt climate changes provide clues for responses of circulation patterns to extreme climate events. In this paper, we examined the time scale and mid-Holocene climatic background of an abrupt dry mid-Holocene event in the Shiyang River drainage basin in the northwest margin of the Asian monsoon. Mid-Holocene lacustrine records were collected from the middle reaches and the terminal lake of the basin. Using radiocarbon and OSL ages, a centennial-scale drought event, which is characterized by a sand layer in lacustrine sediments both from the middle and lower reaches of the basin, was absolutely dated between 8.0-7.0 cal kyr BP. Grain size data suggest an abrupt decline in lake level and a dry environment in the middle reaches of the basin during the dry interval. Previous studies have shown mid-Holocene drought events in other places of monsoon marginal zones; however, their chronologies are not strong enough to study the mechanism. According to the absolutely dated records, we proposed a new hypothesis that the mid-Holocene dry interval can be related to the weakening Asian summer monsoon and the relatively arid environment in arid Central Asia. Furthermore, abrupt dry climatic events are directly linked to the basin-wide effective moisture change in semi-arid and arid regions. Effective moisture is affected by basin-wide precipitation, evapotranspiration, lake surface evaporation and other geographical settings. As a result, the time scales of the dry interval could vary according to locations due to different geographical features.
Li, Yu; Wang, Nai'ang; Zhang, Chengqi
2014-01-01
The mid-latitudes of East Asia are characterized by the interaction between the Asian summer monsoon and the westerly winds. Understanding long-term climate change in the marginal regions of the Asian monsoon is critical for understanding the millennial-scale interactions between the Asian monsoon and the westerly winds. Abrupt climate events are always associated with changes in large-scale circulation patterns; therefore, investigations into abrupt climate changes provide clues for responses of circulation patterns to extreme climate events. In this paper, we examined the time scale and mid-Holocene climatic background of an abrupt dry mid-Holocene event in the Shiyang River drainage basin in the northwest margin of the Asian monsoon. Mid-Holocene lacustrine records were collected from the middle reaches and the terminal lake of the basin. Using radiocarbon and OSL ages, a centennial-scale drought event, which is characterized by a sand layer in lacustrine sediments both from the middle and lower reaches of the basin, was absolutely dated between 8.0–7.0 cal kyr BP. Grain size data suggest an abrupt decline in lake level and a dry environment in the middle reaches of the basin during the dry interval. Previous studies have shown mid-Holocene drought events in other places of monsoon marginal zones; however, their chronologies are not strong enough to study the mechanism. According to the absolutely dated records, we proposed a new hypothesis that the mid-Holocene dry interval can be related to the weakening Asian summer monsoon and the relatively arid environment in arid Central Asia. Furthermore, abrupt dry climatic events are directly linked to the basin-wide effective moisture change in semi-arid and arid regions. Effective moisture is affected by basin-wide precipitation, evapotranspiration, lake surface evaporation and other geographical settings. As a result, the time scales of the dry interval could vary according to locations due to different geographical features. PMID:24599259
Cytoskeletal dynamics in fission yeast: a review of models for polarization and division
Drake, Tyler; Vavylonis, Dimitrios
2010-01-01
We review modeling studies concerning cytoskeletal activity of fission yeast. Recent models vary in length and time scales, describing a range of phenomena from cellular morphogenesis to polymer assembly. The components of cytoskeleton act in concert to mediate cell-scale events and interactions such as polarization. The mathematical models reduce these events and interactions to their essential ingredients, describing the cytoskeleton by its bulk properties. On a smaller scale, models describe cytoskeletal subcomponents and how bulk properties emerge. PMID:21119765
Testing for scale-invariance in extreme events, with application to earthquake occurrence
NASA Astrophysics Data System (ADS)
Main, I.; Naylor, M.; Greenhough, J.; Touati, S.; Bell, A.; McCloskey, J.
2009-04-01
We address the generic problem of testing for scale-invariance in extreme events, i.e. are the biggest events in a population simply a scaled model of those of smaller size, or are they in some way different? Are large earthquakes for example ‘characteristic', do they ‘know' how big they will be before the event nucleates, or is the size of the event determined only in the avalanche-like process of rupture? In either case what are the implications for estimates of time-dependent seismic hazard? One way of testing for departures from scale invariance is to examine the frequency-size statistics, commonly used as a bench mark in a number of applications in Earth and Environmental sciences. Using frequency data however introduces a number of problems in data analysis. The inevitably small number of data points for extreme events and more generally the non-Gaussian statistical properties strongly affect the validity of prior assumptions about the nature of uncertainties in the data. The simple use of traditional least squares (still common in the literature) introduces an inherent bias to the best fit result. We show first that the sampled frequency in finite real and synthetic data sets (the latter based on the Epidemic-Type Aftershock Sequence model) converge to a central limit only very slowly due to temporal correlations in the data. A specific correction for temporal correlations enables an estimate of convergence properties to be mapped non-linearly on to a Gaussian one. Uncertainties closely follow a Poisson distribution of errors across the whole range of seismic moment for typical catalogue sizes. In this sense the confidence limits are scale-invariant. A systematic sample bias effect due to counting whole numbers in a finite catalogue makes a ‘characteristic'-looking type extreme event distribution a likely outcome of an underlying scale-invariant probability distribution. This highlights the tendency of ‘eyeball' fits unconsciously (but wrongly in this case) to assume Gaussian errors. We develop methods to correct for these effects, and show that the current best fit maximum likelihood regression model for the global frequency-moment distribution in the digital era is a power law, i.e. mega-earthquakes continue to follow the Gutenberg-Richter trend of smaller earthquakes with no (as yet) observable cut-off or characteristic extreme event. The results may also have implications for the interpretation of other time-limited geophysical time series that exhibit power-law scaling.
Chapter two: Phenomenology of tsunamis II: scaling, event statistics, and inter-event triggering
Geist, Eric L.
2012-01-01
Observations related to tsunami catalogs are reviewed and described in a phenomenological framework. An examination of scaling relationships between earthquake size (as expressed by scalar seismic moment and mean slip) and tsunami size (as expressed by mean and maximum local run-up and maximum far-field amplitude) indicates that scaling is significant at the 95% confidence level, although there is uncertainty in how well earthquake size can predict tsunami size (R2 ~ 0.4-0.6). In examining tsunami event statistics, current methods used to estimate the size distribution of earthquakes and landslides and the inter-event time distribution of earthquakes are first reviewed. These methods are adapted to estimate the size and inter-event distribution of tsunamis at a particular recording station. Using a modified Pareto size distribution, the best-fit power-law exponents of tsunamis recorded at nine Pacific tide-gauge stations exhibit marked variation, in contrast to the approximately constant power-law exponent for inter-plate thrust earthquakes. With regard to the inter-event time distribution, significant temporal clustering of tsunami sources is demonstrated. For tsunami sources occurring in close proximity to other sources in both space and time, a physical triggering mechanism, such as static stress transfer, is a likely cause for the anomalous clustering. Mechanisms of earthquake-to-earthquake and earthquake-to-landslide triggering are reviewed. Finally, a modification of statistical branching models developed for earthquake triggering is introduced to describe triggering among tsunami sources.
Single-shot optical recording with sub-picosecond resolution spans record nanosecond lengths
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muir, Ryan; Heebner, John
With the advent of electronics, oscilloscopes and photodiodes are now routinely capable of measuring events well below nanosecond resolution. However, these electronic instruments do not currently measure events below 10 ps resolution. From Walden’s observation that there is an engineering tradeoff between electronic bit depth and temporal resolution in analog-to-digital converters, this technique is projected to have extremely poor fidelity if it is extended to record single events with picosecond resolution. While this constraint may be circumvented with extensive signal averaging or other multiple measurements approaches, rare events and nonrepetitive events cannot be observed with this technique. Techniques capable ofmore » measuring information in a single shot are often required. There is a general lack of available technologies that are easily scalable to long records with sub-picosecond resolution, and are simultaneously versatile in wavelength of operation. Since it is difficult to scale electronic methods to shorter resolutions, we instead aim to scale optical methods to longer records. Demonstrated optical recording methods that have achieved 1 ps resolution and long recording lengths rely on either time scaling to slow down the temporal information or, like Wien, perform time-to-space mapping so that fast events may be captured with a conventional camera.« less
Single-shot optical recording with sub-picosecond resolution spans record nanosecond lengths
Muir, Ryan; Heebner, John
2018-01-18
With the advent of electronics, oscilloscopes and photodiodes are now routinely capable of measuring events well below nanosecond resolution. However, these electronic instruments do not currently measure events below 10 ps resolution. From Walden’s observation that there is an engineering tradeoff between electronic bit depth and temporal resolution in analog-to-digital converters, this technique is projected to have extremely poor fidelity if it is extended to record single events with picosecond resolution. While this constraint may be circumvented with extensive signal averaging or other multiple measurements approaches, rare events and nonrepetitive events cannot be observed with this technique. Techniques capable ofmore » measuring information in a single shot are often required. There is a general lack of available technologies that are easily scalable to long records with sub-picosecond resolution, and are simultaneously versatile in wavelength of operation. Since it is difficult to scale electronic methods to shorter resolutions, we instead aim to scale optical methods to longer records. Demonstrated optical recording methods that have achieved 1 ps resolution and long recording lengths rely on either time scaling to slow down the temporal information or, like Wien, perform time-to-space mapping so that fast events may be captured with a conventional camera.« less
Mavromoustakos, Elena; Clark, Gavin I; Rock, Adam J
2016-01-01
Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined.
Mavromoustakos, Elena; Clark, Gavin I.; Rock, Adam J.
2016-01-01
Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined. PMID:27557054
NASA Astrophysics Data System (ADS)
Nippgen, F.; Ross, M. R. V.; Bernhardt, E. S.; McGlynn, B. L.
2017-12-01
Mountaintop mining (MTM) is an especially destructive form of surface coal mining. It is widespread in Central Appalachia and is practiced around the world. In the process of accessing coal seams up to several hundred meters below the surface, mountaintops and ridges are removed via explosives and heavy machinery with the resulting overburden pushed into nearby valleys. This broken up rock and soil material represents a largely unknown amount of storage for incoming precipitation that facilitates enhanced chemical weathering rates and increased dissolved solids exports to streams. However, assessing the independent impact of MTM can be difficult in the presence of other forms of mining, especially underground mining. Here, we evaluate the effect of MTM on water quantity and quality on annual, seasonal, and event time scales in two sets of paired watersheds in southwestern West Virginia impacted by MTM. On an annual timescale, the mined watersheds sustained baseflow throughout the year, while the first order watersheds ceased flowing during the latter parts of the growing season. In fractionally mined watersheds that continued to flow, the water in the stream was exclusively generated from mined portions of the watersheds, leading to elevated total dissolved solids in the stream water. On the event time scale, we analyzed 50 storm events over a water year for a range of hydrologic response metrics. The mined watersheds exhibited smaller runoff ratios and longer response times during the wet dormant season, but responded similarly to rainfall events during the growing season or even exceeded the runoff magnitude of the reference watersheds. Our research demonstrates a clear difference in hydrologic response between mined and unmined watersheds during the growing season and the dormant season that are detectable at annual, seasonal, and event time scales. For larger spatial scales (up to 2,000km2) the effect of MTM on water quantity is not as easily detectable. At these larger scales, other land uses can mask possible alterations in hydrology or the percentage of MTM disturbed areas becomes negligible.
Wu, Sheng; Li, Hong; Petzold, Linda R.
2015-01-01
The inhomogeneous stochastic simulation algorithm (ISSA) is a fundamental method for spatial stochastic simulation. However, when diffusion events occur more frequently than reaction events, simulating the diffusion events by ISSA is quite costly. To reduce this cost, we propose to use the time dependent propensity function in each step. In this way we can avoid simulating individual diffusion events, and use the time interval between two adjacent reaction events as the simulation stepsize. We demonstrate that the new algorithm can achieve orders of magnitude efficiency gains over widely-used exact algorithms, scales well with increasing grid resolution, and maintains a high level of accuracy. PMID:26609185
NASA Astrophysics Data System (ADS)
Lutoff, C.; Anquetin, S.; Ruin, I.; Chassande, M.
2009-09-01
Flash floods are complex phenomena. The atmospheric and hydrological generating mechanisms of the phenomenon are not completely understood, leading to highly uncertain forecasts of and warnings for these events. On the other hand warning and crisis response to such violent and fast events is not a straightforward process. In both the social and physical aspect of the problem, space and time scales involved either in hydrometeorology, human behavior and social organizations sciences are of crucial importance. Forecasters, emergency managers, mayors, school superintendents, school transportation managers, first responders and road users, all have different time and space frameworks that they use to take emergency decision for themselves, their group or community. The integration of space and time scales of both the phenomenon and human activities is therefore a necessity to better deal with questions as forecasting lead-time and warning efficiency. The aim of this oral presentation is to focus on the spatio-temporal aspects of flash floods to improve our understanding of the event dynamic compared to the different scales of the social response. The authors propose a framework of analysis to compare the temporality of: i) the forecasts (from Méteo-France and from EFAS (Thielen et al., 2008)), ii) the meteorological and hydrological parameters, iii) the social response at different scales. The September 2005 event is particularly interesting for such analysis. The rainfall episode lasted nearly a week with two distinct phases separated by low intensity precipitations. Therefore the Méteo-France vigilance bulletin where somehow disconnected from the local flood’s impacts. Our analysis focuses on the timings of different types of local response, including the delicate issue of school transportation, in regard to the forecasts and the actual dynamic of the event.
Time Variations in Forecasts and Occurrences of Large Solar Energetic Particle Events
NASA Astrophysics Data System (ADS)
Kahler, S. W.
2015-12-01
The onsets and development of large solar energetic (E > 10 MeV) particle (SEP) events have been characterized in many studies. The statistics of SEP event onset delay times from associated solar flares and coronal mass ejections (CMEs), which depend on solar source longitudes, can be used to provide better predictions of whether a SEP event will occur following a large flare or fast CME. In addition, size distributions of peak SEP event intensities provide a means for a probabilistic forecast of peak intensities attained in observed SEP increases. SEP event peak intensities have been compared with their rise and decay times for insight into the acceleration and transport processes. These two time scales are generally treated as independent parameters describing the development of a SEP event, but we can invoke an alternative two-parameter description based on the assumption that decay times exceed rise times for all events. These two parameters, from the well known Weibull distribution, provide an event description in terms of its basic shape and duration. We apply this distribution to several large SEP events and ask what the characteristic parameters and their dependence on source longitudes can tell us about the origins of these important events.
Does an inter-flaw length control the accuracy of rupture forecasting in geological materials?
NASA Astrophysics Data System (ADS)
Vasseur, Jérémie; Wadsworth, Fabian B.; Heap, Michael J.; Main, Ian G.; Lavallée, Yan; Dingwell, Donald B.
2017-10-01
Multi-scale failure of porous materials is an important phenomenon in nature and in material physics - from controlled laboratory tests to rockbursts, landslides, volcanic eruptions and earthquakes. A key unsolved research question is how to accurately forecast the time of system-sized catastrophic failure, based on observations of precursory events such as acoustic emissions (AE) in laboratory samples, or, on a larger scale, small earthquakes. Until now, the length scale associated with precursory events has not been well quantified, resulting in forecasting tools that are often unreliable. Here we test the hypothesis that the accuracy of the forecast failure time depends on the inter-flaw distance in the starting material. We use new experimental datasets for the deformation of porous materials to infer the critical crack length at failure from a static damage mechanics model. The style of acceleration of AE rate prior to failure, and the accuracy of forecast failure time, both depend on whether the cracks can span the inter-flaw length or not. A smooth inverse power-law acceleration of AE rate to failure, and an accurate forecast, occurs when the cracks are sufficiently long to bridge pore spaces. When this is not the case, the predicted failure time is much less accurate and failure is preceded by an exponential AE rate trend. Finally, we provide a quantitative and pragmatic correction for the systematic error in the forecast failure time, valid for structurally isotropic porous materials, which could be tested against larger-scale natural failure events, with suitable scaling for the relevant inter-flaw distances.
On the use of variability time-scales as an early classifier of radio transients and variables
NASA Astrophysics Data System (ADS)
Pietka, M.; Staley, T. D.; Pretorius, M. L.; Fender, R. P.
2017-11-01
We have shown previously that a broad correlation between the peak radio luminosity and the variability time-scales, approximately L ∝ τ5, exists for variable synchrotron emitting sources and that different classes of astrophysical sources occupy different regions of luminosity and time-scale space. Based on those results, we investigate whether the most basic information available for a newly discovered radio variable or transient - their rise and/or decline rate - can be used to set initial constraints on the class of events from which they originate. We have analysed a sample of ≈800 synchrotron flares, selected from light curves of ≈90 sources observed at 5-8 GHz, representing a wide range of astrophysical phenomena, from flare stars to supermassive black holes. Selection of outbursts from the noisy radio light curves has been done automatically in order to ensure reproducibility of results. The distribution of rise/decline rates for the selected flares is modelled as a Gaussian probability distribution for each class of object, and further convolved with estimated areal density of that class in order to correct for the strong bias in our sample. We show in this way that comparing the measured variability time-scale of a radio transient/variable of unknown origin can provide an early, albeit approximate, classification of the object, and could form part of a suite of measurements used to provide early categorization of such events. Finally, we also discuss the effect scintillating sources will have on our ability to classify events based on their variability time-scales.
Investigation of relationships between parameters of solar nano-flares and solar activity
NASA Astrophysics Data System (ADS)
Safari, Hossein; Javaherian, Mohsen; Kaki, Bardia
2016-07-01
Solar flares are one of the important coronal events which are originated in solar magnetic activity. They release lots of energy during the interstellar medium, right after the trigger. Flare prediction can play main role in avoiding eventual damages on the Earth. Here, to interpret solar large-scale events (e.g., flares), we investigate relationships between small-scale events (nano-flares) and large-scale events (e.g., flares). In our method, by using simulations of nano-flares based on Monte Carlo method, the intensity time series of nano-flares are simulated. Then, the solar full disk images taken at 171 angstrom recorded by SDO/AIA are employed. Some parts of the solar disk (quiet Sun (QS), coronal holes (CHs), and active regions (ARs)) are cropped and the time series of these regions are extracted. To compare the simulated intensity time series of nano-flares with the intensity time series of real data extracted from different parts of the Sun, the artificial neural networks is employed. Therefore, we are able to extract physical parameters of nano-flares like both kick and decay rate lifetime, and the power of their power-law distributions. The procedure of variations in the power value of power-law distributions within QS, CH is similar to AR. Thus, by observing the small part of the Sun, we can follow the procedure of solar activity.
NASA Astrophysics Data System (ADS)
Müller, Eva; Pfister, Angela; Gerd, Büger; Maik, Heistermann; Bronstert, Axel
2015-04-01
Hydrological extreme events can be triggered by rainfall on different spatiotemporal scales: river floods are typically caused by event durations of between hours and days, while urban flash floods as well as soil erosion or contaminant transport rather result from storms events of very short duration (minutes). Still, the analysis of climate change impacts on rainfall-induced extreme events is usually carried out using daily precipitation data at best. Trend analyses of extreme rainfall at sub-daily or even sub-hourly time scales are rare. In this contribution two lines of research are combined: first, we analyse sub-hourly rainfall data for several decades in three European regions.Second, we investigate the scaling behaviour of heavy short-term precipitation with temperature, i.e. the dependence of high intensity rainfall on the atmospheric temperature at that particular time and location. The trend analysis of high-resolution rainfall data shows for the first time that the frequency of short and intensive storm events in the temperate lowland regions in Germany has increased by up to 0.5 events per year over the last decades. I.e. this trend suggests that the occurrence of these types of storms have multiplied over only a few decades. Parallel to the changes in the rainfall regime, increases in the annual and seasonal average temperature and changes in the occurrence of circulation patterns responsible for the generation of high-intensity storms have been found. The analysis of temporally highly resolved rainfall records from three European regions further indicates that extreme precipitation events are more intense with warmer temperatures during the rainfall event. These observations follow partly the Clausius-Clapeyron relation. Based on this relation one may derive a general rule of maximum rainfall intensity associated to the event temperature, roughly following the Clausius-Clapeyron (CC) relation. This rule might be used for scenarios of future maximum rainfall intensities under a warming climate.
Summary of types of radiation belt electron precipitation observed by BARREL
NASA Astrophysics Data System (ADS)
Halford, Alexa
2016-07-01
The Balloon Array for Relativistic Radiation belt Electron Loss (BARREL) was able to infer precipitation of radiation belt electrons on multiple time scales and due to multiple loss mechanisms. One storm will be specifically highlighted which occurred on 26 January 2013 when a solar wind shock hit the Earth. Although MeV electrons were observed to be lost due to an EMIC wave event [Zhang et al in prep], and multiple periods of electron loss during substorms were observed [Rae et al submitted JGR, Mann et al in prep], we will consider an event period where loss associated with multiple time scales, and thus possibly different loss mechanisms was observed from 1000 - 1200 UT on 26 January 2013. At about 1005 UT on 26 January 2013 an injection of radiation belt electrons followed by drift echoes for energies of ˜80 - 400 keV. BARREL observed X-rays with energies less than 180 keV associated with multiple temporal structures during the drift echo event period. The Van Allen Probes were at similar L-values but upwards of 2 hours away in MLT. Upper band chorus and ULF waves were observed during the event period. Throughout the beginning of the event period, microbursts were clearly observed. During this time lower band chorus waves as well as time domain structures were observed at Van Allen Probe A located upwards of 2 hours away in MLT. This large difference in MLT meant that neither potential loss mechanism was able to be clearly associated with the microbursts. As the lower band chorus and time domain structures were observed to recede, the microbursts were also observed to subside. ULF time scale modulation of the X-rays was also observed throughout most of the event period. We will examine if the ULF waves are the cause of the precipitation themselves, or are modulating the loss of particles from a secondary loss mechanism [Brito et al 2015 JGR, Rae et al Submitted JGR]. Although the 100s ms and ULF time scales are clearly observed, there is an ˜20 minute overarching structure observed in the X-rays at BARREL. This longer time scale appears to match the drift period of the ˜300 keV electrons observed by the Van Allen probes. However the inferred energy of the precipitating electrons is ˜150 keV. It is unclear what may be causing the ˜20 minute structure in the X-rays. At the time of writing this abstract, it is unclear if the drifting of the 300 keV electrons is related to the precipitation of the lower energy electrons (< 180 keV) or if it is just coincidence that they have the same temporal structure.
National Earthquake Information Center Seismic Event Detections on Multiple Scales
NASA Astrophysics Data System (ADS)
Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.
2017-12-01
The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10.1785/gssrl.83.3.531.
Kelvin-Helmholtz instability: the ``atom'' of geophysical turbulence?
NASA Astrophysics Data System (ADS)
Smyth, William
2017-11-01
Observations of small-scale turbulence in Earth's atmosphere and oceans have most commonly been interpreted in terms of the Kolmogorov theory of isotropic turbulence, despite the fact that the observed turbulence is significantly anisotropic due to density stratification and sheared large-scale flows. I will describe an alternative picture in which turbulence consists of distinct events that occur sporadically in space and time. The simplest model for an individual event is the ``Kelvin-Helmholtz (KH) ansatz'', in which turbulence relieves the dynamic instability of a localized shear layer. I will summarize evidence that the KH ansatz is a valid description of observed turbulence events, using microstructure measurements from the equatorial Pacific ocean as an example. While the KH ansatz has been under study for many decades and is reasonably well understood, the bigger picture is much less clear. How are the KH events distributed in space and time? How do different events interact with each other? I will describe some tentative steps toward a more thorough understanding.
A Neutral Network based Early Eathquake Warning model in California region
NASA Astrophysics Data System (ADS)
Xiao, H.; MacAyeal, D. R.
2016-12-01
Early Earthquake Warning systems could reduce loss of lives and other economic impact resulted from natural disaster or man-made calamity. Current systems could be further enhanced by neutral network method. A 3 layer neural network model combined with onsite method was deployed in this paper to improve the recognition time and detection time for large scale earthquakes.The 3 layer neutral network early earthquake warning model adopted the vector feature design for sample events happened within 150 km radius of the epicenters. Dataset used in this paper contained both destructive events and small scale events. All the data was extracted from IRIS database to properly train the model. In the training process, backpropagation algorithm was used to adjust the weight matrices and bias matrices during each iteration. The information in all three channels of the seismometers served as the source in this model. Through designed tests, it was indicated that this model could identify approximately 90 percent of the events' scale correctly. And the early detection could provide informative evidence for public authorities to make further decisions. This indicated that neutral network model could have the potential to strengthen current early warning system, since the onsite method may greatly reduce the responding time and save more lives in such disasters.
Scale invariance of temporal order discrimination using complex, naturalistic events
Kwok, Sze Chai; Macaluso, Emiliano
2015-01-01
Recent demonstrations of scale invariance in cognitive domains prompted us to investigate whether a scale-free pattern might exist in retrieving the temporal order of events from episodic memory. We present four experiments using an encoding-retrieval paradigm with naturalistic stimuli (movies or video clips). Our studies show that temporal order judgement retrieval times were negatively correlated with the temporal separation between two events in the movie. This relation held, irrespective of whether temporal distances were on the order of tens of minutes (Exp 1−2) or just a few seconds (Exp 3−4). Using the SIMPLE model, we factored in the retention delays between encoding and retrieval (delays of 24 h, 15 min, 1.5–2.5 s, and 0.5 s for Exp 1–4, respectively) and computed a temporal similarity score for each trial. We found a positive relation between similarity and retrieval times; that is, the more temporally similar two events, the slower the retrieval of their temporal order. Using Bayesian analysis, we confirmed the equivalence of the RT/similarity relation across all experiments, which included a vast range of temporal distances and retention delays. These results provide evidence for scale invariance during the retrieval of temporal order of episodic memories. PMID:25909581
Alternating event processes during lifetimes: population dynamics and statistical inference.
Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng
2018-01-01
In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.
MMS Multipoint Electric Field Observations of Small-Scale Magnetic Holes
NASA Technical Reports Server (NTRS)
Goodrich, Katherine A.; Ergun, Robert E.; Wilder, Frederick; Burch, James; Torbert, Roy; Khotyaintsev, Yuri; Lindqvist, Per-Arne; Russell, Christopher; Strangeway, Robert; Magnus, Werner
2016-01-01
Small-scale magnetic holes (MHs), local depletions in magnetic field strength, have been observed multiple times in the Earths magnetosphere in the bursty bulk flow (BBF) braking region. This particular subset of MHs has observed scale sizes perpendicular to the background magnetic field (B) less than the ambient ion Larmor radius (p(sib i)). Previous observations by Time History of Events and Macroscale Interactions during Substorms (THEMIS) indicate that this subset of MHs can be supported by a current driven by the E x B drift of electrons. Ions do not participate in the E x B drift due to the small-scale size of the electric field. While in the BBF braking region, during its commissioning phase, the Magnetospheric Multiscale (MMS) spacecraft observed a small-scale MH. The electric field observations taken during this event suggest the presence of electron currents perpendicular to the magnetic field. These observations also suggest that these currents can evolve to smaller spatial scales.
Appraisals of negative events by preadolescent children of divorce.
Sheets, V; Sandler, I; West, S G
1996-10-01
This study investigated the appraisals of the significance of negative events made by 256 preadolescent children of divorce. Appraisals were assessed by a 24-item self-report scale. Confirmatory factor analysis of this scale found support for a 3-dimensional model: negative self-appraisal, negative other-appraisal, and material loss. Differentiation between the dimensions of appraisal increased with age in both cross-sectional and over-time data. Evidence for convergent and discriminant validity of the self-report measure of appraisals was found with scores derived from children's open-ended descriptions of their appraisals. Cross-sectional structural equation models found significant paths between negative appraisal and psychological symptoms, over and above the direct effects of the traditional life event measure of stress. Structural equation modeling of longitudinal (5.5 months) data found a significant path from Time 1 appraisal to Time 2 anxiety for the older children.
Scale-invariant structure of energy fluctuations in real earthquakes
NASA Astrophysics Data System (ADS)
Wang, Ping; Chang, Zhe; Wang, Huanyu; Lu, Hong
2017-11-01
Earthquakes are obviously complex phenomena associated with complicated spatiotemporal correlations, and they are generally characterized by two power laws: the Gutenberg-Richter (GR) and the Omori-Utsu laws. However, an important challenge has been to explain two apparently contrasting features: the GR and Omori-Utsu laws are scale-invariant and unaffected by energy or time scales, whereas earthquakes occasionally exhibit a characteristic energy or time scale, such as with asperity events. In this paper, three high-quality datasets on earthquakes were used to calculate the earthquake energy fluctuations at various spatiotemporal scales, and the results reveal the correlations between seismic events regardless of their critical or characteristic features. The probability density functions (PDFs) of the fluctuations exhibit evidence of another scaling that behaves as a q-Gaussian rather than random process. The scaling behaviors are observed for scales spanning three orders of magnitude. Considering the spatial heterogeneities in a real earthquake fault, we propose an inhomogeneous Olami-Feder-Christensen (OFC) model to describe the statistical properties of real earthquakes. The numerical simulations show that the inhomogeneous OFC model shares the same statistical properties with real earthquakes.
Cyclone-induced rapid creation of extreme Antarctic sea ice conditions
Wang, Zhaomin; Turner, John; Sun, Bo; Li, Bingrui; Liu, Chengyan
2014-01-01
Two polar vessels, Akademik Shokalskiy and Xuelong, were trapped by thick sea ice in the Antarctic coastal region just to the west of 144°E and between 66.5°S and 67°S in late December 2013. This event demonstrated the rapid establishment of extreme Antarctic sea ice conditions on synoptic time scales. The event was associated with cyclones that developed at lower latitudes. Near the event site, cyclone-enhanced strong southeasterly katabatic winds drove large westward drifts of ice floes. In addition, the cyclones also gave southward ice drift. The arrival and grounding of Iceberg B9B in Commonwealth Bay in March 2011 led to the growth of fast ice around it, forming a northward protruding barrier. This barrier blocked the westward ice drift and hence aided sea ice consolidation on its eastern side. Similar cyclone-induced events have occurred at this site in the past after the grounding of Iceberg B9B. Future events may be predictable on synoptic time scales, if cyclone-induced strong wind events can be predicted. PMID:24937550
NASA Astrophysics Data System (ADS)
Pedro, J. B.; Martin, T.; Steig, E. J.; Jochum, M.; Park, W.; Rasmussen, S.
2015-12-01
Antarctic Isotope Maxima (AIM) are centennial-to-millennial scale warming events observed in Antarctic ice core records from the last glacial period and deglaciation. Mounting evidence links AIM events to parallel variations in atmospheric CO2, Southern Ocean (SO) sea surface temperatures and Antarctic Bottom Water production. According to the prevailing view, AIM events are forced from the North Atlantic by melt-water discharge from ice sheets suppressing the production of North Atlantic Deep Water and associated northward heat transport in the Atlantic. However observations and model studies increasingly suggest that melt-water fluxes have the wrong timing to be invoked as such a trigger. Here, drawing on results form the Kiel Climate Model, we present an alternative hypothesis in which AIM events are forced via internal oscillations in SO deep-convection. The quasi-periodic timescale of deep-convection events is set by heat (buoyancy) accumulation at SO intermediate depths and stochastic variability in sea ice conditions and freshening at the surface. Massive heat release from the SO convective zone drives Antarctic and large-scale southern hemisphere warming via a two-stage process involving changes in the location of Southern Ocean fronts, in the strength and intensity of the Westerlies and in meridional ocean and atmospheric heat flux anomalies. The potential for AIM events to be driven by internal Southern Ocean processes and the identification of time-lags internal to the southern high latitudes challenges conventional views on the North Atlantic as the pacemaker of millennial-scale climate variability.
Are extreme events (statistically) special? (Invited)
NASA Astrophysics Data System (ADS)
Main, I. G.; Naylor, M.; Greenhough, J.; Touati, S.; Bell, A. F.; McCloskey, J.
2009-12-01
We address the generic problem of testing for scale-invariance in extreme events, i.e. are the biggest events in a population simply a scaled model of those of smaller size, or are they in some way different? Are large earthquakes for example ‘characteristic’, do they ‘know’ how big they will be before the event nucleates, or is the size of the event determined only in the avalanche-like process of rupture? In either case what are the implications for estimates of time-dependent seismic hazard? One way of testing for departures from scale invariance is to examine the frequency-size statistics, commonly used as a bench mark in a number of applications in Earth and Environmental sciences. Using frequency data however introduces a number of problems in data analysis. The inevitably small number of data points for extreme events and more generally the non-Gaussian statistical properties strongly affect the validity of prior assumptions about the nature of uncertainties in the data. The simple use of traditional least squares (still common in the literature) introduces an inherent bias to the best fit result. We show first that the sampled frequency in finite real and synthetic data sets (the latter based on the Epidemic-Type Aftershock Sequence model) converge to a central limit only very slowly due to temporal correlations in the data. A specific correction for temporal correlations enables an estimate of convergence properties to be mapped non-linearly on to a Gaussian one. Uncertainties closely follow a Poisson distribution of errors across the whole range of seismic moment for typical catalogue sizes. In this sense the confidence limits are scale-invariant. A systematic sample bias effect due to counting whole numbers in a finite catalogue makes a ‘characteristic’-looking type extreme event distribution a likely outcome of an underlying scale-invariant probability distribution. This highlights the tendency of ‘eyeball’ fits to unconsciously (but wrongly in this case) assume Gaussian errors. We develop methods to correct for these effects, and show that the current best fit maximum likelihood regression model for the global frequency-moment distribution in the digital era is a power law, i.e. mega-earthquakes continue to follow the Gutenberg-Richter trend of smaller earthquakes with no (as yet) observable cut-off or characteristic extreme event. The results may also have implications for the interpretation of other time-limited geophysical time series that exhibit power-law scaling.
Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah
Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less
What is This Thing Called Tremor?
NASA Astrophysics Data System (ADS)
Rubin, A. M.; Bostock, M. G.
2017-12-01
Tremor has many enigmatic attributes. The LFEs that comprise it have a dearth of large events, implying a characteristic scale. Bostock et al. (2015) found LFE duration beneath Vancouver Island to be nearly independent of magnitude. That duration ( 0.4 s), multiplied by a shear wave speed, defines a length scale far larger than the spatial separation between consecutive but non-colocated detections. If one LFE ruptures multiple brittle patches in a ductile matrix its propagation speed can be slowed to the extent that consecutive events don't overlap, but then why aren't there larger and smaller LFEs with larger and smaller durations? Perhaps there are. Tremor seismograms from Vancouver Island are often saturated with direct arrivals, by which we mean time lags between events shorter than typical event durations. Direct evidence of this, given the small coda amplitude of LFE stacks, is that seismograms at stations many kilometers apart often track each other wiggle for wiggle. We see this behavior over the full range tremor amplitudes, from close to the noise level on a tremor-free day to 10 times larger. If the LFE magnitude-frequency relation is time-independent, this factor of 10 implies that the LFE occurrence rate during loud tremor is 10^2=100 times that during quiet tremor (>250 LFEs per second). We investigate the implications of this by comparing observed seismograms to synthetics made from the superposition of "LFEs" that are Poissonian in time over a range of average rates. We find that provided the LFEs have a characteristic scale (whether exponential or power law), saturation completely obscures the moment-duration scaling of the contributing events; that is, the moment-duration scaling of LFEs may be identical to that of regular earthquakes. Nonetheless, there are subtle differences between our synthetics and real seismograms, remarkably independent of tremor amplitude, that remain to be explained. Foremost among these is a slightly greater affinity of tremor for the positive than the negative LFE template. In this respect tremor appears most similar to "slightly saturated" synthetics, implying a time-dependent moment-frequency distribution (larger LFEs when tremor is loud). One possibility is that tremor consists of aborted earthquakes quenched by reflections from the base of the high Vp/Vs layer.
On the nonlinearity of spatial scales in extreme weather attribution statements
NASA Astrophysics Data System (ADS)
Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; Alexander, Lisa V.; Wehner, Michael; Shiogama, Hideo; Wolski, Piotr; Ciavarella, Andrew; Christidis, Nikolaos
2018-04-01
In the context of ongoing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporal scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.
On the nonlinearity of spatial scales in extreme weather attribution statements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah
In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less
On the nonlinearity of spatial scales in extreme weather attribution statements
Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; ...
2017-06-17
In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less
NASA Astrophysics Data System (ADS)
Lenderink, Geert; Barbero, Renaud; Loriaux, Jessica; Fowler, Hayley
2017-04-01
Present-day precipitation-temperature scaling relations indicate that hourly precipitation extremes may have a response to warming exceeding the Clausius-Clapeyron (CC) relation; for The Netherlands the dependency on surface dew point temperature follows two times the CC relation corresponding to 14 % per degree. Our hypothesis - as supported by a simple physical argument presented here - is that this 2CC behaviour arises from the physics of convective clouds. So, we think that this response is due to local feedbacks related to the convective activity, while other large scale atmospheric forcing conditions remain similar except for the higher temperature (approximately uniform warming with height) and absolute humidity (corresponding to the assumption of unchanged relative humidity). To test this hypothesis, we analysed the large-scale atmospheric conditions accompanying summertime afternoon precipitation events using surface observations combined with a regional re-analysis for the data in The Netherlands. Events are precipitation measurements clustered in time and space derived from approximately 30 automatic weather stations. The hourly peak intensities of these events again reveal a 2CC scaling with the surface dew point temperature. The temperature excess of moist updrafts initialized at the surface and the maximum cloud depth are clear functions of surface dew point temperature, confirming the key role of surface humidity on convective activity. Almost no differences in relative humidity and the dry temperature lapse rate were found across the dew point temperature range, supporting our theory that 2CC scaling is mainly due to the response of convection to increases in near surface humidity, while other atmospheric conditions remain similar. Additionally, hourly precipitation extremes are on average accompanied by substantial large-scale upward motions and therefore large-scale moisture convergence, which appears to accelerate with surface dew point. This increase in large-scale moisture convergence appears to be consequence of latent heat release due to the convective activity as estimated from the quasi-geostrophic omega equation. Consequently, most hourly extremes occur in precipitation events with considerable spatial extent. Importantly, this event size appears to increase rapidly at the highest dew point temperature range, suggesting potentially strong impacts of climatic warming.
Beck, A; Bayeff-Filloff, M; Bischoff, M; Schneider, B M
2002-11-01
The growing number of mass casualty events during the early 1990s led, in January 1996, to the foundation of an honorary group of specially trained emergency physicians for dealing primarily with the management of large-scale emergency events and mass casualties. The incidence and quantity of these casualties was analysed in order to be better prepared for such events in the future. All calls prospectively registered by the Augsburg Rescue Co-ordination Centre (ARCC) in the 5 years from July 1997 to June 2002 were analysed, distinguishing between the different types of damage, number of patients involved, and time of occurrence (time of day/season). The area served by the ARCC includes the city of Augsburg with its surrounding counties. An estimated 850,000 inhabitants live in this area of 4,100 square kilometers (1,600 square miles). Since 1998, more than 145,000 calls a year have been dealt with of which 28,000 were covered by emergency physicians. In the 5 year period discussed here, 75 large-scale-calls were registered, giving an average incidence of 1.25 calls/month. Most of the calls were fire alarms, followed by car accidents. In total, we were able to serve more than 800 patients. The lowest number per event was two people during an emergency landing of a sport aircraft; the largest number was about 150 patients during a large open-air event in the city. While there was no difference in the time of day at which the event happened, most occurred in November and December. Taking these results into account, the authors, supported by the members of the emergency physician team of the German Trauma Society, developed an algorithm describing the optimal procedure for mass casualty events. This is presented here. In mass casualty or large-scale emergency events, an experienced emergency physician is necessary to co-ordinate the rescue brigades on site.
Earth System Stability Through Geologic Time
NASA Astrophysics Data System (ADS)
Rothman, D.; Bowring, S. A.
2015-12-01
Five times in the past 500 million years, mass extinctions haveresulted in the loss of greater than three-fourths of living species.Each of these events is associated with significant environmentalchange recorded in the carbon-isotopic composition of sedimentaryrocks. There are also many such environmental events in the geologicrecord that are not associated with mass extinctions. What makes themdifferent? Two factors appear important: the size of theenvironmental perturbation, and the time scale over which it occurs.We show that the natural perturbations of Earth's carbon cycle during thepast 500 million years exhibit a characteristic rate of change overtwo orders of magnitude in time scale. This characteristic rate isconsistent with the maximum rate that limits quasistatic (i.e., nearsteady-state) evolution of the carbon cycle. We identify this rate withmarginal stability, and show that mass extinctions occur on the fast,unstable side of the stability boundary. These results suggest thatthe great extinction events of the geologic past, and potentially a"sixth extinction" associated with modern environmental change, arecharacterized by common mechanisms of instability.
NASA Astrophysics Data System (ADS)
Revilla-Romero, Beatriz; Shelton, Kay; Wood, Elizabeth; Berry, Robert; Bevington, John; Hankin, Barry; Lewis, Gavin; Gubbin, Andrew; Griffiths, Samuel; Barnard, Paul; Pinnell, Marc; Huyck, Charles
2017-04-01
The hours and days immediately after a major flood event are often chaotic and confusing, with first responders rushing to mobilise emergency responders, provide alleviation assistance and assess loss to assets of interest (e.g., population, buildings or utilities). Preparations in advance of a forthcoming event are becoming increasingly important; early warning systems have been demonstrated to be useful tools for decision markers. The extent of damage, human casualties and economic loss estimates can vary greatly during an event, and the timely availability of an accurate flood extent allows emergency response and resources to be optimised, reduces impacts, and helps prioritise recovery. In the insurance sector, for example, insurers are under pressure to respond in a proactive manner to claims rather than waiting for policyholders to report losses. Even though there is a great demand for flood inundation extents and severity information in different sectors, generating flood footprints for large areas from hydraulic models in real time remains a challenge. While such footprints can be produced in real time using remote sensing, weather conditions and sensor availability limit their ability to capture every single flood event across the globe. In this session, we will present Flood Foresight (www.floodforesight.com), an operational tool developed to meet the universal requirement for rapid geographic information, before, during and after major riverine flood events. The tool provides spatial data with which users can measure their current or predicted impact from an event - at building, basin, national or continental scales. Within Flood Foresight, the Screening component uses global rainfall predictions to provide a regional- to continental-scale view of heavy rainfall events up to a week in advance, alerting the user to potentially hazardous situations relevant to them. The Forecasting component enhances the predictive suite of tools by providing a local-scale view of the extent and depth of possible riverine flood events several days in advance by linking forecast river flow from a hydrological model to a global flood risk map. The Monitoring component provides a similar local-scale view of a flood inundation extent but in near real time, as an event unfolds, by combining the global flood risk map with observed river gauge telemetry. Immediately following an event, the maximum extent of the flood is also generated. Users of Flood Foresight will be able to receive current and forecast flood extents and depth information via API into their own GIS or analytics software. The set of tools is currently operational for the UK and Europe; the methods presented can be applied globally, allowing provision of service to any country or region. This project was supported by InnovateUK under the Solving Business Problems with Environmental Data competition.
NASA Astrophysics Data System (ADS)
Carvalho, S. C. P.; de Lima, M. I. P.; de Lima, J. L. M. P.
2012-04-01
Laser disdrometers can monitor efficiently rainfall characteristics at small temporal scales, providing data on rain intensity, raindrop diameter and fall speed, and raindrop counts over time. This type of data allows for the increased understanding of the rainfall structure at small time scales. Of particular interest for many hydrological applications is the characterization of the properties of extreme events, including the intra-event variability, which are affected by different factors (e.g. geographical location, rainfall generating mechanisms). These properties depend on the microphysical, dynamical and kinetic processes that interact to produce rain. In this study we explore rainfall data obtained during two years with a laser disdrometer installed in the city of Coimbra, in the centre region of mainland Portugal. The equipment was developed by Thies Clima. The data temporal resolution is one-minute. Descriptive statistics of time series of raindrop diameter (D), fall speed, kinetic energy, and rain rate were studied at the event scale; for different variables, the average, maximum, minimum, median, variance, standard deviation, quartile, coefficient of variation, skewness and kurtosis were determined. The empirical raindrop size distribution, N(D), was also calculated. Additionally, the parameterization of rainfall was attempted by investigating the applicability of different theoretical statistical distributions to fit the empirical data (e.g. exponential, gamma and lognormal distributions). As expected, preliminary results show that rainfall properties and structure vary with rainfall type and weather conditions over the year. Although only two years were investigated, already some insight into different rain events' structure was obtained.
El Nino-southern oscillation simulated in an MRI atmosphere-ocean coupled general circulation model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagai, T.; Tokioka, T.; Endoh, M.
A coupled atmosphere-ocean general circulation model (GCM) was time integrated for 30 years to study interannual variability in the tropics. The atmospheric component is a global GCM with 5 levels in the vertical and 4[degrees]latitude X 5[degrees] longitude grids in the horizontal including standard physical processes (e.g., interactive clouds). The oceanic component is a GCM for the Pacific with 19 levels in the vertical and 1[degrees]x 2.5[degrees] grids in the horizontal including seasonal varying solar radiation as forcing. The model succeeded in reproducing interannual variations that resemble the El Nino-Southern Oscillation (ENSO) with realistic seasonal variations in the atmospheric andmore » oceanic fields. The model ENSO cycle has a time scale of approximately 5 years and the model El Nino (warm) events are locked roughly in phase to the seasonal cycle. The cold events, however, are less evident in comparison with the El Nino events. The time scale of the model ENSO cycle is determined by propagation time of signals from the central-eastern Pacific to the western Pacific and back to the eastern Pacific. Seasonal timing is also important in the ENSO time scale: wind anomalies in the central-eastern Pacific occur in summer and the atmosphere ocean coupling in the western Pacific operates efficiently in the first half of the year.« less
Thresholds of catastrophe in the Earth system
Rothman, Daniel H.
2017-01-01
The history of the Earth system is a story of change. Some changes are gradual and benign, but others, especially those associated with catastrophic mass extinction, are relatively abrupt and destructive. What sets one group apart from the other? Here, I hypothesize that perturbations of Earth’s carbon cycle lead to mass extinction if they exceed either a critical rate at long time scales or a critical size at short time scales. By analyzing 31 carbon isotopic events during the past 542 million years, I identify the critical rate with a limit imposed by mass conservation. Identification of the crossover time scale separating fast from slow events then yields the critical size. The modern critical size for the marine carbon cycle is roughly similar to the mass of carbon that human activities will likely have added to the oceans by the year 2100. PMID:28948221
How Do Novice and Expert Learners Represent, Understand, and Discuss Geologic Time?
NASA Astrophysics Data System (ADS)
Layow, Erica Amanda
This dissertation examined the representations novice and expert learners constructed for the geologic timescale. Learners engaged in a three-part activity. The purpose was to compare novice learners' representations to those of expert learners. This provided insight into the similarities and differences between their strategies for event ordering, assigning values and scale to the geologic timescale model, as well as their language and practices to complete the model. With a qualitative approach to data analysis informed by an expert-novice theoretical framework grounded in phenomenography, learner responses comprised the data analyzed. These data highlighted learners' metacognitive thoughts that might not otherwise be shared through lectures or laboratory activities. Learners' responses were analyzed using a discourse framework that positioned learners as knowers. Novice and expert learners both excelled at ordering and discussing events before the Phanerozoic, but were challenged with events during the Phanerozoic. Novice learners had difficulty assigning values to events and establishing a scale for their models. Expert learners expressed difficulty with determining a scale because of the size of the model, yet eventually used anchor points and unitized the model to establish a scale. Despite challenges constructing their models, novice learners spoke confidently using claims and few hedging phrases indicating their confidence in statements made. Experts used more hedges than novices, however the hedging comments were made about more complex conceptions. Using both phenomenographic and discourse analysis approaches for analysis foregrounded learners' discussions of how they perceived geologic time and their ways of knowing and doing. This research is intended to enhance the geoscience community's understanding of the ways novice and expert learners think and discuss conceptions of geologic time, including the events and values of time, and the strategies used to determine accuracy of scale. This knowledge will provide a base from which to support geoscience curriculum development at the university level, specifically to design activities that will not only engage and express learners' metacognitive scientific practices, but to encourage their construction of scientific identities and membership in the geoscience community.
Short Personality and Life Event scale for detection of suicide attempters.
Artieda-Urrutia, Paula; Delgado-Gómez, David; Ruiz-Hernández, Diego; García-Vega, Juan Manuel; Berenguer, Nuria; Oquendo, Maria A; Blasco-Fontecilla, Hilario
2015-01-01
To develop a brief and reliable psychometric scale to identify individuals at risk for suicidal behaviour. Case-control study. 182 individuals (61 suicide attempters, 57 psychiatric controls, and 64 psychiatrically healthy controls) aged 18 or older, admitted to the Emergency Department at Puerta de Hierro University Hospital in Madrid, Spain. All participants completed a form including their socio-demographic and clinical characteristics, and the Personality and Life Events scale (27 items). To assess Axis I diagnoses, all psychiatric patients (including suicide attempters) were administered the Mini International Neuropsychiatric Interview. Descriptive statistics were computed for the socio-demographic factors. Additionally, χ(2) independence tests were applied to evaluate differences in socio-demographic and clinical variables, and the Personality and Life Events scale between groups. A stepwise linear regression with backward variable selection was conducted to build the Short Personality Life Event (S-PLE) scale. In order to evaluate the accuracy, a ROC analysis was conducted. The internal reliability was assessed using Cronbach's α, and the external reliability was evaluated using a test-retest procedure. The S-PLE scale, composed of just 6 items, showed good performance in discriminating between medical controls, psychiatric controls and suicide attempters in an independent sample. For instance, the S-PLE scale discriminated between past suicide and past non-suicide attempters with sensitivity of 80% and specificity of 75%. The area under the ROC curve was 88%. A factor analysis extracted only one factor, revealing a single dimension of the S-PLE scale. Furthermore, the S-PLE scale provides values of internal and external reliability between poor (test-retest: 0.55) and acceptable (Cronbach's α: 0.65) ranges. Administration time is about one minute. The S-PLE scale is a useful and accurate instrument for estimating the risk of suicidal behaviour in settings where the time is scarce. Copyright © 2015 SEP y SEPB. Published by Elsevier España. All rights reserved.
Van Allen Probes Observations of Radiation Belt Acceleration associated with Solar Wind Shocks
NASA Astrophysics Data System (ADS)
Foster, J. C.; Wygant, J. R.; Baker, D. N.
2017-12-01
During a moderate solar wind shock event on 8 October 2013 the twin Van Allen Probes spacecraft observed the shock-induced electric field in the dayside magnetosphere and the response of the electron populations across a broad range of energies. Whereas other mechanisms populating the radiation belts close to Earth (L 3-5) take place on time scales of months (diffusion) or hours (storm and substorm effects), acceleration during shock events occurs on a much faster ( 1 minute) time scale. During this event the dayside equatorial magnetosphere experienced a strong dusk-dawn/azimuthal component of the electric field of 1 min duration. This shock-induced pulse accelerates radiation belt electrons for the length of time they are exposed to it creating "quasi-periodic pulse-like" enhancements in the relativistic (2 - 6 MeV) electron flux. Electron acceleration occurs on a time scale that is a fraction of their orbital drift period around the Earth. Those electrons whose drift velocity closely matches the azimuthal phase velocity of the shock-induced pulse stay in the accelerating wave as it propagates tailward and receive the largest increase in energy. Relativistic electron gradient drift velocities are energy-dependent, selecting a preferred range of energies (3-4 MeV) for the strongest enhancement. The time scale for shock acceleration is short with respect to the electron drift period ( 5 min), but long with respect to bounce and gyro periodicities. As a result, the third invariant is broken and the affected electron populations are displaced earthward experiencing an adiabatic energy gain. At radial distances tailward of the peak in phase space density, the impulsive inward displacement of the electron population produces a decrease in electron flux and a sequence of gradient drifting "negative holes".Dual spacecraft coverage of the 8 October 2013 event provided a before/after time sequence documenting shock effects.
The influence of antecedent conditions on flood risk in sub-Saharan Africa
NASA Astrophysics Data System (ADS)
Bischiniotis, Konstantinos; van den Hurk, Bart; Coughlan de Perez, Erin; Jongman, Brenden; Veldkamp, Ted; Aerts, Jeroen
2017-04-01
Traditionally, flood risk management has focused on long-term flood protection measures. However, many countries are often not able to afford hard infrastructure that provides sufficient safety levels due to the high investment costs. As a consequence, they rely more on post disaster response and timely warning systems. Most early warning systems have predominantly focused on precipitation as the main predictive factor, having usually lead times of hours or days. However, other variables could also play a role. For instance, anomalous positive water storage, soil saturation and evapotranspiration are physical factors that may influence the length of the flood build-up period. This period can vary from some days to several months before the event and it is particularly important in flood risk management since longer flood warning lead times during this period could result in better flood preparation actions. This study addresses how the antecedent conditions of historical reported flood events over the period 1980 to 2010 in sub-Saharan Africa relate to flood generation. The seasonal-scale conditions are reflected in the Standardized Precipitation Evapotranspiration Index (SPEI), which is calculated using monthly precipitation and temperature data and accounts for the wetness/dryness of an area. Antecedent conditions are separated into a) a short term 'weather-scale' period (0-7 days) and b) a 'seasonal-scale' period (up to 6 months) before the flood event in such a way that they do not overlap. Total 7-day precipitation, which is based on daily meteorological data, was used to evaluate the short-term weather-scale conditions. Using a pair of coordinates, derived from the NatCatSERVICE database on global flood losses, each flood event is positioned on a 0.5°x 0.5° grid cell. The antecedent SPEI conditions of the two periods and their joint influence in flood generation are compared to the same period conditions of the other years of the dataset. First results revealed that many floods were preceded by high SPEI for several months before the flooding event, showing that the area was saturated with a long lead-time. Those that were not preceded by high SPEI had very extreme short-term precipitation that caused the flood event. Furthermore, the importance of seasonal-scale conditions is quantified, which in turn might help humanitarian organizations and decision-makers extend the period of the preventive flood risk management planning.
3-Dimensional Root Cause Diagnosis via Co-analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Ziming; Lan, Zhiling; Yu, Li
2012-01-01
With the growth of system size and complexity, reliability has become a major concern for large-scale systems. Upon the occurrence of failure, system administrators typically trace the events in Reliability, Availability, and Serviceability (RAS) logs for root cause diagnosis. However, RAS log only contains limited diagnosis information. Moreover, the manual processing is time-consuming, error-prone, and not scalable. To address the problem, in this paper we present an automated root cause diagnosis mechanism for large-scale HPC systems. Our mechanism examines multiple logs to provide a 3-D fine-grained root cause analysis. Here, 3-D means that our analysis will pinpoint the failure layer,more » the time, and the location of the event that causes the problem. We evaluate our mechanism by means of real logs collected from a production IBM Blue Gene/P system at Oak Ridge National Laboratory. It successfully identifies failure layer information for 219 failures during 23-month period. Furthermore, it effectively identifies the triggering events with time and location information, even when the triggering events occur hundreds of hours before the resulting failures.« less
Evolution of damage during deformation in porous granular materials (Louis Néel Medal Lecture)
NASA Astrophysics Data System (ADS)
Main, Ian
2014-05-01
'Crackling noise' occurs in a wide variety of systems that respond to external forcing in an intermittent way, leading to sudden bursts of energy release similar to those heard when crunching up a piece of paper or listening to a fire. In mineral magnetism ('Barkhausen') crackling noise occurs due to sudden changes in the size and orientation of microscopic ferromagnetic domains when the external magnetic field is changed. In rock physics sudden changes in internal stress associated with microscopically brittle failure events lead to acoustic emissions that can be recorded on the sample boundary, and used to infer the state of internal damage. Crackling noise is inherently stochastic, but the population of events often exhibits remarkably robust scaling properties, in terms of the source area, duration, energy, and in the waiting time between events. Here I describe how these scaling properties emerge and evolve spontaneously in a fully-dynamic discrete element model of sedimentary rocks subject to uniaxial compression at a constant strain rate. The discrete elements have structural disorder similar to that of a real rock, and this is the only source of heterogeneity. Despite the stationary loading and the lack of any time-dependent weakening processes, the results are all characterized by emergent power law distributions over a broad range of scales, in agreement with experimental observation. As deformation evolves, the scaling exponents change systematically in a way that is similar to the evolution of damage in experiments on real sedimentary rocks. The potential for real-time failure forecasting is examined by using synthetic and real data from laboratory tests and prior to volcanic eruptions. The combination of non-linearity and an irreducible stochastic component leads to significant variations in the precision and accuracy of the forecast failure time, leading to a significant proportion of 'false alarms' (forecast too early) and 'missed events' (forecast too late), as well as an over-optimistic assessments of forecasting power and quality when the failure time is known (the 'benefit of hindsight'). The evolution becomes progressively more complex, and the forecasting power diminishes, in going from ideal synthetics to controlled laboratory tests to open natural systems at larger scales in space and time.
Recollection-Dependent Memory for Event Duration in Large-Scale Spatial Navigation
ERIC Educational Resources Information Center
Brunec, Iva K.; Ozubko, Jason D.; Barense, Morgan D.; Moscovitch, Morris
2017-01-01
Time and space represent two key aspects of episodic memories, forming the spatiotemporal context of events in a sequence. Little is known, however, about how temporal information, such as the duration and the order of particular events, are encoded into memory, and if it matters whether the memory representation is based on recollection or…
Sadhasivam, Senthilkumar; Cohen, Lindsey L; Hosu, Liana; Gorman, Kristin L; Wang, Yu; Nick, Todd G; Jou, Jing Fang; Samol, Nancy; Szabova, Alexandra; Hagerman, Nancy; Hein, Elizabeth; Boat, Anne; Varughese, Anna; Kurth, Charles Dean; Willging, J Paul; Gunter, Joel B
2010-04-01
Behavior in response to distressful events during outpatient pediatric surgery can contribute to postoperative maladaptive behaviors, such as temper tantrums, nightmares, bed-wetting, and attention seeking. Currently available perioperative behavioral assessment tools have limited utility in guiding interventions to ameliorate maladaptive behaviors because they cannot be used in real time, are only intended to be used during 1 phase of the experience (e.g., perioperative), or provide only a static assessment of the child (e.g., level of anxiety). A simple, reliable, real-time tool is needed to appropriately identify children and parents whose behaviors in response to distressful events at any point in the perioperative continuum could benefit from timely behavioral intervention. Our specific aims were to (1) refine the Perioperative Adult Child Behavioral Interaction Scale (PACBIS) to improve its reliability in identifying perioperative behaviors and (2) validate the refined PACBIS against several established instruments. The PACBIS was used to assess the perioperative behaviors of 89 children aged 3 to 12 years presenting for adenotonsillectomy and their parents. Assessments using the PACBIS were made during perioperative events likely to prove distressing to children and/or parents (perioperative measurement of blood pressure, induction of anesthesia, and removal of the IV catheter before discharge). Static measurements of perioperative anxiety and behavioral compliance during anesthetic induction were made using the modified Yale Preoperative Anxiety Scale and the Induction Compliance Checklist (ICC). Each event was videotaped for later scoring using the Child-Adult Medical Procedure Interaction Scale-Short Form (CAMPIS-SF) and Observational Scale of Behavioral Distress (OSBD). Interrater reliability using linear weighted kappa (kappa(w)) and multiple validations using Spearman correlation coefficients were analyzed. The PACBIS demonstrated good to excellent interrater reliability, with kappa(w) ranging from 0.62 to 0.94. The Child Coping and Child Distress subscores of the PACBIS demonstrated strong concurrent correlations with the modified Yale Preoperative Anxiety Scale, ICC, CAMPIS-SF, and OSBD. The Parent Positive subscore of the PACBIS correlated strongly with the CAMPIS-SF and OSBD, whereas the Parent Negative subscore showed significant correlation with the ICC. The PACBIS has strong construct and predictive validities. The PACBIS is a simple, easy to use, real-time instrument to evaluate perioperative behaviors of both children and parents. It has good to excellent interrater reliability and strong concurrent validity against currently accepted scales. The PACBIS offers a means to identify maladaptive child or parental behaviors in real time, making it possible to intervene to modify such behaviors in a timely fashion.
NASA Astrophysics Data System (ADS)
Anquetin, Sandrine; Vannier, Olivier; Ollagnier, Mélody; Braud, Isabelle
2015-04-01
This work contributes to the evaluation of the dynamics of the human exposure during flash-flood events in the Mediterranean region. Understanding why and how the commuters modify their daily mobility in the Cévennes - Vivarais area (France) is the long-term objective of the study. To reach this objective, the methodology relies on three steps: i) evaluation of daily travel patterns, ii) reconstitution of road flooding events in the region based on hydrological simulation at regional scale in order to capture the time evolution and the intensity of flood and iii) identification of the daily fluctuation of the exposition according to road flooding scenarios and the time evolution of mobility patterns. This work deals with the second step. To do that, the physically based and non-calibrated hydrological model CVN (Vannier, 2013) is implemented to retrieve the hydrological signature of past flash-flood events in Southern France. Four past events are analyzed (September 2002; September 2005 (split in 2 different events); October 2008). Since the regional scale is investigated, the scales of the studied catchments range from few km2 to few hundreds of km2 where many catchments are ungauged. The evaluation is based on a multi-scale approach using complementary observations coming from post-flood experiments (for small and/or ungaugged catchments) and operational hydrological network (for larger catchments). The scales of risk (time and location of the road flooding) are also compared to observed data of road cuts. The discussion aims at improving our understanding on the hydrological processes associated with road flooding vulnerability. We specifically analyze runoff coefficient and the ratio between surface and groundwater flows at regional scale. The results show that on the overall, the three regional simulations provide good scores for the probability of detection and false alarms concerning road flooding (1600 points are analyzed for the whole region). Our evaluation procedure provides new insights on the active hydrological processes at small scales (catchments area < 10 km²) since these small scales, distributed over the whole region, are analyzed through road cuts data and post-flood field investigations. As shown in Vannier (2013), the signature of the altered geological layer is significant on the simulated discharges. For catchments under schisty geology, the simulated discharge, whatever the catchment size, is usually overestimated. Vannier, O, 2013, Apport de la modélisation hydrologique régionale à la compréhension des processus de crue en zone méditerranéenne, PhD-Thesis (in French), Grenoble University.
Moore, Sara; Wakam, Glenn; Hubbard, Alan E.; Cohen, Mitchell J.
2017-01-01
Introduction Delayed notification and lack of early information hinder timely hospital based activations in large scale multiple casualty events. We hypothesized that Twitter real-time data would produce a unique and reproducible signal within minutes of multiple casualty events and we investigated the timing of the signal compared with other hospital disaster notification mechanisms. Methods Using disaster specific search terms, all relevant tweets from the event to 7 days post-event were analyzed for 5 recent US based multiple casualty events (Boston Bombing [BB], SF Plane Crash [SF], Napa Earthquake [NE], Sandy Hook [SH], and Marysville Shooting [MV]). Quantitative and qualitative analysis of tweet utilization were compared across events. Results Over 3.8 million tweets were analyzed (SH 1.8 m, BB 1.1m, SF 430k, MV 250k, NE 205k). Peak tweets per min ranged from 209–3326. The mean followers per tweeter ranged from 3382–9992 across events. Retweets were tweeted a mean of 82–564 times per event. Tweets occurred very rapidly for all events (<2 mins) and represented 1% of the total event specific tweets in a median of 13 minutes of the first 911 calls. A 200 tweets/min threshold was reached fastest with NE (2 min), BB (7 min), and SF (18 mins). If this threshold was utilized as a signaling mechanism to place local hospitals on standby for possible large scale events, in all case studies, this signal would have preceded patient arrival. Importantly, this threshold for signaling would also have preceded traditional disaster notification mechanisms in SF, NE, and simultaneous with BB and MV. Conclusions Social media data has demonstrated that this mechanism is a powerful, predictable, and potentially important resource for optimizing disaster response. Further investigated is warranted to assess the utility of prospective signally thresholds for hospital based activation. PMID:28982201
The evolving block universe and the meshing together of times.
Ellis, George F R
2014-10-01
It has been proposed that spacetime should be regarded as an evolving block universe, bounded to the future by the present time, which continually extends to the future. This future boundary is defined at each time by measuring proper time along Ricci eigenlines from the start of the universe. A key point, then, is that physical reality can be represented at many different scales: hence, the passage of time may be seen as different at different scales, with quantum gravity determining the evolution of spacetime itself at the Planck scale, but quantum field theory and classical physics determining the evolution of events within spacetime at larger scales. The fundamental issue then arises as to how the effective times at different scales mesh together, leading to the concepts of global and local times. © 2014 New York Academy of Sciences.
NASA Astrophysics Data System (ADS)
Gavin, D. G.; Colombaroli, D.; Morey, A. E.
2015-12-01
The inclusion of paleo-flood events greatly affects estimates of peak magnitudes (e.g., Q100) in flood-frequency analysis. Likewise, peak events also are associated with certain synoptic climatic patterns that vary on all time scales. Geologic records preserved in lake sediments have the potential to capture the non-stationarity in frequency-magnitude relationships, but few such records preserve a continuous history of event magnitudes. We present a 10-meter 2000-yr record from Upper Squaw Lake, Oregon, that contains finely laminated silt layers that reflect landscape erosion events from the 40 km2 watershed. CT-scans of the core (<1 mm resolution) and a 14C-dated chronology yielded a pseudo-annual time series of erosion magnitudes. The most recent 80 years of the record correlates strongly with annual peak stream discharge and road construction. We examined the frequency-magnitude relationship for the entire pre-road period and show that the seven largest events fall above a strongly linear relationship, suggesting a distinct process (e.g., severe fires or earthquakes) operating at low-frequency to generate large-magnitude events. Expressing the record as cumulative sediment accumulation anomalies showed the importance of the large events in "returning the system" to the long-term mean rate. Applying frequency-magnitude analysis in a moving window showed that the Q100 and Q10 of watershed erosion varied by 1.7 and 1.0 orders of magnitude, respectively. The variations in watershed erosion are weakly correlated with temperature and precipitation reconstructions at the decadal to centennial scale. This suggests that dynamics both internal (i.e., sediment production) and external (i.e., earthquakes) to the system, as well as more stochastic events (i.e., single severe wildfires) can at least partially over-ride external climate forcing of watershed erosion at decadal to centennial time scales.
NASA Astrophysics Data System (ADS)
Poletti, Maria Laura; Pignone, Flavio; Rebora, Nicola; Silvestro, Francesco
2017-04-01
The exposure of the urban areas to flash-floods is particularly significant to Mediterranean coastal cities, generally densely-inhabited. Severe rainfall events often associated to intense and organized thunderstorms produced, during the last century, flash-floods and landslides causing serious damages to urban areas and in the worst events led to human losses. The temporal scale of these events has been observed strictly linked to the size of the catchments involved: in the Mediterranean area a great number of catchments that pass through coastal cities have a small drainage area (less than 100 km2) and a corresponding hydrologic response timescale in the order of a few hours. A suitable nowcasting chain is essential for the on time forecast of this kind of events. In fact meteorological forecast systems are unable to predict precipitation at the scale of these events, small both at spatial (few km) and temporal (hourly) scales. Nowcasting models, covering the time interval of the following two hours starting from the observation try to extend the predictability limits of the forecasting models in support of real-time flood alert system operations. This work aims to present the use of hydrological models coupled with nowcasting techniques. The nowcasting model PhaSt furnishes an ensemble of equi-probable future precipitation scenarios on time horizons of 1-3 h starting from the most recent radar observations. The coupling of the nowcasting model PhaSt with the hydrological model Continuum allows to forecast the flood with a few hours in advance. In this way it is possible to generate different discharge prediction for the following hours and associated return period maps: these maps can be used as a support in the decisional process for the warning system.
Temporal and spatial scaling impacts on extreme precipitation
NASA Astrophysics Data System (ADS)
Eggert, B.; Berg, P.; Haerter, J. O.; Jacob, D.; Moseley, C.
2015-01-01
Both in the current climate and in the light of climate change, understanding of the causes and risk of precipitation extremes is essential for protection of human life and adequate design of infrastructure. Precipitation extreme events depend qualitatively on the temporal and spatial scales at which they are measured, in part due to the distinct types of rain formation processes that dominate extremes at different scales. To capture these differences, we first filter large datasets of high-resolution radar measurements over Germany (5 min temporally and 1 km spatially) using synoptic cloud observations, to distinguish convective and stratiform rain events. In a second step, for each precipitation type, the observed data are aggregated over a sequence of time intervals and spatial areas. The resulting matrix allows a detailed investigation of the resolutions at which convective or stratiform events are expected to contribute most to the extremes. We analyze where the statistics of the two types differ and discuss at which resolutions transitions occur between dominance of either of the two precipitation types. We characterize the scales at which the convective or stratiform events will dominate the statistics. For both types, we further develop a mapping between pairs of spatially and temporally aggregated statistics. The resulting curve is relevant when deciding on data resolutions where statistical information in space and time is balanced. Our study may hence also serve as a practical guide for modelers, and for planning the space-time layout of measurement campaigns. We also describe a mapping between different pairs of resolutions, possibly relevant when working with mismatched model and observational resolutions, such as in statistical bias correction.
Stochastic Generation of Spatiotemporal Rainfall Events for Flood Risk Assessment
NASA Astrophysics Data System (ADS)
Diederen, D.; Liu, Y.; Gouldby, B.; Diermanse, F.
2017-12-01
Current flood risk analyses that only consider peaks of hydrometeorological forcing variables have limitations regarding their representation of reality. Simplistic assumptions regarding antecedent conditions are required, often different sources of flooding are considered in isolation, and the complex temporal and spatial evolution of the events is not considered. Mid-latitude storms, governed by large scale climatic conditions, often exhibit a high degree of temporal dependency, for example. For sustainable flood risk management, that accounts appropriately for climate change, it is desirable for flood risk analyses to reflect reality more appropriately. Analysis of risk mitigation measures and comparison of their relative performance is therefore likely to be more robust and lead to improved solutions. We provide a new framework for the provision of boundary conditions to flood risk analyses that more appropriately reflects reality. The boundary conditions capture the temporal dependencies of complex storms whilst preserving the extreme values and associated spatial dependencies. We demonstrate the application of this framework to generate a synthetic rainfall events time series boundary condition set from reanalysis rainfall data (CFSR) on the continental scale. We define spatiotemporal clusters of rainfall as events, extract hydrological parameters for each event, generate synthetic parameter sets with a multivariate distribution with a focus on the joint tail probability [Heffernan and Tawn, 2004], and finally create synthetic events from the generated synthetic parameters. We highlight the stochastic integration of (a) spatiotemporal features, e.g. event occurrence intensity over space-time, or time to previous event, which we use for the spatial placement and sequencing of the synthetic events, and (b) value-specific parameters, e.g. peak intensity and event extent. We contrast this to more traditional approaches to highlight the significant improvements in terms of representing the reality of extreme flood events.
Event Horizon Telescope observations as probes for quantum structure of astrophysical black holes
NASA Astrophysics Data System (ADS)
Giddings, Steven B.; Psaltis, Dimitrios
2018-04-01
The need for a consistent quantum evolution for black holes has led to proposals that their semiclassical description is modified not just near the singularity, but at horizon or larger scales. If such modifications extend beyond the horizon, they influence regions accessible to distant observation. Natural candidates for these modifications behave like metric fluctuations, with characteristic length scales and timescales set by the horizon radius. We investigate the possibility of using the Event Horizon Telescope to observe these effects, if they have a strength sufficient to make quantum evolution consistent with unitarity, without introducing new scales. We find that such quantum fluctuations can introduce a strong time dependence for the shape and size of the shadow that a black hole casts on its surrounding emission. For the black hole in the center of the Milky Way, detecting the rapid time variability of its shadow will require nonimaging timing techniques. However, for the much larger black hole in the center of the M87 galaxy, a variable black-hole shadow, if present with these parameters, would be readily observable in the individual snapshots that will be obtained by the Event Horizon Telescope.
NASA Astrophysics Data System (ADS)
Pistolesi, Marco; Cioni, Raffaello; Rosi, Mauro; Aguilera, Eduardo
2014-02-01
The ice-capped Cotopaxi volcano is known worldwide for the large-scale, catastrophic lahars that have occurred in connection with historical explosive eruptions. The most recent large-scale lahar event occurred in 1877 when scoria flows partially melted ice and snow of the summit glacier, generating debris flows that severely impacted all the river valleys originating from the volcano. The 1877 lahars have been considered in the recent years as a maximum expected event to define the hazard associated to lahar generation at Cotopaxi. Conversely, recent field-based studies have shown that such debris flows have occurred several times during the last 800 years of activity at Cotopaxi, and that the scale of lahars has been variable, including events much larger than that of 1877. Despite a rapid retreat of the summit ice cap over the past century, in fact, there are no data clearly suggesting that future events will be smaller than those observed in the deposits of the last 800 years of activity. In addition, geological field data prove that the lahar triggering mechanism also has to be considered as a key input parameter and, under appropriate eruptive mechanisms, a hazard scenario of a lahar with a volume 3-times larger than the 1877 event is likely. In order to analyze the impact scenarios in the southern drainage system of the volcano, simulations of inundation areas were performed with a semi-empirical model (LAHARZ), using input parameters including variable water volume. Results indicate that a lahar 3-times larger than the 1877 event would invade much wider areas than those flooded by the 1877 lahars along the southern valley system, eventually impacting highly-urbanized areas such as the city of Latacunga.
Flood events across the North Atlantic region - past development and future perspectives
NASA Astrophysics Data System (ADS)
Matti, Bettina; Dieppois, Bastien; Lawler, Damian; Dahlke, Helen E.; Lyon, Steve W.
2016-04-01
Flood events have a large impact on humans, both socially and economically. An increase in winter and spring flooding across much of northern Europe in recent years opened up the question of changing underlying hydro-climatic drivers of flood events. Predicting the manifestation of such changes is difficult due to the natural variability and fluctuations in northern hydrological systems caused by large-scale atmospheric circulations, especially under altered climate conditions. Improving knowledge on the complexity of these hydrological systems and their interactions with climate is essential to be able to determine drivers of flood events and to predict changes in these drivers under altered climate conditions. This is particularly true for the North Atlantic region where both physical catchment properties and large-scale atmospheric circulations have a profound influence on floods. This study explores changes in streamflow across North Atlantic region catchments. An emphasis is placed on high-flow events, namely the timing and magnitude of past flood events, and selected flood percentiles were tested for stationarity by applying a flood frequency analysis. The issue of non-stationarity of flood return periods is important when linking streamflow to large-scale atmospheric circulations. Natural fluctuations in these circulations are found to have a strong influence on the outcome causing natural variability in streamflow records. Long time series and a multi-temporal approach allows for determining drivers of floods and linking streamflow to large-scale atmospheric circulations. Exploring changes in selected hydrological signatures consistency was found across much of the North Atlantic region suggesting a shift in flow regime. The lack of an overall regional pattern suggests that how catchments respond to changes in climatic drivers is strongly influenced by their physical characteristics. A better understanding of hydrological response to climate drivers is essential for example for forecasting purposes.
Generalizing a nonlinear geophysical flood theory to medium-sized river networks
Gupta, Vijay K.; Mantilla, Ricardo; Troutman, Brent M.; Dawdy, David; Krajewski, Witold F.
2010-01-01
The central hypothesis of a nonlinear geophysical flood theory postulates that, given space-time rainfall intensity for a rainfall-runoff event, solutions of coupled mass and momentum conservation differential equations governing runoff generation and transport in a self-similar river network produce spatial scaling, or a power law, relation between peak discharge and drainage area in the limit of large area. The excellent fit of a power law for the destructive flood event of June 2008 in the 32,400-km2 Iowa River basin over four orders of magnitude variation in drainage areas supports the central hypothesis. The challenge of predicting observed scaling exponent and intercept from physical processes is explained. We show scaling in mean annual peak discharges, and briefly discuss that it is physically connected with scaling in multiple rainfall-runoff events. Scaling in peak discharges would hold in a non-stationary climate due to global warming but its slope and intercept would change.
2017-11-01
magnitude, intensity, and seasonality of climate. For infrastructure projects, relevant design life often exceeds 30 years—a period of time of...uncertainty about future statistical properties of climate at time and spatial scales required for planning and design purposes. Information...about future statistical properties of climate at time and spatial scales required for planning and design , and for assessing future operational
NASA Astrophysics Data System (ADS)
Ficchì, Andrea; Perrin, Charles; Andréassian, Vazken
2016-07-01
Hydro-climatic data at short time steps are considered essential to model the rainfall-runoff relationship, especially for short-duration hydrological events, typically flash floods. Also, using fine time step information may be beneficial when using or analysing model outputs at larger aggregated time scales. However, the actual gain in prediction efficiency using short time-step data is not well understood or quantified. In this paper, we investigate the extent to which the performance of hydrological modelling is improved by short time-step data, using a large set of 240 French catchments, for which 2400 flood events were selected. Six-minute rain gauge data were available and the GR4 rainfall-runoff model was run with precipitation inputs at eight different time steps ranging from 6 min to 1 day. Then model outputs were aggregated at seven different reference time scales ranging from sub-hourly to daily for a comparative evaluation of simulations at different target time steps. Three classes of model performance behaviour were found for the 240 test catchments: (i) significant improvement of performance with shorter time steps; (ii) performance insensitivity to the modelling time step; (iii) performance degradation as the time step becomes shorter. The differences between these groups were analysed based on a number of catchment and event characteristics. A statistical test highlighted the most influential explanatory variables for model performance evolution at different time steps, including flow auto-correlation, flood and storm duration, flood hydrograph peakedness, rainfall-runoff lag time and precipitation temporal variability.
Large-scale weather dynamics during the 2015 haze event in Singapore
NASA Astrophysics Data System (ADS)
Djamil, Yudha; Lee, Wen-Chien; Tien Dat, Pham; Kuwata, Mikinori
2017-04-01
The 2015 haze event in South East Asia is widely considered as a period of the worst air quality in the region in more than a decade. The source of the haze was from forest and peatland fire in Sumatra and Kalimantan Islands, Indonesia. The fires were mostly came from the practice of forest clearance known as slash and burn, to be converted to palm oil plantation. Such practice of clearance although occurs seasonally but at 2015 it became worst by the impact of strong El Nino. The long period of dryer atmosphere over the region due to El Nino makes the fire easier to ignite, spread and difficult to stop. The biomass emission from the forest and peatland fire caused large-scale haze pollution problem in both Islands and further spread into the neighboring countries such as Singapore and Malaysia. In Singapore, for about two months (September-October, 2015) the air quality was in the unhealthy level. Such unfortunate condition caused some socioeconomic losses such as school closure, cancellation of outdoor events, health issues and many more with total losses estimated as S700 million. The unhealthy level of Singapore's air quality is based on the increasing pollutant standard index (PSI>120) due to the haze arrival, it even reached a hazardous level (PSI= 300) for several days. PSI is a metric of air quality in Singapore that aggregate six pollutants (SO2, PM10, PM2.5, NO2, CO and O3). In this study, we focused on PSI variability in weekly-biweekly time scales (periodicity < 30 days) since it is the least understood compare to their diurnal and seasonal scales. We have identified three dominant time scales of PSI ( 5, 10 and 20 days) using Wavelet method and investigated their large-scale atmospheric structures. The PSI associated large-scale column moisture horizontal structures over the Indo-Pacific basin are dominated by easterly propagating gyres in synoptic (macro) scale for the 5 days ( 10 and 20 days) time scales. The propagating gyres manifest as cyclical column moisture flux trajectory around Singapore region. Some of its phases are identified to be responsible in transporting the haze from its source to Singapore. The haze source was identified by compositing number of hotspots in grid-space based on the three time scales of PSI. Further discussion about equatorial waves during the haze event will also be presented.
Semi-supervised tracking of extreme weather events in global spatio-temporal climate datasets
NASA Astrophysics Data System (ADS)
Kim, S. K.; Prabhat, M.; Williams, D. N.
2017-12-01
Deep neural networks have been successfully applied to solve problem to detect extreme weather events in large scale climate datasets and attend superior performance that overshadows all previous hand-crafted methods. Recent work has shown that multichannel spatiotemporal encoder-decoder CNN architecture is able to localize events in semi-supervised bounding box. Motivated by this work, we propose new learning metric based on Variational Auto-Encoders (VAE) and Long-Short-Term-Memory (LSTM) to track extreme weather events in spatio-temporal dataset. We consider spatio-temporal object tracking problems as learning probabilistic distribution of continuous latent features of auto-encoder using stochastic variational inference. For this, we assume that our datasets are i.i.d and latent features is able to be modeled by Gaussian distribution. In proposed metric, we first train VAE to generate approximate posterior given multichannel climate input with an extreme climate event at fixed time. Then, we predict bounding box, location and class of extreme climate events using convolutional layers given input concatenating three features including embedding, sampled mean and standard deviation. Lastly, we train LSTM with concatenated input to learn timely information of dataset by recurrently feeding output back to next time-step's input of VAE. Our contribution is two-fold. First, we show the first semi-supervised end-to-end architecture based on VAE to track extreme weather events which can apply to massive scaled unlabeled climate datasets. Second, the information of timely movement of events is considered for bounding box prediction using LSTM which can improve accuracy of localization. To our knowledge, this technique has not been explored neither in climate community or in Machine Learning community.
NASA Astrophysics Data System (ADS)
Kossieris, Panagiotis; Makropoulos, Christos; Onof, Christian; Koutsoyiannis, Demetris
2018-01-01
Many hydrological applications, such as flood studies, require the use of long rainfall data at fine time scales varying from daily down to 1 min time step. However, in the real world there is limited availability of data at sub-hourly scales. To cope with this issue, stochastic disaggregation techniques are typically employed to produce possible, statistically consistent, rainfall events that aggregate up to the field data collected at coarser scales. A methodology for the stochastic disaggregation of rainfall at fine time scales was recently introduced, combining the Bartlett-Lewis process to generate rainfall events along with adjusting procedures to modify the lower-level variables (i.e., hourly) so as to be consistent with the higher-level one (i.e., daily). In the present paper, we extend the aforementioned scheme, initially designed and tested for the disaggregation of daily rainfall into hourly depths, for any sub-hourly time scale. In addition, we take advantage of the recent developments in Poisson-cluster processes incorporating in the methodology a Bartlett-Lewis model variant that introduces dependence between cell intensity and duration in order to capture the variability of rainfall at sub-hourly time scales. The disaggregation scheme is implemented in an R package, named HyetosMinute, to support disaggregation from daily down to 1-min time scale. The applicability of the methodology was assessed on a 5-min rainfall records collected in Bochum, Germany, comparing the performance of the above mentioned model variant against the original Bartlett-Lewis process (non-random with 5 parameters). The analysis shows that the disaggregation process reproduces adequately the most important statistical characteristics of rainfall at wide range of time scales, while the introduction of the model with dependent intensity-duration results in a better performance in terms of skewness, rainfall extremes and dry proportions.
Earth History databases and visualization - the TimeScale Creator system
NASA Astrophysics Data System (ADS)
Ogg, James; Lugowski, Adam; Gradstein, Felix
2010-05-01
The "TimeScale Creator" team (www.tscreator.org) and the Subcommission on Stratigraphic Information (stratigraphy.science.purdue.edu) of the International Commission on Stratigraphy (www.stratigraphy.org) has worked with numerous geoscientists and geological surveys to prepare reference datasets for global and regional stratigraphy. All events are currently calibrated to Geologic Time Scale 2004 (Gradstein et al., 2004, Cambridge Univ. Press) and Concise Geologic Time Scale (Ogg et al., 2008, Cambridge Univ. Press); but the array of intercalibrations enable dynamic adjustment to future numerical age scales and interpolation methods. The main "global" database contains over 25,000 events/zones from paleontology, geomagnetics, sea-level and sequence stratigraphy, igneous provinces, bolide impacts, plus several stable isotope curves and image sets. Several regional datasets are provided in conjunction with geological surveys, with numerical ages interpolated using a similar flexible inter-calibration procedure. For example, a joint program with Geoscience Australia has compiled an extensive Australian regional biostratigraphy and a full array of basin lithologic columns with each formation linked to public lexicons of all Proterozoic through Phanerozoic basins - nearly 500 columns of over 9,000 data lines plus hot-curser links to oil-gas reference wells. Other datapacks include New Zealand biostratigraphy and basin transects (ca. 200 columns), Russian biostratigraphy, British Isles regional stratigraphy, Gulf of Mexico biostratigraphy and lithostratigraphy, high-resolution Neogene stable isotope curves and ice-core data, human cultural episodes, and Circum-Arctic stratigraphy sets. The growing library of datasets is designed for viewing and chart-making in the free "TimeScale Creator" JAVA package. This visualization system produces a screen display of the user-selected time-span and the selected columns of geologic time scale information. The user can change the vertical-scale, column widths, fonts, colors, titles, ordering, range chart options and many other features. Mouse-activated pop-ups provide additional information on columns and events; including links to external Internet sites. The graphics can be saved as SVG (scalable vector graphics) or PDF files for direct import into Adobe Illustrator or other common drafting software. Users can load additional regional datapacks, and create and upload their own datasets. The "Pro" version has additional dataset-creation tools, output options and the ability to edit and re-save merged datasets. The databases and visualization package are envisioned as a convenient reference tool, chart-production assistant, and a window into the geologic history of our planet.
Inertial-Range Reconnection in Magnetohydrodynamic Turbulence and in the Solar Wind.
Lalescu, Cristian C; Shi, Yi-Kang; Eyink, Gregory L; Drivas, Theodore D; Vishniac, Ethan T; Lazarian, Alexander
2015-07-10
In situ spacecraft data on the solar wind show events identified as magnetic reconnection with wide outflows and extended "X lines," 10(3)-10(4) times ion scales. To understand the role of turbulence at these scales, we make a case study of an inertial-range reconnection event in a magnetohydrodynamic simulation. We observe stochastic wandering of field lines in space, breakdown of standard magnetic flux freezing due to Richardson dispersion, and a broadened reconnection zone containing many current sheets. The coarse-grain magnetic geometry is like large-scale reconnection in the solar wind, however, with a hyperbolic flux tube or apparent X line extending over integral length scales.
Bio-, Magneto- and event-stratigraphy across the K-T boundary
NASA Technical Reports Server (NTRS)
Preisinger, A.; Stradner, H.; Mauritsch, H. J.
1988-01-01
Determining the time and the time structure of rare events in geology can be accomplished by applying three different and independent stratigraphic methods: Biostratigraphy, magneto-stratigraphy and event-stratigraphy. The optimal time resolution of the two former methods is about 1000 years, while by means of event-stratigraphy a resolution of approximately one year can be achieved. For biostratigraphy across the Cretaceous-Tertiary (K-T) boundary micro- and nannofossils have been found best suited. The qualitative and quantitative analyses of minerals and trace elements across the K-T boundary show anomalies on a millimeter scale and permit conclusions regarding the time structure of the K-T event itself. The results of the analyses find a most consistent explanation by the assumption of an extraterrestrial impact. The main portion of the material rain from the atmosphere evidently was deposited within a short time. The long-time components consist of the finest portion of the material rain from the atmosphere and the transported and redeposited fall-out.
Paleobiology, community ecology, and scales of ecological pattern.
Jablonski, D; Sepkoski, J J
1996-07-01
The fossil record provides a wealth of data on the role of regional processes and historical events in shaping biological communities over a variety of time scales. The Quaternary record with its evidence of repeated climatic change shows that both terrestrial and marine species shifted independently rather than as cohesive assemblages over scales of thousands of years. Larger scale patterns also show a strong individualistic component to taxon dynamics; assemblage stability, when it occurs, is difficult to separate from shared responses to low rates of environmental change. Nevertheless, the fossil record does suggest that some biotic interactions influence large-scale ecological and evolutionary patterns, albeit in more diffuse and protracted fashions than those generally studied by community ecologists. These include: (1) the resistance by incumbents to the establishment of new or invading taxa, with episodes of explosive diversification often appearing contingent on the removal of incumbents at extinction events; (2) steady states of within-habitat and global diversity at longer time scales (10(7)-l0(8) yr), despite enormous turnover of taxa; and (3) morphological and biogeographic responses to increased intensities of predation and substratum disturbance over similarly long time scales. The behavior of species and communities over the array of temporal and spatial scales in the fossil record takes on additional significance for framing conservation strategies, and for understanding recovery of species, lineages, and communities from environmental changes.
Paleobiology, community ecology, and scales of ecological pattern
NASA Technical Reports Server (NTRS)
Jablonski, D.; Sepkoski, J. J. Jr; Sepkoski JJ, J. r. (Principal Investigator)
1996-01-01
The fossil record provides a wealth of data on the role of regional processes and historical events in shaping biological communities over a variety of time scales. The Quaternary record with its evidence of repeated climatic change shows that both terrestrial and marine species shifted independently rather than as cohesive assemblages over scales of thousands of years. Larger scale patterns also show a strong individualistic component to taxon dynamics; assemblage stability, when it occurs, is difficult to separate from shared responses to low rates of environmental change. Nevertheless, the fossil record does suggest that some biotic interactions influence large-scale ecological and evolutionary patterns, albeit in more diffuse and protracted fashions than those generally studied by community ecologists. These include: (1) the resistance by incumbents to the establishment of new or invading taxa, with episodes of explosive diversification often appearing contingent on the removal of incumbents at extinction events; (2) steady states of within-habitat and global diversity at longer time scales (10(7)-l0(8) yr), despite enormous turnover of taxa; and (3) morphological and biogeographic responses to increased intensities of predation and substratum disturbance over similarly long time scales. The behavior of species and communities over the array of temporal and spatial scales in the fossil record takes on additional significance for framing conservation strategies, and for understanding recovery of species, lineages, and communities from environmental changes.
Multi-scale comparison of source parameter estimation using empirical Green's function approach
NASA Astrophysics Data System (ADS)
Chen, X.; Cheng, Y.
2015-12-01
Analysis of earthquake source parameters requires correction of path effect, site response, and instrument responses. Empirical Green's function (EGF) method is one of the most effective methods in removing path effects and station responses by taking the spectral ratio between a larger and smaller event. Traditional EGF method requires identifying suitable event pairs, and analyze each event individually. This allows high quality estimations for strictly selected events, however, the quantity of resolvable source parameters is limited, which challenges the interpretation of spatial-temporal coherency. On the other hand, methods that exploit the redundancy of event-station pairs are proposed, which utilize the stacking technique to obtain systematic source parameter estimations for a large quantity of events at the same time. This allows us to examine large quantity of events systematically, facilitating analysis of spatial-temporal patterns, and scaling relationship. However, it is unclear how much resolution is scarified during this process. In addition to the empirical Green's function calculation, choice of model parameters and fitting methods also lead to biases. Here, using two regional focused arrays, the OBS array in the Mendocino region, and the borehole array in the Salton Sea geothermal field, I compare the results from the large scale stacking analysis, small-scale cluster analysis, and single event-pair analysis with different fitting methods to systematically compare the results within completely different tectonic environment, in order to quantify the consistency and inconsistency in source parameter estimations, and the associated problems.
Downscaling large-scale circulation to local winter climate using neural network techniques
NASA Astrophysics Data System (ADS)
Cavazos Perez, Maria Tereza
1998-12-01
The severe impacts of climate variability on society reveal the increasing need for improving regional-scale climate diagnosis. A new downscaling approach for climate diagnosis is developed here. It is based on neural network techniques that derive transfer functions from the large-scale atmospheric controls to the local winter climate in northeastern Mexico and southeastern Texas during the 1985-93 period. A first neural network (NN) model employs time-lagged component scores from a rotated principal component analysis of SLP, 500-hPa heights, and 1000-500 hPa thickness as predictors of daily precipitation. The model is able to reproduce the phase and, to some decree, the amplitude of large rainfall events, reflecting the influence of the large-scale circulation. Large errors are found over the Sierra Madre, over the Gulf of Mexico, and during El Nino events, suggesting an increase in the importance of meso-scale rainfall processes. However, errors are also due to the lack of randomization of the input data and the absence of local atmospheric predictors such as moisture. Thus, a second NN model uses time-lagged specific humidity at the Earth's surface and at the 700 hPa level, SLP tendency, and 700-500 hPa thickness as input to a self-organizing map (SOM) that pre-classifies the atmospheric fields into different patterns. The results from the SOM classification document that negative (positive) anomalies of winter precipitation over the region are associated with: (1) weaker (stronger) Aleutian low; (2) stronger (weaker) North Pacific high; (3) negative (positive) phase of the Pacific North American pattern; and (4) La Nina (El Nino) events. The SOM atmospheric patterns are then used as input to a feed-forward NN that captures over 60% of the daily rainfall variance and 94% of the daily minimum temperature variance over the region. This demonstrates the ability of artificial neural network models to simulate realistic relationships on daily time scales. The results of this research also reveal that the SOM pre-classification of days with similar atmospheric conditions succeeded in emphasizing the differences of the atmospheric variance conducive to extreme events. This resulted in a downscaling NN model that is highly sensitive to local-scale weather anomalies associated with El Nino and extreme cold events.
Ramanathan, Arvind; Savol, Andrej J.; Agarwal, Pratul K.; Chennubhotla, Chakra S.
2012-01-01
Biomolecular simulations at milli-second and longer timescales can provide vital insights into functional mechanisms. Since post-simulation analyses of such large trajectory data-sets can be a limiting factor in obtaining biological insights, there is an emerging need to identify key dynamical events and relating these events to the biological function online, that is, as simulations are progressing. Recently, we have introduced a novel computational technique, quasi-anharmonic analysis (QAA) (PLoS One 6(1): e15827), for partitioning the conformational landscape into a hierarchy of functionally relevant sub-states. The unique capabilities of QAA are enabled by exploiting anharmonicity in the form of fourth-order statistics for characterizing atomic fluctuations. In this paper, we extend QAA for analyzing long time-scale simulations online. In particular, we present HOST4MD - a higher-order statistical toolbox for molecular dynamics simulations, which (1) identifies key dynamical events as simulations are in progress, (2) explores potential sub-states and (3) identifies conformational transitions that enable the protein to access those sub-states. We demonstrate HOST4MD on micro-second time-scale simulations of the enzyme adenylate kinase in its apo state. HOST4MD identifies several conformational events in these simulations, revealing how the intrinsic coupling between the three sub-domains (LID, CORE and NMP) changes during the simulations. Further, it also identifies an inherent asymmetry in the opening/closing of the two binding sites. We anticipate HOST4MD will provide a powerful and extensible framework for detecting biophysically relevant conformational coordinates from long time-scale simulations. PMID:22733562
Rupture Complexities of Fluid Induced Microseismic Events at the Basel EGS Project
NASA Astrophysics Data System (ADS)
Folesky, Jonas; Kummerow, Jörn; Shapiro, Serge A.; Häring, Markus; Asanuma, Hiroshi
2016-04-01
Microseismic data sets of excellent quality, such as the seismicity recorded in the Basel-1 enhanced geothermal system, Switzerland, in 2006-2007, provide the opportunity to analyse induced seismic events in great detail. It is important to understand in how far seismological insights on e.g. source and rupture processes are scale dependent and how they can be transferred to fluid induced micro-seismicity. We applied the empirical Green's function (EGF) method in order to reconstruct the relative source time functions of 195 suitable microseismic events from the Basel-1 reservoir. We found 93 solutions with a clear and consistent directivity pattern. The remaining events display either no measurable directivity, are unfavourably oriented or exhibit non consistent or complex relative source time functions. In this work we focus on selected events of M ˜ 1 which show possible rupture complexities. It is demonstrated that the EGF method allows to resolve complex rupture behaviour even if it is not directly identifiable in the seismograms. We find clear evidence of rupture directivity and multi-phase rupturing in the analysed relative source time functions. The time delays between consecutive subevents lies in the order of 10ms. Amplitudes of the relative source time functions of the subevents do not always show the same azimuthal dependence, indicating dissimilarity in the rupture directivity of the subevents. Our observations support the assumption that heterogeneity on fault surfaces persists down to small scale (few tens of meters).
Weather observations on Whistler Mountain during five storms
NASA Astrophysics Data System (ADS)
Thériault, Julie M.; Rasmussen, Kristen L.; Fisico, Teresa; Stewart, Ronald E.; Joe, Paul; Gultepe, Ismail; Clément, Marilys; Isaac, George A.
2014-01-01
A greater understanding of precipitation formation processes over complex terrain near the west coast of British Colombia will contribute to many relevant applications, such as climate studies, local hydrology, transportation, and winter sport competition. The phase of precipitation is difficult to determine because of the warm and moist weather conditions experienced during the wintertime in coastal mountain ranges. The goal of this study is to investigate the wide range of meteorological conditions that generated precipitation on Whistler Mountain from 4-12 March 2010 during the SNOW-V10 field campaign. During this time period, five different storms were documented in detail and were associated with noticeably different meteorological conditions in the vicinity of Whistler Mountain. New measurement techniques, along with the SNOW-V10 instrumentation, were used to obtain in situ observations during precipitation events along the Whistler mountainside. The results demonstrate a high variability of weather conditions ranging from the synoptic-scale to the macro-scale. These weather events were associated with a variation of precipitation along the mountainside, such as events associated with snow, snow pellets, and rain. Only two events associated with a rain-snow transition along the mountainside were observed, even though above-freezing temperatures along the mountainside were recorded 90 % of the time. On a smaller scale, these events were also associated with a high variability of snowflake types that were observed simultaneously near the top of Whistler Mountain. Overall, these detailed observations demonstrate the importance of understanding small-scale processes to improve observational techniques, short-term weather prediction, and longer-term climate projections over mountainous regions.
Eastern Pacific cooling and Atlantic overturning circulation during the last deglaciation.
Kienast, Markus; Kienast, Stephanie S; Calvert, Stephen E; Eglinton, Timothy I; Mollenhauer, Gesine; François, Roger; Mix, Alan C
2006-10-19
Surface ocean conditions in the equatorial Pacific Ocean could hold the clue to whether millennial-scale global climate change during glacial times was initiated through tropical ocean-atmosphere feedbacks or by changes in the Atlantic thermohaline circulation. North Atlantic cold periods during Heinrich events and millennial-scale cold events (stadials) have been linked with climatic changes in the tropical Atlantic Ocean and South America, as well as the Indian and East Asian monsoon systems, but not with tropical Pacific sea surface temperatures. Here we present a high-resolution record of sea surface temperatures in the eastern tropical Pacific derived from alkenone unsaturation measurements. Our data show a temperature drop of approximately 1 degrees C, synchronous (within dating uncertainties) with the shutdown of the Atlantic meridional overturning circulation during Heinrich event 1, and a smaller temperature drop of approximately 0.5 degrees C synchronous with the smaller reduction in the overturning circulation during the Younger Dryas event. Both cold events coincide with maxima in surface ocean productivity as inferred from 230Th-normalized carbon burial fluxes, suggesting increased upwelling at the time. From the concurrence of equatorial Pacific cooling with the two North Atlantic cold periods during deglaciation, we conclude that these millennial-scale climate changes were probably driven by a reorganization of the oceans' thermohaline circulation, although possibly amplified by tropical ocean-atmosphere interaction as suggested before.
NASA Astrophysics Data System (ADS)
Hollander, R. W.; Bom, V. R.; van Eijk, C. W. E.; Faber, J. S.; Hoevers, H.; Kruit, P.
1994-09-01
The elemental composition of a sample at nanometer scale is determined by measurement of the characteristic energy of Auger electrons, emitted in coincidence with incoming primary electrons from a microbeam in a scanning transmission electron microscope (STEM). Single electrons are detected with position sensitive detectors, consisting of MicroChannel Plates (MCP) and MultiStrip Anodes (MSA), one for the energy of the Auger electrons (Auger-detector) and one for the energy loss of primary electrons (EELS-detector). The MSAs are sensed with LeCroy 2735DC preamplifiers. The fast readout is based on LeCroy's PCOS III system. On the detection of a coincidence (Event) energy data of Auger and EELS are combined with timing data to an Event word. Event words are stored in list mode in a VME memory module. Blocks of Event words are scanned by transputers in VME and two-dimensional energy histograms are filled using the timing information to obtain a maximal true/accidental ratio. The resulting histograms are stored on disk of a PC-386, which also controls data taking. The system is designed to handle 10 5 Events per second, 90% of which are accidental. In the histograms the "true" to "accidental" ratio will be 5. The dead time is 15%.
Large Scale Influences on Summertime Extreme Precipitation in the Northeastern United States.
Marquardt Collow, Allison B; Bosilovich, Michael G; Koster, Randal D
2016-12-01
Observations indicate that over the last few decades there has been a statistically significant increase in precipitation in the Northeastern United States and that this can be attributed to an increase in precipitation associated with extreme precipitation events. Here we use a state-of-the-art atmospheric reanalysis to examine such events in detail. Daily extreme precipitation events defined at the 75 th and 95 th percentile from gridded gauge observations are identified for a selected region within the Northeast. Atmospheric variables from the Modern Era Retrospective Analysis for Research and Applications - Version 2 (MERRA-2) are then composited during these events to illustrate the time evolution of associated synoptic structures, with a focus on vertically integrated water vapor fluxes, sea level pressure, and 500 hPa heights. Anomalies of these fields move into the region from the northwest, with stronger anomalies present in the 95 th percentile case. Although previous studies show tropical cyclones are responsible for the most intense extreme precipitation events, only 10% of the events in this study are caused by tropical cyclones. On the other hand, extreme events resulting from cut off low pressure systems have increased. The time period of the study was divided in half to determine how the mean composite has changed over time. An arc of lower sea level pressure along the east coast and a change in the vertical profile of equivalent potential temperature suggest a possible increase in the frequency or intensity of synoptic scale baroclinic disturbances.
Large Scale Influences on Summertime Extreme Precipitation in the Northeastern United States
NASA Technical Reports Server (NTRS)
Collow, Allison B. Marquardt; Bosilovich, Michael G.; Koster, Randal Dean
2016-01-01
Observations indicate that over the last few decades there has been a statistically significant increase in precipitation in the northeastern United States and that this can be attributed to an increase in precipitation associated with extreme precipitation events. Here a state-of-the-art atmospheric reanalysis is used to examine such events in detail. Daily extreme precipitation events defined at the 75th and 95th percentile from gridded gauge observations are identified for a selected region within the Northeast. Atmospheric variables from the Modern-Era Retrospective Analysis for Research and Applications, version 2 (MERRA-2), are then composited during these events to illustrate the time evolution of associated synoptic structures, with a focus on vertically integrated water vapor fluxes, sea level pressure, and 500-hectopascal heights. Anomalies of these fields move into the region from the northwest, with stronger anomalies present in the 95th percentile case. Although previous studies show tropical cyclones are responsible for the most intense extreme precipitation events, only 10 percent of the events in this study are caused by tropical cyclones. On the other hand, extreme events resulting from cutoff low pressure systems have increased. The time period of the study was divided in half to determine how the mean composite has changed over time. An arc of lower sea level pressure along the East Coast and a change in the vertical profile of equivalent potential temperature suggest a possible increase in the frequency or intensity of synoptic-scale baroclinic disturbances.
The Effect of Inappropriate Calibration: Three Case Studies in Molecular Ecology
Ho, Simon Y. W.; Saarma, Urmas; Barnett, Ross; Haile, James; Shapiro, Beth
2008-01-01
Time-scales estimated from sequence data play an important role in molecular ecology. They can be used to draw correlations between evolutionary and palaeoclimatic events, to measure the tempo of speciation, and to study the demographic history of an endangered species. In all of these studies, it is paramount to have accurate estimates of time-scales and substitution rates. Molecular ecological studies typically focus on intraspecific data that have evolved on genealogical scales, but often these studies inappropriately employ deep fossil calibrations or canonical substitution rates (e.g., 1% per million years for birds and mammals) for calibrating estimates of divergence times. These approaches can yield misleading estimates of molecular time-scales, with significant impacts on subsequent evolutionary and ecological inferences. We illustrate this calibration problem using three case studies: avian speciation in the late Pleistocene, the demographic history of bowhead whales, and the Pleistocene biogeography of brown bears. For each data set, we compare the date estimates that are obtained using internal and external calibration points. In all three cases, the conclusions are significantly altered by the application of revised, internally-calibrated substitution rates. Collectively, the results emphasise the importance of judicious selection of calibrations for analyses of recent evolutionary events. PMID:18286172
The effect of inappropriate calibration: three case studies in molecular ecology.
Ho, Simon Y W; Saarma, Urmas; Barnett, Ross; Haile, James; Shapiro, Beth
2008-02-20
Time-scales estimated from sequence data play an important role in molecular ecology. They can be used to draw correlations between evolutionary and palaeoclimatic events, to measure the tempo of speciation, and to study the demographic history of an endangered species. In all of these studies, it is paramount to have accurate estimates of time-scales and substitution rates. Molecular ecological studies typically focus on intraspecific data that have evolved on genealogical scales, but often these studies inappropriately employ deep fossil calibrations or canonical substitution rates (e.g., 1% per million years for birds and mammals) for calibrating estimates of divergence times. These approaches can yield misleading estimates of molecular time-scales, with significant impacts on subsequent evolutionary and ecological inferences. We illustrate this calibration problem using three case studies: avian speciation in the late Pleistocene, the demographic history of bowhead whales, and the Pleistocene biogeography of brown bears. For each data set, we compare the date estimates that are obtained using internal and external calibration points. In all three cases, the conclusions are significantly altered by the application of revised, internally-calibrated substitution rates. Collectively, the results emphasise the importance of judicious selection of calibrations for analyses of recent evolutionary events.
Scale size-dependent characteristics of the nightside aurora
NASA Astrophysics Data System (ADS)
Humberset, B. K.; Gjerloev, J. W.; Samara, M.; Michell, R. G.
2017-02-01
We have determined the spatiotemporal characteristics of the magnetosphere-ionosphere (M-I) coupling using auroral imaging. Observations at fixed positions for an extended period of time are provided by a ground-based all-sky imager measuring the 557.7 nm auroral emissions. We report on a single event of nightside aurora (˜22 magnetic local time) preceding a substorm onset. To determine the spatiotemporal characteristics, we perform an innovative analysis of an all-sky imager movie (19 min duration, images at 3.31 Hz) that combines a two-dimensional spatial fast Fourier transform with a temporal correlation. We find a scale size-dependent variability where the largest scale sizes are stable on timescales of minutes while the small scale sizes are more variable. When comparing two smaller time intervals of different types of auroral displays, we find a variation in their characteristics. The characteristics averaged over the event are in remarkable agreement with the spatiotemporal characteristics of the nightside field-aligned currents during moderately disturbed times. Thus, two different electrodynamical parameters of the M-I coupling show similar behavior. This gives independent support to the claim of a system behavior that uses repeatable solutions to transfer energy and momentum from the magnetosphere to the ionosphere.
NASA Astrophysics Data System (ADS)
Alizee, D.; Bonamy, D.
2017-12-01
In inhomogeneous brittle solids like rocks, concrete or ceramics, one usually distinguish nominally brittle fracture, driven by the propagation of a single crack from quasibrittle one, resulting from the accumulation of many microcracks. The latter goes along with intermittent sharp noise, as e.g. revealed by the acoustic emission observed in lab scale compressive fracture experiments or at geophysical scale in the seismic activity. In both cases, statistical analyses have revealed a complex time-energy organization into aftershock sequences obeying a range of robust empirical scaling laws (the Omori-Utsu, productivity and Bath's law) that help carry out seismic hazard analysis and damage mitigation. These laws are usually conjectured to emerge from the collective dynamics of microcrack nucleation. In the experiments presented at AGU, we will show that such a statistical organization is not specific to the quasi-brittle multicracking situations, but also rules the acoustic events produced by a single crack slowly driven in an artificial rock made of sintered polymer beads. This simpler situation has advantageous properties (statistical stationarity in particular) permitting us to uncover the origins of these seismic laws: Both productivity law and Bath's law result from the scale free statistics for event energy and Omori-Utsu law results from the scale-free statistics of inter-event time. This yields predictions on how the associated parameters are related, which were analytically derived. Surprisingly, the so-obtained relations are also compatible with observations on lab scale compressive fracture experiments, suggesting that, in these complex multicracking situations also, the organization into aftershock sequences and associated seismic laws are also ruled by the propagation of individual microcrack fronts, and not by the collective, stress-mediated, microcrack nucleation. Conversely, the relations are not fulfilled in seismology signals, suggesting that additional ingredient should be taken into account.
NASA Astrophysics Data System (ADS)
Butler, Rhett; Frazer, L. Neil; Templeton, William J.
2016-05-01
We use the global rate of Mw ≥ 9.0 earthquakes, and standard Bayesian procedures, to estimate the probability of such mega events in the Aleutian Islands, where they pose a significant risk to Hawaii. We find that the probability of such an earthquake along the Aleutians island arc is 6.5% to 12% over the next 50 years (50% credibility interval) and that the annualized risk to Hawai'i is about $30 M. Our method (the regionally scaled global rate method or RSGR) is to scale the global rate of Mw 9.0+ events in proportion to the fraction of global subduction (units of area per year) that takes place in the Aleutians. The RSGR method assumes that Mw 9.0+ events are a Poisson process with a rate that is both globally and regionally stationary on the time scale of centuries, and it follows the principle of Burbidge et al. (2008) who used the product of fault length and convergence rate, i.e., the area being subducted per annum, to scale the Poisson rate for the GSS to sections of the Indonesian subduction zone. Before applying RSGR to the Aleutians, we first apply it to five other regions of the global subduction system where its rate predictions can be compared with those from paleotsunami, paleoseismic, and geoarcheology data. To obtain regional rates from paleodata, we give a closed-form solution for the probability density function of the Poisson rate when event count and observation time are both uncertain.
A Short-term ESPERTA-based Forecast Tool for Moderate-to-extreme Solar Proton Events
NASA Astrophysics Data System (ADS)
Laurenza, M.; Alberti, T.; Cliver, E. W.
2018-04-01
The ESPERTA (Empirical model for Solar Proton Event Real Time Alert) forecast tool has a Probability of Detection (POD) of 63% for all >10 MeV events with proton peak intensity ≥10 pfu (i.e., ≥S1 events, S1 referring to minor storms on the NOAA Solar Radiation Storms scale), from 1995 to 2014 with a false alarm rate (FAR) of 38% and a median (minimum) warning time (WT) of ∼4.8 (0.4) hr. The NOAA space weather scale includes four additional categories: moderate (S2), strong (S3), severe (S4), and extreme (S5). As S1 events have only minor impacts on HF radio propagation in the polar regions, the effective threshold for significant space radiation effects appears to be the S2 level (100 pfu), above which both biological and space operation impacts are observed along with increased effects on HF propagation in the polar regions. We modified the ESPERTA model to predict ≥S2 events and obtained a POD of 75% (41/55) and an FAR of 24% (13/54) for the 1995–2014 interval with a median (minimum) WT of ∼1.7 (0.2) hr based on predictions made at the time of the S1 threshold crossing. The improved performance of ESPERTA for ≥S2 events is a reflection of the big flare syndrome, which postulates that the measures of the various manifestations of eruptive solar flares increase as one considers increasingly larger events.
Gigante, Guido; Deco, Gustavo; Marom, Shimon; Del Giudice, Paolo
2015-01-01
Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural) is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed ‘quasi-orbits’, which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network’s firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms. PMID:26558616
Seasonal and ENSO Influences on the Stable Isotopic Composition of Galápagos Precipitation
NASA Astrophysics Data System (ADS)
Martin, N. J.; Conroy, J. L.; Noone, D.; Cobb, K. M.; Konecky, B. L.; Rea, S.
2018-01-01
The origin of stable isotopic variability in precipitation over time and space is critical to the interpretation of stable isotope-based paleoclimate proxies. In the eastern equatorial Pacific, modern stable isotope measurements in precipitation (δ18Op and δDp) are sparse and largely unevaluated in the literature, although insights from such analyses would benefit the interpretations of several regional isotope-based paleoclimate records. Here we present a new 3.5 year record of daily-resolved δ18Op and δDp from Santa Cruz, Galápagos. With a prior 13 year record of monthly δ18Op and δDp from the island, these new data reveal controls on the stable isotopic composition of regional precipitation on event to interannual time scales. Overall, we find Galápagos δ18Op is significantly correlated with precipitation amount on daily and monthly time scales. The majority of Galápagos rain events are drizzle, or garúa, derived from local marine boundary layer vapor, with corresponding high δ18Op values due to the local source and increased evaporation and equilibration of smaller drops with boundary layer vapor. On monthly time scales, only precipitation in very strong, warm season El Niño months has substantially lower δ18Op values, as the sea surface temperature threshold for deep convection (28°C) is only surpassed at these times. The 2015/2016 El Niño event did not produce strong precipitation or δ18Op anomalies due to the short period of warm SST anomalies, which did not extend into the peak of the warm season. Eastern Pacific proxy isotope records may be biased toward periods of high rainfall during strong to very strong El Niño events.
Modeling carbon production and transport during ELMs in DIII-D
NASA Astrophysics Data System (ADS)
Hogan, J.; Wade, M.; Coster, D.; Lasnier, C.
2004-11-01
Large-scale Type I ELM events could provide a significant C source in ITER, and C production rates depend on incident D flux density and surface temperature, quantities which can vary significantly during an ELM event. Recent progress on DIII-D has improved opportunities for code comparison. Fast time-scale measurements of divertor CIII evolution [1] and fast edge CER measurements of C profile evolution during low-density DIII-D LSN ELMy H-modes (type I) [2] have been modeled using the solps5.0/Eirene99 coupled edge code and time dependent thermal analysis codes. An ELM model based on characteristics of MHD peeling-ballooning modes reproduces the pedestal evolution. Qualitative agreement for the CIII evolution during an ELM event is found using the Roth et al annealing model for chemical sputtering and the sensitivity to other models is described. Significant ELM-to-ELM variations in observed maximum divertor target IR temperature during nominally identical ELMs are investigated with models for C emission from micron-scale dust particles. [1] M Groth, M Fenstermacher et al J Nucl Mater 2003, [2] M Wade, K Burrell et al PSI-16
A Large-Scale Search for Evidence of Quasi-Periodic Pulsations in Solar Flares
NASA Technical Reports Server (NTRS)
Inglis, A. R.; Ireland, J.; Dennis, B. R..; Hayes, L.; Gallagher, P.
2016-01-01
The nature of quasi-periodic pulsations (QPP) in solar flares is poorly constrained, and critically the general prevalence of such signals in solar flares is unknown. Therefore, we perform a large-scale search for evidence of signals consistent with QPP in solar flares, focusing on the 1300 s timescale. We analyze 675 M- and X-class flares observed by the Geostationary Operational Environmental Satellite (GOES) series in 18 soft X-rays between 2011 February 1 and 2015 December 31. Additionally, over the same era we analyze Fermi/Gamma-ray Burst Monitor (GBM) 1525 keV X-ray data for each of these flares associated with a Fermi/GBM solar flare trigger, a total of 261 events. Using a model comparison method, we determine whether there is evidence for a substantial enhancement in the Fourier power spectrum that may be consistent with a QPP signature, based on three tested models; a power-law plus a constant, a broken power-law plus constant, and a power-law-plus-constant with an additional QPP signature component. From this, we determine that approx. 30% of GOES events and approx. 8% of Fermi/GBM events show strong signatures consistent with classical interpretations of QPP. For the remaining events either two or more tested models cannot be strongly distinguished from each other, or the events are well-described by single power-law or broken power-law Fourier power spectra. For both instruments, a preferred characteristic time-scale of approx. 5-30 s was found in the QPP-like events, with no dependence on flare magnitude in either GOES or GBM data. We also show that individual events in the sample show similar characteristic time-scales in both GBM and GOES data sets. We discuss the implications of these results for our understanding of solar flares and possible QPP mechanisms.
THEORETICAL REVIEW The Hippocampus, Time, and Memory Across Scales
Howard, Marc W.; Eichenbaum, Howard
2014-01-01
A wealth of experimental studies with animals have offered insights about how neural networks within the hippocampus support the temporal organization of memories. These studies have revealed the existence of “time cells” that encode moments in time, much as the well-known “place cells” map locations in space. Another line of work inspired by human behavioral studies suggests that episodic memories are mediated by a state of temporal context that changes gradually over long time scales, up to at least a few thousand seconds. In this view, the “mental time travel” hypothesized to support the experience of episodic memory corresponds to a “jump back in time” in which a previous state of temporal context is recovered. We suggest that these 2 sets of findings could be different facets of a representation of temporal history that maintains a record at the last few thousand seconds of experience. The ability to represent long time scales comes at the cost of discarding precise information about when a stimulus was experienced—this uncertainty becomes greater for events further in the past. We review recent computational work that describes a mechanism that could construct such a scale-invariant representation. Taken as a whole, this suggests the hippocampus plays its role in multiple aspects of cognition by representing events embedded in a general spatiotemporal context. The representation of internal time can be useful across nonhippocampal memory systems. PMID:23915126
Evolution of Flow channels and Dipolarization Using THEMIS Observations and Global MHD Simulations
NASA Astrophysics Data System (ADS)
El-Alaoui, M.; McPherron, R. L.; Nishimura, Y.
2017-12-01
We have extensively analyzed a substorm on March 14, 2008 for which we have observations from THEMIS spacecraft located beyond 9 RE near 2100 local time. The available data include an extensive network of all sky cameras and ground magnetometers that establish the times of various auroral and magnetic events. This arrangement provided an excellent data set with which to investigate meso-scale structures in the plasma sheet. We have used a global magnetohydrodynamic simulation to investigate the structure and dynamics of the magnetotail current sheet during this substorm. Both earthward and tailward flows were found in the observations as well as the simulations. The simulation shows that the flow channels follow tortuous paths that are often reflected or deflected before arriving at the inner magnetosphere. The simulation shows a sequence of fast flows and dipolarization events similar to what is seen in the data, though not at precisely the same times or locations. We will use our simulation results combined with the observations to investigate the global convection systems and current sheet structure during this event, showing how meso-scale structures fit into the context of the overall tail dynamics during this event. Our study includes determining the location, timing and strength of several current wedges and expansion onsets during an 8-hour interval.
NASA Astrophysics Data System (ADS)
Choi, N.; Lee, M. I.; Lim, Y. K.; Kim, K. M.
2017-12-01
Heatwave is an extreme hot weather event which accompanies fatal damage to human health. The heatwave has a strong relationship with the large-scale atmospheric teleconnection patterns. In this study, we examine the spatial pattern of heatwave in East Asia by using the EOF analysis and the relationship between heatwave frequency and large-scale atmospheric teleconnection patterns. We also separate the time scale of heatwave frequency as the time scale longer than a decade and the interannual time scale. The long-term variation of heatwave frequency in East Asia shows a linkage with the sea surface temperature (SST) variability over the North Atlantic with a decadal time scale (a.k.a. the Atlantic Multidecadal Oscillation; AMO). On the other hands, the interannual variation of heatwave frequency is linked with the two dominant spatial patterns associated with the large-scale teleconnection patterns mimicking the Scandinavian teleconnection (SCAND-like) pattern and the circumglobal teleconnection (CGT-like) pattern, respectively. It is highlighted that the interannual variation of heatwave frequency in East Asia shows a remarkable change after mid-1990s. While the heatwave frequency was mainly associated with the CGT-like pattern before mid-1990s, the SCAND-like pattern becomes the most dominant one after mid-1990s, making the CGT-like pattern as the second. This study implies that the large-scale atmospheric teleconnection patterns play a key role in developing heatwave events in East Asia. This study further discusses possible mechanisms for the decadal change in the linkage between heatwave frequency and the large-scale teleconnection patterns in East Asia such as early melting of snow cover and/or weakening of East Asian jet stream due to global warming.
A new model for extinction and recolonization in two dimensions: quantifying phylogeography.
Barton, Nicholas H; Kelleher, Jerome; Etheridge, Alison M
2010-09-01
Classical models of gene flow fail in three ways: they cannot explain large-scale patterns; they predict much more genetic diversity than is observed; and they assume that loosely linked genetic loci evolve independently. We propose a new model that deals with these problems. Extinction events kill some fraction of individuals in a region. These are replaced by offspring from a small number of parents, drawn from the preexisting population. This model of evolution forwards in time corresponds to a backwards model, in which ancestral lineages jump to a new location if they are hit by an event, and may coalesce with other lineages that are hit by the same event. We derive an expression for the identity in allelic state, and show that, over scales much larger than the largest event, this converges to the classical value derived by Wright and Malécot. However, rare events that cover large areas cause low genetic diversity, large-scale patterns, and correlations in ancestry between unlinked loci. © 2010 The Author(s). Journal compilation © 2010 The Society for the Study of Evolution.
Capturing flood-to-drought transitions in regional climate model simulations
NASA Astrophysics Data System (ADS)
Anders, Ivonne; Haslinger, Klaus; Hofstätter, Michael; Salzmann, Manuela; Resch, Gernot
2017-04-01
In previous studies atmospheric cyclones have been investigated in terms of related precipitation extremes in Central Europe. Mediterranean (Vb-like) cyclones are of special relevance as they are frequently related to high atmospheric moisture fluxes leading to floods and landslides in the Alpine region. Another focus in this area is on droughts, affecting soil moisture and surface and sub-surface runoff as well. Such events develop differently depending on available pre-saturation of water in the soil. In a first step we investigated two time periods which encompass a flood event and a subsequent drought on very different time scales, one long lasting transition (2002/2003) and a rather short one between May and August 2013. In a second step we extended the investigation to the long time period 1950-2016. We focused on high spatial and temporal scales and assessed the currently achievable accuracy in the simulation of the Vb-events on one hand and following drought events on the other hand. The state-of-the-art regional climate model CCLM is applied in hindcast-mode simulating the single events described above, but also the time from 1948 to 2016 to evaluate the results from the short runs to be valid for the long time period. Besides the conventional forcing of the regional climate model at its lateral boundaries, a spectral nudging technique is applied. The simulations covering the European domain have been varied systematically different model parameters. The resulting precipitation amounts have been compared to E-OBS gridded European precipitation data set and a recent high spatially resolved precipitation data set for Austria (GPARD-6). For the drought events the Standardized Precipitation Evapotranspiration Index (SPEI), soil moisture and runoff has been investigated. Varying the spectral nudging setup helps us to understand the 3D-processes during these events, but also to identify model deficiencies. To improve the simulation of such events in the past, improves also the ability to assess a climate change signal in the recent and far future.
Characterization and prediction of extreme events in turbulence
NASA Astrophysics Data System (ADS)
Fonda, Enrico; Iyer, Kartik P.; Sreenivasan, Katepalli R.
2017-11-01
Extreme events in Nature such as tornadoes, large floods and strong earthquakes are rare but can have devastating consequences. The predictability of these events is very limited at present. Extreme events in turbulence are the very large events in small scales that are intermittent in character. We examine events in energy dissipation rate and enstrophy which are several tens to hundreds to thousands of times the mean value. To this end we use our DNS database of homogeneous and isotropic turbulence with Taylor Reynolds numbers spanning a decade, computed with different small scale resolutions and different box sizes, and study the predictability of these events using machine learning. We start with an aggressive data augmentation to virtually increase the number of these rare events by two orders of magnitude and train a deep convolutional neural network to predict their occurrence in an independent data set. The goal of the work is to explore whether extreme events can be predicted with greater assurance than can be done by conventional methods (e.g., D.A. Donzis & K.R. Sreenivasan, J. Fluid Mech. 647, 13-26, 2010).
Multifractal Approach to Time Clustering of Earthquakes. Application to Mt. Vesuvio Seismicity
NASA Astrophysics Data System (ADS)
Codano, C.; Alonzo, M. L.; Vilardo, G.
The clustering structure of the Vesuvian earthquakes occurring is investigated by means of statistical tools: the inter-event time distribution, the running mean and the multifractal analysis. The first cannot clearly distinguish between a Poissonian process and a clustered one due to the difficulties of clearly distinguishing between an exponential distribution and a power law one. The running mean test reveals the clustering of the earthquakes, but looses information about the structure of the distribution at global scales. The multifractal approach can enlighten the clustering at small scales, while the global behaviour remains Poissonian. Subsequently the clustering of the events is interpreted in terms of diffusive processes of the stress in the earth crust.
Radiation belt electron observations following the January 1997 magnetic cloud event
NASA Astrophysics Data System (ADS)
Selesnick, R. S.; Blake, J. B.
Relativistic electrons in the outer radiation belt associated with the January 1997 magnetic cloud event were observed by the HIST instrument on POLAR at kinetic energies from 0.7 to 7 MeV and L shells from 3 to 9. The electron enhancement occurred on a time scale of hours or less throughout the outer radiation belt, except for a more gradual rise in the higher energy electrons at the lower L values indicative of local acceleration and inward radial diffusion. At the higher L values, variations on a time scale of several days following the initial injection on January 10 are consistent with data from geosynchronous orbit and may be an adiabatic response.
Best Practices in the Evaluation of Large-scale STEM-focused Events: A Review of Recent Literature
NASA Astrophysics Data System (ADS)
Shebby, S.; Cobb, W. H.; Buxner, S.; Shipp, S. S.
2015-12-01
Each year, the National Aeronautics and Space Administration (NASA) sponsors a variety of educational events to share information with educators, students, and the general public. Intended outcomes of these events include increased interest in and awareness of the mission and goals of NASA. Events range in size from relatively small family science nights at a local school to large-scale mission and celestial event celebrations involving thousands of members of the general public. To support community members in designing event evaluations, the Science Mission Directorate (SMD) Planetary Science Forum sponsored the creation of a Best Practices Guide. The guide was generated by reviewing published large-scale event evaluation reports; however, the best practices described within are pertinent for all event organizers and evaluators regardless of event size. Each source included in the guide identified numerous challenges to conducting their event evaluation. These included difficulty in identifying extant instruments or items, collecting representative data, and disaggregating data to inform different evaluation questions. Overall, the guide demonstrates that evaluations of the large-scale events are generally done at a very basic level, with the types of data collected limited to observable demographic information and participant reactions collected via online survey. In addition to these findings, this presentation will describe evaluation best practices that will help practitioners move beyond these basic indicators and examine how to make the evaluation process an integral—and valuable—element of event planning, ultimately informing event outcomes and impacts. It will provide detailed information on five recommendations presented in the guide: 1) consider evaluation methodology, including data analysis, in advance; 2) design data collection instruments well in advance of the event; 3) collect data at different times and from multiple sources; 4) use technology to make the job easier; and 5) be aware of how challenging it is to measure impact.
Pan-European climate at convection-permitting scale: a model intercomparison study
NASA Astrophysics Data System (ADS)
Berthou, Ségolène; Kendon, Elizabeth J.; Chan, Steven C.; Ban, Nikolina; Leutwyler, David; Schär, Christoph; Fosser, Giorgia
2018-03-01
We investigate the effect of using convection-permitting models (CPMs) spanning a pan-European domain on the representation of precipitation distribution at a climatic scale. In particular we compare two 2.2 km models with two 12 km models run by ETH Zürich (ETH-12 km and ETH-2.2 km) and the Met-Office (UKMO-12 km and UKMO-2.2 km). The two CPMs yield qualitatively similar differences to the precipitation climatology compared to the 12 km models, despite using different dynamical cores and different parameterization packages. A quantitative analysis confirms that the CPMs give the largest differences compared to 12 km models in the hourly precipitation distribution in regions and seasons where convection is a key process: in summer across the whole of Europe and in autumn over the Mediterranean Sea and coasts. Mean precipitation is increased over high orography, with an increased amplitude of the diurnal cycle. We highlight that both CPMs show an increased number of moderate to intense short-lasting events and a decreased number of longer-lasting low-intensity events everywhere, correcting (and often over-correcting) biases in the 12 km models. The overall hourly distribution and the intensity of the most intense events is improved in Switzerland and to a lesser extent in the UK but deteriorates in Germany. The timing of the peak in the diurnal cycle of precipitation is improved. At the daily time-scale, differences in the precipitation distribution are less clear but the greater Alpine region stands out with the largest differences. Also, Mediterranean autumnal intense events are better represented at the daily time-scale in both 2.2 km models, due to improved representation of mesoscale processes.
Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.
Lilly, Jonathan M
2017-04-01
A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.
Dautovich, Natalie D; Dzierzewski, Joseph M; Gum, Amber M
2014-11-01
The present study investigated the temporal association between life event stressors relevant to older adults and depressive symptoms using a micro-longitudinal design (i.e., monthly increments over a six-month period). Existing research on stress and depressive symptoms has not examined this association over shorter time periods (e.g., monthly), over multiple time increments, or within-persons. An in-person initial interview was followed by six monthly interviews conducted by telephone. Community. Data were drawn from a study of 144 community-dwelling older adults with depressive symptoms. Stressful life events were measured using the Geriatric Life Events Scale (GALES), and depressive symptoms were assessed with the Short - Geriatric Depression Scale (S-GDS). Using multilevel modeling, 31% of the S-GDS' and 39% of the GALES' overall variance was due to within-person variability. Females and persons with worse health reported more depressive symptoms. Stressful life events predicted concurrent depressive symptoms, but not depressive symptoms one month later. The lack of a time-lagged relationship suggests that older adults with depressive symptoms may recover more quickly from life stressors than previously thought, although additional research using varying time frames is needed to pinpoint the timing of this recovery as well as to identify older adults at risk of long-term effects of life stressors. Copyright © 2014 American Association for Geriatric Psychiatry. All rights reserved.
Talbot, Karley-Dale S; Kerns, Kimberly A
2014-11-01
The current study examined prospective memory (PM, both time-based and event-based) and time estimation (TR, a time reproduction task) in children with and without attention deficit hyperactivity disorder (ADHD). This study also investigated the influence of task performance and TR on time-based PM in children with ADHD relative to controls. A sample of 69 children, aged 8 to 13 years, completed the CyberCruiser-II time-based PM task, a TR task, and the Super Little Fisherman event-based PM task. PM performance was compared with children's TR abilities, parental reports of daily prospective memory disturbances (Prospective and Retrospective Memory Questionnaire for Children, PRMQC), and ADHD symptomatology (Conner's rating scales). Children with ADHD scored more poorly on event-based PM, time-based PM, and TR; interestingly, TR did not appear related to performance on time-based PM. In addition, it was found that PRMQC scores and ADHD symptom severity were related to performance on the time-based PM task but not to performance on the event-based PM task. These results provide some limited support for theories that propose a distinction between event-based PM and time-based PM. Copyright © 2014 Elsevier Inc. All rights reserved.
Global Scale Solar Disturbances
NASA Astrophysics Data System (ADS)
Title, A. M.; Schrijver, C. J.; DeRosa, M. L.
2013-12-01
The combination of the STEREO and SDO missions have allowed for the first time imagery of the entire Sun. This coupled with the high cadence, broad thermal coverage, and the large dynamic range of the Atmospheric Imaging Assembly on SDO has allowed discovery of impulsive solar disturbances that can significantly affect a hemisphere or more of the solar volume. Such events are often, but not always, associated with M and X class flares. GOES C and even B class flares are also associated with these large scale disturbances. Key to the recognition of the large scale disturbances was the creation of log difference movies. By taking the log of images before differencing events in the corona become much more evident. Because such events cover such a large portion of the solar volume their passage can effect the dynamics of the entire corona as it adjusts to and recovers from their passage. In some cases this may lead to a another flare or filament ejection, but in general direct causal evidence of 'sympathetic' behavior is lacking. However, evidence is accumulating these large scale events create an environment that encourages other solar instabilities to occur. Understanding the source of these events and how the energy that drives them is built up, stored, and suddenly released is critical to understanding the origins of space weather. Example events and comments of their relevance will be presented.
NASA Astrophysics Data System (ADS)
Phillips, M.; Denning, A. S.; Randall, D. A.; Branson, M.
2016-12-01
Multi-scale models of the atmosphere provide an opportunity to investigate processes that are unresolved by traditional Global Climate Models while at the same time remaining viable in terms of computational resources for climate-length time scales. The MMF represents a shift away from large horizontal grid spacing in traditional GCMs that leads to overabundant light precipitation and lack of heavy events, toward a model where precipitation intensity is allowed to vary over a much wider range of values. Resolving atmospheric motions on the scale of 4 km makes it possible to recover features of precipitation, such as intense downpours, that were previously only obtained by computationally expensive regional simulations. These heavy precipitation events may have little impact on large-scale moisture and energy budgets, but are outstanding in terms of interaction with the land surface and potential impact on human life. Three versions of the Community Earth System Model were used in this study; the standard CESM, the multi-scale `Super-Parameterized' CESM where large-scale parameterizations have been replaced with a 2D cloud-permitting model, and a multi-instance land version of the SP-CESM where each column of the 2D CRM is allowed to interact with an individual land unit. These simulations were carried out using prescribed Sea Surface Temperatures for the period from 1979-2006 with daily precipitation saved for all 28 years. Comparisons of the statistical properties of precipitation between model architectures and against observations from rain gauges were made, with specific focus on detection and evaluation of extreme precipitation events.
Analytically Solvable Model of Spreading Dynamics with Non-Poissonian Processes
NASA Astrophysics Data System (ADS)
Jo, Hang-Hyun; Perotti, Juan I.; Kaski, Kimmo; Kertész, János
2014-01-01
Non-Poissonian bursty processes are ubiquitous in natural and social phenomena, yet little is known about their effects on the large-scale spreading dynamics. In order to characterize these effects, we devise an analytically solvable model of susceptible-infected spreading dynamics in infinite systems for arbitrary inter-event time distributions and for the whole time range. Our model is stationary from the beginning, and the role of the lower bound of inter-event times is explicitly considered. The exact solution shows that for early and intermediate times, the burstiness accelerates the spreading as compared to a Poisson-like process with the same mean and same lower bound of inter-event times. Such behavior is opposite for late-time dynamics in finite systems, where the power-law distribution of inter-event times results in a slower and algebraic convergence to a fully infected state in contrast to the exponential decay of the Poisson-like process. We also provide an intuitive argument for the exponent characterizing algebraic convergence.
Frequency-dependent moment release of very low frequency earthquakes in the Cascadia subduction zone
NASA Astrophysics Data System (ADS)
Takeo, A.; Houston, H.
2014-12-01
Episodic tremor and slip (ETS) has been observed in Cascadia subduction zone at two different time scales: tremor at a high-frequency range of 2-8 Hz and slow slip events at a geodetic time-scale of days-months. The intermediate time scale is needed to understand the source spectrum of slow earthquakes. Ghosh et al. (2014, IRIS abs) recently reported the presence of very low frequency earthquakes (VLFEs) in Cascadia. In southwest Japan, VLFEs are usually observed at a period range around 20-50 s, and coincide with tremors (e.g., Ito et al. 2007). In this study, we analyzed VLFEs in and around the Olympic Peninsula to confirm their presence and estimate their moment release. We first detected VLFE events by using broadband seismograms with a band-pass filter of 20-50 s. The preliminary result shows that there are at least 16 VLFE events with moment magnitudes of 3.2-3.7 during the M6.8 2010 ETS. The focal mechanisms are consistent with the thrust earthquakes at the subducting plate interface. To detect signals of VLFEs below noise level, we further stacked long-period waveforms at the peak timings of tremor amplitudes for tremors within a 10-15 km radius by using tremor catalogs in 2006-2010, and estimated the focal mechanisms for each tremor source region as done in southwest Japan (Takeo et al. 2010 GRL). As a result, VLFEs could be detected for almost the entire tremor source region at a period range of 20-50 s with average moment magnitudes in each 5-min tremor window of 2.4-2.8. Although the region is limited, we could also detect VLFEs at a period range of 50-100 s with average moment magnitudes of 3.0-3.2. The moment release at 50-100 s is 4-8 times larger than that at 20-50 s, roughly consistent with an omega-squared spectral model. Further study including tremor, slow slip events and characteristic activities, such as rapid tremor reversal and tremor streaks, will reveal the source spectrum of slow earthquakes in a broader time scale from 0.1 s to days.
NASA Astrophysics Data System (ADS)
Flohr, Pascal; Fleitmann, Dominik; Matthews, Roger; Matthews, Wendy; Black, Stuart
2016-03-01
Climate change is often cited as a major factor in social change. The so-called 8.2 ka event was one of the most pronounced and abrupt Holocene cold and arid events. The 9.2 ka event was similar, albeit of a smaller magnitude. Both events affected the Northern Hemisphere climate and caused cooling and aridification in Southwest Asia. Yet, the impacts of the 8.2 and 9.2 ka events on early farming communities in this region are not well understood. Current hypotheses for an effect of the 8.2 ka event vary from large-scale site abandonment and migration (including the Neolithisation of Europe) to continuation of occupation and local adaptation, while impacts of the 9.2 ka have not previously been systematically studied. In this paper, we present a thorough assessment of available, quality-checked radiocarbon (14C) dates for sites from Southwest Asia covering the time interval between 9500 and 7500 cal BP, which we interpret in combination with archaeological evidence. In this way, the synchronicity between changes observed in the archaeological record and the rapid climate events is tested. It is shown that there is no evidence for a simultaneous and widespread collapse, large-scale site abandonment, or migration at the time of the events. However, there are indications for local adaptation. We conclude that early farming communities were resilient to the abrupt, severe climate changes at 9250 and 8200 cal BP.
NASA Astrophysics Data System (ADS)
Bunde, Armin; Eichner, Jan F.; Kantelhardt, Jan W.; Havlin, Shlomo
2005-01-01
We study the statistics of the return intervals between extreme events above a certain threshold in long-term persistent records. We find that the long-term memory leads (i)to a stretched exponential distribution of the return intervals, (ii)to a pronounced clustering of extreme events, and (iii)to an anomalous behavior of the mean residual time to the next event that depends on the history and increases with the elapsed time in a counterintuitive way. We present an analytical scaling approach and demonstrate that all these features can be seen in long climate records. The phenomena should also occur in heartbeat records, Internet traffic, and stock market volatility and have to be taken into account for an efficient risk evaluation.
Liu, Changxin; Gao, Jian; Li, Huiping; Xu, Demin
2018-05-01
The event-triggered control is a promising solution to cyber-physical systems, such as networked control systems, multiagent systems, and large-scale intelligent systems. In this paper, we propose an event-triggered model predictive control (MPC) scheme for constrained continuous-time nonlinear systems with bounded disturbances. First, a time-varying tightened state constraint is computed to achieve robust constraint satisfaction, and an event-triggered scheduling strategy is designed in the framework of dual-mode MPC. Second, the sufficient conditions for ensuring feasibility and closed-loop robust stability are developed, respectively. We show that robust stability can be ensured and communication load can be reduced with the proposed MPC algorithm. Finally, numerical simulations and comparison studies are performed to verify the theoretical results.
NASA Astrophysics Data System (ADS)
Barros, G. P.; Marques, W. C.
2013-05-01
The aim of this study is to investigate the influence and importance of ENSO events on the control of the freshwater discharge pattern at Patos Lagoon, in timescales longer than one year. For this study it was used freshwater discharge, water levels and South Oscillation Index (SOI) data sets. The Southern Oscillation Index, or SOI, gives an indication of the development and intensity of El Niño or La Niña events in the Pacific Ocean. Sustained negative values of the SOI greater than -8 often indicate El Niño episodes. Sustained positive values of the SOI greater than +8 are typical of a La Niña episode. Cross wavelet technique is applied to examine the coherence and phase between interannual time-series (South Oscillation Index, freshwater discharge and water levels). Over synoptic time scales, wind action is the most effective forcing in Patos Lagoon's circulation. However, at longer time scales (over one year), freshwater discharge becomes the most important forcing, controling the water levels, circulation and other processes. At longer time scales, South America is affected by ENSO's influence. El Niño is the South Oscillation phase where the trade winds are weak, the pressure is low over the eastern Tropical Pacific and high on the west side. The south region of Brazil shows precipitation anomalies associated with the ENSO occurrence. The most significant ENSO events show a high temporal variability, which may occur in near biannual scales (1.5 - 3 years) or in lower frequencies (3 years - 7 years). The freshwater discharge of the main tributaries and water levels in Patos Lagoon are influenced by ENSO on interannual scales (cycles between 3.8 and 6 years). Therefore, El Niño events are associated with high mean values of freshwater discharge and water levels above the mean. On the other hand, La Niña events are associated with low mean values of freshwater discharge and water levels below the mean. These results are consistent with analysis related to the SOI and agree with previously results obtained by other authors in this region of South America. The cross wavelet analysis between the freshwater discharge and the SOI time series indicates the dominant length and period of the ENSO cycles that control the discharge. It can be observed that between the years of 1950 and 1965 the dominant period was from 4 to 6 years, while from 1970 and 2000 the dominant period was lower than 4 years, indicating a change on the ENSO influence pattern on the region. Further studies about the characteristics of the catchment (area, length, topography, vegetation, etc.) would be very important to identify the delay between an ENSO event, the precipitation anomaly associated to it and the consequent increase of freshwater discharge, producing valuable information that could help in proper coastal management and flood prediction.
Efficient hemodynamic event detection utilizing relational databases and wavelet analysis
NASA Technical Reports Server (NTRS)
Saeed, M.; Mark, R. G.
2001-01-01
Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.
Li, Huaqing; Chen, Guo; Huang, Tingwen; Dong, Zhaoyang; Zhu, Wei; Gao, Lan
2016-12-01
In this paper, we consider the event-triggered distributed average-consensus of discrete-time first-order multiagent systems with limited communication data rate and general directed network topology. In the framework of digital communication network, each agent has a real-valued state but can only exchange finite-bit binary symbolic data sequence with its neighborhood agents at each time step due to the digital communication channels with energy constraints. Novel event-triggered dynamic encoder and decoder for each agent are designed, based on which a distributed control algorithm is proposed. A scheme that selects the number of channel quantization level (number of bits) at each time step is developed, under which all the quantizers in the network are never saturated. The convergence rate of consensus is explicitly characterized, which is related to the scale of network, the maximum degree of nodes, the network structure, the scaling function, the quantization interval, the initial states of agents, the control gain and the event gain. It is also found that under the designed event-triggered protocol, by selecting suitable parameters, for any directed digital network containing a spanning tree, the distributed average consensus can be always achieved with an exponential convergence rate based on merely one bit information exchange between each pair of adjacent agents at each time step. Two simulation examples are provided to illustrate the feasibility of presented protocol and the correctness of the theoretical results.
A High-Resolution View of Global Seismicity
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2014-12-01
We present high-precision earthquake relocation results from our global-scale re-analysis of the combined seismic archives of parametric data for the years 1964 to present from the International Seismological Centre (ISC), the USGS's Earthquake Data Report (EDR), and selected waveform data from IRIS. We employed iterative, multistep relocation procedures that initially correct for large location errors present in standard global earthquake catalogs, followed by a simultaneous inversion of delay times formed from regional and teleseismic arrival times of first and later arriving phases. An efficient multi-scale double-difference (DD) algorithm is used to solve for relative event locations to the precision of a few km or less, while incorporating information on absolute hypocenter locations from catalogs such as EHB and GEM. We run the computations on both a 40-core cluster geared towards HTC problems (data processing) and a 500-core HPC cluster for data inversion. Currently, we are incorporating waveform correlation delay time measurements available for events in selected regions, but are continuously building up a comprehensive, global correlation database for densely distributed events recorded at stations with a long history of high-quality waveforms. The current global DD catalog includes nearly one million earthquakes, equivalent to approximately 70% of the number of events in the ISC/EDR catalogs initially selected for relocation. The relocations sharpen the view of seismicity in most active regions around the world, in particular along subduction zones where event density is high, but also along mid-ocean ridges where existing hypocenters are especially poorly located. The new data offers the opportunity to investigate earthquake processes and fault structures along entire plate boundaries at the ~km scale, and provides a common framework that facilitates analysis and comparisons of findings across different plate boundary systems.
Ramanathan, Arvind; Savol, Andrej J; Agarwal, Pratul K; Chennubhotla, Chakra S
2012-11-01
Biomolecular simulations at millisecond and longer time-scales can provide vital insights into functional mechanisms. Because post-simulation analyses of such large trajectory datasets can be a limiting factor in obtaining biological insights, there is an emerging need to identify key dynamical events and relating these events to the biological function online, that is, as simulations are progressing. Recently, we have introduced a novel computational technique, quasi-anharmonic analysis (QAA) (Ramanathan et al., PLoS One 2011;6:e15827), for partitioning the conformational landscape into a hierarchy of functionally relevant sub-states. The unique capabilities of QAA are enabled by exploiting anharmonicity in the form of fourth-order statistics for characterizing atomic fluctuations. In this article, we extend QAA for analyzing long time-scale simulations online. In particular, we present HOST4MD--a higher-order statistical toolbox for molecular dynamics simulations, which (1) identifies key dynamical events as simulations are in progress, (2) explores potential sub-states, and (3) identifies conformational transitions that enable the protein to access those sub-states. We demonstrate HOST4MD on microsecond timescale simulations of the enzyme adenylate kinase in its apo state. HOST4MD identifies several conformational events in these simulations, revealing how the intrinsic coupling between the three subdomains (LID, CORE, and NMP) changes during the simulations. Further, it also identifies an inherent asymmetry in the opening/closing of the two binding sites. We anticipate that HOST4MD will provide a powerful and extensible framework for detecting biophysically relevant conformational coordinates from long time-scale simulations. Copyright © 2012 Wiley Periodicals, Inc.
Dissolved oxygen transfer to sediments by sweep and eject motions in aquatic environments
O'Connor, B.L.; Hondzo, Miki
2008-01-01
Dissolved oxygen (DO) concentrations were quantified near the sediment-water interface to evaluate DO transfer to sediments in a laboratory recirculating flume and open channel under varying fluid-flow conditions. DO concentration fluctuations were observed within the diffusive sublayer, as defined by the time-averaged DO concentration gradient near the sediment-water interface. Evaluation of the DO concentration fluctuations along with detailed fluid-flow characterizations were used to quantify quasi-periodic sweep and eject motions (bursting events) near the sediments. Bursting events dominated the Reynolds shear stresses responsible for momentum and mass fluctuations near the sediment bed. Two independent methods for detecting bursting events using DO concentration and velocity data produced consistent results. The average time between bursting events was scaled with wall variables and was incorporated into a similarity model to describe the dimensionless mass transfer coefficient (Sherwood number, Sh) in terms of the Reynolds number, Re, and Schmidt number, Sc, which described transport in the flow. The scaling of bursting events was employed with the similarity model to quantify DO transfer to sediments and results showed a high degree of agreement with experimental data. ?? 2008, by the American Society of Limnology and Oceanography, Inc.
Extreme climatic events constrain space use and survival of a ground-nesting bird.
Tanner, Evan P; Elmore, R Dwayne; Fuhlendorf, Samuel D; Davis, Craig A; Dahlgren, David K; Orange, Jeremy P
2017-05-01
Two fundamental issues in ecology are understanding what influences the distribution and abundance of organisms through space and time. While it is well established that broad-scale patterns of abiotic and biotic conditions affect organisms' distributions and population fluctuations, discrete events may be important drivers of space use, survival, and persistence. These discrete extreme climatic events can constrain populations and space use at fine scales beyond that which is typically measured in ecological studies. Recently, a growing body of literature has identified thermal stress as a potential mechanism in determining space use and survival. We sought to determine how ambient temperature at fine temporal scales affected survival and space use for a ground-nesting quail species (Colinus virginianus; northern bobwhite). We modeled space use across an ambient temperature gradient (ranging from -20 to 38 °C) through a maxent algorithm. We also used Andersen-Gill proportional hazard models to assess the influence of ambient temperature-related variables on survival through time. Estimated available useable space ranged from 18.6% to 57.1% of the landscape depending on ambient temperature. The lowest and highest ambient temperature categories (<-15 °C and >35 °C, respectively) were associated with the least amount of estimated useable space (18.6% and 24.6%, respectively). Range overlap analysis indicated dissimilarity in areas where Colinus virginianus were restricted during times of thermal extremes (range overlap = 0.38). This suggests that habitat under a given condition is not necessarily a habitat under alternative conditions. Further, we found survival was most influenced by weekly minimum ambient temperatures. Our results demonstrate that ecological constraints can occur along a thermal gradient and that understanding the effects of these discrete events and how they change over time may be more important to conservation of organisms than are average and broad-scale conditions as typically measured in ecological studies. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
The reasons for Chinese nursing staff to report adverse events: a questionnaire survey.
Hong, Su; Li, QiuJie
2017-04-01
To investigate the impact of nurses' perception of patient safety culture and adverse event reporting, and demographic factors on adverse event reporting in Chinese hospitals. Accurate and timely adverse event reporting is integral in promoting patient safety and professional learning around the incident. In a cross-sectional survey, a sample of 919 nurses completed a structured questionnaire composed of two validated instruments measuring nurses' perception of patient safety culture and adverse event reporting. Associations between the variables were examined using multiple linear regression analysis. The positive response rates of five dimensions of the Patient Safety Culture Assessment Scale varied from 47.55% to 80.62%. The accuracy rate of Adverse Event Reporting Perception Scale was 63.16%. Five hundred and thirty-one (58.03%) nurses did not report adverse event in past 12 months. Six variables were found to be associated with nurses' adverse event reporting: total work experience (P = 0.003), overall patient safety culture score (P < 0.001), safety climate (P < 0.001), teamwork climate (P < 0.001), overall the adverse event reporting perception scale score (P = 0.003) and importance or reporting (P = 0.002). The results confirmed that improvements in the patient safety culture and nurses' perception of adverse event reporting were related to an increase in voluntary adverse event reporting. The knowledge of adverse event reporting should be integrated into the patient safety curriculum. Interventions that target a specific domain are necessary to improve the safety culture. © 2017 John Wiley & Sons Ltd.
Geomagnetic polarity epochs: age and duration of the olduvai normal polarity event
Gromme, C.S.; Hay, R.L.
1971-01-01
New data show that the Olduvai normal geomagnetic polarity event is represented in Olduvai Gorge, Tanzania, by rocks covering a time span of roughly from 0.1 to 0.2 my and is no older than 2.0 my. Hence the long normal polarity event of this age that is seen in deep-sea sediment cores and in magnetic profiles over oceanic ridges should be called the Olduvai event. The lava from which the Gilsa?? event was defined may have been erupted during the Olduvai event and, if so, the term Gilsa?? should now be abandoned. Many dated lavas that were originally assigned to the Olduvai event represent one or two much shorter normal polarity events that preceded the Olduvai event; these are herein named the Re??union normal polarity events. This revision brings the geomagnetic reversal time scale into conformity with the one implied by assumptions of uniform sedimentation rates on the ocean floor and uniform rates of sea-floor spreading. ?? 1971.
NASA Astrophysics Data System (ADS)
Sebestyen, S. D.; Shanley, J. B.; Boyer, E. W.; Kendall, C.
2004-12-01
Our ability to assess how stream nutrient concentrations respond to biogeochemical transformations and stream flow dynamics is often limited by datasets that do not include all flow conditions that occur over event, monthly, seasonal, and yearly time scales. At the Sleepers River Research Watershed in northeastern Vermont, USA, nitrate, DOC (dissolved organic carbon), and major ion concentrations were measured on samples collected over a wide range of flow conditions from summer 2002 through summer 2004. Nutrient flushing occurred at the W-9 catchment and high-frequency sampling revealed critical insights into seasonal and event-scale controls on nutrient concentrations. In this seasonally snow-covered catchment, the earliest stage of snowmelt introduced nitrogen directly to the stream from the snowpack. As snowmelt progressed, the source of stream nitrate shifted to flushing of soil nitrate along shallow subsurface flow paths. In the growing season, nitrogen flushing to streams varied with antecedent moisture conditions. More nitrogen was available to flush to streams when antecedent moisture was lowest, and mobile nitrogen stores in the landscape regenerated under baseflow conditions on times scales as short as 7 days. Leaf fall was another critical time when coupled hydrological and biogeochemical processes controlled nutrient fluxes. With the input of labile organic carbon from freshly decomposing leaves, nitrate concentrations declined sharply in response to in-stream immobilization or denitrification. These high-resolution hydrochemical data from multiple flow regimes are identifying "hot spots" and "hot moments" of biogeochemical and hydrological processes that control nutrient fluxes in streams.
DOT National Transportation Integrated Search
2015-09-23
This research project aimed to develop a remote sensing system capable of rapidly identifying fine-scale damage to critical transportation infrastructure following hazard events. Such a system must be pre-planned for rapid deployment, automate proces...
Extreme reaction times determine fluctuation scaling in human color vision
NASA Astrophysics Data System (ADS)
Medina, José M.; Díaz, José A.
2016-11-01
In modern mental chronometry, human reaction time defines the time elapsed from stimulus presentation until a response occurs and represents a reference paradigm for investigating stochastic latency mechanisms in color vision. Here we examine the statistical properties of extreme reaction times and whether they support fluctuation scaling in the skewness-kurtosis plane. Reaction times were measured for visual stimuli across the cardinal directions of the color space. For all subjects, the results show that very large reaction times deviate from the right tail of reaction time distributions suggesting the existence of dragon-kings events. The results also indicate that extreme reaction times are correlated and shape fluctuation scaling over a wide range of stimulus conditions. The scaling exponent was higher for achromatic than isoluminant stimuli, suggesting distinct generative mechanisms. Our findings open a new perspective for studying failure modes in sensory-motor communications and in complex networks.
NASA Astrophysics Data System (ADS)
Jennings, Keith; Jones, Julia A.
2015-09-01
This study tested multiple hydrologic mechanisms to explain snowpack dynamics in extreme rain-on-snow floods, which occur widely in the temperate and polar regions. We examined 26, 10 day large storm events over the period 1992-2012 in the H.J. Andrews Experimental Forest in western Oregon, using statistical analyses (regression, ANOVA, and wavelet coherence) of hourly snowmelt lysimeter, air and dewpoint temperature, wind speed, precipitation, and discharge data. All events involved snowpack outflow, but only seven events had continuous net snowpack outflow, including three of the five top-ranked peak discharge events. Peak discharge was not related to precipitation rate, but it was related to the 10 day sum of precipitation and net snowpack outflow, indicating an increased flood response to continuously melting snowpacks. The two largest peak discharge events in the study had significant wavelet coherence at multiple time scales over several days; a distribution of phase differences between precipitation and net snowpack outflow at the 12-32 h time scale with a sharp peak at π/2 radians; and strongly correlated snowpack outflow among lysimeters representing 42% of basin area. The recipe for an extreme rain-on-snow event includes persistent, slow melt within the snowpack, which appears to produce a near-saturated zone within the snowpack throughout the landscape, such that the snowpack may transmit pressure waves of precipitation directly to streams, and this process is synchronized across the landscape. Further work is needed to understand the internal dynamics of a melting snowpack throughout a snow-covered landscape and its contribution to extreme rain-on-snow floods.
Reducing uncertainty in Climate Response Time Scale by Bayesian Analysis of the 8.2 ka event
NASA Astrophysics Data System (ADS)
Lorenz, A.; Held, H.; Bauer, E.; Schneider von Deimling, T.
2009-04-01
We analyze the possibility of uncertainty reduction in Climate Response Time Scale by utilizing Greenland ice-core data that contain the 8.2 ka event within a Bayesian model-data intercomparison with the Earth system model of intermediate complexity, CLIMBER-2.3. Within a stochastic version of the model it has been possible to mimic the 8.2 ka event within a plausible experimental setting and with relatively good accuracy considering the timing of the event in comparison to other modeling exercises [1]. The simulation of the centennial cold event is effectively determined by the oceanic cooling rate which depends largely on the ocean diffusivity described by diffusion coefficients of relatively wide uncertainty ranges. The idea now is to discriminate between the different values of diffusivities according to their likelihood to rightly represent the duration of the 8.2 ka event and thus to exploit the paleo data to constrain uncertainty in model parameters in analogue to [2]. Implementing this inverse Bayesian Analysis with this model the technical difficulty arises to establish the related likelihood numerically in addition to the uncertain model parameters: While mainstream uncertainty analyses can assume a quasi-Gaussian shape of likelihood, with weather fluctuating around a long term mean, the 8.2 ka event as a highly nonlinear effect precludes such an a priori assumption. As a result of this study [3] the Bayesian Analysis showed a reduction of uncertainty in vertical ocean diffusivity parameters of factor 2 compared to prior knowledge. This learning effect on the model parameters is propagated to other model outputs of interest; e.g. the inverse ocean heat capacity, which is important for the dominant time scale of climate response to anthropogenic forcing which, in combination with climate sensitivity, strongly influences the climate systems reaction for the near- and medium-term future. 1 References [1] E. Bauer, A. Ganopolski, M. Montoya: Simulation of the cold climate event 8200 years ago by meltwater outburst from lake Agassiz. Paleoceanography 19:PA3014, (2004) [2] T. Schneider von Deimling, H. Held, A. Ganopolski, S. Rahmstorf, Climate sensitivity estimated from ensemble simulations of glacial climates, Climate Dynamics 27, 149-163, DOI 10.1007/s00382-006-0126-8 (2006). [3] A. Lorenz, Diploma Thesis, U Potsdam (2007).
Memory effect in M ≥ 6 earthquakes of South-North Seismic Belt, Mainland China
NASA Astrophysics Data System (ADS)
Wang, Jeen-Hwa
2013-07-01
The M ≥ 6 earthquakes occurred in the South-North Seismic Belt, Mainland China, during 1901-2008 are taken to study the possible existence of memory effect in large earthquakes. The fluctuation analysis technique is applied to analyze the sequences of earthquake magnitude and inter-event time represented in the natural time domain. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of earthquake magnitude and inter-event time. The migration of earthquakes in study is taken to discuss the possible correlation between events. The phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Together with all kinds of given information, we conclude that the earthquakes in study is short-term correlated and thus the short-term memory effect would be operative.
Periodic, chaotic, and doubled earthquake recurrence intervals on the deep San Andreas Fault
Shelly, David R.
2010-01-01
Earthquake recurrence histories may provide clues to the timing of future events, but long intervals between large events obscure full recurrence variability. In contrast, small earthquakes occur frequently, and recurrence intervals are quantifiable on a much shorter time scale. In this work, I examine an 8.5-year sequence of more than 900 recurring low-frequency earthquake bursts composing tremor beneath the San Andreas fault near Parkfield, California. These events exhibit tightly clustered recurrence intervals that, at times, oscillate between ~3 and ~6 days, but the patterns sometimes change abruptly. Although the environments of large and low-frequency earthquakes are different, these observations suggest that similar complexity might underlie sequences of large earthquakes.
Peculiarity of Seismicity in the Balakend-Zagatal Region, Azerbaijan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ismail-Zadeh, Tahir T.
2006-03-23
The study of seismicity in the Balakend-Zagatal region demonstrates a temporal correlation of small events in the region with the moderate events in Caucasus for the time interval of 1980 to 1990. It is shown that the processes resulting in deformation and tectonic movements of main structural elements of the Caucasus region are internal and are not related to large-scale tectonic processes. A week dependence of the regional movements on the large-scale motion of the lithospheric plates and microplates is apparent from another geological and geodetic data as well.
A Framework of Simple Event Detection in Surveillance Video
NASA Astrophysics Data System (ADS)
Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao
Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.
A large scale membrane-binding protein conformational change that initiates at small length scales
NASA Astrophysics Data System (ADS)
Grandpre, Trevor; Andorf, Matthew; Chakravarthy, Srinivas; Lamb, Robert; Poor, Taylor; Landahl, Eric
2013-03-01
The fusion (F) protein of parainfluenza virus 5 (PIV5) is a membrane-bound, homotrimeric glycoprotein located on the surface of PIV5 viral envelopes. Upon being triggered by the receptor-binding protein (HN), F undergoes a greater than 100Å ATP-independent refolding event. This refolding event results in the insertion of a hydrophobic fusion peptide into the membrane of the target cell, followed by the desolvation and subsequent fusion event as the two membranes are brought together. Isothermal calorimetry and hydrophobic dye incorporation experiments indicate that the soluble construct of the F protein undergoes a conformational rearrangement event at around 55 deg C. We present the results of an initial Time-Resolved Small-Angle X-Ray Scattering (TR-SAXS) study of this large scale, entropically driven conformational change using a temperature jump. Although we the measured radius of gyration of this protein changes on a 110 second timescale, we find that the x-ray scattering intensity at higher angles (corresponding to smaller length scales in the protein) changes nearly an order of magnitude faster. We believe this may be a signature of entropically-driven conformational change. To whom correspondence should be addressed
Hierarchical Engine for Large-scale Infrastructure Co-Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-04-24
HELICS is designed to support very-large-scale (100,000+ federates) cosimulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features include cross platform operating system support, the integration of both event driven (e.g., packetized communication) and time-series (e.g., power flow) simulations, and the ability to co-iterate among federates to ensure physical model convergence at each time step.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Chun-Yaung; Perez, Danny; Voter, Arthur F., E-mail: afv@lanl.gov
Nuclear quantum effects are important for systems containing light elements, and the effects are more prominent in the low temperature regime where the dynamics also becomes sluggish. We show that parallel replica (ParRep) dynamics, an accelerated molecular dynamics approach for infrequent-event systems, can be effectively combined with ring-polymer molecular dynamics, a semiclassical trajectory approach that gives a good approximation to zero-point and tunneling effects in activated escape processes. The resulting RP-ParRep method is a powerful tool for reaching long time scales in complex infrequent-event systems where quantum dynamics are important. Two illustrative examples, symmetric Eckart barrier crossing and interstitial heliummore » diffusion in Fe and Fe–Cr alloy, are presented to demonstrate the accuracy and long-time scale capability of this approach.« less
Mass Extinctions and Biosphere-Geosphere Stability
NASA Astrophysics Data System (ADS)
Rothman, Daniel; Bowring, Samuel
2015-04-01
Five times in the past 500 million years, mass extinctions have resulted in the loss of greater than three-fourths of living species. Each of these events is associated with significant environmental change recorded in the carbon-isotopic composition of sedimentary rocks. There are also many such environmental events in the geologic record that are not associated with mass extinctions. What makes them different? Two factors appear important: the size of the environmental perturbation, and the time scale over which it occurs. We show that the natural perturbations of Earth's carbon cycle during the past 500 million years exhibit a characteristic rate of change over two orders of magnitude in time scale. This characteristic rate is consistent with the maximum rate that limits quasistatic (i.e., near steady-state) evolution of the carbon cycle. We identify this rate with marginal stability, and show that mass extinctions occur on the fast, unstable side of the stability boundary. These results suggest that the great extinction events of the geologic past, and potentially a "sixth extinction" associated with modern environmental change, are characterized by common mechanisms of instability.
NASA Astrophysics Data System (ADS)
Voter, Arthur
Many important materials processes take place on time scales that far exceed the roughly one microsecond accessible to molecular dynamics simulation. Typically, this long-time evolution is characterized by a succession of thermally activated infrequent events involving defects in the material. In the accelerated molecular dynamics (AMD) methodology, known characteristics of infrequent-event systems are exploited to make reactive events take place more frequently, in a dynamically correct way. For certain processes, this approach has been remarkably successful, offering a view of complex dynamical evolution on time scales of microseconds, milliseconds, and sometimes beyond. We have recently made advances in all three of the basic AMD methods (hyperdynamics, parallel replica dynamics, and temperature accelerated dynamics (TAD)), exploiting both algorithmic advances and novel parallelization approaches. I will describe these advances, present some examples of our latest results, and discuss what should be possible when exascale computing arrives in roughly five years. Funded by the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, and by the Los Alamos Laboratory Directed Research and Development program.
Particle field diagnose using angular multiplexing volume holography
NASA Astrophysics Data System (ADS)
Zhao, Yu; Li, Zeren; Luo, Zhenxiong; Jun, Li; Zhong, Jie; Ye, Yan; Li, Shengfu; Zhu, Jianhua
2017-08-01
The problem of particle field diagnosing using holography can be met in many areas. But single frame hologram can only catch one moment of the fast event, which can't reveal the change process of an unrepeatable fast event. For events in different time-scale, different solution should be used. We did this work to record a laser induced particle field in the time-scale of tens of micron seconds. A laser of pulse sequence mode is applied to provide 10 pulses, the energy and time interval of whom is 150mJ and 1μs. Four pockels cells are employed to pick up the last four pulses for holographic recording, the other pulses are controlled to pre-expose the photopolymer based recording material, which can enhance photosensitivity of the photopolymer during the moment of holographic recording. The angular multiplexing technique and volume holography is accepted to avoid shifting the photopolymer between each shot. Another Q-switch YAG laser (pulse energy 100mJ, pulse width 10ns) is applied to produce the fast event. As a result, we successfully caught the motion process of the laser induced particle field. The time interval of each frame is 1μs, the angular range of the four references is 14°, and the diffraction efficiency of each hologram is less than 2%. After a basic analysis, this optical system could catch more holograms through a compact design.
Some aspects of large-scale travelling ionospheric disturbances
NASA Astrophysics Data System (ADS)
Bowman, G. G.
1992-06-01
On two occasions the speeds and directions of travel of large-scale traveling ionospheric disturbances (LS-TIDs) following geomagnetic substorm onsets, have been calculated for the propagation of these disturbances in both hemispheres of the earth. N(h) analyses have been used to produce height change profiles at a fixed frequency from which time shifts between stations (used for the speed and direction-of-travel values) have been calculated. Fixed-frequency phase path measurements at Bribie Island for two events reveal wavetrains with periodicities around 17 min associated with these disturbances. Another event recorded a periodicity of 19 min. Also, for two of the events additional periodicities around 30 min were found. These wavetrains along with the macroscale height changes and electron density depletions associated with these LS-TIDs are essentially the same as the ionospheric structure changes observed during the passage of night-time medium-scale traveling ionospheric disturbances (MS-TIDs). However, unlike these MS-TIDs, the LS-TIDs are generally not associated with the recording of spread-F on ionograms. Possible reasons for this difference are discussed as well as the special conditions which probably prevail on the few occasions when spread-F is associated with LS-TIDs.
Memory effect in M ≥ 7 earthquakes of Taiwan
NASA Astrophysics Data System (ADS)
Wang, Jeen-Hwa
2014-07-01
The M ≥ 7 earthquakes that occurred in the Taiwan region during 1906-2006 are taken to study the possibility of memory effect existing in the sequence of those large earthquakes. Those events are all mainshocks. The fluctuation analysis technique is applied to analyze two sequences in terms of earthquake magnitude and inter-event time represented in the natural time domain. For both magnitude and inter-event time, the calculations are made for three data sets, i.e., the original order data, the reverse-order data, and that of the mean values. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of both magnitude and inter-event time data. In addition, the phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Results lead to a negative answer. Together with all types of information in study, we make a conclusion that the earthquake sequence in study is short-term corrected and thus the short-term memory effect would be operative.
Development of A Tsunami Magnitude Scale Based on DART Buoy Data
NASA Astrophysics Data System (ADS)
Leiva, J.; Polet, J.
2016-12-01
The quantification of tsunami energy has evolved through time, with a number of magnitude and intensity scales employed in the past century. Most of these scales rely on coastal measurements, which may be affected by complexities due to near-shore bathymetric effects and coastal geometries. Moreover, these datasets are generated by tsunami inundation, and thus cannot serve as a means of assessing potential tsunami impact prior to coastal arrival. With the introduction of a network of ocean buoys provided through the Deep-ocean Assessment and Reporting of Tsunamis (DART) project, a dataset has become available that can be exploited to further our current understanding of tsunamis and the earthquakes that excite them. The DART network consists of 39 stations that have produced estimates of sea-surface height as a function of time since 2003, and are able to detect deep ocean tsunami waves. Data collected at these buoys for the past decade reveals that at least nine major tsunami events, such as the 2011 Tohoku and 2013 Solomon Islands events, produced substantial wave amplitudes across a large distance range that can be implemented in a DART data based tsunami magnitude scale. We present preliminary results from the development of a tsunami magnitude scale that follows the methods used in the development of the local magnitude scale by Charles Richter. Analogous to the use of seismic ground motion amplitudes in the calculation of local magnitude, maximum ocean height displacements due to the passage of tsunami waves will be related to distance from the source in a least-squares exponential regression analysis. The regression produces attenuation curves based on the DART data, a site correction term, attenuation parameters, and an amplification factor. Initially, single event based regressions are used to constrain the attenuation parameters. Additional iterations use the parameters of these event-based fits as a starting point to obtain a stable solution, and include the calculation of station corrections, in order to obtain a final amplification factor for each event, which is used to calculate its tsunami magnitude.
Optimizing Tsunami Forecast Model Accuracy
NASA Astrophysics Data System (ADS)
Whitmore, P.; Nyland, D. L.; Huang, P. Y.
2015-12-01
Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.
NASA Astrophysics Data System (ADS)
Chu, Qu-cheng; Wang, Qi-guang; Qiao, Shao-bo; Feng, Guo-lin
2018-01-01
When persistent rainfall occurs frequently over South China, meso-scale and micro-scale synoptic systems persist and expand in space and time and eventually form meso-scale and long-scale weather processes. The accumulation of multiple torrential rain processes is defined as a "cumulative effect" of torrential rain (CETR) event. In this paper, daily reanalysis datasets collected by the National Centers for Environmental Prediction-Department of Energy (NCEP-DOE) during 1979-2014 are used to study the anomalous features and causes of heavy CETR events over South China. The results show that there is a significant difference in the spatial distribution of the heavy CETR events. Based on the center position of the CETR, the middle region displayed middle-region-heavy CETR events while the western region displayed west-region-heavy CETR events. El Niño events in the previous period (December, January, February, March (DJFM)) are major extra-forcing factors of middle-region-heavy CETR events, which is beneficial for the continuous, anomalous Philippine Sea anticyclone and strengthens the West Pacific Subtropical High (WPSH), extending it more westward than normal. The primary water vapor source for precipitation in middle-region-heavy CETR events is the Tropical Western Pacific Ocean. The major extra-forcing factor of a west-region-heavy CETR is the negative anomaly in the southern Tropical Indian Ocean (TIO) during the previous period (DJFM). This factor is beneficial for strengthening the cross-equatorial flow and westerly winds from the Bay of Bengal to the South China Sea (SCS) and early SCS summer monsoon onset. The primary water vapor source of precipitation in the west-region-heavy CETR is the southern TIO.
Defining Tsunami Magnitude as Measure of Potential Impact
NASA Astrophysics Data System (ADS)
Titov, V. V.; Tang, L.
2016-12-01
The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.
Particle acceleration in explosive relativistic reconnection events and Crab Nebula gamma-ray flares
NASA Astrophysics Data System (ADS)
Lyutikov, Maxim; Komissarov, Serguei; Sironi, Lorenzo
2018-04-01
We develop a model of gamma-ray flares of the Crab Nebula resulting from the magnetic reconnection events in a highly magnetised relativistic plasma. We first discuss physical parameters of the Crab Nebula and review the theory of pulsar winds and termination shocks. We also review the principle points of particle acceleration in explosive reconnection events [Lyutikov et al., J. Plasma Phys., vol. 83(6), p. 635830601 (2017a); J. Plasma Phys., vol. 83(6), p. 635830602 (2017b)]. It is required that particles producing flares are accelerated in highly magnetised regions of the nebula. Flares originate from the poleward regions at the base of the Crab's polar outflow, where both the magnetisation and the magnetic field strength are sufficiently high. The post-termination shock flow develops macroscopic (not related to the plasma properties on the skin-depth scale) kink-type instabilities. The resulting large-scale magnetic stresses drive explosive reconnection events on the light-crossing time of the reconnection region. Flares are produced at the initial stage of the current sheet development, during the X-point collapse. The model has all the ingredients needed for Crab flares: natural formation of highly magnetised regions, explosive dynamics on the light travel time, development of high electric fields on macroscopic scales and acceleration of particles to energies well exceeding the average magnetic energy per particle.
Characteristic time scales in the American dollar-Mexican peso exchange currency market
NASA Astrophysics Data System (ADS)
Alvarez-Ramirez, Jose
2002-06-01
Daily fluctuations of the American dollar-Mexican peso exchange currency market are studied using multifractal analysis methods. It is found evidence of multiaffinity of daily fluctuations in the sense that the qth-order (roughness) Hurst exponent Hq varies with changes in q. It is also found that there exist several characteristic time scales ranging from week to year. Accordingly, the market exhibits persistence in the sense that instabilities introduced by market events acting around the characteristic time scales (mainly, quarter and year) would propagate through the future market activity. Some implications of our results on the regulation of the dollar-mexpeso market activity are discussed.
Contributions of substorm injections to SYM-H depressions in the main phase of storms
NASA Astrophysics Data System (ADS)
He, Zhaohai; Dai, Lei; Wang, Chi; Duan, Suping; Zhang, Lingqian; Chen, Tao; Roth, I.
2016-12-01
Substorm injections bring energetic particles to the inner magnetosphere. But the role of the injected population in building up the storm time ring current is not well understood. By surveying Los Alamos National Laboratory geosynchronous data during 34 storm main phases, we show evidence that at least some substorm injections can contribute to substorm-time scale SYM-H/Dst depressions in the main phase of storms. For event studies, we analyze two typical events in which the main-phase SYM-H index exhibited stepwise depressions that are correlated with particle flux enhancement due to injections and with AL index. A statistical study is performed based on 95 storm time injection events. The flux increases of the injected population (50-400 keV) are found proportional to the sharp SYM-H depressions during the injection interval. By identifying dispersionless and dispersive injection signals, we estimate the azimuthal extent of the substorm injection. Statistical results show that the injection regions of these storm time substorms are characterized with an azimuthal extent larger than 06:00 magnetic local time. These results suggest that at least some substorm injections may mimic the large-scale enhanced convection and contribute to sharp decreases of Dst in the storm main phase.
Besmer, Michael D.; Sigrist, Jürg A.; Props, Ruben; Buysschaert, Benjamin; Mao, Guannan; Boon, Nico; Hammes, Frederik
2017-01-01
Rapid contamination of drinking water in distribution and storage systems can occur due to pressure drop, backflow, cross-connections, accidents, and bio-terrorism. Small volumes of a concentrated contaminant (e.g., wastewater) can contaminate large volumes of water in a very short time with potentially severe negative health impacts. The technical limitations of conventional, cultivation-based microbial detection methods neither allow for timely detection of such contaminations, nor for the real-time monitoring of subsequent emergency remediation measures (e.g., shock-chlorination). Here we applied a newly developed continuous, ultra high-frequency flow cytometry approach to track a rapid pollution event and subsequent disinfection of drinking water in an 80-min laboratory scale simulation. We quantified total (TCC) and intact (ICC) cell concentrations as well as flow cytometric fingerprints in parallel in real-time with two different staining methods. The ingress of wastewater was detectable almost immediately (i.e., after 0.6% volume change), significantly changing TCC, ICC, and the flow cytometric fingerprint. Shock chlorination was rapid and detected in real time, causing membrane damage in the vast majority of bacteria (i.e., drop of ICC from more than 380 cells μl-1 to less than 30 cells μl-1 within 4 min). Both of these effects as well as the final wash-in of fresh tap water followed calculated predictions well. Detailed and highly quantitative tracking of microbial dynamics at very short time scales and for different characteristics (e.g., concentration, membrane integrity) is feasible. This opens up multiple possibilities for targeted investigation of a myriad of bacterial short-term dynamics (e.g., disinfection, growth, detachment, operational changes) both in laboratory-scale research and full-scale system investigations in practice. PMID:29085343
Understanding Smoking after Acute Illness: An Application of the Sentinel Event Method
Abar, Beau; Bock, Beth; Chapman, Gretchen; Boudreaux, Edwin D.
2016-01-01
The Sentinel Event Theory provides a stepwise approach for building models to understand how negative events can spark health behavior change. This study tested a preliminary model using the Sentinel Events Method in a sample (N = 300) of smokers who sought care for acute cardiac symptoms. Patients completed measures on: smoking-related causal attribution, perceived severity of the acute illness event, illness-related fear, and intentions to quit smoking. Patients were followed up one week after the health event and a 7 day time line follow back (TLFB) was completed to determine abstinence from tobacco. Structural equation models were performed using average predictor scale scores at baseline, as well as three different time anchors for ratings of illness severity and illness-related fear. Quit intentions, actual illness severity, and age were consistent, positive, independent predictors of 7 day point prevalence abstinence. Additional research on the influences of perceptions and emotional reactions is warranted. PMID:25563437
An event map of memory space in the hippocampus
Deuker, Lorena; Bellmund, Jacob LS; Navarro Schröder, Tobias; Doeller, Christian F
2016-01-01
The hippocampus has long been implicated in both episodic and spatial memory, however these mnemonic functions have been traditionally investigated in separate research strands. Theoretical accounts and rodent data suggest a common mechanism for spatial and episodic memory in the hippocampus by providing an abstract and flexible representation of the external world. Here, we monitor the de novo formation of such a representation of space and time in humans using fMRI. After learning spatio-temporal trajectories in a large-scale virtual city, subject-specific neural similarity in the hippocampus scaled with the remembered proximity of events in space and time. Crucially, the structure of the entire spatio-temporal network was reflected in neural patterns. Our results provide evidence for a common coding mechanism underlying spatial and temporal aspects of episodic memory in the hippocampus and shed new light on its role in interleaving multiple episodes in a neural event map of memory space. DOI: http://dx.doi.org/10.7554/eLife.16534.001 PMID:27710766
The Substructure of the Solar Corona Observed in the Hi-C Telescope
NASA Technical Reports Server (NTRS)
Winebarger, A.; Cirtain, J.; Golub, L.; DeLuca, E.; Savage, S.; Alexander, C.; Schuler, T.
2014-01-01
In the summer of 2012, the High-resolution Coronal Imager (Hi-C) flew aboard a NASA sounding rocket and collected the highest spatial resolution images ever obtained of the solar corona. One of the goals of the Hi-C flight was to characterize the substructure of the solar corona. We therefore calculate how the intensity scales from a low-resolution (AIA) pixels to high-resolution (Hi-C) pixels for both the dynamic events and "background" emission (meaning, the steady emission over the 5 minutes of data acquisition time). We find there is no evidence of substructure in the background corona; the intensity scales smoothly from low-resolution to high-resolution Hi-C pixels. In transient events, however, the intensity observed with Hi-C is, on average, 2.6 times larger than observed with AIA. This increase in intensity suggests that AIA is not resolving these events. This result suggests a finely structured dynamic corona embedded in a smoothly varying background.
Relative timing of last glacial maximum and late-glacial events in the central tropical Andes
NASA Astrophysics Data System (ADS)
Bromley, Gordon R. M.; Schaefer, Joerg M.; Winckler, Gisela; Hall, Brenda L.; Todd, Claire E.; Rademaker, Kurt M.
2009-11-01
Whether or not tropical climate fluctuated in synchrony with global events during the Late Pleistocene is a key problem in climate research. However, the timing of past climate changes in the tropics remains controversial, with a number of recent studies reporting that tropical ice age climate is out of phase with global events. Here, we present geomorphic evidence and an in-situ cosmogenic 3He surface-exposure chronology from Nevado Coropuna, southern Peru, showing that glaciers underwent at least two significant advances during the Late Pleistocene prior to Holocene warming. Comparison of our glacial-geomorphic map at Nevado Coropuna to mid-latitude reconstructions yields a striking similarity between Last Glacial Maximum (LGM) and Late-Glacial sequences in tropical and temperate regions. Exposure ages constraining the maximum and end of the older advance at Nevado Coropuna range between 24.5 and 25.3 ka, and between 16.7 and 21.1 ka, respectively, depending on the cosmogenic production rate scaling model used. Similarly, the mean age of the younger event ranges from 10 to 13 ka. This implies that (1) the LGM and the onset of deglaciation in southern Peru occurred no earlier than at higher latitudes and (2) that a significant Late-Glacial event occurred, most likely prior to the Holocene, coherent with the glacial record from mid and high latitudes. The time elapsed between the end of the LGM and the Late-Glacial event at Nevado Coropuna is independent of scaling model and matches the period between the LGM termination and Late-Glacial reversal in classic mid-latitude records, suggesting that these events in both tropical and temperate regions were in phase.
Naveros, Francisco; Luque, Niceto R; Garrido, Jesús A; Carrillo, Richard R; Anguita, Mancia; Ros, Eduardo
2015-07-01
Time-driven simulation methods in traditional CPU architectures perform well and precisely when simulating small-scale spiking neural networks. Nevertheless, they still have drawbacks when simulating large-scale systems. Conversely, event-driven simulation methods in CPUs and time-driven simulation methods in graphic processing units (GPUs) can outperform CPU time-driven methods under certain conditions. With this performance improvement in mind, we have developed an event-and-time-driven spiking neural network simulator suitable for a hybrid CPU-GPU platform. Our neural simulator is able to efficiently simulate bio-inspired spiking neural networks consisting of different neural models, which can be distributed heterogeneously in both small layers and large layers or subsystems. For the sake of efficiency, the low-activity parts of the neural network can be simulated in CPU using event-driven methods while the high-activity subsystems can be simulated in either CPU (a few neurons) or GPU (thousands or millions of neurons) using time-driven methods. In this brief, we have undertaken a comparative study of these different simulation methods. For benchmarking the different simulation methods and platforms, we have used a cerebellar-inspired neural-network model consisting of a very dense granular layer and a Purkinje layer with a smaller number of cells (according to biological ratios). Thus, this cerebellar-like network includes a dense diverging neural layer (increasing the dimensionality of its internal representation and sparse coding) and a converging neural layer (integration) similar to many other biologically inspired and also artificial neural networks.
Portable real-time fluorescence cytometry of microscale cell culture analog devices
NASA Astrophysics Data System (ADS)
Kim, Donghyun; Tatosian, Daniel A.; Shuler, Michael L.
2006-02-01
A portable fluorescence cytometric system that provides a modular platform for quantitative real-time image measurements has been used to explore the applicability to investigating cellular events on multiple time scales. For a short time scale, we investigated the real-time dynamics of uptake of daunorubicin, a chemotherapeutic agent, in cultured mouse L-cells in a micro cell culture analog compartment using the fluorescent cytometric system. The green fluorescent protein (GFP) expression to monitor induction of pre-specified genes, which occurs on a much longer time scale, has also been measured. Here GFP fluorescence from a doxycycline inducible promoter in a mouse L-cell line was determined. Additionally, a system based on inexpensive LEDs showed performance comparable to a broadband light source based system and reduced photobleaching compared to microscopic examination.
NASA Astrophysics Data System (ADS)
Singh, A.; Tejedor, A.; Grimaud, J. L.; Zaliapin, I. V.; Foufoula-Georgiou, E.
2016-12-01
Knowledge of the dynamics of evolving landscapes in terms of their geomorphic and topologic re-organization in response to changing climatic or tectonic forcing is of scientific and practical interest. Although several studies have addressed the large-scale response (e.g., change in mean relief), studies on the smaller-scale drainage pattern re-organization and quantification of landscape vulnerability to the timing, magnitude, and frequency of changing forcing are lacking. The reason is the absence of data for such an analysis. To that goal, a series of controlled laboratory experiments were conducted at the St. Anthony Falls laboratory of the University of Minnesota to study the effect of changing precipitation patterns on landscape evolution at the short and long-time scales. High resolution digital elevation (DEM) both in space and time were measured for a range of rainfall patterns and uplift rates. Results from our study show a distinct signature of the precipitation increase on the probabilistic and geometrical structure of landscape features, evident in widening and deepening of channels and valleys, change in drainage patterns within sub-basins and change in the space-time structure of erosional and depositional events. A spatially explicit analysis of the locus of these erosional and depositional events suggests a regime shift, during the onset of the transient state, from supply-limited to transport-limited fluvial channels. We document a characteristic scale-dependent signature of erosion at steady state (which we term the "E50-area curve") and show that during reorganization, its evolving shape reflects process and scales of geomorphic change. Finally, we document changes in the longitudinal river profiles, in response to increased precipitation rate, with the formation of abrupt gradient (knickpoints) that migrate upstream as time proceeds.
Modulation of the SSTA decadal variation on ENSO events and relationships of SSTA With LOD,SOI, etc
NASA Astrophysics Data System (ADS)
Liao, D. C.; Zhou, Y. H.; Liao, X. H.
2007-01-01
Interannual and decadal components of the length of day (LOD), Southern Oscillation Index (SOI) and Sea Surface Temperature anomaly (SSTA) in Nino regions are extracted by band-pass filtering, and used for research of the modulation of the SSTA on the ENSO events. Results show that besides the interannual components, the decadal components in SSTA have strong impacts on monitoring and representing of the ENSO events. When the ENSO events are strong, the modulation of the decadal components of the SSTA tends to prolong the life-time of the events and enlarge the extreme anomalies of the SST, while the ENSO events, which are so weak that they can not be detected by the interannual components of the SSTA, can also be detected with the help of the modulation of the SSTA decadal components. The study further draws attention to the relationships of the SSTA interannual and decadal components with those of LOD, SOI, both of the sea level pressure anomalies (SLPA) and the trade wind anomalies (TWA) in tropic Pacific, and also with those of the axial components of the atmospheric angular momentum (AAM) and oceanic angular momentum (OAM). Results of the squared coherence and coherent phases among them reveal close connections with the SSTA and almost all of the parameters mentioned above on the interannual time scales, while on the decadal time scale significant connections are among the SSTA and SOI, SLPA, TWA, ?3w and ?3w+v as well, and slight weaker connections between the SSTA and LOD, ?3pib and ?3bp
Aab, Alexander
2015-03-30
In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in themore » $$E\\gt 8$$ EeV energy bin, with an amplitude for the first harmonic in right ascension $$r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$$, that has a chance probability $$P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$$, reinforcing the hint previously reported with vertical events alone.« less
Quantifying the Temporal Inequality of Nutrient Loads with a Novel Metric
NASA Astrophysics Data System (ADS)
Gall, H. E.; Schultz, D.; Rao, P. S.; Jawitz, J. W.; Royer, M.
2015-12-01
Inequality is an emergent property of many complex systems. For a given series of stochastic events, some events generate a disproportionately large contribution to system responses compared to other events. In catchments, such responses cause streamflow and solute loads to exhibit strong temporal inequality, with the vast majority of discharge and solute loads exported during short periods of time during which high-flow events occur. These periods of time are commonly referred to as "hot moments". Although this temporal inequality is widely recognized, there is currently no uniform metric for assessing it. We used a novel application of Lorenz Inequality, a method commonly used in economics to quantify income inequality, to quantify the spatial and temporal inequality of streamflow and nutrient (nitrogen and phosphorus) loads exported to the Chesapeake Bay. Lorenz Inequality and the corresponding Gini Coefficient provide an analytical tool for quantifying inequality that can be applied at any temporal or spatial scale. The Gini coefficient (G) is a formal measure of inequality that varies from 0 to 1, with a value of 0 indicating perfect equality (i.e., fluxes and loads are constant in time) and 1 indicating perfect inequality (i.e., all of the discharge and solute loads are exported during one instant in time). Therefore, G is a simple yet powerful tool for providing insight into the temporal inequality of nutrient transport. We will present the results of our detailed analysis of streamflow and nutrient time series data collected since the early 1980's at 30 USGS gauging stations in the Chesapeake Bay watershed. The analysis is conducted at an annual time scale, enabling trends and patterns to be assessed both temporally (over time at each station) and spatially (for the same period of time across stations). The results of this analysis have the potential to create a transformative new framework for identifying "hot moments", improving our ability to temporally and spatially target implementation of best management practices to ultimately improve water quality in the Chesapeake Bay. This method also provides insight into the temporal scales at which hydrologic and biogeochemical variability dominate nutrient export dynamics.
Scaling of seismicity induced by nonlinear fluid-rock interaction after an injection stop
NASA Astrophysics Data System (ADS)
Johann, L.; Dinske, C.; Shapiro, S. A.
2016-11-01
Fluid injections into unconventional reservoirs, performed for fluid-mobility enhancement, are accompanied by microseismic activity also after the injection. Previous studies revealed that the triggering of seismic events can be effectively described by nonlinear diffusion of pore fluid pressure perturbations where the hydraulic diffusivity becomes pressure dependent. The spatiotemporal distribution of postinjection-induced microseismicity has two important features: the triggering front, corresponding to early and distant events, and the back front, representing the time-dependent spatial envelope of the growing seismic quiescence zone. Here for the first time, we describe analytically the temporal behavior of these two fronts after the injection stop in the case of nonlinear pore fluid pressure diffusion. We propose a scaling law for the fronts and show that they are sensitive to the degree of nonlinearity and to the Euclidean dimension of the dominant growth of seismicity clouds. To validate the theoretical finding, we numerically model nonlinear pore fluid pressure diffusion and generate synthetic catalogs of seismicity. Additionally, we apply the new scaling relation to several case studies of injection-induced seismicity. The derived scaling laws describe well synthetic and real data.
NASA Astrophysics Data System (ADS)
Strikis, N. M.; Cruz, F. W.; Cheng, H.; Karmann, I.; Vuille, M.; Edwards, R.; Wang, X.; Paula, M. S.; Novello, V. F.; Auler, A.
2011-12-01
A paleoprecipitation reconstruction based on high resolution and well-dated speleothem oxygen isotope records shows that the monsoon precipitation over central eastern Brazil underwent to strong variations on millennial to multi-centennial time-scales during the Holocene. This new record indicates that abrupt events of increase in monsoon precipitation are correlated to Bond events 6, 5 and 4 and also with 8.2 ky event during the early and mid-Holocene, with a mean amplitude of 1.5 % (PDB). The pacing and structure of such events are general consistent with variations in solar activity suggested by atmospheric Δ14 C records. In the late-Holocene, abrupt events of increase in monsoon precipitation peaking at 3.2, 2.7 and 2.3 ky B.P. are approximately synchronous with periods of low solar minima. In this regard, the most prominent event occurred during the late Holocene occurred at ~2.7 ky B.P. In addition, these positive anomalies of the precipitation recorded in central eastern Brazil are also in good agreement with variations in Titicaca lake level. The good correspondence between the speleothem and marine records imply that the variations in the north Atlantic sea surface temperature is the main forcing for abrupt millennial to multi-centennial precipitations variation within the region under influence of South American Monsoon.
NASA Astrophysics Data System (ADS)
Gallego, C.; Costa, A.; Cuerva, A.
2010-09-01
Since nowadays wind energy can't be neither scheduled nor large-scale storaged, wind power forecasting has been useful to minimize the impact of wind fluctuations. In particular, short-term forecasting (characterised by prediction horizons from minutes to a few days) is currently required by energy producers (in a daily electricity market context) and the TSO's (in order to keep the stability/balance of an electrical system). Within the short-term background, time-series based models (i.e., statistical models) have shown a better performance than NWP models for horizons up to few hours. These models try to learn and replicate the dynamic shown by the time series of a certain variable. When considering the power output of wind farms, ramp events are usually observed, being characterized by a large positive gradient in the time series (ramp-up) or negative (ramp-down) during relatively short time periods (few hours). Ramp events may be motivated by many different causes, involving generally several spatial scales, since the large scale (fronts, low pressure systems) up to the local scale (wind turbine shut-down due to high wind speed, yaw misalignment due to fast changes of wind direction). Hence, the output power may show unexpected dynamics during ramp events depending on the underlying processes; consequently, traditional statistical models considering only one dynamic for the hole power time series may be inappropriate. This work proposes a Regime Switching (RS) model based on Artificial Neural Nets (ANN). The RS-ANN model gathers as many ANN's as different dynamics considered (called regimes); a certain ANN is selected so as to predict the output power, depending on the current regime. The current regime is on-line updated based on a gradient criteria, regarding the past two values of the output power. 3 Regimes are established, concerning ramp events: ramp-up, ramp-down and no-ramp regime. In order to assess the skillness of the proposed RS-ANN model, a single-ANN model (without regime classification) is adopted as a reference model. Both models are evaluated in terms of Improvement over Persistence on the Mean Square Error basis (IoP%) when predicting horizons form 1 time-step to 5. The case of a wind farm located in the complex terrain of Alaiz (north of Spain) has been considered. Three years of available power output data with a hourly resolution have been employed: two years for training and validation of the model and the last year for assessing the accuracy. Results showed that the RS-ANN overcame the single-ANN model for one step-ahead forecasts: the overall IoP% was up to 8.66% for the RS-ANN model (depending on the gradient criterion selected to consider the ramp regime triggered) and 6.16% for the single-ANN. However, both models showed similar accuracy for larger horizons. A locally-weighted evaluation during ramp events for one-step ahead was also performed. It was found that the IoP% during ramps-up increased from 17.60% (case of single-ANN) to 22.25% (case of RS-ANN); however, during the ramps-down events this improvement increased from 18.55% to 19.55%. Three main conclusions are derived from this case study: It highlights the importance of considering statistical models capable of differentiate several regimes showed by the output power time series in order to improve the forecasting during extreme events like ramps. On-line regime classification based on available power output data didn't seem to contribute to improve forecasts for horizons beyond one-step ahead. Tacking into account other explanatory variables (local wind measurements, NWP outputs) could lead to a better understanding of ramp events, improving the regime assessment also for further horizons. The RS-ANN model slightly overcame the single-ANN during ramp-down events. If further research reinforce this effect, special attention should be addressed to understand the underlying processes during ramp-down events.
A networks-based discrete dynamic systems approach to volcanic seismicity
NASA Astrophysics Data System (ADS)
Suteanu, Mirela
2013-04-01
The detection and relevant description of pattern change concerning earthquake events is an important, but challenging task. In this paper, earthquake events related to volcanic activity are considered manifestations of a dynamic system evolving over time. The system dynamics is seen as a succession of events with point-like appearance both in time and in space. Each event is characterized by a position in three-dimensional space, a moment of occurrence, and an event size (magnitude). A weighted directed network is constructed to capture the effects of earthquakes on subsequent events. Each seismic event represents a node. Relations among events represent edges. Edge directions are given by the temporal succession of the events. Edges are also characterized by weights reflecting the strengths of the relation between the nodes. Weights are calculated as a function of (i) the time interval separating the two events, (ii) the spatial distance between the events, (iii) the magnitude of the earliest event among the two. Different ways of addressing weight components are explored, and their implications for the properties of the produced networks are analyzed. The resulting networks are then characterized in terms of degree- and weight distributions. Subsequently, the distribution of system transitions is determined for all the edges connecting related events in the network. Two- and three-dimensional diagrams are constructed to reflect transition distributions for each set of events. Networks are thus generated for successive temporal windows of different size, and the evolution of (a) network properties and (b) system transition distributions are followed over time and compared to the timeline of documented geologic processes. Applications concerning volcanic seismicity on the Big Island of Hawaii show that this approach is capable of revealing novel aspects of change occurring in the volcanic system on different scales in time and in space.
Devos, Nicolas; Szövényi, Péter; Weston, David J; Rothfels, Carl J; Johnson, Matthew G; Shaw, A Jonathan
2016-07-01
The goal of this research was to investigate whether there has been a whole-genome duplication (WGD) in the ancestry of Sphagnum (peatmoss) or the class Sphagnopsida, and to determine if the timing of any such duplication(s) and patterns of paralog retention could help explain the rapid radiation and current ecological dominance of peatmosses. RNA sequencing (RNA-seq) data were generated for nine taxa in Sphagnopsida (Bryophyta). Analyses of frequency plots for synonymous substitutions per synonymous site (Ks ) between paralogous gene pairs and reconciliation of 578 gene trees were conducted to assess evidence of large-scale or genome-wide duplication events in each transcriptome. Both Ks frequency plots and gene tree-based analyses indicate multiple duplication events in the history of the Sphagnopsida. The most recent WGD event predates divergence of Sphagnum from the two other genera of Sphagnopsida. Duplicate retention is highly variable across species, which might be best explained by local adaptation. Our analyses indicate that the last WGD could have been an important factor underlying the diversification of peatmosses and facilitated their rise to ecological dominance in peatlands. The timing of the duplication events and their significance in the evolutionary history of peat mosses are discussed. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.
Explosion Monitoring with Machine Learning: A LSTM Approach to Seismic Event Discrimination
NASA Astrophysics Data System (ADS)
Magana-Zook, S. A.; Ruppert, S. D.
2017-12-01
The streams of seismic data that analysts look at to discriminate natural from man- made events will soon grow from gigabytes of data per day to exponentially larger rates. This is an interesting problem as the requirement for real-time answers to questions of non-proliferation will remain the same, and the analyst pool cannot grow as fast as the data volume and velocity will. Machine learning is a tool that can solve the problem of seismic explosion monitoring at scale. Using machine learning, and Long Short-term Memory (LSTM) models in particular, analysts can become more efficient by focusing their attention on signals of interest. From a global dataset of earthquake and explosion events, a model was trained to recognize the different classes of events, given their spectrograms. Optimal recurrent node count and training iterations were found, and cross validation was performed to evaluate model performance. A 10-fold mean accuracy of 96.92% was achieved on a balanced dataset of 30,002 instances. Given that the model is 446.52 MB it can be used to simultaneously characterize all incoming signals by researchers looking at events in isolation on desktop machines, as well as at scale on all of the nodes of a real-time streaming platform. LLNL-ABS-735911
Vanschoenwinkel, Bram; Mergeay, Joachim; Pinceel, Tom; Waterkeyn, Aline; Vandewaerde, Hanne; Seaman, Maitland; Brendonck, Luc
2011-01-01
Recent findings suggest a convergence of time scales between ecological and evolutionary processes which is usually explained in terms of rapid micro evolution resulting in evolution on ecological time scales. A similar convergence, however, can also emerge when slow ecological processes take place on evolutionary time scales. A good example of such a slow ecological process is the colonization of remote aquatic habitats by passively dispersed zooplankton. Using variation at the protein coding mitochondrial COI gene, we investigated the balance between mutation and migration as drivers of genetic diversity in two Branchipodopsis fairy shrimp species (Crustacea, Anostraca) endemic to remote temporary rock pool clusters at the summit of isolated mountaintops in central South Africa. We showed that both species colonized the region almost simultaneously c. 0.8 My ago, but exhibit contrasting patterns of regional genetic diversity and demographic history. The haplotype network of the common B. cf. wolfi showed clear evidence of 11 long distance dispersal events (up to 140 km) with five haplotypes that are shared among distant inselbergs, as well as some more spatially isolated derivates. Similar patterns were not observed for B. drakensbergensis presumably since this rarer species experienced a genetic bottleneck. We conclude that the observed genetic patterns reflect rare historic colonization events rather than frequent ongoing gene flow. Moreover, the high regional haplotype diversity combined with a high degree of haplotype endemicity indicates that evolutionary- (mutation) and ecological (migration) processes in this system operate on similar time scales. PMID:22102865
Reaching extended length-scales with temperature-accelerated dynamics
NASA Astrophysics Data System (ADS)
Amar, Jacques G.; Shim, Yunsic
2013-03-01
In temperature-accelerated dynamics (TAD) a high-temperature molecular dynamics (MD) simulation is used to accelerate the search for the next low-temperature activated event. While TAD has been quite successful in extending the time-scales of simulations of non-equilibrium processes, due to the fact that the computational work scales approximately as the cube of the number of atoms, until recently only simulations of relatively small systems have been carried out. Recently, we have shown that by combining spatial decomposition with our synchronous sublattice algorithm, significantly improved scaling is possible. However, in this approach the size of activated events is limited by the processor size while the dynamics is not exact. Here we discuss progress in developing an alternate approach in which high-temperature parallel MD along with localized saddle-point (LSAD) calculations, are used to carry out TAD simulations without restricting the size of activated events while keeping the dynamics ``exact'' within the context of harmonic transition-state theory. In tests of our LSAD method applied to Ag/Ag(100) annealing and Cu/Cu(100) growth simulations we find significantly improved scaling of TAD, while maintaining a negligibly small error in the energy barriers. Supported by NSF DMR-0907399.
Experimental characterization of extreme events of inertial dissipation in a turbulent swirling flow
Saw, E. -W.; Kuzzay, D.; Faranda, D.; Guittonneau, A.; Daviaud, F.; Wiertel-Gasquet, C.; Padilla, V.; Dubrulle, B.
2016-01-01
The three-dimensional incompressible Navier–Stokes equations, which describe the motion of many fluids, are the cornerstones of many physical and engineering sciences. However, it is still unclear whether they are mathematically well posed, that is, whether their solutions remain regular over time or develop singularities. Even though it was shown that singularities, if exist, could only be rare events, they may induce additional energy dissipation by inertial means. Here, using measurements at the dissipative scale of an axisymmetric turbulent flow, we report estimates of such inertial energy dissipation and identify local events of extreme values. We characterize the topology of these extreme events and identify several main types. Most of them appear as fronts separating regions of distinct velocities, whereas events corresponding to focusing spirals, jets and cusps are also found. Our results highlight the non-triviality of turbulent flows at sub-Kolmogorov scales as possible footprints of singularities of the Navier–Stokes equation. PMID:27578459
NASA Astrophysics Data System (ADS)
Collow, A.; Bosilovich, M. G.; Koster, R. D.
2016-12-01
Over the past two decades a statistically significant increase in the frequency of summertime extreme precipitation events has been observed over the northeastern United States - the largest such increase in the US in terms of area and magnitude. In an effort to characterize synoptic scale patterns and changes to the atmospheric circulation associated with extreme precipitation events in this region, atmospheric fields from the Modern-Era Retrospective Analysis for Research and Applications, Version 2 (MERRA-2) are composited on days that exceed the 90th percentile of precipitation from the CPC-Unified daily gauge-based precipitation observations. Changes over time in composites of sea level pressure, 500 hPa height, and the vertical profile of equivalent potential temperature indicate that the observed increase in extreme precipitation events is associated with extratropical cyclones, including cut off low pressure and frontal systems. Analysis of the Eady maximum growth rate, an indicator for storm tracks, shows that storms tracks in recent years have shifted southward. In addition, mean summertime transient meridional winds have decreased over time, slowing baroclinic systems and causing stationary systems to become more frequent, in agreement with previous studies examining blocking due to high pressure systems. The Atlantic Ocean provides a significant supply of moisture that converges over the region when a cyclonic circulation is situated to the south, and the statistically significant increase in Eady maximum growth rate over time there provides an increasingly improved thermodynamic environment for extreme precipitation events.
Improved Strength and Damage Modeling of Geologic Materials
NASA Astrophysics Data System (ADS)
Stewart, Sarah; Senft, Laurel
2007-06-01
Collisions and impact cratering events are important processes in the evolution of planetary bodies. The time and length scales of planetary collisions, however, are inaccessible in the laboratory and require the use of shock physics codes. We present the results from a new rheological model for geological materials implemented in the CTH code [1]. The `ROCK' model includes pressure, temperature, and damage effects on strength, as well as acoustic fluidization during impact crater collapse. We demonstrate that the model accurately reproduces final crater shapes, tensile cracking, and damaged zones from laboratory to planetary scales. The strength model requires basic material properties; hence, the input parameters may be benchmarked to laboratory results and extended to planetary collision events. We show the effects of varying material strength parameters, which are dependent on both scale and strain rate, and discuss choosing appropriate parameters for laboratory and planetary situations. The results are a significant improvement in models of continuum rock deformation during large scale impact events. [1] Senft, L. E., Stewart, S. T. Modeling Impact Cratering in Layered Surfaces, J. Geophys. Res., submitted.
Extreme Precipitation and High-Impact Landslides
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia; Adler, Robert; Huffman, George; Peters-Lidard, Christa
2012-01-01
It is well known that extreme or prolonged rainfall is the dominant trigger of landslides; however, there remain large uncertainties in characterizing the distribution of these hazards and meteorological triggers at the global scale. Researchers have evaluated the spatiotemporal distribution of extreme rainfall and landslides at local and regional scale primarily using in situ data, yet few studies have mapped rainfall-triggered landslide distribution globally due to the dearth of landslide data and consistent precipitation information. This research uses a newly developed Global Landslide Catalog (GLC) and a 13-year satellite-based precipitation record from Tropical Rainfall Measuring Mission (TRMM) data. For the first time, these two unique products provide the foundation to quantitatively evaluate the co-occurence of precipitation and rainfall-triggered landslides globally. The GLC, available from 2007 to the present, contains information on reported rainfall-triggered landslide events around the world using online media reports, disaster databases, etc. When evaluating this database, we observed that 2010 had a large number of high-impact landslide events relative to previous years. This study considers how variations in extreme and prolonged satellite-based rainfall are related to the distribution of landslides over the same time scales for three active landslide areas: Central America, the Himalayan Arc, and central-eastern China. Several test statistics confirm that TRMM rainfall generally scales with the observed increase in landslide reports and fatal events for 2010 and previous years over each region. These findings suggest that the co-occurrence of satellite precipitation and landslide reports may serve as a valuable indicator for characterizing the spatiotemporal distribution of landslide-prone areas in order to establish a global rainfall-triggered landslide climatology. This research also considers the sources for this extreme rainfall, citing teleconnections from ENSO as likely contributors to regional precipitation variability. This work demonstrates the potential for using satellite-based precipitation estimates to identify potentially active landslide areas at the global scale in order to improve landslide cataloging and quantify landslide triggering at daily, monthly and yearly time scales.
NASA Technical Reports Server (NTRS)
Vanhollebeke, M. A.; Wang, J. R.; Mcdonald, F. B.
1974-01-01
This catalogue of solar cosmic ray events has been prepared for the use of solar physicists and other interested scientists. It contains some 185 solar particle events detected by the Goddard Space Flight Center Cosmic Ray Experiments on IMP's IV and V (Explorer 34 and 41) for the period May 1967 - December 1972. The data is presented in the form of hourly averages for three proton energy intervals - 0.9 - 1.6 MeV; 6 - 20 MeV and 20 - 80 MeV. In addition the time histories of .5 - 1.1 MeV electrons are shown on a separate scale. To assist in the identification of related solar events, the onset time of the electron event is indicated. The details of the instrumentation and detector techniques are described. Further descriptions of data reduction procedure and on the time-history plots are given.
NASA Astrophysics Data System (ADS)
Ruin, Isabelle; Boudevillain, Brice; Creutin, Jean-Dominique; Lutoff, Céline
2013-04-01
Western Mediterranean regions are favorable locations for heavy precipitating events. In recent years, many of them resulted in destructive flash floods with extended damage and loss of life: Nîmes 1988, Vaison-la-Romaine 1992, Aude 1999 and Gard 2002 and 2005. Because of the suddenness in the rise of water levels and the limited forecasting predictability, flash floods often surprise people in the midst of their daily activity and force them to react in a very limited amount of time. In such fast evolving events impacts depend not just on such compositional variables as the magnitude of the flood event and the vulnerability of those affected, but also on such contextual factors as its location and timing (night, rush hours, working hours...). Those contextual factors can alter the scale and social distribution of impacts and vulnerability to them. In the case of flooding fatalities, for instance, the elderly are often said to be the most vulnerable, but when fatalities are mapped against basin size and response time, it has been shown that in fact it is young adults who are most likely to be killed in flash flooding of small catchments, whereas the elderly are the most frequent victim of large scale fluvial flooding. Further investigations in the Gard region have shown that such tendency could be explained by a difference of attitude across ages with respect to mobility related to daily life routine and constraints. According to a survey of intentional behavior professionals appear to be less prone to adapting their daily activities and mobility to rapidly changing environmental conditions than non-professionals. Nevertheless, even if this appears as a tendency in both the analysis of limited data on death circumstances and intended behavior surveys, behavioral verification is very much needed. Understanding how many and why people decide to travel in hazardous weather conditions and how they adapt (or not) their activities and schedule in response to environmental perturbations requires an integrated approach, sensitive to the spatial and temporal dynamics of geophysical hazards and responses to them. Such integrated approaches of the Coupled Human and Natural System have been more common in the environmental change arena than in risk studies. Nevertheless, examining interactions between routine activity-travel patterns and hydro-meteorological dynamics in the context of flash flood event resulted in developing a space-time scale approach that brought new insights to vulnerability and risk studies. This scaling approach requires suitable data sets including information about the meteorological and local flooding dynamics, the perception of environmental cues, the changes in individuals' activity-travel patterns and the social interactions at the place and time where the actions were performed. Even if these types of data are commonly collected in various disciplinary research contexts, they are seldom collected all together and in the context of post-disaster studies. This paper describes the methodological developments of our approach and applies our data collection method to the case of the June 15th, 2010 flash flood events in the Draguignan area (Var, France). This flash flood event offers a typical example to study the relation between the flood dynamics and the social response in the context of a sudden degradation of the environment.
Short-term rainfall: its scaling properties over Portugal
NASA Astrophysics Data System (ADS)
de Lima, M. Isabel P.
2010-05-01
The characterization of rainfall at a variety of space- and time-scales demands usually that data from different origins and resolution are explored. Different tools and methodologies can be used for this purpose. In regions where the spatial variation of rain is marked, the study of the scaling structure of rainfall can lead to a better understanding of the type of events affecting that specific area, which is essential for many engineering applications. The relevant factors affecting rain variability, in time and space, can lead to contrasting statistics which should be carefully taken into account in design procedures and decision making processes. One such region is Mainland Portugal; the territory is located in the transitional region between the sub-tropical anticyclone and the subpolar depression zones and is characterized by strong north-south and east-west rainfall gradients. The spatial distribution and seasonal variability of rain are particularly influenced by the characteristics of the global circulation. One specific feature is the Atlantic origin of many synoptic disturbances in the context of the regional geography (e.g. latitude, orography, oceanic and continental influences). Thus, aiming at investigating the statistical signature of rain events of different origins, resulting from the large number of mechanisms and factors affecting the rainfall climate over Portugal, scale-invariant analyses of the temporal structure of rain from several locations in mainland Portugal were conducted. The study used short-term rainfall time series. Relevant scaling ranges were identified and characterized that help clarifying the small-scale behaviour and statistics of this process.
Shock induced crystallization of amorphous Nickel powders
NASA Astrophysics Data System (ADS)
Cherukara, Mathew; Strachan, Alejandro
2015-06-01
Recent experimental work has shown the efficacy of amorphous Ni/crystalline Al composites as energetic materials, with flame velocities twice that of a comparable crystalline Ni/crystalline Al system. Of further interest is the recrystallization mechanisms in the pure amorphous Ni powders, both thermally induced and mechanically induced. We present large-scale molecular dynamics simulations of shock-induced recrystallization in loosely packed amorphous Nickel powders. We study the time dependent nucleation and growth processes by holding the shocked samples at the induced pressures and temperatures for extended periods following the passage of the shock (up to 6 ns). We find that the nanostructure of the recrystallized Ni and time scales of recrystallization are dependent on the piston velocity. At low piston velocities, nucleation events are rare, leading to long incubation times and a relatively coarse nanostructure. At higher piston velocities, local variations in temperature due to jetting phenomena and void collapse, give rise to multiple nucleation events on time scales comparable to the passage of the shock wave, leading to the formation of a fine-grained nanostructure. Interestingly, we observe that the nucleation and growth process occurs in two steps, with the first nuclei crystallizing into the BCC structure, before evolving over time into the expected FCC structure. U.S. Defense Threat Reduction Agency, HDTRA1-10-1-0119 (Program Manager Suhithi Peiris).
Order parameter aided efficient phase space exploration under extreme conditions
NASA Astrophysics Data System (ADS)
Samanta, Amit
Physical processes in nature exhibit disparate time-scales, for example time scales associated with processes like phase transitions, various manifestations of creep, sintering of particles etc. are often much higher than time the system spends in the metastable states. The transition times associated with such events are also orders of magnitude higher than time-scales associated with vibration of atoms. Thus, an atomistic simulation of such transition events is a challenging task. Consequently, efficient exploration of configuration space and identification of metastable structures in condensed phase systems is challenging. In this talk I will illustrate how we can define a set of coarse-grained variables or order parameters and use these to systematically and efficiently steer a system containing thousands or millions of atoms over different parts of the configuration. This order parameter aided sampling can be used to identify metastable states, transition pathways and understand the mechanistic details of complex transition processes. I will illustrate how this sampling scheme can be used to study phase transition pathways and phase boundaries in prototypical materials, like SiO2 and Cu under high-pressure conditions. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Studing Regional Wave Source Time Functions Using A Massive Automated EGF Deconvolution Procedure
NASA Astrophysics Data System (ADS)
Xie, J. "; Schaff, D. P.
2010-12-01
Reliably estimated source time functions (STF) from high-frequency regional waveforms, such as Lg, Pn and Pg, provide important input for seismic source studies, explosion detection, and minimization of parameter trade-off in attenuation studies. The empirical Green’s function (EGF) method can be used for estimating STF, but it requires a strict recording condition. Waveforms from pairs of events that are similar in focal mechanism, but different in magnitude must be on-scale recorded on the same stations for the method to work. Searching for such waveforms can be very time consuming, particularly for regional waves that contain complex path effects and have reduced S/N ratios due to attenuation. We have developed a massive, automated procedure to conduct inter-event waveform deconvolution calculations from many candidate event pairs. The procedure automatically evaluates the “spikiness” of the deconvolutions by calculating their “sdc”, which is defined as the peak divided by the background value. The background value is calculated as the mean absolute value of the deconvolution, excluding 10 s around the source time function. When the sdc values are about 10 or higher, the deconvolutions are found to be sufficiently spiky (pulse-like), indicating similar path Green’s functions and good estimates of the STF. We have applied this automated procedure to Lg waves and full regional wavetrains from 989 M ≥ 5 events in and around China, calculating about a million deconvolutions. Of these we found about 2700 deconvolutions with sdc greater than 9, which, if having a sufficiently broad frequency band, can be used to estimate the STF of the larger events. We are currently refining our procedure, as well as the estimated STFs. We will infer the source scaling using the STFs. We will also explore the possibility that the deconvolution procedure could complement cross-correlation in a real time event-screening process.
Agent based reasoning for the non-linear stochastic models of long-range memory
NASA Astrophysics Data System (ADS)
Kononovicius, A.; Gontis, V.
2012-02-01
We extend Kirman's model by introducing variable event time scale. The proposed flexible time scale is equivalent to the variable trading activity observed in financial markets. Stochastic version of the extended Kirman's agent based model is compared to the non-linear stochastic models of long-range memory in financial markets. The agent based model providing matching macroscopic description serves as a microscopic reasoning of the earlier proposed stochastic model exhibiting power law statistics.
Patterns of precipitation: Fine-scale rain dynamics in the South of England
NASA Astrophysics Data System (ADS)
Callaghan, Sarah
2010-05-01
The consensus in the climate change community is that one of the (many) effects of climate change will be that the nature of rain events will change, and in all likelihood, they will become more extreme. Currently, most long-term rain rate data sets are hourly (or longer) rain accumulations, so investigating the rain events that occur for less than 0.01% (52.5 minutes) of a year is not possible. Rain datasets do exist with smaller temporal resolution, but these are either not continuous, or simply have not been in operation long enough to investigate any trends in climate change. The Chilbolton Observatory in the south of England is one of the world's most advanced meteorological radar experimental facilities, and is home to the world's largest fully steerable meteorological radar, the Chilbolton Advanced Meteorological Radar (CAMRa). It also hosts a wide range of meteorological and atmospheric sensing instruments, including cameras, lidars, radiometers and a wide selection of different types of rain gauges. The UK atmospheric science, hydrology and Earth Observation communities use the instruments located at Chilbolton to conduct research in weather, flooding and climate. This often involves observations of meteorological phenomena operating below the current resolution of (forecasting and climate) models and work on their effective parameterisation. The Chilbolton datasets contain a continuous drop counting rain gauge time series at 10 seconds integration time, spanning from January 2001 to the present. Though the length of the time series is not sufficient to confidently identify any effects of climate change, the time resolution is sufficient to investigate the differences in the extreme values of rain events over the nine years of the dataset, characterising the inter-annual and seasonal variability. Changes in the occurrence of different rain events have also been investigated by looking at event and inter-event durations to determine if there is any change in the relative number of stratiform and convective events over the time period. Knowledge of the fine scale variability of rain (both in the spatial and temporal domains) is important for the development of accurate models for small-scale forecasting, as well as models for the implementation and operation of rain affected systems, such as microwave radio communications and flood mitigation. As the rain gauge measurements made at Chilbolton will continue for the foreseeable future, these datasets will become increasingly valuable, as they provide a "ground-truth" that can be compared with the results of climate and other models.
The Influence Of Antecedent Conditions On Flood Risk In Sub-Saharan Africa
NASA Astrophysics Data System (ADS)
Bischiniotis, K.
2017-12-01
Flood risk management has traditionally focused on long-term flood protection measures. However, due to high investment costs many lower-income countries are not able to afford hard infrastructure that provides the desired safety levels. Consequently, timely warning of not only extreme events is crucial in risk mitigation at these places. Most flood warning systems have predominantly focused on precipitation as the main predictive factor with lead times of hours or days. Nevertheless, other factors such as anomalous positive water storage, soil saturation and evapotranspiration also affect the flood build-up period. Gaining insights in the processes occurring during this period can increase warning lead times, resulting in more effective preparation. This study follows a pragmatic approach to analyse the hydro-meteorological pre-conditions of 501 damaging floods over the period 1980 to 2010 in sub-Saharan Africa. These are separated into a) weather scale (0-7 days) and b) seasonal scale conditions (up to 6 months) before each event in a way that the two periods do not overlap. The 7-day preceding precipitation (PRE7) and the Standardized Precipitation Evapotranspiration Index (SPEI) reflect the conditions in the two time scale domains, respectively. Using the flood onset date and the location derived from NatCatSERVICE database, the antecedent conditions of each flood are systematically compared to the same conditions during the other years of the dataset, during which no floods were reported. Results indicate that high PRE7 does not always justify the flood generation by itself since there were several cases where similar magnitude precipitation events did not lead to flooding. The SPEI in the end of the flood onset month seems to be a good flood monitoring tool as in most cases it well reflects the wet conditions (80% of the floods). The SPEIs of different averaging times prior to flood events also show that many floods were preceded by wet conditions (70% , 65%, and 57% for averaging time of 1, 3 and 6 months, respectively. Finally, we show that bringing together weather and seasonal-scale conditions can result in an increased flooding likelihood, which in turn might help humanitarian organizations and decision-makers extend the period of the preventive flood risk management planning.
Role of dynamics in enzyme catalysis: substantial versus semantic controversies.
Kohen, Amnon
2015-02-17
CONSPECTUS: The role of the enzyme's dynamic motions in catalysis is at the center of heated contemporary debates among both theoreticians and experimentalists. Resolving these apparent disputes is of both intellectual and practical importance: incorporation of enzyme dynamics could be critical for any calculation of enzymatic function and may have profound implications for structure-based drug design and the design of biomimetic catalysts. Analysis of the literature suggests that while part of the dispute may reflect substantial differences between theoretical approaches, much of the debate is semantic. For example, the term "protein dynamics" is often used by some researchers when addressing motions that are in thermal equilibrium with their environment, while other researchers only use this term for nonequilibrium events. The last cases are those in which thermal energy is "stored" in a specific protein mode and "used" for catalysis before it can dissipate to its environment (i.e., "nonstatistical dynamics"). This terminology issue aside, a debate has arisen among theoreticians around the roles of nonstatistical vs statistical dynamics in catalysis. However, the author knows of no experimental findings available today that examined this question in enzyme catalyzed reactions. Another source of perhaps nonsubstantial argument might stem from the varying time scales of enzymatic motions, which range from seconds to femtoseconds. Motions at different time scales play different roles in the many events along the catalytic cascade (reactant binding, reprotonation of reactants, structural rearrangement toward the transition state, product release, etc.). In several cases, when various experimental tools have been used to probe catalytic events at differing time scales, illusory contradictions seem to have emerged. In this Account, recent attempts to sort the merits of those questions are discussed along with possible future directions. A possible summary of current studies could be that enzyme, substrate, and solvent dynamics contribute to enzyme catalyzed reactions in several ways: first via mutual "induced-fit" shifting of their conformational ensemble upon binding; then via thermal search of the conformational space toward the reaction's transition-state (TS) and the rare event of the barrier crossing toward products, which is likely to be on faster time scales then the first and following events; and finally via the dynamics associated with products release, which are rate-limiting for many enzymatic reactions. From a chemical perspective, close to the TS, enzymatic systems seem to stiffen, restricting motions orthogonal to the chemical coordinate and enabling dynamics along the reaction coordinate to occur selectively. Studies of how enzymes evolved to support those efficient dynamics at various time scales are still in their infancy, and further experiments and calculations are needed to reveal these phenomena in both enzymes and uncatalyzed reactions.
Role of Dynamics in Enzyme Catalysis: Substantial versus Semantic Controversies
2015-01-01
Conspectus The role of the enzyme’s dynamic motions in catalysis is at the center of heated contemporary debates among both theoreticians and experimentalists. Resolving these apparent disputes is of both intellectual and practical importance: incorporation of enzyme dynamics could be critical for any calculation of enzymatic function and may have profound implications for structure-based drug design and the design of biomimetic catalysts. Analysis of the literature suggests that while part of the dispute may reflect substantial differences between theoretical approaches, much of the debate is semantic. For example, the term “protein dynamics” is often used by some researchers when addressing motions that are in thermal equilibrium with their environment, while other researchers only use this term for nonequilibrium events. The last cases are those in which thermal energy is “stored” in a specific protein mode and “used” for catalysis before it can dissipate to its environment (i.e., “nonstatistical dynamics”). This terminology issue aside, a debate has arisen among theoreticians around the roles of nonstatistical vs statistical dynamics in catalysis. However, the author knows of no experimental findings available today that examined this question in enzyme catalyzed reactions. Another source of perhaps nonsubstantial argument might stem from the varying time scales of enzymatic motions, which range from seconds to femtoseconds. Motions at different time scales play different roles in the many events along the catalytic cascade (reactant binding, reprotonation of reactants, structural rearrangement toward the transition state, product release, etc.). In several cases, when various experimental tools have been used to probe catalytic events at differing time scales, illusory contradictions seem to have emerged. In this Account, recent attempts to sort the merits of those questions are discussed along with possible future directions. A possible summary of current studies could be that enzyme, substrate, and solvent dynamics contribute to enzyme catalyzed reactions in several ways: first via mutual “induced-fit” shifting of their conformational ensemble upon binding; then via thermal search of the conformational space toward the reaction’s transition-state (TS) and the rare event of the barrier crossing toward products, which is likely to be on faster time scales then the first and following events; and finally via the dynamics associated with products release, which are rate-limiting for many enzymatic reactions. From a chemical perspective, close to the TS, enzymatic systems seem to stiffen, restricting motions orthogonal to the chemical coordinate and enabling dynamics along the reaction coordinate to occur selectively. Studies of how enzymes evolved to support those efficient dynamics at various time scales are still in their infancy, and further experiments and calculations are needed to reveal these phenomena in both enzymes and uncatalyzed reactions. PMID:25539442
ERIC Educational Resources Information Center
Kovalchik, Stephanie A.; Martino, Steven C.; Collins, Rebecca L.; Shadel, William G.; D'Amico, Elizabeth J.; Becker, Kirsten
2018-01-01
Ecological momentary assessment (EMA) is a popular assessment method in psychology that aims to capture events, emotions, and cognitions in real time, usually repeatedly throughout the day. Because EMA typically involves more intensive monitoring than traditional assessment methods, missing data are commonly an issue and this missingness may bias…
Extreme multi-basin flooding linked with extra-tropical cyclones
NASA Astrophysics Data System (ADS)
De Luca, Paolo; Hillier, John K.; Wilby, Robert L.; Quinn, Nevil W.; Harrigan, Shaun
2017-11-01
Fluvial floods are typically investigated as ‘events’ at the single basin-scale, hence flood management authorities may underestimate the threat of flooding across multiple basins driven by large-scale and nearly concurrent atmospheric event(s). We pilot a national-scale statistical analysis of the spatio-temporal characteristics of extreme multi-basin flooding (MBF) episodes, using peak river flow data for 260 basins in Great Britain (1975-2014), a sentinel region for storms impacting northwest and central Europe. During the most widespread MBF episode, 108 basins (~46% of the study area) recorded annual maximum (AMAX) discharge within a 16 day window. Such episodes are associated with persistent cyclonic and westerly atmospheric circulations, atmospheric rivers, and precipitation falling onto previously saturated ground, leading to hydrological response times <40 h and documented flood impacts. Furthermore, peak flows tend to occur after 0-13 days of very severe gales causing combined and spatially-distributed, yet differentially time-lagged, wind and flood damages. These findings have implications for emergency responders, insurers and contingency planners worldwide.
NASA Technical Reports Server (NTRS)
Porro, A. Robert
2000-01-01
A series of dynamic flow field pressure probes were developed for use in large-scale supersonic wind tunnels at NASA Glenn Research Center. These flow field probes include pitot, static, and five-hole conical pressure probes that are capable of capturing fast acting flow field pressure transients that occur on a millisecond time scale. The pitot and static probes can be used to determine local Mach number time histories during a transient event. The five-hole conical pressure probes are used primarily to determine local flow angularity, but can also determine local Mach number. These probes were designed, developed, and tested at the NASA Glenn Research Center. They were also used in a NASA Glenn 10-by 10-Foot Supersonic Wind Tunnel (SWT) test program where they successfully acquired flow field pressure data in the vicinity of a propulsion system during an engine compressor staff and inlet unstart transient event. Details of the design, development, and subsequent use of these probes are discussed in this report.
The Legacy of Episodic Climatic Events in Shaping Temperate, Broadleaf Forests
NASA Technical Reports Server (NTRS)
Pederson, Neil; Dyer, James M.; McEwan, Ryan W.; Hessl, Amy E.; Mock, Cary J.; Orwig, David A.; Rieder, Harald E.; Cook, Benjamin I.
2015-01-01
In humid, broadleaf-dominated forests where gap dynamics and partial canopy mortality appears to dominate the disturbance regime at local scales, paleoecological evidence shows alteration at regional-scales associated with climatic change. Yet, little evidence of these broad-scale events exists in extant forests. To evaluate the potential for the occurrence of large-scale disturbance, we used 76 tree-ring collections spanning approx. 840 000 sq km and 5327 tree recruitment dates spanning approx. 1.4 million sq km across the humid eastern United States. Rotated principal component analysis indicated a common growth pattern of a simultaneous reduction in competition in 22 populations across 61 000 km2. Growth-release analysis of these populations reveals an intense and coherent canopy disturbance from 1775 to 1780, peaking in 1776. The resulting time series of canopy disturbance is so poorly described by a Gaussian distribution that it can be described as ''heavy tailed,'' with most of the years from 1775 to 1780 comprising the heavy-tail portion of the distribution. Historical documents provide no evidence that hurricanes or ice storms triggered the 1775-1780 event. Instead, we identify a significant relationship between prior drought and years with elevated rates of disturbance with an intense drought occurring from 1772 to 1775. We further find that years with high rates of canopy disturbance have a propensity to create larger canopy gaps indicating repeated opportunities for rapid change in species composition beyond the landscape scale. Evidence of elevated, regional-scale disturbance reveals how rare events can potentially alter system trajectory: a substantial portion of old-growth forests examined here originated or were substantially altered more than two centuries ago following events lasting just a few years. Our recruitment data, comprised of at least 21 species and several shade-intolerant species, document a pulse of tree recruitment at the subcontinental scale during the late-1600s suggesting that this event was severe enough to open large canopy gaps. These disturbances and their climatic drivers support the hypothesis that punctuated, episodic, climatic events impart a legacy in broadleaf-dominated forests centuries after their occurrence. Given projections of future drought, these results also reveal the potential for abrupt, meso- to large-scale forest change in broadleaf-dominated forests over future decades.
NASA Astrophysics Data System (ADS)
Rosendahl, D. H.; Ćwik, P.; Martin, E. R.; Basara, J. B.; Brooks, H. E.; Furtado, J. C.; Homeyer, C. R.; Lazrus, H.; Mcpherson, R. A.; Mullens, E.; Richman, M. B.; Robinson-Cook, A.
2017-12-01
Extreme precipitation events cause significant damage to homes, businesses, infrastructure, and agriculture, as well as many injures and fatalities as a result of fast-moving water or waterborne diseases. In the USA, these natural hazard events claimed the lives of more than 300 people during 2015 - 2016 alone, with total damage reaching $24.4 billion. Prior studies of extreme precipitation events have focused on the sub-daily to sub-weekly timeframes. However, many decisions for planning, preparing and resilience-building require sub-seasonal to seasonal timeframes (S2S; 14 to 90 days), but adequate forecasting tools for prediction do not exist. Therefore, the goal of this newly funded project is an enhancement in understanding of the large-scale forcing and dynamics of S2S extreme precipitation events in the United States, and improved capability for modeling and predicting such events. Here, we describe the project goals, objectives, and research activities that will take place over the next 5 years. In this project, a unique team of scientists and stakeholders will identify and understand weather and climate processes connected with the prediction of S2S extreme precipitation events by answering these research questions: 1) What are the synoptic patterns associated with, and characteristic of, S2S extreme precipitation evens in the contiguous U.S.? 2) What role, if any, do large-scale modes of climate variability play in modulating these events? 3) How predictable are S2S extreme precipitation events across temporal scales? 4) How do we create an informative prediction of S2S extreme precipitation events for policymaking and planing? This project will use observational data, high-resolution radar composites, dynamical climate models and workshops that engage stakeholders (water resource managers, emergency managers and tribal environmental professionals) in co-production of knowledge. The overarching result of this project will be predictive models to reduce of the societal and economic impacts of extreme precipitation events. Another outcome will include statistical and co-production frameworks, which could be applied across other meteorological extremes, all time scales and in other parts of the world to increase resilience to extreme meteorological events.
Morphological response of a large-scale coastal blowout to a strong magnitude transport event
NASA Astrophysics Data System (ADS)
Delgado-Fernandez, Irene; Jackson, Derek; Smith, Alexander; Smyth, Thomas
2017-04-01
Large-scale blowouts are fundamental features of many coastal dune fields in temperate areas around the world. These distinctive erosional (mostly unvegetated) landform features are often characterised by a significant depression area and a connected depositional lobe at their downwind edges. These areas also provide important transport corridors to inland parts of the dune system and can provide ideal habitats for specialist flora and fauna as well as helping to enhance landscape diversity. The actual morphology and shape/size of blowouts can significantly modify the overlying atmospheric boundary layer of the wind, influencing wind flow steering and intensity within the blowout, and ultimately aeolian sediment transport. While investigations of morphological changes within blowouts have largely focused on the medium (months) to long (annual/decadal) temporal scale, studies of aeolian transport dynamics within blowouts have predominantly focused on the short-term (event) scale. Work on wind-transport processes in blowouts is still relatively rare, with ad-hoc studies providing only limited information on airflow and aeolian transport. Large-scale blowouts are characterised by elongated basins that can reach hundreds of meters, potentially resulting in airflow and transport dynamics that are very different from their smaller scale counterparts. This research focuses on a short-term, strong wind event measured at the Devil's Hole blowout (Sefton dunes, NW England), a large-scale blowout feature approximately 300 m in length and 100 m in width. In situ measurements of airflow and aeolian transport were collected during a short-term experiment on the 22nd October 2015. A total of twenty three, 3D ultrasonic anemometers, sand traps, and wenglor sensors were deployed in a spatial grid covering the distal end of the basin, walls, and depositional lobe. Terrestrial laser scanning (TLS) was used to quantify morphological changes within the blowout before and after the strong magnitude transport event. This allowed, for the first time, examination of the morphological response as a direct result of a high energy wind event as it passes through a large-scale blowout. Results indicate strong steering and acceleration of the wind along the blowout basin and up the south wall opposite to the incident regional winds. These accelerated flows generated very strong transport rates of up to 3 g/s along the basin, and moderate strong transport rates of up to 1.5 g/s up the steep north wall. The coupling of high-frequency wind events and transport response together with topographic changes defined by TLS data allows, for the first time, the ability to co-connect the morphological evolution of a coastal blowout landform with the localised driving processes.
NASA Astrophysics Data System (ADS)
Lavely, Adam; Vijayakumar, Ganesh; Brasseur, James; Paterson, Eric; Kinzel, Michael
2011-11-01
Using large-eddy simulation (LES) of the neutral and moderately convective atmospheric boundary layers (NBL, MCBL), we analyze the impact of coherent turbulence structure of the atmospheric surface layer on the short-time statistics that are commonly collected from wind turbines. The incoming winds are conditionally sampled with a filtering and thresholding algorithm into high/low horizontal and vertical velocity fluctuation coherent events. The time scales of these events are ~5 - 20 blade rotations and are roughly twice as long in the MCBL as the NBL. Horizontal velocity events are associated with greater variability in rotor power, lift and blade-bending moment than vertical velocity events. The variability in the industry standard 10 minute average for rotor power, sectional lift and wind velocity had a standard deviation of ~ 5% relative to the ``infinite time'' statistics for the NBL and ~10% for the MCBL. We conclude that turbulence structure associated with atmospheric stability state contributes considerable, quantifiable, variability to wind turbine statistics. Supported by NSF and DOE.
Catchment dynamics and social response during flash floods
NASA Astrophysics Data System (ADS)
Creutin, J. D.; Lutoff, C.; Ruin, I.; Scolobig, A.; Créton-Cazanave, L.
2009-04-01
The objective of this study is to examine how the current techniques for flash-flood monitoring and forecasting can meet the requirements of the population at risk to evaluate the severity of the flood and anticipate its danger. To this end, we identify the social response time for different social actions in the course of two well studied flash flood events which occurred in France and Italy. We introduce a broad characterization of the event management activities into three types according to their main objective (information, organisation and protection). The activities are also classified into three other types according to the scale and nature of the human group involved (individuals, communities and institutions). The conclusions reached relate to i) the characterisation of the social responses according to watershed scale and to the information available, and ii) to the appropriateness of the existing surveillance and forecasting tools to support the social responses. Our results suggest that representing the dynamics of the social response with just one number representing the average time for warning a population is an oversimplification. It appears that the social response time exhibits a parallel with the hydrological response time, by diminishing in time with decreasing size of the relevant watershed. A second result is that the human groups have different capabilities of anticipation apparently based on the nature of information they use. Comparing watershed response times and social response times shows clearly that at scales of less than 100 km2, a number of actions were taken with response times comparable to the catchment response time. The implications for adapting the warning processes to social scales (individual or organisational scales) are considerable. At small scales and for the implied anticipation times, the reliable and high-resolution description of the actual rainfall field becomes the major source of information for decision-making processes such as deciding between evacuations or advising to stay home. This points to the need to improve the accuracy and quality control of real time radar rainfall data, especially for extreme flash flood generating storms.
Solar-Panel Dust Accumulation and Cleanings
NASA Technical Reports Server (NTRS)
2005-01-01
Air-fall dust accumulates on the solar panels of NASA's Mars Exploration Rovers, reducing the amount of sunlight reaching the solar arrays. Pre-launch models predicted steady dust accumulation. However, the rovers have been blessed with occasional wind events that clear significant amounts of dust from the solar panels. This graph shows the effects of those panel-cleaning events on the amount of electricity generated by Spirit's solar panels. The horizontal scale is the number of Martian days (sols) after Spirit's Jan. 4, 2005, (Universal Time) landing on Mars. The vertical scale indicates output from the rover's solar panels as a fraction of the amount produced when the clean panels first opened. Note that the gradual declines are interrupted by occasional sharp increases, such as a dust-cleaning event on sol 420.Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.
Soleimani, Hossein; Hensman, James; Saria, Suchi
2017-08-21
Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.
Donald A. Falk
2013-01-01
Contemporary climate change is driving transitions in many Madrean ecosystems, but the time scale of these changes is accelerated greatly by severe landscape disturbances such as wildfires and insect outbreaks. Landscape-scale disturbance events such as wildfires interact with prior disturbance patterns and landscape structure to catalyze abrupt transitions to novel...
Aliri, Jone; Muela, Alexander; Gorostiaga, Arantxa; Balluerka, Nekane; Aritzeta, Aitor; Soroa, Goretti
2018-01-01
The occurrence of stressful life events is a risk factor for psychopathology in adolescence. Depression is a problem of notable clinical importance that has a negative psychosocial impact on adolescents and which has considerable social, educational, and economic costs. The aim of this study was to examine the relationship between stressful life events and depressive symptomatology in adolescence, taking into account the effect that attachment representations may have on this relation. Participants were 1653 adolescents (951 girls) aged between 13 and 18 years. The sample was selected by means of a random sampling procedure based on the availability of schools to participate. Data were collected at two time points: attachment and stressful life events were assessed first, and symptoms of depression were evaluated eight to nine months later. Two time points were used in order to better analyze the mediating role of attachment security. Stressful life events were recorded using the Inventory of Stressful Life Events, attachment was evaluated by the Inventory of Parent and Peer Attachment (mother, father, and peer versions), and depressive symptomatology was assessed through the Children's Depression Scale. In all cases, the Basque version of these scales was used. The results indicated that attachment to parents was a mediating variable in the relationship between stressful life events and depressive symptomatology. Contrary to what we expected, the results indicate that stressful life events did not have a negative effect on peer attachment, and neither did the latter variable act as a mediator of the relationship between stressful life events and depressive symptoms. It can be concluded that attachment-based interventions may be especially useful for reducing depression symptoms among adolescents. The findings also suggest a role for interventions that target parent-child attachment relationships.
NASA Astrophysics Data System (ADS)
Krysta, Monika; Kushida, Noriyuki; Kotselko, Yuriy; Carter, Jerry
2016-04-01
Possibilities of associating information from four pillars constituting CTBT monitoring and verification regime, namely seismic, infrasound, hydracoustic and radionuclide networks, have been explored by the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) for a long time. Based on a concept of overlying waveform events with the geographical regions constituting possible sources of the detected radionuclides, interactive and non-interactive tools were built in the past. Based on the same concept, a design of a prototype of a Fused Event Bulletin was proposed recently. One of the key design elements of the proposed approach is the ability to access fusion results from either the radionuclide or from the waveform technologies products, which are available on different time scales and through various different automatic and interactive products. To accommodate various time scales a dynamic product evolving while the results of the different technologies are being processed and compiled is envisioned. The product would be available through the Secure Web Portal (SWP). In this presentation we describe implementation of the data fusion functionality in the test framework of the SWP. In addition, we address possible refinements to the already implemented concepts.
Bushmakin, A G; Cappelleri, J C; Chandran, A B; Zlateva, G
2013-01-01
The Fibromyalgia Impact Questionnaire (FIQ) is a patient-reported outcome that evaluates the impact of fibromyalgia (FM) on daily life. This study evaluated the relationships between the functional status of FM patients, measured with the FIQ at baseline, and median time to a clinically relevant pain reduction. Data were derived from two randomised, placebo-controlled trials that evaluated pregabalin 300, 450 and 600 mg/day for the treatment of FM. The Kaplan-Meier (nonparametric) method was applied to estimate median times to 'transient' and 'stable' events. The transient event was defined as a ≥ 27.9% improvement on an 11-point daily pain diary scale (0 = no pain, 10 = worst possible pain), and the stable event was defined as the mean of the daily improvements ≥ 27.9% relative to baseline over the subsequent study duration starting on the day of the transient event. A parametric model using time-to-event analysis was developed for evaluating the relationship between baseline FIQ score and the median time to these events. Median time was longer among patients treated with placebo relative to pregabalin for the transient events (11-12 days vs. 5-7 days) and stable events (86 days vs. 13-29 days). A significant association was observed between baseline FIQ scores and median time to transient and stable events (p < 0.001). Median times to events were similar between the studies. For transient pain reduction events, median times ranged from 3.0 to 4.5 days for baseline FIQ scores of 10, and 9.1-9.6 days for FIQ scores of 100; for stable pain reduction events, the median time ranged from 11.0 to 13.0 days and from 27.0 to 28.5 days for baseline FIQ scores of 10 and 100 respectively. Time to a clinically relevant reduction in pain was significantly associated with FM severity at baseline as measured by the FIQ. Such an analysis can inform patient and physician expectations in clinical practice. © 2012 Blackwell Publishing Ltd.
a Structure of Experienced Time
NASA Astrophysics Data System (ADS)
Havel, Ivan M.
2005-10-01
The subjective experience of time will be taken as a primary motivation for an alternative, essentially discontinuous conception of time. Two types of such experience will be discussed, one based on personal episodic memory, the other on the theoretical fine texture of experienced time below the threshold of phenomenal awareness. The former case implies a discrete structure of temporal episodes on a large scale, while the latter case suggests endowing psychological time with a granular structure on a small scale, i.e. interpreting it as a semi-ordered flow of smeared (not point-like) subliminal time grains. Only on an intermediate temporal scale would the subjectively felt continuity and fluency of time emerge. Consequently, there is no locally smooth mapping of phenomenal time onto the real number continuum. Such a model has certain advantages; for instance, it avoids counterintuitive interpretations of some neuropsychological experiments (e.g. Libet's measurement) in which the temporal order of events is crucial.
Detecting TLEs using a massive all-sky camera network
NASA Astrophysics Data System (ADS)
Garnung, M. B.; Celestin, S. J.
2017-12-01
Transient Luminous Events (TLEs) are large-scale optical events occurring in the upper-atmosphere from the top of thunderclouds up to the ionosphere. TLEs may have important effects in local, regional, and global scales, and many features of TLEs are not fully understood yet [e.g, Pasko, JGR, 115, A00E35, 2010]. Moreover, meteor events have been suggested to play a role in sprite initiation by producing ionospheric irregularities [e.g, Qin et al., Nat. Commun., 5, 3740, 2014]. The French Fireball Recovery and InterPlanetary Observation Network (FRIPON, https://www.fripon.org/?lang=en), is a national all-sky 30 fps camera network designed to continuously detect meteor events. We seek to make use of this network to observe TLEs over unprecedented space and time scales ( 1000×1000 km with continuous acquisition). To do so, we had to significantly modify FRIPON's triggering software Freeture (https://github.com/fripon/freeture) while leaving the meteor detection capability uncompromised. FRIPON has a great potential in the study of TLEs. Not only could it produce new results about spatial and time distributions of TLEs over a very large area, it could also be used to validate and complement observations from future space missions such as ASIM (ESA) and TARANIS (CNES). In this work, we present an original image processing algorithm that can detect sprites using all-sky cameras while strongly limiting the frequency of false positives and our ongoing work on sprite triangulation using the FRIPON network.
Concurrent systems and time synchronization
NASA Astrophysics Data System (ADS)
Burgin, Mark; Grathoff, Annette
2018-05-01
In the majority of scientific fields, system dynamics is described assuming existence of unique time for the whole system. However, it is established theoretically, for example, in relativity theory or in the system theory of time, and validated experimentally that there are different times and time scales in a variety of real systems - physical, chemical, biological, social, etc. In spite of this, there are no wide-ranging scientific approaches to exploration of such systems. Therefore, the goal of this paper is to study systems with this property. We call them concurrent systems because processes in them can go, events can happen and actions can be performed in different time scales. The problem of time synchronization is specifically explored.
Li, Huanhuan; Zou, Yingmin; Wang, Jiaqi; Yang, Xuelin
2016-01-01
Online game addiction (OGA) is becoming a significant problem worldwide. The aim of this study was to explore the incidence of OGA and the roles of stressful life events, avoidant coping styles (ACSs), and neuroticism in OGA. A total of 651 Chinese college students were selected by random cluster sampling. Subjects completed the Chinese version of Young's eight-item Internet Addiction Scale (CIAS), Online Game Cognition Addiction Scale (OGCAS), Revised Eysenck Personality Questionnaire Short Scale in Chinese (EPQ-RSC), Chinese College-student Stress Questionnaire, and Coping Style Questionnaire. Structural equation modeling (SEM) was used to explore the interactive effects of stressful life events, ACSs, and neuroticism on OGA. Of the 651 participants in the sample, 31 (4.8%) were identified as addicts. The incidence of OGA was two times higher for males than females. The addicts had markedly higher scores on the neuroticism subscale of the EPQ-RSC than non-addicts. Compared to non-addicts, addicts were more apt to use ACSs. Having an avoidant coping strategy mediated the effect of stressful life events on OGA. Furthermore, neuroticism moderated the indirect effect of stressful life events on OGA via ACSs. Applications of these findings to etiological research and clinical treatment programs are discussed.
Li, Huanhuan; Zou, Yingmin; Wang, Jiaqi; Yang, Xuelin
2016-01-01
Online game addiction (OGA) is becoming a significant problem worldwide. The aim of this study was to explore the incidence of OGA and the roles of stressful life events, avoidant coping styles (ACSs), and neuroticism in OGA. A total of 651 Chinese college students were selected by random cluster sampling. Subjects completed the Chinese version of Young’s eight-item Internet Addiction Scale (CIAS), Online Game Cognition Addiction Scale (OGCAS), Revised Eysenck Personality Questionnaire Short Scale in Chinese (EPQ-RSC), Chinese College-student Stress Questionnaire, and Coping Style Questionnaire. Structural equation modeling (SEM) was used to explore the interactive effects of stressful life events, ACSs, and neuroticism on OGA. Of the 651 participants in the sample, 31 (4.8%) were identified as addicts. The incidence of OGA was two times higher for males than females. The addicts had markedly higher scores on the neuroticism subscale of the EPQ-RSC than non-addicts. Compared to non-addicts, addicts were more apt to use ACSs. Having an avoidant coping strategy mediated the effect of stressful life events on OGA. Furthermore, neuroticism moderated the indirect effect of stressful life events on OGA via ACSs. Applications of these findings to etiological research and clinical treatment programs are discussed. PMID:27920734
Comparing Shock geometry from MHD simulation to that from the Q/A-scaling analysis
NASA Astrophysics Data System (ADS)
Li, G.; Zhao, L.; Jin, M.
2017-12-01
In large SEP events, ions can be accelerated at CME-driven shocks to very high energies. Spectra of heavy ions in many large SEP events show features such as roll-overs or spectral breaks. In some events when the spectra are plotted in energy/nucleon they can be shifted relatively to each other so that the spectra align. The amount of shift is charge-to-mass ratio (Q/A) dependent and varies from event to event. In the work of Li et al. (2009), the Q/A dependences of the scaling is related to shock geometry when the CME-driven shock is close to the Sun. For events where multiple in-situ spacecraft observations exist, one may expect that different spacecraft are connected to different portions of the CME-driven shock that have different shock geometries, therefore yielding different Q/A dependence. At the same time, shock geometry can be also obtained from MHD simulations. This means we can compare shock geometry from two completely different approaches: one from MHD simulation and the other from in-situ spectral fitting. In this work, we examine this comparison for selected events.
The Ramifications of Meddling with Systems Governed by Self-organized Critical Dynamics
NASA Astrophysics Data System (ADS)
Carreras, B. A.; Newman, D. E.; Dobson, I.
2002-12-01
Complex natural, well as man-made, systems often exhibit characteristics similar to those seen in self-organized critical (SOC) systems. The concept of self-organized criticality brings together ideas of self-organization of nonlinear dynamical systems with the often-observed near critical behavior of many natural phenomena. These phenomena exhibit self-similarities over extended ranges of spatial and temporal scales. In those systems, scale lengths may be described by fractal geometry and time scales that lead to 1/f-like power spectra. Natural applications include modeling the motion of tectonics plates, forest fires, magnetospheric dynamics, spin glass systems, and turbulent transport. In man-made systems, applications have included traffic dynamics, power and communications networks, and financial markets among many others. Simple cellular automata models such as the running sandpile model have been very useful in reproducing the complexity and characteristics of these systems. One characteristic property of the SOC systems is that they relax through what we call events. These events can happen over all scales of the system. Examples of these events are: earthquakes in the case of plate tectonic; fires in forest evolution extinction in the co evolution of biological species; and blackouts in power transmission systems. In a time-averaged sense, these systems are subcritical (that is, they lie in an average state that should not trigger any events) and the relaxation events happen intermittently. The time spent in a subcritical state relative to the time of the events varies from one system to another. For instance, the chance of finding a forest on fire is very low with the frequency of fires being on the order of one fire every few years and with many of these fires small and inconsequential. Very large fires happen over time periods of decades or even centuries. However, because of their consequences, these large but infrequent events are the important ones to understand, control and minimize. The main thrust of this research is to understand how and when global events occur in such systems when we apply mitigation techniques and how this impacts risk assessment. As sample systems we investigate both forest fire models and electrical power transmission network models, though the results are probably applicable to a wide variety of systems. It is found, perhaps counter intuitively, that apparently sensible attempts to mitigate failures in such complex systems can have adverse effects and therefore must be approached with care. The success of mitigation efforts in SOC systems is strongly influenced by the dynamics of the system. Unless the mitigation efforts alter the self-organization forces driving the system, the system will in general be pushed toward criticality. To alter those forces with mitigation efforts may be quite difficult because the forces are an intrinsic part of the system. Moreover, in many cases, efforts to mitigate small disruptions will increase the frequency of large disruptions. This occurs because the large and small disruptions are not independent but are strongly coupled by the dynamics. Before discussing this in the more complicated case of power systems, we will illustrate this phenomenon with a forest fire model.
Spatial patterns of frequent floods in Switzerland
NASA Astrophysics Data System (ADS)
Schneeberger, Klaus; Rössler, Ole; Weingartner, Rolf
2017-04-01
Information about the spatial characteristics of high and extreme streamflow is often needed for an accurate analysis of flood risk and effective co-ordination of flood related activities, such as flood defence planning. In this study we analyse the spatial dependence of frequent floods in Switzerland across different scales. Firstly, we determine the average length of high and extreme flow events for 56 runoff time series of Swiss rivers. Secondly, a dependence measure expressing the probability that streamflow peaks are as high as peaks at a conditional site is used to describe and map the spatial extend of joint occurrence of frequent floods across Switzerland. Thirdly, we apply a cluster analysis to identify groups of sites that are likely to react similarly in terms of joint occurrence of high flow events. The results indicate that a time interval with a length of 3 days seems to be most appropriate to characterise the average length of high streamflow events across spatial scales. In the main Swiss basins, high and extreme streamflows were found to be asymptotically independent. In contrast, at the meso-scale distinct flood regions, which react similarly in terms of occurrence of frequent flood, were found. The knowledge about these regions can help to optimise flood defence planning or to estimate regional flood risk properly.
Spatiotemporal Organization of Energy Release Events in the Quiet Solar Corona
NASA Technical Reports Server (NTRS)
Uritsky, Vadim M.; Davila, Joseph M.
2014-01-01
Using data from the STEREO and SOHO spacecraft, we show that temporal organization of energy release events in the quiet solar corona is close to random, in contrast to the clustered behavior of flaring times in solar active regions. The locations of the quiet-Sun events follow the meso- and supergranulation pattern of the underling photosphere. Together with earlier reports of the scale-free event size statistics, our findings suggest that quiet solar regions responsible for bulk coronal heating operate in a driven self-organized critical state, possibly involving long-range Alfvenic interactions.
NASA Astrophysics Data System (ADS)
Ward, K.
2015-12-01
Hidden within the terabytes of imagery in NASA's Global Imagery Browse Services (GIBS) collection are hundreds of daily natural events. Some events are newsworthy, devastating, and visibly obvious at a global scale, others are merely regional curiosities. Regardless of the scope and significance of any one event, it is likely that multiple GIBS layers can be viewed to provide a multispectral, dataset-based view of the event. To facilitate linking between the discrete event and the representative dataset imagery, NASA's Earth Observatory Group has developed a prototype application programming interface (API): the Earth Observatory Natural Event Tracker (EONET). EONET supports an API model that allows users to retrieve event-specific metadata--date/time, location, and type (wildfire, storm, etc.)--and web service layer-specific metadata which can be used to link to event-relevant dataset imagery in GIBS. GIBS' ability to ingest many near real time datasets, combined with its growing archive of past imagery, means that API users will be able to develop client applications that not only show ongoing events but can also look at imagery from before and after. In our poster, we will present the API and show examples of its use.
The role of storm scale, position and movement in controlling urban flood response
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-claire; Zhou, Zhengzheng; Yang, Long; Liu, Shuguang; Smith, James
2018-01-01
The impact of spatial and temporal variability of rainfall on hydrological response remains poorly understood, in particular in urban catchments due to their strong variability in land use, a high degree of imperviousness and the presence of stormwater infrastructure. In this study, we analyze the effect of storm scale, position and movement in relation to basin scale and flow-path network structure on urban hydrological response. A catalog of 279 peak events was extracted from a high-quality observational dataset covering 15 years of flow observations and radar rainfall data for five (semi)urbanized basins ranging from 7.0 to 111.1 km2 in size. Results showed that the largest peak flows in the event catalog were associated with storm core scales exceeding basin scale, for all except the largest basin. Spatial scale of flood-producing storm events in the smaller basins fell into two groups: storms of large spatial scales exceeding basin size or small, concentrated events, with storm core much smaller than basin size. For the majority of events, spatial rainfall variability was strongly smoothed by the flow-path network, increasingly so for larger basin size. Correlation analysis showed that position of the storm in relation to the flow-path network was significantly correlated with peak flow in the smallest and in the two more urbanized basins. Analysis of storm movement relative to the flow-path network showed that direction of storm movement, upstream or downstream relative to the flow-path network, had little influence on hydrological response. Slow-moving storms tend to be associated with higher peak flows and longer lag times. Unexpectedly, position of the storm relative to impervious cover within the basins had little effect on flow peaks. These findings show the importance of observation-based analysis in validating and improving our understanding of interactions between the spatial distribution of rainfall and catchment variability.
NASA Astrophysics Data System (ADS)
Mallakpour, Iman; Villarini, Gabriele; Jones, Michael P.; Smith, James A.
2017-08-01
The central United States is plagued by frequent catastrophic flooding, such as the flood events of 1993, 2008, 2011, 2013, 2014 and 2016. The goal of this study is to examine whether it is possible to describe the occurrence of flood and heavy precipitation events at the sub-seasonal scale in terms of variations in the climate system. Daily streamflow and precipitation time series over the central United States (defined here to include North Dakota, South Dakota, Nebraska, Kansas, Missouri, Iowa, Minnesota, Wisconsin, Illinois, West Virginia, Kentucky, Ohio, Indiana, and Michigan) are used in this study. We model the occurrence/non-occurrence of a flood and heavy precipitation event over time using regression models based on Cox processes, which can be viewed as a generalization of Poisson processes. Rather than assuming that an event (i.e., flooding or precipitation) occurs independently of the occurrence of the previous one (as in Poisson processes), Cox processes allow us to account for the potential presence of temporal clustering, which manifests itself in an alternation of quiet and active periods. Here we model the occurrence/non-occurrence of flood and heavy precipitation events using two climate indices as time-varying covariates: the Arctic Oscillation (AO) and the Pacific-North American pattern (PNA). We find that AO and/or PNA are important predictors in explaining the temporal clustering in flood occurrences in over 78% of the stream gages we considered. Similar results are obtained when working with heavy precipitation events. Analyses of the sensitivity of the results to different thresholds used to identify events lead to the same conclusions. The findings of this work highlight that variations in the climate system play a critical role in explaining the occurrence of flood and heavy precipitation events at the sub-seasonal scale over the central United States.
NASA Astrophysics Data System (ADS)
Allen, M. F.; Taggart, M. C.; Hernandez, R. R.; Harmon, T. C.; Rundel, P.
2017-12-01
Observation is essential for organizing outputs from sensor data to describe dynamic phenomena regulating core processes. The rhizosphere is that region of the soil layer that regulates soil carbon acquisition, turnover, and sequestration and that is most sensitive to rapid changes in soil moisture, temperature, and gases. Virtually every process regulating carbon and nutrient immobilization and mineralization occur here at the maximum rates. However, the observation of root, microbial, and animal growth, movement, and mortality are rarely undertaken at time scales of crucial events. While multiple cores or observations can be taken in space, replications in time are rarely undertaken. We coupled automated (AMR) and manual minirhizotrons (MMR) with soil and aboveground sensors for temperature (T), water content (q), CO2, and O2 to measure short-term dynamics that regulate carbon cycling. AMRs imaged rhizospheres, multiple times daily. From these images, we observed timing of root and hyphal growth and mortality in response to changes in photosynthesis, diurnal temperature fluctuations, and precipitation and drought events. Replicate manual minirhizotron tubes describe the spatial structure of those events, and replicate core samples provide measurements of standing crop at known times. We present four examples showing how observation led to understanding unusual C flux patterns in mixed-conifer forest (belowground photosynthate allocation), hot desert (CaCO3 formation and weathering), grassland (root grazing), and tropical rainforest (soil gas flux patterns).
NASA Astrophysics Data System (ADS)
Yang, Jian; Sun, Shuaishuai; Tian, Tongfei; Li, Weihua; Du, Haiping; Alici, Gursel; Nakano, Masami
2016-03-01
Protecting civil engineering structures from uncontrollable events such as earthquakes while maintaining their structural integrity and serviceability is very important; this paper describes the performance of a stiffness softening magnetorheological elastomer (MRE) isolator in a scaled three storey building. In order to construct a closed-loop system, a scaled three storey building was designed and built according to the scaling laws, and then four MRE isolator prototypes were fabricated and utilised to isolate the building from the motion induced by a scaled El Centro earthquake. Fuzzy logic was used to output the current signals to the isolators, based on the real-time responses of the building floors, and then a simulation was used to evaluate the feasibility of this closed loop control system before carrying out an experimental test. The simulation and experimental results showed that the stiffness softening MRE isolator controlled by fuzzy logic could suppress structural vibration well.
Planetary Scale Impacts and Consequences for the Mars Hemispheric Dichotomy
NASA Astrophysics Data System (ADS)
Marinova, M. M.; Aharonson, O.; Asphaug, E.
2007-12-01
Planetary-scale impacts are events in which the resultant impact basin is a significant fraction of the planet's circumference. The curvature of the planet is expected to be important in the impact process, especially as it relates to the fate of downrange ejecta in off-axis events. Planetary-scale impacts are abundant in the Solar System, especially early in its evolution. A possible candidate planetary-scale impact basin is the Martian hemispheric dichotomy, expressed as a difference in surface elevation, crustal thickness, and surface age between the northern lowlands and the southern highlands. We investigate the characteristics of planetary-scale impacts, and in particular the effects of a mega impact on Mars. We use a 3 dimensional self-gravitational Smoothed Particle Hydrodynamics (SPH) model to simulate the impacts, implementing an olivine equation of state derived for the Tillotson formulation, and use this to establish the initial pressure and internal energy profile of the planet. The parameter space of impactor energy, impactor size, and impact velocity are explored for Mars hemispheric impacts. We find that for a given impact energy, head-on large but slow impacts produce more melt and cover more of the planet with melt than small, fast, and oblique events. Head-on impacts produce crustal blow-off and a melt pool at the antipode. Oblique impacts do not cover much of the planet with melt, but create sizable basins. Various degrees of crustal thickening are apparent around the crater over a length of ~1000 km; this crustal thickening could relax over geological time. Fast impacts eject material with escape velocity many times their own mass. In all cases, less than 10% of the impactor's mass is placed in orbit. For oblique events, a significant fraction of the angular momentum in the system is carried away by escaping material, limiting the efficiency of angular momentum transfer to the planet.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Gang
Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less
Suicidal events in the Treatment for Adolescents With Depression Study (TADS).
Vitiello, Benedetto; Silva, Susan G; Rohde, Paul; Kratochvil, Christopher J; Kennard, Betsy D; Reinecke, Mark A; Mayes, Taryn L; Posner, Kelly; May, Diane E; March, John S
2009-04-21
The Treatment for Adolescents with Depression Study (TADS) database was analyzed to determine whether suicidal events (attempts and ideation) occurred early in treatment, could be predicted by severity of depression or other clinical characteristics, and were preceded by clinical deterioration or symptoms of increased irritability, akathisia, sleep disruption, or mania. TADS was a 36-week randomized, controlled clinical trial of pharmacologic and psychotherapeutic treatments involving 439 youths with major depressive disorder (DSM-IV criteria). Suicidal events were defined according to the Columbia Classification Algorithm of Suicidal Assessment. Patients were randomly assigned into the study between spring 2000 and summer 2003. Forty-four patients (10.0%) had at least 1 suicidal event (no suicide occurred). Events occurred 0.4 to 31.1 weeks (mean +/- SD = 11.9 +/- 8.2) after starting TADS treatment, with no difference in event timing for patients receiving medication versus those not receiving medication. Severity of self-rated pretreatment suicidal ideation (Suicidal Ideation Questionnaire adapted for adolescents score > or = 31) and depressive symptoms (Reynolds Adolescent Depression Scale score > or = 91) predicted occurrence of suicidal events during treatment (P < .05). Patients with suicidal events were on average still moderately ill prior to the event (mean +/- SD Clinical Global Impressions-Severity of Illness scale score = 4.0 +/- 1.3) and only minimally improved (mean +/- SD Clinical Global Impressions-Improvement scale score = 3.2 +/- 1.1). Events were not preceded by increased irritability, akathisia, sleep disturbance, or manic signs. Specific interpersonal stressors were identified in 73% of cases (N = 44). Of the events, 55% (N = 24) resulted in overnight hospitalization. Most suicidal events occurred in the context of persistent depression and insufficient improvement without evidence of medication-induced behavioral activation as a precursor. Severity of self-rated suicidal ideation and depressive symptoms predicted emergence of suicidality during treatment. Risk for suicidal events did not decrease after the first month of treatment, suggesting the need for careful clinical monitoring for several months after starting treatment. Copyright 2009 Physicians Postgraduate Press, Inc.
Benchmarking worker nodes using LHCb productions and comparing with HEPSpec06
NASA Astrophysics Data System (ADS)
Charpentier, P.
2017-10-01
In order to estimate the capabilities of a computing slot with limited processing time, it is necessary to know with a rather good precision its “power”. This allows for example pilot jobs to match a task for which the required CPU-work is known, or to define the number of events to be processed knowing the CPU-work per event. Otherwise one always has the risk that the task is aborted because it exceeds the CPU capabilities of the resource. It also allows a better accounting of the consumed resources. The traditional way the CPU power is estimated in WLCG since 2007 is using the HEP-Spec06 benchmark (HS06) suite that was verified at the time to scale properly with a set of typical HEP applications. However, the hardware architecture of processors has evolved, all WLCG experiments moved to using 64-bit applications and use different compilation flags from those advertised for running HS06. It is therefore interesting to check the scaling of HS06 with the HEP applications. For this purpose, we have been using CPU intensive massive simulation productions from the LHCb experiment and compared their event throughput to the HS06 rating of the worker nodes. We also compared it with a much faster benchmark script that is used by the DIRAC framework used by LHCb for evaluating at run time the performance of the worker nodes. This contribution reports on the finding of these comparisons: the main observation is that the scaling with HS06 is no longer fulfilled, while the fast benchmarks have a better scaling but are less precise. One can also clearly see that some hardware or software features when enabled on the worker nodes may enhance their performance beyond expectation from either benchmark, depending on external factors.
A scalable multi-photon coincidence detector based on superconducting nanowires.
Zhu, Di; Zhao, Qing-Yuan; Choi, Hyeongrak; Lu, Tsung-Ju; Dane, Andrew E; Englund, Dirk; Berggren, Karl K
2018-06-04
Coincidence detection of single photons is crucial in numerous quantum technologies and usually requires multiple time-resolved single-photon detectors. However, the electronic readout becomes a major challenge when the measurement basis scales to large numbers of spatial modes. Here, we address this problem by introducing a two-terminal coincidence detector that enables scalable readout of an array of detector segments based on superconducting nanowire microstrip transmission line. Exploiting timing logic, we demonstrate a sixteen-element detector that resolves all 136 possible single-photon and two-photon coincidence events. We further explore the pulse shapes of the detector output and resolve up to four-photon events in a four-element device, giving the detector photon-number-resolving capability. This new detector architecture and operating scheme will be particularly useful for multi-photon coincidence detection in large-scale photonic integrated circuits.
Chance, necessity and the origins of life: a physical sciences perspective
NASA Astrophysics Data System (ADS)
Hazen, Robert M.
2017-11-01
Earth's 4.5-billion-year history has witnessed a complex sequence of high-probability chemical and physical processes, as well as `frozen accidents'. Most models of life's origins similarly invoke a sequence of chemical reactions and molecular self-assemblies in which both necessity and chance play important roles. Recent research adds two important insights into this discussion. First, in the context of chemical reactions, chance versus necessity is an inherently false dichotomy-a range of probabilities exists for many natural events. Second, given the combinatorial richness of early Earth's chemical and physical environments, events in molecular evolution that are unlikely at limited laboratory scales of space and time may, nevertheless, be inevitable on an Earth-like planet at time scales of a billion years. This article is part of the themed issue 'Reconceptualizing the origins of life'.
Chen, Zu-Yin; Chiang, Chia-Hung; Huang, Chin-Chou; Chung, Chia-Min; Chan, Wan-Leong; Huang, Po-Hsun; Lin, Shing-Jong; Chen, Jaw-Wen; Leu, Hsin-Bang
2012-06-01
Poor oral hygiene has been associated with an increased risk for cardiovascular disease. However, the association between preventive dentistry and cardiovascular risk reduction has remained undetermined. The aim of this study is to investigate the association between tooth scaling and the risk of cardiovascular events by using a nationwide, population-based study and a prospective cohort design. Our analyses were conducted using information from a random sample of 1 million persons enrolled in the nationally representative Taiwan National Health Insurance Research Database. Exposed individuals consisted of all subjects who were aged ≥ 50 years and who received at least 1 tooth scaling in 2000. The comparison group of non-exposed persons consisted of persons who did not undergo tooth scaling and were matched to exposed individuals using propensity score matching by the time of enrollment, age, gender, history of coronary artery disease, diabetes, hypertension, and hyperlipidemia. During an average follow-up period of 7 years, 10,887 subjects who had ever received tooth scaling (exposed group) and 10,989 age-, gender-, and comorbidity-matched subjects who had not received tooth scaling (non-exposed group) were enrolled. The exposed group had a lower incidence of acute myocardial infarction (1.6% vs 2.2%, P<.001), stroke (8.9% vs 10%, P=.03), and total cardiovascular events (10% vs 11.6%, P<.001) when compared with the non-exposed group. After multivariate analysis, tooth scaling was an independent factor associated with less risk of developing future myocardial infarction (hazard ratio [HR], 0.69; 95% confidence interval [CI], 0.57-0.85), stroke (HR, 0.85; 95% CI, 0.78-0.93), and total cardiovascular events (HR, 0.84; 95% CI, 0.77-0.91). Furthermore, when compared with the non-exposed group, increasing frequency of tooth scaling correlated with a higher risk reduction of acute myocardial infarction, stroke, and total cardiovascular events (P for trend<.001). Tooth scaling was associated with a decreased risk for future cardiovascular events. Copyright © 2012 Elsevier Inc. All rights reserved.
Soft Water Level Sensors for Characterizing the Hydrological Behaviour of Agricultural Catchments
Crabit, Armand; Colin, François; Bailly, Jean Stéphane; Ayroles, Hervé; Garnier, François
2011-01-01
An innovative soft water level sensor is proposed to characterize the hydrological behaviour of agricultural catchments by measuring rainfall and stream flows. This sensor works as a capacitor coupled with a capacitance to frequency converter and measures water level at an adjustable time step acquisition. It was designed to be handy, minimally invasive and optimized in terms of energy consumption and low-cost fabrication so as to multiply its use on several catchments under natural conditions. It was used as a stage recorder to measure water level dynamics in a channel during a runoff event and as a rain gauge to measure rainfall amount and intensity. Based on the Manning equation, a method allowed estimation of water discharge with a given uncertainty and hence runoff volume at an event or annual scale. The sensor was tested under controlled conditions in the laboratory and under real conditions in the field. Comparisons of the sensor to reference devices (tipping bucket rain gauge, hydrostatic pressure transmitter limnimeter, Venturi channels…) showed accurate results: rainfall intensities and dynamic responses were accurately reproduced and discharges were estimated with an uncertainty usually acceptable in hydrology. Hence, it was used to monitor eleven small agricultural catchments located in the Mediterranean region. Both catchment reactivity and water budget have been calculated. Dynamic response of the catchments has been studied at the event scale through the rising time determination and at the annual scale by calculating the frequency of occurrence of runoff events. It provided significant insight into catchment hydrological behaviour which could be useful for agricultural management perspectives involving pollutant transport, flooding event and global water balance. PMID:22163868
NASA Astrophysics Data System (ADS)
Peña Gallardo, Marina; Serrano, Sergio Martín Vicente; Portugués Santiago, Beguería; Burguera Miquel, Tomás
2017-04-01
Drought leads to crop failures reducing the productivity. For this reason, the need of appropriate tool for recognize dry periods and evaluate the impact of drought on crop production is important. In this study, we provide an assessment of the relationship between drought episodes and crop failures in Spain as one of the direct consequences of drought is the diminishing of crop yields. First, different drought indices [the Standardized Precipitation and Evapotranspiration Index (SPEI); the Standardized Precipitation Index (SPI); the self-calibrated Palmer Moisture Anomaly Index (Z-Index), the self-calibrated Crop Moisture Index (CMI) and the Standardized Palmer Drought Index (SPDI)] have been calculated at different time scales in order to identify the dry events occurred in Spain and determine the duration and intensity of each event. Second, the drought episodes have been correlated with crop production estimated and final crop production data provided by the Spanish Crop Insurance System for the available period from 1995 to 2014 at the municipal spatial scale, with the purpose of knowing if the characteristics of the drought episodes are reflected on the agricultural losses. The analysis has been carried out in particular for two types of crop, wheat and barley. The results indicate the existence of an agreement between the most important drought events in Spain and the response of the crop productions and the proportion of hectare insurance. Nevertheless, this agreement vary depending on the drought index applied. Authors found a higher competence of the drought indices calculated at different time scales (SPEI, SPI and SPDI) identifying the begging and end of the drought events and the correspondence with the crop failures.
Timescales of Massive Human Entrainment
Fusaroli, Riccardo; Perlman, Marcus; Mislove, Alan; Paxton, Alexandra; Matlock, Teenie; Dale, Rick
2015-01-01
The past two decades have seen an upsurge of interest in the collective behaviors of complex systems composed of many agents entrained to each other and to external events. In this paper, we extend the concept of entrainment to the dynamics of human collective attention. We conducted a detailed investigation of the unfolding of human entrainment—as expressed by the content and patterns of hundreds of thousands of messages on Twitter—during the 2012 US presidential debates. By time-locking these data sources, we quantify the impact of the unfolding debate on human attention at three time scales. We show that collective social behavior covaries second-by-second to the interactional dynamics of the debates: A candidate speaking induces rapid increases in mentions of his name on social media and decreases in mentions of the other candidate. Moreover, interruptions by an interlocutor increase the attention received. We also highlight a distinct time scale for the impact of salient content during the debates: Across well-known remarks in each debate, mentions in social media start within 5–10 seconds after it occurs; peak at approximately one minute; and slowly decay in a consistent fashion across well-known events during the debates. Finally, we show that public attention after an initial burst slowly decays through the course of the debates. Thus we demonstrate that large-scale human entrainment may hold across a number of distinct scales, in an exquisitely time-locked fashion. The methods and results pave the way for careful study of the dynamics and mechanisms of large-scale human entrainment. PMID:25880357
Archetypes, Causal Description and Creativity in Natural World
NASA Astrophysics Data System (ADS)
Chiatti, Leonardo
The idea, formulated for the first time by Pauli, of a "creativity" of natural processes on a quantum scale is briefly investigated, with particular reference to the phenomena, common throughout the biological world, involved in the amplification of microscopic "creative" events at oscopic level. The involvement of non-locality is also discussed with reference to the synordering of events, a concept introduced for the first time by Bohm. Some convergences are proposed between the metamorphic process envisaged by Bohm and that envisaged by Goethe, and some possible applications concerning known biological phenomena are briefly discussed.
Abramczyk, H; Brozek-Płuska, B; Kurczewski, K; Kurczewska, M; Szymczyk, I; Krzyczmonik, P; Błaszczyk, T; Scholl, H; Czajkowski, W
2006-07-20
Ultrafast time-resolved electronic spectra of the primary events induced in the copper tetrasulfonated phthalocyanine Cu(tsPc)4-) in aqueous solution has been measured by femtosecond pump-probe transient absorption spectroscopy. The primary events initiated by the absorption of a photon occurring within the femtosecond time scale are discussed on the basis of the electron transfer mechanism between the adjacent phthalocyanine rings proposed recently in our laboratory. The femtosecond transient absorption results are compared with the low temperature emission spectra obtained with Raman spectroscopy and the voltammetric curves.
A dynamical systems approach to studying midlatitude weather extremes
NASA Astrophysics Data System (ADS)
Messori, Gabriele; Caballero, Rodrigo; Faranda, Davide
2017-04-01
Extreme weather occurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. The ability to predict these events is therefore a topic of crucial importance. Here we propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We show that simple dynamical systems metrics can be used to identify sets of large-scale atmospheric flow patterns with similar spatial structure and temporal evolution on time scales of several days to a week. In regions where these patterns favor extreme weather, they afford a particularly good predictability of the extremes. We specifically test this technique on the atmospheric circulation in the North Atlantic region, where it provides predictability of large-scale wintertime surface temperature extremes in Europe up to 1 week in advance.
Data-Intensive Discovery Methods for Seismic Monitoring
NASA Astrophysics Data System (ADS)
Richards, P. G.; Schaff, D. P.; Ammon, C. J.; Cleveland, M.; Young, C. J.; Slinkard, M.; Heck, S.
2012-12-01
Seismic events are still mostly located one-at-a-time by Geiger's method of 1909, which uses phase picks and minimizes differences between observed and modeled travel times. But methods that recognize and use seismogram archives as a major resource have been successfully demonstrated---especially for California, China, and for the mid-ocean ridge-transform system---where they enable new insights into earthquake physics and Earth structure, and have raised seismic monitoring to new levels. We report progress on a series of collaborative projects to evaluate such data-intensive methods on ever-larger scales. We use cross correlation (CC): (1) to improve estimates of the relative size of neighboring seismic events in regions of high seismicity; and (2) as a detector, to find new events in current data streams that are similar to events already in the archive, to add to the number of detections of an already known event, or to place a threshold on the size of undetected events occurring near a template event. Elsewhere at this meeting Schaff and Richards report on uses of non-normalized CC measurements to estimate relative event size---a procedure that may be as important as widely-used CC methods to improve the precision of relative location estimates. They have successfully modeled the degradation in CC value that is due to the spatial separation of similar events and can prevent this bias from seriously influencing estimates of relative event size for non-collocated events. Cleveland and Ammon report in more detail on cross-correlation used to measure Rayleigh-wave time shifts, and on improved epicentroid locations and relative origin-time shifts in remote oceanic transform regions. They seek to extend the correlation of R1 waveforms from vertical strike-slip transform-fault earthquakes with waveforms from normal faulting events at nearby ridges, to improve the locations of events offshore from the Pacific northwest and southwestern China. Finally our collaborating Sandia group has reported preliminary results using a 360-core distributed network that took about two hours to search a month-long continuous single channel (sampled at 40 sps) for the occurrence of one or more of 920 waveforms each lasting 40 s and previously recorded by the station. Speed scales with number of cores; and inversely with number of channels, sample rate, and window length. Orders-of-magnitude improvement in speed are anticipated, on these early results; and application to numerous channels. From diverse results such as these, it seems appropriate to consider the future possibility of radical improvement in monitoring virtually all seismically active areas, using archives of prior events as the major resource---though we recognize that such an approach does not directly help to characterize seismic events in inactive regions, or events in active regions which are dissimilar to previously recorded events.
Tiny Molybdenites Tell Diffusion Tales
NASA Astrophysics Data System (ADS)
Stein, H. J.; Hannah, J. L.
2014-12-01
Diffusion invokes micron-scale exchange during crystal growth and dissolution in magma chambers on short time-scales. Fundamental to interpreting such data are assumptions on magma-fluid dynamics at all scales. Nevertheless, elemental diffusion profiles are used to estimate time scales for magma storage, eruption, and recharge. An underutilized timepiece to evaluate diffusion and 3D mobility of magmatic fluids is high-precision Re-Os dating of molybdenite. With spatially unique molybdenite samples from a young ore system (e.g., 1 Ma) and a double Os spike, analytical errors of 1-3 ka unambiguously separate events in time. Re-Os ages show that hydrous shallow magma chambers locally recharge and expel Cu-Mo-Au-silica as superimposed stockwork vein networks at time scales less than a few thousand years [1]. Re-Os ages provide diffusion rates controlled by a dynamic crystal mush, accumulation and expulsion of metalliferous fluid, and magma reorganization after explosive crystallization events. Importantly, this approach has broad application far from ore deposits. Here, we use Re-Os dating of molybdenite to assess time scales for generating and diffusing metals through the deep crust. To maximize opportunity for chemical diffusion, we use a continental-scale Sveconorwegian mylonite zone for the study area. A geologically constrained suite of molybdenite samples was acquired from quarry exposures. Molybdenite, previously unreported, is extremely scarce. Tiny but telling molybdenites include samples from like occurrences to assure geologic accuracy in Re-Os ages. Ages range from mid-Mesoproterozoic to mid-Neoproterozoic, and correspond to early metamorphic dehydration of a regionally widespread biotite-rich gneiss, localized melting of gneiss to form cm-m-scale K-feldspar ± quartz pods, development of vapor-rich, vuggy mm stringers that serve as volatile collection surfaces in felsic leucosomes, and low-angle (relative to foliation) cross-cutting cm-scale quartz veins. Re-Os ages and detailed geologic observation document a 200 m.y. history of metal liberation and diffusion through oxidation. [1] Stein, H.J. (2014) Dating and Tracing the History of Ore Formation, in Holland, H.D. & Turekian, K.K. (eds) Treatise on Geochemistry, 2nd Ed. 13: 87-118, Oxford: Elsevier.
Bersinger, T; Le Hécho, I; Bareille, G; Pigot, T
2015-01-01
Eroded sewer sediments are a significant source of organic matter discharge by combined sewer overflows. Many authors have studied the erosion and sedimentation processes at the scale of a section of sewer pipe and over short time periods. The objective of this study was to assess these processes at the scale of an entire sewer network and over 1 month, to understand whether phenomena observed on a small scale of space and time are still valid on a larger scale. To achieve this objective the continuous monitoring of turbidity was used. First, the study of successive rain events allows observation of the reduction of the available sediment and highlights the widely different erosion resistance for the different sediment layers. Secondly, calculation of daily chemical oxygen demand (COD) fluxes during the entire month was performed showing that sediment storage in the sewer pipe after a rain period is important and stops after 5 days. Nevertheless, during rainfall events, the eroded fluxes are more important than the whole sewer sediment accumulated during a dry weather period. This means that the COD fluxes promoted by runoff are substantial. This work confirms, with online monitoring, most of the conclusions from other studies on a smaller scale.
White, Richard S A; Wintle, Brendan A; McHugh, Peter A; Booker, Douglas J; McIntosh, Angus R
2017-06-14
Despite growing concerns regarding increasing frequency of extreme climate events and declining population sizes, the influence of environmental stochasticity on the relationship between population carrying capacity and time-to-extinction has received little empirical attention. While time-to-extinction increases exponentially with carrying capacity in constant environments, theoretical models suggest increasing environmental stochasticity causes asymptotic scaling, thus making minimum viable carrying capacity vastly uncertain in variable environments. Using empirical estimates of environmental stochasticity in fish metapopulations, we showed that increasing environmental stochasticity resulting from extreme droughts was insufficient to create asymptotic scaling of time-to-extinction with carrying capacity in local populations as predicted by theory. Local time-to-extinction increased with carrying capacity due to declining sensitivity to demographic stochasticity, and the slope of this relationship declined significantly as environmental stochasticity increased. However, recent 1 in 25 yr extreme droughts were insufficient to extirpate populations with large carrying capacity. Consequently, large populations may be more resilient to environmental stochasticity than previously thought. The lack of carrying capacity-related asymptotes in persistence under extreme climate variability reveals how small populations affected by habitat loss or overharvesting, may be disproportionately threatened by increases in extreme climate events with global warming. © 2017 The Author(s).
NASA Astrophysics Data System (ADS)
Mallakpour, Iman; Villarini, Gabriele; Jones, Michael; Smith, James
2016-04-01
The central United States is a region of the country that has been plagued by frequent catastrophic flooding (e.g., flood events of 1993, 2008, 2013, and 2014), with large economic and social repercussions (e.g., fatalities, agricultural losses, flood losses, water quality issues). The goal of this study is to examine whether it is possible to describe the occurrence of flood events at the sub-seasonal scale in terms of variations in the climate system. Daily streamflow time series from 774 USGS stream gage stations over the central United States (defined here to include North Dakota, South Dakota, Nebraska, Kansas, Missouri, Iowa, Minnesota, Wisconsin, Illinois, West Virginia, Kentucky, Ohio, Indiana, and Michigan) with a record of at least 50 years and ending no earlier than 2011 are used for this study. We use a peak-over-threshold (POT) approach to identify flood peaks so that we have, on average two events per year. We model the occurrence/non-occurrence of a flood event over time using regression models based on Cox processes. Cox processes are widely used in biostatistics and can be viewed as a generalization of Poisson processes. Rather than assuming that flood events occur independently of the occurrence of previous events (as in Poisson processes), Cox processes allow us to account for the potential presence of temporal clustering, which manifests itself in an alternation of quiet and active periods. Here we model the occurrence/non-occurrence of flood events using two climate indices as climate time-varying covariates: the North Atlantic Oscillation (NAO) and the Pacific-North American pattern (PNA). The results of this study show that NAO and/or PNA can explain the temporal clustering in flood occurrences in over 90% of the stream gage stations we considered. Analyses of the sensitivity of the results to different average numbers of flood events per year (from one to five) are also performed and lead to the same conclusions. The findings of this work highlight that variations in the climate system play a critical role in explaining the occurrence of flood events at the sub-seasonal scale over the central United States.
Scaling and Single Event Effects (SEE) Sensitivity
NASA Technical Reports Server (NTRS)
Oldham, Timothy R.
2003-01-01
This paper begins by discussing the potential for scaling down transistors and other components to fit more of them on chips in order to increasing computer processing speed. It also addresses technical challenges to further scaling. Components have been scaled down enough to allow single particles to have an effect, known as a Single Event Effect (SEE). This paper explores the relationship between scaling and the following SEEs: Single Event Upsets (SEU) on DRAMs and SRAMs, Latch-up, Snap-back, Single Event Burnout (SEB), Single Event Gate Rupture (SEGR), and Ion-induced soft breakdown (SBD).
SEISMIC SOURCE SCALING AND DISCRIMINATION IN DIVERSE TECTONIC ENVIRONMENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, R E; Mayeda, K; Walter, W R
2007-07-10
The objectives of this study are to improve low-magnitude regional seismic discrimination by performing a thorough investigation of earthquake source scaling using diverse, high-quality datasets from varied tectonic regions. Local-to-regional high-frequency discrimination requires an estimate of how earthquakes scale with size. Walter and Taylor (2002) developed the MDAC (Magnitude and Distance Amplitude Corrections) method to empirically account for these effects through regional calibration. The accuracy of these corrections has a direct impact on our ability to identify clandestine explosions in the broad regional areas characterized by low seismicity. Unfortunately our knowledge of source scaling at small magnitudes (i.e., m{sub b}more » < {approx}4.0) is poorly resolved. It is not clear whether different studies obtain contradictory results because they analyze different earthquakes, or because they use different methods. Even in regions that are well studied, such as test sites or areas of high seismicity, we still rely on empirical scaling relations derived from studies taken from half-way around the world at inter-plate regions. We investigate earthquake sources and scaling from different tectonic settings, comparing direct and coda wave analysis methods. We begin by developing and improving the two different methods, and then in future years we will apply them both to each set of earthquakes. Analysis of locally recorded, direct waves from events is intuitively the simplest way of obtaining accurate source parameters, as these waves have been least affected by travel through the earth. But there are only a limited number of earthquakes that are recorded locally, by sufficient stations to give good azimuthal coverage, and have very closely located smaller earthquakes that can be used as an empirical Green's function (EGF) to remove path effects. In contrast, coda waves average radiation from all directions so single-station records should be adequate, and previous work suggests that the requirements for the EGF event are much less stringent. We can study more earthquakes using the coda-wave methods, while using direct wave methods for the best recorded subset of events so as to investigate any differences between the results of the two approaches. Finding 'perfect' EGF events for direct wave analysis is difficult, as is ascertaining the quality of a particular EGF event. We develop a multi-taper method to obtain time-domain source-time-functions by frequency division. If an earthquake and EGF event pair are able to produce a clear, time-domain source pulse then we accept the EGF event. We then model the spectral (amplitude) ratio to determine source parameters from both direct P and S waves. We use the well-recorded sequence of aftershocks of the M5 Au Sable Forks, NY, earthquake to test the method and also to obtain some of the first accurate source parameters for small earthquakes in eastern North America. We find that the stress drops are high, confirming previous work suggesting that intraplate continental earthquakes have higher stress drops than events at plate boundaries. We simplify and improve the coda wave analysis method by calculating spectral ratios between different sized earthquakes. We first compare spectral ratio performance between local and near-regional S and coda waves in the San Francisco Bay region for moderate-sized events. The average spectral ratio standard deviations using coda are {approx}0.05 to 0.12, roughly a factor of 3 smaller than direct S-waves for 0.2 < f < 15.0 Hz. Also, direct wave analysis requires collocated pairs of earthquakes whereas the event-pairs (Green's function and target events) can be separated by {approx}25 km for coda amplitudes without any appreciable degradation. We then apply coda spectral ratio method to the 1999 Hector Mine mainshock (M{sub w} 7.0, Mojave Desert) and its larger aftershocks. We observe a clear departure from self-similarity, consistent with previous studies using similar regional datasets.« less
Scaling of coupled dilatancy-diffusion processes in space and time
NASA Astrophysics Data System (ADS)
Main, I. G.; Bell, A. F.; Meredith, P. G.; Brantut, N.; Heap, M.
2012-04-01
Coupled dilatancy-diffusion processes resulting from microscopically brittle damage due to precursory cracking have been observed in the laboratory and suggested as a mechanism for earthquake precursors. One reason precursors have proven elusive may be the scaling in space: recent geodetic and seismic data placing strong limits on the spatial extent of the nucleation zone for recent earthquakes. Another may be the scaling in time: recent laboratory results on axi-symmetric samples show both a systematic decrease in circumferential extensional strain at failure and a delayed and a sharper acceleration of acoustic emission event rate as strain rate is decreased. Here we examine the scaling of such processes in time from laboratory to field conditions using brittle creep (constant stress loading) to failure tests, in an attempt to bridge part of the strain rate gap to natural conditions, and discuss the implications for forecasting the failure time. Dilatancy rate is strongly correlated to strain rate, and decreases to zero in the steady-rate creep phase at strain rates around 10-9 s-1 for a basalt from Mount Etna. The data are well described by a creep model based on the linear superposition of transient (decelerating) and accelerating micro-crack growth due to stress corrosion. The model produces good fits to the failure time in retrospect using the accelerating acoustic emission event rate, but in prospective tests on synthetic data with the same properties we find failure-time forecasting is subject to systematic epistemic and aleatory uncertainties that degrade predictability. The next stage is to use the technology developed to attempt failure forecasting in real time, using live streamed data and a public web-based portal to quantify the prospective forecast quality under such controlled laboratory conditions.
NASA Astrophysics Data System (ADS)
Managave, S. R.; Jani, R. A.; Narayana Rao, T.; Sunilkumar, K.; Satheeshkumar, S.; Ramesh, R.
2016-08-01
Evaporation of rain is known to contribute water vapor, a potent greenhouse gas, to the atmosphere. Stable oxygen and hydrogen isotopic compositions (δ18O and, δD, respectively) of precipitation, usually measured/presented as values integrated over rain events or monthly mean values, are important tools for detecting evaporation effects. The slope ~8 of the linear relationship between such time-averaged values of δD and δ18O (called the meteoric water line) is widely accepted as a proof of condensation under isotopic equilibrium and absence of evaporation of rain during atmospheric fall. Here, through a simultaneous investigation of the isotopic and drop size distributions of seventeen rain events sampled on an intra-event scale at Gadanki (13.5°N, 79.2°E), southern India, we demonstrate that the evaporation effects, not evident in the time-averaged data, are significantly manifested in the sub-samples of individual rain events. We detect this through (1) slopes significantly less than 8 for the δD-δ18O relation on intra-event scale and (2) significant positive correlations between deuterium excess ( d-excess = δD - 8*δ18O; lower values in rain indicate evaporation) and the mass-weighted mean diameter of the raindrops ( D m ). An estimated ~44 % of rain is influenced by evaporation. This study also reveals a signature of isotopic equilibration of rain with the cloud base vapor, the processes important for modeling isotopic composition of precipitation. d-excess values of rain are modified by the post-condensation processes and the present approach offers a way to identify the d-excess values least affected by such processes. Isotope-enabled global circulation models could be improved by incorporating intra-event isotopic data and raindrop size dependent isotopic effects.
NASA Astrophysics Data System (ADS)
Bansah, S.; Ali, G.; Haque, M. A.; Tang, V.
2017-12-01
The proportion of precipitation that becomes streamflow is a function of internal catchment characteristics - which include geology, landscape characteristics and vegetation - and influence overall storage dynamics. The timing and quantity of water discharged by a catchment are indeed embedded in event hydrographs. Event hydrograph timing parameters, such as the response lag and time of concentration, are important descriptors of how long it takes the catchment to respond to input precipitation and how long it takes the latter to filter through the catchment. However, the extent to which hydrograph timing parameters relate to average response times derived from fitting transfer functions to annual hydrographs is unknown. In this study, we used a gamma transfer function to determine catchment average response times as well as event-specific hydrograph parameters across a network of eight nested watersheds ranging from 0.19 km2 to 74.6 km2 prairie catchments located in south central Manitoba (Canada). Various statistical analyses were then performed to correlate average response times - estimated using the parameters of the fitted gamma transfer function - to event-specific hydrograph parameters. Preliminary results show significant interannual variations in response times and hydrograph timing parameters: the former were in the order of a few hours to days, while the latter ranged from a few days to weeks. Some statistically significant relationships were detected between response times and event-specific hydrograph parameters. Future analyses will involve the comparison of statistical distributions of event-specific hydrograph parameters with that of runoff response times and baseflow transit times in order to quantity catchment storage dynamics across a range of temporal scales.
NASA Astrophysics Data System (ADS)
Frassinetti, L.; Dodt, D.; Beurskens, M. N. A.; Sirinelli, A.; Boom, J. E.; Eich, T.; Flanagan, J.; Giroud, C.; Jachmich, M. S.; Kempenaars, M.; Lomas, P.; Maddison, G.; Maggi, C.; Neu, R.; Nunes, I.; Perez von Thun, C.; Sieglin, B.; Stamp, M.; Contributors, JET-EFDA
2015-02-01
The baseline type-I ELMy H-mode scenario has been re-established in JET with the new tungsten MKII-HD divertor and beryllium on the main wall (hereafter called the ITER-like wall, JET-ILW). The first JET-ILW results show that the confinement is degraded by 20-30% in the baseline scenarios compared to the previous carbon wall JET (JET-C) plasmas. The degradation is mainly driven by the reduction in the pedestal temperature. Stored energies and pedestal temperature comparable to the JET-C have been obtained to date in JET-ILW baseline plasmas only in the high triangularity shape using N2 seeding. This work compares the energy losses during ELMs and the corresponding time scales of the temperature and density collapse in JET-ILW baseline plasmas with and without N2 seeding with similar JET-C baseline plasmas. ELMs in the JET-ILW differ from those with the carbon wall both in terms of time scales and energy losses. The ELM time scale, defined as the time to reach the minimum pedestal temperature soon after the ELM collapse, is ˜2 ms in the JET-ILW and lower than 1 ms in the JET-C. The energy losses are in the range ΔWELM/Wped ≈ 7-12% in the JET-ILW and ΔWELM/Wped ≈ 10-20% in JET-C, and fit relatively well with earlier multi-machine empirical scalings of ΔWELM/Wped with collisionality. The time scale of the ELM collapse seems to be related to the pedestal collisionality. Most of the non-seeded JET-ILW ELMs are followed by a further energy drop characterized by a slower time scale ˜8-10 ms (hereafter called slow transport events), that can lead to losses in the range ΔWslow/Wped ≈ 15-22%, slightly larger than the losses in JET-C. The N2 seeding in JET-ILW significantly affects the ELMs. The JET-ILW plasmas with N2 seeding are characterized by ELM energy losses and time scales similar to the JET-C and by the absence of the slow transport events.
Future climate risk from compound events
NASA Astrophysics Data System (ADS)
Zscheischler, Jakob; Westra, Seth; van den Hurk, Bart J. J. M.; Seneviratne, Sonia I.; Ward, Philip J.; Pitman, Andy; AghaKouchak, Amir; Bresch, David N.; Leonard, Michael; Wahl, Thomas; Zhang, Xuebin
2018-06-01
Floods, wildfires, heatwaves and droughts often result from a combination of interacting physical processes across multiple spatial and temporal scales. The combination of processes (climate drivers and hazards) leading to a significant impact is referred to as a `compound event'. Traditional risk assessment methods typically only consider one driver and/or hazard at a time, potentially leading to underestimation of risk, as the processes that cause extreme events often interact and are spatially and/or temporally dependent. Here we show how a better understanding of compound events may improve projections of potential high-impact events, and can provide a bridge between climate scientists, engineers, social scientists, impact modellers and decision-makers, who need to work closely together to understand these complex events.
Seismic and Infrasound Location
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arrowsmith, Stephen J.; Begnaud, Michael L.
2014-03-19
This presentation includes slides on Signal Propagation Through the Earth/Atmosphere Varies at Different Scales; 3D Seismic Models: RSTT; Ray Coverage (Pn); Source-Specific Station Corrections (SSSCs); RSTT Conclusions; SALSA3D (SAndia LoS Alamos) Global 3D Earth Model for Travel Time; Comparison of IDC SSSCs to RSTT Predictions; SALSA3D; Validation and Model Comparison; DSS Lines in the Siberian Platform; DSS Line CRA-4 Comparison; Travel Time Δak135; Travel Time Prediction Uncertainty; SALSA3D Conclusions; Infrasound Data Processing: An example event; Infrasound Data Processing: An example event; Infrasound Location; How does BISL work?; BISL: Application to the 2013 DPRK Test; and BISL: Ongoing Research.
The Spatial Scaling of Global Rainfall Extremes
NASA Astrophysics Data System (ADS)
Devineni, N.; Xi, C.; Lall, U.; Rahill-Marier, B.
2013-12-01
Floods associated with severe storms are a significant source of risk for property, life and supply chains. These property losses tend to be determined as much by the duration of flooding as by the depth and velocity of inundation. High duration floods are typically induced by persistent rainfall (upto 30 day duration) as seen recently in Thailand, Pakistan, the Ohio and the Mississippi Rivers, France, and Germany. Events related to persistent and recurrent rainfall appear to correspond to the persistence of specific global climate patterns that may be identifiable from global, historical data fields, and also from climate models that project future conditions. A clear understanding of the space-time rainfall patterns for events or for a season will enable in assessing the spatial distribution of areas likely to have a high/low inundation potential for each type of rainfall forcing. In this paper, we investigate the statistical properties of the spatial manifestation of the rainfall exceedances. We also investigate the connection of persistent rainfall events at different latitudinal bands to large-scale climate phenomena such as ENSO. Finally, we present the scaling phenomena of contiguous flooded areas as a result of large scale organization of long duration rainfall events. This can be used for spatially distributed flood risk assessment conditional on a particular rainfall scenario. Statistical models for spatio-temporal loss simulation including model uncertainty to support regional and portfolio analysis can be developed.
Nuclear Power - Post Fukushima
NASA Astrophysics Data System (ADS)
Reyes, Jose, Jr.
2011-10-01
The extreme events that led to the prolonged power outage at the Fukushima Daiicchi nuclear plant have highlighted the importance of assuring a means for stable long term cooling of the nuclear fuel and containment following a complete station blackout. Legislative bodies, regulatory agencies and industry are drawing lessons from those events and considering what changes, if any, are needed to nuclear power, post Fukushima. The enhanced safety of a new class of reactor designed by NuScale Power is drawing significant attention in light of the Fukushima events. During normal operation, each NuScale containment is fully immersed in a water-filled stainless steel lined concrete pool that resides underground. The pool, housed in a Seismic Category I building, is large enough to provided 30 days of core and containment cooling without adding water. After 30 days, the decay heat generations coupled with thermal radiation heat transfer is completely adequate to remove core decay heat for an unlimited period of time. These passive power systems can perform their function without requiring an external supply of water of power. An assessment of the NuScale passive systems is being performed through a comprehensive test program that includes the NuScale integral system test facility at Oregon State University
Identifying large scale structures at 1 AU using fluctuations and wavelets
NASA Astrophysics Data System (ADS)
Niembro, T.; Lara, A.
2016-12-01
The solar wind (SW) is inhomogeneous and it is dominated for two types of flows: one quasi-stationary and one related to large scale transients (such as coronal mass ejections and co-rotating interaction regions). The SW inhomogeneities can be study as fluctuations characterized by a wide range of length and time scales. We are interested in the study of the characteristic fluctuations caused by large scale transient events. To do so, we define the vector space F with the normalized moving monthly/annual deviations as the orthogonal basis. Then, we compute the norm in this space of the solar wind parameters (velocity, magnetic field, density and temperature) fluctuations using WIND data from August 1992 to August 2015. This norm gives important information about the presence of a large structure disturbance in the solar wind and by applying a wavelet transform to this norm, we are able to determine, without subjectivity, the duration of the compression regions of these large transient structures and, even more, to identify if the structure corresponds to a single or complex (or merged) event. With this method we have automatically detected most of the events identified and published by other authors.
Time Scale Hierarchies in the Functional Organization of Complex Behaviors
Perdikis, Dionysios; Huys, Raoul; Jirsa, Viktor K.
2011-01-01
Traditional approaches to cognitive modelling generally portray cognitive events in terms of ‘discrete’ states (point attractor dynamics) rather than in terms of processes, thereby neglecting the time structure of cognition. In contrast, more recent approaches explicitly address this temporal dimension, but typically provide no entry points into cognitive categorization of events and experiences. With the aim to incorporate both these aspects, we propose a framework for functional architectures. Our approach is grounded in the notion that arbitrary complex (human) behaviour is decomposable into functional modes (elementary units), which we conceptualize as low-dimensional dynamical objects (structured flows on manifolds). The ensemble of modes at an agent’s disposal constitutes his/her functional repertoire. The modes may be subjected to additional dynamics (termed operational signals), in particular, instantaneous inputs, and a mechanism that sequentially selects a mode so that it temporarily dominates the functional dynamics. The inputs and selection mechanisms act on faster and slower time scales then that inherent to the modes, respectively. The dynamics across the three time scales are coupled via feedback, rendering the entire architecture autonomous. We illustrate the functional architecture in the context of serial behaviour, namely cursive handwriting. Subsequently, we investigate the possibility of recovering the contributions of functional modes and operational signals from the output, which appears to be possible only when examining the output phase flow (i.e., not from trajectories in phase space or time). PMID:21980278
Spatio-temporal assessment of food safety risks in Canadian food distribution systems using GIS.
Hashemi Beni, Leila; Villeneuve, Sébastien; LeBlanc, Denyse I; Côté, Kevin; Fazil, Aamir; Otten, Ainsley; McKellar, Robin; Delaquis, Pascal
2012-09-01
While the value of geographic information systems (GIS) is widely applied in public health there have been comparatively few examples of applications that extend to the assessment of risks in food distribution systems. GIS can provide decision makers with strong computing platforms for spatial data management, integration, analysis, querying and visualization. The present report addresses some spatio-analyses in a complex food distribution system and defines influence areas as travel time zones generated through road network analysis on a national scale rather than on a community scale. In addition, a dynamic risk index is defined to translate a contamination event into a public health risk as time progresses. More specifically, in this research, GIS is used to map the Canadian produce distribution system, analyze accessibility to contaminated product by consumers, and estimate the level of risk associated with a contamination event over time, as illustrated in a scenario. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
Intrinsic Patterns of Human Activity
NASA Astrophysics Data System (ADS)
Hu, Kun; Ivanov, Plamen Ch.; Chen, Zhi; Hilton, Michael; Stanley, H. Eugene; Shea, Steven
2003-03-01
Activity is one of the defining features of life. Control of human activity is complex, being influenced by many factors both extrinsic and intrinsic to the body. The most obvious extrinsic factors that affect activity are the daily schedule of planned events, such as work and recreation, as well as reactions to unforeseen or random events. These extrinsic factors may account for the apparently random fluctuations in human motion observed over short time scales. The most obvious intrinsic factors are the body clocks including the circadian pacemaker that influences our sleep/wake cycle and ultradian oscillators with shorter time scales [2, 3]. These intrinsic rhythms may account for the underlying regularity in average activity level over longer periods of up to 24 h. Here we ask if the known extrinsic and intrinsic factors fully account for all complex features observed in recordings of human activity. To this end, we measure activity over two weeks from forearm motion in subjects undergoing their regular daily routine. Utilizing concepts from statistical physics, we demonstrate that during wakefulness human activity possesses previously unrecognized complex dynamic patterns. These patterns of activity are characterized by robust fractal and nonlinear dynamics including a universal probability distribution and long-range power-law correlations that are stable over a wide range of time scales (from minutes to hours). Surprisingly, we find that these dynamic patterns are unaffected by changes in the average activity level that occur within individual subjects throughout the day and on different days of the week, and between subjects. Moreover, we find that these patterns persist when the same subjects undergo time-isolation laboratory experiments designed to account for the phase of the circadian pacemaker, and control the known extrinsic factors by restricting behaviors and manipulating scheduled events including the sleep/wake cycle. We attribute these newly discovered patterns to a robust intrinsic multi-scale dynamic regulation of human activity that is independent of known extrinsic factors, and independent from the circadian and ultradian rhythms.
Element analysis: a wavelet-based method for analysing time-localized events in noisy time series
2017-01-01
A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325
Pizzuto, James; Schenk, Edward R.; Hupp, Cliff R.; Gellis, Allen; Noe, Greg; Williamson, Elyse; Karwan, Diana L.; O'Neal, Michael; Marquard, Julia; Aalto, Rolf E.; Newbold, Denis
2014-01-01
Watershed Best Management Practices (BMPs) are often designed to reduce loading from particle-borne contaminants, but the temporal lag between BMP implementation and improvement in receiving water quality is difficult to assess because particles are only moved downstream episodically, resting for long periods in storage between transport events. A theory is developed that describes the downstream movement of suspended sediment particles accounting for the time particles spend in storage given sediment budget data (by grain size fraction) and information on particle transit times through storage reservoirs. The theory is used to define a suspended sediment transport length scale that describes how far particles are carried during transport events, and to estimate a downstream particle velocity that includes time spent in storage. At 5 upland watersheds of the mid-Atlantic region, transport length scales for silt-clay range from 4 to 60 km, while those for sand range from 0.4 to 113 km. Mean sediment velocities for silt-clay range from 0.0072 km/yr to 0.12 km/yr, while those for sand range from 0.0008 km/yr to 0.20 km/yr, 4–6 orders of magnitude slower than the velocity of water in the channel. These results suggest lag times of 100–1000 years between BMP implementation and effectiveness in receiving waters such as the Chesapeake Bay (where BMPs are located upstream of the characteristic transport length scale). Many particles likely travel much faster than these average values, so further research is needed to determine the complete distribution of suspended sediment velocities in real watersheds.
NASA Astrophysics Data System (ADS)
Reddy, Ramakrushna; Nair, Rajesh R.
2013-10-01
This work deals with a methodology applied to seismic early warning systems which are designed to provide real-time estimation of the magnitude of an event. We will reappraise the work of Simons et al. (2006), who on the basis of wavelet approach predicted a magnitude error of ±1. We will verify and improve upon the methodology of Simons et al. (2006) by applying an SVM statistical learning machine on the time-scale wavelet decomposition methods. We used the data of 108 events in central Japan with magnitude ranging from 3 to 7.4 recorded at KiK-net network stations, for a source-receiver distance of up to 150 km during the period 1998-2011. We applied a wavelet transform on the seismogram data and calculating scale-dependent threshold wavelet coefficients. These coefficients were then classified into low magnitude and high magnitude events by constructing a maximum margin hyperplane between the two classes, which forms the essence of SVMs. Further, the classified events from both the classes were picked up and linear regressions were plotted to determine the relationship between wavelet coefficient magnitude and earthquake magnitude, which in turn helped us to estimate the earthquake magnitude of an event given its threshold wavelet coefficient. At wavelet scale number 7, we predicted the earthquake magnitude of an event within 2.7 seconds. This means that a magnitude determination is available within 2.7 s after the initial onset of the P-wave. These results shed light on the application of SVM as a way to choose the optimal regression function to estimate the magnitude from a few seconds of an incoming seismogram. This would improve the approaches from Simons et al. (2006) which use an average of the two regression functions to estimate the magnitude.
Anticipating flash-floods: Multi-scale aspects of the social response
NASA Astrophysics Data System (ADS)
Lutoff, Céline; Creutin, Jean-Dominique; Ruin, Isabelle; Borga, Marco
2016-10-01
This paper aims at exploring the anticipation phase before a flash flood, corresponding to the time between the first climatic signs and the peak-flow. We focus the analysis on people's behaviors observing how they use this period to organize themselves for facing the event. The analysis is made through the definition of three specific scales: the timeliness scale, an analytical scale of anticipatory actions and the scale of human response network. Using a cross-scale and cross level analysis enables to define different phases in the anticipation period where different kind of environmental precursors are mobilized by the actors in order to make sense of the situation and adapt. Three main points deserve attention at the end: firstly, the concepts of timeliness, anticipatory actions and crisis network scales enable to understand differently what happens both physically and socially during an extreme event; secondly, analyzing the precursors shows that each level of crisis network uses different kinds of signs for estimating the situation, organizing and reacting; thirdly, there is a potential for improvement in observation on both social and physical processes at different scales, for verifying the theory of the anticipatory phases.
Lacour, C; Joannis, C; Gromaire, M-C; Chebbo, G
2009-01-01
Turbidity sensors can be used to continuously monitor the evolution of pollutant mass discharge. For two sites within the Paris combined sewer system, continuous turbidity, conductivity and flow data were recorded at one-minute time intervals over a one-year period. This paper is intended to highlight the variability in turbidity dynamics during wet weather. For each storm event, turbidity response aspects were analysed through different classifications. The correlation between classification and common parameters, such as the antecedent dry weather period, total event volume per impervious hectare and both the mean and maximum hydraulic flow for each event, was also studied. Moreover, the dynamics of flow and turbidity signals were compared at the event scale. No simple relation between turbidity responses, hydraulic flow dynamics and the chosen parameters was derived from this effort. Knowledge of turbidity dynamics could therefore potentially improve wet weather management, especially when using pollution-based real-time control (P-RTC) since turbidity contains information not included in hydraulic flow dynamics and not readily predictable from such dynamics.
Test of the efficiency of three storm water quality models with a rich set of data.
Ahyerre, M; Henry, F O; Gogien, F; Chabanel, M; Zug, M; Renaudet, D
2005-01-01
The objective of this article is to test the efficiency of three different Storm Water Quality Model (SWQM) on the same data set (34 rain events, SS measurements) sampled on a 42 ha watershed in the center of Paris. The models have been calibrated at the scale of the rain event. Considering the mass of pollution calculated per event, the results on the models are satisfactory but that they are in the same order of magnitude as the simple hydraulic approach associated to a constant concentration. In a second time, the mass of pollutant at the outlet of the catchment at the global scale of the 34 events has been calculated. This approach shows that the simple hydraulic calculations gives better results than SWQM. Finally, the pollutographs are analysed, showing that storm water quality models are interesting tools to represent the shape of the pollutographs, and the dynamics of the phenomenon which can be useful in some projects for managers.
Just-in-time connectivity for large spiking networks.
Lytton, William W; Omurtag, Ahmet; Neymotin, Samuel A; Hines, Michael L
2008-11-01
The scale of large neuronal network simulations is memory limited due to the need to store connectivity information: connectivity storage grows as the square of neuron number up to anatomically relevant limits. Using the NEURON simulator as a discrete-event simulator (no integration), we explored the consequences of avoiding the space costs of connectivity through regenerating connectivity parameters when needed: just in time after a presynaptic cell fires. We explored various strategies for automated generation of one or more of the basic static connectivity parameters: delays, postsynaptic cell identities, and weights, as well as run-time connectivity state: the event queue. Comparison of the JitCon implementation to NEURON's standard NetCon connectivity method showed substantial space savings, with associated run-time penalty. Although JitCon saved space by eliminating connectivity parameters, larger simulations were still memory limited due to growth of the synaptic event queue. We therefore designed a JitEvent algorithm that added items to the queue only when required: instead of alerting multiple postsynaptic cells, a spiking presynaptic cell posted a callback event at the shortest synaptic delay time. At the time of the callback, this same presynaptic cell directly notified the first postsynaptic cell and generated another self-callback for the next delay time. The JitEvent implementation yielded substantial additional time and space savings. We conclude that just-in-time strategies are necessary for very large network simulations but that a variety of alternative strategies should be considered whose optimality will depend on the characteristics of the simulation to be run.
Just in time connectivity for large spiking networks
Lytton, William W.; Omurtag, Ahmet; Neymotin, Samuel A; Hines, Michael L
2008-01-01
The scale of large neuronal network simulations is memory-limited due to the need to store connectivity information: connectivity storage grows as the square of neuron number up to anatomically-relevant limits. Using the NEURON simulator as a discrete-event simulator (no integration), we explored the consequences of avoiding the space costs of connectivity through regenerating connectivity parameters when needed – just-in-time after a presynaptic cell fires. We explored various strategies for automated generation of one or more of the basic static connectivity parameters: delays, postsynaptic cell identities and weights, as well as run-time connectivity state: the event queue. Comparison of the JitCon implementation to NEURON’s standard NetCon connectivity method showed substantial space savings, with associated run-time penalty. Although JitCon saved space by eliminating connectivity parameters, larger simulations were still memory-limited due to growth of the synaptic event queue. We therefore designed a JitEvent algorithm that only added items to the queue when required: instead of alerting multiple postsynaptic cells, a spiking presynaptic cell posted a callback event at the shortest synaptic delay time. At the time of the callback, this same presynaptic cell directly notified the first postsynaptic cell and generated another self-callback for the next delay time. The JitEvent implementation yielded substantial additional time and space savings. We conclude that just-in-time strategies are necessary for very large network simulations but that a variety of alternative strategies should be considered whose optimality will depend on the characteristics of the simulation to be run. PMID:18533821
NASA Astrophysics Data System (ADS)
Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.
2017-12-01
Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in space. We undertake a number of quality checks of the stochastic model and compare real and simulated footprints to show that the method is able to re-create realistic patterns even at continental scales where there is large variation in flood generating mechanisms. We then show how these patterns can be used to drive a large scale 2D hydraulic to predict regional scale flooding.
Environmental stochasticity controls soil erosion variability
Kim, Jongho; Ivanov, Valeriy Y.; Fatichi, Simone
2016-01-01
Understanding soil erosion by water is essential for a range of research areas but the predictive skill of prognostic models has been repeatedly questioned because of scale limitations of empirical data and the high variability of soil loss across space and time scales. Improved understanding of the underlying processes and their interactions are needed to infer scaling properties of soil loss and better inform predictive methods. This study uses data from multiple environments to highlight temporal-scale dependency of soil loss: erosion variability decreases at larger scales but the reduction rate varies with environment. The reduction of variability of the geomorphic response is attributed to a ‘compensation effect’: temporal alternation of events that exhibit either source-limited or transport-limited regimes. The rate of reduction is related to environment stochasticity and a novel index is derived to reflect the level of variability of intra- and inter-event hydrometeorologic conditions. A higher stochasticity index implies a larger reduction of soil loss variability (enhanced predictability at the aggregated temporal scales) with respect to the mean hydrologic forcing, offering a promising indicator for estimating the degree of uncertainty of erosion assessments. PMID:26925542
Orbital time scale and new C-isotope record for Cenomanian-Turonian boundary stratotype
NASA Astrophysics Data System (ADS)
Sageman, Bradley B.; Meyers, Stephen R.; Arthur, Michael A.
2006-02-01
Previous time scales for the Cenomanian-Turonian boundary (CTB) interval containing Oceanic Anoxic Event II (OAE II) vary by a factor of three. In this paper we present a new orbital time scale for the CTB stratotype established independently of radiometric, biostratigraphic, or geochemical data sets, update revisions of CTB biostratigraphic zonation, and provide a new detailed carbon isotopic record for the CTB study interval. The orbital time scale allows an independent assessment of basal biozone ages relative to the new CTB date of 93.55 Ma (GTS04). The δ13Corg data document the abrupt onset of OAE II, significant variability in δ13Corg values, and values enriched to almost -22‰. These new data underscore the difficulty in defining OAE II termination. Using the new isotope curve and time scale, estimates of OAE II duration can be determined and exported to other sites based on integration of well-established chemostratigraphic and biostratigraphic datums. The new data will allow more accurate calculations of biogeochemical and paleobiologic rates across the CTB.
A Web service-based architecture for real-time hydrologic sensor networks
NASA Astrophysics Data System (ADS)
Wong, B. P.; Zhao, Y.; Kerkez, B.
2014-12-01
Recent advances in web services and cloud computing provide new means by which to process and respond to real-time data. This is particularly true of platforms built for the Internet of Things (IoT). These enterprise-scale platforms have been designed to exploit the IP-connectivity of sensors and actuators, providing a robust means by which to route real-time data feeds and respond to events of interest. While powerful and scalable, these platforms have yet to be adopted by the hydrologic community, where the value of real-time data impacts both scientists and decision makers. We discuss the use of one such IoT platform for the purpose of large-scale hydrologic measurements, showing how rapid deployment and ease-of-use allows scientists to focus on their experiment rather than software development. The platform is hardware agnostic, requiring only IP-connectivity of field devices to capture, store, process, and visualize data in real-time. We demonstrate the benefits of real-time data through a real-world use case by showing how our architecture enables the remote control of sensor nodes, thereby permitting the nodes to adaptively change sampling strategies to capture major hydrologic events of interest.
Guerrier, Claire; Holcman, David
2016-10-18
Binding of molecules, ions or proteins to small target sites is a generic step of cell activation. This process relies on rare stochastic events where a particle located in a large bulk has to find small and often hidden targets. We present here a hybrid discrete-continuum model that takes into account a stochastic regime governed by rare events and a continuous regime in the bulk. The rare discrete binding events are modeled by a Markov chain for the encounter of small targets by few Brownian particles, for which the arrival time is Poissonian. The large ensemble of particles is described by mass action laws. We use this novel model to predict the time distribution of vesicular release at neuronal synapses. Vesicular release is triggered by the binding of few calcium ions that can originate either from the synaptic bulk or from the entry through calcium channels. We report here that the distribution of release time is bimodal although it is triggered by a single fast action potential. While the first peak follows a stimulation, the second corresponds to the random arrival over much longer time of ions located in the synaptic terminal to small binding vesicular targets. To conclude, the present multiscale stochastic modeling approach allows studying cellular events based on integrating discrete molecular events over several time scales.
NASA Astrophysics Data System (ADS)
Yu, Lejiang; Zhong, Shiyuan; Pei, Lisi; Bian, Xindi; Heilman, Warren E.
2016-04-01
The mean global climate has warmed as a result of the increasing emission of greenhouse gases induced by human activities. This warming is considered the main reason for the increasing number of extreme precipitation events in the US. While much attention has been given to extreme precipitation events occurring over several days, which are usually responsible for severe flooding over a large region, little is known about how extreme precipitation events that cause flash flooding and occur at sub-daily time scales have changed over time. Here we use the observed hourly precipitation from the North American Land Data Assimilation System Phase 2 forcing datasets to determine trends in the frequency of extreme precipitation events of short (1 h, 3 h, 6 h, 12 h and 24 h) duration for the period 1979-2013. The results indicate an increasing trend in the central and eastern US. Over most of the western US, especially the Southwest and the Intermountain West, the trends are generally negative. These trends can be largely explained by the interdecadal variability of the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation (AMO), with the AMO making a greater contribution to the trends in both warm and cold seasons.
NASA Astrophysics Data System (ADS)
2012-05-01
Education: Physics Education Networks meeting has global scale Competition: Competition seeks the next Brian Cox Experiment: New measurement of neutrino time-of-flight consistent with the speed of light Event: A day for all those who teach physics Conference: Students attend first Anglo-Japanese international science conference Celebration: Will 2015 be the 'Year of Light'? Teachers: Challenging our intuition in spectacular fashion: the fascinating world of quantum physics awaits Research: Science sharpens up sport Learning: Kittinger and Baumgartner: on a mission to the edge of space International: London International Youth Science Forum calls for leading young scientists Competition: Physics paralympian challenge needs inquisitive, analytical, artistic and eloquent pupils Forthcoming events
NASA Astrophysics Data System (ADS)
Moebius, Franziska; Or, Dani
2014-05-01
The macroscopically smooth and regular motion of fluid fronts in porous media is composed of numerous rapid pore-scale interfacial jumps and pressure bursts that involve intense interfacial energy release in the form of acoustic emissions. The characteristics of these pore scale events affect residual phase entrapment and transport properties behind the front. We present experimental studies using acoustic emission technique (AE), rapid imaging, and liquid pressure measurements to characterize these processes during drainage and imbibition in simple porous media. Imbibition and drainage produce different AE signatures (AE amplitudes obey a power law). For rapid drainage, AE signals persist long after cessation of front motion reflecting fluid redistribution and interfacial relaxation. Imaging revealed that the velocity of interfacial jumps often exceeds front velocity by more than 50 fold and is highly inertial component (Re>1000). Pore invasion volumes reduced deduced from pressure fluctuations waiting times (for constant withdrawal rates) show remarkable agreement with geometrically-deduced pore volumes. Discrepancies between invaded volumes and geometrical pores increase with increasing capillary numbers due to constraints on evacuation opportunity times and simultaneous invasion events. A mechanistic model for interfacial motions in a pore-throat network was developed to investigate interfacial dynamics focusing on the role of inertia. Results suggest that while pore scale dynamics were sensitive to variations in pore geometry and boundary conditions, inertia exerted only a minor effect on phase entrapment. The study on pore scale invasion events paints a complex picture of rapid and inertial motions and provides new insights on mechanisms at displacement fronts that are essential for improved macroscopic description of multiphase flows in porous media.
Soil erosion under multiple time-varying rainfall events
NASA Astrophysics Data System (ADS)
Heng, B. C. Peter; Barry, D. Andrew; Jomaa, Seifeddine; Sander, Graham C.
2010-05-01
Soil erosion is a function of many factors and process interactions. An erosion event produces changes in surface soil properties such as texture and hydraulic conductivity. These changes in turn alter the erosion response to subsequent events. Laboratory-scale soil erosion studies have typically focused on single independent rainfall events with constant rainfall intensities. This study investigates the effect of multiple time-varying rainfall events on soil erosion using the EPFL erosion flume. The rainfall simulator comprises ten Veejet nozzles mounted on oscillating bars 3 m above a 6 m × 2 m flume. Spray from the nozzles is applied onto the soil surface in sweeps; rainfall intensity is thus controlled by varying the sweeping frequency. Freshly-prepared soil with a uniform slope was subjected to five rainfall events at daily intervals. In each 3-h event, rainfall intensity was ramped up linearly to a maximum of 60 mm/h and then stepped down to zero. Runoff samples were collected and analysed for particle size distribution (PSD) as well as total sediment concentration. We investigate whether there is a hysteretic relationship between sediment concentration and discharge within each event and how this relationship changes from event to event. Trends in the PSD of the eroded sediment are discussed and correlated with changes in sediment concentration. Close-up imagery of the soil surface following each event highlight changes in surface soil structure with time. This study enhances our understanding of erosion processes in the field, with corresponding implications for soil erosion modelling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simmons, N. A.; Myers, S. C.; Johannesson, G.
[1] We develop a global-scale P wave velocity model (LLNL-G3Dv3) designed to accurately predict seismic travel times at regional and teleseismic distances simultaneously. The model provides a new image of Earth's interior, but the underlying practical purpose of the model is to provide enhanced seismic event location capabilities. The LLNL-G3Dv3 model is based on ∼2.8 millionP and Pnarrivals that are re-processed using our global multiple-event locator called Bayesloc. We construct LLNL-G3Dv3 within a spherical tessellation based framework, allowing for explicit representation of undulating and discontinuous layers including the crust and transition zone layers. Using a multiscale inversion technique, regional trendsmore » as well as fine details are captured where the data allow. LLNL-G3Dv3 exhibits large-scale structures including cratons and superplumes as well numerous complex details in the upper mantle including within the transition zone. Particularly, the model reveals new details of a vast network of subducted slabs trapped within the transition beneath much of Eurasia, including beneath the Tibetan Plateau. We demonstrate the impact of Bayesloc multiple-event location on the resulting tomographic images through comparison with images produced without the benefit of multiple-event constraints (single-event locations). We find that the multiple-event locations allow for better reconciliation of the large set of direct P phases recorded at 0–97° distance and yield a smoother and more continuous image relative to the single-event locations. Travel times predicted from a 3-D model are also found to be strongly influenced by the initial locations of the input data, even when an iterative inversion/relocation technique is employed.« less
Networking at Conferences: Developing Your Professional Support System
ERIC Educational Resources Information Center
Kowalsky, Michelle
2012-01-01
The complexity and scale of any large library, education, or technology conference can sometimes be overwhelming. Therefore, spending time reviewing the conference program and perusing the workshop offerings in advance can help you stay organized and make the most of your time at the event. Planning in advance will help you manage potential time…
The Development of Temporal Metamemory
ERIC Educational Resources Information Center
Friedman, William J.
2007-01-01
In two studies of knowledge about the properties and processes of memory for the times of past events, 178 children from 5 through 13 years of age and 40 adults answered questions about how they would remember times on different scales, how temporal memory is affected by retention interval, and the usefulness of different methods. The adults…
USDA-ARS?s Scientific Manuscript database
A time-scale-free approach was developed for estimation of water fluxes at boundaries of monitoring soil profile using water content time series. The approach uses the soil water budget to compute soil water budget components, i.e. surface-water excess (Sw), infiltration less evapotranspiration (I-E...
Characterizing Relativistic Electrons Flux Enhancement Events using sensors onboard SAMPEX and POLAR
NASA Astrophysics Data System (ADS)
Kanekal, S. G.; Selesnick, R. S.; Baker, D. N.; Blake, J. B.
2004-12-01
Relativistic electron fluxes in the Earth's outer Van Allen belt are highly variable with flux enhancements of several orders of magnitude occurring on time scales of a few days. Radiation belt electrons often are energized to relativistic energies when the magnetosphere is subjected to high solar wind speed and the southward turning of the interplanetary magnetic field. Characterization of electron acceleration properties such as electron spectra and flux isotropization are important in understanding acceleration models. We use sensors onboard SAMPEX and POLAR to measure and survey systematically these properties. SAMPEX measurements cover the entire outer zone for more than a decade from mid 1992 to mid 2004 and POLAR covers the time period from mid 1996 to the present. We use the pulse height analyzed data from the PET detector onboard SAMPEX to measure electron spectra. Fluxes measured by the HIST detector onboard POLAR together with the PET measurements are used to characterize isotropization times. This paper presents electron spectra and isotropization time scales for a few representative events. We will eventually extend these measurements and survey the entire solar cycle 23.
NASA Astrophysics Data System (ADS)
Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios
2016-12-01
The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan
2016-12-01
The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore themore » robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.« less
NASA Astrophysics Data System (ADS)
Zheng, Dawei; Ding, Xiaoli; Zhou, Yonghong; Chen, Yongqi
2003-03-01
Time series of the length of day characterizing the rate of Earth rotation, the atmospheric angular momentum and the Southern Oscillation Index from 1962 to 2000 are used to reexamine the relationships between the ENSO events and the changes in the length of day, as well as the global atmospheric angular momentum. Particular attention is given to the different effects of the 1982-1983 and 1997-1998 ENSO events on the variations of Earth rotation. The combined effects of multiscale atmospheric oscillations (seasonal, quasi-biennial and ENSO time scales) on the anomalous variations of the interannual rates of Earth rotation are revealed in this paper by studying the wavelet spectra of the data series.
Progress and challenges with Warn-on-Forecast
NASA Astrophysics Data System (ADS)
Stensrud, David J.; Wicker, Louis J.; Xue, Ming; Dawson, Daniel T.; Yussouf, Nusrat; Wheatley, Dustan M.; Thompson, Therese E.; Snook, Nathan A.; Smith, Travis M.; Schenkman, Alexander D.; Potvin, Corey K.; Mansell, Edward R.; Lei, Ting; Kuhlman, Kristin M.; Jung, Youngsun; Jones, Thomas A.; Gao, Jidong; Coniglio, Michael C.; Brooks, Harold E.; Brewster, Keith A.
2013-04-01
The current status and challenges associated with two aspects of Warn-on-Forecast-a National Oceanic and Atmospheric Administration research project exploring the use of a convective-scale ensemble analysis and forecast system to support hazardous weather warning operations-are outlined. These two project aspects are the production of a rapidly-updating assimilation system to incorporate data from multiple radars into a single analysis, and the ability of short-range ensemble forecasts of hazardous convective weather events to provide guidance that could be used to extend warning lead times for tornadoes, hailstorms, damaging windstorms and flash floods. Results indicate that a three-dimensional variational assimilation system, that blends observations from multiple radars into a single analysis, shows utility when evaluated by forecasters in the Hazardous Weather Testbed and may help increase confidence in a warning decision. The ability of short-range convective-scale ensemble forecasts to provide guidance that could be used in warning operations is explored for five events: two tornadic supercell thunderstorms, a macroburst, a damaging windstorm and a flash flood. Results show that the ensemble forecasts of the three individual severe thunderstorm events are very good, while the forecasts from the damaging windstorm and flash flood events, associated with mesoscale convective systems, are mixed. Important interactions between mesoscale and convective-scale features occur for the mesoscale convective system events that strongly influence the quality of the convective-scale forecasts. The development of a successful Warn-on-Forecast system will take many years and require the collaborative efforts of researchers and operational forecasters to succeed.
Exploration of a High Luminosity 100 TeV Proton Antiproton Collider
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveros, Sandra J.; Summers, Don; Cremaldi, Lucien
New physics is being explored with the Large Hadron Collider at CERN and with Intensity Frontier programs at Fermilab and KEK. The energy scale for new physics is known to be in the multi-TeV range, signaling the need for a future collider which well surpasses this energy scale. We explore a 10more » $$^{\\,34}$$ cm$$^{-2}$$ s$$^{-1}$$ luminosity, 100 TeV $$p\\bar{p}$$ collider with 7$$\\times$$ the energy of the LHC but only 2$$\\times$$ as much NbTi superconductor, motivating the choice of 4.5 T single bore dipoles. The cross section for many high mass states is 10 times higher in $$p\\bar{p}$$ than $pp$ collisions. Antiquarks for production can come directly from an antiproton rather than indirectly from gluon splitting. The higher cross sections reduce the synchrotron radiation in superconducting magnets and the number of events per beam crossing, because lower beam currents can produce the same rare event rates. Events are more centrally produced, allowing a more compact detector with less space between quadrupole triplets and a smaller $$\\beta^{*}$$ for higher luminosity. A Fermilab-like $$\\bar p$$ source would disperse the beam into 12 momentum channels to capture more antiprotons. Because stochastic cooling time scales as the number of particles, 12 cooling ring sets would be used. Each set would include phase rotation to lower momentum spreads, equalize all momentum channels, and stochastically cool. One electron cooling ring would follow the stochastic cooling rings. Finally antiprotons would be recycled during runs without leaving the collider ring by joining them to new bunches with synchrotron damping.« less
Ramírez, Alonso; Pringle, Catherine M.
2018-01-01
Understanding how environmental variables influence the distribution and density of organisms over relatively long temporal scales is a central question in ecology given increased climatic variability (e.g., precipitation, ENSO events). The primary goal of our study was to evaluate long-term (15y time span) patterns of climate, as well as environmental parameters in two Neotropical streams in lowland Costa Rica, to assess potential effects on aquatic macroinvertebrates. We also examined the relative effects of an 8y whole-stream P-enrichment experiment on macroinvertebrate assemblages against the backdrop of this long-term study. Climate, environmental variables and macroinvertebrate samples were measured monthly for 7y and then quarterly for an additional 8y in each stream. Temporal patterns in climatic and environmental variables showed high variability over time, without clear inter-annual or intra-annual patterns. Macroinvertebrate richness and abundance decreased with increasing discharge and was positively related to the number of days since the last high discharge event. Findings show that fluctuations in stream physicochemistry and macroinvertebrate assemblage structure are ultimately the result of large-scale climatic phenomena, such as ENSO events, while the 8y P-enrichment did not appear to affect macroinvertebrates. Our study demonstrates that Neotropical lowland streams are highly dynamic and not as stable as is commonly presumed, with high intra- and inter-annual variability in environmental parameters that change the structure and composition of freshwater macroinvertebrate assemblages. PMID:29420548
Long-term Variability of Beach Cusps
NASA Astrophysics Data System (ADS)
Pianca, C.; Holman, R. A.; Siegle, E.
2016-02-01
The most curious morphological features observed on beaches are the cusps. Due to their rhythmic spacing, beach cusps have attracted many observers and many, often contradictory, theories as to their form. Moreover, most of the research about beach cusps has focused on their formation. Few had available long time series to study such things as the variability of alongshore and cross-shore position and spacing on the cusp field, the presence, longevity and interactions between higher and lower sets of cusps, and the processes by which cusp fields extend, shrink or change length scale. The purpose of this work is to use long-term data sets of video images from two study sites, an intermediate (Duck, USA, 26 years) and a reflective beach (Massaguaçu, Brazil, 3 years), to investigate the temporal and spatial changes of cusps conditions. Time-evolving shoreline data were first extracted using an algorithm called ASLIM (Pianca et al 2015). Cusps were then identified based on the band-passed variability of time exposure image data about this shoreline as a function of elevation relative to MSL. The identified beaches cusps will be analyzed for cusp spacing, positions (upper or lower cusps), alongshore variability, merging events, percentage of cusp events, patterns of the events and time scales of variability. Finally, the relationship of these characteristics to environmental conditions (wave, tides, beach conditions) will be studied.
Input-output relationship in social communications characterized by spike train analysis
NASA Astrophysics Data System (ADS)
Aoki, Takaaki; Takaguchi, Taro; Kobayashi, Ryota; Lambiotte, Renaud
2016-10-01
We study the dynamical properties of human communication through different channels, i.e., short messages, phone calls, and emails, adopting techniques from neuronal spike train analysis in order to characterize the temporal fluctuations of successive interevent times. We first measure the so-called local variation (LV) of incoming and outgoing event sequences of users and find that these in- and out-LV values are positively correlated for short messages and uncorrelated for phone calls and emails. Second, we analyze the response-time distribution after receiving a message to focus on the input-output relationship in each of these channels. We find that the time scales and amplitudes of response differ between the three channels. To understand the effects of the response-time distribution on the correlations between the LV values, we develop a point process model whose activity rate is modulated by incoming and outgoing events. Numerical simulations of the model indicate that a quick response to incoming events and a refractory effect after outgoing events are key factors to reproduce the positive LV correlations.
NASA Technical Reports Server (NTRS)
Waight, Kenneth T., III; Zack, John W.; Karyampudi, V. Mohan
1989-01-01
Initial simulations of the June 28, 1986 Cooperative Huntsville Meteorological Experiment case illustrate the need for mesoscale moisture information in a summertime situation in which deep convection is organized by weak large scale forcing. A methodology is presented for enhancing the initial moisture field from a combination of IR satellite imagery, surface-based cloud observations, and manually digitized radar data. The Mesoscale Atmospheric Simulation Model is utilized to simulate the events of June 28-29. This procedure insures that areas known to have precipitation at the time of initialization will be nearly saturated on the grid scale, which should decrease the time needed by the model to produce the observed Bonnie (a relatively weak hurricane that moved on shore two days before) convection. This method will also result in an initial distribution of model cloudiness (transmissivity) that is very similar to that of the IR satellite image.
Chance, necessity and the origins of life: a physical sciences perspective.
Hazen, Robert M
2017-12-28
Earth's 4.5-billion-year history has witnessed a complex sequence of high-probability chemical and physical processes, as well as 'frozen accidents'. Most models of life's origins similarly invoke a sequence of chemical reactions and molecular self-assemblies in which both necessity and chance play important roles. Recent research adds two important insights into this discussion. First, in the context of chemical reactions, chance versus necessity is an inherently false dichotomy-a range of probabilities exists for many natural events. Second, given the combinatorial richness of early Earth's chemical and physical environments, events in molecular evolution that are unlikely at limited laboratory scales of space and time may, nevertheless, be inevitable on an Earth-like planet at time scales of a billion years.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).
Critical Thresholds in Earth-System Dynamics
NASA Astrophysics Data System (ADS)
Rothman, D.
2017-12-01
The history of the Earth system is a story of change. Some changesare gradual and benign, but others, especially those associated withcatastrophic mass extinction, are relatively abrupt and destructive.What sets one group apart from the other? Here I hypothesize thatperturbations of Earth's carbon cycle lead to mass extinction if theyexceed either a critical rate at long time scales or a critical sizeat short time scales. By analyzing 31 carbon-isotopic events duringthe last 542 million years, I identify the critical rate with a limitimposed by mass conservation. Further analysis identifies thecrossover timescale separating fast from slow events with thetimescale of the ocean's homeostatic response to a change in pH. Theproduct of the critical rate and the crossover timescale then yieldsthe critical size. The modern critical size for the marine carboncycle is roughly similar to the mass of carbon that human activitieswill likely have added to the oceans by the year 2100.
NASA Astrophysics Data System (ADS)
Longobardi, Antonia; Diodato, Nazzareno; Mobilia, Mirka
2017-04-01
Extremes precipitation events are frequently associated to natural disasters falling within the broad spectrum of multiple damaging hydrological events (MDHEs), defined as the simultaneously triggering of different types of phenomena, such as landslides and floods. The power of the rainfall (duration, magnitude, intensity), named storm erosivity, is an important environmental indicator of multiple damaging hydrological phenomena. At the global scale, research interest is actually devoted to the investigation of non-stationary features of extreme events, and consequently of MDHEs, which appear to be increasing in frequency and severity. The Mediterranean basin appears among the most vulnerable regions with an expected increase in occurring damages of about 100% by the end of the century. A high concentration of high magnitude and short duration rainfall events are, in fact, responsible for the largest rainfall erosivity and erosivity density values within Europe. The aim of the reported work is to investigate the relationship between the temporal evolution of severe geomorphological events and combined precipitation indices as a tool to improve understanding the hydro-geological hazard at the catchment scale. The case study is the Solofrana river basin, Southern Italy, which has been seriously and consistently in time affected by natural disasters. Data for about 45 MDH events, spanning on a decadal scale 1951-2014, have been collected and analyzed for this purpose. A preliminary monthly scale analysis of event occurrences highlights a pronounced seasonal characterization of the phenomenon, as about 60% of the total number of reported events take place during the period from September to November. Following, a statistical analysis clearly indicates a significant increase in the frequency of occurrences of MDHEs during the last decades. Such an increase appears to be related to non-stationary features of an average catchment scale rainfall-runoff erosivity index, which combines maximum monthly, maximum daily, and a proxy of maximum hourly precipitation data. The main findings of the reported study relate to the fact that climate evolving tendencies do not appear significant in most of the cases and that MDHEs occurred within the studied catchment also for rainfall events of very moderate intensity and/or severity. The illustrated results seems to indicate that climate variability has not assumed the main role in the large number of damaging event, and that the relative increase hazardous hydro-geological events in the last decade, is instead most likely caused by incorrect urban planning policies.
NASA Astrophysics Data System (ADS)
Giacco, Biagio; Hajdas, Irka; Isaia, Roberto; Deino, Alan; Nomade, Sebastien
2017-04-01
The Campanian Ignimbrite (CI) super-eruption ( 40 ka, Southern Italy) is the largest known volcanic event of Mediterranean area. The CI tephra is widely dispersed through western Eurasia and occurs in close stratigraphic association with significant Late Pleistocene paleoclimatic and Paleolithic cultural events. This makes the CI tephra one of the most important tool for investigating several scientific issues ranging from volcanology, paleoclimatology to archaeology. Yet despite concerted attempts, the absolute age of the CI eruption is not well constrained. Here we present the first direct radiocarbon age for the CI obtained using accepted modern practices, from multiple 14C analyses of an exceptional large charred tree branch embedded in the lithified Yellow Tuff facies of the CI pyroclastic flow deposits, as well as new high-precision 40Ar/39Ar dating for the CI. These data substantially improve upon previous age determinations and permit fuller exploitation of the chronological potential of the CI tephra marker. Specifically, the results of our study are twofold: they provide (i) a robust pair of 14C and 40Ar/39Ar ages for refining both the radiocarbon calibration curve and the Late Pleistocene time-scale in the narrow, but significant time-span across CI event and (ii) compelling chronological evidence for the significance of the combined influence of the CI eruption and Heinrich Event 4 on European climate and potentially evolutionary processes of the Early Upper Palaeolithic.
Characteristics of EUV Coronal Jets Observed with STEREO/SECCHI
NASA Astrophysics Data System (ADS)
Nisticò, G.; Bothmer, V.; Patsourakos, S.; Zimbardo, G.
2009-10-01
In this paper we present the first comprehensive statistical study of EUV coronal jets observed with the SECCHI (Sun Earth Connection Coronal and Heliospheric Investigation) imaging suites of the two STEREO spacecraft. A catalogue of 79 polar jets is presented, identified from simultaneous EUV and white-light coronagraph observations, taken during the time period March 2007 to April 2008, when solar activity was at a minimum. The twin spacecraft angular separation increased during this time interval from 2 to 48 degrees. The appearances of the coronal jets were always correlated with underlying small-scale chromospheric bright points. A basic characterization of the morphology and identification of the presence of helical structure were established with respect to recently proposed models for their origin and temporal evolution. Though each jet appeared morphologically similar in the coronagraph field of view, in the sense of a narrow collimated outward flow of matter, at the source region in the low corona the jet showed different characteristics, which may correspond to different magnetic structures. A classification of the events with respect to previous jet studies shows that amongst the 79 events there were 37 Eiffel tower-type jet events, commonly interpreted as a small-scale (˜35 arc sec) magnetic bipole reconnecting with the ambient unipolar open coronal magnetic fields at its loop tops, and 12 lambda-type jet events commonly interpreted as reconnection with the ambient field happening at the bipole footpoints. Five events were termed micro-CME-type jet events because they resembled the classical coronal mass ejections (CMEs) but on much smaller scales. The remaining 25 cases could not be uniquely classified. Thirty-one of the total number of events exhibited a helical magnetic field structure, indicative for a torsional motion of the jet around its axis of propagation. A few jets are also found in equatorial coronal holes. In this study we present sample events for each of the jet types using both, STEREO A and STEREO B, perspectives. The typical lifetimes in the SECCHI/EUVI ( Extreme UltraViolet Imager) field of view between 1.0 to 1.7 R ⊙ and in SECCHI/COR1 field of view between 1.4 to 4 R ⊙ are obtained, and the derived speeds are roughly estimated. In summary, the observations support the assumption of continuous small-scale reconnection as an intrinsic feature of the solar corona, with its role for the heating of the corona, particle acceleration, structuring and acceleration of the solar wind remaining to be explored in more detail in further studies.
NASA Astrophysics Data System (ADS)
Gauduel, Y. A.
2017-05-01
A major challenge of spatio-temporal radiation biomedicine concerns the understanding of biophysical events triggered by an initial energy deposition inside confined ionization tracks. This contribution deals with an interdisciplinary approach that concerns cutting-edge advances in real-time radiation events, considering the potentialities of innovating strategies based on ultrafast laser science, from femtosecond photon sources to advanced techniques of ultrafast TW laser-plasma accelerator. Recent advances of powerful TW laser sources ( 1019 W cm-2) and laser-plasma interactions providing ultra-short relativistic particle beams in the energy domain 5-200 MeV open promising opportunities for the development of high energy radiation femtochemistry (HERF) in the prethermal regime of secondary low-energy electrons and for the real-time imaging of radiation-induced biomolecular alterations at the nanoscopic scale. New developments would permit to correlate early radiation events triggered by ultrashort radiation sources with a molecular approach of Relative Biological Effectiveness (RBE). These emerging research developments are crucial to understand simultaneously, at the sub-picosecond and nanometric scales, the early consequences of ultra-short-pulsed radiation on biomolecular environments or integrated biological entities. This innovating approach would be applied to biomedical relevant concepts such as the emerging domain of real-time nanodosimetry for targeted pro-drug activation and pulsed radio-chimiotherapy of cancers.
Atmospheric Diabatic Heating in Different Weather States and the General Circulation
NASA Technical Reports Server (NTRS)
Rossow, William B.; Zhang, Yuanchong; Tselioudis, George
2016-01-01
Analysis of multiple global satellite products identifies distinctive weather states of the atmosphere from the mesoscale pattern of cloud properties and quantifies the associated diabatic heating/cooling by radiative flux divergence, precipitation, and surface sensible heat flux. The results show that the forcing for the atmospheric general circulation is a very dynamic process, varying strongly at weather space-time scales, comprising relatively infrequent, strong heating events by ''stormy'' weather and more nearly continuous, weak cooling by ''fair'' weather. Such behavior undercuts the value of analyses of time-averaged energy exchanges in observations or numerical models. It is proposed that an analysis of the joint time-related variations of the global weather states and the general circulation on weather space-time scales might be used to establish useful ''feedback like'' relationships between cloud processes and the large-scale circulation.
On simulating large earthquakes by Green's-function addition of smaller earthquakes
NASA Astrophysics Data System (ADS)
Joyner, William B.; Boore, David M.
Simulation of ground motion from large earthquakes has been attempted by a number of authors using small earthquakes (subevents) as Green's functions and summing them, generally in a random way. We present a simple model for the random summation of subevents to illustrate how seismic scaling relations can be used to constrain methods of summation. In the model η identical subevents are added together with their start times randomly distributed over the source duration T and their waveforms scaled by a factor κ. The subevents can be considered to be distributed on a fault with later start times at progressively greater distances from the focus, simulating the irregular propagation of a coherent rupture front. For simplicity the distance between source and observer is assumed large compared to the source dimensions of the simulated event. By proper choice of η and κ the spectrum of the simulated event deduced from these assumptions can be made to conform at both low- and high-frequency limits to any arbitrary seismic scaling law. For the ω -squared model with similarity (that is, with constant Moƒ3o scaling, where ƒo is the corner frequency), the required values are η = (Mo/Moe)4/3 and κ = (Mo/Moe)-1/3, where Mo is moment of the simulated event and Moe is the moment of the subevent. The spectra resulting from other choices of η and κ, will not conform at both high and low frequency. If η is determined by the ratio of the rupture area of the simulated event to that of the subevent and κ = 1, the simulated spectrum will conform at high frequency to the ω-squared model with similarity, but not at low frequency. Because the high-frequency part of the spectrum is generally the important part for engineering applications, however, this choice of values for η and κ may be satisfactory in many cases. If η is determined by the ratio of the moment of the simulated event to that of the subevent and κ = 1, the simulated spectrum will conform at low frequency to the ω-squared model with similarity, but not at high frequency. Interestingly, the high-frequency scaling implied by this latter choice of η and κ corresponds to an ω-squared model with constant Moƒ4o—a scaling law proposed by Nuttli, although questioned recently by Haar and others. Simple scaling with κ equal to unity and η equal to the moment ratio would work if the high-frequency spectral decay were ω-1.5 instead of ω-2. Just the required decay is exhibited by the stochastic source model recently proposed by Joynet, if the dislocation-time function is deconvolved out of the spectrum. Simulated motions derived from such source models could be used as subevents rather than recorded motions as is usually done. This strategy is a promising approach to simulation of ground motion from an extended rupture.
NASA Astrophysics Data System (ADS)
Turrin, B. D.; Turrin, M.
2012-12-01
After "What is this rock?" the most common questions that is asked of Geologists is "How old is this rock/fossil?" For geologists considering ages back to millions of years is routine. Sorting and cataloguing events into temporal sequences is a natural tendency for all humans. In fact, it is an everyday activity for humans, i.e., keeping track of birthdays, anniversaries, appointments, meetings, AGU abstract deadlines etc… However, the time frames that are most familiar to the non scientist (seconds, minutes, hours, days, years) generally extend to only a few decades or at most centuries. Yet the vast length of time covered by Earth's history, 4.56 billion years, greatly exceeds these timeframes and thus is commonly referred to as "Deep Time". This is a challenging concept for most students to comprehend as it involves temporal and abstract thinking, yet it is key to their successful understanding of numerous geologic principles. We have developed an outdoor learning activity for general Introductory Earth Science courses that incorporates several scientific and geologic concepts such as: linear distance or stratigraphic thickness representing time, learning about major events in Earth's history and locating them in a scaled temporal framework, field mapping, abstract thinking, scaling and dimensional analysis, and the principles of radio isotopic dating. The only supplies needed are readily available in local hardware stores i.e. a 300 ft. surveyor's tape marked in feet, and tenths and hundredths of a foot, and the student's own introductory geology textbook. The exercise employs a variety of pedagogical learning modalities, including traditional lecture-based, the use of Art/Drawing, use of Visualization, Collaborative learning, and Kinesthetic and Experiential learning. Initially the students are exposed to the concept of "Deep Time" in a short conventional introductory lecture; this is followed by a 'field day'. Prior to the field exercise, students work with their textbook selecting events is Earth History that they find interesting. Using the textbook and online resources they then draw figures that represent these events. The drawing exercise reinforces the learning by having students visualize (imprinting an image) of these geologic events. Once the students have produced their drawings, the outdoor field exercise follows. Working collaboratively, the students measure and lay out a scaled linear model representing 4.56 billion years of geologic time. They then organize and place their drawings in the proper sequence on the temporal model that they have created. Once all the drawings are in place they are able to visualize the expanse of time in Earth's history. Through comparing results from a pre-test to those from a post-test we can show the gains in student understanding of Deep Time, a concept that is central to many of our geologic understandings.
NASA Technical Reports Server (NTRS)
Swanson, E. R.; Kugel, C. P.
1972-01-01
The report specifically discusses time dissemination techniques, including epoch determination, frequency determination, and ambiguity resolution. It also discusses operational considerations including equipment, path selection, and adjustment procedure. epoch (the actual location or timing of periodic events) is shown to be both maintainable and calibratable by the techniques described to better than 3-microsecond accuracy; and frequency (the uniformity of the time scale) to about one part in 10 to the 12th power.
NASA Astrophysics Data System (ADS)
Schindewolf, Marcus; Kaiser, Andreas; Buchholtz, Arno; Schmidt, Jürgen
2017-04-01
Extreme rainfall events and resulting flash floods led to massive devastations in Germany during spring 2016. The study presented aims on the development of a early warning system, which allows the simulation and assessment of negative effects on infrastructure by radar-based heavy rainfall predictions, serving as input data for the process-based soil loss and deposition model EROSION 3D. Our approach enables a detailed identification of runoff and sediment fluxes in agricultural used landscapes. In a first step, documented historical events were analyzed concerning the accordance of measured radar rainfall and large scale erosion risk maps. A second step focused on a small scale erosion monitoring via UAV of source areas of heavy flooding events and a model reconstruction of the processes involved. In all examples damages were caused to local infrastructure. Both analyses are promising in order to detect runoff and sediment delivering areas even in a high temporal and spatial resolution. Results prove the important role of late-covering crops such as maize, sugar beet or potatoes in runoff generation. While e.g. winter wheat positively affects extensive runoff generation on undulating landscapes, massive soil loss and thus muddy flows are observed and depicted in model results. Future research aims on large scale model parameterization and application in real time, uncertainty estimation of precipitation forecast and interface developments.
Catastrophic ice lake collapse in Aram Chaos, Mars
NASA Astrophysics Data System (ADS)
Roda, Manuel; Kleinhans, Maarten G.; Zegers, Tanja E.; Oosthoek, Jelmer H. P.
2014-07-01
Hesperian chaotic terrains have been recognized as the source of outflow channels formed by catastrophic outflows. Four main scenarios have been proposed for the formation of chaotic terrains that involve different amounts of water and single or multiple outflow events. Here, we test these scenarios with morphological and structural analyses of imagery and elevation data for Aram Chaos in conjunction with numerical modeling of the morphological evolution of the catastrophic carving of the outflow valley. The morphological and geological analyses of Aram Chaos suggest large-scale collapse and subsidence (1500 m) of the entire area, which is consistent with a massive expulsion of liquid water from the subsurface in one single event. The combined observations suggest a complex process starting with the outflow of water from two small channels, followed by continuous groundwater sapping and headward erosion and ending with a catastrophic lake rim collapse and carving of the Aram Valley, which is synchronous with the 2.5 Ga stage of the Ares Vallis formation. The water volume and formative time scale required to carve the Aram channels indicate that a single, rapid (maximum tens of days) and catastrophic (flood volume of 9.3 × 104 km3) event carved the outflow channel. We conclude that a sub-ice lake collapse model can best explain the features of the Aram Chaos Valley system as well as the time scale required for its formation.
Maguire, Elizabeth M; Bokhour, Barbara G; Wagner, Todd H; Asch, Steven M; Gifford, Allen L; Gallagher, Thomas H; Durfee, Janet M; Martinello, Richard A; Elwy, A Rani
2016-11-11
Many healthcare organizations have developed disclosure policies for large-scale adverse events, including the Veterans Health Administration (VA). This study evaluated VA's national large-scale disclosure policy and identifies gaps and successes in its implementation. Semi-structured qualitative interviews were conducted with leaders, hospital employees, and patients at nine sites to elicit their perceptions of recent large-scale adverse events notifications and the national disclosure policy. Data were coded using the constructs of the Consolidated Framework for Implementation Research (CFIR). We conducted 97 interviews. Insights included how to handle the communication of large-scale disclosures through multiple levels of a large healthcare organization and manage ongoing communications about the event with employees. Of the 5 CFIR constructs and 26 sub-constructs assessed, seven were prominent in interviews. Leaders and employees specifically mentioned key problem areas involving 1) networks and communications during disclosure, 2) organizational culture, 3) engagement of external change agents during disclosure, and 4) a need for reflecting on and evaluating the policy implementation and disclosure itself. Patients shared 5) preferences for personal outreach by phone in place of the current use of certified letters. All interviewees discussed 6) issues with execution and 7) costs of the disclosure. CFIR analysis reveals key problem areas that need to be addresses during disclosure, including: timely communication patterns throughout the organization, establishing a supportive culture prior to implementation, using patient-approved, effective communications strategies during disclosures; providing follow-up support for employees and patients, and sharing lessons learned.
Zhang, Mi; Wen, Xue Fa; Zhang, Lei Ming; Wang, Hui Min; Guo, Yi Wen; Yu, Gui Rui
2018-02-01
Extreme high temperature is one of important extreme weathers that impact forest ecosystem carbon cycle. In this study, applying CO 2 flux and routine meteorological data measured during 2003-2012, we examined the impacts of extreme high temperature and extreme high temperature event on net carbon uptake of subtropical coniferous plantation in Qianyanzhou. Combining with wavelet analysis, we analyzed environmental controls on net carbon uptake at different temporal scales, when the extreme high temperature and extreme high temperature event happened. The results showed that mean daily cumulative NEE decreased by 51% in the days with daily maximum air temperature range between 35 ℃ and 40 ℃, compared with that in the days with the range between 30 ℃ and 34 ℃. The effects of the extreme high temperature and extreme high temperature event on monthly NEE and annual NEE related to the strength and duration of extreme high tempe-rature event. In 2003, when strong extreme high temperature event happened, the sum of monthly cumulative NEE in July and August was only -11.64 g C·m -2 ·(2 month) -1 . The value decreased by 90%, compared with multi-year average value. At the same time, the relative variation of annual NEE reached -6.7%. In July and August, when the extreme high temperature and extreme high temperature event occurred, air temperature (T a ) and vapor press deficit (VPD) were the dominant controller for the daily variation of NEE. The coherency between NEE T a and NEE VPD was 0.97 and 0.95, respectively. At 8-, 16-, and 32-day periods, T a , VPD, soil water content at 5 cm depth (SWC), and precipitation (P) controlled NEE. The coherency between NEE SWC and NEE P was higher than 0.8 at monthly scale. The results indicated that atmospheric water deficit impacted NEE at short temporal scale, when the extreme high temperature and extreme high temperature event occurred, both of atmospheric water deficit and soil drought stress impacted NEE at long temporal scales in this ecosystem.
Event-driven processing for hardware-efficient neural spike sorting
NASA Astrophysics Data System (ADS)
Liu, Yan; Pereira, João L.; Constandinou, Timothy G.
2018-02-01
Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.
Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents and Accidents.
Wheatley, Spencer; Sovacool, Benjamin; Sornette, Didier
2017-01-01
We perform a statistical study of risk in nuclear energy systems. This study provides and analyzes a data set that is twice the size of the previous best data set on nuclear incidents and accidents, comparing three measures of severity: the industry standard International Nuclear Event Scale, the Nuclear Accident Magnitude Scale of radiation release, and cost in U.S. dollars. The rate of nuclear accidents with cost above 20 MM 2013 USD, per reactor per year, has decreased from the 1970s until the present time. Along the way, the rate dropped significantly after Chernobyl (April 1986) and is expected to be roughly stable around a level of 0.003, suggesting an average of just over one event per year across the current global fleet. The distribution of costs appears to have changed following the Three Mile Island major accident (March 1979). The median cost became approximately 3.5 times smaller, but an extremely heavy tail emerged, being well described by a Pareto distribution with parameter α = 0.5-0.6. For instance, the cost of the two largest events, Chernobyl and Fukushima (March 2011), is equal to nearly five times the sum of the 173 other events. We also document a significant runaway disaster regime in both radiation release and cost data, which we associate with the "dragon-king" phenomenon. Since the major accident at Fukushima (March 2011) occurred recently, we are unable to quantify an impact of the industry response to this disaster. Excluding such improvements, in terms of costs, our range of models suggests that there is presently a 50% chance that (i) a Fukushima event (or larger) occurs every 60-150 years, and (ii) that a Three Mile Island event (or larger) occurs every 10-20 years. Further-even assuming that it is no longer possible to suffer an event more costly than Chernobyl or Fukushima-the expected annual cost and its standard error bracket the cost of a new plant. This highlights the importance of improvements not only immediately following Fukushima, but also deeper improvements to effectively exclude the possibility of "dragon-king" disasters. Finally, we find that the International Nuclear Event Scale (INES) is inconsistent in terms of both cost and radiation released. To be consistent with cost data, the Chernobyl and Fukushima disasters would need to have between an INES level of 10 and 11, rather than the maximum of 7. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Short time-scale optical pulsations in the night sky background
NASA Technical Reports Server (NTRS)
Bertsch, D. L.; Fisher, A.; Ogelman, H.
1971-01-01
A network of monitoring stations designed to detect large scale fluorescence emission in the atmosphere has been in operation for over two years. The motivation for the search arises from the prediction that an energetic photon burst would be produced in a supernova and this burst, when absorbed in the atmosphere, would produce fluorescence. This paper reports on observations up to February 1971. No supernova-like events have been found, although 4.4 were expected. One class of non-fluorescence events is described that evidence suggests is related to electrical discharge in the atmosphere. Another type of non-fluorescence pulse appears to be related to particle precipitation in the atmosphere.
Horizontal and vertical structure of reactive bromine events probed by bromine monoxide MAX-DOAS
NASA Astrophysics Data System (ADS)
Simpson, William R.; Peterson, Peter K.; Frieß, Udo; Sihler, Holger; Lampel, Johannes; Platt, Ulrich; Moore, Chris; Pratt, Kerri; Shepson, Paul; Halfacre, John; Nghiem, Son V.
2017-08-01
Heterogeneous photochemistry converts bromide (Br-) to reactive bromine species (Br atoms and bromine monoxide, BrO) that dominate Arctic springtime chemistry. This phenomenon has many impacts such as boundary-layer ozone depletion, mercury oxidation and deposition, and modification of the fate of hydrocarbon species. To study environmental controls on reactive bromine events, the BRomine, Ozone, and Mercury EXperiment (BROMEX) was carried out from early March to mid-April 2012 near Barrow (Utqiaġvik), Alaska. We measured horizontal and vertical gradients in BrO with multiple-axis differential optical absorption spectroscopy (MAX-DOAS) instrumentation at three sites, two mobile and one fixed. During the campaign, a large crack in the sea ice (an open lead) formed pushing one instrument package ˜ 250 km downwind from Barrow (Utqiaġvik). Convection associated with the open lead converted the BrO vertical structure from a surface-based event to a lofted event downwind of the lead influence. The column abundance of BrO downwind of the re-freezing lead was comparable to upwind amounts, indicating direct reactions on frost flowers or open seawater was not a major reactive bromine source. When these three sites were separated by ˜ 30 km length scales of unbroken sea ice, the BrO amount and vertical distributions were highly correlated for most of the time, indicating the horizontal length scales of BrO events were typically larger than ˜ 30 km in the absence of sea ice features. Although BrO amount and vertical distribution were similar between sites most of the time, rapid changes in BrO with edges significantly smaller than this ˜ 30 km length scale episodically transported between the sites, indicating BrO events were large but with sharp edge contrasts. BrO was often found in shallow layers that recycled reactive bromine via heterogeneous reactions on snowpack. Episodically, these surface-based events propagated aloft when aerosol extinction was higher (> 0.1 km-1); however, the presence of aerosol particles aloft was not sufficient to produce BrO aloft. Highly depleted ozone (< 1 nmol mol-1) repartitioned reactive bromine away from BrO and drove BrO events aloft in cases. This work demonstrates the interplay between atmospheric mixing and heterogeneous chemistry that affects the vertical structure and horizontal extent of reactive bromine events.
Roman, C.T.; Peck, J.A.; Allen, J.R.; King, J.W.; Appleby, P.G.
1997-01-01
Sediment accumulation rates were determined at several sites throughout Nauset Marsh (Massachusetts, U.S.A.), a back-barrier lagoonal system, using feldspar marker horizons to evaluate short-term rates (1 to 2 year scales) and radiometric techniques to estimate rates over longer time scales (137Cs, 210Pb, 14C). The barrier spit fronting the Spartina-dominated study site has a complex geomorphic history of inlet migration and over-wash events. This study evaluates sediment accumulation rates in relation to inlet migration, storm events, and sea-level rise. The marker horizon technique displayed strong temporal and spatial variability in response to storm events and proximity to the inlet. Sediment accumulation rates of up to 24 mm year -1 were recorded in the immediate vicinity of the inlet during a period that included several major coastal storms, while feldspar sites remote from the inlet had substantially lower rates (trace accumulation to 2.2 mm year -1). During storm-free periods, accumulation rates did not exceed 6.7 mm year -1, but remained quite variable among sites. Based on 137Cs (3.8 to 4.5 mm year -1) and 210Pb (2.6 to 4.2 mm year -1) radiometric techniques, integrating sediment accumulation over decadal time scales, the marsh appeared to be keeping pace with the relative rate of sealevel rise from 1921 to 1993 of 2.4 mm year -1. At one site, the 210Pb-based sedimentation rate and rate of relative sea-level rise were nearly similar and peat rhizome analysis revealed that Distichlis spicata recently replaced this once S.patens site, suggesting that this portion of Nauset Marsh may be getting wetter, thus representing an initial response to wetland submergence. Horizon markers are useful in evaluating the role of short-term events, such as storms or inlet migration, influencing marsh sedimentation processes. However, sampling methods that integrate marsh sedimentation over decadal time scales are preferable when evaluating a systems response to sea-level rise.
Roman, C.T.; Peck, J.A.; Allen, J.R.; King, J.W.; Appleby, P.G.
1997-01-01
Sediment accumulation rates were determined at several sites throughout Nauset Marsh (Massachusetts, U.S.A.), a back-barrier lagoonal system, using feldspar marker horizons to evaluate short-term rates (1 to 2 year scales) and radiometric techniques to estimate rates over longer time scales (137Cs, 210Pb, 14C). The barrier spit fronting the Spartima-dominated study site has a complex geomorphic history of inlet migration and overwash events. This study evaluates sediment accumulation rates in relation to inlet migration, storm events and sea-level rise. The marker horizon technique displayed strong temporal and spatial variability in response to storm events and proximity to the inlet. Sediment accumulation rates of up to 24 mm year-1 were recorded in the immediate vicinity of the inlet during a period that included several major coastal storms, while feldspar sites remote from the inlet had substantially lower rates (trace accumulation to 2.2 mm year-1). During storm-free periods, accumulation rates did not exceed 6.7 mm year-1, but remained quite variable among sites. Based on 137Cs (3.8 to 4.5 mm year-1) and 210Pb (2.6 to 4.2 mm year-1) radiometric techniques, integrating sediment accumulation over decadal time scales, the marsh appeared to be keeping pace with the relative rate of sea-level rise from 1921 to 1993 of 2.4 mm year-1. At one site, the 210Pb-based sedimentation rate and rate of relative sea-level rise were nearly similar and peat rhizome analysis revealed that Distichlis spicata recently replaced this once S. patens site, suggesting that this portion of Nauset Marsh may be getting wetter, thus representing an initial response to wetland submergence. Horizon markers are useful in evaluating the role of short-term events, such as storms or inlet migration, influencing marsh sedimentation processes. However, sampling methods that integrate marsh sedimentation over decadal time scales are preferable when evaluating a systems response to sea-level rise.
Scaling relation between earthquake magnitude and the departure time from P wave similar growth
Noda, Shunta; Ellsworth, William L.
2016-01-01
We introduce a new scaling relation between earthquake magnitude (M) and a characteristic of initial P wave displacement. By examining Japanese K-NET data averaged in bins partitioned by Mw and hypocentral distance, we demonstrate that the P wave displacement briefly displays similar growth at the onset of rupture and that the departure time (Tdp), which is defined as the time of departure from similarity of the absolute displacement after applying a band-pass filter, correlates with the final M in a range of 4.5 ≤ Mw ≤ 7. The scaling relation between Mw and Tdp implies that useful information on the final M can be derived while the event is still in progress because Tdp occurs before the completion of rupture. We conclude that the scaling relation is important not only for earthquake early warning but also for the source physics of earthquakes.
Exploring Low-Amplitude, Long-Duration Deformational Transients on the Cascadia Subduction Zone
NASA Astrophysics Data System (ADS)
Nuyen, C.; Schmidt, D. A.
2017-12-01
The absence of long-term slow slip events (SSEs) in Cascadia is enigmatic on account of the diverse group of subduction zone systems that do experience long-term SSEs. In particular, southwest Japan, Alaska, New Zealand and Mexico have observed long-term SSEs, with some of the larger events exhibiting centimeter-scale surface displacements over the course of multiple years. The conditions that encourage long-term slow slip are not well established due to the variability in thermal parameter and plate dip amongst subduction zones that host long-term events. The Cascadia Subduction Zone likely has the capacity to host long-term SSEs, and the lack of such events motivates further exploration of the observational data. In order to search for the existence of long-duration transients in surface displacements, we examine Cascadia GPS time series from PANGA and PBO to determine whether or not Cascadia has hosted a long-term slow slip event in the past 20 years. A careful review of the time series does not reveal any large-scale multi-year transients. In order to more clearly recognize possible small amplitude long-term SSEs in Cascadia, the GPS time series are reduced with two separate methods. The first method involves manually removing (1) continental water loading terms, (2) transient displacements of known short-term SSEs, and (3) common mode signals that span the network. The second method utilizes a seasonal-trend decomposition procedure (STL) to extract a long-term trend from the GPS time-series. Manual inspection of both of these products reveals intriguing long-term changes in the longitudinal component of several GPS stations in central Cascadia. To determine whether these shifts could be due to long-term slow slip, we invert the reduced surface displacement time series for fault slip using a principle component analysis-based inversion method. We also utilize forward fault models of various synthetic long-term SSEs to better understand how these events may appear in the time series for a range of magnitudes and durations. Results from this research have direct implications for the possible slip modes in Cascadia and how variations in slip over time can impact stress and strain accumulations along the margin.
Stick-slip behavior in a continuum-granular experiment.
Geller, Drew A; Ecke, Robert E; Dahmen, Karin A; Backhaus, Scott
2015-12-01
We report moment distribution results from a laboratory experiment, similar in character to an isolated strike-slip earthquake fault, consisting of sheared elastic plates separated by a narrow gap filled with a two-dimensional granular medium. Local measurement of strain displacements of the plates at 203 spatial points located adjacent to the gap allows direct determination of the event moments and their spatial and temporal distributions. We show that events consist of spatially coherent, larger motions and spatially extended (noncoherent), smaller events. The noncoherent events have a probability distribution of event moment consistent with an M(-3/2) power law scaling with Poisson-distributed recurrence times. Coherent events have a log-normal moment distribution and mean temporal recurrence. As the applied normal pressure increases, there are more coherent events and their log-normal distribution broadens and shifts to larger average moment.
Polar process and world climate /A brief overview/
NASA Technical Reports Server (NTRS)
Goody, R.
1980-01-01
A review is presented of events relating polar regions to the world climate, the mechanisms of sea ice and polar ice sheets, and of two theories of the Pleistocene Ice Ages. The sea ice which varies over time scales of one or two years and the polar ice sheets with time changes measured in tens or hundreds of thousands of years introduce two distinct time constants into global time changes; the yearly Arctic sea ice variations affect northern Europe and have some effect over the entire Northern Hemisphere; the ice-albedo coupling in the polar ice sheets is involved in major climatic events such as the Pleistocene ice ages. It is concluded that climate problems require a global approach including the atmosphere, the oceans, and the cryosphere.
Negative Life Events Scale for Students (NLESS)
ERIC Educational Resources Information Center
Buri, John R.; Cromett, Cristina E.; Post, Maria C.; Landis, Anna Marie; Alliegro, Marissa C.
2015-01-01
Rationale is presented for the derivation of a new measure of stressful life events for use with students [Negative Life Events Scale for Students (NLESS)]. Ten stressful life events questionnaires were reviewed, and the more than 600 items mentioned in these scales were culled based on the following criteria: (a) only long-term and unpleasant…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Bernhard W.; Mane, Anil U.; Elam, Jeffrey W.
X-ray detectors that combine two-dimensional spatial resolution with a high time resolution are needed in numerous applications of synchrotron radiation. Most detectors with this combination of capabilities are based on semiconductor technology and are therefore limited in size. Furthermore, the time resolution is often realised through rapid time-gating of the acquisition, followed by a slower readout. Here, a detector technology is realised based on relatively inexpensive microchannel plates that uses GHz waveform sampling for a millimeter-scale spatial resolution and better than 100 ps time resolution. The technology is capable of continuous streaming of time- and location-tagged events at rates greatermore » than 10 7events per cm 2. Time-gating can be used for improved dynamic range.« less
Observation and analysis of abrupt changes in the interplanetary plasma velocity and magnetic field.
NASA Technical Reports Server (NTRS)
Martin, R. N.; Belcher, J. W.; Lazarus, A. J.
1973-01-01
This paper presents a limited study of the physical nature of abrupt changes in the interplanetary plasma velocity and magnetic field based on 19 day's data from the Pioneer 6 spacecraft. The period was chosen to include a high-velocity solar wind stream and low-velocity wind. Abrupt events were accepted for study if the sum of the energy density in the magnetic field and velocity changes was above a specified minimum. A statistical analysis of the events in the high-velocity solar wind stream shows that Alfvenic changes predominate. This conclusion is independent of whether steady state requirements are imposed on conditions before and after the event. Alfvenic changes do not dominate in the lower-speed wind. This study extends the plasma field evidence for outwardly propagating Alfvenic changes to time scales as small as 1 min (scale lengths on the order of 20,000 km).
Shenandoah National Park Phenology Project-Weather data collection, description, and processing
Jones, John W.; Aiello, Danielle P.; Osborne, Jesse D.
2010-01-01
The weather data described in this document are being collected as part of a U.S. Geological Survey (USGS) study of changes in Shenandoah National Park (SNP) landscape phenology (Jones and Osbourne, 2008). Phenology is the study of the timing of biological events, such as annual plant flowering and seasonal bird migration. These events are partially driven by changes in temperature and precipitation; therefore, phenology studies how these events may reflect changes in climate. Landscape phenology is the study of changes in biological events over broad areas and assemblages of vegetation. To study climate-change relations over broad areas (at landscape scale), the timing and amount of annual tree leaf emergence, maximum foliage, and leaf fall for forested areas are of interest. To better link vegetation changes with climate, weather data are necessary. This report documents weather-station data collection and processing procedures used in the Shenandoah National Park Phenology Project.
NASA Astrophysics Data System (ADS)
Eichner, J. F.; Steuer, M.; Loew, P.
2016-12-01
Past natural catastrophes offer valuable information for present-day risk assessment. To make use of historic loss data one has to find a setting that enables comparison (over place and time) of historic events happening under today's conditions. By means of loss data normalization the influence of socio-economic development, as the fundamental driver in this context, can be eliminated and the data gives way to the deduction of risk-relevant information and allows the study of other driving factors such as influences from climate variability and climate change or changes of vulnerability. Munich Re's NatCatSERVICE database includes for each historic loss event the geographic coordinates of all locations and regions that were affected in a relevant way. These locations form the basis for what is known as the loss footprint of an event. Here we introduce a state of the art and robust method for global loss data normalization. The presented peril-specific loss footprint normalization method adjusts direct economic loss data to the influence of economic growth within each loss footprint (by using gross cell product data as proxy for local economic growth) and makes loss data comparable over time. To achieve a comparative setting for supra-regional economic differences, we categorize the normalized loss values (together with information on fatalities) based on the World Bank income groups into five catastrophe classes, from minor to catastrophic. The data treated in such way allows (a) for studying the influence of improved reporting of small scale loss events over time and (b) for application of standard (stationary) extreme value statistics (here: peaks over threshold method) to compile estimates for extreme and extrapolated loss magnitudes such as a "100 year event" on global scale. Examples of such results will be shown.
Ostrander, Chadlin M.; Owens, Jeremy D.; Nielsen, Sune G.
2017-01-01
The rates of marine deoxygenation leading to Cretaceous Oceanic Anoxic Events are poorly recognized and constrained. If increases in primary productivity are the primary driver of these episodes, progressive oxygen loss from global waters should predate enhanced carbon burial in underlying sediments—the diagnostic Oceanic Anoxic Event relic. Thallium isotope analysis of organic-rich black shales from Demerara Rise across Oceanic Anoxic Event 2 reveals evidence of expanded sediment-water interface deoxygenation ~43 ± 11 thousand years before the globally recognized carbon cycle perturbation. This evidence for rapid oxygen loss leading to an extreme ancient climatic event has timely implications for the modern ocean, which is already experiencing large-scale deoxygenation. PMID:28808684
Extreme events as foundation of Lévy walks with varying velocity
NASA Astrophysics Data System (ADS)
Kutner, Ryszard
2002-11-01
In this work we study the role of extreme events [E.W. Montroll, B.J. West, in: J.L. Lebowitz, E.W. Montrell (Eds.), Fluctuation Phenomena, SSM, vol. VII, North-Holland, Amsterdam, 1979, p. 63; J.-P. Bouchaud, M. Potters, Theory of Financial Risks from Statistical Physics to Risk Management, Cambridge University Press, Cambridge, 2001; D. Sornette, Critical Phenomena in Natural Sciences. Chaos, Fractals, Selforganization and Disorder: Concepts and Tools, Springer, Berlin, 2000] in determining the scaling properties of Lévy walks with varying velocity. This model is an extension of the well-known Lévy walks one [J. Klafter, G. Zumofen, M.F. Shlesinger, in M.F. Shlesinger, G.M. Zaslavsky, U. Frisch (Eds.), Lévy Flights and Related Topics ion Physics, Lecture Notes in Physics, vol. 450, Springer, Berlin, 1995, p. 196; G. Zumofen, J. Klafter, M.F. Shlesinger, in: R. Kutner, A. Pȩkalski, K. Sznajd-Weron (Eds.), Anomalous Diffusion. From Basics to Applications, Lecture Note in Physics, vol. 519, Springer, Berlin, 1999, p. 15] introduced in the context of chaotic dynamics where a fixed value of the walker velocity is assumed for simplicity. Such an extension seems to be necessary when the open and/or complex system is studied. The model of Lévy walks with varying velocity is spanned on two coupled velocity-temporal hierarchies: the first one consisting of velocities and the second of corresponding time intervals which the walker spends between the successive turning points. Both these hierarchical structures are characterized by their own self-similar dimensions. The extreme event, which can appear within a given time interval, is defined as a single random step of the walker having largest length. By finding power-laws which describe the time-dependence of this displacement and its statistics we obtained two independent diffusion exponents, which are related to the above-mentioned dimensions and which characterize the extreme event kinetics. In this work we show the principal influence of extreme events on the basic quantities (one-step distributions and moments as well as two-step correlation functions) of the continuous-time random walk formalism. Besides, we construct both the waiting-time distribution and sojourn probability density directly in a real space and time in the scaling form by proper component analysis which takes into account all possible fluctuations of the walker steps in contrast to the extreme event analysis. In this work we pay our attention to the basic quantities, since the summarized multi-step ones were already discussed earlier [Physica A 264 (1999) 107; Comp. Phys. Commun. 147 (2002) 565]. Moreover, we study not only the scaling phenomena but also, assuming a finite number of hierarchy levels, the breaking of scaling and its dependence on control parameters. This seems to be important for studying empirical systems the more so as there are still no closed formulae describing this phenomenon except the one for truncated Lévy flights [Phys. Rev. Lett. 73 (1994) 2946]. Our formulation of the model made possible to develop an efficient Monte Carlo algorithm [Physica A 264 (1999) 107; Comp. Phys. Commun. 147 (2002) 565] where no MC step is lost.
Spectral analysis of temporal non-stationary rainfall-runoff processes
NASA Astrophysics Data System (ADS)
Chang, Ching-Min; Yeh, Hund-Der
2018-04-01
This study treats the catchment as a block box system with considering the rainfall input and runoff output being a stochastic process. The temporal rainfall-runoff relationship at the catchment scale is described by a convolution integral on a continuous time scale. Using the Fourier-Stieltjes representation approach, a frequency domain solution to the convolution integral is developed to the spectral analysis of runoff processes generated by temporal non-stationary rainfall events. It is shown that the characteristic time scale of rainfall process increases the runoff discharge variability, while the catchment mean travel time constant plays the role in reducing the variability of runoff discharge. Similar to the behavior of groundwater aquifers, catchments act as a low-pass filter in the frequency domain for the rainfall input signal.
NASA Astrophysics Data System (ADS)
Minihane, M. R.; Freyberg, D. L.
2011-08-01
Identifying the dominant mechanisms controlling recharge in shallow sandy soils in tropical climates has received relatively little attention. Given the expansion of coastal fill using marine sands and the growth of coastal populations throughout the tropics, there is a need to better understand the nature of water balances in these settings. We use time series of field observations at a coastal landfill in Singapore coupled with numerical modeling using the Richards' equation to examine the impact of precipitation patterns on soil moisture dynamics, including percolation past the root zone and recharge, in such an environment. A threshold in total precipitation event depth, much more so than peak precipitation intensity, is the strongest event control on recharge. However, shallow antecedent moisture, and therefore the timing between events along with the seasonal depth to water table, also play significant roles in determining recharge amounts. For example, at our field site, precipitation events of less than 3 mm per event yield little to no direct recharge, but for larger events, moisture content changes below the root zone are linearly correlated to the product of the average antecedent moisture content and the total event precipitation. Therefore, water resources planners need to consider identifying threshold precipitation volumes, along with the multiple time scales that capture variability in event antecedent conditions and storm frequency in assessing the role of recharge in coastal water balances in tropical settings.
SIGN SINGULARITY AND FLARES IN SOLAR ACTIVE REGION NOAA 11158
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorriso-Valvo, L.; De Vita, G.; Kazachenko, M. D.
Solar Active Region NOAA 11158 has hosted a number of strong flares, including one X2.2 event. The complexity of current density and current helicity are studied through cancellation analysis of their sign-singular measure, which features power-law scaling. Spectral analysis is also performed, revealing the presence of two separate scaling ranges with different spectral index. The time evolution of parameters is discussed. Sudden changes of the cancellation exponents at the time of large flares and the presence of correlation with Extreme-Ultra-Violet and X-ray flux suggest that eruption of large flares can be linked to the small-scale properties of the current structures.
NASA Astrophysics Data System (ADS)
Li, Xin; Babovic, Vladan
2016-04-01
Flood and drought are hydrologic extreme events that have significant impact on human and natural systems. Characterization of flood and drought in terms of their start, duration and strength, and investigation of the impact of natural climate variability (i.e., ENSO) and anthropogenic climate change on them can help decision makers to facilitate adaptions to mitigate potential enormous economic costs. To date, numerous studies in this area have been conducted, however, they are primarily focused on extra-tropical regions. Therefore, this study presented a detailed framework to characterize flood and drought events in a tropical urban city-state (i.e., Singapore), based on daily data from 26 precipitation stations. Flood and drought events are extracted from standardized precipitation anomalies from monthly to seasonal time scales. Frequency, duration and magnitude of flood and drought at all the stations are analyzed based on crossing theory. In addition, spatial variation of flood and drought characteristics in Singapore is investigated using ordinary kriging method. Lastly, the impact of ENSO condition on flood and drought characteristics is analyzed using regional regression method. The results show that Singapore can be prone to extreme flood and drought events at both monthly and seasonal time scales. ENSO has significant influence on flood and drought characteristics in Singapore, but mainly during the South West Monsoon season. During the El Niño phase, drought can become more extreme. The results have implications for water management practices in Singapore.
A space mission to detect imminent Earth impactors
NASA Astrophysics Data System (ADS)
Valsecchi, G. B.; Perozzi, E.; Rossi, A.
2015-03-01
One of the goals of NEO surveys is to discover Earth impactors before they hit. How much warning time is desirable depends on the size of the impactors: for the larger ones more time is needed to mount effective mitigation measures. Initially, NEO surveys were aimed at large impactors, that can have significant global effects; however, their typical time scale is orders of magnitude larger than human lifetime. At the other extreme, monthly and annual events, liberating energies of the order of 1 to 10 kilotons, are immaterial as a threat to mankind, not justifying substantial expenditure on them. Intermediate events are of more concern: in the megatons range, timescales are of the order of centuries, and the damage can be substantial. A classical example is the Tunguska event, in which a body with a diameter of about 30 to 50 m liberated about 5 megatons in the atmosphere, devastating 2 000 square kilometers of Siberian forest.
Hierarchical Address Event Routing for Reconfigurable Large-Scale Neuromorphic Systems.
Park, Jongkil; Yu, Theodore; Joshi, Siddharth; Maier, Christoph; Cauwenberghs, Gert
2017-10-01
We present a hierarchical address-event routing (HiAER) architecture for scalable communication of neural and synaptic spike events between neuromorphic processors, implemented with five Xilinx Spartan-6 field-programmable gate arrays and four custom analog neuromophic integrated circuits serving 262k neurons and 262M synapses. The architecture extends the single-bus address-event representation protocol to a hierarchy of multiple nested buses, routing events across increasing scales of spatial distance. The HiAER protocol provides individually programmable axonal delay in addition to strength for each synapse, lending itself toward biologically plausible neural network architectures, and scales across a range of hierarchies suitable for multichip and multiboard systems in reconfigurable large-scale neuromorphic systems. We show approximately linear scaling of net global synaptic event throughput with number of routing nodes in the network, at 3.6×10 7 synaptic events per second per 16k-neuron node in the hierarchy.
Salvalaglio, Matteo; Tiwary, Pratyush; Maggioni, Giovanni Maria; Mazzotti, Marco; Parrinello, Michele
2016-12-07
Condensation of a liquid droplet from a supersaturated vapour phase is initiated by a prototypical nucleation event. As such it is challenging to compute its rate from atomistic molecular dynamics simulations. In fact at realistic supersaturation conditions condensation occurs on time scales that far exceed what can be reached with conventional molecular dynamics methods. Another known problem in this context is the distortion of the free energy profile associated to nucleation due to the small, finite size of typical simulation boxes. In this work the problem of time scale is addressed with a recently developed enhanced sampling method while contextually correcting for finite size effects. We demonstrate our approach by studying the condensation of argon, and showing that characteristic nucleation times of the order of magnitude of hours can be reliably calculated. Nucleation rates spanning a range of 10 orders of magnitude are computed at moderate supersaturation levels, thus bridging the gap between what standard molecular dynamics simulations can do and real physical systems.
NASA Astrophysics Data System (ADS)
Salvalaglio, Matteo; Tiwary, Pratyush; Maggioni, Giovanni Maria; Mazzotti, Marco; Parrinello, Michele
2016-12-01
Condensation of a liquid droplet from a supersaturated vapour phase is initiated by a prototypical nucleation event. As such it is challenging to compute its rate from atomistic molecular dynamics simulations. In fact at realistic supersaturation conditions condensation occurs on time scales that far exceed what can be reached with conventional molecular dynamics methods. Another known problem in this context is the distortion of the free energy profile associated to nucleation due to the small, finite size of typical simulation boxes. In this work the problem of time scale is addressed with a recently developed enhanced sampling method while contextually correcting for finite size effects. We demonstrate our approach by studying the condensation of argon, and showing that characteristic nucleation times of the order of magnitude of hours can be reliably calculated. Nucleation rates spanning a range of 10 orders of magnitude are computed at moderate supersaturation levels, thus bridging the gap between what standard molecular dynamics simulations can do and real physical systems.
NASA Astrophysics Data System (ADS)
Coppola, E.; Sobolowski, S.
2017-12-01
The join EURO-CORDEX and Med-CORDEX Flagship Pilot Study dedicated to the frontier research of using convective permitting (CP) models to address the impact of human induced climate change on convection, has been recently approved and the scientific community behind the project is made of 30 different scientific European institutes. The motivations for such a challenge is the availability of large field campaigns dedicated to the study of heavy precipitation events; the increased computing capacity and model developments; the emerging trend signals in extreme precipitation at daily and mainly sub-daily time scale in the Mediterranean and Alpine regions and the priority of convective extreme events under the WCRP Grand Challenge on climate extremes. The main objective of this effort are to investigate convective-scale events, their processes and changes in a few key regions of Europe and the Mediterranean using CP RCMs, statistical models and available observations. To provide a collective assessment of the modeling capacity at CP scale and to shape a coherent and collective assessment of the consequences of climate change on convective event impacts at local to regional scales. The scientific aims of this research are to investigate how the convective events and the damaging phenomena associated with them will respond to changing climate conditions in different European climates zone. To understand if an improved representation of convective phenomena at convective permitting scales will lead to upscaled added value and finally to assess the possibility to replace these costly convection-permitting experiments with statistical approaches like "convection emulators". The common initial domain will be an extended Alpine domain and all the groups will simulate a minimum of 10 years period with ERA-interim boundary conditions, with the possibility of other two sub-domains one in the Northwest continental Europe and another in the Southeast Mediterranean. The scenario simulations will be completed for three different 10 years time slices one in the historical period, one in the near future and the last one in the far future for the RCP8.5 scenario. The first target of this scientific community is to have an ensemble of 1-2 years ERA-interim simulations ready by late 2017 and a set of test cases to use as a pilot study.
Slow Slip and Earthquake Nucleation in Meter-Scale Laboratory Experiments
NASA Astrophysics Data System (ADS)
Mclaskey, G.
2017-12-01
The initiation of dynamic rupture is thought to be preceded by a quasistatic nucleation phase. Observations of recent earthquakes sometimes support this by illuminating slow slip and foreshocks in the vicinity of the eventual hypocenter. I describe laboratory earthquake experiments conducted on two large-scale loading machines at Cornell University that provide insight into the way earthquake nucleation varies with normal stress, healing time, and loading rate. The larger of the two machines accommodates a 3 m long granite sample, and when loaded to 7 MPa stress levels, we observe dynamic rupture events that are preceded by a measureable nucleation zone with dimensions on the order of 1 m. The smaller machine accommodates a 0.76 m sample that is roughly the same size as the nucleation zone. On this machine, small variations in nucleation properties result in measurable differences in slip events, and we generate both dynamic rupture events (> 0.1 m/s slip rates) and slow slip events ( 0.001 to 30 mm/s slip rates). Slow events occur when instability cannot fully nucleate before reaching the sample ends. Dynamic events occur after long healing times or abrupt increases in loading rate which suggests that these factors shrink the spatial and temporal extents of the nucleation zone. Arrays of slip, strain, and ground motion sensors installed on the sample allow us to quantify seismic coupling and study details of premonitory slip and afterslip. The slow slip events we observe are primarily aseismic (less than 1% of the seismic coupling of faster events) and produce swarms of very small M -6 to M -8 events. These mechanical and seismic interactions suggest that faults with transitional behavior—where creep, small earthquakes, and tremor are often observed—could become seismically coupled if loaded rapidly, either by a slow slip front or dynamic rupture of an earthquake that nucleated elsewhere.
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia; Adler, Robert; Adler, David; Peters-Lidard, Christa; Huffman, George
2012-01-01
It is well known that extreme or prolonged rainfall is the dominant trigger of landslides worldwide. While research has evaluated the spatiotemporal distribution of extreme rainfall and landslides at local or regional scales using in situ data, few studies have mapped rainfall-triggered landslide distribution globally due to the dearth of landslide data and consistent precipitation information. This study uses a newly developed Global Landslide Catalog (GLC) and a 13-year satellite-based precipitation record from TRMM data. For the first time, these two unique products provide the foundation to quantitatively evaluate the co-occurrence of precipitation and landslides globally. Evaluation of the GLC indicates that 2010 had a large number of high-impact landslide events relative to previous years. This study considers how variations in extreme and prolonged satellite-based rainfall are related to the distribution of landslides over the same time scales for three active landslide areas: Central America, the Himalayan Arc, and central-eastern China. Several test statistics confirm that TRMM rainfall generally scales with the observed increase in landslide reports and fatal events for 2010 and previous years over each region. These findings suggest that the co-occurrence of satellite precipitation and landslide reports may serve as a valuable indicator for characterizing the spatiotemporal distribution of landslide-prone areas in order to establish a global rainfall-triggered landslide climatology. This study characterizes the variability of satellite precipitation data and reported landslide activity at the globally scale in order to improve landslide cataloging, forecasting and quantify potential triggering sources at daily, monthly and yearly time scales.
Use of modeled and satelite soil moisture to estimate soil erosion in central and southern Italy.
NASA Astrophysics Data System (ADS)
Termite, Loris Francesco; Massari, Christian; Todisco, Francesca; Brocca, Luca; Ferro, Vito; Bagarello, Vincenzo; Pampalone, Vincenzo; Wagner, Wolfgang
2016-04-01
This study presents an accurate comparison between two different approaches aimed to enhance accuracy of the Universal Soil Loss Equation (USLE) in estimating the soil loss at the single event time scale. Indeed it is well known that including the observed event runoff in the USLE improves its soil loss estimation ability at the event scale. In particular, the USLE-M and USLE-MM models use the observed runoff coefficient to correct the rainfall erosivity factor. In the first case, the soil loss is linearly dependent on rainfall erosivity, in the second case soil loss and erosivity are related by a power law. However, the measurement of the event runoff is not straightforward or, in some cases, possible. For this reason, the first approach used in this study is the use of Soil Moisture For Erosion (SM4E), a recent USLE-derived model in which the event runoff is replaced by the antecedent soil moisture. Three kinds of soil moisture datasets have been separately used: the ERA-Interim/Land reanalysis data of the European Centre for Medium-range Weather Forecasts (ECMWF); satellite retrievals from the European Space Agency - Climate Change Initiative (ESA-CCI); modeled data using a Soil Water Balance Model (SWBM). The second approach is the use of an estimated runoff rather than the observed. Specifically, the Simplified Continuous Rainfall-Runoff Model (SCRRM) is used to derive the runoff estimates. SCRMM requires soil moisture data as input and at this aim the same three soil moisture datasets used for the SM4E have been separately used. All the examined models have been calibrated and tested at the plot scale, using data from the experimental stations for the monitoring of the erosive processes "Masse" (Central Italy) and "Sparacia" (Southern Italy). Climatic data and runoff and soil loss measures at the event time scale are available for the period 2008-2013 at Masse and for the period 2002-2013 at Sparacia. The results show that both the approaches can provide better results than the USLE. Specifically, the SM4E model has proven to be particularly effective at Masse, providing the best soil loss estimations, especially when the modeled soil moisture is used. In this case, the RSR index (ratio between the Root Mean Square Error and the Observed Standard deviation) is equal to 0.94. Instead, the SCRRM is able to better estimate the event runoff at Sparacia than at Masse, thus resulting in good performances of the USLE-derived models using the estimated runoff; however, even at Sparacia the SM4E with modeled soil moisture gives the better soil loss estimates, with RSR = 0.54. These results open an interesting scenario in the use of empirical models to determine soil loss at a large scale, since soil moisture is a not only a simple in situ measurement, but only a widely available information on a global scale from remote sensing.
A crater and its ejecta: An interpretation of Deep Impact
NASA Astrophysics Data System (ADS)
Holsapple, Keith A.; Housen, Kevin R.
2007-03-01
We apply recently updated scaling laws for impact cratering and ejecta to interpret observations of the Deep Impact event. An important question is whether the cratering event was gravity or strength-dominated; the answer gives important clues about the properties of the surface material of Tempel 1. Gravity scaling was assumed in pre-event calculations and has been asserted in initial studies of the mission results. Because the gravity field of Tempel 1 is extremely weak, a gravity-dominated event necessarily implies a surface with essentially zero strength. The conclusion of gravity scaling was based mainly on the interpretation that the impact ejecta plume remained attached to the comet during its evolution. We address that feature here, and conclude that even strength-dominated craters would result in a plume that appeared to remain attached to the surface. We then calculate the plume characteristics from scaling laws for a variety of material types, and for gravity and strength-dominated cases. We find that no model of cratering alone can match the reported observation of plume mass and brightness history. Instead, comet-like acceleration mechanisms such as expanding vapor clouds are required to move the ejected mass to the far field in a few-hour time frame. With such mechanisms, and to within the large uncertainties, either gravity or strength craters can provide the levels of estimated observed mass. Thus, the observations are unlikely to answer the questions about the mechanical nature of the Tempel 1 surface.
A crater and its ejecta: An interpretation of Deep Impact
NASA Astrophysics Data System (ADS)
Holsapple, Keith A.; Housen, Kevin R.
We apply recently updated scaling laws for impact cratering and ejecta to interpret observations of the Deep Impact event. An important question is whether the cratering event was gravity or strength-dominated; the answer gives important clues about the properties of the surface material of Tempel 1. Gravity scaling was assumed in pre-event calculations and has been asserted in initial studies of the mission results. Because the gravity field of Tempel 1 is extremely weak, a gravity-dominated event necessarily implies a surface with essentially zero strength. The conclusion of gravity scaling was based mainly on the interpretation that the impact ejecta plume remained attached to the comet during its evolution. We address that feature here, and conclude that even strength-dominated craters would result in a plume that appeared to remain attached to the surface. We then calculate the plume characteristics from scaling laws for a variety of material types, and for gravity and strength-dominated cases. We find that no model of cratering alone can match the reported observation of plume mass and brightness history. Instead, comet-like acceleration mechanisms such as expanding vapor clouds are required to move the ejected mass to the far field in a few-hour time frame. With such mechanisms, and to within the large uncertainties, either gravity or strength craters can provide the levels of estimated observed mass. Thus, the observations are unlikely to answer the questions about the mechanical nature of the Tempel 1 surface.
NASA Astrophysics Data System (ADS)
Subramanian, A. C.; Lavers, D.; Matsueda, M.; Shukla, S.; Cayan, D. R.; Ralph, M.
2017-12-01
Atmospheric rivers (ARs) - elongated plumes of intense moisture transport - are a primary source of hydrological extremes, water resources and impactful weather along the West Coast of North America and Europe. There is strong demand in the water management, societal infrastructure and humanitarian sectors for reliable sub-seasonal forecasts, particularly of extreme events, such as floods and droughts so that actions to mitigate disastrous impacts can be taken with sufficient lead-time. Many recent studies have shown that ARs in the Pacific and the Atlantic are modulated by large-scale modes of climate variability. Leveraging the improved understanding of how these large-scale climate modes modulate the ARs in these two basins, we use the state-of-the-art multi-model forecast systems such as the North American Multi-Model Ensemble (NMME) and the Subseasonal-to-Seasonal (S2S) database to help inform and assess the probabilistic prediction of ARs and related extreme weather events over the North American and European West Coasts. We will present results from evaluating probabilistic forecasts of extreme precipitation and AR activity at the sub-seasonal scale. In particular, results from the comparison of two winters (2015-16 and 2016-17) will be shown, winters which defied canonical El Niño teleconnection patterns over North America and Europe. We further extend this study to analyze probabilistic forecast skill of AR events in these two basins and the variability in forecast skill during certain regimes of large-scale climate modes.
NASA Astrophysics Data System (ADS)
Haruki, W.; Iseri, Y.; Takegawa, S.; Sasaki, O.; Yoshikawa, S.; Kanae, S.
2016-12-01
Natural disasters caused by heavy rainfall occur every year in Japan. Effective countermeasures against such events are important. In 2015, a catastrophic flood occurred in Kinu river basin, which locates in the northern part of Kanto region. The remarkable feature of this flood event was not only in the intensity of rainfall but also in the spatial characteristics of heavy rainfall area. The flood was caused by continuous overlapping of heavy rainfall area over the Kinu river basin, suggesting consideration of spatial extent is quite important to assess impacts of heavy rainfall events. However, the spatial extent of heavy rainfall events cannot be properly measured through rainfall measurement by rain gauges at observation points. On the other hand, rainfall measurements by radar observations provide spatially and temporarily high resolution rainfall data which would be useful to catch the characteristics of heavy rainfall events. For long term effective countermeasure, extreme heavy rainfall scenario considering rainfall area and distribution is required. In this study, a new method for generating extreme heavy rainfall events using Monte Carlo Simulation has been developed in order to produce extreme heavy rainfall scenario. This study used AMeDAS analyzed precipitation data which is high resolution grid precipitation data made by Japan Meteorological Agency. Depth area duration (DAD) analysis has been conducted to extract extreme rainfall events in the past, considering time and spatial scale. In the Monte Carlo Simulation, extreme rainfall event is generated based on events extracted by DAD analysis. Extreme heavy rainfall events are generated in specific region in Japan and the types of generated extreme heavy rainfall events can be changed by varying the parameter. For application of this method, we focused on Kanto region in Japan. As a result, 3000 years rainfall data are generated. 100 -year probable rainfall and return period of flood in Kinu River Basin (2015) are obtained using generated data. We compared 100-year probable rainfall calculated by this method with other traditional method. New developed method enables us to generate extreme rainfall events considering time and spatial scale and produce extreme rainfall scenario.
A Scale to Characterize the Strength and Impacts of Atmospheric Rivers
NASA Astrophysics Data System (ADS)
Ralph, F. M.; Rutz, J. J.; Cordeira, J. M.; Dettinger, M. D.; Anderson, M.; Schick, L. J.; Smallcomb, C.; Reynolds, D.
2017-12-01
A scale has been developed to categorize atmospheric river (AR) strength and duration. It is based on the maximum instantaneous vertically intergated water vapor transport (IVT) and the duration of the event at a given point (i.e., the duration of IVT ≥250 kg m-1 s-1, which is a minimal threshold of weak AR conditions). The AR Scale is intuitive, with 5 categories (AR Cats 1-5) arising as a function of maximum IVT intensity and duration of at least minimal AR conditions. These categories provide a wide range of users with a baseline for gauging the potential impacts, both beneficial and hazardous, associated with an AR at their location. It also provides a basis for reporting the occurance of past ARs and of tracking their frequency of occurence over time. This presentation will focus on describing the AR Scale, use and interpretation of this scale, and the spatiotemporal distribution of AR Cat 1-5 events in the Western U.S. during the cool season (October - April) during 1980-2017.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walsh, J.J.
1976-01-01
Temporal changes of the Plymouth herring, Atlanto-Scandian herring, Norweigian cod, New York menhaden, Maine lobster, California sardine, anchovy, and red crab, and Japanese herring and sardine are considered in relation to oscillations of Peruvian anchovy and guano bird populations in response to variations of wind strength, of atmospheric and sea surface temperature anomalies, and of current speed for the Eastern Tropical Pacific. It is suggested that marine communities, either off Peru or throughout the ocean, respond in a similar manner to global oscillations at the climatic and El Nino scales by geographical relocation of their centers of abundance. It ismore » further suggested that these two longer scales of variability are minor perturbations of marine ecosystems in comparison with an interaction of overfishing and natural oscillations at the event scale of variability, i.e., that the failure of most of the world's clupeid fisheries may be linked to imposition of this additional stress and the local perturbations of the larval drift of an organism on a time scale of days to weeks.« less
Time-scale dependent sediment flux in the Tajik Pamir Mountains
NASA Astrophysics Data System (ADS)
Pohl, Eric; Gloaguen, Richard; Andermann, Christoff; Fuchs, Margret C.
2014-05-01
The Pamir Mountains (Pamirs) offer the unique possibility to observe landscape shaping processes in a complex climatic environment. While the Westerlies provide most of the moisture as snow in winter, the Indian summer monsoon can also contribute quite significantly to the water budget in summer. Water from snow and ice melt induced by temperature and rainfall mobilizes sediments from hillslopes, debris fans, and moraine remnants. These sediments are transported, re-deposited, and eventually carried out of the orogene. Different approaches are available to assess and quantify the erosion processes at different time-scales. Recent studies applying cosmogenic nuclide (CN) dating suggest erosion rates of approximately 0.65mm/yr for the last 1000 years. In this contribution we want to present modern erosion rates derived from historical archive suspended sediment yield (SSY) data and very recent in situ sampling data, including high-resolution turbidimeter measurements. 10-day averaged SSY data recorded in the past show less erosion by a factor of 2 to 10 compared to CN-derived erosion rates for different catchments. The 10-day SSY data are based on measurements that have been conducted in the morning and evening, thus not accounting for the entire diurnal variation. We installed a turbidimeter with a measuring interval of 10 minutes to better resolve these diurnal variations. We calibrate turbidity with in situ measurements carried out on a daily basis for 9 months to see whether the differences between CN and SSY measurements are really owed to diurnal variations or if rare high magnitude events. e.g. mudflows, landslides, or avalanches disclose this discrepancy. We present single high magnitude SSY events, uncover periodic diurnal sediment variations that systematically lag diurnal temperature variations and relate the sediment amount of such high magnitude events to the smoothed annual cycle. We use the obtained results to discuss whether past changes in climate could explain the observed difference between millennial scale CN vs decadal scale SSY measurements or if single high magnitude events must play the dominant role.
NASA Astrophysics Data System (ADS)
Peethambaran, Rahul; Ghosh, Prosenjit
2017-04-01
The isotope ratios in rainwater are controlled by factors such as source water composition and intensity of convective activity (Rahul et al., 2016). In this study, we investigate the atmospheric controls on rainwater δ18O values collected from two Indian stations, Thiruvananthapuram (TRV, n=222 with average of -2.58±3.06‰) and Bangalore (BLR, n=198 with average of -1.94±3.94‰) covering the southwest monsoon (SWM) and northeast monsoon (NEM), for the time period of four years. The samples are collected at daily intervals and in some particular cases at intra-event time scales (4 events). It was observed that the seasonal variations are more pronounced over BLR due to its location in the central peninsular India, compared to TRV which is a coastal station. The intra-event based observations indicate amount effect is significant due to post-condensation evaporation during raindrop descent. This is supported by the observed low d-excess values of rainwater and its inverse correlation (r=0.5 to 0.8) with rainfall amount within events. The correlation between rainwater δ18O with the local rainfall amount was low (r=0.2 and 0.3) in both the stations whereas the isotope ratios respond to the monsoonal convective systems on a regional scale. Significant negative correlations of isotope ratios with the moisture convergence were obtained in spatio-temporal scales over parts of the Arabian Sea as well as over the regions of moisture pathways associated with synoptic scale disturbances over the BoB. We observe that the correlation pattern responds to seasonal changes at the moisture source regions during the period of SWM and NEM. References Rahul, P., P. Ghosh, S. K. Bhattacharya, and K. Yoshimura (2016), Controlling factors of rainwater and water vapor isotopes at Bangalore, India: constraints from observations in 2013 Indian monsoon, J. Geophys. Res. Atmos., 121, doi:10.1002/2016JD025352.
Was millennial scale climate change during the Last Glacial triggered by explosive volcanism?
Baldini, James U.L.; Brown, Richard J.; McElwaine, Jim N.
2015-01-01
The mechanisms responsible for millennial scale climate change within glacial time intervals are equivocal. Here we show that all eight known radiometrically-dated Tambora-sized or larger NH eruptions over the interval 30 to 80 ka BP are associated with abrupt Greenland cooling (>95% confidence). Additionally, previous research reported a strong statistical correlation between the timing of Southern Hemisphere volcanism and Dansgaard-Oeschger (DO) events (>99% confidence), but did not identify a causative mechanism. Volcanic aerosol-induced asymmetrical hemispheric cooling over the last few hundred years restructured atmospheric circulation in a similar fashion as that associated with Last Glacial millennial-scale shifts (albeit on a smaller scale). We hypothesise that following both recent and Last Glacial NH eruptions, volcanogenic sulphate injections into the stratosphere cooled the NH preferentially, inducing a hemispheric temperature asymmetry that shifted atmospheric circulation cells southward. This resulted in Greenland cooling, Antarctic warming, and a southward shifted ITCZ. However, during the Last Glacial, the initial eruption-induced climate response was prolonged by NH glacier and sea ice expansion, increased NH albedo, AMOC weakening, more NH cooling, and a consequent positive feedback. Conversely, preferential SH cooling following large SH eruptions shifted atmospheric circulation to the north, resulting in the characteristic features of DO events. PMID:26616338
Was millennial scale climate change during the Last Glacial triggered by explosive volcanism?
Baldini, James U L; Brown, Richard J; McElwaine, Jim N
2015-11-30
The mechanisms responsible for millennial scale climate change within glacial time intervals are equivocal. Here we show that all eight known radiometrically-dated Tambora-sized or larger NH eruptions over the interval 30 to 80 ka BP are associated with abrupt Greenland cooling (>95% confidence). Additionally, previous research reported a strong statistical correlation between the timing of Southern Hemisphere volcanism and Dansgaard-Oeschger (DO) events (>99% confidence), but did not identify a causative mechanism. Volcanic aerosol-induced asymmetrical hemispheric cooling over the last few hundred years restructured atmospheric circulation in a similar fashion as that associated with Last Glacial millennial-scale shifts (albeit on a smaller scale). We hypothesise that following both recent and Last Glacial NH eruptions, volcanogenic sulphate injections into the stratosphere cooled the NH preferentially, inducing a hemispheric temperature asymmetry that shifted atmospheric circulation cells southward. This resulted in Greenland cooling, Antarctic warming, and a southward shifted ITCZ. However, during the Last Glacial, the initial eruption-induced climate response was prolonged by NH glacier and sea ice expansion, increased NH albedo, AMOC weakening, more NH cooling, and a consequent positive feedback. Conversely, preferential SH cooling following large SH eruptions shifted atmospheric circulation to the north, resulting in the characteristic features of DO events.
Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR
NASA Astrophysics Data System (ADS)
Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.
2017-12-01
Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.
Carmena, Jose M.
2016-01-01
Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter initialization. Finally, the architecture extended control to tasks beyond those used for CLDA training. These results have significant implications towards the development of clinically-viable neuroprosthetics. PMID:27035820
A Long, Long Time Ago: Student Perceptions of Geologic Time Using a 45.6-foot-long Timeline
NASA Astrophysics Data System (ADS)
Gehman, J. R.; Johnson, E. A.
2008-12-01
In this study we investigated preconceptions of geologic time held by students in five large (50-115 students each) sections of introductory geology and Earth science courses. Students were randomly divided into groups of eleven individuals, and each group was assigned a separate timeline made from a roll of adding machine paper. Students were encouraged to work as a group to place the eleven geological or biological events where they thought they should belong on their timeline based only on their previous knowledge of geologic time. Geologic events included "Oldest Known Earth Rock" and "The Colorado River Begins to Form the Grand Canyon" while biological events included such milestones as "First Fish," "Dinosaurs go Extinct," and "First Modern Humans." Students were asked in an anonymous survey how they decided to place the events on the timeline in this initial exercise. After the eleven event cards were clipped to the timeline and marks were made to record the initial location of each event, students returned to the classroom and were provided with a scale and the correct dates for the events. Each paper timeline was 45.6 ft. long to represent the 4.56 billion years of Earth history (each one-foot-wide floor tile in the hallways outside the classroom equals 100 million years). Student then returned to their timelines and moved the event cards to the correct locations. At the end of the exercise, survey questions and the paper timelines with the markings of the original position of geologic events were collected and compiled. Analysis of the timeline data based on previous knowledge revealed that no group of students arranged all of the events in the proper sequence, although several groups misplaced only two events in relative order. Students consistently placed events further back in time than their correct locations based on absolute age dates. The survey revealed that several student groups used one "old" event such as the "First Dinosaurs Appear" or "Oldest Known Earth Rock" as a marker from which they based relative placement of other events on the timeline. The most recent events including "First Modern Humans" showed the greatest percentage error of placement.
NASA Astrophysics Data System (ADS)
Carolin, S.; Walker, R. T.; Henderson, G. M.; Maxfield, L.; Ersek, V.; Sloan, A.; Talebian, M.; Fattahi, M.; Nezamdoust, J.
2015-12-01
The influence of climate on the growth and development of ancient civilizations throughout the Holocene remains a topic of heated debate. The 4.2 ka BP global-scale mid-to-low latitude aridification event (Walker et al., 2012) in particular has incited various correlation proposals. Some authors suggest that this event may have led to the collapse of the Akkadian empire in Mesopotamia, one of the first empires in human history, as well as to changes among other Early Bronze Age societies dependent on cereal agriculture (eg. Staubwasser and Weiss, 2006). Other authors remain doubtful of the impact of environmental factors on the collapse of past societies (eg. Middleton, 2012). While coincident timing of an environmental event with archeological evidence does not necessitate a causation, a comprehensive understanding of climate variability in the ancient Near East is nonetheless an essential component to resolving the full history of early human settlements. Paleoclimate data on the Central Iranian Plateau, a region rich with ancient history, is exceptionally sparse compared to other areas. Many karst locations are found throughout the region, however, setting the stage for the development of several high-resolution, accurate and precisely-dated climate proxy records if a correlation between the chemistry of semi-arid speleothem samples and climate is resolved. Here we present a 5.1-3.7 ka BP record of decadal-scale stalagmite stable isotope and trace metal variability. The stalagmite was collected in Gol-e zard cave (35.8oN, 52.0oE), ~100 km NE of Tehran on the southern flank of the Alborz mountain range (2530masl). The area currently receives ~270mm mean annual precipitation, with more than 90% of precipitation falling within the wet season (November-May). We use GNIP data from Tehran and local and regional meteorological data to resolve the large-scale mechanisms forcing isotopic variations in rainwater over Gol-e zard cave. We discuss possible transformation of water isotopes during transition through the karst aquifer based on site properties and simple model experiments. Finally, we discuss the timing and magnitude of significant events in the stable isotope and trace metal records, particularly in relation to the 4.2 ka BP drought event apparent in certain other regional climate records.
Seismic dynamics in advance and after the recent strong earthquakes in Italy and New Zealand
NASA Astrophysics Data System (ADS)
Nekrasova, A.; Kossobokov, V. G.
2017-12-01
We consider seismic events as a sequence of avalanches in self-organized system of blocks-and-faults of the Earth lithosphere and characterize earthquake series with the distribution of the control parameter, η = τ × 10B × (5-M) × L C of the Unified Scaling Law for Earthquakes, USLE (where τ is inter-event time, B is analogous to the Gutenberg-Richter b-value, and C is fractal dimension of seismic locus). A systematic analysis of earthquake series in Central Italy and New Zealand, 1993-2017, suggests the existence, in a long-term, of different rather steady levels of seismic activity characterized with near constant values of η, which, in mid-term, intermittently switch at times of transitions associated with the strong catastrophic events. On such a transition, seismic activity, in short-term, may follow different scenarios with inter-event time scaling of different kind, including constant, logarithmic, power law, exponential rise/decay or a mixture of those. The results do not support the presence of universality in seismic energy release. The observed variability of seismic activity in advance and after strong (M6.0+) earthquakes in Italy and significant (M7.0+) earthquakes in New Zealand provides important constraints on modelling realistic earthquake sequences by geophysicists and can be used to improve local seismic hazard assessments including earthquake forecast/prediction methodologies. The transitions of seismic regime in Central Italy and New Zealand started in 2016 are still in progress and require special attention and geotechnical monitoring. It would be premature to make any kind of definitive conclusions on the level of seismic hazard which is evidently high at this particular moment of time in both regions. The study supported by the Russian Science Foundation Grant No.16-17-00093.
NASA Astrophysics Data System (ADS)
Reynen, Andrew; Audet, Pascal
2017-09-01
A new method using a machine learning technique is applied to event classification and detection at seismic networks. This method is applicable to a variety of network sizes and settings. The algorithm makes use of a small catalogue of known observations across the entire network. Two attributes, the polarization and frequency content, are used as input to regression. These attributes are extracted at predicted arrival times for P and S waves using only an approximate velocity model, as attributes are calculated over large time spans. This method of waveform characterization is shown to be able to distinguish between blasts and earthquakes with 99 per cent accuracy using a network of 13 stations located in Southern California. The combination of machine learning with generalized waveform features is further applied to event detection in Oklahoma, United States. The event detection algorithm makes use of a pair of unique seismic phases to locate events, with a precision directly related to the sampling rate of the generalized waveform features. Over a week of data from 30 stations in Oklahoma, United States are used to automatically detect 25 times more events than the catalogue of the local geological survey, with a false detection rate of less than 2 per cent. This method provides a highly confident way of detecting and locating events. Furthermore, a large number of seismic events can be automatically detected with low false alarm, allowing for a larger automatic event catalogue with a high degree of trust.
Examining Extreme Events Using Dynamically Downscaled 12-km WRF Simulations
Continued improvements in the speed and availability of computational resources have allowed dynamical downscaling of global climate model (GCM) projections to be conducted at increasingly finer grid scales and over extended time periods. The implementation of dynamical downscal...
Rare events in networks with internal and external noise
NASA Astrophysics Data System (ADS)
Hindes, J.; Schwartz, I. B.
2017-12-01
We study rare events in networks with both internal and external noise, and develop a general formalism for analyzing rare events that combines pair-quenched techniques and large-deviation theory. The probability distribution, shape, and time scale of rare events are considered in detail for extinction in the Susceptible-Infected-Susceptible model as an illustration. We find that when both types of noise are present, there is a crossover region as the network size is increased, where the probability exponent for large deviations no longer increases linearly with the network size. We demonstrate that the form of the crossover depends on whether the endemic state is localized near the epidemic threshold or not.
NASA Astrophysics Data System (ADS)
Scheifinger, Helfried; Menzel, Annette; Koch, Elisabeth; Peter, Christian; Ahas, Rein
2002-11-01
A data set of 17 phenological phases from Germany, Austria, Switzerland and Slovenia spanning the time period from 1951 to 1998 has been made available for analysis together with a gridded temperature data set (1° × 1° grid) and the North Atlantic Oscillation (NAO) index time series. The disturbances of the westerlies constitute the main atmospheric source for the temporal variability of phenological events in Europe. The trend, the standard deviation and the discontinuity of the phenological time series at the end of the 1980s can, to a great extent, be explained by the NAO. A number of factors modulate the influence of the NAO in time and space. The seasonal northward shift of the westerlies overlaps with the sequence of phenological spring phases, thereby gradually reducing its influence on the temporal variability of phenological events with progression of spring (temporal loss of influence). This temporal process is reflected by a pronounced decrease in trend and standard deviation values and common variability with the NAO with increasing year-day. The reduced influence of the NAO with increasing distance from the Atlantic coast is not only apparent in studies based on the data set of the International Phenological Gardens, but also in the data set of this study with a smaller spatial extent (large-scale loss of influence). The common variance between phenological and NAO time series displays a discontinuous drop from the European Atlantic coast towards the Alps. On a local and regional scale, mountainous terrain reduces the influence of the large-scale atmospheric flow from the Atlantic via a proposed decoupling mechanism. Valleys in mountainous terrain have the inclination to harbour temperature inversions over extended periods of time during the cold season, which isolate the valley climate from the large-scale atmospheric flow at higher altitudes. Most phenological stations reside at valley bottoms and are thus largely decoupled in their temporal variability from the influence of the westerly flow regime (local-scale loss of influence). This study corroborates an increasing number of similar investigations that find that vegetation does react in a sensitive way to variations of its atmospheric environment across various temporal and spatial scales.
Modeling of copper sorption onto GFH and design of full-scale GFH adsorbers.
Steiner, Michele; Pronk, Wouter; Boller, Markus A
2006-03-01
During rain events, copper wash-off occurring from copper roofs results in environmental hazards. In this study, columns filled with granulated ferric hydroxide (GFH) were used to treat copper-containing roof runoff. It was shown that copper could be removed to a high extent. A model was developed to describe this removal process. The model was based on the Two Region Model (TRM), extended with an additional diffusion zone. The extended model was able to describe the copper removal in long-term experiments (up to 125 days) with variable flow rates reflecting realistic runoff events. The four parameters of the model were estimated based on data gained with specific column experiments according to maximum sensitivity for each parameter. After model validation, the parameter set was used for the design of full-scale adsorbers. These full-scale adsorbers show high removal rates during extended periods of time.
Santos-García, D; Catalán, M J; Puente, V; Valldeoriola, F; Regidor, I; Mir, P; Matías-Arbelo, J; Parra, J C; Grandas, F
2018-01-12
To compare the characteristics of patients undergoing treatment with continuous intestinal infusion of levodopa-carbidopa (CIILC) for advanced Parkinson's disease and the data on the effectiveness and safety of CIILC in the different autonomous communities (AC) of Spain. A retrospective, longitudinal, observational study was carried out into 177 patients from 11 CAs who underwent CIILC between January 2006 and December 2011. We analysed data on patients' clinical and demographic characteristics, variables related to effectiveness (changes in off time/on time with or without disabling dyskinesia; changes in Hoehn and Yahr scale and Unified Parkinson's Disease Rating Scale scores; non-motor symptoms; and Clinical Global Impression scale scores) and safety (adverse events), and the rate of CIILC discontinuation. Significant differences were observed between CAs for several baseline variables: duration of disease progression prior to CIILC onset, off time (34.9-59.7%) and on time (2.6-48.0%; with or without disabling dyskinesia), Hoehn and Yahr score during on time, Unified Parkinson's Disease Rating Scale-III score during both on and off time, presence of≥ 4 motor symptoms, and CIILC dose. Significant differences were observed during follow-up (> 24 months in 9 of the 11 CAs studied) for the percentage of off time and on time without disabling dyskinesia, adverse events frequency, and Clinical Global Impression scores. The rate of CIILC discontinuation was between 20-40% in 9 CAs (78 and 80% in remaining 2 CAs). This study reveals a marked variability between CAs in terms of patient selection and CIILC safety and effectiveness. These results may have been influenced by patients' baseline characteristics, the availability of multidisciplinary teams, and clinical experience. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.
Giaccio, Biagio; Hajdas, Irka; Isaia, Roberto; Deino, Alan; Nomade, Sebastien
2017-04-06
The Late Pleistocene Campanian Ignimbrite (CI) super-eruption (Southern Italy) is the largest known volcanic event in the Mediterranean area. The CI tephra is widely dispersed through western Eurasia and occurs in close stratigraphic association with significant palaeoclimatic and Palaeolithic cultural events. Here we present new high-precision 14 C (34.29 ± 0.09 14 C kyr BP, 1σ) and 40 Ar/ 39 Ar (39.85 ± 0.14 ka, 95% confidence level) dating results for the age of the CI eruption, which substantially improve upon or augment previous age determinations and permit fuller exploitation of the chronological potential of the CI tephra marker. These results provide a robust pair of 14 C and 40 Ar/ 39 Ar ages for refining both the radiocarbon calibration curve and the Late Pleistocene time-scale at ca. 40 ka. In addition, these new age constraints provide compelling chronological evidence for the significance of the combined influence of the CI eruption and Heinrich Event 4 on European climate and potentially evolutionary processes of the Early Upper Palaeolithic.
Giaccio, Biagio; Hajdas, Irka; Isaia, Roberto; Deino, Alan; Nomade, Sebastien
2017-01-01
The Late Pleistocene Campanian Ignimbrite (CI) super-eruption (Southern Italy) is the largest known volcanic event in the Mediterranean area. The CI tephra is widely dispersed through western Eurasia and occurs in close stratigraphic association with significant palaeoclimatic and Palaeolithic cultural events. Here we present new high-precision 14C (34.29 ± 0.09 14C kyr BP, 1σ) and 40Ar/39Ar (39.85 ± 0.14 ka, 95% confidence level) dating results for the age of the CI eruption, which substantially improve upon or augment previous age determinations and permit fuller exploitation of the chronological potential of the CI tephra marker. These results provide a robust pair of 14C and 40Ar/39Ar ages for refining both the radiocarbon calibration curve and the Late Pleistocene time-scale at ca. 40 ka. In addition, these new age constraints provide compelling chronological evidence for the significance of the combined influence of the CI eruption and Heinrich Event 4 on European climate and potentially evolutionary processes of the Early Upper Palaeolithic. PMID:28383570
Attitudes Toward Medications and the Relationship to Outcomes in Patients with Schizophrenia.
Campbell, Angela H; Scalo, Julieta F; Crismon, M Lynn; Barner, Jamie C; Argo, Tami R; Lawson, Kenneth A; Miller, Alexander
The determinants of attitudes toward medication (ATM) are not well elucidated. In particular, literature remains equivocal regarding the influence of cognition, adverse events, and psychiatric symptomatology. This study evaluated relationships between those outcomes in schizophrenia and ATM. This is a retrospective analysis of data collected during the Texas Medication Algorithm Project (TMAP, n=307 with schizophrenia-related diagnoses), in outpatient clinics at baseline and every 3 months for ≥1 year (for cognition: 3rd and 9th month only). The Drug Attitude Inventory (DAI-30) measured ATM, and independent variables were: cognition (Trail Making Test [TMT], Verbal Fluency Test, Hopkins Verbal Learning Test), adverse events (Systematic Assessment for Treatment-Emergent Adverse Events, Barnes Akathisia Rating Scale), psychiatric symptomatology (Brief Psychiatric Rating Scale, Scale for Assessment of Negative Symptoms [SANS]), and medication adherence (Medication Compliance Scale). Analyses included binary logistic regression (cognition, psychiatric symptoms) and chi-square (adverse events, adherence) for baseline comparisons, and linear regression (cognition) or ANOVA (adverse events, adherence) for changes over time. Mean DAI-30 scores did not change over 12 months. Odds of positive ATM increased with higher TMT Part B scores (p=0.03) and lower SANS scores (p=0.02). Worsening of general psychopathology (p<0.001), positive symptoms (p<0.001), and negative symptoms (p=0.007) correlated with negative changes in DAI-30 scores. Relationships between cognition, negative symptoms, and ATM warrant further investigation. Studies evaluating therapies for cognitive deficits and negative symptoms should consider including ATM measures as endpoints. Patterns and inconsistencies in findings across studies raise questions about whether some factors thought to influence ATM have nonlinear relationships.
Visualization of Spatio-Temporal Relations in Movement Event Using Multi-View
NASA Astrophysics Data System (ADS)
Zheng, K.; Gu, D.; Fang, F.; Wang, Y.; Liu, H.; Zhao, W.; Zhang, M.; Li, Q.
2017-09-01
Spatio-temporal relations among movement events extracted from temporally varying trajectory data can provide useful information about the evolution of individual or collective movers, as well as their interactions with their spatial and temporal contexts. However, the pure statistical tools commonly used by analysts pose many difficulties, due to the large number of attributes embedded in multi-scale and multi-semantic trajectory data. The need for models that operate at multiple scales to search for relations at different locations within time and space, as well as intuitively interpret what these relations mean, also presents challenges. Since analysts do not know where or when these relevant spatio-temporal relations might emerge, these models must compute statistical summaries of multiple attributes at different granularities. In this paper, we propose a multi-view approach to visualize the spatio-temporal relations among movement events. We describe a method for visualizing movement events and spatio-temporal relations that uses multiple displays. A visual interface is presented, and the user can interactively select or filter spatial and temporal extents to guide the knowledge discovery process. We also demonstrate how this approach can help analysts to derive and explain the spatio-temporal relations of movement events from taxi trajectory data.
Reaction Event Counting Statistics of Biopolymer Reaction Systems with Dynamic Heterogeneity.
Lim, Yu Rim; Park, Seong Jun; Park, Bo Jung; Cao, Jianshu; Silbey, Robert J; Sung, Jaeyoung
2012-04-10
We investigate the reaction event counting statistics (RECS) of an elementary biopolymer reaction in which the rate coefficient is dependent on states of the biopolymer and the surrounding environment and discover a universal kinetic phase transition in the RECS of the reaction system with dynamic heterogeneity. From an exact analysis for a general model of elementary biopolymer reactions, we find that the variance in the number of reaction events is dependent on the square of the mean number of the reaction events when the size of measurement time is small on the relaxation time scale of rate coefficient fluctuations, which does not conform to renewal statistics. On the other hand, when the size of the measurement time interval is much greater than the relaxation time of rate coefficient fluctuations, the variance becomes linearly proportional to the mean reaction number in accordance with renewal statistics. Gillespie's stochastic simulation method is generalized for the reaction system with a rate coefficient fluctuation. The simulation results confirm the correctness of the analytic results for the time dependent mean and variance of the reaction event number distribution. On the basis of the obtained results, we propose a method of quantitative analysis for the reaction event counting statistics of reaction systems with rate coefficient fluctuations, which enables one to extract information about the magnitude and the relaxation times of the fluctuating reaction rate coefficient, without a bias that can be introduced by assuming a particular kinetic model of conformational dynamics and the conformation dependent reactivity. An exact relationship is established between a higher moment of the reaction event number distribution and the multitime correlation of the reaction rate for the reaction system with a nonequilibrium initial state distribution as well as for the system with the equilibrium initial state distribution.
Statistic versus stochastic characterization of persistent droughts
NASA Astrophysics Data System (ADS)
Gonzalez-Perez, J.; Valdes, J. B.
2005-12-01
Droughts are one of more devastating natural disasters. A drought event is always related with deficiency in precipitation over a time period. As longer are the drought periods, larger are the damages associated with, following a potential relationship. Additionally, the extension covered by an event also increases its impact, because it makes difficult to compensate the deficit from neighbourhood water resources. Therefore, the characterization of a drought by its persistent deficit, and the area over which it extends are main points to be carried on. The Standardized Precipitation Index (SPI) provides a statistical characterization of the deficits. Its computation, for different aggregation time scales, allows a persistence evaluation. Another more recent statistic that may be applied in drought characterization is the extreme persistent probability function (e.p.f.), which characterizes the persistence of extreme realizations in a random sequence. This work presents an analysis of the differences in performance of the SPI and the e.p.f. in the statistical characterization of a drought event. The inclusion of the persistency directly in the statistic gives to the e.p.f. an advantage over the SPI. Furthermore, the relationship between the e.p.f. and its mean frequency of recurrence is known. Thus, the e.p.f. may be applied to provide either statistic or stochastic characterization of a drought event. Both criteria were compared, showing that the stochastic characterization produces a better drought indicator. The stochastic characterization using the e.p.f. as a criterion yields the new Drought Frequency Index (DFI). The index is applicable to any random water related variable to identify drought events. Its main advantages over the SPI are the direct inclusion of persistence, and its larger robustness to the time scale. To incorporate the spatial extension in the characterization of a drought event, the new DFI may also be evaluated to characterize the drought spatial-temporal development using DFI-maps. Case studies in Spain and the USA support the advantages of the e.p.f.
DOES A SCALING LAW EXIST BETWEEN SOLAR ENERGETIC PARTICLE EVENTS AND SOLAR FLARES?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahler, S. W., E-mail: AFRL.RVB.PA@kirtland.af.mil
2013-05-20
Among many other natural processes, the size distributions of solar X-ray flares and solar energetic particle (SEP) events are scale-invariant power laws. The measured distributions of SEP events prove to be distinctly flatter, i.e., have smaller power-law slopes, than those of the flares. This has led to speculation that the two distributions are related through a scaling law, first suggested by Hudson, which implies a direct nonlinear physical connection between the processes producing the flares and those producing the SEP events. We present four arguments against this interpretation. First, a true scaling must relate SEP events to all flare X-raymore » events, and not to a small subset of the X-ray event population. We also show that the assumed scaling law is not mathematically valid and that although the flare X-ray and SEP event data are correlated, they are highly scattered and not necessarily related through an assumed scaling of the two phenomena. An interpretation of SEP events within the context of a recent model of fractal-diffusive self-organized criticality by Aschwanden provides a physical basis for why the SEP distributions should be flatter than those of solar flares. These arguments provide evidence against a close physical connection of flares with SEP production.« less
Greenville, Aaron C; Wardle, Glenda M; Dickman, Chris R
2012-01-01
Extreme climatic events, such as flooding rains, extended decadal droughts and heat waves have been identified increasingly as important regulators of natural populations. Climate models predict that global warming will drive changes in rainfall and increase the frequency and severity of extreme events. Consequently, to anticipate how organisms will respond we need to document how changes in extremes of temperature and rainfall compare to trends in the mean values of these variables and over what spatial scales the patterns are consistent. Using the longest historical weather records available for central Australia – 100 years – and quantile regression methods, we investigate if extreme climate events have changed at similar rates to median events, if annual rainfall has increased in variability, and if the frequency of large rainfall events has increased over this period. Specifically, we compared local (individual weather stations) and regional (Simpson Desert) spatial scales, and quantified trends in median (50th quantile) and extreme weather values (5th, 10th, 90th, and 95th quantiles). We found that median and extreme annual minimum and maximum temperatures have increased at both spatial scales over the past century. Rainfall changes have been inconsistent across the Simpson Desert; individual weather stations showed increases in annual rainfall, increased frequency of large rainfall events or more prolonged droughts, depending on the location. In contrast to our prediction, we found no evidence that intra-annual rainfall had become more variable over time. Using long-term live-trapping records (22 years) of desert small mammals as a case study, we demonstrate that irruptive events are driven by extreme rainfalls (>95th quantile) and that increases in the magnitude and frequency of extreme rainfall events are likely to drive changes in the populations of these species through direct and indirect changes in predation pressure and wildfires. PMID:23170202
NASA Astrophysics Data System (ADS)
Inoue, Y.; Tsuruoka, K.; Arikawa, M.
2014-04-01
In this paper, we proposed a user interface that displays visual animations on geographic maps and timelines for depicting historical stories by representing causal relationships among events for time series. We have been developing an experimental software system for the spatial-temporal visualization of historical stories for tablet computers. Our proposed system makes people effectively learn historical stories using visual animations based on hierarchical structures of different scale timelines and maps.
Space weather at Low Latitudes: Considerations to improve its forecasting
NASA Astrophysics Data System (ADS)
Chau, J. L.; Goncharenko, L.; Valladares, C. E.; Milla, M. A.
2013-05-01
In this work we present a summary of space weather events that are unique to low-latitude regions. Special emphasis will be devoted to events that occur during so-called quiet (magnetically) conditions. One of these events is the occurrence of nighttime F-region irregularities, also known Equatorial Spread F (ESF). When such irregularities occur navigation and communications systems get disrupted or perturbed. After more than 70 years of studies, many features of ESF irregularities (climatology, physical mechanisms, longitudinal dependence, time dependence, etc.) are well known, but so far they cannot be forecast on time scales of minutes to hours. We present a summary of some of these features and some of the efforts being conducted to contribute to their forecasting. In addition to ESF, we have recently identified a clear connection between lower atmospheric forcing and the low latitude variability, particularly during the so-called sudden stratospheric warming (SSW) events. During SSW events and magnetically quiet conditions, we have observed changes in total electron content (TEC) that are comparable to changes that occur during strong magnetically disturbed conditions. We present results from recent events as well as outline potential efforts to forecast the ionospheric effects during these events.
Recollection-dependent memory for event duration in large-scale spatial navigation
Barense, Morgan D.
2017-01-01
Time and space represent two key aspects of episodic memories, forming the spatiotemporal context of events in a sequence. Little is known, however, about how temporal information, such as the duration and the order of particular events, are encoded into memory, and if it matters whether the memory representation is based on recollection or familiarity. To investigate this issue, we used a real world virtual reality navigation paradigm where periods of navigation were interspersed with pauses of different durations. Crucially, participants were able to reliably distinguish the durations of events that were subjectively “reexperienced” (i.e., recollected), but not of those that were familiar. This effect was not found in temporal order (ordinal) judgments. We also show that the active experience of the passage of time (holding down a key while waiting) moderately enhanced duration memory accuracy. Memory for event duration, therefore, appears to rely on the hippocampally supported ability to recollect or reexperience an event enabling the reinstatement of both its duration and its spatial context, to distinguish it from other events in a sequence. In contrast, ordinal memory appears to rely on familiarity and recollection to a similar extent. PMID:28202714
NASA Astrophysics Data System (ADS)
Hawthorne, J. C.; Bartlow, N. M.; Ghosh, A.
2017-12-01
We estimate the normalized moment rate spectrum of a slow slip event in Cascadia and then attempt to reproduce it. Our goal is to further assess whether a single physical mechanism could govern slow slip and tremor events, with durations that span 6 orders of magnitude, so we construct the spectrum by parameterizing a large slow slip event as the sum of a number of subevents with various durations. The spectrum estimate uses data from three sources: the GPS-based slip inversion of Bartlow et al (2011), PBO borehole strain measurements, and beamforming-based tremor moment estimates of Ghosh et al (2009). We find that at periods shorter than 1 day, the moment rate power spectrum decays as frequencyn, where n is between 0.7 and 1.4 when measured from strain and between 1.2 and 1.4 when inferred from tremor. The spectrum appears roughly flat at periods of 1 to 10 days, as both the 1-day-period strain and tremor data and the 6-day-period slip inversion data imply a moment rate power of 0.02 times the the total moment squared. We demonstrate one way to reproduce this spectrum: by constructing the large-scale slow slip event as the sum of a series of subevents. The shortest of these subevents could be interpreted as VLFEs or even LFEs, while longer subevents might represent the aseismic slip that drives rapid tremor reverals, streaks, or rapid tremor migrations. We pick the subevent magnitudes from a Gutenberg-Richter distribution and place the events randomly throughout a 30-day interval. Then we assign each subevent a duration that scales with its moment to a specified power. Finally, we create a moment rate function for each subevent and sum all of the moment rates. We compute the summed slow slip moment rate spectra with two approaches: a time-domain numerical computation and a frequency-domain analytical summation. Several sets of subevent parameters can allow the constructed slow slip event to match the observed spectrum. One allowable set of parameters is of particular interest: a b-value of 1 coupled with subevent durations that scale linearly with their moments, as suggested by previous observations of slow earthquakes (Ide et al, 2007). Our work thus lends further plausibility to the existence of a single family of slow earthquakes, possibly governed by a single physical mechanism.
The European 2015 drought from a hydrological perspective
NASA Astrophysics Data System (ADS)
Laaha, Gregor; Gauster, Tobias; Tallaksen, Lena M.; Vidal, Jean-Philippe; Stahl, Kerstin; Prudhomme, Christel; Heudorfer, Benedikt; Vlnas, Radek; Ionita, Monica; Van Lanen, Henny A. J.; Adler, Mary-Jeanne; Caillouet, Laurie; Delus, Claire; Fendekova, Miriam; Gailliez, Sebastien; Hannaford, Jamie; Kingston, Daniel; Van Loon, Anne F.; Mediero, Luis; Osuch, Marzena; Romanowicz, Renata; Sauquet, Eric; Stagge, James H.; Wong, Wai K.
2017-06-01
In 2015 large parts of Europe were affected by drought. In this paper, we analyze the hydrological footprint (dynamic development over space and time) of the drought of 2015 in terms of both severity (magnitude) and spatial extent and compare it to the extreme drought of 2003. Analyses are based on a range of low flow and hydrological drought indices derived for about 800 streamflow records across Europe, collected in a community effort based on a common protocol. We compare the hydrological footprints of both events with the meteorological footprints, in order to learn from similarities and differences of both perspectives and to draw conclusions for drought management. The region affected by hydrological drought in 2015 differed somewhat from the drought of 2003, with its center located more towards eastern Europe. In terms of low flow magnitude, a region surrounding the Czech Republic was the most affected, with summer low flows that exhibited return intervals of 100 years and more. In terms of deficit volumes, the geographical center of the event was in southern Germany, where the drought lasted a particularly long time. A detailed spatial and temporal assessment of the 2015 event showed that the particular behavior in these regions was partly a result of diverging wetness preconditions in the studied catchments. Extreme droughts emerged where preconditions were particularly dry. In regions with wet preconditions, low flow events developed later and tended to be less severe. For both the 2003 and 2015 events, the onset of the hydrological drought was well correlated with the lowest flow recorded during the event (low flow magnitude), pointing towards a potential for early warning of the severity of streamflow drought. Time series of monthly drought indices (both streamflow- and climate-based indices) showed that meteorological and hydrological events developed differently in space and time, both in terms of extent and severity (magnitude). These results emphasize that drought is a hazard which leaves different footprints on the various components of the water cycle at different spatial and temporal scales. The difference in the dynamic development of meteorological and hydrological drought also implies that impacts on various water-use sectors and river ecology cannot be informed by climate indices alone. Thus, an assessment of drought impacts on water resources requires hydrological data in addition to drought indices based solely on climate data. The transboundary scale of the event also suggests that additional efforts need to be undertaken to make timely pan-European hydrological assessments more operational in the future.
Preiksaitis, Aidanas; Krakauskaite, Solventa; Petkus, Vytautas; Rocka, Saulius; Chomskis, Romanas; Dagi, Teodoro Forcht; Ragauskas, Arminas
2016-07-01
Cerebrovascular autoregulation (CA) is an important hemodynamic mechanism that protects the brain against inappropriate fluctuations in cerebral blood flow in the face of changing cerebral perfusion pressure. Temporal CA failure is associated with worse outcomes in various acute neurological diseases. An integrative approach is presently used according to the existing paradigm for the association of series of temporal CA impairments with the outcomes of patients with traumatic brain injury (TBI). To explore the influence of the duration of CA impairment events on severe TBI patient outcomes. Patient age was also included in the analysis of the prospectively collected clinical data. CA monitoring included 33 prospective severe TBI patients. The pressure reactivity index [PRx(t)] was continuously monitored to collect information on the dynamics of CA status and to analyze associations between the duration of the longest CA impairment event and patient outcomes. The Glasgow outcome scale and the duration of the longest CA impairment were negatively correlated. The duration of autoregulation impairment significantly correlated with worse outcomes. Multidimensional representation of Glasgow outcome scale plots showed that better outcomes were obtained for younger patients (age < 47 years) and those whose longest CA impairment event was shorter than 40 minutes if PRx(t) was above 0.7 in the CA impairment event. Unfavorable outcomes for TBI patients are more significantly associated with the duration of the single longest CA impairment episode at a high PRx(t) value, rather than with averaged PRx(t) values or the average time of all CA impairment episodes. ABP, arterial blood pressureABP(t), continuous reference arterial blood pressureCA, cerebrovascular autoregulationCBF, cerebral blood flowCPP, cerebral perfusion pressureGOS, Glasgow outcome scaleGOSHD, Glasgow outcome scale after hospital dischargeGOS6M, Glasgow outcome scale at 6 months after dischargeICP, intracranial pressureICP(t), continuously monitored intracranial pressureLCAI, longest CA impairmentoptCPP, optimal cerebral perfusion pressurePRx(t), pressure reactivity indexTBI, traumatic brain injury.
Bolide Airbursts as a Seismic Source for the 2018 Mars InSight Mission
NASA Astrophysics Data System (ADS)
Stevanović, J.; Teanby, N. A.; Wookey, J.; Selby, N.; Daubar, I. J.; Vaubaillon, J.; Garcia, R.
2017-10-01
In 2018, NASA will launch InSight, a single-station suite of geophysical instruments, designed to characterise the martian interior. We investigate the seismo-acoustic signal generated by a bolide entering the martian atmosphere and exploding in a terminal airburst, and assess this phenomenon as a potential observable for the SEIS seismic payload. Terrestrial analogue data from four recent events are used to identify diagnostic airburst characteristics in both the time and frequency domain. In order to estimate a potential number of detectable events for InSight, we first model the impactor source population from observations made on the Earth, scaled for planetary radius, entry velocity and source density. We go on to calculate a range of potential airbursts from the larger incident impactor population. We estimate there to be {˜} 1000 events of this nature per year on Mars. To then derive a detectable number of airbursts for InSight, we scale this number according to atmospheric attenuation, air-to-ground coupling inefficiencies and by instrument capability for SEIS. We predict between 10-200 detectable events per year for InSight.
DynamO: a free O(N) general event-driven molecular dynamics simulator.
Bannerman, M N; Sargant, R; Lue, L
2011-11-30
Molecular dynamics algorithms for systems of particles interacting through discrete or "hard" potentials are fundamentally different to the methods for continuous or "soft" potential systems. Although many software packages have been developed for continuous potential systems, software for discrete potential systems based on event-driven algorithms are relatively scarce and specialized. We present DynamO, a general event-driven simulation package, which displays the optimal O(N) asymptotic scaling of the computational cost with the number of particles N, rather than the O(N) scaling found in most standard algorithms. DynamO provides reference implementations of the best available event-driven algorithms. These techniques allow the rapid simulation of both complex and large (>10(6) particles) systems for long times. The performance of the program is benchmarked for elastic hard sphere systems, homogeneous cooling and sheared inelastic hard spheres, and equilibrium Lennard-Jones fluids. This software and its documentation are distributed under the GNU General Public license and can be freely downloaded from http://marcusbannerman.co.uk/dynamo. Copyright © 2011 Wiley Periodicals, Inc.
Reconciliation of Gene and Species Trees
Rusin, L. Y.; Lyubetskaya, E. V.; Gorbunov, K. Y.; Lyubetsky, V. A.
2014-01-01
The first part of the paper briefly overviews the problem of gene and species trees reconciliation with the focus on defining and algorithmic construction of the evolutionary scenario. Basic ideas are discussed for the aspects of mapping definitions, costs of the mapping and evolutionary scenario, imposing time scales on a scenario, incorporating horizontal gene transfers, binarization and reconciliation of polytomous trees, and construction of species trees and scenarios. The review does not intend to cover the vast diversity of literature published on these subjects. Instead, the authors strived to overview the problem of the evolutionary scenario as a central concept in many areas of evolutionary research. The second part provides detailed mathematical proofs for the solutions of two problems: (i) inferring a gene evolution along a species tree accounting for various types of evolutionary events and (ii) trees reconciliation into a single species tree when only gene duplications and losses are allowed. All proposed algorithms have a cubic time complexity and are mathematically proved to find exact solutions. Solving algorithms for problem (ii) can be naturally extended to incorporate horizontal transfers, other evolutionary events, and time scales on the species tree. PMID:24800245
Long-time atomistic dynamics through a new self-adaptive accelerated molecular dynamics method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, N.; Yang, L.; Gao, F.
2017-02-27
A self-adaptive accelerated molecular dynamics method is developed to model infrequent atomic- scale events, especially those events that occur on a rugged free-energy surface. Key in the new development is the use of the total displacement of the system at a given temperature to construct a boost-potential, which is slowly increased to accelerate the dynamics. The temperature is slowly increased to accelerate the dynamics. By allowing the system to evolve from one steady-state con guration to another by overcoming the transition state, this self-evolving approach makes it possible to explore the coupled motion of species that migrate on vastly differentmore » time scales. The migrations of single vacancy (V) and small He-V clusters, and the growth of nano-sized He-V clusters in Fe for times in the order of seconds are studied by this new method. An interstitial- assisted mechanism is rst explored for the migration of a helium-rich He-V cluster, while a new two-component Ostwald ripening mechanism is suggested for He-V cluster growth.« less
NASA Astrophysics Data System (ADS)
Herath, Sujeewa Malwila; Sarukkalige, Ranjan; Nguyen, Van Thanh Van
2018-01-01
Understanding the relationships between extreme daily and sub-daily rainfall events and their governing factors is important in order to analyse the properties of extreme rainfall events in a changing climate. Atmospheric temperature is one of the dominant climate variables which has a strong relationship with extreme rainfall events. In this study, a temperature-rainfall binning technique is used to evaluate the dependency of extreme rainfall on daily maximum temperature. The Clausius-Clapeyron (C-C) relation was found to describe the relationship between daily maximum temperature and a range of rainfall durations from 6 min up to 24 h for seven Australian weather stations, the stations being located in Adelaide, Brisbane, Canberra, Darwin, Melbourne, Perth and Sydney. The analysis shows that the rainfall - temperature scaling varies with location, temperature and rainfall duration. The Darwin Airport station shows a negative scaling relationship, while the other six stations show a positive relationship. To identify the trend in scaling relationship over time the same analysis is conducted using data covering 10 year periods. Results indicate that the dependency of extreme rainfall on temperature also varies with the analysis period. Further, this dependency shows an increasing trend for more extreme short duration rainfall and a decreasing trend for average long duration rainfall events at most stations. Seasonal variations of the scale changing trends were analysed by categorizing the summer and autumn seasons in one group and the winter and spring seasons in another group. Most of 99th percentile of 6 min, 1 h and 24 h rain durations at Perth, Melbourne and Sydney stations show increasing trend for both groups while Adelaide and Darwin show decreasing trend. Furthermore, majority of scaling trend of 50th percentile are decreasing for both groups.
Aftershocks following crash of currency exchange rate: The case of RUB/USD in 2014
NASA Astrophysics Data System (ADS)
Usmanova, Vasilya; Lysogorskiy, Yury V.; Abe, Sumiyoshi
2018-02-01
The dynamical behavior of the currency exchange rate after its large-scale catastrophe is discussed through a case study of the rate of Russian rubles to US dollars after its crash in 2014. It is shown that, similarly to the case of the stock market crash, the relaxation is characterized by a power law, which is in analogy with the Omori-Utsu law for earthquake aftershocks. The waiting-time distribution is found to also obey a power law. Furthermore, the event-event correlation is discussed, and the aging phenomenon and scaling property are observed. Comments are made on (non-)Markovianity of the aftershock process and on a possible relevance of glassy dynamics to the market system after the crash.
Hard X-Ray Constraints on Small-Scale Coronal Heating Events
NASA Astrophysics Data System (ADS)
Marsh, Andrew; Smith, David M.; Glesener, Lindsay; Klimchuk, James A.; Bradshaw, Stephen; Hannah, Iain; Vievering, Juliana; Ishikawa, Shin-Nosuke; Krucker, Sam; Christe, Steven
2017-08-01
A large body of evidence suggests that the solar corona is heated impulsively. Small-scale heating events known as nanoflares may be ubiquitous in quiet and active regions of the Sun. Hard X-ray (HXR) observations with unprecedented sensitivity >3 keV have recently been enabled through the use of focusing optics. We analyze active region spectra from the FOXSI-2 sounding rocket and the NuSTAR satellite to constrain the physical properties of nanoflares simulated with the EBTEL field-line-averaged hydrodynamics code. We model a wide range of X-ray spectra by varying the nanoflare heating amplitude, duration, delay time, and filling factor. Additional constraints on the nanoflare parameter space are determined from energy constraints and EUV/SXR data.
Continuous catchment-scale monitoring of geomorphic processes with a 2-D seismological array
NASA Astrophysics Data System (ADS)
Burtin, A.; Hovius, N.; Milodowski, D.; Chen, Y.-G.; Wu, Y.-M.; Lin, C.-W.; Chen, H.
2012-04-01
The monitoring of geomorphic processes during extreme climatic events is of a primary interest to estimate their impact on the landscape dynamics. However, available techniques to survey the surface activity do not provide a relevant time and/or space resolution. Furthermore, these methods hardly investigate the dynamics of the events since their detection are made a posteriori. To increase our knowledge of the landscape evolution and the influence of extreme climatic events on a catchment dynamics, we need to develop new tools and procedures. In many past works, it has been shown that seismic signals are relevant to detect and locate surface processes (landslides, debris flows). During the 2010 typhoon season, we deployed a network of 12 seismometers dedicated to monitor the surface processes of the Chenyoulan catchment in Taiwan. We test the ability of a two dimensional array and small inter-stations distances (~ 11 km) to map in continuous and at a catchment-scale the geomorphic activity. The spectral analysis of continuous records shows a high-frequency (> 1 Hz) seismic energy that is coherent with the occurrence of hillslope and river processes. Using a basic detection algorithm and a location approach running on the analysis of seismic amplitudes, we manage to locate the catchment activity. We mainly observe short-time events (> 300 occurrences) associated with debris falls and bank collapses during daily convective storms, where 69% of occurrences are coherent with the time distribution of precipitations. We also identify a couple of debris flows during a large tropical storm. In contrast, the FORMOSAT imagery does not detect any activity, which somehow reflects the lack of extreme climatic conditions during the experiment. However, high resolution pictures confirm the existence of links between most of geomorphic events and existing structures (landslide scars, gullies...). We thus conclude to an activity that is dominated by reactivation processes. It highlights the major interest of a seismic monitoring since it allows a detailed spatial and temporal survey of events that classic approaches are not able to observe. In the future, dense two dimensional seismological arrays will assess in real-time the landscape dynamics of an entire catchment, tracking sediments from slopes to rivers.
Alsep data processing: How we processed Apollo Lunar Seismic Data
NASA Technical Reports Server (NTRS)
Latham, G. V.; Nakamura, Y.; Dorman, H. J.
1979-01-01
The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th power bits per day for nearly eight years until the termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On the average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures finally settled on consist of first plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain especially for automatically processing extraterrestrial seismic signals.
Fernández-Fernández, Virginia; Márquez-González, María; Losada-Baltar, Andrés; García, Pablo E; Romero-Moreno, Rosa
2013-01-01
Older people's emotional distress is often related to rumination processes focused on past vital events occurred during their lives. The specific coping strategies displayed to face those events may contribute to explain older adults' current well-being: they can perceive that they have obtained personal growth after those events and/or they can show a tendency to have intrusive thoughts about those events. This paper describes the development and analysis of the psychometric properties of the Scales for the Assessment of the Psychological Impact of Past Life Events (SAPIPLE): the past life events-occurrence scale (LE-O), ruminative thought scale (LE-R) and personal growth scale (LE-PG). Participants were 393 community dwelling elderly (mean age=71.5 years old; SD=6.9). In addition to the SAPIPLE scales, depressive symptomatology, anxiety, psychological well-being, life satisfaction, physical function and vitality have been assessed. The inter-rater agreement's analysis suggests the presence of two factors in the LE-O: positive and negative vital events. Confirmatory Factor Analysis (CFA) supported this two-dimensional structure for both the LE-R and the LE-PG. Good internal consistency indexes have been obtained for each scale and subscale, as well as good criterion and concurrent validity indexes. Both ruminative thoughts about past life events and personal growth following those events are related to older adults' current well-being. The SAPIPLE presents good psychometric properties that justify its use for elderly people. Copyright © 2012 SEGG. Published by Elsevier Espana. All rights reserved.
Biological hierarchies and the nature of extinction.
Congreve, Curtis R; Falk, Amanda R; Lamsdell, James C
2018-05-01
Hierarchy theory recognises that ecological and evolutionary units occur in a nested and interconnected hierarchical system, with cascading effects occurring between hierarchical levels. Different biological disciplines have routinely come into conflict over the primacy of different forcing mechanisms behind evolutionary and ecological change. These disconnects arise partly from differences in perspective (with some researchers favouring ecological forcing mechanisms while others favour developmental/historical mechanisms), as well as differences in the temporal framework in which workers operate. In particular, long-term palaeontological data often show that large-scale (macro) patterns of evolution are predominantly dictated by shifts in the abiotic environment, while short-term (micro) modern biological studies stress the importance of biotic interactions. We propose that thinking about ecological and evolutionary interactions in a hierarchical framework is a fruitful way to resolve these conflicts. Hierarchy theory suggests that changes occurring at lower hierarchical levels can have unexpected, complex effects at higher scales due to emergent interactions between simple systems. In this way, patterns occurring on short- and long-term time scales are equally valid, as changes that are driven from lower levels will manifest in different forms at higher levels. We propose that the dual hierarchy framework fits well with our current understanding of evolutionary and ecological theory. Furthermore, we describe how this framework can be used to understand major extinction events better. Multi-generational attritional loss of reproductive fitness (MALF) has recently been proposed as the primary mechanism behind extinction events, whereby extinction is explainable solely through processes that result in extirpation of populations through a shutdown of reproduction. While not necessarily explicit, the push to explain extinction through solely population-level dynamics could be used to suggest that environmentally mediated patterns of extinction or slowed speciation across geological time are largely artefacts of poor preservation or a coarse temporal scale. We demonstrate how MALF fits into a hierarchical framework, showing that MALF can be a primary forcing mechanism at lower scales that still results in differential survivorship patterns at the species and clade level which vary depending upon the initial environmental forcing mechanism. Thus, even if MALF is the primary mechanism of extinction across all mass extinction events, the primary environmental cause of these events will still affect the system and result in differential responses. Therefore, patterns at both temporal scales are relevant. © 2017 Cambridge Philosophical Society.
Citizen journalism in a time of crisis: lessons from a large-scale California wildfire
S. Gillette; J. Taylor; D.J. Chavez; R. Hodgson; J. Downing
2007-01-01
The accessibility of news production tools through consumer communication technology has made it possible for media consumers to become media producers. The evolution of media consumer to media producer has important implications for the shape of public discourse during a time of crisis. Citizen journalists cover crisis events using camera cell phones and digital...
[Critical incidents and quality of life among rescue workers].
Prati, G; Pietrantoni, L
2009-01-01
Fire-fighters, paramedics and civil protection volunteers routinely confront potentially traumatic events in the course of their jobs. The frequency of exposure to critical incidents and the relationship between critical incident exposure and quality of life (Professional Quality of Life Scale, PROQOL, Stamm, 2005) A sample of 586 Italian emergency workers. The data indicated that the most frequent critical incidents were incidents involving multiple casualties (65% three or more times), prolonged extrication of trapped victim with life-threatening injuries (64% three or more times), verbal or physical threat by public while on duty (41% three or more times), and victims known to fire-emergency worker (40% three or more times). Infrequent events included serious line of duty injury to self (76% never) and colleagues and line of duty risk of injury or death to self (53% never) and colleagues (47% never). Emergency health workers were more exposed to critical incidents in comparison to fire-fighters. Result from non-parametric correlation analyses indicated that the more infrequent events showed the strong association with compassion fatigue and burnout while failed mission after extensive effort was the event most strongly associated with most associated with compassion satisfaction.
On the reliable use of satellite-derived surface water products for global flood monitoring
NASA Astrophysics Data System (ADS)
Hirpa, F. A.; Revilla-Romero, B.; Thielen, J.; Salamon, P.; Brakenridge, R.; Pappenberger, F.; de Groeve, T.
2015-12-01
Early flood warning and real-time monitoring systems play a key role in flood risk reduction and disaster response management. To this end, real-time flood forecasting and satellite-based detection systems have been developed at global scale. However, due to the limited availability of up-to-date ground observations, the reliability of these systems for real-time applications have not been assessed in large parts of the globe. In this study, we performed comparative evaluations of the commonly used satellite-based global flood detections and operational flood forecasting system using 10 major flood cases reported over three years (2012-2014). Specially, we assessed the flood detection capabilities of the near real-time global flood maps from the Global Flood Detection System (GFDS), and from the Moderate Resolution Imaging Spectroradiometer (MODIS), and the operational forecasts from the Global Flood Awareness System (GloFAS) for the major flood events recorded in global flood databases. We present the evaluation results of the global flood detection and forecasting systems in terms of correctly indicating the reported flood events and highlight the exiting limitations of each system. Finally, we propose possible ways forward to improve the reliability of large scale flood monitoring tools.
Sustainable Model for Public Health Emergency Operations Centers for Global Settings.
Balajee, S Arunmozhi; Pasi, Omer G; Etoundi, Alain Georges M; Rzeszotarski, Peter; Do, Trang T; Hennessee, Ian; Merali, Sharifa; Alroy, Karen A; Phu, Tran Dac; Mounts, Anthony W
2017-10-01
Capacity to receive, verify, analyze, assess, and investigate public health events is essential for epidemic intelligence. Public health Emergency Operations Centers (PHEOCs) can be epidemic intelligence hubs by 1) having the capacity to receive, analyze, and visualize multiple data streams, including surveillance and 2) maintaining a trained workforce that can analyze and interpret data from real-time emerging events. Such PHEOCs could be physically located within a ministry of health epidemiology, surveillance, or equivalent department rather than exist as a stand-alone space and serve as operational hubs during nonoutbreak times but in emergencies can scale up according to the traditional Incident Command System structure.
Memory beyond memory in heart beating, a sign of a healthy physiological condition.
Allegrini, P; Grigolini, P; Hamilton, P; Palatella, L; Raffaelli, G
2002-04-01
We describe two types of memory and illustrate each using artificial and actual heartbeat data sets. The first type of memory, yielding anomalous diffusion, implies the inverse power-law nature of the waiting time distribution and the second the correlation among distinct times, and consequently also the occurrence of many pseudoevents, namely, not genuinely random events. Using the method of diffusion entropy analysis, we establish the scaling that would be determined by the real events alone. We prove that the heart beating of healthy patients reveals the existence of many more pseudoevents than in the patients with congestive heart failure.
New early warning system for gravity-driven ruptures based on codetection of acoustic signal
NASA Astrophysics Data System (ADS)
Faillettaz, J.
2016-12-01
Gravity-driven rupture phenomena in natural media - e.g. landslide, rockfalls, snow or ice avalanches - represent an important class of natural hazards in mountainous regions. To protect the population against such events, a timely evacuation often constitutes the only effective way to secure the potentially endangered area. However, reliable prediction of imminence of such failure events remains challenging due to the nonlinear and complex nature of geological material failure hampered by inherent heterogeneity, unknown initial mechanical state, and complex load application (rainfall, temperature, etc.). Here, a simple method for real-time early warning that considers both the heterogeneity of natural media and characteristics of acoustic emissions attenuation is proposed. This new method capitalizes on codetection of elastic waves emanating from microcracks by multiple and spatially separated sensors. Event-codetection is considered as surrogate for large event size with more frequent codetected events (i.e., detected concurrently on more than one sensor) marking imminence of catastrophic failure. Simple numerical model based on a Fiber Bundle Model considering signal attenuation and hypothetical arrays of sensors confirms the early warning potential of codetection principles. Results suggest that although statistical properties of attenuated signal amplitude could lead to misleading results, monitoring the emergence of large events announcing impeding failure is possible even with attenuated signals depending on sensor network geometry and detection threshold. Preliminary application of the proposed method to acoustic emissions during failure of snow samples has confirmed the potential use of codetection as indicator for imminent failure at lab scale. The applicability of such simple and cheap early warning system is now investigated at a larger scale (hillslope). First results of such a pilot field experiment are presented and analysed.
NASA Astrophysics Data System (ADS)
Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie
2016-07-01
This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.
VME rollback hardware for time warp multiprocessor systems
NASA Technical Reports Server (NTRS)
Robb, Michael J.; Buzzell, Calvin A.
1992-01-01
The purpose of the research effort is to develop and demonstrate innovative hardware to implement specific rollback and timing functions required for efficient queue management and precision timekeeping in multiprocessor discrete event simulations. The previously completed phase 1 effort demonstrated the technical feasibility of building hardware modules which eliminate the state saving overhead of the Time Warp paradigm used in distributed simulations on multiprocessor systems. The current phase 2 effort will build multiple pre-production rollback hardware modules integrated with a network of Sun workstations, and the integrated system will be tested by executing a Time Warp simulation. The rollback hardware will be designed to interface with the greatest number of multiprocessor systems possible. The authors believe that the rollback hardware will provide for significant speedup of large scale discrete event simulation problems and allow multiprocessors using Time Warp to dramatically increase performance.
Donner, Deahn M.; Ribic, Christine; Probst, John R.
2010-01-01
Habitat colonization and abandonment affects the distribution of a species in space and time, ultimately influencing the duration of time habitat is used and the total area of habitat occupied in any given year. Both aspects have important implications to long-term conservation planning. The importance of patch isolation and area to colonization–extinction events is well studied, but little information exists on how changing regional landscape structure and population dynamics influences the variability in the timing of patch colonization and abandonment events. We used 26 years of Kirtland’s Warbler (Dendroica kirtlandii) population data taken during a habitat restoration program (1979–2004) across its historical breeding range to examine the influence of patch attributes and temporal large-scale processes, specifically the rate of habitat turnover and fraction of occupied patches, on the year-to-year timing of patch colonization and abandonment since patch origin. We found the timing of patch colonization and abandonment was influenced by patch and large-scale regional factors. In this system, larger patches were typically colonized earlier (i.e., at a younger age) and abandoned later than smaller patches. Isolated patches (i.e., patches farther from another occupied patch) were generally colonized later and abandoned earlier. Patch habitat type affected colonization and abandonment; colonization occurred at similar patch ages between plantation and wildfire areas (9 and 8.5 years, respectively), but plantations were abandoned at earlier ages (13.9 years) than wildfire areas (16.4 years) resulting in shorter use. As the fraction of occupied patches increased, patches were colonized and abandoned at earlier ages. Patches were abandoned at older ages when the influx of new habitat patches was at low and high rates. Our results provide empirical support for the temporal influence of patch dynamics (i.e., patch destruction, creation, and succession) on local colonization and extinction processes that help explain large-scale patterns of habitat occupancy. Results highlight the need for practitioners to consider the timing of habitat restoration as well as total amount and spatial arrangement of habitat to sustain populations.
Space Environment Modelling with the Use of Artificial Intelligence Methods
NASA Astrophysics Data System (ADS)
Lundstedt, H.; Wintoft, P.; Wu, J.-G.; Gleisner, H.; Dovheden, V.
1996-12-01
Space based technological systems are affected by the space weather in many ways. Several severe failures of satellites have been reported at times of space storms. Our society also increasingly depends on satellites for communication, navigation, exploration, and research. Predictions of the conditions in the satellite environment have therefore become very important. We will here present predictions made with the use of artificial intelligence (AI) techniques, such as artificial neural networks (ANN) and hybrids of AT methods. We are developing a space weather model based on intelligence hybrid systems (IHS). The model consists of different forecast modules, each module predicts the space weather on a specific time-scale. The time-scales range from minutes to months with the fundamental time-scale of 1-5 minutes, 1-3 hours, 1-3 days, and 27 days. Solar and solar wind data are used as input data. From solar magnetic field measurements, either made on the ground at Wilcox Solar Observatory (WSO) at Stanford, or made from space by the satellite SOHO, solar wind parameters can be predicted and modelled with ANN and MHD models. Magnetograms from WSO are available on a daily basis. However, from SOHO magnetograms will be available every 90 minutes. SOHO magnetograms as input to ANNs will therefore make it possible to even predict solar transient events. Geomagnetic storm activity can today be predicted with very high accuracy by means of ANN methods using solar wind input data. However, at present real-time solar wind data are only available during part of the day from the satellite WIND. With the launch of ACE in 1997, solar wind data will on the other hand be available during 24 hours per day. The conditions of the satellite environment are not only disturbed at times of geomagnetic storms but also at times of intense solar radiation and highly energetic particles. These events are associated with increased solar activity. Predictions of these events are therefore also handled with the modules in the Lund Space Weather Model. Interesting Links: Lund Space Weather and AI Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guerrier, C.; Holcman, D., E-mail: david.holcman@ens.fr; Mathematical Institute, Oxford OX2 6GG, Newton Institute
The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationallymore » greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.« less
NASA Astrophysics Data System (ADS)
Philipp, Andy; Kerl, Florian; Büttner, Uwe; Metzkes, Christine; Singer, Thomas; Wagner, Michael; Schütze, Niels
2016-05-01
In recent years, the Free State of Saxony (Eastern Germany) was repeatedly hit by both extensive riverine flooding, as well as flash flood events, emerging foremost from convective heavy rainfall. Especially after a couple of small-scale, yet disastrous events in 2010, preconditions, drivers, and methods for deriving flash flood related early warning products are investigated. This is to clarify the feasibility and the limits of envisaged early warning procedures for small catchments, hit by flashy heavy rain events. Early warning about potentially flash flood prone situations (i.e., with a suitable lead time with regard to required reaction-time needs of the stakeholders involved in flood risk management) needs to take into account not only hydrological, but also meteorological, as well as communication issues. Therefore, we propose a threefold methodology to identify potential benefits and limitations in a real-world warning/reaction context. First, the user demands (with respect to desired/required warning products, preparation times, etc.) are investigated. Second, focusing on small catchments of some hundred square kilometers, two quantitative precipitation forecasts are verified. Third, considering the user needs, as well as the input parameter uncertainty (i.e., foremost emerging from an uncertain QPF), a feasible, yet robust hydrological modeling approach is proposed on the basis of pilot studies, employing deterministic, data-driven, and simple scoring methods.
Global impacts of the 1980s regime shift.
Reid, Philip C; Hari, Renata E; Beaugrand, Grégory; Livingstone, David M; Marty, Christoph; Straile, Dietmar; Barichivich, Jonathan; Goberville, Eric; Adrian, Rita; Aono, Yasuyuki; Brown, Ross; Foster, James; Groisman, Pavel; Hélaouët, Pierre; Hsu, Huang-Hsiung; Kirby, Richard; Knight, Jeff; Kraberg, Alexandra; Li, Jianping; Lo, Tzu-Ting; Myneni, Ranga B; North, Ryan P; Pounds, J Alan; Sparks, Tim; Stübi, René; Tian, Yongjun; Wiltshire, Karen H; Xiao, Dong; Zhu, Zaichun
2016-02-01
Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur. © 2015 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
2013-07-01
structure of the data and Gower’s similarity coefficient as the algorithm for calculating the proximity matrices. The following section provides a...representative set of terrorist event data. Attribute Day Location Time Prim /Attack Sec/Attack Weight 1 1 1 1 1 Scale Nominal Nominal Interval Nominal...calculate the similarity it uses Gower’s similarity and multidimensional scaling algorithms contained in an R statistical computing environment
Submarine landslides on the north continental slope of the South China Sea
NASA Astrophysics Data System (ADS)
Wang, Weiwei; Wang, Dawei; Wu, Shiguo; Völker, David; Zeng, Hongliu; Cai, Guanqiang; Li, Qingping
2018-02-01
Recent and paleo-submarine landslides are widely distributed within strata in deep-water areas along continental slopes, uplifts, and carbonate platforms on the north continental margin of the South China Sea (SCS). In this paper, high-resolution 3D seismic data and multibeam data based on seismic sedimentology and geomorphology are employed to assist in identifying submarine landslides. In addition, deposition models are proposed that are based on specific geological structures and features, and which illustrate the local stress field over entire submarine landslides in deep-water areas of the SCS. The SCS is one of the largest fluvial sediment sinks in enclosed or semi-enclosed marginal seas worldwide. It therefore provides a set of preconditions for the formation of submarine landslides, including rapid sediment accumulation, formation of gas hydrates, and fluid overpressure. A new concept involving temporal and spatial analyses is tested to construct a relationship between submarine landslides and different time scale trigger mechanisms, and three mechanisms are discussed in the context of spatial scale and temporal frequency: evolution of slope gradient and overpressure, global environmental changes, and tectonic events. Submarine landslides that are triggered by tectonic events are the largest but occur less frequently, while submarine landslides triggered by the combination of slope gradient and over-pressure evolution are the smallest but most frequently occurring events. In summary, analysis shows that the formation of submarine landslides is a complex process involving the operation of different factors on various time scales.
Using Scaling to Understand, Model and Predict Global Scale Anthropogenic and Natural Climate Change
NASA Astrophysics Data System (ADS)
Lovejoy, S.; del Rio Amador, L.
2014-12-01
The atmosphere is variable over twenty orders of magnitude in time (≈10-3 to 1017 s) and almost all of the variance is in the spectral "background" which we show can be divided into five scaling regimes: weather, macroweather, climate, macroclimate and megaclimate. We illustrate this with instrumental and paleo data. Based the signs of the fluctuation exponent H, we argue that while the weather is "what you get" (H>0: fluctuations increasing with scale), that it is macroweather (H<0: fluctuations decreasing with scale) - not climate - "that you expect". The conventional framework that treats the background as close to white noise and focuses on quasi-periodic variability assumes a spectrum that is in error by a factor of a quadrillion (≈ 1015). Using this scaling framework, we can quantify the natural variability, distinguish it from anthropogenic variability, test various statistical hypotheses and make stochastic climate forecasts. For example, we estimate the probability that the warming is simply a giant century long natural fluctuation is less than 1%, most likely less than 0.1% and estimate return periods for natural warming events of different strengths and durations, including the slow down ("pause") in the warming since 1998. The return period for the pause was found to be 20-50 years i.e. not very unusual; however it immediately follows a 6 year "pre-pause" warming event of almost the same magnitude with a similar return period (30 - 40 years). To improve on these unconditional estimates, we can use scaling models to exploit the long range memory of the climate process to make accurate stochastic forecasts of the climate including the pause. We illustrate stochastic forecasts on monthly and annual scale series of global and northern hemisphere surface temperatures. We obtain forecast skill nearly as high as the theoretical (scaling) predictability limits allow: for example, using hindcasts we find that at 10 year forecast horizons we can still explain ≈ 15% of the anomaly variance. These scaling hindcasts have comparable - or smaller - RMS errors than existing GCM's. We discuss how these be further improved by going beyond time series forecasts to space-time.
High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering
NASA Technical Reports Server (NTRS)
Maly, K.
1998-01-01
Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.
NASA Astrophysics Data System (ADS)
Bevilacqua, Andrea; Neri, Augusto; Bisson, Marina; Esposti Ongaro, Tomaso; Flandoli, Franco; Isaia, Roberto; Rosi, Mauro; Vitale, Stefano
2017-09-01
This study presents a new method for producing long-term hazard maps for pyroclastic density currents (PDC) originating at Campi Flegrei caldera. Such method is based on a doubly stochastic approach and is able to combine the uncertainty assessments on the spatial location of the volcanic vent, the size of the flow and the expected time of such an event. The results are obtained by using a Monte Carlo approach and adopting a simplified invasion model based on the box model integral approximation. Temporal assessments are modelled through a Cox-type process including self-excitement effects, based on the eruptive record of the last 15 kyr. Mean and percentile maps of PDC invasion probability are produced, exploring their sensitivity to some sources of uncertainty and to the effects of the dependence between PDC scales and the caldera sector where they originated. Conditional maps representative of PDC originating inside limited zones of the caldera, or of PDC with a limited range of scales are also produced. Finally, the effect of assuming different time windows for the hazard estimates is explored, also including the potential occurrence of a sequence of multiple events. Assuming that the last eruption of Monte Nuovo (A.D. 1538) marked the beginning of a new epoch of activity similar to the previous ones, results of the statistical analysis indicate a mean probability of PDC invasion above 5% in the next 50 years on almost the entire caldera (with a probability peak of 25% in the central part of the caldera). In contrast, probability values reduce by a factor of about 3 if the entire eruptive record is considered over the last 15 kyr, i.e. including both eruptive epochs and quiescent periods.
NASA Astrophysics Data System (ADS)
Brunsell, N. A.; Nippert, J. B.
2011-12-01
As the climate warms, it is generally acknowledged that the number and magnitude of extreme weather events will increase. We examined an ecophysiological model's responses to precipitation and temperature anomalies in relation to the mean and variance of annual precipitation along a pronounced precipitation gradient from eastern to western Kansas. This natural gradient creates a template of potential responses for both the mean and variance of annual precipitation to compare the timescales of carbon and water fluxes. Using data from several Ameriflux sites (KZU and KFS) and a third eddy covariance tower (K4B) along the gradient, BIOME-BGC was used to characterize water and carbon cycle responses to extreme weather events. Changes in the extreme value distributions were based on SRES A1B and A2 scenarios using an ensemble mean of 21 GCMs for the region, downscaled using a stochastic weather generator. We focused on changing the timing and magnitude of precipitation and altering the diurnal and seasonal temperature ranges. Biome-BGC was then forced with daily output from the stochastic weather generator, and we examined how potential changes in these extreme value distributions impact carbon and water cycling at the sites across the Kansas precipitation gradient at time scales ranging from daily to interannual. To decompose the time scales of response, we applied a wavelet based information theory analysis approach. Results indicate impacts in soil moisture memory and carbon allocation processes, which vary in response to both the mean and variance of precipitation along the precipitation gradient. These results suggest a more pronounced focus ecosystem responses to extreme events across a range of temporal scales in order to fully characterize the water and carbon cycle responses to global climate change.
NASA Astrophysics Data System (ADS)
Pohle, Ina; Niebisch, Michael; Müller, Hannes; Schümberg, Sabine; Zha, Tingting; Maurer, Thomas; Hinz, Christoph
2018-07-01
To simulate the impacts of within-storm rainfall variabilities on fast hydrological processes, long precipitation time series with high temporal resolution are required. Due to limited availability of observed data such time series are typically obtained from stochastic models. However, most existing rainfall models are limited in their ability to conserve rainfall event statistics which are relevant for hydrological processes. Poisson rectangular pulse models are widely applied to generate long time series of alternating precipitation events durations and mean intensities as well as interstorm period durations. Multiplicative microcanonical random cascade (MRC) models are used to disaggregate precipitation time series from coarse to fine temporal resolution. To overcome the inconsistencies between the temporal structure of the Poisson rectangular pulse model and the MRC model, we developed a new coupling approach by introducing two modifications to the MRC model. These modifications comprise (a) a modified cascade model ("constrained cascade") which preserves the event durations generated by the Poisson rectangular model by constraining the first and last interval of a precipitation event to contain precipitation and (b) continuous sigmoid functions of the multiplicative weights to consider the scale-dependency in the disaggregation of precipitation events of different durations. The constrained cascade model was evaluated in its ability to disaggregate observed precipitation events in comparison to existing MRC models. For that, we used a 20-year record of hourly precipitation at six stations across Germany. The constrained cascade model showed a pronounced better agreement with the observed data in terms of both the temporal pattern of the precipitation time series (e.g. the dry and wet spell durations and autocorrelations) and event characteristics (e.g. intra-event intermittency and intensity fluctuation within events). The constrained cascade model also slightly outperformed the other MRC models with respect to the intensity-frequency relationship. To assess the performance of the coupled Poisson rectangular pulse and constrained cascade model, precipitation events were stochastically generated by the Poisson rectangular pulse model and then disaggregated by the constrained cascade model. We found that the coupled model performs satisfactorily in terms of the temporal pattern of the precipitation time series, event characteristics and the intensity-frequency relationship.
Echoes from the abyss: Tentative evidence for Planck-scale structure at black hole horizons
NASA Astrophysics Data System (ADS)
Abedi, Jahed; Dykaar, Hannah; Afshordi, Niayesh
2017-10-01
In classical general relativity (GR), an observer falling into an astrophysical black hole is not expected to experience anything dramatic as she crosses the event horizon. However, tentative resolutions to problems in quantum gravity, such as the cosmological constant problem, or the black hole information paradox, invoke significant departures from classicality in the vicinity of the horizon. It was recently pointed out that such near-horizon structures can lead to late-time echoes in the black hole merger gravitational wave signals that are otherwise indistinguishable from GR. We search for observational signatures of these echoes in the gravitational wave data released by the advanced Laser Interferometer Gravitational-Wave Observatory (LIGO), following the three black hole merger events GW150914, GW151226, and LVT151012. In particular, we look for repeating damped echoes with time delays of 8 M log M (+spin corrections, in Planck units), corresponding to Planck-scale departures from GR near their respective horizons. Accounting for the "look elsewhere" effect due to uncertainty in the echo template, we find tentative evidence for Planck-scale structure near black hole horizons at false detection probability of 1% (corresponding to 2.5 σ
North Indian heavy rainfall event during June 2013: diagnostics and extended range prediction
NASA Astrophysics Data System (ADS)
Joseph, Susmitha; Sahai, A. K.; Sharmila, S.; Abhilash, S.; Borah, N.; Chattopadhyay, R.; Pillai, P. A.; Rajeevan, M.; Kumar, Arun
2015-04-01
The Indian summer monsoon of 2013 covered the entire country by 16 June, one month earlier than its normal date. Around that period, heavy rainfall was experienced in the north Indian state of Uttarakhand, which is situated on the southern slope of Himalayan Ranges. The heavy rainfall and associated landslides caused serious damages and claimed many lives. This study investigates the scientific rationale behind the incidence of the extreme rainfall event in the backdrop of large scale monsoon environment. It is found that a monsoonal low pressure system that provided increased low level convergence and abundant moisture, and a midlatitude westerly trough that generated strong upper level divergence, interacted with each other and helped monsoon to cover the entire country and facilitated the occurrence of the heavy rainfall event in the orographic region. The study also examines the skill of an ensemble prediction system (EPS) in predicting the Uttarakhand event on extended range time scale. The EPS is implemented on both high (T382) and low (T126) resolution versions of the coupled general circulation model CFSv2. Although the models predicted the event 10-12 days in advance, they failed to predict the midlatitude influence on the event. Possible reasons for the same are also discussed. In both resolutions of the model, the event was triggered by the generation and northwestward movement of a low pressure system developed over the Bay of Bengal. The study advocates the usefulness of high resolution models in predicting extreme events.
75 FR 42633 - Business Continuity and Disaster Recovery
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-22
... event of a wide-scale disruption affecting such entities' trading or clearing operations. These proposed... objective, in the event of a wide-scale disruption. The proposed amendments also revise application guidance... overall resilience of the U.S. financial system in the event of a wide-scale disruption, and is the...
Stochastic summation of empirical Green's functions
Wennerberg, Leif
1990-01-01
Two simple strategies are presented that use random delay times for repeatedly summing the record of a relatively small earthquake to simulate the effects of a larger earthquake. The simulations do not assume any fault plane geometry or rupture dynamics, but realy only on the ω−2 spectral model of an earthquake source and elementary notions of source complexity. The strategies simulate ground motions for all frequencies within the bandwidth of the record of the event used as a summand. The first strategy, which introduces the basic ideas, is a single-stage procedure that consists of simply adding many small events with random time delays. The probability distribution for delays has the property that its amplitude spectrum is determined by the ratio of ω−2 spectra, and its phase spectrum is identically zero. A simple expression is given for the computation of this zero-phase scaling distribution. The moment rate function resulting from the single-stage simulation is quite simple and hence is probably not realistic for high-frequency (>1 Hz) ground motion of events larger than ML∼ 4.5 to 5. The second strategy is a two-stage summation that simulates source complexity with a few random subevent delays determined using the zero-phase scaling distribution, and then clusters energy around these delays to get an ω−2 spectrum for the sum. Thus, the two-stage strategy allows simulations of complex events of any size for which the ω−2 spectral model applies. Interestingly, a single-stage simulation with too few ω−2records to get a good fit to an ω−2 large-event target spectrum yields a record whose spectral asymptotes are consistent with the ω−2 model, but that includes a region in its spectrum between the corner frequencies of the larger and smaller events reasonably approximated by a power law trend. This spectral feature has also been discussed as reflecting the process of partial stress release (Brune, 1970), an asperity failure (Boatwright, 1984), or the breakdown of ω−2 scaling due to rupture significantly longer than the width of the seismogenic zone (Joyner, 1984).
NASA Astrophysics Data System (ADS)
Jeffreson, S. M. R.; Kruijssen, J. M. D.; Krumholz, M. R.; Longmore, S. N.
2018-05-01
We apply an analytic theory for environmentally-dependent molecular cloud lifetimes to the Central Molecular Zone of the Milky Way. Within this theory, the cloud lifetime in the Galactic centre is obtained by combining the time-scales for gravitational instability, galactic shear, epicyclic perturbations and cloud-cloud collisions. We find that at galactocentric radii ˜45-120 pc, corresponding to the location of the `100-pc stream', cloud evolution is primarily dominated by gravitational collapse, with median cloud lifetimes between 1.4 and 3.9 Myr. At all other galactocentric radii, galactic shear dominates the cloud lifecycle, and we predict that molecular clouds are dispersed on time-scales between 3 and 9 Myr, without a significant degree of star formation. Along the outer edge of the 100-pc stream, between radii of 100 and 120 pc, the time-scales for epicyclic perturbations and gravitational free-fall are similar. This similarity of time-scales lends support to the hypothesis that, depending on the orbital geometry and timing of the orbital phase, cloud collapse and star formation in the 100-pc stream may be triggered by a tidal compression at pericentre. Based on the derived time-scales, this should happen in approximately 20 per cent of all accretion events onto the 100-pc stream.
NASA Astrophysics Data System (ADS)
Wang, S.; Sobel, A. H.; Nie, J.
2015-12-01
Two Madden Julian Oscillation (MJO) events were observed during October and November 2011 in the equatorial Indian Ocean during the DYNAMO field campaign. Precipitation rates and large-scale vertical motion profiles derived from the DYNAMO northern sounding array are simulated in a small-domain cloud-resolving model using parameterized large-scale dynamics. Three parameterizations of large-scale dynamics --- the conventional weak temperature gradient (WTG) approximation, vertical mode based spectral WTG (SWTG), and damped gravity wave coupling (DGW) --- are employed. The target temperature profiles and radiative heating rates are taken from a control simulation in which the large-scale vertical motion is imposed (rather than directly from observations), and the model itself is significantly modified from that used in previous work. These methodological changes lead to significant improvement in the results.Simulations using all three methods, with imposed time -dependent radiation and horizontal moisture advection, capture the time variations in precipitation associated with the two MJO events well. The three methods produce significant differences in the large-scale vertical motion profile, however. WTG produces the most top-heavy and noisy profiles, while DGW's is smoother with a peak in midlevels. SWTG produces a smooth profile, somewhere between WTG and DGW, and in better agreement with observations than either of the others. Numerical experiments without horizontal advection of moisture suggest that that process significantly reduces the precipitation and suppresses the top-heaviness of large-scale vertical motion during the MJO active phases, while experiments in which the effect of cloud on radiation are disabled indicate that cloud-radiative interaction significantly amplifies the MJO. Experiments in which interactive radiation is used produce poorer agreement with observation than those with imposed time-varying radiative heating. Our results highlight the importance of both horizontal advection of moisture and cloud-radiative feedback to the dynamics of the MJO, as well as to accurate simulation and prediction of it in models.
Rupture Synchronicity in Complex Fault Systems
NASA Astrophysics Data System (ADS)
Milner, K. R.; Jordan, T. H.
2013-12-01
While most investigators would agree that the timing of large earthquakes within a fault system depends on stress-mediated interactions among its elements, much of the debate relevant to time-dependent forecasting has been centered on single-fault concepts, such as characteristic earthquake behavior. We propose to broaden this discussion by quantifying the multi-fault concept of rupture synchronicity. We consider a finite set of small, fault-spanning volumes {Vk} within a fault system of arbitrary (fractal) complexity. We let Ck be the catalog of length tmax comprising Nk discrete times {ti(k)} that mark when the kth volume participates in a rupture of magnitude > M. The main object of our analysis is the complete set of event time differences {τij(kk') = ti(k) - tj(k')}, which we take to be a random process with an expected density function ρkk'(t). When k = k', we call this function the auto-catalog density function (ACDF); when k ≠ k', we call it the cross-catalog density function (CCDF). The roles of the ACDF and CCDF in synchronicity theory are similar to those of autocorrelation and cross-correlation functions in time-series analysis. For a renewal process, the ACDF can be written in terms of convolutions of the interevent-time distribution, and many of its properties (e.g., large-t asymptote) can be derived analytically. The interesting information in the CCDF, like that in the ACDF, is concentrated near t = 0. If two catalogs are completely asynchronous, the CCDF collapses to an asymptote given by the harmonic mean of the ACDF asymptotes. Synchronicity can therefore be characterized by the variability of the CCDF about this asymptote. The brevity of instrumental catalogs makes the identification of synchronicity at large M difficult, but we will illustrate potentially interesting behaviors through the analysis of a million-year California catalog generated by the earthquake simulator, RSQSim (Deiterich & Richards-Dinger, 2010), which we sampled at a dozen fault-spanning volumes. At the magnitude threshold M = 7, the ACDF can be well fit by renewal models with fairly small aperiodicity parameters (α < 0.2) for all fault volumes but one (on the San Jacinto fault). At interseismic (Reid) time scales, we observe pairs of fault segments that are tightly locked, such as the Cholame and Carrizo sections of the San Andreas Fault (SAF), where the CCDF and two ACDFs are nearly equal; segments out of phase (Carrizo-SAF/Coachella-SAF and Coachella-SAF/San Jacinto), where the CCDF variation is an odd function of time; and segments where events are in phase with integer ratios of recurrence times (2:1 synchronicity of Coachella-SAF/Mojave-SAF and Carrizo-SAF/Mojave-SAF). At near-seismic (Omori) time scales, we observe various modes of clustering, triggering, and shadowing in RSQSim catalogs; e.g., events on Mojave-SAF trigger Garlock events, and events on Coachella-SAF shut down events on San Jacinto. Therefore, despite its geometrical complexity and multiplicity of time scales, the RSQSim model of the San Andreas fault system exhibits a variety of synchronous behaviors that increase the predictability of large ruptures within the system. A key question for earthquake forecasting is whether the real San Andreas system is equally, or much less, synchronous.
Space and time scales of shoreline change at Cape Cod National Seashore, MA, USA
Allen, J.R.; LaBash, C.L.; List, J.H.; Kraus, Nicholas C.; McDougal, William G.
1999-01-01
Different processes cause patterns of shoreline change which are exhibited at different magnitudes and nested into different spatial and time scale hierarchies. The 77-km outer beach at Cape Cod National Seashore offers one of the few U.S. federally owned portions of beach to study shoreline change within the full range of sediment source and sink relationships, and barely affected by human intervention. 'Mean trends' of shoreline changes are best observed at long time scales but contain much spatial variation thus many sites are not equal in response. Long-term, earlier-noted trends are confirmed but the added quantification and resolution improves greatly the understanding of appropriate spatial and time scales of those processes driving bluff retreat and barrier island changes in both north and south depocenters. Shorter timescales allow for comparison of trends and uncertainty in shoreline change at local scales but are dependent upon some measure of storm intensity and seasonal frequency. Single-event shoreline survey results for one storm at daily intervals after the erosional phase suggest a recovery time for the system of six days, identifies three sites with abnormally large change, and that responses at these sites are spatially coherent for now unknown reasons. Areas near inlets are the most variable at all time scales. Hierarchies in both process and form are suggested.
Near-Real-Time Earth Observation Data Supporting Wildfire Management
NASA Astrophysics Data System (ADS)
Ambrosia, V. G.; Zajkowski, T.; Quayle, B.
2013-12-01
During disaster events, the most critical element needed by responding personnel and management teams is situational intelligence / awareness. During rapidly-evolving events such as wildfires, the need for timely information is critical to save lives, property and resources. The wildfire management agencies in the US rely heavily on remote sensing information both from airborne platforms as well as from orbital assets. The ability to readily have information from those systems, not just data, is critical to effective control and damage mitigation. NASA has been collaborating with the USFS to mature and operationalize various asset-information capabilities to effect improved knowledge of fire-prone areas, monitor wildfire events in real-time, assess effectiveness of fire management strategies, and provide rapid, post-fire assessment for recovery operations. Specific examples of near-real-time remote sensing asset utility include daily MODIS data employed to assess fire potential / wildfire hazard areas, and national-scale hot-spot detection, airborne thermal sensor collected during wildfire events to effect management strategies, EO-1 ALI 'pointable' satellite sensor data to assess fire-retardant application effectiveness, and Landsat 8 and other sensor data to derive burn severity indices for post-fire remediation work. These cases of where near-real-time data is used operationally during the previous few fire seasons will be presented.
Scaling rates of true polar wander in convecting planets and moons
NASA Astrophysics Data System (ADS)
Rose, Ian; Buffett, Bruce
2017-12-01
Mass redistribution in the convecting mantle of a planet causes perturbations in its moment of inertia tensor. Conservation of angular momentum dictates that these perturbations change the direction of the rotation vector of the planet, a process known as true polar wander (TPW). Although the existence of TPW on Earth is firmly established, its rate and magnitude over geologic time scales remain controversial. Here we present scaling analyses and numerical simulations of TPW due to mantle convection over a range of parameter space relevant to planetary interiors. For simple rotating convection, we identify a set of dimensionless parameters that fully characterize true polar wander. We use these parameters to define timescales for the growth of moment of inertia perturbations due to convection and for their relaxation due to true polar wander. These timescales, as well as the relative sizes of convective anomalies, control the rate and magnitude of TPW. This analysis also clarifies the nature of so called "inertial interchange" TPW events, and relates them to a broader class of events that enable large and often rapid TPW. We expect these events to have been more frequent in Earth's past.
Scaling properties and universality of first-passage-time probabilities in financial markets
NASA Astrophysics Data System (ADS)
Perelló, Josep; Gutiérrez-Roig, Mario; Masoliver, Jaume
2011-12-01
Financial markets provide an ideal frame for the study of crossing or first-passage time events of non-Gaussian correlated dynamics, mainly because large data sets are available. Tick-by-tick data of six futures markets are herein considered, resulting in fat-tailed first-passage time probabilities. The scaling of the return with its standard deviation collapses the probabilities of all markets examined—and also for different time horizons—into single curves, suggesting that first-passage statistics is market independent (at least for high-frequency data). On the other hand, a very closely related quantity, the survival probability, shows, away from the center and tails of the distribution, a hyperbolic t-1/2 decay typical of a Markovian dynamics, albeit the existence of memory in markets. Modifications of the Weibull and Student distributions are good candidates for the phenomenological description of first-passage time properties under certain regimes. The scaling strategies shown may be useful for risk control and algorithmic trading.
Swift, Arthur; von Grote, Erika; Jonas, Brandie; Nogueira, Alessandra
2017-01-01
The appeal of hyaluronic acid fillers for facial soft tissue augmentation is attributable to both an immediate aesthetic effect and relatively short recovery time. Although recovery time is an important posttreatment variable, as it impacts comfort with appearance and perceived treatment benefit, it is not routinely evaluated. Natural-looking aesthetic outcomes are also a primary concern for many patients. A single-center, noncomparative study evaluated the time (in hours) until subjects return to social engagement (RtSE) following correction of moderate and severe nasolabial folds (NLFs) with R R (Restylane ® Refyne) ® and R D (Restylane Defyne), respectively. Twenty subjects (aged 35-57 years) who received bilateral NLF correction documented their RtSE and injection-related events posttreatment. Treatment efficacy was evaluated by improvements in Wrinkle Severity Rating Scale (WSRS) and subject satisfaction questionnaire at days 14 and 30, and by Global Aesthetic Improvement Scale (GAIS) at day 30. Safety was evaluated by injection-related events and treatment-emergent adverse events. Fifty percent of subjects reported RtSE within 2 hours posttreatment. WSRS for the R R group improved significantly from baseline at day 14 (-1.45±0.42) and day 30 (-1.68±0.46) ( P <0.001), respectively. WSRS for the R D group improved significantly from baseline at day 14 (-2.22±0.44) and day 30 (-2.50±0.50) ( P <0.004), respectively. All GAIS improvements were clinically significant at day 30. The majority of injection-related events were mild or moderate. Two subjects experienced 3 related treatment-emergent adverse events; 1 R R subject experienced severe bruising, and 1 R D subject experienced severe erythema and mild telangiectasia. Subject satisfaction was high regarding aesthetic outcomes and natural-looking results. Optimal correction of moderate NLFs with R R and severe NLFs with R D involved minimal time to RtSE for most subjects. Treatments that significantly improved WSRS and GAIS, were generally well-tolerated, and provided natural-looking aesthetic outcomes.
Applying complex networks to evaluate precipitation patterns over South America
NASA Astrophysics Data System (ADS)
Ciemer, Catrin; Boers, Niklas; Barbosa, Henrique; Kurths, Jürgen; Rammig, Anja
2016-04-01
The climate of South America exhibits pronounced differences between the wet- and the dry-season, which are accompanied by specific synoptic events like changes in the location of the South American Low Level Jet (SALLJ) and the establishment of the South American Convergence Zone (SACZ). The onset of these events can be related to the presence of typical large-scale precipitation patterns over South America, as previous studies have shown[1,2]. The application of complex network methods to precipitation data recently received increased scientific attention for the special case of extreme events, as it is possible with such methods to analyze the spatiotemporal correlation structure as well as possible teleconnections of these events[3,4]. In these approaches the correlation between precipitation datasets is calculated by means of Event Synchronization which restricts their applicability to extreme precipitation events. In this work, we propose a method which is able to consider not only extreme precipitation but complete time series. A direct application of standard similarity measures in order to correlate precipitation time series is impossible due to their intricate statistical properties as the large amount of zeros. Therefore, we introduced and evaluated a suitable modification of Pearson's correlation coefficient to construct spatial correlation networks of precipitation. By analyzing the characteristics of spatial correlation networks constructed on the basis of this new measure, we are able to determine coherent areas of similar precipitation patterns, spot teleconnections of correlated areas, and detect central regions for precipitation correlation. By analyzing the change of the network over the year[5], we are also able to determine local and global changes in precipitation correlation patterns. Additionally, global network characteristics as the network connectivity yield indications for beginning and end of wet- and dry season. In order to identify large-scale synoptic events like the SACZ and SALLJ onset, detecting the changes of correlation over time between certain regions is of significant relevance. [1] Nieto-Ferreira et al. Quarterly Journal of the Royal Meteorological Society (2011) [2] Vera et al. Bulletin of the American Meteorological Society (2006) [3] Quiroga et al. Physical review E (2002) [4] Boers et al. nature communications (2014) [5] Radebach et al. Physical review E (2013)
Multiple-scale neuroendocrine signals connect brain and pituitary hormone rhythms
Romanò, Nicola; Guillou, Anne; Martin, Agnès O; Mollard, Patrice
2017-01-01
Small assemblies of hypothalamic “parvocellular” neurons release their neuroendocrine signals at the median eminence (ME) to control long-lasting pituitary hormone rhythms essential for homeostasis. How such rapid hypothalamic neurotransmission leads to slowly evolving hormonal signals remains unknown. Here, we show that the temporal organization of dopamine (DA) release events in freely behaving animals relies on a set of characteristic features that are adapted to the dynamic dopaminergic control of pituitary prolactin secretion, a key reproductive hormone. First, locally generated DA release signals are organized over more than four orders of magnitude (0.001 Hz–10 Hz). Second, these DA events are finely tuned within and between frequency domains as building blocks that recur over days to weeks. Third, an integration time window is detected across the ME and consists of high-frequency DA discharges that are coordinated within the minutes range. Thus, a hierarchical combination of time-scaled neuroendocrine signals displays local–global integration to connect brain–pituitary rhythms and pace hormone secretion. PMID:28193889
NASA Astrophysics Data System (ADS)
Goldberg, D.; Bock, Y.; Melgar, D.
2017-12-01
Earthquake magnitude is a concise metric that illuminates the destructive potential of a seismic event. Rapid determination of earthquake magnitude is currently the main prerequisite for dissemination of a tsunami early warning, thus timely and automated calculation is of high importance. Seismic instrumentation experiences well-documented complications at long periods, making the accurate measurement of ground displacement in the near field unreliable. As a result, the relation between ground motion measured with seismic instrumentation and magnitude saturates, causing underestimation of the size of very large events. In the case of tsunamigenic earthquakes, magnitude underestimation in turn leads to a flawed tsunami inundation assessment, which limits the effectiveness of an early warning, in particular for local tsunamis. Global Navigation Satellite System (GNSS) instrumentation measures the displacement field directly, leading to more accurate magnitude estimates with near-field data. Unlike seismic-only instrumentation, near-field GNSS has been shown to provide an accurate magnitude estimate using the peak ground displacement (PGD) after just 2 minutes [Melgar et al., 2015]. However, GNSS alone is too noisy to detect the first seismic wave arrivals (P-waves), thus it cannot be as timely as a seismic system on its own. Using collocated seismic and geodetic instrumentation, we refine magnitude scaling relations by incorporating a large dataset of earthquakes in Japan. We demonstrate that consideration of the time-dependence of displacement amplitude with respect to P-wave arrival time reduces the time to convergence of the magnitude estimate. We present findings on the growth of events of large magnitude, and demonstrate time-dependent scaling relations that adapt to the amount of recorded data, starting with the P-wave arrival and continuing through PGD. We illustrate real-time, automated implementation of this method, and consider network improvements to advance rapid characterization of large events. Improvement of initial magnitude estimates through integration of geodetic and seismogeodetic observations is a top priority of an ongoing collaboration with NASA and NOAA's National and Pacific Tsunami Warning Centers (NOAA/NASA GNSS Tsunami Team).
Tree Circumference Dynamics in Four Forests Characterized Using Automated Dendrometer Bands
McMahon, Sean M.; Detto, Matteo; Lutz, James A.; Davies, Stuart J.; Chang-Yang, Chia-Hao; Anderson-Teixeira, Kristina J.
2016-01-01
Stem diameter is one of the most commonly measured attributes of trees, forming the foundation of forest censuses and monitoring. Changes in tree stem circumference include both irreversible woody stem growth and reversible circumference changes related to water status, yet these fine-scale dynamics are rarely leveraged to understand forest ecophysiology and typically ignored in plot- or stand-scale estimates of tree growth and forest productivity. Here, we deployed automated dendrometer bands on 12–40 trees at four different forested sites—two temperate broadleaf deciduous, one temperate conifer, and one tropical broadleaf semi-deciduous—to understand how tree circumference varies on time scales of hours to months, how these dynamics relate to environmental conditions, and whether the structure of these variations might introduce substantive error into estimates of woody growth. Diurnal stem circumference dynamics measured over the bark commonly—but not consistently—exhibited daytime shrinkage attributable to transpiration-driven changes in stem water storage. The amplitude of this shrinkage was significantly correlated with climatic variables (daily temperature range, vapor pressure deficit, and radiation), sap flow and evapotranspiration. Diurnal variations were typically <0.5 mm circumference in amplitude and unlikely to be of concern to most studies of tree growth. Over time scales of multiple days, the bands captured circumference increases in response to rain events, likely driven by combinations of increased stem water storage and bark hydration. Particularly at the tropical site, these rain responses could be quite substantial, ranging up to 1.5 mm circumference expansion within 48 hours following a rain event. We conclude that over-bark measurements of stem circumference change sometimes correlate with but have limited potential for directly estimating daily transpiration, but that they can be valuable on time scales of days to weeks for characterizing changes in stem growth and hydration. PMID:28030646
NASA Astrophysics Data System (ADS)
Setty, V.; Sharma, A.
2013-12-01
Characterization of extreme conditions of space weather is essential for potential mitigation strategies. The non-equilibrium nature of magnetosphere makes such efforts complicated and new techniques to understand its extreme event distribution are required. The heavy tail distribution in such systems can be a modeled using Stable distribution whose stability parameter is a measure of scaling in the cumulative distribution and is related to the Hurst exponent. This exponent can be readily measured in stationary time series using several techniques and detrended fluctuation analysis (DFA) is widely used in the presence of non-stationarities. However DFA has severe limitations in cases with non-linear and atypical trends. We propose a new technique that utilizes nonlinear dynamical predictions as a measure of trends and estimates the Hurst exponents. Furthermore, such a measure provides us with a new way to characterize predictability, as perfectly detrended data have no long term memory akin to Gaussian noise Ab initio calculation of weekly Hurst exponents using the auroral electrojet index AL over a span of few decades shows that these exponents are time varying and so is its fractal structure. Such time series data with time varying Hurst exponents are modeled well using multifractional Brownian motion and it is shown that DFA estimates a single time averaged value for Hurst exponent in such data. Our results show that using time varying Hurst exponent structure, we can (a) Estimate stability parameter, -a measure of scaling in heavy tails, (b) Define and identify epochs when the magnetosphere switches between regimes with and without extreme events, and, (c) Study the dependence of the Hurst exponents on the solar activity.
Small-scale deflagration cylinder test with velocimetry wall-motion diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hooks, Daniel E; Hill, Larry G; Pierce, Timothy H
Predicting the likelihood and effects of outcomes resultant from thermal initiation of explosives remains a significant challenge. For certain explosive formulations, the general outcome can be broadly predicted given knowledge of certain conditions. However, there remain unexplained violent events, and increased statistical understanding of outcomes as a function of many variables, or 'violence categorization,' is needed. Additionally, the development of an equation of state equivalent for deflagration would be very useful in predicting possible detailed event consequences using traditional hydrodynamic detonation moders. For violence categorization, it is desirable that testing be efficient, such that it is possible to statistically definemore » outcomes reliant on the processes of initiation of deflagration, steady state deflagration, and deflagration to detonation transitions. If the test simultaneously acquires information to inform models of violent deflagration events, overall predictive capabilities for event likelihood and consequence might improve remarkably. In this paper we describe an economical scaled deflagration cylinder test. The cyclotetramethylene tetranitramine (HMX) based explosive formu1lation PBX 9501 was tested using different temperature profiles in a thick-walled copper cylindrical confiner. This test is a scaled version of a recently demonstrated deflagration cylinder test, and is similar to several other thermal explosion tests. The primary difference is the passive velocimetry diagnostic, which enables measurement of confinement vessel wall velocities at failure, regardless of the timing and location of ignition.« less
Richard, Nelly; Laursen, Bettina; Grupe, Morten; Drewes, Asbjørn M; Graversen, Carina; Sørensen, Helge B D; Bastlund, Jesper F
2017-04-01
Active auditory oddball paradigms are simple tone discrimination tasks used to study the P300 deflection of event-related potentials (ERPs). These ERPs may be quantified by time-frequency analysis. As auditory stimuli cause early high frequency and late low frequency ERP oscillations, the continuous wavelet transform (CWT) is often chosen for decomposition due to its multi-resolution properties. However, as the conventional CWT traditionally applies only one mother wavelet to represent the entire spectrum, the time-frequency resolution is not optimal across all scales. To account for this, we developed and validated a novel method specifically refined to analyse P300-like ERPs in rats. An adapted CWT (aCWT) was implemented to preserve high time-frequency resolution across all scales by commissioning of multiple wavelets operating at different scales. First, decomposition of simulated ERPs was illustrated using the classical CWT and the aCWT. Next, the two methods were applied to EEG recordings obtained from prefrontal cortex in rats performing a two-tone auditory discrimination task. While only early ERP frequency changes between responses to target and non-target tones were detected by the CWT, both early and late changes were successfully described with strong accuracy by the aCWT in rat ERPs. Increased frontal gamma power and phase synchrony was observed particularly within theta and gamma frequency bands during deviant tones. The study suggests superior performance of the aCWT over the CWT in terms of detailed quantification of time-frequency properties of ERPs. Our methodological investigation indicates that accurate and complete assessment of time-frequency components of short-time neural signals is feasible with the novel analysis approach which may be advantageous for characterisation of several types of evoked potentials in particularly rodents.
Microseismicity of an Unstable Rock Mass: From Field Monitoring to Laboratory Testing
NASA Astrophysics Data System (ADS)
Colombero, C.; Comina, C.; Vinciguerra, S.; Benson, P. M.
2018-02-01
The field-scale microseismic (MS) activity of an unstable rock mass is known to be an important tool to assess damage and cracking processes eventually leading to macroscopic failures. However, MS-event rates alone may not be enough for a complete understanding of the trigger mechanisms of mechanical instabilities. Acoustic Emission (AE) techniques at the laboratory scale can be used to provide complementary information. In this study, we report a MS/AE comparison to assess the stability of a granitic rock mass in the northwestern Italian Alps (Madonna del Sasso). An attempt to bridge the gap between the two different scales of observation, and the different site and laboratory conditions, is undertaken to gain insights on the rock mass behavior as a function of external governing factors. Time- and frequency-domain parameters of the MS/AE waveforms are compared and discussed with this aim. At the field scale, special attention is devoted to the correlation of the MS-event rate with meteorological parameters (air temperature and rainfalls). At the laboratory scale, AE rates, waveforms, and spectral content, recorded under controlled temperature and fluid conditions, are analyzed in order to better constrain the physical mechanisms responsible for the observed field patterns. The factors potentially governing the mechanical instability at the site were retrieved from the integration of the results. Abrupt thermal variations were identified as the main cause of the site microsesimicity, without highlighting irreversible acceleration in the MS-event rate potentially anticipating the rock mass collapse.
Hydrologic response to stormwater control measures in urban watersheds
NASA Astrophysics Data System (ADS)
Bell, Colin D.; McMillan, Sara K.; Clinton, Sandra M.; Jefferson, Anne J.
2016-10-01
Stormwater control measures (SCMs) are designed to mitigate deleterious effects of urbanization on river networks, but our ability to predict the cumulative effect of multiple SCMs at watershed scales is limited. The most widely used metric to quantify impacts of urban development, total imperviousness (TI), does not contain information about the extent of stormwater control. We analyzed the discharge records of 16 urban watersheds in Charlotte, NC spanning a range of TI (4.1-54%) and area mitigated with SCMs (1.3-89%). We then tested multiple watershed metrics that quantify the degree of urban impact and SCM mitigation to determine which best predicted hydrologic response across sites. At the event time scale, linear models showed TI to be the best predictor of both peak unit discharge and rainfall-runoff ratios across a range of storm sizes. TI was also a strong driver of both a watershed's capacity to buffer small (e.g., 1-10 mm) rain events, and the relationship between peak discharge and precipitation once that buffering capacity is exceeded. Metrics containing information about SCMs did not appear as primary predictors of event hydrologic response, suggesting that the level of SCM mitigation in many urban watersheds is insufficient to influence hydrologic response. Over annual timescales, impervious surfaces unmitigated by SCMs and tree coverage were best correlated with streamflow flashiness and water yield, respectively. The shift in controls from the event scale to the annual scale has important implications for water resource management, suggesting that overall limitation of watershed imperviousness rather than partial mitigation by SCMs may be necessary to alleviate the hydrologic impacts of urbanization.
NASA Astrophysics Data System (ADS)
Martínez, G.; Vanderlinden, K.; Giraldez, J. V.; Espejo, A. J.; Muriel, J. L.
2009-12-01
Soil moisture plays an important role in a wide variety of biogeochemical fluxes in the soil-plant-atmosphere system and governs the (eco)hydrological response of a catchment to an external forcing such as rainfall. Near-surface electromagnetic induction (EMI) sensors that measure the soil apparent electrical conductivity (ECa) provide a fast and non-invasive means for characterizing this response at the field or catchment scale through high-resolution time-lapse mapping. Here we show how ECa maps, obtained before and after an intense rainfall event of 125 mm h-1, elucidate differences in soil moisture patterns and hydrologic response of an experimental field as a consequence of differed soil management. The dryland field (Vertisol) was located in SW Spain and cropped with a typical wheat-sunflower-legume rotation. Both, near-surface and subsurface ECa (ECas and ECad, respectively), were measured using the EM38-DD EMI sensor in a mobile configuration. Raw ECa measurements and Mean Relative Differences (MRD) provided information on soil moisture patterns while time-lapse maps were used to evaluate the hydrologic response of the field. ECa maps of the field, measured before and after the rainfall event showed similar patterns. The field depressions where most of water and sediments accumulated had the highest ECa and MRD values. The SE-oriented soil, which was deeper and more exposed to sun and wind, showed the lowest ECa and MRD. The largest differences raised in the central part of the field where a high ECa and MRD area appeared after the rainfall event as a consequence of the smaller soil depth and a possible subsurface flux concentration. Time-lapse maps of both ECa and MRD were also similar. The direct drill plots showed higher increments of ECa and MRD as a result of the smaller runoff production. Time-lapse ECa increments showed a bimodal distribution differentiating clearly the direct drill from the conventional and minimum tillage plots. However this kind of distribution could not be shown using MRD differences since they come from standardized distributions. Field-extend time-lapse ECa maps can provide useful images of the hydrological response of agricultural fields which can be used to evaluate different soil management strategies or to aid the assessment of biogeochemical fluxes at the field scale.
The observation of possible reconnection events in the boundary changes of solar coronal holes
NASA Technical Reports Server (NTRS)
Kahler, S. W.; Moses, J. Daniel
1989-01-01
Coronal holes are large scale regions of magnetically open fields which are easily observed in solar soft X-ray images. The boundaries of coronal holes are separatrices between large scale regions of open and closed magnetic fields where one might expect to observe evidence of solar magnetic reconnection. Previous studies by Nolte and colleagues using Skylab X-ray images established that large scale (greater than or equal to 9 x 10(4) km) changes in coronal hole boundaries were due to coronal processes, i.e., magnetic reconnection, rather than to photospheric motions. Those studies were limited to time scales of about one day, and no conclusion could be drawn about the size and time scales of the reconnection process at hole boundaries. Sequences of appropriate Skylab X-ray images were used with a time resolution of about 90 min during times of the central meridian passages of the coronal hole labelled Coronal Hole 1 to search for hole boundary changes which can yield the spatial and temporal scales of coronal magnetic reconnection. It was found that 29 of 32 observed boundary changes could be associated with bright points. The appearance of the bright point may be the signature of reconnection between small scale and large scale magnetic fields. The observed boundary changes contributed to the quasi-rigid rotation of Coronal Hole 1.
Emeishan volcanism and the end-Guadalupian extinction: New U-Pb TIMS ages
NASA Astrophysics Data System (ADS)
Mundil, Roland; Denyszyn, Steve; He, Bin; Metcalfe, Ian; Yigang, Xu
2010-05-01
High-resolution geochronology with an age resolution at the permil level is instrumental in testing proposed causal links between continental-scale, short-term volcanic events and environmental crises that affect life globally. Synchroneity with large-scale volcanic events has been shown for three of the five most severe extinctions, namely the end-Permian extinction coinciding with Siberian Trapp volcanism, the end-Triassic extinction with Central Atlantic Magmatic Province) volcanism and the end-Cretaceous with Deccan Trapp volcanism. Recent studies also show that the magnitude of the extinction is not solely a function of the size (volume) of the volcanic event but more importantly of the eruption rate and also the nature of the host rock that is intruded, and the resulting reactions and release of gases that can affect climate. The end-Guadalupian (end Middle Permian, ca 260 Ma) biotic crisis has traditionally not been included in the 'big five' mass extinctions, possibly because of its close proximity in time to the end-Permian event, although its magnitude (in terms of total extinction rate) is comparable to the three most severe extinctions (end-Ordovician, end-Permian, end-Cretaceous). As a result, research of the end-Guadalupian event has so far been neglected and its timing as well as the temporal relation to the Emeishan volcanic province in western China is as yet not fully studied. Geochronological data are so far mostly based on ambiguous 40Ar/39Ar analyses of commonly altered basaltic products and U-Pb zircon analyses on felsic products using micro-beam techniques that typically result in radio-isotopic ages with percent-level uncertainty, and thus insufficient for high-resolution correlations of events. In addition, no precise and accurate radio-isotopic data exist from this time period so that evolutionary events (extinction and recovery) on land and in the ocean are notoriously difficult to correlate though biostratigraphic records are available from numerous sedimentary archives. A further complication arises from the severe tectonic (and resulting thermal) overprint, due to the closure of the Tethys and the collision of the Indian plate with Asia, of most of the area where Emeishan volcanic products are exposed. Also, currently existing paleo-environmental data are scarce and insufficient for testing this hypothesis with confidence, because studies using stable isotopes as proxies are restricted to short profiles from only a few sites. Therefore, fundamental questions remain unanswered. We present new data U-Pb IDTIMS ages with permil-level resolution that constrain the timing of Emeishan volcanism and the timing of biotic events recorded in sediments. In detail, U-Pb results are from felsic intercalations within late stage Emeishan products and biostratigaphically calibrated marine sedimentary sections in southwestern and central China as well as thick tuffs within terrestrial sections from the Bowen Basin in eastern Australia. There is also great potential for obtaining precise U-Pb age results on volcanic products with basaltic composition using the accessory mineral baddeleyite the occurrence of which we have already confirmed. Geochronological and geochemical research is complemented with paleo-enviromental studies and biostratigraphy. We expect that through integration of U-Pb and 40Ar/39Ar geochronology with chemo- and biostratigraphy, the time scale of the Middle through Late Permian will be greatly improved and will lead to a more realistic evaluation of potential causes for the biotic crisis and its aftermath.
NASA Astrophysics Data System (ADS)
Lascu, I.; Feinberg, J. M.; Dorale, J. A.; Cheng, H.; Edwards, R. L.
2015-12-01
Short-lived geomagnetic events are reflections of geodynamo behavior at small length scales. A rigorous documentation of the anatomy, timing, duration, and frequency of centennial-to-millennial scale geomagnetic events can be invaluable for theoretical and numerical geodynamo models, and for the understanding the finer dynamics of the Earth's core. A critical ingredient for characterizing such geomagnetic instabilities are tightly constrained age models that enable high-resolution magnetostratigraphies. Here we focus on a North American speleothem geomagnetic record of the Laschamp excursion, which was the first geomagnetic excursion recognized and described in the paleomagnetic record, and remains the most studied event of its kind. The geological significance of the Laschamp lies chiefly in the fact that it constitutes a global time-synchronous geochronological marker. The Laschamp excursion occurred around the time of the demise of Homo neanderthalensis, in conjunction with high-amplitude, rapid climatic oscillations leading into the Last Glacial Maximum, and precedes a major supervolcano eruption in the Mediterranean. Thus, the precise determination of the timing and duration of the Laschamp would help in elucidating major scientific questions situated at the intersection of geology, paleoclimatology, and anthropology. Here we present a geomagnetic record from a stalagmite collected in Crevice Cave, Missouri, which we have dated using a combination of high-precision 230Th ages and annual layer counting using confocal microscopy. We have found a maximum duration for the Laschamp that spans the interval 42,250-39,700 years BP, and an age of 41,100 ± 350 years BP for the height of the excursion. During this period relative paleointensity decreased by an order of magnitude and the virtual geomagnetic pole was located at southerly latitudes. Our chronology provides the first robust bracketing for the Laschamp excursion, and improves on previous age determinations based on 40Ar/39Ar dating of lava flows, and orbitally-tuned sedimentary and ice-core records.
On the self-organized critical state of Vesuvio volcano
NASA Astrophysics Data System (ADS)
Luongo, G.; Mazzarella, A.; Palumbo, A.
1996-01-01
The catalogue of volcanic earthquakes recorded at Vesuvio (1972-1993) is shown to be complete for events with magnitude enclosed between 1.8 and 3.0. Such a result is converted in significant fractal laws (power laws) relating the distribution of earthquakes to the distribution of energy release, seismic moment, size of fractured zone and linear dimension of faults. The application of the Cantor dust model to time sequence of Vesuvio seismic and eruptive events allows the determination of significant time-clustering fractal structures. In particular, the Vesuvio eruptive activity shows a double-regime process with a stronger clustering on short-time scales than on long-time scales. The complexity of the Vesuvio system does not depend on the number of geological, geophysical and geochemical factors that govern it, but mainly on the number of their interconnections, on the intensity of such linkages and on the feed-back processes. So, all the identified fractal features are taken as evidence that the Vesuvio system is in a self-organized critical state i.e., in a marginally stable state in which a small perturbation can start a chain reaction that can lead to catastrophe. After the catatrophe, the system regulates itself and begins a new cycle, not necessarily periodic, that will end with a successive catastrophe. The variations of the fractal dimension and of the specific scale ranges, in which the fractal behaviour is found to hold, serve as possible volcanic predictors reflecting changes of the same volcanic process.
New analytic results for speciation times in neutral models.
Gernhard, Tanja
2008-05-01
In this paper, we investigate the standard Yule model, and a recently studied model of speciation and extinction, the "critical branching process." We develop an analytic way-as opposed to the common simulation approach-for calculating the speciation times in a reconstructed phylogenetic tree. Simple expressions for the density and the moments of the speciation times are obtained. Methods for dating a speciation event become valuable, if for the reconstructed phylogenetic trees, no time scale is available. A missing time scale could be due to supertree methods, morphological data, or molecular data which violates the molecular clock. Our analytic approach is, in particular, useful for the model with extinction, since simulations of birth-death processes which are conditioned on obtaining n extant species today are quite delicate. Further, simulations are very time consuming for big n under both models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dodge, D. A.; Harris, D. B.
Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less
Dodge, D. A.; Harris, D. B.
2016-03-15
Correlation detectors are of considerable interest to the seismic monitoring communities because they offer reduced detection thresholds and combine detection, location and identification functions into a single operation. They appear to be ideal for applications requiring screening of frequent repeating events. However, questions remain about how broadly empirical correlation methods are applicable. We describe the effectiveness of banks of correlation detectors in a system that combines traditional power detectors with correlation detectors in terms of efficiency, which we define to be the fraction of events detected by the correlators. This paper elaborates and extends the concept of a dynamic correlationmore » detection framework – a system which autonomously creates correlation detectors from event waveforms detected by power detectors; and reports observed performance on a network of arrays in terms of efficiency. We performed a large scale test of dynamic correlation processors on an 11 terabyte global dataset using 25 arrays in the single frequency band 1-3 Hz. The system found over 3.2 million unique signals and produced 459,747 screened detections. A very satisfying result is that, on average, efficiency grows with time and, after nearly 16 years of operation, exceeds 47% for events observed over all distance ranges and approaches 70% for near regional and 90% for local events. This observation suggests that future pipeline architectures should make extensive use of correlation detectors, principally for decluttering observations of local and near-regional events. Our results also suggest that future operations based on correlation detection will require commodity large-scale computing infrastructure, since the numbers of correlators in an autonomous system can grow into the hundreds of thousands.« less
NASA Astrophysics Data System (ADS)
Bastin, Sophie; Champollion, Cédric; Bock, Olivier; Drobinski, Philippe; Masson, Frédéric
2005-03-01
Global Positioning System (GPS) tomography analyses of water vapor, complemented by high-resolution numerical simulations are used to investigate a Mistral/sea breeze event in the region of Marseille, France, during the ESCOMPTE experiment. This is the first time GPS tomography has been used to validate the three-dimensional water vapor concentration from numerical simulation, and to analyze a small-scale meteorological event. The high spatial and temporal resolution of GPS analyses provides a unique insight into the evolution of the vertical and horizontal distribution of water vapor during the Mistral/sea-breeze transition.
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
Forest Fire Management: A Comprehensive And Operational Approach
NASA Astrophysics Data System (ADS)
Fabrizi, Roberto; Perez, Bruno; Gomez, Antonio
2013-12-01
Remote sensing plays an important role in obtaining rapid and complete information on the occurrence and evolution in space and time of forest fires. In this paper, we present a comprehensive study of fire events through Earth Observation data for early warning, crisis monitoring and post-event damage assessment or a synthesis of the fire event, both in a wide spatial range (local to regional) and temporal scale (short to long term). The fire products are stored and distributed by means of a WebGIS and a Geoportal with additional auxiliary geospatial data. These products allow fire managers to perform analysis and decision making in a more comprehensive manner.
The memory remains: Understanding collective memory in the digital age
García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha
2017-01-01
Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects. PMID:28435881
The memory remains: Understanding collective memory in the digital age.
García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha
2017-04-01
Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects.
Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements
NASA Astrophysics Data System (ADS)
Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.
2017-12-01
Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.
Time drawings: Spatial representation of temporal concepts.
Leone, María Juliana; Salles, Alejo; Pulver, Alejandro; Golombek, Diego Andrés; Sigman, Mariano
2018-03-01
Time representation is a fundamental property of human cognition. Ample evidence shows that time (and numbers) are represented in space. However, how the conceptual mapping varies across individuals, scales, and temporal structures remains largely unknown. To investigate this issue, we conducted a large online study consisting in five experiments that addressed different time scales and topology: Zones of time, Seasons, Days of the week, Parts of the day and Timeline. Participants were asked to map different kinds of time events to a location in space and to determine their size and color. Results showed that time is organized in space in a hierarchical progression: some features appear to be universal (i.e. selection order), others are shaped by how time is organized in distinct cultures (i.e. location order) and, finally, some aspects vary depending on individual features such as age, gender, and chronotype (i.e. size and color). Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Tsai, Christina; Yeh, Ting-Gu
2017-04-01
Extreme weather events are occurring more frequently as a result of climate change. Recently dengue fever has become a serious issue in southern Taiwan. It may have characteristic temporal scales that can be identified. Some researchers have hypothesized that dengue fever incidences are related to climate change. This study applies time-frequency analysis to time series data concerning dengue fever and hydrologic and meteorological variables. Results of three time-frequency analytical methods - the Hilbert Huang transform (HHT), the Wavelet Transform (WT) and the Short Time Fourier Transform (STFT) are compared and discussed. A more effective time-frequency analysis method will be identified to analyze relevant time series data. The most influential time scales of hydrologic and meteorological variables that are associated with dengue fever are determined. Finally, the linkage between hydrologic/meteorological factors and dengue fever incidences can be established.
MMS Observations of Parallel Electric Fields During a Quasi-Perpendicular Bow Shock Crossing
NASA Astrophysics Data System (ADS)
Goodrich, K.; Schwartz, S. J.; Ergun, R.; Wilder, F. D.; Holmes, J.; Burch, J. L.; Gershman, D. J.; Giles, B. L.; Khotyaintsev, Y. V.; Le Contel, O.; Lindqvist, P. A.; Strangeway, R. J.; Russell, C.; Torbert, R. B.
2016-12-01
Previous observations of the terrestrial bow shock have frequently shown large-amplitude fluctuations in the parallel electric field. These parallel electric fields are seen as both nonlinear solitary structures, such as double layers and electron phase-space holes, and short-wavelength waves, which can reach amplitudes greater than 100 mV/m. The Magnetospheric Multi-Scale (MMS) Mission has crossed the Earth's bow shock more than 200 times. The parallel electric field signatures observed in these crossings are seen in very discrete packets and evolve over time scales of less than a second, indicating the presence of a wealth of kinetic-scale activity. The high time resolution of the Fast Particle Instrument (FPI) available on MMS offers greater detail of the kinetic-scale physics that occur at bow shocks than ever before, allowing greater insight into the overall effect of these observed electric fields. We present a characterization of these parallel electric fields found in a single bow shock event and how it reflects the kinetic-scale activity that can occur at the terrestrial bow shock.
Progress in fast, accurate multi-scale climate simulations
Collins, W. D.; Johansen, H.; Evans, K. J.; ...
2015-06-01
We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less
NASA Technical Reports Server (NTRS)
Griffith, Karen
2011-01-01
The purpose of this paper is to look for links in a virtual trainee's interest and self-efficacy in a simulated event as it relates to their previous self-reported technical skill level. Ultimately, the idea would be to provide the right amount of support at the right place at the right time to set the conditions for maximum transfer of the skill sets to the work place. An anecdotal recap of a recent experiment of a medium-scale training event produced in a virtual world will provide examples for discussion. In July 2010, a virtual training event was produced for the Air Force Research Lab's Games for Team Training (GaMeTT) at the Patriot Exercise at Volk Field in Wisconsin. There were 29 EMEDS participants who completed the simulated OCO event using the OLIVE gaming engine. Approximately 25 avatars were present at any given time; including role players, observers, coordinators and participants.
Non-stationary least-squares complex decomposition for microseismic noise attenuation
NASA Astrophysics Data System (ADS)
Chen, Yangkang
2018-06-01
Microseismic data processing and imaging are crucial for subsurface real-time monitoring during hydraulic fracturing process. Unlike the active-source seismic events or large-scale earthquake events, the microseismic event is usually of very small magnitude, which makes its detection challenging. The biggest trouble of microseismic data is the low signal-to-noise ratio issue. Because of the small energy difference between effective microseismic signal and ambient noise, the effective signals are usually buried in strong random noise. I propose a useful microseismic denoising algorithm that is based on decomposing a microseismic trace into an ensemble of components using least-squares inversion. Based on the predictive property of useful microseismic event along the time direction, the random noise can be filtered out via least-squares fitting of multiple damping exponential components. The method is flexible and almost automated since the only parameter needed to be defined is a decomposition number. I use some synthetic and real data examples to demonstrate the potential of the algorithm in processing complicated microseismic data sets.
Watanabe, Hayafumi; Sano, Yukie; Takayasu, Hideki; Takayasu, Misako
2016-11-01
To elucidate the nontrivial empirical statistical properties of fluctuations of a typical nonsteady time series representing the appearance of words in blogs, we investigated approximately 3×10^{9} Japanese blog articles over a period of six years and analyze some corresponding mathematical models. First, we introduce a solvable nonsteady extension of the random diffusion model, which can be deduced by modeling the behavior of heterogeneous random bloggers. Next, we deduce theoretical expressions for both the temporal and ensemble fluctuation scalings of this model, and demonstrate that these expressions can reproduce all empirical scalings over eight orders of magnitude. Furthermore, we show that the model can reproduce other statistical properties of time series representing the appearance of words in blogs, such as functional forms of the probability density and correlations in the total number of blogs. As an application, we quantify the abnormality of special nationwide events by measuring the fluctuation scalings of 1771 basic adjectives.
Automated parton-shower variations in PYTHIA 8
Mrenna, S.; Skands, P.
2016-10-03
In the era of precision physics measurements at the LHC, efficient and exhaustive estimations of theoretical uncertainties play an increasingly crucial role. In the context of Monte Carlo (MC) event generators, the estimation of such uncertainties traditionally requires independent MC runs for each variation, for a linear increase in total run time. In this work, we report on an automated evaluation of the dominant (renormalization-scale and nonsingular) perturbative uncertainties in the pythia 8 event generator, with only a modest computational overhead. Each generated event is accompanied by a vector of alternative weights (one for each uncertainty variation), with each set separatelymore » preserving the total cross section. Explicit scale-compensating terms can be included, reflecting known coefficients of higher-order splitting terms and reducing the effect of the variations. In conclusion, the formalism also allows for the enhancement of rare partonic splittings, such as g→bb¯ and q→qγ, to obtain weighted samples enriched in these splittings while preserving the correct physical Sudakov factors.« less
Willard, D.A.; Bernhardt, C.E.; Korejwo, D.A.; Meyers, S.R.
2005-01-01
We present paleoclimatic evidence for a series of Holocene millennial-scale cool intervals in eastern North America that occurred every ???1400 years and lasted ???300-500 years, based on pollen data from Chesapeake Bay in the mid-Atlantic region of the United States. The cool events are indicated by significant decreases in pine pollen, which we interpret as representing decreases in January temperatures of between 0.2??and 2??C. These temperature decreases include excursions during the Little Ice Age (???1300-1600 AD) and the 8 ka cold event. The timing of the pine minima is correlated with a series of quasi-periodic cold intervals documented by various proxies in Greenland, North Atlantic, and Alaskan cores and with solar minima interpreted from cosmogenic isotope records. These events may represent changes in circumpolar vortex size and configuration in response to intervals of decreased solar activity, which altered jet stream patterns to enhance meridional circulation over eastern North America. ?? 2004 Elsevier B.V. All rights reserved.
North Atlantic weather regimes: A synoptic study of phase space. M.S. Thesis
NASA Technical Reports Server (NTRS)
Orrhede, Anna Karin
1990-01-01
In the phase space of weather, low frequency variability (LFV) of the atmosphere can be captured in a large scale subspace, where a trajectory connects consecutive large scale weather maps, thus revealing flow changes and recurrences. Using this approach, Vautard applied the trajectory speed minimization method (Vautard and Legras) to atmospheric data. From 37 winters of 700 mb geopotential height anomalies over the North Atlantic and the adjacent land masses, four persistent and recurrent weather patterns, interpreted as weather regimes, were discernable: a blocking regime, a zonal regime, a Greenland anticyclone regime, and an Atlantic regime. These regimes are studied further in terms of maintenance and transitions. A regime survey unveils preferences regarding event durations and precursors for the onset or break of an event. The transition frequencies between regimes vary, and together with the transition times, suggest the existence of easier transition routes. These matters are more systematically studied using complete synoptic map sequences from a number of events.
Networks as Renormalized Models for Emergent Behavior in Physical Systems
NASA Astrophysics Data System (ADS)
Paczuski, Maya
2005-09-01
Networks are paradigms for describing complex biological, social and technological systems. Here I argue that networks provide a coherent framework to construct coarsegrained models for many different physical systems. To elucidate these ideas, I discuss two long-standing problems. The first concerns the structure and dynamics of magnetic fields in the solar corona, as exemplified by sunspots that startled Galileo almost 400 years ago. We discovered that the magnetic structure of the corona embodies a scale free network, with spots at all scales. A network model representing the three-dimensional geometry of magnetic fields, where links rewire and nodes merge when they collide in space, gives quantitative agreement with available data, and suggests new measurements. Seismicity is addressed in terms of relations between events without imposing space-time windows. A metric estimates the correlation between any two earthquakes. Linking strongly correlated pairs, and ignoring pairs with weak correlation organizes the spatio-temporal process into a sparse, directed, weighted network. New scaling laws for seismicity are found. For instance, the aftershock decay rate decreases as ~ 1/t in time up to a correlation time, tomori. An estimate from the data gives tomori to be about one year for small magnitude 3 earthquakes, about 1400 years for the Landers event, and roughly 26,000 years for the earthquake causing the 2004 Asian tsunami. Our results confirm Kagan's conjecture that aftershocks can rumble on for centuries.
Design of a High Luminosity 100 TeV Proton Antiproton Collider
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveros Tuativa, Sandra Jimena
2017-04-01
Currently new physics is being explored with the Large Hadron Collider at CERN and with Intensity Frontier programs at Fermilab and KEK. The energy scale for new physics is known to be in the multi-TeV range, signaling the need for a future collider which well surpasses this energy scale. A 10more » $$^{\\,34}$$ cm$$^{-2}$$ s$$^{-1}$$ luminosity 100 TeV proton-antiproton collider is explored with 7$$\\times$$ the energy of the LHC. The dipoles are 4.5\\,T to reduce cost. A proton-antiproton collider is selected as a future machine for several reasons. The cross section for many high mass states is 10 times higher in $$p\\bar{p}$$ than $pp$ collisions. Antiquarks for production can come directly from an antiproton rather than indirectly from gluon splitting. The higher cross sections reduce the synchrotron radiation in superconducting magnets and the number of events per bunch crossing, because lower beam currents can produce the same rare event rates. Events are also more centrally produced, allowing a more compact detector with less space between quadrupole triplets and a smaller $$\\beta^{*}$$ for higher luminosity. To adjust to antiproton beam losses (burn rate), a Fermilab-like antiproton source would be adapted to disperse the beam into 12 different momentum channels, using electrostatic septa, to increase antiproton momentum capture 12 times. At Fermilab, antiprotons were stochastically cooled in one Debuncher and one Accumulator ring. Because the stochastic cooling time scales as the number of particles, two options of 12 independent cooling systems are presented. One electron cooling ring might follow the stochastic cooling rings for antiproton stacking. Finally antiprotons in the collider ring would be recycled during runs without leaving the collider ring, by joining them to new bunches with snap bunch coalescence and synchrotron damping. These basic ideas are explored in this work on a future 100 TeV proton-antiproton collider and the main parameters are presented.« less
The deadliest storm of the 20th century striking Portugal: Flood impacts and atmospheric circulation
NASA Astrophysics Data System (ADS)
Trigo, Ricardo M.; Ramos, Catarina; Pereira, Susana S.; Ramos, Alexandre M.; Zêzere, José L.; Liberato, Margarida L. R.
2016-10-01
The deadliest storm affecting Portugal since, at least, the early 19th century, took place on the 25 and 26 November 1967 causing more than 500 fatalities. This work aims to assess the most relevant aspects of this episode. This includes describing the associated meteorological conditions and key hydrological characterisation such as the level of exceptionality of the observed precipitation at different temporal scales, or the estimation of peak discharge values in 20 small river catchments affected. Additionally, from a human impact perspective we provide a full account of all the main socio-economic impacts, particularly the numbers and location of victims (dead, injured, homeless and evacuated). Based on the sub-daily time series of a representative station, and its Intensity-Duration-Frequency curves, we have found that the exceptionality of this rainfall event is particularly linked to rainfall intensities ranging in duration from 4 to 9 h compatible with return periods of 100-years or more. This range of time scale which are similar to the estimated concentration time values of the hydrographic basins affected by the flash flood event. From a meteorological perspective, this episode was characterised by strong convection at the regional scale, fuelled by high availability of moisture over the Lisbon region associated with a low pressure system centered near Lisbon that favoured the convective instability. Most victims were sleeping or were caught by surprise at home in the small river catchments around the main Lisbon metropolitan area. The majority of people who died or who were severely affected by the flood lived in degraded housing conditions often raised in a clandestine way, occupying flood plains near the stream beds. This level of destruction observed at the time is in stark contrast to what was observed in subsequent episodes of similar amplitude. In particular, since 1967 the Lisbon area, was struck by two comparable intense precipitation events in 1983 and 2008 but generating considerably fewer deaths and evacuated people.
2012-01-20
ultrasonic Lamb waves to plastic strain and fatigue life. Theory was developed and validated to predict second harmonic generation for specific mode... Fatigue and damage generation and progression are processes consisting of a series of interrelated events that span large scales of space and time...strain and fatigue life A set of experiments were completed that worked to relate the acoustic nonlinearity measured with Lamb waves to both the
NASA Astrophysics Data System (ADS)
Espírito Santo, Fátima; de Lima, Isabel P.; Silva, Álvaro; Pires, Vanda; de Lima, João L. M. P.
2014-05-01
Large-scale atmospheric circulation patterns and their persistence are known to drive inter-annual variability of precipitation in Europe, although depending on geographical location; this includes precipitation extremes and their trends. The vast range of time and space scales involved leads sometimes to precipitation deficits and surpluses which might affect in a different way the society, the environment and the economy at the local and regional scales, depending on specific conditions. In addition, changes in the climate are expected to affect the occurrence of extreme weather and climate events that might influence significantly the distribution, availability and sustainability of regional water resources. The location of mainland Portugal on the Northeast Atlantic region, in South-western Europe, together with other geographical features, makes this territory vulnerable to extreme dry/wet hydro-meteorological events, driven by the strong variability in precipitation. In our study we discuss, for this territory, the relation between the spatio-temporal variability in those events, including their persistence at different scales, and the variability in several modes of low frequency variability; special attention is dedicated to the North Atlantic Oscillation (NAO) and Scandinavian pattern (SCAND). Some of these dry/wet episodes affect different aspects of the hydrologic cycle and are likely to lead to drought and soil wetness/saturation conditions that can enhance flood events. Such episodes were categorized here using the Standardized Precipitation Index (SPI), which was calculated at short (3 and 6-month) and long (12 and 24-month) time scales from monthly precipitation data recorded in the 1941-2012 period (72 years) at 50 precipitation stations scattered across the study area. Moreover, because SPI is a normalized index, it is also suitable to provide spatial representations of these conditions, allowing the comparison between areas within the same region. Thus, indices were interpolated for the whole territory using deterministic and geostatistical methods, and the zonal statistics results were mapped; the spatial interpolation, analysis and mapping were implemented in ArcGIS. Results confirm that the precipitation in this region is strongly influenced by the NAO and SCAND, in particular in the wettest months. Moreover, the annual SPI shows a significant increase in the extent of dry extremes and a non-significant decrease in the extent of wet extremes. For shorter time scales, the behaviour depends on the season. We discuss the observed SPI trends and the uncertainties for the precipitation regime in the southern and western parts of the Iberian Peninsula, which includes mainland Portugal. Results underline potential applications of SPI for water resources management, which is discussed in the context of the regional hydrological conditions and increasing demand for water for different uses.
Collective benefits in traffic during mega events via the use of information technologies
Xu, Yanyan; González, Marta C.
2017-01-01
Information technologies today can inform each of us about the route with the shortest time, but they do not contain incentives to manage travellers such that we all get collective benefits in travel times. To that end we need travel demand estimates and target strategies to reduce the traffic volume from the congested roads during peak hours in a feasible way. During large events, the traffic inconveniences in large cities are unusually high, yet temporary, and the entire population may be more willing to adopt collective recommendations for collective benefits in traffic. In this paper, we integrate, for the first time, big data resources to estimate the impact of events on traffic and propose target strategies for collective good at the urban scale. In the context of the Olympic Games in Rio de Janeiro, we first predict the expected increase in traffic. To that end, we integrate data from mobile phones, Airbnb, Waze and transit information, with game schedules and expected attendance in each venue. Next, we evaluate different route choice scenarios for drivers during the peak hours. Finally, we gather information on the trips that contribute the most to the global congestion which could be redirected from vehicles to transit. Interestingly, we show that (i) following new route alternatives during the event with individual shortest times can save more collective travel time than keeping the routine routes used before the event, uncovering the positive value of information technologies during events; (ii) with only a small proportion of people selected from specific areas switching from driving to public transport, the collective travel time can be reduced to a great extent. Results are presented online for evaluation by the public and policymakers (www.flows-rio2016.com (last accessed 3 September 2017)). PMID:28404868
Collective benefits in traffic during mega events via the use of information technologies.
Xu, Yanyan; González, Marta C
2017-04-01
Information technologies today can inform each of us about the route with the shortest time, but they do not contain incentives to manage travellers such that we all get collective benefits in travel times. To that end we need travel demand estimates and target strategies to reduce the traffic volume from the congested roads during peak hours in a feasible way. During large events, the traffic inconveniences in large cities are unusually high, yet temporary, and the entire population may be more willing to adopt collective recommendations for collective benefits in traffic. In this paper, we integrate, for the first time, big data resources to estimate the impact of events on traffic and propose target strategies for collective good at the urban scale. In the context of the Olympic Games in Rio de Janeiro, we first predict the expected increase in traffic. To that end, we integrate data from mobile phones, Airbnb, Waze and transit information, with game schedules and expected attendance in each venue. Next, we evaluate different route choice scenarios for drivers during the peak hours. Finally, we gather information on the trips that contribute the most to the global congestion which could be redirected from vehicles to transit. Interestingly, we show that (i) following new route alternatives during the event with individual shortest times can save more collective travel time than keeping the routine routes used before the event, uncovering the positive value of information technologies during events; (ii) with only a small proportion of people selected from specific areas switching from driving to public transport, the collective travel time can be reduced to a great extent. Results are presented online for evaluation by the public and policymakers (www.flows-rio2016.com (last accessed 3 September 2017)). © 2017 The Author(s).
NASA Astrophysics Data System (ADS)
Hinderer, J.; Hector, B.; Séguis, L.; Descloitres, M.; Cohard, J.; Boy, J.; Calvo, M.; Rosat, S.; Riccardi, U.; Galle, S.
2013-12-01
Water storage changes (WSC) are investigated by the mean of gravity monitoring in Djougou, northern Benin, in the frame of the GHYRAF (Gravity and Hydrology in Africa) project. In this area, WSC are 1) part of the control system for evapotranspiration (ET) processes, a key variable of the West-African monsoon cycle and 2) the state variable for resource management, a critical issue in storage-poor hard rock basement contexts such as in northern Benin. We show the advantages of gravity monitoring for analyzing different processes in the water cycle involved at various time and space scales, using the main gravity sensors available today (FG5 absolute gravimeter, superconducting gravimeter -SG- and CG5 micro-gravimeter). The study area is also part of the long-term observing system AMMA-Catch, and thus under intense hydro-meteorological monitoring (rain, soil moisture, water table level, ET ...). Gravity-derived WSC are compared at all frequencies to hydrological data and to hydrological models calibrated on these data. Discrepancies are analyzed to discuss the pros and cons of each approach. Fast gravity changes (a few hours) are significant when rain events occur, and involve different contributions: rainfall itself, runoff, fast subsurface water redistribution, screening effect of the gravimeter building and local topography. We investigate these effects and present the statistical results of a set of rain events recorded with the SG installed in Djougou since July 2010. The intermediate time scale of gravity changes (a few days) is caused by ET and both vertical and horizontal water redistribution. The integrative nature of gravity measurements does not allow to separate these different contributions, and the screening from the shelter reduces our ability to retrieve ET values. Also, atmospheric corrections are critical at such frequencies, and deserve some specific attention. However, a quick analysis of gravity changes following rain events shows that the values are in accordance with expected ET values (up to about 5mm/day). Seasonal WSC are analyzed since 2008 using FG5 absolute gravity measurements four times a year and since 2010 using the continuous SG time series. They can reach up to 12 microGal (≈270mm) and show a clear interannual variability, as can be expected from rainfall variability in the area. This data set allows some estimates of an average specific yield for the local aquifer, together with a scaling factor for Magnetic Resonance Soundings-derived water content.
MAJOR ELECTRON EVENTS AND CORONAL MAGNETIC CONFIGURATIONS OF THE RELATED SOLAR ACTIVE REGIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, C.; Owen, C. J.; Matthews, S. A.
A statistical survey of 26 major electron events during the period 2002 February through the end of solar cycle 23 is presented. We have obtained electron solar onset times and the peak flux spectra for each event by fitting to a power-law spectrum truncated by an exponential high-energy tail, i.e., f(E){approx}E{sup -{delta}}e{sup -E/E{sub 0}}. We also derived the coronal magnetic configurations of the related solar active regions (ARs) from the potential-field source-surface model. It is found that (1) 10 of the 11 well-connected open field-line events are prompt events whose solar onset times coincide with the maxima of flare emissionmore » and 13 of the 14 closed field-line events are delayed events. (2) A not-well-connected open field-line event and one of the closed field-line events are prompt events, they are both associated with large-scale coronal disturbances or dimming. (3) An averaged harder spectrum is found in open field-line events compared with the closed ones. Specifically, the averaged spectral index {delta} is of 1.6 {+-} 0.3 in open field-line events and of 2.0 {+-} 0.4 in closed ones. The spectra of three closed field-line events show infinite rollover energies E {sub 0}. These correlations clearly establish a significant link between the coronal magnetic field-line topology and the escape of charged particles from the flaring ARs into interplanetary space during the major solar energetic particle events.« less
Wang, Yupeng; Ficklin, Stephen P; Wang, Xiyin; Feltus, F Alex; Paterson, Andrew H
2016-01-01
Different modes of gene duplication including whole-genome duplication (WGD), and tandem, proximal and dispersed duplications are widespread in angiosperm genomes. Small-scale, stochastic gene relocations and transposed gene duplications are widely accepted to be the primary mechanisms for the creation of dispersed duplicates. However, here we show that most surviving ancient dispersed duplicates in core eudicots originated from large-scale gene relocations within a narrow window of time following a genome triplication (γ) event that occurred in the stem lineage of core eudicots. We name these surviving ancient dispersed duplicates as relocated γ duplicates. In Arabidopsis thaliana, relocated γ, WGD and single-gene duplicates have distinct features with regard to gene functions, essentiality, and protein interactions. Relative to γ duplicates, relocated γ duplicates have higher non-synonymous substitution rates, but comparable levels of expression and regulation divergence. Thus, relocated γ duplicates should be distinguished from WGD and single-gene duplicates for evolutionary investigations. Our results suggest large-scale gene relocations following the γ event were associated with the diversification of core eudicots.
Wang, Yupeng; Ficklin, Stephen P.; Wang, Xiyin; Feltus, F. Alex; Paterson, Andrew H.
2016-01-01
Different modes of gene duplication including whole-genome duplication (WGD), and tandem, proximal and dispersed duplications are widespread in angiosperm genomes. Small-scale, stochastic gene relocations and transposed gene duplications are widely accepted to be the primary mechanisms for the creation of dispersed duplicates. However, here we show that most surviving ancient dispersed duplicates in core eudicots originated from large-scale gene relocations within a narrow window of time following a genome triplication (γ) event that occurred in the stem lineage of core eudicots. We name these surviving ancient dispersed duplicates as relocated γ duplicates. In Arabidopsis thaliana, relocated γ, WGD and single-gene duplicates have distinct features with regard to gene functions, essentiality, and protein interactions. Relative to γ duplicates, relocated γ duplicates have higher non-synonymous substitution rates, but comparable levels of expression and regulation divergence. Thus, relocated γ duplicates should be distinguished from WGD and single-gene duplicates for evolutionary investigations. Our results suggest large-scale gene relocations following the γ event were associated with the diversification of core eudicots. PMID:27195960
NASA Astrophysics Data System (ADS)
Chołoniewski, Jan; Chmiel, Anna; Sienkiewicz, Julian; Hołyst, Janusz A.; Küster, Dennis; Kappas, Arvid
2016-09-01
High frequency psychophysiological data create a challenge for quantitative modeling based on Big Data tools since they reflect the complexity of processes taking place in human body and its responses to external events. Here we present studies of fluctuations in facial electromyography (fEMG) and electrodermal activity (EDA) massive time series and changes of such signals in the course of emotional stimulation. Zygomaticus major (ZYG, "smiling" muscle) activity, corrugator supercilii (COR, "frowning"bmuscle) activity, and phasic skin conductance (PHSC, sweating) levels of 65 participants were recorded during experiments that involved exposure to emotional stimuli (i.e., IAPS images, reading and writing messages on an artificial online discussion board). Temporal Taylor's fluctuations scaling were found when signals for various participants and during various types of emotional events were compared. Values of scaling exponents were close to 1, suggesting an external origin of system dynamics and/or strong interactions between system's basic elements (e.g., muscle fibres). Our statistical analysis shows that the scaling exponents enable identification of high valence and arousal levels in ZYG and COR signals.
NASA Technical Reports Server (NTRS)
Patel, V. L.
1975-01-01
Twenty-one geomagnetic storm events during 1966 and 1970 were studied by using simultaneous interplanetary magnetic field and plasma parameters. Explorer 33 and 35 field and plasma data were analyzed on large-scale (hourly) and small-scale (3 min.) during the time interval coincident with initial phase of the geomagnetic storms. The solar-ecliptic Bz component turns southward at the end of the initial phase, thus triggering the main phase decrease in Dst geomagnetic field. When the Bz is already negative, its value becomes further negative. The By component also shows large fluctuations along with Bz. When there are no clear changes in the Bz component, the By shows abrupt changes at the main phase onet. On the small-scale behavior of the magnetic field and electric field (E=-VxB) studied in details for the three events, it is found that the field fluctuations in By, Bz and Ey and Ez are present in the initial phase. These fluctuations become larger just before the main phase of the storm begins. In the largescale behavior field remains quiet because the small scale variations are averaged out.
Probing interval timing with scalp-recorded electroencephalography (EEG).
Ng, Kwun Kei; Penney, Trevor B
2014-01-01
Humans, and other animals, are able to easily learn the durations of events and the temporal relationships among them in spite of the absence of a dedicated sensory organ for time. This chapter summarizes the investigation of timing and time perception using scalp-recorded electroencephalography (EEG), a non-invasive technique that measures brain electrical potentials on a millisecond time scale. Over the past several decades, much has been learned about interval timing through the examination of the characteristic features of averaged EEG signals (i.e., event-related potentials, ERPs) elicited in timing paradigms. For example, the mismatch negativity (MMN) and omission potential (OP) have been used to study implicit and explicit timing, respectively, the P300 has been used to investigate temporal memory updating, and the contingent negative variation (CNV) has been used as an index of temporal decision making. In sum, EEG measures provide biomarkers of temporal processing that allow researchers to probe the cognitive and neural substrates underlying time perception.
Quantifying memory in complex physiological time-series.
Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R
2013-01-01
In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.
Quantifying Memory in Complex Physiological Time-Series
Shirazi, Amir H.; Raoufy, Mohammad R.; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R.; Amodio, Piero; Jafari, G. Reza; Montagnese, Sara; Mani, Ali R.
2013-01-01
In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of “memory length” was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are ‘forgotten’ quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations. PMID:24039811
NASA Astrophysics Data System (ADS)
Kubota, M.; Fukunishi, H.; Okano, S.
2001-07-01
A new optical instrument for studying upper atmospheric dynamics, called the Multicolor All-sky Imaging System (MAIS), has been developed. The MAIS can obtain all-sky images of airglow emission at two different wavelengths simultaneously with a time resolution of several minutes. Since December 1991, imaging observations with the MAIS have been conducted at the Zao observatory (38.09°N, 140.56°E). From these observations, two interesting events with wave structures have been detected in OI 630-nm nightglow images. The first event was observed on the night of June 2/3, 1992 during a geomagnetically quiet period. Simultaneous data of ionospheric parameters showed that they are caused by propagation of the medium-scale traveling ionospheric disturbance (TID). Phase velocity and horizontal wavelength determined from the image data are 45-100 m/s and ~280 km, and the propagation direction is south-westward. The second event was observed on the night of February 27/28, 1992 during a geomagnetic storm. It is found that a large enhancement of OI 630-nm emission is caused by a propagation of the large-scale TID. Meridional components of phase velocities and wavelengths determined from ionospheric data are 305-695 m/s (southward) and 930-5250 km. The source of this large-scale TID appears to be auroral processes at high latitudes.
The predictability of consumer visitation patterns
NASA Astrophysics Data System (ADS)
Krumme, Coco; Llorente, Alejandro; Cebrian, Manuel; Pentland, Alex ("Sandy"); Moro, Esteban
2013-04-01
We consider hundreds of thousands of individual economic transactions to ask: how predictable are consumers in their merchant visitation patterns? Our results suggest that, in the long-run, much of our seemingly elective activity is actually highly predictable. Notwithstanding a wide range of individual preferences, shoppers share regularities in how they visit merchant locations over time. Yet while aggregate behavior is largely predictable, the interleaving of shopping events introduces important stochastic elements at short time scales. These short- and long-scale patterns suggest a theoretical upper bound on predictability, and describe the accuracy of a Markov model in predicting a person's next location. We incorporate population-level transition probabilities in the predictive models, and find that in many cases these improve accuracy. While our results point to the elusiveness of precise predictions about where a person will go next, they suggest the existence, at large time-scales, of regularities across the population.
Stratigraphy of the Anthropocene.
Zalasiewicz, Jan; Williams, Mark; Fortey, Richard; Smith, Alan; Barry, Tiffany L; Coe, Angela L; Bown, Paul R; Rawson, Peter F; Gale, Andrew; Gibbard, Philip; Gregory, F John; Hounslow, Mark W; Kerr, Andrew C; Pearson, Paul; Knox, Robert; Powell, John; Waters, Colin; Marshall, John; Oates, Michael; Stone, Philip
2011-03-13
The Anthropocene, an informal term used to signal the impact of collective human activity on biological, physical and chemical processes on the Earth system, is assessed using stratigraphic criteria. It is complex in time, space and process, and may be considered in terms of the scale, relative timing, duration and novelty of its various phenomena. The lithostratigraphic signal includes both direct components, such as urban constructions and man-made deposits, and indirect ones, such as sediment flux changes. Already widespread, these are producing a significant 'event layer', locally with considerable long-term preservation potential. Chemostratigraphic signals include new organic compounds, but are likely to be dominated by the effects of CO(2) release, particularly via acidification in the marine realm, and man-made radionuclides. The sequence stratigraphic signal is negligible to date, but may become geologically significant over centennial/millennial time scales. The rapidly growing biostratigraphic signal includes geologically novel aspects (the scale of globally transferred species) and geologically will have permanent effects.
The predictability of consumer visitation patterns
Krumme, Coco; Llorente, Alejandro; Cebrian, Manuel; Pentland, Alex ("Sandy"); Moro, Esteban
2013-01-01
We consider hundreds of thousands of individual economic transactions to ask: how predictable are consumers in their merchant visitation patterns? Our results suggest that, in the long-run, much of our seemingly elective activity is actually highly predictable. Notwithstanding a wide range of individual preferences, shoppers share regularities in how they visit merchant locations over time. Yet while aggregate behavior is largely predictable, the interleaving of shopping events introduces important stochastic elements at short time scales. These short- and long-scale patterns suggest a theoretical upper bound on predictability, and describe the accuracy of a Markov model in predicting a person's next location. We incorporate population-level transition probabilities in the predictive models, and find that in many cases these improve accuracy. While our results point to the elusiveness of precise predictions about where a person will go next, they suggest the existence, at large time-scales, of regularities across the population. PMID:23598917
Ground-based demonstration of the European Laser Timing (ELT) experiment.
Schreiber, Karl Ulrich; Prochazka, Ivan; Lauber, Pierre; Hugentobler, Urs; Schäfer, Wolfgang; Cacciapuoti, Luigi; Nasca, Rosario
2010-03-01
The development of techniques for the comparison of distant clocks and for the distribution of stable and accurate time scales has important applications in metrology and fundamental physics research. Additionally, the rapid progress of frequency standards in the optical domain is presently demanding additional efforts for improving the performances of existing time and frequency transfer links. Present clock comparison systems in the microwave domain are based on GPS and two-way satellite time and frequency transfer (TWSTFT). European Laser Timing (ELT) is an optical link presently under study in the frame of the ESA mission Atomic Clock Ensemble in Space (ACES). The on-board hardware for ELT consists of a corner cube retro-reflector (CCR), a single-photon avalanche diode (SPAD), and an event timer board connected to the ACES time scale. Light pulses fired toward ACES by a laser ranging station will be detected by the SPAD diode and time tagged in the ACES time scale. At the same time, the CCR will re-direct the laser pulse toward the ground station providing precise ranging information. We have carried out a ground-based feasibility study at the Geodetic Observatory Wettzell. By using ordinary satellites with laser reflectors and providing a second independent detection port and laser pulse timing unit with an independent time scale, it is possible to evaluate many aspects of the proposed time transfer link before the ACES launch.
Scaling differences between large interplate and intraplate earthquakes
NASA Technical Reports Server (NTRS)
Scholz, C. H.; Aviles, C. A.; Wesnousky, S. G.
1985-01-01
A study of large intraplate earthquakes with well determined source parameters shows that these earthquakes obey a scaling law similar to large interplate earthquakes, in which M sub o varies as L sup 2 or u = alpha L where L is rupture length and u is slip. In contrast to interplate earthquakes, for which alpha approximately equals 1 x .00001, for the intraplate events alpha approximately equals 6 x .0001, which implies that these earthquakes have stress-drops about 6 times higher than interplate events. This result is independent of focal mechanism type. This implies that intraplate faults have a higher frictional strength than plate boundaries, and hence, that faults are velocity or slip weakening in their behavior. This factor may be important in producing the concentrated deformation that creates and maintains plate boundaries.
NASA Astrophysics Data System (ADS)
Coppola, Erika; Sobolowski, Stefan
2017-04-01
The join EURO-CORDEX and Med-CORDEX Flagship Pilot Study dedicated to the frontier research of using convective permitting models to address the impact of human induced climate change on convection, has been recently approved and the scientific community behind the project is made of 30 different scientific institutes distributed all around Europe. The motivations for such a challenge is the availability of large field campaigns dedicated to the study of heavy precipitation events such as HyMeX and high resolution dense observation networks like WegnerNet, RdisaggH (CH),COMEPHORE (Fr), SAFRAN (Fr), EURO4M-APGD (CH); the increased computing capacity and model developments; the emerging trend signals in extreme precipitation at daily and mainly sub-daily time scale in the Mediterranean and Alpine regions and the priority of convective extreme events under the WCRP Grand Challenge on climate extremes, because they carry both society-relevant and scientific challenges. The main objective of this effort are to investigate convective-scale events, their processes and their changes in a few key regions of Europe and the Mediterranean using convection-permitting RCMs, statistical models and available observations. To provide a collective assessment of the modeling capacity at convection-permitting scale and to shape a coherent and collective assessment of the consequences of climate change on convective event impacts at local to regional scales. The scientific aims of this research are to investigate how the convective events and the damaging phenomena associated with them will respond to changing climate conditions in several European regions with different climates. To understand if an improved representation of convective phenomena at convective permitting scales will lead to upscaled added value and finally to assess the possibility to replace these costly convection-permitting experiments with statistical approaches like "convection emulators". The common initial domain will be an extended Alpine domain and all the groups will simulate a minimum of 10 years period with ERA-interim boundary conditions, with the possibility of other two sub-domains one in the Northwest continental Europe and another in the Southeast Mediterranean. The scenario simulations will be completed for three different 10 years time slices one in the historical period, one in the near future and the last one in the far future for the RCP8.5 scenario. The first target of this scientific community is to have an ensemble of 1-2 years ERA-interim simulations ready by next summer.
Lee, Hsin-Chieh; Huang, Chia-Lin; Ho, Sui-Hua; Sung, Wen-Hsu
2017-10-01
The aim of this study was to investigate the effects of virtual reality (VR) balance training conducted using Kinect for Xbox® games on patients with chronic stroke. Fifty patients with mild to moderate motor deficits were recruited and randomly assigned to two groups: VR plus standard treatment group and standard treatment (ST) group. In total, 12 training sessions (90 minutes a session, twice a week) were conducted in both groups, and performance was assessed at three time points (pretest, post-test, and follow-up) by a blinded assessor. The outcome measures were the Berg Balance Scale (BBS), Functional Reach Test, and Timed Up and Go Test (cognitive; TUG-cog) for balance evaluations; Modified Barthel Index for activities of daily living ability; Activities-specific Balance Confidence Scale for balance confidence; and Stroke Impact Scale for quality of life. The pleasure scale and adverse events were also recorded after each training session. Both groups exhibited significant improvement over time in the BBS (P = 0.000) and TUG-cog test (P = 0.005). The VR group rated the experience as more pleasurable than the ST group during the intervention (P = 0.027). However, no significant difference was observed in other outcome measures within or between the groups. No serious adverse events were observed during the treatment in either group. VR balance training by using Kinect for Xbox games plus the traditional method had positive effects on the balance ability of patients with chronic stroke. The VR group experienced higher pleasure than the ST group during the intervention.
Context-aware event detection smartphone application for first responders
NASA Astrophysics Data System (ADS)
Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.
2013-05-01
The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.
Laboratory generated M -6 earthquakes
McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.
2014-01-01
We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.
NASA Astrophysics Data System (ADS)
De Vleeschouwer, David; Da Silva, Anne-Christine; Day, James E.; Whalen, Michael; Claeys, Philippe
2016-04-01
Milankovitch cycles (obliquity, eccentricity and precession) result in changes in the distribution of solar energy over seasons, as well as over latitudes, on time scales of ten thousands of years to millions of years. These changing patterns in insolation have induced significant variations in Earth's past climate over the last 4.5 billion years. Cyclostratigraphy and astrochronology utilize the geologic imprint of such quasi-cyclic climatic variations to measure geologic time. In recent years, major improvements of the Geologic Time Scale have been proposed through the application of cyclostratigraphy, mostly for the Mesozoic and Cenozoic (Gradstein et al., 2012). However, the field of Paleozoic cyclostratigraphy and astrochronology is still in its infancy and the application of cyclostratigraphic techniques in the Paleozoic allows for a whole new range of research questions. For example, unraveling the timing and pacing of environmental changes over the Late Devonian mass extinction on a 105-year time-scale concerns such a novel research question. Here, we present a global cyclostratigraphic framework for late Frasnian to early Famennian climatic and environmental change, through the integration of globally distributed sections. The backbone of this relative time scale consists of previously published cyclostratigraphies for western Canada and Poland (De Vleeschouwer et al., 2012; De Vleeschouwer et al., 2013). We elaborate this Euramerican base by integrating new proxy data -interpreted in terms of astronomical climate forcing- from the Iowa basin (USA, magnetic susceptibility and carbon isotope data) and Belgium (XRF and carbon isotope data). Next, we expand this well-established cyclostratigraphic framework towards the Paleo-Tethys Ocean, using magnetic susceptibility and carbon isotope records from the Fuhe section in South China (Whalen et al., 2015). The resulting global cyclostratigraphic framework implies an important refinement of the late Frasnian to early Famennian stratigraphy, but also allows for an evaluation of the role of astronomical forcing in perturbing the global carbon cycle and pacing anoxic conditions throughout the Late Devonian mass extinction event. The late Frasnian anoxic Kellwasser events, for example, each represent only a portion of a 405-kyr eccentricity cycle, with the onset of both events separated by 500-600 kyr. References: De Vleeschouwer, D., Whalen, M. T., Day, J. E., and Claeys, P., 2012, Cyclostratigraphic calibration of the Frasnian (Late Devonian) time scale (western Alberta, Canada): Geological Society of America Bulletin, v. 124, no. 5-6, p. 928-942. De Vleeschouwer, D., Rakociński, M., Racki, G., Bond, D. P., Sobień, K., and Claeys, P., 2013, The astronomical rhythm of Late-Devonian climate change (Kowala section, Holy Cross Mountains, Poland): Earth and Planetary Science Letters, v. 365, p. 25-37. Gradstein, F. M., Ogg, J. G., Schmitz, M., and Ogg, G., 2012, The Geologic Time Scale 2012 2-Volume Set, Elsevier. Whalen, M. T., Śliwiński, M. G., Payne, J. H., Day, J. E., Chen, D., and da Silva, A.-C., 2015, Chemostratigraphy and magnetic susceptibility of the Late Devonian Frasnian-Famennian transition in western Canada and southern China: implications for carbon and nutrient cycling and mass extinction: Geological Society, London, Special Publications, v. 414.